Deep learning for neural encoding and decoding of remotely controlled object
Understanding the information presented within each individual neurons’ firing rate in relation to the environment is of great interest to neuroscientists. Encoding models which relates information from environment-to-brain have shown neurons to fire specifically to stimuli such as position and move...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Thesis-Master by Research |
Language: | English |
Published: |
Nanyang Technological University
2023
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/166372 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-166372 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-1663722023-05-02T06:33:01Z Deep learning for neural encoding and decoding of remotely controlled object Lim, Amos Wei Han Goh Wen Bin Wilson School of Biological Sciences wilsongoh@ntu.edu.sg Science::Biological sciences::Biomathematics Understanding the information presented within each individual neurons’ firing rate in relation to the environment is of great interest to neuroscientists. Encoding models which relates information from environment-to-brain have shown neurons to fire specifically to stimuli such as position and movement. Inversely, decoding models which studies information in the brain-to-environment direction could be useful to study brain computations or to build engineering marvels, such as neural prosthetics. Here, we propose a novel behaviour task to study the information encoded and the ability to decode information from individual neurons across different brain regions in a remote-control task. We optogenetically inhibited the several regions of the brain and found that the secondary motor cortex and the posterior parietal cortex (PPC) were behaviourally relevant to the task. By building an encoding model, we found neurons holding representations of the position and velocity of the object movement, and they were mainly located in the PPC and the motor regions respectively. Using advanced deep learning models, the object position and velocity could also be decoded from the neurons’ firing rates, although the performance between brain regions differed from that of the encoding model. The application of explainable artificial intelligence techniques revealed the amount of contribution of individual neurons to the success of the decoding model, and sequential removing of the most relevant neurons shows the number of neurons that contribute significantly to the predicted task variable. More importantly, the complementary findings in the encoding and decoding models reveals a plausible mechanism of action from environment-to-brain-to-environment for the mice to solve the task. Master of Science 2023-04-24T08:58:06Z 2023-04-24T08:58:06Z 2023 Thesis-Master by Research Lim, A. W. H. (2023). Deep learning for neural encoding and decoding of remotely controlled object. Master's thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/166372 https://hdl.handle.net/10356/166372 10.32657/10356/166372 en This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License (CC BY-NC 4.0). application/pdf Nanyang Technological University |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
Science::Biological sciences::Biomathematics |
spellingShingle |
Science::Biological sciences::Biomathematics Lim, Amos Wei Han Deep learning for neural encoding and decoding of remotely controlled object |
description |
Understanding the information presented within each individual neurons’ firing rate in relation to the environment is of great interest to neuroscientists. Encoding models which relates information from environment-to-brain have shown neurons to fire specifically to stimuli such as position and movement. Inversely, decoding models which studies information in the brain-to-environment direction could be useful to study brain computations or to build engineering marvels, such as neural prosthetics. Here, we propose a novel behaviour task to study the information encoded and the ability to decode information from individual neurons across different brain regions in a remote-control task. We optogenetically inhibited the several regions of the brain and found that the secondary motor cortex and the posterior parietal cortex (PPC) were behaviourally relevant to the task. By building an encoding model, we found neurons holding representations of the position and velocity of the object movement, and they were mainly located in the PPC and the motor regions respectively. Using advanced deep learning models, the object position and velocity could also be decoded from the neurons’ firing rates, although the performance between brain regions differed from that of the encoding model. The application of explainable artificial intelligence techniques revealed the amount of contribution of individual neurons to the success of the decoding model, and sequential removing of the most relevant neurons shows the number of neurons that contribute significantly to the predicted task variable. More importantly, the complementary findings in the encoding and decoding models reveals a plausible mechanism of action from environment-to-brain-to-environment for the mice to solve the task. |
author2 |
Goh Wen Bin Wilson |
author_facet |
Goh Wen Bin Wilson Lim, Amos Wei Han |
format |
Thesis-Master by Research |
author |
Lim, Amos Wei Han |
author_sort |
Lim, Amos Wei Han |
title |
Deep learning for neural encoding and decoding of remotely controlled object |
title_short |
Deep learning for neural encoding and decoding of remotely controlled object |
title_full |
Deep learning for neural encoding and decoding of remotely controlled object |
title_fullStr |
Deep learning for neural encoding and decoding of remotely controlled object |
title_full_unstemmed |
Deep learning for neural encoding and decoding of remotely controlled object |
title_sort |
deep learning for neural encoding and decoding of remotely controlled object |
publisher |
Nanyang Technological University |
publishDate |
2023 |
url |
https://hdl.handle.net/10356/166372 |
_version_ |
1765213840633495552 |