Cross-modal synthesis of structural and functional connectome with CycleGAN for disease classification
Over the last few decades, a study of human brain, or neuroscience, has grown in a significant rate. Countless researches of how our brain functions have been published. An interesting technique is to model human brain as the comprehensive map of neurons. With this perspective, it can be further stu...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2022
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/156570 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | Over the last few decades, a study of human brain, or neuroscience, has grown in a significant rate. Countless researches of how our brain functions have been published. An interesting technique is to model human brain as the comprehensive map of neurons. With this perspective, it can be further studied via Structural Connectome (SC), which is a network of anatomical white matter connections in the brain,
and Functional Connectome (FC), which is commonly used to assess whole brain dynamics and function. Understanding the connection between SC and FC would definitely contribute a lot to neuroscience field. This justifies the need for multi-view learning techniques to encode large datasets that combine both the brain and SC and FC.
In this project, our objective is to study the cross-modal synthesis of human connectome by proposing an approach to produce SC matrices from FC matrices and vice versa, using the state-of-the-art generative model, CycleGAN. Once the synthetic samples have been created, we combine them with the orginal samples to perform multi-view disease classification, by utilizing our lab’s Convolutional Neural Network
(CNN) model. This analysis aims to evaluate the improvement in classification task results when applying these simulated data. |
---|