Please use this identifier to cite or link to this item:
|Title:||Building predictive models combining structural and functional connectome data via multi-view Graph Neural Networks||Authors:||Debdeep Mukherjee||Keywords:||Engineering::Computer science and engineering||Issue Date:||2022||Publisher:||Nanyang Technological University||Source:||Debdeep Mukherjee (2022). Building predictive models combining structural and functional connectome data via multi-view Graph Neural Networks. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/158154||Abstract:||This project investigates how we can leverage multiview geometric deep learning on structural and functional connectome data of the human brain to make accurate and robust predictive models on an individual’s attributes and cognitive abilities. The human brain can be mapped as structural connectivity (SC) data that are the white matter connections measured using Diffusion Tensor Imaging (DTI) and functional connectivity (FC) data which are the corresponding activation regions measured using Functional Magnetic Resonance Imaging (fMRI). The SC and FC data can be processed after neuroimaging to derive connectivity matrices based on predefined regions of interest (ROIs), to obtain nxn matrices for n ROIs. geometric deep learning can be applied on these graph data and these connectivity matrices have been used to give predictions on the age and gender characteristics, which our project replicates. However for more complex attributes like cognitive abilities, more robust predictive models are being researched to find insights on how SC-FC coupling relationships exist. Literature shows evidence that the coupling is not a simple one-to-one mapping that can be generalized across individuals. There exists unique elements in the coupling of SC and FC within individuals. Hence, this project aims to explore the relationship between SC and FC, more specifically how geometric deep learning models can learn the feature representations from multimodal SC-FC data and hence make better predictions for attributes such as cognitive abilities. Through these models, we therefore learn novel insights about the brain such as generalities at the population level across individuals on the structural and functional characteristics of the human brain as well as the relationship between them, also referred to as SC-FC coupling. We explored DNN, Connectome CNN, BrainNetCNN, GCN, GraphSAGE and GAT across both unimodal and multimodal data to make predictions of 5 Y variables each: age, gender and 3 fluid intelligence components. After obtaining and analyzing the results we gained novel insights about how the brain works and the SC-FC coupling. We learnt that age and gender are best predicted with unimodal SC, especially for gender as there is distinct difference in the structure of the brain between males and females, with functional characteristics being more similar that leads to gender being predicted the best with SC data. We also successfully obtain customized multimodal graph deep learning models that were able to learn the feature representations and relationships between SC and FC to a certain extend and show that there exists unique sc-fc coupling in individuals that contribute new information in the representation learning to improve prediction of cognitive abilities compared to purely unimodal as most studies currently have shown results with. Moreover the multimodal models were able able to draw sufficient understanding of the generalities in the coupling between individuals across the population such that models such as Multimodal GCN and Multimodal BrainNetCNN produced the highest prediction accuracy results on unseen test data for prediction of cognitive abilities, improving upon the prediction results of the reference paper  by 40.7%.||URI:||https://hdl.handle.net/10356/158154||Fulltext Permission:||restricted||Fulltext Availability:||With Fulltext|
|Appears in Collections:||SCSE Student Reports (FYP/IA/PA/PI)|
Updated on Dec 1, 2022
Updated on Dec 1, 2022
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.