Please use this identifier to cite or link to this item:
|Title:||3D multi-modality medical image registration with GAN-based synthetic image augmentation||Authors:||Guo, Zhiwei||Keywords:||Engineering::Computer science and engineering||Issue Date:||2021||Publisher:||Nanyang Technological University||Source:||Guo, Z. (2021). 3D multi-modality medical image registration with GAN-based synthetic image augmentation. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/152248||Abstract:||Medical image registration is a crucial yet challenging task in medical image analysis and processing. we propose an end-to-end unsupervised multi-modality deformable image registration network with CycleGAN augmentation. The proposed method is designed for intra-subject brain MRI-CT registration. The registration network can be divided into two stage. First, it generates a synthetic CT image from its corresponding MRI image by using CycleGAN. Second, by feeding the synthetic CT (sCT) and original CT into an unsupervised registration network, the deformation field to align the sCT and CT image is obtained, which is also the deformation field applied to the MRI image to align with CT image. Compared with state-of-art unsupervised registration method, instead of calculating a single mono-modality image similarity on CT and warped sCT, we also include a mutual information multi-modality image similarity on CT and warped MRI. We demonstrated that our proposed method outperforms both current state-of-the-art registration algorithm and existing registration tools. Because of an impending Technical Disclosure, some details of methodology have been omitted in this report.||URI:||https://hdl.handle.net/10356/152248||Fulltext Permission:||embargo_restricted_20230724||Fulltext Availability:||With Fulltext|
|Appears in Collections:||SCSE Student Reports (FYP/IA/PA/PI)|
Files in This Item:
|4.3 MB||Adobe PDF||Under embargo until Jul 24, 2023|
Updated on May 17, 2022
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.