Please use this identifier to cite or link to this item:
Title: Deep anomaly detection for medical images
Authors: Li, Xintong
Keywords: Engineering::Electrical and electronic engineering
Issue Date: 2020
Publisher: Nanyang Technological University
Project: B3139-191
Abstract: Deep learning methods have been demonstrated to be effective in many medical tasks. However, these methods normally require a large number of labeled data, which is costly especially for disease screening, where the abnormal/diseased data are more difficult to obtain. The main purpose of this Final Year Project is to investigate transfer learning-based anomaly detection methods that do not require or only require a small number of labeled data instances. In this work, two anomaly detection methods are proposed. A semi-supervised joint learning method (SmSupJL) is proposed to train a feature extractor with two losses, namely, ’cross-entropy loss’ and ’intra-class variance loss’ on a small labeled train set. By applying these two losses, the feature extractor is able to learn discriminative features of normal and abnormal samples and keep the compactness of normal samples. To further reduce the number of labeled data instances needed, we propose an unsupervised domain adaptation method (UnSupDA) which does not require any labeled instances from target domain but a small number of labeled data instances from source domain to detect anomalies. Self-supervised tasks are used to align source domain and target domain and thus transfer the knowledge learned from the source domain to target domain. Experimental results evaluated on Kaggle Diabetic Retinopathy (DR) dataset demonstrated that the performance of these methods is either surpass or comparable to the current state-of-the-art.
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
  Restricted Access
4.98 MBAdobe PDFView/Open

Page view(s)

Updated on Dec 3, 2022


Updated on Dec 3, 2022

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.