Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/156777
Title: Web application for ICH subtype classification from CT head scans
Authors: Lim, Candy
Keywords: Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence
Engineering::Computer science and engineering::Computer applications::Life and medical sciences
Issue Date: 2022
Publisher: Nanyang Technological University
Source: Lim, C. (2022). Web application for ICH subtype classification from CT head scans. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/156777
Abstract: Traumatic brain injury (TBI) causes intracranial hemorrhage (ICH) that requires urgent diagnosis and treatment to improve patient outcome. Machine learning techniques can help clinicians to classify brain lesions and assist clinicians diagnose TBI from radiological scans. The project objective was to build a CAD system which assists in the detection, screening, and diagnosis of ICH in routine clinical practice. The models are trained and created using different CNN models developed on Tensorflow, Keras, and OpenCV using sliced CT scanned images from the 2019-RSNA Brain CT Hemorrhage Challenge dataset. The results from these models were evaluated and the MobileNetV1 architecture model is determined to give the best performance analysis. The CAD system, which was constructed using the Django and ReactJS frameworks, was able to extract medical picture analysis for use in a deep learning solution.
URI: https://hdl.handle.net/10356/156777
Schools: School of Computer Science and Engineering 
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
Candy Lim - Final.pdf
  Restricted Access
2.8 MBAdobe PDFView/Open

Page view(s)

132
Updated on Sep 30, 2023

Download(s) 50

28
Updated on Sep 30, 2023

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.