Please use this identifier to cite or link to this item:
Title: Medical image segmentation and visualization
Authors: Thilaga Govindasamy.
Keywords: DRNTU::Engineering::Electrical and electronic engineering::Control and instrumentation::Medical electronics
Issue Date: 2009
Abstract: Segmentation of regions of interest from medical images such as brain tumors and anatomic parts of body has always been a great challenge. In clinical practice today, such segmentation task is very essential for various clinical applications such as surgeries, diagnosis of diseases, visualization of anatomic parts of the body and so on. Manual segmentation is a very tedious and painstaking way of extracting the required regions of interest. Radiologists or trained clinical staff have to go through every image (one patient could have 100 over Magnetic Resonance (MR) images in one clinical examination) and segment the regions of interest for evaluation and analysis. This is time consuming and costly for the amount of man-hours spent to get the task completed. From day to day, the amount of medical data generated from medical imaging has been increasing tremendously and manual segmentation is not efficient anymore for fast and accurate, repeatable and reproducible results. High accuracy of segmentation is essential when dealing with human life. This project explores the Level Set method used to segment parts of posterior fossa from magnetic resonance images for hemifacial spasm analysis using some prior knowledge like root exit zone of the facial nerve and pixel intensity. The algorithm is also capable to visualize the segmented parts in 3D and compute the volume of the segmented parts for analysis purpose.
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Theses

Files in This Item:
File Description SizeFormat 
  Restricted Access
4.03 MBAdobe PDFView/Open

Page view(s) 20

checked on Oct 24, 2020

Download(s) 20

checked on Oct 24, 2020

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.