Please use this identifier to cite or link to this item:
Title: Dynamic texture classification
Authors: Chockalingam, Muthiah
Keywords: Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision
Engineering::Computer science and engineering::Computing methodologies::Pattern recognition
Issue Date: 2022
Publisher: Nanyang Technological University
Source: Chockalingam, M. (2022). Dynamic texture classification. Final Year Project (FYP), Nanyang Technological University, Singapore.
Project: SCSE21-0308
Abstract: Video Object Segmentation is a segment under the field of computer vision, and it recently has been garnering a lot of attention as to its applicability in solving many real-life problems. One such problem is the effective navigation of ships, by using video objects segmentation to effectively segment the target object (i.e., water) to identify clear regions of water for ships to sail through and avoid any possible obstacles on their course. The biggest problem with creating such a video object segmentation model would be the fact that water has a very tricky appearance. Its appearance and texture are highly dynamic as they change very quickly, sometimes even between frames itself, due to factors such as illumination, ripples, and waves. Therefore, this student’s aim is to analyse the existing video object segmentation methods, identify the most suitable one for dynamic texture classification (i.e., tracking object with a dynamic appearance), and test it with a dataset that is representative of the waters that ships sail through. Results are then recorded to evaluate the effectiveness of that particular model with regards to the objective of aiding a ship’s navigation.
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
FYP Final Report_Muthiah.pdf
  Restricted Access
FYP Final Report_Muthiah_U1822406D9.14 MBAdobe PDFView/Open

Page view(s)

Updated on May 16, 2022


Updated on May 16, 2022

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.