Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/78679
Full metadata record
DC FieldValueLanguage
dc.contributor.authorHu, Minghui
dc.date.accessioned2019-06-25T07:27:01Z
dc.date.available2019-06-25T07:27:01Z
dc.date.issued2019
dc.identifier.urihttp://hdl.handle.net/10356/78679
dc.description.abstractDeep learning is widely used in recent years, and the application of machine learning techniques in computer vision like object detection and tracking has been a prime part in this field. This project mainly introduces a method of a recurrent convolution neural network for object detection and tracking in dim light underwater condition. The convolution neural network model is inspired by a spatially supervised regression method, while the recurrent part is based on bounding box recurrent methods like Long Short-Term Memory and Gated Recurrent Units. This project conducts a systematically analysis on YOLO and LSTMs model. We illustrate our methods on an annotated dataset consisting of massive underwater footage. Experimental results show that recurrent convolution neural network can complete the object detection and regression method can meet the tracking requirement and boost the stability of the system.en_US
dc.format.extent61 p.en_US
dc.language.isoenen_US
dc.subjectEngineering::Electrical and electronic engineeringen_US
dc.titleTarget tracking using deep neural network (DNN)en_US
dc.typeThesis
dc.contributor.supervisorPonnuthurai N. Suganthanen_US
dc.contributor.schoolSchool of Electrical and Electronic Engineeringen_US
dc.description.degreeMaster of Science (Power Engineering)en_US
item.grantfulltextrestricted-
item.fulltextWith Fulltext-
Appears in Collections:EEE Theses
Files in This Item:
File Description SizeFormat 
HU - Dissertation Report.pdf
  Restricted Access
2.06 MBAdobe PDFView/Open

Page view(s)

306
Updated on Jul 21, 2024

Download(s) 50

26
Updated on Jul 21, 2024

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.