Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/149851
Full metadata record
DC FieldValueLanguage
dc.contributor.authorOng, Zong Youen_US
dc.date.accessioned2021-06-09T09:25:05Z-
dc.date.available2021-06-09T09:25:05Z-
dc.date.issued2021-
dc.identifier.citationOng, Z. Y. (2021). Deep learning based action recognition. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/149851en_US
dc.identifier.urihttps://hdl.handle.net/10356/149851-
dc.description.abstractIn recent times, surveillance is becoming more and more prevalent, with it being used to aid law enforcement, ensure home safety and allow caregivers to track the movement of the elderlies at home. The detection and recognition of human actions, through surveillance and recording devices, is informative — it allows the user to monitor the actions of the target subject at any point of time. One of the more successful ways of recognizing human action is through the use of skeleton data. This method provides an excellent representation for describing human activities. Recently, the Graph Convolutional Network (GCN) has become a popular topic, and graph-based skeleton action recognition significantly improved in performance, because GCN can learn the neighbourhood information and body joints’ interaction in a directed way. Thus, GCN has exceptional potential in skeleton action recognition. This report consists of three sections: Firstly, several ablation studies are conducted on the baseline model – Two-Stream Adaptive Graph Convolutional Network (2s-AGCN) to better understand the performance of the baseline model, Secondly, several experiments are performed where we explore adding some modification to the baseline model to improve on its performance when benchmarking the model against datasets that contain skeleton data such as – NTU RGB+D 60 dataset, Lastly, experiments on several models with new ideas incorporated are conducted to test their performance against the baseline model.en_US
dc.language.isoenen_US
dc.publisherNanyang Technological Universityen_US
dc.relationA3108-201en_US
dc.subjectEngineering::Electrical and electronic engineeringen_US
dc.titleDeep learning based action recognitionen_US
dc.typeFinal Year Project (FYP)en_US
dc.contributor.supervisorAlex Chichung Koten_US
dc.contributor.supervisorEr Meng Hwaen_US
dc.contributor.schoolSchool of Electrical and Electronic Engineeringen_US
dc.description.degreeBachelor of Engineering (Electrical and Electronic Engineering)en_US
dc.contributor.researchRapid-Rich Object Search (ROSE) Laben_US
dc.contributor.supervisoremailEACKOT@ntu.edu.sg, EMHER@ntu.edu.sgen_US
item.grantfulltextrestricted-
item.fulltextWith Fulltext-
Appears in Collections:EEE Student Reports (FYP/IA/PA/PI)
Files in This Item:
File Description SizeFormat 
FYP_final report_edit.pdf
  Restricted Access
1.99 MBAdobe PDFView/Open

Page view(s)

116
Updated on Jul 1, 2022

Download(s)

8
Updated on Jul 1, 2022

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.