Please use this identifier to cite or link to this item:
Title: Skeleton based action recognition with graph convolutional networks
Authors: Han, Jia Yi
Keywords: Engineering::Electrical and electronic engineering
Issue Date: 2021
Publisher: Nanyang Technological University
Source: Han, J. Y. (2021). Skeleton based action recognition with graph convolutional networks. Final Year Project (FYP), Nanyang Technological University, Singapore.
Project: A3320-202
Abstract: Human Action Recognition (HAR) has become more popular in the research field of computer vision in recent years. It has the goal of understanding human actions and motion from captured data, using deep learning methods, to be able to classify each action or motion with a specific label. It can be used in a broad range application of computer vision, such as security surveillance, autonomous navigation systems and for human safety operations. Different data modalities exist that are available to process for human action recognition, such as skeleton, depth, infrared, radar. The use of skeleton data modality has also become more popular. Following the recent advancements in methods of information capture, and increased number of data sensors, the vast amount of data available leads to more data capacity required to process it. The increased size of data to process leads to a much higher computational cost to evaluate classifications of actions. To combat this, many different deep learning methods were developed to reduce the amount of computational cost while not sacrificing performance and accuracy. With recent advancements in modelling techniques, newer methods of graph convolutional networks (GCNs) are used to model and classify human actions from skeleton data. In this project, Shift-GCN and MS-G3D are the main models are used to classify human actions.
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
FYP Report Jia Yi Han.pdf
  Restricted Access
1.23 MBAdobe PDFView/Open

Page view(s)

Updated on May 15, 2022


Updated on May 15, 2022

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.