Please use this identifier to cite or link to this item:
Title: Image analytics using Artificial Intelligence (Human Action Recognition in industrial workplace)
Authors: Xiong, Jingxi
Keywords: Engineering::Electrical and electronic engineering
Issue Date: 2022
Publisher: Nanyang Technological University
Source: Xiong, J. (2022). Image analytics using Artificial Intelligence (Human Action Recognition in industrial workplace). Final Year Project (FYP), Nanyang Technological University, Singapore.
Project: A3301-211
Abstract: The recent pandemic has reinforced the concept of industry 4.0 in traditional manufacturer industries, and one of the rising needs is to understand operators’ action to increase productivity and efficiency. Compared to traditional video action recognition tasks, video cation recognition under an industrial setting involves unusual objects, complex background and more inter-human interactions, which have an obvious gap between current public action recognition dataset. In this project, an industrial based dataset is being constructed to fill the blank in action recognition tasks in industrial workplace. Furthermore, two methods are proposed to improve the existing TSN and TSM model performance on human action recognition tasks via introducing the concept of grouping and split-attention mechanism to enhance model efficiency and accuracy. Various experiment setting and data augmentation methods are also reviewed in detail to explore the optimum setting in action recognition tasks. The model performance has improved from 78.80% to 90.51% on UCF101 dataset, and has reached 84.22% accuracy on self- constructed industrial dataset.
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
FYP_Report_Xiong Jingxi_Video_Action_Recognition vF.pdf
  Restricted Access
1.51 MBAdobe PDFView/Open

Page view(s)

Updated on Aug 7, 2022


Updated on Aug 7, 2022

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.