Please use this identifier to cite or link to this item:
|Title:||Image analytics using Artificial Intelligence (Human Action Recognition in industrial workplace)||Authors:||Xiong, Jingxi||Keywords:||Engineering::Electrical and electronic engineering||Issue Date:||2022||Publisher:||Nanyang Technological University||Source:||Xiong, J. (2022). Image analytics using Artificial Intelligence (Human Action Recognition in industrial workplace). Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/157848||Project:||A3301-211||Abstract:||The recent pandemic has reinforced the concept of industry 4.0 in traditional manufacturer industries, and one of the rising needs is to understand operators’ action to increase productivity and efficiency. Compared to traditional video action recognition tasks, video cation recognition under an industrial setting involves unusual objects, complex background and more inter-human interactions, which have an obvious gap between current public action recognition dataset. In this project, an industrial based dataset is being constructed to fill the blank in action recognition tasks in industrial workplace. Furthermore, two methods are proposed to improve the existing TSN and TSM model performance on human action recognition tasks via introducing the concept of grouping and split-attention mechanism to enhance model efficiency and accuracy. Various experiment setting and data augmentation methods are also reviewed in detail to explore the optimum setting in action recognition tasks. The model performance has improved from 78.80% to 90.51% on UCF101 dataset, and has reached 84.22% accuracy on self- constructed industrial dataset.||URI:||https://hdl.handle.net/10356/157848||Fulltext Permission:||restricted||Fulltext Availability:||With Fulltext|
|Appears in Collections:||EEE Student Reports (FYP/IA/PA/PI)|
Files in This Item:
|FYP_Report_Xiong Jingxi_Video_Action_Recognition vF.pdf|
|1.51 MB||Adobe PDF||View/Open|
Updated on Jun 26, 2022
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.