Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/39687
Title: Recognition of human actions
Authors: Xiao, Xu
Keywords: DRNTU::Engineering::Electrical and electronic engineering::Control and instrumentation::Control engineering
Issue Date: 2010
Abstract: Human action recognition aims to recognize the actions performed by humans in daily life, as the computer has high performance on computing and analysis, it helps in a lot of areas, such as security surveillance, interactive human-machine communication, etc. One of the most robust and fast recognition approaches is the Motion History Image (MHI) based temporal-template matching method. This project focuses on the usage of Motion History Image (MHI) and Motion Energy Image (MEI), traditional Hu Moments matching approach was tried to verify its recognition ability. Based on it, modification was implemented; instead of using whole MHI image to get Hu Moments, the MHI image of human body is segmented into 4 body parts – head, arm, torso and leg. Hu Moments approach was carried out for each body part and recognition was realized by combining effect of the 4 body parts. Although the modification improved the recognition performance, however, the increasing computational complexity and some subjective operations were not desired for the recognition. Finally based on MHI images, Local Binary Pattern (LBP) approach, which initially used in face recognition, was proposed and adopted to solve the problem in computational complexity and improve recognition performance.
URI: http://hdl.handle.net/10356/39687
Rights: Nanyang Technological University
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
EA4021-091.pdf
  Restricted Access
6.12 MBAdobe PDFView/Open

Page view(s) 50

254
Updated on Dec 1, 2020

Download(s) 50

28
Updated on Dec 1, 2020

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.