Please use this identifier to cite or link to this item:
Title: Human action capturing and classification
Authors: Feng, Zhou
Keywords: DRNTU::Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision
Issue Date: 2006
Source: Feng, Z. (2006). Human action capturing and classification. Master’s thesis, Nanyang Technological University, Singapore.
Abstract: In this report, a vision-based framework is proposed for learning and inferring occupant activities at different levels. These levels range from short temporal interval movements, to intermediate level events and long temporal term activities. Our research is focused on using a combined tracking-classification framework for the unsupervised classification of human action. An initial comparative study was done to evaluate several existing foreground segmentation methods that employ background modeling. Our own probabilistic foreground-background segmentation method is proposed to extract human-centric reference frames. Based on the human-centric reference frames, a principled analysis of the correspondence problem leads to a novel probabilistic action representation called the correspondence-ambiguous feature histogram array (CAFHA) that is robust to variations across similar actions. CAFHA is shown to be effective in unsupervised action classification and quasi real-time action inference. A novel feature selection method is proposed to select the optimal features to improve the CAFHA representation, such that the best discrimination between different action clusters may be found via unsupervised spectral clustering. Finally, a number of potential future directions are proposed that are targeted at further improvements to our framework and creating new research methods required to recognize human activity at longer temporal scales.
DOI: 10.32657/10356/13599
Fulltext Permission: open
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Theses

Files in This Item:
File Description SizeFormat 
FengZhou06.pdfMain report1.89 MBAdobe PDFThumbnail

Page view(s) 50

Updated on May 8, 2021

Download(s) 20

Updated on May 8, 2021

Google ScholarTM




Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.