Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/138068
Full metadata record
DC FieldValueLanguage
dc.contributor.authorRamanathan, Manojen_US
dc.contributor.authorYau, Wei-Yunen_US
dc.contributor.authorTeoh, Eam Khwangen_US
dc.contributor.authorThalmann, Nadia Magnenaten_US
dc.date.accessioned2020-04-23T04:31:11Z-
dc.date.available2020-04-23T04:31:11Z-
dc.date.issued2018-
dc.identifier.citationRamanathan, M., Yau, W.-Y., Teoh, E. K., & Thalmann, N. M. (2017). Pose-invariant kinematic features for action recognition. Proceedings of the 2017 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC), 292-299. doi:10.1109/APSIPA.2017.8282038en_US
dc.identifier.isbn9781538615430-
dc.identifier.urihttps://hdl.handle.net/10356/138068-
dc.description.abstractRecognition of actions from videos is a difficult task due to several factors like dynamic backgrounds, occlusion, pose-variations observed. To tackle the pose variation problem, we propose a simple method based on a novel set of pose-invariant kinematic features which are encoded in a human body centric space. The proposed framework begins with detection of neck point, which will serve as a origin of body centric space. We propose a deep learning based classifier to detect neck point based on the output of fully connected network layer. With the help of the detected neck, propagation mechanism is proposed to divide the foreground region into head, torso and leg grids. The motion observed in each of these body part grids are represented using a set of pose-invariant kinematic features. These features represent motion of foreground or body region with respect to the detected neck point's motion and encoded based on view in a human body centric space. Based on these features, poseinvariant action recognition can be achieved. Due to the body centric space is used, non-upright human posture actions can also be handled easily. To test its effectiveness in non-upright human postures in actions, a new dataset is introduced with 8 non-upright actions performed by 35 subjects in 3 different views. Experiments have been conducted on benchmark and newly proposed non-upright action dataset to identify limitations and get insights on the proposed framework.en_US
dc.description.sponsorshipNRF (Natl Research Foundation, S’pore)en_US
dc.description.sponsorshipASTAR (Agency for Sci., Tech. and Research, S’pore)en_US
dc.language.isoenen_US
dc.rights© 2017 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. The published version is available at: https://doi.org/10.1109/APSIPA.2017.8282038en_US
dc.subjectEngineering::Computer science and engineeringen_US
dc.subjectEngineering::Electrical and electronic engineeringen_US
dc.titlePose-invariant kinematic features for action recognitionen_US
dc.typeConference Paperen
dc.contributor.schoolSchool of Electrical and Electronic Engineeringen_US
dc.contributor.conference2017 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC)en_US
dc.contributor.researchInstitute for Media Innovation (IMI)en_US
dc.identifier.doi10.1109/APSIPA.2017.8282038-
dc.description.versionAccepted versionen_US
dc.identifier.spage292en_US
dc.identifier.epage299en_US
dc.subject.keywordsAction Recognitionen_US
dc.subject.keywordsPose-invarianceen_US
dc.citation.conferencelocationKuala Lumpur, Malaysiaen_US
item.grantfulltextopen-
item.fulltextWith Fulltext-
Appears in Collections:IMI Conference Papers
Files in This Item:
File Description SizeFormat 
18_Pose-Invariant Kinematic Features for Action Recognition.pdf682.09 kBAdobe PDFThumbnail
View/Open

SCOPUSTM   
Citations 50

1
Updated on Sep 3, 2020

Page view(s)

38
Updated on Dec 1, 2020

Download(s)

5
Updated on Dec 1, 2020

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.