Please use this identifier to cite or link to this item:
Title: Gait-based gender classification in unconstrained environments
Authors: Huang, T. S.
Lu, Jiwen
Wang, Gang
Keywords: DRNTU::Engineering::Electrical and electronic engineering
Issue Date: 2012
Abstract: This paper investigates the problem of gait-based gender classification in unconstrained environments. Different from existing human gait analysis and recognition methods which assume that humans walk in controlled environments, we aim to recognize human gender from uncontrolled gaits in which people can walk freely and the walking direction of human gaits may be time-varying in a singe video clip. Given each gait sequence collected in an uncontrolled manner, we first obtain human silhouettes using background substraction and cluster them into several groups. For each group, we compute the averaged gait image (AGI) as features. Then, we learn a distance metric under which the intraclass variations are minimized and the interclass variations are maximized, simultaneously, such that more discriminative information can be exploited for gender classification. Experimental results on our dataset demonstrate the efficacy of the proposed method.
Fulltext Permission: none
Fulltext Availability: No Fulltext
Appears in Collections:EEE Conference Papers

Page view(s) 20

Updated on Jan 31, 2023

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.