Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/174147
Full metadata record
DC FieldValueLanguage
dc.contributor.authorTao, Yingen_US
dc.contributor.authorChang, Chip Hongen_US
dc.contributor.authorSa¨ıgh, Sylvainen_US
dc.contributor.authorGao, Shengyuen_US
dc.date.accessioned2024-06-06T02:47:07Z-
dc.date.available2024-06-06T02:47:07Z-
dc.date.issued2024-
dc.identifier.citationTao, Y., Chang, C. H., Sa¨ıgh, S. & Gao, S. (2024). GaitSpike: event-based gait recognition with spiking neural network. 2024 IEEE 6th International Conference on Artificial Intelligence Circuits and Systems (AICAS), 357-361. https://dx.doi.org/10.1109/AICAS59952.2024.10595896en_US
dc.identifier.isbn979-8-3503-8363-8-
dc.identifier.issn2834-9857-
dc.identifier.urihttps://hdl.handle.net/10356/174147-
dc.description.abstractExisting vision-based gait recognition systems are mostly designed based on video footage acquired with RGB cameras. Appearance-, model- and motion-based techniques commonly used by these systems require silhouette segmentation, skeletal contour detection and optical flow patterns, respectively for features extraction. The extracted features are typically classified by convolutional neural networks to identify the person. These preprocessing steps are computationally intensive due to the high visual data redundancies and their accuracies can be influenced by background variations and non-locomotion related external factors. In this paper, we propose GaitSpike, a new gait recognition system that synergistically combines the advantages of sparsity-driven event-based camera and spiking neural network (SNN) for gait biometric classification. Specifically, a domain-specific locomotion-invariant representation (LIR) is proposed to replace the static Cartesian coordinates of the raw address event representation of the event camera to a floating polar coordinate reference to the motion center. The aim is to extract the relative motion information between the motion center and other human body parts to minimize the intra-class variance to promote the learning of inter-class features by the SNN. Experiments on a real event-based gait dataset DVS128-Gait and a synthetic event-based gait dataset EV-CASIA-B show that GaitSpike achieves comparable accuracy as RGB camera based gait recognition systems with higher computational efficiency, and outperforms the state-of-the-art event camera based gait recognition systems.en_US
dc.description.sponsorshipMinistry of Education (MOE)en_US
dc.description.sponsorshipNational Research Foundation (NRF)en_US
dc.language.isoenen_US
dc.relationMOE-T2EP50220-0003en_US
dc.rights© 2024 IEEE. All rights reserved. This article may be downloaded for personal use only. Any other use requires prior permission of the copyright holder. The Version of Record is available online at http://doi.org/10.1109/AICAS59952.2024.10595896.en_US
dc.subjectEngineeringen_US
dc.titleGaitSpike: event-based gait recognition with spiking neural networken_US
dc.typeConference Paperen
dc.contributor.schoolSchool of Electrical and Electronic Engineeringen_US
dc.contributor.conference2024 IEEE 6th International Conference on Artificial Intelligence Circuits and Systems (AICAS)en_US
dc.contributor.organizationCNRS@CREATEen_US
dc.identifier.doi10.1109/AICAS59952.2024.10595896-
dc.description.versionSubmitted/Accepted versionen_US
dc.identifier.urlhttps://aicas2024.org/-
dc.identifier.spage357en_US
dc.identifier.epage361en_US
dc.subject.keywordsGait recognitionen_US
dc.subject.keywordsRGB camerasen_US
dc.citation.conferencelocationAbu Dhabi, UAEen_US
dc.description.acknowledgementThis research is supported in part by the National Research Foundation, Prime Minister’s Office, Singapore under its Campus for Research Excellence and Technological Enterprise (CREATE) DesCartes programme, and in part by the Ministry of Education, Singapore, under its AcRF Tier 2 Award MOE-T2EP50220-0003.en_US
item.fulltextWith Fulltext-
item.grantfulltextopen-
Appears in Collections:EEE Conference Papers
Files in This Item:
File Description SizeFormat 
2024037707.pdf952.95 kBAdobe PDFThumbnail
View/Open

Page view(s)

169
Updated on Feb 8, 2025

Download(s) 50

160
Updated on Feb 8, 2025

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.