Please use this identifier to cite or link to this item:
|Title:||Efficient feedforward categorization of objects and human postures with address-event image sensors||Authors:||Chen, Shoushun
Carrasco, Jose Antonio Perez
|Keywords:||DRNTU::Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision||Issue Date:||2012||Source:||Chen, S., Akseirod, P., Zhao, B., Carrasco, J. A. P., Linares-Barranco, B., & Culurciello, E. (2012). Efficient feedforward categorization of objects and human postures with address-event image sensors. IEEE transactions on pattern analysis and machine intelligence, 34(2), 302-314.||Series/Report no.:||IEEE transactions on pattern analysis and machine intelligence||Abstract:||This paper proposes an algorithm for feedforward categorization of objects and, in particular, human postures in real-time video sequences from address-event temporal-difference image sensors. The system employs an innovative combination of event-based hardware and bio-inspired software architecture. An event-based temporal difference image sensor is used to provide input video sequences, while a software module extracts size and position invariant line features inspired by models of the primate visual cortex. The detected line features are organized into vectorial segments. After feature extraction, a modified line segment Hausdorff-distance classifier combined with on-the-fly cluster-based size and position invariant categorization. The system can achieve about 90 percent average success rate in the categorization of human postures, while using only a small number of training samples. Compared to state-of-the-art bio-inspired categorization methods, the proposed algorithm requires less hardware resource, reduces the computation complexity by at least five times, and is an ideal candidate for hardware implementation with event-based circuits.||URI:||https://hdl.handle.net/10356/99187
|ISSN:||0162-8828||DOI:||http://dx.doi.org/10.1109/TPAMI.2011.120||Rights:||© 2012 IEEE||Fulltext Permission:||none||Fulltext Availability:||No Fulltext|
|Appears in Collections:||EEE Journal Articles|
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.