Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/170575
Title: A hybrid neuromorphic object tracking and classification framework for real-time systems
Authors: Ussa, Andres
Rajen, Chockalingam Senthil
Pulluri, Tarun
Singla, Deepak
Acharya, Jyotibdha
Chuanrong, Gideon Fu
Basu, Arindam
Ramesh, Bharath
Keywords: Engineering::Electrical and electronic engineering
Issue Date: 2023
Source: Ussa, A., Rajen, C. S., Pulluri, T., Singla, D., Acharya, J., Chuanrong, G. F., Basu, A. & Ramesh, B. (2023). A hybrid neuromorphic object tracking and classification framework for real-time systems. IEEE Transactions On Neural Networks and Learning Systems. https://dx.doi.org/10.1109/TNNLS.2023.3243679
Journal: IEEE Transactions on Neural Networks and Learning Systems
Abstract: Deep learning inference that needs to largely take place on the "edge" is a highly computational and memory intensive workload, making it intractable for low-power, embedded platforms such as mobile nodes and remote security applications. To address this challenge, this article proposes a real-time, hybrid neuromorphic framework for object tracking and classification using event-based cameras that possess desirable properties such as low-power consumption (5-14 mW) and high dynamic range (120 dB). Nonetheless, unlike traditional approaches of using event-by-event processing, this work uses a mixed frame and event approach to get energy savings with high performance. Using a frame-based region proposal method based on the density of foreground events, a hardware-friendly object tracking scheme is implemented using the apparent object velocity while tackling occlusion scenarios. The frame-based object track input is converted back to spikes for TrueNorth (TN) classification via the energy-efficient deep network (EEDN) pipeline. Using originally collected datasets, we train the TN model on the hardware track outputs, instead of using ground truth object locations as commonly done, and demonstrate the ability of our system to handle practical surveillance scenarios. As an alternative tracker paradigm, we also propose a continuous-time tracker with C ++ implementation where each event is processed individually, which better exploits the low latency and asynchronous nature of neuromorphic vision sensors. Subsequently, we extensively compare the proposed methodologies to state-of-the-art event-based and frame-based methods for object tracking and classification, and demonstrate the use case of our neuromorphic approach for real-time and embedded applications without sacrificing performance. Finally, we also showcase the efficacy of the proposed neuromorphic system to a standard RGB camera setup when simultaneously evaluated over several hours of traffic recordings.
URI: https://hdl.handle.net/10356/170575
ISSN: 2162-237X
DOI: 10.1109/TNNLS.2023.3243679
Schools: School of Electrical and Electronic Engineering 
Rights: © 2023 IEEE. All rights reserved.
Fulltext Permission: none
Fulltext Availability: No Fulltext
Appears in Collections:EEE Journal Articles

SCOPUSTM   
Citations 50

4
Updated on Mar 9, 2025

Web of ScienceTM
Citations 50

1
Updated on Oct 28, 2023

Page view(s)

146
Updated on Mar 15, 2025

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.