Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/59546
Title: A biologically inspired human posture recognition system
Authors: Zhao, Bo
Keywords: DRNTU::Engineering::Electrical and electronic engineering
Issue Date: 2014
Source: Zhao, B. (2014). A biologically inspired human posture recognition system. Doctoral thesis, Nanyang Technological University, Singapore.
Abstract: Human posture recognition is gaining increasing attention nowadays, especially in the area of assisted living and elderly care. The aim of this thesis is to explore the design and implementation of a highly-efficient biologically-plausible engine for the categorization of objects, with real-time human posture recognition as the envisaged application. Based on commercially available image sensors and powerful personal computers, an impressive series of systems have been reported for human posture recognition. However, the high complexity of these algorithms limits their usage on low-cost and lightweight embedded platforms. In addition, the conventional image sensors employed in these systems also contribute to lower energy efficiency, since their output contains a very high level of redundancy. Unlike conventional cameras which have little computing capability, smart image sensors utilize novel focal-plane signal processing to improve computation efficiency. This thesis targets an innovative combination of a motion detection smart image sensor and a bio-inspired event-driven processing architecture. A biologically inspired human posture recognition system is proposed. The system was designed first using a frame-based temporal difference image sensor, followed by an improved version based on a frame-free asynchronous Address-Event-Representation (AER) vision sensor. Using the frame-based temporal difference image sensor, a bio-inspired feedforward feature extraction approach was proposed. The feature extraction unit consists of a layer of simple cells (modeled by Gabor filters) and a layer of complex cells (featuring maximum operations). After feature extraction, each frame is represented as a set of vectorial line segments. A modified line segment Hausdorff-distance (LHD) classifier, combined with a clustering-based size and position calculation module, is then used for posture categorization. The system can achieve about 90% average successful rate in the categorization of human postures, while using only a small number of training samples. Using the frame-free AER temporal contrast vision sensor, an improved event-based feedforward categorization system was further proposed. A similar cortex-like feature extraction method based on convolution and maximum operation was used, with the introduction of an event-driven processing technique to fully utilize the power of AER sensors. An asynchronous time-domain motion symbol detector was further proposed to detect a burst of motion events and then trigger the classification. The feature spikes are classified by a spiking neural network (SNN), namely tempotron. One appealing characteristic of this system is its fully event-driven processing. The input, the features, and the classifier are all based on address events (spikes). Experimental results on two datasets demonstrate the efficacy of the proposed system. The proposed system performs well with single object/human posture in the scene. A number of challenging problems remain to be addressed, such as the view invariance, multiple objects tracking, and adaptation to high dynamic range lighting conditions. Resolving these issues will undoubtedly lead to a very promising new generation of categorization systems.
URI: http://hdl.handle.net/10356/59546
metadata.item.grantfulltext: restricted
metadata.item.fulltext: With Fulltext
Appears in Collections:EEE Theses

Files in This Item:
File Description SizeFormat 
ZHAO BO_G0902277G.pdfThesis softcopy4.75 MBAdobe PDFThumbnail
View/Open

Page view(s)

248
checked on Jan 12, 2020

Download(s)

125
checked on Jan 12, 2020

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.