Please use this identifier to cite or link to this item:
Title: Development of a motion based supporting model for people tracking
Authors: Zhang, Zheng
Keywords: DRNTU::Engineering::Electrical and electronic engineering
Issue Date: 2015
Abstract: In this report, a single person tracking system based on salient parts is applied as the original model to be improved by adding the velocity factor into the codes. And by this way, a motion based supporting model for people tracking is constructed. The original model is designed to track the targets in consecutive images by combining data association and salient parts based tracking. There are four stages in this tracking system as candidate initialization, candidate update, tracker initialization and tracker update. In the original model, the relative velocity between the target and the salient parts is considered as zero. In this report, a relative velocity between the target and the salient parts is added into the codes to improve the performance. The velocity factor is added into the codes after the tracker initialization starts working. And the information got from the candidate part including candidate initialization and candidate update is used to get the initial velocity for tracker part. The velocity factor is represented by the difference in location from frame to frame and is updated. Finally, the improved model succeeds in presenting better results but there are still some problems. The recommendations about how to improve the performance of the improved system are also provided in the last section of the report for future research.
Schools: School of Electrical and Electronic Engineering 
Rights: Nanyang Technological University
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
  Restricted Access
3.6 MBAdobe PDFView/Open

Page view(s)

Updated on Nov 29, 2023


Updated on Nov 29, 2023

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.