Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/38998
Full metadata record
DC FieldValueLanguage
dc.contributor.authorLi, Zheng.en_US
dc.date.accessioned2010-05-21T03:39:25Z-
dc.date.available2010-05-21T03:39:25Z-
dc.date.copyright1997-
dc.date.issued1997-
dc.identifier.urihttp://hdl.handle.net/10356/38998-
dc.description.abstractA new model-based tracking algorithm is reported for real-time motion tracking per-formance. In this experiment, a single calibrated camera and geometric models are used. Each tracking cycle completes computation between a projected 3D model and the image. The early stage is to match and measure the error between the model and corresponding image features; The later stage is to estimate the model pose according to previous measurement. The matching process includes two aspects of:(l)feature extraction using local minimum energy and (2)global matching of a known three-dimensional model against the projected features. A one-dimensional search strategy is adopted instead of local window operation. The feature energy which is defined as the negative absolute value of the edge strength, fits the small motion hypothesis. The algorithm reported here is robust to change in lighting and backgrounds.en_US
dc.format.extent116 p.-
dc.language.isoen-
dc.rightsNANYANG TECHNOLOGICAL UNIVERSITYen_US
dc.subjectDRNTU::Engineering::Electrical and electronic engineering::Control and instrumentation-
dc.titleReal-time 3D model-based motion trackingen_US
dc.typeThesisen_US
dc.contributor.supervisorWang, Hanen_US
dc.contributor.schoolSchool of Electrical and Electronic Engineeringen_US
dc.description.degreeMaster of Engineeringen_US
item.fulltextWith Fulltext-
item.grantfulltextrestricted-
Appears in Collections:EEE Theses
Files in This Item:
File Description SizeFormat 
EEETHESES_51.pdf
  Restricted Access
14.14 MBAdobe PDFView/Open

Page view(s) 50

448
Updated on Jul 14, 2024

Download(s)

3
Updated on Jul 14, 2024

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.