Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/61366
Title: Understanding human interaction in RGB-D videos
Authors: Shi, Yuanyuan
Keywords: DRNTU::Engineering::Electrical and electronic engineering
Issue Date: 2014
Abstract: In this report, a human hand gesture recognition system is proposed. The system can understand both static and dynamic human hand gestures. So far, the system is able to recognize 9 static hand gestures: numbers from one to nine; and 1 dynamic hand gesture: number ten. In the system implementation, a right-hand CyberGlove II is used to get the accurate and stable hand joints information for the static hand gesture recognition. Based on the results of static classification, together with the hand joint motion information from Microsoft Kinect, dynamic hand gestures can be classified. In addition, and effective and fast human hand gesture recognition algorithm is proposed to manage the data from sensors and achieve classification results in real time. To verify the effectiveness of the system, a human hand gesture sample dataset containing 250 samples collected from 5 people of difference body sizes is constructed. The testing results show that the algorithm is able to understand human hand gestures accurately and fast.
URI: http://hdl.handle.net/10356/61366
Schools: School of Electrical and Electronic Engineering 
Rights: Nanyang Technological University
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
FYP_Report_ShiYuanyuan.pdf
  Restricted Access
Main article5.85 MBAdobe PDFView/Open

Page view(s)

304
Updated on Mar 22, 2025

Download(s)

9
Updated on Mar 22, 2025

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.