Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/63820
Full metadata record
DC FieldValueLanguage
dc.contributor.authorCai, Xiao
dc.date.accessioned2015-05-19T05:46:19Z
dc.date.available2015-05-19T05:46:19Z
dc.date.copyright2015en_US
dc.date.issued2015
dc.identifier.urihttp://hdl.handle.net/10356/63820
dc.description.abstractIn the field of Human Machine Interaction, with the development of 3D cameras and computer vision, recognition of human hand gestures and motions has become a real trending topic and is gaining greater significance in Natural User Interface (NUI). By detecting, tracking and recognizing hand gestures and motions, together with the development of graphical user interfaces, the use of traditional input devices such as key boards and mouse could be reduced. However, challenges have been faced as the accuracy and speed of real-time gesture recognition are seen as one of the bottlenecks with the current development. The final year project, Extreme Learning Machines based Human Hand Sign Language Recognition, is hence proposed with the aim of investigating real-time recognition of three broad classifications of gestures – static, dynamic, and motion detection with respective applications. Previous studies and available algorithms are examined to define the gap between the performance of Extreme Learning Machines proposed by Prof. Huang and other kernel methods. Suitable development platform, techniques for feature extraction and refinement, application of ELM methods are selected for the research. The desktop, game, and PowerPoint slides control application is developed based on 3Gear Nimble hand tracking library. Three applications for static gesture recognition, namely simultaneous detection of both hands, music player triggering application, and rock paper scissors game application, are presented. The dynamic gesture recognition is to combine both training and normal operation together for four main gestures. Windows default music player could also be triggered by making use of two of the dynamic gestures. The motion detection part of this project achieves recognition of writing number from 0 to 9 based on a countdown system, however, receiving not very nice performance. Performance of these three broad ranges of gesture recognition applications is evaluated compare with support vector machines. Recommendations for future work regarding the project is provided, in terms of achieving real-time recognition and performance improvement.en_US
dc.format.extent98 p.en_US
dc.language.isoenen_US
dc.rightsNanyang Technological University
dc.subjectDRNTU::Engineering::Electrical and electronic engineering::Control and instrumentation::Control engineeringen_US
dc.titleExtreme learning machines based human hand sign language recognitionen_US
dc.typeFinal Year Project (FYP)en_US
dc.contributor.supervisorHuang Guangbinen_US
dc.contributor.schoolSchool of Electrical and Electronic Engineeringen_US
dc.description.degreeBachelor of Engineeringen_US
item.fulltextWith Fulltext-
item.grantfulltextrestricted-
Appears in Collections:EEE Student Reports (FYP/IA/PA/PI)
Files in This Item:
File Description SizeFormat 
CaiXiao FYP Report (U1122976G).pdf
  Restricted Access
2.37 MBAdobe PDFView/Open

Page view(s)

164
Updated on Jan 25, 2021

Download(s)

12
Updated on Jan 25, 2021

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.