Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/66915
Title: Kinect-based automatic sign language recognition
Authors: Lin, Zi Ying
Keywords: DRNTU::Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision
Issue Date: 2016
Abstract: As technology advances, so does its popularity as a teaching tool, and it can be envisioned that technology will play an integral role in education in the future. This project aims to integrate the use of technology with sign-language to help improve and promote communication not only amongst the hearing-impaired, but also with their friends and family as well. The project works on using the functionality of the depth sensor on the Microsoft Kinect to recognize gestures according to the American Sign Language (ASL). The project is a prototype that demonstrates the concept of learning with technology by allowing users to practice the ASL hand gestures through an educational game. The depth sensor in the Microsoft Kinect allows the project to detect and identify the hand and 5 individual fingers. The concept of gesture recognition is essential to future developments of this project and serves a basis, as this project scratches only the surface of the potential of this concept. This report details the process of development of this project, the detection of the hand and fingers, as well as the recognition of gestures.
URI: http://hdl.handle.net/10356/66915
Rights: Nanyang Technological University
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
SCE15-0076_LinZiYing.pdf
  Restricted Access
6.05 MBAdobe PDFView/Open

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.