Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/160679
Full metadata record
DC FieldValueLanguage
dc.contributor.authorLee, C. K. M.en_US
dc.contributor.authorNg, Kam K. H.en_US
dc.contributor.authorChen, Chun-Hsienen_US
dc.contributor.authorLau, H. C. W.en_US
dc.contributor.authorChung, S. Y.en_US
dc.contributor.authorTsoi, Tiffanyen_US
dc.date.accessioned2022-08-01T01:35:12Z-
dc.date.available2022-08-01T01:35:12Z-
dc.date.issued2021-
dc.identifier.citationLee, C. K. M., Ng, K. K. H., Chen, C., Lau, H. C. W., Chung, S. Y. & Tsoi, T. (2021). American sign language recognition and training method with recurrent neural network. Expert Systems With Applications, 167, 114403-. https://dx.doi.org/10.1016/j.eswa.2020.114403en_US
dc.identifier.issn0957-4174en_US
dc.identifier.urihttps://hdl.handle.net/10356/160679-
dc.description.abstractThough American sign language (ASL) has gained recognition from the American society, few ASL applications have been developed with educational purposes. Those designed with real-time sign recognition systems are also lacking. Leap motion controller facilitates the real-time and accurate recognition of ASL signs. It allows an opportunity for designing a learning application with a real-time sign recognition system that seeks to improve the effectiveness of ASL learning. The project proposes an ASL learning application prototype. The application would be a whack-a-mole game with a real-time sign recognition system embedded. Since both static and dynamic signs (J, Z) exist in ASL alphabets, Long-Short Term Memory Recurrent Neural Network with k-Nearest-Neighbour method is adopted as the classification method is based on handling of sequences of input. Characteristics such as sphere radius, angles between fingers and distance between finger positions are extracted as input for the classification model. The model is trained with 2600 samples, 100 samples taken for each alphabet. The experimental results revealed that the recognition rate for 26 ASL alphabets yields an average of 99.44% accuracy rate and 91.82% in 5-fold cross-validation with the use of leap motion controller.en_US
dc.language.isoenen_US
dc.relation.ispartofExpert Systems with Applicationsen_US
dc.rights© 2020 Elsevier Ltd. All rights reserved.en_US
dc.subjectEngineering::Mechanical engineeringen_US
dc.titleAmerican sign language recognition and training method with recurrent neural networken_US
dc.typeJournal Articleen
dc.contributor.schoolSchool of Mechanical and Aerospace Engineeringen_US
dc.identifier.doi10.1016/j.eswa.2020.114403-
dc.identifier.scopus2-s2.0-85097552443-
dc.identifier.volume167en_US
dc.identifier.spage114403en_US
dc.subject.keywordsAmerican Sign Languageen_US
dc.subject.keywordsLeap Motion Controlleren_US
item.fulltextNo Fulltext-
item.grantfulltextnone-
Appears in Collections:MAE Journal Articles

SCOPUSTM   
Citations 10

51
Updated on Nov 27, 2023

Web of ScienceTM
Citations 10

27
Updated on Oct 25, 2023

Page view(s)

70
Updated on Nov 30, 2023

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.