Please use this identifier to cite or link to this item:
https://hdl.handle.net/10356/73996
Title: | Watch-based hand activity recognition through machine learning | Authors: | Zhao, Qingmei | Keywords: | DRNTU::Engineering::Computer science and engineering::Computing methodologies::Pattern recognition | Issue Date: | 2018 | Abstract: | To discover a novel and innovative way to interact with smartwatch beyond the tiny touchscreen, my Final Year Project (FYP) focus on recognizing the hand activities of people drawing numbers 0-9 in the air while wearing smartwatch on their hand. A 2-stage hand activity recognition system is proposed and uses Machine Learning techniques to classify the different hand activities from the time series signals recorded from the sensors embedded on the smartwatch. To evaluate the performance of recognition system, I have collected hand activity data from 92 different people and engineered critical and essential features from the sensor data to identify activities. The adopted 2-stage hand activity recognition system with a deep convolutional neural networks (CNN) models which could achieve the accuracy of 94.3\%. The outstanding performance of the recognition system with CNN models has been verified by the experiments on training different Machine Learning models with the same set of features. Lastly, a hand-free standalone Android Watch App is developed to load the pre-trained models and demonstrate the hand activity recognition. | URI: | http://hdl.handle.net/10356/73996 | Schools: | School of Computer Science and Engineering | Rights: | Nanyang Technological University | Fulltext Permission: | restricted | Fulltext Availability: | With Fulltext |
Appears in Collections: | SCSE Student Reports (FYP/IA/PA/PI) |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
FYP_Report_fffffff.pdf Restricted Access | 3.6 MB | Adobe PDF | View/Open |
Page view(s)
345
Updated on Mar 17, 2025
Download(s)
17
Updated on Mar 17, 2025
Google ScholarTM
Check
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.