Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/70233
Full metadata record
DC FieldValueLanguage
dc.contributor.authorCandinegara, Edwin
dc.date.accessioned2017-04-17T08:51:24Z
dc.date.available2017-04-17T08:51:24Z
dc.date.issued2017
dc.identifier.urihttp://hdl.handle.net/10356/70233
dc.description.abstractHuman activity recognition technology has received increasing attention from researchers in the recent years. This technology has a wide range of possible applications such as healthcare, security, sports, etc. As such, many researchers have developed activity recognition systems using various types of devices such as specially made sensors, video surveillance, and smartphones for data collection. With the recent introduction of smartwatch, some researchers have started collecting sensory data using both smartphones and smartwatches due to their non-intrusive nature and high penetration rate. In this project, a robust human activity recognition framework which can detect not only basic activities (sitting, standing, running, etc.) but also hand-based activities (writing, typing, reading, etc.) is proposed. The new framework is named the Robust Activity Recognition using Smartphone and Smartwatch (RARSS). This new framework will be extensively tested using datasets collected from 15 subjects. There are two testing techniques employed, namely 5-fold cross validation and Leave-One-Person-Out (LOPO) testing. Furthermore, extracted features in RARSS’ will be compared with two sets of features proposed by other researchers. There are five main findings presented in this report as follows: (i) Sensory data collected from the smartwatch provides additional information for better distinguishing hand-based activities. (ii) Combining sensory data from both smartphone and smartwatch improves the models’ performance. (iii) Sensory data collected from barometer and gyroscope sensor provides additional useful information for differentiating certain activities. (iv) RARSS’ features are empirically shown to be better than the other two benchmarking feature sets. (v) Most importantly, a technique called data mean deduction, which is applied on the collected sensory data, can significantly reduce the differences embedded in the sensory data collected for the same activity but from different subjects. It is shown that using the mean deducted data together with the original sensory data can significantly improve the performance of the activity recognition models. A real-time human activity recognition system is also developed in this project. The system is built based on the RARSS framework so that it can accurately predict human activities in real- time. The system has a web application for displaying the latest predicted activity with an activity history list. The web application also employs a smoothing function on the activity history list to cater for activity transitions.en_US
dc.format.extent80 p.en_US
dc.language.isoenen_US
dc.rightsNanyang Technological University
dc.subjectDRNTU::Engineering::Computer science and engineeringen_US
dc.titleHuman activity recognition through wearable devicesen_US
dc.typeFinal Year Project (FYP)en_US
dc.contributor.supervisorTan Ah Hweeen_US
dc.contributor.schoolSchool of Computer Science and Engineeringen_US
dc.description.degreeBachelor of Engineering (Computer Science)en_US
item.grantfulltextrestricted-
item.fulltextWith Fulltext-
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)
Files in This Item:
File Description SizeFormat 
Revised Final Year Project Report.pdf
  Restricted Access
7.06 MBAdobe PDFView/Open

Page view(s)

170
Updated on Jul 27, 2021

Download(s) 50

64
Updated on Jul 27, 2021

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.