Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/78432
Title: Human activities recognition in smart living environment
Authors: Ye, Yuchen
Keywords: DRNTU::Engineering::Electrical and electronic engineering::Control and instrumentation::Control engineering
Issue Date: 2019
Abstract: Today, in the mobile internet era, sensors are now widely used around the world. The development and application of sensors benefits humans in many fields. In the recent years, sensor-based activities recognition has made great progress. Among them, activities recognition research based on wearable sensors and smartphones’ sensors have occupied a major position and provide a lot of support of application in human’s daily life. Smartphone is easy to carry, due to this advantage, a large number of researchers use smartphone to collect sensor data and research. In this project, MATLAB was used to process accelerometer sensor data from smartphone, then examines the best accuracy that can be achieved by using different machine learning algorithms including K-Near Neighbor (K-NN), Support Vector Machines (SVM) and Ensemble Learner. MATLAB is a convenient software to do machine learning. MATLAB’s Classification Learner app which provides users several classifiers and visual interface of sensor data will be used in this experiment. The result of experiment shows that all the machine learning algorithms can reach 85 or higher. The highest accuracy that can be achieved is 94.74% by using Cubic SVM.
URI: http://hdl.handle.net/10356/78432
Schools: School of Electrical and Electronic Engineering 
Rights: Nanyang Technological University
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
Final Report-YYC.pdf
  Restricted Access
4.05 MBAdobe PDFView/Open

Page view(s)

298
Updated on Jun 23, 2024

Download(s) 50

44
Updated on Jun 23, 2024

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.