Please use this identifier to cite or link to this item:
|Title:||Robust human activity recognition using lesser number of wearable sensors||Authors:||Wang, Di
DRNTU::Engineering::Computer science and engineering
|Issue Date:||2017||Source:||Wang, D., Candinegara, E., Hou, J., Tan, A.-H., & Miao, C. (2017). Robust human activity recognition using lesser number of wearable sensors. 2017 International Conference on Security, Pattern Analysis, and Cybernetics (SPAC), 290-295. doi:10.1109/SPAC.2017.8304292||Abstract:||In recent years, research on the recognition of human physical activities solely using wearable sensors has received more and more attention. Compared to other types of sensory devices such as surveillance cameras, wearable sensors are preferred in most activity recognition applications mainly due to their non-intrusiveness and pervasiveness. However, many existing activity recognition applications or experiments using wearable sensors were conducted in the confined laboratory settings using specifically developed gadgets. These gadgets may be useful for a small group of people in certain specific scenarios, but probably will not gain their popularity because they introduce additional costs and they are unusual in everyday life. Alternatively, commercial devices such as smart phones and smart watches can be better utilized for robust activity recognitions. However, only few prior studies focused on activity recognitions using multiple commercial devices. In this paper, we present our feature extraction strategy and compare the performance of our feature set against other feature sets using the same classifiers. We conduct various experiments on a subset of a public dataset named PAMAP2. Specifically, we only select two sensors out of the thirteen used in PAMAP2. Experimental results show that our feature extraction strategy performs better than the others. This paper provides the necessary foundation towards robust activity recognition using only the commercial wearable devices.||URI:||https://hdl.handle.net/10356/89637
|DOI:||10.1109/SPAC.2017.8304292||Rights:||© 2017 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. The published version is available at: [http://dx.doi.org/10.1109/SPAC.2017.8304292].||Fulltext Permission:||open||Fulltext Availability:||With Fulltext|
|Appears in Collections:||SCSE Conference Papers|
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.