Please use this identifier to cite or link to this item:
https://hdl.handle.net/10356/150409
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Qian, Hangwei | en_US |
dc.contributor.author | Pan, Sinno Jialin | en_US |
dc.contributor.author | Miao, Chunyan | en_US |
dc.date.accessioned | 2021-06-07T03:17:31Z | - |
dc.date.available | 2021-06-07T03:17:31Z | - |
dc.date.issued | 2021 | - |
dc.identifier.citation | Qian, H., Pan, S. J. & Miao, C. (2021). Latent independent excitation for generalizable sensor-based cross-person activity recognition. The Thirty-Fifth AAAI Conference on Artificial Intelligence (AAAI-21), 35. | en_US |
dc.identifier.uri | https://hdl.handle.net/10356/150409 | - |
dc.description.abstract | In wearable-sensor-based activity recognition, it is often assumed that the training and the test samples follow the same data distribution. This assumption neglects practical scenarios where the activity patterns inevitably vary from person to person. To solve this problem, transfer learning and domain adaptation approaches are often leveraged to reduce the gaps between different participants. Nevertheless, these approaches require additional information (i.e., labeled or unlabeled data, meta-information) from the target domain during the training stage. In this paper, we introduce a novel method named Generalizable Independent Latent Excitation (GILE) for human activity recognition, which greatly enhances the cross-person generalization capability of the model. Our proposed method is superior to existing methods in the sense that it does not require any access to the target domain information. Besides, this novel model can be directly applied to various target domains without re-training or fine-tuning. Specifically, the proposed model learns to automatically disentangle domain-agnostic and domain-specific features, the former of which are expected to be invariant across various persons. To further remove correlations between the two types of features, a novel Independent Excitation mechanism is incorporated in the latent feature space. Comprehensive experimental evaluations are conducted on three benchmark datasets to demonstrate the superiority of the proposed method over state-of-the-art solutions. | en_US |
dc.description.sponsorship | Ministry of Education (MOE) | en_US |
dc.description.sponsorship | Nanyang Technological University | en_US |
dc.language.iso | en | en_US |
dc.relation | 020493-00001 | en_US |
dc.rights | © 2021 Association for the Advancement of Artificial Intelligence (AAAI). All rights reserved. This paper was published in The Thirty-Fifth AAAI Conference on Artificial Intelligence (AAAI-21) and is made available with permission of Association for the Advancement of Artificial Intelligence (AAAI). | en_US |
dc.subject | Engineering::Computer science and engineering | en_US |
dc.title | Latent independent excitation for generalizable sensor-based cross-person activity recognition | en_US |
dc.type | Conference Paper | en |
dc.contributor.school | School of Computer Science and Engineering | en_US |
dc.contributor.conference | The Thirty-Fifth AAAI Conference on Artificial Intelligence (AAAI-21) | en_US |
dc.contributor.research | Joint NTU-UBC Research Centre of Excellence in Active Living for the Elderly (LILY) | en_US |
dc.description.version | Accepted version | en_US |
dc.identifier.volume | 35 | en_US |
dc.subject.keywords | Human Activity Recognition | en_US |
dc.subject.keywords | Cross-person Generalization | en_US |
item.grantfulltext | open | - |
item.fulltext | With Fulltext | - |
Appears in Collections: | SCSE Conference Papers |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
5_AAAI21_GILE.pdf | full paper | 1.04 MB | Adobe PDF | View/Open |
Page view(s)
337
Updated on Sep 12, 2024
Download(s) 20
324
Updated on Sep 12, 2024
Google ScholarTM
Check
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.