Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/150409
Full metadata record
DC FieldValueLanguage
dc.contributor.authorQian, Hangweien_US
dc.contributor.authorPan, Sinno Jialinen_US
dc.contributor.authorMiao, Chunyanen_US
dc.date.accessioned2021-06-07T03:17:31Z-
dc.date.available2021-06-07T03:17:31Z-
dc.date.issued2021-
dc.identifier.citationQian, H., Pan, S. J. & Miao, C. (2021). Latent independent excitation for generalizable sensor-based cross-person activity recognition. The Thirty-Fifth AAAI Conference on Artificial Intelligence (AAAI-21), 35.en_US
dc.identifier.urihttps://hdl.handle.net/10356/150409-
dc.description.abstractIn wearable-sensor-based activity recognition, it is often assumed that the training and the test samples follow the same data distribution. This assumption neglects practical scenarios where the activity patterns inevitably vary from person to person. To solve this problem, transfer learning and domain adaptation approaches are often leveraged to reduce the gaps between different participants. Nevertheless, these approaches require additional information (i.e., labeled or unlabeled data, meta-information) from the target domain during the training stage. In this paper, we introduce a novel method named Generalizable Independent Latent Excitation (GILE) for human activity recognition, which greatly enhances the cross-person generalization capability of the model. Our proposed method is superior to existing methods in the sense that it does not require any access to the target domain information. Besides, this novel model can be directly applied to various target domains without re-training or fine-tuning. Specifically, the proposed model learns to automatically disentangle domain-agnostic and domain-specific features, the former of which are expected to be invariant across various persons. To further remove correlations between the two types of features, a novel Independent Excitation mechanism is incorporated in the latent feature space. Comprehensive experimental evaluations are conducted on three benchmark datasets to demonstrate the superiority of the proposed method over state-of-the-art solutions.en_US
dc.description.sponsorshipMinistry of Education (MOE)en_US
dc.description.sponsorshipNanyang Technological Universityen_US
dc.language.isoenen_US
dc.relation020493-00001en_US
dc.rights© 2021 Association for the Advancement of Artificial Intelligence (AAAI). All rights reserved. This paper was published in The Thirty-Fifth AAAI Conference on Artificial Intelligence (AAAI-21) and is made available with permission of Association for the Advancement of Artificial Intelligence (AAAI).en_US
dc.subjectEngineering::Computer science and engineeringen_US
dc.titleLatent independent excitation for generalizable sensor-based cross-person activity recognitionen_US
dc.typeConference Paperen
dc.contributor.schoolSchool of Computer Science and Engineeringen_US
dc.contributor.conferenceThe Thirty-Fifth AAAI Conference on Artificial Intelligence (AAAI-21)en_US
dc.contributor.researchJoint NTU-UBC Research Centre of Excellence in Active Living for the Elderly (LILY)en_US
dc.description.versionAccepted versionen_US
dc.identifier.volume35en_US
dc.subject.keywordsHuman Activity Recognitionen_US
dc.subject.keywordsCross-person Generalizationen_US
item.grantfulltextopen-
item.fulltextWith Fulltext-
Appears in Collections:SCSE Conference Papers
Files in This Item:
File Description SizeFormat 
5_AAAI21_GILE.pdffull paper1.04 MBAdobe PDFThumbnail
View/Open

Page view(s)

337
Updated on Sep 12, 2024

Download(s) 20

324
Updated on Sep 12, 2024

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.