Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/164108
Full metadata record
DC FieldValueLanguage
dc.contributor.authorWen, Junen_US
dc.contributor.authorYuan, Junsongen_US
dc.contributor.authorZheng, Qianen_US
dc.contributor.authorLiu, Rishengen_US
dc.contributor.authorGong, Zhefengen_US
dc.contributor.authorZheng, Nengganen_US
dc.date.accessioned2023-01-05T01:26:55Z-
dc.date.available2023-01-05T01:26:55Z-
dc.date.issued2022-
dc.identifier.citationWen, J., Yuan, J., Zheng, Q., Liu, R., Gong, Z. & Zheng, N. (2022). Hierarchical domain adaptation with local feature patterns. Pattern Recognition, 124, 108445-. https://dx.doi.org/10.1016/j.patcog.2021.108445en_US
dc.identifier.issn0031-3203en_US
dc.identifier.urihttps://hdl.handle.net/10356/164108-
dc.description.abstractDomain adaptation is proposed to generalize learning machines and address performance degradation of models that are trained from one specific source domain but applied to novel target domains. Existing domain adaptation methods focus on transferring holistic features whose discriminability is generally tailored to be source-specific and inferiorly generic to be transferable. As a result, standard domain adaptation on holistic features usually damages feature structures, especially local feature statistics, and deteriorates the learned discriminability. To alleviate this issue, we propose to transfer primitive local feature patterns, whose discriminability are shown to be inherently more sharable, and perform hierarchical feature adaptation. Concretely, we first learn a cluster of domain-shared local feature patterns and partition the feature space into cells. Local features are adaptively aggregated inside each cell to obtain cell features, which are further integrated into holistic features. To achieve fine-grained adaptations, we simultaneously perform alignment on local features, cell features and holistic features, within which process the local and cell features are aligned independently inside each cell to maintain the learned local structures and prevent negative transfer. Experimenting on typical one-to-one unsupervised domain adaptation for both image classification and action recognition tasks, partial domain adaptation, and domain-agnostic adaptation, we show that the proposed method achieves more reliable feature transfer by consistently outperforming state-of-the-art models and the learned domain-invariant features generalize well to novel domains.en_US
dc.description.sponsorshipNanyang Technological Universityen_US
dc.language.isoenen_US
dc.relation.ispartofPattern Recognitionen_US
dc.rights© 2021 Elsevier Ltd. All rights reserved.en_US
dc.subjectEngineering::Electrical and electronic engineeringen_US
dc.titleHierarchical domain adaptation with local feature patternsen_US
dc.typeJournal Articleen
dc.contributor.schoolSchool of Electrical and Electronic Engineeringen_US
dc.identifier.doi10.1016/j.patcog.2021.108445-
dc.identifier.scopus2-s2.0-85120475087-
dc.identifier.volume124en_US
dc.identifier.spage108445en_US
dc.subject.keywordsDomain Adaptationen_US
dc.subject.keywordsLocal Feature Patternsen_US
dc.description.acknowledgementThis work was supported by Zhejiang Lab (2020KB0AC02), National Key R&D Program of China (SQ2020YFB130047, 2020YFB1313501, 2020YFB1313503), Zhejiang Provincial Natural Science Foundation (LR19F020005), the National Natural Science Foundation of China (61972347, 31070944, 31271147, 61922019, 31471063, 31671074, and 61572433) and the Fundamental Research Funds for the Central Universities, China (2017FZA7003). Qian Zheng is supported by the Rapid-Rich Object Search (ROSE) Lab, Nanyang Technological University, Singapore.en_US
item.grantfulltextnone-
item.fulltextNo Fulltext-
Appears in Collections:EEE Journal Articles

SCOPUSTM   
Citations 20

13
Updated on Mar 1, 2024

Web of ScienceTM
Citations 20

7
Updated on Oct 31, 2023

Page view(s)

73
Updated on Mar 4, 2024

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.