Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/164101
Title: Domain consistency regularization for unsupervised multi-source domain adaptive classification
Authors: Luo, Zhipeng
Zhang, Xiaobing
Lu, Shijian
Yi, Shuai
Keywords: Engineering::Computer science and engineering
Issue Date: 2022
Source: Luo, Z., Zhang, X., Lu, S. & Yi, S. (2022). Domain consistency regularization for unsupervised multi-source domain adaptive classification. Pattern Recognition, 132, 108955-. https://dx.doi.org/10.1016/j.patcog.2022.108955
Journal: Pattern Recognition
Abstract: Deep learning-based multi-source unsupervised domain adaptation (MUDA) has been actively studied in recent years. Compared with single-source unsupervised domain adaptation (SUDA), domain shift in MUDA exists not only between the source and target domains but also among multiple source domains. Most existing MUDA algorithms focus on extracting domain-invariant representations among all domains whereas the task-specific decision boundaries among classes are largely neglected. In this paper, we propose an end-to-end trainable network that exploits domain Consistency Regularization for unsupervised Multi-source domain Adaptive classification (CRMA). CRMA aligns not only the distributions of each pair of source and target domains but also that of all domains. For each pair of source and target domains, we employ an intra-domain consistency to regularize a pair of domain-specific classifiers to achieve intra-domain alignment. In addition, we design an inter-domain consistency that targets joint inter-domain alignment among all domains. To address different similarities between multiple source domains and the target domain, we design an authorization strategy that assigns different authorities to domain-specific classifiers adaptively for optimal pseudo label prediction and self-training. Extensive experiments show that CRMA tackles unsupervised domain adaptation effectively under a multi-source setup and achieves superior adaptation consistently across multiple MUDA datasets.
URI: https://hdl.handle.net/10356/164101
ISSN: 0031-3203
DOI: 10.1016/j.patcog.2022.108955
Schools: School of Computer Science and Engineering 
Organisations: Sensetime Research
Rights: © 2022 Elsevier Ltd. All rights reserved.
Fulltext Permission: none
Fulltext Availability: No Fulltext
Appears in Collections:SCSE Journal Articles

SCOPUSTM   
Citations 50

7
Updated on Feb 21, 2024

Web of ScienceTM
Citations 50

4
Updated on Oct 30, 2023

Page view(s)

75
Updated on Feb 28, 2024

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.