Domain consistency regularization for unsupervised multi-source domain adaptive classification

Deep learning-based multi-source unsupervised domain adaptation (MUDA) has been actively studied in recent years. Compared with single-source unsupervised domain adaptation (SUDA), domain shift in MUDA exists not only between the source and target domains but also among multiple source domains. Most...

Full description

Saved in:
Bibliographic Details
Main Authors: Luo, Zhipeng, Zhang, Xiaobing, Lu, Shijian, Yi, Shuai
Other Authors: School of Computer Science and Engineering
Format: Article
Language:English
Published: 2023
Subjects:
Online Access:https://hdl.handle.net/10356/164101
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-164101
record_format dspace
spelling sg-ntu-dr.10356-1641012023-01-04T08:54:52Z Domain consistency regularization for unsupervised multi-source domain adaptive classification Luo, Zhipeng Zhang, Xiaobing Lu, Shijian Yi, Shuai School of Computer Science and Engineering Sensetime Research Engineering::Computer science and engineering Domain Adaptation Transfer Learning Deep learning-based multi-source unsupervised domain adaptation (MUDA) has been actively studied in recent years. Compared with single-source unsupervised domain adaptation (SUDA), domain shift in MUDA exists not only between the source and target domains but also among multiple source domains. Most existing MUDA algorithms focus on extracting domain-invariant representations among all domains whereas the task-specific decision boundaries among classes are largely neglected. In this paper, we propose an end-to-end trainable network that exploits domain Consistency Regularization for unsupervised Multi-source domain Adaptive classification (CRMA). CRMA aligns not only the distributions of each pair of source and target domains but also that of all domains. For each pair of source and target domains, we employ an intra-domain consistency to regularize a pair of domain-specific classifiers to achieve intra-domain alignment. In addition, we design an inter-domain consistency that targets joint inter-domain alignment among all domains. To address different similarities between multiple source domains and the target domain, we design an authorization strategy that assigns different authorities to domain-specific classifiers adaptively for optimal pseudo label prediction and self-training. Extensive experiments show that CRMA tackles unsupervised domain adaptation effectively under a multi-source setup and achieves superior adaptation consistently across multiple MUDA datasets. 2023-01-04T08:54:52Z 2023-01-04T08:54:52Z 2022 Journal Article Luo, Z., Zhang, X., Lu, S. & Yi, S. (2022). Domain consistency regularization for unsupervised multi-source domain adaptive classification. Pattern Recognition, 132, 108955-. https://dx.doi.org/10.1016/j.patcog.2022.108955 0031-3203 https://hdl.handle.net/10356/164101 10.1016/j.patcog.2022.108955 2-s2.0-85135697430 132 108955 en Pattern Recognition © 2022 Elsevier Ltd. All rights reserved.
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering::Computer science and engineering
Domain Adaptation
Transfer Learning
spellingShingle Engineering::Computer science and engineering
Domain Adaptation
Transfer Learning
Luo, Zhipeng
Zhang, Xiaobing
Lu, Shijian
Yi, Shuai
Domain consistency regularization for unsupervised multi-source domain adaptive classification
description Deep learning-based multi-source unsupervised domain adaptation (MUDA) has been actively studied in recent years. Compared with single-source unsupervised domain adaptation (SUDA), domain shift in MUDA exists not only between the source and target domains but also among multiple source domains. Most existing MUDA algorithms focus on extracting domain-invariant representations among all domains whereas the task-specific decision boundaries among classes are largely neglected. In this paper, we propose an end-to-end trainable network that exploits domain Consistency Regularization for unsupervised Multi-source domain Adaptive classification (CRMA). CRMA aligns not only the distributions of each pair of source and target domains but also that of all domains. For each pair of source and target domains, we employ an intra-domain consistency to regularize a pair of domain-specific classifiers to achieve intra-domain alignment. In addition, we design an inter-domain consistency that targets joint inter-domain alignment among all domains. To address different similarities between multiple source domains and the target domain, we design an authorization strategy that assigns different authorities to domain-specific classifiers adaptively for optimal pseudo label prediction and self-training. Extensive experiments show that CRMA tackles unsupervised domain adaptation effectively under a multi-source setup and achieves superior adaptation consistently across multiple MUDA datasets.
author2 School of Computer Science and Engineering
author_facet School of Computer Science and Engineering
Luo, Zhipeng
Zhang, Xiaobing
Lu, Shijian
Yi, Shuai
format Article
author Luo, Zhipeng
Zhang, Xiaobing
Lu, Shijian
Yi, Shuai
author_sort Luo, Zhipeng
title Domain consistency regularization for unsupervised multi-source domain adaptive classification
title_short Domain consistency regularization for unsupervised multi-source domain adaptive classification
title_full Domain consistency regularization for unsupervised multi-source domain adaptive classification
title_fullStr Domain consistency regularization for unsupervised multi-source domain adaptive classification
title_full_unstemmed Domain consistency regularization for unsupervised multi-source domain adaptive classification
title_sort domain consistency regularization for unsupervised multi-source domain adaptive classification
publishDate 2023
url https://hdl.handle.net/10356/164101
_version_ 1754611284716814336