Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Your Classifier can Secretly Suffice Multi-Source Domain Adaptation

About

Multi-Source Domain Adaptation (MSDA) deals with the transfer of task knowledge from multiple labeled source domains to an unlabeled target domain, under a domain-shift. Existing methods aim to minimize this domain-shift using auxiliary distribution alignment objectives. In this work, we present a different perspective to MSDA wherein deep models are observed to implicitly align the domains under label supervision. Thus, we aim to utilize implicit alignment without additional training objectives to perform adaptation. To this end, we use pseudo-labeled target samples and enforce a classifier agreement on the pseudo-labels, a process called Self-supervised Implicit Alignment (SImpAl). We find that SImpAl readily works even under category-shift among the source domains. Further, we propose classifier agreement as a cue to determine the training convergence, resulting in a simple training algorithm. We provide a thorough evaluation of our approach on five benchmarks, along with detailed insights into each component of our approach.

Naveen Venkat, Jogendra Nath Kundu, Durgesh Kumar Singh, Ambareesh Revanur, R. Venkatesh Babu• 2021

Related benchmarks

TaskDatasetResultRank
Unsupervised Domain AdaptationOffice-Home (test)
Average Accuracy72.2
332
Image ClassificationOffice-31--
261
Unsupervised Domain AdaptationOffice-Home
Average Accuracy72.2
238
Image ClassificationDomainNet (test)
Average Accuracy48.6
209
Image ClassificationDomainNet
Accuracy (ClipArt)66.4
161
Image ClassificationOfficeHome
Average Accuracy72.2
131
Unsupervised Domain AdaptationImageCLEF-DA
Average Accuracy88.3
104
Unsupervised Domain AdaptationDomainNet
Average Accuracy48.6
100
Image ClassificationOffice-Home
Average Accuracy72.2
59
Multi-source Unsupervised Domain AdaptationDomainNet target
Clipart Accuracy66.4
26
Showing 10 of 21 rows

Other info

Code

Follow for update