Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Learning to Generate Novel Domains for Domain Generalization

About

This paper focuses on domain generalization (DG), the task of learning from multiple source domains a model that generalizes well to unseen domains. A main challenge for DG is that the available source domains often exhibit limited diversity, hampering the model's ability to learn to generalize. We therefore employ a data generator to synthesize data from pseudo-novel domains to augment the source domains. This explicitly increases the diversity of available training domains and leads to a more generalizable model. To train the generator, we model the distribution divergence between source and synthesized pseudo-novel domains using optimal transport, and maximize the divergence. To ensure that semantics are preserved in the synthesized data, we further impose cycle-consistency and classification losses on the generator. Our method, L2A-OT (Learning to Augment by Optimal Transport) outperforms current state-of-the-art DG methods on four benchmark datasets.

Kaiyang Zhou, Yongxin Yang, Timothy Hospedales, Tao Xiang• 2020

Related benchmarks

TaskDatasetResultRank
Image ClassificationPACS (test)
Average Accuracy84.9
271
Image ClassificationPACS
Overall Average Accuracy82.8
241
Domain GeneralizationPACS--
231
Domain GeneralizationPACS (test)
Average Accuracy84.9
225
Image ClassificationOffice-Home (test)
Mean Accuracy65.6
199
Person Re-IdentificationMarket-1501 to DukeMTMC-reID (test)
Rank-150.1
191
Domain GeneralizationPACS (leave-one-domain-out)
Art Accuracy83.3
152
Image ClassificationOfficeHome
Average Accuracy65.6
137
object recognitionPACS (leave-one-domain-out)
Acc (Art painting)83.3
112
Domain GeneralizationOffice-Home (test)
Average Accuracy67.66
106
Showing 10 of 18 rows

Other info

Follow for update