Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Learning to Generate Novel Domains for Domain Generalization

About

This paper focuses on domain generalization (DG), the task of learning from multiple source domains a model that generalizes well to unseen domains. A main challenge for DG is that the available source domains often exhibit limited diversity, hampering the model's ability to learn to generalize. We therefore employ a data generator to synthesize data from pseudo-novel domains to augment the source domains. This explicitly increases the diversity of available training domains and leads to a more generalizable model. To train the generator, we model the distribution divergence between source and synthesized pseudo-novel domains using optimal transport, and maximize the divergence. To ensure that semantics are preserved in the synthesized data, we further impose cycle-consistency and classification losses on the generator. Our method, L2A-OT (Learning to Augment by Optimal Transport) outperforms current state-of-the-art DG methods on four benchmark datasets.

Kaiyang Zhou, Yongxin Yang, Timothy Hospedales, Tao Xiang• 2020

Related benchmarks

TaskDatasetResultRank
Image ClassificationPACS (test)
Average Accuracy84.9
254
Image ClassificationPACS
Overall Average Accuracy82.8
230
Domain GeneralizationPACS (test)
Average Accuracy84.9
225
Domain GeneralizationPACS
Accuracy (Art)83.3
221
Image ClassificationOffice-Home (test)
Mean Accuracy65.6
199
Person Re-IdentificationMarket-1501 to DukeMTMC-reID (test)
Rank-150.1
172
Domain GeneralizationPACS (leave-one-domain-out)
Art Accuracy83.3
146
Image ClassificationOfficeHome
Average Accuracy65.6
131
object recognitionPACS (leave-one-domain-out)
Acc (Art painting)83.3
112
Domain GeneralizationOffice-Home (test)
Average Accuracy67.66
106
Showing 10 of 18 rows

Other info

Follow for update