Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Cross-Domain Ensemble Distillation for Domain Generalization

About

Domain generalization is the task of learning models that generalize to unseen target domains. We propose a simple yet effective method for domain generalization, named cross-domain ensemble distillation (XDED), that learns domain-invariant features while encouraging the model to converge to flat minima, which recently turned out to be a sufficient condition for domain generalization. To this end, our method generates an ensemble of the output logits from training data with the same label but from different domains and then penalizes each output for the mismatch with the ensemble. Also, we present a de-stylization technique that standardizes features to encourage the model to produce style-consistent predictions even in an arbitrary target domain. Our method greatly improves generalization capability in public benchmarks for cross-domain image classification, cross-dataset person re-ID, and cross-dataset semantic segmentation. Moreover, we show that models learned by our method are robust against adversarial attacks and image corruptions.

Kyungmoon Lee, Sungyeon Kim, Suha Kwak• 2022

Related benchmarks

TaskDatasetResultRank
Semantic segmentationCityscapes (test)
mIoU39.2
1145
Image ClassificationPACS (test)
Average Accuracy86.4
254
Image ClassificationPACS
Overall Average Accuracy86.4
230
Domain GeneralizationPACS (test)
Average Accuracy66.5
225
Person Re-IdentificationMarket-1501 to DukeMTMC-reID (test)
Rank-149.3
172
Image ClassificationOfficeHome
Average Accuracy67.4
131
Person Re-IdentificationDukeMTMC-reID to Market-1501 (test)
Rank-1 Acc59
119
Domain GeneralizationOfficeHome (leave-one-domain-out)
Art Accuracy60.8
59
Open Domain GeneralizationOfficeHome
Acc61.96
43
Semantic segmentationMapillary (test)
mIoU37.1
30
Showing 10 of 23 rows

Other info

Code

Follow for update