Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Simple Domain Generalization Methods are Strong Baselines for Open Domain Generalization

About

In real-world applications, a machine learning model is required to handle an open-set recognition (OSR), where unknown classes appear during the inference, in addition to a domain shift, where the distribution of data differs between the training and inference phases. Domain generalization (DG) aims to handle the domain shift situation where the target domain of the inference phase is inaccessible during model training. Open domain generalization (ODG) takes into account both DG and OSR. Domain-Augmented Meta-Learning (DAML) is a method targeting ODG but has a complicated learning process. On the other hand, although various DG methods have been proposed, they have not been evaluated in ODG situations. This work comprehensively evaluates existing DG methods in ODG and shows that two simple DG methods, CORrelation ALignment (CORAL) and Maximum Mean Discrepancy (MMD), are competitive with DAML in several cases. In addition, we propose simple extensions of CORAL and MMD by introducing the techniques used in DAML, such as ensemble learning and Dirichlet mixup data augmentation. The experimental evaluation demonstrates that the extended CORAL and MMD can perform comparably to DAML with lower computational costs. This suggests that the simple DG methods and their simple extensions are strong baselines for ODG. The code used in the experiments is available at https://github.com/shiralab/OpenDG-Eval.

Masashi Noguchi, Shinichi Shirakawa• 2023

Related benchmarks

TaskDatasetResultRank
Open Domain GeneralizationOfficeHome
Acc61.9
43
Open Set Domain GeneralizationOfficeHome H=1
Accuracy66.81
23
Open Set Domain GeneralizationOfficeHome H=1/6
Accuracy55.09
12
Open Set Domain GeneralizationPACS
Accuracy (H=1)84.48
12
Open Set Domain GeneralizationOfficeHome Average
Accuracy58.95
12
Showing 5 of 5 rows

Other info

Follow for update