When is invariance useful in an Out-of-Distribution Generalization problem ?
About
The goal of Out-of-Distribution (OOD) generalization problem is to train a predictor that generalizes on all environments. Popular approaches in this field use the hypothesis that such a predictor shall be an \textit{invariant predictor} that captures the mechanism that remains constant across environments. While these approaches have been experimentally successful in various case studies, there is still much room for the theoretical validation of this hypothesis. This paper presents a new set of theoretical conditions necessary for an invariant predictor to achieve the OOD optimality. Our theory not only applies to non-linear cases, but also generalizes the necessary condition used in \citet{rojas2018invariant}. We also derive Inter Gradient Alignment algorithm from our theory and demonstrate its competitiveness on MNIST-derived benchmark datasets as well as on two of the three \textit{Invariance Unit Tests} proposed by \citet{aubinlinear}.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Node Classification | Cora Covariate shift (degree split) | OOD Accuracy62.62 | 50 | |
| Regression | PovertyMap (test) | Worst-U/R Pearson Correlation0.43 | 43 | |
| Regression | ACSIncome (test) | RMSE0.454 | 34 | |
| Node Classification | Arxiv Covariate shift (degree split) | OOD Accuracy65.87 | 30 | |
| Node Classification | WebKB university split Concept shift | OOD Test Accuracy28.44 | 30 | |
| Node Classification | WebKB university split Covariate shift | OOD Test Accuracy29.89 | 30 | |
| Node Classification | Arxiv Covariate shift time split | OOD Test Accuracy65.93 | 20 | |
| Node Classification | Cora word split Covariate shift | OOD Test Accuracy65.07 | 15 | |
| Node Classification | Cora word split Concept shift | OOD Accuracy64.56 | 15 | |
| Node Classification | CBAS color split Covariate shift | OOD Accuracy65.71 | 15 |