Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Predicting Out-of-Distribution Error with the Projection Norm

About

We propose a metric -- Projection Norm -- to predict a model's performance on out-of-distribution (OOD) data without access to ground truth labels. Projection Norm first uses model predictions to pseudo-label test samples and then trains a new model on the pseudo-labels. The more the new model's parameters differ from an in-distribution model, the greater the predicted OOD error. Empirically, our approach outperforms existing methods on both image and text classification tasks and across different network architectures. Theoretically, we connect our approach to a bound on the test error for overparameterized linear models. Furthermore, we find that Projection Norm is the only approach that achieves non-trivial detection performance on adversarial examples. Our code is available at https://github.com/yaodongyu/ProjNorm.

Yaodong Yu, Zitong Yang, Alexander Wei, Yi Ma, Jacob Steinhardt• 2022

Related benchmarks

TaskDatasetResultRank
Accuracy EstimationPACS
R20.474
50
Accuracy EstimationEntity-13 Subpopulation Shift
R20.952
36
Accuracy EstimationEntity-30 Subpopulation Shift
R20.959
36
Accuracy EstimationNonliving-26 Subpopulation Shift
R20.939
36
Accuracy EstimationLiving-17 Subpopulation Shift
R20.923
36
Unsupervised Accuracy EstimationRR1-WILDS
R-squared0.878
36
Unsupervised Accuracy EstimationDomainNet
R^20.363
36
Unsupervised Accuracy EstimationOffice-Home
R^20.172
36
Showing 8 of 8 rows

Other info

Follow for update