Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Rethinking Out-of-distribution (OOD) Detection: Masked Image Modeling is All You Need

About

The core of out-of-distribution (OOD) detection is to learn the in-distribution (ID) representation, which is distinguishable from OOD samples. Previous work applied recognition-based methods to learn the ID features, which tend to learn shortcuts instead of comprehensive representations. In this work, we find surprisingly that simply using reconstruction-based methods could boost the performance of OOD detection significantly. We deeply explore the main contributors of OOD detection and find that reconstruction-based pretext tasks have the potential to provide a generally applicable and efficacious prior, which benefits the model in learning intrinsic data distributions of the ID dataset. Specifically, we take Masked Image Modeling as a pretext task for our OOD detection framework (MOOD). Without bells and whistles, MOOD outperforms previous SOTA of one-class OOD detection by 5.7%, multi-class OOD detection by 3.0%, and near-distribution OOD detection by 2.1%. It even defeats the 10-shot-per-class outlier exposure OOD detection, although we do not include any OOD samples for our detection

Jingyao Li, Pengguang Chen, Shaozuo Yu, Zexin He, Shu Liu, Jiaya Jia• 2023

Related benchmarks

TaskDatasetResultRank
Out-of-Distribution DetectionImageNet 1k (test)
Average AUROC89.1
58
Out-of-Distribution DetectionImageNet-30 In-distribution labeled (test)
Mean AUROC0.989
32
Multi-class OOD detectionCIFAR-10 (test)
OOD Score (SVHN)99.8
11
One-class OOD detectionCIFAR-10 one-class v1
CIFAR-10 Plane Score98.6
10
Multi-class OOD detectionCIFAR-100 (test)
OOD Accuracy (SVHN)96.5
9
One-class Out-of-Distribution DetectionImageNet 30
AUROC0.92
7
One-class Out-of-Distribution DetectionCIFAR-100 super-classes (test)
AUROC0.948
7
Outlier Exposure OOD DetectionCIFAR-10 near-distribution (test)
AUROC99.41
6
Showing 8 of 8 rows

Other info

Code

Follow for update