Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Mixture Outlier Exposure: Towards Out-of-Distribution Detection in Fine-grained Environments

About

Many real-world scenarios in which DNN-based recognition systems are deployed have inherently fine-grained attributes (e.g., bird-species recognition, medical image classification). In addition to achieving reliable accuracy, a critical subtask for these models is to detect Out-of-distribution (OOD) inputs. Given the nature of the deployment environment, one may expect such OOD inputs to also be fine-grained w.r.t. the known classes (e.g., a novel bird species), which are thus extremely difficult to identify. Unfortunately, OOD detection in fine-grained scenarios remains largely underexplored. In this work, we aim to fill this gap by first carefully constructing four large-scale fine-grained test environments, in which existing methods are shown to have difficulties. Particularly, we find that even explicitly incorporating a diverse set of auxiliary outlier data during training does not provide sufficient coverage over the broad region where fine-grained OOD samples locate. We then propose Mixture Outlier Exposure (MixOE), which mixes ID data and training outliers to expand the coverage of different OOD granularities, and trains the model such that the prediction confidence linearly decays as the input transitions from ID to OOD. Extensive experiments and analyses demonstrate the effectiveness of MixOE for building up OOD detector in fine-grained environments. The code is available at https://github.com/zjysteven/MixOE.

Jingyang Zhang, Nathan Inkawhich, Randolph Linderman, Yiran Chen, Hai Li• 2021

Related benchmarks

TaskDatasetResultRank
Image ClassificationImageNet-1K 1.0 (val)
Top-1 Accuracy75.47
1866
Image ClassificationImageNet-1K
Top-1 Acc74.62
836
Image ClassificationCIFAR-100--
622
Image ClassificationCIFAR-10
Accuracy96.6
507
Out-of-Distribution DetectionCIFAR-100
AUROC86.46
107
Out-of-Distribution DetectionCIFAR-10
AUROC97.59
105
OOD DetectionCIFAR-100 standard (test)
AUROC (%)92.93
94
OOD DetectionCIFAR-10 (ID) vs Places 365 (OOD)
AUROC96.92
77
OOD DetectionCIFAR-100 IND SVHN OOD
AUROC (%)92.27
74
Out-of-Distribution DetectionImageNet-1k (ID) with 4 OOD datasets (iNaturalist, SUN, Places, Textures)
FPR9558
45
Showing 10 of 31 rows

Other info

Follow for update