Enhancing The Reliability of Out-of-distribution Image Detection in Neural Networks
About
We consider the problem of detecting out-of-distribution images in neural networks. We propose ODIN, a simple and effective method that does not require any change to a pre-trained neural network. Our method is based on the observation that using temperature scaling and adding small perturbations to the input can separate the softmax score distributions between in- and out-of-distribution images, allowing for more effective detection. We show in a series of experiments that ODIN is compatible with diverse network architectures and datasets. It consistently outperforms the baseline approach by a large margin, establishing a new state-of-the-art performance on this task. For example, ODIN reduces the false positive rate from the baseline 34.7% to 4.3% on the DenseNet (applied to CIFAR-10) when the true positive rate is 95%.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Image Classification | ImageNet-1K | -- | 600 | |
| Out-of-Distribution Detection | iNaturalist | AUROC98.57 | 219 | |
| Out-of-Distribution Detection | SUN OOD with ImageNet-1k In-distribution (test) | FPR@9529.05 | 204 | |
| Out-of-Distribution Detection | Textures | AUROC0.8785 | 168 | |
| Out-of-Distribution Detection | Places | FPR9555.06 | 142 | |
| Out-of-Distribution Detection | ImageNet OOD Average 1k (test) | FPR@9554.2 | 137 | |
| Out-of-Distribution Detection | ImageNet-1k ID iNaturalist OOD | FPR9530.22 | 132 | |
| OOD Detection | CIFAR-10 (IND) SVHN (OOD) | AUROC0.9506 | 131 | |
| Out-of-Distribution Detection | CIFAR-10 | AUROC93.86 | 121 | |
| OOD Detection | CIFAR-10 (ID) vs Places 365 (OOD) | AUROC86.61 | 117 |