How to Overcome Curse-of-Dimensionality for Out-of-Distribution Detection?
About
Machine learning models deployed in the wild can be challenged by out-of-distribution (OOD) data from unknown classes. Recent advances in OOD detection rely on distance measures to distinguish samples that are relatively far away from the in-distribution (ID) data. Despite the promise, distance-based methods can suffer from the curse-of-dimensionality problem, which limits the efficacy in high-dimensional feature space. To combat this problem, we propose a novel framework, Subspace Nearest Neighbor (SNN), for OOD detection. In training, our method regularizes the model and its feature representation by leveraging the most relevant subset of dimensions (i.e. subspace). Subspace learning yields highly distinguishable distance measures between ID and OOD data. We provide comprehensive experiments and ablations to validate the efficacy of SNN. Compared to the current best distance-based method, SNN reduces the average FPR95 by 15.96% on the CIFAR-100 benchmark.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Out-of-Distribution Detection | CIFAR-10 vs CIFAR-100 (test) | AUROC89.8 | 93 | |
| OOD Detection | CIFAR-10 (ID) vs Places 365 (OOD) | AUROC96.84 | 77 | |
| Out-of-Distribution Detection | CIFAR10 (ID) / ISUN (OOD) (test) | FPR@956.02 | 41 | |
| Out-of-Distribution Detection | CIFAR-10 In-Dist Texture Out-Dist | AUROC92.91 | 41 | |
| Out-of-Distribution Detection | CIFAR10 (ID) vs SVHN (OOD) | AUROC97.8 | 37 | |
| OOD Detection | CIFAR-10 (In-distribution) vs LSUN-R (Out-of-distribution) | FPR9510.93 | 25 | |
| Out-of-Distribution Detection | CIFAR-10 OpenOOD far-OOD (test) | FPR@9529.15 | 18 | |
| Out-of-Distribution Detection | Imagenette (ID) vs close ImageNet classes (Near-OOD) (test) | FPR@9589.9 | 3 |