Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

On the Importance of Gradients for Detecting Distributional Shifts in the Wild

About

Detecting out-of-distribution (OOD) data has become a critical component in ensuring the safe deployment of machine learning models in the real world. Existing OOD detection approaches primarily rely on the output or feature space for deriving OOD scores, while largely overlooking information from the gradient space. In this paper, we present GradNorm, a simple and effective approach for detecting OOD inputs by utilizing information extracted from the gradient space. GradNorm directly employs the vector norm of gradients, backpropagated from the KL divergence between the softmax output and a uniform probability distribution. Our key idea is that the magnitude of gradients is higher for in-distribution (ID) data than that for OOD data, making it informative for OOD detection. GradNorm demonstrates superior performance, reducing the average FPR95 by up to 16.33% compared to the previous best method.

Rui Huang, Andrew Geng, Yixuan Li• 2021

Related benchmarks

TaskDatasetResultRank
Image ClassificationImageNet-1K--
600
Out-of-Distribution DetectioniNaturalist
AUROC93.97
219
Out-of-Distribution DetectionSUN OOD with ImageNet-1k In-distribution (test)
FPR@9540.73
204
Out-of-Distribution DetectionTextures
AUROC0.7312
168
Out-of-Distribution DetectionPlaces
FPR9555.62
142
Out-of-Distribution DetectionImageNet OOD Average 1k (test)
FPR@9528.92
137
Out-of-Distribution DetectionImageNet-1k ID iNaturalist OOD
FPR9542.46
132
OOD DetectionCIFAR-10 (IND) SVHN (OOD)
AUROC0.9411
131
Out-of-Distribution DetectionTexture
AUROC90.99
113
Out-of-Distribution DetectionImageNet
FPR9554.9
108
Showing 10 of 134 rows
...

Other info

Code

Follow for update