Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Nonparametric Uncertainty Quantification for Single Deterministic Neural Network

About

This paper proposes a fast and scalable method for uncertainty quantification of machine learning models' predictions. First, we show the principled way to measure the uncertainty of predictions for a classifier based on Nadaraya-Watson's nonparametric estimate of the conditional label distribution. Importantly, the proposed approach allows to disentangle explicitly aleatoric and epistemic uncertainties. The resulting method works directly in the feature space. However, one can apply it to any neural network by considering an embedding of the data induced by the network. We demonstrate the strong performance of the method in uncertainty estimation tasks on text classification problems and a variety of real-world image datasets, such as MNIST, SVHN, CIFAR-100 and several versions of ImageNet.

Nikita Kotelevskii, Aleksandr Artemenkov, Kirill Fedyanin, Fedor Noskov, Alexander Fishkov, Artem Shelmanov, Artem Vazhentsev, Aleksandr Petiushko, Maxim Panov• 2022

Related benchmarks

TaskDatasetResultRank
Out-of-Distribution DetectionCIFAR-100 SVHN in-distribution out-of-distribution (test)
AUROC89.7
90
Out-of-Distribution DetectionImageNet-O
AUROC0.824
74
Out-of-Distribution DetectionCIFAR-100 (in-distribution) / LSUN (out-of-distribution) (test)
AUROC92.3
67
Out-of-Distribution DetectionCIFAR100 (ID) vs SVHN (OOD) (test)
AUROC89.7
40
Out-of-Distribution DetectionCIFAR-100 In-distribution vs Smooth (OOD)
AUC96.8
22
Out-of-Distribution DetectionImageNet-R
ROC AUC0.995
9
OOD DetectionCIFAR-100 (in-distribution) and LSUN (out-of-distribution) (test)
ROC AUC92.3
6
Out-of-Distribution DetectionImageNet-R (test)
ROC-AUC99.5
5
Out-of-Distribution DetectionImageNet-O (test)--
5
Showing 9 of 9 rows

Other info

Code

Follow for update