Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Epistemic Uncertainty Quantification For Pre-trained Neural Network

About

Epistemic uncertainty quantification (UQ) identifies where models lack knowledge. Traditional UQ methods, often based on Bayesian neural networks, are not suitable for pre-trained non-Bayesian models. Our study addresses quantifying epistemic uncertainty for any pre-trained model, which does not need the original training data or model modifications and can ensure broad applicability regardless of network architectures or training techniques. Specifically, we propose a gradient-based approach to assess epistemic uncertainty, analyzing the gradients of outputs relative to model parameters, and thereby indicating necessary model adjustments to accurately represent the inputs. We first explore theoretical guarantees of gradient-based methods for epistemic UQ, questioning the view that this uncertainty is only calculable through differences between multiple models. We further improve gradient-driven UQ by using class-specific weights for integrating gradients and emphasizing distinct contributions from neural network layers. Additionally, we enhance UQ accuracy by combining gradient and perturbation methods to refine the gradients. We evaluate our approach on out-of-distribution detection, uncertainty calibration, and active learning, demonstrating its superiority over current state-of-the-art UQ methods for pre-trained models.

Hanjing Wang, Qiang Ji• 2024

Related benchmarks

TaskDatasetResultRank
Out-of-Distribution DetectionCIFAR-10 vs SVHN (test)
AUROC0.9662
101
Out-of-Distribution DetectionCIFAR-100 SVHN in-distribution out-of-distribution (test)
AUROC90.22
90
Out-of-Distribution DetectionCIFAR-10 in-distribution LSUN out-of-distribution (test)
AUROC94.99
73
Out-of-Distribution DetectionCIFAR-100 (in-distribution) / LSUN (out-of-distribution) (test)
AUROC88.38
67
Out-of-Distribution DetectionSVHN CIFAR-10 in-distribution out-of-distribution (test)
AUROC91.11
56
Out-of-Distribution DetectionMNIST (In-distribution) vs Fashion-MNIST (OOD) (test)
AUPR0.9879
36
Out-of-Distribution DetectionSVHN → CIFAR-100 (test)
AUROC90.37
22
Active LearningMNIST (test)
Accuracy75.31
12
Active LearningCIFAR-10 (test)
Accuracy39.28
12
Active LearningSVHN (test)
Accuracy67.37
12
Showing 10 of 12 rows

Other info

Code

Follow for update