Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Optimal Transport-Induced Samples against Out-of-Distribution Overconfidence

About

Deep neural networks (DNNs) often produce overconfident predictions on out-of-distribution (OOD) inputs, undermining their reliability in open-world environments. Singularities in semi-discrete optimal transport (OT) mark regions of semantic ambiguity, where classifiers are particularly prone to unwarranted high-confidence predictions. Motivated by this observation, we propose a principled framework to mitigate OOD overconfidence by leveraging the geometry of OT-induced singular boundaries. Specifically, we formulate an OT problem between a continuous base distribution and the latent embeddings of training data, and identify the resulting singular boundaries. By sampling near these boundaries, we construct a class of OOD inputs, termed optimal transport-induced OOD samples (OTIS), which are geometrically grounded and inherently semantically ambiguous. During training, a confidence suppression loss is applied to OTIS to guide the model toward more calibrated predictions in structurally uncertain regions. Extensive experiments show that our method significantly alleviates OOD overconfidence and outperforms state-of-the-art methods.

Keke Tang, Ziyong Du, Xiaofei Wang, Weilong Peng, Peican Zhu, Zhihong Tian• 2026

Related benchmarks

TaskDatasetResultRank
Out-of-Distribution DetectioniNaturalist
FPR@9549.16
200
Out-of-Distribution DetectionTextures--
141
Out-of-Distribution DetectionPlaces
FPR9568.08
110
Out-of-Distribution DetectionSUN
FPR@9568.17
71
Out-of-Distribution DetectionMNIST--
13
Out-of-Distribution DetectionFMNIST--
13
Confidence calibrationCIFAR-10 ID (test)
ECE1.88
9
Confidence calibrationFMNIST ID (test)
ECE3.26
9
Confidence calibrationMNIST ID (test)
ECE0.14
9
Confidence calibrationCIFAR-100 ID (test)
ECE6.91
9
Showing 10 of 18 rows

Other info

Follow for update