Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

DCAC: Dynamic Class-Aware Cache Creates Stronger Out-of-Distribution Detectors

About

Out-of-distribution (OOD) detection remains a fundamental challenge for deep neural networks, particularly due to overconfident predictions on unseen OOD samples during testing. We reveal a key insight: OOD samples predicted as the same class, or given high probabilities for it, are visually more similar to each other than to the true in-distribution (ID) samples. Motivated by this class-specific observation, we propose DCAC (Dynamic Class-Aware Cache), a training-free, test-time calibration module that maintains separate caches for each ID class to collect high-entropy samples and calibrate the raw predictions of input samples. DCAC leverages cached visual features and predicted probabilities through a lightweight two-layer module to mitigate overconfident predictions on OOD samples. This module can be seamlessly integrated with various existing OOD detection methods across both unimodal and vision-language models while introducing minimal computational overhead. Extensive experiments on multiple OOD benchmarks demonstrate that DCAC significantly enhances existing methods, achieving substantial improvements, i.e., reducing FPR95 by 6.55% when integrated with ASH-S on ImageNet OOD benchmark.

Yanqi Wu, Qichao Chen, Runhe Lai, Xinhua Lu, Jia-Xin Zhuang, Zhilin Zhao, Wei-Shi Zheng, Ruixuan Wang• 2026

Related benchmarks

TaskDatasetResultRank
Out-of-Distribution DetectionSUN OOD with ImageNet-1k In-distribution (test)
FPR@9510.18
159
Out-of-Distribution DetectionImageNet-1k ID iNaturalist OOD
FPR951.2
87
Out-of-Distribution DetectionImageNet (ID) vs Places365 (OOD) 1.0 (test)
FPR9525.29
41
Out-of-Distribution DetectionImageNet-1K (ID) vs Textures (OOD) (test)
FPR9511.7
34
Out-of-Distribution DetectionImageNet-1K Far-OOD Average (test)
FPR@9514.32
18
Showing 5 of 5 rows

Other info

Follow for update