Synthesizing Near-Boundary OOD Samples for Out-of-Distribution Detection
About
Pre-trained vision-language models have exhibited remarkable abilities in detecting out-of-distribution (OOD) samples. However, some challenging OOD samples, which lie close to in-distribution (InD) data in image feature space, can still lead to misclassification. The emergence of foundation models like diffusion models and multimodal large language models (MLLMs) offers a potential solution to this issue. In this work, we propose SynOOD, a novel approach that harnesses foundation models to generate synthetic, challenging OOD data for fine-tuning CLIP models, thereby enhancing boundary-level discrimination between InD and OOD samples. Our method uses an iterative in-painting process guided by contextual prompts from MLLMs to produce nuanced, boundary-aligned OOD samples. These samples are refined through noise adjustments based on gradients from OOD scores like the energy score, effectively sampling from the InD/OOD boundary. With these carefully synthesized images, we fine-tune the CLIP image encoder and negative label features derived from the text encoder to strengthen connections between near-boundary OOD samples and a set of negative labels. Finally, SynOOD achieves state-of-the-art performance on the large-scale ImageNet benchmark, with minimal increases in parameters and runtime. Our approach significantly surpasses existing methods, and the code is available at https://github.com/Jarvisgivemeasuit/SynOOD.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Out-of-Distribution Detection | SUN OOD with ImageNet-1k In-distribution (test) | FPR@9520.46 | 204 | |
| Out-of-Distribution Detection | ImageNet-1k ID iNaturalist OOD | FPR951.57 | 132 | |
| OOD Detection | iNaturalist (OOD) / ImageNet-1k (ID) 1.0 (test) | FPR951.57 | 64 | |
| Out-of-Distribution Detection | ImageNet-1K Near-OOD OpenOOD v1.5 | AUROC77.55 | 51 | |
| OOD Detection | ImageNet-1k ID Average OOD | AUROC0.9701 | 50 | |
| Out-of-Distribution Detection | ImageNet-1K OOD Average | AUROC97.01 | 50 | |
| Out-of-Distribution Detection | Places OOD ImageNet-1k ID | AUROC97.37 | 45 | |
| OOD Detection | ImageNet-1K | Average FPR9514.27 | 44 | |
| Out-of-Distribution Detection | ImageNet-1k (ID) vs Textures (OOD) | AUROC95.29 | 43 | |
| Out-of-Distribution Detection | ImageNet-1k (ID) vs Textures (OOD) 1.0 (test) | AUC95.29 | 40 |