Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Why Normalizing Flows Fail to Detect Out-of-Distribution Data

About

Detecting out-of-distribution (OOD) data is crucial for robust machine learning systems. Normalizing flows are flexible deep generative models that often surprisingly fail to distinguish between in- and out-of-distribution data: a flow trained on pictures of clothing assigns higher likelihood to handwritten digits. We investigate why normalizing flows perform poorly for OOD detection. We demonstrate that flows learn local pixel correlations and generic image-to-latent-space transformations which are not specific to the target image dataset. We show that by modifying the architecture of flow coupling layers we can bias the flow towards learning the semantic structure of the target data, improving OOD detection. Our investigation reveals that properties that enable flows to generate high-fidelity images can have a detrimental effect on OOD detection.

Polina Kirichenko, Pavel Izmailov, Andrew Gordon Wilson• 2020

Related benchmarks

TaskDatasetResultRank
OOD DetectionISIC Ink Artefacts (Similar)
AUROC83.96
70
OOD DetectionISIC Colour Chart Artefacts Synth Similar
AUROC0.9748
40
OOD DetectionISIC Colour Chart Artefacts Similar
AUROC96.07
40
OOD DetectionISIC Colour Chart Artefacts, Synth Dissimilar
AUROC93.02
40
OOD DetectionISIC Colour Chart Artefacts (Dissimilar)
AUROC0.9417
40
OOD DetectionISIC Ink Artefacts (Dissimilar)
AUROC65.62
40
Showing 6 of 6 rows

Other info

Follow for update