Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Anomaly Detection via Reverse Distillation from One-Class Embedding

About

Knowledge distillation (KD) achieves promising results on the challenging problem of unsupervised anomaly detection (AD).The representation discrepancy of anomalies in the teacher-student (T-S) model provides essential evidence for AD. However, using similar or identical architectures to build the teacher and student models in previous studies hinders the diversity of anomalous representations. To tackle this problem, we propose a novel T-S model consisting of a teacher encoder and a student decoder and introduce a simple yet effective "reverse distillation" paradigm accordingly. Instead of receiving raw images directly, the student network takes teacher model's one-class embedding as input and targets to restore the teacher's multiscale representations. Inherently, knowledge distillation in this study starts from abstract, high-level presentations to low-level features. In addition, we introduce a trainable one-class bottleneck embedding (OCBE) module in our T-S model. The obtained compact embedding effectively preserves essential information on normal patterns, but abandons anomaly perturbations. Extensive experimentation on AD and one-class novelty detection benchmarks shows that our method surpasses SOTA performance, demonstrating our proposed approach's effectiveness and generalizability.

Hanqiu Deng, Xingyu Li• 2022

Related benchmarks

TaskDatasetResultRank
Anomaly LocalizationMVTec AD
Pixel AUROC98.3
513
Anomaly DetectionMVTec-AD (test)
I-AUROC98.5
327
Anomaly DetectionVisA
AUROC96
261
Anomaly LocalizationMVTec-AD (test)
Pixel AUROC98.3
211
Anomaly DetectionMVTec-AD (test)
P-AUROC97.8
152
Pixel-level Anomaly DetectionMVTec
Pixel AUROC99.6
127
Anomaly LocalizationVisA
P-AUROC0.965
119
Anomaly SegmentationMVTec AD--
105
Anomaly DetectionVisA (test)
I-AUROC96
91
Anomaly DetectionBraTS 2018 (test)
AUROC (Image)92.69
88
Showing 10 of 254 rows
...

Other info

Follow for update