Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Contextual Affinity Distillation for Image Anomaly Detection

About

Previous works on unsupervised industrial anomaly detection mainly focus on local structural anomalies such as cracks and color contamination. While achieving significantly high detection performance on this kind of anomaly, they are faced with logical anomalies that violate the long-range dependencies such as a normal object placed in the wrong position. In this paper, based on previous knowledge distillation works, we propose to use two students (local and global) to better mimic the teacher's behavior. The local student, which is used in previous studies mainly focuses on structural anomaly detection while the global student pays attention to logical anomalies. To further encourage the global student's learning to capture long-range dependencies, we design the global context condensing block (GCCB) and propose a contextual affinity loss for the student training and anomaly scoring. Experimental results show the proposed method doesn't need cumbersome training techniques and achieves a new state-of-the-art performance on the MVTec LOCO AD dataset.

Jie Zhang, Masanori Suganuma, Takayuki Okatani• 2023

Related benchmarks

TaskDatasetResultRank
Anomaly DetectionMVTec-LOCO 1.0 (test)
ROC-AUC (Total)84
53
Image-level Anomaly DetectionMvtec LOCO AD
AUROC (Logical)81.2
26
Anomaly LocalizationMVTec LOCO AD (test)
Mean Score0.73
19
Anomaly DetectionMVTec LOCO
AUROC84
18
Anomaly DetectionMVTec AD modified (test)
Structural AUROC0.955
11
Anomaly LocalizationMvtec LOCO AD 1.0 (test)
sPRO (Average)73
10
Showing 6 of 6 rows

Other info

Follow for update