Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

MVTec 3D-AD

Benchmarks

Task NameDataset NameSOTA ResultTrend
Anomaly DetectionMVTec 3D-AD 1.0 (test)
Mean Score0.989
134
Anomaly DetectionMVTec 3D-AD
I-AUROC97.3
47
Anomaly SegmentationMVTec 3D-AD
Mean Score99.6
40
Anomaly LocalizationMVTec 3D-AD
AUPRO (Mean)97.7
29
Anomaly DetectionMVTec 3D-AD
AUPRO@30% (Bagel)98.4
23
Anomaly DetectionMVTec 3D-AD
AUPRO@30%97.6
17
Anomaly DetectionMVTec 3D-AD Setting 4: 6-1 with 4 steps
I-AUROC78.6
9
Anomaly DetectionMVTec 3D-AD Setting 3: 6-4 with 1 step
I-AUROC82.4
9
Anomaly DetectionMVTec 3D-AD Setting 2: 9-1 with 1 step
I-AUROC87.5
9
Anomaly DetectionMVTec 3D-AD Setting 1: 10-0 with 0 step
I-AUROC91
9
Industrial Anomaly DetectionMVTec-3D AD (test)
Mean I-AUROC98.9
7
Anomaly SegmentationMVTec 3D-AD (test)
I-AUROC0.954
6
Multimodal Anomaly LocalizationMVTec 3D-AD (test)
Bagel AUPRO@1%47.2
5
Anomaly ClassificationMVTec 3D-AD
Accuracy53.33
5
Image-level Anomaly DetectionMVTec 3D-AD
AUROC93.37
5
Anomaly GenerationMVTec 3D-AD
KID51.28
5
Anomaly SegmentationMVTec 3D-AD RGB+3D (test)
Pixel AUROC0.992
2
Showing 17 of 17 rows