| Task Name | Dataset Name | SOTA Result | Trend | |
|---|---|---|---|---|
| Anomaly Detection | MVTec 3D-AD 1.0 (test) | Mean Score0.989 | 134 | |
| Anomaly Detection | MVTec 3D-AD | I-AUROC97.3 | 47 | |
| Anomaly Segmentation | MVTec 3D-AD | Mean Score99.6 | 40 | |
| Anomaly Localization | MVTec 3D-AD | AUPRO (Mean)97.7 | 29 | |
| Anomaly Detection | MVTec 3D-AD | AUPRO@30% (Bagel)98.4 | 23 | |
| Anomaly Detection | MVTec 3D-AD | AUPRO@30%97.6 | 17 | |
| Anomaly Detection | MVTec 3D-AD Setting 4: 6-1 with 4 steps | I-AUROC78.6 | 9 | |
| Anomaly Detection | MVTec 3D-AD Setting 3: 6-4 with 1 step | I-AUROC82.4 | 9 | |
| Anomaly Detection | MVTec 3D-AD Setting 2: 9-1 with 1 step | I-AUROC87.5 | 9 | |
| Anomaly Detection | MVTec 3D-AD Setting 1: 10-0 with 0 step | I-AUROC91 | 9 | |
| Industrial Anomaly Detection | MVTec-3D AD (test) | Mean I-AUROC98.9 | 7 | |
| Anomaly Segmentation | MVTec 3D-AD (test) | I-AUROC0.954 | 6 | |
| Multimodal Anomaly Localization | MVTec 3D-AD (test) | Bagel AUPRO@1%47.2 | 5 | |
| Anomaly Classification | MVTec 3D-AD | Accuracy53.33 | 5 | |
| Image-level Anomaly Detection | MVTec 3D-AD | AUROC93.37 | 5 | |
| Anomaly Generation | MVTec 3D-AD | KID51.28 | 5 | |
| Anomaly Segmentation | MVTec 3D-AD RGB+3D (test) | Pixel AUROC0.992 | 2 |