Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

TACTIC for Navigating the Unknown: Tabular Anomaly deteCTion via In-Context inference

About

Anomaly detection for tabular data has been a long-standing unsupervised learning problem that remains a major challenge for current deep learning models. Recently, in-context learning has emerged as a new paradigm that has shifted efforts from task-specific optimization to large-scale pretraining aimed at creating foundation models that generalize across diverse datasets. Although in-context models, such as TabPFN, perform well in supervised problems, their learned classification-based priors may not readily extend to anomaly detection. In this paper, we study in-context models for anomaly detection and show that the unsupervised extensions to TabPFN exhibit unstable behavior, particularly in noisy or contaminated contexts, in addition to the high computational cost. We address these challenges and introduce TACTIC, an in-context anomaly detection approach based on pretraining with anomaly-centric synthetic priors, which provides fast and data-dependent reasoning about anomalies while avoiding dataset-specific tuning. In contrast to typical score-based approaches, which produce uncalibrated anomaly scores that require post-processing (e.g. threshold selection or ranking heuristics), the proposed model is trained as a discriminative predictor, enabling unambiguous anomaly decisions in a single forward pass. Through experiments on real-world datasets, we examine the performance of TACTIC in clean and noisy contexts with varying anomaly rates and different anomaly types, as well as the impact of prior choices on detection quality. Our experiments clearly show that specialized anomaly-centric in-context models such as TACTIC are highly competitive compared to other task-specific methods.

Patryk Marsza{\l}ek, Tomasz Ku\'smierczyk, Marek \'Smieja• 2026

Related benchmarks

TaskDatasetResultRank
Anomaly DetectionADBench noisy-context scenario 1.0
F1 Score84.78
532
Anomaly DetectionSynthetic Gaussian mixture datasets global anomalies
F1 Score1
35
Anomaly DetectionADBench
Mean AUCROC90.43
34
Anomaly DetectionADBench ID 1
AUCROC59.86
17
Anomaly DetectionADBench ID 2
AUCROC76.2
17
Anomaly DetectionADBench 38 real-world datasets (clean-context)
F1 Score65.52
14
Anomaly DetectionGaussian Mixture Local Anomalies (synthetic)
AUCROC99.19
12
Anomaly DetectionGaussian mixture synthetic datasets with cluster anomalies
AUCROC78.52
12
Anomaly DetectionSynthetic Gaussian mixture datasets with global anomalies (Mean over 25 datasets)
AUCROC99.92
12
Anomaly DetectionGaussian mixture synthetic dataset Id 0
F1 Score37.76
10
Showing 10 of 35 rows

Other info

Follow for update