Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Pi-transformer: A prior-informed dual-attention model for multivariate time-series anomaly detection

About

Anomalies in multivariate time series often arise from temporal context and cross-channel coordination rather than isolated outliers. We present Pi-Transformer (Prior-Informed Transformer), a transformer with two attention pathways: data-driven series attention and a smoothly evolving prior attention that encodes temporal invariants such as scale-related self-similarity and phase synchrony. The prior provides an amplitude-insensitive temporal reference that calibrates reconstruction error. During training, we pair a reconstruction objective with a divergence term that encourages agreement between the two attentions while keeping them meaningfully distinct. The prior is regularised to evolve smoothly and is lightly distilled towards dataset-level statistics. At inference, the model combines an alignment-weighted reconstruction signal (Energy) with a mismatch signal that highlights timing and phase disruptions, and fuses them into a single score for detection. Across five benchmarks (SMD, MSL, SMAP, SWaT, and PSM), Pi-Transformer achieves state-of-the-art or highly competitive F1, with particular strength on timing and phase-breaking anomalies. Case analyses show complementary behaviour of the two streams and interpretable detections around regime changes. Embedding prior attention into transformer scoring yields a calibrated and robust approach to anomaly detection in complex multivariate systems.

Sepehr Maleki, Negar Pourmoazemi• 2025

Related benchmarks

TaskDatasetResultRank
Anomaly DetectionSMD
F1 Score91.23
359
Anomaly DetectionSWaT
F1 Score96.82
276
Anomaly DetectionPSM
F1 Score98.08
142
Anomaly DetectionMSL
Precision96.24
95
Anomaly DetectionSMAP
F1 Score97.02
69
Showing 5 of 5 rows

Other info

Follow for update