Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Deep TPC: Temporal-Prior Conditioning for Time Series Forecasting

About

LLM-for-time series (TS) methods typically treat time shallowly, injecting positional or prompt-based cues once at the input of a largely frozen decoder, which limits temporal reasoning as this information degrades through the layers. We introduce Temporal-Prior Conditioning (TPC), which elevates time to a first-class modality that conditions the model at multiple depths. TPC attaches a small set of learnable time series tokens to the patch stream; at selected layers these tokens cross-attend to temporal embeddings derived from compact, human-readable temporal descriptors encoded by the same frozen LLM, then feed temporal context back via self-attention. This disentangles time series signal and temporal information while maintaining a low parameter budget. We show that by training only the cross-attention modules and explicitly disentangling time series signal and temporal information, TPC consistently outperforms both full fine-tuning and shallow conditioning strategies, achieving state-of-the-art performance in long-term forecasting across diverse datasets. Code available at: https://github.com/fil-mp/Deep_tpc

Filippos Bellos, NaveenJohn Premkumar, Yannis Avrithis, Nam H. Nguyen, Jason J. Corso• 2026

Related benchmarks

TaskDatasetResultRank
Long-term time-series forecastingETTh1
MAE0.422
351
Long-term time-series forecastingWeather
MSE0.23
348
Long-term time-series forecastingETTh2
MSE0.355
327
Long-term time-series forecastingETTm2
MSE0.265
305
Long-term time-series forecastingETTm1
MSE0.346
295
Long-term time-series forecastingTraffic
MSE0.394
278
Long-term time-series forecastingElectricity
MSE0.166
103
Long-term forecastingsolar
MSE0.201
36
Showing 8 of 8 rows

Other info

Follow for update