Deep TPC: Temporal-Prior Conditioning for Time Series Forecasting
About
LLM-for-time series (TS) methods typically treat time shallowly, injecting positional or prompt-based cues once at the input of a largely frozen decoder, which limits temporal reasoning as this information degrades through the layers. We introduce Temporal-Prior Conditioning (TPC), which elevates time to a first-class modality that conditions the model at multiple depths. TPC attaches a small set of learnable time series tokens to the patch stream; at selected layers these tokens cross-attend to temporal embeddings derived from compact, human-readable temporal descriptors encoded by the same frozen LLM, then feed temporal context back via self-attention. This disentangles time series signal and temporal information while maintaining a low parameter budget. We show that by training only the cross-attention modules and explicitly disentangling time series signal and temporal information, TPC consistently outperforms both full fine-tuning and shallow conditioning strategies, achieving state-of-the-art performance in long-term forecasting across diverse datasets. Code available at: https://github.com/fil-mp/Deep_tpc
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Long-term time-series forecasting | ETTh1 | MAE0.422 | 351 | |
| Long-term time-series forecasting | Weather | MSE0.23 | 348 | |
| Long-term time-series forecasting | ETTh2 | MSE0.355 | 327 | |
| Long-term time-series forecasting | ETTm2 | MSE0.265 | 305 | |
| Long-term time-series forecasting | ETTm1 | MSE0.346 | 295 | |
| Long-term time-series forecasting | Traffic | MSE0.394 | 278 | |
| Long-term time-series forecasting | Electricity | MSE0.166 | 103 | |
| Long-term forecasting | solar | MSE0.201 | 36 |