A decoder-only foundation model for time-series forecasting
About
Motivated by recent advances in large language models for Natural Language Processing (NLP), we design a time-series foundation model for forecasting whose out-of-the-box zero-shot performance on a variety of public datasets comes close to the accuracy of state-of-the-art supervised forecasting models for each individual dataset. Our model is based on pretraining a patched-decoder style attention model on a large time-series corpus, and can work well across different forecasting history lengths, prediction lengths and temporal granularities.
Abhimanyu Das, Weihao Kong, Rajat Sen, Yichen Zhou• 2023
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Time Series Forecasting | ETTh1 | MSE0.1435 | 729 | |
| Time Series Forecasting | ETTh2 | MSE0.315 | 561 | |
| Time Series Forecasting | ETTm2 | MSE0.32 | 382 | |
| Long-term forecasting | ETTm1 | MSE0.361 | 375 | |
| Long-term forecasting | ETTh1 | MSE0.414 | 365 | |
| Anomaly Detection | SMD | -- | 359 | |
| Time Series Forecasting | ETTh1 (test) | MSE0.425 | 348 | |
| Time Series Forecasting | ETTm1 | MSE0.434 | 334 | |
| Long-term forecasting | ETTm2 | MSE0.202 | 310 | |
| Time Series Forecasting | ETTm1 (test) | MSE0.332 | 278 |
Showing 10 of 292 rows
...