Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Scaleformer: Iterative Multi-scale Refining Transformers for Time Series Forecasting

About

The performance of time series forecasting has recently been greatly improved by the introduction of transformers. In this paper, we propose a general multi-scale framework that can be applied to the state-of-the-art transformer-based time series forecasting models (FEDformer, Autoformer, etc.). By iteratively refining a forecasted time series at multiple scales with shared weights, introducing architecture adaptations, and a specially-designed normalization scheme, we are able to achieve significant performance improvements, from 5.5% to 38.5% across datasets and transformer architectures, with minimal additional computational overhead. Via detailed ablation studies, we demonstrate the effectiveness of each of our contributions across the architecture and methodology. Furthermore, our experiments on various public datasets demonstrate that the proposed improvements outperform their corresponding baseline counterparts. Our code is publicly available in https://github.com/BorealisAI/scaleformer.

Amin Shabani, Amir Abdi, Lili Meng, Tristan Sylvain• 2022

Related benchmarks

TaskDatasetResultRank
Time Series ForecastingETTh1
MSE0.909
601
Time Series ForecastingETTh2
MSE0.591
438
Time Series ForecastingETTm2
MSE0.388
382
Multivariate long-term forecastingETTh1
MSE0.468
344
Multivariate long-term series forecastingWeather
MSE0.248
288
Multivariate long-term forecastingTraffic
MSE0.443
159
Multivariate Time-series ForecastingETTh1 (test)
MSE0.396
134
Time Series ForecastingETTm1
MAE0.453
66
Multivariate long-term forecastingETTh1 T=720 (test)
MSE0.544
51
Multivariate long-term forecastingETTh1 T=96 (test)
MSE0.381
48
Showing 10 of 18 rows

Other info

Follow for update