Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Multi-resolution Time-Series Transformer for Long-term Forecasting

About

The performance of transformers for time-series forecasting has improved significantly. Recent architectures learn complex temporal patterns by segmenting a time-series into patches and using the patches as tokens. The patch size controls the ability of transformers to learn the temporal patterns at different frequencies: shorter patches are effective for learning localized, high-frequency patterns, whereas mining long-term seasonalities and trends requires longer patches. Inspired by this observation, we propose a novel framework, Multi-resolution Time-Series Transformer (MTST), which consists of a multi-branch architecture for simultaneous modeling of diverse temporal patterns at different resolutions. In contrast to many existing time-series transformers, we employ relative positional encoding, which is better suited for extracting periodic components at different scales. Extensive experiments on several real-world datasets demonstrate the effectiveness of MTST in comparison to state-of-the-art forecasting techniques.

Yitian Zhang, Liheng Ma, Soumyasundar Pal, Yingxue Zhang, Mark Coates• 2023

Related benchmarks

TaskDatasetResultRank
Human Activity RecognitionUCI-HAR 6-Classes (test)
Accuracy90.99
11
Medical Time Series ClassificationPTB-XL 5-Classes (test)
Accuracy0.7214
11
Human Activity RecognitionFLAAP 10-Classes (test)
Accuracy70.57
11
Medical Time Series ClassificationPTB 2-Classes (test)
Accuracy0.7659
11
Medical Time Series ClassificationADFTD 3-Classes (test)
Accuracy45.6
11
Medical Time Series ClassificationAPAVA 2-Classes (test)
Accuracy71.14
11
Medical Time Series ClassificationTDBrain 2-Classes (test)
Accuracy76.96
11
Showing 7 of 7 rows

Other info

Follow for update