Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

A Transformer-based Framework for Multivariate Time Series Representation Learning

About

In this work we propose for the first time a transformer-based framework for unsupervised representation learning of multivariate time series. Pre-trained models can be potentially used for downstream tasks such as regression and classification, forecasting and missing value imputation. By evaluating our models on several benchmark datasets for multivariate time series regression and classification, we show that not only does our modeling approach represent the most successful method employing unsupervised learning of multivariate time series presented to date, but also that it exceeds the current state-of-the-art performance of supervised methods; it does so even when the number of training samples is very limited, while offering computational efficiency. Finally, we demonstrate that unsupervised pre-training of our transformer models offers a substantial performance benefit over fully supervised learning, even without leveraging additional unlabeled data, i.e., by reusing the same data samples through the unsupervised objective.

George Zerveas, Srideepika Jayaraman, Dhaval Patel, Anuradha Bhamidipaty, Carsten Eickhoff• 2020

Related benchmarks

TaskDatasetResultRank
Multivariate long-term forecastingETTh1
MSE0.624
344
Multivariate long-term series forecastingETTh2
MSE0.429
319
Multivariate long-term series forecastingWeather
MSE0.419
288
Multivariate long-term series forecastingETTm1
MSE0.494
257
Multivariate long-term forecastingElectricity
MSE0.31
183
Multivariate long-term series forecastingETTm2
MSE0.425
175
Multivariate long-term forecastingTraffic
MSE0.611
159
Time-series classificationSelfRegulationSCP2
Accuracy55
55
Time-series classificationHeartbeat
Accuracy77.1
51
Time-series classificationUWaveGestureLibrary
Accuracy85.6
47
Showing 10 of 64 rows

Other info

Follow for update