Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Self-Distilled Representation Learning for Time Series

About

Self-supervised learning for time-series data holds potential similar to that recently unleashed in Natural Language Processing and Computer Vision. While most existing works in this area focus on contrastive learning, we propose a conceptually simple yet powerful non-contrastive approach, based on the data2vec self-distillation framework. The core of our method is a student-teacher scheme that predicts the latent representation of an input time series from masked views of the same time series. This strategy avoids strong modality-specific assumptions and biases typically introduced by the design of contrastive sample pairs. We demonstrate the competitiveness of our approach for classification and forecasting as downstream tasks, comparing with state-of-the-art self-supervised learning methods on the UCR and UEA archives as well as the ETT and Electricity datasets.

Felix Pieper, Konstantin Ditschuneit, Martin Genzel, Alexandra Lindt, Johannes Otterbach• 2023

Related benchmarks

TaskDatasetResultRank
Time-series classificationUEA time series classification archive (test)
Average Accuracy73.8
27
Time-series classificationUCR archive 125 datasets (test)
Avg. Acc83.2
10
Showing 2 of 2 rows

Other info

Follow for update