Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Large Pre-trained time series models for cross-domain Time series analysis tasks

About

Large pre-trained models have been vital in recent advancements in domains like language and vision, making model training for individual downstream tasks more efficient and provide superior performance. However, tackling time-series analysis tasks usually involves designing and training a separate model from scratch leveraging training data and domain expertise specific to the task. We tackle a significant challenge for pre-training a foundational time-series model from multi-domain time-series datasets: extracting semantically useful tokenized inputs to the model across heterogenous time-series from different domains. We propose Large Pre-trained Time-series Models (LPTM) that introduces a novel method of adaptive segmentation that automatically identifies optimal dataset-specific segmentation strategy during pre-training. This enables LPTM to perform similar to or better than domain-specific state-of-art model when fine-tuned to different downstream time-series analysis tasks and under zero-shot settings. LPTM achieves superior forecasting and time-series classification results taking up to 40% less data and 50% less training time compared to state-of-art baselines. Code: www.github.com/AdityaLab/Samay

Harshavardhan Kamarthi, B. Aditya Prakash• 2023

Related benchmarks

TaskDatasetResultRank
Time Series ForecastingILI
MAE0.83
58
Time-series classificationSelfRegulationSCP2
Accuracy69.1
55
Time-series classificationHeartbeat
Accuracy74
51
Time-series classificationUWaveGestureLibrary
Accuracy94
47
Time-series classificationPEMS-SF
Accuracy93
45
Multivariate Time Series ClassificationFinger Movement
Accuracy78
39
Time Series ForecastingETT1
RMSE0.43
36
Time Series ForecastingETT2
RMSE0.46
36
Time Series ForecastingNY-B
RMSE2.31
36
Time Series ForecastingFlu-US
RMSE0.79
36
Showing 10 of 46 rows

Other info

Code

Follow for update