Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Multi-Time Attention Networks for Irregularly Sampled Time Series

About

Irregular sampling occurs in many time series modeling applications where it presents a significant challenge to standard deep learning models. This work is motivated by the analysis of physiological time series data in electronic health records, which are sparse, irregularly sampled, and multivariate. In this paper, we propose a new deep learning framework for this setting that we call Multi-Time Attention Networks. Multi-Time Attention Networks learn an embedding of continuous-time values and use an attention mechanism to produce a fixed-length representation of a time series containing a variable number of observations. We investigate the performance of this framework on interpolation and classification tasks using multiple datasets. Our results show that the proposed approach performs as well or better than a range of baseline and recently proposed models while offering significantly faster training times than current state-of-the-art methods.

Satya Narayan Shukla, Benjamin M. Marlin• 2021

Related benchmarks

TaskDatasetResultRank
ClassificationPAMAP2 original and sensor dropout
Accuracy74.6
48
ClassificationPAMAP2
F1 Score76.8
48
ForecastingMIMIC-III (test)
MSE0.54
43
Event PredictionStackOverflow
RMSE0.95
42
Multivariate Time Series ClassificationUEA 30% missing rate (test)
Accuracy63.4
39
Time-series classification18 UEA datasets Regular
Accuracy65.4
38
Clinical predictionMIMIC-III
AUROC85.44
36
Time-series classificationUEA 18 datasets 70% Missing
Accuracy64.2
34
Irregularly Sampled Time Series ForecastingMIMIC
MSE0.9408
34
Lane-Keeping Action ClassificationOpenAI CarRacing
Accuracy80.86
33
Showing 10 of 93 rows
...

Other info

Code

Follow for update