Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Approximately Equivariant Recurrent Generative Models for Quasi-Periodic Time Series with a Progressive Training Scheme

About

We present a simple yet effective generative model for time series, based on a Recurrent Variational Autoencoder that we refer to as AEQ-RVAE-ST. Recurrent layers often struggle with unstable optimization and poor convergence when modeling long sequences. To address these limitations, we introduce a training scheme that subsequently increases the sequence length, stabilizing optimization and enabling consistent learning over extended horizons. By composing known components into a recurrent, approximately time-shift-equivariant topology, our model introduces an inductive bias that aligns with the structure of quasi-periodic and nearly stationary time series. Across several benchmark datasets, AEQ-RVAE-ST matches or surpasses state-of-the-art generative models, particularly on quasi-periodic data, while remaining competitive on more irregular signals. Performance is evaluated through ELBO, Fr\'echet Distance, discriminative metrics, and visualizations of the learned latent embeddings.

Ruwen Fulek, Markus Lange-Hegermann• 2025

Related benchmarks

TaskDatasetResultRank
Synthetic Time Series GenerationElectric Motor
Average ELBO1.66
48
Synthetic Time Series GenerationETT
Average ELBO1.6
48
Synthetic Time Series GenerationSine
Average ELBO Score1.47
48
Synthetic Time Series GenerationMetroPT3
Average ELBO1.49
48
Synthetic Time Series GenerationECG
Average ELBO1.64
48
Synthetic Time Series GenerationECG
FID Score0.08
24
Synthetic Time Series GenerationETT
FID Score0.58
24
Synthetic Time Series GenerationMetroPT3
FID Score0.26
24
Time-series generationSine
Discriminative Score0.021
24
Synthetic Time Series GenerationElectric Motor
FID Score0.1
24
Showing 10 of 15 rows

Other info

Follow for update