Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

DB2-TransF: All You Need Is Learnable Daubechies Wavelets for Time Series Forecasting

About

Time series forecasting requires models that can efficiently capture complex temporal dependencies, especially in large-scale and high-dimensional settings. While Transformer-based architectures excel at modeling long-range dependencies, their quadratic computational complexity poses limitations on scalability and adaptability. To overcome these challenges, we introduce DB2-TransF, a novel Transformer-inspired architecture that replaces the self-attention mechanism with a learnable Daubechies wavelet coefficient layer. This wavelet-based module efficiently captures multi-scale local and global patterns and enhances the modeling of correlations across multiple time series for the time series forecasting task. Extensive experiments on 13 standard forecasting benchmarks demonstrate that DB2-TransF achieves comparable or superior predictive accuracy to conventional Transformers, while substantially reducing memory usage for the time series forecasting task. The obtained experimental results position DB2-TransF as a scalable and resource-efficient framework for advanced time series forecasting. Our code is available at https://github.com/SteadySurfdom/DB2-TransF

Moulik Gupta, Achyut Mani Tripathi• 2025

Related benchmarks

TaskDatasetResultRank
Multivariate ForecastingETTh1
MSE0.381
645
Multivariate Time-series ForecastingETTm1
MSE0.327
433
Multivariate ForecastingETTh2
MSE0.287
341
Multivariate Time-series ForecastingETTm2
MSE0.179
334
Multivariate Time-series ForecastingWeather
MSE0.165
276
Multivariate Time-series ForecastingTraffic
MSE0.404
200
Multivariate Time-series ForecastingExchange
MAE0.203
165
Multivariate Time-series ForecastingElectricity
MSE0.148
150
Multivariate time series predictionPeMS03
MSE0.068
111
Multivariate Time-series ForecastingPeMS04
MSE0.108
74
Showing 10 of 13 rows

Other info

Follow for update