Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

FRWKV:Frequency-Domain Linear Attention for Long-Term Time Series Forecasting

About

Traditional Transformers face a major bottleneck in long-sequence time series forecasting due to their quadratic complexity $(\mathcal{O}(T^2))$ and their limited ability to effectively exploit frequency-domain information. Inspired by RWKV's $\mathcal{O}(T)$ linear attention and frequency-domain modeling, we propose FRWKV, a frequency-domain linear-attention framework that overcomes these limitations. Our model integrates linear attention mechanisms with frequency-domain analysis, achieving $\mathcal{O}(T)$ computational complexity in the attention path while exploiting spectral information to enhance temporal feature representations for scalable long-sequence modeling. Across eight real-world datasets, FRWKV achieves a first-place average rank. Our ablation studies confirm the critical roles of both the linear attention and frequency-encoder components. This work demonstrates the powerful synergy between linear attention and frequency analysis, establishing a new paradigm for scalable time series modeling. Code is available at this repository: https://github.com/yangqingyuan-byte/FRWKV.

Qingyuan Yang, Shizhuo Deng, Dongyue Chen, Da Teng, Zehua Gan• 2025

Related benchmarks

TaskDatasetResultRank
Time Series ForecastingETTh1 (test)
MSE0.433
348
Time Series ForecastingETTm1 (test)
MSE0.38
278
Time Series ForecastingETTh2 (test)
MSE0.368
232
Time Series ForecastingWeather (test)
MSE0.243
200
Time Series ForecastingETTm2 (test)
MSE0.272
171
Time Series ForecastingECL (test)
MSE0.171
93
Showing 6 of 6 rows

Other info

Follow for update