Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

STAEformer: Spatio-Temporal Adaptive Embedding Makes Vanilla Transformer SOTA for Traffic Forecasting

About

With the rapid development of the Intelligent Transportation System (ITS), accurate traffic forecasting has emerged as a critical challenge. The key bottleneck lies in capturing the intricate spatio-temporal traffic patterns. In recent years, numerous neural networks with complicated architectures have been proposed to address this issue. However, the advancements in network architectures have encountered diminishing performance gains. In this study, we present a novel component called spatio-temporal adaptive embedding that can yield outstanding results with vanilla transformers. Our proposed Spatio-Temporal Adaptive Embedding transformer (STAEformer) achieves state-of-the-art performance on five real-world traffic forecasting datasets. Further experiments demonstrate that spatio-temporal adaptive embedding plays a crucial role in traffic forecasting by effectively capturing intrinsic spatio-temporal relations and chronological information in traffic time series.

Hangchen Liu, Zheng Dong, Renhe Jiang, Jiewen Deng, Jinliang Deng, Quanjun Chen, Xuan Song• 2023

Related benchmarks

TaskDatasetResultRank
Traffic speed forecastingMETR-LA (test)--
195
Traffic ForecastingPeMS08
RMSE23.25
166
Traffic ForecastingPeMS07
MAE19.14
94
Traffic Flow ForecastingPEMS04 (test)
MAE18.22
66
Traffic Flow ForecastingPEMS08 (test)
MAE27.43
66
Traffic Flow ForecastingPEMS03 (test)
MAE15.35
49
Traffic ForecastingTokyo JARTIC 2021 (test)
MAE5.89
44
Traffic ForecastingMETR-LA 30min horizon 6
MAE2.97
44
Spatial-temporal Time Series ForecastingPeMS03
MAE15.35
35
Traffic Flow PredictionPEMS07 (test)
MAE19.14
34
Showing 10 of 39 rows

Other info

Code

Follow for update