Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Predicting Physics in Mesh-reduced Space with Temporal Attention

About

Graph-based next-step prediction models have recently been very successful in modeling complex high-dimensional physical systems on irregular meshes. However, due to their short temporal attention span, these models suffer from error accumulation and drift. In this paper, we propose a new method that captures long-term dependencies through a transformer-style temporal attention model. We introduce an encoder-decoder structure to summarize features and create a compact mesh representation of the system state, to allow the temporal model to operate on a low-dimensional mesh representations in a memory efficient manner. Our method outperforms a competitive GNN baseline on several complex fluid dynamics prediction tasks, from sonic shocks to vascular flow. We demonstrate stable rollouts without the need for training noise and show perfectly phase-stable predictions even for very long sequences. More broadly, we believe our approach paves the way to bringing the benefits of attention-based sequence models to solving high-dimensional complex physics tasks.

Xu Han, Han Gao, Tobias Pfaff, Jian-Xun Wang, Li-Ping Liu• 2022

Related benchmarks

TaskDatasetResultRank
Rollout PredictionCylinder Flow Re=400 (test)
U Velocity4.9
5
Showing 1 of 1 rows

Other info

Follow for update