Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Memory-Conditioned Flow-Matching for Stable Autoregressive PDE Rollouts

About

Autoregressive generative PDE solvers can be accurate one step ahead yet drift over long rollouts, especially in coarse-to-fine regimes where each step must regenerate unresolved fine scales. This is the regime of diffusion and flow-matching generators: although their internal dynamics are Markovian, rollout stability is governed by per-step \emph{conditional law} errors. Using the Mori--Zwanzig projection formalism, we show that eliminating unresolved variables yields an exact resolved evolution with a Markov term, a memory term, and an orthogonal forcing, exposing a structural limitation of memoryless closures. Motivated by this, we introduce memory-conditioned diffusion/flow-matching with a compact online state injected into denoising via latent features. Via disintegration, memory induces a structured conditional tail prior for unresolved scales and reduces the transport needed to populate missing frequencies. We prove Wasserstein stability of the resulting conditional kernel. We then derive discrete Gr\"onwall rollout bounds that separate memory approximation from conditional generation error. Experiments on compressible flows with shocks and multiscale mixing show improved accuracy and markedly more stable long-horizon rollouts, with better fine-scale spectral and statistical fidelity.

Victor Armegioiu• 2026

Related benchmarks

TaskDatasetResultRank
Autoregressive PDE rolloutCE-RM (test)
Final Relative L2 Error0.0069
16
Autoregressive rolloutCRP2D four-quadrant Riemann (test)
Final Time Relative L2 Error1.23
8
Showing 2 of 2 rows

Other info

Follow for update