Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Bias-Constrained Diffusion Schedules for PDE Emulations: Reconstruction Error Minimization and Efficient Unrolled Training

About

Conditional Diffusion Models are powerful surrogates for emulating complex spatiotemporal dynamics, yet they often fail to match the accuracy of deterministic neural emulators for high-precision tasks. In this work, we address two critical limitations of autoregressive PDE diffusion models: their sub-optimal single-step accuracy and the prohibitive computational cost of unrolled training. First, we characterize the relationship between the noise schedule, the reconstruction error reduction rate and the diffusion exposure bias, demonstrating that standard schedules lead to suboptimal reconstruction error. Leveraging this insight, we propose an \textit{Adaptive Noise Schedule} framework that minimizes inference reconstruction error by dynamically constraining the model's exposure bias. We further show that this optimized schedule enables a fast \textit{Proxy Unrolled Training} method to stabilize long-term rollouts without the cost of full Markov Chain sampling. Both proposed methods enable significant improvements in short-term accuracy and long-term stability over diffusion and deterministic baselines on diverse benchmarks, including forced Navier-Stokes, Kuramoto-Sivashinsky and Transonic Flow.

Constantin Le Cle\"i, Nils Thuerey, Xiaoxiang Zhu• 2026

Related benchmarks

TaskDatasetResultRank
Fluid Dynamics EmulationKolmogorov Flow
MSE (1-step)8.07e-7
6
Fluid Dynamics EmulationTransonic Flow
MSE (1-step)3.87e-5
6
Fluid Dynamics EmulationKuramoto-Sivashinsky
1-step MSE8.83e-8
6
Showing 3 of 3 rows

Other info

Follow for update