Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Preconditioned One-Step Generative Modeling for Bayesian Inverse Problems in Function Spaces

About

We propose a machine-learning algorithm for Bayesian inverse problems in the function-space regime based on one-step generative transport. Building on the Mean Flows, we learn a fully conditional amortized sampler with a neural-operator backbone that maps a reference Gaussian noise to approximate posterior samples. We show that while white-noise references may be admissible at fixed discretization, they become incompatible with the function-space limit, leading to instability in inference for Bayesian problems arising from PDEs. To address this issue, we adopt a prior-aligned anisotropic Gaussian reference distribution and establish the Lipschitz regularity of the resulting transport. Our method is not distilled from MCMC: training relies only on prior samples and simulated partial and noisy observations. Once trained, it generates a $64\times64$ posterior sample in $\sim 10^{-3}$s, avoiding the repeated PDE solves of MCMC while matching key posterior summaries.

Zilan Cheng, Li-Lian Wang, Zhongjian Wang• 2026

Related benchmarks

TaskDatasetResultRank
Posterior SamplingAdvection
Time per Sample (s)3.5
6
Posterior SamplingDarcy
Time per Sample (s)3.5
6
Posterior SamplingReaction-diffusion
Time per Sample (s)3.5
6
Posterior SamplingNavier-Stokes
Time per Sample (s)3.5
6
Showing 4 of 4 rows

Other info

Follow for update