pADAM: A Plug-and-Play All-in-One Diffusion Architecture for Multi-Physics Learning
About
Generalizing across disparate physical laws remains a fundamental challenge for artificial intelligence in science. Existing deep-learning solvers are largely confined to single-equation settings, limiting transfer across physical regimes and inference tasks. Here we introduce pADAM, a unified generative framework that learns a shared probabilistic prior across heterogeneous partial differential equation families. Through a learned joint distribution of system states and, where applicable, physical parameters, pADAM supports forward prediction and inverse inference within a single architecture without retraining. Across benchmarks ranging from scalar diffusion to nonlinear Navier--Stokes equations, pADAM achieves accurate inference even under sparse observations. Combined with conformal prediction, it also provides reliable uncertainty quantification with coverage guarantees. In addition, pADAM performs probabilistic model selection from only two sparse snapshots, identifying governing laws through its learned generative representation. These results highlight the potential of generative multi-physics modeling for unified and uncertainty-aware scientific inference.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Inverse reconstruction | Advection N=500 per family (test) | Relative L2 Error1.25 | 8 | |
| Inverse reconstruction | Diffusion N=500 per family (test) | Relative L2 Error0.45 | 4 | |
| Forward prediction | Diffusion N=500 per family (test) | Relative L2 Error82 | 4 | |
| Forward prediction | Advection N=500 per family (test) | Relative L2 Error1.03 | 4 | |
| Forward prediction | Advection-diffusion N=500 per family (test) | Relative L2 Error1.54 | 4 |