Physics-Informed Diffusion Models
About
Generative models such as denoising diffusion models are quickly advancing their ability to approximate highly complex data distributions. They are also increasingly leveraged in scientific machine learning, where samples from the implied data distribution are expected to adhere to specific governing equations. We present a framework that unifies generative modeling and partial differential equation fulfillment by introducing a first-principle-based loss term that enforces generated samples to fulfill the underlying physical constraints. Our approach reduces the residual error by up to two orders of magnitude compared to previous work in a fluid flow case study and outperforms task-specific frameworks in relevant metrics for structural topology optimization. We also present numerical evidence that our extended training objective acts as a natural regularization mechanism against overfitting. Our framework is simple to implement and versatile in its applicability for imposing equality and inequality constraints as well as auxiliary optimization objectives.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Tire Force Estimation | Tire Force Estimation (Sporty) | Estimated Force630.5 | 22 | |
| Tire Force Estimation | Tire Force Estimation Smooth | eF (Force Error)579.5 | 22 | |
| PDE constrained field generation | Darcy Flow (val) | NLL-3.5 | 16 | |
| Vehicle Tracking | Downtown Driving | Position Error (ex&y)4.845 | 11 | |
| Tire Force Estimation | Tire Force Estimation (Aggressive) | Estimated Force Error988.6 | 11 | |
| Vehicle Tracking | Rural Driving | Error (x, y)4.293 | 11 | |
| Generative Modeling | Darcy flow 1024 samples (test) | Relative Error0.022 | 7 |