Automatic Posterior Transformation for Likelihood-Free Inference
About
How can one perform Bayesian inference on stochastic simulators with intractable likelihoods? A recent approach is to learn the posterior from adaptively proposed simulations using neural network-based conditional density estimators. However, existing methods are limited to a narrow range of proposal distributions or require importance weighting that can limit performance in practice. Here we present automatic posterior transformation (APT), a new sequential neural posterior estimation method for simulation-based inference. APT can modify the posterior estimate using arbitrary, dynamically updated proposals, and is compatible with powerful flow-based density estimators. It is more flexible, scalable and efficient than previous simulation-based inference techniques. APT can operate directly on high-dimensional time series and image data, opening up new applications for likelihood-free inference.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Continuous Ranked Probability Score (CRPS) Estimation | Lorenz-96 200 samples (test) | CRPS Component F0.593 | 11 | |
| Parameter Estimation | Multiscale Lorenz-96 (test) | Mean AP Error (F)11.29 | 11 | |
| Simulation-Based Inference | Ricker ε = 0% (test) | RMSE2.16 | 5 | |
| Parameter Estimation | Kuramoto-Sivashinsky Equation (KSE) (test) | MAPE (lambda_2)0.043 | 5 | |
| Simulation-Based Inference | Ornstein-Uhlenbeck process (OUP) ε = 20% (test) | RMSE2.59 | 5 | |
| Uncertainty Quantification | Kuramoto-Sivashinsky (KS) equation 100 instances (test) | CRPS (λ_s)0.175 | 5 | |
| Simulation-Based Inference | Ricker ε = 10% (test) | RMSE7.86 | 5 | |
| Simulation-Based Inference | Ricker ε = 20% (test) | RMSE11.2 | 5 | |
| Simulation-Based Inference | Ornstein-Uhlenbeck process (OUP) ε = 0% (test) | RMSE0.79 | 5 | |
| Simulation-Based Inference | Ornstein-Uhlenbeck process (OUP) ε = 10% (test) | RMSE1.26 | 5 |