Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Automatic Posterior Transformation for Likelihood-Free Inference

About

How can one perform Bayesian inference on stochastic simulators with intractable likelihoods? A recent approach is to learn the posterior from adaptively proposed simulations using neural network-based conditional density estimators. However, existing methods are limited to a narrow range of proposal distributions or require importance weighting that can limit performance in practice. Here we present automatic posterior transformation (APT), a new sequential neural posterior estimation method for simulation-based inference. APT can modify the posterior estimate using arbitrary, dynamically updated proposals, and is compatible with powerful flow-based density estimators. It is more flexible, scalable and efficient than previous simulation-based inference techniques. APT can operate directly on high-dimensional time series and image data, opening up new applications for likelihood-free inference.

David S. Greenberg, Marcel Nonnenmacher, Jakob H. Macke• 2019

Related benchmarks

TaskDatasetResultRank
Continuous Ranked Probability Score (CRPS) EstimationLorenz-96 200 samples (test)
CRPS Component F0.593
11
Parameter EstimationMultiscale Lorenz-96 (test)
Mean AP Error (F)11.29
11
Simulation-Based InferenceRicker ε = 0% (test)
RMSE2.16
5
Parameter EstimationKuramoto-Sivashinsky Equation (KSE) (test)
MAPE (lambda_2)0.043
5
Simulation-Based InferenceOrnstein-Uhlenbeck process (OUP) ε = 20% (test)
RMSE2.59
5
Uncertainty QuantificationKuramoto-Sivashinsky (KS) equation 100 instances (test)
CRPS (λ_s)0.175
5
Simulation-Based InferenceRicker ε = 10% (test)
RMSE7.86
5
Simulation-Based InferenceRicker ε = 20% (test)
RMSE11.2
5
Simulation-Based InferenceOrnstein-Uhlenbeck process (OUP) ε = 0% (test)
RMSE0.79
5
Simulation-Based InferenceOrnstein-Uhlenbeck process (OUP) ε = 10% (test)
RMSE1.26
5
Showing 10 of 10 rows

Other info

Follow for update