Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

MCMC Variational Inference via Uncorrected Hamiltonian Annealing

About

Given an unnormalized target distribution we want to obtain approximate samples from it and a tight lower bound on its (log) normalization constant log Z. Annealed Importance Sampling (AIS) with Hamiltonian MCMC is a powerful method that can be used to do this. Its main drawback is that it uses non-differentiable transition kernels, which makes tuning its many parameters hard. We propose a framework to use an AIS-like procedure with Uncorrected Hamiltonian MCMC, called Uncorrected Hamiltonian Annealing. Our method leads to tight and differentiable lower bounds on log Z. We show empirically that our method yields better performances than other competing approaches, and that the ability to tune its parameters using reparameterization gradients may lead to large performance improvements.

Tomas Geffner, Justin Domke• 2021

Related benchmarks

TaskDatasetResultRank
Density EstimationMNIST (test)
NLL (bits/dim)86.3
56
Generative ModelingMNIST (test)--
35
Density EstimationKMNIST (test)
Log-likelihood-170.2
20
Generative Modelingletters (test)
ELBO-130.9
20
Generative ModelingKMNIST (test)
ELBO-171.6
20
Density EstimationOcr-letters (test)
Avg Log-Likelihood (nats)-129.9
19
Log-likelihood estimationMNIST (test)
Log-likelihood-86.9
10
Log-likelihood estimationletters (test)
Log-likelihood-129.9
10
Variational InferenceMNIST (test)
Negative ELBO-88.58
10
log Z estimationMNIST downsampled (test)
Log Z Absolute Error0.17
9
Showing 10 of 17 rows

Other info

Follow for update