Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Riemannian Laplace Approximation with the Fisher Metric

About

Laplace's method approximates a target density with a Gaussian distribution at its mode. It is computationally efficient and asymptotically exact for Bayesian inference due to the Bernstein-von Mises theorem, but for complex targets and finite-data posteriors it is often too crude an approximation. A recent generalization of the Laplace Approximation transforms the Gaussian approximation according to a chosen Riemannian geometry providing a richer approximation family, while still retaining computational efficiency. However, as shown here, its properties depend heavily on the chosen metric, indeed the metric adopted in previous work results in approximations that are overly narrow as well as being biased even at the limit of infinite data. We correct this shortcoming by developing the approximation family further, deriving two alternative variants that are exact at the limit of infinite data, extending the theoretical analysis of the method, and demonstrating practical improvements in a range of experiments.

Hanlin Yu, Marcelo Hartmann, Bernardo Williams, Mark Girolami, Arto Klami• 2023

Related benchmarks

TaskDatasetResultRank
NN regressionSnelson (complete)
MSE0.072
5
NN regressionSnelson (gap)
MSE0.27
5
Logistic RegressionRipl standardized (test)
Wasserstein Distance (W)0.064
4
Logistic RegressionPima standardized (test)
Wasserstein Distance (W)0.147
4
Logistic RegressionHear standardized (test)
Wasserstein Distance (W)0.514
4
Logistic RegressionAust standardized (test)
Wasserstein Distance (W)0.417
4
Logistic RegressionGerm standardized (test)
Wasserstein Distance (W)0.387
4
Logistic RegressionRipl raw (test)
Wasserstein Distance0.247
4
Logistic RegressionPima raw (test)
Wasserstein Distance0.112
4
Logistic RegressionHear raw (test)
Wasserstein Distance0.644
4
Showing 10 of 14 rows

Other info

Follow for update