Riemannian Laplace Approximation with the Fisher Metric
About
Laplace's method approximates a target density with a Gaussian distribution at its mode. It is computationally efficient and asymptotically exact for Bayesian inference due to the Bernstein-von Mises theorem, but for complex targets and finite-data posteriors it is often too crude an approximation. A recent generalization of the Laplace Approximation transforms the Gaussian approximation according to a chosen Riemannian geometry providing a richer approximation family, while still retaining computational efficiency. However, as shown here, its properties depend heavily on the chosen metric, indeed the metric adopted in previous work results in approximations that are overly narrow as well as being biased even at the limit of infinite data. We correct this shortcoming by developing the approximation family further, deriving two alternative variants that are exact at the limit of infinite data, extending the theoretical analysis of the method, and demonstrating practical improvements in a range of experiments.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| NN regression | Snelson (complete) | MSE0.072 | 5 | |
| NN regression | Snelson (gap) | MSE0.27 | 5 | |
| Logistic Regression | Ripl standardized (test) | Wasserstein Distance (W)0.064 | 4 | |
| Logistic Regression | Pima standardized (test) | Wasserstein Distance (W)0.147 | 4 | |
| Logistic Regression | Hear standardized (test) | Wasserstein Distance (W)0.514 | 4 | |
| Logistic Regression | Aust standardized (test) | Wasserstein Distance (W)0.417 | 4 | |
| Logistic Regression | Germ standardized (test) | Wasserstein Distance (W)0.387 | 4 | |
| Logistic Regression | Ripl raw (test) | Wasserstein Distance0.247 | 4 | |
| Logistic Regression | Pima raw (test) | Wasserstein Distance0.112 | 4 | |
| Logistic Regression | Hear raw (test) | Wasserstein Distance0.644 | 4 |