Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Low Rank Based Subspace Inference for the Laplace Approximation of Bayesian Neural Networks

About

Subspace inference for neural networks assumes that a subspace of their parameter space suffices to produce a reliable uncertainty quantification. In this work, we underpin the validity of this assumption by using low rank techniques. We derive an expression for a subspace model to a Bayesian inference scenario based on the Laplace approximation that is, in a certain sense, optimal given a specific dataset. We empirically show that a Laplace approximation constructed with a dimensionally reduced covariance matrix closely matches the full Laplace approximation obtained using the exact covariance matrix. Where feasible, this subspace model can serve as a baseline for benchmarking the performance of subspace models. In addition, we provide a scalable approximation of this subspace construction that is usable in practice and compare it to existing subspace models from the literature. In general, our approximation scheme outperforms previous work. Furthermore, we present a metric to qualitatively compare the approximation quality of different subspace models even if the exact Laplace approximation is unknown.

Josua Faller, J\"org Martin• 2025

Related benchmarks

TaskDatasetResultRank
Uncertainty QuantificationFashionMNIST
NLL0.248
42
Uncertainty QuantificationCIFAR10
NLL0.256
42
Uncertainty QuantificationImageNet-10
NLL0.266
42
Image ClassificationCIFAR-10-C (test)
NLL0.312
28
Image ClassificationCIFAR-10-C brightness
NLL0.313
6
Image ClassificationCIFAR-10-C elastic transform
NLL0.49
6
Image ClassificationCIFAR-10-C gaussian blur
NLL0.341
6
Image ClassificationCIFAR-10-C impulse noise
NLL1.66
6
Uncertainty QuantificationMNIST-C (test)
NLL (brightness)0.106
6
Showing 9 of 9 rows

Other info

Follow for update