Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Subspace Inference for Bayesian Deep Learning

About

Bayesian inference was once a gold standard for learning with neural networks, providing accurate full predictive distributions and well calibrated uncertainty. However, scaling Bayesian inference techniques to deep neural networks is challenging due to the high dimensionality of the parameter space. In this paper, we construct low-dimensional subspaces of parameter space, such as the first principal components of the stochastic gradient descent (SGD) trajectory, which contain diverse sets of high performing models. In these subspaces, we are able to apply elliptical slice sampling and variational inference, which struggle in the full parameter space. We show that Bayesian model averaging over the induced posterior in these subspaces produces accurate predictions and well calibrated predictive uncertainty for both regression and image classification.

Pavel Izmailov, Wesley J. Maddox, Polina Kirichenko, Timur Garipov, Dmitry Vetrov, Andrew Gordon Wilson• 2019

Related benchmarks

TaskDatasetResultRank
RegressionEnergy UCI (test)
RMSE1.587
27
RegressionBoston UCI (test)
RMSE3.453
26
RegressionConcrete UCI (test)
RMSE5.142
21
RegressionYacht UCI (test)
RMSE0.972
20
Regressionelevators (test)
RMSE0.088
19
RegressionProtein (test)
Test Log Likelihood-0.712
18
RegressionSkillcraft (test)
Log Likelihood (Test)-1.179
17
RegressionNaval UCI (test)
RMSE0.001
16
RegressionProtein (test)
RMSE0.418
10
Regressionpol (test)
RMSE2.499
9
Showing 10 of 13 rows

Other info

Follow for update