Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

A Simple Baseline for Bayesian Uncertainty in Deep Learning

About

We propose SWA-Gaussian (SWAG), a simple, scalable, and general purpose approach for uncertainty representation and calibration in deep learning. Stochastic Weight Averaging (SWA), which computes the first moment of stochastic gradient descent (SGD) iterates with a modified learning rate schedule, has recently been shown to improve generalization in deep learning. With SWAG, we fit a Gaussian using the SWA solution as the first moment and a low rank plus diagonal covariance also derived from the SGD iterates, forming an approximate posterior distribution over neural network weights; we then sample from this Gaussian distribution to perform Bayesian model averaging. We empirically find that SWAG approximates the shape of the true posterior, in accordance with results describing the stationary distribution of SGD iterates. Moreover, we demonstrate that SWAG performs well on a wide variety of tasks, including out of sample detection, calibration, and transfer learning, in comparison to many popular alternatives including MC dropout, KFAC Laplace, SGLD, and temperature scaling.

Wesley Maddox, Timur Garipov, Pavel Izmailov, Dmitry Vetrov, Andrew Gordon Wilson• 2019

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-100 (test)
Accuracy82.12
3518
Image ClassificationCIFAR-100--
622
Image ClassificationCIFAR-10
Accuracy93.13
507
Image ClassificationSVHN (test)
Accuracy90.04
362
Image ClassificationSVHN
Accuracy96.72
359
Image ClassificationSTL-10 (test)
Accuracy76.96
357
Image ClassificationCIFAR-100 (test)
Top-1 Acc82.27
275
Image ClassificationImageNet (test)--
235
Image ClassificationFashionMNIST (test)
Accuracy92.56
218
ClassificationCIFAR-100 (test)
Accuracy82.4
129
Showing 10 of 69 rows

Other info

Follow for update