Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Large-scale Score-based Variational Posterior Inference for Bayesian Deep Neural Networks

About

Bayesian (deep) neural networks (BNN) are often more attractive than the mainstream point-estimate vanilla deep learning in various aspects including uncertainty quantification, robustness to noise, resistance to overfitting, and more. The variational inference (VI) is one of the most widely adopted approximate inference methods. Whereas the ELBO-based variational free energy method is a dominant choice in the literature, in this paper we introduce a score-based alternative for BNN variational inference. Although there have been quite a few score-based variational inference methods proposed in the community, most are not adequate for large-scale BNNs for various computational and technical reasons. We propose a novel scalable VI method where the learning objective combines the score matching loss and the proximal penalty term in iterations, which helps our method avoid the reparametrized sampling, and allows for noisy unbiased mini-batch scores through stochastic gradients. This in turn makes our method scalable to large-scale neural networks including Vision Transformers, and allows for richer variational density families. On several benchmarks including visual recognition and time-series forecasting with large-scale deep networks, we empirically show the effectiveness of our approach.

Minyoung Kim• 2026

Related benchmarks

TaskDatasetResultRank
Time Series ForecastingETTh2
MSE0.2375
438
Time Series ForecastingECL
MSE0.1228
183
Time Series ForecastingILI
MAE0.8862
58
Image ClassificationPets
Error Rate7.746
12
Image ClassificationFlowers
Error Rate (%)11.97
12
Image ClassificationAircraft
Error Rate44.196
12
Image ClassificationDTD
Error Rate32.498
12
Time Series ForecastingTraffic
MSE0.429
3
Time Series ForecastingWeather
MSE0.1265
3
Showing 9 of 9 rows

Other info

Follow for update