Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Masked Bayesian Neural Networks : Theoretical Guarantee and its Posterior Inference

About

Bayesian approaches for learning deep neural networks (BNN) have been received much attention and successfully applied to various applications. Particularly, BNNs have the merit of having better generalization ability as well as better uncertainty quantification. For the success of BNN, search an appropriate architecture of the neural networks is an important task, and various algorithms to find good sparse neural networks have been proposed. In this paper, we propose a new node-sparse BNN model which has good theoretical properties and is computationally feasible. We prove that the posterior concentration rate to the true model is near minimax optimal and adaptive to the smoothness of the true model. In particular the adaptiveness is the first of its kind for node-sparse BNNs. In addition, we develop a novel MCMC algorithm which makes the Bayesian inference of the node-sparse BNN model feasible in practice.

Insung Kong, Dongyoon Yang, Jongjin Lee, Ilsang Ohn, Gyuseung Baek, Yongdai Kim• 2023

Related benchmarks

TaskDatasetResultRank
RegressionAbalone
RMSE2.081
17
Classificationbreast-w
ROC-AUC0.978
13
RegressionBoston
RMSE4.277
12
ClassificationBreast--
12
Out-of-Distribution DetectionChurn (test)
AUROC0.599
7
Out-of-Distribution DetectionFICO (test)
AUROC0.519
7
ClassificationFICO
AUROC0.74
7
Out-of-Distribution DetectionBREAST (test)
AUROC0.503
7
RegressionMPG
RMSE2.897
7
ClassificationFICO
ECE0.219
3
Showing 10 of 14 rows

Other info

Follow for update