Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Bayesian Convolutional Neural Networks with Bernoulli Approximate Variational Inference

About

Convolutional neural networks (CNNs) work well on large datasets. But labelled data is hard to collect, and in some applications larger amounts of data are not available. The problem then is how to use CNNs with small data -- as CNNs overfit quickly. We present an efficient Bayesian CNN, offering better robustness to over-fitting on small data than traditional approaches. This is by placing a probability distribution over the CNN's kernels. We approximate our model's intractable posterior with Bernoulli variational distributions, requiring no additional model parameters. On the theoretical side, we cast dropout network training as approximate inference in Bayesian neural networks. This allows us to implement our model using existing tools in deep learning with no increase in time complexity, while highlighting a negative result in the field. We show a considerable improvement in classification accuracy compared to standard techniques and improve on published state-of-the-art results for CIFAR-10.

Yarin Gal, Zoubin Ghahramani• 2015

Related benchmarks

TaskDatasetResultRank
ClassificationCUB (test)
Accuracy76.08
79
Out-of-Distribution DetectionSVHN TinyImageNet in-distribution out-of-distribution (test)
AUROC63.35
46
ClassificationCaltech101 (test)
Accuracy73.45
33
Multi-view ClassificationCUB (test)
Accuracy92.33
14
Multi-view ClassificationHMDB (test)
Accuracy71.68
14
Multi-view ClassificationPIE (test)
Accuracy91.32
14
Multi-view ClassificationCaltech101 (test)
Accuracy92.95
14
Image ClassificationMNIST (0-4) (test)
Accuracy99.91
12
Image ClassificationTiny ImageNet 0-99
Accuracy43.48
12
Image ClassificationCIFAR100 0-49 In-Distribution (test)
Accuracy59.88
12
Showing 10 of 21 rows

Other info

Follow for update