Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Binarized Neural Networks

About

We introduce a method to train Binarized Neural Networks (BNNs) - neural networks with binary weights and activations at run-time and when computing the parameters' gradient at train-time. We conduct two sets of experiments, each based on a different framework, namely Torch7 and Theano, where we train BNNs on MNIST, CIFAR-10 and SVHN, and achieve nearly state-of-the-art results. During the forward pass, BNNs drastically reduce memory size and accesses, and replace most arithmetic operations with bit-wise operations, which might lead to a great increase in power-efficiency. Last but not least, we wrote a binary matrix multiplication GPU kernel with which it is possible to run our MNIST BNN 7 times faster than with an unoptimized GPU kernel, without suffering any loss in classification accuracy. The code for training and running our BNNs is available.

Itay Hubara, Daniel Soudry, Ran El Yaniv• 2016

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-10 (test)
Accuracy89.9
3381
Image ClassificationImageNet (val)
Top-1 Acc42.2
1206
Image ClassificationCIFAR-10 (test)
Accuracy88.6
906
Image Super-resolutionSet5
PSNR13.97
507
Image Super-resolutionUrban100
PSNR12.75
221
Image ClassificationImageNet-1k (val)
Top-1 Acc42.2
188
Medical Image SegmentationISIC
DICE56
64
Image Super-resolutionManga109
LPIPS0.7489
38
Image Super-resolutionB100
PSNR13.73
24
Object DetectionMS-COCO 2014 (minival)
mAP6.2
23
Showing 10 of 19 rows

Other info

Follow for update