Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Deep Networks with Stochastic Depth

About

Very deep convolutional networks with hundreds of layers have led to significant reductions in error on competitive benchmarks. Although the unmatched expressiveness of the many layers can be highly desirable at test time, training very deep networks comes with its own set of challenges. The gradients can vanish, the forward flow often diminishes, and the training time can be painfully slow. To address these problems, we propose stochastic depth, a training procedure that enables the seemingly contradictory setup to train short networks and use deep networks at test time. We start with very deep networks but during training, for each mini-batch, randomly drop a subset of layers and bypass them with the identity function. This simple approach complements the recent success of residual networks. It reduces training time substantially and improves the test error significantly on almost all data sets that we used for evaluation. With stochastic depth we can increase the depth of residual networks even beyond 1200 layers and still yield meaningful improvements in test error (4.91% on CIFAR-10).

Gao Huang, Yu Sun, Zhuang Liu, Daniel Sedra, Kilian Weinberger• 2016

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-100 (test)
Accuracy88.42
3518
Image ClassificationCIFAR-10 (test)--
3381
Language ModelingWikiText-2 (test)--
1541
Image ClassificationImageNet-1k (val)--
1453
Image ClassificationImageNet (val)
Top-1 Acc77.5
1206
Image ClassificationCIFAR-10 (test)--
906
Image ClassificationCIFAR-10
Accuracy82.2
507
Image ClassificationCIFAR-10--
471
Image ClassificationImageNet ILSVRC-2012 (val)--
405
ClassificationSVHN (test)
Error Rate1.75
182
Showing 10 of 15 rows

Other info

Code

Follow for update