Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Training Very Deep Networks

About

Theoretical and empirical evidence indicates that the depth of neural networks is crucial for their success. However, training becomes more difficult as depth increases, and training of very deep networks remains an open problem. Here we introduce a new architecture designed to overcome this. Our so-called highway networks allow unimpeded information flow across many layers on information highways. They are inspired by Long Short-Term Memory recurrent networks and use adaptive gating units to regulate the information flow. Even with hundreds of layers, highway networks can be trained directly through simple gradient descent. This enables the study of extremely deep and efficient architectures.

Rupesh Kumar Srivastava, Klaus Greff, J\"urgen Schmidhuber• 2015

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-100 (test)--
3518
Image ClassificationCIFAR-10 (test)--
3381
Image ClassificationCIFAR-10 (test)
Accuracy92.4
906
Image ClassificationMNIST (test)
Accuracy99.55
882
Image ClassificationCIFAR-100--
622
Image ClassificationCIFAR-10
Accuracy92.4
471
ClassificationCIFAR-100 (test)
Accuracy67.76
129
Image ClassificationCIFAR-10 (test)
Error Rate7.72
102
Image ClassificationCIFAR-100 2009 (test)
Accuracy68.09
53
Image ClassificationCIFAR-10 Standard data augmentation (test)
Test Error Rate7.6
43
Showing 10 of 22 rows

Other info

Code

Follow for update