Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Recurrent Neural Network Regularization

About

We present a simple regularization technique for Recurrent Neural Networks (RNNs) with Long Short-Term Memory (LSTM) units. Dropout, the most successful technique for regularizing neural networks, does not work well with RNNs and LSTMs. In this paper, we show how to correctly apply dropout to LSTMs, and show that it substantially reduces overfitting on a variety of tasks. These tasks include language modeling, speech recognition, image caption generation, and machine translation.

Wojciech Zaremba, Ilya Sutskever, Oriol Vinyals• 2014

Related benchmarks

TaskDatasetResultRank
Language ModelingWikiText-2 (test)
PPL96.41
1541
Language ModelingWikiText-103 (test)
Perplexity48.7
524
Language ModelingPTB (test)
Perplexity78.4
471
Language ModelingPenn Treebank (test)
Perplexity56
411
Language ModelingPenn Treebank (val)
Perplexity71.9
178
Language ModelingPenn Treebank (PTB) (test)
Perplexity79.34
120
Language ModelingPTB (val)
Perplexity82.2
83
Language ModelingPenn Treebank word-level (test)
Perplexity78.4
72
Machine TranslationWMT en-fr 14
BLEU Score29.03
56
Image ClassificationCIFAR-10
Accuracy94.53
15
Showing 10 of 15 rows

Other info

Code

Follow for update