Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Zoneout: Regularizing RNNs by Randomly Preserving Hidden Activations

About

We propose zoneout, a novel method for regularizing RNNs. At each timestep, zoneout stochastically forces some hidden units to maintain their previous values. Like dropout, zoneout uses random noise to train a pseudo-ensemble, improving generalization. But by preserving instead of dropping hidden units, gradient information and state information are more readily propagated through time, as in feedforward stochastic depth networks. We perform an empirical investigation of various RNN regularizers, and find that zoneout gives significant performance improvements across tasks. We achieve competitive results with relatively simple models in character- and word-level language modelling on the Penn Treebank and Text8 datasets, and combining with recurrent batch normalization yields state-of-the-art results on permuted sequential MNIST.

David Krueger, Tegan Maharaj, J\'anos Kram\'ar, Mohammad Pezeshki, Nicolas Ballas, Nan Rosemary Ke, Anirudh Goyal, Yoshua Bengio, Aaron Courville, Chris Pal• 2016

Related benchmarks

TaskDatasetResultRank
Natural Language InferenceSNLI (test)
Accuracy81.7
681
Language ModelingPTB (test)
Perplexity77.4
471
Character-level Language ModelingPenn Treebank (test)
BPC1.27
113
Sequential Image ClassificationPMNIST (test)
Accuracy (Test)95.9
77
Language ModelingPenn Treebank word-level (test)
Perplexity77.4
72
Question AnsweringbAbI (test)
Mean Error36.41
54
Permuted Sequential Image ClassificationMNIST Permuted Sequential
Test Accuracy Mean95.9
50
Sequential Image ClassificationMNIST Sequential (test)
Accuracy98.2
47
Character-level PredictionPTB (test)
BPC (Test)1.27
42
Permuted Pixel-by-Pixel MNIST ClassificationPermuted MNIST (pMNIST) pixel-by-pixel (test)
Accuracy (Clean)95.9
25
Showing 10 of 13 rows

Other info

Follow for update