Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Architectural Complexity Measures of Recurrent Neural Networks

About

In this paper, we systematically analyze the connecting architectures of recurrent neural networks (RNNs). Our main contribution is twofold: first, we present a rigorous graph-theoretic framework describing the connecting architectures of RNNs in general. Second, we propose three architecture complexity measures of RNNs: (a) the recurrent depth, which captures the RNN's over-time nonlinear complexity, (b) the feedforward depth, which captures the local input-output nonlinearity (similar to the "depth" in feedforward neural networks (FNNs)), and (c) the recurrent skip coefficient which captures how rapidly the information propagates over time. We rigorously prove each measure's existence and computability. Our experimental results show that RNNs might benefit from larger recurrent depth and feedforward depth. We further demonstrate that increasing recurrent skip coefficient offers performance boosts on long term dependency problems.

Saizheng Zhang, Yuhuai Wu, Tong Che, Zhouhan Lin, Roland Memisevic, Ruslan Salakhutdinov, Yoshua Bengio• 2016

Related benchmarks

TaskDatasetResultRank
Natural Language InferenceSNLI (test)
Accuracy77.6
681
Language ModelingPenn Treebank (test)
Perplexity80.2
411
Sentiment AnalysisIMDB (test)
Accuracy92.88
248
Language ModelingPenn Treebank (val)
Perplexity83.6
178
Character-level Language Modelingtext8 (test)
BPC1.63
128
Pixel-by-pixel Image ClassificationPermuted Sequential MNIST (pMNIST) (test)
Accuracy94
79
Paraphrase DetectionQQP (test)
Accuracy82.58
51
Permuted Sequential Image ClassificationMNIST Permuted Sequential
Test Accuracy Mean94
50
Sequential Image ClassificationMNIST Sequential (test)
Accuracy98.1
47
Image Classificationpixel-by-pixel MNIST (test)
Accuracy98.1
28
Showing 10 of 14 rows

Other info

Follow for update