Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Hierarchical Multiscale Recurrent Neural Networks

About

Learning both hierarchical and temporal representation has been among the long-standing challenges of recurrent neural networks. Multiscale recurrent neural networks have been considered as a promising approach to resolve this issue, yet there has been a lack of empirical evidence showing that this type of models can actually capture the temporal dependencies by discovering the latent hierarchical structure of the sequence. In this paper, we propose a novel multiscale approach, called the hierarchical multiscale recurrent neural networks, which can capture the latent hierarchical structure in the sequence by encoding the temporal dependencies with different timescales using a novel update mechanism. We show some evidence that our proposed multiscale architecture can discover underlying hierarchical structure in the sequences without using explicit boundary information. We evaluate our proposed model on character-level language modelling and handwriting sequence modelling.

Junyoung Chung, Sungjin Ahn, Yoshua Bengio• 2016

Related benchmarks

TaskDatasetResultRank
Character-level Language Modelingenwik8 (test)
BPC1.32
195
Character-level Language Modelingtext8 (test)
BPC1.29
128
Character-level Language ModelingPenn Treebank (test)
BPC1.24
113
Character-level PredictionPTB (test)
BPC (Test)1.24
42
Character-level Language ModelingHutter Prize Wikipedia (test)
Bits/Char1.32
28
Character-level Language ModelingPenn Treebank char-level (test)
BPC1.24
25
Character-level Language Modelingtext8
BPC1.29
16
Character-level Language ModelingPenn Treebank character-level (val)
BPC1.24
10
Byte-size token predictionByte-size token prediction dataset (val)
BPC1.29
7
Showing 9 of 9 rows

Other info

Follow for update