Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Mogrifier LSTM

About

Many advances in Natural Language Processing have been based upon more expressive models for how inputs interact with the context in which they occur. Recurrent networks, which have enjoyed a modicum of success, still lack the generalization and systematicity ultimately required for modelling language. In this work, we propose an extension to the venerable Long Short-Term Memory in the form of mutual gating of the current input and the previous output. This mechanism affords the modelling of a richer space of interactions between inputs and their context. Equivalently, our model can be viewed as making the transition function given by the LSTM context-dependent. Experiments demonstrate markedly improved generalization on language modelling in the range of 3-4 perplexity points on Penn Treebank and Wikitext-2, and 0.01-0.05 bpc on four character-based datasets. We establish a new state of the art on all datasets with the exception of Enwik8, where we close a large gap between the LSTM and Transformer models.

G\'abor Melis, Tom\'a\v{s} Ko\v{c}isk\'y, Phil Blunsom• 2019

Related benchmarks

TaskDatasetResultRank
Language ModelingWikiText-2 (test)
PPL38.6
1541
Language ModelingPTB (test)
Perplexity44.8
471
Character-level Language Modelingenwik8 (test)
BPC1.146
195
Character-level Language ModelingEnwik8 (val)
BPC1.135
15
Language ModelingPTB English Mikolov preprocessed (val)
Perplexity44.9
13
Language ModelingWikitext-2 Standard (val)
Perplexity40.2
12
Showing 6 of 6 rows

Other info

Code

Follow for update