Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks

About

Natural language is hierarchically structured: smaller units (e.g., phrases) are nested within larger units (e.g., clauses). When a larger constituent ends, all of the smaller constituents that are nested within it must also be closed. While the standard LSTM architecture allows different neurons to track information at different time scales, it does not have an explicit bias towards modeling a hierarchy of constituents. This paper proposes to add such an inductive bias by ordering the neurons; a vector of master input and forget gates ensures that when a given neuron is updated, all the neurons that follow it in the ordering are also updated. Our novel recurrent architecture, ordered neurons LSTM (ON-LSTM), achieves good performance on four different tasks: language modeling, unsupervised parsing, targeted syntactic evaluation, and logical inference.

Yikang Shen, Shawn Tan, Alessandro Sordoni, Aaron Courville• 2018

Related benchmarks

TaskDatasetResultRank
Language ModelingPTB (test)
Perplexity87.2
471
Language ModelingPenn Treebank (test)
Perplexity56.2
411
Language ModelingPenn Treebank (PTB) (test)
Perplexity56.2
120
Unsupervised ParsingPTB (test)
F1 Score50
75
Language ModelingPenn Treebank word-level (test)
Perplexity56.17
72
Language ModelingPenn Treebank (PTB) (val)
Perplexity58.3
70
Unconditional Text GenerationEMNLP 2017 WMT News
Perplexity37.46
64
Unsupervised Constituency ParsingChinese Treebank (CTB) (test)
Unlabeled Sentence F1 (Mean)25.4
36
Unsupervised Constituency ParsingSUSANNE (test)
F1 Score33.1
32
Grammar InductionPTB English (test)
F1 Score47.4
29
Showing 10 of 25 rows

Other info

Code

Follow for update