Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Transition-Based Dependency Parsing with Stack Long Short-Term Memory

About

We propose a technique for learning representations of parser states in transition-based dependency parsers. Our primary innovation is a new control structure for sequence-to-sequence neural networks---the stack LSTM. Like the conventional stack data structures used in transition-based parsing, elements can be pushed to or popped from the top of the stack in constant time, but, in addition, an LSTM maintains a continuous space embedding of the stack contents. This lets us formulate an efficient parsing model that captures three facets of a parser's state: (i) unbounded look-ahead into the buffer of incoming words, (ii) the complete history of actions taken by the parser, and (iii) the complete contents of the stack of partially built tree fragments, including their internal structures. Standard backpropagation techniques are used for training and yield state-of-the-art parsing performance.

Chris Dyer, Miguel Ballesteros, Wang Ling, Austin Matthews, Noah A. Smith• 2015

Related benchmarks

TaskDatasetResultRank
Dependency ParsingChinese Treebank (CTB) (test)
UAS87.2
99
Dependency ParsingEnglish PTB Stanford Dependencies (test)
UAS93.1
76
Dependency ParsingWSJ (test)
UAS93.2
67
Dependency ParsingCoNLL German 2009 (test)
UAS88.56
25
Dependency ParsingPenn Treebank (PTB) Section 23 v2.2 (test)
UAS93.1
17
POS TaggingPenn Treebank (PTB) Section 23 v2.2 (test)
POS Accuracy97.3
15
Dependency ParsingCoNLL Spanish 2009 (test)
UAS90.76
14
Semantic ParsingUCCA Wiki in-domain (test)
Primary Labeled F-score69.9
14
Dependency ParsingCoNLL English 2009 (test)
UAS91.59
13
Dependency ParsingCoNLL Chinese 2009 (test)
UAS82.45
12
Showing 10 of 18 rows

Other info

Follow for update