Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Structured Training for Neural Network Transition-Based Parsing

About

We present structured perceptron training for neural network transition-based dependency parsing. We learn the neural network representation using a gold corpus augmented by a large number of automatically parsed sentences. Given this fixed network representation, we learn a final layer using the structured perceptron with beam-search decoding. On the Penn Treebank, our parser reaches 94.26% unlabeled and 92.41% labeled attachment accuracy, which to our knowledge is the best accuracy on Stanford Dependencies to date. We also provide in-depth ablative analysis to determine which aspects of our model provide the largest gains in accuracy.

David Weiss, Chris Alberti, Michael Collins, Slav Petrov• 2015

Related benchmarks

TaskDatasetResultRank
Dependency ParsingEnglish PTB Stanford Dependencies (test)
UAS94.26
76
Dependency ParsingWSJ (test)
UAS94.26
67
Dependency ParsingPenn Treebank (PTB) Section 23 v2.2 (test)
UAS93.99
17
POS TaggingPenn Treebank (PTB) Section 23 v2.2 (test)
POS Accuracy97.44
15
Dependency ParsingTreebank Union (test)
Accuracy (News)92.62
10
Dependency ParsingUnion-Web (test)
UAS89.29
8
Dependency ParsingUnion-News (test)
UAS93.91
8
Dependency ParsingUnion-QTB (test)
UAS94.17
8
Showing 8 of 8 rows

Other info

Follow for update