Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Neural Probabilistic Model for Non-projective MST Parsing

About

In this paper, we propose a probabilistic parsing model, which defines a proper conditional probability distribution over non-projective dependency trees for a given sentence, using neural representations as inputs. The neural network architecture is based on bi-directional LSTM-CNNs which benefits from both word- and character-level representations automatically, by using combination of bidirectional LSTM and CNN. On top of the neural network, we introduce a probabilistic structured layer, defining a conditional log-linear model over non-projective trees. We evaluate our model on 17 different datasets, across 14 different languages. By exploiting Kirchhoff's Matrix-Tree Theorem (Tutte, 1984), the partition functions and marginals can be computed efficiently, leading to a straight-forward end-to-end model training procedure via back-propagation. Our parser achieves state-of-the-art parsing performance on nine datasets.

Xuezhe Ma, Eduard Hovy• 2017

Related benchmarks

TaskDatasetResultRank
Dependency ParsingChinese Treebank (CTB) (test)
UAS89.05
99
Dependency ParsingPenn Treebank (PTB) (test)
LAS92.98
80
Dependency ParsingEnglish PTB Stanford Dependencies (test)
UAS94.88
76
Dependency ParsingCoNLL German 2009 (test)
UAS93.62
25
Dependency ParsingDep. Parse (test)
UAS94.9
23
Dependency ParsingPenn Treebank (PTB) Section 23 v2.2 (test)
UAS94.88
17
POS TaggingPenn Treebank (PTB) Section 23 v2.2 (test)
POS Accuracy97.3
15
Dependency ParsingCoNLL Spanish 2009 (test)
UAS89.2
14
Dependency ParsingCoNLL English 2009 (test)
UAS94.66
13
Dependency ParsingCoNLL Chinese 2009 (test)
UAS93.4
12
Showing 10 of 20 rows

Other info

Follow for update