Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Dependency Grammar Induction with Neural Lexicalization and Big Training Data

About

We study the impact of big models (in terms of the degree of lexicalization) and big data (in terms of the training corpus size) on dependency grammar induction. We experimented with L-DMV, a lexicalized version of Dependency Model with Valence and L-NDMV, our lexicalized extension of the Neural Dependency Model with Valence. We find that L-DMV only benefits from very small degrees of lexicalization and moderate sizes of training corpora. L-NDMV can benefit from big training data and lexicalization of greater degrees, especially when enhanced with good model initialization, and it achieves a result that is competitive with the current state-of-the-art.

Wenjuan Han, Yong Jiang, Kewei Tu• 2017

Related benchmarks

TaskDatasetResultRank
Dependency ParsingWSJ (test)
UAS63.2
67
Dependency ParsingWSJ 10 or fewer words (test)
UAS77.2
25
Unsupervised Dependency ParsingWSJ section 23 length <= 10 (test)
DDA77.2
16
Unsupervised Dependency ParsingWSJ section 23 (all lengths) (test)
Directed Dependency Accuracy (DDA)63.2
16
Showing 4 of 4 rows

Other info

Follow for update