Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Dependency Grammar Induction with a Neural Variational Transition-based Parser

About

Dependency grammar induction is the task of learning dependency syntax without annotated training data. Traditional graph-based models with global inference achieve state-of-the-art results on this task but they require $O(n^3)$ run time. Transition-based models enable faster inference with $O(n)$ time complexity, but their performance still lags behind. In this work, we propose a neural transition-based parser for dependency grammar induction, whose inference procedure utilizes rich neural features with $O(n)$ time complexity. We train the parser with an integration of variational inference, posterior regularization and variance reduction techniques. The resulting framework outperforms previous unsupervised transition-based dependency parsers and achieves performance comparable to graph-based models, both on the English Penn Treebank and on the Universal Dependency Treebank. In an empirical comparison, we show that our approach substantially increases parsing speed over graph-based models.

Bowen Li, Jianpeng Cheng, Yang Liu, Frank Keller• 2018

Related benchmarks

TaskDatasetResultRank
Dependency ParsingWSJ (test)
UAS37.8
67
Grammar InductionPTB English (test)
F1 Score42
29
Dependency ParsingWSJ 10 or fewer words (test)
UAS54.7
25
Dependency ParsingUniversal Dependencies Low-Res treebanks (test)
LAS Score (Basque)52.9
8
Dependency ParsingUniversal Dependency Treebanks Length <= 40 (test)
Basque48.9
8
Showing 5 of 5 rows

Other info

Follow for update