Dependency Grammar Induction with a Neural Variational Transition-based Parser
About
Dependency grammar induction is the task of learning dependency syntax without annotated training data. Traditional graph-based models with global inference achieve state-of-the-art results on this task but they require $O(n^3)$ run time. Transition-based models enable faster inference with $O(n)$ time complexity, but their performance still lags behind. In this work, we propose a neural transition-based parser for dependency grammar induction, whose inference procedure utilizes rich neural features with $O(n)$ time complexity. We train the parser with an integration of variational inference, posterior regularization and variance reduction techniques. The resulting framework outperforms previous unsupervised transition-based dependency parsers and achieves performance comparable to graph-based models, both on the English Penn Treebank and on the Universal Dependency Treebank. In an empirical comparison, we show that our approach substantially increases parsing speed over graph-based models.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Dependency Parsing | WSJ (test) | UAS37.8 | 67 | |
| Grammar Induction | PTB English (test) | F1 Score42 | 29 | |
| Dependency Parsing | WSJ 10 or fewer words (test) | UAS54.7 | 25 | |
| Dependency Parsing | Universal Dependencies Low-Res treebanks (test) | LAS Score (Basque)52.9 | 8 | |
| Dependency Parsing | Universal Dependency Treebanks Length <= 40 (test) | Basque48.9 | 8 |