Unsupervised Dependency Parsing: Let's Use Supervised Parsers
About
We present a self-training approach to unsupervised dependency parsing that reuses existing supervised and unsupervised parsing algorithms. Our approach, called `iterated reranking' (IR), starts with dependency trees generated by an unsupervised parser, and iteratively improves these trees using the richer probability models used in supervised parsing that are in turn trained on these trees. Our system achieves 1.8% accuracy higher than the state-of-the-part parser of Spitkovsky et al. (2013) on the WSJ corpus.
Phong Le, Willem Zuidema• 2015
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Dependency Parsing | WSJ (test) | UAS65.8 | 67 | |
| Dependency Parsing | WSJ 10 or fewer words (test) | UAS73.2 | 25 | |
| Unsupervised Dependency Parsing | WSJ section 23 (all lengths) (test) | Directed Dependency Accuracy (DDA)65.8 | 16 | |
| Unsupervised Dependency Parsing | WSJ section 23 length <= 10 (test) | DDA73.2 | 16 | |
| Dependency Parsing | WSJ corpus all sentences (section 23) | DDA66.2 | 9 | |
| Dependency Parsing | WSJ corpus length up to 10 (section 23) | DDA73.2 | 9 |
Showing 6 of 6 rows