Neural Bi-Lexicalized PCFG Induction
About
Neural lexicalized PCFGs (L-PCFGs) have been shown effective in grammar induction. However, to reduce computational complexity, they make a strong independence assumption on the generation of the child word and thus bilexical dependencies are ignored. In this paper, we propose an approach to parameterize L-PCFGs without making implausible independence assumptions. Our approach directly models bilexical dependencies and meanwhile reduces both learning and representation complexities of L-PCFGs. Experimental results on the English WSJ dataset confirm the effectiveness of our approach in improving both running speed and unsupervised parsing performance.
Songlin Yang, Yanpeng Zhao, Kewei Tu• 2021
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Unsupervised Parsing | PTB (test) | F1 Score60.4 | 75 | |
| Unsupervised Constituency Parsing | Penn TreeBank English (test) | Mean S-F160.4 | 16 | |
| Unsupervised Constituency Parsing | English SPMRL (test) | S-F160.4 | 15 | |
| Unsupervised Parsing | WSJ (test) | F1 Score60.4 | 11 |
Showing 4 of 4 rows