Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

PCFGs Can Do Better: Inducing Probabilistic Context-Free Grammars with Many Symbols

About

Probabilistic context-free grammars (PCFGs) with neural parameterization have been shown to be effective in unsupervised phrase-structure grammar induction. However, due to the cubic computational complexity of PCFG representation and parsing, previous approaches cannot scale up to a relatively large number of (nonterminal and preterminal) symbols. In this work, we present a new parameterization form of PCFGs based on tensor decomposition, which has at most quadratic computational complexity in the symbol number and therefore allows us to use a much larger number of symbols. We further use neural parameterization for the new form to improve unsupervised parsing performance. We evaluate our model across ten languages and empirically demonstrate the effectiveness of using more symbols. Our code: https://github.com/sustcsonglin/TN-PCFG

Songlin Yang, Yanpeng Zhao, Kewei Tu• 2021

Related benchmarks

TaskDatasetResultRank
Unsupervised ParsingPTB (test)--
75
Unsupervised Constituency ParsingChinese Treebank (CTB) (test)
Unlabeled Sentence F1 (Mean)39.2
36
Unsupervised Constituency ParsingWSJ (test)
Max F161.4
29
Unsupervised Constituency ParsingPenn TreeBank English (test)
Mean S-F157.7
16
Unsupervised Constituency ParsingEnglish SPMRL (test)
S-F157.7
15
Unsupervised ParsingWSJ (test)
F1 Score57.7
11
Unsupervised Constituency ParsingSPMRL French (test)
S-F145
11
Unsupervised Constituency ParsingGerman SPMRL (test)
S-F147.1
11
Unsupervised ParsingChinese Treebank (CTB)
S-F139.2
6
Unsupervised Constituency ParsingSPMRL (test)
German Score47.1
6
Showing 10 of 17 rows

Other info

Code

Follow for update