Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

The Return of Lexical Dependencies: Neural Lexicalized PCFGs

About

In this paper we demonstrate that $\textit{context free grammar (CFG) based methods for grammar induction benefit from modeling lexical dependencies}$. This contrasts to the most popular current methods for grammar induction, which focus on discovering $\textit{either}$ constituents $\textit{or}$ dependencies. Previous approaches to marry these two disparate syntactic formalisms (e.g. lexicalized PCFGs) have been plagued by sparsity, making them unsuitable for unsupervised grammar induction. However, in this work, we present novel neural models of lexicalized PCFGs which allow us to overcome sparsity problems and effectively induce both constituents and dependencies within a single model. Experiments demonstrate that this unified framework results in stronger results on both representations than achieved when modeling either formalism alone. Code is available at https://github.com/neulab/neural-lpcfg.

Hao Zhu, Yonatan Bisk, Graham Neubig• 2020

Related benchmarks

TaskDatasetResultRank
Unsupervised ParsingPTB (test)--
75
Unsupervised Constituency ParsingWSJ (test)
Max F155.3
29
Unsupervised Constituency ParsingPenn TreeBank English (test)
Mean S-F155.3
16
Unsupervised ParsingWSJ (test)
F1 Score57.4
11
Showing 4 of 4 rows

Other info

Follow for update