Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Learning-Based Single-Document Summarization with Compression and Anaphoricity Constraints

About

We present a discriminative model for single-document summarization that integrally combines compression and anaphoricity constraints. Our model selects textual units to include in the summary based on a rich set of sparse features whose weights are learned on a large corpus. We allow for the deletion of content within a sentence when that deletion is licensed by compression rules; in our framework, these are implemented as dependencies between subsentential units of text. Anaphoricity constraints then improve cross-sentence coherence by guaranteeing that, for each pronoun included in the summary, the pronoun's antecedent is included as well or the pronoun is rewritten as a full mention. When trained end-to-end, our final system outperforms prior work on both ROUGE as well as on human judgments of linguistic quality.

Greg Durrett, Taylor Berg-Kirkpatrick, Dan Klein• 2016

Related benchmarks

TaskDatasetResultRank
Extractive SummarizationNYT50 (test)
ROUGE-142.2
21
Text SummarizationNYT (test)
R1 Score42.2
18
SummarizationNYT50 limited length (test)
ROUGE-142.2
8
Text SummarizationNYT50 (test)
ROUGE-142.2
5
Showing 4 of 4 rows

Other info

Follow for update