Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

A Neural Attention Model for Abstractive Sentence Summarization

About

Summarization based on text extraction is inherently limited, but generation-style abstractive methods have proven challenging to build. In this work, we propose a fully data-driven approach to abstractive sentence summarization. Our method utilizes a local attention-based model that generates each word of the summary conditioned on the input sentence. While the model is structurally simple, it can easily be trained end-to-end and scales to a large amount of training data. The model shows significant performance gains on the DUC-2004 shared task compared with several strong baselines.

Alexander M. Rush, Sumit Chopra, Jason Weston• 2015

Related benchmarks

TaskDatasetResultRank
Text SummarizationDUC 2004 (test)
ROUGE-128.18
115
Text SummarizationGigaword (test)
ROUGE-137.41
75
Abstractive SummarizationGigaword (test)
ROUGE-129.78
58
SummarizationGigaword
ROUGE-L26.96
38
Abstractive SummarizationGigawords (test)
ROUGE-132.7
27
SummarizationSummarization dataset
ROUGE-L F151.5
16
Abstractive Text SummarizationGigaword
ROUGE-129.76
14
Code-to-NL generationCodeNN C# (test)
BLEU19.31
13
Abstractive SummarizationGigaword full-length F1 (test)
ROUGE-1 F129.78
12
SummarizationDUC 75-byte limit 2004
ROUGE-1 Recall26.55
6
Showing 10 of 14 rows

Other info

Follow for update