Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Abstractive Text Summarization Using Sequence-to-Sequence RNNs and Beyond

About

In this work, we model abstractive text summarization using Attentional Encoder-Decoder Recurrent Neural Networks, and show that they achieve state-of-the-art performance on two different corpora. We propose several novel models that address critical problems in summarization that are not adequately modeled by the basic architecture, such as modeling key-words, capturing the hierarchy of sentence-to-word structure, and emitting words that are rare or unseen at training time. Our work shows that many of our proposed models contribute to further improvement in performance. We also propose a new dataset consisting of multi-sentence summaries, and establish performance benchmarks for further research.

Ramesh Nallapati, Bowen Zhou, Cicero Nogueira dos santos, Caglar Gulcehre, Bing Xiang• 2016

Related benchmarks

TaskDatasetResultRank
Abstractive Text SummarizationCNN/Daily Mail (test)
ROUGE-L36.67
169
SummarizationarXiv (test)
ROUGE-129.3
161
Text SummarizationDUC 2004 (test)
ROUGE-128.61
115
Text SummarizationGigaword (test)
ROUGE-135.3
75
SummarizationPubmed
ROUGE-131.55
70
Abstractive SummarizationGigaword (test)
ROUGE-132.67
58
SummarizationCNN/Daily Mail original, non-anonymized (test)
ROUGE-135.46
54
Abstractive SummarizationCNN/Daily Mail non-anonymous (test)
ROUGE-135.46
52
Abstractive SummarizationCNN/DailyMail full length F-1 (test)
ROUGE-135.46
48
SummarizationGigaword
ROUGE-L30.64
38
Showing 10 of 25 rows

Other info

Code

Follow for update