Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Cutting-off Redundant Repeating Generations for Neural Abstractive Summarization

About

This paper tackles the reduction of redundant repeating generation that is often observed in RNN-based encoder-decoder models. Our basic idea is to jointly estimate the upper-bound frequency of each target vocabulary in the encoder and control the output words based on the estimation in the decoder. Our method shows significant improvement over a strong RNN-based encoder-decoder baseline and achieved its best results on an abstractive summarization benchmark.

Jun Suzuki, Masaaki Nagata• 2016

Related benchmarks

TaskDatasetResultRank
Text SummarizationDUC 2004 (test)
ROUGE-132.28
115
Text SummarizationGigaword (test)
ROUGE-136.3
75
SummarizationDUC 75-byte limit 2004
ROUGE-1 Recall32.28
6
SummarizationGigaword two length limit
ROUGE-1 F-Score36.3
6
Showing 4 of 4 rows

Other info

Code

Follow for update