Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Selective Encoding for Abstractive Sentence Summarization

About

We propose a selective encoding model to extend the sequence-to-sequence framework for abstractive sentence summarization. It consists of a sentence encoder, a selective gate network, and an attention equipped decoder. The sentence encoder and decoder are built with recurrent neural networks. The selective gate network constructs a second level sentence representation by controlling the information flow from encoder to decoder. The second level representation is tailored for sentence summarization task, which leads to better performance. We evaluate our model on the English Gigaword, DUC 2004 and MSR abstractive sentence summarization datasets. The experimental results show that the proposed selective encoding model outperforms the state-of-the-art baseline models.

Qingyu Zhou, Nan Yang, Furu Wei, Ming Zhou• 2017

Related benchmarks

TaskDatasetResultRank
Text SummarizationDUC 2004 (test)
ROUGE-129.21
115
Text SummarizationGigaword (test)
ROUGE-146.86
75
Abstractive SummarizationGigaword (test)
ROUGE-136.15
58
Abstractive SummarizationGigawords (test)
ROUGE-136.2
27
Abstractive Text SummarizationGigaword
ROUGE-136.15
14
Abstractive SummarizationGigaword full-length F1 (test)
ROUGE-1 F136.15
12
Abstractive Sentence SummarizationMSR-ATC (test)
ROUGE-125.75
5
Sentence SummarizationInternal English Gigaword (test)
ROUGE-146.86
5
Showing 8 of 8 rows

Other info

Code

Follow for update