Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

SimCLS: A Simple Framework for Contrastive Learning of Abstractive Summarization

About

In this paper, we present a conceptually simple while empirically powerful framework for abstractive summarization, SimCLS, which can bridge the gap between the learning objective and evaluation metrics resulting from the currently dominated sequence-to-sequence learning framework by formulating text generation as a reference-free evaluation problem (i.e., quality estimation) assisted by contrastive learning. Experimental results show that, with minor modification over existing top-scoring systems, SimCLS can improve the performance of existing top-performing models by a large margin. Particularly, 2.51 absolute improvement against BART and 2.50 over PEGASUS w.r.t ROUGE-1 on the CNN/DailyMail dataset, driving the state-of-the-art performance to a new level. We have open-sourced our codes and results: https://github.com/yixinL7/SimCLS. Results of our proposed models have been deployed into ExplainaBoard platform, which allows researchers to understand our systems in a more fine-grained way.

Yixin Liu, Pengfei Liu• 2021

Related benchmarks

TaskDatasetResultRank
SummarizationXSum (test)
ROUGE-224.57
231
Abstractive Text SummarizationCNN/Daily Mail (test)
ROUGE-L43.54
169
SummarizationXsum
ROUGE-224.6
108
SummarizationCNN/DM
ROUGE-146.67
56
Abstractive SummarizationXSum (test)
ROUGE-L39.44
44
Text SummarizationCNNDM
ROUGE-222.15
11
Abstractive SummarizationCNNDM (test)
ROUGE-146.67
11
SummarizationCNN/DM human evaluation--
4
SummarizationXSum 100 sample subset (test)
Informativeness (Pairwise Lose)7
2
Showing 9 of 9 rows

Other info

Code

Follow for update