Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Multi-Document Summarization with Centroid-Based Pretraining

About

In Multi-Document Summarization (MDS), the input can be modeled as a set of documents, and the output is its summary. In this paper, we focus on pretraining objectives for MDS. Specifically, we introduce a novel pretraining objective, which involves selecting the ROUGE-based centroid of each document cluster as a proxy for its summary. Our objective thus does not require human written summaries and can be utilized for pretraining on a dataset consisting solely of document sets. Through zero-shot, few-shot, and fully supervised experiments on multiple MDS datasets, we show that our model Centrum is better or comparable to a state-of-the-art model. We make the pretrained and fine-tuned models freely available to the research community https://github.com/ratishsp/centrum.

Ratish Puduppully, Parag Jain, Nancy F. Chen, Mark Steedman• 2022

Related benchmarks

TaskDatasetResultRank
Multi-document summarizationMulti-News (test)
ROUGE-220.4
45
Multi-document summarizationMulti-News 256 (test)
ROUGE-145.7
12
Multi-document summarizationWCEP 50 (test)
ROUGE-142
12
Multi-document summarizationDUC 2007 250 (test)
ROUGE-135.3
6
Multi-document summarizationDUC 2007 (human evaluation)
Informativeness65.5
3
Showing 5 of 5 rows

Other info

Code

Follow for update