Multi-Document Summarization with Centroid-Based Pretraining
About
In Multi-Document Summarization (MDS), the input can be modeled as a set of documents, and the output is its summary. In this paper, we focus on pretraining objectives for MDS. Specifically, we introduce a novel pretraining objective, which involves selecting the ROUGE-based centroid of each document cluster as a proxy for its summary. Our objective thus does not require human written summaries and can be utilized for pretraining on a dataset consisting solely of document sets. Through zero-shot, few-shot, and fully supervised experiments on multiple MDS datasets, we show that our model Centrum is better or comparable to a state-of-the-art model. We make the pretrained and fine-tuned models freely available to the research community https://github.com/ratishsp/centrum.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Multi-document summarization | Multi-News (test) | ROUGE-220.4 | 45 | |
| Multi-document summarization | Multi-News 256 (test) | ROUGE-145.7 | 12 | |
| Multi-document summarization | WCEP 50 (test) | ROUGE-142 | 12 | |
| Multi-document summarization | DUC 2007 250 (test) | ROUGE-135.3 | 6 | |
| Multi-document summarization | DUC 2007 (human evaluation) | Informativeness65.5 | 3 |