Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Cross-Lingual Abstractive Summarization with Limited Parallel Resources

About

Parallel cross-lingual summarization data is scarce, requiring models to better use the limited available cross-lingual resources. Existing methods to do so often adopt sequence-to-sequence networks with multi-task frameworks. Such approaches apply multiple decoders, each of which is utilized for a specific task. However, these independent decoders share no parameters, hence fail to capture the relationships between the discrete phrases of summaries in different languages, breaking the connections in order to transfer the knowledge of the high-resource languages to low-resource languages. To bridge these connections, we propose a novel Multi-Task framework for Cross-Lingual Abstractive Summarization (MCLAS) in a low-resource setting. Employing one unified decoder to generate the sequential concatenation of monolingual and cross-lingual summaries, MCLAS makes the monolingual summarization task a prerequisite of the cross-lingual summarization (CLS) task. In this way, the shared decoder learns interactions involving alignments and summary patterns across languages, which encourages attaining knowledge transfer. Experiments on two CLS datasets demonstrate that our model significantly outperforms three baseline models in both low-resource and full-dataset scenarios. Moreover, in-depth analysis on the generated summaries and attention heads verifies that interactions are learned well using MCLAS, which benefits the CLS task under limited parallel resources.

Yu Bai, Yang Gao, Heyan Huang• 2021

Related benchmarks

TaskDatasetResultRank
Cross-lingual SummarizationEn2ZhSum (test)
ROUGE-142.27
31
Cross-lingual SummarizationZh2EnSum (test)
ROUGE-135.65
27
Cross-lingual SummarizationEn2DeSum (test)
ROUGE-136.48
13
Cross-lingual SummarizationZh2EnSum Maximum Scenario (test)
Informativeness0.057
4
Cross-lingual SummarizationZh2EnSum Minimum Scenario (test)
Informativeness-0.264
4
Cross-lingual SummarizationZh2EnSum Medium Scenario (test)
Informativeness0.00e+0
4
Cross-lingual SummarizationQMSumX En2Ukr (test)
ROUGE-117.93
4
Cross-lingual SummarizationDialogSumX En2Fr (test)
ROUGE-139.51
4
Cross-lingual SummarizationDialogSumX En2Ukr (test)
ROUGE-130.24
4
Cross-lingual SummarizationQMSumX En2Zh (test)
ROUGE-130.09
4
Showing 10 of 11 rows

Other info

Code

Follow for update