Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Learning-by-Narrating: Narrative Pre-Training for Zero-Shot Dialogue Comprehension

About

Comprehending a dialogue requires a model to capture diverse kinds of key information in the utterances, which are either scattered around or implicitly implied in different turns of conversations. Therefore, dialogue comprehension requires diverse capabilities such as paraphrasing, summarizing, and commonsense reasoning. Towards the objective of pre-training a zero-shot dialogue comprehension model, we develop a novel narrative-guided pre-training strategy that learns by narrating the key information from a dialogue input. However, the dialogue-narrative parallel corpus for such a pre-training strategy is currently unavailable. For this reason, we first construct a dialogue-narrative parallel corpus by automatically aligning movie subtitles and their synopses. We then pre-train a BART model on the data and evaluate its performance on four dialogue-based tasks that require comprehension. Experimental results show that our model not only achieves superior zero-shot performance but also exhibits stronger fine-grained dialogue comprehension capabilities. The data and code are available at https://github.com/zhaochaocs/Diana

Chao Zhao, Wenlin Yao, Dian Yu, Kaiqiang Song, Dong Yu, Jianshu Chen• 2022

Related benchmarks

TaskDatasetResultRank
Dialogue ComprehensionDREAM
Accuracy53.41
15
Dialogue SummarizationSamSum
ROUGE-213.23
10
Dialogue ComprehensionPCMD
Accuracy54.88
9
Dialogue ComprehensionVLEP
Accuracy58.9
9
Showing 4 of 4 rows

Other info

Code

Follow for update