Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

PLATO: Pre-trained Dialogue Generation Model with Discrete Latent Variable

About

Pre-training models have been proved effective for a wide range of natural language processing tasks. Inspired by this, we propose a novel dialogue generation pre-training framework to support various kinds of conversations, including chit-chat, knowledge grounded dialogues, and conversational question answering. In this framework, we adopt flexible attention mechanisms to fully leverage the bi-directional context and the uni-directional characteristic of language generation. We also introduce discrete latent variables to tackle the inherent one-to-many mapping problem in response generation. Two reciprocal tasks of response generation and latent act recognition are designed and carried out simultaneously within a shared network. Comprehensive experiments on three publicly available datasets verify the effectiveness and superiority of the proposed framework.

Siqi Bao, Huang He, Fan Wang, Hua Wu, Haifeng Wang• 2019

Related benchmarks

TaskDatasetResultRank
Dialogue GenerationDailyDialog
Distinct-10.054
26
Dialogue Response GenerationPersona-Chat
BLEU-145.8
20
Response GenerationDailyDialog (test)
BLEU-232.2
16
Dialog GenerationDSTC7-AVSD (test)
BLEU-10.784
12
Dialog State TrackingDuClarifyDial
Type Accuracy99
5
Response GenerationDuClarifyDial
BLEU-146
5
Dialog Act PlanningDuClarifyDial
Act Accuracy91
5
End-to-End Dialog GenerationDuClarifyDial (test)
BLEU-10.32
5
Showing 8 of 8 rows

Other info

Follow for update