Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Learning Discourse-level Diversity for Neural Dialog Models using Conditional Variational Autoencoders

About

While recent neural encoder-decoder models have shown great promise in modeling open-domain conversations, they often generate dull and generic responses. Unlike past work that has focused on diversifying the output of the decoder at word-level to alleviate this problem, we present a novel framework based on conditional variational autoencoders that captures the discourse-level diversity in the encoder. Our model uses latent variables to learn a distribution over potential conversational intents and generates diverse responses using only greedy decoders. We have further developed a novel variant that is integrated with linguistic prior knowledge for better performance. Finally, the training procedure is improved by introducing a bag-of-word loss. Our proposed models have been validated to generate significantly more diverse responses than baseline approaches and exhibit competence in discourse-level decision-making.

Tiancheng Zhao, Ran Zhao, Maxine Eskenazi• 2017

Related benchmarks

TaskDatasetResultRank
Paraphrase GenerationQQP (test)
BLEU-221.5
22
Dialogue GenerationDouban (test)
BLEU-10.064
20
Language ModelingYahoo
Prior LL-330.5
18
StorytellingRocStories 8:1:1 (test)
BLEU-10.2581
10
Conversational Question GenerationReddit CQG (test)
Fluency47.4
10
Knowledge-grounded dialogWizard-of-Wikipedia (WoW) (test)
BLEU16.7
9
Personalized Dialogue GenerationConvAI2 (Human Evaluation)
Readability71
8
Personalized Dialogue GenerationConvAI2
BLEU-16.89
7
Personalized Dialogue GenerationBaidu PersonaChat
BLEU-110.86
7
Dialogue GenerationREDDIT
Relevance2.58
5
Showing 10 of 12 rows

Other info

Follow for update