Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Incremental Transformer with Deliberation Decoder for Document Grounded Conversations

About

Document Grounded Conversations is a task to generate dialogue responses when chatting about the content of a given document. Obviously, document knowledge plays a critical role in Document Grounded Conversations, while existing dialogue models do not exploit this kind of knowledge effectively enough. In this paper, we propose a novel Transformer-based architecture for multi-turn document grounded conversations. In particular, we devise an Incremental Transformer to encode multi-turn utterances along with knowledge in related documents. Motivated by the human cognitive process, we design a two-pass decoder (Deliberation Decoder) to improve context coherence and knowledge correctness. Our empirical study on a real-world Document Grounded Dataset proves that responses generated by our model significantly outperform competitive baselines on both context coherence and knowledge relevance.

Zekang Li, Cheng Niu, Fandong Meng, Yang Feng, Qian Li, Jie Zhou• 2019

Related benchmarks

TaskDatasetResultRank
Dialogue GenerationWizard of Wikipedia (WoW) Seen (test)
BLEU-115.8
13
Dialogue GenerationCMU-DoG (test)
BLEU-19.5
13
Knowledge-Grounded Dialogue GenerationWizard of Wikipedia unseen (test)
BLEU-113.4
11
Showing 3 of 3 rows

Other info

Follow for update