Incremental Transformer with Deliberation Decoder for Document Grounded Conversations
About
Document Grounded Conversations is a task to generate dialogue responses when chatting about the content of a given document. Obviously, document knowledge plays a critical role in Document Grounded Conversations, while existing dialogue models do not exploit this kind of knowledge effectively enough. In this paper, we propose a novel Transformer-based architecture for multi-turn document grounded conversations. In particular, we devise an Incremental Transformer to encode multi-turn utterances along with knowledge in related documents. Motivated by the human cognitive process, we design a two-pass decoder (Deliberation Decoder) to improve context coherence and knowledge correctness. Our empirical study on a real-world Document Grounded Dataset proves that responses generated by our model significantly outperform competitive baselines on both context coherence and knowledge relevance.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Dialogue Generation | Wizard of Wikipedia (WoW) Seen (test) | BLEU-115.8 | 13 | |
| Dialogue Generation | CMU-DoG (test) | BLEU-19.5 | 13 | |
| Knowledge-Grounded Dialogue Generation | Wizard of Wikipedia unseen (test) | BLEU-113.4 | 11 |