Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Lexical Knowledge Internalization for Neural Dialog Generation

About

We propose knowledge internalization (KI), which aims to complement the lexical knowledge into neural dialog models. Instead of further conditioning the knowledge-grounded dialog (KGD) models on externally retrieved knowledge, we seek to integrate knowledge about each input token internally into the model's parameters. To tackle the challenge due to the large scale of lexical knowledge, we adopt the contrastive learning approach and create an effective token-level lexical knowledge retriever that requires only weak supervision mined from Wikipedia. We demonstrate the effectiveness and general applicability of our approach on various datasets and diversified model structures.

Zhiyong Wu, Wei Bi, Xiang Li, Lingpeng Kong, Ben Kao• 2022

Related benchmarks

TaskDatasetResultRank
Dialogue GenerationDailyDialog
Distinct-14.39
26
Knowledge-Grounded Dialogue GenerationWizard of Wikipedia (WoW) Seen (test)
ROUGE-112.84
10
Knowledge-Grounded Dialogue GenerationWoW (Wizard of Wikipedia) unseen (test)
ROUGE-111.23
10
Knowledge-grounded Dialog GenerationWoW (Seen)
Appropriateness Score3.9
6
Dialogue GenerationCRD
BLEU-43.01
3
Dialog GenerationDailyDialog (test)
Appropriateness4.22
3
Dialog GenerationCRD (test)
Appropriateness4.22
3
Dialogue Response GenerationCRD (test)
PPL28.5
2
Showing 8 of 8 rows

Other info

Code

Follow for update