Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Diffusion-LM Improves Controllable Text Generation

About

Controlling the behavior of language models (LMs) without re-training is a major open problem in natural language generation. While recent works have demonstrated successes on controlling simple sentence attributes (e.g., sentiment), there has been little progress on complex, fine-grained controls (e.g., syntactic structure). To address this challenge, we develop a new non-autoregressive language model based on continuous diffusions that we call Diffusion-LM. Building upon the recent successes of diffusion models in continuous domains, Diffusion-LM iteratively denoises a sequence of Gaussian vectors into word vectors, yielding a sequence of intermediate latent variables. The continuous, hierarchical nature of these intermediate variables enables a simple gradient-based algorithm to perform complex, controllable generation tasks. We demonstrate successful control of Diffusion-LM for six challenging fine-grained control tasks, significantly outperforming prior work.

Xiang Lisa Li, John Thickstun, Ishaan Gulrajani, Percy Liang, Tatsunori B. Hashimoto• 2022

Related benchmarks

TaskDatasetResultRank
Language modellingLM1B (test)
Perplexity118.6
120
Language ModelingOne Billion Word Benchmark (test)
Test Perplexity118.6
108
Machine TranslationWMT 2014 (test)
BLEU17.41
100
Machine TranslationWMT En-De '14
BLEU15.3
89
Text GenerationLM1B (test)--
72
Machine TranslationWMT 2016 (test)
BLEU29.39
58
Machine TranslationIWSLT De-En 14
BLEU Score29.11
33
Machine TranslationWMT De-En 14
BLEU17.3
33
Language ModelingLM1B (val)
Perplexity118.6
17
Structured JSON GenerationMultiWOZ, Super-NaturalInstructions, TruthfulQA, and Self-Instruct Averaged
Similarity Score0.72
16
Showing 10 of 27 rows

Other info

Code

Follow for update