Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Likelihood-Based Diffusion Language Models

About

Despite a growing interest in diffusion-based language models, existing work has not shown that these models can attain nontrivial likelihoods on standard language modeling benchmarks. In this work, we take the first steps towards closing the likelihood gap between autoregressive and diffusion-based language models, with the goal of building and releasing a diffusion model which outperforms a small but widely-known autoregressive model. We pursue this goal through algorithmic improvements, scaling laws, and increased compute. On the algorithmic front, we introduce several methodological improvements for the maximum-likelihood training of diffusion language models. We then study scaling laws for our diffusion models and find compute-optimal training regimes which differ substantially from autoregressive models. Using our methods and scaling analysis, we train and release Plaid 1B, a large diffusion language model which outperforms GPT-2 124M in likelihood on benchmark datasets and generates fluent samples in unconditional and zero-shot control settings.

Ishaan Gulrajani, Tatsunori B. Hashimoto• 2023

Related benchmarks

TaskDatasetResultRank
Language ModelingWikiText2
Perplexity29.42
2839
Code GenerationHumanEval
Pass@10.1
1036
Language ModelingPTB
Perplexity74.33
1034
Language ModelingWikiText-103
PPL28.28
189
Math ReasoningGSM8K
Accuracy32.6
187
Language ModelingLAMBADA
Perplexity57.28
150
Character-level Language Modelingtext8 (test)
BPC1.48
128
Language ModelingLM1B (val)
Perplexity32.4
55
Language ModelingWikiText
Wikitext PPL50.86
45
Language ModelingLAMBADA zero-shot (test)--
44
Showing 10 of 26 rows

Other info

Code

Follow for update