Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Simple and Effective Masked Diffusion Language Models

About

While diffusion models excel at generating high-quality images, prior work reports a significant performance gap between diffusion and autoregressive (AR) methods in language modeling. In this work, we show that simple masked discrete diffusion is more performant than previously thought. We apply an effective training recipe that improves the performance of masked diffusion models and derive a simplified, Rao-Blackwellized objective that results in additional improvements. Our objective has a simple form -- it is a mixture of classical masked language modeling losses -- and can be used to train encoder-only language models that admit efficient samplers, including ones that can generate arbitrary lengths of text semi-autoregressively like a traditional language model. On language modeling benchmarks, a range of masked diffusion models trained with modern engineering practices achieves a new state-of-the-art among diffusion models, and approaches AR perplexity. We provide the code, along with a blog post and video tutorial on the project page: https://s-sahoo.com/mdlm

Subham Sekhar Sahoo, Marianne Arriola, Yair Schiff, Aaron Gokaslan, Edgar Marroquin, Justin T Chiu, Alexander Rush, Volodymyr Kuleshov• 2024

Related benchmarks

TaskDatasetResultRank
Commonsense ReasoningHellaSwag
Accuracy48.32
1460
Commonsense ReasoningWinoGrande
Accuracy51.93
776
Question AnsweringARC Challenge
Accuracy24.66
749
Language ModelingPTB
Perplexity89.049
650
Commonsense ReasoningPIQA
Accuracy59.63
647
Natural Language UnderstandingGLUE (dev)
SST-2 (Acc)92.2
504
Language ModelingWikiText
PPL32.093
479
Language ModelingPTB (test)
Perplexity95.26
471
Code GenerationHumanEval (test)
Pass@120
444
Question AnsweringARC Easy
Accuracy34.26
386
Showing 10 of 110 rows
...

Other info

Code

Follow for update