Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Block Diffusion: Interpolating Between Autoregressive and Diffusion Language Models

About

Diffusion language models offer unique benefits over autoregressive models due to their potential for parallelized generation and controllability, yet they lag in likelihood modeling and are limited to fixed-length generation. In this work, we introduce a class of block diffusion language models that interpolate between discrete denoising diffusion and autoregressive models. Block diffusion overcomes key limitations of both approaches by supporting flexible-length generation and improving inference efficiency with KV caching and parallel token sampling. We propose a recipe for building effective block diffusion models that includes an efficient training algorithm, estimators of gradient variance, and data-driven noise schedules to minimize the variance. Block diffusion sets a new state-of-the-art performance among diffusion models on language modeling benchmarks and enables generation of arbitrary-length sequences. We provide the code, along with the model weights and blog post on the project page: https://m-arriola.com/bd3lms

Marianne Arriola, Aaron Gokaslan, Justin T. Chiu, Zhihan Yang, Zhixuan Qi, Jiaqi Han, Subham Sekhar Sahoo, Volodymyr Kuleshov• 2025

Related benchmarks

TaskDatasetResultRank
Commonsense ReasoningHellaSwag
Accuracy42.48
1460
Language ModelingPTB
Perplexity118.3
650
Commonsense ReasoningPIQA
Accuracy59.79
647
Language ModelingWikiText
PPL39.28
479
Language modellingLM1B (test)
Perplexity28.23
120
Language ModelingLAMBADA
Perplexity39.17
99
Language ModelingPTB (val)
Perplexity82
83
Language ModelingOpenWebText
Perplexity40.97
50
Unconditional GenerationOpenWebText (OWT) L=1024 (held-out)
MAUVE0.133
45
Coreference ResolutionWinoGrande
Accuracy51.38
36
Showing 10 of 32 rows

Other info

Follow for update