Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Generalized Interpolating Discrete Diffusion

About

While state-of-the-art language models achieve impressive results through next-token prediction, they have inherent limitations such as the inability to revise already generated tokens. This has prompted exploration of alternative approaches such as discrete diffusion. However, masked diffusion, which has emerged as a popular choice due to its simplicity and effectiveness, reintroduces this inability to revise words. To overcome this, we generalize masked diffusion, deriving a new family of general interpolating discrete diffusion (GIDD) which offers greater flexibility in the design of the noising processes. Leveraging a novel diffusion ELBO, we achieve compute-matched state-of-the-art performance in diffusion language modeling. Exploiting GIDD's flexibility, we explore a hybrid approach combining masking and uniform noise, leading to improved sample quality and unlocking the ability for the model to correct its own mistakes, an area where autoregressive models notoriously have struggled. Code: https://github.com/dvruette/gidd/

Dimitri von R\"utte, Janis Fluri, Yuhui Ding, Antonio Orvieto, Bernhard Sch\"olkopf, Thomas Hofmann• 2025

Related benchmarks

TaskDatasetResultRank
Language ModelingPTB
Perplexity86.911
1034
Language ModelingWikiText
PPL30.809
732
Language ModelingLAMBADA
Perplexity47.811
150
Image GenerationImageNet-1k (val)
FID35.403
93
Language ModelingOWT
Gen. PPL63.8
61
Image GenerationImageNet-1K
FID7.076
55
Language ModelingLM1B
PPL (Generalized)118.6
55
Language ModelingLM1B (val)
Perplexity32.98
55
Language ModelingarXiv
Perplexity39.019
55
Language ModelingOpenWebText (OWT) (val)
Perplexity22.29
42
Showing 10 of 14 rows

Other info

Follow for update