Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Generalized Interpolating Discrete Diffusion

About

While state-of-the-art language models achieve impressive results through next-token prediction, they have inherent limitations such as the inability to revise already generated tokens. This has prompted exploration of alternative approaches such as discrete diffusion. However, masked diffusion, which has emerged as a popular choice due to its simplicity and effectiveness, reintroduces this inability to revise words. To overcome this, we generalize masked diffusion, deriving a new family of general interpolating discrete diffusion (GIDD) which offers greater flexibility in the design of the noising processes. Leveraging a novel diffusion ELBO, we achieve compute-matched state-of-the-art performance in diffusion language modeling. Exploiting GIDD's flexibility, we explore a hybrid approach combining masking and uniform noise, leading to improved sample quality and unlocking the ability for the model to correct its own mistakes, an area where autoregressive models notoriously have struggled. Code: https://github.com/dvruette/gidd/

Dimitri von R\"utte, Janis Fluri, Yuhui Ding, Antonio Orvieto, Bernhard Sch\"olkopf, Thomas Hofmann• 2025

Related benchmarks

TaskDatasetResultRank
Language ModelingPTB
Perplexity86.911
650
Language ModelingWikiText
PPL30.809
479
Language ModelingLAMBADA
Perplexity47.811
99
Image GenerationImageNet-1k (val)
FID35.403
84
Image GenerationImageNet-1K
FID7.076
42
Language ModelingarXiv
Perplexity39.019
21
Language ModelingAG-News
PPL60.607
20
Language ModelingPubmed
Perplexity42.634
8
Language ModelingLM1B GPT2
PPL65.898
4
Showing 9 of 9 rows

Other info

Follow for update