Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Self-Speculative Masked Diffusions

About

We present self-speculative masked diffusions, a new class of masked diffusion generative models for discrete data that require significantly fewer function evaluations to generate samples. Standard masked diffusion models predict factorized logits over currently masked positions. A number of masked positions are then sampled, however, the factorization approximation means that sampling too many positions in one go leads to poor sample quality. As a result, many simulation steps and therefore neural network function evaluations are required to generate high-quality data. We reduce the computational burden by generating non-factorized predictions over masked positions. This is achieved by modifying the final transformer attention mask from non-causal to causal, enabling draft token generation and parallel validation via a novel, model-integrated speculative sampling mechanism. This results in a non-factorized predictive distribution over masked positions in a single forward pass. We apply our method to GPT2 scale text modelling and protein sequence generation, finding that we can achieve a ~2x reduction in the required number of network forward passes relative to standard masked diffusion models.

Andrew Campbell, Valentin De Bortoli, Jiaxin Shi, Arnaud Doucet• 2025

Related benchmarks

TaskDatasetResultRank
Text GenerationOpenWebText
Perplexity5.02
86
Showing 1 of 1 rows

Other info

Follow for update