Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Autoregressive Models Rival Diffusion Models at ANY-ORDER Generation

About

Diffusion language models enable any-order generation and bidirectional conditioning, offering appealing flexibility for tasks such as infilling, rewriting, and self-correction. However, their formulation-predicting one part of a sequence from another within a single-step dependency-limits modeling depth and often yields lower sample quality and stability than autoregressive (AR) models. To address this, we revisit autoregressive modeling as a foundation and reformulate diffusion-style training into a structured multi-group prediction process. We propose Any-order Any-subset Autoregressive modeling (A3), a generalized framework that extends the standard AR factorization to arbitrary token groups and generation orders. A3 preserves the probabilistic rigor and multi-layer dependency modeling of AR while inheriting diffusion models' flexibility for parallel and bidirectional generation. We implement A3 through a two-stream attention architecture and a progressive adaptation strategy that transitions pretrained AR models toward any-order prediction. Experiments on question answering, commonsense reasoning, and story infilling demonstrate that A3 outperforms diffusion-based models while maintaining flexible decoding. This work offers a unified approach for a flexible, efficient, and novel language modeling paradigm.

Tianqi Du, Lizhe Fang, Weijie Yang, Chenheng Zhang, Zeming Wei, Yifei Wang, Yisen Wang• 2026

Related benchmarks

TaskDatasetResultRank
Commonsense ReasoningPIQA
Accuracy78.1
647
Commonsense ReasoningSIQA
Accuracy45.2
96
Common Sense ReasoningHSWAG
Accuracy0.584
52
Commonsense ReasoningWino
Accuracy60.2
45
Question AnsweringTriQA
Accuracy19.4
21
Conditional GenerationPile
Perplexity11.2
18
InfillingROCStories
ROUGE-119.2
7
Showing 7 of 7 rows

Other info

Follow for update