Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Adaptive Visual Autoregressive Acceleration via Dual-Linkage Entropy Analysis

About

Visual AutoRegressive modeling (VAR) suffers from substantial computational cost due to the massive token count involved. Failing to account for the continuous evolution of modeling dynamics, existing VAR token reduction methods face three key limitations: heuristic stage partition, non-adaptive schedules, and limited acceleration scope, thereby leaving significant acceleration potential untapped. Since entropy variation intrinsically reflects the transition of predictive uncertainty, it offers a principled measure to capture modeling dynamics evolution. Therefore, we propose NOVA, a training-free token reduction acceleration framework for VAR models via entropy analysis. NOVA adaptively determines the acceleration activation scale during inference by online identifying the inflection point of scale entropy growth. Through scale-linkage and layer-linkage ratio adjustment, NOVA dynamically computes distinct token reduction ratios for each scale and layer, pruning low-entropy tokens while reusing the cache derived from the residuals at the prior scale to accelerate inference and maintain generation quality. Extensive experiments and analyses validate NOVA as a simple yet effective training-free acceleration framework.

Yu Zhang, Jingyi Liu, Feng Liu, Duoqian Miao, Qi Zhang, Kexue Fu, Changwei Wang, Longbing Cao• 2026

Related benchmarks

TaskDatasetResultRank
Text-to-Image GenerationGenEval
GenEval Score79
277
Text-to-Image GenerationDPG-Bench
Overall Score86.53
173
Text-to-Image GenerationImageReward
ImageReward Score1.035
56
Text-to-Image GenerationHPS v2.1
Score (Anime)32.03
9
Showing 4 of 4 rows

Other info

Follow for update