Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Adaptive Classifier-Free Guidance via Dynamic Low-Confidence Masking

About

Classifier-Free Guidance (CFG) significantly enhances controllability in generative models by interpolating conditional and unconditional predictions. However, standard CFG often employs a static unconditional input, which can be suboptimal for iterative generation processes where model uncertainty varies dynamically. We introduce Adaptive Classifier-Free Guidance (A-CFG), a novel method that tailors the unconditional input by leveraging the model's instantaneous predictive confidence. At each step of an iterative (masked) diffusion language model, A-CFG identifies tokens in the currently generated sequence for which the model exhibits low confidence. These tokens are temporarily re-masked to create a dynamic, localized unconditional input. This focuses CFG's corrective influence precisely on areas of ambiguity, leading to more effective guidance. We integrate A-CFG into a state-of-the-art masked diffusion language model and demonstrate its efficacy. Experiments on diverse language generation benchmarks show that A-CFG yields substantial improvements over standard CFG, achieving, for instance, a 3.9 point gain on GPQA. Our work highlights the benefit of dynamically adapting guidance mechanisms to model uncertainty in iterative generation.

Pengxiang Li, Shilin Yan, Joey Tsai, Renrui Zhang, Ruichuan An, Ziyu Guo, Xiaowei Gao• 2025

Related benchmarks

TaskDatasetResultRank
Multi-hop QAHotpotQA
Exact Match24.2
76
Open-domain Question AnsweringMS Marco--
48
General QANQ
EM34
38
Open-domain QATriviaQA
EM70.6
20
Slot FillingT-REx--
20
Showing 5 of 5 rows

Other info

Follow for update