Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Entropy-Tree: Tree-Based Decoding with Entropy-Guided Exploration

About

Large language models achieve strong reasoning performance, yet existing decoding strategies either explore blindly (random sampling) or redundantly (independent multi-sampling). We propose Entropy-Tree, a tree-based decoding method that exploits entropy as a signal for branching decisions--expanding the search tree only at positions where the model exhibits genuine uncertainty. Entropy-Tree shows superior accuracy and calibration in reasoning tasks: it achieves better pass@k than Multi-chain across multiple models and datasets, and its predictive entropy demonstrates better AUROC compared to several traditional metrics. Entropy-Tree unifies efficient structured exploration and reliable uncertainty estimation within a single decoding procedure.

Longxuan Wei, Yubo Zhang, Zijiao Zhang, Zhihu Wang, Shiwan Zhao, Tianyu Huang, Huiting Zhao, Chenfei Liu, Shenao Zhang, Junchi Yan• 2026

Related benchmarks

TaskDatasetResultRank
Uncertainty CalibrationMATH 500
AUROC0.861
18
Uncertainty CalibrationGPQA Diamond
AUROC0.696
18
Uncertainty CalibrationSciBench
AUROC78.5
18
Uncertainty CalibrationGPQA Main
AUROC0.636
18
Scientific problem solvingSciBench
Pass@2077.46
17
Mathematical ReasoningSVAMP
Pass@200.9733
6
Mathematical ReasoningAIME24
Pass@200.2333
6
Mathematical ReasoningAIME 25
Pass@2036.67
6
Mathematical ReasoningSVAMP
Pass@1096.62
6
Mathematical ReasoningAIME24
Pass@1018.33
6
Showing 10 of 18 rows

Other info

Follow for update