Entropy-Tree: Tree-Based Decoding with Entropy-Guided Exploration
About
Large language models achieve strong reasoning performance, yet existing decoding strategies either explore blindly (random sampling) or redundantly (independent multi-sampling). We propose Entropy-Tree, a tree-based decoding method that exploits entropy as a signal for branching decisions--expanding the search tree only at positions where the model exhibits genuine uncertainty. Entropy-Tree shows superior accuracy and calibration in reasoning tasks: it achieves better pass@k than Multi-chain across multiple models and datasets, and its predictive entropy demonstrates better AUROC compared to several traditional metrics. Entropy-Tree unifies efficient structured exploration and reliable uncertainty estimation within a single decoding procedure.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Uncertainty Calibration | MATH 500 | AUROC0.861 | 18 | |
| Uncertainty Calibration | GPQA Diamond | AUROC0.696 | 18 | |
| Uncertainty Calibration | SciBench | AUROC78.5 | 18 | |
| Uncertainty Calibration | GPQA Main | AUROC0.636 | 18 | |
| Scientific problem solving | SciBench | Pass@2077.46 | 17 | |
| Mathematical Reasoning | SVAMP | Pass@200.9733 | 6 | |
| Mathematical Reasoning | AIME24 | Pass@200.2333 | 6 | |
| Mathematical Reasoning | AIME 25 | Pass@2036.67 | 6 | |
| Mathematical Reasoning | SVAMP | Pass@1096.62 | 6 | |
| Mathematical Reasoning | AIME24 | Pass@1018.33 | 6 |