Training-Free Bayesianization for Low-Rank Adapters of Large Language Models
About
Estimating the uncertainty of responses from Large Language Models (LLMs) remains a critical challenge. While recent Bayesian methods have demonstrated effectiveness in quantifying uncertainty through low-rank weight updates, they typically require complex fine-tuning or post-training procedures. In this paper, we propose Training-Free Bayesianization (TFB), a simple yet theoretically grounded framework that efficiently transforms trained low-rank adapters into Bayesian ones without additional training. TFB systematically searches for the maximally acceptable level of variance in the weight posterior, constrained within a family of low-rank isotropic Gaussian distributions. Our theoretical analysis shows that under mild conditions, this search process is equivalent to KL-regularized variational optimization, a generalized form of variational inference. Through comprehensive experiments, we show that TFB achieves superior uncertainty estimation and generalization compared to existing methods while eliminating the need for complex Bayesianization training procedures. Code will be available at https://github.com/Wang-ML-Lab/bayesian-peft.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Commonsense Reasoning | ARC-C | Accuracy80.58 | 172 | |
| Commonsense Reasoning | ARC-E | Accuracy91.2 | 106 | |
| Common Sense Reasoning | WinoGrande S In-Distribution Llama-3.1-8B (train test) | Accuracy76.25 | 15 | |
| Common Sense Reasoning | ARC-C OOD Small Shift | Accuracy80.97 | 14 | |
| Common Sense Reasoning | Chem OOD Large Shift | Accuracy47.33 | 14 | |
| Uncertainty Quantification | ARC-E | Training Memory (MB)2.10e+4 | 14 | |
| Common Sense Reasoning | ARC-E OOD Small Shift | Accuracy85.74 | 12 | |
| Common Sense Reasoning | Phy OOD Large Shift | Accuracy48.83 | 12 | |
| Common Sense Reasoning | OBQA In-Distribution | Accuracy88.3 | 12 | |
| Common Sense Reasoning | WG-M (In-Distribution) | Accuracy82.7 | 12 |