Controlling Distributional Bias in Multi-Round LLM Generation via KL-Optimized Fine-Tuning
About
While the real world is inherently stochastic, Large Language Models (LLMs) are predominantly evaluated on single-round inference against fixed ground truths. In this work, we shift the lens to distribution alignment: assessing whether LLMs, when prompted repeatedly, can generate outputs that adhere to a desired target distribution, e.g. reflecting real-world statistics or a uniform distribution. We formulate distribution alignment using the attributes of gender, race, and sentiment within occupational contexts. Our empirical analysis reveals that off-the-shelf LLMs and standard alignment techniques, including prompt engineering and Direct Preference Optimization, fail to reliably control output distributions. To bridge this gap, we propose a novel fine-tuning framework that couples Steering Token Calibration with Semantic Alignment. We introduce a hybrid objective function combining Kullback-Leibler divergence to anchor the probability mass of latent steering tokens and Kahneman-Tversky Optimization to bind these tokens to semantically consistent responses. Experiments across six diverse datasets demonstrate that our approach significantly outperforms baselines, achieving precise distributional control in attribute generation tasks.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Distribution Alignment | Gender UK Real | MAE0.084 | 24 | |
| Distribution Alignment | Gender (US) - Real | MAE0.068 | 24 | |
| Distribution Alignment | Aggregate of Six Datasets | Average MAE0.082 | 24 | |
| Story Generation | Story Generation Gender (UK) Real distribution | MAE0.22 | 24 | |
| Story Generation | Story Generation Gender (US) Real distribution | MAE0.25 | 24 | |
| Distribution Alignment | Gender UK Even | MAE0.046 | 20 | |
| Distribution Alignment | Race Even | MAE0.072 | 20 | |
| Distribution Alignment | Sentiment Even | MAE0.075 | 20 | |
| Story Generation | Story Generation Sentiment Even distribution | MAE0.13 | 20 | |
| Distribution Alignment | Gender (US) - Even | MAE0.054 | 20 |