Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Online Domain-aware LLM Decoding for Continual Domain Evolution

About

LLMs are typically fine-tuned offline on domain-specific data, assuming a static domain. In practice, domain knowledge evolves continuously through new regulations, products, services, and interaction patterns. Retraining or fine-tuning LLMs for every new instance is computationally infeasible. Additionally, real-world environments also exhibit temporal dynamics with shifting data distributions. Disregarding this phenomenon, commonly referred to as concept drift, can significantly diminish a model's predictive accuracy. This mismatch between evolving domains and static adaptation pipelines highlights the need for efficient, real-time adaptation without costly retraining. In response, we introduce Online Domain-aware Decoding framework (ODD). ODD performs probability-level fusion between a base LLM and a prefix-tree prior, guided by adaptive confidence modulation using disagreement and continuity signals. Empirical evaluation under diverse drift scenarios demonstrates that ODD consistently surpasses LLM-Greedy and LLM-Temp Scaled across all syntactic and semantic NLG metrics. It yields an absolute ROUGE-L gain of 0.065 and a 13.6% relative improvement in Cosine Similarity over the best baseline. These results demonstrate ODD 's robustness to evolving lexical and contextual patterns, making it suitable for dynamic LLM applications.

Mohammad Abu-Shaira, Weishi Shi• 2026

Related benchmarks

TaskDatasetResultRank
LLM DecodingBitext Telco Abrupt Drift
E.M.9.6
3
LLM DecodingBitext Telco Incremental Drift
E.M.0.052
3
LLM DecodingBitext Telco Gradual Drift
EM0.037
3
Showing 3 of 3 rows

Other info

Follow for update