Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

KVComm: Enabling Efficient LLM Communication through Selective KV Sharing

About

Large Language Models (LLMs) are increasingly deployed in multi-agent systems, where effective inter-model communication is crucial. Existing communication protocols either rely on natural language, incurring high inference costs and information loss, or on hidden states, which suffer from information concentration bias and inefficiency. To address these limitations, we propose KVComm, a novel communication framework that enables efficient communication between LLMs through selective sharing of KV pairs. KVComm leverages the rich information encoded in the KV pairs while avoiding the pitfalls of hidden states. We introduce a KV layer-wise selection strategy based on attention importance scores with a Gaussian prior to identify the most informative KV pairs for communication. Extensive experiments across diverse tasks and model pairs demonstrate that KVComm achieves comparable performance to the upper-bound method, which directly merges inputs to one model without any communication, while transmitting as few as 30\% of layers' KV pairs. Our study highlights the potential of KV pairs as an effective medium for inter-LLM communication, paving the way for scalable and efficient multi-agent systems.

Xiangyu Shi, Marco Chiesa, Gerald Q. Maguire Jr., Dejan Kostic• 2025

Related benchmarks

TaskDatasetResultRank
Question AnsweringMuSiQuest-E 2022 (Full)
F1 Score46
30
Question AnsweringHotpotQA-E 2018 (Full)
F1 Score71
30
Question AnsweringQASPER-E 2021 (Full)
F1 Score38
30
SummarizationSAMSum Full 2019
F1 Score35
30
ReasoningCountries
F1 Score57
19
Mathematical ReasoningTMATH
F1 Score36
3
ReasoningTipsheets
F1 Score0.96
3
Question AnsweringHotpotQA
F1 Score65
2
Question AnsweringMultiField-QA en
F1 Score51
2
Question Answering2WikiM-QA
F1 Score37
2
Showing 10 of 11 rows

Other info

Follow for update