Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

INMS: Memory Sharing for Large Language Model based Agents

About

While Large Language Model (LLM) based agents excel at complex tasks, their performance in open-ended scenarios is often constrained by isolated operation and reliance on static databases, missing the dynamic knowledge exchange of human dialogue. To bridge this gap, we propose the INteractive Memory Sharing (INMS) framework, an asynchronous interaction paradigm for multi-agent systems. By integrating real-time memory filtering, storage, and retrieval, INMS establishes a shared conversational memory pool. This enables continuous, dialogue-like memory sharing among agents, promoting collective self-enhancement and dynamically refining the retrieval mediator based on interaction history. Extensive experiments across three datasets demonstrate that INMS significantly improves agent performance by effectively modeling multi-agent interaction and collective knowledge sharing.

Hang Gao, Yongfeng Zhang• 2024

Related benchmarks

TaskDatasetResultRank
Literary CompositionLimerick 1.0 (test)
F1 Score51
7
Literary CompositionSonnet 1.0 (test)
F1 Score24
7
Logic reasoningRiddle 1.0 (test)
F1 Score69
7
PlanningFitness 1.0 (test)
F1 Score23
7
PlanningTravel 1.0 (test)
F1 Score21
7
Logic reasoningPuzzle 1.0 (test)
F1 Score18
7
PlanningStudy 1.0 (test)
F1 Score13
7
Logic reasoningPun 1.0 (test)
F1 Score38
7
Literary CompositionWuyanlvshi 1.0 (test)
F1 Score0.00e+0
7
Showing 9 of 9 rows

Other info

Follow for update