Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Compress to Impress: Unleashing the Potential of Compressive Memory in Real-World Long-Term Conversations

About

Existing retrieval-based methods have made significant strides in maintaining long-term conversations. However, these approaches face challenges in memory database management and accurate memory retrieval, hindering their efficacy in dynamic, real-world interactions. This study introduces a novel framework, COmpressive Memory-Enhanced Dialogue sYstems (COMEDY), which eschews traditional retrieval modules and memory databases. Instead, COMEDY adopts a "One-for-All" approach, utilizing a single language model to manage memory generation, compression, and response generation. Central to this framework is the concept of compressive memory, which intergrates session-specific summaries, user-bot dynamics, and past events into a concise memory format. To support COMEDY, we curated a large-scale Chinese instruction-tuning dataset, Dolphin, derived from real user-chatbot interactions. Comparative evaluations demonstrate COMEDY's superiority over traditional retrieval-based methods in producing more nuanced and human-like conversational experiences. Our codes are available at https://github.com/nuochenpku/COMEDY.

Nuo Chen, Hongguang Li, Juhua Huang, Baoyuan Wang, Jia Li• 2024

Related benchmarks

TaskDatasetResultRank
Role-playingRPGBench Dialogue Shift (Generalization)
Turn Composition-0.565
18
Role-playingRPGBench Character Shift (Generalization)
Deviation Score (Literature)-0.564
18
Role-playingRPGBench User Shift Generalization
RP Score (German)-0.125
18
Role-playingRPGBench Aggregate (Overall)
Avg Score-0.253
18
Role-playingRPGBench In-distribution
R-EMI-0.287
18
Showing 5 of 5 rows

Other info

Follow for update