Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

RaSeRec: Retrieval-Augmented Sequential Recommendation

About

Although prevailing supervised and self-supervised learning augmented sequential recommendation (SeRec) models have achieved improved performance with powerful neural network architectures, we argue that they still suffer from two limitations: (1) Preference Drift, where models trained on past data can hardly accommodate evolving user preference; and (2) Implicit Memory, where head patterns dominate parametric learning, making it harder to recall long tails. In this work, we explore retrieval augmentation in SeRec, to address these limitations. Specifically, we propose a Retrieval-Augmented Sequential Recommendation framework, named RaSeRec, the main idea of which is to maintain a dynamic memory bank to accommodate preference drifts and retrieve relevant memories to augment user modeling explicitly. It consists of two stages: (i) collaborative-based pre-training, which learns to recommend and retrieve; (ii) retrieval-augmented fine-tuning, which learns to leverage retrieved memories. Extensive experiments on three datasets fully demonstrate the superiority and effectiveness of RaSeRec. The implementation code is available at https://github.com/HITsz-TMG/RaSeRec.

Xinping Zhao, Baotian Hu, Yan Zhong, Shouzheng Huang, Zihao Zheng, Meng Wang, Haofen Wang, Min Zhang• 2024

Related benchmarks

TaskDatasetResultRank
Sequential RecommendationML 1M
NDCG@100.1646
130
Sequential RecommendationBeauty
Hit Rate @ 200.1221
43
Sequential RecommendationSports
HR@53.15
39
Sequential RecommendationBeauty (test)
NDCG@53.69
36
Sequential RecommendationSports (test)
HR@53.31
26
Sequential RecommendationOFFICE
NDCG@54.29
22
Sequential RecommendationHome
HR@51.96
14
Sequential RecommendationClothing (test)
HR@50.0194
13
Showing 8 of 8 rows

Other info

Code

Follow for update