Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Continual Collaborative Distillation for Recommender System

About

Knowledge distillation (KD) has emerged as a promising technique for addressing the computational challenges associated with deploying large-scale recommender systems. KD transfers the knowledge of a massive teacher system to a compact student model, to reduce the huge computational burdens for inference while retaining high accuracy. The existing KD studies primarily focus on one-time distillation in static environments, leaving a substantial gap in their applicability to real-world scenarios dealing with continuously incoming users, items, and their interactions. In this work, we delve into a systematic approach to operating the teacher-student KD in a non-stationary data stream. Our goal is to enable efficient deployment through a compact student, which preserves the high performance of the massive teacher, while effectively adapting to continuously incoming data. We propose Continual Collaborative Distillation (CCD) framework, where both the teacher and the student continually and collaboratively evolve along the data stream. CCD facilitates the student in effectively adapting to new data, while also enabling the teacher to fully leverage accumulated knowledge. We validate the effectiveness of CCD through extensive quantitative, ablative, and exploratory experiments on two real-world datasets. We expect this research direction to contribute to narrowing the gap between existing KD studies and practical applications, thereby enhancing the applicability of KD in real-world systems.

Gyuseok Lee, SeongKu Kang, Wonbin Kweon, Hwanjo Yu• 2024

Related benchmarks

TaskDatasetResultRank
RecommendationGowalla--
153
RecommendationRetailRocket
Hit Rate @ 1047.1
35
RecommendationYelp
NDCG@100.112
32
RecommendationMovieLens 20M
nDCG@1031.7
29
Top-K RecommendationAmazon Books
NDCG@100.117
23
Top-K RecommendationAmazon Electronics
NDCG@1013.4
23
Top-K RecommendationTaobao
NDCG@100.22
23
Top-K RecommendationH&M
NDCG@1026.7
23
Showing 8 of 8 rows

Other info

Follow for update