Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Sparse Contrastive Learning for Content-Based Cold Item Recommendation

About

Item cold-start is a pervasive challenge for collaborative filtering (CF) recommender systems. Existing methods often train cold-start models by mapping auxiliary item content, such as images or text descriptions, into the embedding space of a CF model. However, such approaches can be limited by the fundamental information gap between CF signals and content features. In this work, we propose to avoid this limitation with purely content-based modeling of cold items, i.e. without alignment with CF user or item embeddings. We instead frame cold-start prediction in terms of item-item similarity, training a content encoder to project into a latent space where similarity correlates with user preferences. We define our training objective as a sparse generalization of sampled softmax loss with the $\alpha$-entmax family of activation functions, which allows for sharper estimation of item relevance by zeroing gradients for uninformative negatives. We then describe how this Sampled Entmax for Cold-start (SEMCo) training regime can be extended via knowledge distillation, and show that it outperforms existing cold-start methods and standard sampled softmax in ranking accuracy. We also discuss the advantages of purely content-based modeling, particularly in terms of equity of item outcomes.

Gregor Meehan, Johan Pauwels• 2026

Related benchmarks

TaskDatasetResultRank
Cold-start recommendationClothing Cold (test)
Recall@2014.86
13
Cold-start recommendationElectronics Cold (test)
Recall@205.26
13
Cold-start recommendationM4A-Onion Cold (test)
Recall@209.1
13
Cold-start recommendationMicrolens Cold (test)
Recall@2014.55
13
Showing 4 of 4 rows

Other info

Follow for update