Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Tuning-Free LLM Can Build A Strong Recommender Under Sparse Connectivity And Knowledge Gap Via Extracting Intent

About

Recent advances in recommendation with large language models (LLMs) often rely on either commonsense augmentation at the item-category level or implicit intent modeling on existing knowledge graphs. However, such approaches struggle to capture grounded user intents and to handle sparsity and cold-start scenarios. In this work, we present LLM-based Intent Knowledge Graph Recommender (IKGR), a novel framework that constructs an intent-centric knowledge graph where both users and items are explicitly linked to intent nodes extracted by a tuning-free, RAG-guided LLM pipeline. By grounding intents in external knowledge sources and user profiles, IKGR canonically represents what a user seeks and what an item satisfies as first-class entities. To alleviate sparsity, we further introduce a mutual-intent connectivity densification strategy, which shortens semantic paths between users and long-tail items without requiring cross-graph fusion. Finally, a lightweight GNN layer is employed on top of the intent-enhanced graph to produce recommendation signals with low latency. Extensive experiments on public and enterprise datasets demonstrate that IKGR consistently outperforms strong baselines, particularly on cold-start and long-tail slices, while remaining efficient through a fully offline LLM pipeline.

Wenqing Zheng, Noah Fatsi, Daniel Barcklow, Dmitri Kalaev, Steven Yao, Owen Reinert, C. Bayan Bruss, Daniele Rosa• 2025

Related benchmarks

TaskDatasetResultRank
RecommendationBeauty
NDCG@528.06
48
RecommendationSearch
HR@10.0086
8
RecommendationBooks
HR@112.51
8
RecommendationSteam
HR@110.95
8
RecommendationSearch
P-value0.021
1
RecommendationBeauty
p-value0.038
1
RecommendationBooks
P-Value0.034
1
RecommendationSteam
P-Value0.018
1
RecommendationYelp 2022
P-value0.061
1
Showing 9 of 9 rows

Other info

Follow for update