Tuning-Free LLM Can Build A Strong Recommender Under Sparse Connectivity And Knowledge Gap Via Extracting Intent
About
Recent advances in recommendation with large language models (LLMs) often rely on either commonsense augmentation at the item-category level or implicit intent modeling on existing knowledge graphs. However, such approaches struggle to capture grounded user intents and to handle sparsity and cold-start scenarios. In this work, we present LLM-based Intent Knowledge Graph Recommender (IKGR), a novel framework that constructs an intent-centric knowledge graph where both users and items are explicitly linked to intent nodes extracted by a tuning-free, RAG-guided LLM pipeline. By grounding intents in external knowledge sources and user profiles, IKGR canonically represents what a user seeks and what an item satisfies as first-class entities. To alleviate sparsity, we further introduce a mutual-intent connectivity densification strategy, which shortens semantic paths between users and long-tail items without requiring cross-graph fusion. Finally, a lightweight GNN layer is employed on top of the intent-enhanced graph to produce recommendation signals with low latency. Extensive experiments on public and enterprise datasets demonstrate that IKGR consistently outperforms strong baselines, particularly on cold-start and long-tail slices, while remaining efficient through a fully offline LLM pipeline.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Recommendation | Beauty | NDCG@528.06 | 48 | |
| Recommendation | Search | HR@10.0086 | 8 | |
| Recommendation | Books | HR@112.51 | 8 | |
| Recommendation | Steam | HR@110.95 | 8 | |
| Recommendation | Search | P-value0.021 | 1 | |
| Recommendation | Beauty | p-value0.038 | 1 | |
| Recommendation | Books | P-Value0.034 | 1 | |
| Recommendation | Steam | P-Value0.018 | 1 | |
| Recommendation | Yelp 2022 | P-value0.061 | 1 |