Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Unleash the Potential of Long Semantic IDs for Generative Recommendation

About

Semantic ID-based generative recommendation represents items as sequences of discrete tokens, but it inherently faces a trade-off between representational expressiveness and computational efficiency. Residual Quantization (RQ)-based approaches restrict semantic IDs to be short to enable tractable sequential modeling, while Optimized Product Quantization (OPQ)-based methods compress long semantic IDs through naive rigid aggregation, inevitably discarding fine-grained semantic information. To resolve this dilemma, we propose ACERec, a novel framework that decouples the granularity gap between fine-grained tokenization and efficient sequential modeling. It employs an Attentive Token Merger to distill long expressive semantic tokens into compact latents and introduces a dedicated Intent Token serving as a dynamic prediction anchor. To capture cohesive user intents, we guide the learning process via a dual-granularity objective, harmonizing fine-grained token prediction with global item-level semantic alignment. Extensive experiments on six real-world benchmarks demonstrate that ACERec consistently outperforms state-of-the-art baselines, achieving an average improvement of 14.40\% in NDCG@10, effectively reconciling semantic expressiveness and computational efficiency.

Ming Xia, Zhiqin Zhou, Guoxin Ma, Dongmin Huang• 2026

Related benchmarks

TaskDatasetResultRank
Sequential RecommendationSports
Recall@50.0341
43
Sequential RecommendationToys
Recall@50.0688
31
Sequential RecommendationBeauty
HR@108.41
30
Sequential RecommendationInstruments
HR@58.19
20
Sequential RecommendationOffice Amazon (test)
R@56.83
10
Showing 5 of 5 rows

Other info

Follow for update