Graph Transformer for Recommendation
About
This paper presents a novel approach to representation learning in recommender systems by integrating generative self-supervised learning with graph transformer architecture. We highlight the importance of high-quality data augmentation with relevant self-supervised pretext tasks for improving performance. Towards this end, we propose a new approach that automates the self-supervision augmentation process through a rationale-aware generative SSL that distills informative user-item interaction patterns. The proposed recommender with Graph TransFormer (GFormer) that offers parameterized collaborative rationale discovery for selective augmentation while preserving global-aware user-item relationships. In GFormer, we allow the rationale-aware SSL to inspire graph collaborative filtering with task-adaptive invariant rationalization in graph transformer. The experimental results reveal that our GFormer has the capability to consistently improve the performance over baselines on different datasets. Several in-depth experiments further investigate the invariant rationale-aware augmentation from various aspects. The source code for this work is publicly available at: https://github.com/HKUDS/GFormer.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Recommendation | Epinions (test) | Recall@206.04 | 33 | |
| Recommendation | Ali-Display (test) | NDCG@200.5781 | 17 | |
| Top-K Recommendation | Douban-Book (test) | Recall@100.093 | 14 | |
| Top-K Recommendation | Yelp (test) | Recall@100.0651 | 14 | |
| Recommendation | Amazon-CD (test) | Hit Rate @ 1073.21 | 8 |