Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Graph Transformer for Recommendation

About

This paper presents a novel approach to representation learning in recommender systems by integrating generative self-supervised learning with graph transformer architecture. We highlight the importance of high-quality data augmentation with relevant self-supervised pretext tasks for improving performance. Towards this end, we propose a new approach that automates the self-supervision augmentation process through a rationale-aware generative SSL that distills informative user-item interaction patterns. The proposed recommender with Graph TransFormer (GFormer) that offers parameterized collaborative rationale discovery for selective augmentation while preserving global-aware user-item relationships. In GFormer, we allow the rationale-aware SSL to inspire graph collaborative filtering with task-adaptive invariant rationalization in graph transformer. The experimental results reveal that our GFormer has the capability to consistently improve the performance over baselines on different datasets. Several in-depth experiments further investigate the invariant rationale-aware augmentation from various aspects. The source code for this work is publicly available at: https://github.com/HKUDS/GFormer.

Chaoliu Li, Lianghao Xia, Xubin Ren, Yaowen Ye, Yong Xu, Chao Huang• 2023

Related benchmarks

TaskDatasetResultRank
RecommendationEpinions (test)
Recall@206.04
33
RecommendationAli-Display (test)
NDCG@200.5781
17
Top-K RecommendationDouban-Book (test)
Recall@100.093
14
Top-K RecommendationYelp (test)
Recall@100.0651
14
RecommendationAmazon-CD (test)
Hit Rate @ 1073.21
8
Showing 5 of 5 rows

Other info

Follow for update