Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Introducing Self-Attention to Target Attentive Graph Neural Networks

About

Session-based recommendation systems suggest relevant items to users by modeling user behavior and preferences using short-term anonymous sessions. Existing methods leverage Graph Neural Networks (GNNs) that propagate and aggregate information from neighboring nodes i.e., local message passing. Such graph-based architectures have representational limits, as a single sub-graph is susceptible to overfit the sequential dependencies instead of accounting for complex transitions between items in different sessions. We propose a new technique that leverages a Transformer in combination with a target attentive GNN. This allows richer representations to be learnt, which translates to empirical performance gains in comparison to a vanilla target attentive GNN. Our experimental results and ablation show that our proposed method is competitive with the existing methods on real-world benchmark datasets, improving on graph-based hypotheses. Code is available at https://github.com/The-Learning-Machines/SBR

Sai Mitheran, Abhinav Java, Surya Kant Sahu, Arshad Shaikh• 2021

Related benchmarks

TaskDatasetResultRank
Session-based recommendationYoochoose 1/64 (test)
HR@2071.91
20
Session-based recommendationDiginetica (test)
HR@2051.86
20
Showing 2 of 2 rows

Other info

Code

Follow for update