Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Attentional Factorization Machines: Learning the Weight of Feature Interactions via Attention Networks

About

Factorization Machines (FMs) are a supervised learning approach that enhances the linear regression model by incorporating the second-order feature interactions. Despite effectiveness, FM can be hindered by its modelling of all feature interactions with the same weight, as not all feature interactions are equally useful and predictive. For example, the interactions with useless features may even introduce noises and adversely degrade the performance. In this work, we improve FM by discriminating the importance of different feature interactions. We propose a novel model named Attentional Factorization Machine (AFM), which learns the importance of each feature interaction from data via a neural attention network. Extensive experiments on two real-world datasets demonstrate the effectiveness of AFM. Empirically, it is shown on regression task AFM betters FM with a $8.6\%$ relative improvement, and consistently outperforms the state-of-the-art deep learning methods Wide&Deep and DeepCross with a much simpler structure and fewer model parameters. Our implementation of AFM is publicly available at: https://github.com/hexiangnan/attentional_factorization_machine

Jun Xiao, Hao Ye, Xiangnan He, Hanwang Zhang, Fei Wu, Tat-Seng Chua• 2017

Related benchmarks

TaskDatasetResultRank
CTR PredictionCriteo
AUC0.7965
282
Click-Through Rate PredictionAvazu (test)
AUC0.7821
191
CTR PredictionAvazu
AUC78.06
144
CTR PredictionCriteo (test)
AUC0.8071
141
CTR PredictionFrappe
AUC0.9611
83
CTR PredictionMovieLens
AUC94.55
55
Click-Through Rate PredictionCriteo (test)
AUC0.7975
47
CTR PredictionFrappe (test)
AUC0.9557
38
CTR PredictionKDD 12
AUC0.7659
28
CTR PredictionMovieLens (test)
Logloss0.278
21
Showing 10 of 22 rows

Other info

Follow for update