Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Boost then Convolve: Gradient Boosting Meets Graph Neural Networks

About

Graph neural networks (GNNs) are powerful models that have been successful in various graph representation learning tasks. Whereas gradient boosted decision trees (GBDT) often outperform other machine learning methods when faced with heterogeneous tabular data. But what approach should be used for graphs with tabular node features? Previous GNN models have mostly focused on networks with homogeneous sparse features and, as we show, are suboptimal in the heterogeneous setting. In this work, we propose a novel architecture that trains GBDT and GNN jointly to get the best of both worlds: the GBDT model deals with heterogeneous features, while GNN accounts for the graph structure. Our model benefits from end-to-end optimization by allowing new trees to fit the gradient updates of GNN. With an extensive experimental comparison to the leading GBDT and GNN models, we demonstrate a significant increase in performance on a variety of graphs with tabular features. The code is available: https://github.com/nd7141/bgnn.

Sergei Ivanov, Liudmila Prokhorenkova• 2021

Related benchmarks

TaskDatasetResultRank
Node ClassificationCiteseer
Accuracy69.1
804
Graph ClassificationPROTEINS
Accuracy70.5
742
Node ClassificationPubmed
Accuracy59.9
742
Graph ClassificationMUTAG
Accuracy80.2
697
Graph ClassificationNCI1
Accuracy70.5
460
Node ClassificationCornell
Accuracy68.2
426
Graph ClassificationENZYMES
Accuracy58.1
305
Node ClassificationActor
Accuracy31.1
237
Graph ClassificationPTC
Accuracy55.5
167
Graph ClassificationIMDB-B (test)
Accuracy68
134
Showing 10 of 41 rows

Other info

Follow for update