Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

On the Bottleneck of Graph Neural Networks and its Practical Implications

About

Since the proposal of the graph neural network (GNN) by Gori et al. (2005) and Scarselli et al. (2008), one of the major problems in training GNNs was their struggle to propagate information between distant nodes in the graph. We propose a new explanation for this problem: GNNs are susceptible to a bottleneck when aggregating messages across a long path. This bottleneck causes the over-squashing of exponentially growing information into fixed-size vectors. As a result, GNNs fail to propagate messages originating from distant nodes and perform poorly when the prediction task depends on long-range interaction. In this paper, we highlight the inherent problem of over-squashing in GNNs: we demonstrate that the bottleneck hinders popular GNNs from fitting long-range signals in the training data; we further show that GNNs that absorb incoming edges equally, such as GCN and GIN, are more susceptible to over-squashing than GAT and GGNN; finally, we show that prior work, which extensively tuned GNN models of long-range problems, suffers from over-squashing, and that breaking the bottleneck improves their state-of-the-art results without any tuning or additional weights. Our code is available at https://github.com/tech-srl/bottleneck/ .

Uri Alon, Eran Yahav• 2020

Related benchmarks

TaskDatasetResultRank
Graph ClassificationPROTEINS
Accuracy76.8
994
Graph ClassificationMUTAG
Accuracy83.45
862
Node ClassificationChameleon
Accuracy41.4
640
Node ClassificationSquirrel
Accuracy34.6
591
Graph ClassificationNCI1
Accuracy84.3
501
Graph ClassificationCOLLAB
Accuracy75.434
422
Node ClassificationCiteseer
Accuracy42.6
393
Graph ClassificationIMDB-B
Accuracy80.4
378
Graph ClassificationENZYMES
Accuracy65.1
318
Graph ClassificationIMDB-M
Accuracy41
275
Showing 10 of 64 rows

Other info

Follow for update