Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

DRew: Dynamically Rewired Message Passing with Delay

About

Message passing neural networks (MPNNs) have been shown to suffer from the phenomenon of over-squashing that causes poor performance for tasks relying on long-range interactions. This can be largely attributed to message passing only occurring locally, over a node's immediate neighbours. Rewiring approaches attempting to make graphs 'more connected', and supposedly better suited to long-range tasks, often lose the inductive bias provided by distance on the graph since they make distant nodes communicate instantly at every layer. In this paper we propose a framework, applicable to any MPNN architecture, that performs a layer-dependent rewiring to ensure gradual densification of the graph. We also propose a delay mechanism that permits skip connections between nodes depending on the layer and their mutual distance. We validate our approach on several long-range tasks and show that it outperforms graph Transformers and multi-hop MPNNs.

Benjamin Gutteridge, Xiaowen Dong, Michael Bronstein, Francesco Di Giovanni• 2023

Related benchmarks

TaskDatasetResultRank
Graph RegressionPeptides struct LRGB (test)
MAE0.2536
178
Molecular property predictionQM9 (test)
mu1.93
174
Graph ClassificationPeptides-func LRGB (test)
AP0.715
136
Graph RegressionPeptides struct (test)
MAE0.2536
84
Node ClassificationPascalVOC-SP LRGB (test)
F1 Score33.14
51
Link PredictionPCQM-Contact LRGB (test)--
33
Multilabel Graph ClassificationPeptides-func LRGB (test)
AP71.5
30
Graph ClassificationPeptides-func LRGB
AP71.5
29
Graph RegressionPeptides-struct LRGB
MAE0.2536
29
Link PredictionPCQM-Contact LRGB
MRR34.44
17
Showing 10 of 17 rows

Other info

Code

Follow for update