DRew: Dynamically Rewired Message Passing with Delay
About
Message passing neural networks (MPNNs) have been shown to suffer from the phenomenon of over-squashing that causes poor performance for tasks relying on long-range interactions. This can be largely attributed to message passing only occurring locally, over a node's immediate neighbours. Rewiring approaches attempting to make graphs 'more connected', and supposedly better suited to long-range tasks, often lose the inductive bias provided by distance on the graph since they make distant nodes communicate instantly at every layer. In this paper we propose a framework, applicable to any MPNN architecture, that performs a layer-dependent rewiring to ensure gradual densification of the graph. We also propose a delay mechanism that permits skip connections between nodes depending on the layer and their mutual distance. We validate our approach on several long-range tasks and show that it outperforms graph Transformers and multi-hop MPNNs.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Graph Regression | Peptides struct LRGB (test) | MAE0.2536 | 178 | |
| Molecular property prediction | QM9 (test) | mu1.93 | 174 | |
| Graph Classification | Peptides-func LRGB (test) | AP0.715 | 136 | |
| Graph Regression | Peptides struct (test) | MAE0.2536 | 84 | |
| Node Classification | PascalVOC-SP LRGB (test) | F1 Score33.14 | 51 | |
| Link Prediction | PCQM-Contact LRGB (test) | -- | 33 | |
| Multilabel Graph Classification | Peptides-func LRGB (test) | AP71.5 | 30 | |
| Graph Classification | Peptides-func LRGB | AP71.5 | 29 | |
| Graph Regression | Peptides-struct LRGB | MAE0.2536 | 29 | |
| Link Prediction | PCQM-Contact LRGB | MRR34.44 | 17 |