Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Gradient Gating for Deep Multi-Rate Learning on Graphs

About

We present Gradient Gating (G$^2$), a novel framework for improving the performance of Graph Neural Networks (GNNs). Our framework is based on gating the output of GNN layers with a mechanism for multi-rate flow of message passing information across nodes of the underlying graph. Local gradients are harnessed to further modulate message passing updates. Our framework flexibly allows one to use any basic GNN layer as a wrapper around which the multi-rate gradient gating mechanism is built. We rigorously prove that G$^2$ alleviates the oversmoothing problem and allows the design of deep GNNs. Empirical results are presented to demonstrate that the proposed framework achieves state-of-the-art performance on a variety of graph learning tasks, including on large-scale heterophilic graphs.

T. Konstantin Rusch, Benjamin P. Chamberlain, Michael W. Mahoney, Michael M. Bronstein, Siddhartha Mishra• 2022

Related benchmarks

TaskDatasetResultRank
Node ClassificationChameleon
Accuracy71.4
549
Node ClassificationSquirrel
Accuracy64.26
500
Node ClassificationCornell
Accuracy87.3
426
Node ClassificationWisconsin
Accuracy87.84
410
Node ClassificationTexas
Accuracy0.8757
410
Node ClassificationSquirrel (test)
Mean Accuracy64.26
234
Node ClassificationChameleon (test)
Mean Accuracy71.4
230
Node ClassificationTexas (test)
Mean Accuracy87.57
228
Node ClassificationWisconsin (test)
Mean Accuracy87.84
198
Node ClassificationCornell (test)
Mean Accuracy87.3
188
Showing 10 of 25 rows

Other info

Follow for update