Clenshaw Graph Neural Networks
About
Graph Convolutional Networks (GCNs), which use a message-passing paradigm with stacked convolution layers, are foundational methods for learning graph representations. Recent GCN models use various residual connection techniques to alleviate the model degradation problem such as over-smoothing and gradient vanishing. Existing residual connection techniques, however, fail to make extensive use of underlying graph structure as in the graph spectral domain, which is critical for obtaining satisfactory results on heterophilic graphs. In this paper, we introduce ClenshawGCN, a GNN model that employs the Clenshaw Summation Algorithm to enhance the expressiveness of the GCN model. ClenshawGCN equips the standard GCN model with two straightforward residual modules: the adaptive initial residual connection and the negative second-order residual connection. We show that by adding these two residual modules, ClenshawGCN implicitly simulates a polynomial filter under the Chebyshev basis, giving it at least as much expressive power as polynomial spectral GNNs. In addition, we conduct comprehensive experiments to demonstrate the superiority of our model over spatial and spectral GNN models.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Node Classification | Cornell (60%/20%/20% random) | Accuracy92.46 | 78 | |
| Node Classification | Cora (60/20/20 random split) | Accuracy88.9 | 74 | |
| Node Classification | Chameleon (60%/20%/20% random) | Accuracy69.44 | 55 | |
| Node Classification | Texas (60% 20% 20% random splits) | Accuracy93.36 | 45 | |
| Node Classification | Squirrel 60%/20%/20% random splits | Accuracy62.14 | 44 | |
| Node Classification | Actor (60%/20%/20% random splits) | Accuracy42.08 | 34 | |
| Node Classification | Pubmed (60/20/20 random split) | Accuracy91.99 | 31 | |
| Node Classification | Citeseer (60 20 20 random split) | Accuracy80.34 | 22 | |
| Node Classification | Twitch-gamer (50% 25% 25% fixed splits) | Accuracy66.26 | 6 |