Breaking the Limits of Message Passing Graph Neural Networks
About
Since the Message Passing (Graph) Neural Networks (MPNNs) have a linear complexity with respect to the number of nodes when applied to sparse graphs, they have been widely implemented and still raise a lot of interest even though their theoretical expressive power is limited to the first order Weisfeiler-Lehman test (1-WL). In this paper, we show that if the graph convolution supports are designed in spectral-domain by a non-linear custom function of eigenvalues and masked with an arbitrary large receptive field, the MPNN is theoretically more powerful than the 1-WL test and experimentally as powerful as a 3-WL existing models, while remaining spatially localized. Moreover, by designing custom filter functions, outputs can have various frequency components that allow the convolution process to learn different relationships between a given input graph signal and its associated properties. So far, the best 3-WL equivalent graph neural networks have a computational complexity in $\mathcal{O}(n^3)$ with memory usage in $\mathcal{O}(n^2)$, consider non-local update mechanism and do not provide the spectral richness of output profile. The proposed method overcomes all these aforementioned problems and reaches state-of-the-art results in many downstream tasks.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Graph Regression | ZINC 12K (test) | MAE0.161 | 164 | |
| Graph Classification | EXP (test) | Accuracy100 | 33 | |
| Graph Isomorphism Testing | Graph8c | Undistinguished Pairs Count0.00e+0 | 15 | |
| Graph Isomorphism Testing | Strongly Regular Graphs (SRGs) | SR166221 | 15 | |
| Graph Separation | GRAPH8c random initialization | Non-Separated Pairs0.00e+0 | 11 | |
| Graph Separation | EXP random initialization | Non-separated Graph Pairs0.00e+0 | 11 |