Specformer: Spectral Graph Neural Networks Meet Transformers
About
Spectral graph neural networks (GNNs) learn graph representations via spectral-domain graph convolutions. However, most existing spectral graph filters are scalar-to-scalar functions, i.e., mapping a single eigenvalue to a single filtered value, thus ignoring the global pattern of the spectrum. Furthermore, these filters are often constructed based on some fixed-order polynomials, which have limited expressiveness and flexibility. To tackle these issues, we introduce Specformer, which effectively encodes the set of all eigenvalues and performs self-attention in the spectral domain, leading to a learnable set-to-set spectral filter. We also design a decoder with learnable bases to enable non-local graph convolution. Importantly, Specformer is equivariant to permutation. By stacking multiple Specformer layers, one can build a powerful spectral GNN. On synthetic datasets, we show that our Specformer can better recover ground-truth spectral filters than other spectral GNNs. Extensive experiments of both node-level and graph-level tasks on real-world graph datasets show that our Specformer outperforms state-of-the-art GNNs and learns meaningful spectrum patterns. Code and data are available at https://github.com/bdy9527/Specformer.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Node Classification | Cora (test) | Mean Accuracy88.41 | 687 | |
| Node Classification | Chameleon | Accuracy49.79 | 549 | |
| Node Classification | PubMed (test) | Accuracy89.19 | 500 | |
| Node Classification | Squirrel | Accuracy38.24 | 500 | |
| Node Classification | Actor | Accuracy34.12 | 237 | |
| Node Classification | Squirrel (test) | Mean Accuracy62.15 | 234 | |
| Node Classification | Chameleon (test) | Mean Accuracy73.31 | 230 | |
| Graph Classification | ogbg-molpcba (test) | AP29.72 | 206 | |
| Graph Regression | ZINC (test) | MAE0.066 | 204 | |
| Node Classification | Actor (test) | Mean Accuracy0.3826 | 143 |