Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Sign and Basis Invariant Networks for Spectral Graph Representation Learning

About

We introduce SignNet and BasisNet -- new neural architectures that are invariant to two key symmetries displayed by eigenvectors: (i) sign flips, since if $v$ is an eigenvector then so is $-v$; and (ii) more general basis symmetries, which occur in higher dimensional eigenspaces with infinitely many choices of basis eigenvectors. We prove that under certain conditions our networks are universal, i.e., they can approximate any continuous function of eigenvectors with the desired invariances. When used with Laplacian eigenvectors, our networks are provably more expressive than existing spectral methods on graphs; for instance, they subsume all spectral graph convolutions, certain spectral graph invariants, and previously proposed graph positional encodings as special cases. Experiments show that our networks significantly outperform existing baselines on molecular graph regression, learning expressive graph representations, and learning neural fields on triangle meshes. Our code is available at https://github.com/cptq/SignNet-BasisNet .

Derek Lim, Joshua Robinson, Lingxiao Zhao, Tess Smidt, Suvrit Sra, Haggai Maron, Stefanie Jegelka• 2022

Related benchmarks

TaskDatasetResultRank
Node Classificationogbn-arxiv (test)
Accuracy71.95
382
Graph ClassificationCIFAR10 (test)
Test Accuracy71.87
139
Node ClassificationCLUSTER (test)
Test Accuracy77.442
113
Graph ClassificationMNIST (test)
Accuracy98.16
110
Graph RegressionPeptides struct (test)
MAE0.2501
84
Graph ClassificationPeptides-func (test)
AP64.94
82
Molecular property predictionBBBP (test)
ROC-AUC0.679
64
Graph RegressionZINC subset (test)
MAE0.1078
56
Molecular property predictionTox21 (test)
ROC-AUC0.776
53
Molecular property predictionSIDER (test)
ROC-AUC0.605
53
Showing 10 of 19 rows

Other info

Code

Follow for update