Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Exphormer: Sparse Transformers for Graphs

About

Graph transformers have emerged as a promising architecture for a variety of graph learning and representation tasks. Despite their successes, though, it remains challenging to scale graph transformers to large graphs while maintaining accuracy competitive with message-passing networks. In this paper, we introduce Exphormer, a framework for building powerful and scalable graph transformers. Exphormer consists of a sparse attention mechanism based on two mechanisms: virtual global nodes and expander graphs, whose mathematical characteristics, such as spectral expansion, pseduorandomness, and sparsity, yield graph transformers with complexity only linear in the size of the graph, while allowing us to prove desirable theoretical properties of the resulting transformer models. We show that incorporating Exphormer into the recently-proposed GraphGPS framework produces models with competitive empirical results on a wide variety of graph datasets, including state-of-the-art results on three datasets. We also show that Exphormer can scale to datasets on larger graphs than shown in previous graph transformer architectures. Code can be found at \url{https://github.com/hamed1375/Exphormer}.

Hamed Shirzad, Ameya Velingker, Balaji Venkatachalam, Danica J. Sutherland, Ali Kemal Sinop• 2023

Related benchmarks

TaskDatasetResultRank
Node ClassificationCora
Accuracy83.29
885
Node ClassificationPubmed
Accuracy89.06
742
Node Classificationogbn-arxiv (test)
Accuracy72.44
382
Node ClassificationPubmed
Accuracy79.67
307
Node ClassificationCiteseer
Accuracy71.85
275
Node ClassificationActor
Accuracy39.01
237
Graph Classificationogbg-molpcba (test)
AP28.49
206
Node ClassificationwikiCS
Accuracy79.38
198
Graph RegressionPeptides struct LRGB (test)
MAE0.2481
178
Node ClassificationPhoto
Mean Accuracy95.36
165
Showing 10 of 94 rows
...

Other info

Code

Follow for update