Graph Propagation Transformer for Graph Representation Learning
About
This paper presents a novel transformer architecture for graph representation learning. The core insight of our method is to fully consider the information propagation among nodes and edges in a graph when building the attention module in the transformer blocks. Specifically, we propose a new attention mechanism called Graph Propagation Attention (GPA). It explicitly passes the information among nodes and edges in three ways, i.e. node-to-node, node-to-edge, and edge-to-node, which is essential for learning graph-structured data. On this basis, we design an effective transformer architecture named Graph Propagation Transformer (GPTrans) to further help learn graph data. We verify the performance of GPTrans in a wide range of graph learning experiments on several benchmark datasets. These results show that our method outperforms many state-of-the-art transformer-based graph models with better performance. The code will be released at https://github.com/czczup/GPTrans.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Graph Classification | ogbg-molpcba (test) | AP32.43 | 206 | |
| Graph Regression | OGB-LSC PCQM4M v2 (val) | MAE0.0809 | 81 | |
| Graph property regression | PCQM4M (val) | MAE0.1151 | 19 | |
| Graph-level classification | MolHIV (test) | AUC0.8126 | 19 | |
| Graph Regression | PCQM4M v2 (val) | MAE0.0809 | 13 | |
| Graph property regression | PCQM4M v2 (test-dev) | MAE0.0821 | 9 | |
| Graph Regression | OGB-LSC PCQM4M (test) | -- | 9 | |
| Graph Regression | PCQM4M v2 (test) | MAE0.0821 | 8 |