Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

GRPE: Relative Positional Encoding for Graph Transformer

About

We propose a novel positional encoding for learning graph on Transformer architecture. Existing approaches either linearize a graph to encode absolute position in the sequence of nodes, or encode relative position with another node using bias terms. The former loses preciseness of relative position from linearization, while the latter loses a tight integration of node-edge and node-topology interaction. To overcome the weakness of the previous approaches, our method encodes a graph without linearization and considers both node-topology and node-edge interaction. We name our method Graph Relative Positional Encoding dedicated to graph representation learning. Experiments conducted on various graph datasets show that the proposed method outperforms previous approaches significantly. Our code is publicly available at https://github.com/lenscloth/GRPE.

Wonpyo Park, Woonggi Chang, Donggeon Lee, Juntae Kim, Seung-won Hwang• 2022

Related benchmarks

TaskDatasetResultRank
Graph Classificationogbg-molpcba (test)
AP31.5
206
Graph RegressionZINC 12K (test)
MAE0.094
164
Node ClassificationCLUSTER (test)
Test Accuracy81.586
113
Graph RegressionOGB-LSC PCQM4M v2 (val)
MAE0.0867
81
Quantum Chemical PredictionPCQM4M v2 (val)
MAE0.0866
68
Molecular property predictionMOLPCBA OGB (test)
AP (Test)31.5
36
Quantum Chemical PredictionPCQM4M v2 (test-dev)
MAE0.0898
31
Graph-level classificationMolHIV (test)
AUC0.8139
19
Graph property regressionPCQM4M (val)
MAE0.1225
19
Graph ClassificationMolHIV v1 (test)
AUC0.8139
16
Showing 10 of 12 rows

Other info

Follow for update