Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Path-Augmented Graph Transformer Network

About

Much of the recent work on learning molecular representations has been based on Graph Convolution Networks (GCN). These models rely on local aggregation operations and can therefore miss higher-order graph properties. To remedy this, we propose Path-Augmented Graph Transformer Networks (PAGTN) that are explicitly built on longer-range dependencies in graph-structured data. Specifically, we use path features in molecular graphs to create global attention layers. We compare our PAGTN model against the GCN model and show that our model consistently outperforms GCNs on molecular property prediction datasets including quantum chemistry (QM7, QM8, QM9), physical chemistry (ESOL, Lipophilictiy) and biochemistry (BACE, BBBP).

Benson Chen, Regina Barzilay, Tommi Jaakkola• 2019

Related benchmarks

TaskDatasetResultRank
Atomization energy predictionQM7 (10-fold cross validation)
MAE47.8
13
Showing 1 of 1 rows

Other info

Follow for update