Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Hierarchical Inter-Message Passing for Learning on Molecular Graphs

About

We present a hierarchical neural message passing architecture for learning on molecular graphs. Our model takes in two complementary graph representations: the raw molecular graph representation and its associated junction tree, where nodes represent meaningful clusters in the original graph, e.g., rings or bridged compounds. We then proceed to learn a molecule's representation by passing messages inside each graph, and exchange messages between the two representations using a coarse-to-fine and fine-to-coarse information flow. Our method is able to overcome some of the restrictions known from classical GNNs, like detecting cycles, while still being very efficient to train. We validate its performance on the ZINC dataset and datasets stemming from the MoleculeNet benchmark collection.

Matthias Fey, Jan-Gin Yuen, Frank Weichert• 2020

Related benchmarks

TaskDatasetResultRank
Graph RegressionZINC 12K (test)
MAE0.151
164
Graph ClassificationOGBG-MOLHIV v1 (test)
ROC-AUC0.788
88
Graph ClassificationMolHIV
ROC AUC78.8
82
Graph property predictionOGBG-MOLHIV (test)
ROC-AUC78.8
61
Graph RegressionZINC subset (test)
MAE0.151
56
Molecular property predictionMUV (test)
ROC-AUC81.8
49
Graph RegressionZINC
MSE0.151
49
Graph-level regressionZINC full (test)
MAE0.036
45
Graph Classificationogbg-molhiv
ROC-AUC0.788
39
Molecular property predictionMOLPCBA OGB (test)
AP (Test)27.39
36
Showing 10 of 30 rows

Other info

Code

Follow for update