Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Towards Training Billion Parameter Graph Neural Networks for Atomic Simulations

About

Recent progress in Graph Neural Networks (GNNs) for modeling atomic simulations has the potential to revolutionize catalyst discovery, which is a key step in making progress towards the energy breakthroughs needed to combat climate change. However, the GNNs that have proven most effective for this task are memory intensive as they model higher-order interactions in the graphs such as those between triplets or quadruplets of atoms, making it challenging to scale these models. In this paper, we introduce Graph Parallelism, a method to distribute input graphs across multiple GPUs, enabling us to train very large GNNs with hundreds of millions or billions of parameters. We empirically evaluate our method by scaling up the number of parameters of the recently proposed DimeNet++ and GemNet models by over an order of magnitude. On the large-scale Open Catalyst 2020 (OC20) dataset, these graph-parallelized models lead to relative improvements of 1) 15% on the force MAE metric for the S2EF task and 2) 21% on the AFbT metric for the IS2RS task, establishing new state-of-the-art results.

Anuroop Sriram, Abhishek Das, Brandon M. Wood, Siddharth Goyal, C. Lawrence Zitnick• 2022

Related benchmarks

TaskDatasetResultRank
Initial Structure to Relaxed Structure (IS2RS)Open Catalyst OC20 (test)
AFbT0.308
32
S2EF (Structure to Energy and Forces)OC20 average across all four splits (test)
Force MAE (meV/Å)20.5
27
Adsorption energy predictionOC20 IS2RE (test)
MAE0.3712
16
IS2RE (Initial Structure to Relaxed Energy)OC20 average across all four splits (test)
Energy MAE (meV)371
10
Structure to Energy and ForcesOpen Catalyst Project S2EF OC20 (test)
Energy MAE (eV)0.2701
5
Showing 5 of 5 rows

Other info

Follow for update