Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Pre-training with Fractional Denoising to Enhance Molecular Property Prediction

About

Deep learning methods have been considered promising for accelerating molecular screening in drug discovery and material design. Due to the limited availability of labelled data, various self-supervised molecular pre-training methods have been presented. While many existing methods utilize common pre-training tasks in computer vision (CV) and natural language processing (NLP), they often overlook the fundamental physical principles governing molecules. In contrast, applying denoising in pre-training can be interpreted as an equivalent force learning, but the limited noise distribution introduces bias into the molecular distribution. To address this issue, we introduce a molecular pre-training framework called fractional denoising (Frad), which decouples noise design from the constraints imposed by force learning equivalence. In this way, the noise becomes customizable, allowing for incorporating chemical priors to significantly improve molecular distribution modeling. Experiments demonstrate that our framework consistently outperforms existing methods, establishing state-of-the-art results across force prediction, quantum chemical properties, and binding affinity tasks. The refined noise design enhances force accuracy and sampling coverage, which contribute to the creation of physically consistent molecular representations, ultimately leading to superior predictive performance.

Yuyan Ni, Shikun Feng, Xin Hong, Yuancheng Sun, Wei-Ying Ma, Zhi-Ming Ma, Qiwei Ye, Yanyan Lan• 2024

Related benchmarks

TaskDatasetResultRank
Protein-ligand binding affinity predictionPDBbind Sequence Identity (60%) 2017
RMSE1.213
50
Protein-ligand binding affinity predictionPDBbind Sequence Identity (30%) 2017
RMSE1.365
50
Force PredictionMD17 (test)
Aspirin Force Error0.209
30
Showing 3 of 3 rows

Other info

Follow for update