Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

MolXPT: Wrapping Molecules with Text for Generative Pre-training

About

Generative pre-trained Transformer (GPT) has demonstrates its great success in natural language processing and related techniques have been adapted into molecular modeling. Considering that text is the most important record for scientific discovery, in this paper, we propose MolXPT, a unified language model of text and molecules pre-trained on SMILES (a sequence representation of molecules) wrapped by text. Briefly, we detect the molecule names in each sequence and replace them to the corresponding SMILES. In this way, the SMILES could leverage the information from surrounding text, and vice versa. The above wrapped sequences, text sequences from PubMed and SMILES sequences from PubChem are all fed into a language model for pre-training. Experimental results demonstrate that MolXPT outperforms strong baselines of molecular property prediction on MoleculeNet, performs comparably to the best model in text-molecule translation while using less than half of its parameters, and enables zero-shot molecular generation without finetuning.

Zequn Liu, Wei Zhang, Yingce Xia, Lijun Wu, Shufang Xie, Tao Qin, Ming Zhang, Tie-Yan Liu• 2023

Related benchmarks

TaskDatasetResultRank
Molecular property predictionMoleculeNet BBBP (scaffold)
ROC AUC72.9
140
Molecular property predictionMoleculeNet SIDER (scaffold)
ROC-AUC0.672
120
Molecule CaptioningChEBI-20 (test)
METEOR0.626
114
Molecular property predictionMoleculeNet BACE (scaffold)
ROC-AUC85.7
110
molecule property predictionMoleculeNet (scaffold split)
BBBP80
85
Molecular property predictionMoleculeNet HIV (scaffold)
ROC AUC80.8
66
Molecular property predictionBACE (test)
ROC-AUC88.4
65
Molecular Property ClassificationMoleculeNet BBBP
ROC AUC80
56
Molecular property predictionMoleculeNet Tox21 (scaffold)
ROC-AUC79.6
48
Text-guided molecule generationChEBI-20 (test)
MACCS FTS Similarity85.9
48
Showing 10 of 34 rows

Other info

Follow for update