Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

SynLlama: Generating Synthesizable Molecules and Their Analogs with Large Language Models

About

Generative machine learning models for exploring chemical space have shown immense promise, but many molecules they generate are too difficult to synthesize, making them impractical for further investigation or development. In this work, we present a novel approach by fine-tuning Meta's Llama3 Large Language Models (LLMs) to create SynLlama, which generates full synthetic pathways made of commonly accessible building blocks and robust organic reaction templates. SynLlama explores a large synthesizable space using significantly less data, and offers strong performance in both forward and bottom-up synthesis planning compared to other state-of-the-art methods. We find that SynLlama, even without training on external building blocks, can effectively generalize to unseen yet purchasable building blocks, meaning that its reconstruction capabilities extend to a broader synthesizable chemical space than the training data. We also demonstrate the use of SynLlama in a pharmaceutical context for synthesis planning of analog molecules and hit expansion leads for proposed inhibitors of target proteins, offering medicinal chemists a valuable tool for discovery.

Kunyang Sun, Dorian Bagni, Joseph M. Cavanagh, Yingze Wang, Jacob M. Sawyer, Bo Zhou, Andrew Gritsevskiy, Oufan Zhang, Teresa Head-Gordon• 2025

Related benchmarks

TaskDatasetResultRank
Chemical space projectionChEMBL (test)
Reconstruction %19.7
4
Chemical space projectionEnamine REAL (test)
Reconstruction Rate69.1
3
Showing 2 of 2 rows

Other info

Follow for update