Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Crystalite: A Lightweight Transformer for Efficient Crystal Modeling

About

Generative models for crystalline materials often rely on equivariant graph neural networks, which capture geometric structure well but are costly to train and slow to sample. We present Crystalite, a lightweight diffusion Transformer for crystal modeling built around two simple inductive biases. The first is Subatomic Tokenization, a compact chemically structured atom representation that replaces high-dimensional one-hot encodings and is better suited to continuous diffusion. The second is the Geometry Enhancement Module (GEM), which injects periodic minimum-image pair geometry directly into attention through additive geometric biases. Together, these components preserve the simplicity and efficiency of a standard Transformer while making it better matched to the structure of crystalline materials. Crystalite achieves state-of-the-art results on crystal structure prediction benchmarks, and de novo generation performance, attaining the best S.U.N. discovery score among the evaluated baselines while sampling substantially faster than geometry-heavy alternatives.

Tin Had\v{z}i Veljkovi\'c, Joshua Rosenthal, Ivor Lon\v{c}ari\'c, Jan-Willem van de Meent• 2026

Related benchmarks

TaskDatasetResultRank
Crystal GenerationLeMat-GenBench (MP20)
Validity97.2
28
De Novo GenerationMP-20
Structural Validity1
21
Crystal Structure PredictionMP-20
Match Rate (%)66.05
13
Crystal Structure PredictionMPTS-52
Match Rate (MR)31.49
13
Crystal Structure GenerationMP-20 (test)
Compositional Validity81.94
10
Crystal Structure PredictionAlex-MP-20
MR (%)67.52
2
Showing 6 of 6 rows

Other info

Follow for update