Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Principled Latent Diffusion for Graphs via Laplacian Autoencoders

About

Graph diffusion models achieve state-of-the-art performance in graph generation but suffer from quadratic complexity in the number of nodes -- and much of their capacity is wasted modeling the absence of edges in sparse graphs. Inspired by latent diffusion in other modalities, a natural idea is to compress graphs into a low-dimensional latent space and perform diffusion there. However, unlike images or text, graph generation requires nearly lossless reconstruction, as even a single error in decoding an adjacency matrix can render the entire sample invalid. This challenge has remained largely unaddressed. We propose LG-Flow, a latent graph diffusion framework that directly overcomes these obstacles. A permutation-equivariant autoencoder maps each node into a fixed-dimensional embedding from which the full adjacency is provably recoverable, enabling near-lossless reconstruction for both undirected graphs and DAGs. The dimensionality of this latent representation scales linearly with the number of nodes, eliminating the quadratic bottleneck and making it feasible to train larger and more expressive models. In this latent space, we train a Diffusion Transformer with flow matching, enabling efficient and expressive graph generation. Our approach achieves competitive results against state-of-the-art graph diffusion models, while achieving up to $1000\times$ speed-up. Our code is available at https://github.com/asiraudin/LG-Flow .

Antoine Siraudin, Christopher Morris• 2026

Related benchmarks

TaskDatasetResultRank
Molecular Graph GenerationMOSES
Validity88.4
13
Molecular Graph GenerationGuacaMol
Validity93
6
Graph generationEgo (test)
Degree1.9
5
Graph generationExtended Planar (test)--
3
Graph generationExtended Tree (test)--
3
Showing 5 of 5 rows

Other info

Follow for update