Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Are Expressive Encoders Necessary for Discrete Graph Generation?

About

Discrete graph generation has emerged as a powerful paradigm for modeling graph data, often relying on highly expressive neural backbones such as transformers or higher-order architectures. We revisit this design choice by introducing GenGNN, a modular message-passing framework for graph generation. Diffusion models with GenGNN achieve more than 90% validity on Tree and Planar datasets, within margins of graph transformers, at 2-5x faster inference speed. For molecule generation, DiGress with a GenGNN backbone achieves 99.49% Validity. A systematic ablation study shows the benefit provided by each GenGNN component, indicating the need for residual connections to mitigate oversmoothing on complicated graph-structure. Through scaling analyses, we apply a principled metric-space view to investigate learned diffusion representations and uncover whether GNNs can be expressive neural backbones for discrete diffusion.

Jay Revolinsky, Harry Shomer, Jiliang Tang• 2026

Related benchmarks

TaskDatasetResultRank
Graph generationSBM
VUN0.795
51
Graph generationPlanar
V.U.N.93
48
Unconditional molecular generationMOSES
Validity91.44
39
Molecular Graph GenerationQM9
Validity99.49
37
Graph generationTree
A.Ratio1.4
36
Molecule GenerationGuacaMol
Validity93.09
20
Molecular Graph GenerationZINC250K
Validity96.24
9
Graph generationComm20
Average Ratio1.8
6
Conditional Graph GenerationTLS
V.U.N93.75
2
Showing 9 of 9 rows

Other info

Follow for update