Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Beyond ReLU: Bifurcation, Oversmoothing, and Topological Priors

About

Graph Neural Networks (GNNs) learn node representations through iterative network-based message-passing. While powerful, deep GNNs suffer from oversmoothing, where node features converge to a homogeneous, non-informative state. We re-frame this problem of representational collapse from a \emph{bifurcation theory} perspective, characterizing oversmoothing as convergence to a stable ``homogeneous fixed point.'' Our central contribution is the theoretical discovery that this undesired stability can be broken by replacing standard monotone activations (e.g., ReLU) with a class of functions. Using Lyapunov-Schmidt reduction, we analytically prove that this substitution induces a bifurcation that destabilizes the homogeneous state and creates a new pair of stable, non-homogeneous \emph{patterns} that provably resist oversmoothing. Our theory predicts a precise, nontrivial scaling law for the amplitude of these emergent patterns, which we quantitatively validate in experiments. Finally, we demonstrate the practical utility of our theory by deriving a closed-form, bifurcation-aware initialization and showing its utility in real benchmark experiments.

Erkan Turan, Gaspard Abel, Maysam Behmanesh, Emery Pierson, Maks Ovsjanikov• 2026

Related benchmarks

TaskDatasetResultRank
Node ClassificationChameleon
Accuracy70.85
640
Node ClassificationWisconsin
Accuracy85.62
627
Node ClassificationTexas
Accuracy0.9369
616
Node ClassificationSquirrel
Accuracy61.79
591
Node ClassificationCornell
Accuracy76.87
582
Node ClassificationCiteseer
Accuracy78.18
393
Node ClassificationPhoto
Accuracy95.69
139
Node ClassificationComputer
Accuracy91.94
89
Node ClassificationCoauthorCS
Accuracy95.84
31
Showing 9 of 9 rows

Other info

Follow for update