Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Beyond ReLU: Bifurcation, Oversmoothing, and Topological Priors

About

Graph Neural Networks (GNNs) learn node representations through iterative network-based message-passing. While powerful, deep GNNs suffer from oversmoothing, where node features converge to a homogeneous, non-informative state. We re-frame this problem of representational collapse from a \emph{bifurcation theory} perspective, characterizing oversmoothing as convergence to a stable ``homogeneous fixed point.'' Our central contribution is the theoretical discovery that this undesired stability can be broken by replacing standard monotone activations (e.g., ReLU) with a class of functions. Using Lyapunov-Schmidt reduction, we analytically prove that this substitution induces a bifurcation that destabilizes the homogeneous state and creates a new pair of stable, non-homogeneous \emph{patterns} that provably resist oversmoothing. Our theory predicts a precise, nontrivial scaling law for the amplitude of these emergent patterns, which we quantitatively validate in experiments. Finally, we demonstrate the practical utility of our theory by deriving a closed-form, bifurcation-aware initialization and showing its utility in real benchmark experiments.

Erkan Turan, Gaspard Abel, Maysam Behmanesh, Emery Pierson, Maks Ovsjanikov• 2026

Related benchmarks

TaskDatasetResultRank
Node ClassificationChameleon
Accuracy70.85
549
Node ClassificationSquirrel
Accuracy61.79
500
Node ClassificationCornell
Accuracy76.87
426
Node ClassificationTexas
Accuracy0.9369
410
Node ClassificationWisconsin
Accuracy85.62
410
Node ClassificationCiteseer
Accuracy78.18
275
Node ClassificationComputer
Accuracy91.94
48
Node ClassificationPhoto
Accuracy95.69
23
Node ClassificationCoauthorCS
Accuracy95.84
11
Showing 9 of 9 rows

Other info

Follow for update