Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

On the Necessity of Learnable Sheaf Laplacians

About

Sheaf Neural Networks (SNNs) were introduced as an extension of Graph Convolutional Networks to address oversmoothing on heterophilous graphs by attaching a sheaf to the input graph and replacing the adjacency-based operator with a sheaf Laplacian defined by (learnable) restriction maps. Prior work motivates this design through theoretical properties of sheaf diffusion and the kernel of the sheaf Laplacian, suggesting that suitable non-identity restriction maps can avoid representations converging to constants across connected components. Since oversmoothing can also be mitigated through residual connections and normalization, we revisit a trivial sheaf construction to ask whether the additional complexity of learning restriction maps is necessary. We introduce an Identity Sheaf Network baseline, where all restriction maps are fixed to the identity, and use it to ablate the empirical improvements reported by sheaf-learning architectures. Across five popular heterophilic benchmarks, the identity baseline achieves comparable performance to a range of SNN variants. Finally, we introduce the Rayleigh quotient as a normalized measure for comparing oversmoothing across models and show that, in trained networks, the behavior predicted by the diffusion-based analysis of SNNs is not reflected empirically. In particular, Identity Sheaf Networks do not appear to suffer more significant oversmoothing than their SNN counterparts.

Ferran Hernandez Caralt, Mar Gonz\`alez i Catal\`a, Adri\'an Bazaga, Pietro Li\`o• 2026

Related benchmarks

TaskDatasetResultRank
Node ClassificationChameleon
Accuracy67.35
640
Node ClassificationWisconsin
Accuracy88.82
627
Node ClassificationTexas
Accuracy0.8801
616
Node ClassificationSquirrel
Accuracy53.27
591
Node ClassificationCornell
Accuracy85.95
582
Showing 5 of 5 rows

Other info

Follow for update