Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Mamba-Based Graph Convolutional Networks: Tackling Over-smoothing with Selective State Space

About

Graph Neural Networks (GNNs) have shown great success in various graph-based learning tasks. However, it often faces the issue of over-smoothing as the model depth increases, which causes all node representations to converge to a single value and become indistinguishable. This issue stems from the inherent limitations of GNNs, which struggle to distinguish the importance of information from different neighborhoods. In this paper, we introduce MbaGCN, a novel graph convolutional architecture that draws inspiration from the Mamba paradigm-originally designed for sequence modeling. MbaGCN presents a new backbone for GNNs, consisting of three key components: the Message Aggregation Layer, the Selective State Space Transition Layer, and the Node State Prediction Layer. These components work in tandem to adaptively aggregate neighborhood information, providing greater flexibility and scalability for deep GNN models. While MbaGCN may not consistently outperform all existing methods on each dataset, it provides a foundational framework that demonstrates the effective integration of the Mamba paradigm into graph representation learning. Through extensive experiments on benchmark datasets, we demonstrate that MbaGCN paves the way for future advancements in graph neural network research.

Xin He, Yili Wang, Wenqi Fan, Xu Shen, Xin Juan, Rui Miao, Xin Wang• 2025

Related benchmarks

TaskDatasetResultRank
Node ClassificationCora
Accuracy87.79
885
Node ClassificationCiteseer
Accuracy76.68
804
Node ClassificationPubmed
Accuracy89.32
742
Node ClassificationWisconsin
Accuracy86.27
410
Node ClassificationActor
Accuracy37.97
237
Node ClassificationPhoto
Mean Accuracy94.41
165
Node ClassificationComputers
Mean Accuracy90.39
143
Showing 7 of 7 rows

Other info

Follow for update