Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

COMBA: Cross Batch Aggregation for Learning Large Graphs with Context Gating State Space Models

About

State space models (SSMs) have recently emerged for modeling long-range dependency in sequence data, with much simplified computational costs than modern alternatives, such as transformers. Advancing SMMs to graph structured data, especially for large graphs, is a significant challenge because SSMs are sequence models and the shear graph volumes make it very expensive to convert graphs as sequences for effective learning. In this paper, we propose COMBA to tackle large graph learning using state space models, with two key innovations: graph context gating and cross batch aggregation. Graph context refers to different hops of neighborhood for each node, and graph context gating allows COMBA to use such context to learn best control of neighbor aggregation. For each graph context, COMBA samples nodes as batches, and train a graph neural network (GNN), with information being aggregated cross batches, allowing COMBA to scale to large graphs. Our theoretical study asserts that cross-batch aggregation guarantees lower error than training GNN without aggregation. Experiments on benchmark networks demonstrate significant performance gains compared to baseline approaches. Code and benchmark datasets will be released for public access.

Jiajun Shen, Yufei Jin, Yi He, xingquan Zhu• 2026

Related benchmarks

TaskDatasetResultRank
Node ClassificationOgbn-arxiv
Accuracy71.6
191
Node Classificationamazon-ratings
Accuracy50.7
138
Node ClassificationOGBN-Products
Accuracy73.5
62
Node Classificationtolokers
ROC AUC84.5
47
Node ClassificationMinesweeper
ROC AUC94.2
46
Node-level classificationROMAN EMP.
Accuracy89.5
24
Showing 6 of 6 rows

Other info

Follow for update