Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

LLM as Graph Kernel: Rethinking Message Passing on Text-Rich Graphs

About

Text-rich graphs, which integrate complex structural dependencies with abundant textual information, are ubiquitous yet remain challenging for existing learning paradigms. Conventional methods and even LLM-hybrids compress rich text into static embeddings or summaries before structural reasoning, creating an information bottleneck and detaching updates from the raw content. We argue that in text-rich graphs, the text is not merely a node attribute but the primary medium through which structural relationships are manifested. We introduce RAMP, a Raw-text Anchored Message Passing approach that moves beyond using LLMs as mere feature extractors and instead recasts the LLM itself as a graph-native aggregation operator. RAMP exploits the text-rich nature of the graph via a novel dual-representation scheme: it anchors inference on each node's raw text during each iteration while propagating dynamically optimized messages from neighbors. It further handles both discriminative and generative tasks under a single unified generative formulation. Extensive experiments show that RAMP effectively bridges the gap between graph propagation and deep text reasoning, achieving competitive performance and offering new insights into the role of LLMs as graph kernels for general-purpose graph learning.

Ying Zhang, Hang Yu, Haipeng Zhang, Peng Di• 2026

Related benchmarks

TaskDatasetResultRank
Node ClassificationCora
Accuracy84.87
1215
Node ClassificationPubmed
Accuracy93.68
396
Node ClassificationPhoto
Mean Accuracy76.21
343
Node ClassificationarXiv
Accuracy75.38
219
Node ClassificationCiteseer
Accuracy74.83
86
Graph Question AnsweringExplaGraphs
Accuracy93.86
38
Node ClassificationHistory
Accuracy85.09
11
Showing 7 of 7 rows

Other info

Follow for update