Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

HINormer: Representation Learning On Heterogeneous Information Networks with Graph Transformer

About

Recent studies have highlighted the limitations of message-passing based graph neural networks (GNNs), e.g., limited model expressiveness, over-smoothing, over-squashing, etc. To alleviate these issues, Graph Transformers (GTs) have been proposed which work in the paradigm that allows message passing to a larger coverage even across the whole graph. Hinging on the global range attention mechanism, GTs have shown a superpower for representation learning on homogeneous graphs. However, the investigation of GTs on heterogeneous information networks (HINs) is still under-exploited. In particular, on account of the existence of heterogeneity, HINs show distinct data characteristics and thus require different treatment. To bridge this gap, in this paper we investigate the representation learning on HINs with Graph Transformer, and propose a novel model named HINormer, which capitalizes on a larger-range aggregation mechanism for node representation learning. In particular, assisted by two major modules, i.e., a local structure encoder and a heterogeneous relation encoder, HINormer can capture both the structural and heterogeneous information of nodes on HINs for comprehensive node representations. We conduct extensive experiments on four HIN benchmark datasets, which demonstrate that our proposed model can outperform the state-of-the-art.

Qiheng Mao, Zemin Liu, Chenghao Liu, Jianling Sun• 2023

Related benchmarks

TaskDatasetResultRank
Node ClassificationIMDB
Macro F1 Score0.6465
179
Node ClassificationACM
Macro F193.95
104
Node ClassificationDBLP
Micro-F194.94
94
Node ClassificationFreebase
Macro F149.94
43
Node ClassificationDBLP
Micro-F194.2
24
Showing 5 of 5 rows

Other info

Follow for update