Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Hypergraph Convolution and Hypergraph Attention

About

Recently, graph neural networks have attracted great attention and achieved prominent performance in various research fields. Most of those algorithms have assumed pairwise relationships of objects of interest. However, in many real applications, the relationships between objects are in higher-order, beyond a pairwise formulation. To efficiently learn deep embeddings on the high-order graph-structured data, we introduce two end-to-end trainable operators to the family of graph neural networks, i.e., hypergraph convolution and hypergraph attention. Whilst hypergraph convolution defines the basic formulation of performing convolution on a hypergraph, hypergraph attention further enhances the capacity of representation learning by leveraging an attention module. With the two operators, a graph neural network is readily extended to a more flexible model and applied to diverse applications where non-pairwise relationships are observed. Extensive experimental results with semi-supervised node classification demonstrate the effectiveness of hypergraph convolution and hypergraph attention.

Song Bai, Feihu Zhang, Philip H.S. Torr• 2019

Related benchmarks

TaskDatasetResultRank
Node ClassificationCora--
1215
Node ClassificationCiteseer
Accuracy72.42
931
Node ClassificationCora (test)
Mean Accuracy79.14
861
Node ClassificationCiteseer (test)
Accuracy0.7242
824
Node ClassificationPubmed
Accuracy84.56
819
Node ClassificationPubMed (test)
Accuracy86.41
546
Node ClassificationPubmed
Accuracy86.41
178
Node ClassificationCiteseer
Mean Accuracy72.42
90
Node ClassificationDBLP CA
Accuracy90.92
37
Node ClassificationCora HOMO. (test)
Mean Accuracy79.23
30
Showing 10 of 63 rows

Other info

Follow for update