Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

From Hypergraph Energy Functions to Hypergraph Neural Networks

About

Hypergraphs are a powerful abstraction for representing higher-order interactions between entities of interest. To exploit these relationships in making downstream predictions, a variety of hypergraph neural network architectures have recently been proposed, in large part building upon precursors from the more traditional graph neural network (GNN) literature. Somewhat differently, in this paper we begin by presenting an expressive family of parameterized, hypergraph-regularized energy functions. We then demonstrate how minimizers of these energies effectively serve as node embeddings that, when paired with a parameterized classifier, can be trained end-to-end via a supervised bilevel optimization process. Later, we draw parallels between the implicit architecture of the predictive models emerging from the proposed bilevel hypergraph optimization, and existing GNN architectures in common use. Empirically, we demonstrate state-of-the-art results on various hypergraph node classification benchmarks. Code is available at https://github.com/yxzwang/PhenomNN.

Yuxin Wang, Quan Gan, Xipeng Qiu, Xuanjing Huang, David Wipf• 2023

Related benchmarks

TaskDatasetResultRank
Node ClassificationCora
Accuracy88.12
1215
Node ClassificationCiteseer
Accuracy74.45
931
Node ClassificationCora (test)
Mean Accuracy88.12
861
Node ClassificationCiteseer (test)
Accuracy0.7721
824
Node ClassificationPubmed
Accuracy78.12
819
Node ClassificationChameleon
Accuracy43.62
640
Node ClassificationSquirrel
Accuracy39.45
591
Node ClassificationChameleon (test)
Mean Accuracy43.62
297
Node ClassificationCornell (test)
Mean Accuracy72.16
274
Node ClassificationTexas (test)
Mean Accuracy81.49
269
Showing 10 of 31 rows

Other info

Follow for update