Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Be More with Less: Hypergraph Attention Networks for Inductive Text Classification

About

Text classification is a critical research topic with broad applications in natural language processing. Recently, graph neural networks (GNNs) have received increasing attention in the research community and demonstrated their promising results on this canonical task. Despite the success, their performance could be largely jeopardized in practice since they are: (1) unable to capture high-order interaction between words; (2) inefficient to handle large datasets and new documents. To address those issues, in this paper, we propose a principled model -- hypergraph attention networks (HyperGAT), which can obtain more expressive power with less computational consumption for text representation learning. Extensive experiments on various benchmark datasets demonstrate the efficacy of the proposed approach on the text classification task.

Kaize Ding, Jianling Wang, Jundong Li, Dingcheng Li, Huan Liu• 2020

Related benchmarks

TaskDatasetResultRank
Graph ClassificationMutag (test)
Accuracy68.51
217
Text ClassificationMR (test)
Accuracy78.32
148
Text Classification20News
Accuracy85.16
127
Text ClassificationMR
Accuracy77.33
106
Text ClassificationR8
Accuracy96.13
71
Text ClassificationR8 (test)
Accuracy97.97
56
Text ClassificationR52
Accuracy92.94
56
Document ClassificationOhsumed (test)
Accuracy69.9
54
Text Classificationmovie review dataset (test)
Accuracy55.68
35
Text ClassificationR52 (test)
Accuracy94.98
30
Showing 10 of 24 rows

Other info

Follow for update