Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Text Level Graph Neural Network for Text Classification

About

Recently, researches have explored the graph neural network (GNN) techniques on text classification, since GNN does well in handling complex structures and preserving global information. However, previous methods based on GNN are mainly faced with the practical problems of fixed corpus level graph structure which do not support online testing and high memory consumption. To tackle the problems, we propose a new GNN based model that builds graphs for each input text with global parameters sharing instead of a single graph for the whole corpus. This method removes the burden of dependence between an individual text and entire corpus which support online testing, but still preserve global information. Besides, we build graphs by much smaller windows in the text, which not only extract more local features but also significantly reduce the edge numbers as well as memory consumption. Experiments show that our model outperforms existing models on several text classification datasets even with consuming less memory.

Lianzhe Huang, Dehong Ma, Sujian Li, Xiaodong Zhang, Houfeng WANG• 2019

Related benchmarks

TaskDatasetResultRank
Text ClassificationR8 (test)
Accuracy97.8
56
Document ClassificationOhsumed (test)
Accuracy69.4
54
Document ClassificationR52 (test)
Accuracy94.6
29
Showing 3 of 3 rows

Other info

Follow for update