Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Cross-lingual Text Classification with Heterogeneous Graph Neural Network

About

Cross-lingual text classification aims at training a classifier on the source language and transferring the knowledge to target languages, which is very useful for low-resource languages. Recent multilingual pretrained language models (mPLM) achieve impressive results in cross-lingual classification tasks, but rarely consider factors beyond semantic similarity, causing performance degradation between some language pairs. In this paper we propose a simple yet effective method to incorporate heterogeneous information within and across languages for cross-lingual text classification using graph convolutional networks (GCN). In particular, we construct a heterogeneous graph by treating documents and words as nodes, and linking nodes with different relations, which include part-of-speech roles, semantic similarity, and document translations. Extensive experiments show that our graph-based method significantly outperforms state-of-the-art models on all tasks, and also achieves consistent performance gain over baselines in low-resource settings where external tools like translators are unavailable.

Ziyun Wang, Xuan Liu, Peiji Yang, Shixing Liu, Zhisheng Wang• 2021

Related benchmarks

TaskDatasetResultRank
Sentiment Classificationamazon fr (test)--
8
Intent ClassificationMultilingual SLU EN-ES (test)
Accuracy96.81
6
Intent ClassificationMultilingual SLU EN-TH (test)
Accuracy89.71
6
Sentiment ClassificationAmazon Review EN → DE (test)
Accuracy (Books)0.927
6
Sentiment ClassificationAmazon Review EN → JA (test)
Books Accuracy87.21
6
News ClassificationXGLUE News Classification (test)
Accuracy (DE)85
5
Showing 6 of 6 rows

Other info

Code

Follow for update