Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Cross-lingual Contextualized Topic Models with Zero-shot Learning

About

Many data sets (e.g., reviews, forums, news, etc.) exist parallelly in multiple languages. They all cover the same content, but the linguistic differences make it impossible to use traditional, bag-of-word-based topic models. Models have to be either single-language or suffer from a huge, but extremely sparse vocabulary. Both issues can be addressed by transfer learning. In this paper, we introduce a zero-shot cross-lingual topic model. Our model learns topics on one language (here, English), and predicts them for unseen documents in different languages (here, Italian, French, German, and Portuguese). We evaluate the quality of the topic predictions for the same document in different languages. Our results show that the transferred topics are coherent and stable across languages, which suggests exciting future research directions.

Federico Bianchi, Silvia Terragni, Dirk Hovy, Debora Nozza, Elisabetta Fersini• 2020

Related benchmarks

TaskDatasetResultRank
Topic Modeling20NG
NPMI0.103
23
Topic ModelingBBC
NPMI0.038
17
Document ClusteringM10 (test)
NMI0.46
13
Document ClusteringSS (test)
NMI0.509
13
Document ClusteringPascal (test)
NMI0.465
13
Topic ModelingPascal
NPMI0.005
13
Topic ModelingBio
NPMI0.133
13
Topic ModelingBBC
IRBO1
13
Topic ModelingDBLP
IRBO100
13
Document ClusteringBio (test)
NMI0.406
13
Showing 10 of 31 rows

Other info

Follow for update