Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Cross-lingual Contextualized Topic Models with Zero-shot Learning

About

Many data sets (e.g., reviews, forums, news, etc.) exist parallelly in multiple languages. They all cover the same content, but the linguistic differences make it impossible to use traditional, bag-of-word-based topic models. Models have to be either single-language or suffer from a huge, but extremely sparse vocabulary. Both issues can be addressed by transfer learning. In this paper, we introduce a zero-shot cross-lingual topic model. Our model learns topics on one language (here, English), and predicts them for unseen documents in different languages (here, Italian, French, German, and Portuguese). We evaluate the quality of the topic predictions for the same document in different languages. Our results show that the transferred topics are coherent and stable across languages, which suggests exciting future research directions.

Federico Bianchi, Silvia Terragni, Dirk Hovy, Debora Nozza, Elisabetta Fersini• 2020

Related benchmarks

TaskDatasetResultRank
Topic Modeling20NG
NPMI0.103
33
Topic ModelingJamuna News
CV0.66
29
Topic ModelingNCTBText
CV0.64
29
Topic ModelingBanFakeNews
CV0.6
25
Topic ModelingM10
NPMI0.041
23
Topic ModelingDBLP
NPMI-0.062
23
Topic ModelingBBC
NPMI0.038
17
Document ClusteringM10 (test)
NMI0.46
13
Document ClusteringSS (test)
NMI0.509
13
Document ClusteringPascal (test)
NMI0.465
13
Showing 10 of 34 rows

Other info

Follow for update