Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Dynamic Contextualized Word Embeddings

About

Static word embeddings that represent words by a single vector cannot capture the variability of word meaning in different linguistic and extralinguistic contexts. Building on prior work on contextualized and dynamic word embeddings, we introduce dynamic contextualized word embeddings that represent words as a function of both linguistic and extralinguistic context. Based on a pretrained language model (PLM), dynamic contextualized word embeddings model time and social space jointly, which makes them attractive for a range of NLP tasks involving semantic variability. We highlight potential application scenarios by means of qualitative and quantitative analyses on four English datasets.

Valentin Hofmann, Janet B. Pierrehumbert, Hinrich Sch\"utze• 2020

Related benchmarks

TaskDatasetResultRank
Language ModelingYelp (test)
PPL4.723
35
Masked Language ModelingReddit (test)
Perplexity9.555
6
Masked Language ModelingarXiv (test)
Perplexity3.53
6
Masked Language ModelingCiao (test)
Perplexity5.91
6
Showing 4 of 4 rows

Other info

Follow for update