Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Moving Down the Long Tail of Word Sense Disambiguation with Gloss-Informed Biencoders

About

A major obstacle in Word Sense Disambiguation (WSD) is that word senses are not uniformly distributed, causing existing models to generally perform poorly on senses that are either rare or unseen during training. We propose a bi-encoder model that independently embeds (1) the target word with its surrounding context and (2) the dictionary definition, or gloss, of each sense. The encoders are jointly optimized in the same representation space, so that sense disambiguation can be performed by finding the nearest sense embedding for each target word embedding. Our system outperforms previous state-of-the-art models on English all-words WSD; these gains predominantly come from improved performance on rare senses, leading to a 31.1% error reduction on less frequent senses over prior work. This demonstrates that rare senses can be more effectively disambiguated by modeling their definitions.

Terra Blevins, Luke Zettlemoyer• 2020

Related benchmarks

TaskDatasetResultRank
Word Sense DisambiguationSensEval-3 (test)
F1 Score77.4
51
Word Sense DisambiguationSemEval Task 7 (S7-T7) 2007 (test)
F1 Score74.5
29
Word Sense DisambiguationEnglish All-Words Average (test)--
19
Word Sense DisambiguationSemEval-13 (SE13) 3.0 (test)
F1 Score79.7
16
Word Sense DisambiguationSenseval-2 (SE2) 3.0 (test)
F1 Score79.4
16
Word Sense DisambiguationSemEval-15 (SE15) 3.0 (test)
F1 Score81.7
16
Word Sense DisambiguationAll-Words WSD Concatenation SE2+SE3+SE13+SE15 3.0 (test)
Overall F179
16
Word Sense DisambiguationSemEval-07 3.0 (dev)
F1 Score74.5
14
Word Sense DisambiguationSemEval English 2015 (test)
F1 (all)80.9
13
Word Sense DisambiguationSemEval 2007
F1 Score59.4
13
Showing 10 of 24 rows

Other info

Follow for update