Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Semi-supervised Word Sense Disambiguation with Neural Models

About

Determining the intended sense of words in text - word sense disambiguation (WSD) - is a long standing problem in natural language processing. Recently, researchers have shown promising results using word vectors extracted from a neural network language model as features in WSD algorithms. However, a simple average or concatenation of word vectors for each word in a text loses the sequential and syntactic information of the text. In this paper, we study WSD with a sequence learning neural net, LSTM, to better capture the sequential and syntactic patterns of the text. To alleviate the lack of training data in all-words WSD, we employ the same LSTM in a semi-supervised label propagation classifier. We demonstrate state-of-the-art results, especially on verbs.

Dayu Yuan, Julian Richardson, Ryan Doherty, Colin Evans, Eric Altendorf• 2016

Related benchmarks

TaskDatasetResultRank
Word Sense DisambiguationSemEval-2007 Task 17 (test)
F1 Score63.7
36
Word Sense DisambiguationSemEval Task 7 (S7-T7) 2007 (test)
F1 Score84.3
29
Word Sense DisambiguationSenseval-3 English all-words
F1 Score71.8
14
Word Sense DisambiguationSemEval English 2015 (test)
F1 (all)72.6
13
Word Sense DisambiguationSemEval Task 17 all-words 2007
F1 (Nouns)72.3
9
Word Sense DisambiguationSenseval2 all-words
F1 (All Words)74.4
8
Word Sense DisambiguationSemEval Coarse-grained 2007 (all-words)
F1 (Nouns)0.834
8
Word Sense DisambiguationSemEval-2013 Task 12 (nouns)
F1 (Nouns)69.5
7
Showing 8 of 8 rows

Other info

Follow for update