Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Improved Word Sense Disambiguation Using Pre-Trained Contextualized Word Representations

About

Contextualized word representations are able to give different representations for the same word in different contexts, and they have been shown to be effective in downstream natural language processing tasks, such as question answering, named entity recognition, and sentiment analysis. However, evaluation on word sense disambiguation (WSD) in prior work shows that using contextualized word representations does not outperform the state-of-the-art approach that makes use of non-contextualized word embeddings. In this paper, we explore different strategies of integrating pre-trained contextualized word representations and our best strategy achieves accuracies exceeding the best prior published accuracies by significant margins on multiple benchmark WSD datasets. We make the source code available at https://github.com/nusnlp/contextemb-wsd.

Christian Hadiwinoto, Hwee Tou Ng, Wee Chung Gan• 2019

Related benchmarks

TaskDatasetResultRank
Word Sense DisambiguationSensEval-3 (test)
F1 Score74
51
Word Sense DisambiguationSemEval-2007 Task 17 (test)
F1 Score69.3
36
Word Sense DisambiguationSensEval-2 (test)
F1 Score76.4
35
Word Sense DisambiguationSemEval Task 12 2013 (test)
F1 Score71.1
19
Word Sense DisambiguationSemEval Task 13 2015 (test)
F176.2
19
Word Sense DisambiguationEnglish All-Words Average (test)
F1 Score74.1
19
Word Sense DisambiguationSenseval-3 English Lexical Sample (test)
Accuracy80
13
Word Sense DisambiguationSenseval-2 English Lexical Sample (test)
Accuracy76.9
11
Word Sense DisambiguationChinese OntoNotes (test)
BC84.7
7
Showing 9 of 9 rows

Other info

Code

Follow for update