Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

GlossBERT: BERT for Word Sense Disambiguation with Gloss Knowledge

About

Word Sense Disambiguation (WSD) aims to find the exact sense of an ambiguous word in a particular context. Traditional supervised methods rarely take into consideration the lexical resources like WordNet, which are widely utilized in knowledge-based methods. Recent studies have shown the effectiveness of incorporating gloss (sense definition) into neural networks for WSD. However, compared with traditional word expert supervised methods, they have not achieved much improvement. In this paper, we focus on how to better leverage gloss knowledge in a supervised neural WSD system. We construct context-gloss pairs and propose three BERT-based models for WSD. We fine-tune the pre-trained BERT model on SemCor3.0 training corpus and the experimental results on several English all-words WSD benchmark datasets show that our approach outperforms the state-of-the-art systems.

Luyao Huang, Chi Sun, Xipeng Qiu, Xuanjing Huang• 2019

Related benchmarks

TaskDatasetResultRank
Word Sense DisambiguationSensEval-3 (test)
F1 Score75.2
51
Word Sense DisambiguationEnglish All-Words Average (test)--
19
Word Sense DisambiguationSenseval-2 (SE2) 3.0 (test)
F1 Score77.7
16
Word Sense DisambiguationAll-Words WSD Concatenation SE2+SE3+SE13+SE15 3.0 (test)
Overall F177
16
Word Sense DisambiguationSemEval-15 (SE15) 3.0 (test)
F1 Score80.4
16
Word Sense DisambiguationSemEval-13 (SE13) 3.0 (test)
F1 Score76.1
16
Word Sense DisambiguationSemEval-07 3.0 (dev)
F1 Score72.5
14
Word Sense Disambiguation42D
F1 Score45.7
9
Word Sense DisambiguationS10
F1 Score75.8
9
Word Sense DisambiguationsoftEN
F1 Score77.1
9
Showing 10 of 11 rows

Other info

Code

Follow for update