Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Deep contextualized word representations

About

We introduce a new type of deep contextualized word representation that models both (1) complex characteristics of word use (e.g., syntax and semantics), and (2) how these uses vary across linguistic contexts (i.e., to model polysemy). Our word vectors are learned functions of the internal states of a deep bidirectional language model (biLM), which is pre-trained on a large text corpus. We show that these representations can be easily added to existing models and significantly improve the state of the art across six challenging NLP problems, including question answering, textual entailment and sentiment analysis. We also present an analysis showing that exposing the deep internals of the pre-trained network is crucial, allowing downstream models to mix different types of semi-supervision signals.

Matthew E. Peters, Mark Neumann, Mohit Iyyer, Matt Gardner, Christopher Clark, Kenton Lee, Luke Zettlemoyer• 2018

Related benchmarks

TaskDatasetResultRank
Natural Language InferenceSNLI (test)
Accuracy89.3
681
Named Entity RecognitionCoNLL 2003 (test)
F1 Score92.28
539
Natural Language UnderstandingGLUE (dev)
SST-2 (Acc)91.5
504
Natural Language UnderstandingGLUE (test)
SST-2 Accuracy90.4
416
Question AnsweringSQuAD v1.1 (dev)
F1 Score85.6
375
Question AnsweringSQuAD v1.1 (test)
F1 Score87.432
260
Sentiment AnalysisSST-5 (test)
Accuracy54.7
173
Named Entity RecognitionCoNLL English 2003 (test)
F1 Score92.22
135
Coreference ResolutionCoNLL English 2012 (test)
MUC F1 Score78.6
114
Question AnsweringSQuAD (test)
F187.4
111
Showing 10 of 71 rows
...

Other info

Code

Follow for update