Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

End-to-end Neural Coreference Resolution

About

We introduce the first end-to-end coreference resolution model and show that it significantly outperforms all previous work without using a syntactic parser or hand-engineered mention detector. The key idea is to directly consider all spans in a document as potential mentions and learn distributions over possible antecedents for each. The model computes span embeddings that combine context-dependent boundary representations with a head-finding attention mechanism. It is trained to maximize the marginal likelihood of gold antecedent spans from coreference clusters and is factored to enable aggressive pruning of potential mentions. Experiments demonstrate state-of-the-art performance, with a gain of 1.5 F1 on the OntoNotes benchmark and by 3.1 F1 using a 5-model ensemble, despite the fact that this is the first approach to be successfully trained with no external resources.

Kenton Lee, Luheng He, Mike Lewis, Luke Zettlemoyer• 2017

Related benchmarks

TaskDatasetResultRank
Coreference ResolutionCoNLL English 2012 (test)
MUC F1 Score80.4
114
Coreference ResolutionGAP (test)
Overall F164
53
Named Entity RecognitionNCBI-disease (test)
Precision89
40
Named Entity RecognitionGENIA (test)--
34
Relation ExtractionSCIERC (test)
F1 Score34.1
23
Entity recognitionSCIERC (test)
F1 Score61.2
20
Coreference ResolutionCoNLL 2012
Average F168.8
17
Named Entity RecognitionLivingNER (test)
Precision95.7
9
Named Entity RecognitionSocialDisNER (test)
Precision90.4
9
Coreference ResolutionSCIERC (test)
Precision60.9
7
Showing 10 of 13 rows

Other info

Code

Follow for update