Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Coreference Resolution without Span Representations

About

The introduction of pretrained language models has reduced many complex task-specific NLP models to simple lightweight layers. An exception to this trend is coreference resolution, where a sophisticated task-specific model is appended to a pretrained transformer encoder. While highly effective, the model has a very large memory footprint -- primarily due to dynamically-constructed span and span-pair representations -- which hinders the processing of complete documents and the ability to train on multiple instances in a single batch. We introduce a lightweight end-to-end coreference model that removes the dependency on span representations, handcrafted features, and heuristics. Our model performs competitively with the current standard model, while being simpler and more efficient.

Yuval Kirstain, Ori Ram, Omer Levy• 2021

Related benchmarks

TaskDatasetResultRank
Coreference ResolutionCoNLL English 2012 (test)
MUC F1 Score85.8
114
Coreference ResolutionGAP (test)
Overall F188.3
53
Coreference ResolutionOntoNotes
MUC85.8
23
Coreference ResolutionEnglish OntoNotes 5.0 (test)
MUC Precision86.6
18
Coreference ResolutionWikiCoref (WC) (test)--
12
Coreference ResolutionWinogender (WG) (test)
Accuracy70.5
11
Coreference ResolutionGAP 5.0 (test)
Masc F190.6
4
Coreference ResolutionWinoBias (test)
Accuracy84.3
2
Coreference ResolutionBUG (test)
Accuracy72.2
2
Showing 9 of 9 rows

Other info

Code

Follow for update