Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

GLiNER: Generalist Model for Named Entity Recognition using Bidirectional Transformer

About

Named Entity Recognition (NER) is essential in various Natural Language Processing (NLP) applications. Traditional NER models are effective but limited to a set of predefined entity types. In contrast, Large Language Models (LLMs) can extract arbitrary entities through natural language instructions, offering greater flexibility. However, their size and cost, particularly for those accessed via APIs like ChatGPT, make them impractical in resource-limited scenarios. In this paper, we introduce a compact NER model trained to identify any type of entity. Leveraging a bidirectional transformer encoder, our model, GLiNER, facilitates parallel entity extraction, an advantage over the slow sequential token generation of LLMs. Through comprehensive testing, GLiNER demonstrate strong performance, outperforming both ChatGPT and fine-tuned LLMs in zero-shot evaluations on various NER benchmarks.

Urchade Zaratiana, Nadi Tomeh, Pierre Holat, Thierry Charnois• 2023

Related benchmarks

TaskDatasetResultRank
Named Entity RecognitionCoNLL 03
F1 (Entity)65.4
102
Named Entity RecognitionOntoNotes
F1-score27.3
91
Named Entity RecognitionConll 2003
F1 Score92.6
86
Named Entity RecognitionBC5CDR
F1 Score68.7
59
Named Entity RecognitionMIT Restaurant--
50
Named Entity RecognitionACE05
F1 Score82.8
38
Named Entity RecognitionGENIA
F1 Score55.1
37
Named Entity RecognitionCrossNER
AI Score57.2
35
Named Entity RecognitionMIT
Movie Entity Score57.2
28
Named Entity RecognitionNCBI
F1 Score65.3
26
Showing 10 of 40 rows

Other info

Follow for update