Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Smarter, Better, Faster, Longer: A Modern Bidirectional Encoder for Fast, Memory Efficient, and Long Context Finetuning and Inference

About

Encoder-only transformer models such as BERT offer a great performance-size tradeoff for retrieval and classification tasks with respect to larger decoder-only models. Despite being the workhorse of numerous production pipelines, there have been limited Pareto improvements to BERT since its release. In this paper, we introduce ModernBERT, bringing modern model optimizations to encoder-only models and representing a major Pareto improvement over older encoders. Trained on 2 trillion tokens with a native 8192 sequence length, ModernBERT models exhibit state-of-the-art results on a large pool of evaluations encompassing diverse classification tasks and both single and multi-vector retrieval on different domains (including code). In addition to strong downstream performance, ModernBERT is also the most speed and memory efficient encoder and is designed for inference on common GPUs.

Benjamin Warner, Antoine Chaffin, Benjamin Clavi\'e, Orion Weller, Oskar Hallstr\"om, Said Taghadouini, Alexis Gallagher, Raja Biswas, Faisal Ladhak, Tom Aarsen, Nathan Cooper, Griffin Adams, Jeremy Howard, Iacopo Poli• 2024

Related benchmarks

TaskDatasetResultRank
Node ClassificationCora (test)
Mean Accuracy22.9
687
Semantic Textual SimilaritySTS tasks (STS12, STS13, STS14, STS15, STS16, STS-B, SICK-R)
STS12 Score80.67
195
Link PredictionCiteseer--
146
Token ClassificationAmazon ESCI Product Description English (test)
Top-k Accuracy89.8
72
Token ClassificationAmazon ESCI Product Title English (test)
Top-k Token Accuracy82.3
72
Natural Language UnderstandingGLUE (test val)
MRPC Accuracy92.2
59
Information RetrievalBEIR
TREC-COVID0.721
59
Information RetrievalMS Marco
NDCG@1087.39
56
Mortality PredictionMIMIC-IV (test)
AUC58.01
43
Extractive Question AnsweringSQuAD 2.0
F1 Score92.6
34
Showing 10 of 78 rows
...

Other info

Follow for update