Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

GPT-NeoX-20B: An Open-Source Autoregressive Language Model

About

We introduce GPT-NeoX-20B, a 20 billion parameter autoregressive language model trained on the Pile, whose weights will be made freely and openly available to the public through a permissive license. It is, to the best of our knowledge, the largest dense autoregressive model that has publicly available weights at the time of submission. In this work, we describe \model{}'s architecture and training and evaluate its performance on a range of language-understanding, mathematics, and knowledge-based tasks. We find that GPT-NeoX-20B is a particularly powerful few-shot reasoner and gains far more in performance when evaluated five-shot than similarly sized GPT-3 and FairSeq models. We open-source the training and evaluation code, as well as the model weights, at https://github.com/EleutherAI/gpt-neox.

Sid Black, Stella Biderman, Eric Hallahan, Quentin Anthony, Leo Gao, Laurence Golding, Horace He, Connor Leahy, Kyle McDonell, Jason Phang, Michael Pieler, USVSN Sai Prashanth, Shivanshu Purohit, Laria Reynolds, Jonathan Tow, Ben Wang, Samuel Weinbach• 2022

Related benchmarks

TaskDatasetResultRank
Commonsense ReasoningHellaSwag
Accuracy68.37
1460
Multi-task Language UnderstandingMMLU--
842
Question AnsweringARC Challenge--
749
Commonsense ReasoningPIQA
Accuracy77.9
647
Named Entity RecognitionCoNLL 2003 (test)--
539
Question AnsweringOpenBookQA
Accuracy32.6
465
Code GenerationHumanEval (test)
Pass@115.4
444
Mathematical ReasoningMATH (test)--
433
Question AnsweringARC Easy
Normalized Acc74.6
385
Natural Language InferenceRTE
Accuracy53.79
367
Showing 10 of 86 rows
...

Other info

Code

Follow for update