Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

GPT-NeoX-20B: An Open-Source Autoregressive Language Model

About

We introduce GPT-NeoX-20B, a 20 billion parameter autoregressive language model trained on the Pile, whose weights will be made freely and openly available to the public through a permissive license. It is, to the best of our knowledge, the largest dense autoregressive model that has publicly available weights at the time of submission. In this work, we describe \model{}'s architecture and training and evaluate its performance on a range of language-understanding, mathematics, and knowledge-based tasks. We find that GPT-NeoX-20B is a particularly powerful few-shot reasoner and gains far more in performance when evaluated five-shot than similarly sized GPT-3 and FairSeq models. We open-source the training and evaluation code, as well as the model weights, at https://github.com/EleutherAI/gpt-neox.

Sid Black, Stella Biderman, Eric Hallahan, Quentin Anthony, Leo Gao, Laurence Golding, Horace He, Connor Leahy, Kyle McDonell, Jason Phang, Michael Pieler, USVSN Sai Prashanth, Shivanshu Purohit, Laria Reynolds, Jonathan Tow, Ben Wang, Samuel Weinbach• 2022

Related benchmarks

TaskDatasetResultRank
Commonsense ReasoningHellaSwag
Accuracy68.37
1891
Question AnsweringARC Challenge--
906
Multi-task Language UnderstandingMMLU--
876
Commonsense ReasoningPIQA
Accuracy77.9
751
Named Entity RecognitionCoNLL 2003 (test)--
539
Code GenerationHumanEval (test)
Pass@115.4
506
Question AnsweringOpenBookQA
Accuracy32.6
465
Natural Language InferenceRTE
Accuracy53.79
448
Mathematical ReasoningMATH (test)--
433
Question AnsweringARC Easy
Normalized Acc74.6
389
Showing 10 of 86 rows
...

Other info

Code

Follow for update