Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

TinyLlama: An Open-Source Small Language Model

About

We present TinyLlama, a compact 1.1B language model pretrained on around 1 trillion tokens for approximately 3 epochs. Building on the architecture and tokenizer of Llama 2, TinyLlama leverages various advances contributed by the open-source community (e.g., FlashAttention and Lit-GPT), achieving better computational efficiency. Despite its relatively small size, TinyLlama demonstrates remarkable performance in a series of downstream tasks. It significantly outperforms existing open-source language models with comparable sizes. Our model checkpoints and code are publicly available on GitHub at https://github.com/jzhang38/TinyLlama.

Peiyuan Zhang, Guangtao Zeng, Tianduo Wang, Wei Lu• 2024

Related benchmarks

TaskDatasetResultRank
Mathematical ReasoningGSM8K
Accuracy1.7
983
Multi-task Language UnderstandingMMLU
Accuracy32.6
842
Commonsense ReasoningWinoGrande
Accuracy50.36
776
Mathematical ReasoningGSM8K (test)
Accuracy14.19
751
Question AnsweringARC Challenge
Accuracy30.1
749
Commonsense ReasoningPIQA
Accuracy73.3
647
Question AnsweringOpenBookQA
Accuracy23
465
Code GenerationHumanEval (test)--
444
Question AnsweringARC Easy
Normalized Acc55.3
385
Boolean Question AnsweringBoolQ
Accuracy57.8
307
Showing 10 of 64 rows

Other info

Follow for update