Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity

About

In deep learning, models typically reuse the same parameters for all inputs. Mixture of Experts (MoE) defies this and instead selects different parameters for each incoming example. The result is a sparsely-activated model -- with outrageous numbers of parameters -- but a constant computational cost. However, despite several notable successes of MoE, widespread adoption has been hindered by complexity, communication costs and training instability -- we address these with the Switch Transformer. We simplify the MoE routing algorithm and design intuitive improved models with reduced communication and computational costs. Our proposed training techniques help wrangle the instabilities and we show large sparse models may be trained, for the first time, with lower precision (bfloat16) formats. We design models based off T5-Base and T5-Large to obtain up to 7x increases in pre-training speed with the same computational resources. These improvements extend into multilingual settings where we measure gains over the mT5-Base version across all 101 languages. Finally, we advance the current scale of language models by pre-training up to trillion parameter models on the "Colossal Clean Crawled Corpus" and achieve a 4x speedup over the T5-XXL model.

William Fedus, Barret Zoph, Noam Shazeer• 2021

Related benchmarks

TaskDatasetResultRank
Commonsense ReasoningHellaSwag
Accuracy31.5
1460
Commonsense ReasoningWinoGrande
Accuracy51.2
776
Mathematical ReasoningGSM8K (test)--
751
Question AnsweringARC Challenge
Accuracy20.09
749
Commonsense ReasoningPIQA
Accuracy63.55
647
Language ModelingWikiText-103 (test)
Perplexity19.72
524
Language ModelingWikiText
PPL21.71
479
Natural Language UnderstandingGLUE (test)
SST-2 Accuracy93.81
416
Multitask Language UnderstandingMMLU (test)
Accuracy46.52
303
Reading ComprehensionRACE high
Accuracy55.2
295
Showing 10 of 55 rows

Other info

Follow for update