Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

ColD Fusion: Collaborative Descent for Distributed Multitask Finetuning

About

We propose a new paradigm to continually evolve pretrained models, denoted ColD Fusion. It provides the benefits of multitask learning but leverages distributed computation with limited communication and eliminates the need for shared data. Consequentially, ColD Fusion can give rise to a synergistic loop, where finetuned models can be recycled to continually improve the pretrained model they are based upon. We show that ColD Fusion yields comparable benefits to multitask training by producing a model that (a) attains strong performance on all of the datasets it was trained on; and (b) is a better starting point for finetuning on unseen datasets. We show that ColD Fusion outperforms RoBERTa and even previous multitask models. Specifically, when training and testing on 35 diverse datasets, ColD Fusion-based model outperforms RoBERTa by 2.33 points on average without any changes to the architecture.

Shachar Don-Yehiya, Elad Venezian, Colin Raffel, Noam Slonim, Yoav Katz, Leshem Choshen• 2022

Related benchmarks

TaskDatasetResultRank
Natural Language InferenceRTE
Accuracy84.48
367
Question AnsweringBoolQ
Accuracy81.39
240
Question ClassificationTREC
Accuracy91.04
205
Topic ClassificationAG-News
Accuracy89.58
173
Sentiment AnalysisSST-2
Accuracy95.16
156
Topic ClassificationDBpedia
Accuracy78.15
117
Natural Language InferenceCB
Accuracy85
110
Coreference ResolutionWSC
Accuracy62.31
96
Paraphrase DetectionMRPC
Avg Accuracy89.26
89
Word Sense DisambiguationWiC
Avg Accuracy68.12
84
Showing 10 of 35 rows

Other info

Code

Follow for update