Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

AdapterFusion: Non-Destructive Task Composition for Transfer Learning

About

Sequential fine-tuning and multi-task learning are methods aiming to incorporate knowledge from multiple tasks; however, they suffer from catastrophic forgetting and difficulties in dataset balancing. To address these shortcomings, we propose AdapterFusion, a new two stage learning algorithm that leverages knowledge from multiple tasks. First, in the knowledge extraction stage we learn task specific parameters called adapters, that encapsulate the task-specific information. We then combine the adapters in a separate knowledge composition step. We show that by separating the two stages, i.e., knowledge extraction and knowledge composition, the classifier can effectively exploit the representations learned from multiple tasks in a non-destructive manner. We empirically evaluate AdapterFusion on 16 diverse NLU tasks, and find that it effectively combines various types of knowledge at different layers of the model. We show that our approach outperforms traditional strategies such as full fine-tuning as well as multi-task learning. Our code and adapters are available at AdapterHub.ml.

Jonas Pfeiffer, Aishwarya Kamath, Andreas R\"uckl\'e, Kyunghyun Cho, Iryna Gurevych• 2020

Related benchmarks

TaskDatasetResultRank
Natural Language UnderstandingGLUE
SST-296.1
531
Natural Language UnderstandingGLUE (test)
SST-2 Accuracy96.6
416
Question AnsweringSQuAD v1.1 (dev)
F1 Score90.86
380
Natural Language UnderstandingGLUE (val)
SST-295.49
191
Question AnsweringSQuAD v2.0 (dev)
F181.84
163
Visual Question AnsweringVQA v2 (val)--
144
Visual Question AnsweringVQA 2.0 (val)
Accuracy (Overall)40.96
143
Natural Language UnderstandingGLUE (test dev)
MRPC Accuracy90.2
87
Natural Language UnderstandingSuperGLUE
SGLUE Score73.6
84
Author ProfilingPAN16 (test)
Task Score87.3
80
Showing 10 of 25 rows

Other info

Follow for update