Massive Choice, Ample Tasks (MaChAmp): A Toolkit for Multi-task Learning in NLP
About
Transfer learning, particularly approaches that combine multi-task learning with pre-trained contextualized embeddings and fine-tuning, have advanced the field of Natural Language Processing tremendously in recent years. In this paper we present MaChAmp, a toolkit for easy fine-tuning of contextualized embeddings in multi-task settings. The benefits of MaChAmp are its flexible configuration options, and the support of a variety of natural language processing tasks in a uniform toolkit, from text classification and sequence labeling to dependency parsing, masked language modeling, and text generation.
Rob van der Goot, Ahmet \"Ust\"un, Alan Ramponi, Ibrahim Sharaf, Barbara Plank• 2020
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Dependency Parsing | 30 unseen languages split (test) | Average LAS41.7 | 12 | |
| Semantic Parsing | PMB English 3.0.0 (dev) | F1 Score88.2 | 8 | |
| Semantic Parsing | PMB English 3.0.0 (test) | F1 Score88.9 | 8 |
Showing 3 of 3 rows