DELLA-Merging: Reducing Interference in Model Merging through Magnitude-Based Sampling
About
With the proliferation of domain-specific models, model merging has emerged as a set of techniques that combine the capabilities of multiple models into one that can multitask without the cost of additional training. In this paper, we propose a new model merging technique, Drop and rEscaLe via sampLing with mAgnitude (DELLA-Merging), that employs a novel pruning technique, MAGPRUNE, which shows significant advantages over DARE and TIES. MAGPRUNE first ranks the parameters in order of their magnitude and assigns higher dropout probabilities (p) to parameters with lower ranks corresponding to lower magnitudes. To approximate the original embeddings, MAGPRUNE employs a rescaling operation on the parameters that survive the random dropping by 1/(1 - p). On three different expert models considered for merging (LM, Math, Code) and corresponding benchmark datasets (AlpacaEval, GSM8K, MBPP), DELLA shows an average improvement of 2.4 points over baseline methods employing delta parameter pruning (an improvement of 3.6 points over TIES, 1.2 points over DARE), and 11.1 points over the no-pruning baseline (TA). We release the source code at: https://github.com/declare-lab/della.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Named Entity Recognition | MIT Movie | Entity F166.72 | 57 | |
| Relation Extraction | CONLL04 | Relation Strict F126.6 | 52 | |
| Named Entity Recognition | tweetNER7 | Entity F155.62 | 49 | |
| Relation Extraction | CoNLL 04 | F128.78 | 39 | |
| Relation Extraction | New York Times | Precision88.9 | 32 | |
| Entity Typing | FindVehicle | Precision70.42 | 32 | |
| Entity Typing | FabNER | Precision59.71 | 32 | |
| Aggregate Performance | Mit-Movie, TweetNER7, New York Times, CoNLL04, FindVehicle, and FabNER | Precision57.59 | 13 | |
| Multi-source Heterogeneous Transfer | Average NER, RE, ET | Precision57.59 | 10 | |
| Reasoning | AIME25 and GPQA-Diamond | HV Value1.3018 | 6 |