Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Model Merging via Data-Free Covariance Estimation

About

Model merging provides a way of cheaply combining individual models to produce a model that inherits each individual's capabilities. While some merging methods can approach the performance of multitask training, they are often heuristically motivated and lack theoretical justification. A principled alternative is to pose model merging as a layer-wise optimization problem that directly minimizes interference between tasks. However, this formulation requires estimating per-layer covariance matrices from data, which may not be available when performing merging. In contrast, many of the heuristically-motivated methods do not require auxiliary data, making them practically advantageous. In this work, we revisit the interference minimization framework and show that, under certain conditions, covariance matrices can be estimated directly from difference matrices, eliminating the need for data while also reducing computational costs. We validate our approach across vision and language benchmarks on models ranging from 86M parameters to 7B parameters, outperforming previous data-free state-of-the-art merging methods

Marawan Gamal Abdel Hameed, Derek Tam, Pascal Jr Tikeng Notsawo, Colin Raffel, Guillaume Rabusseau• 2026

Related benchmarks

TaskDatasetResultRank
Code GenerationHumanEval+
Pass@154.5
383
Image Classification8 Vision Tasks (test)
Avg Accuracy92.2
82
Natural Language Processing7 NLP Tasks (test)
Average Accuracy79.8
38
Model Merging8 Vision Tasks (test)
Accuracy86.1
33
Model Merging7 NLP Tasks (test)
Accuracy76.7
22
Instruction FollowingIFEval
Pass@1 (Strict)47
16
Mathematical ReasoningAIME 24
Pass@1 Accuracy39.9
8
General ReasoningAverage
Avg @1 Score45.9
8
Code GenerationHumanEval
Pass@158.2
8
Mathematical ReasoningAIME 25
Accuracy @129.8
8
Showing 10 of 10 rows

Other info

Follow for update