Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

MagMax: Leveraging Model Merging for Seamless Continual Learning

About

This paper introduces a continual learning approach named MagMax, which utilizes model merging to enable large pre-trained models to continuously learn from new data without forgetting previously acquired knowledge. Distinct from traditional continual learning methods that aim to reduce forgetting during task training, MagMax combines sequential fine-tuning with a maximum magnitude weight selection for effective knowledge integration across tasks. Our initial contribution is an extensive examination of model merging techniques, revealing that simple approaches like weight averaging and random weight selection surprisingly hold up well in various continual learning contexts. More importantly, we present MagMax, a novel model-merging strategy that enables continual learning of large pre-trained models for successive tasks. Our thorough evaluation demonstrates the superiority of MagMax in various scenarios, including class- and domain-incremental learning settings. The code is available at this URL: https://github.com/danielm1405/magmax.

Daniel Marczak, Bart{\l}omiej Twardowski, Tomasz Trzci\'nski, Sebastian Cygert• 2024

Related benchmarks

TaskDatasetResultRank
Image ClassificationTinyImageNet (test)
Accuracy75.98
440
Image ClassificationStanford Cars (test)
Accuracy88.61
316
Semantic segmentationCityscapes (val)
mIoU70.04
297
Image ClassificationCUB-200-2011 (test)
Top-1 Acc70.95
286
Domain GeneralizationPACS
Accuracy98.62
231
Domain GeneralizationOfficeHome
Accuracy91.41
202
Image ClassificationOxford Flowers-102 (test)
Top-1 Accuracy86.51
192
Domain GeneralizationDomainNet
Accuracy62.24
133
Domain GeneralizationTerraIncognita
Accuracy55.16
101
Object NavigationHM3D
Success Rate (SR)45.1
85
Showing 10 of 46 rows

Other info

Follow for update