Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Learning More Generalized Experts by Merging Experts in Mixture-of-Experts

About

We observe that incorporating a shared layer in a mixture-of-experts can lead to performance degradation. This leads us to hypothesize that learning shared features poses challenges in deep learning, potentially caused by the same feature being learned as various different features. To address this issue, we track each expert's usage frequency and merge the two most frequently selected experts. We then update the least frequently selected expert using the combination of experts. This approach, combined with the subsequent learning of the router's expert selection, allows the model to determine if the most frequently selected experts have learned the same feature differently. If they have, the combined expert can be further trained to learn a more general feature. Consequently, our algorithm enhances transfer learning and mitigates catastrophic forgetting when applied to multi-domain task incremental learning.

Sejik Park• 2024

Related benchmarks

TaskDatasetResultRank
Few-shot Image ClassificationDTD
Accuracy64
42
Few-shot Image ClassificationSUN397
Accuracy72.5
36
Image ClassificationFood few-shot
Accuracy88.8
32
Image ClassificationStanford Cars few-shot
Score (%)69.5
32
Image ClassificationEuroSAT few-shot
Accuracy82.3
32
Image ClassificationCIFAR100 few-shot
Accuracy74.9
32
Image ClassificationFlowers few-shot
Score (%)89.4
32
Image ClassificationOxfordPet few-shot
Score (%)89.1
32
Multi-Task Incremental LearningMTIL Aircraft, Caltech101, CIFAR100, DTD, EuroSAT, Flowers, Food, MNIST, OxfordPet, Cars, SUN397
Caltech101 Accuracy94.7
32
Image ClassificationMNIST few-shot
Accuracy (few-shot)89
32
Showing 10 of 25 rows

Other info

Follow for update