Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

MiCA Learns More Knowledge Than LoRA and Full Fine-Tuning

About

Minor Component Adaptation (MiCA) is a novel parameter-efficient fine-tuning method for large language models that focuses on adapting underutilized subspaces of model representations. Unlike conventional methods such as Low-Rank Adaptation (LoRA), which target dominant subspaces, MiCA leverages Singular Value Decomposition to identify subspaces related to minor singular vectors associated with the least significant singular values and constrains the update of parameters during fine-tuning to those directions. This strategy leads to up to 5.9x improvement in knowledge acquisition under optimized training hyperparameters and a minimal parameter footprint of 6-60% compared to LoRA. These results suggest that constraining adaptation to minor singular directions provides a more efficient and stable mechanism for integrating new knowledge into pre-trained language models.

Sten R\"udiger, Sebastian Raschka• 2026

Related benchmarks

TaskDatasetResultRank
Commonsense ReasoningHellaSwag
HellaSwag Accuracy61.62
350
Multiple-choice Question AnsweringTruthfulQA MC1
MC1 Accuracy43.38
39
Multiple-choice Question AnsweringBLOGS-MC
Accuracy75.63
6
Multiple-choice Question AnsweringHISTORY-MC
Accuracy39.22
4
Showing 4 of 4 rows

Other info

Follow for update