Mitigating Subject Dependency in EEG Decoding with Subject-Specific Low-Rank Adapters
About
Subject-specific distribution shifts represent a fundamental obstacle to developing foundation models for brain decoding. We propose the Subject-Specific Low-Rank Adapter (SuLoRA), a drop-in replacement for standard linear or convolutional layers that captures inter-subject variability by decomposing weights into a shared, subject-invariant component and a lightweight, low-rank correction unique to each subject. This explicit separation enables existing architectures to become robust to subject shifts without architectural redesign. We evaluate SuLoRA on MEG speech perception and EEG motor imagery tasks across CNN and transformer architectures. In the speech decoding task, SuLoRA exceeds the baseline performance with half of the parameters. On motor imagery dataset, SuLoRA outperforms both subject-agnostic models and independently trained subject-specific models. SuLoRA offers a practical path towards effective cross-subject foundation models for brain signal applications.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Motor Imagery Classification | BCI Comp. 2a (4 Classes) IV | Accuracy64.72 | 8 | |
| Motor Imagery Classification | BCI Comp. IV2b (2 Classes) | Accuracy0.7648 | 8 |