EEG-based Graph-guided Domain Adaptation for Robust Cross-Session Emotion Recognition
About
Accurate recognition of human emotional states is critical for effective human-machine interaction. Electroencephalography (EEG) offers a reliable source for emotion recognition due to its high temporal resolution and its direct reflection of neural activity. Nevertheless, variations across recording sessions present a major challenge for model generalization. To address this issue, we propose EGDA, a framework that reduces cross-session discrepancies by jointly aligning the global (marginal) and class-specific (conditional) distributions, while preserving the intrinsic structure of EEG data through graph regularization. Experimental results on the SEED-IV dataset demonstrate that EGDA achieves robust cross-session performance, obtaining accuracies of 81.22%, 80.15%, and 83.27% across three transfer tasks, and surpassing several baseline methods. Furthermore, the analysis highlights the Gamma frequency band as the most discriminative and identifies the central-parietal and prefrontal brain regions as critical for reliable emotion recognition.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Emotion Recognition | SEED-IV Session 1 → Session 3 | Accuracy97.69 | 198 | |
| Emotion Recognition | SEED-IV Session 2 → Session 3 | Accuracy100 | 198 | |
| Emotion Recognition | SEED-IV Session 1 → Session 2 | Accuracy100 | 134 | |
| Emotion Recognition | SEED Session 1 → Session 2 | Accuracy100 | 70 |