Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Learning from Brain Topography: A Hierarchical Local-Global Graph-Transformer Network for EEG Emotion Recognition

About

Understanding how local neurophysiological patterns interact with global brain dynamics is essential for decoding human emotions from EEG signals. However, existing deep learning approaches often overlook the brain's intrinsic spatial organization, failing to simultaneously capture local topological relations and global dependencies. To address these challenges, we propose Neuro-HGLN, a Neurologically-informed Hierarchical Graph-Transformer Learning Network that integrates biologically grounded priors with hierarchical representation learning. Neuro-HGLN first constructs a spatial Euclidean prior graph based on physical electrode distances to serve as an anatomically grounded inductive bias. A learnable global dynamic graph is then introduced to model functional connectivity across the entire brain. In parallel, to capture fine-grained regional dependencies, Neuro-HGLN builds region-level local graphs using a multi-head self-attention mechanism. These graphs are processed synchronously through local-constrained parallel GCN layers to produce region-specific representations. Subsequently, an iTransformer encoder aggregates these features to capture cross-region dependencies under a dimension-as-token formulation. Extensive experiments demonstrate that Neuro-HGLN achieves state-of-the-art performance on multiple benchmarks, providing enhanced interpretability grounded in neurophysiological structure. These results highlight the efficacy of unifying local topological learning with cross-region dependency modeling for robust EEG emotion recognition.

Yijin Zhou, Fu Li, Yi Niu, Boxun Fu, Huaning Wang, Lijian Zhang• 2026

Related benchmarks

TaskDatasetResultRank
Emotion RecognitionSEED-IV Session 1 → Session 3
Accuracy84.74
198
Emotion RecognitionSEED-IV Session 2 → Session 3
Accuracy85.45
198
Emotion RecognitionSEED-IV Session 1 → Session 2
Accuracy82.23
134
Emotion RecognitionSEED Session 1 → Session 2
Accuracy92.57
70
EEG emotion recognitionSEED Subject-independent
Accuracy90.92
28
EEG emotion recognitionSEED-IV (Subject-independent)
Accuracy79.3
28
EEG emotion recognitionMPED (Subject-independent)
Accuracy28.56
18
EEG emotion recognitionSEED V (subject-independent)
Accuracy78.34
15
Emotion RecognitionSEED-V Session 1 → Session 2
Accuracy0.768
6
Emotion RecognitionSEED Session 1 → Session 3
Accuracy94.64
6
Showing 10 of 14 rows

Other info

Follow for update