Abstract
Recognizing emotions using EEG signals is difficult because EEG data is not stationary, has a low signal-to-noise ratio, and varies a lot between subjects. We present a new hybrid framework called CDA-GAF (Cross-Domain Adaptive Graph Attention Fusion) in this work. It combines the strengths of Graph Attention Networks (GATs), Temporal Transformers, and Domain Adaptation to make emotion classification models more robust and generalizable. To make brain connectivity graphs for each frequency band, our method first gets functional connectivity features from EEG channels. A GAT module processes these to find spatial dependencies in EEG activity. Then, a Temporal Transformer module is used to model long-range dependencies between EEG sequences. To address cross-subject variations, we implement a domain adaptation layer utilizing CORAL loss or Domain-Adversarial Training (DANN), which aligns feature distributions between source and target subjects. We also use extra emotion supervision signals, like HRV or micro-expressions, to improve the quality of the labels by anchoring the emotional state in multiple ways. We test our model on standard datasets like DEAP, SEED, and WESAD. It does much better than baseline models at recognizing emotions in both within-subject and cross-subject settings. Our findings underscore the efficacy of integrating graph-based spatial encoding, temporal attention mechanisms, and domain adaptation for emotion recognition from EEG data.
Keywords
CDA, EEG, GAF, DANN,Downloads
References
- T. Li, Z. Wang, H. Liu (2025). Adversarial Domain Adaptation-Based EEG Emotion Transfer Recognition. IEEE Access, IEEE, 32706 – 32723.
- Y. Yang, Z. Wang, Y. Song, Z. Jia, B. Wang, T.P. Jung, F. Wan (2025). Exploiting the Intrinsic Neighborhood Semantic Structure for Domain Adaptation in EEG-based Emotion Recognition. IEEE Transactions on Affective Computing, IEEE, 1-3.
- C. Gao, L. Jia, J. Xu, P. Yang (2025). UMDA-DDSTGN: An Unsupervised Meta-Domain Adaptation Method Using Dynamic Directed Spatial-Temporal Graph Network for EEG-based emotion recognition. Knowledge-Based Systems, 325, 113938.
- X.C. Zhong, Q. Wang, R. Li, Y. Liu, S. Duan, R. Yang, J. Sun (2025). Unsupervised Domain Adaptation with Pseudo-Label Propagation for Cross-Domain EEG Emotion Recognition. IEEE Transactions on Instrumentation and Measurement, IEEE.
- Z. Lyu, Z. Zuo, C. Chen, Y. Fang (2025). Mutual Information Disentanglement Based Domain Adaptation Model for EEG Emotion Recognition. IEEE Signal Processing Letters.
- C. Ahuja, D. Sethia (2025). Transit-eeg—a framework for cross-subject classification with subject specific adaptation. IEEE Transactions on Cognitive and Developmental Systems, IEEE, 17(4), 923-937.
- L. Hu, C. Tan, Y. Tian (2025). Adaptive dual-graph learning joint feature selection for EEG emotion recognition. Journal of King Saud University Computer and Information Sciences, 37(4), 69.
- W. Tan, H. Zhang, Y. Wang, W. Wen, L. Chen, H. Li, N. Zeng (2025). SEDA-EEG: A semi-supervised emotion recognition network with domain adaptation for cross-subject EEG analysis. Neurocomputing, 622, 129315.
- M. Hilali, A. Ezzati, S. Ben Alla (2025). CMHFE-DAN: A Transformer-Based Feature Extractor with Domain Adaptation for EEG-Based Emotion Recognition. Information, 16(7), 560.
- X. Ju, X. Wu, S. Dai, M. Li, D. Hu (2025). Domain adversarial learning with multiple adversarial tasks for EEG emotion recognition. Expert Systems with Applications, 266, 126028.
- X. Deng, X. Hong, L. Ai, X. Li, C. Li (2025). Multi-Source Reinforced Selective Domain Adaptation for Cross-Subject and Cross-Session EEG-Based Emotion Recognition. IEEE Access, IEEE
- Q. Zhu, T. Zhu, L. Fei, C. Zheng, W. Shao, D. Zhang, D. Zhang (2025). Multi-Modal Cross-Subject Emotion Feature Alignment and Recognition with EEG and Eye Movements. IEEE Transactions on Affective Computing, IEEE, 1-15.
- Y. Zhang, Z.S. Chen (2025). Harnessing electroencephalography connectomes for cognitive and clinical neuroscience. Nature Biomedical Engineering, 1-16.
- H. Song, Q. She, F. Fang, S. Liu, Y. Chen, Y. Zhang (2025). Domain generalization through latent distribution exploration for motor imagery EEG classification. Neurocomputing, 614, 128889.
- J. Shen, L. You, Y. Ma, Z. Zhao, H. Liang, Y. Zhang, B. Hu (2025). UA-DAAN: An Uncertainty-aware Dynamic Adversarial Adaptation Network for EEG-based Depression Recognition. IEEE Transactions on Affective Computing,IEEE, 1-12.
- Z. Xiao, Q. She, F. Fang, M. Meng, Y. Zhang (2025). Auxiliary classifier adversarial networks with maximum subdomain discrepancy for EEG-based emotion recognition. Medical & Biological Engineering & Computing, 1-15.
- C. Xu, Y. Song, Q. Zheng, Q. Wang, P.A. Heng (2025). Unsupervised multi-source domain adaptation via contrastive learning for eeg classification. Expert Systems with Applications, 261, 125452.
- G. Ghous, S. Najam, M. Alshehri, A. Alshahrani, Y. AIQahtani, A. Jalal, H. Liu (2025). Attention-Driven Emotion Recognition in EEG: A Transformer-Based Approach with Cross-Dataset Fine-Tuning. IEEE Access, IEEE, 69369-69394.
- H. Liu, X. Jin, D. Liu, W. Kong, J. Tang, Y. Peng (2025). Joint disentangled representation and domain adversarial training for EEG-based cross-session biometric recognition in single-task protocols. Cognitive Neurodynamics, 19(1), 31.
- H. Feng, S. Wang, H. Lv, C. Nie, W. Feng, H. Peng, Y. Zhao (2026). Cross-subject seizure detection with vision transformer and unsupervised domain adaptation. Biomedical Signal Processing and Control, 111, 108341.
- Y. Liu, L. Qin, X. Chen, R.L.B. Jeannes, J.L. Coatrieux, H. Shu (2025). Advancing Cross-Subject Domain Generalization in Brain-Computer Interfaces with Multi-Adversarial Strategies. IEEE Transactions on Instrumentation and Measurement, IEEE, 74.
- N. Zhou, L. Jika, W. Wang, Y. Xie, Y. Du, Y.C. Soh (2025). BERN: A Novel Framework for Enhanced Emotion Recognition through the Integration of EEG and Eye Movement Features. IEEE Transactions on Cognitive and Developmental Systems, IEEE, 1-3.
- A. Abgeena, S. Garg, N. Goyal, and J. R. PC (2025). NeuroEmo: A neuroimaging-based fMRI dataset to extract temporal affective brain dynamics for Indian movie video clips stimuli using dynamic functional connectivity approach with graph convolution neural network (DFC-GCNN). Computers in Biology and Medicine, 194, 110439.
- M. Raparthi, N.R. Mitta, V.K. Dunka, S. Gudekota, S.P. Pattyam, V.S.P. Nimmagadda (2025). Deep learning model for patient emotion recognition using EEG-tNIRS data. Neuroscience Informatics, 100219.
- C.A.U. Hassan, M. Ehatisham-ul-Haq, F. Murtaza, A.U. Yasin, S.S. Ullah (2025). EmoTrans attention based emotion recognition using EEG signals and facial analysis with expert validation. Scientific Reports, 15(1), 22004.
Articles

