Cross-subject affective analysis based on dynamic brain functional networks
IntroductionEmotion recognition is crucial in facilitating human-computer emotional interaction. To enhance the credibility and realism of emotion recognition, researchers have turned to physiological signals, particularly EEG signals, as they directly reflect cerebral cortex activity. However, due...
Saved in:
| Main Authors: | , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Frontiers Media S.A.
2025-04-01
|
| Series: | Frontiers in Human Neuroscience |
| Subjects: | |
| Online Access: | https://www.frontiersin.org/articles/10.3389/fnhum.2025.1445763/full |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1850145637936398336 |
|---|---|
| author | Lifeng You Tianyu Zhong Erheng He Xuejie Liu Qinghua Zhong |
| author_facet | Lifeng You Tianyu Zhong Erheng He Xuejie Liu Qinghua Zhong |
| author_sort | Lifeng You |
| collection | DOAJ |
| description | IntroductionEmotion recognition is crucial in facilitating human-computer emotional interaction. To enhance the credibility and realism of emotion recognition, researchers have turned to physiological signals, particularly EEG signals, as they directly reflect cerebral cortex activity. However, due to inter-subject variability and non-smoothness of EEG signals, the generalization performance of models across subjects remains a challenge.MethodsIn this study, we proposed a novel approach that combines time-frequency analysis and brain functional networks to construct dynamic brain functional networks using sliding time windows. This integration of time, frequency, and spatial domains helps to effectively capture features, reducing inter-individual differences, and improving model generalization performance. To construct brain functional networks, we employed mutual information to quantify the correlation between EEG channels and set appropriate thresholds. We then extracted three network attribute features—global efficiency, local efficiency, and local clustering coefficients—to achieve emotion classification based on dynamic brain network features.ResultsThe proposed method is evaluated on the DEAP dataset through subject-dependent (trial-independent), subject-independent, and subject- and trial-independent experiments along both valence and arousal dimensions. The results demonstrate that our dynamic brain functional network outperforms the static brain functional network in all three experimental cases. High classification accuracies of 90.89% and 91.17% in the valence and arousal dimensions, respectively, were achieved on the subject-independent experiments based on the dynamic brain function, leading to significant advancements in EEG-based emotion recognition. In addition, experiments with each brain region yielded that the left and right temporal lobes focused on processing individual private emotional information, whereas the remaining brain regions paid attention to processing basic emotional information. |
| format | Article |
| id | doaj-art-32bf5e42c3e043f2a6d1e1f4671fb424 |
| institution | OA Journals |
| issn | 1662-5161 |
| language | English |
| publishDate | 2025-04-01 |
| publisher | Frontiers Media S.A. |
| record_format | Article |
| series | Frontiers in Human Neuroscience |
| spelling | doaj-art-32bf5e42c3e043f2a6d1e1f4671fb4242025-08-20T02:28:02ZengFrontiers Media S.A.Frontiers in Human Neuroscience1662-51612025-04-011910.3389/fnhum.2025.14457631445763Cross-subject affective analysis based on dynamic brain functional networksLifeng You0Tianyu Zhong1Erheng He2Xuejie Liu3Qinghua Zhong4School of Physics, South China Normal University, Guangzhou, ChinaSchool of Social Sciences, Nanyang Technological University, Singapore, SingaporeSchool of Physics, South China Normal University, Guangzhou, ChinaSchool of Electronic Science and Engineering (School of Microelectronics), South China Normal University, Foshan, ChinaSchool of Electronic Science and Engineering (School of Microelectronics), South China Normal University, Foshan, ChinaIntroductionEmotion recognition is crucial in facilitating human-computer emotional interaction. To enhance the credibility and realism of emotion recognition, researchers have turned to physiological signals, particularly EEG signals, as they directly reflect cerebral cortex activity. However, due to inter-subject variability and non-smoothness of EEG signals, the generalization performance of models across subjects remains a challenge.MethodsIn this study, we proposed a novel approach that combines time-frequency analysis and brain functional networks to construct dynamic brain functional networks using sliding time windows. This integration of time, frequency, and spatial domains helps to effectively capture features, reducing inter-individual differences, and improving model generalization performance. To construct brain functional networks, we employed mutual information to quantify the correlation between EEG channels and set appropriate thresholds. We then extracted three network attribute features—global efficiency, local efficiency, and local clustering coefficients—to achieve emotion classification based on dynamic brain network features.ResultsThe proposed method is evaluated on the DEAP dataset through subject-dependent (trial-independent), subject-independent, and subject- and trial-independent experiments along both valence and arousal dimensions. The results demonstrate that our dynamic brain functional network outperforms the static brain functional network in all three experimental cases. High classification accuracies of 90.89% and 91.17% in the valence and arousal dimensions, respectively, were achieved on the subject-independent experiments based on the dynamic brain function, leading to significant advancements in EEG-based emotion recognition. In addition, experiments with each brain region yielded that the left and right temporal lobes focused on processing individual private emotional information, whereas the remaining brain regions paid attention to processing basic emotional information.https://www.frontiersin.org/articles/10.3389/fnhum.2025.1445763/fullEEGemotion recognitiondynamic brain function networksubject independencesubject and trial independence |
| spellingShingle | Lifeng You Tianyu Zhong Erheng He Xuejie Liu Qinghua Zhong Cross-subject affective analysis based on dynamic brain functional networks Frontiers in Human Neuroscience EEG emotion recognition dynamic brain function network subject independence subject and trial independence |
| title | Cross-subject affective analysis based on dynamic brain functional networks |
| title_full | Cross-subject affective analysis based on dynamic brain functional networks |
| title_fullStr | Cross-subject affective analysis based on dynamic brain functional networks |
| title_full_unstemmed | Cross-subject affective analysis based on dynamic brain functional networks |
| title_short | Cross-subject affective analysis based on dynamic brain functional networks |
| title_sort | cross subject affective analysis based on dynamic brain functional networks |
| topic | EEG emotion recognition dynamic brain function network subject independence subject and trial independence |
| url | https://www.frontiersin.org/articles/10.3389/fnhum.2025.1445763/full |
| work_keys_str_mv | AT lifengyou crosssubjectaffectiveanalysisbasedondynamicbrainfunctionalnetworks AT tianyuzhong crosssubjectaffectiveanalysisbasedondynamicbrainfunctionalnetworks AT erhenghe crosssubjectaffectiveanalysisbasedondynamicbrainfunctionalnetworks AT xuejieliu crosssubjectaffectiveanalysisbasedondynamicbrainfunctionalnetworks AT qinghuazhong crosssubjectaffectiveanalysisbasedondynamicbrainfunctionalnetworks |