Cross-subject affective analysis based on dynamic brain functional networks

IntroductionEmotion recognition is crucial in facilitating human-computer emotional interaction. To enhance the credibility and realism of emotion recognition, researchers have turned to physiological signals, particularly EEG signals, as they directly reflect cerebral cortex activity. However, due...

Full description

Saved in:
Bibliographic Details
Main Authors: Lifeng You, Tianyu Zhong, Erheng He, Xuejie Liu, Qinghua Zhong
Format: Article
Language:English
Published: Frontiers Media S.A. 2025-04-01
Series:Frontiers in Human Neuroscience
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fnhum.2025.1445763/full
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:IntroductionEmotion recognition is crucial in facilitating human-computer emotional interaction. To enhance the credibility and realism of emotion recognition, researchers have turned to physiological signals, particularly EEG signals, as they directly reflect cerebral cortex activity. However, due to inter-subject variability and non-smoothness of EEG signals, the generalization performance of models across subjects remains a challenge.MethodsIn this study, we proposed a novel approach that combines time-frequency analysis and brain functional networks to construct dynamic brain functional networks using sliding time windows. This integration of time, frequency, and spatial domains helps to effectively capture features, reducing inter-individual differences, and improving model generalization performance. To construct brain functional networks, we employed mutual information to quantify the correlation between EEG channels and set appropriate thresholds. We then extracted three network attribute features—global efficiency, local efficiency, and local clustering coefficients—to achieve emotion classification based on dynamic brain network features.ResultsThe proposed method is evaluated on the DEAP dataset through subject-dependent (trial-independent), subject-independent, and subject- and trial-independent experiments along both valence and arousal dimensions. The results demonstrate that our dynamic brain functional network outperforms the static brain functional network in all three experimental cases. High classification accuracies of 90.89% and 91.17% in the valence and arousal dimensions, respectively, were achieved on the subject-independent experiments based on the dynamic brain function, leading to significant advancements in EEG-based emotion recognition. In addition, experiments with each brain region yielded that the left and right temporal lobes focused on processing individual private emotional information, whereas the remaining brain regions paid attention to processing basic emotional information.
ISSN:1662-5161