A Multi-Context Emotional EEG Dataset for Cross-Context Emotion Decoding

Abstract EEG-based emotion decoding is essential for unveiling neural mechanisms of emotion and has applications in mental health and human-machine interaction. However, existing datasets for EEG-based emotion decoding are limited to a single context of emotion elicitation. The ability of emotion de...

Full description

Saved in:
Bibliographic Details
Main Authors: Xin Xu, Xinke Shen, Xuyang Chen, Qingzhu Zhang, Sitian Wang, Yihan Li, Zongsheng Li, Dan Zhang, Mingming Zhang, Quanying Liu
Format: Article
Language:English
Published: Nature Portfolio 2025-07-01
Series:Scientific Data
Online Access:https://doi.org/10.1038/s41597-025-05349-2
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract EEG-based emotion decoding is essential for unveiling neural mechanisms of emotion and has applications in mental health and human-machine interaction. However, existing datasets for EEG-based emotion decoding are limited to a single context of emotion elicitation. The ability of emotion decoding methods to generalize across different contexts remains underexplored. To address this gap, we present the Multi-Context Emotional EEG (EmoEEG-MC) dataset, featuring 64-channel EEG and peripheral physiological data from 60 participants exposed to two distinct contexts: video-induced and imagery-induced emotions. These contexts evoke seven distinct emotional categories: joy, inspiration, tenderness, fear, disgust, sadness, and neutral emotion. The emotional experience of specific emotion categories was validated through subjective reports. To validate the potential of cross-context emotion decoding, we implemented a support vector machine with L1 regularization, achieving accuracies of 66.7% for binary classification (positive vs. negative emotions) and 28.9% for seven-category emotion classification, both significantly above chance levels. The EmoEEG-MC dataset serves as a foundational resource for understanding the neural substrates of emotion and enhancing the real-world applicability of affective computing.
ISSN:2052-4463