A study of motor imagery EEG classification based on feature fusion and attentional mechanisms

IntroductionMotor imagery EEG-based action recognition is an emerging field arising from the intersection of brain science and information science, which has promising applications in the fields of neurorehabilitation and human-computer collaboration. However, existing methods face challenges includ...

Full description

Saved in:
Bibliographic Details
Main Authors: Tingting Zhu, Hailin Tang, Lei Jiang, Yijia Li, Shijun Li, Zhijian Wu
Format: Article
Language:English
Published: Frontiers Media S.A. 2025-07-01
Series:Frontiers in Human Neuroscience
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fnhum.2025.1611229/full
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849430775005446144
author Tingting Zhu
Hailin Tang
Lei Jiang
Yijia Li
Shijun Li
Shijun Li
Zhijian Wu
Zhijian Wu
author_facet Tingting Zhu
Hailin Tang
Lei Jiang
Yijia Li
Shijun Li
Shijun Li
Zhijian Wu
Zhijian Wu
author_sort Tingting Zhu
collection DOAJ
description IntroductionMotor imagery EEG-based action recognition is an emerging field arising from the intersection of brain science and information science, which has promising applications in the fields of neurorehabilitation and human-computer collaboration. However, existing methods face challenges including the low signal-to-noise ratio of EEG signals, inter-subject variability, and model overfitting.MethodsWe propose HA-FuseNet, an end-to-end motor imagery action classification network. This model integrates feature fusion and attention mechanisms to classify left hand, right hand, foot, and tongue movements. Its innovations include: (1) multi-scale dense connectivity, (2) hybrid attention mechanism, (3) global self-attention module, and (4) lightweight design for reduced computational overhead.ResultsOn BCI Competition IV Dataset 2A, HA-FuseNet achieved 77.89% average within-subject accuracy (8.42% higher than EEGNet) and 68.53% cross-subject accuracy.ConclusionThe model demonstrates robustness to spatial resolution variations and individual differences, effectively mitigating key challenges in motor imagery EEG classification.
format Article
id doaj-art-8a4a8508e76a419aa1b460e3343f4549
institution Kabale University
issn 1662-5161
language English
publishDate 2025-07-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Human Neuroscience
spelling doaj-art-8a4a8508e76a419aa1b460e3343f45492025-08-20T03:27:52ZengFrontiers Media S.A.Frontiers in Human Neuroscience1662-51612025-07-011910.3389/fnhum.2025.16112291611229A study of motor imagery EEG classification based on feature fusion and attentional mechanismsTingting Zhu0Hailin Tang1Lei Jiang2Yijia Li3Shijun Li4Shijun Li5Zhijian Wu6Zhijian Wu7School of Big Data and Computing, Guangdong Baiyun University, Guangzhou, ChinaSchool of Big Data and Computing, Guangdong Baiyun University, Guangzhou, ChinaSchool of Big Data and Computing, Guangdong Baiyun University, Guangzhou, ChinaDropbox Inc., San Francisco, CA, United StatesSchool of Big Data and Computing, Guangdong Baiyun University, Guangzhou, ChinaSchool of Computer Science, Wuhan University, Wuhan, ChinaSchool of Big Data and Computing, Guangdong Baiyun University, Guangzhou, ChinaSchool of Computer Science, Wuhan University, Wuhan, ChinaIntroductionMotor imagery EEG-based action recognition is an emerging field arising from the intersection of brain science and information science, which has promising applications in the fields of neurorehabilitation and human-computer collaboration. However, existing methods face challenges including the low signal-to-noise ratio of EEG signals, inter-subject variability, and model overfitting.MethodsWe propose HA-FuseNet, an end-to-end motor imagery action classification network. This model integrates feature fusion and attention mechanisms to classify left hand, right hand, foot, and tongue movements. Its innovations include: (1) multi-scale dense connectivity, (2) hybrid attention mechanism, (3) global self-attention module, and (4) lightweight design for reduced computational overhead.ResultsOn BCI Competition IV Dataset 2A, HA-FuseNet achieved 77.89% average within-subject accuracy (8.42% higher than EEGNet) and 68.53% cross-subject accuracy.ConclusionThe model demonstrates robustness to spatial resolution variations and individual differences, effectively mitigating key challenges in motor imagery EEG classification.https://www.frontiersin.org/articles/10.3389/fnhum.2025.1611229/fullbrain-computer interfacemotor imageryEEGattention mechanismfeature fusion
spellingShingle Tingting Zhu
Hailin Tang
Lei Jiang
Yijia Li
Shijun Li
Shijun Li
Zhijian Wu
Zhijian Wu
A study of motor imagery EEG classification based on feature fusion and attentional mechanisms
Frontiers in Human Neuroscience
brain-computer interface
motor imagery
EEG
attention mechanism
feature fusion
title A study of motor imagery EEG classification based on feature fusion and attentional mechanisms
title_full A study of motor imagery EEG classification based on feature fusion and attentional mechanisms
title_fullStr A study of motor imagery EEG classification based on feature fusion and attentional mechanisms
title_full_unstemmed A study of motor imagery EEG classification based on feature fusion and attentional mechanisms
title_short A study of motor imagery EEG classification based on feature fusion and attentional mechanisms
title_sort study of motor imagery eeg classification based on feature fusion and attentional mechanisms
topic brain-computer interface
motor imagery
EEG
attention mechanism
feature fusion
url https://www.frontiersin.org/articles/10.3389/fnhum.2025.1611229/full
work_keys_str_mv AT tingtingzhu astudyofmotorimageryeegclassificationbasedonfeaturefusionandattentionalmechanisms
AT hailintang astudyofmotorimageryeegclassificationbasedonfeaturefusionandattentionalmechanisms
AT leijiang astudyofmotorimageryeegclassificationbasedonfeaturefusionandattentionalmechanisms
AT yijiali astudyofmotorimageryeegclassificationbasedonfeaturefusionandattentionalmechanisms
AT shijunli astudyofmotorimageryeegclassificationbasedonfeaturefusionandattentionalmechanisms
AT shijunli astudyofmotorimageryeegclassificationbasedonfeaturefusionandattentionalmechanisms
AT zhijianwu astudyofmotorimageryeegclassificationbasedonfeaturefusionandattentionalmechanisms
AT zhijianwu astudyofmotorimageryeegclassificationbasedonfeaturefusionandattentionalmechanisms
AT tingtingzhu studyofmotorimageryeegclassificationbasedonfeaturefusionandattentionalmechanisms
AT hailintang studyofmotorimageryeegclassificationbasedonfeaturefusionandattentionalmechanisms
AT leijiang studyofmotorimageryeegclassificationbasedonfeaturefusionandattentionalmechanisms
AT yijiali studyofmotorimageryeegclassificationbasedonfeaturefusionandattentionalmechanisms
AT shijunli studyofmotorimageryeegclassificationbasedonfeaturefusionandattentionalmechanisms
AT shijunli studyofmotorimageryeegclassificationbasedonfeaturefusionandattentionalmechanisms
AT zhijianwu studyofmotorimageryeegclassificationbasedonfeaturefusionandattentionalmechanisms
AT zhijianwu studyofmotorimageryeegclassificationbasedonfeaturefusionandattentionalmechanisms