Transformers in EEG Analysis: A Review of Architectures and Applications in Motor Imagery, Seizure, and Emotion Classification
Transformers have rapidly influenced research across various domains. With their superior capability to encode long sequences, they have demonstrated exceptional performance, outperforming existing machine learning methods. There has been a rapid increase in the development of transformer-based mode...
Saved in:
| Main Authors: | , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2025-02-01
|
| Series: | Sensors |
| Subjects: | |
| Online Access: | https://www.mdpi.com/1424-8220/25/5/1293 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1850052637101129728 |
|---|---|
| author | Elnaz Vafaei Mohammad Hosseini |
| author_facet | Elnaz Vafaei Mohammad Hosseini |
| author_sort | Elnaz Vafaei |
| collection | DOAJ |
| description | Transformers have rapidly influenced research across various domains. With their superior capability to encode long sequences, they have demonstrated exceptional performance, outperforming existing machine learning methods. There has been a rapid increase in the development of transformer-based models for EEG analysis. The high volumes of recently published papers highlight the need for further studies exploring transformer architectures, key components, and models employed particularly in EEG studies. This paper aims to explore four major transformer architectures: Time Series Transformer, Vision Transformer, Graph Attention Transformer, and hybrid models, along with their variants in recent EEG analysis. We categorize transformer-based EEG studies according to the most frequent applications in motor imagery classification, emotion recognition, and seizure detection. This paper also highlights the challenges of applying transformers to EEG datasets and reviews data augmentation and transfer learning as potential solutions explored in recent years. Finally, we provide a summarized comparison of the most recent reported results. We hope this paper serves as a roadmap for researchers interested in employing transformer architectures in EEG analysis. |
| format | Article |
| id | doaj-art-e0a67566b4584bfe8c4ca97c0c35c37f |
| institution | DOAJ |
| issn | 1424-8220 |
| language | English |
| publishDate | 2025-02-01 |
| publisher | MDPI AG |
| record_format | Article |
| series | Sensors |
| spelling | doaj-art-e0a67566b4584bfe8c4ca97c0c35c37f2025-08-20T02:52:45ZengMDPI AGSensors1424-82202025-02-01255129310.3390/s25051293Transformers in EEG Analysis: A Review of Architectures and Applications in Motor Imagery, Seizure, and Emotion ClassificationElnaz Vafaei0Mohammad Hosseini1Department of Psychology, Northeastern University, Boston, MA 02115, USADepartment of Biomedical Engineering, Science and Research Branch, Islamic Azad University, Tehran 1477893855, IranTransformers have rapidly influenced research across various domains. With their superior capability to encode long sequences, they have demonstrated exceptional performance, outperforming existing machine learning methods. There has been a rapid increase in the development of transformer-based models for EEG analysis. The high volumes of recently published papers highlight the need for further studies exploring transformer architectures, key components, and models employed particularly in EEG studies. This paper aims to explore four major transformer architectures: Time Series Transformer, Vision Transformer, Graph Attention Transformer, and hybrid models, along with their variants in recent EEG analysis. We categorize transformer-based EEG studies according to the most frequent applications in motor imagery classification, emotion recognition, and seizure detection. This paper also highlights the challenges of applying transformers to EEG datasets and reviews data augmentation and transfer learning as potential solutions explored in recent years. Finally, we provide a summarized comparison of the most recent reported results. We hope this paper serves as a roadmap for researchers interested in employing transformer architectures in EEG analysis.https://www.mdpi.com/1424-8220/25/5/1293transformersvision transformergraph attention transformerelectroencephalography (EEG)brain–computer interface (BCI)motor imagery classification |
| spellingShingle | Elnaz Vafaei Mohammad Hosseini Transformers in EEG Analysis: A Review of Architectures and Applications in Motor Imagery, Seizure, and Emotion Classification Sensors transformers vision transformer graph attention transformer electroencephalography (EEG) brain–computer interface (BCI) motor imagery classification |
| title | Transformers in EEG Analysis: A Review of Architectures and Applications in Motor Imagery, Seizure, and Emotion Classification |
| title_full | Transformers in EEG Analysis: A Review of Architectures and Applications in Motor Imagery, Seizure, and Emotion Classification |
| title_fullStr | Transformers in EEG Analysis: A Review of Architectures and Applications in Motor Imagery, Seizure, and Emotion Classification |
| title_full_unstemmed | Transformers in EEG Analysis: A Review of Architectures and Applications in Motor Imagery, Seizure, and Emotion Classification |
| title_short | Transformers in EEG Analysis: A Review of Architectures and Applications in Motor Imagery, Seizure, and Emotion Classification |
| title_sort | transformers in eeg analysis a review of architectures and applications in motor imagery seizure and emotion classification |
| topic | transformers vision transformer graph attention transformer electroencephalography (EEG) brain–computer interface (BCI) motor imagery classification |
| url | https://www.mdpi.com/1424-8220/25/5/1293 |
| work_keys_str_mv | AT elnazvafaei transformersineeganalysisareviewofarchitecturesandapplicationsinmotorimageryseizureandemotionclassification AT mohammadhosseini transformersineeganalysisareviewofarchitecturesandapplicationsinmotorimageryseizureandemotionclassification |