Stochasticity as a solution for overfitting—A new model and comparative study on non-invasive EEG prospects

The potential and utility of inner speech is pivotal for developing practical, everyday Brain-Computer Interface (BCI) applications, as it represents a type of brain signal that operates independently of external stimuli however it is largely underdeveloped due to the challenges faced in deciphering...

Full description

Saved in:
Bibliographic Details
Main Authors: Yousef A. Radwan, Eslam Ahmed Mohamed, Donia Metwalli, Mariam Barakat, Anas Ahmed, Antony E. Kiroles, Sahar Selim
Format: Article
Language:English
Published: Frontiers Media S.A. 2025-01-01
Series:Frontiers in Human Neuroscience
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fnhum.2025.1484470/full
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1832589838141882368
author Yousef A. Radwan
Eslam Ahmed Mohamed
Donia Metwalli
Mariam Barakat
Anas Ahmed
Antony E. Kiroles
Sahar Selim
author_facet Yousef A. Radwan
Eslam Ahmed Mohamed
Donia Metwalli
Mariam Barakat
Anas Ahmed
Antony E. Kiroles
Sahar Selim
author_sort Yousef A. Radwan
collection DOAJ
description The potential and utility of inner speech is pivotal for developing practical, everyday Brain-Computer Interface (BCI) applications, as it represents a type of brain signal that operates independently of external stimuli however it is largely underdeveloped due to the challenges faced in deciphering its signals. In this study, we evaluated the behaviors of various Machine Learning (ML) and Deep Learning (DL) models on a publicly available dataset, employing popular preprocessing methods as feature extractors to enhance model training. We face significant challenges like subject-dependent variability, high noise levels, and overfitting. To address overfitting in particular, we propose using “BruteExtraTree”: a new classifier which relies on moderate stochasticity inherited from its base model, the ExtraTreeClassifier. This model not only matches the best DL model, ShallowFBCSPNet, in the subject-independent scenario in our experiments scoring 32% accuracy, but also surpasses the state-of-the-art by achieving 46.6% average per-subject accuracy in the subject-dependent case. Our results on the subject-dependent case show promise on the possibility of a new paradigm for using inner speech data inspired from LLM pretraining but we also highlight the crucial need for a drastic change in data recording or noise removal methods to open the way for more practical accuracies in the subject-independent case.
format Article
id doaj-art-547170b4599b4b8382465c1cfef81017
institution Kabale University
issn 1662-5161
language English
publishDate 2025-01-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Human Neuroscience
spelling doaj-art-547170b4599b4b8382465c1cfef810172025-01-24T07:13:55ZengFrontiers Media S.A.Frontiers in Human Neuroscience1662-51612025-01-011910.3389/fnhum.2025.14844701484470Stochasticity as a solution for overfitting—A new model and comparative study on non-invasive EEG prospectsYousef A. RadwanEslam Ahmed MohamedDonia MetwalliMariam BarakatAnas AhmedAntony E. KirolesSahar SelimThe potential and utility of inner speech is pivotal for developing practical, everyday Brain-Computer Interface (BCI) applications, as it represents a type of brain signal that operates independently of external stimuli however it is largely underdeveloped due to the challenges faced in deciphering its signals. In this study, we evaluated the behaviors of various Machine Learning (ML) and Deep Learning (DL) models on a publicly available dataset, employing popular preprocessing methods as feature extractors to enhance model training. We face significant challenges like subject-dependent variability, high noise levels, and overfitting. To address overfitting in particular, we propose using “BruteExtraTree”: a new classifier which relies on moderate stochasticity inherited from its base model, the ExtraTreeClassifier. This model not only matches the best DL model, ShallowFBCSPNet, in the subject-independent scenario in our experiments scoring 32% accuracy, but also surpasses the state-of-the-art by achieving 46.6% average per-subject accuracy in the subject-dependent case. Our results on the subject-dependent case show promise on the possibility of a new paradigm for using inner speech data inspired from LLM pretraining but we also highlight the crucial need for a drastic change in data recording or noise removal methods to open the way for more practical accuracies in the subject-independent case.https://www.frontiersin.org/articles/10.3389/fnhum.2025.1484470/fullEEGbrain-computer interfaceinner speechmachine learningdeep learning
spellingShingle Yousef A. Radwan
Eslam Ahmed Mohamed
Donia Metwalli
Mariam Barakat
Anas Ahmed
Antony E. Kiroles
Sahar Selim
Stochasticity as a solution for overfitting—A new model and comparative study on non-invasive EEG prospects
Frontiers in Human Neuroscience
EEG
brain-computer interface
inner speech
machine learning
deep learning
title Stochasticity as a solution for overfitting—A new model and comparative study on non-invasive EEG prospects
title_full Stochasticity as a solution for overfitting—A new model and comparative study on non-invasive EEG prospects
title_fullStr Stochasticity as a solution for overfitting—A new model and comparative study on non-invasive EEG prospects
title_full_unstemmed Stochasticity as a solution for overfitting—A new model and comparative study on non-invasive EEG prospects
title_short Stochasticity as a solution for overfitting—A new model and comparative study on non-invasive EEG prospects
title_sort stochasticity as a solution for overfitting a new model and comparative study on non invasive eeg prospects
topic EEG
brain-computer interface
inner speech
machine learning
deep learning
url https://www.frontiersin.org/articles/10.3389/fnhum.2025.1484470/full
work_keys_str_mv AT yousefaradwan stochasticityasasolutionforoverfittinganewmodelandcomparativestudyonnoninvasiveeegprospects
AT eslamahmedmohamed stochasticityasasolutionforoverfittinganewmodelandcomparativestudyonnoninvasiveeegprospects
AT doniametwalli stochasticityasasolutionforoverfittinganewmodelandcomparativestudyonnoninvasiveeegprospects
AT mariambarakat stochasticityasasolutionforoverfittinganewmodelandcomparativestudyonnoninvasiveeegprospects
AT anasahmed stochasticityasasolutionforoverfittinganewmodelandcomparativestudyonnoninvasiveeegprospects
AT antonyekiroles stochasticityasasolutionforoverfittinganewmodelandcomparativestudyonnoninvasiveeegprospects
AT saharselim stochasticityasasolutionforoverfittinganewmodelandcomparativestudyonnoninvasiveeegprospects