An fNIRS dataset for Multimodal Speech Comprehension in Normal Hearing Individuals and Cochlear Implant Users

Abstract Understanding cortical processing in cochlear implant (CI) users is crucial for improving speech outcomes. Functional near-infrared spectroscopy (fNIRS) provides a non-invasive, implant-compatible method for assessing cortical activity during speech comprehension. However, existing studies...

Full description

Saved in:
Bibliographic Details
Main Authors: András Bálint, Wilhelm Wimmer, Christian Rummel, Marco Caversaccio, Stefan Weder
Format: Article
Language:English
Published: Nature Portfolio 2025-07-01
Series:Scientific Data
Online Access:https://doi.org/10.1038/s41597-025-05654-w
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849344159377260544
author András Bálint
Wilhelm Wimmer
Christian Rummel
Marco Caversaccio
Stefan Weder
author_facet András Bálint
Wilhelm Wimmer
Christian Rummel
Marco Caversaccio
Stefan Weder
author_sort András Bálint
collection DOAJ
description Abstract Understanding cortical processing in cochlear implant (CI) users is crucial for improving speech outcomes. Functional near-infrared spectroscopy (fNIRS) provides a non-invasive, implant-compatible method for assessing cortical activity during speech comprehension. However, existing studies suffer from methodological heterogeneity and a lack of standardized datasets, limiting cross-study comparisons and generalizability. To address this gap, we present a multimodal fNIRS dataset comprising 46 CI users and 26 normal hearing controls. Participants completed a clinically relevant speech comprehension task using the German Matrix Sentence Test (OLSA) under speech-in-quiet, speech-in-noise, audiovisual and visual speech (i.e., lipreading) conditions. fNIRS recordings covered key cortical regions involved in speech processing, including the prefrontal, temporal, and visual cortices. Additionally, we provide detailed metadata, including patient history, hearing tests, behavioral measures, and spatially registered probe positions. This data descriptor aims to provide a comprehensive resource for investigating multimodal speech understanding in CI users. It enables researchers to explore cortical adaptations in prosthetic hearing, contributing to the refinement of CI rehabilitation strategies and advancing the understanding of auditory neuroplasticity.
format Article
id doaj-art-d385249044e04f32bc6c74aec611e136
institution Kabale University
issn 2052-4463
language English
publishDate 2025-07-01
publisher Nature Portfolio
record_format Article
series Scientific Data
spelling doaj-art-d385249044e04f32bc6c74aec611e1362025-08-20T03:42:44ZengNature PortfolioScientific Data2052-44632025-07-0112111010.1038/s41597-025-05654-wAn fNIRS dataset for Multimodal Speech Comprehension in Normal Hearing Individuals and Cochlear Implant UsersAndrás Bálint0Wilhelm Wimmer1Christian Rummel2Marco Caversaccio3Stefan Weder4Hearing Research Laboratory, ARTORG Center for Biomedical Engineering Research, University of BernDepartment of ENT - Head and Neck Surgery, Inselspital, Bern University Hospital, University of BernSupport Center for Advanced Neuroimaging (SCAN), University Institute of Diagnostic and Interventional Neuroradiology, Inselspital, Bern University Hospital, University of BernHearing Research Laboratory, ARTORG Center for Biomedical Engineering Research, University of BernDepartment of ENT - Head and Neck Surgery, Inselspital, Bern University Hospital, University of BernAbstract Understanding cortical processing in cochlear implant (CI) users is crucial for improving speech outcomes. Functional near-infrared spectroscopy (fNIRS) provides a non-invasive, implant-compatible method for assessing cortical activity during speech comprehension. However, existing studies suffer from methodological heterogeneity and a lack of standardized datasets, limiting cross-study comparisons and generalizability. To address this gap, we present a multimodal fNIRS dataset comprising 46 CI users and 26 normal hearing controls. Participants completed a clinically relevant speech comprehension task using the German Matrix Sentence Test (OLSA) under speech-in-quiet, speech-in-noise, audiovisual and visual speech (i.e., lipreading) conditions. fNIRS recordings covered key cortical regions involved in speech processing, including the prefrontal, temporal, and visual cortices. Additionally, we provide detailed metadata, including patient history, hearing tests, behavioral measures, and spatially registered probe positions. This data descriptor aims to provide a comprehensive resource for investigating multimodal speech understanding in CI users. It enables researchers to explore cortical adaptations in prosthetic hearing, contributing to the refinement of CI rehabilitation strategies and advancing the understanding of auditory neuroplasticity.https://doi.org/10.1038/s41597-025-05654-w
spellingShingle András Bálint
Wilhelm Wimmer
Christian Rummel
Marco Caversaccio
Stefan Weder
An fNIRS dataset for Multimodal Speech Comprehension in Normal Hearing Individuals and Cochlear Implant Users
Scientific Data
title An fNIRS dataset for Multimodal Speech Comprehension in Normal Hearing Individuals and Cochlear Implant Users
title_full An fNIRS dataset for Multimodal Speech Comprehension in Normal Hearing Individuals and Cochlear Implant Users
title_fullStr An fNIRS dataset for Multimodal Speech Comprehension in Normal Hearing Individuals and Cochlear Implant Users
title_full_unstemmed An fNIRS dataset for Multimodal Speech Comprehension in Normal Hearing Individuals and Cochlear Implant Users
title_short An fNIRS dataset for Multimodal Speech Comprehension in Normal Hearing Individuals and Cochlear Implant Users
title_sort fnirs dataset for multimodal speech comprehension in normal hearing individuals and cochlear implant users
url https://doi.org/10.1038/s41597-025-05654-w
work_keys_str_mv AT andrasbalint anfnirsdatasetformultimodalspeechcomprehensioninnormalhearingindividualsandcochlearimplantusers
AT wilhelmwimmer anfnirsdatasetformultimodalspeechcomprehensioninnormalhearingindividualsandcochlearimplantusers
AT christianrummel anfnirsdatasetformultimodalspeechcomprehensioninnormalhearingindividualsandcochlearimplantusers
AT marcocaversaccio anfnirsdatasetformultimodalspeechcomprehensioninnormalhearingindividualsandcochlearimplantusers
AT stefanweder anfnirsdatasetformultimodalspeechcomprehensioninnormalhearingindividualsandcochlearimplantusers
AT andrasbalint fnirsdatasetformultimodalspeechcomprehensioninnormalhearingindividualsandcochlearimplantusers
AT wilhelmwimmer fnirsdatasetformultimodalspeechcomprehensioninnormalhearingindividualsandcochlearimplantusers
AT christianrummel fnirsdatasetformultimodalspeechcomprehensioninnormalhearingindividualsandcochlearimplantusers
AT marcocaversaccio fnirsdatasetformultimodalspeechcomprehensioninnormalhearingindividualsandcochlearimplantusers
AT stefanweder fnirsdatasetformultimodalspeechcomprehensioninnormalhearingindividualsandcochlearimplantusers