Neural processing of naturalistic audiovisual events in space and time
Abstract Our brain seamlessly integrates distinct sensory information to form a coherent percept. However, when real-world audiovisual events are perceived, the specific brain regions and timings for processing different levels of information remain less investigated. To address that, we curated nat...
Saved in:
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
Nature Portfolio
2025-01-01
|
Series: | Communications Biology |
Online Access: | https://doi.org/10.1038/s42003-024-07434-5 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1832585451429429248 |
---|---|
author | Yu Hu Yalda Mohsenzadeh |
author_facet | Yu Hu Yalda Mohsenzadeh |
author_sort | Yu Hu |
collection | DOAJ |
description | Abstract Our brain seamlessly integrates distinct sensory information to form a coherent percept. However, when real-world audiovisual events are perceived, the specific brain regions and timings for processing different levels of information remain less investigated. To address that, we curated naturalistic videos and recorded functional magnetic resonance imaging (fMRI) and electroencephalography (EEG) data when participants viewed videos with accompanying sounds. Our findings reveal early asymmetrical cross-modal interaction, with acoustic information represented in both early visual and auditory regions, while visual information only identified in visual cortices. The visual and auditory features were processed with similar onset but different temporal dynamics. High-level categorical and semantic information emerged in multisensory association areas later in time, indicating late cross-modal integration and its distinct role in converging conceptual information. Comparing neural representations to a two-branch deep neural network model highlighted the necessity of early cross-modal connections to build a biologically plausible model of audiovisual perception. With EEG-fMRI fusion, we provided a spatiotemporally resolved account of neural activity during the processing of naturalistic audiovisual stimuli. |
format | Article |
id | doaj-art-c025bbad017349da8162fcec9bb2d813 |
institution | Kabale University |
issn | 2399-3642 |
language | English |
publishDate | 2025-01-01 |
publisher | Nature Portfolio |
record_format | Article |
series | Communications Biology |
spelling | doaj-art-c025bbad017349da8162fcec9bb2d8132025-01-26T12:48:19ZengNature PortfolioCommunications Biology2399-36422025-01-018111610.1038/s42003-024-07434-5Neural processing of naturalistic audiovisual events in space and timeYu Hu0Yalda Mohsenzadeh1Western Institute for Neuroscience, Western UniversityWestern Institute for Neuroscience, Western UniversityAbstract Our brain seamlessly integrates distinct sensory information to form a coherent percept. However, when real-world audiovisual events are perceived, the specific brain regions and timings for processing different levels of information remain less investigated. To address that, we curated naturalistic videos and recorded functional magnetic resonance imaging (fMRI) and electroencephalography (EEG) data when participants viewed videos with accompanying sounds. Our findings reveal early asymmetrical cross-modal interaction, with acoustic information represented in both early visual and auditory regions, while visual information only identified in visual cortices. The visual and auditory features were processed with similar onset but different temporal dynamics. High-level categorical and semantic information emerged in multisensory association areas later in time, indicating late cross-modal integration and its distinct role in converging conceptual information. Comparing neural representations to a two-branch deep neural network model highlighted the necessity of early cross-modal connections to build a biologically plausible model of audiovisual perception. With EEG-fMRI fusion, we provided a spatiotemporally resolved account of neural activity during the processing of naturalistic audiovisual stimuli.https://doi.org/10.1038/s42003-024-07434-5 |
spellingShingle | Yu Hu Yalda Mohsenzadeh Neural processing of naturalistic audiovisual events in space and time Communications Biology |
title | Neural processing of naturalistic audiovisual events in space and time |
title_full | Neural processing of naturalistic audiovisual events in space and time |
title_fullStr | Neural processing of naturalistic audiovisual events in space and time |
title_full_unstemmed | Neural processing of naturalistic audiovisual events in space and time |
title_short | Neural processing of naturalistic audiovisual events in space and time |
title_sort | neural processing of naturalistic audiovisual events in space and time |
url | https://doi.org/10.1038/s42003-024-07434-5 |
work_keys_str_mv | AT yuhu neuralprocessingofnaturalisticaudiovisualeventsinspaceandtime AT yaldamohsenzadeh neuralprocessingofnaturalisticaudiovisualeventsinspaceandtime |