Multisensory stimuli facilitate low-level perceptual learning on a difficult global motion task in virtual reality.
The present study investigates the feasibility of inducing visual perceptual learning on a peripheral, global direction discrimination and integration task in virtual reality, and tests whether audio-visual multisensory training induces faster or greater visual learning than unisensory visual traini...
Saved in:
| Main Authors: | Catherine A Fromm, Ross K Maddox, Melissa J Polonenko, Krystel R Huxlin, Gabriel J Diaz |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Public Library of Science (PLoS)
2025-01-01
|
| Series: | PLoS ONE |
| Online Access: | https://doi.org/10.1371/journal.pone.0319007 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Generalization in perceptual learning across stimuli and tasks
by: Ravit Kahalani-Hodedany, et al.
Published: (2024-10-01) -
The effect of dynamic stimuli on attention under different perceptual loads
by: Yuanli Li, et al.
Published: (2025-07-01) -
Orthogonal neural representations support perceptual judgments of natural stimuli
by: Ramanujan Srinath, et al.
Published: (2025-02-01) -
The effects of stimulus variability on the perceptual learning of speech and non-speech stimuli.
by: Karen Banai, et al.
Published: (2015-01-01) -
Augmented reality-based radial and lateral motion stimuli alter aiming performance in dart throwing
by: Yuki Ueyama, et al.
Published: (2025-03-01)