Rendering visual events as sounds: Spatial attention capture by auditory augmented reality.
Many salient visual events tend to coincide with auditory events, such as seeing and hearing a car pass by. Information from the visual and auditory senses can be used to create a stable percept of the stimulus. Having access to related coincident visual and auditory information can help for spatial...
Saved in:
| Main Authors: | Scott A Stone, Matthew S Tata |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Public Library of Science (PLoS)
2017-01-01
|
| Series: | PLoS ONE |
| Online Access: | https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0182635&type=printable |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Cueing Visual Attention to Spatial Locations With Auditory Cues
by: Matthew Kean, et al.
Published: (2008-12-01) -
Effects of spatial and feature attention on disparity-rendered structure-from-motion stimuli in the human visual cortex.
by: Ifan Betina Ip, et al.
Published: (2014-01-01) -
The Impact of Selective Spatial Attention on Auditory–Tactile Integration: An Event-Related Potential Study
by: Weichao An, et al.
Published: (2024-12-01) -
Visual Coherence for Augmented Reality
by: A. L. Gorbunov
Published: (2023-07-01) -
A virtual-reality (VR) cognitive pupillometry analysis of auditory and visual phonemic awareness tasks involving ‘th’ sound variations
by: Mohsen Mahmoudi-Dehaki, et al.
Published: (2024-09-01)