Rendering visual events as sounds: Spatial attention capture by auditory augmented reality.

Many salient visual events tend to coincide with auditory events, such as seeing and hearing a car pass by. Information from the visual and auditory senses can be used to create a stable percept of the stimulus. Having access to related coincident visual and auditory information can help for spatial...

Full description

Saved in:
Bibliographic Details
Main Authors: Scott A Stone, Matthew S Tata
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2017-01-01
Series:PLoS ONE
Online Access:https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0182635&type=printable
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850230681219629056
author Scott A Stone
Matthew S Tata
author_facet Scott A Stone
Matthew S Tata
author_sort Scott A Stone
collection DOAJ
description Many salient visual events tend to coincide with auditory events, such as seeing and hearing a car pass by. Information from the visual and auditory senses can be used to create a stable percept of the stimulus. Having access to related coincident visual and auditory information can help for spatial tasks such as localization. However not all visual information has analogous auditory percepts, such as viewing a computer monitor. Here, we describe a system capable of detecting and augmenting visual salient events into localizable auditory events. The system uses a neuromorphic camera (DAVIS 240B) to detect logarithmic changes of brightness intensity in the scene, which can be interpreted as salient visual events. Participants were blindfolded and asked to use the device to detect new objects in the scene, as well as determine direction of motion for a moving visual object. Results suggest the system is robust enough to allow for the simple detection of new salient stimuli, as well accurately encoding direction of visual motion. Future successes are probable as neuromorphic devices are likely to become faster and smaller in the future, making this system much more feasible.
format Article
id doaj-art-22bad8d841a84b0087fcdb5b6e4ca264
institution OA Journals
issn 1932-6203
language English
publishDate 2017-01-01
publisher Public Library of Science (PLoS)
record_format Article
series PLoS ONE
spelling doaj-art-22bad8d841a84b0087fcdb5b6e4ca2642025-08-20T02:03:47ZengPublic Library of Science (PLoS)PLoS ONE1932-62032017-01-01128e018263510.1371/journal.pone.0182635Rendering visual events as sounds: Spatial attention capture by auditory augmented reality.Scott A StoneMatthew S TataMany salient visual events tend to coincide with auditory events, such as seeing and hearing a car pass by. Information from the visual and auditory senses can be used to create a stable percept of the stimulus. Having access to related coincident visual and auditory information can help for spatial tasks such as localization. However not all visual information has analogous auditory percepts, such as viewing a computer monitor. Here, we describe a system capable of detecting and augmenting visual salient events into localizable auditory events. The system uses a neuromorphic camera (DAVIS 240B) to detect logarithmic changes of brightness intensity in the scene, which can be interpreted as salient visual events. Participants were blindfolded and asked to use the device to detect new objects in the scene, as well as determine direction of motion for a moving visual object. Results suggest the system is robust enough to allow for the simple detection of new salient stimuli, as well accurately encoding direction of visual motion. Future successes are probable as neuromorphic devices are likely to become faster and smaller in the future, making this system much more feasible.https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0182635&type=printable
spellingShingle Scott A Stone
Matthew S Tata
Rendering visual events as sounds: Spatial attention capture by auditory augmented reality.
PLoS ONE
title Rendering visual events as sounds: Spatial attention capture by auditory augmented reality.
title_full Rendering visual events as sounds: Spatial attention capture by auditory augmented reality.
title_fullStr Rendering visual events as sounds: Spatial attention capture by auditory augmented reality.
title_full_unstemmed Rendering visual events as sounds: Spatial attention capture by auditory augmented reality.
title_short Rendering visual events as sounds: Spatial attention capture by auditory augmented reality.
title_sort rendering visual events as sounds spatial attention capture by auditory augmented reality
url https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0182635&type=printable
work_keys_str_mv AT scottastone renderingvisualeventsassoundsspatialattentioncapturebyauditoryaugmentedreality
AT matthewstata renderingvisualeventsassoundsspatialattentioncapturebyauditoryaugmentedreality