Enhancing eyes-free interaction in virtual reality using sonification for multiple object selection

In virtual reality (VR) environments, selecting and manipulating multiple out-of-view objects is often challenging because most current VR systems lack integrated haptics. To address this limitation, we propose a sonification method that guides users’ hands to target objects outside their field of v...

Full description

Saved in:
Bibliographic Details
Main Authors: Yota Takahara, Arinobu Niijima, Chanho Park, Takefumi Ogawa
Format: Article
Language:English
Published: Frontiers Media S.A. 2025-07-01
Series:Frontiers in Virtual Reality
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/frvir.2025.1598776/full
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849320829567893504
author Yota Takahara
Arinobu Niijima
Chanho Park
Takefumi Ogawa
author_facet Yota Takahara
Arinobu Niijima
Chanho Park
Takefumi Ogawa
author_sort Yota Takahara
collection DOAJ
description In virtual reality (VR) environments, selecting and manipulating multiple out-of-view objects is often challenging because most current VR systems lack integrated haptics. To address this limitation, we propose a sonification method that guides users’ hands to target objects outside their field of view by assigning distinct auditory parameters (pan, frequency, and amplitude) to the three spatial axes. These parameters are discretized into three exponential steps within a comfortable volume (less than 43 dB) and frequency range (150–700 Hz), determined via pilot studies to avoid listener fatigue. Our method dynamically shifts the sound source location depending on the density of the target objects: when objects are sparsely positioned, each object serves as its own sound source, whereas for dense clusters, a single sound source is placed at the cluster’s center to prevent overlapping sounds. We validated our technique through user studies involving two VR applications: a shooting game that requires rapid weapon selection and a 3D cube keyboard for text entry. Compared to a no-sound baseline, our sonification significantly improved positional accuracy in eyes-free selection tasks. In the shooting game, participants could more easily swap weapons without losing sight of on-screen action, while in the keyboard task, typing accuracy more than doubled during blind entry. These findings suggest that sonification can substantially enhance eyes-free interaction in VR without relying on haptic or visual cues, thereby offering a promising avenue for more efficient and comfortable VR experiences.
format Article
id doaj-art-c0b4da7447f7400db2d3eb0b3c108733
institution Kabale University
issn 2673-4192
language English
publishDate 2025-07-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Virtual Reality
spelling doaj-art-c0b4da7447f7400db2d3eb0b3c1087332025-08-20T03:49:56ZengFrontiers Media S.A.Frontiers in Virtual Reality2673-41922025-07-01610.3389/frvir.2025.15987761598776Enhancing eyes-free interaction in virtual reality using sonification for multiple object selectionYota Takahara0Arinobu Niijima1Chanho Park2Takefumi Ogawa3Graduate School of Engineering, The University of Tokyo, Tokyo, JapanNTT Human Informatics Laboratories, NTT Corporation, Yokosuka, Kanagawa, JapanGraduate School of Engineering, The University of Tokyo, Tokyo, JapanInformation Technology Center, The University of Tokyo, Chiba, JapanIn virtual reality (VR) environments, selecting and manipulating multiple out-of-view objects is often challenging because most current VR systems lack integrated haptics. To address this limitation, we propose a sonification method that guides users’ hands to target objects outside their field of view by assigning distinct auditory parameters (pan, frequency, and amplitude) to the three spatial axes. These parameters are discretized into three exponential steps within a comfortable volume (less than 43 dB) and frequency range (150–700 Hz), determined via pilot studies to avoid listener fatigue. Our method dynamically shifts the sound source location depending on the density of the target objects: when objects are sparsely positioned, each object serves as its own sound source, whereas for dense clusters, a single sound source is placed at the cluster’s center to prevent overlapping sounds. We validated our technique through user studies involving two VR applications: a shooting game that requires rapid weapon selection and a 3D cube keyboard for text entry. Compared to a no-sound baseline, our sonification significantly improved positional accuracy in eyes-free selection tasks. In the shooting game, participants could more easily swap weapons without losing sight of on-screen action, while in the keyboard task, typing accuracy more than doubled during blind entry. These findings suggest that sonification can substantially enhance eyes-free interaction in VR without relying on haptic or visual cues, thereby offering a promising avenue for more efficient and comfortable VR experiences.https://www.frontiersin.org/articles/10.3389/frvir.2025.1598776/fullsonificationeyes-free interactionvirtual realitymultiple object selectionauditory feedback
spellingShingle Yota Takahara
Arinobu Niijima
Chanho Park
Takefumi Ogawa
Enhancing eyes-free interaction in virtual reality using sonification for multiple object selection
Frontiers in Virtual Reality
sonification
eyes-free interaction
virtual reality
multiple object selection
auditory feedback
title Enhancing eyes-free interaction in virtual reality using sonification for multiple object selection
title_full Enhancing eyes-free interaction in virtual reality using sonification for multiple object selection
title_fullStr Enhancing eyes-free interaction in virtual reality using sonification for multiple object selection
title_full_unstemmed Enhancing eyes-free interaction in virtual reality using sonification for multiple object selection
title_short Enhancing eyes-free interaction in virtual reality using sonification for multiple object selection
title_sort enhancing eyes free interaction in virtual reality using sonification for multiple object selection
topic sonification
eyes-free interaction
virtual reality
multiple object selection
auditory feedback
url https://www.frontiersin.org/articles/10.3389/frvir.2025.1598776/full
work_keys_str_mv AT yotatakahara enhancingeyesfreeinteractioninvirtualrealityusingsonificationformultipleobjectselection
AT arinobuniijima enhancingeyesfreeinteractioninvirtualrealityusingsonificationformultipleobjectselection
AT chanhopark enhancingeyesfreeinteractioninvirtualrealityusingsonificationformultipleobjectselection
AT takefumiogawa enhancingeyesfreeinteractioninvirtualrealityusingsonificationformultipleobjectselection