A Hybrid Gaze Distance Estimation via Cross-Reference of Vergence and Depth

To deliver an optimal Mixed Reality (MR) experience, wherein virtual elements and real-world objects are seamlessly merged, it is vital to ensure a consistent vergence-accommodation distance. This necessitates the advancement of technology to precisely estimate the user’s gaze distance. P...

Full description

Saved in:
Bibliographic Details
Main Authors: Dae-Yong Cho, Min-Koo Kang
Format: Article
Language:English
Published: IEEE 2024-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10772443/
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850264672484196352
author Dae-Yong Cho
Min-Koo Kang
author_facet Dae-Yong Cho
Min-Koo Kang
author_sort Dae-Yong Cho
collection DOAJ
description To deliver an optimal Mixed Reality (MR) experience, wherein virtual elements and real-world objects are seamlessly merged, it is vital to ensure a consistent vergence-accommodation distance. This necessitates the advancement of technology to precisely estimate the user’s gaze distance. Presently, various MR devices employ small eye-tracking cameras to capture both eyes and infer the gaze distance based on vergence angle data. However, this technique faces significant challenges, as it is highly sensitive to several human errors, such as strabismus, blinking, and fatigue of the eyes due to prolonged use. To address these issues, this paper introduces an innovative hybrid algorithm for estimating gaze distances. The proposed approach concurrently utilizes an eye camera and a depth camera to conduct parallel estimations: one based on the conventional vergence angle and the other on gaze-mapped depth information. The confidence of each method is then assessed and cross-referenced, and an adaptive weighted average is computed to derive a more precise and stable gaze distance estimation. In the experiment, three challenging test scenarios designed to induce human and environmental errors were administered to 12 subjects under uniform conditions to evaluate the accuracy and stability of the proposed method. The experimental results were validated through both qualitative and quantitative analysis. The findings showed that the proposed method significantly outperformed current methods with a visual angle error of 0.132 degrees under ideal conditions. Furthermore, it consistently maintained robustness against human and environmental errors, achieving an error range of 0.14 to 0.21 degrees even in demanding environments.
format Article
id doaj-art-5de08eee76644786bccf5121c46fd7cf
institution OA Journals
issn 2169-3536
language English
publishDate 2024-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj-art-5de08eee76644786bccf5121c46fd7cf2025-08-20T01:54:38ZengIEEEIEEE Access2169-35362024-01-011218261818262610.1109/ACCESS.2024.351035710772443A Hybrid Gaze Distance Estimation via Cross-Reference of Vergence and DepthDae-Yong Cho0https://orcid.org/0000-0002-6685-5306Min-Koo Kang1https://orcid.org/0000-0003-1109-4818Korea Institute of Science and Technology, Seoul, South KoreaKorea Institute of Science and Technology, Seoul, South KoreaTo deliver an optimal Mixed Reality (MR) experience, wherein virtual elements and real-world objects are seamlessly merged, it is vital to ensure a consistent vergence-accommodation distance. This necessitates the advancement of technology to precisely estimate the user’s gaze distance. Presently, various MR devices employ small eye-tracking cameras to capture both eyes and infer the gaze distance based on vergence angle data. However, this technique faces significant challenges, as it is highly sensitive to several human errors, such as strabismus, blinking, and fatigue of the eyes due to prolonged use. To address these issues, this paper introduces an innovative hybrid algorithm for estimating gaze distances. The proposed approach concurrently utilizes an eye camera and a depth camera to conduct parallel estimations: one based on the conventional vergence angle and the other on gaze-mapped depth information. The confidence of each method is then assessed and cross-referenced, and an adaptive weighted average is computed to derive a more precise and stable gaze distance estimation. In the experiment, three challenging test scenarios designed to induce human and environmental errors were administered to 12 subjects under uniform conditions to evaluate the accuracy and stability of the proposed method. The experimental results were validated through both qualitative and quantitative analysis. The findings showed that the proposed method significantly outperformed current methods with a visual angle error of 0.132 degrees under ideal conditions. Furthermore, it consistently maintained robustness against human and environmental errors, achieving an error range of 0.14 to 0.21 degrees even in demanding environments.https://ieeexplore.ieee.org/document/10772443/Augmented reality (AR)extended reality (XR)eye trackinggaze distance estimationmixed reality (MR)varifocal
spellingShingle Dae-Yong Cho
Min-Koo Kang
A Hybrid Gaze Distance Estimation via Cross-Reference of Vergence and Depth
IEEE Access
Augmented reality (AR)
extended reality (XR)
eye tracking
gaze distance estimation
mixed reality (MR)
varifocal
title A Hybrid Gaze Distance Estimation via Cross-Reference of Vergence and Depth
title_full A Hybrid Gaze Distance Estimation via Cross-Reference of Vergence and Depth
title_fullStr A Hybrid Gaze Distance Estimation via Cross-Reference of Vergence and Depth
title_full_unstemmed A Hybrid Gaze Distance Estimation via Cross-Reference of Vergence and Depth
title_short A Hybrid Gaze Distance Estimation via Cross-Reference of Vergence and Depth
title_sort hybrid gaze distance estimation via cross reference of vergence and depth
topic Augmented reality (AR)
extended reality (XR)
eye tracking
gaze distance estimation
mixed reality (MR)
varifocal
url https://ieeexplore.ieee.org/document/10772443/
work_keys_str_mv AT daeyongcho ahybridgazedistanceestimationviacrossreferenceofvergenceanddepth
AT minkookang ahybridgazedistanceestimationviacrossreferenceofvergenceanddepth
AT daeyongcho hybridgazedistanceestimationviacrossreferenceofvergenceanddepth
AT minkookang hybridgazedistanceestimationviacrossreferenceofvergenceanddepth