A Quality-Centered Analysis of Eye Tracking Data in Foveated Rendering

This work presents the analysis of data recorded by an eye tracking device in the course of evaluating a foveated rendering approach for head-mounted displays (HMDs). Foveated ren- dering methods adapt the image synthesis process to the user’s gaze and exploiting the human visual system’s limitation...

Full description

Saved in:
Bibliographic Details
Main Authors: Thorsten Roth, Martin Weier, André Hinkenjann, Yongmin Li, Philipp Slusallek
Format: Article
Language:English
Published: MDPI AG 2017-09-01
Series:Journal of Eye Movement Research
Subjects:
Online Access:https://bop.unibe.ch/JEMR/article/view/3729
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849767567640494080
author Thorsten Roth
Martin Weier
André Hinkenjann
Yongmin Li
Philipp Slusallek
author_facet Thorsten Roth
Martin Weier
André Hinkenjann
Yongmin Li
Philipp Slusallek
author_sort Thorsten Roth
collection DOAJ
description This work presents the analysis of data recorded by an eye tracking device in the course of evaluating a foveated rendering approach for head-mounted displays (HMDs). Foveated ren- dering methods adapt the image synthesis process to the user’s gaze and exploiting the human visual system’s limitations to increase rendering performance. Especially, foveated rendering has great potential when certain requirements have to be fulfilled, like low-latency rendering to cope with high display refresh rates. This is crucial for virtual reality (VR), as a high level of immersion, which can only be achieved with high rendering performance and also helps to reduce nausea, is an important factor in this field. We put things in context by first providing basic information about our rendering system, followed by a description of the user study and the collected data. This data stems from fixation tasks that subjects had to perform while being shown fly-through sequences of virtual scenes on an HMD. These fixation tasks consisted of a combination of various scenes and fixation modes. Besides static fixation targets, moving tar- gets on randomized paths as well as a free focus mode were tested. Using this data, we estimate the precision of the utilized eye tracker and analyze the participants’ accuracy in focusing the displayed fixation targets. Here, we also take a look at eccentricity-dependent quality ratings. Comparing this information with the users’ quality ratings given for the displayed sequences then reveals an interesting connection between fixation modes, fixation accuracy and quality ratings.
format Article
id doaj-art-2f21b2861db34c19a9f62dbb01f61cfd
institution DOAJ
issn 1995-8692
language English
publishDate 2017-09-01
publisher MDPI AG
record_format Article
series Journal of Eye Movement Research
spelling doaj-art-2f21b2861db34c19a9f62dbb01f61cfd2025-08-20T03:04:08ZengMDPI AGJournal of Eye Movement Research1995-86922017-09-0110510.16910/jemr.10.5.2A Quality-Centered Analysis of Eye Tracking Data in Foveated RenderingThorsten Roth0Martin Weier1André Hinkenjann2Yongmin Li3Philipp Slusallek4Bonn-Rhein-Sieg University of Applied Sciences Brunel University LondonBonn-Rhein-Sieg University of Applied Sciences Saarland UniversityBonn-Rhein-Sieg University of Applied SciencesBrunel University LondonSaarland University Intel Visual Computing University German Research Centre for Artificial Intelligence (DFKI)This work presents the analysis of data recorded by an eye tracking device in the course of evaluating a foveated rendering approach for head-mounted displays (HMDs). Foveated ren- dering methods adapt the image synthesis process to the user’s gaze and exploiting the human visual system’s limitations to increase rendering performance. Especially, foveated rendering has great potential when certain requirements have to be fulfilled, like low-latency rendering to cope with high display refresh rates. This is crucial for virtual reality (VR), as a high level of immersion, which can only be achieved with high rendering performance and also helps to reduce nausea, is an important factor in this field. We put things in context by first providing basic information about our rendering system, followed by a description of the user study and the collected data. This data stems from fixation tasks that subjects had to perform while being shown fly-through sequences of virtual scenes on an HMD. These fixation tasks consisted of a combination of various scenes and fixation modes. Besides static fixation targets, moving tar- gets on randomized paths as well as a free focus mode were tested. Using this data, we estimate the precision of the utilized eye tracker and analyze the participants’ accuracy in focusing the displayed fixation targets. Here, we also take a look at eccentricity-dependent quality ratings. Comparing this information with the users’ quality ratings given for the displayed sequences then reveals an interesting connection between fixation modes, fixation accuracy and quality ratings.https://bop.unibe.ch/JEMR/article/view/3729RenderingRay tracingdata analysisperceived qualityeye trackingfoveated rendering
spellingShingle Thorsten Roth
Martin Weier
André Hinkenjann
Yongmin Li
Philipp Slusallek
A Quality-Centered Analysis of Eye Tracking Data in Foveated Rendering
Journal of Eye Movement Research
Rendering
Ray tracing
data analysis
perceived quality
eye tracking
foveated rendering
title A Quality-Centered Analysis of Eye Tracking Data in Foveated Rendering
title_full A Quality-Centered Analysis of Eye Tracking Data in Foveated Rendering
title_fullStr A Quality-Centered Analysis of Eye Tracking Data in Foveated Rendering
title_full_unstemmed A Quality-Centered Analysis of Eye Tracking Data in Foveated Rendering
title_short A Quality-Centered Analysis of Eye Tracking Data in Foveated Rendering
title_sort quality centered analysis of eye tracking data in foveated rendering
topic Rendering
Ray tracing
data analysis
perceived quality
eye tracking
foveated rendering
url https://bop.unibe.ch/JEMR/article/view/3729
work_keys_str_mv AT thorstenroth aqualitycenteredanalysisofeyetrackingdatainfoveatedrendering
AT martinweier aqualitycenteredanalysisofeyetrackingdatainfoveatedrendering
AT andrehinkenjann aqualitycenteredanalysisofeyetrackingdatainfoveatedrendering
AT yongminli aqualitycenteredanalysisofeyetrackingdatainfoveatedrendering
AT philippslusallek aqualitycenteredanalysisofeyetrackingdatainfoveatedrendering
AT thorstenroth qualitycenteredanalysisofeyetrackingdatainfoveatedrendering
AT martinweier qualitycenteredanalysisofeyetrackingdatainfoveatedrendering
AT andrehinkenjann qualitycenteredanalysisofeyetrackingdatainfoveatedrendering
AT yongminli qualitycenteredanalysisofeyetrackingdatainfoveatedrendering
AT philippslusallek qualitycenteredanalysisofeyetrackingdatainfoveatedrendering