A dataset of paired head and eye movements during visual tasks in virtual environments
Abstract We describe a multimodal dataset of paired head and eye movements acquired in controlled virtual reality environments. Our dataset includes head and eye movement for n = 25 participants who interacted with four different virtual reality environments that required coordinated head and eye be...
Saved in:
| Main Authors: | Colin Rubow, Chia-Hsuan Tsai, Eric Brewer, Connor Mattson, Daniel S. Brown, Haohan Zhang |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Nature Portfolio
2024-12-01
|
| Series: | Scientific Data |
| Online Access: | https://doi.org/10.1038/s41597-024-04184-1 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Analysis of eye and head coordination in a visual peripheral recognition task
by: Simon Schwab, et al.
Published: (2012-05-01) -
Decoding target discriminability and time pressure using eye and head movement features in a foraging search task
by: Anthony J. Ries, et al.
Published: (2025-08-01) -
Effectiveness of a Virtual Reality Head-Mounted Display System-based Developmental Eye Movement Test
by: Jung-Ho Kim, et al.
Published: (2016-09-01) -
Task-dependent eye-movement patterns in viewing art
by: Nino Sharvashidze, et al.
Published: (2020-12-01) -
Diagnosis of Parkinson’s disease by eliciting trait-specific eye movements in multi-visual tasks
by: Maosong Jiang, et al.
Published: (2025-01-01)