Predicting spatial familiarity by exploiting head and eye movements during pedestrian navigation in the real world
Abstract Spatial familiarity has seen a long history of interest in wayfinding research. To date, however, no studies have been done which systematically assess the behavioral correlates of spatial familiarity, including eye and body movements. In this study, we take a step towards filling this gap...
Saved in:
| Main Authors: | , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Nature Portfolio
2025-03-01
|
| Series: | Scientific Reports |
| Online Access: | https://doi.org/10.1038/s41598-025-92274-4 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Abstract Spatial familiarity has seen a long history of interest in wayfinding research. To date, however, no studies have been done which systematically assess the behavioral correlates of spatial familiarity, including eye and body movements. In this study, we take a step towards filling this gap by reporting on the results of an in-situ, within-subject study with $$N=52$$ pedestrian wayfinders that combines eye-tracking and body movement sensors. In our study, participants were required to walk both a familiar route and an unfamiliar route by following auditory, landmark-based route instructions. We monitored participants’ behavior using a mobile eye tracker, a high-precision Global Navigation Satellite System receiver, and a high-precision, head-mounted Inertial Measurement Unit. We conducted machine learning experiments using Gradient-Boosted Trees to perform binary classification, testing out different feature sets, i.e., gaze only, Inertial Measurement Unit data only, and a combination of the two, to classify a person as familiar or unfamiliar with a particular route. We achieve the highest accuracy of $$89.9\%$$ using exclusively Inertial Measurement Unit data, exceeding gaze alone at $$67.6\%$$ , and gaze and Inertial Measurement Unit data together at $$85.9\%$$ . For the highest accuracy achieved, yaw and acceleration values are most important. This finding indicates that head movements (“looking around to orient oneself”) are a particularly valuable indicator to distinguish familiar and unfamiliar environments for pedestrian wayfinders. |
|---|---|
| ISSN: | 2045-2322 |