Predicting spatial familiarity by exploiting head and eye movements during pedestrian navigation in the real world

Abstract Spatial familiarity has seen a long history of interest in wayfinding research. To date, however, no studies have been done which systematically assess the behavioral correlates of spatial familiarity, including eye and body movements. In this study, we take a step towards filling this gap...

Full description

Saved in:
Bibliographic Details
Main Authors: Markus Kattenbeck, Ioannis Giannopoulos, Negar Alinaghi, Antonia Golab, Daniel R. Montello
Format: Article
Language:English
Published: Nature Portfolio 2025-03-01
Series:Scientific Reports
Online Access:https://doi.org/10.1038/s41598-025-92274-4
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850029841403871232
author Markus Kattenbeck
Ioannis Giannopoulos
Negar Alinaghi
Antonia Golab
Daniel R. Montello
author_facet Markus Kattenbeck
Ioannis Giannopoulos
Negar Alinaghi
Antonia Golab
Daniel R. Montello
author_sort Markus Kattenbeck
collection DOAJ
description Abstract Spatial familiarity has seen a long history of interest in wayfinding research. To date, however, no studies have been done which systematically assess the behavioral correlates of spatial familiarity, including eye and body movements. In this study, we take a step towards filling this gap by reporting on the results of an in-situ, within-subject study with $$N=52$$ pedestrian wayfinders that combines eye-tracking and body movement sensors. In our study, participants were required to walk both a familiar route and an unfamiliar route by following auditory, landmark-based route instructions. We monitored participants’ behavior using a mobile eye tracker, a high-precision Global Navigation Satellite System receiver, and a high-precision, head-mounted Inertial Measurement Unit. We conducted machine learning experiments using Gradient-Boosted Trees to perform binary classification, testing out different feature sets, i.e., gaze only, Inertial Measurement Unit data only, and a combination of the two, to classify a person as familiar or unfamiliar with a particular route. We achieve the highest accuracy of $$89.9\%$$  using exclusively Inertial Measurement Unit data, exceeding gaze alone at $$67.6\%$$ , and gaze and Inertial Measurement Unit data together at $$85.9\%$$ . For the highest accuracy achieved, yaw and acceleration values are most important. This finding indicates that head movements (“looking around to orient oneself”) are a particularly valuable indicator to distinguish familiar and unfamiliar environments for pedestrian wayfinders.
format Article
id doaj-art-0e390ac886b24fa18caacb6bc5afa861
institution DOAJ
issn 2045-2322
language English
publishDate 2025-03-01
publisher Nature Portfolio
record_format Article
series Scientific Reports
spelling doaj-art-0e390ac886b24fa18caacb6bc5afa8612025-08-20T02:59:24ZengNature PortfolioScientific Reports2045-23222025-03-0115111710.1038/s41598-025-92274-4Predicting spatial familiarity by exploiting head and eye movements during pedestrian navigation in the real worldMarkus Kattenbeck0Ioannis Giannopoulos1Negar Alinaghi2Antonia Golab3Daniel R. Montello4Research Unit Geoinformation, TU WienResearch Unit Geoinformation, TU WienResearch Unit Geoinformation, TU WienEnergy Economics Group, TU WienDepartment of Geography, UC Santa BarbaraAbstract Spatial familiarity has seen a long history of interest in wayfinding research. To date, however, no studies have been done which systematically assess the behavioral correlates of spatial familiarity, including eye and body movements. In this study, we take a step towards filling this gap by reporting on the results of an in-situ, within-subject study with $$N=52$$ pedestrian wayfinders that combines eye-tracking and body movement sensors. In our study, participants were required to walk both a familiar route and an unfamiliar route by following auditory, landmark-based route instructions. We monitored participants’ behavior using a mobile eye tracker, a high-precision Global Navigation Satellite System receiver, and a high-precision, head-mounted Inertial Measurement Unit. We conducted machine learning experiments using Gradient-Boosted Trees to perform binary classification, testing out different feature sets, i.e., gaze only, Inertial Measurement Unit data only, and a combination of the two, to classify a person as familiar or unfamiliar with a particular route. We achieve the highest accuracy of $$89.9\%$$  using exclusively Inertial Measurement Unit data, exceeding gaze alone at $$67.6\%$$ , and gaze and Inertial Measurement Unit data together at $$85.9\%$$ . For the highest accuracy achieved, yaw and acceleration values are most important. This finding indicates that head movements (“looking around to orient oneself”) are a particularly valuable indicator to distinguish familiar and unfamiliar environments for pedestrian wayfinders.https://doi.org/10.1038/s41598-025-92274-4
spellingShingle Markus Kattenbeck
Ioannis Giannopoulos
Negar Alinaghi
Antonia Golab
Daniel R. Montello
Predicting spatial familiarity by exploiting head and eye movements during pedestrian navigation in the real world
Scientific Reports
title Predicting spatial familiarity by exploiting head and eye movements during pedestrian navigation in the real world
title_full Predicting spatial familiarity by exploiting head and eye movements during pedestrian navigation in the real world
title_fullStr Predicting spatial familiarity by exploiting head and eye movements during pedestrian navigation in the real world
title_full_unstemmed Predicting spatial familiarity by exploiting head and eye movements during pedestrian navigation in the real world
title_short Predicting spatial familiarity by exploiting head and eye movements during pedestrian navigation in the real world
title_sort predicting spatial familiarity by exploiting head and eye movements during pedestrian navigation in the real world
url https://doi.org/10.1038/s41598-025-92274-4
work_keys_str_mv AT markuskattenbeck predictingspatialfamiliaritybyexploitingheadandeyemovementsduringpedestriannavigationintherealworld
AT ioannisgiannopoulos predictingspatialfamiliaritybyexploitingheadandeyemovementsduringpedestriannavigationintherealworld
AT negaralinaghi predictingspatialfamiliaritybyexploitingheadandeyemovementsduringpedestriannavigationintherealworld
AT antoniagolab predictingspatialfamiliaritybyexploitingheadandeyemovementsduringpedestriannavigationintherealworld
AT danielrmontello predictingspatialfamiliaritybyexploitingheadandeyemovementsduringpedestriannavigationintherealworld