Binocular vs. monocular 3D cues in multiple object tracking: expertise differences between soccer players and non-athletes

Abstract Classical two-dimensional multiple object tracking (2D-MOT) measures the cognitive ability to track multiple moving elements in real-life-like scenarios. Stereo-three-dimensional MOT (S-3D-MOT), a more ecologically valid form of 2D-MOT, shows better tracking performance in soccer players. I...

Full description

Saved in:
Bibliographic Details
Main Authors: Xiang Che, Jiayue Ma, Yu Zhang, Chen Zhou, Qian Zhou, Kun Zhang, Jijun Lan, Qi Hui, Jie Li
Format: Article
Language:English
Published: SpringerOpen 2025-07-01
Series:Cognitive Research
Subjects:
Online Access:https://doi.org/10.1186/s41235-025-00658-x
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract Classical two-dimensional multiple object tracking (2D-MOT) measures the cognitive ability to track multiple moving elements in real-life-like scenarios. Stereo-three-dimensional MOT (S-3D-MOT), a more ecologically valid form of 2D-MOT, shows better tracking performance in soccer players. Its unique feature is the additional binocular and monocular 3D cues compared to 2D-MOT, but their individual contributions to MOT performance are unclear. To fill this research gap, the current study introduced a three-dimensional MOT task on a flat screen (F-3D-MOT) to distinguish the roles of binocular and monocular 3D cues. F-3D-MOT provides additional monocular 3D cues compared to classical 2D-MOT but lacks binocular 3D cues compared to S-3D-MOT. Moreover, whether the effects of these 3D cues on MOT performance vary between soccer players and non-athletes remains unclear. Therefore, both groups were recruited for this study. The results showed that soccer players performed significantly better than non-athletes specifically in S-3D-MOT, indicating their enhanced sensitivity to binocular 3D cues. In contrast, neither monocular cues (F-3D-MOT) nor 2D displays led to significant differences between the two groups.
ISSN:2365-7464