Predicting Landing Position Deviation in Low-Visibility and Windy Environment Using Pilots’ Eye Movement Features

Eye movement features of pilots are critical for aircraft landing, especially in low-visibility and windy conditions. This study conducts simulated flight experiments concerning aircraft approach and landing under three low-visibility and windy conditions, including no-wind, crosswind, and tailwind....

Full description

Saved in:
Bibliographic Details
Main Authors: Xiuyi Li, Yue Zhou, Weiwei Zhao, Chuanyun Fu, Zhuocheng Huang, Nianqian Li, Haibo Xu
Format: Article
Language:English
Published: MDPI AG 2025-06-01
Series:Aerospace
Subjects:
Online Access:https://www.mdpi.com/2226-4310/12/6/523
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850158029145636864
author Xiuyi Li
Yue Zhou
Weiwei Zhao
Chuanyun Fu
Zhuocheng Huang
Nianqian Li
Haibo Xu
author_facet Xiuyi Li
Yue Zhou
Weiwei Zhao
Chuanyun Fu
Zhuocheng Huang
Nianqian Li
Haibo Xu
author_sort Xiuyi Li
collection DOAJ
description Eye movement features of pilots are critical for aircraft landing, especially in low-visibility and windy conditions. This study conducts simulated flight experiments concerning aircraft approach and landing under three low-visibility and windy conditions, including no-wind, crosswind, and tailwind. This research collects 30 participants’ eye movement data after descending from the instrument approach to the visual approach and measures the landing position deviation. Then, a random forest method is used to rank eye movement features and sequentially construct feature sets by feature importance. Two machine learning models (SVR and RF) and four deep learning models (GRU, LSTM, CNN-GRU, and CNN-LSTM) are trained with these feature sets to predict the landing position deviation. The results show that the cumulative fixation duration on the heading indicator, altimeter, air-speed indicator, and external scenery is vital for landing position deviation under no-wind conditions. The attention allocation required by approaches under crosswind and tailwind conditions is more complex. According to the MAE metric, CNN-LSTM has the best prediction performance and stability under no-wind conditions, while CNN-GRU is better for crosswind and tailwind cases. RF also performs well as per the RMSE metric, as it is suitable for predicting landing position errors of outliers.
format Article
id doaj-art-4be0e54900cf40ffa78d2062b05c3e7e
institution OA Journals
issn 2226-4310
language English
publishDate 2025-06-01
publisher MDPI AG
record_format Article
series Aerospace
spelling doaj-art-4be0e54900cf40ffa78d2062b05c3e7e2025-08-20T02:24:00ZengMDPI AGAerospace2226-43102025-06-0112652310.3390/aerospace12060523Predicting Landing Position Deviation in Low-Visibility and Windy Environment Using Pilots’ Eye Movement FeaturesXiuyi Li0Yue Zhou1Weiwei Zhao2Chuanyun Fu3Zhuocheng Huang4Nianqian Li5Haibo Xu6CAAC Academy, Civil Aviation Flight University of China, Guanghan 618307, ChinaFlight Technology College, Civil Aviation Flight University of China, Guanghan 618307, ChinaFlight Technology College, Civil Aviation Flight University of China, Guanghan 618307, ChinaSchool of Transportation Science and Engineering, Harbin Institute of Technology, Harbin 150001, ChinaFlight Technology College, Civil Aviation Flight University of China, Guanghan 618307, ChinaFlight Technology College, Civil Aviation Flight University of China, Guanghan 618307, ChinaGuanghan Brand, Civil Aviation Flight University of China, Guanghan 618307, ChinaEye movement features of pilots are critical for aircraft landing, especially in low-visibility and windy conditions. This study conducts simulated flight experiments concerning aircraft approach and landing under three low-visibility and windy conditions, including no-wind, crosswind, and tailwind. This research collects 30 participants’ eye movement data after descending from the instrument approach to the visual approach and measures the landing position deviation. Then, a random forest method is used to rank eye movement features and sequentially construct feature sets by feature importance. Two machine learning models (SVR and RF) and four deep learning models (GRU, LSTM, CNN-GRU, and CNN-LSTM) are trained with these feature sets to predict the landing position deviation. The results show that the cumulative fixation duration on the heading indicator, altimeter, air-speed indicator, and external scenery is vital for landing position deviation under no-wind conditions. The attention allocation required by approaches under crosswind and tailwind conditions is more complex. According to the MAE metric, CNN-LSTM has the best prediction performance and stability under no-wind conditions, while CNN-GRU is better for crosswind and tailwind cases. RF also performs well as per the RMSE metric, as it is suitable for predicting landing position errors of outliers.https://www.mdpi.com/2226-4310/12/6/523eye movement featureslanding position deviationdeep learningCNN-LSTM modelrandom forest algorithm
spellingShingle Xiuyi Li
Yue Zhou
Weiwei Zhao
Chuanyun Fu
Zhuocheng Huang
Nianqian Li
Haibo Xu
Predicting Landing Position Deviation in Low-Visibility and Windy Environment Using Pilots’ Eye Movement Features
Aerospace
eye movement features
landing position deviation
deep learning
CNN-LSTM model
random forest algorithm
title Predicting Landing Position Deviation in Low-Visibility and Windy Environment Using Pilots’ Eye Movement Features
title_full Predicting Landing Position Deviation in Low-Visibility and Windy Environment Using Pilots’ Eye Movement Features
title_fullStr Predicting Landing Position Deviation in Low-Visibility and Windy Environment Using Pilots’ Eye Movement Features
title_full_unstemmed Predicting Landing Position Deviation in Low-Visibility and Windy Environment Using Pilots’ Eye Movement Features
title_short Predicting Landing Position Deviation in Low-Visibility and Windy Environment Using Pilots’ Eye Movement Features
title_sort predicting landing position deviation in low visibility and windy environment using pilots eye movement features
topic eye movement features
landing position deviation
deep learning
CNN-LSTM model
random forest algorithm
url https://www.mdpi.com/2226-4310/12/6/523
work_keys_str_mv AT xiuyili predictinglandingpositiondeviationinlowvisibilityandwindyenvironmentusingpilotseyemovementfeatures
AT yuezhou predictinglandingpositiondeviationinlowvisibilityandwindyenvironmentusingpilotseyemovementfeatures
AT weiweizhao predictinglandingpositiondeviationinlowvisibilityandwindyenvironmentusingpilotseyemovementfeatures
AT chuanyunfu predictinglandingpositiondeviationinlowvisibilityandwindyenvironmentusingpilotseyemovementfeatures
AT zhuochenghuang predictinglandingpositiondeviationinlowvisibilityandwindyenvironmentusingpilotseyemovementfeatures
AT nianqianli predictinglandingpositiondeviationinlowvisibilityandwindyenvironmentusingpilotseyemovementfeatures
AT haiboxu predictinglandingpositiondeviationinlowvisibilityandwindyenvironmentusingpilotseyemovementfeatures