An integrated framework for eye tracking-assisted task capability recognition of air traffic controllers with machine learning

To effectively address the continuously increasing demands of air transport, air traffic management (ATM) systems are evolving towards a human-artificial intelligence (AI) hybrid automation paradigm. In this paradigm, air traffic controllers (ATCOs) play a crucial role in ensuring safe and efficient...

Full description

Saved in:
Bibliographic Details
Main Authors: Liu, Bufan, Lye, Sun Woh, Zakaria, Zainuddin
Other Authors: School of Mechanical and Aerospace Engineering
Format: Article
Language:English
Published: 2024
Subjects:
Online Access:https://hdl.handle.net/10356/180776
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:To effectively address the continuously increasing demands of air transport, air traffic management (ATM) systems are evolving towards a human-artificial intelligence (AI) hybrid automation paradigm. In this paradigm, air traffic controllers (ATCOs) play a crucial role in ensuring safe and efficient operations. Recognizing ATCOs’ task capability is essential for evaluating performance, optimizing task assignments, and personalizing training strategies. Physiological signals, such as eye movements, offer objective insights into human behavior and cognitive processes, making them valuable for identifying the task capability of ATCOs. In this study, an integrated framework leveraging machine learning is proposed to achieve this goal. First, a Transformer-attentional long short-term memory (LSTM) network is developed to analyze eye movement patterns, capturing both global and long-term dependencies for precise skill level detection. Next, manual parameters are extracted from the raw eye tracking data and the SHAP (SHapley Additive exPlanation) method is utilized to determine the importance of these parameters, aiding in the selection of relevant performance metrics. Furthermore, a radar chart is implemented to intuitively visualize and compare performance metrics across different skill levels based on the selected parameters. A case study and extensive experiments are conducted to validate the effectiveness of the proposed framework. This research advances task capability recognition for ATCOs in a human-in-the-loop scenario, with a focus on expertise level detection, parameter importance identification, and performance metric comparison.