Adaptive human-robot interactions for multiple unmanned aerial vehicles

Advances in unmanned aircraft systems (UAS) have paved the way for progressively higher levels of intelligence and autonomy, supporting new modes of operation, such as the one-to-many (OTM) concept, where a single human operator is responsible for monitoring and coordinating the tasks of multiple un...

Full description

Saved in:
Bibliographic Details
Main Authors: Lim, Yixiang, Pongsakornsathien, Nichakorn, Gardi, Alessandro, Sabatini, Roberto, Kistan, Trevor, Ezer, Neta, Bursch, Daniel J.
Other Authors: School of Mechanical and Aerospace Engineering
Format: Article
Language:English
Published: 2021
Subjects:
Online Access:https://hdl.handle.net/10356/146846
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-146846
record_format dspace
spelling sg-ntu-dr.10356-1468462023-03-04T17:13:40Z Adaptive human-robot interactions for multiple unmanned aerial vehicles Lim, Yixiang Pongsakornsathien, Nichakorn Gardi, Alessandro Sabatini, Roberto Kistan, Trevor Ezer, Neta Bursch, Daniel J. School of Mechanical and Aerospace Engineering Saab-NTU Joint Lab Engineering::Mechanical engineering Aerial Robotics Autonomous Systems Advances in unmanned aircraft systems (UAS) have paved the way for progressively higher levels of intelligence and autonomy, supporting new modes of operation, such as the one-to-many (OTM) concept, where a single human operator is responsible for monitoring and coordinating the tasks of multiple unmanned aerial vehicles (UAVs). This paper presents the development and evaluation of cognitive human-machine interfaces and interactions (CHMI2 ) supporting adaptive automation in OTM applications. A CHMI2 system comprises a network of neurophysiological sensors and machine-learning based models for inferring user cognitive states, as well as the adaptation engine containing a set of transition logics for control/display functions and discrete autonomy levels. Models of the user’s cognitive states are trained on past performance and neurophysiological data during an offline calibration phase, and subsequently used in the online adaptation phase for real-time inference of these cognitive states. To investigate adaptive automation in OTM applications, a scenario involving bushfire detection was developed where a single human operator is responsible for tasking multiple UAV platforms to search for and localize bushfires over a wide area. We present the architecture and design of the UAS simulation environment that was developed, together with various human-machine interface (HMI) formats and functions, to evaluate the CHMI2 system’s feasibility through human-in-the-loop (HITL) experiments. The CHMI2 module was subsequently integrated into the simulation environment, providing the sensing, inference, and adaptation capabilities needed to realise adaptive automation. HITL experiments were performed to verify the CHMI2 module’s functionalities in the offline calibration and online adaptation phases. In particular, results from the online adaptation phase showed that the system was able to support real-time inference and human-machine interface and interaction (HMI2 ) adaptation. However, the accuracy of the inferred workload was variable across the different participants (with a root mean squared error (RMSE) ranging from 0.2 to 0.6), partly due to the reduced number of neurophysiological features available as real-time inputs and also due to limited training stages in the offline calibration phase. To improve the performance of the system, future work will investigate the use of alternative machine learning techniques, additional neurophysiological input features, and a more extensive training stage. Published version 2021-03-12T02:18:58Z 2021-03-12T02:18:58Z 2021 Journal Article Lim, Y., Pongsakornsathien, N., Gardi, A., Sabatini, R., Kistan, T., Ezer, N. & Bursch, D. J. (2021). Adaptive human-robot interactions for multiple unmanned aerial vehicles. Robotics, 10(1). https://dx.doi.org/10.3390/robotics10010012 2218-6581 https://hdl.handle.net/10356/146846 10.3390/robotics10010012 2-s2.0-85099342707 1 10 en Robotics © 2021 The Authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). application/pdf
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering::Mechanical engineering
Aerial Robotics
Autonomous Systems
spellingShingle Engineering::Mechanical engineering
Aerial Robotics
Autonomous Systems
Lim, Yixiang
Pongsakornsathien, Nichakorn
Gardi, Alessandro
Sabatini, Roberto
Kistan, Trevor
Ezer, Neta
Bursch, Daniel J.
Adaptive human-robot interactions for multiple unmanned aerial vehicles
description Advances in unmanned aircraft systems (UAS) have paved the way for progressively higher levels of intelligence and autonomy, supporting new modes of operation, such as the one-to-many (OTM) concept, where a single human operator is responsible for monitoring and coordinating the tasks of multiple unmanned aerial vehicles (UAVs). This paper presents the development and evaluation of cognitive human-machine interfaces and interactions (CHMI2 ) supporting adaptive automation in OTM applications. A CHMI2 system comprises a network of neurophysiological sensors and machine-learning based models for inferring user cognitive states, as well as the adaptation engine containing a set of transition logics for control/display functions and discrete autonomy levels. Models of the user’s cognitive states are trained on past performance and neurophysiological data during an offline calibration phase, and subsequently used in the online adaptation phase for real-time inference of these cognitive states. To investigate adaptive automation in OTM applications, a scenario involving bushfire detection was developed where a single human operator is responsible for tasking multiple UAV platforms to search for and localize bushfires over a wide area. We present the architecture and design of the UAS simulation environment that was developed, together with various human-machine interface (HMI) formats and functions, to evaluate the CHMI2 system’s feasibility through human-in-the-loop (HITL) experiments. The CHMI2 module was subsequently integrated into the simulation environment, providing the sensing, inference, and adaptation capabilities needed to realise adaptive automation. HITL experiments were performed to verify the CHMI2 module’s functionalities in the offline calibration and online adaptation phases. In particular, results from the online adaptation phase showed that the system was able to support real-time inference and human-machine interface and interaction (HMI2 ) adaptation. However, the accuracy of the inferred workload was variable across the different participants (with a root mean squared error (RMSE) ranging from 0.2 to 0.6), partly due to the reduced number of neurophysiological features available as real-time inputs and also due to limited training stages in the offline calibration phase. To improve the performance of the system, future work will investigate the use of alternative machine learning techniques, additional neurophysiological input features, and a more extensive training stage.
author2 School of Mechanical and Aerospace Engineering
author_facet School of Mechanical and Aerospace Engineering
Lim, Yixiang
Pongsakornsathien, Nichakorn
Gardi, Alessandro
Sabatini, Roberto
Kistan, Trevor
Ezer, Neta
Bursch, Daniel J.
format Article
author Lim, Yixiang
Pongsakornsathien, Nichakorn
Gardi, Alessandro
Sabatini, Roberto
Kistan, Trevor
Ezer, Neta
Bursch, Daniel J.
author_sort Lim, Yixiang
title Adaptive human-robot interactions for multiple unmanned aerial vehicles
title_short Adaptive human-robot interactions for multiple unmanned aerial vehicles
title_full Adaptive human-robot interactions for multiple unmanned aerial vehicles
title_fullStr Adaptive human-robot interactions for multiple unmanned aerial vehicles
title_full_unstemmed Adaptive human-robot interactions for multiple unmanned aerial vehicles
title_sort adaptive human-robot interactions for multiple unmanned aerial vehicles
publishDate 2021
url https://hdl.handle.net/10356/146846
_version_ 1759856612966662144