An integrated highly synchronous, high resolution, real time eye tracking system for dynamic flight movement

Electronic surveillance systems are being used rapidly today, ranging from a simple video camera to a complex biometric surveillance system for facial patterns and intelligent computer vision based surveillance systems, which are applied in many fields such as home monitoring, security surveillance...

Full description

Saved in:
Bibliographic Details
Main Authors: Wee, Hong Jie, Lye, Sun Woh, Pinheiro, Jean-Philippe
Other Authors: School of Mechanical and Aerospace Engineering
Format: Article
Language:English
Published: 2021
Subjects:
Online Access:https://hdl.handle.net/10356/150646
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:Electronic surveillance systems are being used rapidly today, ranging from a simple video camera to a complex biometric surveillance system for facial patterns and intelligent computer vision based surveillance systems, which are applied in many fields such as home monitoring, security surveillance of important places and mission critical tasks like air traffic control surveillance. Such systems normally involve a computer system and a human surveillance operator, who looks at the dynamic display to perform his surveillance tasks. Exploitation of shared information between these physical heterogeneous data capture systems with human operated functions is one emerging aspect in electronic surveillance that has yet to be addressed deeply. Hence, an innovative interaction interface for such knowledge extraction and representation is required. Such an interface should establish a data activity register frame which captures information depicting various surveillance activities at a specified spatial and time reference. This paper presents a real time eye tracking system, which integrates two sets of activity data in a highly dynamic changing and synchronous manner in real-time with respect to both spatial and time frames, through the “Dynamic Data Alignment and Timestamp Synchronisation Model”. This model matches the timestamps of the two data streams, aligns them to the same spatial reference frame before fusing them together into a data activity register frame. The Air Traffic Control (ATC) domain is used to illustrate this model, where experiments are conducted under simulated radar traffic situations with participants and their radar input data. Test results revealed that this model is able to synchronise the timestamp of the eye and dynamic display data, align both of these data spatially, while taking into account dynamic changes in space and time on a simulated radar display. This system can also distinguish and show variations in the monitoring behaviour of participants. As such, new knowledge can be extracted and represented through this innovative interface, which can then be applied to other applications in the field of electronic surveillance to unearth monitoring behaviour of the human surveillance operator.