Enhancing multimodal interactions with eye-tracking for virtual reality applications

The motion of dragging is a common yet imperative action in many forms of human-computer interaction, including Virtual Reality. With the growing availability of commercial eye-tracking devices, researchers have begun to investigate eye-based multimodal interactions’ performance in dragging tasks...

Full description

Saved in:
Bibliographic Details
Main Author: Chia, Wen Han
Other Authors: Cai Yiyu
Format: Final Year Project
Language:English
Published: Nanyang Technological University 2021
Subjects:
Online Access:https://hdl.handle.net/10356/154171
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-154171
record_format dspace
spelling sg-ntu-dr.10356-1541712021-12-19T12:42:56Z Enhancing multimodal interactions with eye-tracking for virtual reality applications Chia, Wen Han Cai Yiyu School of Mechanical and Aerospace Engineering Defense Science Organization National Laboratories (DSO) Andrew Ho Si Yong Tan He Jiang MYYCai@ntu.edu.sg, hsiyong@dso.org.sg, thejiang@dso.org.sg Engineering::Industrial engineering::Human factors engineering The motion of dragging is a common yet imperative action in many forms of human-computer interaction, including Virtual Reality. With the growing availability of commercial eye-tracking devices, researchers have begun to investigate eye-based multimodal interactions’ performance in dragging tasks in desktop settings. However, little is known about the performance of eye-based multimodal interactions in 3D dragging tasks with Virtual Reality head-mounted displays. 31 participants volunteered in the study which compared the usability of eye-gaze with button click, eyegaze with dwell time and the default Vive controller for 3D dragging tasks in Virtual Reality headmounted displays. Based on the ISO 9241-9 standard, a novel immersive 3D dragging task was designed and implemented to facilitate the experiment. The task difficulty was varied by adjusting the following variables: target width, target-destination angular distance, and direction of path curvature. An additional selection task was implemented along with the dragging task to investigate multitasking performance. Contrary to our hypothesis, the controller was the fastest, achieved the highest throughput, and was the most preferred modality among the three modalities. It also offered the highest precision and accuracy in the dragging task. Notably, gaze with click had comparable speed and accuracy with the controller. Even though both gaze with click and gaze with dwell were highly imprecise in the dragging task, they were still well-preferred by participants. Furthermore, design guidelines were recommended for visual targets’ position in the horizontal field of view and visual target size in the immersive 3D dragging task. In conclusion, the controller is the most usable modality for an immersive 3D dragging task. Gaze with click could still suffice as a usable modality when low precision is required in the dragging task. Bachelor of Engineering (Mechanical Engineering) 2021-12-19T12:42:55Z 2021-12-19T12:42:55Z 2021 Final Year Project (FYP) Chia, W. H. (2021). Enhancing multimodal interactions with eye-tracking for virtual reality applications. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/154171 https://hdl.handle.net/10356/154171 en A271 application/pdf Nanyang Technological University
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering::Industrial engineering::Human factors engineering
spellingShingle Engineering::Industrial engineering::Human factors engineering
Chia, Wen Han
Enhancing multimodal interactions with eye-tracking for virtual reality applications
description The motion of dragging is a common yet imperative action in many forms of human-computer interaction, including Virtual Reality. With the growing availability of commercial eye-tracking devices, researchers have begun to investigate eye-based multimodal interactions’ performance in dragging tasks in desktop settings. However, little is known about the performance of eye-based multimodal interactions in 3D dragging tasks with Virtual Reality head-mounted displays. 31 participants volunteered in the study which compared the usability of eye-gaze with button click, eyegaze with dwell time and the default Vive controller for 3D dragging tasks in Virtual Reality headmounted displays. Based on the ISO 9241-9 standard, a novel immersive 3D dragging task was designed and implemented to facilitate the experiment. The task difficulty was varied by adjusting the following variables: target width, target-destination angular distance, and direction of path curvature. An additional selection task was implemented along with the dragging task to investigate multitasking performance. Contrary to our hypothesis, the controller was the fastest, achieved the highest throughput, and was the most preferred modality among the three modalities. It also offered the highest precision and accuracy in the dragging task. Notably, gaze with click had comparable speed and accuracy with the controller. Even though both gaze with click and gaze with dwell were highly imprecise in the dragging task, they were still well-preferred by participants. Furthermore, design guidelines were recommended for visual targets’ position in the horizontal field of view and visual target size in the immersive 3D dragging task. In conclusion, the controller is the most usable modality for an immersive 3D dragging task. Gaze with click could still suffice as a usable modality when low precision is required in the dragging task.
author2 Cai Yiyu
author_facet Cai Yiyu
Chia, Wen Han
format Final Year Project
author Chia, Wen Han
author_sort Chia, Wen Han
title Enhancing multimodal interactions with eye-tracking for virtual reality applications
title_short Enhancing multimodal interactions with eye-tracking for virtual reality applications
title_full Enhancing multimodal interactions with eye-tracking for virtual reality applications
title_fullStr Enhancing multimodal interactions with eye-tracking for virtual reality applications
title_full_unstemmed Enhancing multimodal interactions with eye-tracking for virtual reality applications
title_sort enhancing multimodal interactions with eye-tracking for virtual reality applications
publisher Nanyang Technological University
publishDate 2021
url https://hdl.handle.net/10356/154171
_version_ 1720447156842659840