Eye-gaze controlled virtual keyboard

Patients who suffer from locked-in syndrome lose their ability to communicate via speech. Often, they rely on alternative methods for communication, yet these methods can be cumbersome and require significant assistance. In this project, we aim to develop a user-friendly communication system u...

Full description

Saved in:
Bibliographic Details
Main Author: Quek, Sherrie Jie Ru
Other Authors: Smitha Kavallur Pisharath Gopi
Format: Final Year Project
Language:English
Published: Nanyang Technological University 2024
Subjects:
Eye
Online Access:https://hdl.handle.net/10356/175166
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-175166
record_format dspace
spelling sg-ntu-dr.10356-1751662024-04-26T15:43:17Z Eye-gaze controlled virtual keyboard Quek, Sherrie Jie Ru Smitha Kavallur Pisharath Gopi School of Computer Science and Engineering smitha@ntu.edu.sg Computer and Information Science Eye Virtual keyboard Patients who suffer from locked-in syndrome lose their ability to communicate via speech. Often, they rely on alternative methods for communication, yet these methods can be cumbersome and require significant assistance. In this project, we aim to develop a user-friendly communication system using a virtual keyboard via eye-gaze input. By leveraging on Tobii eye tracking technology, the system offers an intuitive interface and reduces the reliance on caregiver assistance for setup and maintenance. Through experimental evaluation, we investigate the effectiveness of different keyboard layouts for our system and revealed that the familiarity with the QWERTY keyboard layout significantly influenced performance. The results demonstrate the effectiveness of the eye-gaze input method in facilitating communication for paralysed patients, contributing to improved quality of life and autonomy. Bachelor's degree 2024-04-22T05:57:48Z 2024-04-22T05:57:48Z 2024 Final Year Project (FYP) Quek, S. J. R. (2024). Eye-gaze controlled virtual keyboard. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/175166 https://hdl.handle.net/10356/175166 en application/pdf Nanyang Technological University
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Computer and Information Science
Eye
Virtual keyboard
spellingShingle Computer and Information Science
Eye
Virtual keyboard
Quek, Sherrie Jie Ru
Eye-gaze controlled virtual keyboard
description Patients who suffer from locked-in syndrome lose their ability to communicate via speech. Often, they rely on alternative methods for communication, yet these methods can be cumbersome and require significant assistance. In this project, we aim to develop a user-friendly communication system using a virtual keyboard via eye-gaze input. By leveraging on Tobii eye tracking technology, the system offers an intuitive interface and reduces the reliance on caregiver assistance for setup and maintenance. Through experimental evaluation, we investigate the effectiveness of different keyboard layouts for our system and revealed that the familiarity with the QWERTY keyboard layout significantly influenced performance. The results demonstrate the effectiveness of the eye-gaze input method in facilitating communication for paralysed patients, contributing to improved quality of life and autonomy.
author2 Smitha Kavallur Pisharath Gopi
author_facet Smitha Kavallur Pisharath Gopi
Quek, Sherrie Jie Ru
format Final Year Project
author Quek, Sherrie Jie Ru
author_sort Quek, Sherrie Jie Ru
title Eye-gaze controlled virtual keyboard
title_short Eye-gaze controlled virtual keyboard
title_full Eye-gaze controlled virtual keyboard
title_fullStr Eye-gaze controlled virtual keyboard
title_full_unstemmed Eye-gaze controlled virtual keyboard
title_sort eye-gaze controlled virtual keyboard
publisher Nanyang Technological University
publishDate 2024
url https://hdl.handle.net/10356/175166
_version_ 1800916223049859072