Comparing Eye-Tracking versus EEG Features for Four-Class Emotion Classification in VR Predictive Analytics
This paper presents a novel emotion recognition approach using electroencephalography (EEG) brainwave signals augmented with eye-tracking data in virtual reality (VR) to classify 4-quadrant circumplex model of emotions. 3600 videos are used as the stimuli to evoke user’s emotions (happy, angry, bore...
Saved in:
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English English |
Published: |
2020
|
Subjects: | |
Online Access: | https://eprints.ums.edu.my/id/eprint/25561/1/Comparing%20Eye-Tracking%20versus%20EEG%20Features%20for%20Four-Class%20Emotion%20Classification%20in%20VR%20Predictive%20Analytics.pdf https://eprints.ums.edu.my/id/eprint/25561/2/Comparing%20Eye-Tracking%20versus%20EEG%20Features%20for%20Four-Class%20Emotion%20Classification%20in%20VR%20Predictive%20Analytics1.pdf https://eprints.ums.edu.my/id/eprint/25561/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Universiti Malaysia Sabah |
Language: | English English |
id |
my.ums.eprints.25561 |
---|---|
record_format |
eprints |
spelling |
my.ums.eprints.255612021-04-08T14:32:20Z https://eprints.ums.edu.my/id/eprint/25561/ Comparing Eye-Tracking versus EEG Features for Four-Class Emotion Classification in VR Predictive Analytics Lim, Jia Zheng James Mountstephens Jason Teo T Technology (General) TA Engineering (General). Civil engineering (General) This paper presents a novel emotion recognition approach using electroencephalography (EEG) brainwave signals augmented with eye-tracking data in virtual reality (VR) to classify 4-quadrant circumplex model of emotions. 3600 videos are used as the stimuli to evoke user’s emotions (happy, angry, bored, calm) with a VR headset and a pair of earphones. EEG signals are recorded via a wearable EEG brain-computer interfacing (BCI) device and pupil diameter is collected also from a wearable portable eye-tracker. We extract 5 frequency bands which are Delta, Theta, Alpha, Beta, and Gamma from EEG data as well as obtaining pupil diameter from the eye-tracker as the chosen as the eye-related feature for this investigation. Support Vector Machine (SVM) with Radial Basis Function (RBF) kernel is used as the classifier. The best accuracies based on EEG brainwave signals and pupil diameter are 98.44% and 58.30% respectively. 2020 Article PeerReviewed text en https://eprints.ums.edu.my/id/eprint/25561/1/Comparing%20Eye-Tracking%20versus%20EEG%20Features%20for%20Four-Class%20Emotion%20Classification%20in%20VR%20Predictive%20Analytics.pdf text en https://eprints.ums.edu.my/id/eprint/25561/2/Comparing%20Eye-Tracking%20versus%20EEG%20Features%20for%20Four-Class%20Emotion%20Classification%20in%20VR%20Predictive%20Analytics1.pdf Lim, Jia Zheng and James Mountstephens and Jason Teo (2020) Comparing Eye-Tracking versus EEG Features for Four-Class Emotion Classification in VR Predictive Analytics. International Journal of Advanced Science and Technology, 29 (6). pp. 1492-1497. |
institution |
Universiti Malaysia Sabah |
building |
UMS Library |
collection |
Institutional Repository |
continent |
Asia |
country |
Malaysia |
content_provider |
Universiti Malaysia Sabah |
content_source |
UMS Institutional Repository |
url_provider |
http://eprints.ums.edu.my/ |
language |
English English |
topic |
T Technology (General) TA Engineering (General). Civil engineering (General) |
spellingShingle |
T Technology (General) TA Engineering (General). Civil engineering (General) Lim, Jia Zheng James Mountstephens Jason Teo Comparing Eye-Tracking versus EEG Features for Four-Class Emotion Classification in VR Predictive Analytics |
description |
This paper presents a novel emotion recognition approach using electroencephalography (EEG) brainwave signals augmented with eye-tracking data in virtual reality (VR) to classify 4-quadrant circumplex model of emotions. 3600 videos are used as the stimuli to evoke user’s emotions (happy, angry, bored, calm) with a VR headset and a pair of earphones. EEG signals are recorded via a wearable EEG brain-computer interfacing (BCI) device and pupil diameter is collected also from a wearable portable eye-tracker. We extract 5 frequency bands which are Delta, Theta, Alpha, Beta, and Gamma from EEG data as well as obtaining pupil diameter from the eye-tracker as the chosen as the eye-related feature for this investigation. Support Vector Machine (SVM) with Radial Basis Function (RBF) kernel is used as the classifier. The best accuracies based on EEG brainwave signals and pupil diameter are 98.44% and 58.30% respectively. |
format |
Article |
author |
Lim, Jia Zheng James Mountstephens Jason Teo |
author_facet |
Lim, Jia Zheng James Mountstephens Jason Teo |
author_sort |
Lim, Jia Zheng |
title |
Comparing Eye-Tracking versus EEG Features for Four-Class Emotion Classification in VR Predictive Analytics |
title_short |
Comparing Eye-Tracking versus EEG Features for Four-Class Emotion Classification in VR Predictive Analytics |
title_full |
Comparing Eye-Tracking versus EEG Features for Four-Class Emotion Classification in VR Predictive Analytics |
title_fullStr |
Comparing Eye-Tracking versus EEG Features for Four-Class Emotion Classification in VR Predictive Analytics |
title_full_unstemmed |
Comparing Eye-Tracking versus EEG Features for Four-Class Emotion Classification in VR Predictive Analytics |
title_sort |
comparing eye-tracking versus eeg features for four-class emotion classification in vr predictive analytics |
publishDate |
2020 |
url |
https://eprints.ums.edu.my/id/eprint/25561/1/Comparing%20Eye-Tracking%20versus%20EEG%20Features%20for%20Four-Class%20Emotion%20Classification%20in%20VR%20Predictive%20Analytics.pdf https://eprints.ums.edu.my/id/eprint/25561/2/Comparing%20Eye-Tracking%20versus%20EEG%20Features%20for%20Four-Class%20Emotion%20Classification%20in%20VR%20Predictive%20Analytics1.pdf https://eprints.ums.edu.my/id/eprint/25561/ |
_version_ |
1760230384474259456 |