Four-class emotion classification in virtual reality using pupillometry
Background: Emotion classifcation remains a challenging problem in afective computing. The large majority of emotion classifcation studies rely on electroencephalography (EEG) and/or electrocardiography (ECG) signals and only classifes the emotions into two or three classes. Moreover, the stimuli us...
Saved in:
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English English |
Published: |
2020
|
Online Access: | https://eprints.ums.edu.my/id/eprint/26209/1/Four.pdf https://eprints.ums.edu.my/id/eprint/26209/2/s40537-020-00322-9.pdf https://eprints.ums.edu.my/id/eprint/26209/ https://doi.org/10.1186/s40537-020-00322-9 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Universiti Malaysia Sabah |
Language: | English English |
id |
my.ums.eprints.26209 |
---|---|
record_format |
eprints |
spelling |
my.ums.eprints.262092021-04-08T14:36:52Z https://eprints.ums.edu.my/id/eprint/26209/ Four-class emotion classification in virtual reality using pupillometry Lim Jia Zheng James Mountstephens Jason Teo Background: Emotion classifcation remains a challenging problem in afective computing. The large majority of emotion classifcation studies rely on electroencephalography (EEG) and/or electrocardiography (ECG) signals and only classifes the emotions into two or three classes. Moreover, the stimuli used in most emotion classifcation studies utilize either music or visual stimuli that are presented through conventional displays such as computer display screens or television screens. This study reports on a novel approach to recognizing emotions using pupillometry alone in the form of pupil diameter data to classify emotions into four distinct classes according to Russell’s Circumplex Model of Emotions, utilizing emotional stimuli that are presented in a virtual reality (VR) environment. The stimuli used in this experiment are 360° videos presented using a VR headset. Using an eye-tracker, pupil diameter is acquired as the sole classifcation feature. Three classifers were used for the emotion classifcation which are Support Vector Machine (SVM), k-Nearest Neighbor (KNN), and Random Forest (RF). Findings: SVM achieved the best performance for the four-class intra-subject classifcation task at an average of 57.05% accuracy, which is more than twice the accuracy of a random classifer. Although the accuracy can still be signifcantly improved, this study reports on the frst systematic study on the use of eye-tracking data alone without any other supplementary sensor modalities to perform human emotion classifcation and demonstrates that even with a single feature of pupil diameter alone, emotions could be classifed into four distinct classes to a certain level of accuracy. Moreover, the best performance for recognizing a particular class was 70.83%, which was achieved by the KNN classifer for Quadrant 3 emotions. Conclusion: This study presents the frst systematic investigation on the use of pupillometry as the sole feature to classify emotions into four distinct classes using VR stimuli. The ability to conduct emotion classifcation using pupil data alone represents a promising new approach to afective computing as new applications could be developed using readily-available webcams on laptops and other mobile devices that are equipped with cameras without the need for specialized and costly equipment such as EEG and/or ECG as the sensor modality. 2020 Article PeerReviewed text en https://eprints.ums.edu.my/id/eprint/26209/1/Four.pdf text en https://eprints.ums.edu.my/id/eprint/26209/2/s40537-020-00322-9.pdf Lim Jia Zheng and James Mountstephens and Jason Teo (2020) Four-class emotion classification in virtual reality using pupillometry. Journal of Big Data, 7 (43). pp. 1-9. https://doi.org/10.1186/s40537-020-00322-9 |
institution |
Universiti Malaysia Sabah |
building |
UMS Library |
collection |
Institutional Repository |
continent |
Asia |
country |
Malaysia |
content_provider |
Universiti Malaysia Sabah |
content_source |
UMS Institutional Repository |
url_provider |
http://eprints.ums.edu.my/ |
language |
English English |
description |
Background: Emotion classifcation remains a challenging problem in afective computing. The large majority of emotion classifcation studies rely on electroencephalography (EEG) and/or electrocardiography (ECG) signals and only classifes the emotions into two or three classes. Moreover, the stimuli used in most emotion classifcation studies utilize either music or visual stimuli that are presented through conventional displays such as computer display screens or television screens. This study reports on a novel approach to recognizing emotions using pupillometry alone in the form of pupil diameter data to classify emotions into four distinct classes according to Russell’s Circumplex Model of Emotions, utilizing emotional stimuli that are presented in a virtual reality (VR) environment. The stimuli used in this experiment are 360° videos presented using a VR headset. Using an eye-tracker, pupil diameter is acquired as the sole classifcation feature. Three classifers were used for the emotion classifcation which are Support Vector Machine (SVM), k-Nearest Neighbor (KNN), and Random Forest (RF). Findings: SVM achieved the best performance for the four-class intra-subject classifcation task at an average of 57.05% accuracy, which is more than twice the accuracy of a random classifer. Although the accuracy can still be signifcantly improved, this study reports on the frst systematic study on the use of eye-tracking data alone without any other supplementary sensor modalities to perform human emotion classifcation and demonstrates that even with a single feature of pupil diameter alone, emotions could be classifed into four distinct classes to a certain level of accuracy. Moreover, the best performance for recognizing a particular class was 70.83%, which was achieved by the KNN classifer for Quadrant 3 emotions. Conclusion: This study presents the frst systematic investigation on the use of pupillometry as the sole feature to classify emotions into four distinct classes using VR stimuli. The ability to conduct emotion classifcation using pupil data alone represents a promising new approach to afective computing as new applications could be developed using readily-available webcams on laptops and other mobile devices that are equipped with cameras without the need for specialized and costly equipment such as EEG and/or ECG as the sensor modality. |
format |
Article |
author |
Lim Jia Zheng James Mountstephens Jason Teo |
spellingShingle |
Lim Jia Zheng James Mountstephens Jason Teo Four-class emotion classification in virtual reality using pupillometry |
author_facet |
Lim Jia Zheng James Mountstephens Jason Teo |
author_sort |
Lim Jia Zheng |
title |
Four-class emotion classification in virtual reality using pupillometry |
title_short |
Four-class emotion classification in virtual reality using pupillometry |
title_full |
Four-class emotion classification in virtual reality using pupillometry |
title_fullStr |
Four-class emotion classification in virtual reality using pupillometry |
title_full_unstemmed |
Four-class emotion classification in virtual reality using pupillometry |
title_sort |
four-class emotion classification in virtual reality using pupillometry |
publishDate |
2020 |
url |
https://eprints.ums.edu.my/id/eprint/26209/1/Four.pdf https://eprints.ums.edu.my/id/eprint/26209/2/s40537-020-00322-9.pdf https://eprints.ums.edu.my/id/eprint/26209/ https://doi.org/10.1186/s40537-020-00322-9 |
_version_ |
1760230469692030976 |