Consumer grade brain sensing for emotion recognition

© 2001-2012 IEEE. For several decades, electroencephalography (EEG) has featured as one of the most commonly used tools in emotional state recognition via monitoring of distinctive brain activities. An array of datasets has been generated with the use of diverse emotion-eliciting stimuli and the res...

Full description

Saved in:
Bibliographic Details
Main Authors: Payongkit Lakhan, Nannapas Banluesombatkul, Vongsagon Changniam, Ratwade Dhithijaiyratn, Pitshaporn Leelaarporn, Ekkarat Boonchieng, Supanida Hompoonsup, Theerawit Wilaiprasitporn
Format: Journal
Published: 2020
Subjects:
Online Access:https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85073212855&origin=inward
http://cmuir.cmu.ac.th/jspui/handle/6653943832/67808
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Chiang Mai University
id th-cmuir.6653943832-67808
record_format dspace
spelling th-cmuir.6653943832-678082020-04-02T15:18:24Z Consumer grade brain sensing for emotion recognition Payongkit Lakhan Nannapas Banluesombatkul Vongsagon Changniam Ratwade Dhithijaiyratn Pitshaporn Leelaarporn Ekkarat Boonchieng Supanida Hompoonsup Theerawit Wilaiprasitporn Engineering Physics and Astronomy © 2001-2012 IEEE. For several decades, electroencephalography (EEG) has featured as one of the most commonly used tools in emotional state recognition via monitoring of distinctive brain activities. An array of datasets has been generated with the use of diverse emotion-eliciting stimuli and the resulting brainwave responses conventionally captured with high-end EEG devices. However, the applicability of these devices is to some extent limited by practical constraints and may prove difficult to be deployed in highly mobile context omnipresent in everyday happenings. In this study, we evaluate the potential of OpenBCI to bridge this gap by first comparing its performance to research grade EEG system, employing the same algorithms that were applied on benchmark datasets. Moreover, for the purpose of emotion classification, we propose a novel method to facilitate the selection of audio-visual stimuli of high/low valence and arousal. Our setup entailed recruiting 200 healthy volunteers of varying years of age to identify the top 60 affective video clips from a total of 120 candidates through standardized self assessment, genre tags, and unsupervised machine learning. In addition, 43 participants were enrolled to watch the pre-selected clips during which emotional EEG brainwaves and peripheral physiological signals were collected. These recordings were analyzed and extracted features fed into a classification model to predict whether the elicited signals were associated with a high or low level of valence and arousal. As it turned out, our prediction accuracies were decidedly comparable to those of previous studies that utilized more costly EEG amplifiers for data acquisition. 2020-04-02T15:04:49Z 2020-04-02T15:04:49Z 2019-11-01 Journal 15581748 1530437X 2-s2.0-85073212855 10.1109/JSEN.2019.2928781 https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85073212855&origin=inward http://cmuir.cmu.ac.th/jspui/handle/6653943832/67808
institution Chiang Mai University
building Chiang Mai University Library
country Thailand
collection CMU Intellectual Repository
topic Engineering
Physics and Astronomy
spellingShingle Engineering
Physics and Astronomy
Payongkit Lakhan
Nannapas Banluesombatkul
Vongsagon Changniam
Ratwade Dhithijaiyratn
Pitshaporn Leelaarporn
Ekkarat Boonchieng
Supanida Hompoonsup
Theerawit Wilaiprasitporn
Consumer grade brain sensing for emotion recognition
description © 2001-2012 IEEE. For several decades, electroencephalography (EEG) has featured as one of the most commonly used tools in emotional state recognition via monitoring of distinctive brain activities. An array of datasets has been generated with the use of diverse emotion-eliciting stimuli and the resulting brainwave responses conventionally captured with high-end EEG devices. However, the applicability of these devices is to some extent limited by practical constraints and may prove difficult to be deployed in highly mobile context omnipresent in everyday happenings. In this study, we evaluate the potential of OpenBCI to bridge this gap by first comparing its performance to research grade EEG system, employing the same algorithms that were applied on benchmark datasets. Moreover, for the purpose of emotion classification, we propose a novel method to facilitate the selection of audio-visual stimuli of high/low valence and arousal. Our setup entailed recruiting 200 healthy volunteers of varying years of age to identify the top 60 affective video clips from a total of 120 candidates through standardized self assessment, genre tags, and unsupervised machine learning. In addition, 43 participants were enrolled to watch the pre-selected clips during which emotional EEG brainwaves and peripheral physiological signals were collected. These recordings were analyzed and extracted features fed into a classification model to predict whether the elicited signals were associated with a high or low level of valence and arousal. As it turned out, our prediction accuracies were decidedly comparable to those of previous studies that utilized more costly EEG amplifiers for data acquisition.
format Journal
author Payongkit Lakhan
Nannapas Banluesombatkul
Vongsagon Changniam
Ratwade Dhithijaiyratn
Pitshaporn Leelaarporn
Ekkarat Boonchieng
Supanida Hompoonsup
Theerawit Wilaiprasitporn
author_facet Payongkit Lakhan
Nannapas Banluesombatkul
Vongsagon Changniam
Ratwade Dhithijaiyratn
Pitshaporn Leelaarporn
Ekkarat Boonchieng
Supanida Hompoonsup
Theerawit Wilaiprasitporn
author_sort Payongkit Lakhan
title Consumer grade brain sensing for emotion recognition
title_short Consumer grade brain sensing for emotion recognition
title_full Consumer grade brain sensing for emotion recognition
title_fullStr Consumer grade brain sensing for emotion recognition
title_full_unstemmed Consumer grade brain sensing for emotion recognition
title_sort consumer grade brain sensing for emotion recognition
publishDate 2020
url https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85073212855&origin=inward
http://cmuir.cmu.ac.th/jspui/handle/6653943832/67808
_version_ 1681426703571746816