EEG-based dominance level recognition for emotion-enabled interaction

Emotions recognized from Electroencephalogram (EEG) could reflect the real "inner" feelings of the human. Recently, research on real-time emotion recognition received more attention since it could be applied in games, e-learning systems or even in marketing. EEG signal can be divided into...

Full description

Saved in:
Bibliographic Details
Main Authors: Liu, Yisi., Sourina, Olga.
Other Authors: School of Electrical and Electronic Engineering
Format: Conference or Workshop Item
Language:English
Published: 2013
Online Access:https://hdl.handle.net/10356/84775
http://hdl.handle.net/10220/12944
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-84775
record_format dspace
spelling sg-ntu-dr.10356-847752020-03-07T13:24:45Z EEG-based dominance level recognition for emotion-enabled interaction Liu, Yisi. Sourina, Olga. School of Electrical and Electronic Engineering IEEE International Conference on Multimedia and Expo (2012 : Melbourne, Australia) Emotions recognized from Electroencephalogram (EEG) could reflect the real "inner" feelings of the human. Recently, research on real-time emotion recognition received more attention since it could be applied in games, e-learning systems or even in marketing. EEG signal can be divided into the delta, theta, alpha, beta, and gamma waves based on their frequency bands. Based on the Valence-Arousal-Dominance emotion model, we proposed a subject-dependent algorithm using the beta/alpha ratio to recognize high and low dominance levels of emotions from EEG. Three experiments were designed and carried out to collect the EEG data labeled with emotions. Sound clips from International Affective Digitized Sounds (IADS) database and music pieces were used to evoke emotions in the experiments. Our approach would allow real-time recognition of the emotions defined with different dominance levels in Valence-Arousal-Dominance model. 2013-08-02T07:17:54Z 2019-12-06T15:51:00Z 2013-08-02T07:17:54Z 2019-12-06T15:51:00Z 2012 2012 Conference Paper https://hdl.handle.net/10356/84775 http://hdl.handle.net/10220/12944 10.1109/ICME.2012.20 en
institution Nanyang Technological University
building NTU Library
country Singapore
collection DR-NTU
language English
description Emotions recognized from Electroencephalogram (EEG) could reflect the real "inner" feelings of the human. Recently, research on real-time emotion recognition received more attention since it could be applied in games, e-learning systems or even in marketing. EEG signal can be divided into the delta, theta, alpha, beta, and gamma waves based on their frequency bands. Based on the Valence-Arousal-Dominance emotion model, we proposed a subject-dependent algorithm using the beta/alpha ratio to recognize high and low dominance levels of emotions from EEG. Three experiments were designed and carried out to collect the EEG data labeled with emotions. Sound clips from International Affective Digitized Sounds (IADS) database and music pieces were used to evoke emotions in the experiments. Our approach would allow real-time recognition of the emotions defined with different dominance levels in Valence-Arousal-Dominance model.
author2 School of Electrical and Electronic Engineering
author_facet School of Electrical and Electronic Engineering
Liu, Yisi.
Sourina, Olga.
format Conference or Workshop Item
author Liu, Yisi.
Sourina, Olga.
spellingShingle Liu, Yisi.
Sourina, Olga.
EEG-based dominance level recognition for emotion-enabled interaction
author_sort Liu, Yisi.
title EEG-based dominance level recognition for emotion-enabled interaction
title_short EEG-based dominance level recognition for emotion-enabled interaction
title_full EEG-based dominance level recognition for emotion-enabled interaction
title_fullStr EEG-based dominance level recognition for emotion-enabled interaction
title_full_unstemmed EEG-based dominance level recognition for emotion-enabled interaction
title_sort eeg-based dominance level recognition for emotion-enabled interaction
publishDate 2013
url https://hdl.handle.net/10356/84775
http://hdl.handle.net/10220/12944
_version_ 1681049054274912256