Emotion recognition using brain computer interface
Emotions play an important role in our daily life but at the same time, it is complex and difficult to interpret if someone has the intention to conceal them. In recent years, there has been an increase in the nature of recognizing emotion using Electroencephalogram (EEG) signals because these signa...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
2017
|
Subjects: | |
Online Access: | http://hdl.handle.net/10356/70200 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | Emotions play an important role in our daily life but at the same time, it is complex and difficult to interpret if someone has the intention to conceal them. In recent years, there has been an increase in the nature of recognizing emotion using Electroencephalogram (EEG) signals because these signals are constantly being given off and thus it becomes difficult to forcefully maintain these EEG signal. Hence, EEG has been said to be a more efficient and effective tool for measuring the electrical activity of the human brain and understanding in comparison with other methods. The arousal-valance scale has been widely used in researches for recognizing emotions. However, it is found that using music video as stimulus, Power Spectral Density (PSD) as features and Support Vector Machine (SVM) as classifier, the accuracy accounted to 58.8% and 55.7% for valance and arousal respectively. In another experiment, using video clips as stimulus, PSD as features and SVM as classifier, the accuracy amounted to 57% and 52.4% for valance and arousal respectively.
This study aims to classify the various emotion states (Happy, Sad, Calm and Frighten) using video stimulus while recording the EEG signals. The designing of the graphic user interface (GUI) for elicitation of emotion was done in C# and the processing algorithms for EEG data was done using MATLAB. The reason for choosing video clips as stimuli was because it is similar to witnessing a situation and reflecting our emotions in real life. The video clips used were all selected from online sources through recommendation. The Emotiv Epoc headset with 14 electrode channels was also used to record the EEG signals of the test subjects during the experiment stage. The recorded signals were then stored in a .csv file and processed using MATLAB.
The processing stages include using baseline filtering to remove voltage offset, bandpass filtering for various frequency bands, converting signals from time domain to frequency domain, feature extraction and feature selection. Finally, the processed data was classified using Multi-class Support Vector Machine (MULTISVM) to classify the emotion states. An accuracy of 80.53% was achieved for 2 emotion elicitation (happy and sad) while an accuracy of 63.75% was achieved for the 4 emotion elicitation.
This suggests that the more the emotions are involved, the less accurate it would be to classify the emotions as there could be overlapping of emotions. In addition, different people could feel differently in each situation, thus making it harder to classify. Hence, if individual preference could be taken into account, it could result in a higher accuracy. Further works could also be done on compiling video stimulus for various countries which suit their culture, thus making it easier to source for stimulus. |
---|