Investigation of multimodality sensors for real-time emotion assessment
Affective research using a multi-modal approach has increasingly involved the use of commercial available sensor devices. However, there is a lack of available market data on the accuracy and performance of such devices in predicting emotions. The objective of the project is to analyse the performan...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
2016
|
Subjects: | |
Online Access: | http://hdl.handle.net/10356/66620 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-66620 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-666202023-03-03T20:53:13Z Investigation of multimodality sensors for real-time emotion assessment Chua, Yong Lun Chan Syin Aung Aung Phyo Wai School of Computer Engineering A*STAR Institute for Infocomm Research (I2R) DRNTU::Engineering::Computer science and engineering Affective research using a multi-modal approach has increasingly involved the use of commercial available sensor devices. However, there is a lack of available market data on the accuracy and performance of such devices in predicting emotions. The objective of the project is to analyse the performance and capabilities of commercial sensor devices in predicting human emotions. Two methods of quantifying emotions will be introduced, namely Paul Ekman’s six discrete basic emotions, and the Circumplex valence-arousal dimensional model. The project also analyses whether the emotion elicitation methodology adopted, namely using International Affective Picture System (IAPS) or Karolinska Directed Emotional Faces (KDEF), will have any effect on the device’s performance. The project is split into three phases, namely the development, experiment and analysis phase. The development phase focuses on experiment design and the development of the hardware and software systems. An experimental software application was developed in C# involving the Muse EEG headband, Amped PPG sensor, and Intel RealSense camera. The software records the subject’s EEG, PPG and facial data using the IAPS and KDEF image libraries to elicit emotions in two separate experiments. The second phase involved the actual conduct of the experiment. The IAPS and KDEF experiment attracted 10 and 8 subjects respectively. Subjects were shown 20 images chosen from the IAPS library and were required to mimic the facial expressions of 18 images from the KDEF library. In both experiments, subjects had to answer a Self-Assessment Manikin (SAM) questionnaire after seeing each image to assess their subjective valence-arousal ratings. A preliminary analysis of the he data collected from the experiments was done using Weka’s J48 decision tree with 10 fold cross validation to determine the accuracy of each sensor devices for both experiments. Recommendations for future works and possible improvements are also discussed towards the end of the report. Bachelor of Engineering (Computer Science) 2016-04-19T03:20:31Z 2016-04-19T03:20:31Z 2016 Final Year Project (FYP) http://hdl.handle.net/10356/66620 en Nanyang Technological University 76 p. application/pdf |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
DRNTU::Engineering::Computer science and engineering |
spellingShingle |
DRNTU::Engineering::Computer science and engineering Chua, Yong Lun Investigation of multimodality sensors for real-time emotion assessment |
description |
Affective research using a multi-modal approach has increasingly involved the use of commercial available sensor devices. However, there is a lack of available market data on the accuracy and performance of such devices in predicting emotions. The objective of the project is to analyse the performance and capabilities of commercial sensor devices in predicting human emotions. Two methods of quantifying emotions will be introduced, namely Paul Ekman’s six discrete basic emotions, and the Circumplex valence-arousal dimensional model. The project also analyses whether the emotion elicitation methodology adopted, namely using International Affective Picture System (IAPS) or Karolinska Directed Emotional Faces (KDEF), will have any effect on the device’s performance.
The project is split into three phases, namely the development, experiment and analysis phase. The development phase focuses on experiment design and the development of the hardware and software systems. An experimental software application was developed in C# involving the Muse EEG headband, Amped PPG sensor, and Intel RealSense camera. The software records the subject’s EEG, PPG and facial data using the IAPS and KDEF image libraries to elicit emotions in two separate experiments.
The second phase involved the actual conduct of the experiment. The IAPS and KDEF experiment attracted 10 and 8 subjects respectively. Subjects were shown 20 images chosen from the IAPS library and were required to mimic the facial expressions of 18 images from the KDEF library. In both experiments, subjects had to answer a Self-Assessment Manikin (SAM) questionnaire after seeing each image to assess their subjective valence-arousal ratings.
A preliminary analysis of the he data collected from the experiments was done using Weka’s J48 decision tree with 10 fold cross validation to determine the accuracy of each sensor devices for both experiments. Recommendations for future works and possible improvements are also discussed towards the end of the report. |
author2 |
Chan Syin |
author_facet |
Chan Syin Chua, Yong Lun |
format |
Final Year Project |
author |
Chua, Yong Lun |
author_sort |
Chua, Yong Lun |
title |
Investigation of multimodality sensors for real-time emotion assessment |
title_short |
Investigation of multimodality sensors for real-time emotion assessment |
title_full |
Investigation of multimodality sensors for real-time emotion assessment |
title_fullStr |
Investigation of multimodality sensors for real-time emotion assessment |
title_full_unstemmed |
Investigation of multimodality sensors for real-time emotion assessment |
title_sort |
investigation of multimodality sensors for real-time emotion assessment |
publishDate |
2016 |
url |
http://hdl.handle.net/10356/66620 |
_version_ |
1759856713124544512 |