Planning of new experimental protocol for identifying human emotions & analysis of new kinect data

Humans use our emotions every day – be it a positive or negative expression, they represent our feelings at that point in time. Researchers have been analysing humans and their facial expressions for years, motivated to know how else they can better make use of these emotions to predict people’s res...

Full description

Saved in:
Bibliographic Details
Main Author: Lim, Jia-Min
Other Authors: Justin Dauwels
Format: Final Year Project
Language:English
Published: 2016
Subjects:
Online Access:http://hdl.handle.net/10356/67743
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:Humans use our emotions every day – be it a positive or negative expression, they represent our feelings at that point in time. Researchers have been analysing humans and their facial expressions for years, motivated to know how else they can better make use of these emotions to predict people’s response for useful applications. In this project, we look at how we can utilise the Kinect platform to better understand humans by detecting their facial expressions and associate it with emotions. This report summarizes the protocol to conducting an experiment which uses a motion controller developing device, Kinect for Windows version 1, to analyse the images by different methods of classification. System will be trained to recognise facial movement and identify the emotions associated based on the difference in these facial movements. This project is a collaborated between 2 Final Year Project (FYP) students and 2 Masters students. The author will primarily focus on setting up the protocol for conducting new experiments followed by analysing the new data with Kinect 1. The other students will use other devices such as Kinect 2 and the eye tracker. After conducting new experiments with new subjects, images of the data are put through different classifiers to train the system to recognise and identify facial expressions to emotions. The purpose of this project is to help applications use this system to better gauge people’s response for different applications and make a more informed follow-up decision.