Automated emotion recognition based on extreme learning machines

Although the information in still images can already enable a computer to perform emotion recognition, it is only natural for moving images tocontain even more information, empowering the computer to further improve its ability to recognize emotions. Hence, in this project, we will investigate the e...

Full description

Saved in:
Bibliographic Details
Main Author: Tan, Yikai
Other Authors: Teoh Eam Khwang
Format: Final Year Project
Language:English
Published: 2014
Subjects:
Online Access:http://hdl.handle.net/10356/60112
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-60112
record_format dspace
spelling sg-ntu-dr.10356-601122023-07-07T17:32:26Z Automated emotion recognition based on extreme learning machines Tan, Yikai Teoh Eam Khwang School of Electrical and Electronic Engineering DRNTU::Engineering::Electrical and electronic engineering::Computer hardware, software and systems Although the information in still images can already enable a computer to perform emotion recognition, it is only natural for moving images tocontain even more information, empowering the computer to further improve its ability to recognize emotions. Hence, in this project, we will investigate the effectiveness of emotion recognition using visual information from videos, by using dynamic Haar-like filters for feature extractions and Extreme Learning Machine (ELM) as the classifier. The project will be segmented into 3 parts. In the first part we will be looking into Static haar-like features, while in the second part we will be looking into dynamic haar-like features. Both part 1 and 2 contains 3 phases, pre-processing, feature extraction, classification. In first phase, pre-processing, facial normalization based on eye coordinates will be performed on images from the Cohn-Kanade Database. The second phase is feature extraction, for still images, static haar-like filters will be used, to extract static haar-like features from the most expressive normalized images of each subject, while for moving images, dynamic haar-like filters will be used to extract dynamic haar-like features from the normalized video sequence of each subject The third phase is classification, where training and testing will be done on extracted features using Extreme Learning Machine with kernel to evaluate accuracy of emotion classifications. In part 3, accuracy results of both static and dynamic haar-like features will be compared to test effectiveness of using dynamic haar-like features. Finally, further integration will be done on 2 different classifiers in relation to dynamic haar-like features, namely Sparse Representation Classifier (SRC) and Extreme Learning Machine (ELM). Bachelor of Engineering 2014-05-22T05:50:21Z 2014-05-22T05:50:21Z 2014 2014 Final Year Project (FYP) http://hdl.handle.net/10356/60112 en Nanyang Technological University 87 p. application/pdf
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic DRNTU::Engineering::Electrical and electronic engineering::Computer hardware, software and systems
spellingShingle DRNTU::Engineering::Electrical and electronic engineering::Computer hardware, software and systems
Tan, Yikai
Automated emotion recognition based on extreme learning machines
description Although the information in still images can already enable a computer to perform emotion recognition, it is only natural for moving images tocontain even more information, empowering the computer to further improve its ability to recognize emotions. Hence, in this project, we will investigate the effectiveness of emotion recognition using visual information from videos, by using dynamic Haar-like filters for feature extractions and Extreme Learning Machine (ELM) as the classifier. The project will be segmented into 3 parts. In the first part we will be looking into Static haar-like features, while in the second part we will be looking into dynamic haar-like features. Both part 1 and 2 contains 3 phases, pre-processing, feature extraction, classification. In first phase, pre-processing, facial normalization based on eye coordinates will be performed on images from the Cohn-Kanade Database. The second phase is feature extraction, for still images, static haar-like filters will be used, to extract static haar-like features from the most expressive normalized images of each subject, while for moving images, dynamic haar-like filters will be used to extract dynamic haar-like features from the normalized video sequence of each subject The third phase is classification, where training and testing will be done on extracted features using Extreme Learning Machine with kernel to evaluate accuracy of emotion classifications. In part 3, accuracy results of both static and dynamic haar-like features will be compared to test effectiveness of using dynamic haar-like features. Finally, further integration will be done on 2 different classifiers in relation to dynamic haar-like features, namely Sparse Representation Classifier (SRC) and Extreme Learning Machine (ELM).
author2 Teoh Eam Khwang
author_facet Teoh Eam Khwang
Tan, Yikai
format Final Year Project
author Tan, Yikai
author_sort Tan, Yikai
title Automated emotion recognition based on extreme learning machines
title_short Automated emotion recognition based on extreme learning machines
title_full Automated emotion recognition based on extreme learning machines
title_fullStr Automated emotion recognition based on extreme learning machines
title_full_unstemmed Automated emotion recognition based on extreme learning machines
title_sort automated emotion recognition based on extreme learning machines
publishDate 2014
url http://hdl.handle.net/10356/60112
_version_ 1772825832065073152