Real time academic emotion recognition using body gestures

Emotion plays a powerful role in students learning process, however, many intelligent tutoring systems (ITS) only consider the knowledge models of the students. Hence, building affect models for ITSs have been an emerging research researchers had proposed several approaches on affect recognition bas...

Full description

Saved in:
Bibliographic Details
Main Author: Cheung, Oi Hing
Format: text
Language:English
Published: Animo Repository 2012
Subjects:
Online Access:https://animorepository.dlsu.edu.ph/etd_masteral/4288
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: De La Salle University
Language: English
id oai:animorepository.dlsu.edu.ph:etd_masteral-11126
record_format eprints
spelling oai:animorepository.dlsu.edu.ph:etd_masteral-111262024-08-24T02:54:59Z Real time academic emotion recognition using body gestures Cheung, Oi Hing Emotion plays a powerful role in students learning process, however, many intelligent tutoring systems (ITS) only consider the knowledge models of the students. Hence, building affect models for ITSs have been an emerging research researchers had proposed several approaches on affect recognition based on the students facial expressions, speech and body gestures. However, most of these works focused on gathering emotional information from facial expressions and speech of the students. Acted emotion databases and expensive hardware are also used to predict student emotions. In this paper, a markerless approach is proposed to detect four basic academic emotions (boredom, flow, confusion, frustration) from the student based on their body gestures. The data for this study were gathered from five students using Microsoft Kinect camera. Features extracted were hand position and duration, arm position, speed, head tilt, leaning and shifting. Given the features extracted, the gestures detected by the system include hands up/hands near face/hands down, arms up/arms down, scratch/steady, fast/slow, lean forward/lean backward, lean right/lean left, shifting/not shifting. To classify the emotion of the student, rules were generated based on the students gestures. The system generated five rules for each students, the system achieved an accuracy of 72.57%, 91.06% , 76.42% , 77.91% and 81.96% for each student respectively. 2012-01-01T08:00:00Z text https://animorepository.dlsu.edu.ph/etd_masteral/4288 Master's Theses English Animo Repository Emotion recognition
institution De La Salle University
building De La Salle University Library
continent Asia
country Philippines
Philippines
content_provider De La Salle University Library
collection DLSU Institutional Repository
language English
topic Emotion recognition
spellingShingle Emotion recognition
Cheung, Oi Hing
Real time academic emotion recognition using body gestures
description Emotion plays a powerful role in students learning process, however, many intelligent tutoring systems (ITS) only consider the knowledge models of the students. Hence, building affect models for ITSs have been an emerging research researchers had proposed several approaches on affect recognition based on the students facial expressions, speech and body gestures. However, most of these works focused on gathering emotional information from facial expressions and speech of the students. Acted emotion databases and expensive hardware are also used to predict student emotions. In this paper, a markerless approach is proposed to detect four basic academic emotions (boredom, flow, confusion, frustration) from the student based on their body gestures. The data for this study were gathered from five students using Microsoft Kinect camera. Features extracted were hand position and duration, arm position, speed, head tilt, leaning and shifting. Given the features extracted, the gestures detected by the system include hands up/hands near face/hands down, arms up/arms down, scratch/steady, fast/slow, lean forward/lean backward, lean right/lean left, shifting/not shifting. To classify the emotion of the student, rules were generated based on the students gestures. The system generated five rules for each students, the system achieved an accuracy of 72.57%, 91.06% , 76.42% , 77.91% and 81.96% for each student respectively.
format text
author Cheung, Oi Hing
author_facet Cheung, Oi Hing
author_sort Cheung, Oi Hing
title Real time academic emotion recognition using body gestures
title_short Real time academic emotion recognition using body gestures
title_full Real time academic emotion recognition using body gestures
title_fullStr Real time academic emotion recognition using body gestures
title_full_unstemmed Real time academic emotion recognition using body gestures
title_sort real time academic emotion recognition using body gestures
publisher Animo Repository
publishDate 2012
url https://animorepository.dlsu.edu.ph/etd_masteral/4288
_version_ 1808617238077374464