Real time academic emotion recognition using body gestures

Emotion plays a powerful role in students learning process, however, many intelligent tutoring systems (ITS) only consider the knowledge models of the students. Hence, building affect models for ITSs have been an emerging research researchers had proposed several approaches on affect recognition bas...

Full description

Saved in:
Bibliographic Details
Main Author: Cheung, Oi Hing
Format: text
Language:English
Published: Animo Repository 2012
Subjects:
Online Access:https://animorepository.dlsu.edu.ph/etd_masteral/4288
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: De La Salle University
Language: English
Description
Summary:Emotion plays a powerful role in students learning process, however, many intelligent tutoring systems (ITS) only consider the knowledge models of the students. Hence, building affect models for ITSs have been an emerging research researchers had proposed several approaches on affect recognition based on the students facial expressions, speech and body gestures. However, most of these works focused on gathering emotional information from facial expressions and speech of the students. Acted emotion databases and expensive hardware are also used to predict student emotions. In this paper, a markerless approach is proposed to detect four basic academic emotions (boredom, flow, confusion, frustration) from the student based on their body gestures. The data for this study were gathered from five students using Microsoft Kinect camera. Features extracted were hand position and duration, arm position, speed, head tilt, leaning and shifting. Given the features extracted, the gestures detected by the system include hands up/hands near face/hands down, arms up/arms down, scratch/steady, fast/slow, lean forward/lean backward, lean right/lean left, shifting/not shifting. To classify the emotion of the student, rules were generated based on the students gestures. The system generated five rules for each students, the system achieved an accuracy of 72.57%, 91.06% , 76.42% , 77.91% and 81.96% for each student respectively.