Markerless gesture recognition in the context of affect modeling for intelligent tutoring systems
Most gesture recognition systems of today use high-technology devices, and markers or wires to capture body movements of the user. These systems present high tracking accuracies. However, high-technology devices can be expensive, and difficult to deploy and duplicate. Markers can also be distracting...
Saved in:
Main Authors: | , , , |
---|---|
Format: | text |
Language: | English |
Published: |
Animo Repository
2011
|
Subjects: | |
Online Access: | https://animorepository.dlsu.edu.ph/etd_bachelors/11169 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | De La Salle University |
Language: | English |
Summary: | Most gesture recognition systems of today use high-technology devices, and markers or wires to capture body movements of the user. These systems present high tracking accuracies. However, high-technology devices can be expensive, and difficult to deploy and duplicate. Markers can also be distracting and impractical. This research proposes on using only the computers web camera in tracking and recognizing gestures. A web camera is low-cost, simple, and unobtrusive. This research presents a novel framework that hopes to solve some of the identified problems of gesture recognition systems.
Three participants were video recorded while studying in front of a computer. Data collection yielded to raw data of approximately 7 hours long. A markerless gesture recognition system implemented in the context of affect modeling for intelligent tutoring system was built. Each of the systems modules namely, (1) hand on face detection module, (2) hand and arm detection module, and (3) posture detection module was tested and achieved an accuracy value of 75%, 70%, and 95%, respectively. The systems emotion recognition module was also tested based on the data of one participant. For three academic emotions namely, bored, flow, and confused, the emotion recognition module achieved an accuracy value of 67%. |
---|