Human-virtual human interaction by upper body gesture understanding
In this paper, a novel human-virtual human interaction system is proposed. This system supports a real human to communicate with a virtual human using natural body language. Meanwhile, the virtual human is capable of understanding the meaning of human upper body gestures and reacting with its own pe...
Saved in:
Main Authors: | , , |
---|---|
Other Authors: | |
Format: | Conference or Workshop Item |
Language: | English |
Published: |
2013
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/100286 http://hdl.handle.net/10220/18125 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | In this paper, a novel human-virtual human interaction system is proposed. This system supports a real human to communicate with a virtual human using natural body language. Meanwhile, the virtual human is capable of understanding the meaning of human upper body gestures and reacting with its own personality by the means of body action, facial expression and verbal language simultaneously. In total, 11 human upper body gestures with and without human-object interaction are currently involved in the system. They can be characterized by human head, hand and arm posture. In our system implementation, the wearable Immersion CyberGlove II is used to capture the hand posture and the vision-based Microsoft Kinect takes charge of capturing the head and arm posture. This is a new sensor solution for human-gesture capture, and can be regarded as the most important contribution of this paper. Based on the posture data from the CyberGlove II and the Kinect, an effective and real-time human gesture recognition algorithm is also proposed. To verify the effectiveness of the gesture recognition method, we build a human gesture sample dataset. Additionally, the experiments demonstrate that our algorithm can recognize human gestures with high accuracy in real time. [This research is partially supported by IMI Seed Grant. It is partially carried out at BeingThere Centre with the support from the Singapore National Research Foundation under its International Research Centre @ Singapore Funding Initiative and administered by the IDM Programme Office.] |
---|