Human-virtual human interaction by upper body gesture understanding

In this paper, a novel human-virtual human interaction system is proposed. This system supports a real human to communicate with a virtual human using natural body language. Meanwhile, the virtual human is capable of understanding the meaning of human upper body gestures and reacting with its own pe...

Full description

Saved in:
Bibliographic Details
Main Authors: Xiao, Yang, Yuan, Junsong, Thalmann, Daniel
Other Authors: School of Electrical and Electronic Engineering
Format: Conference or Workshop Item
Language:English
Published: 2013
Subjects:
Online Access:https://hdl.handle.net/10356/100286
http://hdl.handle.net/10220/18125
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-100286
record_format dspace
spelling sg-ntu-dr.10356-1002862020-03-07T13:24:49Z Human-virtual human interaction by upper body gesture understanding Xiao, Yang Yuan, Junsong Thalmann, Daniel School of Electrical and Electronic Engineering Symposium on Virtual Reality Software and Technology (19th : 2013 : Singapore) Electrical and Electronic Engineering In this paper, a novel human-virtual human interaction system is proposed. This system supports a real human to communicate with a virtual human using natural body language. Meanwhile, the virtual human is capable of understanding the meaning of human upper body gestures and reacting with its own personality by the means of body action, facial expression and verbal language simultaneously. In total, 11 human upper body gestures with and without human-object interaction are currently involved in the system. They can be characterized by human head, hand and arm posture. In our system implementation, the wearable Immersion CyberGlove II is used to capture the hand posture and the vision-based Microsoft Kinect takes charge of capturing the head and arm posture. This is a new sensor solution for human-gesture capture, and can be regarded as the most important contribution of this paper. Based on the posture data from the CyberGlove II and the Kinect, an effective and real-time human gesture recognition algorithm is also proposed. To verify the effectiveness of the gesture recognition method, we build a human gesture sample dataset. Additionally, the experiments demonstrate that our algorithm can recognize human gestures with high accuracy in real time. [This research is partially supported by IMI Seed Grant. It is partially carried out at BeingThere Centre with the support from the Singapore National Research Foundation under its International Research Centre @ Singapore Funding Initiative and administered by the IDM Programme Office.] Accepted version 2013-12-06T04:12:44Z 2019-12-06T20:19:43Z 2013-12-06T04:12:44Z 2019-12-06T20:19:43Z 2013 2013 Conference Paper Xiao, Y., Yuan, J., & Thalmann, D., (2013). Human-Virtual Human Interaction by Upper Body Gesture Understanding. Proceedings of the 19th ACM Symposium on Virtual Reality Software and Technology. 133-142. https://hdl.handle.net/10356/100286 http://hdl.handle.net/10220/18125 10.1145/2503713.2503727 en © 2013 Association for Computing Machinery This is the author created version of a work that has been peer reviewed and accepted for publication by Human-Virtual Human Interaction by Upper Body Gesture Understanding , Association for Computing Machinery. It incorporates referee’s comments but changes resulting from the publishing process, such as copyediting, structural formatting, may not be reflected in this document. The published version is available at: http://dx.doi.org/10.1145/2503713.2503727]. 10 p. application/pdf
institution Nanyang Technological University
building NTU Library
country Singapore
collection DR-NTU
language English
topic Electrical and Electronic Engineering
spellingShingle Electrical and Electronic Engineering
Xiao, Yang
Yuan, Junsong
Thalmann, Daniel
Human-virtual human interaction by upper body gesture understanding
description In this paper, a novel human-virtual human interaction system is proposed. This system supports a real human to communicate with a virtual human using natural body language. Meanwhile, the virtual human is capable of understanding the meaning of human upper body gestures and reacting with its own personality by the means of body action, facial expression and verbal language simultaneously. In total, 11 human upper body gestures with and without human-object interaction are currently involved in the system. They can be characterized by human head, hand and arm posture. In our system implementation, the wearable Immersion CyberGlove II is used to capture the hand posture and the vision-based Microsoft Kinect takes charge of capturing the head and arm posture. This is a new sensor solution for human-gesture capture, and can be regarded as the most important contribution of this paper. Based on the posture data from the CyberGlove II and the Kinect, an effective and real-time human gesture recognition algorithm is also proposed. To verify the effectiveness of the gesture recognition method, we build a human gesture sample dataset. Additionally, the experiments demonstrate that our algorithm can recognize human gestures with high accuracy in real time. [This research is partially supported by IMI Seed Grant. It is partially carried out at BeingThere Centre with the support from the Singapore National Research Foundation under its International Research Centre @ Singapore Funding Initiative and administered by the IDM Programme Office.]
author2 School of Electrical and Electronic Engineering
author_facet School of Electrical and Electronic Engineering
Xiao, Yang
Yuan, Junsong
Thalmann, Daniel
format Conference or Workshop Item
author Xiao, Yang
Yuan, Junsong
Thalmann, Daniel
author_sort Xiao, Yang
title Human-virtual human interaction by upper body gesture understanding
title_short Human-virtual human interaction by upper body gesture understanding
title_full Human-virtual human interaction by upper body gesture understanding
title_fullStr Human-virtual human interaction by upper body gesture understanding
title_full_unstemmed Human-virtual human interaction by upper body gesture understanding
title_sort human-virtual human interaction by upper body gesture understanding
publishDate 2013
url https://hdl.handle.net/10356/100286
http://hdl.handle.net/10220/18125
_version_ 1681035208354168832