Human robot interaction by understanding upper body gestures
In this paper, a human–robot interaction system based on a novel combination of sensors is proposed. It allows one person to interact with a humanoid social robot using natural body language. The robot understands the meaning of human upper body gestures and expresses itself by using a combination o...
Saved in:
Main Authors: | , , , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
2014
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/100584 http://hdl.handle.net/10220/24134 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-100584 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-1005842020-03-07T14:02:45Z Human robot interaction by understanding upper body gestures Xiao, Yang Zhang, Zhijun Beck, Aryel Yuan, Junsong Thalmann, Daniel School of Electrical and Electronic Engineering DRNTU::Engineering::Mechanical engineering::Robots In this paper, a human–robot interaction system based on a novel combination of sensors is proposed. It allows one person to interact with a humanoid social robot using natural body language. The robot understands the meaning of human upper body gestures and expresses itself by using a combination of body movements, facial expressions, and verbal language. A set of 12 upper body gestures is involved for communication. This set also includes gestures with human–object interactions. The gestures are characterized by head, arm, and hand posture information. The wearable Immersion CyberGlove II is employed to capture the hand posture. This information is combined with the head and arm posture captured from Microsoft Kinect. This is a new sensor solution for human-gesture capture. Based on the posture data from the CyberGlove II and Kinect, an effective and real-time human gesture recognition method is proposed. The gesture understanding approach based on an innovative combination of sensors is the main contribution of this paper. To verify the effectiveness of the proposed gesture recognition method, a human body gesture data set is built. The experimental results demonstrate that our approach can recognize the upper body gestures with high accuracy in real time. In addition, for robot motion generation and control, a novel online motion planning method is proposed. In order to generate appropriate dynamic motion, a quadratic programming (QP)-based dual-arms kinematic motion generation scheme is proposed, and a simplified recurrent neural network is employed to solve the QP problem. The integration of a handshake within the HRI system illustrates the effectiveness of the proposed online generation method. Published version 2014-10-28T07:06:58Z 2019-12-06T20:24:50Z 2014-10-28T07:06:58Z 2019-12-06T20:24:50Z 2014 2014 Journal Article Xiao, Y., Zhang, Z., Beck, A., Yuan, J., & Thalmann, D. (2014). Human robot interaction by understanding upper body gestures. Presence : teleoperators and virtual environments, 23(2), 133-154. https://hdl.handle.net/10356/100584 http://hdl.handle.net/10220/24134 10.1162/PRES_a_00176 en Presence : teleoperators and virtual environments © 2014 Massachusetts Institute of Technology Press. This paper was published in Presence: Teleoperators and Virtual Environments and is made available as an electronic reprint (preprint) with permission of Massachusetts Institute of Technology Press. The paper can be found at the following official DOI: [http://dx.doi.org/10.1162/PRES_a_00176]. One print or electronic copy may be made for personal use only. Systematic or multiple reproduction, distribution to multiple locations via electronic or other means, duplication of any material in this paper for a fee or for commercial purposes, or modification of the content of the paper is prohibited and is subject to penalties under law. application/pdf |
institution |
Nanyang Technological University |
building |
NTU Library |
country |
Singapore |
collection |
DR-NTU |
language |
English |
topic |
DRNTU::Engineering::Mechanical engineering::Robots |
spellingShingle |
DRNTU::Engineering::Mechanical engineering::Robots Xiao, Yang Zhang, Zhijun Beck, Aryel Yuan, Junsong Thalmann, Daniel Human robot interaction by understanding upper body gestures |
description |
In this paper, a human–robot interaction system based on a novel combination of sensors is proposed. It allows one person to interact with a humanoid social robot using natural body language. The robot understands the meaning of human upper body gestures and expresses itself by using a combination of body movements, facial expressions, and verbal language. A set of 12 upper body gestures is involved for communication. This set also includes gestures with human–object interactions. The gestures are characterized by head, arm, and hand posture information. The wearable Immersion CyberGlove II is employed to capture the hand posture. This information is combined with the head and arm posture captured from Microsoft Kinect. This is a new sensor solution for human-gesture capture. Based on the posture data from the CyberGlove II and Kinect, an effective and real-time human gesture recognition method is proposed. The gesture understanding approach based on an innovative combination of sensors is the main contribution of this paper. To verify the effectiveness of the proposed gesture recognition method, a human body gesture data set is built. The experimental results demonstrate that our approach can recognize the upper body gestures with high accuracy in real time. In addition, for robot motion generation and control, a novel online motion planning method is proposed. In order to generate appropriate dynamic motion, a quadratic programming (QP)-based dual-arms kinematic motion generation scheme is proposed, and a simplified recurrent neural network is employed to solve the QP problem. The integration of a handshake within the HRI system illustrates the effectiveness of the proposed online generation method. |
author2 |
School of Electrical and Electronic Engineering |
author_facet |
School of Electrical and Electronic Engineering Xiao, Yang Zhang, Zhijun Beck, Aryel Yuan, Junsong Thalmann, Daniel |
format |
Article |
author |
Xiao, Yang Zhang, Zhijun Beck, Aryel Yuan, Junsong Thalmann, Daniel |
author_sort |
Xiao, Yang |
title |
Human robot interaction by understanding upper body gestures |
title_short |
Human robot interaction by understanding upper body gestures |
title_full |
Human robot interaction by understanding upper body gestures |
title_fullStr |
Human robot interaction by understanding upper body gestures |
title_full_unstemmed |
Human robot interaction by understanding upper body gestures |
title_sort |
human robot interaction by understanding upper body gestures |
publishDate |
2014 |
url |
https://hdl.handle.net/10356/100584 http://hdl.handle.net/10220/24134 |
_version_ |
1681043036751003648 |