Egocentric hand pose estimation and distance recovery in a single RGB image
Articulated hand pose recovery in egocentric vision is useful for in-air interaction with the wearable devices, such as the Google glasses. Despite the progress obtained with the depth camera, this task is still challenging with ordinary RGB cameras. In this paper we demonstrate the possibility...
Saved in:
Main Authors: | , , |
---|---|
Other Authors: | |
Format: | Conference or Workshop Item |
Language: | English |
Published: |
2016
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/80278 http://hdl.handle.net/10220/40401 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-80278 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-802782020-03-07T13:24:43Z Egocentric hand pose estimation and distance recovery in a single RGB image Liang, Hui Yuan, Junsong Thalman, Daniel School of Electrical and Electronic Engineering 2015 IEEE International Conference on Multimedia and Expo (ICME) egocentric vision hand pose estimation conditional regression forest Articulated hand pose recovery in egocentric vision is useful for in-air interaction with the wearable devices, such as the Google glasses. Despite the progress obtained with the depth camera, this task is still challenging with ordinary RGB cameras. In this paper we demonstrate the possibility to recover both the articulated hand pose and its distance from the camera with a single RGB camera in egocentric view. We address this problem by modeling the distance as a hidden variable and use the Conditional Regression Forest to infer the pose and distance jointly. Especially, we find that the pose estimation accuracy can be further enhanced by incorporating the hand part semantics. The experimental results show that the proposed method achieves good performance on both a synthesized dataset and several real-world color image sequences that are captured in different environments. In addition, our system runs in real-time at more than 10fps. Accepted version 2016-04-12T07:54:52Z 2019-12-06T13:46:22Z 2016-04-12T07:54:52Z 2019-12-06T13:46:22Z 2015 Conference Paper Liang, H., Yuan, J., & Thalman, D. (2015). Egocentric hand pose estimation and distance recovery in a single RGB image. 2015 IEEE International Conference on Multimedia and Expo (ICME), 1-6. https://hdl.handle.net/10356/80278 http://hdl.handle.net/10220/40401 10.1109/ICME.2015.7177448 en © 2015 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. The published version is available at: [http://dx.doi.org/10.1109/ICME.2015.7177448]. 6 p. application/pdf |
institution |
Nanyang Technological University |
building |
NTU Library |
country |
Singapore |
collection |
DR-NTU |
language |
English |
topic |
egocentric vision hand pose estimation conditional regression forest |
spellingShingle |
egocentric vision hand pose estimation conditional regression forest Liang, Hui Yuan, Junsong Thalman, Daniel Egocentric hand pose estimation and distance recovery in a single RGB image |
description |
Articulated hand pose recovery in egocentric vision is useful
for in-air interaction with the wearable devices, such as the
Google glasses. Despite the progress obtained with the depth
camera, this task is still challenging with ordinary RGB cameras.
In this paper we demonstrate the possibility to recover
both the articulated hand pose and its distance from the camera
with a single RGB camera in egocentric view. We address
this problem by modeling the distance as a hidden variable
and use the Conditional Regression Forest to infer the pose
and distance jointly. Especially, we find that the pose estimation
accuracy can be further enhanced by incorporating the
hand part semantics. The experimental results show that the
proposed method achieves good performance on both a synthesized
dataset and several real-world color image sequences
that are captured in different environments. In addition, our
system runs in real-time at more than 10fps. |
author2 |
School of Electrical and Electronic Engineering |
author_facet |
School of Electrical and Electronic Engineering Liang, Hui Yuan, Junsong Thalman, Daniel |
format |
Conference or Workshop Item |
author |
Liang, Hui Yuan, Junsong Thalman, Daniel |
author_sort |
Liang, Hui |
title |
Egocentric hand pose estimation and distance recovery in a single RGB image |
title_short |
Egocentric hand pose estimation and distance recovery in a single RGB image |
title_full |
Egocentric hand pose estimation and distance recovery in a single RGB image |
title_fullStr |
Egocentric hand pose estimation and distance recovery in a single RGB image |
title_full_unstemmed |
Egocentric hand pose estimation and distance recovery in a single RGB image |
title_sort |
egocentric hand pose estimation and distance recovery in a single rgb image |
publishDate |
2016 |
url |
https://hdl.handle.net/10356/80278 http://hdl.handle.net/10220/40401 |
_version_ |
1681048271091400704 |