A methodology to model and simulate customized realistic anthropomorphic robotic hands
When building robotic hands, researchers are always face with two main issues of how to make robotic hands look human-like and how to make robotic hands function like real hands. Most existing solutions solve these issues by manually modelling the robotic hand [10-18]. However, the design processes...
Saved in:
Main Authors: | , , , |
---|---|
Other Authors: | |
Format: | Conference or Workshop Item |
Language: | English |
Published: |
2020
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/138936 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-138936 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-1389362020-09-26T21:53:04Z A methodology to model and simulate customized realistic anthropomorphic robotic hands Tian, Li Magnenat-Thalmann, Nadia Thalmann, Daniel Zheng, Jianmin School of Computer Science and Engineering CGI 2018: Proceedings of Computer Graphics International 2018 Institute for Media Innovation (IMI) Engineering::Computer science and engineering Robotics Embedded Systems When building robotic hands, researchers are always face with two main issues of how to make robotic hands look human-like and how to make robotic hands function like real hands. Most existing solutions solve these issues by manually modelling the robotic hand [10-18]. However, the design processes are long, and it is difficult to duplicate the geometry shape of a human hand. To solve these two issues, this paper presents a simple and effective method that combines 3D printing and digitization techniques to create a 3D printable cable-driven robotic hand from scanning a physical hand. The method involves segmenting the 3D scanned hand model, adding joints, and converting it into a 3D printable model. Comparing to other robotic solutions, our solution retains more than 90% geometry information of a human hand1, which is attained from 3D scanning. Our modelling progress takes around 15 minutes that include 10 minutes of 3D scanning and five minutes for changing the scanned model to an articulated model by running our algorithm. Compared to other articulated modelling solutions [19, 20], our solution is compatible with an actuation system which provides our robotic hand with the ability to mimic different gestures. We have also developed a way of representing hand skeletons based on the hand anthropometric. As a proof of concept, we demonstrate our robotic hand's performance in the grasping experiments. NRF (Natl Research Foundation, S’pore) Accepted version 2020-05-14T04:09:51Z 2020-05-14T04:09:51Z 2018 Conference Paper Tian, L., Magnenat-Thalmann, N., Thalmann, D., & Zheng, J. (2018). A methodology to model and simulate customized realistic anthropomorphic robotic hands. Proceedings of Computer Graphics International 2018, 153-162. doi:10.1145/3208159.3208182 https://hdl.handle.net/10356/138936 10.1145/3208159.3208182 153 162 en © 2018 Association for Computing Machinery. All rights reserved. This paper was published in CGI 2018: Proceedings of Computer Graphics International 2018 and is made available with permission of Association for Computing Machinery. application/pdf |
institution |
Nanyang Technological University |
building |
NTU Library |
country |
Singapore |
collection |
DR-NTU |
language |
English |
topic |
Engineering::Computer science and engineering Robotics Embedded Systems |
spellingShingle |
Engineering::Computer science and engineering Robotics Embedded Systems Tian, Li Magnenat-Thalmann, Nadia Thalmann, Daniel Zheng, Jianmin A methodology to model and simulate customized realistic anthropomorphic robotic hands |
description |
When building robotic hands, researchers are always face with two main issues of how to make robotic hands look human-like and how to make robotic hands function like real hands. Most existing solutions solve these issues by manually modelling the robotic hand [10-18]. However, the design processes are long, and it is difficult to duplicate the geometry shape of a human hand. To solve these two issues, this paper presents a simple and effective method that combines 3D printing and digitization techniques to create a 3D printable cable-driven robotic hand from scanning a physical hand. The method involves segmenting the 3D scanned hand model, adding joints, and converting it into a 3D printable model. Comparing to other robotic solutions, our solution retains more than 90% geometry information of a human hand1, which is attained from 3D scanning. Our modelling progress takes around 15 minutes that include 10 minutes of 3D scanning and five minutes for changing the scanned model to an articulated model by running our algorithm. Compared to other articulated modelling solutions [19, 20], our solution is compatible with an actuation system which provides our robotic hand with the ability to mimic different gestures. We have also developed a way of representing hand skeletons based on the hand anthropometric. As a proof of concept, we demonstrate our robotic hand's performance in the grasping experiments. |
author2 |
School of Computer Science and Engineering |
author_facet |
School of Computer Science and Engineering Tian, Li Magnenat-Thalmann, Nadia Thalmann, Daniel Zheng, Jianmin |
format |
Conference or Workshop Item |
author |
Tian, Li Magnenat-Thalmann, Nadia Thalmann, Daniel Zheng, Jianmin |
author_sort |
Tian, Li |
title |
A methodology to model and simulate customized realistic anthropomorphic robotic hands |
title_short |
A methodology to model and simulate customized realistic anthropomorphic robotic hands |
title_full |
A methodology to model and simulate customized realistic anthropomorphic robotic hands |
title_fullStr |
A methodology to model and simulate customized realistic anthropomorphic robotic hands |
title_full_unstemmed |
A methodology to model and simulate customized realistic anthropomorphic robotic hands |
title_sort |
methodology to model and simulate customized realistic anthropomorphic robotic hands |
publishDate |
2020 |
url |
https://hdl.handle.net/10356/138936 |
_version_ |
1681058435301376000 |