A methodology to model and simulate customized realistic anthropomorphic robotic hands

When building robotic hands, researchers are always face with two main issues of how to make robotic hands look human-like and how to make robotic hands function like real hands. Most existing solutions solve these issues by manually modelling the robotic hand [10-18]. However, the design processes...

Full description

Saved in:
Bibliographic Details
Main Authors: Tian, Li, Magnenat-Thalmann, Nadia, Thalmann, Daniel, Zheng, Jianmin
Other Authors: School of Computer Science and Engineering
Format: Conference or Workshop Item
Language:English
Published: 2020
Subjects:
Online Access:https://hdl.handle.net/10356/138936
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:When building robotic hands, researchers are always face with two main issues of how to make robotic hands look human-like and how to make robotic hands function like real hands. Most existing solutions solve these issues by manually modelling the robotic hand [10-18]. However, the design processes are long, and it is difficult to duplicate the geometry shape of a human hand. To solve these two issues, this paper presents a simple and effective method that combines 3D printing and digitization techniques to create a 3D printable cable-driven robotic hand from scanning a physical hand. The method involves segmenting the 3D scanned hand model, adding joints, and converting it into a 3D printable model. Comparing to other robotic solutions, our solution retains more than 90% geometry information of a human hand1, which is attained from 3D scanning. Our modelling progress takes around 15 minutes that include 10 minutes of 3D scanning and five minutes for changing the scanned model to an articulated model by running our algorithm. Compared to other articulated modelling solutions [19, 20], our solution is compatible with an actuation system which provides our robotic hand with the ability to mimic different gestures. We have also developed a way of representing hand skeletons based on the hand anthropometric. As a proof of concept, we demonstrate our robotic hand's performance in the grasping experiments.