Markerless motion capture of hand and object tracking for rehabilitation

Capturing hand motion has a myriad of applications ranging from entertainment to healthcare. Current approaches found in hand rehabilitation often involve the placement of a goniometer, wearable sensors, or markers which can be inefficient, hinder natural hand movements, or require time-consuming...

Full description

Saved in:
Bibliographic Details
Main Author: Lim, Guan Ming
Other Authors: Ang Wei Tech
Format: Thesis-Doctor of Philosophy
Language:English
Published: Nanyang Technological University 2024
Subjects:
Online Access:https://hdl.handle.net/10356/178252
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-178252
record_format dspace
spelling sg-ntu-dr.10356-1782522024-07-05T03:11:43Z Markerless motion capture of hand and object tracking for rehabilitation Lim, Guan Ming Ang Wei Tech School of Mechanical and Aerospace Engineering Robotics Research Centre Rehabilitation Research Institute of Singapore (RRIS) WTAng@ntu.edu.sg Engineering Markerless Motion capture Hand rehabilitation Capturing hand motion has a myriad of applications ranging from entertainment to healthcare. Current approaches found in hand rehabilitation often involve the placement of a goniometer, wearable sensors, or markers which can be inefficient, hinder natural hand movements, or require time-consuming setup and data post-processing. A promising alternative is markerless motion capture, which seeks to estimate hand poses directly from images. However, existing methods face challenges related to real-time performance, accuracy, and robustness to hand-object interaction. Therefore, this thesis aims to enhance the efficiency and accuracy of markerless motion capture of hand and object tracking for rehabilitation. First, we present a minimal setup that employs an efficient neural network for real-time estimation of 3D hand pose and shape from a single color image. To address accuracy limitations stemming from depth ambiguity in a single-camera setup, we propose a simple method using a mirror-based multi-view setup to measure hand motion. This eliminates the complexity of synchronizing multiple cameras and reduces joint angle errors by half compared to a single-view setup. Additionally, to account for hand-object interaction, we create synthetic depth images of subjects with diverse body shapes to train a neural network to segment forearms, hands, and objects. In practice, the initial pose estimate or object segmentation from the neural network (learning-based) is never perfect, but it can be refined with model fitting (optimization-based). Therefore, we perform rigid object tracking using precomputed sparse viewpoint information to allow real-time tracking while achieving submillimeter accuracy on synthetic datasets. The method is extended to track an articulated hand model with object interaction on a multi-camera setup, achieving an average joint angle error of around 10 degrees when validated against a marker-based motion capture system. Finally, to analyze grasping parameters when the hand is in contact with the object, we develop a sensorized object covered with a pressure sensor array that could generate a 2D pressure distribution map. This helps to provide additional information on the grasping pattern which is not available with color or depth images. Overall, we demonstrate the potential of markerless motion capture system that could complement hand rehabilitation, by providing real-time feedback on dynamic hand motion that is robust to hand-object interaction. Furthermore, the markerless setup is much more portable as compared to marker-based systems, making it possible to use for hand motion capture in clinics or at home. Doctor of Philosophy 2024-06-13T23:27:21Z 2024-06-13T23:27:21Z 2024 Thesis-Doctor of Philosophy Lim, G. M. (2024). Markerless motion capture of hand and object tracking for rehabilitation. Doctoral thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/178252 https://hdl.handle.net/10356/178252 10.32657/10356/178252 en RRG2/16001 RFP/19003 RRG4/2201 This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License (CC BY-NC 4.0). application/pdf Nanyang Technological University
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering
Markerless
Motion capture
Hand rehabilitation
spellingShingle Engineering
Markerless
Motion capture
Hand rehabilitation
Lim, Guan Ming
Markerless motion capture of hand and object tracking for rehabilitation
description Capturing hand motion has a myriad of applications ranging from entertainment to healthcare. Current approaches found in hand rehabilitation often involve the placement of a goniometer, wearable sensors, or markers which can be inefficient, hinder natural hand movements, or require time-consuming setup and data post-processing. A promising alternative is markerless motion capture, which seeks to estimate hand poses directly from images. However, existing methods face challenges related to real-time performance, accuracy, and robustness to hand-object interaction. Therefore, this thesis aims to enhance the efficiency and accuracy of markerless motion capture of hand and object tracking for rehabilitation. First, we present a minimal setup that employs an efficient neural network for real-time estimation of 3D hand pose and shape from a single color image. To address accuracy limitations stemming from depth ambiguity in a single-camera setup, we propose a simple method using a mirror-based multi-view setup to measure hand motion. This eliminates the complexity of synchronizing multiple cameras and reduces joint angle errors by half compared to a single-view setup. Additionally, to account for hand-object interaction, we create synthetic depth images of subjects with diverse body shapes to train a neural network to segment forearms, hands, and objects. In practice, the initial pose estimate or object segmentation from the neural network (learning-based) is never perfect, but it can be refined with model fitting (optimization-based). Therefore, we perform rigid object tracking using precomputed sparse viewpoint information to allow real-time tracking while achieving submillimeter accuracy on synthetic datasets. The method is extended to track an articulated hand model with object interaction on a multi-camera setup, achieving an average joint angle error of around 10 degrees when validated against a marker-based motion capture system. Finally, to analyze grasping parameters when the hand is in contact with the object, we develop a sensorized object covered with a pressure sensor array that could generate a 2D pressure distribution map. This helps to provide additional information on the grasping pattern which is not available with color or depth images. Overall, we demonstrate the potential of markerless motion capture system that could complement hand rehabilitation, by providing real-time feedback on dynamic hand motion that is robust to hand-object interaction. Furthermore, the markerless setup is much more portable as compared to marker-based systems, making it possible to use for hand motion capture in clinics or at home.
author2 Ang Wei Tech
author_facet Ang Wei Tech
Lim, Guan Ming
format Thesis-Doctor of Philosophy
author Lim, Guan Ming
author_sort Lim, Guan Ming
title Markerless motion capture of hand and object tracking for rehabilitation
title_short Markerless motion capture of hand and object tracking for rehabilitation
title_full Markerless motion capture of hand and object tracking for rehabilitation
title_fullStr Markerless motion capture of hand and object tracking for rehabilitation
title_full_unstemmed Markerless motion capture of hand and object tracking for rehabilitation
title_sort markerless motion capture of hand and object tracking for rehabilitation
publisher Nanyang Technological University
publishDate 2024
url https://hdl.handle.net/10356/178252
_version_ 1806059907125870592