Hand pose/gesture analysis using artificial intelligence
Hand gesture recognition is a crucial research topic in human-computer interaction, enabling efficient, intuitive, and natural communication between humans and computers. In short, hand gesture recognition is the task of classifying hand gestures. Despite tremendous progress, hand gesture recognitio...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2024
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/177144 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | Hand gesture recognition is a crucial research topic in human-computer interaction, enabling efficient, intuitive, and natural communication between humans and computers. In short, hand gesture recognition is the task of classifying hand gestures. Despite tremendous progress, hand gesture recognition poses significant challenges due to its distinctive characteristics and difficulties, including self-occlusion or object occlusion, computational complexity in processing high-dimensional hand gesture data, and ambiguity in gesture interpretation due to similar hand configurations for different hand gestures. Over the last decade, the implementation of deep learning methods in the computer vision area has achieved state-of-the-art (SoTA) results. In this project, a skeleton-based dynamic hand gesture recognition model, namely Temporal Decoupling Graph Convolutional Network (TD-GCN), is selected to be the recognition model. TD-GCN is trained on two well-known hand gesture datasets: the SHREC’ 17 Track dataset and the DHG-14/28 dataset. Through retraining, modifications, and experiments, the trained model achieved a 96.79% accuracy on SHREC’ 17 Track 14-gesture dataset and a 92.74% accuracy on the 28-gesture dataset respectively. While on the DHG-14/28 dataset, the trained model achieved 92.14% and 89.29% accuracy
respectively. |
---|