Understanding human interaction in RGB-D videos

In this report, a human hand gesture recognition system is proposed. The system can understand both static and dynamic human hand gestures. So far, the system is able to recognize 9 static hand gestures: numbers from one to nine; and 1 dynamic hand gesture: number ten. In the system impleme...

Full description

Saved in:
Bibliographic Details
Main Author: Shi, Yuanyuan
Other Authors: School of Electrical and Electronic Engineering
Format: Final Year Project
Language:English
Published: 2014
Subjects:
Online Access:http://hdl.handle.net/10356/61366
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-61366
record_format dspace
spelling sg-ntu-dr.10356-613662023-07-07T18:01:05Z Understanding human interaction in RGB-D videos Shi, Yuanyuan School of Electrical and Electronic Engineering Yuan Junsong DRNTU::Engineering::Electrical and electronic engineering In this report, a human hand gesture recognition system is proposed. The system can understand both static and dynamic human hand gestures. So far, the system is able to recognize 9 static hand gestures: numbers from one to nine; and 1 dynamic hand gesture: number ten. In the system implementation, a right-hand CyberGlove II is used to get the accurate and stable hand joints information for the static hand gesture recognition. Based on the results of static classification, together with the hand joint motion information from Microsoft Kinect, dynamic hand gestures can be classified. In addition, and effective and fast human hand gesture recognition algorithm is proposed to manage the data from sensors and achieve classification results in real time. To verify the effectiveness of the system, a human hand gesture sample dataset containing 250 samples collected from 5 people of difference body sizes is constructed. The testing results show that the algorithm is able to understand human hand gestures accurately and fast. Bachelor of Engineering 2014-06-09T07:21:27Z 2014-06-09T07:21:27Z 2014 2014 Final Year Project (FYP) http://hdl.handle.net/10356/61366 en Nanyang Technological University 57 p. application/pdf
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic DRNTU::Engineering::Electrical and electronic engineering
spellingShingle DRNTU::Engineering::Electrical and electronic engineering
Shi, Yuanyuan
Understanding human interaction in RGB-D videos
description In this report, a human hand gesture recognition system is proposed. The system can understand both static and dynamic human hand gestures. So far, the system is able to recognize 9 static hand gestures: numbers from one to nine; and 1 dynamic hand gesture: number ten. In the system implementation, a right-hand CyberGlove II is used to get the accurate and stable hand joints information for the static hand gesture recognition. Based on the results of static classification, together with the hand joint motion information from Microsoft Kinect, dynamic hand gestures can be classified. In addition, and effective and fast human hand gesture recognition algorithm is proposed to manage the data from sensors and achieve classification results in real time. To verify the effectiveness of the system, a human hand gesture sample dataset containing 250 samples collected from 5 people of difference body sizes is constructed. The testing results show that the algorithm is able to understand human hand gestures accurately and fast.
author2 School of Electrical and Electronic Engineering
author_facet School of Electrical and Electronic Engineering
Shi, Yuanyuan
format Final Year Project
author Shi, Yuanyuan
author_sort Shi, Yuanyuan
title Understanding human interaction in RGB-D videos
title_short Understanding human interaction in RGB-D videos
title_full Understanding human interaction in RGB-D videos
title_fullStr Understanding human interaction in RGB-D videos
title_full_unstemmed Understanding human interaction in RGB-D videos
title_sort understanding human interaction in rgb-d videos
publishDate 2014
url http://hdl.handle.net/10356/61366
_version_ 1772829113682231296