Action recognition using machine learning techniques for robots

Recent advancement in the processing capabilities of mobile chips has opened up the possibility of developing domestic companion robots with small form factors. The most intuitive way to interact with such robot is through human action, especially hand gesture. In this project, a robust static an...

Full description

Saved in:
Bibliographic Details
Main Author: Yue, Zhongqi
Other Authors: Huang Guangbin
Format: Final Year Project
Language:English
Published: 2017
Subjects:
Online Access:http://hdl.handle.net/10356/72346
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-72346
record_format dspace
spelling sg-ntu-dr.10356-723462023-07-07T15:59:25Z Action recognition using machine learning techniques for robots Yue, Zhongqi Huang Guangbin School of Electrical and Electronic Engineering DRNTU::Engineering::Electrical and electronic engineering Recent advancement in the processing capabilities of mobile chips has opened up the possibility of developing domestic companion robots with small form factors. The most intuitive way to interact with such robot is through human action, especially hand gesture. In this project, a robust static and dynamic gesture detector is built to achieve real-time performance on a mobile processor. The proposed framework features a novel hand hypotheses generator based on color and edge, a hand detector using Convolutional Neural Network (CNN), a static gesture recognizer based on skin contour analysis, and a hypotheses-tracking system based on Kalman Filter for increased performance and consistency. The resulting system is robust to viewpoint, ambient lighting, and rotation, capable of producing accurate results in various real-life settings. Index Terms: Hand Gesture Recognition, Hand Detection, Hand Tracking, Convolutional Neural Network Bachelor of Engineering 2017-06-12T02:05:12Z 2017-06-12T02:05:12Z 2017 Final Year Project (FYP) http://hdl.handle.net/10356/72346 en Nanyang Technological University 99 p. application/pdf
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic DRNTU::Engineering::Electrical and electronic engineering
spellingShingle DRNTU::Engineering::Electrical and electronic engineering
Yue, Zhongqi
Action recognition using machine learning techniques for robots
description Recent advancement in the processing capabilities of mobile chips has opened up the possibility of developing domestic companion robots with small form factors. The most intuitive way to interact with such robot is through human action, especially hand gesture. In this project, a robust static and dynamic gesture detector is built to achieve real-time performance on a mobile processor. The proposed framework features a novel hand hypotheses generator based on color and edge, a hand detector using Convolutional Neural Network (CNN), a static gesture recognizer based on skin contour analysis, and a hypotheses-tracking system based on Kalman Filter for increased performance and consistency. The resulting system is robust to viewpoint, ambient lighting, and rotation, capable of producing accurate results in various real-life settings. Index Terms: Hand Gesture Recognition, Hand Detection, Hand Tracking, Convolutional Neural Network
author2 Huang Guangbin
author_facet Huang Guangbin
Yue, Zhongqi
format Final Year Project
author Yue, Zhongqi
author_sort Yue, Zhongqi
title Action recognition using machine learning techniques for robots
title_short Action recognition using machine learning techniques for robots
title_full Action recognition using machine learning techniques for robots
title_fullStr Action recognition using machine learning techniques for robots
title_full_unstemmed Action recognition using machine learning techniques for robots
title_sort action recognition using machine learning techniques for robots
publishDate 2017
url http://hdl.handle.net/10356/72346
_version_ 1772829095421280256