Nao robot kinect control

This is the Final Year Project report on producing an application to control Aldeberan Robotics’ Nao Robot using Microsoft’s Kinect sensor. The process of developing this application involved learning how to program the Kinect and the Nao robot as well as researching and designing artificial neural...

Full description

Saved in:
Bibliographic Details
Main Author: Cumming, Matthew
Other Authors: Song Qing
Format: Final Year Project
Language:English
Published: 2014
Subjects:
Online Access:http://hdl.handle.net/10356/60415
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-60415
record_format dspace
spelling sg-ntu-dr.10356-604152023-07-07T17:23:32Z Nao robot kinect control Cumming, Matthew Song Qing School of Electrical and Electronic Engineering Centre for Intelligent Machines DRNTU::Engineering This is the Final Year Project report on producing an application to control Aldeberan Robotics’ Nao Robot using Microsoft’s Kinect sensor. The process of developing this application involved learning how to program the Kinect and the Nao robot as well as researching and designing artificial neural networks and machine learning algorithms for gesture recognition. The majority of the time spent working on this project involved research, and learning how to use the tools necessary to build the final application. This project was an open ended one; once an interface has been created the robot could be controlled via the Kinect in any desired manner. A WPF application was created in Visual Studios, using C#, to take information from the Kinect about the user’s body position and output it. The outputs were analysed and used with python scripts to record gestures from the Kinect. These recorded gestures were then used to train a multilayer perceptron artificial neural network (also programmed in python) using a machine learning algorithm to recognise various gestures. Once the program had learned the gestures, it could perform gesture recognition in real time and control the robot accordingly. Further modes of operation were developed whereby the robot would imitate the user; moving to the same position and orientation as the user in its own environment. This project was successful, providing a great deal of research and learning opportunities while also establishing a system that can fully control the Nao robot through any interaction with the Kinect. It would be easy to develop this into a powerful tool for teaching Nao new behaviours or skills via human demonstration. Bachelor of Engineering 2014-05-27T04:16:51Z 2014-05-27T04:16:51Z 2014 2014 Final Year Project (FYP) http://hdl.handle.net/10356/60415 en Nanyang Technological University 146 p. application/pdf
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic DRNTU::Engineering
spellingShingle DRNTU::Engineering
Cumming, Matthew
Nao robot kinect control
description This is the Final Year Project report on producing an application to control Aldeberan Robotics’ Nao Robot using Microsoft’s Kinect sensor. The process of developing this application involved learning how to program the Kinect and the Nao robot as well as researching and designing artificial neural networks and machine learning algorithms for gesture recognition. The majority of the time spent working on this project involved research, and learning how to use the tools necessary to build the final application. This project was an open ended one; once an interface has been created the robot could be controlled via the Kinect in any desired manner. A WPF application was created in Visual Studios, using C#, to take information from the Kinect about the user’s body position and output it. The outputs were analysed and used with python scripts to record gestures from the Kinect. These recorded gestures were then used to train a multilayer perceptron artificial neural network (also programmed in python) using a machine learning algorithm to recognise various gestures. Once the program had learned the gestures, it could perform gesture recognition in real time and control the robot accordingly. Further modes of operation were developed whereby the robot would imitate the user; moving to the same position and orientation as the user in its own environment. This project was successful, providing a great deal of research and learning opportunities while also establishing a system that can fully control the Nao robot through any interaction with the Kinect. It would be easy to develop this into a powerful tool for teaching Nao new behaviours or skills via human demonstration.
author2 Song Qing
author_facet Song Qing
Cumming, Matthew
format Final Year Project
author Cumming, Matthew
author_sort Cumming, Matthew
title Nao robot kinect control
title_short Nao robot kinect control
title_full Nao robot kinect control
title_fullStr Nao robot kinect control
title_full_unstemmed Nao robot kinect control
title_sort nao robot kinect control
publishDate 2014
url http://hdl.handle.net/10356/60415
_version_ 1772826965066121216