Natural gesture based control for quadcopter

Human-Computer Interaction has progressed rapidly throughout the years and the opportunities that it provides are massive. Through Natural User Interface, natural gestures can be used to control a quadcopter. This means that the user himself becomes the controller and is able to send commands to the...

Full description

Saved in:
Bibliographic Details
Main Author: Chang, Yi Wei
Other Authors: Suresh Sundaram
Format: Final Year Project
Language:English
Published: 2015
Subjects:
Online Access:http://hdl.handle.net/10356/65606
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:Human-Computer Interaction has progressed rapidly throughout the years and the opportunities that it provides are massive. Through Natural User Interface, natural gestures can be used to control a quadcopter. This means that the user himself becomes the controller and is able to send commands to the quadcopter just by changing the posture of his body. Flying a quadcopter is quite different from controlling a remote controlled car. This is because unlike a car, a quadcopter can travel in three different axes. This is why the Microsoft Kinect is being used, as it is able to obtain depth information from the surroundings. The Microsoft Kinect is used to track the skeleton data of the user in order to determine the current posture of the user. Each posture is then classified as a specific command which is then sent to the quadcopter, which in this case is the Parrot AR.Drone 2.0. Certain thresholds were set for each posture so that there is a reasonable amount of leeway provided. This ensures that the user does not accidentally issue a command that he did not intend to do. The commands were sent to the quadcopter using Wi-Fi over the User Datagram Protocol to reduce bandwidth overhead and latency. This enables the quadcopter to respond instantly to the commands given by the user.