Gesture control for indoor navigation and maze exploration for drones

As drones are now becoming more popular and being commercialized all over the globe, new ways of controlling these drones have been explored. The purpose of this study is to investigate one of the many ways of controlling this drones that feels second nature to us and gives the natural flight experi...

Full description

Saved in:
Bibliographic Details
Main Author: Sazali Mohammed Ali
Other Authors: Sundaram Suresh
Format: Final Year Project
Language:English
Published: 2017
Subjects:
Online Access:http://hdl.handle.net/10356/70371
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-70371
record_format dspace
spelling sg-ntu-dr.10356-703712023-03-03T20:25:02Z Gesture control for indoor navigation and maze exploration for drones Sazali Mohammed Ali Sundaram Suresh School of Computer Science and Engineering Centre for Computational Intelligence DRNTU::Engineering::Computer science and engineering As drones are now becoming more popular and being commercialized all over the globe, new ways of controlling these drones have been explored. The purpose of this study is to investigate one of the many ways of controlling this drones that feels second nature to us and gives the natural flight experience to the user. In this study, the implementation of using a motion controller to control the motion of a drone is via human gestures. Using the LEAP motion sensor as the motion controller and the Parrot AR Drone 2.0 for this implementation. The Parrot AR Drone is commercial quad rotor having a built-in Wi-Fi system. The Parrot AR Drone is connected to the laptop via Wi-Fi and the LEAP sensor is connected to the laptop via USB port. The LEAP Motion sensor recognizes the hand gestures and relays it on to the laptop. The laptop, acting as the server, runs the program which is used as the platform for this implementation. JavaScript embedded in HTML is the programming language used for interaction with the AR Drone to convey the simple hand gestures via web browser. In the implementation, we have written JavaScript codes to interpret the hand gestures captured by the LEAP, and transmit them to control the motion of the AR Drone through these gestures Bachelor of Engineering (Computer Engineering) 2017-04-21T02:49:21Z 2017-04-21T02:49:21Z 2017 Final Year Project (FYP) http://hdl.handle.net/10356/70371 en Nanyang Technological University 34 p. application/pdf
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic DRNTU::Engineering::Computer science and engineering
spellingShingle DRNTU::Engineering::Computer science and engineering
Sazali Mohammed Ali
Gesture control for indoor navigation and maze exploration for drones
description As drones are now becoming more popular and being commercialized all over the globe, new ways of controlling these drones have been explored. The purpose of this study is to investigate one of the many ways of controlling this drones that feels second nature to us and gives the natural flight experience to the user. In this study, the implementation of using a motion controller to control the motion of a drone is via human gestures. Using the LEAP motion sensor as the motion controller and the Parrot AR Drone 2.0 for this implementation. The Parrot AR Drone is commercial quad rotor having a built-in Wi-Fi system. The Parrot AR Drone is connected to the laptop via Wi-Fi and the LEAP sensor is connected to the laptop via USB port. The LEAP Motion sensor recognizes the hand gestures and relays it on to the laptop. The laptop, acting as the server, runs the program which is used as the platform for this implementation. JavaScript embedded in HTML is the programming language used for interaction with the AR Drone to convey the simple hand gestures via web browser. In the implementation, we have written JavaScript codes to interpret the hand gestures captured by the LEAP, and transmit them to control the motion of the AR Drone through these gestures
author2 Sundaram Suresh
author_facet Sundaram Suresh
Sazali Mohammed Ali
format Final Year Project
author Sazali Mohammed Ali
author_sort Sazali Mohammed Ali
title Gesture control for indoor navigation and maze exploration for drones
title_short Gesture control for indoor navigation and maze exploration for drones
title_full Gesture control for indoor navigation and maze exploration for drones
title_fullStr Gesture control for indoor navigation and maze exploration for drones
title_full_unstemmed Gesture control for indoor navigation and maze exploration for drones
title_sort gesture control for indoor navigation and maze exploration for drones
publishDate 2017
url http://hdl.handle.net/10356/70371
_version_ 1759857352984494080