Drone control via deep learning
Drone adoption rate has also been steadily increasing with the advent of more readily available commercial hobbyist drones. However, just obtaining a drone and being able to pilot it without training for an amateur is a difficult task due to its non-traditional control scheme and having aircraft pri...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2024
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/181122 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | Drone adoption rate has also been steadily increasing with the advent of more readily available commercial hobbyist drones. However, just obtaining a drone and being able to pilot it without training for an amateur is a difficult task due to its non-traditional control scheme and having aircraft principal axis systems. This project aims to bridge the gap using deep learning computer vision to create a real-time natural user interface (NUI) program that acts as a ground control system to pilot the drone instead. Deep learning using convolution neural networks (CNNs) has been gaining traction in the field of computer vision in recent times and has seen large success in identifying human poses and gestures. By leveraging deep learning, we can implement dynamic human pose and gesture recognition systems and translate those natural body movements to corresponding drone commands to reduce the learning complexity of drone controls. This report discusses the implementation, technology behind it, and what could be improved further of the following NUI program. The program should also be able to run successfully on low-end standalone embedded computing systems such as the Jetson TX2, for easier accessibility to the masses without powerful computing systems. |
---|