Intelligent vision based coordinated control for quadcopter
Within the past decade, quadcopters have been widely used in military operations, surveillance, search missions and many more interesting areas. They have also become the focus of aerodynamics research institutes to test flight control theory, autonomous navigation and more. When quadcopters are man...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
2014
|
Subjects: | |
Online Access: | http://hdl.handle.net/10356/59049 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | Within the past decade, quadcopters have been widely used in military operations, surveillance, search missions and many more interesting areas. They have also become the focus of aerodynamics research institutes to test flight control theory, autonomous navigation and more. When quadcopters are manufactured smaller in recent years and called nano quadcopters of incredible small size, its potential for more useful applications sparked interest for many.
The challenge with quadcopters is to make it fly stably and hold a target position. Increasing amounts of interest go to improving natural interaction with quadcopters. Hence the goal of this project is set to (1) explore possible means to control a nano quadcopter (2) implement a close-loop control system for stable hovering (3) implement a vision-based system to control quadcopter using gestures.
From our project, we have discovered and implemented possible means to control a nano quadcopter. They include manual control using a joystick controller such as Xbox 360 controller or Playstation controller, keyboard and mouse as controllers and gesture-based controllers such as Leap Motion controller and Microsoft Kinect controller.
Close-loop control system was implemented using PID controller. Input to PID controller is the difference between target position and actual position of a nano quadcopter. Actual position of a nano quadcopter was determined by tracking red ball attached on top of nano quadcopter and applying imaging processing techniques on RGB and depth images produced by Microsoft Kinect.
OpenNI framework and NITE middleware allowed us to develop skeleton tracking algorithm. Gesture recognition algorithm implementation was based on skeleton joint data of left hand of the user. Visualization of skeleton joint data was also made available. |
---|