Dynamic output feedback image-based visual servoing of rotorcrafts

Recent advances in the field of unmanned aerial vehicles (UAVs) make rotorcrafts equipped with a camera suitable for performing advanced missions. One of the primary applications is tracking stationary and moving targets in GPSdenied environment or when the relative positioning is needed. Visual ser...

Full description

Saved in:
Bibliographic Details
Main Author: Li, Jianan
Other Authors: Low Kin Huat
Format: Theses and Dissertations
Language:English
Published: 2018
Subjects:
Online Access:http://hdl.handle.net/10356/73849
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:Recent advances in the field of unmanned aerial vehicles (UAVs) make rotorcrafts equipped with a camera suitable for performing advanced missions. One of the primary applications is tracking stationary and moving targets in GPSdenied environment or when the relative positioning is needed. Visual servoing provides such an alternative. However, image-based visual servoing (IBVS) for a rotorcraft UAV is challenging due to the nonlinear perspective projection of image formation, system uncertainties and underactuated nonlinear dynamics of the UAV. An IBVS law in consideration of system dynamics is known as dynamic image-based visual servoing (DIBVS). Among various DIBVS approaches, this research selects the virtual camera approach which is drawing more attention recently because it facilitates the estimation of image points’ depth and image kinematics. This thesis presents a dynamic output feedback image-based visual servoing law for rotorcraft UAVs. The proposed control law enables a UAV to regulate its position and heading relative to a visual target consisting of multiple coplanar points. The UAV is required to be equipped with a minimal set of sensors, i.e., an inertial measurement unit (IMU) and a single downward facing camera. A set of first order image moment features which are defined in virtual camera image plane are used for visual servoing. The proposed control law has the following characteristics. First, traditional IBVS approaches control the lateral and height subsystems separately and usually do not consider the potential loss of the image. Thus, in this research, the desired value of the image feature in controlling the vertical motion of the vehicle is designed to be a function of the image features of the lateral motion instead of a constant. This modification helps to keep the visual target stay in the camera’s field of view (FoV) by indirectly adjusting the height of the vehicle. Second, the dimension of the observer filter state space is reduced by removing an integral term in the observer, in which way the controller’s structure is further simplified while the asymptotic stability of the system is still maintained. Third, the control law is adaptive to various unknown system parameters including the mass of the vehicle, thrust coefficient, and bias errors in Euler angle measurement. Fourth, most traditional DIBVS control laws are developed relying on the measurement of the vehicle’s linear velocity. However, in GNSS-denied situation, this may not be practical. In this thesis, an output feedback method is adopted to DIBVS and the need of linear velocity measurement is removed.The controller guarantees the asymptotic stability of the error dynamics. Both simulation and experimental results are presented to demonstrate the performance of the proposed control law. The experiment architecture includes a quadrotor equipped with an open source Pixhawk autopilot and a CMUcam5 Pixy onboard computer vision system. The experimental result shows that the errors reach the steady state after about 20 seconds. At the steady state, the errors oscillate within a very small range where for position and heading errors, the oscillation amplitudes are less than 0:1 m and 0:1 rad, respectively.