Visual and inertial odometry for mobile robot

In the recent year, application of unmanned aerial vehicle (UAV) has been well received by both consumer and industry application due to its versatility and cost effectiveness. During the Singapore International Water Week, national water agency (PUB) showcased its smart technologies to perform vari...

Full description

Saved in:
Bibliographic Details
Main Author: Leong, Jing Chen
Other Authors: Seet Gim Lee, Gerald
Format: Final Year Project
Language:English
Published: 2019
Subjects:
Online Access:http://hdl.handle.net/10356/78537
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:In the recent year, application of unmanned aerial vehicle (UAV) has been well received by both consumer and industry application due to its versatility and cost effectiveness. During the Singapore International Water Week, national water agency (PUB) showcased its smart technologies to perform various complex tasks, such as deep tunnel sewerage system inspection [1]. Instead of carrying out inspection by inspection personnel, PUB deployed UAV to perform inspections in the deep tunnel sewerage system environment which is hostile for human. For the UAV to manoeuvre in the environment, it must be able to localize itself. To localize an UAV, most developer incorporated global positioning system (GPS) into the UAV. However, GPS system is ineffective to determine the position of UAV when GPS signal is weak or absent, such as operating in urban indoor environment and tunnel network. To accurately determine the location of an UAV, another method is to incorporate sensors such as inertial measurement unit (IMU), cameras and Lidar. The objective of this project is to localize an UAV with respect to the features in the environment for tunnel and indoor task using vision and inertial sensors. In tunnel environment, the number of features could be scarce and less illuminated. Camera performs ineffectively when the environment is dark and travelling at higher flight speed due to motion blur, while IMU is prone to drifting when held stationary or travelling at lower speed due to higher noise exhibit when IMU is held static. In such condition, IMU will complement camera when travelling at higher flight velocity where the IMU is effective at, meanwhile camera will complement IMU at lower speed, where the camera is effective to determine movement changes. Throughout the project, several challenges were encountered, such as camera and IMU calibration issue, and hardware synchronization issue between the camera and IMU. In this project, an UAV capable to self-localize was setup, with integration of on-board computing unit, camera and IMU. Prior to the state estimation, camera calibration, IMU calibration and hardware synchronization of camera and IMU was carried out. To perform the state estimation, VINS-Mono state estimator was utilized. Subsequently, experimental evaluation was carried out to compare the UAV system ii localization performance with ground truth data at Motion Analysis Laboratory of Nanyang Technological University. Apart from that, a comparison study was made to determine the robustness and reliability of VINS-Mono state estimator and the UAV system using various flight velocities and environment features settings. Based on the results obtained, it can be concluded that the implementation is practical with a synchronized camera-IMU setup as it is capable to localize an UAV in an GPS-denied environment, with an average root mean square error kept under 25cm.