Advanced vision-based localization and mapping
With the realization of many state-of-the-art computing processors, the field of robotics is experiencing an amazing advancement with a plethora of applications. One of such application is based on Simultaneous Localization and Mapping (SLAM), which is a technique that allows a robot to determine it...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
2019
|
Subjects: | |
Online Access: | http://hdl.handle.net/10356/77764 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | With the realization of many state-of-the-art computing processors, the field of robotics is experiencing an amazing advancement with a plethora of applications. One of such application is based on Simultaneous Localization and Mapping (SLAM), which is a technique that allows a robot to determine its movement from a sequence of images from its “eyes”. Applications from SLAM ranges from self-driving cars to autonomous surveillance drones. While the development of SLAM on drones is gradually come to its final stages, it still has a lot of room for improvements. There are still problems of losing track or unreliable accuracy. Therefore, we decided to implement a more robust SLAM system by integrating SLAM results with an additional low-cost Inertial Measurement Unit (IMU). In this study, the work is two-fold. Firstly, various cutting-edge SLAM algorithms are tested to find the most suitable and most robust choice for drone application. We opted for PL-SLAM for our system as it showed the most confident results. Secondly, to integrate the system with the IMU, we chose the loosely-coupled solution with Extended Kalman Filter (EKF) to enhance the system’s accuracy. Overall, our SLAM showed a promising accuracy and it no longer lost track while operating. |
---|