Advanced vision-based localization and mapping

With the realization of many state-of-the-art computing processors, the field of robotics is experiencing an amazing advancement with a plethora of applications. One of such application is based on Simultaneous Localization and Mapping (SLAM), which is a technique that allows a robot to determine it...

Full description

Saved in:
Bibliographic Details
Main Author: Pham, Nguyen Tuan Anh
Other Authors: Xie Lihua
Format: Final Year Project
Language:English
Published: 2019
Subjects:
Online Access:http://hdl.handle.net/10356/77764
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-77764
record_format dspace
spelling sg-ntu-dr.10356-777642023-07-07T15:56:23Z Advanced vision-based localization and mapping Pham, Nguyen Tuan Anh Xie Lihua School of Electrical and Electronic Engineering DRNTU::Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision With the realization of many state-of-the-art computing processors, the field of robotics is experiencing an amazing advancement with a plethora of applications. One of such application is based on Simultaneous Localization and Mapping (SLAM), which is a technique that allows a robot to determine its movement from a sequence of images from its “eyes”. Applications from SLAM ranges from self-driving cars to autonomous surveillance drones. While the development of SLAM on drones is gradually come to its final stages, it still has a lot of room for improvements. There are still problems of losing track or unreliable accuracy. Therefore, we decided to implement a more robust SLAM system by integrating SLAM results with an additional low-cost Inertial Measurement Unit (IMU). In this study, the work is two-fold. Firstly, various cutting-edge SLAM algorithms are tested to find the most suitable and most robust choice for drone application. We opted for PL-SLAM for our system as it showed the most confident results. Secondly, to integrate the system with the IMU, we chose the loosely-coupled solution with Extended Kalman Filter (EKF) to enhance the system’s accuracy. Overall, our SLAM showed a promising accuracy and it no longer lost track while operating. Bachelor of Engineering (Electrical and Electronic Engineering) 2019-06-06T04:09:51Z 2019-06-06T04:09:51Z 2019 Final Year Project (FYP) http://hdl.handle.net/10356/77764 en Nanyang Technological University 56 p. application/pdf
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic DRNTU::Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision
spellingShingle DRNTU::Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision
Pham, Nguyen Tuan Anh
Advanced vision-based localization and mapping
description With the realization of many state-of-the-art computing processors, the field of robotics is experiencing an amazing advancement with a plethora of applications. One of such application is based on Simultaneous Localization and Mapping (SLAM), which is a technique that allows a robot to determine its movement from a sequence of images from its “eyes”. Applications from SLAM ranges from self-driving cars to autonomous surveillance drones. While the development of SLAM on drones is gradually come to its final stages, it still has a lot of room for improvements. There are still problems of losing track or unreliable accuracy. Therefore, we decided to implement a more robust SLAM system by integrating SLAM results with an additional low-cost Inertial Measurement Unit (IMU). In this study, the work is two-fold. Firstly, various cutting-edge SLAM algorithms are tested to find the most suitable and most robust choice for drone application. We opted for PL-SLAM for our system as it showed the most confident results. Secondly, to integrate the system with the IMU, we chose the loosely-coupled solution with Extended Kalman Filter (EKF) to enhance the system’s accuracy. Overall, our SLAM showed a promising accuracy and it no longer lost track while operating.
author2 Xie Lihua
author_facet Xie Lihua
Pham, Nguyen Tuan Anh
format Final Year Project
author Pham, Nguyen Tuan Anh
author_sort Pham, Nguyen Tuan Anh
title Advanced vision-based localization and mapping
title_short Advanced vision-based localization and mapping
title_full Advanced vision-based localization and mapping
title_fullStr Advanced vision-based localization and mapping
title_full_unstemmed Advanced vision-based localization and mapping
title_sort advanced vision-based localization and mapping
publishDate 2019
url http://hdl.handle.net/10356/77764
_version_ 1772826310436978688