Visual inertial SLAM based autonomous navigation system

Development and use of Visual and Visual Inertial SLAM for autonomous navigation have been prevalent for the past decade. Autonomous navigation in the past commonly used traditional SLAM with LiDAR as sensors which provides accurate depth ranging of the environment. The development of Visual SLAM us...

Full description

Saved in:
Bibliographic Details
Main Author: Chua, Eng Soon
Other Authors: Lam Siew Kei
Format: Final Year Project
Language:English
Published: Nanyang Technological University 2023
Subjects:
Online Access:https://hdl.handle.net/10356/171846
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:Development and use of Visual and Visual Inertial SLAM for autonomous navigation have been prevalent for the past decade. Autonomous navigation in the past commonly used traditional SLAM with LiDAR as sensors which provides accurate depth ranging of the environment. The development of Visual SLAM using cameras as sensors was initiated as a lower-cost alternative to the usage of more expensive LiDAR for autonomous navigation. Today newer Visual SLAM methods with the use of Inertial Measurement Units (IMU) referred to as Visual Inertial SLAM and compatibility with different camera types such as stereo cameras and RGB-D cameras were proposed and shared with the public. In this project, we aim to implement an autonomous navigation system using COVINS and Multi-Robot Coordination. The navigation stack of Multi-Robot Coordination is integrated with COVINS, and the integrated system is then implemented on an Unguided Ground Vehicle to perform autonomous navigation in an indoor environment.