Real-time visual localization system on an embedded platform

Simultaneous localization and mapping (SLAM) of an unknown environment is an important task for mobile autonomous robots and navigation systems. Indoor SLAM, however, cannot rely on GPS signal and existing solutions often assume static scene conditions which may not occur in real-life scenarios. In...

Full description

Saved in:
Bibliographic Details
Main Author: Do, Anh Tu
Other Authors: Lam Siew Kei
Format: Final Year Project
Language:English
Published: Nanyang Technological University 2021
Subjects:
Online Access:https://hdl.handle.net/10356/148230
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:Simultaneous localization and mapping (SLAM) of an unknown environment is an important task for mobile autonomous robots and navigation systems. Indoor SLAM, however, cannot rely on GPS signal and existing solutions often assume static scene conditions which may not occur in real-life scenarios. In addition, the maps generated using typical SLAM systems have no semantic meaning of the environment and are not useful for higher level mission such as path planning, object interaction, etc. The computation cost of such system is another concern as SLAM systems are often implemented on embedded platforms. In this project, two real-time visual SLAM systems are designed for the Jetson Xavier NX embedded platform that are capable of working in dynamic environments and generating 3D semantic global maps for context-aware tasks. It employs deep neural network for obtaining semantic information to build the global maps. A distributed computing setup is also considered in order to achieve real-time 3D semantic map generation. The quantitative and qualitative results of the proposed semantic SLAM systems are presented for comparison and evaluation.