A consistent and long-term mapping approach for navigation

The construction and maintenance of a robocentric map is key to high-level mobile robotic tasks like path planning and smart navigation. But the challenge of dynamic environment and huge amount of dense sensor data makes it hard to be implemented in a real-world application for long-term use. In thi...

Full description

Saved in:
Bibliographic Details
Main Authors: Zhang, Handuo, Karunasekera, Hasith, Wang, Han
Other Authors: School of Electrical and Electronic Engineering
Format: Article
Language:English
Published: 2020
Subjects:
Online Access:https://www.zealpress.com/ijratv5a4/
https://hdl.handle.net/10356/141288
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:The construction and maintenance of a robocentric map is key to high-level mobile robotic tasks like path planning and smart navigation. But the challenge of dynamic environment and huge amount of dense sensor data makes it hard to be implemented in a real-world application for long-term use. In this paper we present a novel mapping approach by incorporating semantic cuboid object detection and multi-view geometry information. The proposed system can precisely describe the incremental 3D environment in real-time and maintain a long-term map by extracting out moving objects. The representation of the map is a collection of sub-volumes which can be utilized to perform pose graph optimization to address the challenge of building a consistent and scalable map. These sub-volumes are first aligned by localization module and refined by fusing the active volumes using co-visible graph. With the proposed framework we can obtain the object-level constraints and propose a consistent obstacle mapping system combining multi-view geometry with obstacle detection to obtain robust static map in a complex environment. Public dataset and self-collected data demonstrate the efficiency and consistency of our proposed approach.