Flexible and resource-efficient multi-robot collaborative visual-inertial-range localization

In multi-robot systems, two important research problems are relative localization between the robots and global localization of all robots in a common frame. Traditional methods rely on detecting inter and intra-robot loop closures, which can be restrictive operation-wise since the robot must form l...

Full description

Saved in:
Bibliographic Details
Main Authors: Nguyen, Thien Hoang, Nguyen, Thien-Minh, Xie, Lihua
Other Authors: School of Electrical and Electronic Engineering
Format: Article
Language:English
Published: 2022
Subjects:
Online Access:https://hdl.handle.net/10356/162341
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-162341
record_format dspace
spelling sg-ntu-dr.10356-1623412022-10-14T08:08:16Z Flexible and resource-efficient multi-robot collaborative visual-inertial-range localization Nguyen, Thien Hoang Nguyen, Thien-Minh Xie, Lihua School of Electrical and Electronic Engineering Engineering::Electrical and electronic engineering Localization Sensor Fusion In multi-robot systems, two important research problems are relative localization between the robots and global localization of all robots in a common frame. Traditional methods rely on detecting inter and intra-robot loop closures, which can be restrictive operation-wise since the robot must form loops. Ultra-wideband sensors, which provide direct distance measurements and robot ID, can replace loop closures in many applications. However, existing research on UWB-aided multi-robot state estimation often ignores the odometry drift which leads to inaccurate global position in the long run. In this work, we present a UWB-aided multi-robot localization system that does not rely on loop closure (flexible) and only requires odometry data from neighbors (resource-efficient). We propose a two-stage approach: 1) with a long sliding window, the relative transformation is refined based on range and odometry data, 2) onboard visual-inertial-range data are tightly fused in a short-term sliding window to provide more accurate local and global estimates. Simulation and real-life experiments with two quadrotors show that the system as a whole outperforms previous approaches as well as its individual parts. National Research Foundation (NRF) This work was supported by the National Research Foundation, Singapore under its Medium Sized Center for Advanced Robotics Technology Innovation. 2022-10-14T08:08:16Z 2022-10-14T08:08:16Z 2021 Journal Article Nguyen, T. H., Nguyen, T. & Xie, L. (2021). Flexible and resource-efficient multi-robot collaborative visual-inertial-range localization. IEEE Robotics and Automation Letters, 7(2), 928-935. https://dx.doi.org/10.1109/LRA.2021.3136286 2377-3766 https://hdl.handle.net/10356/162341 10.1109/LRA.2021.3136286 2-s2.0-85121818529 2 7 928 935 en IEEE Robotics and Automation Letters © 2021 IEEE. All rights reserved.
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering::Electrical and electronic engineering
Localization
Sensor Fusion
spellingShingle Engineering::Electrical and electronic engineering
Localization
Sensor Fusion
Nguyen, Thien Hoang
Nguyen, Thien-Minh
Xie, Lihua
Flexible and resource-efficient multi-robot collaborative visual-inertial-range localization
description In multi-robot systems, two important research problems are relative localization between the robots and global localization of all robots in a common frame. Traditional methods rely on detecting inter and intra-robot loop closures, which can be restrictive operation-wise since the robot must form loops. Ultra-wideband sensors, which provide direct distance measurements and robot ID, can replace loop closures in many applications. However, existing research on UWB-aided multi-robot state estimation often ignores the odometry drift which leads to inaccurate global position in the long run. In this work, we present a UWB-aided multi-robot localization system that does not rely on loop closure (flexible) and only requires odometry data from neighbors (resource-efficient). We propose a two-stage approach: 1) with a long sliding window, the relative transformation is refined based on range and odometry data, 2) onboard visual-inertial-range data are tightly fused in a short-term sliding window to provide more accurate local and global estimates. Simulation and real-life experiments with two quadrotors show that the system as a whole outperforms previous approaches as well as its individual parts.
author2 School of Electrical and Electronic Engineering
author_facet School of Electrical and Electronic Engineering
Nguyen, Thien Hoang
Nguyen, Thien-Minh
Xie, Lihua
format Article
author Nguyen, Thien Hoang
Nguyen, Thien-Minh
Xie, Lihua
author_sort Nguyen, Thien Hoang
title Flexible and resource-efficient multi-robot collaborative visual-inertial-range localization
title_short Flexible and resource-efficient multi-robot collaborative visual-inertial-range localization
title_full Flexible and resource-efficient multi-robot collaborative visual-inertial-range localization
title_fullStr Flexible and resource-efficient multi-robot collaborative visual-inertial-range localization
title_full_unstemmed Flexible and resource-efficient multi-robot collaborative visual-inertial-range localization
title_sort flexible and resource-efficient multi-robot collaborative visual-inertial-range localization
publishDate 2022
url https://hdl.handle.net/10356/162341
_version_ 1749179230201053184