Ultra-wideband-aided localization for autonomous robots
Achieving accurate, reliable and globally consistent localization is a fundamental challenge for autonomous mobile robots. There have been numerous researches in this topic showing impressive performance in various scenarios and with various sensor combinations. In a multi-robot setting, finding th...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Thesis-Doctor of Philosophy |
Language: | English |
Published: |
Nanyang Technological University
2023
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/167947 https://doi.org/10.21979/N9/X39LEK |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-167947 |
---|---|
record_format |
dspace |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
Engineering::Electrical and electronic engineering::Computer hardware, software and systems |
spellingShingle |
Engineering::Electrical and electronic engineering::Computer hardware, software and systems Nguyen, Hoang Thien Ultra-wideband-aided localization for autonomous robots |
description |
Achieving accurate, reliable and globally consistent localization is a fundamental challenge for autonomous mobile robots. There have been numerous researches in this topic showing impressive performance in various scenarios and with various sensor combinations.
In a multi-robot setting, finding the relative positions between robots is another essential issue that inhibits other high level tasks such as obstacle avoidance, path planning and collaborative tasking. These problems are especially critical for Micro Aerial Vehicles (MAVs) since any collisions with the surrounding objects or neighbor robots can be catastrophic. Furthermore, the size, weight and power constraints are much more restrictive for MAVs as opposed to ground or legged robots. In this regard, Ultra-wideband (UWB) provides many advantages compared to traditional sensors such as camera, radar or LiDAR. However, UWB has its own limitations such as being heavily reliant on line-of-sight (LoS) and being unable to provide any information about the environment. For these reasons, designing new methods that can combine UWB and other sensor modalities is a worthwhile endeavour and also the overarching goal of this thesis.
Firstly, we investigate the problem of integrating UWB and a monocular camera, specifically fusing ranging and visual odometry (VO) data to estimate both the UWB anchor position and the VO's scale factor.
We propose a loosely-coupled (LC) fusion scheme to address this problem, with the main advantage being its flexibility since both the VO algorithm and UWB sensor can be replaced easily.
Additionally, a theoretical analysis is provided, including the derivation of Fisher Information Matrix (FIM), the FIM's determinant, an evaluation of the singular configurations and their geometric interpretations.
Based on these analyses, an estimation-trajectory optimization framework is presented which consists of an estimator, which is designed based on gradient-descent or semidefinite programming (SDP) optimization, and a trajectory optimization approach that minimizes the uncertainty volume and can be easily adapted to different scenarios.
Nonetheless, the VO's estimate will still drift over time as the error accumulates which cannot be addressed by the LC method. Hence, we continue to develop a tightly-coupled (TC) version that fuses both UWB and visual measurements in a joint optimization pipeline, supports any number of anchors and incorporates various other enhancements.
Extensive simulation and experiments are carried out to demonstrate the effectiveness of the proposed LC and TC systems. Overall, our LC method is fast but susceptible to long-term drift while our TC method is more accurate but more computationally expensive, and the estimation results can be further improved thanks to the trajectory optimization method, even in challenging conditions.
Having established the pros and cons for both LC and TC systems, the TC variant is chosen for further researches as we focus on the system performance.
In our next works, we extend the fusion scheme to that of UWB and visual-inertial odometry (VIO).
Existing methods discard most of the UWB data and ignore the spatial-temporal offsets between sensors. Our approach, on the other hand, addresses these issues by leveraging the propagated information readily available from the VIO pipeline and generalize the UWB processing model. As a result, our method uses the UWB data more effectively: the time-offset between measurements is compensated and all available measurements can be used. Experimental results show that our system consistently surpass previous works in localizing the anchor position and reducing the long-term odometry drift.
Lastly, we explore UWB-aided methods for two important problems in multi-robot systems:
1) onboard localization, i.e. finding the position of each robot in a global frame,
2) relative localization, i.e. finding the relative position and heading between robots.
Through the derivation of the FIM and its determinant, an analysis for unobservable configurations and their physical interpretation are provided. We then propose optimization-based methods to estimate the relative frame transformation accurately even with no initial guesses.
These estimates are then continuously refined based on the robot's motion model and online data, while the onboard sensor data are tightly fused to correct the drift. Our method works without loop closure while requiring less exchanged data, which makes it both flexible and resource-efficient.
Moreover, practical issues ignored in previous works are addressed, such as rejecting inter-robot UWB ranging outliers and checking for singular configurations.
Simulations and experiments verify that our system as a whole or as its individual parts outperform previous approaches, and the theoretical results are in line with real-world observations. |
author2 |
Xie Lihua |
author_facet |
Xie Lihua Nguyen, Hoang Thien |
format |
Thesis-Doctor of Philosophy |
author |
Nguyen, Hoang Thien |
author_sort |
Nguyen, Hoang Thien |
title |
Ultra-wideband-aided localization for autonomous robots |
title_short |
Ultra-wideband-aided localization for autonomous robots |
title_full |
Ultra-wideband-aided localization for autonomous robots |
title_fullStr |
Ultra-wideband-aided localization for autonomous robots |
title_full_unstemmed |
Ultra-wideband-aided localization for autonomous robots |
title_sort |
ultra-wideband-aided localization for autonomous robots |
publisher |
Nanyang Technological University |
publishDate |
2023 |
url |
https://hdl.handle.net/10356/167947 https://doi.org/10.21979/N9/X39LEK |
_version_ |
1772827130628931584 |
spelling |
sg-ntu-dr.10356-1679472023-07-04T17:03:45Z Ultra-wideband-aided localization for autonomous robots Nguyen, Hoang Thien Xie Lihua School of Electrical and Electronic Engineering elhxie@ntu.edu.sg Engineering::Electrical and electronic engineering::Computer hardware, software and systems Achieving accurate, reliable and globally consistent localization is a fundamental challenge for autonomous mobile robots. There have been numerous researches in this topic showing impressive performance in various scenarios and with various sensor combinations. In a multi-robot setting, finding the relative positions between robots is another essential issue that inhibits other high level tasks such as obstacle avoidance, path planning and collaborative tasking. These problems are especially critical for Micro Aerial Vehicles (MAVs) since any collisions with the surrounding objects or neighbor robots can be catastrophic. Furthermore, the size, weight and power constraints are much more restrictive for MAVs as opposed to ground or legged robots. In this regard, Ultra-wideband (UWB) provides many advantages compared to traditional sensors such as camera, radar or LiDAR. However, UWB has its own limitations such as being heavily reliant on line-of-sight (LoS) and being unable to provide any information about the environment. For these reasons, designing new methods that can combine UWB and other sensor modalities is a worthwhile endeavour and also the overarching goal of this thesis. Firstly, we investigate the problem of integrating UWB and a monocular camera, specifically fusing ranging and visual odometry (VO) data to estimate both the UWB anchor position and the VO's scale factor. We propose a loosely-coupled (LC) fusion scheme to address this problem, with the main advantage being its flexibility since both the VO algorithm and UWB sensor can be replaced easily. Additionally, a theoretical analysis is provided, including the derivation of Fisher Information Matrix (FIM), the FIM's determinant, an evaluation of the singular configurations and their geometric interpretations. Based on these analyses, an estimation-trajectory optimization framework is presented which consists of an estimator, which is designed based on gradient-descent or semidefinite programming (SDP) optimization, and a trajectory optimization approach that minimizes the uncertainty volume and can be easily adapted to different scenarios. Nonetheless, the VO's estimate will still drift over time as the error accumulates which cannot be addressed by the LC method. Hence, we continue to develop a tightly-coupled (TC) version that fuses both UWB and visual measurements in a joint optimization pipeline, supports any number of anchors and incorporates various other enhancements. Extensive simulation and experiments are carried out to demonstrate the effectiveness of the proposed LC and TC systems. Overall, our LC method is fast but susceptible to long-term drift while our TC method is more accurate but more computationally expensive, and the estimation results can be further improved thanks to the trajectory optimization method, even in challenging conditions. Having established the pros and cons for both LC and TC systems, the TC variant is chosen for further researches as we focus on the system performance. In our next works, we extend the fusion scheme to that of UWB and visual-inertial odometry (VIO). Existing methods discard most of the UWB data and ignore the spatial-temporal offsets between sensors. Our approach, on the other hand, addresses these issues by leveraging the propagated information readily available from the VIO pipeline and generalize the UWB processing model. As a result, our method uses the UWB data more effectively: the time-offset between measurements is compensated and all available measurements can be used. Experimental results show that our system consistently surpass previous works in localizing the anchor position and reducing the long-term odometry drift. Lastly, we explore UWB-aided methods for two important problems in multi-robot systems: 1) onboard localization, i.e. finding the position of each robot in a global frame, 2) relative localization, i.e. finding the relative position and heading between robots. Through the derivation of the FIM and its determinant, an analysis for unobservable configurations and their physical interpretation are provided. We then propose optimization-based methods to estimate the relative frame transformation accurately even with no initial guesses. These estimates are then continuously refined based on the robot's motion model and online data, while the onboard sensor data are tightly fused to correct the drift. Our method works without loop closure while requiring less exchanged data, which makes it both flexible and resource-efficient. Moreover, practical issues ignored in previous works are addressed, such as rejecting inter-robot UWB ranging outliers and checking for singular configurations. Simulations and experiments verify that our system as a whole or as its individual parts outperform previous approaches, and the theoretical results are in line with real-world observations. Doctor of Philosophy 2023-05-21T05:39:25Z 2023-05-21T05:39:25Z 2023 Thesis-Doctor of Philosophy Nguyen, H. T. (2023). Ultra-wideband-aided localization for autonomous robots. Doctoral thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/167947 https://hdl.handle.net/10356/167947 10.32657/10356/167947 en https://doi.org/10.21979/N9/X39LEK This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License (CC BY-NC 4.0). application/pdf Nanyang Technological University |