Extrinsic calibration between multiple 3D LiDARs for autonomous robots

The autonomous vehicles and intelligent transportation are of vital importance in recent years. A single LiDAR only has limited field of view (FOV) and may have blind spot. Applying multiple sensors in one system can offset these drawbacks and further improve the system’s perceptual capability. T...

Full description

Saved in:
Bibliographic Details
Main Author: Lyu, Qiyang
Other Authors: Wang Dan Wei
Format: Thesis-Master by Coursework
Language:English
Published: Nanyang Technological University 2021
Subjects:
Online Access:https://hdl.handle.net/10356/152597
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-152597
record_format dspace
spelling sg-ntu-dr.10356-1525972023-07-04T17:40:50Z Extrinsic calibration between multiple 3D LiDARs for autonomous robots Lyu, Qiyang Wang Dan Wei School of Electrical and Electronic Engineering EDWWANG@ntu.edu.sg Engineering::Electrical and electronic engineering The autonomous vehicles and intelligent transportation are of vital importance in recent years. A single LiDAR only has limited field of view (FOV) and may have blind spot. Applying multiple sensors in one system can offset these drawbacks and further improve the system’s perceptual capability. Thus, a feasible method is to fuse the information from different LiDARs together. Calibration between multiple LiDARs are needed before fusing. The calibration process will figure out the spatial relationship between two sensors, which is rotation R and translation t. An accurate calibration is essential for a good fusion. Most of recent methods focus on short baseline condition. However, in intelligent transportation system in urban environment, LiDARs are commonly placed far away from each other, with large baseline, resulting in different perspective. Previous methods could not work under significant different perspective condition. A new calibration method is proposed in this thesis regardless of the restrictions of the baseline length. A sphere target is designed for the calibration, using the ICP method with known correspondence. The advantage of using a sphere is that the sphere is visible in all directions, and the sphere center keeps unchanged. With the usage of sphere, no prior assumption is needed for the calibration, except the necessary overlap FOV between multiple LiDARs. The stated method could work in both short baseline and long baseline condition. Simulations using sphere target are conducted. Translation and Rotation matrix between LiDARs are calculated using the proposed method, where the result turns out to be extremely accurate. The noise robustness test also shows impressive input error tolerance using the proposed method. The quantitative and qualitative test presents that the final error could be controlled under 0.01m and 0.1 degree under detection distance of 30 meters and noise with the standard deviation of 0.03m. Master of Science (Computer Control and Automation) 2021-09-06T00:29:02Z 2021-09-06T00:29:02Z 2021 Thesis-Master by Coursework Lyu, Q. (2021). Extrinsic calibration between multiple 3D LiDARs for autonomous robots. Master's thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/152597 https://hdl.handle.net/10356/152597 en application/pdf Nanyang Technological University
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering::Electrical and electronic engineering
spellingShingle Engineering::Electrical and electronic engineering
Lyu, Qiyang
Extrinsic calibration between multiple 3D LiDARs for autonomous robots
description The autonomous vehicles and intelligent transportation are of vital importance in recent years. A single LiDAR only has limited field of view (FOV) and may have blind spot. Applying multiple sensors in one system can offset these drawbacks and further improve the system’s perceptual capability. Thus, a feasible method is to fuse the information from different LiDARs together. Calibration between multiple LiDARs are needed before fusing. The calibration process will figure out the spatial relationship between two sensors, which is rotation R and translation t. An accurate calibration is essential for a good fusion. Most of recent methods focus on short baseline condition. However, in intelligent transportation system in urban environment, LiDARs are commonly placed far away from each other, with large baseline, resulting in different perspective. Previous methods could not work under significant different perspective condition. A new calibration method is proposed in this thesis regardless of the restrictions of the baseline length. A sphere target is designed for the calibration, using the ICP method with known correspondence. The advantage of using a sphere is that the sphere is visible in all directions, and the sphere center keeps unchanged. With the usage of sphere, no prior assumption is needed for the calibration, except the necessary overlap FOV between multiple LiDARs. The stated method could work in both short baseline and long baseline condition. Simulations using sphere target are conducted. Translation and Rotation matrix between LiDARs are calculated using the proposed method, where the result turns out to be extremely accurate. The noise robustness test also shows impressive input error tolerance using the proposed method. The quantitative and qualitative test presents that the final error could be controlled under 0.01m and 0.1 degree under detection distance of 30 meters and noise with the standard deviation of 0.03m.
author2 Wang Dan Wei
author_facet Wang Dan Wei
Lyu, Qiyang
format Thesis-Master by Coursework
author Lyu, Qiyang
author_sort Lyu, Qiyang
title Extrinsic calibration between multiple 3D LiDARs for autonomous robots
title_short Extrinsic calibration between multiple 3D LiDARs for autonomous robots
title_full Extrinsic calibration between multiple 3D LiDARs for autonomous robots
title_fullStr Extrinsic calibration between multiple 3D LiDARs for autonomous robots
title_full_unstemmed Extrinsic calibration between multiple 3D LiDARs for autonomous robots
title_sort extrinsic calibration between multiple 3d lidars for autonomous robots
publisher Nanyang Technological University
publishDate 2021
url https://hdl.handle.net/10356/152597
_version_ 1772827107318038528