One-step method to calibrate a 3D LiDAR and a monocular camera
Nowadays, autonomous mobile robots rely on heterogeneous sensors to enhance their perception capabilities, because of the complementary characteristics provided by different sensors. For example, cameras and Lighting Detection and Ranging sensors (LiDAR) are commonly used. To deeply fuse the i...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Theses and Dissertations |
Language: | English |
Published: |
2019
|
Subjects: | |
Online Access: | http://hdl.handle.net/10356/78463 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-78463 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-784632023-07-04T16:48:21Z One-step method to calibrate a 3D LiDAR and a monocular camera Zhang, Ran Wang Dan Wei School of Electrical and Electronic Engineering DRNTU::Engineering::Electrical and electronic engineering::Control and instrumentation::Robotics Nowadays, autonomous mobile robots rely on heterogeneous sensors to enhance their perception capabilities, because of the complementary characteristics provided by different sensors. For example, cameras and Lighting Detection and Ranging sensors (LiDAR) are commonly used. To deeply fuse the information collected by different sensors, a precise extrinsic calibration is necessary, with which the transformation matrix (rotation and translation) between the sensor frames can be obtained. However, to calibrate a thermal camera and a sparse 3D LiDAR, the existing two-step method requires a visual camera to assist the process. In this thesis, a one-step method is proposed to calibrate a thermal camera and a sparse 3D LiDAR (Lighting Detection and Ranging). The proposed method completely removes the visual camera in the existing two-step method, while ensuring calibration accuracy. The complexity of the calibration process is greatly simplified and the calibration efficiency is improved. At the same time, a pre-processing step is proposed to calibrate a 2D rotating LiDAR and a monocular color camera, with the use of an off-the-shelf KITTI calibration toolbox. The 2D rotating LiDAR produces a very dense point cloud, which makes it impossible to obtain a correct result with the calibration toolbox. The method proves that after pre processing the original point cloud, the KITTI calibration toolbox is fully applicable to the calibration between the monocular camera and the 2D rotating LiDAR. The influence of different factors on the calibration results is also compared and discussed Master of Science (Computer Control and Automation) 2019-06-20T06:05:22Z 2019-06-20T06:05:22Z 2019 Thesis http://hdl.handle.net/10356/78463 en 68 p. application/pdf |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
DRNTU::Engineering::Electrical and electronic engineering::Control and instrumentation::Robotics |
spellingShingle |
DRNTU::Engineering::Electrical and electronic engineering::Control and instrumentation::Robotics Zhang, Ran One-step method to calibrate a 3D LiDAR and a monocular camera |
description |
Nowadays, autonomous mobile robots rely on heterogeneous sensors to enhance their
perception capabilities, because of the complementary characteristics provided by
different sensors. For example, cameras and Lighting Detection and Ranging sensors
(LiDAR) are commonly used. To deeply fuse the information collected by different
sensors, a precise extrinsic calibration is necessary, with which the transformation
matrix (rotation and translation) between the sensor frames can be obtained. However,
to calibrate a thermal camera and a sparse 3D LiDAR, the existing two-step method
requires a visual camera to assist the process. In this thesis, a one-step method is
proposed to calibrate a thermal camera and a sparse 3D LiDAR (Lighting Detection
and Ranging). The proposed method completely removes the visual camera in the
existing two-step method, while ensuring calibration accuracy. The complexity of the
calibration process is greatly simplified and the calibration efficiency is improved. At
the same time, a pre-processing step is proposed to calibrate a 2D rotating LiDAR and
a monocular color camera, with the use of an off-the-shelf KITTI calibration toolbox.
The 2D rotating LiDAR produces a very dense point cloud, which makes it impossible
to obtain a correct result with the calibration toolbox. The method proves that after pre
processing the original point cloud, the KITTI calibration toolbox is fully applicable to
the calibration between the monocular camera and the 2D rotating LiDAR. The
influence of different factors on the calibration results is also compared and discussed |
author2 |
Wang Dan Wei |
author_facet |
Wang Dan Wei Zhang, Ran |
format |
Theses and Dissertations |
author |
Zhang, Ran |
author_sort |
Zhang, Ran |
title |
One-step method to calibrate a 3D LiDAR and a monocular camera |
title_short |
One-step method to calibrate a 3D LiDAR and a monocular camera |
title_full |
One-step method to calibrate a 3D LiDAR and a monocular camera |
title_fullStr |
One-step method to calibrate a 3D LiDAR and a monocular camera |
title_full_unstemmed |
One-step method to calibrate a 3D LiDAR and a monocular camera |
title_sort |
one-step method to calibrate a 3d lidar and a monocular camera |
publishDate |
2019 |
url |
http://hdl.handle.net/10356/78463 |
_version_ |
1772826989749600256 |