Online extrinsic calibration between a 3D LiDAR and a monocular camera

For autonomous mobile robots, multiple sensors are being widely adopted as a way to enhance the perception ability of the robot. 3D LiDARs and cameras are the most commonly used sensors. A 3D LiDAR provides data in the form of unordered 3D coordinates of objects in the environment. A monocular camer...

Full description

Saved in:
Bibliographic Details
Main Author: Mharolkar, Sanat Rajesh
Other Authors: Wang Dan Wei
Format: Thesis-Master by Research
Language:English
Published: Nanyang Technological University 2022
Subjects:
Online Access:https://hdl.handle.net/10356/155025
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-155025
record_format dspace
spelling sg-ntu-dr.10356-1550252023-07-04T17:05:15Z Online extrinsic calibration between a 3D LiDAR and a monocular camera Mharolkar, Sanat Rajesh Wang Dan Wei School of Electrical and Electronic Engineering EDWWANG@ntu.edu.sg Engineering::Electrical and electronic engineering::Control and instrumentation::Robotics For autonomous mobile robots, multiple sensors are being widely adopted as a way to enhance the perception ability of the robot. 3D LiDARs and cameras are the most commonly used sensors. A 3D LiDAR provides data in the form of unordered 3D coordinates of objects in the environment. A monocular camera provides data in the form of ordered color or intensity images. These two data complement each other and when fused together, they can provide a much better understanding of the scene. The first step to fuse the data from a 3D LiDAR and a camera is to obtain accurate calibration parameters between the two. Calibration is of two types – intrinsic and extrinsic. While the intrinsic calibration of each sensor is always done separately, this thesis is focused on the extrinsic calibration between the two sensors. Extrinsic calibration involves determining the rotation matrix and translation vector between the two sensors. Generally, extrinsic calibration can be divided into two types – offline and online. Offline methods provide good accuracy; however, they require the use of a specific calibration target and human intervention. Online extrinsic calibration attempts to perform calibration in any unknown scene without the use of any specific targets, but the accuracy is not as good as offline methods. Thus, online calibration is more desirable in terms of convenience, but the accuracy needs to be improved. Through this thesis work, existing methods for offline and online extrinsic calibration between a LiDAR and camera are reviewed, and finally, a new method is proposed for online extrinsic calibration between a 3D LiDAR and monocular camera. The idea behind this new method is to leverage deep learning architectures to learn the extrinsic parameters from the image and point cloud data obtained from the two sensors. The network architecture takes the raw image and raw point cloud as the input and gives the rotation and translation parameters as the output. The utility of this proposed method is shown through extensive experiments in this work using the KITTI360 dataset. The proposed solution has 2 variations in the design, both of which are described and extensively tested. For miscalibrations in the range of ±0.2m translation per axis and ±10° rotation per axis, it is shown that the first design variation achieves a mean rotation error of 0.56° and mean translation error of 4.87 cm. For the same range of miscalibrations, the second design variation is shown to achieve a mean rotation error of 0.85° and a mean translation error of 3.97 cm. Finally, the proposed solution is adapted for a generalized use case and the utility is shown using experiments on data collected using real sensors such as the Livox Horizon 3D LiDAR and the ZED camera. Master of Engineering 2022-02-04T05:04:11Z 2022-02-04T05:04:11Z 2022 Thesis-Master by Research Mharolkar, S. R. (2022). Online extrinsic calibration between a 3D LiDAR and a monocular camera. Master's thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/155025 https://hdl.handle.net/10356/155025 10.32657/10356/155025 en This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License (CC BY-NC 4.0). application/pdf Nanyang Technological University
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering::Electrical and electronic engineering::Control and instrumentation::Robotics
spellingShingle Engineering::Electrical and electronic engineering::Control and instrumentation::Robotics
Mharolkar, Sanat Rajesh
Online extrinsic calibration between a 3D LiDAR and a monocular camera
description For autonomous mobile robots, multiple sensors are being widely adopted as a way to enhance the perception ability of the robot. 3D LiDARs and cameras are the most commonly used sensors. A 3D LiDAR provides data in the form of unordered 3D coordinates of objects in the environment. A monocular camera provides data in the form of ordered color or intensity images. These two data complement each other and when fused together, they can provide a much better understanding of the scene. The first step to fuse the data from a 3D LiDAR and a camera is to obtain accurate calibration parameters between the two. Calibration is of two types – intrinsic and extrinsic. While the intrinsic calibration of each sensor is always done separately, this thesis is focused on the extrinsic calibration between the two sensors. Extrinsic calibration involves determining the rotation matrix and translation vector between the two sensors. Generally, extrinsic calibration can be divided into two types – offline and online. Offline methods provide good accuracy; however, they require the use of a specific calibration target and human intervention. Online extrinsic calibration attempts to perform calibration in any unknown scene without the use of any specific targets, but the accuracy is not as good as offline methods. Thus, online calibration is more desirable in terms of convenience, but the accuracy needs to be improved. Through this thesis work, existing methods for offline and online extrinsic calibration between a LiDAR and camera are reviewed, and finally, a new method is proposed for online extrinsic calibration between a 3D LiDAR and monocular camera. The idea behind this new method is to leverage deep learning architectures to learn the extrinsic parameters from the image and point cloud data obtained from the two sensors. The network architecture takes the raw image and raw point cloud as the input and gives the rotation and translation parameters as the output. The utility of this proposed method is shown through extensive experiments in this work using the KITTI360 dataset. The proposed solution has 2 variations in the design, both of which are described and extensively tested. For miscalibrations in the range of ±0.2m translation per axis and ±10° rotation per axis, it is shown that the first design variation achieves a mean rotation error of 0.56° and mean translation error of 4.87 cm. For the same range of miscalibrations, the second design variation is shown to achieve a mean rotation error of 0.85° and a mean translation error of 3.97 cm. Finally, the proposed solution is adapted for a generalized use case and the utility is shown using experiments on data collected using real sensors such as the Livox Horizon 3D LiDAR and the ZED camera.
author2 Wang Dan Wei
author_facet Wang Dan Wei
Mharolkar, Sanat Rajesh
format Thesis-Master by Research
author Mharolkar, Sanat Rajesh
author_sort Mharolkar, Sanat Rajesh
title Online extrinsic calibration between a 3D LiDAR and a monocular camera
title_short Online extrinsic calibration between a 3D LiDAR and a monocular camera
title_full Online extrinsic calibration between a 3D LiDAR and a monocular camera
title_fullStr Online extrinsic calibration between a 3D LiDAR and a monocular camera
title_full_unstemmed Online extrinsic calibration between a 3D LiDAR and a monocular camera
title_sort online extrinsic calibration between a 3d lidar and a monocular camera
publisher Nanyang Technological University
publishDate 2022
url https://hdl.handle.net/10356/155025
_version_ 1772825397095825408