Online extrinsic calibration of multiple long baseline ranging sensors

With the rapid development of autonomous vehicles and vehicle-to-everything, V2X application, ranging sensors, typically RGBD sensors and LiDARs are widely deployed as perception tools in smart city. They are commonly used for object detection and tracking, trajectory estimation, collision avoidance...

Full description

Saved in:
Bibliographic Details
Main Author: Yan, Qiao
Other Authors: Wang Dan Wei
Format: Thesis-Master by Research
Language:English
Published: Nanyang Technological University 2022
Subjects:
Online Access:https://hdl.handle.net/10356/155836
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:With the rapid development of autonomous vehicles and vehicle-to-everything, V2X application, ranging sensors, typically RGBD sensors and LiDARs are widely deployed as perception tools in smart city. They are commonly used for object detection and tracking, trajectory estimation, collision avoidance. One single RGBD sensor or LiDAR has a limited field of view and thus has plenty of bind spots. In order to fully cover the surroundings and enhance the perception capability, it is necessary to fuse the information from multiple ranging sensors. To fuse the information, accurate and robust extrinsic calibration is needed. The extrinsic calibration is to find out the extrinsic parameters which describe the relative pose between two sensors. The extrinsic parameters consist of a rotation matrix and a translation vector. Through extrinsic calibration, the rotation matrix and translation vector across different sensor frames can be obtained. Current calibration solutions mainly focus on short baseline scenario, and most of them are offline calibration which requires calibration target, and are usually conducted in indoor scenarios. However, in intelligent transportation system, RGBD sensors and LiDARs are typically deployed far from each other with varying perspectives. Thus, most of these methods are impractical and inconvenient for V2X, which has large baseline and viewpoint difference. To address these problems, two calibration methods are proposed for extrinsic calibration of multiple RGBD sensors and LiDARs, LB-RGBD2RGBD-Calib and Object4Calib. LB-RGBD2RGBD-Calib is proposed to achieve online extrinsic calibration between multiple long baseline RGBD sensors (abbr. Long Baseline RGBD to RGBD Calibration). The advantages of both color image and depth information are fully utilized. In this method, first, image features are extracted with texture information and are matched to build the initial correspondence between multiple RGBD sensors with outliers removed by geometric constraints. Second, the depth information is refined by accumulating multiple frames to smooth the noise. The refined depth is utilized in PnP Ransac method to remove outliers and obtain a coarse extrinsic parameter. Third, to refine the extrinsic parameter, Iterative Closest Point and edge alignment are carried out. Both quantitative and qualitative experiments demonstrate that LB-RGBD2RGBD-Calib achieves accurate result, the reprojection error is around 2.01 pixels and point cloud RMSE error is 0.35 and 0.38 meters for ICP and edge alignment. Object4Calib is to solve the extrinsic calibration of multiple LiDARs with long baseline and large viewpoint difference. The main novelty is that this method creatively uses the easy-to-obtain objects, the vehicles, in the traffic scenario for calibration and the essence is that the 3D bounding box centers of the vehicles are viewpoint-invariant. Thus, the object center is a good common feature for multiLiDARs in V2X. Object4Calib consists of two steps: exhaustive searching and global ICP refinement. For the coarse step, an exhaustive searching strategy is proposed to find the optimal correspondence of the center points between different LiDARs. For the refinement step, ICP is carried out with coarse parameter as initial value. Both quantitative and qualitative experiments in different simulation and real environments demonstrate that Object4Calib is robust and accurate. Even for a setup where the translation and rotation between two LiDARs is more than 30m and 90°, Object4Calib can achieve successful calibration.