Multi-sensor calibration for autonomous container prime mover
Nowadays, self-driving car is becoming increasingly popular, and its implementation in docks is also a trend. To realize intelligent operation, multi-sensor system is needed. In this dissertation, we propose a multi-sensor system on a car with various cameras and LiDARs to simulate the working of th...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Thesis-Master by Coursework |
Language: | English |
Published: |
Nanyang Technological University
2023
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/165016 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | Nowadays, self-driving car is becoming increasingly popular, and its implementation in docks is also a trend. To realize intelligent operation, multi-sensor system is needed. In this dissertation, we propose a multi-sensor system on a car with various cameras and LiDARs to simulate the working of the autonomous container prime movers at the dock. To get a general understanding of the surrounding environment, sensor fusion plays a vital role. So, the dissertation mainly focuses on calibrating the whole multi-sensor system. It includes multi-camera calibration, RGB camera and LiDAR calibration. Both target-based and targetless methods are used. We compare and analyze their strengths and appropriate implementation scenarios. For the targetless method, the result is sometimes unstable, so we alleviate the problem by multi-scene calibration. In some cases, the sensors may not have a common field of view, so we propose to chain the transformation using an intermedium sensor. Also, the calibration of a blind-spot LiDAR and a camera is rarely done before, and we extend the generic target-based method to realize it. Qualitative analysis of the calibration result of the system is implemented, and the sensor fusion result shows that the obtained calibrated parameters are accurate. Finally, we compile a calibration tutorial and share our experiment sample dataset on GitHub for further research. The tutorial and dataset are available at https://github.com/ZyueRemi/Tutorial_Lidar_camera_calibration. |
---|