Sensor fusion for long-range object detection under rainy conditions

Recently, requirements for 3D detection have been increasing. However, due to the lack of public dataset for LiDAR under severe weather conditions, the works mainly focus on the performances under sunny or clear conditions. These works seldom focus on the drawbacks of LiDAR - Sparsity. Since LiDAR w...

Full description

Saved in:
Bibliographic Details
Main Author: Chen, Junhao
Other Authors: Soong Boon Hee
Format: Thesis-Master by Coursework
Language:English
Published: Nanyang Technological University 2024
Subjects:
Online Access:https://hdl.handle.net/10356/173392
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:Recently, requirements for 3D detection have been increasing. However, due to the lack of public dataset for LiDAR under severe weather conditions, the works mainly focus on the performances under sunny or clear conditions. These works seldom focus on the drawbacks of LiDAR - Sparsity. Since LiDAR would suffer from reflection, scattering, and loss of energy when traveling, the objects in long-range are difficult to detect by LiDAR. This effect is even worse under rainy conditions by the disturbing from rain droplets. The existing systems concerning the common situation with a higher density of LiDAR points could not detect the objects in the long range under rainy conditions. To solve this problem, we first generated a simulated dataset - RAINY KITTI for lack of public dataset under rainy conditions. This dataset is generated based on KITTI. Instead of adding noise directly, this dataset is simulated with Marshal-Palmer rain droplet distribution and Mie Scattering, which follows the physical rules under different rainy intensities (2mm, 30mm, 50mm, 100mm). The detection results are worse when training and testing on these datasets. Secondly, a novel system using decision-level fusion is developed. This system would introduce a completing network based on Transformer guided by the 2D detection result to densify the point clouds before estimating the 3D bounding box. Instead of estimating the result with the low-density input, the densified data is the combination of the original data and the simulated data. Compared with conventional work - Frustum PointNet, our work improves the detection performance (3DAP) for long-range objects under rainy conditions by 10.3% and for overall distances by 16.8%. Under clear conditions, the results of this work are similar to those of conventional works. The results have proved the effects of the completing network when detecting long-range objects under rainy conditions and shown the robustness when detecting objects under sunny conditions at different distances.