Sensor fusion for object detection under adverse weather
Autonomous vehicles (AVs) have rapidly emerged as one of the most rapidly advancing technologies in the realm of transportation. The innovative strides in AV technology, from self-driving cars to autonomous delivery drones, have captured widespread attention and are poised to reshape the future o...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2023
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/172746 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | Autonomous vehicles (AVs) have rapidly emerged as one of the most rapidly advancing
technologies in the realm of transportation. The innovative strides in AV technology, from
self-driving cars to autonomous delivery drones, have captured widespread attention and
are poised to reshape the future of mobility. Traditional Autonomous Vehicle (AV) camerabased object detectors fail in complicated scenarios where the lighting and weather
conditions are not ideal. In recent years, there has been a rise in the use of deep learning
methods relying on LiDARs and Radars, given their long history of achieving state of art
performance in different types of applications. Despite the rapid development in deep
neural networks, object detection is still a persistent challenge as some sensors experience
poor perception under adverse weather.
Severe weather conditions can still impede LiDAR’s performance despite its excellent
capabilities. Snow and water particles make the targets detected by the LiDAR sensor
become partially occluded. To address this redundancy, a RADAR sensor is introduced to
compensate for LiDAR’s shortcomings. In this project, a sensor fusion approach is proposed
to optimize the object detection’s performance in rainy, snowy, or foggy weather. In
addition, a comparison of the performance of the object detectors using only one of the
proposed sensors will also be performed.
Adding on, catastrophic forgetting is a baffling phenomenon that can be observed in neural
networks. This happens when a trained neural network has the tendency of losing its ability
to make predictions based on its previous training when it is finetuned with new data.
Therefore, an investigation will be conducted on the impact of catastrophic forgetting in
adverse weather conditions for both LiDAR and RADAR sensors. A comparative assessment
will be made to formulate probable resolution. |
---|