2D OBJECT DETECTION AND TRACKING BASED ON CAMERA AND LIDAR DATA FOR AN AUTONOMOUS VEHICLE
Environmental perception is a fundamental function of an autonomous vehicle that requires sensing devices from which the vehicle obtains crucial information of the surrounding environment, including detecting and tracking nearby objects. But every sensor has its own limitations, hence it would be na...
Saved in:
Main Author: | |
---|---|
Format: | Final Project |
Language: | Indonesia |
Online Access: | https://digilib.itb.ac.id/gdl/view/49979 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Institut Teknologi Bandung |
Language: | Indonesia |
Summary: | Environmental perception is a fundamental function of an autonomous vehicle that requires sensing devices from which the vehicle obtains crucial information of the surrounding environment, including detecting and tracking nearby objects. But every sensor has its own limitations, hence it would be naïve to fully believe data from an individual sensor. One of the proposed solutions is to fuse data from multiple sensors such that the extracted information has less uncertainty,
Our study uses two different and separate sensors to detect objects: stereo vision camera and LiDAR. Stereo vision camera can extract depth information from every captured pixel by using two imaging sensors. LiDAR measures distance by illuminating targets with rotating laser beams and create a digital representation of the environment. Stereo vision camera has less accuracy and smaller field of view than LiDAR but has a higher data rate and richer resolution. We first preprocessed the camera readings, so it has same data structure as LiDAR. To obtain object detections, point cloud from both sensors is grouped using density-based clustering method. Then, these detections are fused, and each dynamic state is estimated using particle filter.
The result shows our detection and tracking system has smallest Root Mean Square Error (RMSE) of 0.046 m, 0.012 m, 0.007 m, 0.141 m/s, and 0.099 m/s2 for estimating x position, y position, radius, speed, and acceleration of a tracked object respectively. Sensor fusion method improved most errors when estimating observable variables such as position and radius. The system is also installed on a golf cart and successfully detects and tracks moving pedestrian and motorbike while driving around the Bandung State Polytechnic complex. |
---|