2D OBJECT DETECTION AND TRACKING BASED ON CAMERA AND LIDAR DATA FOR AN AUTONOMOUS VEHICLE

Environmental perception is a fundamental function of an autonomous vehicle that requires sensing devices from which the vehicle obtains crucial information of the surrounding environment, including detecting and tracking nearby objects. But every sensor has its own limitations, hence it would be na...

Full description

Saved in:
Bibliographic Details
Main Author: Nur Ramadhani, Arlin
Format: Final Project
Language:Indonesia
Online Access:https://digilib.itb.ac.id/gdl/view/49978
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Institut Teknologi Bandung
Language: Indonesia
id id-itb.:49978
spelling id-itb.:499782020-09-21T21:29:25Z2D OBJECT DETECTION AND TRACKING BASED ON CAMERA AND LIDAR DATA FOR AN AUTONOMOUS VEHICLE Nur Ramadhani, Arlin Indonesia Final Project Object, detection, tracking, sensor fusion, stereo vision, LiDAR INSTITUT TEKNOLOGI BANDUNG https://digilib.itb.ac.id/gdl/view/49978 Environmental perception is a fundamental function of an autonomous vehicle that requires sensing devices from which the vehicle obtains crucial information of the surrounding environment, including detecting and tracking nearby objects. But every sensor has its own limitations, hence it would be naïve to fully believe data from an individual sensor. One of the proposed solutions is to fuse data from multiple sensors such that the extracted information has less uncertainty, Our study uses two different and separate sensors to detect objects: stereo vision camera and LiDAR. Stereo vision camera can extract depth information from every captured pixel by using two imaging sensors. LiDAR measures distance by illuminating targets with rotating laser beams and create a digital representation of the environment. Stereo vision camera has less accuracy and smaller field of view than LiDAR but has a higher data rate and richer resolution. We first preprocessed the camera readings, so it has same data structure as LiDAR. To obtain object detections, point cloud from both sensors is grouped using density-based clustering method. Then, these detections are fused, and each dynamic state is estimated using particle filter. The result shows our detection and tracking system has smallest Root Mean Square Error (RMSE) of 0.046 m, 0.012 m, 0.007 m, 0.141 m/s, and 0.099 m/s2 for estimating x position, y position, radius, speed, and acceleration of a tracked object respectively. Sensor fusion method improved most errors when estimating observable variables such as position and radius. The system is also installed on a golf cart and successfully detects and tracks moving pedestrian and motorbike while driving around the Bandung State Polytechnic complex. text
institution Institut Teknologi Bandung
building Institut Teknologi Bandung Library
continent Asia
country Indonesia
Indonesia
content_provider Institut Teknologi Bandung
collection Digital ITB
language Indonesia
description Environmental perception is a fundamental function of an autonomous vehicle that requires sensing devices from which the vehicle obtains crucial information of the surrounding environment, including detecting and tracking nearby objects. But every sensor has its own limitations, hence it would be naïve to fully believe data from an individual sensor. One of the proposed solutions is to fuse data from multiple sensors such that the extracted information has less uncertainty, Our study uses two different and separate sensors to detect objects: stereo vision camera and LiDAR. Stereo vision camera can extract depth information from every captured pixel by using two imaging sensors. LiDAR measures distance by illuminating targets with rotating laser beams and create a digital representation of the environment. Stereo vision camera has less accuracy and smaller field of view than LiDAR but has a higher data rate and richer resolution. We first preprocessed the camera readings, so it has same data structure as LiDAR. To obtain object detections, point cloud from both sensors is grouped using density-based clustering method. Then, these detections are fused, and each dynamic state is estimated using particle filter. The result shows our detection and tracking system has smallest Root Mean Square Error (RMSE) of 0.046 m, 0.012 m, 0.007 m, 0.141 m/s, and 0.099 m/s2 for estimating x position, y position, radius, speed, and acceleration of a tracked object respectively. Sensor fusion method improved most errors when estimating observable variables such as position and radius. The system is also installed on a golf cart and successfully detects and tracks moving pedestrian and motorbike while driving around the Bandung State Polytechnic complex.
format Final Project
author Nur Ramadhani, Arlin
spellingShingle Nur Ramadhani, Arlin
2D OBJECT DETECTION AND TRACKING BASED ON CAMERA AND LIDAR DATA FOR AN AUTONOMOUS VEHICLE
author_facet Nur Ramadhani, Arlin
author_sort Nur Ramadhani, Arlin
title 2D OBJECT DETECTION AND TRACKING BASED ON CAMERA AND LIDAR DATA FOR AN AUTONOMOUS VEHICLE
title_short 2D OBJECT DETECTION AND TRACKING BASED ON CAMERA AND LIDAR DATA FOR AN AUTONOMOUS VEHICLE
title_full 2D OBJECT DETECTION AND TRACKING BASED ON CAMERA AND LIDAR DATA FOR AN AUTONOMOUS VEHICLE
title_fullStr 2D OBJECT DETECTION AND TRACKING BASED ON CAMERA AND LIDAR DATA FOR AN AUTONOMOUS VEHICLE
title_full_unstemmed 2D OBJECT DETECTION AND TRACKING BASED ON CAMERA AND LIDAR DATA FOR AN AUTONOMOUS VEHICLE
title_sort 2d object detection and tracking based on camera and lidar data for an autonomous vehicle
url https://digilib.itb.ac.id/gdl/view/49978
_version_ 1822000525374128128