Data driven extraction of challenging situation for autonomous vehicles

Autonomous vehicles or Self-Driven Vehicles (SDVs) are becoming increasingly common in Singapore and for a wide variety of applications – from first-and-last-mile commutes to logistics. The areas of deployment are similarly diverse, ranging from docks to housing estates and highways. This variance i...

Full description

Saved in:
Bibliographic Details
Main Author: Krishna, Sagar
Other Authors: Justin Dauwels
Format: Theses and Dissertations
Language:English
Published: 2019
Subjects:
Online Access:http://hdl.handle.net/10356/78469
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:Autonomous vehicles or Self-Driven Vehicles (SDVs) are becoming increasingly common in Singapore and for a wide variety of applications – from first-and-last-mile commutes to logistics. The areas of deployment are similarly diverse, ranging from docks to housing estates and highways. This variance in operating environments necessitates careful validation and analysis of SDVs in contextual situations before deployment. To support the Land Transport Authority's (LTA) development of test requirements and standards to deploy AVs in Singapore, NTU led the Centre of Excellence for Testing & Research of AVs – NTU (CETRAN). While CETRAN does not directly develop new technologies for AVs, it generates fundamental research on how these systems should operate, develop testing requirements, and establish an international standard for AVs. CETRAN houses an expert research team formed by NTU that performs testing in a computer-simulated environment representative of Singapore's traffic conditions, to complement the tests being performed in the test circuit. In order to create realistic tests which correspond as closely as possible with traffic situations occurring on Singapore roads, 15000 km of driving data has been collected and analyzed over a course of 2 years (including video and vehicle dynamics), and 20000 labelled situations have been identified in conjunction with CETRAN. I was provided a subset of the total driving data containing video recorded using front and back cameras, as well as vehicle dynamic data from the Controller Area Network bus (CANbus) containing various signals, such as GPS location, velocity, orientation, yaw rate, etc. Additionally, I had access to event labels which have been manually annotated by a trained team of dedicated labelers. The goal of the project was to train machine learning algorithms to automatically extract two specific situations from the collected data. The resulting algorithm was validated by comparing them with the manually annotated labels. The resulting work would be of great value to CETRAN, allowing them to scale up the data analysis over a much larger database.