Towards out-of-distribution detection for object detection networks
Many studies have recently been published on recognizing when a classification neural network is provided with data that does not fit into one of the class labels learnt during training. These so-called out-of-distribution (OOD) detection approaches have the potential to improve system safety in sit...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2022
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/157090 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | Many studies have recently been published on recognizing when a classification neural network is provided with data that does not fit into one of the class labels learnt during training. These so-called out-of-distribution (OOD) detection approaches have the potential to improve system safety in situations when
unexpected or new inputs might cause mistakes that jeopardize human life. These approaches would particularly be able to aid autonomous vehicles if they could be used to detect and pinpoint anomalous objects in a driving environment, allowing the system to either fail gracefully or to treat such objects with extreme caution.
We dive deep into an existing and promising OOD Detection method from the image classification literature called 'Detecting Out-of-Distribution Inputs in Deep Neural Networks Using an Early-Layer Output' and explore how it can be modified for application in object detection networks. We then apply our approach to the YOLOv3 object detector and evaluate it across multiple metrics to empirically prove the effectiveness of the approach. |
---|