Scaling object detection by transferring learning

More and more datasets have increased their size with enough class annotations. Although the classification datasets are easy to collect, a large number of bounding box annotations require significant human labor and it is time-consuming. Thus, the number of bounding box annotations are usually smal...

Full description

Saved in:
Bibliographic Details
Main Author: Liu, Yizheng
Other Authors: Tan Yap Peng
Format: Thesis-Master by Coursework
Language:English
Published: Nanyang Technological University 2020
Subjects:
Online Access:https://hdl.handle.net/10356/140699
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:More and more datasets have increased their size with enough class annotations. Although the classification datasets are easy to collect, a large number of bounding box annotations require significant human labor and it is time-consuming. Thus, the number of bounding box annotations are usually small. The supervised training method not only requires image-level classification labels but also needs object-level annotations in the detection database which limit the number of object classes they can detect. Therefore, the weakly-supervised training methods are applied in this experiment in which the weights of the classification network are transferred to the weights of the detection network. We call this an effective and efficient network weight transfer network (WTN). The classification weight is pre-trained by Open Images v2. The detection network and WTN are trained by Objects 365 dataset which is the large-scale object detection dataset and works well in feature learning. The experimental results show that the performance of WTN is improved.