Training-free neural active learning with initialization robustness guarantees

Neural active learning techniques so far have focused on enhancing the predic- tive capabilities of the networks. However, safety-critical applications necessi- tate not only good predictive performance but also robustness to randomness in the model-fitting process. To address this, we present th...

Full description

Saved in:
Bibliographic Details
Main Author: Singh, Jasraj
Other Authors: Tong Ping
Format: Final Year Project
Language:English
Published: Nanyang Technological University 2023
Subjects:
Online Access:https://hdl.handle.net/10356/166498
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:Neural active learning techniques so far have focused on enhancing the predic- tive capabilities of the networks. However, safety-critical applications necessi- tate not only good predictive performance but also robustness to randomness in the model-fitting process. To address this, we present the Expected Variance with Gaussian Processes (EV-GP) criterion for neural active learning, which is theoretically guaranteed to choose data points that result in neural networks exhibiting both (a) good generalization capabilities and (b) robustness to initial- ization. Notably, our EV-GP criterion is training-free, i.e., it does not require network training during data selection, making it computationally efficient. We empirically prove that our EV-GP criterion strongly correlates with initialization robustness and generalization performance. Additionally, we demonstrate that it consistently surpasses baseline methods in achieving both objectives, particularly in cases with limited initially labeled data or large batch sizes for active learning.