Removing bias for out-of-distribution generalization
Deep models have a strong ability to fit the training data, and thus can achieve high performance when the testing data is sampled from the same distribution as the training. However, in practice, the deep models fail to perform perfectly because the testing data is usually Out-of-Distribution (OOD)...
Saved in:
Main Author: | Qi, Jiaxin |
---|---|
Other Authors: | Zhang Hanwang |
Format: | Thesis-Doctor of Philosophy |
Language: | English |
Published: |
Nanyang Technological University
2023
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/168654 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Similar Items
-
Class is invariant to context and vice versa: On learning invariance for out-of-distribution generalization
by: QI, Jiaxin, et al.
Published: (2022) -
Towards out-of-distribution detection for object detection networks
by: Kanodia, Ritwik
Published: (2022) -
Neural network compression techniques for out-of-distribution detection
by: Bansal, Aditya
Published: (2022) -
Full-spectrum out-of-distribution detection
by: Yang, Jingkang, et al.
Published: (2023) -
Disentangling latent space of variational autoencoder with distribution dependent guarantees for out-of-distribution detection and reasoning
by: Rahiminasab Zahra Reza (Zahra Rahiminasab)
Published: (2024)