An empirical study on robustness of DNNs with out-of-distribution awareness
The state-of-the-art deep neural network (DNN) achieves impressive performance on the input that is similar to training data. However, it fails to make reasonable decisions on the input that is quite different from training data, i.e., out-of-distribution (OOD) examples. Although many techniques hav...
Saved in:
Main Authors: | ZHOU, Lingjun, YU, Bing, BEREND, David, XIE, Xiaofei, LI, Xiaohong, ZHAO, Jianjun, LIU, Xusheng |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2020
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/7095 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
Similar Items
-
Cats are not fish: Deep learning testing calls for out-of-distribution awareness
by: BEREND, David, et al.
Published: (2020) -
Marble: Model-based robustness analysis of stateful deep learning systems
by: DU, Xiaoning, et al.
Published: (2020) -
Watch out! Motion is blurring the vision of your deep neural networks
by: GUO, Qing, et al.
Published: (2020) -
DeepMutation++: A mutation testing framework for deep learning systems
by: HU, Qiang, et al.
Published: (2019) -
A quantitative analysis framework for recurrent neural network
by: DU, Xiaoning, et al.
Published: (2019)