Neuron coverage-guided domain generalization

This paper focuses on the domain generalization task where domain knowledge is unavailable, and even worse, only samples from a single domain can be utilized during training. Our motivation originates from the recent progresses in deep neural network (DNN) testing, which has shown that maximizing ne...

Full description

Saved in:
Bibliographic Details
Main Authors: TIAN, Chris Xing, LI, Haoliang, XIE, Xiaofei, LIU, Yang, WANG, Shiqi
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2022
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/7491
https://ink.library.smu.edu.sg/context/sis_research/article/8494/viewcontent/2103.00229.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-8494
record_format dspace
spelling sg-smu-ink.sis_research-84942022-11-10T07:45:37Z Neuron coverage-guided domain generalization TIAN, Chris Xing LI, Haoliang XIE, Xiaofei LIU, Yang WANG, Shiqi This paper focuses on the domain generalization task where domain knowledge is unavailable, and even worse, only samples from a single domain can be utilized during training. Our motivation originates from the recent progresses in deep neural network (DNN) testing, which has shown that maximizing neuron coverage of DNN can help to explore possible defects of DNN (i.e.,misclassification). More specifically, by treating the DNN as a program and each neuron as a functional point of the code, during the network training we aim to improve the generalization capability by maximizing the neuron coverage of DNN with the gradient similarity regularization between the original and augmented samples. As such, the decision behavior of the DNN is optimized, avoiding the arbitrary neurons that are deleterious for the unseen samples, and leading to the trained DNN that can be better generalized to out-of-distribution samples. Extensive studies on various domain generalization tasks based on both single and multiple domain(s) setting demonstrate the effectiveness of our proposed approach compared with state-of-the-art baseline methods. We also analyze our method by conducting visualization based on network dissection. The results further provide useful evidence on the rationality and effectiveness of our approach. 2022-03-01T08:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/7491 info:doi/10.1109/TPAMI.2022.3157441 https://ink.library.smu.edu.sg/context/sis_research/article/8494/viewcontent/2103.00229.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Out-of-distribution neuron coverage gradient similarity Artificial Intelligence and Robotics
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Out-of-distribution
neuron coverage
gradient similarity
Artificial Intelligence and Robotics
spellingShingle Out-of-distribution
neuron coverage
gradient similarity
Artificial Intelligence and Robotics
TIAN, Chris Xing
LI, Haoliang
XIE, Xiaofei
LIU, Yang
WANG, Shiqi
Neuron coverage-guided domain generalization
description This paper focuses on the domain generalization task where domain knowledge is unavailable, and even worse, only samples from a single domain can be utilized during training. Our motivation originates from the recent progresses in deep neural network (DNN) testing, which has shown that maximizing neuron coverage of DNN can help to explore possible defects of DNN (i.e.,misclassification). More specifically, by treating the DNN as a program and each neuron as a functional point of the code, during the network training we aim to improve the generalization capability by maximizing the neuron coverage of DNN with the gradient similarity regularization between the original and augmented samples. As such, the decision behavior of the DNN is optimized, avoiding the arbitrary neurons that are deleterious for the unseen samples, and leading to the trained DNN that can be better generalized to out-of-distribution samples. Extensive studies on various domain generalization tasks based on both single and multiple domain(s) setting demonstrate the effectiveness of our proposed approach compared with state-of-the-art baseline methods. We also analyze our method by conducting visualization based on network dissection. The results further provide useful evidence on the rationality and effectiveness of our approach.
format text
author TIAN, Chris Xing
LI, Haoliang
XIE, Xiaofei
LIU, Yang
WANG, Shiqi
author_facet TIAN, Chris Xing
LI, Haoliang
XIE, Xiaofei
LIU, Yang
WANG, Shiqi
author_sort TIAN, Chris Xing
title Neuron coverage-guided domain generalization
title_short Neuron coverage-guided domain generalization
title_full Neuron coverage-guided domain generalization
title_fullStr Neuron coverage-guided domain generalization
title_full_unstemmed Neuron coverage-guided domain generalization
title_sort neuron coverage-guided domain generalization
publisher Institutional Knowledge at Singapore Management University
publishDate 2022
url https://ink.library.smu.edu.sg/sis_research/7491
https://ink.library.smu.edu.sg/context/sis_research/article/8494/viewcontent/2103.00229.pdf
_version_ 1770576356887756800