Improving out-of-distribution detection with disentangled foreground and background features

Detecting out-of-distribution (OOD) inputs is a principal task for ensuring the safety of deploying deep-neural-network classifiers in open-set scenarios. OOD samples can be drawn from arbitrary distributions and exhibit deviations from in-distribution (ID) data in various dimensions, such as foregr...

Full description

Saved in:
Bibliographic Details
Main Authors: DING, Choubo, PANG, Guansong
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2024
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/9756
https://ink.library.smu.edu.sg/context/sis_research/article/10756/viewcontent/2303.08727v2.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-10756
record_format dspace
spelling sg-smu-ink.sis_research-107562024-12-16T03:17:38Z Improving out-of-distribution detection with disentangled foreground and background features DING, Choubo PANG, Guansong Detecting out-of-distribution (OOD) inputs is a principal task for ensuring the safety of deploying deep-neural-network classifiers in open-set scenarios. OOD samples can be drawn from arbitrary distributions and exhibit deviations from in-distribution (ID) data in various dimensions, such as foreground features (e.g., objects in CIFAR100 images vs. those in CIFAR10 images) and background features (e.g., textural images vs. objects in CIFAR10). Existing methods can confound foreground and background features in training, failing to utilize the background features for OOD detection. This paper considers the importance of feature disentanglement in out-of-distribution detection and proposes the simultaneous exploitation of both foreground and background features to support the detection of OOD inputs in in out-of-distribution detection. To this end, we propose a novel framework that first disentangles foreground and background features from ID training samples via a dense prediction approach, and then learns a new classifier that can evaluate the OOD scores of test images from both foreground and background features. It is a generic framework that allows for a seamless combination with various existing OOD detection methods. Extensive experiments show that our approach 1) can substantially enhance the performance of four different state-of-the-art (SotA) OOD detection methods on multiple widely-used OOD datasets with diverse background features, and 2) achieves new SotA performance on these benchmarks. 2024-10-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/9756 info:doi/10.1145/3664647.3681614 https://ink.library.smu.edu.sg/context/sis_research/article/10756/viewcontent/2303.08727v2.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Machine learning Computer vision Image representation Anomaly detection Out-of-Distribution detection Disentangled representations Artificial Intelligence and Robotics Graphics and Human Computer Interfaces
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Machine learning
Computer vision
Image representation
Anomaly detection
Out-of-Distribution detection
Disentangled representations
Artificial Intelligence and Robotics
Graphics and Human Computer Interfaces
spellingShingle Machine learning
Computer vision
Image representation
Anomaly detection
Out-of-Distribution detection
Disentangled representations
Artificial Intelligence and Robotics
Graphics and Human Computer Interfaces
DING, Choubo
PANG, Guansong
Improving out-of-distribution detection with disentangled foreground and background features
description Detecting out-of-distribution (OOD) inputs is a principal task for ensuring the safety of deploying deep-neural-network classifiers in open-set scenarios. OOD samples can be drawn from arbitrary distributions and exhibit deviations from in-distribution (ID) data in various dimensions, such as foreground features (e.g., objects in CIFAR100 images vs. those in CIFAR10 images) and background features (e.g., textural images vs. objects in CIFAR10). Existing methods can confound foreground and background features in training, failing to utilize the background features for OOD detection. This paper considers the importance of feature disentanglement in out-of-distribution detection and proposes the simultaneous exploitation of both foreground and background features to support the detection of OOD inputs in in out-of-distribution detection. To this end, we propose a novel framework that first disentangles foreground and background features from ID training samples via a dense prediction approach, and then learns a new classifier that can evaluate the OOD scores of test images from both foreground and background features. It is a generic framework that allows for a seamless combination with various existing OOD detection methods. Extensive experiments show that our approach 1) can substantially enhance the performance of four different state-of-the-art (SotA) OOD detection methods on multiple widely-used OOD datasets with diverse background features, and 2) achieves new SotA performance on these benchmarks.
format text
author DING, Choubo
PANG, Guansong
author_facet DING, Choubo
PANG, Guansong
author_sort DING, Choubo
title Improving out-of-distribution detection with disentangled foreground and background features
title_short Improving out-of-distribution detection with disentangled foreground and background features
title_full Improving out-of-distribution detection with disentangled foreground and background features
title_fullStr Improving out-of-distribution detection with disentangled foreground and background features
title_full_unstemmed Improving out-of-distribution detection with disentangled foreground and background features
title_sort improving out-of-distribution detection with disentangled foreground and background features
publisher Institutional Knowledge at Singapore Management University
publishDate 2024
url https://ink.library.smu.edu.sg/sis_research/9756
https://ink.library.smu.edu.sg/context/sis_research/article/10756/viewcontent/2303.08727v2.pdf
_version_ 1819113129399812096