Toward generalist anomaly detection via in-context residual learning with few-shot sample prompts
This paper explores the problem of Generalist Anomaly Detection (GAD), aiming to train one single detection model that can generalize to detect anomalies in diverse datasets from different application domains without any further training on the target data. Some recent studies have showed that large...
Saved in:
Main Authors: | , |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2024
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/9762 https://ink.library.smu.edu.sg/context/sis_research/article/10762/viewcontent/2403.06495v3.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
id |
sg-smu-ink.sis_research-10762 |
---|---|
record_format |
dspace |
spelling |
sg-smu-ink.sis_research-107622024-12-16T02:45:26Z Toward generalist anomaly detection via in-context residual learning with few-shot sample prompts ZHU, Jiawen PANG, Guansong This paper explores the problem of Generalist Anomaly Detection (GAD), aiming to train one single detection model that can generalize to detect anomalies in diverse datasets from different application domains without any further training on the target data. Some recent studies have showed that large pre-trained Visual-Language Models (VLMs) like CLIP have strong generalization capabilities on detecting industrial defects from various datasets, but their methods rely heavily on handcrafted text prompts about defects, making them difficult to generalize to anomalies in other applications, e.g., medical image anomalies or semantic anomalies in natural images. In this work, we propose to train a GAD model with few-shot normal images as sample prompts for AD on diverse datasets on the fly. To this end, we introduce a novel approach that learns an in-context residual learning model for GAD, termed InCTRL. It is trained on an auxiliary dataset to discriminate anomalies from normal samples based on a holistic evaluation of the residuals between query images and few-shot normal sample prompts. Regardless of the datasets, per definition of anomaly, larger residuals are expected for anomalies than normal samples, thereby enabling InCTRL to generalize across different domains without further training. Comprehensive experiments on nine AD datasets are performed to establish a GAD benchmark that encapsulate the detection of industrial defect anomalies, medical anomalies, and semantic anomalies in both one-vs-all and multi-class setting, on which InCTRL is the best performer and significantly outperforms state-of-the-art competing methods. 2024-06-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/9762 info:doi/10.1109/CVPR52733.2024.01688 https://ink.library.smu.edu.sg/context/sis_research/article/10762/viewcontent/2403.06495v3.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Generalist anomaly detection Machine learning Visual-language models Artificial Intelligence and Robotics Computer Sciences |
institution |
Singapore Management University |
building |
SMU Libraries |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
SMU Libraries |
collection |
InK@SMU |
language |
English |
topic |
Generalist anomaly detection Machine learning Visual-language models Artificial Intelligence and Robotics Computer Sciences |
spellingShingle |
Generalist anomaly detection Machine learning Visual-language models Artificial Intelligence and Robotics Computer Sciences ZHU, Jiawen PANG, Guansong Toward generalist anomaly detection via in-context residual learning with few-shot sample prompts |
description |
This paper explores the problem of Generalist Anomaly Detection (GAD), aiming to train one single detection model that can generalize to detect anomalies in diverse datasets from different application domains without any further training on the target data. Some recent studies have showed that large pre-trained Visual-Language Models (VLMs) like CLIP have strong generalization capabilities on detecting industrial defects from various datasets, but their methods rely heavily on handcrafted text prompts about defects, making them difficult to generalize to anomalies in other applications, e.g., medical image anomalies or semantic anomalies in natural images. In this work, we propose to train a GAD model with few-shot normal images as sample prompts for AD on diverse datasets on the fly. To this end, we introduce a novel approach that learns an in-context residual learning model for GAD, termed InCTRL. It is trained on an auxiliary dataset to discriminate anomalies from normal samples based on a holistic evaluation of the residuals between query images and few-shot normal sample prompts. Regardless of the datasets, per definition of anomaly, larger residuals are expected for anomalies than normal samples, thereby enabling InCTRL to generalize across different domains without further training. Comprehensive experiments on nine AD datasets are performed to establish a GAD benchmark that encapsulate the detection of industrial defect anomalies, medical anomalies, and semantic anomalies in both one-vs-all and multi-class setting, on which InCTRL is the best performer and significantly outperforms state-of-the-art competing methods. |
format |
text |
author |
ZHU, Jiawen PANG, Guansong |
author_facet |
ZHU, Jiawen PANG, Guansong |
author_sort |
ZHU, Jiawen |
title |
Toward generalist anomaly detection via in-context residual learning with few-shot sample prompts |
title_short |
Toward generalist anomaly detection via in-context residual learning with few-shot sample prompts |
title_full |
Toward generalist anomaly detection via in-context residual learning with few-shot sample prompts |
title_fullStr |
Toward generalist anomaly detection via in-context residual learning with few-shot sample prompts |
title_full_unstemmed |
Toward generalist anomaly detection via in-context residual learning with few-shot sample prompts |
title_sort |
toward generalist anomaly detection via in-context residual learning with few-shot sample prompts |
publisher |
Institutional Knowledge at Singapore Management University |
publishDate |
2024 |
url |
https://ink.library.smu.edu.sg/sis_research/9762 https://ink.library.smu.edu.sg/context/sis_research/article/10762/viewcontent/2403.06495v3.pdf |
_version_ |
1819113131240062976 |