One class based feature learning approach for defect detection using deep autoencoders
Detecting defects is an integral part of any manufacturing process. Most works still utilize traditional image processing algorithms to detect defects owing to the complexity and variety of products and manufacturing environments. In this paper, we propose an approach based on deep learning which us...
Saved in:
Main Authors: | , , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
2020
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/137979 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-137979 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-1379792021-01-28T05:31:24Z One class based feature learning approach for defect detection using deep autoencoders Abdul Mujeeb Dai, Wenting Erdt, Marius Sourin, Alexei School of Computer Science and Engineering School of Electrical and Electronic Engineering Fraunhofer Research Center Engineering::Computer science and engineering Automated Manufacturing Systems Automatic Optical Inspection Detecting defects is an integral part of any manufacturing process. Most works still utilize traditional image processing algorithms to detect defects owing to the complexity and variety of products and manufacturing environments. In this paper, we propose an approach based on deep learning which uses autoencoders for extraction of discriminative features. It can detect different defects without using any defect samples during training. This method, where samples of only one class (i.e. defect-free samples) are available for training, is called One Class Classification (OCC). This OCC method can also be used for training a neural network when only one golden sample is available by generating many copies of the reference image by data augmentation. The trained model is then able to generate a descriptor—a unique feature vector of an input image. A test image captured by an Automatic Optical Inspection (AOI) camera is sent to the trained model to generate a test descriptor, which is compared with a reference descriptor to obtain a similarity score. After comparing the results of this method with a popular traditional similarity matching method SIFT, we find that in the most cases this approach is more effective and more flexible than the traditional image processing-based methods, and it can be used to detect different types of defects with minimum customization. Accepted version 2020-04-21T01:08:05Z 2020-04-21T01:08:05Z 2019 Journal Article Abdul Mujeeb, Dai, W., Erdt, M., & Sourin, A. (2019). One class based feature learning approach for defect detection using deep autoencoders. Advanced Engineering Informatics, 42, 100933-. doi:10.1016/j.aei.2019.100933 1474-0346 https://hdl.handle.net/10356/137979 10.1016/j.aei.2019.100933 42 100933 en SMA-RP4 Advanced Engineering Informatics © 2019 Elsevier. All rights reserved. This paper was published in Advanced Engineering Informatics and is made available with permission of Elsevier. application/pdf |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
Engineering::Computer science and engineering Automated Manufacturing Systems Automatic Optical Inspection |
spellingShingle |
Engineering::Computer science and engineering Automated Manufacturing Systems Automatic Optical Inspection Abdul Mujeeb Dai, Wenting Erdt, Marius Sourin, Alexei One class based feature learning approach for defect detection using deep autoencoders |
description |
Detecting defects is an integral part of any manufacturing process. Most works still utilize traditional image processing algorithms to detect defects owing to the complexity and variety of products and manufacturing environments. In this paper, we propose an approach based on deep learning which uses autoencoders for extraction of discriminative features. It can detect different defects without using any defect samples during training. This method, where samples of only one class (i.e. defect-free samples) are available for training, is called One Class Classification (OCC). This OCC method can also be used for training a neural network when only one golden sample is available by generating many copies of the reference image by data augmentation. The trained model is then able to generate a descriptor—a unique feature vector of an input image. A test image captured by an Automatic Optical Inspection (AOI) camera is sent to the trained model to generate a test descriptor, which is compared with a reference descriptor to obtain a similarity score. After comparing the results of this method with a popular traditional similarity matching method SIFT, we find that in the most cases this approach is more effective and more flexible than the traditional image processing-based methods, and it can be used to detect different types of defects with minimum customization. |
author2 |
School of Computer Science and Engineering |
author_facet |
School of Computer Science and Engineering Abdul Mujeeb Dai, Wenting Erdt, Marius Sourin, Alexei |
format |
Article |
author |
Abdul Mujeeb Dai, Wenting Erdt, Marius Sourin, Alexei |
author_sort |
Abdul Mujeeb |
title |
One class based feature learning approach for defect detection using deep autoencoders |
title_short |
One class based feature learning approach for defect detection using deep autoencoders |
title_full |
One class based feature learning approach for defect detection using deep autoencoders |
title_fullStr |
One class based feature learning approach for defect detection using deep autoencoders |
title_full_unstemmed |
One class based feature learning approach for defect detection using deep autoencoders |
title_sort |
one class based feature learning approach for defect detection using deep autoencoders |
publishDate |
2020 |
url |
https://hdl.handle.net/10356/137979 |
_version_ |
1690658338725953536 |