Development of automatic obscene images filtering using deep learning
Because of Internet availability in most societies, access to pornography has be-come a severe issue. On the other side, the pornography industry has grown steadily, and its websites are becoming increasingly popular by offering potential users free passes. Filtering obscene images and video frames...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Book Chapter |
Language: | English English |
Published: |
Springer
2021
|
Subjects: | |
Online Access: | http://irep.iium.edu.my/88883/1/88883_Development%20of%20automatic%20obscene%20images.pdf http://irep.iium.edu.my/88883/2/88883_Development%20of%20automatic%20obscene%20images_SCOPUS.pdf http://irep.iium.edu.my/88883/ https://link.springer.com/book/10.1007%2F978-3-030-70917-4 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Universiti Islam Antarabangsa Malaysia |
Language: | English English |
Summary: | Because of Internet availability in most societies, access to pornography has be-come a severe issue. On the other side, the pornography industry has grown steadily, and its websites are becoming increasingly popular by offering potential users free passes. Filtering obscene images and video frames is essential in the big data era, where all kinds of information are available for everyone. This paper proposes a fully automated method to filter any storage device from obscene vid-eos and images using deep learning algorithms. The whole recognition process can be divided into two stages, including fine detection and focus detection. The fine detection includes skin color detection with YCbCr and HSV color spaces and accurate face detection using the Adaboost algorithm with Haar-like features. Moreover, focus detection uses AlexNet transfer learning to identify the obscene images which passed stage one. Results showed the effectiveness of our pro-posed algorithm in filtering obscene images or videos. The testing accuracy achieved is 95.26% when tested with 3969 testing images. |
---|