Watch out! Motion is blurring the vision of your deep neural networks

The state-of-the-art deep neural networks (DNNs) are vulnerable to adversarial examples with additive random noise-like perturbations. While such examples are hardly found in the physical world, the image blurring effect caused by object motion, on the other hand, commonly occurs in practice, making...

Full description

Saved in:
Bibliographic Details
Main Authors: GUO, Qing, JUEFEI-XU, Felix, XIE, Xiaofei, MA, Lei, WANG, Jian, YU, Bing, FENG, Wei, LIU, Yang
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2020
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/7108
https://ink.library.smu.edu.sg/context/sis_research/article/8111/viewcontent/NeurIPS_2020_watch_out_motion_is_blurring_the_vision_of_your_deep_neural_networks_Paper.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-8111
record_format dspace
spelling sg-smu-ink.sis_research-81112023-08-02T08:37:06Z Watch out! Motion is blurring the vision of your deep neural networks GUO, Qing JUEFEI-XU, Felix XIE, Xiaofei MA, Lei WANG, Jian YU, Bing FENG, Wei LIU, Yang The state-of-the-art deep neural networks (DNNs) are vulnerable to adversarial examples with additive random noise-like perturbations. While such examples are hardly found in the physical world, the image blurring effect caused by object motion, on the other hand, commonly occurs in practice, making the study of which greatly important especially for the widely adopted real-time image processing tasks (e.g., object detection, tracking). In this paper, we initiate the first step to comprehensively investigate the potential hazards of blur effect for DNN, caused by object motion. We propose a novel adversarial attack method that can generate visually natural motion-blurred adversarial examples, named motion-based adversarial blur attack (ABBA). To this end, we first formulate the kernel-prediction-based attack where an input image is convolved with kernels in a pixel-wise way, and the misclassification capability is achieved by tuning the kernel weights. To generate visually more natural and plausible examples, we further propose the saliency-regularized adversarial kernel prediction, where the salient region serves as a moving object, and the predicted kernel is regularized to achieve visual effects that are natural. Besides, the attack is further enhanced by adaptively tuning the translations of object and background. A comprehensive evaluation on the NeurIPS’17 adversarial competition dataset demonstrates the effectiveness of ABBA by considering various kernel sizes, translations, and regions. The in-depth study further confirms that our method shows more effective penetrating capability to the state-of-the-art GAN-based deblurring mechanisms compared with other blurring methods. We release the code to https://github.com/tsingqguo/ABBA. 2020-12-01T08:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/7108 https://ink.library.smu.edu.sg/context/sis_research/article/8111/viewcontent/NeurIPS_2020_watch_out_motion_is_blurring_the_vision_of_your_deep_neural_networks_Paper.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University OS and Networks Software Engineering
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic OS and Networks
Software Engineering
spellingShingle OS and Networks
Software Engineering
GUO, Qing
JUEFEI-XU, Felix
XIE, Xiaofei
MA, Lei
WANG, Jian
YU, Bing
FENG, Wei
LIU, Yang
Watch out! Motion is blurring the vision of your deep neural networks
description The state-of-the-art deep neural networks (DNNs) are vulnerable to adversarial examples with additive random noise-like perturbations. While such examples are hardly found in the physical world, the image blurring effect caused by object motion, on the other hand, commonly occurs in practice, making the study of which greatly important especially for the widely adopted real-time image processing tasks (e.g., object detection, tracking). In this paper, we initiate the first step to comprehensively investigate the potential hazards of blur effect for DNN, caused by object motion. We propose a novel adversarial attack method that can generate visually natural motion-blurred adversarial examples, named motion-based adversarial blur attack (ABBA). To this end, we first formulate the kernel-prediction-based attack where an input image is convolved with kernels in a pixel-wise way, and the misclassification capability is achieved by tuning the kernel weights. To generate visually more natural and plausible examples, we further propose the saliency-regularized adversarial kernel prediction, where the salient region serves as a moving object, and the predicted kernel is regularized to achieve visual effects that are natural. Besides, the attack is further enhanced by adaptively tuning the translations of object and background. A comprehensive evaluation on the NeurIPS’17 adversarial competition dataset demonstrates the effectiveness of ABBA by considering various kernel sizes, translations, and regions. The in-depth study further confirms that our method shows more effective penetrating capability to the state-of-the-art GAN-based deblurring mechanisms compared with other blurring methods. We release the code to https://github.com/tsingqguo/ABBA.
format text
author GUO, Qing
JUEFEI-XU, Felix
XIE, Xiaofei
MA, Lei
WANG, Jian
YU, Bing
FENG, Wei
LIU, Yang
author_facet GUO, Qing
JUEFEI-XU, Felix
XIE, Xiaofei
MA, Lei
WANG, Jian
YU, Bing
FENG, Wei
LIU, Yang
author_sort GUO, Qing
title Watch out! Motion is blurring the vision of your deep neural networks
title_short Watch out! Motion is blurring the vision of your deep neural networks
title_full Watch out! Motion is blurring the vision of your deep neural networks
title_fullStr Watch out! Motion is blurring the vision of your deep neural networks
title_full_unstemmed Watch out! Motion is blurring the vision of your deep neural networks
title_sort watch out! motion is blurring the vision of your deep neural networks
publisher Institutional Knowledge at Singapore Management University
publishDate 2020
url https://ink.library.smu.edu.sg/sis_research/7108
https://ink.library.smu.edu.sg/context/sis_research/article/8111/viewcontent/NeurIPS_2020_watch_out_motion_is_blurring_the_vision_of_your_deep_neural_networks_Paper.pdf
_version_ 1773551431262928896