Adversarial attacks on RNN-based deep learning systems
Automatic Speech Recognition (ASR) systems have been growing in prevalence together with the advancement in deep learning. Built within many Intelligent Voice Control (IVC) systems such as Alexa, Siri and Google Assistant, ASR has become an attractive target for adversarial attacks. In this research...
محفوظ في:
المؤلف الرئيسي: | |
---|---|
مؤلفون آخرون: | |
التنسيق: | Final Year Project |
اللغة: | English |
منشور في: |
Nanyang Technological University
2020
|
الموضوعات: | |
الوصول للمادة أونلاين: | https://hdl.handle.net/10356/137926 |
الوسوم: |
إضافة وسم
لا توجد وسوم, كن أول من يضع وسما على هذه التسجيلة!
|
المؤسسة: | Nanyang Technological University |
اللغة: | English |
id |
sg-ntu-dr.10356-137926 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-1379262020-04-20T00:31:49Z Adversarial attacks on RNN-based deep learning systems Loi, Chii Lek Liu Yang School of Computer Science and Engineering yangliu@ntu.edu.sg Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence Engineering::Computer science and engineering::Computing methodologies::Simulation and modeling Automatic Speech Recognition (ASR) systems have been growing in prevalence together with the advancement in deep learning. Built within many Intelligent Voice Control (IVC) systems such as Alexa, Siri and Google Assistant, ASR has become an attractive target for adversarial attacks. In this research project, the objective is to create a black-box over-the-air (OTA) attack system that can mutate an audio into its adversarial form with imperceptible difference, such that it will be interpreted as the targeted word by the ASR. In this paper, we demonstrate the feasibility and effectiveness of such an attack system in generating perturbation for the DeepSpeech ASR. Bachelor of Engineering (Computer Science) 2020-04-20T00:31:49Z 2020-04-20T00:31:49Z 2020 Final Year Project (FYP) https://hdl.handle.net/10356/137926 en SCSE 19-0319 application/pdf Nanyang Technological University |
institution |
Nanyang Technological University |
building |
NTU Library |
country |
Singapore |
collection |
DR-NTU |
language |
English |
topic |
Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence Engineering::Computer science and engineering::Computing methodologies::Simulation and modeling |
spellingShingle |
Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence Engineering::Computer science and engineering::Computing methodologies::Simulation and modeling Loi, Chii Lek Adversarial attacks on RNN-based deep learning systems |
description |
Automatic Speech Recognition (ASR) systems have been growing in prevalence together with the advancement in deep learning. Built within many Intelligent Voice Control (IVC) systems such as Alexa, Siri and Google Assistant, ASR has become an attractive target for adversarial attacks. In this research project, the objective is to create a black-box over-the-air (OTA) attack system that can mutate an audio into its adversarial form with imperceptible difference, such that it will be interpreted as the targeted word by the ASR. In this paper, we demonstrate the feasibility and effectiveness of such an attack system in generating perturbation for the DeepSpeech ASR. |
author2 |
Liu Yang |
author_facet |
Liu Yang Loi, Chii Lek |
format |
Final Year Project |
author |
Loi, Chii Lek |
author_sort |
Loi, Chii Lek |
title |
Adversarial attacks on RNN-based deep learning systems |
title_short |
Adversarial attacks on RNN-based deep learning systems |
title_full |
Adversarial attacks on RNN-based deep learning systems |
title_fullStr |
Adversarial attacks on RNN-based deep learning systems |
title_full_unstemmed |
Adversarial attacks on RNN-based deep learning systems |
title_sort |
adversarial attacks on rnn-based deep learning systems |
publisher |
Nanyang Technological University |
publishDate |
2020 |
url |
https://hdl.handle.net/10356/137926 |
_version_ |
1681057799811891200 |