Better pay attention whilst fuzzing

Fuzzing is one of the prevailing methods for vulnerability detection. However, even state-of-the-art fuzzing methods become ineffective after some period of time, i.e., the coverage hardly improves as existing methods are ineffective to focus the attention of fuzzing on covering the hard-to-trigger...

Full description

Saved in:
Bibliographic Details
Main Authors: ZHU, Shunkai, WANG, Jingyi, SUN, Jun, YANG, Jie, LIN, Xingwei, ZHANG, Liyi, CHENG, Peng
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2023
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/8550
https://ink.library.smu.edu.sg/context/sis_research/article/9553/viewcontent/Fuzzing_av.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-9553
record_format dspace
spelling sg-smu-ink.sis_research-95532024-01-22T14:48:34Z Better pay attention whilst fuzzing ZHU, Shunkai WANG, Jingyi SUN, Jun YANG, Jie LIN, Xingwei ZHANG, Liyi CHENG, Peng Fuzzing is one of the prevailing methods for vulnerability detection. However, even state-of-the-art fuzzing methods become ineffective after some period of time, i.e., the coverage hardly improves as existing methods are ineffective to focus the attention of fuzzing on covering the hard-to-trigger program paths. In other words, they cannot generate inputs that can break the bottleneck due to the fundamental difficulty in capturing the complex relations between the test inputs and program coverage. In particular, existing fuzzers suffer from the following main limitations: 1) lacking an overall analysis of the program to identify the most “rewarding” seeds, and 2) lacking an effective mutation strategy which could continuously select and mutates the more relevant “bytes” of the seeds. In this work, we propose an approach called ATTUZZ to address these two issues systematically. First, we propose a lightweight dynamic analysis technique that estimates the “reward” of covering each basic block and selects the most rewarding seeds accordingly. Second, we mutate the selected seeds according to a neural network model which predicts whether a certain “rewarding” block will be covered given certain mutations on certain bytes of a seed. The model is a deep learning model equipped with an attention mechanism which is learned and updated periodically whilst fuzzing. Our evaluation shows that ATTUZZ significantly outperforms 5 state-of-the-art grey-box fuzzers on 6 popular real-world programs and MAGMA data sets at achieving higher edge coverage and finding new bugs. In particular, ATTUZZ achieved 1.2X edge coverage and 1.8X bugs detected than AFL++ over 24-hour runs. In addition, ATTUZZ also finds 4 new bugs in the latest version of some popular software including p7zip and openUSD. 2023-12-01T08:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/8550 info:doi/10.1109/TSE.2023.3338129 https://ink.library.smu.edu.sg/context/sis_research/article/9553/viewcontent/Fuzzing_av.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Attention Model Codes Computer bugs Deep learning Electronic mail Fuzzing Image edge detection Program Analysis Recurrent neural networks Software Engineering
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Attention Model
Codes
Computer bugs
Deep learning
Electronic mail
Fuzzing
Image edge detection
Program Analysis
Recurrent neural networks
Software Engineering
spellingShingle Attention Model
Codes
Computer bugs
Deep learning
Electronic mail
Fuzzing
Image edge detection
Program Analysis
Recurrent neural networks
Software Engineering
ZHU, Shunkai
WANG, Jingyi
SUN, Jun
YANG, Jie
LIN, Xingwei
ZHANG, Liyi
CHENG, Peng
Better pay attention whilst fuzzing
description Fuzzing is one of the prevailing methods for vulnerability detection. However, even state-of-the-art fuzzing methods become ineffective after some period of time, i.e., the coverage hardly improves as existing methods are ineffective to focus the attention of fuzzing on covering the hard-to-trigger program paths. In other words, they cannot generate inputs that can break the bottleneck due to the fundamental difficulty in capturing the complex relations between the test inputs and program coverage. In particular, existing fuzzers suffer from the following main limitations: 1) lacking an overall analysis of the program to identify the most “rewarding” seeds, and 2) lacking an effective mutation strategy which could continuously select and mutates the more relevant “bytes” of the seeds. In this work, we propose an approach called ATTUZZ to address these two issues systematically. First, we propose a lightweight dynamic analysis technique that estimates the “reward” of covering each basic block and selects the most rewarding seeds accordingly. Second, we mutate the selected seeds according to a neural network model which predicts whether a certain “rewarding” block will be covered given certain mutations on certain bytes of a seed. The model is a deep learning model equipped with an attention mechanism which is learned and updated periodically whilst fuzzing. Our evaluation shows that ATTUZZ significantly outperforms 5 state-of-the-art grey-box fuzzers on 6 popular real-world programs and MAGMA data sets at achieving higher edge coverage and finding new bugs. In particular, ATTUZZ achieved 1.2X edge coverage and 1.8X bugs detected than AFL++ over 24-hour runs. In addition, ATTUZZ also finds 4 new bugs in the latest version of some popular software including p7zip and openUSD.
format text
author ZHU, Shunkai
WANG, Jingyi
SUN, Jun
YANG, Jie
LIN, Xingwei
ZHANG, Liyi
CHENG, Peng
author_facet ZHU, Shunkai
WANG, Jingyi
SUN, Jun
YANG, Jie
LIN, Xingwei
ZHANG, Liyi
CHENG, Peng
author_sort ZHU, Shunkai
title Better pay attention whilst fuzzing
title_short Better pay attention whilst fuzzing
title_full Better pay attention whilst fuzzing
title_fullStr Better pay attention whilst fuzzing
title_full_unstemmed Better pay attention whilst fuzzing
title_sort better pay attention whilst fuzzing
publisher Institutional Knowledge at Singapore Management University
publishDate 2023
url https://ink.library.smu.edu.sg/sis_research/8550
https://ink.library.smu.edu.sg/context/sis_research/article/9553/viewcontent/Fuzzing_av.pdf
_version_ 1789483263297847296