QVIP: An ILP-based formal verification approach for quantized neural networks
Deep learning has become a promising programming paradigm in software development, owing to its surprising performance in solving many challenging tasks. Deep neural networks (DNNs) are increasingly being deployed in practice, but are limited on resource-constrained devices owing to their demand for...
Saved in:
Main Authors: | , , , , , , |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2022
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/7281 https://ink.library.smu.edu.sg/context/sis_research/article/8284/viewcontent/ase22.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
id |
sg-smu-ink.sis_research-8284 |
---|---|
record_format |
dspace |
spelling |
sg-smu-ink.sis_research-82842022-09-22T07:25:55Z QVIP: An ILP-based formal verification approach for quantized neural networks ZHANG, Yedi ZHAO, Zhe CHEN, Guangke SONG, Fu ZHANG, Min CHEN, Taolue SUN, Jun Deep learning has become a promising programming paradigm in software development, owing to its surprising performance in solving many challenging tasks. Deep neural networks (DNNs) are increasingly being deployed in practice, but are limited on resource-constrained devices owing to their demand for computational power. Quantization has emerged as a promising technique to reduce the size of DNNs with comparable accuracy as their floating-point numbered counterparts. The resulting quantized neural networks (QNNs) can be implemented energy-efficiently. Similar to their floating-point numbered counterparts, quality assurance techniques for QNNs, such as testing and formal verification, are essential but are currently less explored. In this work, we propose a novel and efficient formal verification approach for QNNs. In particular, we are the first to propose an encoding that reduces the verification problem of QNNs into the solving of integer linear constraints, which can be solved using off-the-shelf solvers. Our encoding is both sound and complete. We demonstrate the application of our approach on local robustness verification and maximum robustness radius computation. We implement our approach in a prototype tool QVIP and conduct a thorough evaluation. Experimental results on QNNs with different quantization bits confirm the effectiveness and efficiency of our approach, e.g., two orders of magnitude faster and able to solve more verification tasks in the same time limit than the state-of-the-art methods. 2022-10-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/7281 info:doi/10.1145/3551349.3556916 https://ink.library.smu.edu.sg/context/sis_research/article/8284/viewcontent/ase22.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Quantized neural network formal verification integer linear programming robustness OS and Networks Software Engineering |
institution |
Singapore Management University |
building |
SMU Libraries |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
SMU Libraries |
collection |
InK@SMU |
language |
English |
topic |
Quantized neural network formal verification integer linear programming robustness OS and Networks Software Engineering |
spellingShingle |
Quantized neural network formal verification integer linear programming robustness OS and Networks Software Engineering ZHANG, Yedi ZHAO, Zhe CHEN, Guangke SONG, Fu ZHANG, Min CHEN, Taolue SUN, Jun QVIP: An ILP-based formal verification approach for quantized neural networks |
description |
Deep learning has become a promising programming paradigm in software development, owing to its surprising performance in solving many challenging tasks. Deep neural networks (DNNs) are increasingly being deployed in practice, but are limited on resource-constrained devices owing to their demand for computational power. Quantization has emerged as a promising technique to reduce the size of DNNs with comparable accuracy as their floating-point numbered counterparts. The resulting quantized neural networks (QNNs) can be implemented energy-efficiently. Similar to their floating-point numbered counterparts, quality assurance techniques for QNNs, such as testing and formal verification, are essential but are currently less explored. In this work, we propose a novel and efficient formal verification approach for QNNs. In particular, we are the first to propose an encoding that reduces the verification problem of QNNs into the solving of integer linear constraints, which can be solved using off-the-shelf solvers. Our encoding is both sound and complete. We demonstrate the application of our approach on local robustness verification and maximum robustness radius computation. We implement our approach in a prototype tool QVIP and conduct a thorough evaluation. Experimental results on QNNs with different quantization bits confirm the effectiveness and efficiency of our approach, e.g., two orders of magnitude faster and able to solve more verification tasks in the same time limit than the state-of-the-art methods. |
format |
text |
author |
ZHANG, Yedi ZHAO, Zhe CHEN, Guangke SONG, Fu ZHANG, Min CHEN, Taolue SUN, Jun |
author_facet |
ZHANG, Yedi ZHAO, Zhe CHEN, Guangke SONG, Fu ZHANG, Min CHEN, Taolue SUN, Jun |
author_sort |
ZHANG, Yedi |
title |
QVIP: An ILP-based formal verification approach for quantized neural networks |
title_short |
QVIP: An ILP-based formal verification approach for quantized neural networks |
title_full |
QVIP: An ILP-based formal verification approach for quantized neural networks |
title_fullStr |
QVIP: An ILP-based formal verification approach for quantized neural networks |
title_full_unstemmed |
QVIP: An ILP-based formal verification approach for quantized neural networks |
title_sort |
qvip: an ilp-based formal verification approach for quantized neural networks |
publisher |
Institutional Knowledge at Singapore Management University |
publishDate |
2022 |
url |
https://ink.library.smu.edu.sg/sis_research/7281 https://ink.library.smu.edu.sg/context/sis_research/article/8284/viewcontent/ase22.pdf |
_version_ |
1770576297747021824 |