DiffChaser: Detecting disagreements for deep neural networks

The platform migration and customization have become an indispensable process of deep neural network (DNN) development lifecycle. A highprecision but complex DNN trained in the cloud on massive data and powerful GPUs often goes through an optimization phase (e.g., quantization, compression) before d...

Full description

Saved in:
Bibliographic Details
Main Authors: XIE, Xiaofei, MA, Lei, WANG, Haijun, LI, Yuekang, LIU, Yang, LI, Xiaohong
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2019
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/7105
https://ink.library.smu.edu.sg/context/sis_research/article/8108/viewcontent/0800.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
Description
Summary:The platform migration and customization have become an indispensable process of deep neural network (DNN) development lifecycle. A highprecision but complex DNN trained in the cloud on massive data and powerful GPUs often goes through an optimization phase (e.g., quantization, compression) before deployment to a target device (e.g., mobile device). A test set that effectively uncovers the disagreements of a DNN and its optimized variant provides certain feedback to debug and further enhance the optimization procedure. However, the minor inconsistency between a DNN and its optimized version is often hard to detect and easily bypasses the original test set. This paper proposes DiffChaser, an automated black-box testing framework to detect untargeted/targeted disagreements between version variants of a DNN. We demonstrate 1) its effectiveness by comparing with the state-of-the-art techniques, and 2) its usefulness in real-world DNN product deployment involved with quantization and optimization.