ACSL : adaptive correlation-driven sparsity learning for deep neural network compression

Deep convolutional neural network compression has attracted lots of attention due to the need to deploy accurate models on resource-constrained edge devices. Existing techniques mostly focus on compressing networks for image-level classification, and it is not clear if they generalize well on networ...

Full description

Saved in:
Bibliographic Details
Main Authors: He, Wei, Wu, Meiqing, Lam, Siew-Kei
Other Authors: School of Computer Science and Engineering
Format: Article
Language:English
Published: 2021
Subjects:
Online Access:https://hdl.handle.net/10356/152852
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-152852
record_format dspace
spelling sg-ntu-dr.10356-1528522021-11-15T02:38:47Z ACSL : adaptive correlation-driven sparsity learning for deep neural network compression He, Wei Wu, Meiqing Lam, Siew-Kei School of Computer Science and Engineering Engineering::Computer science and engineering Network Pruning Channel Correlation Deep convolutional neural network compression has attracted lots of attention due to the need to deploy accurate models on resource-constrained edge devices. Existing techniques mostly focus on compressing networks for image-level classification, and it is not clear if they generalize well on network architectures for more challenging pixel-level tasks, e.g., dense crowd counting or semantic segmentation. In this paper, we propose an adaptive correlation-driven sparsity learning (ACSL) framework for channel pruning that outperforms state-of-the-art methods on both image-level and pixel-level tasks. In our ACSL framework, we first quantify the data-dependent channel correlation information with a channel affinity matrix. Next, we leverage these inter-dependencies to induce sparsity into the channels with the introduced adaptive penalty strength. After removing the redundant channels, we obtain compact and efficient models, which have significantly less number of parameters while maintaining comparable performance with the original models. We demonstrate the advantages of our proposed approach on three popular vision tasks, i.e., dense crowd counting, semantic segmentation, and image-level classification. The experimental results demonstrate the superiority of our framework. In particular, for crowd counting on the Mall dataset, the proposed ACSL framework is able to reduce up to 94% parameters (VGG16-Decoder) and 84% FLOPs (ResNet101), while maintaining the same performance of (at times outperforming) the original model. National Research Foundation (NRF) This research project is supported by the National Research Foundation Singapore under its Campus for Research Excellence and Technological Enterprise (CREATE) programme with the Technical University of Munich at TUMCREATE. Prof. Siew-Kei Lam is partially supported under the RIE2020 Industry Alignment Fund – Industry Collaboration Projects (IAFICP) Funding Initiative, as well as cash and in-kind contribution from Singapore Telecommunications Limited (Singtel), through Singtel Cognitive and Artificial Intelligence Lab for Enterprises (SCALE@NTU). 2021-11-15T02:38:04Z 2021-11-15T02:38:04Z 2021 Journal Article He, W., Wu, M. & Lam, S. (2021). ACSL : adaptive correlation-driven sparsity learning for deep neural network compression. Neural Networks, 144, 465-477. https://dx.doi.org/10.1016/j.neunet.2021.09.012 0893-6080 https://hdl.handle.net/10356/152852 10.1016/j.neunet.2021.09.012 144 465 477 en Neural Networks © 2021 Elsevier Ltd. All rights reserved.
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering::Computer science and engineering
Network Pruning
Channel Correlation
spellingShingle Engineering::Computer science and engineering
Network Pruning
Channel Correlation
He, Wei
Wu, Meiqing
Lam, Siew-Kei
ACSL : adaptive correlation-driven sparsity learning for deep neural network compression
description Deep convolutional neural network compression has attracted lots of attention due to the need to deploy accurate models on resource-constrained edge devices. Existing techniques mostly focus on compressing networks for image-level classification, and it is not clear if they generalize well on network architectures for more challenging pixel-level tasks, e.g., dense crowd counting or semantic segmentation. In this paper, we propose an adaptive correlation-driven sparsity learning (ACSL) framework for channel pruning that outperforms state-of-the-art methods on both image-level and pixel-level tasks. In our ACSL framework, we first quantify the data-dependent channel correlation information with a channel affinity matrix. Next, we leverage these inter-dependencies to induce sparsity into the channels with the introduced adaptive penalty strength. After removing the redundant channels, we obtain compact and efficient models, which have significantly less number of parameters while maintaining comparable performance with the original models. We demonstrate the advantages of our proposed approach on three popular vision tasks, i.e., dense crowd counting, semantic segmentation, and image-level classification. The experimental results demonstrate the superiority of our framework. In particular, for crowd counting on the Mall dataset, the proposed ACSL framework is able to reduce up to 94% parameters (VGG16-Decoder) and 84% FLOPs (ResNet101), while maintaining the same performance of (at times outperforming) the original model.
author2 School of Computer Science and Engineering
author_facet School of Computer Science and Engineering
He, Wei
Wu, Meiqing
Lam, Siew-Kei
format Article
author He, Wei
Wu, Meiqing
Lam, Siew-Kei
author_sort He, Wei
title ACSL : adaptive correlation-driven sparsity learning for deep neural network compression
title_short ACSL : adaptive correlation-driven sparsity learning for deep neural network compression
title_full ACSL : adaptive correlation-driven sparsity learning for deep neural network compression
title_fullStr ACSL : adaptive correlation-driven sparsity learning for deep neural network compression
title_full_unstemmed ACSL : adaptive correlation-driven sparsity learning for deep neural network compression
title_sort acsl : adaptive correlation-driven sparsity learning for deep neural network compression
publishDate 2021
url https://hdl.handle.net/10356/152852
_version_ 1718368063864176640