Evaluating the merits of ranking in structured network pruning

Pruning of channels in trained deep neural networks has been widely used to implement efficient DNNs that can be deployed on embedded/mobile devices. Majority of existing techniques employ criteria-based sorting of the channels to preserve salient channels during pruning as well as to automatically...

Full description

Saved in:
Bibliographic Details
Main Authors: Sharma, Kuldeep, Ramakrishnan, Nirmala, Prakash, Alok, Lam, Siew-Kei, Srikanthan, Thambipillai
Other Authors: School of Computer Science and Engineering
Format: Conference or Workshop Item
Language:English
Published: 2021
Subjects:
Online Access:https://hdl.handle.net/10356/147716
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-147716
record_format dspace
spelling sg-ntu-dr.10356-1477162021-04-21T02:56:03Z Evaluating the merits of ranking in structured network pruning Sharma, Kuldeep Ramakrishnan, Nirmala Prakash, Alok Lam, Siew-Kei Srikanthan, Thambipillai School of Computer Science and Engineering IEEE International Conference on Distributed Computing Systems (ICDCS) Engineering::Computer science and engineering Neural Networks Structured Pruning Pruning of channels in trained deep neural networks has been widely used to implement efficient DNNs that can be deployed on embedded/mobile devices. Majority of existing techniques employ criteria-based sorting of the channels to preserve salient channels during pruning as well as to automatically determine the pruned network architecture. However, recent studies on widely used DNNs, such as VGG-16, have shown that selecting and preserving salient channels using pruning criteria is not necessary since the plasticity of the network allows the accuracy to be recovered through fine-tuning. In this work, we further explore the value of the ranking criteria in pruning to show that if channels are removed gradually and iteratively, alternating with fine-tuning on the target dataset, ranking criteria are indeed not necessary to select redundant channels. Experimental results confirm that even a random selection of channels for pruning leads to similar performance (accuracy). In addition, we demonstrate that even a simple pruning technique that uniformly removes channels from all layers in the network, performs similar to existing ranking criteria-based approaches, while leading to lower inference time (GFLOPs). Our extensive evaluations include the context of embedded implementations of DNNs - specifically, on small networks such as SqueezeNet and at aggressive pruning percentages. We leverage these insights, to propose a GFLOPs-aware iterative pruning strategy that does not rely on any ranking criteria and yet can further lead to lower inference time by 15% without sacrificing accuracy. National Research Foundation (NRF) 2021-04-21T02:56:03Z 2021-04-21T02:56:03Z 2020 Conference Paper Sharma, K., Ramakrishnan, N., Prakash, A., Lam, S. & Srikanthan, T. (2020). Evaluating the merits of ranking in structured network pruning. IEEE International Conference on Distributed Computing Systems (ICDCS), 2020-November, 1389-1396. https://dx.doi.org/10.1109/ICDCS47774.2020.00183 9781728170022 https://hdl.handle.net/10356/147716 10.1109/ICDCS47774.2020.00183 2-s2.0-85101963628 2020-November 1389 1396 en NRF TUMCREATE © 2020 Institute of Electrical and Electronics Engineers (IEEE). All rights reserved.
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering::Computer science and engineering
Neural Networks
Structured Pruning
spellingShingle Engineering::Computer science and engineering
Neural Networks
Structured Pruning
Sharma, Kuldeep
Ramakrishnan, Nirmala
Prakash, Alok
Lam, Siew-Kei
Srikanthan, Thambipillai
Evaluating the merits of ranking in structured network pruning
description Pruning of channels in trained deep neural networks has been widely used to implement efficient DNNs that can be deployed on embedded/mobile devices. Majority of existing techniques employ criteria-based sorting of the channels to preserve salient channels during pruning as well as to automatically determine the pruned network architecture. However, recent studies on widely used DNNs, such as VGG-16, have shown that selecting and preserving salient channels using pruning criteria is not necessary since the plasticity of the network allows the accuracy to be recovered through fine-tuning. In this work, we further explore the value of the ranking criteria in pruning to show that if channels are removed gradually and iteratively, alternating with fine-tuning on the target dataset, ranking criteria are indeed not necessary to select redundant channels. Experimental results confirm that even a random selection of channels for pruning leads to similar performance (accuracy). In addition, we demonstrate that even a simple pruning technique that uniformly removes channels from all layers in the network, performs similar to existing ranking criteria-based approaches, while leading to lower inference time (GFLOPs). Our extensive evaluations include the context of embedded implementations of DNNs - specifically, on small networks such as SqueezeNet and at aggressive pruning percentages. We leverage these insights, to propose a GFLOPs-aware iterative pruning strategy that does not rely on any ranking criteria and yet can further lead to lower inference time by 15% without sacrificing accuracy.
author2 School of Computer Science and Engineering
author_facet School of Computer Science and Engineering
Sharma, Kuldeep
Ramakrishnan, Nirmala
Prakash, Alok
Lam, Siew-Kei
Srikanthan, Thambipillai
format Conference or Workshop Item
author Sharma, Kuldeep
Ramakrishnan, Nirmala
Prakash, Alok
Lam, Siew-Kei
Srikanthan, Thambipillai
author_sort Sharma, Kuldeep
title Evaluating the merits of ranking in structured network pruning
title_short Evaluating the merits of ranking in structured network pruning
title_full Evaluating the merits of ranking in structured network pruning
title_fullStr Evaluating the merits of ranking in structured network pruning
title_full_unstemmed Evaluating the merits of ranking in structured network pruning
title_sort evaluating the merits of ranking in structured network pruning
publishDate 2021
url https://hdl.handle.net/10356/147716
_version_ 1698713735179272192