ACSL : adaptive correlation-driven sparsity learning for deep neural network compression
Deep convolutional neural network compression has attracted lots of attention due to the need to deploy accurate models on resource-constrained edge devices. Existing techniques mostly focus on compressing networks for image-level classification, and it is not clear if they generalize well on networ...
Saved in:
Main Authors: | He, Wei, Wu, Meiqing, Lam, Siew-Kei |
---|---|
Other Authors: | School of Computer Science and Engineering |
Format: | Article |
Language: | English |
Published: |
2021
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/152852 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Similar Items
-
CRIMP: compact & reliable DNN inference on in-memory processing via crossbar-aligned compression and non-ideality adaptation
by: Huai, Shuo, et al.
Published: (2023) -
Multiple antenna-based THz communication system with channel correlation
by: Sharma, Shubha, et al.
Published: (2024) -
Crossbar-aligned & integer-only neural network compression for efficient in-memory acceleration
by: Huai, Shuo, et al.
Published: (2023) -
Evaluating the merits of ranking in structured network pruning
by: Sharma, Kuldeep, et al.
Published: (2021) -
Multi-fold correlation attention network for predicting traffic speeds with heterogeneous frequency
by: Sun, Yidan, et al.
Published: (2022)