Crossbar-aligned & integer-only neural network compression for efficient in-memory acceleration
Crossbar-based In-Memory Computing (IMC) accelerators preload the entire Deep Neural Network (DNN) into crossbars before inference. However, devices with limited crossbars cannot infer increasingly complex models. IMC-pruning can reduce the usage of crossbars, but current methods need expensive extr...
Saved in:
Main Authors: | Huai, Shuo, Liu, Di, Luo, Xiangzhong, Chen, Hui, Liu, Weichen, Subramaniam, Ravi |
---|---|
其他作者: | School of Computer Science and Engineering |
格式: | Conference or Workshop Item |
語言: | English |
出版: |
2023
|
主題: | |
在線閱讀: | https://hdl.handle.net/10356/165352 |
標簽: |
添加標簽
沒有標簽, 成為第一個標記此記錄!
|
機構: | Nanyang Technological University |
語言: | English |
相似書籍
-
CRIMP: compact & reliable DNN inference on in-memory processing via crossbar-aligned compression and non-ideality adaptation
由: Huai, Shuo, et al.
出版: (2023) -
A comprehensive study on optimization techniques for AMR robots recognition models
由: Zheng, Hao Peng
出版: (2025) -
You only search once: on lightweight differentiable architecture search for resource-constrained embedded platforms
由: Luo, Xiangzhong, et al.
出版: (2023) -
EdgeCompress: coupling multi-dimensional model compression and dynamic inference for EdgeAI
由: Kong, Hao, et al.
出版: (2023) -
Smart scissor: coupling spatial redundancy reduction and CNN compression for embedded hardware
由: Kong, Hao, et al.
出版: (2023)