Stochastic downsampling for cost-adjustable inference and improved regularization in convolutional networks
It is desirable to train convolutional networks (CNNs) to run more efficiently during inference. In many cases however, the computational budget that the system has for inference cannot be known beforehand during training, or the inference budget is dependent on the changing real-time resource avail...
Saved in:
Main Authors: | Kuen, Jason, Kong, Xiangfei, Lin, Zhe, Wang, Gang, Yin, Jianxiong, See, Simon, Tan, Yap-Peng |
---|---|
Other Authors: | School of Electrical and Electronic Engineering |
Format: | Conference or Workshop Item |
Language: | English |
Published: |
2020
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/143626 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Similar Items
-
Convolutional neural networks with dynamic regularization
by: Wang, Yi, et al.
Published: (2022) -
DDPMnet: All-Digital Pulse Density-Based DNN Architecture with 228 Gate Equivalents/MAC Unit, 28-TOPS/W and 1.5-TOPS/mm2 in 40nm
by: Animesh Gupta, et al.
Published: (2023) -
On commutativity of multidimensional downsamplers and upsamplers
by: Khansari Masoud R.K., et al.
Published: (2018) -
High Throughput, Area-Efficient, and Variation-Tolerant 3D In-memory Compute System for Deep Convolutional Neural Networks
by: EVGENY ZAMBURG, et al.
Published: (2021) -
Hardware-friendly stochastic and adaptive learning in memristor convolutional neural networks
by: Zhang, Wei, et al.
Published: (2022)