Stochastic downsampling for cost-adjustable inference and improved regularization in convolutional networks

It is desirable to train convolutional networks (CNNs) to run more efficiently during inference. In many cases however, the computational budget that the system has for inference cannot be known beforehand during training, or the inference budget is dependent on the changing real-time resource avail...

Full description

Saved in:
Bibliographic Details
Main Authors: Kuen, Jason, Kong, Xiangfei, Lin, Zhe, Wang, Gang, Yin, Jianxiong, See, Simon, Tan, Yap-Peng
Other Authors: School of Electrical and Electronic Engineering
Format: Conference or Workshop Item
Language:English
Published: 2020
Subjects:
Online Access:https://hdl.handle.net/10356/143626
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-143626
record_format dspace
spelling sg-ntu-dr.10356-1436262020-09-15T01:41:25Z Stochastic downsampling for cost-adjustable inference and improved regularization in convolutional networks Kuen, Jason Kong, Xiangfei Lin, Zhe Wang, Gang Yin, Jianxiong See, Simon Tan, Yap-Peng School of Electrical and Electronic Engineering 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition Engineering::Electrical and electronic engineering Network Parameters Neural Nets It is desirable to train convolutional networks (CNNs) to run more efficiently during inference. In many cases however, the computational budget that the system has for inference cannot be known beforehand during training, or the inference budget is dependent on the changing real-time resource availability. Thus, it is inadequate to train just inference-efficient CNNs, whose inference costs are not adjustable and cannot adapt to varied inference budgets. We propose a novel approach for cost-adjustable inference in CNNs - Stochastic Downsampling Point (SDPoint). During training, SDPoint applies feature map downsampling to a random point in the layer hierarchy, with a random downsampling ratio. The different stochastic downsampling configurations known as SDPoint instances (of the same model) have computational costs different from each other, while being trained to minimize the same prediction loss. Sharing network parameters across different instances provides significant regularization boost. During inference, one may handpick a SDPoint instance that best fits the inference budget. The effectiveness of SDPoint, as both a cost-adjustable inference approach and a regularizer, is validated through extensive experiments on image classification. Accepted version 2020-09-15T01:30:56Z 2020-09-15T01:30:56Z 2018 Conference Paper Kuen, J., Kong, X., Lin, Z., Wang, G., Yin, J., See, S., & Tan, Y.-P. (2018). Stochastic downsampling for cost-adjustable inference and improved regularization in convolutional networks. 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, 7929-7938. doi:10.1109/CVPR.2018.00827 978-1-5386-6420-9 https://hdl.handle.net/10356/143626 10.1109/CVPR.2018.00827 7929 7938 en © 2018 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, in any current or future media, including reprinting/republishing this material for adverstising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. The published version is available at:https://doi.org/10.1109/CVPR.2018.00827 application/pdf
institution Nanyang Technological University
building NTU Library
country Singapore
collection DR-NTU
language English
topic Engineering::Electrical and electronic engineering
Network Parameters
Neural Nets
spellingShingle Engineering::Electrical and electronic engineering
Network Parameters
Neural Nets
Kuen, Jason
Kong, Xiangfei
Lin, Zhe
Wang, Gang
Yin, Jianxiong
See, Simon
Tan, Yap-Peng
Stochastic downsampling for cost-adjustable inference and improved regularization in convolutional networks
description It is desirable to train convolutional networks (CNNs) to run more efficiently during inference. In many cases however, the computational budget that the system has for inference cannot be known beforehand during training, or the inference budget is dependent on the changing real-time resource availability. Thus, it is inadequate to train just inference-efficient CNNs, whose inference costs are not adjustable and cannot adapt to varied inference budgets. We propose a novel approach for cost-adjustable inference in CNNs - Stochastic Downsampling Point (SDPoint). During training, SDPoint applies feature map downsampling to a random point in the layer hierarchy, with a random downsampling ratio. The different stochastic downsampling configurations known as SDPoint instances (of the same model) have computational costs different from each other, while being trained to minimize the same prediction loss. Sharing network parameters across different instances provides significant regularization boost. During inference, one may handpick a SDPoint instance that best fits the inference budget. The effectiveness of SDPoint, as both a cost-adjustable inference approach and a regularizer, is validated through extensive experiments on image classification.
author2 School of Electrical and Electronic Engineering
author_facet School of Electrical and Electronic Engineering
Kuen, Jason
Kong, Xiangfei
Lin, Zhe
Wang, Gang
Yin, Jianxiong
See, Simon
Tan, Yap-Peng
format Conference or Workshop Item
author Kuen, Jason
Kong, Xiangfei
Lin, Zhe
Wang, Gang
Yin, Jianxiong
See, Simon
Tan, Yap-Peng
author_sort Kuen, Jason
title Stochastic downsampling for cost-adjustable inference and improved regularization in convolutional networks
title_short Stochastic downsampling for cost-adjustable inference and improved regularization in convolutional networks
title_full Stochastic downsampling for cost-adjustable inference and improved regularization in convolutional networks
title_fullStr Stochastic downsampling for cost-adjustable inference and improved regularization in convolutional networks
title_full_unstemmed Stochastic downsampling for cost-adjustable inference and improved regularization in convolutional networks
title_sort stochastic downsampling for cost-adjustable inference and improved regularization in convolutional networks
publishDate 2020
url https://hdl.handle.net/10356/143626
_version_ 1681057730560786432