Please use this identifier to cite or link to this item:
https://hdl.handle.net/10356/143626
Title: | Stochastic downsampling for cost-adjustable inference and improved regularization in convolutional networks | Authors: | Kuen, Jason Kong, Xiangfei Lin, Zhe Wang, Gang Yin, Jianxiong See, Simon Tan, Yap-Peng |
Keywords: | Engineering::Electrical and electronic engineering | Issue Date: | 2018 | Source: | Kuen, J., Kong, X., Lin, Z., Wang, G., Yin, J., See, S., & Tan, Y.-P. (2018). Stochastic downsampling for cost-adjustable inference and improved regularization in convolutional networks. 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, 7929-7938. doi:10.1109/CVPR.2018.00827 | Abstract: | It is desirable to train convolutional networks (CNNs) to run more efficiently during inference. In many cases however, the computational budget that the system has for inference cannot be known beforehand during training, or the inference budget is dependent on the changing real-time resource availability. Thus, it is inadequate to train just inference-efficient CNNs, whose inference costs are not adjustable and cannot adapt to varied inference budgets. We propose a novel approach for cost-adjustable inference in CNNs - Stochastic Downsampling Point (SDPoint). During training, SDPoint applies feature map downsampling to a random point in the layer hierarchy, with a random downsampling ratio. The different stochastic downsampling configurations known as SDPoint instances (of the same model) have computational costs different from each other, while being trained to minimize the same prediction loss. Sharing network parameters across different instances provides significant regularization boost. During inference, one may handpick a SDPoint instance that best fits the inference budget. The effectiveness of SDPoint, as both a cost-adjustable inference approach and a regularizer, is validated through extensive experiments on image classification. | URI: | https://hdl.handle.net/10356/143626 | ISBN: | 978-1-5386-6420-9 | DOI: | 10.1109/CVPR.2018.00827 | Rights: | © 2018 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, in any current or future media, including reprinting/republishing this material for adverstising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. The published version is available at:https://doi.org/10.1109/CVPR.2018.00827 | Fulltext Permission: | open | Fulltext Availability: | With Fulltext |
Appears in Collections: | EEE Conference Papers |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Stochastic Downsampling for Cost-Adjustable Inference and Improved Regularization in Convolutional Networks.pdf | 1.94 MB | Adobe PDF | View/Open |
SCOPUSTM
Citations
20
10
Updated on Jan 22, 2023
Web of ScienceTM
Citations
50
5
Updated on Jan 29, 2023
Page view(s)
130
Updated on Feb 6, 2023
Download(s) 50
70
Updated on Feb 6, 2023
Google ScholarTM
Check
Altmetric
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.