Abstract

utomatic designing computationally efficient neural networks has received much attention in recent years. Existing approaches either utilize network pruning or leverage the network architecture search methods. This paper presents a new framework named network adjustment, which considers network accuracy as a function of FLOPs, so that under each network configuration, one can estimate the FLOPs utilization ratio (FUR) for each layer and use it to determine whether to increase or decrease the number of channels on the layer. Note that FUR, like the gradient of a non-linear function, is accurate only in a small neighborhood of the current network. Hence, we design an iterative mechanism so that the initial network undergoes a number of steps, each of which has a small 'adjusting rate' to control the changes to the network. The computational overhead of the entire search process is reasonable, i.e., comparable to that of re-training the final model from scratch. Experiments on standard image classification datasets and a wide range of base networks demonstrate the effectiveness of our approach, which consistently outperforms the pruning counterpart. The code is available at https://github.com/danczs/NetworkAdjustment.


Original document

The different versions of the original document can be found in:

http://dx.doi.org/10.1109/cvpr42600.2020.01067
https://openaccess.thecvf.com/content_CVPR_2020/papers/Chen_Network_Adjustment_Channel_Search_Guided_by_FLOPs_Utilization_Ratio_CVPR_2020_paper.pdf,
https://openaccess.thecvf.com/content_CVPR_2020/html/Chen_Network_Adjustment_Channel_Search_Guided_by_FLOPs_Utilization_Ratio_CVPR_2020_paper.html,
https://ieeexplore.ieee.org/document/9156620,
https://academic.microsoft.com/#/detail/3035422913
Back to Top

Document information

Published on 01/01/2020

Volume 2020, 2020
DOI: 10.1109/cvpr42600.2020.01067
Licence: CC BY-NC-SA license

Document Score

0

Views 0
Recommendations 0

Share this document

claim authorship

Are you one of the authors of this document?