Home > Research > Publications & Outputs > Progressive Channel-Shrinking Network

Electronic data

  • 2304.00280

    Accepted author manuscript, 3.06 MB, PDF document

    Available under license: CC BY: Creative Commons Attribution 4.0 International License


Text available via DOI:

View graph of relations

Progressive Channel-Shrinking Network

Research output: Contribution to Journal/MagazineJournal articlepeer-review

  • Jianhong Pan
  • Siyuan Yang
  • Lin Geng Foo
  • Qiuhong Ke
  • Hossein Rahmani
  • Zhipeng Fan
  • Jun Liu
<mark>Journal publication date</mark>1/02/2024
<mark>Journal</mark>IEEE Transactions on Multimedia
Number of pages11
Pages (from-to)2016-2026
Publication StatusPublished
Early online date30/06/23
<mark>Original language</mark>English


Currently, salience-based channel pruning makes continuous breakthroughs in network compression. In the realization, the salience mechanism is used as a metric of channel salience to guide pruning. Therefore, salience-based channel pruning can dynamically adjust the channel width at run-time, which provides a flexible pruning scheme. However, there are two problems emerging: a gating function is often needed to truncate the specific salience entries to zero, which destabilizes the forward propagation; dynamic architecture brings more cost for indexing in inference which bottlenecks the inference speed. In this paper, we propose a Progressive Channel-Shrinking (PCS) method to compress the selected salience entries at run-time instead of roughly approximating them to zero. We also propose a Running Shrinking Policy to provide a testing-static pruning scheme that can reduce the memory access cost for filter indexing. We evaluate our method on ImageNet and CIFAR10 datasets over two prevalent networks: ResNet and VGG, and demonstrate that our PCS outperforms all baselines and achieves state-of-the-art in terms of compression-performance tradeoff. Moreover, we observe a significant and practical acceleration of inference. The code will be released upon acceptance.