Home > Research > Publications & Outputs > Pruning convolutional neural networks with an a...

Links

Text available via DOI:

View graph of relations

Pruning convolutional neural networks with an attention mechanism for remote sensing image classification

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

Pruning convolutional neural networks with an attention mechanism for remote sensing image classification. / Zhang, S.; Wu, G.; Gu, Junhua et al.
In: Electronics (Switzerland), Vol. 9, No. 8, 1209, 27.07.2020, p. 1-19.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

APA

Vancouver

Zhang S, Wu G, Gu J, Han J. Pruning convolutional neural networks with an attention mechanism for remote sensing image classification. Electronics (Switzerland). 2020 Jul 27;9(8):1-19. 1209. doi: 10.3390/electronics9081209

Author

Zhang, S. ; Wu, G. ; Gu, Junhua et al. / Pruning convolutional neural networks with an attention mechanism for remote sensing image classification. In: Electronics (Switzerland). 2020 ; Vol. 9, No. 8. pp. 1-19.

Bibtex

@article{e6e9987fd6b8402ea9455d92e8282b6c,
title = "Pruning convolutional neural networks with an attention mechanism for remote sensing image classification",
abstract = "Despite the great success of Convolutional Neural Networks (CNNs) in various visual recognition tasks, the high computational and storage costs of such deep networks impede their deployments in real-time remote sensing tasks. To this end, considerable attention has been given to the filter pruning techniques, which enable slimming deep networks with acceptable performance drops and thus implementing them on the remote sensing devices. In this paper, we propose a new scheme, termed Pruning Filter with Attention Mechanism (PFAM), to compress and accelerate traditional CNNs. In particular, a novel correlation-based filter pruning criterion, which explores the long-range dependencies among filters via an attention module, is employed to select the to-be-pruned filters. Distinct from previous methods, the less correlated filters are first pruned after the pruning stage in the current training epoch, and they are reconstructed and updated during the next training epoch. Doing so allows manipulating input data with the maximum information preserved when executing the original training strategy such that the compressed network model can be obtained without the need for the pretrained model. The proposed method is evaluated on three public remote sensing image datasets, and the experimental results demonstrate its superiority, compared to state-of-the-art baselines. Specifically, PFAM achieves a 0.67% accuracy improvement with a 40% model-size reduction on the Aerial Image Dataset (AID) dataset, which is impressive.",
keywords = "Deep feature learning, Filter pruning, Remote sensing imagery, Self-attention",
author = "S. Zhang and G. Wu and Junhua Gu and J. Han",
year = "2020",
month = jul,
day = "27",
doi = "10.3390/electronics9081209",
language = "English",
volume = "9",
pages = "1--19",
journal = "Electronics (Switzerland)",
issn = "2079-9292",
publisher = "MDPI AG",
number = "8",

}

RIS

TY - JOUR

T1 - Pruning convolutional neural networks with an attention mechanism for remote sensing image classification

AU - Zhang, S.

AU - Wu, G.

AU - Gu, Junhua

AU - Han, J.

PY - 2020/7/27

Y1 - 2020/7/27

N2 - Despite the great success of Convolutional Neural Networks (CNNs) in various visual recognition tasks, the high computational and storage costs of such deep networks impede their deployments in real-time remote sensing tasks. To this end, considerable attention has been given to the filter pruning techniques, which enable slimming deep networks with acceptable performance drops and thus implementing them on the remote sensing devices. In this paper, we propose a new scheme, termed Pruning Filter with Attention Mechanism (PFAM), to compress and accelerate traditional CNNs. In particular, a novel correlation-based filter pruning criterion, which explores the long-range dependencies among filters via an attention module, is employed to select the to-be-pruned filters. Distinct from previous methods, the less correlated filters are first pruned after the pruning stage in the current training epoch, and they are reconstructed and updated during the next training epoch. Doing so allows manipulating input data with the maximum information preserved when executing the original training strategy such that the compressed network model can be obtained without the need for the pretrained model. The proposed method is evaluated on three public remote sensing image datasets, and the experimental results demonstrate its superiority, compared to state-of-the-art baselines. Specifically, PFAM achieves a 0.67% accuracy improvement with a 40% model-size reduction on the Aerial Image Dataset (AID) dataset, which is impressive.

AB - Despite the great success of Convolutional Neural Networks (CNNs) in various visual recognition tasks, the high computational and storage costs of such deep networks impede their deployments in real-time remote sensing tasks. To this end, considerable attention has been given to the filter pruning techniques, which enable slimming deep networks with acceptable performance drops and thus implementing them on the remote sensing devices. In this paper, we propose a new scheme, termed Pruning Filter with Attention Mechanism (PFAM), to compress and accelerate traditional CNNs. In particular, a novel correlation-based filter pruning criterion, which explores the long-range dependencies among filters via an attention module, is employed to select the to-be-pruned filters. Distinct from previous methods, the less correlated filters are first pruned after the pruning stage in the current training epoch, and they are reconstructed and updated during the next training epoch. Doing so allows manipulating input data with the maximum information preserved when executing the original training strategy such that the compressed network model can be obtained without the need for the pretrained model. The proposed method is evaluated on three public remote sensing image datasets, and the experimental results demonstrate its superiority, compared to state-of-the-art baselines. Specifically, PFAM achieves a 0.67% accuracy improvement with a 40% model-size reduction on the Aerial Image Dataset (AID) dataset, which is impressive.

KW - Deep feature learning

KW - Filter pruning

KW - Remote sensing imagery

KW - Self-attention

U2 - 10.3390/electronics9081209

DO - 10.3390/electronics9081209

M3 - Journal article

VL - 9

SP - 1

EP - 19

JO - Electronics (Switzerland)

JF - Electronics (Switzerland)

SN - 2079-9292

IS - 8

M1 - 1209

ER -