Home > Research > Publications & Outputs > An Adaptive Capsule Network for Hyperspectral R...

Electronic data

  • remotesensing-1245064

    Accepted author manuscript, 3.19 MB, Word document

    Available under license: CC BY: Creative Commons Attribution 4.0 International License

Links

Text available via DOI:

View graph of relations

An Adaptive Capsule Network for Hyperspectral Remote Sensing Classification

Research output: Contribution to journalJournal articlepeer-review

Published
  • Xiaohui Ding
  • Yong Li
  • Ji Yang
  • Huapeng Li
  • Lingjia Liu
  • Yangxiaoyue Liu
  • Ce Zhang
Close
Article number2445
<mark>Journal publication date</mark>23/06/2021
<mark>Journal</mark>Remote Sensing
Issue number13
Volume13
Number of pages17
Pages (from-to)1-17
Publication StatusPublished
<mark>Original language</mark>English

Abstract

The capsule network (Caps) is a novel type of neural network that has great potential for the classification of hyperspectral remote sensing. However, the Caps suffers from the issue of gradient vanishing. To solve this problem, a powered activation regularization based adaptive capsule network (PAR-ACaps) was proposed for hyperspectral remote sensing classification, in which an adaptive routing algorithm without iteration was applied to amplify the gradient, and the powered activation regularization method was used to learn the sparser and more discriminative representation. The classification performance of PAR-ACaps was evaluated using two public hyperspectral remote sensing datasets, i.e., the Pavia University (PU) and Salinas (SA) datasets. The average overall classification accuracy (OA) of PAR-ACaps with shallower architecture was measured and compared with those of the benchmarks, including random forest (RF), support vector machine (SVM), 1-dimensional convolutional neural network (1DCNN), two-dimensional convolutional neural network (CNN), three-dimensional convolutional neural network (3DCNN), Caps, and the original adaptive capsule network (ACaps) with comparable network architectures. The OA of PAR-ACaps for PU and SA datasets was 99.51% and 94.52%, respectively, which was higher than those of benchmarks. Moreover, the classification performance of PAR-ACaps with relatively deeper neural architecture (four and six convolutional layers in the feature extraction stage) was also evaluated to demonstrate the effectiveness of gradient amplification. As shown in the experimental results, the classification performance of PAR-ACaps with relatively deeper neural architecture for PU and SA datasets was also superior to 1DCNN, CNN, 3DCNN, Caps, and ACaps with comparable neural architectures. Additionally, the training time consumed by PAR-ACaps was significantly lower than that of Caps. The proposed PAR-ACaps is, therefore, recommended as an effective alternative for hyperspectral remote sensing classification.