Home > Research > Publications & Outputs > Hard-Constrained Hopfield Neural Network for Su...

Electronic data

  • H-HNN_TGRS

    Accepted author manuscript, 2.68 MB, PDF document

    Available under license: CC BY: Creative Commons Attribution 4.0 International License

Links

Text available via DOI:

View graph of relations

Hard-Constrained Hopfield Neural Network for Subpixel Mapping

Research output: Contribution to Journal/MagazineJournal articlepeer-review

E-pub ahead of print
Article number5641212
<mark>Journal publication date</mark>31/12/2024
<mark>Journal</mark>IEEE Transactions on Geoscience and Remote Sensing
Volume62
Publication StatusE-pub ahead of print
Early online date19/09/24
<mark>Original language</mark>English

Abstract

Subpixel mapping (SPM) can address the mixed pixel problem by producing land cover maps at a finer spatial resolution than the input images. The Hopfield neural network (HNN) method has shown great advantages in SPM and various extended versions have been developed recently. However, a long-standing issue in the HNN, especially in the multiclass scenario, is its tendency to fall into local optima with vanished gradients, failing to push subpixels to the hard class label of 0 or 1. This can lead to great uncertainties in determining hard class labels and, moreover, the disappearance of many small-sized land cover features and spatial details. In this article, we proposed a hard-constrained HNN (H-HNN) model that introduces hard label-based constraints at both the subpixel and coarse pixel scales. These constraints aim to increase the accuracy of SPM by guiding the optimization process fully toward obtaining hard classification maps at the subpixel level. Experimental evaluations against benchmark methods demonstrated the effectiveness of the H-HNN. The findings reveal that the H-HNN method is a general and robust alternative to the HNN, which can increase the overall accuracy of the SPM results by about 1%. In addition, the H-HNN can effectively reduce the uncertainties by predicting more accurate hard class labels and coarse proportions (with root-mean-square error (RMSE) of the coarse proportions decreased by about 0.015).