Home > Research > Publications & Outputs > Scale Sequence Joint Deep Learning (SS-JDL) for...

Electronic data

  • SSJDL_manuscript_Ce_accepted

    Rights statement: This is the author’s version of a work that was accepted for publication in Remote Sensing of Environment. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Remote Sensing of Environment, 237, 2020 DOI: 10.1016/j.rse.2019.111593

    Accepted author manuscript, 3.35 MB, PDF document

    Available under license: CC BY-NC-ND: Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License

Text available via DOI:

View graph of relations

Scale Sequence Joint Deep Learning (SS-JDL) for land use and land cover classification

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published
Close
Article number111593
<mark>Journal publication date</mark>28/02/2020
<mark>Journal</mark>Remote Sensing of Environment
Volume237
Number of pages16
Publication StatusPublished
Early online date13/12/19
<mark>Original language</mark>English

Abstract

Choosing appropriate scales for remotely sensed image classification is extremely important yet still an open question in relation to deep convolutional neural networks (CNN), due to the impact of spatial scale (i.e., input patch size) on the recognition of ground objects. Currently, the optimal scale selection processes are extremely cumbersome and time-consuming requiring repetitive experiments involving trial-and-error procedures, which significantly reduces the practical utility of the corresponding classification methods. This issue is crucial when trying to classify large-scale land use (LU) and land cover (LC) jointly (Zhang et al., 2019). In this paper, a simple and parsimonious scale sequence joint deep learning (SS-JDL) method is proposed for joint LU and LC classification, in which a sequence of scales is embedded in the iterative process of fitting the joint distribution implicit in the joint deep learning (JDL) method, thus, replacing the previous paradigm of scale selection. The sequence of scales, derived autonomously and used to define the CNN input patch sizes, provides consecutive information transmission from small-scale features to large-scale representations, and from simple LC states to complex LU characterisations. The effectiveness of the novel SS-JDL method was tested on aerial digital photography of three complex and heterogeneous landscapes, two in Southern England (Bournemouth and Southampton) and one in North West England (Manchester). Benchmark comparisons were provided in the form of a range of LU and LC methods, including the state-of-the-art joint deep learning (JDL) method. The experimental results demonstrated that the SS-JDL consistently outperformed all of the state-of-the-art baselines in terms of both LU and LC classification accuracies, as well as computational efficiency. The proposed SS-JDL method, therefore, represents a fast and effective implementation of the state-of-the-art JDL method. By creating a single, unifying joint distribution framework for classifying higher order feature representations, including LU, the SS-JDL method has the potential to transform the classification paradigm in remote sensing, and in machine learning more generally.

Bibliographic note

This is the author’s version of a work that was accepted for publication in Remote Sensing of Environment. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Remote Sensing of Environment, 237, 2020 DOI: 10.1016/j.rse.2019.111593