Home > Research > Publications & Outputs > ABCNet

Electronic data

  • ABCNet_clean

    Accepted author manuscript, 7.43 MB, PDF document

    Available under license: CC BY: Creative Commons Attribution 4.0 International License

Links

Text available via DOI:

View graph of relations

ABCNet: Attentive bilateral contextual network for efficient semantic segmentation of Fine-Resolution remotely sensed imagery

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published
Close
<mark>Journal publication date</mark>30/11/2021
<mark>Journal</mark>ISPRS Journal of Photogrammetry and Remote Sensing
Volume181
Number of pages15
Pages (from-to)84-98
Publication StatusPublished
Early online date16/09/21
<mark>Original language</mark>English

Abstract

Semantic segmentation of remotely sensed imagery plays a critical role in many real-world applications, such as environmental change monitoring, precision agriculture, environmental protection, and economic assessment. Following rapid developments in sensor technologies, vast numbers of fine-resolution satellite and airborne remote sensing images are now available, for which semantic segmentation is potentially a valuable method. However, because of the rich complexity and heterogeneity of information provided with an ever-increasing spatial resolution, state-of-the-art deep learning algorithms commonly adopt complex network structures for segmentation, which often result in significant computational demand. Particularly, the frequently-used fully convolutional network (FCN) relies heavily on fine-grained spatial detail (fine spatial resolution) and contextual information (large receptive fields), both imposing high computational costs. This impedes the practical utility of FCN for real-world applications, especially those requiring real-time data processing. In this paper, we propose a novel Attentive Bilateral Contextual Network (ABCNet), a lightweight convolutional neural network (CNN) with a spatial path and a contextual path. Extensive experiments, including a comprehensive ablation study, demonstrate that ABCNet has strong discrimination capability with competitive accuracy compared with state-of-the-art benchmark methods while achieving significantly increased computational efficiency. Specifically, the proposed ABCNet achieves a 91.3% overall accuracy (OA) on the Potsdam test dataset and outperforms all lightweight benchmark methods significantly. The code is freely available at https://github.com/lironui/ABCNet.