Home > Research > Publications & Outputs > Self-Supervised Learning With Adaptive Distilla...

Electronic data

  • SSL_revision1

    Rights statement: ©2021 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.

    Accepted author manuscript, 7.14 MB, PDF document

    Available under license: CC BY-NC: Creative Commons Attribution-NonCommercial 4.0 International License

Links

Text available via DOI:

View graph of relations

Self-Supervised Learning With Adaptive Distillation for Hyperspectral Image Classification

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

Self-Supervised Learning With Adaptive Distillation for Hyperspectral Image Classification. / Yue, Jun; Fang, Leyuan; Rahmani, Hossein et al.
In: IEEE Transactions on Geoscience and Remote Sensing, Vol. 60, 5501813, 01.01.2022.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

Yue, J, Fang, L, Rahmani, H & Ghamisi, P 2022, 'Self-Supervised Learning With Adaptive Distillation for Hyperspectral Image Classification', IEEE Transactions on Geoscience and Remote Sensing, vol. 60, 5501813. https://doi.org/10.1109/TGRS.2021.3057768

APA

Yue, J., Fang, L., Rahmani, H., & Ghamisi, P. (2022). Self-Supervised Learning With Adaptive Distillation for Hyperspectral Image Classification. IEEE Transactions on Geoscience and Remote Sensing, 60, Article 5501813. https://doi.org/10.1109/TGRS.2021.3057768

Vancouver

Yue J, Fang L, Rahmani H, Ghamisi P. Self-Supervised Learning With Adaptive Distillation for Hyperspectral Image Classification. IEEE Transactions on Geoscience and Remote Sensing. 2022 Jan 1;60:5501813. Epub 2021 Feb 22. doi: 10.1109/TGRS.2021.3057768

Author

Yue, Jun ; Fang, Leyuan ; Rahmani, Hossein et al. / Self-Supervised Learning With Adaptive Distillation for Hyperspectral Image Classification. In: IEEE Transactions on Geoscience and Remote Sensing. 2022 ; Vol. 60.

Bibtex

@article{b12a813269b346e58aa8205c4051d758,
title = "Self-Supervised Learning With Adaptive Distillation for Hyperspectral Image Classification",
abstract = "Hyperspectral image (HSI) classification is an important topic in the community of remote sensing, which has a wide range of applications in geoscience. Recently, deep learning-based methods have been widely used in HSI classification. However, due to the scarcity of labeled samples in HSI, the potential of deep learning-based methods has not been fully exploited. To solve this problem, a self-supervised learning (SSL) method with adaptive distillation is proposed to train the deep neural network with extensive unlabeled samples. The proposed method consists of two modules: adaptive knowledge distillation with spatial-spectral similarity and 3-D transformation on HSI cubes. The SSL with adaptive knowledge distillation uses the self-supervised information to train the network by knowledge distillation, where self-supervised knowledge is the adaptive soft label generated by spatial-spectral similarity measurement. The SSL with adaptive knowledge distillation mainly includes the following three steps. First, the similarity between unlabeled samples and object classes in HSI is generated based on the spatial-spectral joint distance (SSJD) between unlabeled samples and labeled samples. Second, the adaptive soft label of each unlabeled sample is generated to measure the probability that the unlabeled sample belongs to each object class. Third, a progressive convolutional network (PCN) is trained by minimizing the cross-entropy between the adaptive soft labels and the probabilities generated by the forward propagation of the PCN. The SSL with 3-D transformation rotates the HSI cube in both the spectral domain and the spatial domain to fully exploit the labeled samples. Experiments on three public HSI data sets have demonstrated that the proposed method can achieve better performance than existing state-of-the-art methods.",
author = "Jun Yue and Leyuan Fang and Hossein Rahmani and Pedram Ghamisi",
note = "{\textcopyright}2021 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE. ",
year = "2022",
month = jan,
day = "1",
doi = "10.1109/TGRS.2021.3057768",
language = "English",
volume = "60",
journal = "IEEE Transactions on Geoscience and Remote Sensing",
issn = "0196-2892",
publisher = "IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC",

}

RIS

TY - JOUR

T1 - Self-Supervised Learning With Adaptive Distillation for Hyperspectral Image Classification

AU - Yue, Jun

AU - Fang, Leyuan

AU - Rahmani, Hossein

AU - Ghamisi, Pedram

N1 - ©2021 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.

PY - 2022/1/1

Y1 - 2022/1/1

N2 - Hyperspectral image (HSI) classification is an important topic in the community of remote sensing, which has a wide range of applications in geoscience. Recently, deep learning-based methods have been widely used in HSI classification. However, due to the scarcity of labeled samples in HSI, the potential of deep learning-based methods has not been fully exploited. To solve this problem, a self-supervised learning (SSL) method with adaptive distillation is proposed to train the deep neural network with extensive unlabeled samples. The proposed method consists of two modules: adaptive knowledge distillation with spatial-spectral similarity and 3-D transformation on HSI cubes. The SSL with adaptive knowledge distillation uses the self-supervised information to train the network by knowledge distillation, where self-supervised knowledge is the adaptive soft label generated by spatial-spectral similarity measurement. The SSL with adaptive knowledge distillation mainly includes the following three steps. First, the similarity between unlabeled samples and object classes in HSI is generated based on the spatial-spectral joint distance (SSJD) between unlabeled samples and labeled samples. Second, the adaptive soft label of each unlabeled sample is generated to measure the probability that the unlabeled sample belongs to each object class. Third, a progressive convolutional network (PCN) is trained by minimizing the cross-entropy between the adaptive soft labels and the probabilities generated by the forward propagation of the PCN. The SSL with 3-D transformation rotates the HSI cube in both the spectral domain and the spatial domain to fully exploit the labeled samples. Experiments on three public HSI data sets have demonstrated that the proposed method can achieve better performance than existing state-of-the-art methods.

AB - Hyperspectral image (HSI) classification is an important topic in the community of remote sensing, which has a wide range of applications in geoscience. Recently, deep learning-based methods have been widely used in HSI classification. However, due to the scarcity of labeled samples in HSI, the potential of deep learning-based methods has not been fully exploited. To solve this problem, a self-supervised learning (SSL) method with adaptive distillation is proposed to train the deep neural network with extensive unlabeled samples. The proposed method consists of two modules: adaptive knowledge distillation with spatial-spectral similarity and 3-D transformation on HSI cubes. The SSL with adaptive knowledge distillation uses the self-supervised information to train the network by knowledge distillation, where self-supervised knowledge is the adaptive soft label generated by spatial-spectral similarity measurement. The SSL with adaptive knowledge distillation mainly includes the following three steps. First, the similarity between unlabeled samples and object classes in HSI is generated based on the spatial-spectral joint distance (SSJD) between unlabeled samples and labeled samples. Second, the adaptive soft label of each unlabeled sample is generated to measure the probability that the unlabeled sample belongs to each object class. Third, a progressive convolutional network (PCN) is trained by minimizing the cross-entropy between the adaptive soft labels and the probabilities generated by the forward propagation of the PCN. The SSL with 3-D transformation rotates the HSI cube in both the spectral domain and the spatial domain to fully exploit the labeled samples. Experiments on three public HSI data sets have demonstrated that the proposed method can achieve better performance than existing state-of-the-art methods.

U2 - 10.1109/TGRS.2021.3057768

DO - 10.1109/TGRS.2021.3057768

M3 - Journal article

VL - 60

JO - IEEE Transactions on Geoscience and Remote Sensing

JF - IEEE Transactions on Geoscience and Remote Sensing

SN - 0196-2892

M1 - 5501813

ER -