Home > Research > Publications & Outputs > OHD

Links

Text available via DOI:

View graph of relations

OHD: An Online Category-Aware Framework for Learning with Noisy Labels under Long-Tailed Distribution

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

OHD: An Online Category-Aware Framework for Learning with Noisy Labels under Long-Tailed Distribution. / Zhao, Qihao; Zhang, Fan; Hu, Wei et al.
In: IEEE Transactions on Circuits and Systems for Video Technology, Vol. 34, No. 5, 01.05.2024, p. 3806-3818.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

Zhao, Q, Zhang, F, Hu, W, Feng, S & Liu, J 2024, 'OHD: An Online Category-Aware Framework for Learning with Noisy Labels under Long-Tailed Distribution', IEEE Transactions on Circuits and Systems for Video Technology, vol. 34, no. 5, pp. 3806-3818. https://doi.org/10.1109/TCSVT.2023.3321733

APA

Zhao, Q., Zhang, F., Hu, W., Feng, S., & Liu, J. (2024). OHD: An Online Category-Aware Framework for Learning with Noisy Labels under Long-Tailed Distribution. IEEE Transactions on Circuits and Systems for Video Technology, 34(5), 3806-3818. https://doi.org/10.1109/TCSVT.2023.3321733

Vancouver

Zhao Q, Zhang F, Hu W, Feng S, Liu J. OHD: An Online Category-Aware Framework for Learning with Noisy Labels under Long-Tailed Distribution. IEEE Transactions on Circuits and Systems for Video Technology. 2024 May 1;34(5):3806-3818. Epub 2023 Oct 4. doi: 10.1109/TCSVT.2023.3321733

Author

Zhao, Qihao ; Zhang, Fan ; Hu, Wei et al. / OHD : An Online Category-Aware Framework for Learning with Noisy Labels under Long-Tailed Distribution. In: IEEE Transactions on Circuits and Systems for Video Technology. 2024 ; Vol. 34, No. 5. pp. 3806-3818.

Bibtex

@article{cf467649b8dc4f7ba84d9527ae5f75bb,
title = "OHD: An Online Category-Aware Framework for Learning with Noisy Labels under Long-Tailed Distribution",
abstract = "Recently, many effective methods have emerged to address the robustness problem of Deep Neural Networks (DNNs) trained with noisy labels. However, existing work on learning with noisy labels (LNL) mainly focuses on balanced datasets, while real-world scenarios usually also exhibit a long-tailed distribution (LTD). In this paper, we propose an online category-aware approach to mitigate the impact of noisy labels and LTD on the robustness of DNNs. First, the category frequency of clean samples used to rebalance the feature space cannot be obtained directly in the presence of noisy samples. We design a novel category-aware Online Joint Distribution to dynamically estimate the category frequency of clean samples. Second, previous LNL methods were category-agnostic. These methods would easily be confused with noisy samples and tail categories' samples under LTD. Based on this observation, we propose a Harmonizing Factor strategy to exploit more information from the category-aware online joint distribution. This strategy provides more accurate estimates of clean samples between noisy samples and samples with tail categories. Finally, we propose Dynamic Cost-sensitive Learning, which utilizes the loss and category frequency of the estimated clean samples to address both LNL and LTD. Compared to extensive state-of-the-art methods, our strategy consistently improves the generalization performance of DNNs on several synthetic datasets and two real-world datasets.",
keywords = "Deep neural networks, image classification, learning with noisy labels, long-tailed distribution",
author = "Qihao Zhao and Fan Zhang and Wei Hu and Songhe Feng and Jun Liu",
year = "2024",
month = may,
day = "1",
doi = "10.1109/TCSVT.2023.3321733",
language = "English",
volume = "34",
pages = "3806--3818",
journal = "IEEE Transactions on Circuits and Systems for Video Technology",
issn = "1051-8215",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
number = "5",

}

RIS

TY - JOUR

T1 - OHD

T2 - An Online Category-Aware Framework for Learning with Noisy Labels under Long-Tailed Distribution

AU - Zhao, Qihao

AU - Zhang, Fan

AU - Hu, Wei

AU - Feng, Songhe

AU - Liu, Jun

PY - 2024/5/1

Y1 - 2024/5/1

N2 - Recently, many effective methods have emerged to address the robustness problem of Deep Neural Networks (DNNs) trained with noisy labels. However, existing work on learning with noisy labels (LNL) mainly focuses on balanced datasets, while real-world scenarios usually also exhibit a long-tailed distribution (LTD). In this paper, we propose an online category-aware approach to mitigate the impact of noisy labels and LTD on the robustness of DNNs. First, the category frequency of clean samples used to rebalance the feature space cannot be obtained directly in the presence of noisy samples. We design a novel category-aware Online Joint Distribution to dynamically estimate the category frequency of clean samples. Second, previous LNL methods were category-agnostic. These methods would easily be confused with noisy samples and tail categories' samples under LTD. Based on this observation, we propose a Harmonizing Factor strategy to exploit more information from the category-aware online joint distribution. This strategy provides more accurate estimates of clean samples between noisy samples and samples with tail categories. Finally, we propose Dynamic Cost-sensitive Learning, which utilizes the loss and category frequency of the estimated clean samples to address both LNL and LTD. Compared to extensive state-of-the-art methods, our strategy consistently improves the generalization performance of DNNs on several synthetic datasets and two real-world datasets.

AB - Recently, many effective methods have emerged to address the robustness problem of Deep Neural Networks (DNNs) trained with noisy labels. However, existing work on learning with noisy labels (LNL) mainly focuses on balanced datasets, while real-world scenarios usually also exhibit a long-tailed distribution (LTD). In this paper, we propose an online category-aware approach to mitigate the impact of noisy labels and LTD on the robustness of DNNs. First, the category frequency of clean samples used to rebalance the feature space cannot be obtained directly in the presence of noisy samples. We design a novel category-aware Online Joint Distribution to dynamically estimate the category frequency of clean samples. Second, previous LNL methods were category-agnostic. These methods would easily be confused with noisy samples and tail categories' samples under LTD. Based on this observation, we propose a Harmonizing Factor strategy to exploit more information from the category-aware online joint distribution. This strategy provides more accurate estimates of clean samples between noisy samples and samples with tail categories. Finally, we propose Dynamic Cost-sensitive Learning, which utilizes the loss and category frequency of the estimated clean samples to address both LNL and LTD. Compared to extensive state-of-the-art methods, our strategy consistently improves the generalization performance of DNNs on several synthetic datasets and two real-world datasets.

KW - Deep neural networks

KW - image classification

KW - learning with noisy labels

KW - long-tailed distribution

U2 - 10.1109/TCSVT.2023.3321733

DO - 10.1109/TCSVT.2023.3321733

M3 - Journal article

AN - SCOPUS:85174848345

VL - 34

SP - 3806

EP - 3818

JO - IEEE Transactions on Circuits and Systems for Video Technology

JF - IEEE Transactions on Circuits and Systems for Video Technology

SN - 1051-8215

IS - 5

ER -