Home > Research > Publications & Outputs > HARDer-Net

Electronic data

Links

Text available via DOI:

View graph of relations

HARDer-Net: Hardness-Guided Discrimination Network for 3D Early Activity Prediction

Research output: Contribution to Journal/MagazineJournal articlepeer-review

E-pub ahead of print

Standard

HARDer-Net: Hardness-Guided Discrimination Network for 3D Early Activity Prediction. / Li, Tianjiao; Luo, Yang; Zhang, Wei et al.
In: IEEE Transactions on Circuits and Systems for Video Technology, 16.07.2024.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

Li, T, Luo, Y, Zhang, W, Duan, L & Liu, J 2024, 'HARDer-Net: Hardness-Guided Discrimination Network for 3D Early Activity Prediction', IEEE Transactions on Circuits and Systems for Video Technology. https://doi.org/10.1109/TCSVT.2024.3429182

APA

Li, T., Luo, Y., Zhang, W., Duan, L., & Liu, J. (2024). HARDer-Net: Hardness-Guided Discrimination Network for 3D Early Activity Prediction. IEEE Transactions on Circuits and Systems for Video Technology. Advance online publication. https://doi.org/10.1109/TCSVT.2024.3429182

Vancouver

Li T, Luo Y, Zhang W, Duan L, Liu J. HARDer-Net: Hardness-Guided Discrimination Network for 3D Early Activity Prediction. IEEE Transactions on Circuits and Systems for Video Technology. 2024 Jul 16. Epub 2024 Jul 16. doi: 10.1109/TCSVT.2024.3429182

Author

Li, Tianjiao ; Luo, Yang ; Zhang, Wei et al. / HARDer-Net : Hardness-Guided Discrimination Network for 3D Early Activity Prediction. In: IEEE Transactions on Circuits and Systems for Video Technology. 2024.

Bibtex

@article{482b0d4dc585427bb281169fbea98245,
title = "HARDer-Net: Hardness-Guided Discrimination Network for 3D Early Activity Prediction",
abstract = "To predict the class label from a partially observable activity sequence can be quite challenging due to the high degree of similarity existing in early segments of different activities. In this paper, an innovative HARDness-Guided Discrimination Network (HARDer-Net) is proposed to evaluate the relationship between similar activity pairs that are extremely hard to discriminate. To train our HARDer-Net, an innovative adversarial learning scheme has been designed, providing our network with the strength to extract subtle discrimination information for the prediction of 3D early activities. Moreover, to enhance the adversarial learning scheme efficacy of our model for 3D early action prediction, we construct a Hardness-Guided bank that dynamically records the hard similar samples and conducts reward-guided selections of these recorded hard samples using a deep reinforcement learning scheme. The proposed method significantly enhances the capability of the model to discern fine-grained differences in early activity sequences. Several widely-used activity datasets are used to evaluate our proposed HARDer-Net, and we achieve state-of-the-art performance across all the evaluated datasets.",
author = "Tianjiao Li and Yang Luo and Wei Zhang and Lingyu Duan and Jun Liu",
year = "2024",
month = jul,
day = "16",
doi = "10.1109/TCSVT.2024.3429182",
language = "English",
journal = "IEEE Transactions on Circuits and Systems for Video Technology",
issn = "1051-8215",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

RIS

TY - JOUR

T1 - HARDer-Net

T2 - Hardness-Guided Discrimination Network for 3D Early Activity Prediction

AU - Li, Tianjiao

AU - Luo, Yang

AU - Zhang, Wei

AU - Duan, Lingyu

AU - Liu, Jun

PY - 2024/7/16

Y1 - 2024/7/16

N2 - To predict the class label from a partially observable activity sequence can be quite challenging due to the high degree of similarity existing in early segments of different activities. In this paper, an innovative HARDness-Guided Discrimination Network (HARDer-Net) is proposed to evaluate the relationship between similar activity pairs that are extremely hard to discriminate. To train our HARDer-Net, an innovative adversarial learning scheme has been designed, providing our network with the strength to extract subtle discrimination information for the prediction of 3D early activities. Moreover, to enhance the adversarial learning scheme efficacy of our model for 3D early action prediction, we construct a Hardness-Guided bank that dynamically records the hard similar samples and conducts reward-guided selections of these recorded hard samples using a deep reinforcement learning scheme. The proposed method significantly enhances the capability of the model to discern fine-grained differences in early activity sequences. Several widely-used activity datasets are used to evaluate our proposed HARDer-Net, and we achieve state-of-the-art performance across all the evaluated datasets.

AB - To predict the class label from a partially observable activity sequence can be quite challenging due to the high degree of similarity existing in early segments of different activities. In this paper, an innovative HARDness-Guided Discrimination Network (HARDer-Net) is proposed to evaluate the relationship between similar activity pairs that are extremely hard to discriminate. To train our HARDer-Net, an innovative adversarial learning scheme has been designed, providing our network with the strength to extract subtle discrimination information for the prediction of 3D early activities. Moreover, to enhance the adversarial learning scheme efficacy of our model for 3D early action prediction, we construct a Hardness-Guided bank that dynamically records the hard similar samples and conducts reward-guided selections of these recorded hard samples using a deep reinforcement learning scheme. The proposed method significantly enhances the capability of the model to discern fine-grained differences in early activity sequences. Several widely-used activity datasets are used to evaluate our proposed HARDer-Net, and we achieve state-of-the-art performance across all the evaluated datasets.

U2 - 10.1109/TCSVT.2024.3429182

DO - 10.1109/TCSVT.2024.3429182

M3 - Journal article

JO - IEEE Transactions on Circuits and Systems for Video Technology

JF - IEEE Transactions on Circuits and Systems for Video Technology

SN - 1051-8215

ER -