Home > Research > Publications & Outputs > Human action recognition using transfer learnin...

Electronic data

  • Human Action Recognition using Transfer Learning with Deep Representations_Revised

    Rights statement: ©2017 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.

    Accepted author manuscript, 588 KB, PDF document

    Available under license: CC BY-NC: Creative Commons Attribution-NonCommercial 4.0 International License

Links

Text available via DOI:

View graph of relations

Human action recognition using transfer learning with deep representations

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published

Standard

Human action recognition using transfer learning with deep representations. / Bux, Allah; Wang, Xiaofeng; Angelov, Plamen Parvanov et al.
2017 International Joint Conference on Neural Networks (IJCNN). IEEE, 2017.

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Harvard

APA

Vancouver

Bux A, Wang X, Angelov PP, Habib Z. Human action recognition using transfer learning with deep representations. In 2017 International Joint Conference on Neural Networks (IJCNN). IEEE. 2017 doi: 10.1109/IJCNN.2017.7965890

Author

Bux, Allah ; Wang, Xiaofeng ; Angelov, Plamen Parvanov et al. / Human action recognition using transfer learning with deep representations. 2017 International Joint Conference on Neural Networks (IJCNN). IEEE, 2017.

Bibtex

@inproceedings{7c04fe4136164f2981b0b2df2a639546,
title = "Human action recognition using transfer learning with deep representations",
abstract = "Human action recognition is an imperative research area in the field of computer vision due to its numerous applications. Recently, with the emergence and successful deployment of deep learning techniques for image classification, object recognition, and speech recognition, more research is directed from traditional handcrafted to deep learning techniques. This paper presents a novel method for human action recognition based on a pre-trained deep CNN model for feature extraction & representation followed by a hybrid Support Vector Machine (SVM) and K-Nearest Neighbor (KNN) classifier for action recognition. It has been observed that already learnt CNN based representations on large-scale annotated dataset could be transferred to action recognition task with limited training dataset. The proposed method is evaluated on two well-known action datasets, i.e., UCF sports and KTH. The comparative analysis confirms that the proposed method achieves superior performance over state-of-the-art methods in terms of accuracy.",
author = "Allah Bux and Xiaofeng Wang and Angelov, {Plamen Parvanov} and Zulfiqar Habib",
note = "{\textcopyright}2017 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.",
year = "2017",
month = jul,
day = "3",
doi = "10.1109/IJCNN.2017.7965890",
language = "English",
isbn = "9781509061839",
booktitle = "2017 International Joint Conference on Neural Networks (IJCNN)",
publisher = "IEEE",

}

RIS

TY - GEN

T1 - Human action recognition using transfer learning with deep representations

AU - Bux, Allah

AU - Wang, Xiaofeng

AU - Angelov, Plamen Parvanov

AU - Habib, Zulfiqar

N1 - ©2017 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.

PY - 2017/7/3

Y1 - 2017/7/3

N2 - Human action recognition is an imperative research area in the field of computer vision due to its numerous applications. Recently, with the emergence and successful deployment of deep learning techniques for image classification, object recognition, and speech recognition, more research is directed from traditional handcrafted to deep learning techniques. This paper presents a novel method for human action recognition based on a pre-trained deep CNN model for feature extraction & representation followed by a hybrid Support Vector Machine (SVM) and K-Nearest Neighbor (KNN) classifier for action recognition. It has been observed that already learnt CNN based representations on large-scale annotated dataset could be transferred to action recognition task with limited training dataset. The proposed method is evaluated on two well-known action datasets, i.e., UCF sports and KTH. The comparative analysis confirms that the proposed method achieves superior performance over state-of-the-art methods in terms of accuracy.

AB - Human action recognition is an imperative research area in the field of computer vision due to its numerous applications. Recently, with the emergence and successful deployment of deep learning techniques for image classification, object recognition, and speech recognition, more research is directed from traditional handcrafted to deep learning techniques. This paper presents a novel method for human action recognition based on a pre-trained deep CNN model for feature extraction & representation followed by a hybrid Support Vector Machine (SVM) and K-Nearest Neighbor (KNN) classifier for action recognition. It has been observed that already learnt CNN based representations on large-scale annotated dataset could be transferred to action recognition task with limited training dataset. The proposed method is evaluated on two well-known action datasets, i.e., UCF sports and KTH. The comparative analysis confirms that the proposed method achieves superior performance over state-of-the-art methods in terms of accuracy.

U2 - 10.1109/IJCNN.2017.7965890

DO - 10.1109/IJCNN.2017.7965890

M3 - Conference contribution/Paper

SN - 9781509061839

BT - 2017 International Joint Conference on Neural Networks (IJCNN)

PB - IEEE

ER -