Home > Research > Publications & Outputs > Human action recognition using transfer learnin...

Electronic data

  • Human Action Recognition using Transfer Learning with Deep Representations_Revised

    Rights statement: ©2017 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.

    Accepted author manuscript, 588 KB, PDF document

    Available under license: CC BY-NC: Creative Commons Attribution-NonCommercial 4.0 International License

Links

Text available via DOI:

View graph of relations

Human action recognition using transfer learning with deep representations

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published
Publication date3/07/2017
Host publication2017 International Joint Conference on Neural Networks (IJCNN)
PublisherIEEE
ISBN (electronic)9781509061822
ISBN (print)9781509061839
<mark>Original language</mark>English

Abstract

Human action recognition is an imperative research area in the field of computer vision due to its numerous applications. Recently, with the emergence and successful deployment of deep learning techniques for image classification, object recognition, and speech recognition, more research is directed from traditional handcrafted to deep learning techniques. This paper presents a novel method for human action recognition based on a pre-trained deep CNN model for feature extraction & representation followed by a hybrid Support Vector Machine (SVM) and K-Nearest Neighbor (KNN) classifier for action recognition. It has been observed that already learnt CNN based representations on large-scale annotated dataset could be transferred to action recognition task with limited training dataset. The proposed method is evaluated on two well-known action datasets, i.e., UCF sports and KTH. The comparative analysis confirms that the proposed method achieves superior performance over state-of-the-art methods in terms of accuracy.

Bibliographic note

©2017 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.