Home > Research > Publications & Outputs > Deep Fisher Discriminant Learning for Mobile Ha...

Associated organisational unit

Electronic data

  • 1-s2.0-S0031320317305198-main

    Rights statement: This is the author’s version of a work that was accepted for publication in Pattern Recognition. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Pattern Recognition, 77, 2018 DOI: 10.1016/j.patcog.2017.12.023

    Accepted author manuscript, 3.27 MB, PDF document

    Available under license: CC BY-NC-ND: Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License

Links

Text available via DOI:

View graph of relations

Deep Fisher Discriminant Learning for Mobile Hand Gesture Recognition

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

Deep Fisher Discriminant Learning for Mobile Hand Gesture Recognition. / Li, Ce; Xie, Chunyu; Zhang, Baochang et al.
In: Pattern Recognition, Vol. 77, 05.2018, p. 276-288.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

Li, C, Xie, C, Zhang, B, Chen, C & Han, J 2018, 'Deep Fisher Discriminant Learning for Mobile Hand Gesture Recognition', Pattern Recognition, vol. 77, pp. 276-288. https://doi.org/10.1016/j.patcog.2017.12.023

APA

Vancouver

Li C, Xie C, Zhang B, Chen C, Han J. Deep Fisher Discriminant Learning for Mobile Hand Gesture Recognition. Pattern Recognition. 2018 May;77:276-288. Epub 2018 Jan 4. doi: 10.1016/j.patcog.2017.12.023

Author

Li, Ce ; Xie, Chunyu ; Zhang, Baochang et al. / Deep Fisher Discriminant Learning for Mobile Hand Gesture Recognition. In: Pattern Recognition. 2018 ; Vol. 77. pp. 276-288.

Bibtex

@article{c4ab141f5e674450bf5bf6995899c3e1,
title = "Deep Fisher Discriminant Learning for Mobile Hand Gesture Recognition",
abstract = "Gesture recognition becomes a popular analytics tool for extracting the characteristics of user movement and enables numerous practical applications in the biometrics field. Despite recent advances in this technique, complex user interaction and the limited amount of data pose serious challenges to existing methods. In this paper, we present a novel approach for hand gesture recognition based on user interaction on mobile devices. We have developed two deep models by integrating Bidirectional Long-Short Term Memory (BiLSTM) network and Bidirectional Gated Recurrent Unit (BiGRU) with Fisher criterion, termed as F-BiLSTM and F-BiGRU respectively. These two Fisher discriminative models can classify user{\textquoteright}s gesture effectively by analyzing the corresponding acceleration and angular velocity data of hand motion. In addition, we build a large Mobile Gesture Database (MGD) containing 5547 sequences of 12 gestures. With extensive experiments, we demonstrate the superior performance of the proposed method compared to the state-of-the-art BiLSTM and BiGRU on MGD database and two other benchmark databases (i.e., BUAA mobile gesture and SmartWatch gesture). The source code and MGD database will be made publicly available at https://github.com/bczhangbczhang/Fisher-Discriminant-LSTM.",
keywords = "Fisher Discriminant, Hand Gesture Recognition, Mobile Devices",
author = "Ce Li and Chunyu Xie and Baochang Zhang and Chen Chen and Jungong Han",
note = "This is the author{\textquoteright}s version of a work that was accepted for publication in Pattern Recognition. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Pattern Recognition, 77, 2018 DOI: 10.1016/j.patcog.2017.12.023",
year = "2018",
month = may,
doi = "10.1016/j.patcog.2017.12.023",
language = "English",
volume = "77",
pages = "276--288",
journal = "Pattern Recognition",
issn = "0031-3203",
publisher = "Elsevier Ltd",

}

RIS

TY - JOUR

T1 - Deep Fisher Discriminant Learning for Mobile Hand Gesture Recognition

AU - Li, Ce

AU - Xie, Chunyu

AU - Zhang, Baochang

AU - Chen, Chen

AU - Han, Jungong

N1 - This is the author’s version of a work that was accepted for publication in Pattern Recognition. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Pattern Recognition, 77, 2018 DOI: 10.1016/j.patcog.2017.12.023

PY - 2018/5

Y1 - 2018/5

N2 - Gesture recognition becomes a popular analytics tool for extracting the characteristics of user movement and enables numerous practical applications in the biometrics field. Despite recent advances in this technique, complex user interaction and the limited amount of data pose serious challenges to existing methods. In this paper, we present a novel approach for hand gesture recognition based on user interaction on mobile devices. We have developed two deep models by integrating Bidirectional Long-Short Term Memory (BiLSTM) network and Bidirectional Gated Recurrent Unit (BiGRU) with Fisher criterion, termed as F-BiLSTM and F-BiGRU respectively. These two Fisher discriminative models can classify user’s gesture effectively by analyzing the corresponding acceleration and angular velocity data of hand motion. In addition, we build a large Mobile Gesture Database (MGD) containing 5547 sequences of 12 gestures. With extensive experiments, we demonstrate the superior performance of the proposed method compared to the state-of-the-art BiLSTM and BiGRU on MGD database and two other benchmark databases (i.e., BUAA mobile gesture and SmartWatch gesture). The source code and MGD database will be made publicly available at https://github.com/bczhangbczhang/Fisher-Discriminant-LSTM.

AB - Gesture recognition becomes a popular analytics tool for extracting the characteristics of user movement and enables numerous practical applications in the biometrics field. Despite recent advances in this technique, complex user interaction and the limited amount of data pose serious challenges to existing methods. In this paper, we present a novel approach for hand gesture recognition based on user interaction on mobile devices. We have developed two deep models by integrating Bidirectional Long-Short Term Memory (BiLSTM) network and Bidirectional Gated Recurrent Unit (BiGRU) with Fisher criterion, termed as F-BiLSTM and F-BiGRU respectively. These two Fisher discriminative models can classify user’s gesture effectively by analyzing the corresponding acceleration and angular velocity data of hand motion. In addition, we build a large Mobile Gesture Database (MGD) containing 5547 sequences of 12 gestures. With extensive experiments, we demonstrate the superior performance of the proposed method compared to the state-of-the-art BiLSTM and BiGRU on MGD database and two other benchmark databases (i.e., BUAA mobile gesture and SmartWatch gesture). The source code and MGD database will be made publicly available at https://github.com/bczhangbczhang/Fisher-Discriminant-LSTM.

KW - Fisher Discriminant

KW - Hand Gesture Recognition

KW - Mobile Devices

U2 - 10.1016/j.patcog.2017.12.023

DO - 10.1016/j.patcog.2017.12.023

M3 - Journal article

VL - 77

SP - 276

EP - 288

JO - Pattern Recognition

JF - Pattern Recognition

SN - 0031-3203

ER -