Home > Research > Publications & Outputs > Are mid-air dynamic gestures applicable to user...

Electronic data

  • 1-s2.0-S0167865518301466-main

    Rights statement: This is the author’s version of a work that was accepted for publication in Physics Reports. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Physics Reports, 373, 4-5, 2003 DOI: 10.1016/S0370-1573(02)00269-7

    Accepted author manuscript, 1.04 MB, PDF document

    Available under license: CC BY-NC-ND: Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License

Links

Text available via DOI:

View graph of relations

Are mid-air dynamic gestures applicable to user identification?

Research output: Contribution to Journal/MagazineJournal articlepeer-review

E-pub ahead of print

Standard

Are mid-air dynamic gestures applicable to user identification? / Liu, Heng; Dai, Liangliang; Hou, Shudong et al.
In: Pattern Recognition Letters, 18.04.2018.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

APA

Liu, H., Dai, L., Hou, S., Han, J., & Liu, H. (2018). Are mid-air dynamic gestures applicable to user identification? Pattern Recognition Letters. Advance online publication. https://doi.org/10.1016/j.patrec.2018.04.026

Vancouver

Liu H, Dai L, Hou S, Han J, Liu H. Are mid-air dynamic gestures applicable to user identification? Pattern Recognition Letters. 2018 Apr 18. Epub 2018 Apr 18. doi: 10.1016/j.patrec.2018.04.026

Author

Liu, Heng ; Dai, Liangliang ; Hou, Shudong et al. / Are mid-air dynamic gestures applicable to user identification?. In: Pattern Recognition Letters. 2018.

Bibtex

@article{9f1c1d126d5446c09f1618d2af5d452f,
title = "Are mid-air dynamic gestures applicable to user identification?",
abstract = "Abstract Unlike the existing gesture related research predominantly focusing on gesture recognition (classification), this work explores the feasibility and the potential of mid-air dynamic gesture based user identification through presenting an efficient bidirectional GRU (Gated Recurrent Unit) network. From the perspective of the feature analysis from the Bi-GRU network used for different recognition tasks, we make a detailed investigation on the correlation and the difference between the gesture type features and the gesture user identity characteristics. During this process, two unsupervised feature representation methods – PCA and hash ITQ (Iterative Quantization) are fully used to perform feature reduction and feature binary coding. Experiments and analysis based on our dynamic gesture data set (60 individuals) exemplify the effectiveness of the proposed mid-air dynamic gesture based user identification approach and clearly reveal the relationship between the gesture type features and the gesture user identity characteristics.",
keywords = "Gesture based user identification, Gesture user identity characteristics, Bi-GRU, Mid-air dynamic gestures",
author = "Heng Liu and Liangliang Dai and Shudong Hou and Jungong Han and Hongshen Liu",
note = "This is the author{\textquoteright}s version of a work that was accepted for publication in Physics Reports. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Physics Reports, 373, 4-5, 2003 DOI: 10.1016/S0370-1573(02)00269-7",
year = "2018",
month = apr,
day = "18",
doi = "10.1016/j.patrec.2018.04.026",
language = "English",
journal = "Pattern Recognition Letters",
issn = "0167-8655",
publisher = "Elsevier Science B.V.",

}

RIS

TY - JOUR

T1 - Are mid-air dynamic gestures applicable to user identification?

AU - Liu, Heng

AU - Dai, Liangliang

AU - Hou, Shudong

AU - Han, Jungong

AU - Liu, Hongshen

N1 - This is the author’s version of a work that was accepted for publication in Physics Reports. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Physics Reports, 373, 4-5, 2003 DOI: 10.1016/S0370-1573(02)00269-7

PY - 2018/4/18

Y1 - 2018/4/18

N2 - Abstract Unlike the existing gesture related research predominantly focusing on gesture recognition (classification), this work explores the feasibility and the potential of mid-air dynamic gesture based user identification through presenting an efficient bidirectional GRU (Gated Recurrent Unit) network. From the perspective of the feature analysis from the Bi-GRU network used for different recognition tasks, we make a detailed investigation on the correlation and the difference between the gesture type features and the gesture user identity characteristics. During this process, two unsupervised feature representation methods – PCA and hash ITQ (Iterative Quantization) are fully used to perform feature reduction and feature binary coding. Experiments and analysis based on our dynamic gesture data set (60 individuals) exemplify the effectiveness of the proposed mid-air dynamic gesture based user identification approach and clearly reveal the relationship between the gesture type features and the gesture user identity characteristics.

AB - Abstract Unlike the existing gesture related research predominantly focusing on gesture recognition (classification), this work explores the feasibility and the potential of mid-air dynamic gesture based user identification through presenting an efficient bidirectional GRU (Gated Recurrent Unit) network. From the perspective of the feature analysis from the Bi-GRU network used for different recognition tasks, we make a detailed investigation on the correlation and the difference between the gesture type features and the gesture user identity characteristics. During this process, two unsupervised feature representation methods – PCA and hash ITQ (Iterative Quantization) are fully used to perform feature reduction and feature binary coding. Experiments and analysis based on our dynamic gesture data set (60 individuals) exemplify the effectiveness of the proposed mid-air dynamic gesture based user identification approach and clearly reveal the relationship between the gesture type features and the gesture user identity characteristics.

KW - Gesture based user identification

KW - Gesture user identity characteristics

KW - Bi-GRU

KW - Mid-air dynamic gestures

U2 - 10.1016/j.patrec.2018.04.026

DO - 10.1016/j.patrec.2018.04.026

M3 - Journal article

JO - Pattern Recognition Letters

JF - Pattern Recognition Letters

SN - 0167-8655

ER -