Home > Research > Publications & Outputs > Robust End-to-End Hand Identification via Holis...

Electronic data

  • 0132

    Rights statement: ©2021 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.

    Accepted author manuscript, 3.91 MB, PDF document

    Available under license: CC BY-NC: Creative Commons Attribution-NonCommercial 4.0 International License

Links

Text available via DOI:

View graph of relations

Robust End-to-End Hand Identification via Holistic Multi-Unit Knuckle Recognition

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published

Standard

Robust End-to-End Hand Identification via Holistic Multi-Unit Knuckle Recognition. / Vyas, Ritesh; Rahmani, Hossein; Boswell-Challand, Ricki et al.
International Joint Conference on Biometrics (IJCB-2021). Shenzhen, China: IEEE, 2021. p. 1-8 9484356 (2021 IEEE International Joint Conference on Biometrics, IJCB 2021).

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Harvard

Vyas, R, Rahmani, H, Boswell-Challand, R, Angelov, P, Black, S & Williams, B 2021, Robust End-to-End Hand Identification via Holistic Multi-Unit Knuckle Recognition. in International Joint Conference on Biometrics (IJCB-2021)., 9484356, 2021 IEEE International Joint Conference on Biometrics, IJCB 2021, IEEE, Shenzhen, China, pp. 1-8. https://doi.org/10.1109/IJCB52358.2021.9484356

APA

Vyas, R., Rahmani, H., Boswell-Challand, R., Angelov, P., Black, S., & Williams, B. (2021). Robust End-to-End Hand Identification via Holistic Multi-Unit Knuckle Recognition. In International Joint Conference on Biometrics (IJCB-2021) (pp. 1-8). Article 9484356 (2021 IEEE International Joint Conference on Biometrics, IJCB 2021). IEEE. https://doi.org/10.1109/IJCB52358.2021.9484356

Vancouver

Vyas R, Rahmani H, Boswell-Challand R, Angelov P, Black S, Williams B. Robust End-to-End Hand Identification via Holistic Multi-Unit Knuckle Recognition. In International Joint Conference on Biometrics (IJCB-2021). Shenzhen, China: IEEE. 2021. p. 1-8. 9484356. (2021 IEEE International Joint Conference on Biometrics, IJCB 2021). Epub 2021 Jul 20. doi: 10.1109/IJCB52358.2021.9484356

Author

Vyas, Ritesh ; Rahmani, Hossein ; Boswell-Challand, Ricki et al. / Robust End-to-End Hand Identification via Holistic Multi-Unit Knuckle Recognition. International Joint Conference on Biometrics (IJCB-2021). Shenzhen, China : IEEE, 2021. pp. 1-8 (2021 IEEE International Joint Conference on Biometrics, IJCB 2021).

Bibtex

@inproceedings{5e2ac3f0afd944f7b89c2f40ef100f91,
title = "Robust End-to-End Hand Identification via Holistic Multi-Unit Knuckle Recognition",
abstract = "In many cases of serious crime, images of a hand can be the only evidence available for the forensic identification of the offender. As well as placing them at the scene, such images and video evidence offer proof of the offender committing the crime. The knuckle creases of the human hand have emerged as an effective biometric trait and been used to identify the perpetrators of child abuse in forensic investigations. However, manual utilization of knuckle creases for identification is highly time consuming and can be subjective, requiring the expertise of experienced forensic anthropologists whose availability is very limited. Hence, there arises a need for an automated approach for localization and comparison of knuckle patterns. In this paper, we present a fully automatic end-to-end approach which localizes the minor, major and base knuckles in images of the hand, and effectively uses them for identification achieving state-of-the-art results. This work improves on existing approaches and allows us to strengthen cases further by objectively combining multiple knuckles and knuckle types to obtain a holistic matching result for comparing two hands. This yields a stronger and more robust multi-unit biometric and facilitates the large-scale examination of the potential of knuckle-based identification. Evaluated on two large landmark datasets, the proposed framework achieves equal error rates (EER) of 1.0-1.9%, rank-1 accuracies of 99.3-100% and decidability indices of 5.04-5.83. We make the full results available via a novel online GUI to raise awareness with the general public and forensic investigators about the identifiability of various knuckle regions. These strong results demonstrate the value of our holistic approach to hand identification from knuckle patterns and their utility in forensic investigations.",
author = "Ritesh Vyas and Hossein Rahmani and Ricki Boswell-Challand and Plamen Angelov and S. Black and Bryan Williams",
note = "{\textcopyright}2021 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE. ",
year = "2021",
month = aug,
day = "4",
doi = "10.1109/IJCB52358.2021.9484356",
language = "English",
isbn = "9781665437813",
series = "2021 IEEE International Joint Conference on Biometrics, IJCB 2021",
publisher = "IEEE",
pages = "1--8",
booktitle = "International Joint Conference on Biometrics (IJCB-2021)",

}

RIS

TY - GEN

T1 - Robust End-to-End Hand Identification via Holistic Multi-Unit Knuckle Recognition

AU - Vyas, Ritesh

AU - Rahmani, Hossein

AU - Boswell-Challand, Ricki

AU - Angelov, Plamen

AU - Black, S.

AU - Williams, Bryan

N1 - ©2021 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.

PY - 2021/8/4

Y1 - 2021/8/4

N2 - In many cases of serious crime, images of a hand can be the only evidence available for the forensic identification of the offender. As well as placing them at the scene, such images and video evidence offer proof of the offender committing the crime. The knuckle creases of the human hand have emerged as an effective biometric trait and been used to identify the perpetrators of child abuse in forensic investigations. However, manual utilization of knuckle creases for identification is highly time consuming and can be subjective, requiring the expertise of experienced forensic anthropologists whose availability is very limited. Hence, there arises a need for an automated approach for localization and comparison of knuckle patterns. In this paper, we present a fully automatic end-to-end approach which localizes the minor, major and base knuckles in images of the hand, and effectively uses them for identification achieving state-of-the-art results. This work improves on existing approaches and allows us to strengthen cases further by objectively combining multiple knuckles and knuckle types to obtain a holistic matching result for comparing two hands. This yields a stronger and more robust multi-unit biometric and facilitates the large-scale examination of the potential of knuckle-based identification. Evaluated on two large landmark datasets, the proposed framework achieves equal error rates (EER) of 1.0-1.9%, rank-1 accuracies of 99.3-100% and decidability indices of 5.04-5.83. We make the full results available via a novel online GUI to raise awareness with the general public and forensic investigators about the identifiability of various knuckle regions. These strong results demonstrate the value of our holistic approach to hand identification from knuckle patterns and their utility in forensic investigations.

AB - In many cases of serious crime, images of a hand can be the only evidence available for the forensic identification of the offender. As well as placing them at the scene, such images and video evidence offer proof of the offender committing the crime. The knuckle creases of the human hand have emerged as an effective biometric trait and been used to identify the perpetrators of child abuse in forensic investigations. However, manual utilization of knuckle creases for identification is highly time consuming and can be subjective, requiring the expertise of experienced forensic anthropologists whose availability is very limited. Hence, there arises a need for an automated approach for localization and comparison of knuckle patterns. In this paper, we present a fully automatic end-to-end approach which localizes the minor, major and base knuckles in images of the hand, and effectively uses them for identification achieving state-of-the-art results. This work improves on existing approaches and allows us to strengthen cases further by objectively combining multiple knuckles and knuckle types to obtain a holistic matching result for comparing two hands. This yields a stronger and more robust multi-unit biometric and facilitates the large-scale examination of the potential of knuckle-based identification. Evaluated on two large landmark datasets, the proposed framework achieves equal error rates (EER) of 1.0-1.9%, rank-1 accuracies of 99.3-100% and decidability indices of 5.04-5.83. We make the full results available via a novel online GUI to raise awareness with the general public and forensic investigators about the identifiability of various knuckle regions. These strong results demonstrate the value of our holistic approach to hand identification from knuckle patterns and their utility in forensic investigations.

U2 - 10.1109/IJCB52358.2021.9484356

DO - 10.1109/IJCB52358.2021.9484356

M3 - Conference contribution/Paper

SN - 9781665437813

T3 - 2021 IEEE International Joint Conference on Biometrics, IJCB 2021

SP - 1

EP - 8

BT - International Joint Conference on Biometrics (IJCB-2021)

PB - IEEE

CY - Shenzhen, China

ER -