Home > Research > Publications & Outputs > Person Identification from Fingernails and Knuc...

Electronic data

Links

Text available via DOI:

View graph of relations

Person Identification from Fingernails and Knuckles Images using Deep Learning Features and the Bray-Curtis Similarity Measure

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published
<mark>Journal publication date</mark>7/11/2022
<mark>Journal</mark>Neurocomputing
Volume513
Number of pages11
Pages (from-to)83-93
Publication StatusPublished
Early online date28/09/22
<mark>Original language</mark>English

Abstract

In this paper, an approach that makes use of knuckle creases and fingernails for person identification is presented. It introduces a framework for automatic person identification that includes localisation of the region of interest (ROI) of many components within hand images, recognition and segmentation of the detected components using bounding boxes, and similarity matching between two different sets of segmented images. The following hand components are considered: i) the metacarpophalangeal (MCP) joint, commonly known as the base knuckle; ii) the proximal interphalangeal (PIP) joint, commonly known as the major knuckle; iii) the distal interphalangeal (DIP) joint, commonly known as the minor knuckle; iv) the interphalangeal (IP) joint, commonly known as the thumb knuckle, and v) the fingernails. Crucial elements of the proposed framework are the feature extraction and similarity matching. This paper exploits different deep learning neural networks (DLNNs), which are essential in extracting discriminative high-level abstract features. We further use various similarity measures for the matching process. We validate the proposed approach on well-known benchmarks, including the 11k Hands dataset and the Hong Kong Polytechnic University Contactless Hand Dorsal Images known as PolyU. The results indicate that knuckle patterns and fingernails play a significant role in the person identification framework. The 11K Hands dataset results indicate that the left-hand results are better than the right-hand results and the fingernails produce consistently higher identification results than other hand components, with a rank-1 score of 100%. In addition, the PolyU dataset attains 100% in the fingernail of the thumb finger.