Hand palm recognition

Chan Kuok Hong

Hi, It’s Rex. I have been working on hand palm recognizer recently.

The reason this project exists in the first place is to expand our current eKYC solution biometrics. As a matter of fact, even how both twins do look alike, their biometrics can be very different from each other. With this model, we can now distinguish twins, of which it is something that we could not have achieved by using the eKYC solution.


Figure 1: 11k Hands (Afifi, M, 2019)

There are approaches that one can use when it comes to the training of the hand/palm recognition – (1) either by training the classes by classes that may require tons of data; or (2) by training its embeddings and calculating the distance (i.e. Euclidean distance and etc) to determine whether they are alike.

In terms of practicality of these approaches, in the light of the amounts of data it needs in the former approach – of which it is impractical and unsustainable in real-world application – I have adopted the latter approach. I will be training my own Arc face and try to fit palm images that are different distribution with the one I trained on my own arc face model.

At this stage, I am able to extract the palm roi with an object detection model and to determine if the palm is already existed in the database from the familiar palm images. Having built the groundwork, I am currently focusing and working on the finetuning of the model, in the hope of establishing an enhanced model accuracy on my arc face model that is able to generalize well to the unseen data.




Afifi, M. 11K Hands: Gender recognition and biometric identification using a large dataset of hand images. Multimed Tools Appl 78, 20835–20854 (2019). https://doi.org/10.1007/s11042-019-7424-8

Deng, J., Guo, J., Xue, N. and Zafeiriou, S., 2019. Arcface: Additive angular margin loss for deep face recognition. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (pp. 4690-4699).