Fig.1. Process of joining the disjoint finger to the hand, (a) original hand image, (b) generated hand mask using pre-pr
Identity Verification using Shape and Geometry of Human Hands Expert Systems with Applications, Elsevier, 2015 Shefali Sharma1, Shiv Ram Dubey2, Satish Kumar Singh2, Rajiv Saxena3 and Rajat Kumar Singh2 1Jaypee
University of Engineering and Technology Guna, 2Indian Institute of Information Technology Allahabad, 3Jaypee University Anoopshahr
Introduction
Performance Evaluation
A multimodal biometric system for identity verification is designed by fusingshape and geometry features at score level. Shape and geometry features are derived with the help of only contour of the hand image. Two shape based features are extracted using distance and orientation of hand contour points and wavelet decomposition. Seven distances are used to encode the geometrical information of the hand. All processing is done w.r.t. a stable reference point at wrist line which is more stable as compared to the centroid [1].
The experiments of identity verification are done over two datasets one created by us and other one is created by IITD [3]. The IITD dataset is having contactless hand images, whereas images in our dataset are captured after contacting the hand over a simple scanner. Our dataset consists of 250 images (5 images from 50 subjects). The 240 subjects with 5 images each are considered from IITD dataset.
(c)
(d)
Fig.3. Robustness of the stable and rotation invariant reference point extraction over wrist boundary of the hand.
The finger points obtained in Fig.2(g) is used to extract the geometrical feature as depicted in Fig.4. The ratio of each pair of distances is combined into a feature vector. p2
p3
p4
d3
Imposter
mm m2 3 4
d2 m5
d5
m1
d7 d6
(d)
(e)
Fig.1. Process of joining the disjoint finger to the hand, (a) original hand image, (b) generated hand mask using pre-processing, the ring finger is disjoint from hand, (c) hand mask is rotated such that ring finger became vertical, (d) ring finger is extended such that it became attached with the remaining hand, and (e) the hand mask is rotated to its original orientation back.
200
400 600 800 Number of scores
1000
1200
z-score max-division min-max
TAR
0.94
0.3 0.2 0.1
0.92 0.9 0
Genuine Imposter
0.4
0.05
0.1
0.15 FAR
0.2
0 0
0.25
0.5
1 1.5 min-max normalization
2
(a) (b) Fig.8. Effect of score normalization method over our hand dataset: (a) ROC plot using different score normalization methods, (b) the min-max normalized score density distribution. 0
1
p1
TAR
0.99
ф1
10
-1
d o g fused
m1
(b) Finger alignments at particular orientations
(c) Updated contour with reference point
Fig.5. Finger registration at a fixed orientation from the vertical axis.
(d) Hand Orientation Registration
-2
10
-4
10 FAR
-2
10
0.98
240 200 150 100 50
0.97
0
0.96 0
0.005
0.01 FAR
0.015
0.02
(a) (b) Fig.9. Results over IITD hand dataset: (a) ROC curve of score fusion and (b) performance comparison with varying number of subjects. Here, „d‟, „o‟, and „g‟ represent the distance, orientation, and geometrical features.
References
Shape based Feature (a) JUET contact dataset
Geometry based Feature (h) Feature extraction
Genuine
0 0
1.0
0.96
10
(c) Centroid and Orientation
0.8
0.98
10
(b) Hand Mask
0.4 0.6 Threshold
1
The finger registration is performed to align each finger at a fixed orientation (i.e. 60, 30, 10, -10, and -30 degrees from thumb to baby finger) from the vertical axis as illustrated in Fig.5. The updated contour after finger alignment is used to extract the shape features in terms of the wavelet decomposition of distance and orientation of each point of contour w.r.t. the reference point.A wavelet decomposition (1-D) at level 5 using Daubechies-1 wavelet filter is used in this work [2].
(a) Contour with known finger feature points
(a) Hand Image
0.2
(a) (b) Fig.7. Analysis over our hand dataset: (a) false rejection (FA) and false acceptance (FA) with different thresholds, and (b) genuine and imposter scores after score fusion of all three modalities.
Method The framework to find out rotation invariant hand contour with the peaks and valleys of each finger is demonstrated in Fig.2 with the help of an example hand image. Fig.3 shows the robustness of the reference point extraction over wrist boundary of the hand.
False Rejection False Acceptance
TAR
(c)
1 0.5
Fig.4. Calculation of geometrical features, (a) hand contour with finger feature points and (b) 7 distances used as geometrical features. (b)
0.4
0 0
d1
(a)
0.6
0.2
d4
p5
p1
1.5
0.8
Density
A thresholding and morphological operators are used to get the hand mask binary image from the original hand image. In this process, if any portion of hand gets disjoint from the rest of the hand portion then the scheme illustrated in Fig.1 is used to fix it for each disjoint portion.
1
Score
(b)
Probability
Preprocessing
(a)
(g) Finger feature points
(f) Peaks & Valleys
(e) Contour with reference point
Fig.2. Localization of rotation invariant hand contour and finger feature points such as peaks, valleys and mid-point of two adjacent valleys.
(b) IITD contactless dataset
Fig.6. Example images from (a) our dataset and (b) IITD dataset.
[1]E. Yoruk, E. Konukoglu, B. Sankur, and J. Darbon, “Shape-based hand recognition”, IEEE TIP, 15(7): 1803-1815, 2006. [2]S.G. Mallat, “A theory for multiresolution signal decomposition: the wavelet representation”, IEEE TPAMI, 11(7): 674-693, 1989. [3]A. Kumar, “Incorporating cohort information for reliable palmprint authentication”,Sixth ICVGIP, 2009.