Adaptive and Distributed Cryptography for Signature ...

4 downloads 264 Views 261KB Size Report
traditional recognition approaches, biometric authentication relies on who a person is or what a ... the stored templates applied to dynamic signature features.
Adaptive and Distributed Cryptography for Signature Biometrics Protection Patrizio Campisi, Emanuele Maiorana, Miguel Gonzalez Prats, and Alessandro Neri Dip. Elettronica Applicata, Universit`a degli Studi “Roma Tre”, Via della Vasca Navale 84, I-00146 Roma, Italy ABSTRACT The most emerging technology for people identification and authentication is biometrics. In contrast with traditional recognition approaches, biometric authentication relies on who a person is or what a person does, being based on strictly personal traits, much more difficult to be forgotten, lost, stolen, copied or forged than traditional data. In this paper, we focus on two vulnerable points of biometric systems: the database where the templates are stored and the communication channel between the stored templates and the matcher. Specifically, we propose a method, based on user adaptive error correction codes, to achieve securitization and cancelability of the stored templates applied to dynamic signature features. More in detail, the employed error correction code is tailored to the intra-class variability of each user’s signature features. This leads to an enhancement of the system performance expressed in terms of false acceptance rate. Moreover, in order to avoid corruption or interception of the stored templates in the transmission channels, we propose a scheme based on threshold cryptography: the distribution of the certificate authority functionality among a number of nodes provides distributed, faulttolerant, and hierarchical key management services. Experimental results show the effectiveness of our approach, when compared to traditional non-secure correlation-based classifiers.

1. INTRODUCTION The most emerging technology for people authentication is biometrics. It can be defined as the analysis of physiological or behavioral people characteristics for automatic recognition. Biometrics authentication relies on who a person is or what a person does, in contrast with traditional authentication approach, based on what a person knows (password) or what a person has (ID card)1,2 . Then, it is based on strictly personal traits, much more difficult to be forgotten, lost, stolen, copied or forged than traditional data. Biometric authentication systems consist of two stages: the enrollment subsystem and the authentication subsystem. In the enrollment stage the biometric features are captured from a subject and checked for their quality, then relevant informations from the biometric features are extracted, and eventually these informations are stored in a database. As for the authentication subsystem, two methods can be taken into account. • Verification: the subject who claims an identity presents some form of identifier (like used ID, ATM card) and a biometric characteristic. The system extracts the features from the biometric, and it compares the feature in the database associated with the provided ID and the one acquired. • Identification: the system acquires the biometric sample from the subject, extracts features from the raw measurements, and searches the entire database for matches using the extracted biometric feature. Although biometric system can improve security over existing methods of user authentication, security leakage, unintended by the designer,3 can be introduced in any real life system. As detailed in4 attacks can be perpetrated at the sensor level like the coercive attack where the true biometric is presented but in some unauthorized manner, the replay attack where a recorded version of the true data is presented to the sensor, the impersonation attack where unauthorized individuals changes his/her biometric to appear like an authorized Corresponding author Patrizio Campisi, E-mail: +39.06.55177026

[email protected], Telephone:

+39.06.55177064, Fax:

individual. Moreover attacks can be perpetrated on the channels interconnecting the different part of the biometric system like the channel between the sensor and the biometric system (if one has access to this point), the channel between the feature extractor and the matcher, the channel between the matcher and the application device thus overriding the output of the matching module, and the channel between the central or distributed database and the authentication system. Also, the feature extractor could be forced to produce a pre-selected feature, the matcher can be attacked to produce an artificially high or low match score, and attacks on the database itself can be done. In this paper we focus on the protection of templates, which can be stored either in a central database or in distributed databases. A malicious user can steal a template both by intercepting a communication through the channel between the database and the authentication system and by breaking into the databases. In both cases the biometric system is compromised. Although it was believed that it is not possible to reconstruct the original biometric data starting from the corresponding extracted template, some concrete counter examples, which contradicts this assumption, have been provided in the recent literature for faces5 and fingerprints.6 Moreover, even if, from the stolen template, it is either not possible or computationally infeasible to reconstruct the original biometric data, the stolen template cannot be used anymore by the user to summarize the biometric data under analysis. In fact, whereas a password can be changed as many times as the user wants, the same is not possible for biometrics since they are in limited number and cannot be easily changed by the user. This makes, the template protection issue of paramount importance in the design of a biometric system. There exist several security techniques to thwart attacks on the templates. Template encryption can be used to make the data useless without the knowledge of the key that must be kept secret. However, once the key is broken or the data are decrypted they are not protected anymore. To overcome this limitation, watermarking techniques910 can be used. Either a time-stamp can be embedded into the template in such a way that after the expiration date the template is useless, or the template can be embedded into a host signal to make its presence undetectable. In11 a fragile watermarking method for fingerprint verification is proposed in order to detect tampering while not lowering verification performance. In12 a secure biometric storage based on the use of syndrome codes is described. Error correction has also been employed to protect face biometric data in13 where Reed-Solomon codes have been used. Templates protection is achieved in14 by partially deleting and splitting template information and partially deleting them while enabling their restoration through error correction coding. In some approaches template protection can be addressed by integrating biometrics with cryptographic techniques15 . In16 the so called fuzzy commitment scheme, where a cryptographic hashed version of the data is stored/transmitted, was introduced. In17 the cancelable biometric concept is exploited. It consists of an intentional noninvertible distortion on the biometric data both in the enrollment stage and in the authentication one. Since only the distorted data are stored in the database, even if the database is compromised, the biometric data cannot be retrieved. In18 biometric secrecy preservation and replaceability are obtained using random tokens together with multiple-bit discretization and permutation thus obtaining replaceable cryptographic signature keys. Based on,16 a new template protection approach, called Helper Data System was proposed in19 for acoustic ear, in20 for fingerprint, and in21 for face recognition. In this paper we propose a user adaptive cryptosystem, based on fuzzy commitment scheme16 for the security of the stored data and their renewability. A practical implementation of the system has been designed with reference to dynamic signatures. The paper is organized as follows. In Section 2 some details on signature recognition and on the used features are given. The biometric protection scheme is presented in Section 3 where the reliable features selection principles are outlined as well as the user adaptiveness of the method. In Section 4 a distributed storing architecture is proposed, aimed at avoiding interception or corruption of data, providing robustness and fault-tolerance to the system. Eventually, experimental results and conclusions are drawn in Section 5.

2. DYNAMIC SIGNATURE BIOMETRICS People recognition based on signatures is one of the most accepted biometric verification methods, since it is perceived as a non-invasive and non-threatening process by the most of users. Moreover, it is actually one of

the most reliable behavioral biometrics together with speech, is characterized by a high collectibility and, due to the large experiences on it, has a high legal value22,23,24 . Signature recognition can be either static or dynamic: in the first mode, also referred to as off-line, only the signature image is acquired in a digitized form through an optical scanner or a camera, while in the second mode, also called on-line, signatures are acquired by a digitizer tablet or a pen-sensitive computer display. These devices usually capture the position of the pen (in x(t) and y(t) coordinates being t the temporal coordinate), and the pressure p(t) applied by the pen. To accomplish recognition or verification, further informations are extracted from these raw data, such as global or local velocity, acceleration or strokes, to cite only a few, commonly referred to as signature features. Online signatures are therefore more difficult to forge than off-line ones, so they are more suitable for personal authentication in legal and commercial transactions. In our work, on-line signatures are acquired using an Interlink Electronics ePad-ink, based on a resistive touchpad with 300dpi resolution. A total of 66 features, detailed 1,25, 26 are extracted from the C inr Table z acquired signals. Statistical moments Mrz , defined as Mrz = c=1 xc yc , where C is the number of acquired samples and x, y are the coordinates of a signature sample, are also included.27 This set of features include both global and local features, where the last ones are based on a partition of each signature in five consecutive and disjoint segments with the same temporal extension: the need of local features for the analysis of dynamic signature is well expressed in.28 ID 1 2 3 4 5−6 7−8 9 − 10

Description

ID

Number of Strokes Time Duration Number of Samples Aspect Ratio (y vs x) X and Y Area Average Xand Y Velocity Absolute Average Xand Y Velocity

11 − 12 13 − 14 15 − 16 17 − 21 22 − 31 32 − 41 42 − 51

Description

ID

Average Xand Y Acceleration Initial X and Y Final X and Y M11 , M12 , M21 , M30 and M03 X and Y Sub-Areas (Local) X and Y Sub-Velocities (Local) X and Y Sub-Accelerations (Local)

52 53 54 − 56 57 − 59 60 − 62 63 − 64 65 − 66

Description Height Width Mean X,Y and Pressure Value Maximum X, Y and Pressure Value Minimum X, Y and Pressure Value Maximum X and Y Velocity Maximum X and Y Acceleration

Table 1. Features extracted from on-line signatures.

3. BIOMETRIC PROTECTION SCHEME In this section the proposed scheme for biometric templates protection is presented. Basically, it is based on Juels’ proposal on the use of helper data and error correcting codes.16 This approach is twofold, allowing the system both to manage cancelable biometrics17 and to handle the intra-class variability of signatures. As it can be expected from a behavioral biometric, signatures taken from a user can exhibit a lot of variability between different realizations, mainly due to lack of user’s habit and to the different conditions of execution (seated or standing position, wide or narrow area for resting the arms). Therefore, features extracted from a subject’s signatures may not be always the same. This variability is here handled by considering the obtained templates as a noisy version of the ”ideal” template, where the noise power is related to their variance. The schemes of the proposed enrollment and authentication procedure are illustrated in Figures 1 and 3 respectively.

3.1. Enrollment stage In the enrollment phase, for each subject s a number I of biometrics measurements are recorded. Then for each measurement the K = 66 features detailed in Section 2 are extracted and collected in the features vectors fis , with i = 1, . . . , I. The intra-class μs and the inter-class μ vector mean are then estimated as: 1 s f ; I i=1 i I

μs =

μ=

S 1 s μ S s=1

(1)

Figure 1. Enrollment scheme: the acquired data are analyzed, quantized and summed to error correcting codes. The stored data are μ, HD1, HD2, HD3 and h(N ).

where S is the number of enrolled subjects. Then, the enrolled features vectors fis , with i = 1, . . . , I, for the user s, are binarized by using the inter-class mean μ and collected as row vectors in a binary matrix Bs , with I (signature samples) rows and K (features) columns, whose generic element Bs [i, k] is obtained as:  0 if fis [k] ≤ μ[k] s B [i, k] = (2) 1 if fis [k] > μ[k] A binary vector representative of the features extracted from each signature made by the user s is then obtained in two different ways. In the first approach we make use of the estimated intra-class mean μs for the user s, thus obtaining the representative vector bsµ whose k-th element is obtained as follows:  bsµ [k]

=

0 1

if μs [k] ≤ μ[k] if μs [k] > μ[k].

(3)

In the second approach the k-th element of the representative vector bsp is selected by considering the most probable bit among the corresponding elements of the binarized features collected in the binary matrix Bs as follows:  I Bs [i, k] ≤ I/2 0 if s i=1 bp [k] = (4) I s 1 if i=1 B [i, k] > I/2. 3.1.1. Reliable Feature Selection In the proposed scheme, after having determined Bs and either bsµ or bsp , the next step aims at reducing the number of features employed in the authentication stage. Only subjects most reliable features are selected, thus counteracting the potential instability, for the single user, of the feature vector components. This is done according to a reliability measure. In21 , where features extracted from faces are considered, a reliability measure is derived by assuming a Gaussian distribution for each feature. However, the Gaussianity assumption does not apply in the scenario we have considered. Extensive tests have pointed out that the employed signature features cannot be modeled according to either a Gaussian or a generalized Gaussian distribution. In Figure 2 the behavior of three features extracted from a set of signatures

Figure 2. Fitting of three features distributions to Gaussian and Generalized Gaussian Model: a)Feature n.18; b)Feature n.43; c)Feature n.59.

(Feature 18: Statical Moment M12 ; Feature n. 43: 2nd X segment Acceleration; Feature n. 59: Maximum Pressure Value) is shown as an example together with the Gaussian and generalized Gaussian probability density functions, whose parameters are estimated from the experimental data. Testing of fit of the Gaussian or the generalized Gaussian distribution to the data have also been performed. Specifically the Goodness-of-Fit (GOF) test, the Chi-squared test, the Cramer-von Mises test, and the Anderson-Darling29 have been used. The tests results, collected in Table 2, highlight the poor match between the experimental data and the considered distributions.

Fitting Results GOF Chi-squared Cramer-von Mises Anderson-Darling

Feature n. 18 Gaussian Gen. Gaussian 0.6590 0.5387 0.2915 0.1555 0.5359 0.4454 1.5389 1.3676

Feature n. 43 Gaussian Gen. Gaussian 0.2355 0.2204 0.2368 0.2483 0.1104 0.1085 0.3552 0.3497

Feature n. 59 Gaussian Gen. Gaussian 0.8017 0.7436 0.3013 0.1215 0.7892 0.7071 2.6061 1.9456

Table 2. Test of fit of a Gaussian and Generalized Gaussian distribution to the data: Goodness-of-Fit, Chi-squared, Cramer-von Mises, and Anderson-Darling.

Therefore in our approach we introduce a reliability measure not directly related to the signature features distribution. Specifically, for the k-th bit of the feature vector, the reliability Rs1 [k] is defined as follows: I Rs1 [k]

=1−

i=1 (B

s

[i, k] ⊕ bsµ/p [k]) I

(5)

where ⊕ represents the XOR operation and bsµ/p is given by either (3) or (4). In (5) the occurrence of the k-th binary value bsµ/p [k] in the corresponding elements of the binarized features collected in the binary matrix Bs is evaluated, thus giving a measure of the representativeness of the value bsµ/p [k] with respect to its counterpart obtained by a new signature by the same user. According to this measure, components with a high reliability possess a high discrimination capability. However, it can happen that by using the reliability measure Rs1 [k], we can result in components with the same reliability value. Then, in order to further discriminate among them we introduced a second level of screening for features, according to the following reliability measure: Rs2 [k] =

| μ[k] − μs [k] | σ s [k]

(6)

 2 I  s 1 s with σ s [k] = being the standard deviation of the k-th feature of subject s. A i=1 fi [k] − μ [k] I−1 higher discriminating power is trusted to features with a larger difference between μs [k] and μ[k], relative to the standard deviation σ s [k]. Therefore after the application of the reliability metrics to bsµ/p [k] we end up with the binary feature vector x containing the K  most reliable components. As shown in Figure 1, the indexes of the most reliable feature for user s are stored in the Helper Data HD1s . s

In order to achieve both template protection and renewability, our scheme uses error correcting codes (BCH codes)30 whose error correction capabilities (ECC) can be a priori selected, as in,21 and detailed in Section 3.1.2. In this paper we propose an error correcting code selection procedure depending on the intra-class variability of each user’s signature as pointed out in Section 3.1.3. 3.1.2. A priori Selection of Error Correction Capability After having obtained the binary feature vector xs a BCH code, whose ECC is decided a priori depending on the desired False Acceptance Rate (FAR) or False Rejection Rate (FRR), is employed. Specifically, a codeword cs , is generated from a randomly selected number N s . Then a XOR operation between the codeword cs and xs is performed, thus obtaining: HD3s = xs ⊕ cs

(7)

A hashed version h(N s ) of N s , created using a SHA-1 algorithm31 , is then stored, together with the inter-class mean μ and the Helper Data HD3s . 3.1.3. Adaptive Selection of Error Correction Capability The approach described in Section 3.1.2 is able to provide cancelability of the template simply by changing the codeword cs (i.e., the random generated number N s ) associated to the user during enrollment. However, we can also provide adaptability to the user signature variability. This implies that we manage the intra-variability of signatures, which reflects on bit differences between the feature vector xs representative of the user s and the feature vector x ˜s obtained in the authentication stage from the same user s (see Figure 3). This is done by choosing the BCH code and its ECC, among a set of available codes, in such a way that for users who give rise to a high intra-class variability, codes with higher error correction capabilities are selected. Therefore, in the enrollment stage, an intra-class analysis is performed as follows. Once, the K  reliable features are selected, as detailed in Section 3.1.1, the matrix Xs , having I rows and K  columns, is obtained from Bs , having I rows and K columns, dropping the non reliable features. Then, the Hamming distances Dis , with i = 1, . . . , I, between any reliable feature vector acquired in the enrollment stage, rows of Xs , and xs are evaluated. The average Avgs of the Dis values, 1 s Avg = Di I I

s

(8)

i=1

is then used to characterize the intra-class variability of the user s. Specifically, after having chosen a BCH code with codeword length equal to K  , its ECC is selected to be equal to Ψ[Avgs + v] where v is a value common to all users and the operator Ψ[·] represents the possible nearest ECC greater than or equal to [·]. Then, the selected BCH ECC is stored in HD2s . Once the BCH code has been selected, a codeword cs , is generated from a randomly selected number N s . Then a XOR operation between the codeword cs and xs is performed, thus obtaining: HD3s = xs ⊕ cs .

(9)

A hashed version h(N s ) of N s , created using a SHA-1 algorithm is eventually stored. The proposed framework provides security, being impossible to retrieve f s from μ, HD1s , HD2s , HD3s and h(N s ).

3.2. Authentication stage

Figure 3. Authentication scheme: when a subject claims his identity, a response is given using the stored data μ, HD1, HD2, HD3 and h(N ).

The authentication phase follows the same steps as the enrollment stage (see Figure 3). When a subject claims his identity, he provides his signature, which is converted in the features vector ˜f s. Then the quantization ˜ s . The reliable features x ˜ s are selected using HD1s . The is done using the inter class mean μ thus obtaining b s codeword ˜ c results from the XOR operation ˜ s ⊕ HD3s . c˜s = x

(10)

˜ s from C ˜ s . Finally, the The BCH decoder is selected depending on the encoder used in enrollment, obtaining N s s ˜ SHA-1 hashed version h(N ) is compared to h(N ): if both values are identical the subject is authenticated.

4. THRESHOLD CRYPTOGRAPHY ARCHITECTURE In the proposed approach, five different entities have to be stored for each user: the inter-class media μ, the Helper Data HD1, HD2,HD3, and the hash h(N ). None of these parameter can supply any information about the users’ biometrics by itself, but each of them carries a piece of information in the authentication phase: if any of these entities is not available at a certain time, user authentication is not possible. Moreover, the single parameter μ can give informations on general traits of users’ signatures. In addition, as for the storage of biometric templates, we have to design both a secure storage system and a secure modality to transmit the stored templates to the matcher17 . Applying simple cryptographic algorithms can be not enough: even if an intruder cannot retrieve the features values, he can however corrupt the data, making the authentication impossible. Moreover, for an attack coming from the inside of the system, it is possible to retrieve the cryptographic key and decrypt the information. In order to avoid corruption, modifications or interception of the stored templates we propose a scheme based on threshold cryptography32, 33 . In this system, the distribution of the certificate authority functionality among a number of nodes provides distributed, fault tolerant, and hierarchical key management services. Shamir34 was the first to highlight the possibility of sharing a secret key among different entities. The first practical threshold cryptosystem, based on ElGamal and RSA cryptographic keys, was proposed by Desmedt in.32 In brief, threshold cryptography allows l entities to share the ability of performing a cryptographic operation, distributing trust and building a highly available and secure key management service. In a t-out-of-l scheme, any t parties, where t is the system threshold, can perform this operation jointly, whereas it is infeasible for any t − 1 parties (or less) to do so, even by colluding. In such a system, an attacker needs to break at least t nodes in order to compromise system security. The values of t and l may be chosen according to the desired security and fault-tolerance requirements: lowering t means lowering the security provided, while increasing it can be useful for more security critical services. Threshold cryptography is widely used in computer networks to provide security in terms of availability, confidentiality, and secure key or data distribution, finding its main application in document authorization or verification, e-commerce transactions, and distributed online certification authority33−35 .

The employed approach uses RSA and Lagrange Interpolating Polynomial Scheme. This approach takes the public (n, e) and private (n, d) RSA key and distribute the private key d on l different servers. When the encrypted message M d (modn) has to be reconstructed, each server outputs a partial signature Si = M αi , instead of sending their key share and therefore without revealing the private key to any party. The receiver selects t signatures, from which it computes the original signature prodti=1 Si , as shown in Figure 4

Figure 4. Distributed Architecture based on Threshold Cryptography.

To obtain the partial keys αi , a polynomial h(x) = a0 + a1 x + . . . + at−1 xt−1 of degree t − 1, with a0 = d, the secret key, is considered. It is demonstrated that given t points (i1 , K1 ), . . . , (it , Kt ) with different x coordinates, there is a unique polynomial of degree ≤ t − 1 passing through them. The Lagrange Polynomial is as follows: h(x) =

t 

Ki

t  j=1,j=i

i=1

x − xj

xi − xj

(11)

The different shares can be taken as Ki = h(xi ) for 1 ≤ i ≤ l. Given t of these shares the Lagrange polynomial can be reconstructed, where any t − 1 shares cannot do so. The basic sharing procedure assumes32 :

Md =

t i=1

M αi = M

t

i=1

αi

; d = h(0) =

t  i=1

αi =

t  i=1

Ki

t  j=1,j=i

t  0 − xj

→ αi = Ki xi − xj

j=1,j=i

0 − xj

(12) xi − xj

where xi are the (known) indices assigned to each server (for simplicity it can be xi = i).

5. EXPERIMENTAL RESULTS AND CONCLUSIONS In this Section an extensive set of experimental results, concerning the system performances are presented. Specifically, we have performed experimentations in order to compare: • the adaptive approach detailed in Section 3.1.3 with the non adaptive one described in 3.1.2, which is referred to as classic; • the use of the reliability measure Rs1 [k] with the use of both Rs1 [k] and Rs2 [k], as explained in Section 3.1.1; • the use of the binary vector bsµ with the use of bsp , to represent each user during enrollment. Moreover, to understand the influence of the enrollment phase in the authentication process, we performed two tests with different number of acquired signatures during the enrollment. Specifically, ten signatures were recorded by thirty subjects in Test A, whereas only five signatures were acquired from the same set of subjects in Test B in the enrollment stage.

BCH Code K  ECC 31 2 31 3 31 5

FRR 4,47% 2,47% 1,07%

Test A FARRF 7,53% 12,15% 23,29%

63 63 63

13,87% 11,33% 6,13%

10,45% 11,83% 15,11%

BCH Code K  ECC 31 2 31 3 31 5

FRR 4,47% 2,47% 1,07%

Test A FARRF 7,53% 12,15% 23,29%

63 63 63

8,27% 6,13% 3,20%

12,10% 13,82% 18,00%

10 11 13

10 11 13

FARSF 27,33% 41,00% 62,00%

FRR 12,53% 6,53% 2,73%

Test B FARRF 4,22% 7,56% 16,58%

FAR SF 15,00% 29,33% 49,00%

BCH Code K  ECC 31 2 31 3 31 5

FRR 3,20% 1,80% 0,73%

Test A FARRF 11,95% 15,92% 25,49%

32,33% 35,00% 40,33% (a)

14,07% 10,53% 4,06%

9,64% 14,28% 10,82%

24,33% 27,33% 36,66%

63 63 63

13,80% 11,20% 6,13%

10,48% 11,80% 15,15%

FARSF 27,33% 41,00% 62,00%

FRR 12,53% 6,53% 2,73%

Test B FARRF 4,22% 7,56% 16,58%

FAR SF 15,00% 29,33% 49,00%

BCH Code K  ECC 31 2 31 3 31 5

FRR 3,20% 1,80% 0,73%

Test A FARRF 11,95% 15,92% 25,49%

34,67% 39,00% 47,66% (c)

12,00% 9,33% 4,07%

10,85% 16,32% 12,43%

28,33% 31,00% 42,00%

63 63 63

8,47% 6,13% 3,13%

12,11% 13,83% 18,46%

10 11 13

10 11 13

FARSF 42,33% 53,66% 65,00%

FRR 6,20% 4,06% 1,80%

Test B FARRF 9,49% 14,16% 22,47%

FAR SF 36,33% 45,00% 58,00%

32,33% 34,66% 40,33% (b)

14,33% 10,86% 4,20%

9,55% 10,96% 14,36%

24,33% 27,33% 36,66%

FARSF 42,33% 53,66% 65,00%

FRR 6,20% 4,06% 1,80%

Test B FARRF 9,49% 14,16% 22,47%

FAR SF 36,33% 45,00% 58,00%

34,67% 39,00% 47,33% (d)

11,87% 9,67% 4,20%

10,73% 12,51% 16,34%

27,33% 30,66% 41,66%

Table 3. First Experiments: system performances without adaptive BCH code selection, varying K  and code ECC. The results refer to systems using: a)Rs1 [k] and bsµ ; b) both Rs1 [k] and Rs2 [k], and bsµ ; c) Rs1 [k] and bsp ; d) both Rs1 [k] and Rs2 [k], and bsp [k].

The first experiment was aimed at evaluating system performances without the use of adaptive BCH code selection. Several BCH codes, with different ECC, have been applied. Specifically, we have used BCH codes with codeword length K1 = 31 and K2 = 63 each with different ECC. The system performances have been assessed through FRR and FAR as shown in Table 3. More in detail, for FRR estimation fifty signatures from each enrolled subject have been recorded in different times in the timespan of a week. The FAR is both referred to conditions of random forgeries22 , indicated as FARRF , and to conditions of skilled forgeries, indicated as FARSF . For each subject, the fifty signatures of all the remaining twentynine users are used as random forgeries, whereas in the case of skilled forgeries a test set of ten skilled forgeries was created for each subject, using a training time of ten minutes for each signature whose original was made available. Specifically, in Table 3.a the system performances using as reliability measure Rs1 [k] and bsµ as subject’s representative binary vector, for different values of K  and ECC are shown. Table 3.b refers to a system using both Rs1 [k] and Rs2 [k], and bsµ as subject’s representative binary vector. Moreover, Table 3.c refers to a system using Rs1 [k] and bsp , while Table 3.d refers to a system using both Rs1 [k] and Rs2 [k], and bsp . In the second experiment we tested the system using the adaptive codes selection scheme we have here proposed, which results in improved performances as shown in Table 4. The ECC is selected as detailed in Section 3.1.3 having used the values v = 1, 2, 3 for K  = 31 and while v = 3, 4, 5 for K  = 63. The obtained experimental results highlight that: • The use of the adaptive code selection method increases the performances of the system in terms of FAR, especially when skilled forgeries are taken into account. Performances in terms of FRR are the same as the classic approach. • Increasing the value of K  leads to a lowering of the performances in terms of FRR, while little increasing the performance in terms of FAR is achieved. • Taking more signatures during enrollment results in a little increase of FAR and a significative decrease of FRR, thanks to the possibility to better estimate the statistics of each user. • Selecting the most reliable features according to the reliability measure Rs1 [k] only involves having better FAR performances, while using both the reliability measures implies better performances in terms of FRR. • For representing each user s, the use of the binary vector bsp instead of bsμ increases the performances in terms of FAR, while lowers the performances in terms of FRR.

K 31 31 31

BCH Code ECC Ψ[Avgs + 1] Ψ[Avgs + 2] Ψ[Avgs + 3]

FRR 8,60% 4,40% 2,40%

Test A FARRF 3,45% 7,55% 12,38%

63 63 63

Ψ[Avgs + 3] Ψ[Avgs + 4] Ψ[Avgs + 5]

10,07% 7,13% 5,27%

5,29% 7,58% 10,48%

K 31 31 31

BCH Code ECC Ψ[Avgs + 1] Ψ[Avgs + 2] Ψ[Avgs + 3]

FRR 8,60% 4,40% 2,40%

Test A FARRF 3,45% 7,55% 12,38%

63 63 63

Ψ[Avgs + 3] Ψ[Avgs + 4] Ψ[Avgs + 5]

10,00% 7,00% 5,87%

6,47% 8,71% 10,86%

FARSF 13,67% 28,00% 41,33%

FRR 25,47% 12,53% 6,53%

Test B FARRF 1,79% 4,22% 7,56%

FAR SF 5,33% 15,00% 29,33%

K 31 31 31

BCH Code ECC Ψ[Avgs + 1] Ψ[Avgs + 2] Ψ[Avgs + 3]

FRR 5,87% 3,13% 1,73%

22,00% 30,33% 37,00% (a)

13,60% 10,00% 7,13%

3,97% 6,18% 8,13%

19,33% 23,00% 27,67%

63 63 63

Ψ[Avgs + 3] Ψ[Avgs + 4] Ψ[Avgs + 5]

10,00% 6,93% 5,27%

FARSF 13,67% 28,00% 41,33%

FRR 25,47% 12,53% 6,53%

Test B FARRF 1,79% 4,22% 7,56%

FAR SF 5,33% 15,00% 29,33%

K 31 31 31

BCH Code ECC Ψ[Avgs + 1] Ψ[Avgs + 2] Ψ[Avgs + 3]

FRR 5,87% 3,13% 1,73%

22,33% 29,67% 34,67% (c)

13,93% 9,67% 7,80%

4,80% 7,03% 8,87%

19,33% 25,67% 30,67%

63 63 63

Ψ[Avgs + 3] Ψ[Avgs + 4] Ψ[Avgs + 5]

10,13% 7,00% 5,67%

Test A FARRF 7,26% 11,95% 16,03% 5,36% 7,52% 10,52% (b) Test A FARRF 7,26% 11,95% 16,03% 6,53% 8,63% 10,90% (d)

FAR SF 31,33% 43,00% 54,00%

FRR 12,67% 6,20% 4,07%

Test B FARRF 5,31% 9,49% 14,16%

FARSF 24,33% 36,33% 45,00%

21,33% 30,67% 33,73%

13,07% 9,33% 6,73%

4,00% 6,10% 8,08%

20,67% 23,00% 27,00%

FAR SF 31,33% 43,00% 54,00%

FRR 12,67% 6,20% 4,07%

Test B FARRF 5,31% 9,49% 14,16%

FARSF 24,33% 36,33% 45,00%

21,67% 29,33% 35,00%

14,53% 9,87% 8,20%

4,77% 6,95% 8,74%

20,33% 24,00% 28,67%

Table 4. Second Experiments: system performance with adaptive BCH code selection, varying K  and code ECC. The results refer to systems using: a)Rs1 [k] and bsµ ; b) both Rs1 [k] and Rs2 [k], and bsµ ; c) Rs1 [k] and bsp ; d) both Rs1 [k] and Rs2 [k], and bsp [k].

Therefore, while the utilization of the adaptive code selection method gives better performances overall, the use of bsp instead of bsμ , or the use of both the reliability measures, together with the choice of the number of enrolled biometrics and the selection of codewords length, have to be chosen according to the desired FRR (for user acceptability-focused applications) or FAR (for security-focused applications) ratios.

REFERENCES 1. A.K. Jain, An introduction to biometric recognition, IEEE Transactions on Circuits and Systems for Video Technology, Vol. 14, No. 1, 2004 2. R.M. Bolle, J.H. Connell, S.Pankati, N.K. Ratha, A.W. Senior, Guide to Biometrics, Springer, New York, USA, 2004. 3. S. Prabhakar, S. Pankanti, A.K. Jain, Biometric Recognition: Security and Privacy Concerns, IEEE Security & Privacy Magazine 1 (2003), pp: 33-42, 2003. 4. N. Ratha, J. H. Connell, and R. M. Bolle, An analysis of minutiae matching strength, in Proc. Int. Conf. Audio and Video-based Biometric Person Authentication, Halmstad, Sweden, Jun. 2001, pp. 223228. 5. A. Adler, Can images be regenerated from biometric templates ?, Proc. Biometrics Consortium Conf., Washington, D.C., Sep. 2224, 2003. 6. U. Uludag and A. K. Jain, Attacks on biometric systems: a case study in fingerprints, in Proc. SPIEEI Security, Steganography and Watermarking of Multimedia Contents VI, San Jose, CA, Jan. 2004, pp. 622633. 7. L. Chen, S. Pearson, A. Vamvakas, On Enhancing Biometric Authentication with Data Protection, Fourth Internatial Conference on knowledge-Based Intelligent Engineering Systems and Allied Technologie, Sept 2000. 8. N.K. Ratha, J.H. Connell, R.M. Bolle, A biometrics-based secure authentication system, Proc. of IEEE Workshop on Automatic Identification Advanced Technologies, pp: 70-73, 1999. 9. N.K. Ratha, J.H. Connell, R. Bolle, Secure data hiding in wavelet compressed fingerprint images, ACM Multimedia 2000 Workshops Proc., pp: 127-130, 2000. 10. A.K. Jain, U. Uludag, Hiding Biometric Data, IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 25, No. 11, pp: 1494-1498, 2003. 11. S. Pankanti, M.M. Yeung, Verification Watermarks on Fingerprint Recognition and Retrieval, Proc. SPIE, Vol. 3657, 1999, pp: 66-78.

12. E. Martinian, S. Yekhanin, J. Yedidia, Secure Biometrics Via Syndromes, Allerton Conference on Communications, Control and Computing, 2005 13. Y. Cheng Feng, P.C Yuen, Protecting Face Biometric Data on Smartcard with Reed-Solomon Code, Proc. on Computer Vision and Pattern Recognition Workshop, 2006. 14. T. Ohki, S. Akatsuka, N. Komatsu, Safety of templates in biometric person authentication using errorcorrecting code, Proc. SPIE, Security, Steganography, and Watermarking of Multimedia Contents VIII, 2006. 15. U. Uludag, S. Pankanti, S. Prabhakar, A.K. Jain, Biometric Cryptosystems: Issues ans Challanges, Proc. of IEEE, vol. 92, no. 6, pp. 948-960, June 2004. 16. A. Juels, M. Wattenberg, A Fuzzy Commitment Scheme, 6th ACM Conf. Computer and Communication Security, pp: 28-36, 1999. 17. N.K. Ratha, J.H. Connell, R. Bolle, Enhancing Security and Privacy of Biometric-based Authentication Systems, IBM Systems Journal, Vol. 40, No. 3, 2002. 18. Y.W. Kuan, A. Goh, D. Ngo, A. Teoh, Cryptographic Keys from Dynamic Hand-Signatures with Biometric Secrecy Preservation and Replaceability, Proc. Fourth IEEE Workshop on Automatic Identification Advanced Technologies, pp: 27-32, 2005. 19. P. Tuyls, E. Verbitsky, T. Ignatenko, D. Schobben, T.H. Akkermans, Privacy Protected Biometric Templates: Acoustic Ear Identification, SPIE Proc., Vol. 5404, pp: 176-182, 2004 20. P. Tuyls, A. Akkermans, T. Kevenaar, G.J. Schrijen, A. Bazen, R. Veldhuis, Practical biometric template protection system based on reliable components, AVBPA Proc., 2005 21. M. Van der Veen, T. Kevenaar, G.-J. Schrijen, T.H. Akkermans, F. Zuo, Face biometrics with renewable templates, Security, Steganography, and Watermarking of Multimedia Contents, SPIE Proc., Vol. 6072, 2006. 22. M. Faundez-Zanuy, Signature recognition state-of-the-art, IEEE Aerospace and Electronic Systems Magazine, Vol. 20, Issue 7, pp: 28-32, 2005. 23. C. Vielhauer, R. Steinmetz, A. Mayerh¨ ofer, Biometric Hash based on statistical Features of online Signatures, International Conference on Pattern Recognition (ICPR), IEEE Proc., Vol. 1, pp: 123-126, 2002. 24. M. Freire-Santos, J. Fierrez-Aguilar, J. Ortega-Garcia, Cryptographic key generation using handwritten signature, Defense and Security Symposium, Biometric Technologies for Human Identification, SPIE Proc., Vol. 6202, pp: 225-231, 2006. 25. F. Bauer, B. Wirtz, Parameter Reduction and Personalized Parameter Selection for Automatic Signature Verification, International Conference on Document Analysis and Recognition, Vol. 1, pp: 183-186, 1995. 26. T.H. Rhee, S.J. Cho, J.H. Kim, On-Line Signature Verification Using Model-Guided Segmentation and Discriminative Feature Selection for Skilled Forgeries, Sixth International Conference on Document Analysis and Recognition, pp: 645 -649, 2001. 27. R.M. Guest, The Repeatability of Signatures, Ninth International Workshop on Frontiers in Handwriting Recognition, pp: 492-497,2004. 28. F. Ramirez Rioja, M.N. Miyatake, H. Prez Meana, K. Toscano, Dynamic features Extraction for on-Line Signature verification, 14th International Conference on Electronics, Communications and Computers, pp: 156-161, 2004. 29. G.A.P. Cirrone, S. Donadio, S. Guatelli, A. Mantero, B. Mascialino, S. Parlati, M.G. Pia, A. Pfeiffer, A. Ribon, P. Viarengo,A Goodness-of-Fit Statistical Toolkit, IEEE Transactions on Nuclear Science, Vol. 51, No. 5, 2004 30. M. Purser, Introduction to Error-Correcting Codes, Artech House, Boston, 1995. 31. Federal Information Processing (FIP) Standards Publication 180-1, Security Hash Standard, http://www.itl.nist.gov/fipspubs/fip180-1.htm, April 1995 32. Y. Desmedt,Y. Frankel, Threshold cryptosystems, Advances in Cryptology-Crypto ’89, Lecture Notes in Computer Science 435, Springer-Verlag, pp: 307-315, 1990. 33. Y. Desmedt, Some Recent Research Aspects of Threshold Cryptography, Information Security Proc.(Lecture Notes in Computer Science 1396), Springer-Verlag, pp: 158-173, 1997. 34. A. Shamir, How to share a secret, Communications of the ACM, 22, pp: 612-613, 1979. 35. L. Zhou, Towards Fault-Tolerant and Secure On-line Services, Ph.D. dissertation, Dept. of Computer Science, Cornell University, Ithaca, NY, 2001.

Suggest Documents