development of touchscreen hardware and user habit change .... The app can register multiple users and ... implementation of the proposed scheme on Android.
2015 IEEE 39th Annual International Computers, Software & Applications Conference
A Novel Touchscreen-based Authentication Scheme Using Static and Dynamic Hand Biometrics Mengyu Qiao1, Suiyuan Zhang1, Andrew H. Sung2, and Qingzhong Liu3 1
Department of Mathematics and Computer Science, South Dakota School of Mines and Technology, Rapid City, SD 57701, USA 2 School of Computing, The University of Southern Mississippi, Hattiesburg, MS 39406, USA 3 Department of Computer Science, Sam Houston State University, Huntsville, TX 77341, USA
As the gateway to computer systems and applications, user authentication is a widely adopted and effective security measure for protecting confidential data. Without a reliable method of user authentication, data access may be granted to unauthorized individuals or groups that steal confidential information for malicious purposes, or in the case of businesses, use the information to their financial advantage [10]. A user’s identity can be verified using: (1) something the user has, such as driver's licenses, passport, and etc.; (2) something the user knows, such as password and pin; (3) something the user is, such as fingerprint, iris, and vein patterns; (4) something the user does, such as keystroke, and touch gesture [2] [4]. In consideration of cost, convenience, and feasibility, password has been the most widely used method for user authentication on mobile devices. There are three popular password schemes used in mobile device: textbased passwords, shape-based passwords, and biometric passwords. A text-based password is the traditional scheme where a user chooses a string of digits, letters, or symbols as their password. A shape-based password is a more advanced scheme developed for devices with touchscreens, which allows the user to perform a single-touch shape-based gesture as the password. A biometric password uses additional biometrical sensors to collect biometric features of the user, and uses them as the password. Biometric passwords eliminate the need for users to memorize potentially many passwords, and at same time increase the security of the authentication. The problem of text-based and shape-based passwords is that they can be easily disclosed and reproduced using traces left on the screen or through shoulder-surfing attack. Biometric password has much higher security, but it may require an additional device or sensor to read the biometric information. The development of touchscreen has greatly reformed human-computer interaction. The two main types of technology used to sense touch are resistive and capacitive sensors. Capacitive touchscreens currently dominate the smartphone and portable electronics markets. Unlike a resistive touchscreen, there is no moving part in a capacitive touchscreen panel, so the risk of screen damage is much lower. The latest generation of touchscreen is equipped with
Abstract— With the booming of smart phone and high-speed wireless networks in recent years, applications and data have been shifting from desktop to mobile devices at a vigorous pace. Although mobile computing provides great convenience in daily life, it becomes vulnerable to various types of emerging attacks. User authentication plays an indispensable role in protecting computer systems and applications, but the development of touchscreen hardware and user habit change post requirements for new authentication methods for mobile and tablets devices. In this paper, we present a robust user authentication scheme using both static and dynamic features of touch gestures. We take advantage of the pressure sensitivity of multi-touch screens to obtain irreproducible biometric patterns. Discriminative features such as distance, angle, and pressure are extracted from the touch-point data, and used in statistical analysis for verification. We tested our scheme in a variety of experiments that involved multiple volunteers to perform various gestures. The analysis of experimental results and user feedback indicate the proposed scheme delivers comprehensive measurements and accurate pattern classification for touch gestures. Based on these results, we conclude that the proposed scheme overcomes the limitations of the existing user authentication methods, and shows great potential to provide robust protection against unauthorized access. Keywords-multi-touch gestures; biometric; authentication; password; pressure sensiitvity
I.
INTRODUCTION
As one of the predominant ways of communication, mobile phones have become an indispensable part of our daily life. Smartphones are increasingly prevalent and versatile in handling more tasks such as web-browsing, emailing, entertainment, navigation, trading, and online shopping. However, the popularity of smartphones and the vast number of applications also make these platforms more attractive to attackers. Currently, various typed of attacks have been conceived and achieved to gain unauthorized access on different smartphone platforms. Protecting a user’s personal data on a mobile device is becoming more crucial than ever.
0730-3157/15 $31.00 © 2015 IEEE DOI 10.1109/COMPSAC.2015.133
494
user’s biometric features, such as fingerprint, iris, and etc. The last type is based on “something you do”, which characterizes a user’s behavior, such as keystroke and touch gesture. Due to various limitations in design, production, and usage, there are three password schemes primarily used for user authentication in mobile devices: text-based passwords, shape-based passwords, and biometric passwords.
even more sensors, such as a pressure sensor which measures the pressure of the object that touches the screen [8] [12]. The growing understanding of security has increased the demand for convenient and robust user authentication methods. The new method should aim at overcoming the drawbacks of the existing methods, and achieve higher security and reliability. These desired features of user authentication may be accomplished by the fusion of different password schemes. Advancements in multi-touch screen technologies enable the collection of more information related to touch, such as position, pressure and size of multiple touch points. The data can derive some static and dynamic features of a touch gesture. Using the comprehensive feature set, the new approach will be able to enhance the security of user authentication, while still providing a simple user interface. In this paper, we propose a new approach for user authentication for mobile devices equipped with multi-touch screen. We extracted static and dynamic features from the raw touch-point data, such as distance, angle, and pressure, which reflects the unique characteristics of each gesture patterns and the user’s hand geometry. Through statistical analysis, we found that the features such as distance and angle primarily describe the physical shape of the gesture and the features such as pressure and touch-point size mainly represent attributes of the user. In other words, some features tend to distinguish gestures and the others tend to distinguish persons. Thus, we combined different features to enhance the accuracy of authentication. As a proof of concept, we designed an algorithm for the proposed authentication scheme and implemented a prototype app on Android platform. The app can register multiple users and verify users’ identities. To test our scheme, we conducted a variety of experiments that involved multiple volunteers, and analyzed various gestures. Experimental results show that our scheme is promising for accurate user authentication on mobile and tablet devices. The rest of this paper is organized as follows. Section II briefly reviews related work on different types of passwords and their applications for user authentication. Section III describes our scheme for touchscreen-based user authentication, which includes data preprocessing, features extraction, and similarity measures. Section IV presents the implementation of the proposed scheme on Android platform. Section V presents experiments with discussions followed by conclusions in Section VI.
A. Text-based Password The traditional text-based password scheme is still the most popular way to authenticate a user. However, the weak password problem revealed lots of vulnerabilities, and thus undermined our trust in its security. The wide-spread use of touchscreen devices and the increasing need for password has made choosing a weak password more common. A Study [3] has shown that users tend to choose weak passwords on mobile devices, because typing on a touchscreen is 31% slower than a physical keyboard. Since unlocking the mobile device is a frequent action, most users tend to use a simpler password that is related to their personal information, such as a birthday or phone number. Another study [1] has shown that 61% of users prefer to use the same password for everything. A text-based password is also easy to steal or crack, and a proof-of-concept app has been developed to measure the thermal heat signatures left on the text input device, so that the passwords can be easily recognized according to the intensity patterns of the heat signatures [7]. B. Shape-based Password New technology provides more possibilities for advanced user authentication. “Pattern Lock” is one of the shape-based password schemes, which is implemented on Android platform. Unlike the traditional text-based password, a shape-based password is a pattern or sequence of dots connected by lines, which a user must draw in order to gain access to a system. This authentication technique uses new features of the touchscreen technology. However, this method also has some limitations. First, these passwords have been shown to be vulnerable to disclosure based on the traces left on the screen by finger oil [11]. Second, it does not provide any protection against shoulder-surfing attacks, since the password does not contain any personal traits of the user [6]. Also the pattern is fixed by the program. For example, “Pattern Lock” of Android has a nine-dot map built into the program. Users have no control of the positions of dots, which makes the “Pattern Lock” weak against shoulder-surfing attacks [14]. Moreover, a shape-based password does not take advantage of the latest multi-touch interfaces in smart phones and tablets with big screens, where a user can use multiple fingertips to interact with the devices.
II. RELATED WORK User authentication decides whether or not to give a specific user access to a system, an application, or data. Therefore, a robust and efficient user authentication is almost necessary for every computer system. A user can be authenticated by four different ways. The first type of user authentication is based on “something you have”, which means a physical token or device that the user has and is only authorized to this user. The second type is based on “something you know”, which is some secret information only the user knows, such as a password. The third type is based on “something you are”, which usually refers to a
C. Biometric Password For any biometric authentication system, there are an enrollment phase and a verification phase. Before the system can be used for the first time, a user is required to enroll by generating a sample of biometric features, which is stored for verification. During the verification, the user goes through
495
moving his or her fingers, the average pressures, and the angle changes from the raw data. Using these processed features can make finding a match more efficient and accurate.
the same process to generate a new sample of features, which is compared with the stored pattern [5]. Different types of biometric authentication are classified by the types of biometric feature captured during the enrollment and verification phases. Touch ID, a feature available on the iPhones starting from iPhone 5, is a popular user authentication method using biometric features. Apple added a fingerprint sensor to the “Home” button, and users can use their fingerprint as a passcode. With just a touch of the “Home” button, the Touch ID sensor will read the user’s fingerprint and unlock the phone. It can also be used to authorize purchases from Apple’s app store, iTunes Store, and iBooks Store. One obvious drawback of fingerprint authentication is that it requires an additional sensor, which may not be feasible on some low-cost devices. Another example of the use of biometric features for user authentication is the Face Unlock in Android. The Face Unlock feature was originally introduced in 2011 as part of Android 4.0, also known as Ice Cream Sandwich. It can recognize a person’s face and use it to unlock the system. However, the Face Unlock could be bypassed by holding a static photo up to the mobile device’s camera. Using behavioral biometric features is also a possible way to authenticate a user. For example, Vatsa et al. proposed a signature verification algorithm based on static and dynamic features in [13]. Dynamic features, such as pressure and time, are captured as part of the signature, and compared to stored measurements to determine a matching score. Static features, such as number of pixels, and length and height of the signature, are generated using the data points extracted from the tablet and are analyzed as part of the matching process. In general, biometrics shows great advantages over traditional authentication techniques, because it almost cannot be forgotten, stolen, misplaced, or forged.
B. Thumb Detection and Finger Numbering The numeric labels given to touch-points by Android are assigned, in increasing order, according to the order in which a finger first touches the screen, which means the touch generated by any finger can be assigned any label from 1 to 5. Each time the same gesture is performed, fingers do not necessarily touch the screen in the same order, which will result in the same finger being assigned a different label in different attempts of the same gesture. To verify the multitouch gesture input by comparing its features with the stored pattern, all touch-point data needs to be correctly labeled and ordered in a consistent manner. To do this, we need to re-order the touch sequences to produce consistent labeling. Due to differences in the palm shapes of individuals and the large diversity of the multitouch gestures chosen by users, the numbering process needs to adaptively and stably give correct labels to fingers. We construct a simple polygon that connects the starting point of each touch sequence. Index 1 is assigned to the touch sequence with the largest total distance between it and the other four vertices. Then we number the remaining touch sequences in a clockwise order starting from the position of the touch sequence with index 1. After the re-order process, the data can be re-labeled in a consistent form, so that we are able to compare touch sequences from different attempts. Figure 1 illustrates the detected thumb in red circle and the assigned finger labels.
III. TOUCHSCREEN-BASED AUTHENTICATION USING BIOMETRIC FEATURES In recognition of the high-demand for user authentication and the limitations of the existing approaches, we propose a touchscreen-based user authentication method using static and dynamic features for mobile devices. It adopts the advantages from shape-based and a biometric password. We take advantage of the advanced multi-touch interface, and authenticate a user with a five-finger gesture. We collect the position and pressure data for the touch-points while a user performs a gesture on the touchscreen multiple times, and then extract static features, such as distance and angle, and dynamic features, such as shape of the palm, and pressure for each fingertip. Through similarity check, a stable pattern is selected and used as the template for later verification.
Figure 1. Thumb detection and finger numbering.
C. Distance Features The trails of the fingertips indicate the basic shape of a gesture. The length of the fingertips trails is one of the features we use to characterize a gesture during the process. The distance feature is the accumulation of all distances between adjacent touch-points. The distance feature of each touch sequence is calculated as follows:
A. Raw Data Acquisition Each time a touchscreen senses that an object has touched the surface of the screen, it will capture the x-y coordinates of the touched points, along with time-stamps, labels and the corresponding pressures. But the raw data cannot be directly used to recognize the user. First we extract information about the tracks made on the screen by the user
ଶ
ଶ
݀݅݁ܿ݊ܽݐݏ ൌ ට൫ݔǡାଵ െ ݔǡ ൯ ൫ݕǡାଵ െ ݕǡ ൯ ୀ
496
ሺ݅ ൌ ͳǡʹǡ͵ǡͶǡͷሻ
(1)
Where ݔǡ is the x coordinate of the ݆௧ point of the sequence with index ݅. ݕǡ is the y coordinate of the ݆௧ point of the sequence with index ݅.
Finally, the accumulated angle is calculated using the following formula. ̴݈ܽܿܿܽ݊݃݁ ൌ σୀଵห݈ܽ݊݃݁ǡ െ ݈ܽ݊݃݁ǡିଵ ห
D. Angle Features Angle is also used to describe the shape of a gesture. The angle feature we extracted is the accumulation of all angle changes between adjacent segments of the selected touchpoint sequence. Other than just the shape of the gesture, angle can also represent the movement of a gesture, which helps increase the accuracy of pattern matching. Figure 2 illustrates the process to compute angle features.
(3)
ሺ݅ ൌ ͳǡʹǡ͵ǡͶǡͷǢ ݊ ൌ ͳͲሻ
E. Pressure Features Pressures of fingertips are typical biometric features of a multi-touch gesture because they are related to natural characteristics of the human hand and the habits of each individual. In other words, the pressures of the fingertips of each user while performing the same gesture tend to be different from each other. The touch-screen device can capture the pressure of each touch-point in a touch sequence, and the pressure feature of a sequence is the average pressure of all points. The pressure feature of each touch sequence is calculated as follows. ଵ
݁ݎݑݏݏ݁ݎܲ݃ݒܣ ൌ ൈ σǡ ୀ ݁ݎݑݏݏ݁ݎǡ ሺ݅ ൌ ͳǡʹǡ͵ǡͶǡͷሻ
(4)
Where ݁ݎݑݏݏ݁ݎǡ is the pressure of the ݆௧ point of the sequence with index ݅. F. Similarity Measure As mentioned in the previous section, feature data is captured with noise and distortions. Therefore, in the enrollment stage, we require users to perform the same multi-touch gesture at least three times. The program will evaluate the stability of the three attempts and if any of the three attempts is similar enough to the other two, the program will stored that set of data in the database as the template data. Otherwise, the user will be asked to keep performing the same multi-touch gesture until it is accepted. Euclidean distance is used as the matching cost between two touch sequences when checking the similarity between two sets of distance features or two sets of pressure features. A set of inequalities is applied when comparing angle features as described below. The selection of template to be stored in the database is based on the similarity check of these features. Distance similarity check: Compare the Euclidean distance between the new distance features and the stored distance features with a selected threshold.
Figure 2. Angle feature extraction.
The locations of the points captured by a device have different densities in different areas of the touch sequence. It is inefficient and inaccurate to calculate angles between every pair of adjacent segments, because the number of short segments is large and the oscillation in short distances is frequent and unpredictable. Therefore, we pick points for the angle calculation based on the length of the entire trail of each touch sequence. We select the points that are the closest to the cumulative distances of 7.5%, 15%, 22.5%... of the total distance of the sequence. After point selection, we need to calculate the angles of the vectors between them. Due to the different sensitivities of the devices, it is possible that there are some mis-captured points. Moreover, the angle is very sensitive to sensor noise and small finger oscillation. To reduce the errors caused by these effects, the angle is measured for every vector connecting two points with a 2-point interval in between. In other words, we calculate the angle of the vector of the points 3 indexes away from each other. For example, ݈ܽ݊݃݁ଵǡଵ denotes the angle of points with index 1 and 4 from the touch sequence 1; ݈ܽ݊݃݁ଵǡଶ is the angle of points with index 2 and 5 from the touch sequence 1, so in total each touch sequence can generate ten angles between these selected points. The angle is calculated as follows: ௬ǡೕశయ ି௬ǡೕ
݈ܽ݊݃݁ǡ ൌ
൬
௫ǡೕశయ ି௫ǡೕ
൰
݈݊ܽ݁݀݅ܿݑܧሺ௦௨ଵǡ௦௨ଶሻ ൌ ඥσୀଵሺ݀݅݁ܿ݊ܽݐݏଵǡ െ ݀݅݁ܿ݊ܽݐݏଶǡ ሻଶ
(5)
Where ݀݅݁ܿ݊ܽݐݏଵǡ is the computed ݅ ௧ distance feature of the sequence 1 and ݀݅݁ܿ݊ܽݐݏଶǡ is the computed ݅ ௧ distance feature of the sequence 2. Angle similarity check: Compare the new angle features with the stored angle features. If the features of more than three fingers are similar, the two sets of features are similar.
(2)
ሺ݅ ൌ ͳǡʹǡ͵ǡͶǡͷሻ
Where ݕǡ is the y coordinate of the݆௧ point of the sequence with index ݅; ݔǡ is the x coordinate of the ݆௧ point of the sequence with index ݅.
̴݈ܽܿܿܽ݊݃݁ଵǡ ൏ ̴݈ܽܿܿܽ݊݃݁ଶǡ ൫̴݈ܽܿܿܽ݊݃݁ଶǡ Ͳ כǤͳ ݁൯
497
(6)
̴݈ܽܿܿܽ݊݃݁ଵǡ ̴݈ܽܿܿܽ݊݃݁ଶǡ െ ൫̴݈ܽܿܿܽ݊݃݁ଶǡ Ͳ כǤͳ ݁൯(7)
username; otherwise the username will be accepted and the program will switch to the sign-up screen. If the user chooses sign-in, the program will check if the username exists. If not, the program will pop up a window to inform the user that the username is not found. When the input username matches a username in the database, the user will be forwarded to the sign-in screen. The sign-up interface is used for user registration. This interface will appear to the user when the username does not match any username in the database or the button Sign-up is tapped. The data collection begins with the user performing a multi-touch gesture. All x-y coordinates, time-stamps, labels, and pressures of 5 touch sequences from 5 fingers are sequentially captured by the device [15]. After obtaining the data for the touch sequences of 5 fingers, the reorder function identifies the thumb and gives all fingers new labels. Then, these reordered data are used to calculate the distance, the average pressure, and the angle of each touch sequence. Finally, the similarity check described previously is applied after the gesture has been performed three times, and the date which passes the check is stored in the database. The screenshot below demonstrates the re-order function. The big circle represents the sequence that is re-labeled as index 1 and the rest of the sequences are re-labeled as 2,3,4,5 in clockwise order. In the application, as a visualization aid, the size of the circle represents the pressure of the touchpoints. Figure 4 shows the screenshot of touch gesture input.
Where ̴݈ܽܿܿܽ݊݃݁ଵǡ is the݅ ௧ angle of touch sequence 1 and ̴݈ܽܿܿܽ݊݃݁ଶǡ is the ݅ ௧ angle of touch sequence 2, and݁ is a constant used to increase the accuracy of the measurement of the angle. When both inequalities are true, then we assume ̴݈ܽܿܿܽ݊݃݁ଵǡ and̴݈ܽܿܿܽ݊݃݁ଶǡ are similar to each other. Considering the sensitivity of the angle calculation, it is hard to match all five angles. Based on our tests, we concluded that when more than three pairs of angle are similar, the two sets of angle are deemed to be similar. Pressure similarity check: Compare the Euclidean distance between the new pressure features and the stored pressure features with a selected threshold. ݈݊ܽ݁݀݅ܿݑܧሺ௦௨ଵǡ௦௨ଶሻ ൌ ටσୀଵ൫݁ݎݑݏݏ݁ݎܲ݃ݒܣଵǡ െ ݁ݎݑݏݏ݁ݎܲ݃ݒܣଶǡ ൯
ଶ
(8)
Where ݁ݎݑݏݏ݁ݎܲ݃ݒܣଵǡ is the ݅ ௧ pressure feature of the sequence 1 and ݁ݎݑݏݏ݁ݎܲ݃ݒܣଶǡ is the ݅ ௧ pressure feature of the sequence 2. IV. PROTOTYPE IMPLEMENTATION To test our scheme for user authentication on real touchscreen devices, we implemented an Android app that implements the aforementioned algorithm. The flowchart in figure 3 illustrates the process of the app.
The sequence is relabeled as index 5. The sequence is relabeled as index 4. The sequence is relabeled as index 3. The sequence is relabeled as index 2. The big circle indicates the detected
Figure 4. Screenshot of the thumb detection and re-labelling.
The sign-in screen handles the verification function. It will be activated when the button Sign-in is tapped and the input username matches a stored username in the database. The data collection process used in this stage is similar to the enrollment stage. After the user performs a gesture, the program extracts distance, angle and pressure features. Then the program runs a similarity check between the input feature data and the stored template feature data. Based on the result of the similarity check, the program decides whether to accept or reject the user.
Figure 3. The flowchart of the application implementing the proposed user authentication scheme.
The application starts with a launch screen in which the user is asked to type a username in the text field. Then the user can choose to sign-in or sign-up. The username check will be performed for both choices. When the user chooses sign-up, the program will check the database for the input username. If taken, the user will be asked to choose another
498
V. EXPERIMENTS AND DISCUSSIONS 3
To test the effectiveness of our scheme and the applicability of the application, we recruited 10 participants to perform multi-touch gestures using the application. We used a Nexus 7 tablet [9] as the testing device, and informed all participants about our approach and the use of the application. We demonstrated five selected gestures to all the participants and let them practice all gestures until they could pass the similarity check required for registration. Then, we used the verification stage of the application to collect the feature data while they performed different gestures.
2
4
1
A. Gesture Selection A multi-touch gesture study [11] tested 22 gestures with both male and female testers ranging in age from 15 to 50. They created a gestural taxonomy based on the movement of the two major components of the hand, the palm and the fingertips. Palm movement is whether the hand itself needs to move during the gesture. It can be divided into two classes: static and dynamic. • Static: The palm or hand position remains static while performing the gesture. Only the fingertips are moving without changing the position of the hand. • Dynamic: The center of the hand is moving while performing the gesture. Fingertip movement can be divided into four categories: • Parallel: all fingers are moving in the same direction during the gesture. • Closed: all fingers are moving inward toward the center of the hand. • Open: all fingers are moving outward from the center of the hand. • Circular: all fingers are rotating around the center of the hand. Based on the stability and the self-reported user experience, we choose five gestures as shown in figure 5 for our experiment: 1. Close: all five fingers move toward the center of the hand. 2. FTCCW: Thumb is fixed, and the other fingers rotate around in a counter-clockwise direction. 3. FTCW: Thumb is fixed, and the other fingers rotate around in a clockwise direction. 4. CW: All fingers rotate in a clockwise direction. 5. CCW: all fingers rotate in a counter-clockwise direction.
5
(Fixed) (b)
FTCCW
3 2
4 1
(Fixed) 5 (c)
FTCW
3 2
4
1
5 (d)
CW
3 2
4
3 2 4 5 1 5
(e)
1
CCW
Figure 5. The five selected gestures. (a)
Close
499
For all participants, we first explained the purpose of the study and the function of the application, then we train each participant all five multi-touch gestures several times. Next, the participant performed all five multi-touch gestures five times each, and we captured the data of these five gestures for further analysis. For testing purpose, we compared the data in three scenarios: same user with same gesture; same user with different gesture; different user with same gesture. B. Same User with Same Gesture In this section, we compared features from five attempts of the same gesture by the same user.
Figure 8. The relationship of the pressure features among different fingers with the same gesture performed by the same user.
The performance of pressure is almost as stable as distance as shown in figure 8. The pressures of the fingertips represent the biometric characteristics of the user, so it should be consistent when the gestures are performed by the same user. It is clear that when the same user performs the same gesture, the distance and pressure are very similar each time. As mentioned before, the distance feature describes the shape of the gesture itself and the pressure feature represents the biometric characteristics of the user. Thus, when the same user performs the same gesture, the distance and pressure are expected to be similar while the angle feature, which is a more sensitive feature, is not consistent for all five fingers. For this reason, we consider the feature angles to be similar when more than three of five angle features are classified as similar during the angle similarity check.
Figure 6. The relationship of the distance features among different fingers with the same gesture performed by the same user.
The five sequences represent the gesture Close performed by the user five times. Figure 6 shows the relationship of the distances of different fingers, where the distances of the same finger are almost identical. When one user performs the same gesture, it is easy for them to perform the distance exactly the same especially with the visual aid of the application.
C. Same User with Different Gesture In this section, we compared features from five different gestures by the same user. The following results are collected from one participant’s attempts with all five gestures.
Figure 7. The relationship of the angle features among different fingers with the same gesture performed by the same user.
Figure 9. The relationship of the distance features among different fingers with different gestures performed by the same user.
On the other hand, as a more sensitive feature, the angle’s performance is not as stable as distance as shown in figure 7. Therefore, when the similarity check is applied to the angle, there is too much variation to require that all finger features pass the similarity test before a user can be accepted.
In figure 9, it is clear that the distances noticeably vary between different gestures. But the distances of sequence 2 & 3 are similar. This is because the gestures represented by sequence 2 & 3 are the same gestures but performed in clockwise and counter clockwise directions.
500
When performing the same gesture, different users have their own preference for how to perform it. We gave the participations the general idea of the gesture, but we didn’t force them to do the exact same gesture. This resulted in relatively large differences in distances even with the same gesture as shown in figure 12. The distance may also be affected by the hand size and the screen size.
Figure 10. The relationship of the angle features among different fingers with different gestures performed by the same user.
Figure 10 shows that the angle features are different from each other, but they may also be somewhat similar in some cases. This is because the gestures FTCCW, FTCW, CW and CCW have many commonalities, especially with respect to the fingers with index 2, 3, 4 and 5, which are all rotating.
Figure 13. The relationship of the angle features among different fingers with the same gesture performed by different users.
Our analysis also shows that even with the same gesture, different hand sizes and habits of hand movement resulted in relatively large differences among the angle features as illustrated in figure 13.
Figure 11. The relationship of the pressure features among different fingers with different gestures performed by the same user.
Pressure identifies users more effective than gestures. So even though the gestures are different, the pressure features of different gestures are similar as shown in figure 11. In this comparison, because different gestures are used, the features that describe the physical shape of the gesture, such as distance and angle, show large differences. While the pressure which represents the user’s biometric characteristics is more stable than the other two features.
Figure 14. The relationship of the pressure features among different fingers with the same gesture performed by different users.
As expected, the pressure features in figure 14 show larger differences among users even if the gesture is same, which indicates that pressure is good at discriminating users.
D. Different User with Same Gesture The following analysis shows the comparison among the Close gesture performed by different participants.
E. User Authentication Results We use True Positive (TP), True Negative (TN), False Positive (FP) and False Negative (FN) to measure accuracy. True Positive (TP) means the same user performing the same gesture gets accepted by the program; True Negative (TN) means a different user performing the same gesture get rejected by the program; False Positive (FP) means a different user performing the same gesture gets accepted by the program; False Negative (FN) means the same user performing the same gesture gets rejected by the program. We compared the template data with the input data using the similarity checking process explained previously. We randomly chose feature data from six of the ten participants. For each gesture, a participant’s first sample is used as his/her template data, so there were 4×6=24 comparisons used to calculate the accuracy.
Figure 12. The relationship of the distance features among different fingers with the same gesture performed by different users.
501
The False Positive Rate (FPR) and False Negative Rate (FNR) of user authentication are calculated as follows: ܴܲܨൌ ܴܰܨൌ TABLE I.
ி
VI. CONCLUSIONS In this paper, we present a new scheme for robust user authentication. It utilizes multi-touch screen and pressure sensor, and adopts the advantages of shape-based password and biometric password without requiring additional devices. We extract static and dynamic features from raw data. The fusion of the features helps enhance the reliability and stability of the scheme over any single feature approach. The experimental results show that the proposed authentication scheme achieves high accuracy, and the user feedbacks indicate that the users are willing to adopt it for everyday use. We find that the proposed scheme is an important improvement to the existing user authentication methods and has great potential to enhance the security of mobile computing.
(9)
ிା்ே ிே
(10)
ிேା்
FALSE POSITIVE RATE (FPR) AND FALSE NEGATIVE RATE (FNR) OF USER AUTHENTICATION.
Gesture
FPR
FNR
Close
4.00%
12.50%
FTCCW
4.00%
16.67%
FTCW
0.00%
29.17%
CW
8.00%
20.80%
CCW
4.00%
8.33%
REFERENCES The accuracy of the gesture is calculated as follows: ܥܥܣൌ ቀ TABLE II.
் ்ାிே
ቁ ൈ ͲǤͷ ቀ
்ே ்ேାி
[1]
ቁ ൈ ͲǤͷ
Catone, J., Bad Form: 61% Use Same Password for Everything, http://readwrite.com/2008/01/17/majority_use_same_password, Retrieved on 8-11-2014 [2] Conti, V., Militello, C., Sorbello, F. , Vitabile, S. , A Frequencybased Approach for Features Fusion in Fingerprint and Iris Multimodal Biometric Identification Systems, IEEE Transactions on Biometrics Compendium, Volume:40, Issue: 4, pp 384 – 395, 2010 [3] Faundez-Zanuy, M. On-line signature recognition based on VQDTW, Pattern Recognition, Volume: 40, Issue: 3, pp 981 – 992, 2007 [4] Forman, G.H., Zahorjan, J., The challenges of mobile computing, IEEE Computer, Volume: 27, Issue: 4, pp 38 – 47, 1994 [5] Jorgensen, Z. and Yu, T., On mouse dynamics as a behavioral biometric for authentication, Proceedings of the 6th ACM Symposium on Information, Computer and Communications Security, pp 476 – 482, 2011 [6] Kim, D., Dunphy, P., Briggs, P., Hook, J., Nicholson, J., Nicholson, J., and Olivier, P. Multi-touch authentication on tabletops, Proceedings of the 28th International Conference on Human Factors in Computing Systems, pp 1093 – 1102, 2010 [7] Kreft, E., The Simple Way to Thwart an Alarming New Tool to Steal Your ATM Code, http://www.theblaze.com/stories/2014/08/29/itlooks-like-an-iphone-case-but-it-helps-steal-your-atm-pin-the-simpleway-you-can-outsmart-a-thief/, Retrieved on 9-15-2014 [8] Lancet, Y., What Are the Differences between Capacitive & Resistive Touchscreens? http://www.makeuseof.com/tag/differencescapacitive-resistive-touchscreens-si/, Retrieved on 12-3-2014. [9] Molen, B., Nexus 7 review, http://www.engadget.com/2013/07/29/nexus-7-review-2013/, Retrieved on 11-23-2014 [10] Muslukhov, I., Boshmaf, Y., Kuo, C. and Lester, J., Know your enemy: the risk of unauthorized access in smartphones by insiders, Proceedings of the 15th International Conference on HumanComputer Interaction with Mobile Devices and Services, pp 271 – 280, 2013 [11] Napa,S., Kowsar,A., Katherine,I. and Nasir,M., Biometric-Rich Gestures: A Novel Approach to Authentication on Multi-touch Devices, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp 977 – 986, 2012 [12] Sears, A., Plaisant, C., Shneiderman, B., A new era for high-precision touchscreens. Advances in Human-Computer Interaction, Volume. 3, pp 1 – 33, 1992
(11)
ACCURACY OF OF USER AUTHENTICATION.
Gesture
Accuracy
Close
91.75%
FTCCW
89.65%
FTCW
85.40%
CW
85.69%
CCW
93.80%
The results in Table I and II show that the proposed scheme achieved good accuracy in authenticating different gestures and reduced false classifications to very low level. Therefore, the proposed scheme has great potential to provide robust protection against unauthorized access. F. Discussions The distance and angle features can distinguish different gestures better than the pressure features, but the pressure features show promising effect in distinguishing users. The combined feature set greatly enhances the robustness and the accuracy of authentication process. In addition to the statistical results, the users gave comments and evaluations for different gestures. For example, the Close gesture has the highest rate from the users, because it is easy to perform; on the other hand, gestures such as the fixed thumb counter-clockwise gesture are harder to perform for the users with large hands. The users showed high interests in this new authentication method, and expected to see the use of this method in more mobile apps. The comments from the users also showed that they are more willing to use biometric authentication than text-based or shape-based passwords, because it reduces typing and enhances the security of mobile devices.
502
[13] Vatsa, M., Singh, R., Mitra, P. and Noore, A., Signature Verification Using Static and Dynamic Features, Proceedings of the 11th International Conference on Neural Information Processing, pp 350 – 355, 2004 [14] Wiedenbeck, S., Waters, J., Birget, J.-C., Brodskiy, A.,and Memon, N. PassPoints: Design and Longitudinal Evaluation of a Graphical Password System. International Journal of Human-Computer Studies, Volume: 63, Issue: 1, pp 102 – 127, 2005 [15] Android APIs API level: 21, http://developer.android.com/reference /android/view/MotionEvent.html, Retrieved on 9-25-2014
503