Touch screen mobile application as part of testing and verification ...

2 downloads 250869 Views 685KB Size Report
Official Full-Text Paper (PDF): Touch screen mobile application as part of testing and verification ... hardware that enabled developing and running complex.
Touch screen mobile application as part of testing and verification system D. Živkov, I. Kaštelan, E. Neborovski, G. Miljković and M.Katona RT-RK LCC, Institute for Computer Based Systems, Novi Sad, Serbia [email protected]

Abstract - In this paper it is presented one approach for creating testing environment of devices based on touch screens. Focused subject is testing of Android based devices, precisely required software to perform it. Android application has been developed that detects touches on screen, then packs them in formatted messages and sends them to central point where they are interpreted. Beside this paper proposes model of complete testing environment that would proved fast, inexpensive automated solution for verification of touch screen devices. Base for this paper is previous research done on stimulation boards that provide fast and controlled way of simulation touch events based on electric capabilities of capacitive touch screen. Implemented environment has been tested using commercial Android mobile phone to confirm software functionality and to analyze behavior of stimulation boards.

I.

INTRODUCTION

In last two decades portability of devices has significantly improved. Devices that presented great testing platform for improvements were mobile phones. Higher independency thanks to better battery technology and efficient power consummation made possible for people to relay more on portable devices. Following this portable devices become more powerful pieces of hardware that enabled developing and running complex software structures on them (operating systems and wide choice of applications), in other words they become “smart”. These days we are experiencing enormous expansion of smart-device (smart phones, PDAs (Person Digital Assistants), tablets, e-book readers, media players etc.). Beside newly available features provided by software components, one of the important features that is becoming inevitable is how this devices interact with user. Old-fashion button oriented device control is being replaced by touch-screens. Market research show that number of mobile phones with touch-screen capabilities has grown from 3% in 2008 to 23% in 2010 with predictions that by 2014 half of mobile phones will be based on touch-screen technology [1]. Even studies that were preformed on early touch screen devices showed that users work faster and with acceptable precision using touch-screen interface comparing to mouse [2]. With more and more devices relaying on use of touch-screen to interact with user new models for GUI (Graphical User Interface) of application need to be investigated. Many studies are preformed to analyze what are the best ways to design UI of application to be more attractive, one of those studies is performed by Balagtas-Fernandez and This work was supported in part by the Ministry of Education and Science of the Republic of Serbia under the project No. 32030

MIPRO 2012/CTS

offered as example [3]. As number of produced devices rapidly increases, demand for faster production of equally (or even more) reliable products is also rising. Main foundation for solid product, beside good production process, is quality verification system. Verification systems are mainly separated in two major groups: manual and automated. Dustin, in his book, [4] points out that testing process should not completely relay on automated testing, although manual testing is labor-intensive and error prone it should not be completely avoided during testing. He presents one case study that compared automated testing performance to manual testing and significant improvements in speed are concluded [4]. Another example of automated testing implementation is given in [5] demonstrating how speed of verification increased by factor of 5, comparing to manual testing previously preformed. This project has been mainly driven by previous work in the field of black box testing and automated verification systems for TV. Proposed system is based on solution already described in [6], with purpose to verify of television sets[6]. Goal of presented system would be to finally become an integrated part of complex testing environment presented in [5] and [7]. In today’s production environments testing and verification of touch-screens is based on mechanical stimulation making this systems large and limited in speed [8]. There are two touch-screen technologies widely used in modern devices: capacitive and resistant. Capacitive displays are integrated in proximally 50% of produces touch-screen devices. This paper is based on research done in [8] demonstrating alternative method of stimulating touch-screen displays using electrical stimulation instead of mechanical. Since it is based on electrical stimulation used solution is limited only to capacitive-based devices. This papers main goal is to present component of proposed system for automated testing that has main purpose to verify touch-screen responsiveness to stimulation. We will attempt to point out main advantages, but also limitations of presented environment. In following section system components developed as part of earlier studies are presented and how they interact with newly developed modules. Section III describes in details Android data collection application, targeted PC application and communication between them. Section IV presents some measurement results perform on system. In Section V we describe model of final testing environment that will be based on results from this paper and previous ones.

1033

II.

SYSTEM OVERVIEW

Implemented system for this paper consisted of 4 components: • Touch source (stimulation board or real finger) • Android phone with touch-screen • Android application for touch-detection • PC server for data acquisition Further sub sections describe system components in details. On Figure 1. is diagram describing system environment. First goal of this system of would be to confirm functionally of stimulation board using commercial devices tested and verified (smart-phones and tablets). Another purpose of this work is to became base for more complex verification environment for touch-screen devices. A. Stimulation board Reference [8] describes in detail how stimulation board is implemented and physical effects that is based on. In short it is based on introducing the grounded conductor in capacitive touch-screen sensor environment. Two methods are proposed (one using a MOSFET and other using orthogonal conductive lines) and benefits and flaws of each approach are described in [8] . But witch ever approach is used it would be controlled by Field Programmable Gate Array (FPGA) and same API would be used for control of stimulation board. B. Android application For purpose of detecting touch-screen events software component on tested device needed to be implemented. Approach was taken to relay on present system APIs available on device. Reason for this is that we wanted to detect only events that are actually reported by system. Actual events on touch-screens are filtered and processed by device vendor driver and general system driver and then presented as system recognized touch events. Android-based device was selected because of two important reasons, first one being that, according to research [9], at end of 2011 43,7 % of smart-phone

application is to collect touch events detected by system and to report these events to server. In current implementation communication between phone and acquisition server is preformed over TCP network, using set of predefined messages. Communication between phone and server is in detail described in next section. Application is developed to support multi-touch events. Android system can detect up to 4 simultaneous touch events. Beside touch coordinates some Android devices can provide information about amount of pressure applied by user (this information is calculate by area cover by finger, not actually by pressure measuring) , so this data is also presorted to server. For development of Android application Java programming language is used in Eclipse IDE (Integrated Development Environment) with Android SDK (System Development Kit). Targeted version of Android operating system is version 2.2.1 and higher, this version is chosen because it is firs version optimized for tablets. C. Data acquisition server This software component is running on Windowsbased PC. Currently implementation represents simple data acquisition and logging server. It is accepting connection form device, collects messages from phone and logs them into the file. This version is very basic and covering only essential functionalities need to verify communication workflow between servers and phone. Further development is currently in progress, and server functionality is growing in complexity. Idea is that, beside data acquisition and logging, server also performs display of events in graphical environment more acceptable to users. Server code was developed in C++ code using Microsoft Visual Studio 2008. III.

IMPLEMENTATION

A. Software modules Two applications were implemented for this paper: collection application (mobile client) and acquisition application (server). Significant attention was focused on development of Android application with imperative to process touch events coming from system efficiently. Android application is split in two independent modules: •

MultiTouchSensor – task of this module is to collect touch events from system



CoordianteSender – task of this module is to deliver collected data to server application

Since network related tasks (connection establishment, sending etc.) are not very time efficient touch detection and sending operation needed to be separated by these two modules. To provide communication between these two modules shared double-ended queue.

Figure 1. System components

devices are running Android operating system and second reason is existence of good development environment for Android applications. Task of implemented Android

1034

MultiTouchSensor module is optimizes to process received events as fast as possible and passes them to queue, from where events are received by CoordianteSender, packed into the protocol messages and sent to acquisition server. Communication sequence between modules is given on Figure 2. Android operating system detects 3 different types of events:

MIPRO 2012/MEET



ACTION_DOWN – when user pressed the screen



ACTION_UP – when user lifts finger from screen



ACTION_MOVE – when user moves finger on the screen

These events are converted into protocol messages and sent to server side for analysis. Android system has specific behavior when ACTION_MOVE event occurs to continue reporting coordinates of finger event even when there is no movement. Reported coordinates are presented in pixels according to device screen resolution. To prevent overloading communication with server these events are filtered so only move events that have change in position coordinates are sent to server. Algorithm for event filtering can be exhibited on Figure 3. B. Communication protocl Application and server are exchanging set of defined messages that provide enough information to server so it could track events on device. Three types of messages have been defined: for touch start, for touch end and for movement. Message for beginning of touch is carrying information about touch index (ordinal number), coordinates of touch and pressure information. Keyword TBegin is used to identify these messages. Expression below demonstrates format of this message: TBegin 2 179.23753 196.91711 0.21176472 Message describing end of touch event is defined with keyword TEnd . This message has two parameters: index of ended touch and number of remaining touches. Example of message where second finger is raised from

fingers is preformed on screen. Data that is part of these messages is in order: number of active touches, x coordinate of first touch 1, y coordinate of first touch, pressure of first touch one, x coordinate of second touch, y coordinate of second touch, pressure of second touch etc. As mentioned before maximal number of simultaneous touches is 4 on Android devices. Message reporting movement doesn’t actually consist of information witch finger did move. This task has been given over to server side to determine. This was done to minimize implemented logic on device side and minimize processing on sender side. Example of movement message when 3 fingers are on screen: TMove 3 349.56012 482.9275 0.21176472 175.48387 271.5285 0.28235295 106.04105 577.43524 0.18823531 All this messages are sent as arrays of characters using standard TCP (Transmission Control Protocol) connection between server and Android application. Event received

TRUE

Event == UP

FALSE TRUE

Event == DOWN

FALSE Touch-screen

System drivers

MultiTouchSensor

Queue

Event == MOVE

TRUE

FALSE

(xt,yt) == (xt-1,yt-1)

TRUE

TRUE

Send event

End Figure 3. Flowchart of event filter

CoordianteSender

Server

Figure 2. Software module interaction

screen and one more finger left touching screen is given: TEnd 2 1 Most complex message type is that describing movements. This message is sent when ever movement of

MIPRO 2012/CTS

IV.

PREFORMANCE ANALYSIS

In this section we will state some performance results measured on current implementation. For measuring HTC Desire S mobile phone was used, connected to wireless network, and PC running server, also connected on wireless network. Purpose of test is to observe how fast can server receive and process events. Information that we also acquired is ratio of event detected by device. Average data for different number of simulations touches are available in TABLE I. , beside simple tests with multiple touches on screen, one special test was

1035

preformed with 4 simultaneous touches active and, early mentioned, filtering algorithm for events removed to simulate maxima activity in system. From measured data we can see that using this communication environment we are not able to perform in real-time. This is not a blocking issue since analysis of data doesn’t need to be performed in real-time, what is important is that order of reported messages is kept do provide us with correct history of event occurrences. Major reason for delay in communication is usage of TCP protocol and wireless network, since it is not always the best solution for realtime communication.

Stimulation board

Stimulation board driver

TABLE I.

AVERAGE NUMBER OF EVENTS PRODUCED BY DEVICE AND RECEIVED BY SERVER IN 30 SECONDS Event generated by device 660 1047

278 540

3 touches

1759

1208

4 touches(no filter)

3967

2674

Testing with actual stimulation board has been preformed, but the board didn’t provide stabile enough behavior to perform more detailed analysis. Board was able to stimulate screen successfully but there was also some false detections of touch events when there is no activity from the boards. Since this research is still ongoing we can expect more detailed results, more information about board performance can be found in [8]. V.

VI.

1036

PC Data acquisition server

Figure 4. Proposed system model

REFERENCES [1]

[2]

[3]

[4] [5]

[6]

CONLUSION

In the end we can say that in scope of this paper developed components are explained and theirs functionality is confirmed. Some performance analyses have been executed to demonstrate how system is performing. From this data we concluded that TCP protocol and wireless network are not best solution to achieve communication between device and data acquisition server. Work in this paper provided essential information need to confirm usability of proposed system

Collection application

and provided good foundation for continuance of work. Future development will mainly depend on results of stimulation board research, if results prove to be acceptable further work will focuse on implementation of missing component of proposed system, manly on PC server application. Drivers and API for stimulation board will be implemented and appropriate GUI will be developed to enable creation of test scenarios and analyze result of preformed results. For improvement in communication USB connection between device and server is considered as alternative that could provide efficient data transfer.

FINAL VERIFICATION SYSTEM MODEL

This chapter will describe model of verification system that we propose. System relays on work presented in this paper and previous ones. Main idea is to create closed system that will on one side control stimulation board and create activity on touch-screen and on the other side collects data reported by system on tested device. After data is collected it can be analyzed against preformed input and conclusion regarding device/touchscreen functionality can be made. Proposed system can easily be integrated in environment for serial production to provide fast and verification of products. On Figure 4. model of proposed system and its component can be observed.

Screen driver

Module for data analysis and presentation

Event received by server

1 touche 2 touches

Touch screen

Tested device

[7]

[8]

[9]

Duke Lee, “The State of the Touch-Screen Panel Market in 2011” , pp. 12-16, Vol. 14, No. 3., Society for Information Display “Information Display Magazine”, March 2011. Sears, A., Shneiderman, B., "High precision touchscreens: design strategies and comparisons with a mouse.", Vol. 34, NO.4, 593613., International Journal of Man-Machine Studies, April 1991. F. Balagtas-Fernandez, J. Forrai, and H. Hussmann, “Evaluation of user interface design and input methods for applications on mobile touch screen devices,” in Proceedings of the 12th IFIP TC13 International Conference on Human-Computer Interaction (INTERACT), 2009. E. Dustin, J. Rashka, and J. Paul, Automated Software Testing. Reading, Massachusetts: Addison Wesley, 1999. M. Katona, I. Kastelan, V. Pekovic, N. Teslic, T. Tekcan: “Automatic black box testing of television systems on the final production line”,IEEE Transactions on Consumer Electronics, Vol. 57 Issue 1, 2011, pp. 224-231 D. Marijan, N. Teslic, V. Pekovic, T. Tekcan, "An Approach to Achieving the Reliability in TV Embedded System," ssiri-c, pp.13-17, Fourth International Conference on Secure Software Integration and Reliability Improvement Companion, 2010 Pekovic, V.; Teslic, N.; Resetar, I.; Tekcan, T., "Test management and test execution system for automated verification of digital television systems", pp. 1-6, 2010 IEEE 14th International Symposium on Consumer Electronics (ISCE), 2010 I. Kastelan, N. Bednar, M. Katona, D. Zivkov, "Touch-Screen Stimulation for Automated Verification of Touchscreen-based Devices", in press, 19th Annual IEEE International Conference and Workshops on the ECBS , 2012. "2012 U.S. Digital Future in Focus", ComScore, 2011 http://www.comscore.com/

MIPRO 2012/MEET

Suggest Documents