Compatibility Testing Service for Mobile Applications - IEEE Computer ...

11 downloads 52419 Views 532KB Size Report
developed, mobile app testing and quality assurance have become very important. Due to the diversity of mobile devices and platforms, compatibility testing for ...
2015 IEEE Symposium on Service-Oriented System Engineering

Compatibility Testing Service for Mobile Applications

Tao Zhang

Jerry Gao

Jing Cheng

Tadahiro Uehara

School of Software and Microelectronics Northwest Polytechnical University Xi’an, China [email protected]

Department of Computer Engineering San Jose State University San Jose, USA [email protected]

School of Software and Microelectronics Northwest Polytechnical University Xi’an, China [email protected]

Software Innovation Laboratories Fujitsu Aboratories LTD. Japan uehara.tadahiro @jp.fujitsu.com

There have been a number of published research papers addressing mobile apps testing. However, most papers focus primarily on the follow areas: • Mobile testing concepts, issues, and challenges [2][3].

Abstract—As more and more mobile applications are developed, mobile app testing and quality assurance have become very important. Due to the diversity of mobile devices and platforms, compatibility testing for mobile apps has been identified as one urgent and challenging issue. There are two major reasons contributing to this issue. They are: a) the large number of mobile devices with diverse features and platforms which are upgraded frequently; b) a higher cost and complexity in mobile app compatibility testing. This paper proposes one optimized compatibility testing strategy using a statistical approach to reduce test costs, and improve engineer’s operation efficiency. The paper provides a solution to generate an optimized compatibility test sequence for mobile apps using the K-Means statistical algorithm. A compatibility testing service has been proposed for mobile apps. Moreover, two case study results are reported to demonstrate its potential application and effectiveness.

• Mobile testing modeling and analysis [4][5] • White-box and black-box testing for mobile apps [6][7] • Mobile GUI and automated testing [8][9] • Mobile usability testing [10][11] • Mobile TaaS (Mobile Testing as a Service) [12][13] However, mobile compatibility testing research is still in its early stages. There is a lack of step-by-step testing methods and powerful testing tools for engineers to address compatibility testing for mobile apps across multiple mobile devices, platforms, and mobile environments. One typical challenge is how to find an optimal test sequence for diverse mobile devices with different platforms and appliances. This paper focuses on this topic and cost-effective mobile compatibility testing issues. Unlike existing research work, this paper provides a statistical and step-by-step approach for engineers to obtain an optimal test sequence for diverse mobile test devices, platforms, and environments. The major contributions of this paper include four areas. First, a tree model (known as a mobile feature tree) is used to support compatibility test modeling and analysis. Secondly, a statics method is applied to cluster mobile devices with similar configurations and appliances to avoid redundant testing. Thirdly, an optimization strategy is presented for compatibility testing to rank mobile devices by their market shares, so those most popular mobile devices will be tested in priority. Fourthly, a compatibility testing service has been presented. The rest of the paper is structured as follows. Section Ċ discusses some basic concepts about the mobile compatibility testing. Section ċ proposes a mobile compatibility testing service. Section Č presents a mobile compatibility testing method. Section č presents results of the experimental evaluation of the effectiveness of the proposed approach by comparing it with the random selection of devices. Section Ď reviews and discusses the

Keywords-software testing; mobile testing; compatibility testing; clustering algorithm; test coverage

I.

INTRODUCTION

In recent years, more and more diverse mobile applications (mobile apps) have been developed to support different applications in social, news, tourism, health, business, and other domains [1]. According to the latest study by Juniper Research 1 , The mobile application marketplace will reach $25 billion by 2015, and 160 billion mobile apps will be downloaded on mobile devices in 2017. In order to assure the quality of mobile apps, costeffective testing of mobile apps becomes a very hot topic. Mobile compatibility testing has been identified as one urgent and challenging issue, because each mobile app is expected to be downloaded and used on diverse mobile devices with different platforms, appliances and features. Native apps with features depending on platforms and specific appliances cause more test costs and operation complexity. Hence, how to conduct cost-effective mobile app testing and compatibility testing becomes a real challenge.

1

http://www.marketingcharts.com/wp/direct/loud-based-Mobile market-to-grow-88-12043/ 978-1-4799-8356-8/15 $31.00 © 2015 IEEE DOI 10.1109/SOSE.2015.35

179

Users often abandon mobile apps with serious compatibility faults, and seek alternatives.

published papers and related research work on this subject. Finally, the conclusion and future research directions are given in Section ď. II.

Higher costs of mobile compatibility testing - Due to thousands kinds of mobile devices, different mobile platforms, and various native APIs, Engineers have to spend a lot of time and effort on testing mobile compatibility.

UNDERSTANDING MOBILE COMPATIBILITY TESTING

A. What is Mobile Compatibility Testing? Compatibility testing is one of the system testing activities for mobile apps, which aim to validate dependence between the under-test mobile app and its different running environments. Based on our recent literature survey, there is a lack of published papers on mobile app compatibility testing, and also no detail definition of mobile compatibility testing. Here we define it as below.

Frequent changes and upgrades of mobile devices and platforms - Mobile compatibility testing often must be redone on new mobile devices, or new versions of mobile platform and native APIs. C. Mobile Compatibility Testing Scope and Focus Mobile compatibility testing focuses on testing and validating function features of mobile apps, which depend on mobile environments. As shown in Figure 1, the scope of mobile compatibility testing includes the following six types of testing activities and focuses.

“Mobile compatibility testing refers to validating whether the under-test mobile app can work well on different mobile devices with various platforms, appliances and features, and in different environments.” Mobile compatibility testing focuses on three main compatibility problems: platform compatibility, device feature compatibility, and native API compatibility. Platform compatibility refers to validating whether the under-test mobile app can work correctly on different mobile platforms with multiple versions, such as IOS, Android and Windows mobile, etc. Due to differences in APIs, scheduling and operating mechanisms of different mobile platforms, some faults will appear when the undertest mobile app works on some special mobile platforms. In particular, the open-source android platform may be customized and modified by some device manufacturers, resulting in serious android fragmentation problems.

Figure 1. Mobile Compatibility Test Scope and focus



Device feature compatibility refers to validating whether the under-test mobile app is compatible with different hardware features of mobile devices. Mobile devices have many hardware components, including: CPU, RAM, screen, net connection, camera, GPS, and etc. Each component has some different feature values, for examples, there are various screen sizes, such as: 5.5 inches, 5 inches, 4.5 inches, 4 inches, etc. The different hardware features may lead to some compatibility problems of mobile apps.

• •

Native API compatibility means that mobile apps should be compatible with different versions of the native API programs. These API programs may be some special hardware drivers, or interfaces of other software development kits.



B. Why is Mobile Compatibility Testing Important? Although the motivation of mobile compatibility testing has been clearly identified by Jerry Gao [3], and there are no published papers addressing the importance of mobile compatibility testing. Here, we list its primary reasons and importance.





Function and Behavior Compatibility Related Some functions and behaviors must be identified, which may cause some compatibility faults in some special mobile environments. Compatibility Environment Configuration - This allows users to configure different deployment environments for mobile compatibility testing. Mobile Compatibility Model and Coverage - This refers to the activities for analyzing mobile compatibility factors and device features. Cloud-based Compatibility Testing - Cloud computing can simulate various compatibility environments, and support automatic mobile testing. Mobile Compatibility Testing Approach - This defines mobile compatibility testing processes, test tasks, and test participants. Mobile Compatibility Regression Testing - Mobile apps, devices and environments are changed and upgraded frequently. Regression testing may be continuous for mobile apps. III.

MOBILE COMPATIBILITY TESTING SERVICES

Mobile testing-as-a-service (MTaaS) has been identified as one promising testing approach for mobile apps [19]. Mobile compatibility testing service offers four major

Intense competition in mobile app market - Though there are millions of mobile apps in various markets, most mobile users often use only about nine apps every day.

180

advantages: a) mobile compatibility testing at anytime and anywhere; b) cost reduction by sharing mobile compatibility test devices and resources; c) providing on-demand mobile compatibility testing services; and d) supporting elastic mobile compatibility testing automation. Mobile compatibility testing service is a better solution to meet urgent demands on compatibility testing for more upcoming mobile applications and mobile SaaS systems. A. Infrastructure of Mobile Compatibility Testing This section presents a proposed infrastructure of mobile compatibility testing, as shown in Figure 2. A test engineers start a mobile compatibility testing with a test terminator. The test requests are sent to the compatibility test server. The test server, test hubs, and test agents work together to perform test tasks.

Figure 3. The Architecture of Mobile Compatibility Test Server





• Figure 2. The Infrastructure of Mobile Compatibility Testing Service

The mobile compatibility test server supports compatibility testing services for the under-test mobile app in test project and process management, test task schedule, test automation tracking and control, and billing and reporting.



The mobile test hubs run in the remote computers connecting with some mobile devices or emulators. The test hubs can detect, control, monitor, and manage connected mobile test devices or emulators. The test hubs include iOS test hubs, Android test hub, and emulator test hubs.



The mobile test agents (as software agents) run on the mobile devices or mobile emulators. A mobile test agent receives test commands form the connected test hub, then performs test commands, collects test track data. The test agents include iOS test agents, Android test agents, and emulator test agents. B. Mobile Compatibility Test Server The test server provides compatibility testing services. As shown in Figure 3, the architecture of the test server has five layers: 1) test tenant; 2) test service; 3) compatibility testing; 4) communication; 5) test database.

Test tenant - This layer supports tenants to interact with the testing services. The tenants can use a web browser, or employ a programmatic approach to interact with the testing service platform. Test service - This layer receives test tasks from test tenants, then it schedules, controls and monitors test tasks. This layer consists of six core service components: 1) Test task manager; 2) Test task schedule; 3) Test task control; 4) Test task monitor; 5) Test reporter; and 6) Test billing. Compatibility testing - This layer supports mobile compatibility testing, which consists of five core components: 1) Test hub Manager; 2)Test device Manager; 3)Test script manager; 4) Mobile compatibility testing model; and 5) Test coverage analyzer. Connection - This layer provides communication connections between the test server with test hubs. In this layer, the test tasks are encapsulated and sent to test hubs, and the test results and status logs are received from test hubs. Test database - This layer manages all compatibility test data. We have built the task database, script database, track database, and resource database.

Figure 4. The Architecture of Mobile Compatibility Test Hub

181

E. The Work Flow of Mobile Compatibility Testing Service The work flow of mobile compatibility testing services is displayed in Figure 6. The process of the work flow mainly includes eight steps:

C. Mobile Compatibility Test Hubs The mobile compatibility test hubs run in remote computers, which can detect, connect and control some mobile test devices or emulators. The architecture of test hub is shown in Figure4. • • • • • •

Mobile test automatic Manager - This component accepts test tasks form the test server, and then executes test tasks automatically. Mobile test script manger - This component manages test scripts on mobile test hubs. Mobile test result analyzer - This analyzer can analyze test track data from mobile test agents, and create test result reports. Mobile test device controller - This component detects, controls, and manages connected mobile test devices. Mobile test server connector - This component provides communication connection with the test server. Mobile test device connector - This component detects mobile devices, and then connects mobile devices with Wi-Fi, USB, or Bluetooth.

Figure 6. The Work Flow of Mobile Compatibility Testing Service

1) A test tenant sends a compatibility test request to the test server; 2) The test server creates test tasks, and builds compatibility testing model; 3) The test server sends test tasks to some test hubs; 4) Those test hubs control connected mobile test devices to execute test commands; 5) The test agents collect test track data, and send back to the test hubs; 6) The test hubs analyze test results; 7) The test hubs send test results to the test server; 8) The test server returns the test report to the tenant.

D. Mobile Compatibility Test Agents The test agents run on the mobile devices or emulators to execute test commands. Figure 5 shows the architecture of test agent, which consists of five core components: mobile test command executor, mobile test script interpreter, mobile test tracker, mobile test environment configurator, and mobile test device connector.

IV.

TEST METHOD FOR MOBILE COMPATIBILITY

In this section, the K-Means algorithm is used to cluster mobile devices with similar compatibility features. And then all device clusters are ranked by market share. Using the proposed method, the mobile devices selected not only cover most of compatibility features, but also represent the most popular user environments. Figure 5. The Architecture of Mobile Compatibility Test Agent



• • • •

Mobile test command executor - This component executes test commands from the test hub. Main test commands include: under-test app installation, test environment configuration, test script execution, test tracks reporting, and etc. Mobile test script interpreter - The interpreter can translate test scripts into test instructions, and these test instructions are executed on mobile test devices. Mobile Test tracker - The test tracker collects test track data, which are sent to the connected test hub. Mobile test environment configurator - The component sets test environment parameters, such as change mobile connection from 2G to 4G. Mobile device connector - This component provides communication connection with the test hubs.

Figure 7. Mobile Application Compatibility Testing Feature Tree Model

A. Feature Tree Model for Mobile Compatibility Testing Different mobile apps have different functions, run in different environments, and have different compatibility

182

Dis(Dx , D y )F = ¦ wi Dis(Dx , D y )F n

faults. So every mobile app need to be tested in different environments with various compatibility features. Then we use a feature tree for modeling compatibility features. Every leaf node of feature tree is used to represent one basic compatibility feature. One example is shown in Figure 7.

Where wi is the weight factor of sub feature Fsi. For the same parent node, the sum of all weight factors of its child nodes should be 1 , which meet:

A feature tree model CFtr can be formally defined as a 2tuple CFtr = (N, E), where •



w1 + w2 + " + wn = 1

(5) Thus, the feature distance of two devices can be recursively calculated by feature distance of all leaf nodes. For the leaf nodes, according to the different type, are calculated respectively. For non-numerical features, such as network link, the computation formula is defined as:

N is a set of the feature tree node, including leaf nodes, intermediate nodes (or parent nodes) and the root node. Each leaf node represents a mobile compatibility feature, such as camera, screen resolution, network connections, and so on. E is a connection between a set of nodes, each connection includes a parent node and a child node, describing the affiliation relationship between parent and child.

­0 Dis(Dx , D y )F = ® ¯1

DxF ≠ D yF

(6)

For numerical features, feature values need to be standardized using min-max method firstly, which limit its value region in [0, 1], then calculate according to Formula 7.

1) Setting the initial K and device cluster centers The first step of K-Means algorithm is to set initial K and device cluster centers. The initial K is the number of initial device clusters. The initial K and device cluster centers are computed according to the feature tree model of the under-test mobile app.

Dis(D x䯸 Dy )F = D xF - D yF

(7)

For example, a device cluster center is given in Figure 8, a feature model of mobile device is given in Figure 9, and then the feature distance between the cluster center and mobile device is 0.6.

In the feature tree model, every leaf node has some different feature values, as a test data set Ai. For example, the node “Screen Size” has four values: 3.5 inch, 4.5 inch, 5.0 inch, and 5.5 inch. The test data set for leaf node “Sreen Size” is defined as below. (1)

Thus, the initial device cluster center set B can be calculated with Cartesian product of all leaf nodes’ test data sets, which is defined as: B=A1×A2×....×An

DxF = D yF

Where DxF represents the value of feature F for device Dx, and DyF represents the value of feature F for device Dy. If two values are same, feature distance is 0, otherwise is 1.

B. Clustering Mobile Test Devices with Similar Features In order to cluster mobile test devices using the KMeans algorithm, there are two key questions: 1) how to set the initial K and cluster centers; 2) how to calculate distance of one mobile device with a cluster center.

Ai = {3.5, 4.5, 5.0, 5.5}

(4)

si

i =1

Figure 8. A Sample of Initial Cluster Center

(2)

The initial K is the number of elements in the set B, namely the size of the set B, and equal to the product of each leaf node’s size, denoted by:

K = B = A1 × A2 × " × An = A1 × A2 × " × An

(3)

2) Calculating distances between mobile devices with every cluster center For clustering mobile devices, a recursive and hierarchical calculation method is proposed to calculate distance between selected mobile devices with a cluster center by their feature values. The distance (Dis) of feature F between mobile device Dx and Dy is denoted as Dis Dx , D y , which can be recursively calculated by the

(

Figure 9. A Sample of Device Feature Model

Finally we can calculate all feature distances between every mobile device and all cluster centers, and then assign every mobile device to the nearest device cluster. Those devices in the same cluster have similar compatibility feature values. We select mobile test devices from different clusters to avoid redundant compatibility testing.

)

F

distance of all sub features Fsi. The detailed computation formula is given as below.

183

b) Construct feature tree models for compatibility testing- We pick two Chinese native mobile apps (Weixin and Ctrip) for compatibility testing. Weixin is a mobile app supporting instant messaging and social networking. Our testing for Weixin app mainly focuses on the Android operating system and camera capability in diverse resolution. Ctrip app supports mobile booking for travelers in flight and hotel reservations. The compatibility testing for Ctrip focuses on diverse screen sizes and resolutions. For two mobile apps, we set up their feature tree models, as shown separately in the Figure 10 and Figure 11.

C. Ranking Clusters in Order of Market Share Selection and prioritization of popular devices for mobile compatibility testing will help greatly reduce fault effects. That is because the compatibility faults on popular devices will make more users lose their confidence, and turn to competitors. In order to select and prioritize mobile test devices, we rank device clusters by market shares. The market share of one cluster is calculated by summing market shares of all devices in the cluster, and then all clusters are ordered in sequence of market shares. Finally we select one popular device form every device cluster in sequence to provide an optimized test sequence for mobile compatibility testing. D. Evaluating and Analyzing Test Coverage In this section, we present two sets of compatibility testing metrics for evaluating test coverage. In features tree model, each leaf node represents one basic device feature. Most of feature values are discrete values, such as resolution values: 1366 × 768, 1680 × 1050, 1440 × 900, 1366 × 768, 1280 × 768, etc. So the feature coverage (Fcov) is defined as below. Fcov = N test N all

˄8˅

Figure 10. The Compatibility Feature Tree Model for Weixin App

Where Ntest is the number of tested feature values, Nall is the total number of feature values. According to feature coverage of every leaf node, average feature coverage (AFcov) of parent node can be recursively calculated using Formula 10, ultimately average feature test coverage of the under-test mobile app can be calculated. n

AFcov = ¦ Fcovi n

˄9˅

i =1

Finally, according to the final selection of test device set, we can compute the user market share(Ucov). n

U cov = ¦ dev _ sharei

Figure 11. The Compatibility Feature Tree Model for Ctrip App

Perform statistical analysis and provide optimized test device sequences - All mobile devices are respectively clustered for each mobile app using K-Means statistical method. We calculated number of devices, market share, and average price for each cluster. The clusters for two apps are shown in the Figure 12 and Figure 13 separately. We select one popular device from every cluster in sequence of mark share to provide an optimized test device sequence.

˄10˅

i =1

dev _ sharei represents the markets share of the

Where selected test device i.

V.

CASE STUDY

c) Compatibility test result analysis - We select 20 popular mobile devices from the optimized test device sequence for each app as statistical test device set. For comparative analysis, we require the test engineers to choose the same number of mobile devices based on their experience as experience test device set. The test coverage of two test device sets for each app are analyzed in the Table I. We respectively executed the same set of test cases on the two test device sets for every app, the final test results are presented in the Table Ċ.

To evaluate the effectiveness of the proposed approach, we have used this approach to test two mobile apps on different mobile devices, and compared the test results. We conduct this study in four steps shown. a) Set-up a candidate test device pool - We selected 2500 kinds of Android smart mobile phones as candidate test devices. Those smart phones has been produced from 2012 year, and sold on Zhongguancun (one of famous mobile phone sales website in china)

184

Figure 13. The Clusters Result for Ctrip App

Figure 12. The Clusters Result for Weixin App

TABLE I. Mobile app Weixin Ctrip

Initial cluster

Selected Devices

TEST COVERAGE OF TEST DEVICES

Feature Coverage˄Fcov˅ Android Screen Screen Camera version size resolution 100% 86% 82% 87%

Average Feature Coverage (AFcov)

User Market Coverage (Ucov)

92.6%

22.3%

Set one

135

20

Set two

/

20

100%

65%

86%

72%

86%

9.45%

Set one

100

20

100%

/

86%

92%

94.5%

30.1%

Set two

/

20

100%

/

69%

72%

85.3%

8.79%

set one: statistical test device set ; set two: experience test device set

TABLE II. Mobile App Weixin Ctrip

support based on the function, control flow and data flow mobile application testing.

MOBILE COMPATIBILITY TEST RESULTS Deviceindependent Fault 1 3

Device-Specific Fault Set one

Set two

5 6

1 2

Usability testing is necessary to enhance the quality of mobile usage experience of mobile APPs. The paper [10] proposes a toolkit that embeds into mobile applications the ability to automatically collect user interface events as the user interacts with the applications. Waterson [11] has conducted a remote usability study on mobile devices. T. Kallio and A. Kaikkonen present a usability testing study in two comparative environments (in a laboratory and in a field) based on 40 test users in paper [17].

The experimental results show that the proposed method in this paper will help engineer choose the more suitable mobile devices for compatibility testing, which can help reduce test costs and time, improve test efficiency, and promote test quality. VI.

MTaaS (Mobile Testing-As-A-Service) provides a cloud-based mobile testing service. Jerry Gao discusses its base concept, features, test process, and infrastructure [12]. Oleksii Starov presents a cloud-based testing framework for mobile systems, which provides the ability to run tests on a variety of remote mobile devices [18]. Al-Ahmad reviews the related areas of mobile, cloud and mobile cloud applications testing, in terms of features and models [19].

RELATED WORK

Up to today, many papers have been published to address different testing areas in mobile applications and software compatibility. Model-driven testing methods were applied to white box testing of mobile applications, which support for automatically generating test cases and analyzing testing path coverage [6]. The traditional scenario-based testing method [7] and random testing [14] have been extended for black box testing of mobile applications.

Ăukasz presents a method of measuring effectiveness of given software environment for discovering defects in software by introducing environment sensitivity measure [20]. Some researchers provides some models and methods for testing configurable component compatibility [21][22].

Numerous researches have discussed the GUI testing techniques for mobile application. The paper [8] proposed a structured way for GUI testing methods of Android applications. The paper [9] also proposed an automated test method for android applications. J2MEUnit [15], and GlassJar Toolkit [16] are two automated testing tools, which

However, there are a few of publications about compatibility testing of mobile applications. Google has presented the testing strategy and has developed android

185

[8]

compatibility tools for android fragmentation issues. Sergiy Vilkomir [23] uses a combinatorial approach for mobile compatibility testing. Junfei Huang [24] presents a mobile app automated compatibility testing service.

[9]

Unlike the existing research, this paper presents a compatibility testing service for mobile apps. The testing service use a systematic statistical approach for providing an optimized test device sequence, which help engineers reduce test cost, and improve test efficiency.

[10]

[11]

VII. CONCLUSION AND FUTURE WORK

With the fast advance of mobile computing and large increase of mobile app downloads, compatibility testing for mobile apps becomes a challenge task for engineers due to its higher complexity and costs. This paper proposes a systematic and cost-effective mobile compatibility test method based on a tree model using the K-Means algorithm for selecting mobile devices and their diverse platforms and configurations as well as appliances. The paper also presents the compatibility testing service for mobile apps. Future research directions include three areas: a) devicespecific fault analysis for mobile compatibility testing to identify the relations between compatibility faults and mobile device features; b) optimization algorithm improvement; and c) test and coverage analysis tool development supporting mobile compatibility testing.

[12]

[13]

[14]

[15]

[16]

ACKNOWLEDGEMENT This research project was supported by National Natural Science Foundation of China (Program No. 61103003), and joint funded by Fujitsu Research Lab. on Mobile SaaS Testing from 2013-2015.

[17]

[18]

REFERENCES [1]

[2] [3]

[4]

[5]

[6]

[7]

Adrian Holzera, and Jan Ondrusb. “Mobile application market: A developer’s perspective”, Telematics and Informatics, 2014, 28(1): pp.22-31. Jerry Gao, X. Bai, W. T. Tsai, and T. Uehara, “Mobile application testing: a tutorial”, IEEE Computer, vol.47(2), 2014, pp.26-35. Klaus Haller, Swisscom IT Services, and Zürich Switzerland, “Mobile Testing”, ACM SIGSOFT Software Engineering Notes archive, vol.38(6), 2013, pp.1-8. M E Delamaro, A. M. R. Vincenzi, and J. C. Maldonado, “A strategy to perform coverage testing of mobile applications,” In: Proceedings of the international workshop on Automation of software test. 2006, pp.118-124. Chuanqi Tao, and Jerry Gao, “Modeling mobile application test platform and environment: testing criteria and complexity analysis”, In: Proceedings of 2014 Workshop on Joining AcadeMiA and Industry Contributions to Test Automation and Model-Based Testing, pp.28-33. Mahmood R, Esfahani N, Kacem T, Mirzaei N, Malek S, and Stavrou A, “A white-box approach for automated security testing of android applications on the cloud”, In: Proceedings of the 7th International Workshop on Automation of Software Test, 2012, 22-28. Jiang Bo, Long Xiang, and Gao Xiaopeng, “Mobiletest: A tool supporting automatic black box test for software on smart mobile devices”, In: Proceedings of the 2th International Workshop on Automation of Software Test, 2007.

[19]

[20]

[21]

[22]

[23]

[24]

186

D. Amaltano, A. R. Fasolino, P. Tramontana, S. D. Carmine, and A. M. Memon, “Using GUI tripping for automated testing of android applications”, In: Proceedings of the 27th IEEE International Conference on Automated Software Engineering, 2012, pp.258-261. C. Hu, and I. Neamtiu, “Automating GUI testing for Android applications”, In: Proceedings of the 6th International Workshop on Automation of Software Test”, 2011, pp.77–83. Xiaoxiao Ma, Bo Yan, Guanling Chen, Chunhui Zhang, Ke Huang, Jill Drury, and LinzhangWang, “Design and Implementation of a Toolkit for Usability Testing of Mobile Apps”, Mobile Networks and Applications, vol.18(1), 2013, pp.81–97. Waterson S., Landay J. A., and Matthews T., “In the lab and out in the wild:remote web usability testing for mobile devices”, In: Proceedings of the conference on human factors in computing systems, 2002, pp.296–297. Jerry Gao, Wei-Tek Tsai, Ray Paul, Xiaoying Bai, and Tadahiro Uehara, “Mobile Testing-as-a-Service (MTaaS) - Infrastructures, Issues, Solutions and Needs”, In: Proceedings of the IEEE 15th International Symposium on High-Assurance Systems Engineering, 2014, pp.158-167. Murugesan L., and Balasubramanian P., "Cloud based mobile application testing," In: Proceedings of 2014 IEEE/ACIS 13th International Conference on Computer and Information Science (ICIS), 2014, pp.287-289. Zhifang Liu, Xiaopeng Gao, and Xiang Long, “Adaptive random testing of mobile application”, In: Proceedings of the 2nd International Conference on Computer Engineering and Technology, 2010, pp.297-301. H. V. D. Merwe, B. V. D. Merwe, and Willem Visser, “Verifying android applications using java pathfinde”, ACM SIGSOFT Software Engineering Notes, vol. 37(6), 2012, pp.1-5. Zhenglei Wang, Zhenjun Du, and Rong Chen. “A Testing Method for Java ME Software”, In: Proceedings of the 8th International Conference on Embedded Computing. 2009, 58-62. T. Kallio, and A. Kaikkonen, “Usability testing of mobile applications: A comparison between laboratory and field testing”, Journal of Usability studies, vol. 1(1), 2005, pp.23-28. Oleksii Starov, Sergiy Vilkomir, and Vyacheslav Kharchenko, ”Cloud Testing for Mobile Software Systems-Concept and Prototyping ”, In: Proceeding of the 28th International Joint Conference on Software Technologies, 2013, pp.124-131. Al-Ahmad A.S., Aljunid S.A., and Sani A.S.A.,”Mobile Cloud Computing Testing Review”, In: Proceeding of the 2013 International Conference on Advanced Computer Science Applications and Technologies (ACSAT), 2013, pp.176-180. Pobereznik, L., "A method for selecting environments for software compatibility testing", In: Proceedings of the 2013 Federated Conference on Computer Science and Information Systems (FedCSIS), 2013, pp.1355-1360. Lichul Yoon, Alan Sussman, Atif Memon, and Adam Porter, “Testing component compatibility in evolving configurations”, Information and Software Technology, vol 55(2), 2013, pp.445-458. Jerry Gao, J. Guan, A. Ma, C. Q. Tao, X. Y. Bai, and D. C. Kung, “Testing Configurable Component-Based Software-Configuration Test Modeling andComplexity Analysis", In: Proceedings of 2011 International Conference on Software Engineering and Knowledge, 2011, pp.495-502. Sergiy Vilkomir, and Brandi Amstutz., “Using Combinatorial Approaches for Testing Mobile Applications”, In: Proceeding of 2014 IEEE International Conference on Software Testing, Verification, and Validation Workshops, 2014, pp.78-83. Junfei Huang, ̌AppACTS㧦Mobile App Automated Compatibility Testing Service̍, In: Proceedings the 2th IEEE International conference on Mobile Cloud Computing, Services, and Engineer, 2014, pp.85-90.

Suggest Documents