Investigations on Android Testing Methodologies and ...

11 downloads 0 Views 936KB Size Report
utilities in BusyBox generally have fewer options than their full-featured GNU cousins; however, the options that are included provide the expected functionality ...
International Journal of Pure and Applied Mathematics Volume 118 No. 20 2018, 2735-2757 ISSN: 1314-3395 (on-line version) url: http://www.ijpam.eu Special Issue

ijpam.eu

Investigations on Android Testing Methodologies and Strategies SelvarajKesavan*1 , J.Jayakumar2 , Senthil Kumar 3 [email protected] m1 , jayaku [email protected] 2 1 Accenture Solutions, Bangalore, Ind ia 2 Karunya University, Co imbatore, India 3 Dilla University, Dilla, Ethiopia

Abstract Over the years, usage of software and tools has increased drastically. With stiff market co mpetition and high expectation fro m users, it is essential to deliver high performance, bug free software to the users. End to end platfo rm testing, analysis and monitoring the performance in embedded software en viron ment is quite a challenging task. Many testing and debugging tools that have evolved in Android helps rapid development of solutions right through the board bring up to the product release stage. These tools not only aid in the development, it also he lps in the study of existing performance criteria of the system and do further optimization. This document attempts to list various proven methodologies and tools used for debugging, system informat ion retrieval and performance measurement on the Android platform along with their strengths and limitations. The analysis helps guide the user through optimu m selection of tools and evaluates the performance of system components with respect to various criteria. The developers and testers can benefit fro m this s tudy to help pick the right testing tools that are appropriate for the development environment.

Keywords:Android platfo rm, benchmark, embedded system, performance, testing, debugging, optimization.

1. Introduction Embedded System is a comp lex entity that comprises of numerous hardware and software components. Although the performance of the embedded system widely depends on the hardware specification, the absolute or real performance of the system is adjudged based on the quality of the software that is included to build the system. The efficiency of these software’s and the level of optimization with which the hardware resources are consumed are crit ical for the system’s performance. There are various test methods available to capture the real perfo rmance. The relat ive performance of the system is measured by running various sets of test methods against it. It requires a lot of prior experience and a deep understanding of the tools used in hardware, software and system integration. Soft ware testing is a key, essential p rocess that consumes more than half t ime in the software development life cycle. The quality of the software product and service purely depends on the excellence of the testing procedures. The testing procedures should cover end to end product from a single module to comp lete system integration aspects. The design aspect of the testing methodology should include the total cost, effort, reusability, coverag e, easy installation and validation. As the application ecosystem grows exponentially, new software co mponents are introduced quickly and in short timeframes to meet the user expectations. This makes testing the software and hardware co mponents, a real challenge and demands adequate comprehensive testing frameworks and tools. Manual testing is a crucial testing method widely used in testing software stack in the embedded industry. However manual testing is more time consuming, more effort, error prone, tedious to cover all the cases and it also delays time to market launch. Automated testing is preferred way in most of the software development environment. The quality of the product can be improved with precise test outcome of the automated testing. It is cost effect ive method, detects the issues earlier and requ ire only few skilled resources to execute the test procedures. However auto mated testing needs resource with co mplete awareness of the tool usage, require significant init ial

*Author for correspondence

2735

International Journal of Pure and Applied Mathematics

Special Issue

investment, time and post processing analysis of the test outcome. Irrespective o f the few issues, automated testing method using suitable tools is preferable co mpared to manual testing method. Android ecosystem is extensively used and proven Linu x based software platform primarily for s mart phones and tablet devices. Android has seen numerous updates which have incrementally improved the operating system fro m version Cupcake to Marshmallow. There are nu merous tools available for Android platform to analyse a software stack from the Application layer till the kernel. These tools main ly provide system wide end to end log and debug provisions to the users which are quite useful to set benchmarks and carry out performance optimization. This paper concentrates mainly to provide deep insights of various system wide tools in Android platfo rm and their benefits to the developers, consumers and manufacturers. The contributions of this paper broadly include the following three aspects. First, the underlying Android layered architecture is given in brief. Second, we have discussed the embedded testing strategy and list the test environment possibilit ies for embedded software. Th ird, we have listed few testing and performance measurement tools, experiment using the Android device and analyzed the results. The rest of this paper is organized as follows. Section 2 covers few related works in the area of testing methodologies, tools in the embedded and android platform. The test strategy analysis and Android test environment setup are discussed in Section 3. The brief introduction of Android architecture and its subsystems are given in Section 4. The experimental setup of Google Nexus 4 Android device is described in Section 5. Section 6 gives a brief overview of the various logging, debugging, performance and system information tools used in Android along with experiment test outcome and analysis of the results. Finally, Section 7 concludes the paper.

2. Related Work Testing and debugging is a method to understand the right flow and find defects in software. There are nu merous tools and frameworks available to make the development/testing process more efficient. Many researches proposed innovative, novel methods which saves the time and cost of the product development. Android as smart device platform has a great market share and lead in the consumer electronics industry. Since it is open source and has strong community support, many tools available to get the most of Android platform data. The IEEE provides the standard testing approach, testing assumptions, direction to be followed and gives extensive resource usage information [1].It helps worldwide testing community to fo llo w the defined procedure. The International Software Testing Qualification Board (ISTQB) offers the software testing qualification cert ification fo r software testing awareness and different customary testing methodologies [2]. The industry certificat ion helps to evaluate the testing skills of the software testers. The authors [3] discuss the platform based design approach for testing and validation in embedded software development. Platform based design brings the idea of reusable, programmable model thus reduce the development time and hides complexity by abstracting the platform dependent implementations. The authors [4] classifies regression techniques in terms of inclusiveness, precision, efficiency, and generality. The paper also identifies the strengths and weaknesses of various techniques which guides to select the appropriate test selection for usage. The detailed software testing survey conducted ba sed on testing methodologies, automated testing tools, testing metrics, testing standards and testing education in paper [5]. It also proposes the recommendations in software testing based on the survey observations. The authors [6] conducted survey on coverage-based Testing based on coverage measurement, coverage criteria and automation aspects. It helps software testers to pick the correct coverage testing tools for their needs and environment.

2736

International Journal of Pure and Applied Mathematics

Special Issue

3 The authors [7] perform the survey on automatic test case generation methods and list the problems in usage of certain techniques. In Paper [8], the detailed investigation on test driven development approach using several experiments to decide the effectiveness and limitations of the methodology. The various Challenges, Methodologies, and Issues in the Usability Testing of Mobile Applications are described in [9]. Paper [10] presents a novel tool for GUI testing of Android applications with sending sequences of user events and exposing the failures of the application. The paper [14] surveys the mo del-based testing and test automation imp lementations of Android applicat ions and provides the complete process of different applicatio n models, test case design and execution scenarios. Profiling is the method of exp loring the right capability of the soft ware and hardware of the device under various circu mstances. In Paper [18], the performance difference between Java and C/C++ imp lementations is computed using Android applications comp iled by using a native cross -compiler fo r A RM and a native shared library through the JNI of the Android NDK. The authors [19] p ropose the model to measure the performance of native Linu x applications and Java applications evaluated by executing identical tasks. The analysis shows that native C applications can be up to 30 ti mes as fast as an identical algorith m running in Dalv ik VM. The Android platform performance calculated using Native C programs is proposed in [21]. Cloud based testing model and tools creates a revolution in testing methodology. The testing approaches, analysis, survey and advantages of cloud over traditional testing methods are discussed in [23][ 24] [25].

3. Android Test Strategy The sheer market power of Google, the attraction of the platform’s open environ ment, co mmun ity support for developers, convergence of connected consumer electronics devices, Android creates significant mo mentum for the technology and industry dynamics. Beyond mobile devices, Android is getting designed into devices that extend far beyond the phone, become a key co mponent in global consumer electronics market such as set top box, connected home, auto motive In Vehicle Infotain ment, mob ile internet devices and more. Android as a platform provides a competit ive user experience in terms of • Allows users to creating state of art user interface • provid ing hardware independent architecture • Robust architecture to port third party software components. • Enhanced with rich middleware features • Facilitates exhaustive interoperability and co mplaint testing Testers, make every effort to devise a strong test strategy to ensure that these Quality Goals are achieved in entirety. For every Quality Goal to be tested, we apply various tests techniques which use diverse tools and follow different test processes. To create a test strategy, it is necessary to start at the beginning of a pro ject and then carry the develop ment and imp lementation forward throughout the whole of the development and production lifecycle. To make the testing strategy is carried forward it is required to create a test strategy document and update during development and production stage of project. The test strategy must give a clear vision of what the testing team will do for the who le project for the entire duration. Testing Strategy approach requires adequate knowledge in test method. Fo llo wing are the key testing methods used in embedded product development cycle.     

Application Product Test Integration Product Test Performance Test Regression Test Assembly Test

2737

International Journal of Pure and Applied Mathematics

  

Special Issue

User Acceptance Test Operational Acceptance Test Automation Approach

A good test strategy is the integral part of software test model and it should consider all the key factors of the pro ject. T he following factors are the key parameters to be included in a testing strategy.          

Scope and Limitat ions Testing Approach Test Types Test Coverage Test Deliverab les Resources List of Software’s including test, debug, performance tools Enu merate hardware, firmware, software, networks, etc. required to carry out the testing Configurat ion and Build management process Risk Management

4. Android Test Environme nt: The test environment is crit ical to perfect test execution and release the software. The selection of test environ ment depends on many factors like t ime to market, infrastructure cost, functionality to be tested and qua lity requirements. Android platform software testing can be done using   

Device Emu lators. Develop ment hardware/Device. Cloud based Remote Testing Approach.

5. Device Emulators: Device emu lators offer sound testing environment for the developers and users. It allo ws performing manual and automated GUI, functionality, stability, regression and performance testing efficiently Application and native features can b e tested on emulators without real target hardware. Advantages:  



It is simple and cost effective. It runs in user host, so it is fast to test compare to the real device. Helps users to test, debug more frequently and simulate the bugs .

Limitations:    

Hardware and software configuration mis match between emu lator and real device. Real device co mputation results may be inco mpatible with emulator results. Not possible to test network related functionalit ies. Not all the features can be tested in emu lators.

6. Development hardware/Device Real device allo ws user to execute the entire possible scenario without any limitations and gives the realistic output. End to end system wide testing (hardware, software and network) can be done with real devices. It gives actual end user feeling s o that one can understand how the device behaves in all possible scen arios. Advantages:    

Testing in real device always give accurate results Usage on real network gives device behavior in real live network environ ment. Provides realistic co mputation results with given hardware and software capability. Results the actual performance figure and hints the user for more optimization

2738

International Journal of Pure and Applied Mathematics

Special Issue

5 Limitations:  

Very difficult to start init ial development and debugging with real device environ ment Device cost is considerably more which increases overall develop ment cost

7. Cloud Based Approach: Testing as a Service (TAAS) Cloud-based mobile testing model allows user allo ws the users to connect remotely and virtualize resources. The test beds with real devices are setup remotely and testers and developers can easily access the devices which are conne cted to live networks placed fro m anywhere anytime using the internet. Advantages:    

Cloud based testing approach enable to test across variety of modern devices and gives the accurate results. Remote testing eliminates the need of local device presence and saves infrastructure cost. Easily connect and execute the test cases using web interface fro m host. Allows creating public and private cloud mode setup and testing

Limitations:  

Significant cost involved for usage, licensing and software tools. Adequate skills and software required to connect with remote devices.

8. Experime ntal Setup The experimental evaluation of different Android tools has been carried out by the test setup as shown in Figure 1. The tools execution using the experimental setup basically required demonstrating the steps to use and use cases of the each tools. The host is running with Ubuntu Linux 12.04 desktop machine having 3.8.8 kernel with 4GB RAM and Intel core i5 CPU. Google Nexus 4 target device flashed with Android Lo llipop 5.1 connected with host machine. The host and target devices are connected with Android Debug Bridge (A DB). The Application installation, pushing/pulling file s, configuration, accessing Android shell fro m host to target can be done using ADB.

9. Software Testing Tool Analysis and Evaluation Testing and debugging requires adequate knowledge in development platform and use cases .With variety of development platforms, short time to market and various devices hardware capabilities, manufacturers and users faces numerous challenges. Various application and platform tools are used in Android environment to help the developers, users and device manufactures providing s upport in terms o f solution bring up, debug support, collect significant system information and to get key performance benchmark metrics. It is essential for the Android developers, testers to have the proper knowledg e and executing methodology of Android tools to make develop ment easy and reduce the effort. Without proper tool, analysing and debugging system issues are real tough and it will take long time. The tools are typically evaluate, analyses device end to end compatibility according to a handful of criteria. The Android tools are categorically div ided into    

Android Logging and Debugging tools Android System Informat ion Tools Android Performance measurement tools Android Co mpatibility test suite Each tool provides distinct features in terms of in formation it delivers, measuring efficiency and usage. The following

section lists the most of the tools used in Android platform, lists its features and limitations

2739

International Journal of Pure and Applied Mathematics

Special Issue

m

Fig. 1 Experimental setup 9.1.

Android Logging and Debugging tools Table 1Android Logging and Debugging tools

Approach Trace view

Description Graphically interprets the logs from an Android application.It shows the timeline and profile panel information. The timeline panel g ives each thread and relevant time information. Profile panel shows the summary of the total time spent in a method.

Merits  Easy to graphically interprets the logs.  Time and profile panel provides significant informat ion about threads and all the module time informat ion’s

Limitations  The trace view does not remove the thread informat ion fro m chart even if the thread exits during profiling.  The same thread id may be used again and again by the virtual machine

Dalv ik Debug Monitor Server (DDMS)

It acts as the mediator between development environ ments to the Dalvik processes running on the device/emulator. Android Debug Bridge (ADB) notifies DDMS when a device or emu lator is connected to the host and creates a VM Monitoring service with DDM S to monitor VMs that are running on device/emulator. When debugger connects to a selected port, all the debugging traffic is forwarded to the selected VM in DDM S. It is part of Android distribution and located in tools/ directory.

 Provides integrated debug environment  Thread and Heap informat ion of VM’s running on device.  Memory allocation tracker  File Explorer to view device file system  View process status  Screen capture on the device  Logcat output  System state informat ion e.g Radio  Data spoofing such as incoming call, SMS and location data  Application network requests can be tracked.

 It does not display all processes in the system as it sees only the processes with a VM.  Connect and disconnect, ddms drops and reconnects the client so the VM realizes that the debugger has exit.

logcat

The Android logging system provides a mechanis m for collecting and viewing system debug output. Logs from various applications and portions of the system are collected in a series of circular buffers, wh ich then can be v iewed and filtered by the logcat co mmand. It is a

 System-wide debugging using the traces  Using the timestamps in the logs, processes taking unusually long time to complete certain tasks can be discovered

 Adding too many logs can slow down the system and result in time out of certain activities due to delayed responses.

2740

International Journal of Pure and Applied Mathematics

Special Issue

7 utility to retrieve and view all Android user space logs stored in log buffer. Log Categ ories:  Event log - Used to record diagnostic events such as garbage collection, activity manager state, battery status, system watchdogs  System log  Radio log - Used to record radio related messages. Logging system automatically routes messages with specific tags (ex: RIL or AT) to radio log buffer  Main (Default) log - All the relevant logs except event and radio logs. Monkey Test

Strace

The Monkey is a program that runs on the emu lator or device and in jects thousands of rapid touchscreen/key press events such as clicks, touches, or gestures, as well as several system-level events into the phone. This provides accelerated testing of the phone’s stability. It also watches the system under test and looks for three conditions  Monkey to run in one or more specific packages, it watches for attempts to navigate to any other packages, and blocks them.  Application crashes or receives any sort of unhandled exception, the Monkey will stop and report the error.  Application generates an application not responding error, then Monkey stops and throws an error. Strace is diagnostic useful debugging tool. It is used to trace the system calls and signals from user space to kernel space. It is a useful tool for debugging and diagnostics of any application running in the user space. It is especially useful for p rograms for wh ich the source code is not available such as third party binary only distributions.

 Logs can be used to determine certain causes for low performance such as poor battery performance. It is possible that a particu lar application is performing some action continuously but it is not expected to do so. This will result in unnecessary draining of battery life. The traces fro m Logcat allo w us to spot out such scenarios

 Multiple devices can be connected and execute tests  It allo ws automated functional, regression and stability test

 Monkey generates default set of events for the same seed numbers.  Monkey experiment requires some knowledge of the application code

 Allows inspection of system calls and signals made/received by a process and it's children.  Let user attach the strace to an already running process using it'spid (optionally it's children).  Possible to log the traces in a file as printing traces on the console could make applicat ion slow.  It is possible to print the time taken to execute a system call.  Strace shows the parameter values and the return values of each system call.  No instrumentation is needed in the application code to enable strace.  It can also show the program counter (PC) of the running application that made that specific system call.  Strace will find whether a system call is being made too

 Not very useful for Java application perspective as the system calls are in the context of native user side code and are not obvious  Requires good understanding of the system calls and their relevance to the program/application context and use-case

2741

International Journal of Pure and Applied Mathematics

Special Issue

many t imes andhelps to find time taken in each of the system calls.

Boot chart

Boot chart is designed to provide a graphical d isplay the system activit ies during boot time. It consists of a data collection tool and a graphic generator tool. The p rocess utilization of CPU is indicated by coloration of it process bar. Things to look for are the start and top times of the various processes, and their CPU utilizat ion. Long gaps with low utilizat ion may indicate a t imeout or some other problem. Boot chart is run to generate a graphic image fro m the collected data.

 Graphical display of system activities during boot time  Allows to easily find and debug the process which takes long time with low ut ilization.

 It increases the system load during boot up process  Disk stats are always zero on Android, since there are no block devices being used.

BusyBo x

BusyBo x co mb ines tiny versions of many co mmon UNIX utilit ies into a single small executable. It provides replacements for most of the utilities like GNU file utils, shell utils, etc. The utilit ies in BusyBo x generally have fewer options than their fu ll-featured GNU cousins; however, the options that are included provide the expected functionality and behave very much like their GNU counterparts. BusyBo x provides a complete environ ment for any small or embedded system.

 It provides many small footprint tools to use in constrained embedded system  Modularity allows easy include or exclude commands at compile t ime.  It gives more than 200 utilities to debug, log, analyses all modules of the system.  Utility supports architecture such as ARM, x86, ia64, x86_64, m68k, MIPS, PowerPC, S390, CRIS, H8/ 300, Sparc.  Tool to analy ze the layout and helps to streamline, optimize the user interface

 Need extensive knowledge to use and analyze the results

Hierarchy Viewer

PC based tool which allows for debugging and optimizing the user interface, provides a visual representation of the layout's View hierarchy and a pixel perfect view. It allo ws for debugging and optimizing the user interface. • Layout View prov ides –

Tree view of all Views on the screen – Wireframe drawing of the layout – Properties inspector • Pixel Perfect View provides – – –

Exp lorer View as a tree Normal View Magnified View

2742

International Journal of Pure and Applied Mathematics

Special Issue

9

10. GNU Debu gger (GD B)

GDB can be used to remote debug and collect the info rmation of the Native Android Applications efficiently. The setup procedure is not straight forward as Linu x. It requires server and client connection using ADB port. The setup procedure is given below

 It is used to ease the debugging of binary files on the Android platform.

 It requires gdbclient and gdbserver setup using ADB shell.  Somet ime t imeout abruptly closes the debugging mode.

 It gives good control over views and gives the test result of each view.  It simp lifies SDK instrumentation testing  Using Robotiu m all SDK tool based testing can be done in same test project.

 It does not allow to test complete functionalities in Non-UI based applications.  It is slow to deploy and execute the test scenarios

 Start the gdbserver on the device  Set a port to listen to at the client  Attach to the process for debugging  Use gdbclient on the client side and listening to the remote port  Port forwarding fro m A DB port (sending port ) to localhost port (listening port)  Start the debugger at the client. Robotium

Robotium is an automation test framework used in Android platform which provides complete support for automating test scenarios for native and web applications. It allows testes to write code to test unit, system and end to end device functionalities.

11. System Information Tools: Table 2Android System Information tools Merits

Approach

Description

Du mpSys

Du mpsys is used to dump informat ion fro m a service to the console. Each service in Android can optionally imp lement the dump method to print debug informat ion when requested by this tool (or any client. Du mpsys tool is beneficial for service writer and app developer. It provides the list of running services on the system.Du mpsys is also used by Du mpstate that captures almost all the important debugging data fro m the system.

 Captures all the relevant informat ion in the system.  vital to performance and stability analysis  Most beneficial for Application and service development to know whether the development affects the system performance 

Du mpstate

Du mpstate is the most critical tool on Android to analyze defects related to performance and stability. The defect can be easily analyzed and fixed using the Du mpstate log. Du mpstate fetches extensive system informat ion fro m the device via adb one the host machine. It relies on other Android tools and files for the information it dumps to the console. It is also possible to redirect the output to a socket or a file on the system. It uses other

 Prints system wide memo ry and process informat ion  Prints details of all threads associated with processes  Prints system wide trace data (including application and kernel logs)  Prints all system properties  Prints application and services states

2743

Limitations  Huge amount of data is dumped. It is very difficult to analyse and point the exact problem.

International Journal of Pure and Applied Mathematics

Librank

Procrank

system utilities such as dumpssys, dmesg, logcat, libran k, procran k to dump the data. Librank is an Android console utility that gives a quick summary of memory (resident and non-resident pages) used per lib rary. By default it displays the list in descending order of memo ry usage. Librank is used together with Du mpstate. It provides the memory statistics in terms of RSStot : Total Resident Set Size that this library uses across all the processes that use it. VSS: Virtual Set Size. Th is is the second column which shows the amount of memo ry mapped into a process space for that particular lib rary and it includes non-resident memory. RSS: Resident Set Size. This is the third colu mn which shows the amount of memo ry mapped into a process space for that particular lib rary and it only includes resident memo ry. Ho wever this includes the memory shared across processes and therefore not very useful. PSS: It is the amount of memo ry shared with other processes, accounted in a way that the amount is divided evenly between the processes that share it. Th is is memo ry that would not be released if the process was terminated, but is indicative of the amount that this process is "contributing". USS: Unique Set Size. This is the amount of memory that would be freed if the application was terminated right now. Procrank displays a summary o f process memory utilization. By default, it shows a list of VSS, RSS, PSS and Un ique Set Size (USS), and sorts the list by VSS. Procrank is used with Du mpstate. It can be used to view the working set size of each process and to reset the working set size counters. The main two variables to watch are PSS and USS. USS is the set of pages that are unique to a process. This is the amount of memo ry that would be freed if the application was terminated right now.PSS is the amount of memo ry shared with other processes, accounted in a way that

Special Issue

 It can be used to quickly identify the amount of memo ry used by a library. If the RSStot is a very large value compared to the size of the library (code size), then we can guess that the data segment because of a memo ry leak.  It is possible to sort the output of this tool by Virtual Set Size (VSS), Process shared set(PSS), Unique Set Size(USS), o r Resident Set Size(RSS) and to restrict the output to lib raries fro m a specific system path.

 The output data is interpreted along with dumpstate informat ion to get the exact system status.

 It displays a summary of process memory utilizat ion.

 The data provided is not used directly and it does not give enough adequate informat ion to point out the exact issues.

2744

International Journal of Pure and Applied Mathematics

Special Issue

11 the amount is divided evenly between the processes that share it. This is memo ry that would not be released if the process was terminated, but is indicative of the amount that this process is "contributing" to the overall memo ry load. Proc and sysfs

Proc and sysfs used to get the system vital informat ion’s and statistics such as memo ry, CPU, system interrupts, memory map, power state and kernel parameters. Maps command displays the specific process memo ry map informat ion.

 Proc is easy to use and dumping ground for a whole range of system informat ion  Sysfs exposes system informat ion and control points to user-space fro m the kernel.

 Proc is mainly used only to export informat ion related to a process.

Smem

Smem is memo ry analyzer tool provides several meaningfu l memo ry usage reports .It g ives the realistic memory usage of each applications and libraries in a virtual memo ry system.

 It is used to analyse realistic memory usage on a system using proportional set size representation.  Provides option to filtering, analysing and visualizing the process data

 Not co mpatible with older kernel variants.

addr2line

Addr2line is GNU development tool which convert addresses into file names and line numbers. It takes the input addresses, executable and provide informat ion’s fro m core files to point the source file and the line on which the program stopped its execution. The below examp le shows that the stack crash program counter address converted into the exact filename and line number .It makes easy to figure out the exact line of the code where it causes the crash.

12. Android Profiling/Benchmark: Benchmark is the method to assess the performance characteristics of the system hardware and software components. In an embedded system, benchmarks are important especially in designing CPU, GPU and rendering engine. For examp le, the benchmark extracts display output by measuring the number of frames per second. Interpreting the benchmarking data gives the details where to look to imp rove the system rendering frame rates. It also helps t he vendors to keep industry standard benchmarks. Most of the available performance test suites in Android platform are usually based on the existing Linu x test programs. The various criteria such as CPU and Memory Utilization, rendering, I/O file system, n etwork usage measurement, and virtual machine performance can be used to evaluate device performance. Bench marking apps are useful for measuring the performance of the device and co mparing the scores with similar devices. The 3D bench marking tools usually report their scores in Frames per Second. Time taken to draw one frame on the screen is usually considered to the time it takes for the Draw() method or equivalent method that draws the fu ll set of graphics on the screen once, after doing all the manipulat ions (like scaling by a factor, translating etc.) on the vertices of the scene.

2745

International Journal of Pure and Applied Mathematics

Special Issue

13. Benchmarking factors Exceptional device performance efficiency and better user experience are the two crit ical aspects in any consumer devices creation. The table 3 contains criteria that can be used to evaluate device performance in Android platform. These criteria will be used as a starting point to identify a representative set of performance indices and the corresponding test tools to evaluate the overall performance of the devices. Some of the criteria mentioned in table are not explicitly measured by the existing benchmark tools.

Table. 3 Benchmark Criteria Module System processing capability

Sub system Memory CPU

Rendering

2D Graphics 3D Graphics

Audio video playback Audio Video recording

Image

I/O

File system SD Card

Network USB Bluetooth

Sensor

GPS

Light Pro ximity

Metrics  Stretching Memory  copy and add  MFLOPS/sec  Operation time.  Floating point performance.  Integer point performance.  Frame Per Seconds score  Standard Deviation  Fill-rate  Triangle throughput  Lighting  Texturing  Render different Video/audio formats and record.  Memo ry, power, CPU.  Record different formats for fixed duration and compute time to encode.  Record a large video/audio file and co mpute time required.  Vo ice capture for fixed length – compute time to encode  Record different formats for fixed duration and compute time to encode  Record a large image file and compute time required  File creat ion, deletion time  Data Read/Write operation time  Data Read/Write operation time.  SD Card Scanning: Co mpute time needed, CPU utilizat ion, and memory used  Download Rate (average, max)  Upload rate (average, max)  Download rate  Transfer rate in different modes over Asynchronous Connection-Less and Synchronous Connection Oriented mode  GPS Status.  GPS signal (SNR)  accuracy of a fixed location  Co mpute accuracy of light sensor at different light levels  Co mpute accuracy of pro ximity sensor at different proximity levels  Co mpute reaction time for pro ximity sensor at different pro ximity levels

2746

International Journal of Pure and Applied Mathematics

Special Issue

13 Miscellaneous

 Measure or estimate power consumption at all bench mark tests criteria  Native Code prime calcu lations (Du ration, system load, Errors)  Gauges JVM Performance(Score)  Co mpute time needed  Co mpute time needed  Co mpute time needed  Record color temperature  Calibrate heat generation at all benchmark tests

Power consumption Stress Tests OS and JVM Performance Shutdown/reboot/startup Switch on/off devices Gesture response Color temperature Heat generation

14. Android benchmark/Profiling tools analysis: 15. Oprofile Oprofile is a tool to gather statistical data on CPU utilizat ion which can be used to improve system performance. Somet imes Oprofile co mmand automat ically generated the result hangs .It provides      

System-wide profiling Performance counter support Call-graph support Low overhead Post-profile analysis System support

16. Iperf iPerf is an open source tool for any linu x environ ment and is a commonly used network testing and performance measurement tool, also for Internet Protocol bandwidth measurement on TCP and UDP bandwidth performance, allo wing the tuning of various parameters and characteristics (only used here for testing Wifi and data transfer performance). It can be u sed for             

Measure bandwidth, packet loss, delay jitter Report MSS/MTU size and observed read sizes. Support for TCP window size via socket buffers. Multi-threaded. Client and server can have mult iple simu ltaneous connections. Client can create UDP streams of specified bandwidth. Multicast and IPv6 capable. Options can be specified with K (kilo-) and M (mega-) suffices. Can run for specified t ime, rather than a set amount of data to transfer. Picks the best units for the size of data being reported. Server handles multip le connections. Print periodic, intermed iate bandwidth, jitter, and loss reports at specified intervals. Server can be run as a daemon. Use representative streams to test out how link layer co mpression affects the achievable bandwidth.

17. 0xBenchmark: Benchmark is an open Source Co mprehensive Benchmark Su ite that provides various criteria to measure of 2D, 3D graphics, Math computation, Virtual Machine and Native performance. Math computation is computed using Linpack (numerical test to measure floating point performance) and Scimark2 criteria. 2D Graphics performance is analy zed using parameters Draw Canvas, Circles, Rectangle, Arc, Image and Text. 3D Graphics, four different benchmarks Open GL Cube, Open GL Blending, Open GL Fog and Flying teapot are used. These benchmarks can run with Open GL hardware acceleration turned on and then OpenGL hardware acceleration turned off. Android Dalvik Virtual Machine efficiently runs mu ltiple

2747

International Journal of Pure and Applied Mathematics

Special Issue

instance of virtual machine efficiently. The performance is measured in terms of garbage colle ction and memory stretching. The native level system performance can be measured with LibM icro and UnixBench. The scores and the measurement units are co mpatible with the most of the other applications and tools.

18. MemBench MemBench provides Micro benchmark for memo ry system performance by stretching the memory and add/copy test. It is used to analyse the memory performance when co mplex programs are executed by the CPU. It calcu lates the average data copied/added to the memo ry

19. NBench NBench is a reliab le benchmarking framework which measures the memory Efficiency, CPU Integer and Floating Point Performance by executing various jobs, higher numbers represent better results. The performance evaluated in d ifferent hardware and software architecture by running dif ferent tasks such as numeric sort , string sort, Huffman co mpression etc.

20. Android Benchmark: Android Benchmark is co mposite tools provides the performance measurement of CPU, Graphics rendering, system memo ry and File systems. It is co mposite tool provides the performance measurement of CPU, Graphics rendering, system memo ry and File systems. It also provides the amount of data copied to memory per second with score. The total scores and the measurement units of Android benchmark are not compatib le with t he other benchmark outputs

21. LinPack Linpack benchmark is based on the library which performs nu merical linear algebra co mputations on the systems. It measures how fast a co mputer solves dense systems of linear equations and measures a system's floating-point computing power. The results are given in MFLOPS units for easy comparison across devices

22. Fps2D Fps2D is graphical bench mark tool that measures the 2D g raphics performance in terms of frames per second. The outputs are given with different criteria such as average, log10 histogram and standard deviation of frame rate.

23. An3D: An3D calculates the outputs Fill Rate in pixels per second and different Frames Per Second measurements. The measurements are compatib le with the other Graphics measuremen t applications.

24. Caffeine Mark: It is series of tests used to measure the speed of Java programs running in hard ware and software configurations. It measures the number of java instructions executed per second. It is series of tests used to measure the spe ed of Java programs running in hardware and software configurations. It also measures the number of java instructions executed per second and Android DVM performance.

2748

International Journal of Pure and Applied Mathematics

Special Issue

15

25. GPS Test: GPS test measures the signal to Noise rat io (SNR) and satellite information as well as the current location and time required to read fro m the GPS engine.

26. Quadrant Standard: Quadrant Standard measures the performance of CPU, Memory, I/ O, 2D and 3D Graphics.

27. SDCard Benchmark: SDCard Bench mark measures the SD card performance in terms of write and read speed in kBytes/sec. It also measures the internal phone memory write and read speed.

28. Speedy Android: Speedy Android measures download and upload rates (in Kb/sec) for WiFi. Provides upload and download statistics, Network and GPS info and map access.

29. GPU Bench: GPU Bench is an OpenGL ES 2.0 bench marking utility. It analyzes vertex and frag ment shaders performance. It tests graphics using Absolute and Relative modes Absolute Mode:   

The drawing operations are done in a texture of 256*256 They draw this texture on screen in a rectangle 256*256 In the current version, this mode appears to be full screen graphics. Relative Mode:

  

The drawing operations are done in a texture containing a rectangle Size of Width/2 and Height/2 They draw this te xture on the screen This means that the graphics fit at the centre of the screen in a rectangular canvas. This test can currently only run when OpenGL HW is enabled, since OpenGL ES 2.0 is not supported in software

implementation.

30. Nenamark Nenamark is an Open GL bench marking tool that tests the OpenGL ES 2.0 imp lementation. This test also appears to use some co mputationally intensive operations like lighting, shadows etc. It measures different types of graphics perfo rmance scenarios and returns frames per s econd score which is very useful for co mparison purpose

31. GLBenchmark GLBenchmark has different capabilities to evaluate the performances. It evaluates the performance of 3D, CPU with different tests. It evaluates  

Low level 3D performance tests evaluates Triangle throughput

2749

International Journal of Pure and Applied Mathematics

           

Special Issue

Lighting Texturing Fill-rate Rendering quality CPU tests evaluates Floating point performance Integer performance System tests evaluates OpenGL ES environment EGL environment CPU vendor, clock rate and architecture OS platform, vendor and version

32. Stability Test Stability test is strong stress test on CPU, GPU in scalable manner and generates the report.

33. Performance test results and analysis The performance tests on various Android modules are performed using the tools. The result s are captures and shown in table 4 Table. 4 Benchmark Evaluation Results Module Memory

Tools Used 0xBench mark

Result Stretching Memory : Binary tree of depth 16 Total Memory: 8265696 bytes Free Memory : 5158112 bytes Creat ing : long-lived binary tree of depth 14 and long-lived array of 125000 doubles: Total Memory: 8396768 Free Memory : 3572968 Create 37448 trees of depth 2 Top down: 1767 msecs Bottom up: 1910 msecs Create 8456 trees of depth 4 Top down: 2090 msecs Bottom up: 1858 msecs Create 2064 trees of depth 6 Top down: 1799 msecs Bottom up: 1996 msecs Create 512 t rees of depth 8 Top down: 1850 msecs Bottom up: 1851 msecs

MemBench

Android Benchmark CPU

LinPack

0XBench mark

Total Memory: 8396768 Free Memory : 3519008 Co mpleted in 17358 msecs Copy MB/sec 5.29; 5.26; 4.69; 5.26 Add MB/sec 5.24; 5.23; 5.24; 5.23 Total score: 181.66061 Copy memory: 165.07098 Mb/sec MFLOPS/sec: 3.89 Time: 21.51 sec LINPACK : MFLOPS/sec: 3.77

2750

International Journal of Pure and Applied Mathematics

Special Issue

17 Norm Res: 1.71; Precision.: 2.22E-16 SCIMARK2: Co mposite: 3.87 FFT: 2.72; Jacobi: 7.46; Monte Carlo Integration: 0.82 Sparse Matrix Multi.: 3.73 Dense Matrix LU Fact.: 4.61

Graphics

Android Benchmark

Total score: 239.77 MFLOPS Double / Single Precision: 4.54 / 5.03

NBench : Algorith ms: Nu merical Sort, String Sort , Bitfield, Floating Po int Emulat ion, Fourier, etc…

Duration: 10 mins. Bytemark Results: Integer Ind: 5.454 Float PtInd: 0.407 Linu x Data: Memory Ind: 1.002 Integer Ind: 1.712 Float PtInd: 0.226

Android Benchmark

Total score: 218.29 Draw opacity / transparent bitmap: 97.52 / 35.99 MPixels/sec Fps2D (bouncing Average: 61, stddev = 10.96; 1000 iterat ion. ball an imation) 0xBench mark Draw Canvas: Average: 60.66 Draw Circle: Average: 49.00 Draw Rectangle: Average: 28.66 Draw Arc: Average: 47.50 Draw Image: Average: 45.50 Draw Text : Average: 32.33

2D Graphics

3D Graphics

0xBench mark

Open GL HW (measurement in Average FPS): Open GL Cube: 68.5 Open GL Blending: 68.5 Open GL Fog: 69.18 Fly ing Teapot: 28.98 Open GL SW 1.1 (measurement in Average FPS): Open GL Cube: 61.66 Open GL Blending: 21.52 Open GL Fog: 47.61 Fly ing Teapot: 62.66 FPS: 32

NenaMark GL Bench mark GPU Bench Input/output sub-system

File System

Android Benchmark

Sdcard

SD Benchmark

connectivity

Network

Speedy Android

Dalv ik Virtual mach ine

OS and JVM Performance

Caffeine Mark

2751

Card

The tests were runsatisfactorilywith goodresultsvisually. Absolute Mode Score: 22258 Relative Mode score: 24401 Total score: 45.20 1000 empty files creation: 11.10 sec 1000 empty files deletion: 12.40 sec Write/Read 1M into/fro m file: 2.44 / 88.50 M/sec Write: 35.30 kB/sec Read: 37.45 kB/sec Download Rate: Max: 2370.9 kbps Mean: 1554.2 kbps Upload Rate: Mean: 769.7 kbps Overall score: 687

International Journal of Pure and Applied Mathematics

Miscellaneous

GPS

GPSTest

Stress Tests

Stability Test

Special Issue

Information can be used to measure the accuracy of a fixed location by manually co mparing the outputs Duration: 30 min Bogomips: 599 System load > 8.5

The following are the sample snapshots of a test run using different applications.

34. A.Memory Test 0xBench markMemBench

B.CPU Test 0xBenchmark

Linpack :

Fig.2 (A) Memo ry Test (B) CPU Test

35. Graphics Test 0xBench mark:Fps2d: Neocore:

0xBench: Fly ing Teapot (Open GL HW rendering)Fly ing Teapot (Open GL SW rendering)

Fig.3Screen shots of graphics test output

2752

International Journal of Pure and Applied Mathematics

Special Issue

19

36. Benchmark Test

A.SD card benchmark:B.Dalvik Virtual machine Test:

Fig.4 Bench mark test (A) SD card bench mark (B) Dalv ik VM benchmark

37. Network Test: Speedy Android

Fig.5 Network test output

38. Performance Tools Limitations: Some performance tools typically evaluate device performance according to a single criterion, or at best a handful of criteria and no attempts are made to process the results in a fo rm that is suitable for mu lt i-objective ranking. In some performance tools, their performance indices, basic assumptions and limitations are not explicitly specified. Since the animation sequences are depends on the application itself, the FPS scores should not be compared across different application s.

2753

International Journal of Pure and Applied Mathematics

Special Issue

39. Conformance Test Tools: CTS is the Conformance Test Suite for Android. It is open sourced under the Apache software license 2.0 and its latest version is available for download fro m Google. Google run a web -based compatibility service to cert ify CTS reports generated by device manufacturers. CTS cert ification is co mpulsory if the device manufactures wishes to use the Android brand and license the Android Market application wh ich is another service run by Google. In other words, Google would give their approval stamp on a device only after seeing evidence that the device fulfills all the requirements listed in the specifications. It also acts as a control mechanis m put in place by Google to ensure that the application developer is always developing for a unified Android p latform. It is an automated test framework that covers a wide range of tests, 21,000 tests to date, for source (API signature) and binary (API behavior) co mpatibility. CTS performance testing only includes application launch. CTS can be extended to run internal tests during the platform develop ment cycle. CTS co mprises of a test framework with support for creating test plans, parsing the test results into html friendly files and indiv idual tests that run within the test framework. It runs on the host machine, pushing the apk tests files on the device and recording the results back on the host mach ine in a browser friendly format . The tests are run as Android application packages on the device. Additional application packages can be added to the test harness to test high level platform specific features. CTS takes advantage of the JUnit framework, deriv ing fro m the existing classes for test cases, test suite and test runner. JUnit also provides functions to setup the context before running the test cases. The Android API includes the test package that can be used for high level testing. The Android test package takes into account these scenarios:         

Application use cases (this would be categorized as applicat ion instrumentation) Application activities Application start up time (this is categorized as performance testing) Service use cases CTS test will include Signature tests Platform API Tests Dalv ik VM Tests Platform Data Model Types of Test cases:

    

Unit tests test atomic units of code within the Android platform; e.g. a single class, such as java.util.HashMap Functional tests test a combination of APIs together in a higher -level use-case. Reference application tests instrument a complete sample applicat ion to exercise a full set of APIs and Android runtime services. Robustness tests test the durability of the system under stress. Performance tests test the performance of the system against defined benchmarks, for examp le rendering frames per second.

Fig.6. CTS Test Su mmary

2754

International Journal of Pure and Applied Mathematics

Special Issue

21 Figure 6 and figure 7 illustrates the test summary of each package and lists the test cases result summary of a single package. Issues:  

It runs Java tests only given that it has been derived fro m JUnit. Costly to integrate low level testing for device drivers, given the heavy JNI and code for parsing the test results in a format that is understood by CTS. CTS is test framework for applicat ion compatib ility. It has not been designed to test low level system performance

Fig.7 CTS Package Test

40. Conclusion With the emergence of many mobile platforms and growing market demand for the Android operating system, vendors are fo rced to create new, dynamic and innovative solutions. Considering the co mplexity involved in develop ment of modern day use cases and business contexts, appropriate and efficient tools are required for testing. In this paper, we have presented the classification and deep analysis for most of the Android tools and test frameworks which are widely used for system testing a nd real performance measurements. The paper also lists the merits and limitations of each of the tools so that the software developers, system integrators and testers working on Android can benefit from the analysis thereby pick the right methodologies based on the need and infrastructure. The performance measurement of various subsystems gives the system behavior under available software and hardware capabilities. This paper is also useful to those who are new to Android development practice and software testing. R EFERENCES [1]. IEEE Standard for Soft ware Unit Testing IEEE Std. 1008-1987. [2]. “International Software Testing Qualification Board(ISTQB)”,http://www.istqb.org, Oct 20,2016. [3]. AlbertSangiovanni-Vincentelli and Grant Martin, “Platform-based Design and Soft ware design methodology for Embedded Systems”. IEEE Design & Test of Co mputers, Volu me 18, pp.23-33,Nov-Dec 2001. [4]. G. Rothermel and M.J. Harrold, “Analy zing Regression Test Selection Techniques”, IEEE Transactions on Software Engineering, vol. 22, no. 8, pp. 529 – 551,1996. [5]. S.P. Ng, T. Murnane, K. Reed, D. Grant, T.Y. Chen “A Preliminary Survey on Soft ware Testing Pract ices in Australia” IEEE Australian Software Engineering Conference, 116-25,April 2004. [6]. Qian Yang, J. Jenny Li, David M. Weiss, “A Survey of Coverage-Based Testing Tools”, The Co mputer Journal, volu me 52 (5), pp. 589-597,August 2009. [7]. M.Prasanna, S.N. Sivanandam, R.Ven katesan, R.Sundarrajan, “A Survey of Automatic Test case Generation”, Academic Open Internet Journal (AOIJ), Vo lu me 15, 2005. [8]. B. Geo rge and L. Williams , "An init ial Investigation of Test Driven Develop ment in Industry," Proceedings of the 2003 ACM symposium on Applied co mputing, Pages 1135-1139,Melbourne, FL, 2003.

2755

International Journal of Pure and Applied Mathematics

Special Issue

[9]. D. Zhang ,B. Adipat, "Challenges, Methodologies, and Issues in the Usability Testing of Mo bile Applications," International Journal of Hu man-Co mputer Interaction, vol. 18, pp. 293 – 308,2005. [10]. Amalfitano, D.Fasolino, A.R, Tramontana, P, De Carmine, S, Imparato. G.,”A toolset for GUI testing of Android applications”, 28th IEEE International Conference on Software Maintenance (ICSM ), pp 23-28 Sept. 2012, Trento. [11]. “Android Developer Testing Fundamentals”,http://developer.Android.com/tools/testing/testing-Android.html#Instrumentation ,Oct 20,2016. [12]. “Emma, a free Java coverage tool”,http://emma.sourceforge.net,Oct 20,2016. [13]. A. Memon, Q. Xie, "Studying the fault-detection effectiveness of GUI test cases for rapidly evolving software", IEEE Transactions on Software Engineering, vol.31, no.10, pp. 884-896, Oct. 2005. [14]. T. Takala, M . Katara, J. Harty,” Experiences of System-Level Model Based GUI Testing of an Android Application”, Fourth IEEE International Conference on Software Testing, Verification and Validation, pp. 377 -386, 21-25 March 2011, Berlin. [15]. C. Kaner, J. Bach,B andPettichord, Lessons Learned in Software Testing: A Context-Driven Approach, Wiley publication, 2001 [16]. N. Ny man, "Using monkey test tools", Software Testing and Quality Engineering magazine, vol. 29, no. 2, pp. 18 -21, 2000. [17]. uTest, Inc., “The Essential Gu ide to Mobile App Testing”,http://www.utest.com/resources#ebooks , March 12,2016. [18]. Yeong-Jun Kim , Seong-Jin Cho ,Kil-Jae Kim ,Eun-Hye Hwang ,Seung-Hyun Yoon , Jae-WookJeon ,”Benchmarking Java application using JNI and native C application on Android, Control”, 12th International Conference on Automation and Systems (ICCAS), 2012 pp:284 - 288 , JeJu Island, 17-21 Oct. 2012 [19]. Leonid Batyuk, Aubrey-Derrick Sch midt, Hans-Gunther Sch midt, Ah metCamtepe and SahinAlbayrak, "Developing and Benchmarking Native Linu x Applications on Android," Proceedings of the 2nd International Conference on Mobile Wireless Middleware, Operating Systems, and Applications (Mobilware 2009), pp. 381-390, April 28-29, 2009. [20]. JihyunPark ,Byoungju Choi1,”Automated Memo ry Leakage Detection in Android Based Systems” ,International Journal of Control and Automation Vo l. 5, No. 2, June, 2012. [21]. Sangchul Lee, JaewookJeon, " Evaluating Performance of Android Platfo rm Using Native C for Embedded Systems," International Conference on Control Automation and Systems , pp. 1160-1163, 27-30 Oct 2010. [22]. A. Rajgarhia , A. Gehani , "Performance and extension of user space file systems," ACM Sy mposium on Applied Co mputing, pp. 206-213,2010. [23]. Starov O. and Vilko mir S.,” Integrated TaaS platform for mob ile develop ment”, 8th International Workshop on Architecture solutions, Automation of Software Test (AST), pp 1 – 7,May 2013. [24]. K. Inçki , I. Ari and H. So zer, "A Survey of Soft ware Testing in the Cloud", IEEE Sixth International Conference on Software Security and Reliability Co mpanion, pp. 18-23, 20-22 June 2012. [25]. W. Tsai, X. Chen, L. Liu, Y. Zhao, L. Tang, and W. Zhao, "Testing as a service over cloud," IEEE International Symposiu m on Service Oriented System Engineering, pp 181 - 188 , 4-5 June 2010, Nan jing [26]. “Google App Engine Developers portal”,https://developers.google.com/appengine,March 14,2016.

2756

2757

2758

Suggest Documents