Implementing Software Test Management Using SpiraTeam Tool

3 downloads 256 Views 798KB Size Report
Abstract— SpiraTeam is a powerful test management tool to handle the software test processes. It assists the test manager to plan and manage test activities ...
2010 Fifth International Conference on Software Engineering Advances

Implementing Software Test Management Using SpiraTeam Tool Ahmed Ibrahim Safana Suhaimi Ibrahim Centre for Advance Software Engineering Universiti Teknologi Malaysia Kuala Lumpur, Malaysia [email protected] [email protected] Abstract— SpiraTeam is a powerful test management tool to handle the software test processes. It assists the test manager to plan and manage test activities such as project planning, releases, iterations. It provides the test manager with the ability to identify incidents, manage test cases, requirement and associated traceability. The main problem that exists in software testing management research is how to effectively manage the whole software test processes in a single environment. The Software Test Process (STP) is characterized by the decay in number due to the remaining errors, failure intensity, and an increase in code coverage on a tested software. It is very important for a manager to decide how to spend testing-resources on software testing for developing a quality and reliable software. Considering some kinds of software-testing management problems, such as requirement traceability, test coverage and testing-resource control problem for how to spend the allocated amount of testing-resource per expected time. This research applied SpiraTeam to manage the tests on an embedded real time system of medium size called OBA. This paper reports and demonstrates the capabilities of SpiraTeam to manage test activities and its significant contribution in test management.

management processes. When the test processes are carefully planned at an early stage of system development, it serves as a key factor to project success. Early understanding of what to accomplish by the project manager helps in the early planning of the project. Without this, it will be hard to project how the testing process is to be managed and controlled; to estimate resources needed; to plan for applying the system [2, 3]. Large amount of data and versioning software test processes make the test management very difficult. The testing process is a constant checking of one developed item against another (e.g., to review the design specification against the requirements). It involves enormous amounts of data, and this data is in a state of constant change so that a number of versions associated with every data item are produced. Therefore, the project managers and testers usually need to determine the state of the testing process and make a comparison of the data produced. This can be controlled only if the status of each data element is known at all times [4]. Project management is regarded as a system of management, technologies, procedure, practice, experience, and skills; successful management of any engineering project requires experience. Software Project is an umbrella activity within software engineering. It is initiated before any technical activity begins and continues throughout the stages of development till maintenance [5]. Generally, testing management is a broad term which can basically include management of the testing process, management of test data and management of the test organization and resources [6, 7, 8]. SpiraTeam focuses on supporting the management and control of test data (e.g., Requirement, test cases and Incidence) produced in a project development cycle. Its support covers all levels of software testing, including unit testing, integration testing, system testing as good as regression testing [9, 10].

Keywords- software; Test Management tool; SpiraTeam; Onboard Automobile System; Agile approach.

I. INTRODUCTION Generally, in every software development, the most important stages are Specification, Design, Coding, and Testing. In the final phase, software testing is becoming more complicated. A software system is tested to detect and correct latent software errors in three successive stages: Module testing, integration testing, and system testing. Each software module is independently tested during module testing alone, and the interfaces among these modules are tested in the integration testing. The last stage called system testing is implemented to cover the system including operating system and other hardware/software. The environment in which all the variety of tests the software undergoes is very crucial as all the output need to be capture and recorded into a database repository. During the testing phases, considerable development resources are consumed to detect and correct latent errors [1]. With the growth of software testing techniques getting more, management and control into testing process has become increasingly important. The difficulties faced in early planning for testing cause a lot of problem to the test 978-0-7695-4144-0/10 $26.00 © 2010 IEEE DOI 10.1109/ICSEA.2010.76

II. TEST MANAGEMENT TOOLS Brief analyses have been made by the authors in finding the best test management tool that will handle the exercise. During the analysis the selected tools has been tested by different test managers on the same project under different environments. The result gathered after several consultations leads to the comparison below. Table 1: Comparisons on test management tools. Tool

453 447

Strength

Weakness

Quality Center

Very popular testing tool, good support, good cross browser support , good online communication

No java support out the box, more expensive solution.

Rational Manager

Economical, lots of supporting tolls, good extensible scripting language through VB, good data creating facilities, good online communication Built- in recovery system for un attended testing, ability to test across multiple platforms, browser and technology in one script Customized template for test specification, defects and test cases. Tools support scripting design management Customize dashboard, unlimited user, real – time access, variety of format reports.

Scripting Language is a littler basic out the box, and a bit confusion with the tools.

Test

SlikCentral Test Manager

Wip-CAFÉ TMS

SpiraTeam

of automated unit and functional testing tools. Some of the features of this tool include. IV. FEATURES OF SPIRATEAM. A. Requirement Management Using SpiraTeam, a tester has the ability to create, edit and delete project scope or requirements in a hierarchical organization that resembles a typical scope matrix. The importance levels range from (critical to low) and a status identifier that designates where the requirement is in the development lifecycle (requested, planned, in progress and completed). In addition, each requirement is mapped to one or more test cases that can be used to validate that the functionality works as expected. This mapping is called the “Requirement Test Coverage”.

Expensive, no object identity tool is available

No object identity tool available, some limitation in the script use for automation Less support to vaster operating system

B. Test Case Management The test cases of the project can be managed by creating, editing and deleting. The test cases are stored in a hierarchical folder structure. Each test case consists of a set of test steps that represent the individual actions a user must take to complete the test. These test steps also contain a description of the expected result and any sample data elements that the tester should use when performing the action. When a user executes a test case, the results are stored in a test run that contains the success/failure status of each test step as well as the actual observed result that the tester experienced.

SpiraTeam was found to be suitable based on the analysis made. Some of the reasons of selecting Spirateam to be the best tool in handling this project inclucde: 1. The ability to manage the test cases and the requirements separetly, and then allows the mapping between each requirement and the related test cases. 2. When an incident or fault is detected, the tool allows the tester to record it against the requirement it was traced from. 3. It generates requirement tracebility report. 4. It has the ability of generating a report using different formats. Example, Excell, word and so on. 5. It has the ability to migrate data to many test run tools to and vice versa 6. It has the ability of integrating with many test run tools 7. It has the ability of displaying the whole information of a giving project in a snapshort view called dash board.

C. Release Planning Different versions / releases of the project can easily be tracked. Each project in the system can be decomposed into an unlimited number of specific project releases. The name and the version number need to be attached. Requirements and Test Cases developed during the design phase can then be assigned to these different releases. When a tester executes a series of test cases, they are able to choose the version of the project being tested and the resulting test run information is then associated with that release. D. Iteration Planning Individual iterations that comprise a release can be tracked. The project manager has the option to manage agile methodology projects within the SpiraTeam environment. Unlike the release planning stage, where high-level requirements are estimated and scheduled, the iteration planning phase involves assigning each of the tasks in the project backlog against a specific iteration until the available effort in the iteration has been completely allocated.

The remaining test management tools are lacking many of the above mentioned features. Some of the tools cannot be able to manage a complete project in a single enviroment unlike the SpiraTeam that provides the test manager with the complete test management enviroment. III. SPIRATEAM SpiraTeam is a test management tool with an integrated Application Lifecycle Management system. It manages the project requirements, releases, test cases, issues and tasks in one unified environment. It contains all the features provided by SpiraTest, a highly acclaimed quality assurance system and SpiraPlan, an agile-enabled project management solution. It can be able to work in conjunction with a variety

E. Incidence Tracking The system provides the ability to create, edit, assign, track, manage and close incidents. Incidents are raised during the testing of the software system under development. These incidents can be categorized into bugs, enhancements, issues, training items, limitations, change requests, and risks, and

448 454

softweare test plan. The test plan was used in conducting the ad-hoc test process.

each type has its own specific workflow and business rules. Typically each incident is raised initially as a ‘New’ item of type ‘Incident’. Following the review by the project manager and customer, they are changed to one of the other specific types, given a priority (critical, high, medium or low), and status changed to ‘Open’. Once it is assigned to a developer for fixing, it is changed to status ‘Assigned’. The developer now works to correct the incident, after which time its status changes to ‘Fixed’ or ‘Not Reproducible’ depending on the actions taken. F. Artifact Relationships Various artifacts are maintained in the test management environment of the SpiraTeam. These include projects, users, requirements and tests. To aid in understanding how the information is related, Figure 1 and 2 below illustrate the relationship between different artifacts and entities in SpirTeam.

Figure 3: Subsystem components of OBA

VI. APPLYING SPIRATEAM

ON ON-BOARD AUTOMOBILE SOFTWARE

Figure 1:

Figure 2:

The implementation is SpiraTeam on OBA characterized by an exponential behavior. Requirement management is the immediate activity after getting the environment ready. The requirements where exported into SpiraTeam using Excel Importer. The screenshot below shows the requirement summary of the OBA software. Figure 3 show the summary of the OBA requirement in SpiraTeam, the requirements are distributed based on their importance and development status.

The main entities of SpiraTest project.

The relationships between the various SpiraTest entities

V. CASE STUDY On- Board Automobile system (OBA), is the case study of this research. The Driving Assistance System (DAS) is defined as the potential study of vahicle for the near future. Figure 1 shows subsystem components of OBA.It is also a general name given to system which is to be controlled by a device called “Save Drive” (SD). The Safe drive consist of processor, On-Board Automobile (OBA). The OBA is a coupler computer software and the control panel on the mechanical component that is intended to improve the safety of the vehicle driving. More especially over a long trip on a motor way and to provide luxury to the drivers. The software specification and requirement provides a way of writing a

Figure 4:

Requirement summary

Test case management follows the requirement management. The relationship between the test cases and the requirement is term “Test Coverage”. In test coverage, the entire requirements are mapped to one or more test cases. The tester has the ability to identify the requirement left 449 455

uncovered from the requirement coverage chart. The diagrams below show the requirement coverage window and chart.

Figure 7: Requirement Coverage chart

The aggregate of the test steps conducted on figure 6 is displayed in graphical view on figure 7. During the test running, any test result that is contrary to the expected result is recorded as a new incident. Every incident has a certain kind of workflow; the work flow shows the stages taken in solving the incidence right from the inception to close down. In this project the following work flow is used in taking care of the risk incidence.

Figure 5: Requirement Mapping

Figure 5 illustarates the mappinng between the requirment and the test cases of the OBA. The left side of the window consists of all the available test cases, while the right side consists of the selected test cases for the current requirement. Selecting the check box of the test case and clicking on the “add” button, link the test case with the current requirement. Test run enables the tester to record the result of the tests on each requirement. The test run window displays the test Steps entered during the test case management and the possible status. The status ranges from Pass, Pass All, Blocked, Caution, Fail to Pause. Figure 6 shows the test run steps. On each step, the tester can select pass, blocked failed caution or pause.

Figure 8: Incidents’ workflow.

Figure 8 shows the workflow of the steps taken in handling a risk incident. A risk detected incident is recorded as new; it is open during the meeting between the stockholders and the development team. Assigned to be corrected and closed when it done. It is reopen when test prove it wrong. Test execution status is one of the most important features of SpiraTest management tool. The tester has the ability to keep the status of the tests based on the way they are executed. The diagram below shows the test execution status of the On-board Automobile Software.

Figure 6: Requirement run.

450 456

VII. . EXPERIMENTAL RESULT The experimental result was derived in different format. Such format includes HTML, MS-Excel, MS-Word and XML. The client has the opportunity to decide on the format he wants the experimental result to be printed. At the end of the implementation the results gathered includes Requirement Traceability linked with Test case coverage, Test case Traceability linked with Requirements, Requirement coverage and many more. The tool helps in planning the whole project in an Agile approaches, the scope of this paper did not involved the project planning. The result cannot be displayed on this paper due to the size and the format does not conform to the paper format. But the graphical views that support the report can be viewed as follows.

Figure 9: Test Execution Status.

Figure 9 shows the OBA requirements that are not run, cautioned and passes during the test run in a graphical view. The Incidence recorded during the test run can be summarized. The numbers of incident found are summarized in a tabular form with their status and priority. Incident open count; provides the graphical view of the incidence progress as shown below.

Figure 11: test run summary

Figure 11displays the test summary. in this version of the report; the x-axis represents the test run execution status, and the individual bars are grouped by test run type. Each data-value can be viewed by positioning the mouse pointer over the bar, and a “tooltip” will pop-up listing the actual data value. Clicking on the The “Display Data Grid” link will display the underlying data that is being used to generate the graph.

Figure 10: Incident Open Count.

Figure 10 illustrates the percentages of new incident recorded and their status during the test run of the OBA project. The table below outlines some of the benefits of using the tool against the manual system. Table 2: Comparisons on test manual and Automatic Tests. Manual Tests management Processes are ad-hoc and repeatable across projects

not

There is no visibility between test cases, requirements and defects Measuring progress and productivity during testing is time consuming and difficult It’s hard to share information across the project and get real-time metrics regarding the quality of the system being tested There is no central repository of test results from all sources

Spirateam tests management Permanently recorded and reusable Test case, Requirement and defect are linked together Can be viewed in a graphical chart at a glance. Associatibity between the processec and a real time result. A central repository of a rusults

451 457

Figure 12: test run progress rate

.

Figure 12 shows the test run summary. The y-axis represents the number of test runs executed in each 24 hour period, and the x-axis represents a specific day in the timespan. Each data-bar can be viewed by positioning the mouse pointer over the point, and a “tooltip” will pop-up listing the actual data value.

VIII.

CONCLUSION

In order to make software tests process flexible and changeable, the research recommended the use of an effective software testing management tool. The tool should be able to manage the whole tests process in a single environment. The tool should have the ability to migrate and integrate data with much data processing software. Some of the shortcomings discovered on this tool include; inability to accommodate more than five users at the same time when running on Windows Vista Operating system, unlike Windows Server 2003. The tool is more effective and response quickly when running on Mozilla Fire Fox browser when compared to Google Chrome. REFERENCES [1]

[2] [3]

[4] Figure 13: incident progress rate [5]

Figure 13 displays the total number of incidents created and closed over a particular date-range. The y-axis represents the number of incidents and the x-axis represents a specific day in the time-span.

[6]

[7]

[8] [9] [10] [11] [12] [13]

[14] Figure 14: incident aging

[15]

Figure 14 is the incident aging chart. it displays the number of days incidents have been left open in the system. On y-axis is the count on incident and different age intervals on the x-axis. Each bar-chart color represents a different incident priority, giving the project manager a snap short view of the age open project incidents by priority.

[16]

452 458

H. Ohtera and S. Yamada, “Optimal allocation & control for software testing resources,” IEEE Transection on reliability, vol. 39, pp. 171–172, June 1999. L. Lui and D. J. Robson, “ A support Enviroment for the managing of software testing,” IEEE conference on software engineering 1992. N. Kicillof, W. Garieskamp, and V. Braberman, “Achieving both model and code coverage with aytomated gray-box testing,” AMOST’07 ACM London, UK July 2007, pp. 1–5. F. Li, W. M. Ma, and A. Chao “Architecture centric approach to enhnace software testing management,” 2008 IEEE Eight International Conference on Intelligent Systems Design and Application, pp. 3-4. P. M. Kamde, V. D. Nandavadekar, and R. G. Pawar, “Value of test cases in software testing,” IEEE International Conference On Management of Innovation and Technology, 2006, pp. 1 – 4. J. W. Cangussu, R. A. Decarlo, and A. P. Mathur, “Monitoring the software Test process using statistical process control: A logrithmic approach,” ESEC/ESE 03, September 1-5 2003 Helsinki, Filand, pp. 3. Y. Jun-feng, Y. Shi, L. Ju-bo, X. Dan, and J. Xiang-yang “Reflective architecture based software testing management model,” 2006 IEEE International Conference on Management and Technology. H. Pieire, “Testing networking ,” Vol. 39, no. 2, 1990 june, P. 172: P. Farell-Vinay, “Manage software testing,” Auerbach publication tayloy & Fracis group, vol. 2, Dec. 2008, pp. 39-89. I. Burnstain, “Practica software testing,” Springer Verley publication , vol. 2, Dec. 2003, pp. 39-89. White paper “Gain control of the chaotic software test management ,” Software Test Managemnet. pp. 1-4. N. Palani, “Test Management tool review,” Wipro technology & Electronic. Un published paper. pp 3-6 Y. shen, and J. Liu “Research on the application of data mining in software testing and defects analysis,” academy publosher vol: pp15, pp 29. SpiraTest Management tool Inflacter coorpration [online] Available at http.www.inflactar.com retrieved on 20th February, 2010 Rational test management [online] Available at http://www.aptest.com/atm2 retrieved on 10th March, 2010 Slik Central Test document on the tool [online] Available at www.01.ibm.com retrieved on 5th March, 2010 .

Suggest Documents