Aug 19, 2013 ... Master Test Plan for the course. TIE-21200 Ohjelmistojen ... OHJ-3060 / OHJ-
3066 Software Testing. Master Test Plan ... INTRODUCTION .
Master Test Plan for the course TIE-21200 Ohjelmistojen testaus Periods I-II of academic year 2013–2014 Tampere University of Technology
Version 8.1.0
OHJ-3060 / OHJ-3066 Software Testing
Master Test Plan
Version 8.1.0
VERSION HISTORY Version Date 8.1.0 19.8.2013 8.0.0 8.3.2013
Author Antti Jääskeläinen Matti Vuori
Changes Several more updates. Base version for academic year 2013-2014
2/31
OHJ-3060 / OHJ-3066 Software Testing
Master Test Plan
Version 8.1.0
CONTENTS VERSION HISTORY ........................................................................................................................... 2 1.
INTRODUCTION ....................................................................................................................... 5 1.1 THE PROJECT ............................................................................................................................ 5 1.2 OVERVIEW ................................................................................................................................ 5 1.2.1 Test Organisation ........................................................................................................... 5 1.2.2 System under Test ........................................................................................................... 5 1.2.3 Testing Levels ................................................................................................................. 5 1.2.4 Project Phases ................................................................................................................ 5 1.2.5 Project Execution Style ................................................................................................... 5 1.2.6 Features to Be Tested and Not Tested ............................................................................ 5 1.3 TEST ENVIRONMENT AND TOOLS .............................................................................................. 6 1.3.1 Operating System: Linux ................................................................................................ 6 1.3.2 System under Test: RegexSearcher and jEdit ................................................................. 6 1.3.3 Programming Environment: Eclipse .............................................................................. 6 1.3.4 Unit Test Framework: JUnit ........................................................................................... 6 1.3.5 Coverage Measurement Tool: CodeCover...................................................................... 6 1.3.6 Model-Based Testing Tools............................................................................................. 7 1.3.7 Other Test Automation Tools .......................................................................................... 7 1.4 STRUCTURE OF THIS DOCUMENT .............................................................................................. 7
2.
UNIT TESTING (PHASES 1 & 2) ............................................................................................. 8 2.1 GENERAL .................................................................................................................................. 8 2.2 FEATURES TO BE TESTED .......................................................................................................... 8 2.3 APPROACH ................................................................................................................................ 8 2.4 PASS / FAIL CRITERIA ................................................................................................................ 9 2.5 TEST COVERAGE ....................................................................................................................... 9 2.6 REPORTING.............................................................................................................................. 10 2.7 DELIVERABLES ........................................................................................................................ 10 2.7.1 Phase 1 ......................................................................................................................... 10 2.7.2 Phase 2 ......................................................................................................................... 11
3.
SYSTEM TESTING (PHASES 3 & 4) .................................................................................... 14 3.1 GENERAL ................................................................................................................................ 14 3.1.1 Phases ........................................................................................................................... 14 3.1.2 Objectives ..................................................................................................................... 14 3.1.3 Generic test strategy ..................................................................................................... 14 3.1.4 Generic guidance .......................................................................................................... 14 3.2 FEATURES TO BE TESTED ........................................................................................................ 15 3.3 APPROACH .............................................................................................................................. 15 3.3.1 Approach A: Systematic, Planned Manual Testing ...................................................... 16 3.3.2 Approach B: Exploratory Testing ................................................................................. 16 3.3.3 Approach C: Traditional Test Automation ................................................................... 17 3.3.4 Approach D: Model-Based Testing .............................................................................. 17 3.3.5 Approach E: Combination of Approaches .................................................................... 18 3.4 SYSTEM UNDER TEST .............................................................................................................. 18 3.5 PRIORITIZATION....................................................................................................................... 18 3.6 DEPENDENCIES........................................................................................................................ 19 3.7 PASS / FAIL CRITERIA .............................................................................................................. 19 3.8 DELIVERABLES ........................................................................................................................ 19 3.8.1 Phase 3 ......................................................................................................................... 19 3.8.2 Phase 4 ......................................................................................................................... 21
3/31
OHJ-3060 / OHJ-3066 Software Testing
4.
Master Test Plan
Version 8.1.0
DOCUMENTATION ................................................................................................................ 23 4.1 4.2 4.3 4.4
INSTRUCTIONS PROVIDED BY THE COURSE ............................................................................. 23 THE DOCUMENTATION THAT THE TEAMS CREATE .................................................................. 23 DOCUMENT TEMPLATES.......................................................................................................... 23 OTHER DOCUMENTS ............................................................................................................... 23
APPENDIX 1: SUBMITTING REPORTS AND OTHER FILES.................................................. 24 GENERAL .......................................................................................................................................... 24 DELIVERABLES ................................................................................................................................. 24 NAMING AND PACKAGING OF FILES .................................................................................................. 24 TESTING PHASE 2 SUBMISSION PACKAGE ......................................................................................... 25 APPENDIX 2: TOOL INSTRUCTIONS .......................................................................................... 26 ECLIPSE ............................................................................................................................................ 26 Running Eclipse on Lintula ......................................................................................................... 26 Installing and Running Eclipse on Your Own Computer ............................................................ 26 REGEXSEARCHER ............................................................................................................................. 26 Importing RegexSearcher into Eclipse ........................................................................................ 26 JEDIT ................................................................................................................................................ 26 Importing jEdit into Eclipse ........................................................................................................ 26 Running jEdit in Eclipse ............................................................................................................. 27 JUNIT................................................................................................................................................ 27 Setting Up JUnit .......................................................................................................................... 27 Creating JUnit Test Classes ........................................................................................................ 27 Creating JUnit Test Suites........................................................................................................... 27 Exporting JUnit Tests .................................................................................................................. 28 CODECOVER ..................................................................................................................................... 28 Installing CodeCover Eclipse Plugin .......................................................................................... 28 Using CodeCover ........................................................................................................................ 28 APPENDIX 3: EXAMPLE OF JUNIT TEST CLASS AND SUITE .............................................. 29 JUNIT TEST CLASS ............................................................................................................................ 29 JUNIT TEST SUITE ............................................................................................................................. 30 APPENDIX 4: TROUBLESHOOTING ........................................................................................... 31 PROBLEMS IN ACCESSING THE TOOLS OR PACKAGES IN LINTULA ..................................................... 31 PROBLEMS WITH ECLIPSE ................................................................................................................. 31 PROBLEMS WITH JEDIT ..................................................................................................................... 31
4/31
OHJ-3060 / OHJ-3066 Software Testing
1.
Master Test Plan
Version 8.1.0
INTRODUCTION
1.1 The Project This document describes the course project for software testing course in Tampere University of Technology, periods I-II academic year 2013-2014. 1.2
Overview
1.2.1 Test Organisation Students in teams of two people. Each one is assigned an assistant who reviews the team’s work and gives feedback on the execution of the project. 1.2.2 System under Test The systems under test (SUTs) are RegexSearcher and jEdit. The former is a Java class for searching regular expression matches from a body of text. The latter is an advanced programming text editor, see http://www.jedit.org/ for more details. 1.2.3 Testing Levels During the project, RegexSearcher is tested on code level (unit testing) and jEdit on UI level (system testing). 1.2.4 Project Phases Course project has four phases: Phase 1: Planning of unit testing Phase 2: Execution and reporting of unit tests Phase 3: Planning of system testing Phase 4: Execution and reporting of system tests 1.2.5 Project Execution Style The project is executed in a traditional project style where plans and reports are emphasized. This mirrors the practices in most companies. There are also some project details which are necessary because of the educational purpose of the course project. 1.2.6 Features to Be Tested and Not Tested Testing on unit level will concentrate on a single class, whereas the system testing will target the most important editor functions. The testing is functional testing only. Non-functional characteristics (usability, security, performance, etc.) are not tested.
5/31
OHJ-3060 / OHJ-3066 Software Testing
Master Test Plan
Version 8.1.0
1.3 Test Environment and Tools There are some requirements for the testing environment that is used in this course. In the following, we will explain the operating environment and required tools. 1.3.1 Operating System: Linux All the necessary tools are installed in Lintula (Birdland) Linux environment. Do not try to use them on Solaris; they will not work on it. If you are unsure about the operating system of a workstation or a server, run “uname” (SunOS equals Solaris, which is the wrong environment). Lintula Linux workstations are located in TC217. user_at_right_OS@haikara $ uname Linux user_at_wrong_OS@mustatilhi $ uname SunOS It is also permissible to install the tools and perform the testing in other environments. However, in that case the responsibility of getting the tools to work is up to you; the course personnel can only provide full support for Lintula Linux. 1.3.2 System under Test: RegexSearcher and jEdit The packages for the versions of RegexSearcher and jEdit under test will be available at /share/ohjcourses/testaus/packages/ in Lintula (Birdland). The jEdit package is made available for use in the course project only. Note that the testaus directory does not have read permissions enabled for everyone, but its contents can still be accessed. jEdit stores its settings in .jedit folder, which is placed within the user's home directory on Linux. If you use jEdit yourself, you should make a backup of this directory before you start testing in order to retain your own settings. Removing the directory can be used to reset the settings when necessary. 1.3.3 Programming Environment: Eclipse Unit tests may be created and executed in Eclipse (http://www.eclipse.org), but its use is not compulsory. An installation intended for use in the project can be found at /share/ohjcourses/testaus/eclipse/eclipse in Lintula. Appendix 2 has instructions for its use. 1.3.4 Unit Test Framework: JUnit JUnit is used as unit testing tool. Instructions for using it are included in Appendix 2 and examples in Appendix 3. More comprehensive documentation can be found at http://www.junit.org. 1.3.5 Coverage Measurement Tool: CodeCover CodeCover can be used to measure how well the tests cover the tested code. Instructions can be found in Appendix 2 and further documentation at http://codecover.org.
6/31
OHJ-3060 / OHJ-3066 Software Testing
Master Test Plan
Version 8.1.0
1.3.6 Model-Based Testing Tools The model-based approach in system testing requires suitable tools for test generation and execution. The tools supported by the course are OSMO (http://code.google.com/p/osmo/), ModelJUnit (http://www.cs.waikato.ac.nz/~marku/mbt/modeljunit/) and fMBT (https://01.org/fmbt/). Some information on the use of these tools is currently available at http://www.cs.tut.fi/~testaus/s2013/project/tools/; the page will be updated before phase 3. Using other model-based tools is also permitted. 1.3.7 Other Test Automation Tools The teams are free to use any tool they find appropriate if they choose test automation for their approach to system testing. The course provides support for the Jemmy tool (described at http://www.cs.tut.fi/~testaus/s2013/project/tools/), which can be used to access jEdit GUI components. Other tools may also be used. 1.4 Structure of This Document This document contains two parts: the test plan for system testing and the test plan for unit testing. Both parts follow (somewhat) the template for test plans in IEEE standard for software test documentation [IEEE829]. The first part includes the requirements for unit testing tasks and the second part for system testing tasks.
7/31
OHJ-3060 / OHJ-3066 Software Testing
2.
Master Test Plan
Version 8.1.0
UNIT TESTING (PHASES 1 & 2)
2.1 General In this part of the project the purpose is to create a unit test suite for the RegexSearcher. As is common in testing, comprehensive documentation of the code under test is not available. As such, the correct functionality must be determined based on comments in the code and by using common sense. If necessary, you can create suitable test cases specifically to help you observe the behaviour of the code under test. Unit testing consists of two parts: Phase 1: Planning of unit testing Plan test structure, select test targets Design appropriate test suites Phase 2: Execution and reporting of unit tests Create the unit tests Execute the tests Create a test report containing a record the errors and anomalies you found and a log of your testing activity 2.2 Features to Be Tested The SUT is the class regexsearcher.RegexSearcher, which has been created specifically for the needs of this project. This class searches through a body of text for strings matching a specific regular expression pattern (for more information on regular expressions in Java, see http://docs.oracle.com/javase/7/docs/api/java/util/regex/Pattern.html). The implementation of the class contains multiple bugs. Only the functionality of RegexSearcher is tested. The Java API classes it uses, in particular java.util.regex.Pattern and java.util.regex.Matcher, can be assumed to have been thoroughly tested already. As such, different regular expressions need to be used in testing only to the extent that they can be expected to interact with the features of RegexSearcher. 2.3 Approach Unit tests are created using a generic unit testing application JUnit. The application is chosen because it is one of the most widely used unit testing tools and because it utilizes the “xUnit” paradigm. Similar tools are available for many languages. Each test case is a JUnit method. Multiple test cases may be placed into the same test class if they make use of same variables or initialization. Several test classes can be grouped into a test suite. See Appendix 2 for instructions and Appendix 3 for examples.
8/31
OHJ-3060 / OHJ-3066 Software Testing
Master Test Plan
Version 8.1.0
Note: To facilitate automated execution of your tests, you must create a test suite named AllTests that executes all of your test cases and place it in folder test, package regexsearcher. Each unit test case consists of four things: the setup, the sending of inputs, the checking of outputs, and cleanup. In the setup, all required objects are created and set to appropriate states, either using method calls or assigning values to variables directly. These actions are counted on to succeed; it may be necessary to run other tests first to make sure of that. Setup may be partially or completely performed in the setUp() method which JUnit automatically executes before each test case. The sending of inputs is performed by calling the method under test. Note that each test case should test only a single feature. If a test case contained tests for multiple features, a failure in one could prevent others from being tested. As such, test cases often contain only a single call to the method under test. However, multiple calls can be made if they are clearly related to testing the same feature. The checking of outputs means verifying that the return value (if any) of the method to be tested was correct, and that the test object was left in the correct state. The latter may be checked by reading variables or using method calls. All of these checks should be performed using JUnit assertions. Finally, in the cleanup the test class is made ready for the execution of other tests. This generally involves getting rid of test objects stored in object variables, and can be performed in the tearDown() method. 2.4 Pass / Fail Criteria You will need to consider the criteria for passing or failing of each of the methods to be tested, as well as the test suite as a whole. These criteria are based on various factors, such as the consequences of potential errors, time to market, available budget etc. Setting the bar too low may result in a fault-riddled product, while setting it too high will likely cause testing to consume too much time and money. A simple method is to assign the tests of each priority level a required percentage of passing tests. For example, if you used a prioritization scheme with the three levels of priority mentioned above, you could decide that 75% of critical, 50% of medium and 90% of low priority test cases must be passed (obviously these numbers are not actually a good choice). 2.5 Test Coverage It is not sufficient to decide when the tests have been passed. It is also necessary to decide when the testing is comprehensive enough that its passing indicates that the SUT works reliably. In unit testing this decision is often based on different code coverages, such as statement or branch coverage. Coverages can be measured with the CodeCover tool in Eclipse, instructions can be found in Appendix 2. In this project, code coverages are not emphasized, but they can provide a useful guideline for test design. However, tests should not be designed just to improve coverage; rather an improved cover-
9/31
OHJ-3060 / OHJ-3066 Software Testing
Master Test Plan
Version 8.1.0
age is a by-product of well-designed tests. In particular, while low coverages may indicate poor testing, high coverages are no guarantee of good testing. More information on coverages is available in the lecture notes. 2.6 Reporting Good reporting of errors and other incidents is one of the most important aspects of testing. A good report gives an idea of which features are not working properly; provides the information relevant to repeating the incident, such as the execution environment, the initial state of the SUT, and of course the given inputs; and explains the consequences of the incident and the reasons for why the issue should be fixed. The tester should always consider the features of the SUT faulty unless the tests clearly show otherwise. Thus, any strange or anomalous behaviour should be treated as a potential error and reported as such, rather than ignored as “that's probably how it's meant to work.” Incident reports on legitimate features can be flagged as such and moved aside; ignored real errors can reach the final product and cause serious problems. 2.7 Deliverables Unit testing is done in two phases. In the first phase you will create a unit test plan and design the unit test suite. In the second phase you will implement the unit tests, execute them and report the results. See Appendix 1 for submission instructions. 2.7.1 Phase 1 In the first phase of unit testing you will create an initial version of the unit test plan and some test cases. The plan should contain the following information: 0. Cover Page Course information Identify the document Identify the SUT Your names and student numbers The name of your supervising assistant 1. Features to Be Tested Which classes and methods are to be tested, and where are they implemented? 2. Test Execution Procedure Explain how the test execution is to be performed, including in what environment the testing should be done; what actions are needed to set up the environment; how the test cases are executed; and how the results are obtained and interpreted.
10/31
OHJ-3060 / OHJ-3066 Software Testing
Master Test Plan
Version 8.1.0
To reach the right level of detail, imagine that you are writing instructions for an unskilled tester who has access to the testing environment and your submission package. 3. Test Set Description Explain how you have divided your test cases into separate classes and suites. What features are tested by each of these? Explain what kinds of techniques you use for designing tests (boundary values, null parameters, and so on). Individual test cases need not be listed. 4. Pass / Fail Criteria Specify the criteria for deciding whether the SUT has passed testing. Appendix: Sample Test Cases List the code for at least five unit test cases here. For each case, the following information must be given: Test case identifier (the name of the test method, should be readable from the code) Time created (include as comment) Time last modified (include as comment, may be left out if the case has not been modified since creation) Priority and its rationale according to your prioritization scheme (include as comment) Dependencies (include as comment, may be left out if there aren't any) Setup activities (should be readable from the code) Input specifications (should be readable from the code) Output specifications (should be readable from the code) Make sure to show not only the test methods themselves but also any other related code such as object variables and their initialization. Pay attention to readability. This set of test cases is not meant to be comprehensive. Its main purpose is to let your assistant check that you are using JUnit properly. 2.7.2 Phase 2 In the second phase of unit testing you will create the rest of your unit test cases, execute them and report the results. The report should contain the following information: 0. Cover Page Course information Identify the document Identify the SUT Your names and student numbers The name of your supervising assistant 1. Introduction Specify the SUT. Specify the testing environment, including the name and operating system of the computer and the versions of the software used in testing.
11/31
OHJ-3060 / OHJ-3066 Software Testing
Master Test Plan
Version 8.1.0
Give a reference to your test plan. 2. Variances from the plan Report and explain any differences in the testing process as compared with the plan. 3. Comprehensiveness Assessment Evaluate how well the testing covered the features of the SUT. Identify features that were insufficiently tested and explain the reasons for this. Code coverage metrics may also be presented and discussed, but are not required. 4. Summary of Results Give a short summary of the most significant errors and anomalies that you found. The purpose of this section is to give the reader an impression of the condition of the SUT at a quick glance. Try to keep it short and focus on the essentials. Full error reports will be placed into the appendices. 5. Evaluation Compare the results with the pass / fail criteria defined in the test plan. Does the SUT pass testing? 6. Approvals Specify the names of persons (your course assistant) who must approve this report. Provide space for signatures and date. Appendix: Error Reports Report here all incidents such as errors and suspicious behaviour detected during testing. Since the SUT is known to contain errors, you are expected to find at least one error or anomaly worth reporting. The error reports should contain the following information, where relevant: Title. A one line, descriptive text that clearly tells what is essential in the error. (Such titles are shown in big lists and therefore they need to be clear and expressive.) Date and time. When the error was found. Test case or situation where the incident occurred. A longer description of the incident. Activity that produced the incident, in sufficient detail for repeating it. Inputs used. In detail. Expected results. Actual results. Other anomalies or relevant observations. Severity. How severe is the error – a development blocker or just cosmetic? A verbal description is sufficient. Test environment. Can the incident be repeated. Analysis on the potential causes of the incident. Analysis on how the incident affected testing.
12/31
OHJ-3060 / OHJ-3066 Software Testing
Master Test Plan
Version 8.1.0
If multiple test cases fail due to the same error, it is sufficient to report it once, with all those test cases mentioned in the report. A large number of reports for the same error only hinder readability. You must also submit the final version of the unit test plan, in which you have made corrections according to the feedback you received for the initial version in the previous phase. Furthermore, you must submit the code for your unit tests. See Appendix 2 for instructions on how to export your tests from Eclipse.
13/31
OHJ-3060 / OHJ-3066 Software Testing
Master Test Plan
3.
SYSTEM TESTING (PHASES 3 & 4)
3.1
General
Version 8.1.0
3.1.1 Phases In this part of the project the purpose is to perform system testing on jEdit. This testing will be performed on a version of jEdit into which errors have been seeded. System testing part of the project will consist of two phases: Phase 3: Planning of system testing Select an approach to the testing Feel free to select an approach that you feel most comfortable with Plan test structure and how to conduct the testing Phase 4: Execution and reporting system tests Implement and execute the tests Create a test log to record testing activity (this may be optional, depending on your test approach) Create a test report from findings Create a defect report 3.1.2 Objectives Testing whether the jEdit editor is suitable for production work, for actual every-day work. Finding out what defects it has so they can be corrected 3.1.3 Generic test strategy Testing is done at UI level Black box, no access to source code The end user's view Tester & test designer need to understand what is important for the user and decide based on that What to test How to test it – with what kind of tests 3.1.4 Generic guidance Need for proof & trust System testing is in companies carefully monitored by the managers, by the customers We need proof that the testing a) is well planned – right things, with good tests, b) is well executed – metrics, sometimes logs, c) has covered the right things – requirement coverage (code coverage is not usually monitored at this level) Documentation is therefore important
14/31
OHJ-3060 / OHJ-3066 Software Testing
Master Test Plan
Version 8.1.0
Working with other people – providing information Unit testing is usually done by the developer her / himself, but system testing may be done by a separate team. Tests may be planned and executed by different people. And managers watch it carefully Therefore, things need to be documented so that others can understand them. Can someone else perform the testing based on the plans? Can someone else correct the errors based on the testers' reports? Can the managers understand by the reports, what the state of the application is? (For example, what risks are there if it is released to market or deployed to the customer?) Positive and negative testing The functions should be tested with both positive and negative tests. Positive testing: The function works when everything is ok (valid file, plenty of disk space etc...) Negative testing: Disturbances in the environment, files etc... User errors, even deliberate misuse. Strange and exceptional inputs. All special situations. The lecture slides even suggest that 80 % of tests should target error and special situations. Keeping track of testing – test coverage Write a list of functions and the test cases that test them. The list tracks test coverage Separate positive and negative tests so that others can see that the application has been tested sufficiently harshly Saving files Positive TC1011 Save a short text file to user directory TC1013 (etc...) Negative TC1031 Save to a write-protected directory TC1033 (etc...) 3.2 Features to Be Tested Functionality of the system is to be tested, focusing on the most important editor functions. How to identify the features to test: Start by the most important features of any text editor Think of how a programmer would use an editor What would be the use scenarios / use cases? What editor functions (UI functions) of the editor are used? If we had a requirement specification, it would guide us Instead, we use the use cases and the implemented software as our guide That is common in practice too 3.3 Approach Testing is carried out on user interface level.
15/31
OHJ-3060 / OHJ-3066 Software Testing
Master Test Plan
Version 8.1.0
You may choose your approach in this phase of testing according to what seems appropriate. The actual tasks and reporting will vary somewhat depending on the approach selected. There are four different approaches, and the option to combine them: 3.3.1 Approach A: Systematic, Planned Manual Testing The most traditional approach and most widely used in companies. Phase 3: Identify the most essential functions of the application Plan a test suite Design test cases Phase 4: Execute the test cases Report errors Report the testing Essential: Good planning of test cases that test the most essential features of the application Good test planning so others can be sure that the testing has a solid basis Good error reporting 3.3.2 Approach B: Exploratory Testing The approach often used in agile software development. Also used to supplement systematic approaches. Phase 3: Select a set of usage scenarios Present an understanding what is most important in the application to work correctly Present your refined approach to the exploratory testing Phase 4: In the context of the scenarios, test the application using the principles of exploratory testing Keep a test log to document what has been tested and how A checklist ”List of some usual things to test in an application” is available on the course project page and can be used as an aid (available also in Finnish) Report the errors found Report the testing. Essential: Choosing the right context for testing Making observations and interpretations of the behaviour of the application Using good test techniques at each point where something is suspected Documenting the work so that others can trust that you have done a good job
16/31
OHJ-3060 / OHJ-3066 Software Testing
Master Test Plan
Version 8.1.0
A plan The test log Good error reporting Other: Checklists are provided for help (see the Course Project web page.) 3.3.3 Approach C: Traditional Test Automation Designing tests manually and executing them automatically. Phase 3: Identify the most essential functions of the application Plan how their testing can be automated. The tools, the test data etc… Plan a test suite Design automated test cases Document briefly the test system Phase 4: Execute the test cases using test automation Report errors Report the testing Essential: Finding a good tool with which the automation can be done efficiently and so that the tests are easy to maintain As test design will need to be done in similar manner to the alternative A, we will not require a high number of test cases – just a demonstration of good understanding of how testing can be automated Good error reporting Potential test automation tools Jemmy is supported by the course. See http://www.cs.tut.fi/~testaus/s2013/project/tools/. See the list http://java-source.net/open-source/testing-tools jEdit project itself uses Marathon http://www.marathontesting.com/ FEST is also used (a library used with JUnit http://code.google.com/p/fest/ UISpec4J http://www.uispec4j.org/ Jameleon http://jameleon.sourceforge.net/index.html 3.3.4 Approach D: Model-Based Testing Creating a model from which tests can be generated automatically. Phase 3: Identify the most essential functions of the application Pick a suitable testing tool and modelling methodology Create a test model
17/31
OHJ-3060 / OHJ-3066 Software Testing
Master Test Plan
Version 8.1.0
Phase 4: Generate and execute tests Report errors Report testing Essential: Deciding what to model and designing the test model carefully Good error reporting Supported model-based testing tools: OSMO, ModelJUnit and fMBT See http://www.cs.tut.fi/~testaus/s2013/project/tools/ for more details Other tools may also be used 3.3.5 Approach E: Combination of Approaches The approaches A, B, C and D (two or more of them) can be combined to create a balanced approach to testing. This would often be most valuable in practical projects and demonstrating skills in all approaches is a proof of good expertise. Examples: Systematic manual testing of some functions and exploratory testing of some others Automated testing of some functions and exploratory testing of some others First exploratory testing, then systematic testing, and finally some automated testing Essential: Demonstrating an understanding of key issues of the approaches used Good error reporting 3.4 System under Test The jEdit version 5.1.0 distributed by the course can be used for setting up test automation in phase 3 should you use approach C or D. In phase 4 the actual testing will be performed on jEdit 5.1.0 CE (where CE stands for Course Edition), into which some errors have been seeded. The CE version will become available after phase 3 is finished. 3.5 Prioritization Not all tests are equally important. A test for critical core functionality has a higher priority than one for an obscure special case. In practice this means that you have to consider which parts of the SUT and which kinds of inputs are most important to test. From there priorities can be derived for individual test cases. The method of prioritizing testing in this project is left up to you, but it must fulfil two requirements. First, there must be at least three different levels of priority (e.g. critical / medium / low).
18/31
OHJ-3060 / OHJ-3066 Software Testing
Master Test Plan
Version 8.1.0
Second, you must show how you obtained the priorities, for example by applying the probability / effect analysis or MoSCoW method described in the lecture slides. In general, the tests should be executed in order of descending priority. That way the most important features will get tested even if testing is interrupted due to lack of time. 3.6 Dependencies Some tests can be reasonably performed only if other tests have already passed. This may be because the setup of the test includes another feature to be tested, or the test is a more complex variant of another test. The most straightforward example is that the runtime features of an application cannot be tested until the application has been confirmed to start up successfully. As another example, opening an exceptionally large file should be tested only after confirming that the application can successfully open any files at all. Dependencies may cause situations where a test case of lower priority should be executed before one of higher priority. 3.7 Pass / Fail Criteria As in unit testing, see 2.4. 3.8 Deliverables Just as unit testing, system testing is done in two phases. In the first phase you will create a system test plan. In the second phase you will perform testing according to the plan and report the results. See Appendix 1 for submission instructions. 3.8.1 Phase 3 In the first phase of system testing you will create an initial version of the system test plan. The plan should contain the following information: 0. Cover Page Course information Identify the document Identify the SUT Your names and student numbers The name of your supervising assistant 1. Features to Be Tested What features are to be tested? 2. Approach Explain your testing approach (based on alternatives A, B, C, D or E) Explain why you chose this approach. Explain the testing process in sufficient detail so that someone else could perform it as you intended. 3. Test Specification 19/31
OHJ-3060 / OHJ-3066 Software Testing
Master Test Plan
Version 8.1.0
Explain what kinds of tests you intend to perform, and on what sets of features. Explain why you chose these specific feature sets and methods. Adjust this for your selected approach (A, B, C, D or E). 4. Prioritization Explain your prioritization scheme: what priorities are you using and how do you obtain them? 5. Dependencies Explain any dependencies relevant to your testing approach. 6. Pass / Fail Criteria Specify the criteria for deciding whether the tested features have passed the testing. Specify the criteria for the entire system testing to be considered passed. Appendix: Test Cases If you create specific test cases, list them here. For each case the following information must be given: Test case identifier. ID and title. Time created. Time last modified (leave empty if the case has not been modified since its creation). Priority and its rationale. How did you end up with the priority? Dependencies with other test cases. Setup. Things you need to do before running the case to take the system into a suitable state, like changing the environment, loading a file, or executing another test case that exits in that state. Test steps. The sequence of actions to be performed in order to execute the test case. Inputs. Specific texts, files and other data needed during execution. Separate from test steps to make it easier to execute tests with varying data. Expected results. Cleanup. The actions needed to return the system to a stable base state, such as deleting temporary files, reloading a file, returning to main display, logging out, etc. (these are important especially if we think of automating the test cases). Other information that you think is important for a tester to know so that she/he understands how to execute this test case. If you don't have any test cases, you can leave this part out. The description may be tailored to suit your test approach. (For example, in exploratory testing planned test cases are not normally used.) Include other information as needed according to your chosen approach.
20/31
OHJ-3060 / OHJ-3066 Software Testing
Master Test Plan
Version 8.1.0
3.8.2 Phase 4 In the second phase of system testing you will perform testing according to the plan you created in the first phase. The results will be documented in a system test report, which should contain the following information: 0. Cover Page Course information Identify the document Identify the SUT Your names and student numbers The name of your supervising assistant 1. Introduction Specify the SUT. Specify the testing environment, including the name and hardware of the computer and the versions of the software used in testing. Specify the test plan. 2. Variances from Plan Report and explain any differences in the testing process as compared to the plan. 3. Comprehensiveness Assessment Evaluate how well the testing covered the features of the SUT. Identify features that were insufficiently tested and explain the reasons for this. 4. Summary of Results Give a short summary of the most significant errors and anomalies that you found. The purpose of this section is to give the reader an impression of the condition of the SUT at a quick glance. Try to keep it short and focus on the essentials. Full error reports will be placed into the appendices. 5. Evaluation Compare the results with the pass / fail criteria defined in the test plan. Does the SUT pass testing? 6. Approvals Specify the names of persons (your course assistant) who must approve this report. Provide space for signatures and date. Appendix: Error Reports Report here all incidents such as errors and suspicious behaviour detected during testing.
21/31
OHJ-3060 / OHJ-3066 Software Testing
Master Test Plan
Version 8.1.0
Since the SUT is known to contain errors, you are expected to find at least one error or anomaly worth reporting. The error reports should contain the following information, where relevant: Title. A one line, descriptive text that clearly tells what is essential in the error. (Such titles are shown in big lists and therefore they need to be clear and expressive.) Date and time. When the error was found. Test case or situation where the incident occurred. A longer description of the incident. Activity that produced the incident, in sufficient detail for repeating it. Inputs used. In detail. Expected results. Actual results. Other anomalies or relevant observations. Severity. How severe is the error – a development blocker or just cosmetic? A verbal description is sufficient. Test environment. Can the incident be repeated. Analysis on the potential causes of the incident. Analysis on how the incident affected testing. If multiple test cases fail due to the same error, it is sufficient to report it once, with all those test cases mentioned in the report. Appendix: Test Log Report your testing process. Only required for exploratory testing. You must also submit the final version of the system test plan, in which you have made corrections according to the feedback you received for the initial version in the previous phase. If you created automated tests, include them in the package.
22/31
OHJ-3060 / OHJ-3066 Software Testing
4.
Master Test Plan
Version 8.1.0
DOCUMENTATION
4.1
Instructions Provided by the Course Web page Course Project Master test plan (this document) – available on the Course Project web page Slide set: TIE-21200: Ohjelmistojen testaus – Introduction to the Course Project (parts 1 and 2) – will become available on the Course web page
4.2
The Documentation that the Teams Create See the chapters on test phases for a description of the documentation required. Documentation is submitted in PDF format. See details on submissions on the Course Project page.
4.3
Document Templates The Course Project web page contains links to document templates
4.4
Other Documents Other documents, such as checklists, are available on the Course Project page.
23/31
OHJ-3060 / OHJ-3066 Software Testing
Master Test Plan
Version 8.1.0
APPENDIX 1: SUBMITTING REPORTS AND OTHER FILES General Please follow these instructions very carefully. Failing to follow the instructions will cost points. The plans and reports must be in PDF format. The cover page of any document must include the names of both students and their student numbers. All of them correctly typed. Any appendices must be in the same PDF file as the main document. Do not submit them as separate files. In phases 1 and 3, return the PDF file as it is via IDLE system (https://idle.cs.tut.fi/). In phases 2 and 4, return a package containing the report, the fixed plan and other necessary files, again via IDLE. Deliverables Phase 1: 1. Unit test plan document Phase 2: 1. Unit test summary report document 2. Fixed unit test design document 3. The unit test folder exported from Eclipse Phase 3: 1. System test plan document Phase 4: 1. System test report document 2. Fixed system test plan document 3. Automated tests if you created any 4. Other files required for testing (e.g. a complex input file for a test case) Naming and Packaging of Files Name the plan and report files as 2013.PHASEx.SNUM1.SNUM2.pdf, where PHASE is 1, 2, 3 or 4 The letter x is replaced with 'a' ('approval') when submitting the initial version, and with 'f' ('fixed') when submitting the fixed plan along with the corresponding report SNUM1 and SNUM2 are the student numbers of the authors. Example: 2013.1a.123456.142424.pdf. In phases 2 and 4 you are required to also return other files in addition to the reports. Pack all the files (that is, everything that is required, including the report in PDF format) to 2013.PHASEa.SNUM1.SNUM2.tar.gz package, where PHASE is 2 or 4 SNUM1 and SNUM2 are the student numbers of the authors. Example: 2013.2a.123456.142424.tar.gz. Package format:
24/31
OHJ-3060 / OHJ-3066 Software Testing
Master Test Plan
Version 8.1.0
The file format must be gzipped tar. You can create the package on Linux with command tar czf 2013.PHASEa.SNUM1.SNUM2.tar.gz * in the directory that contains the report and other required files (only). Delivered items must be at the root of the package, not in an extra sub-directory. The report file in the package should be named 2013.PHASEa.SNUM1.SNUM2.pdf (PHASE = 2 or 4) and the fixed test design 2013.DESIGNPHASEf.SNUM1.SNUM2.pdf (DESIGNPHASE = 1 or 3). Even if there was nothing to fix in the test design document you MUST rename that document before delivery. Following is an example of how those submission packages must be assembled ('/' marks the root of the package): package contains files description 2013.2a.123456.142424.tar.gz /2013.2a.123456.142424.pdf unit test report /2013.1f.123456.142424.pdf fixed unit test design /test/ unit test case folder package 2013.4a.123456.142424.tar.gz
contains files
description
/2013.4a.123456.142424.pdf /2013.3f.123456.142424.pdf
system test report fixed system test design other related files if any
Testing Phase 2 Submission Package The test cases submitted in phase 2 will be executed automatically. An incorrectly formed package will cause the automated execution to fail and cost you points. To test that your submission package has the correct structure, place it into an otherwise empty directory and execute there the following three commands (with SNUM1 and SNUM2 replaced with your student numbers): tar xzf 2013.2a.SNUM1.SNUM2.tar.gz javac -cp /share/ohjcourses/testaus/testenv:test test/regexsearcher/AllTests.java java -cp /share/ohjcourses/testaus/testenv:test org.junit.runner.JUnitCore regexsearcher.AllTests Your tests should now be executed and failures listed. Check that the reported number of tests and failures is correct.
25/31
OHJ-3060 / OHJ-3066 Software Testing
Master Test Plan
Version 8.1.0
APPENDIX 2: TOOL INSTRUCTIONS All files mentioned here can be found at /share/ohjcourses/testaus/packages/ in Lintula. The testaus directory does not have read permissions enabled for everyone, so you will need to type this path. Eclipse These instructions are written for Eclipse version 4.3, but other versions may also be used. Running Eclipse on Lintula Execute /share/ohjcourses/testaus/eclipse/eclipse on a Linux workstation (TC217) to run Eclipse. Set your workspace (the place where your project and all its files will be placed into) in a suitable place under your home directory and click OK. On the welcome view click Workbench. Installing and Running Eclipse on Your Own Computer Download the package for Eclipse Standard from http://www.eclipse.org/downloads/. Extract the contents into a directory of your choice. Run the executable. Set up workspace and move to workbench as above. RegexSearcher Importing RegexSearcher into Eclipse Right-click inside the package list area in Package Explorer in Eclipse and select New → Java Project from the pop-up menu. Type in project name, e.g. regexsearcher, and click Finish. The RegexSearcher project now appears in Package Explorer. Right-click on your project and select Import…. Select General/Archive File and click Next. Set the archive file to the package regexsearcher.tar.gz and click Finish. jEdit Importing jEdit into Eclipse You may need to import jEdit into Eclipse if you want to perform automated system testing. Import a jEdit package into Eclipse the same way as RegexSearcher. There are two alternate packages: jedit5.1.0CE.tar.gz and jedit5.1.0.tar.gz. The former is the version under test during phase 4, and contains seeded errors; it will become available at the beginning of the phase. The latter version does not contain seeded errors and may be used for setting up test automation during phase 3. Right-click on your project and select Properties.
26/31
OHJ-3060 / OHJ-3066 Software Testing
Master Test Plan
Version 8.1.0
Select Java Build Path and the tab Libraries. Click Add External JARs..., select the package jsr305-2.0.1.jar and click OK. Still in the Libraries tab, click Add Class Folder…, select the folder classfiles and click OK and OK. Running jEdit in Eclipse Open your jEdit project and underneath it the folder classfiles/org/gjt/sp/jedit. Right-click on jEdit.class and select Run As → Java Application (or Debug As → Java Application if you want to use debug features). JUnit Setting Up JUnit Right-click on your project and select Properties. Select Java Build Path and the tab Libraries. Click Add External JARs... and select the packages junit-4.11.jar and hamcrest-core-1.3.jar. Select the tab Source. Click Add Folder, then Create New Folder. Name the folder test and click Finish, OK and OK. The tests will be placed into this test folder so that they don’t get mixed up with the code under test in the src folder. Creating JUnit Test Classes Right-click on the class file you want to test in Package Explorer and select New → JUnit Test Case. Change the source folder from the src folder into the test folder. Change the name if you want. Check the method stubs you want to create; setUp() is executed before and tearDown() after every individual test case, the others are not needed. The methods can also be added manually later on. Click Finish. The test class now appears in Package Explorer. Implement the test methods. See Appendix 3 for an example. Execute the tests by right-clicking on the test class and selecting Run As → JUnit Test (or Debug as → JUnit test if you want to use debug features). Creating JUnit Test Suites Select the test class files you want to include in the new suite (more can be manually added later), right-click and select New → Other.... Select Java/JUnit/JUnit Test Suite and click Next. Name the suite (default is AllTests, which is one suite you need to create) and click Finish. The suite can be executed just as an individual test class.
27/31
OHJ-3060 / OHJ-3066 Software Testing
Master Test Plan
Version 8.1.0
Exporting JUnit Tests In phase 2 of the project you will need to export your test cases and submit them along with the report. Right-click on the test folder in Package Explorer and select Export.... Select General/File System and click Next. Set the target directory, make sure that the choice Create only selected directories is selected and click Finish. CodeCover Installing CodeCover Eclipse Plugin The course installation of Eclipse in Lintula already has CodeCover installed, so you only need to do this if you are using an Eclipse installation of your own. From the Eclipse toolbar select Help → Install New Software. Click Add, fill in the name (CodeCover Update Site) and location (http://update.codecover.org/), and click OK. Select CodeCover, click Next and proceed through the installation. Using CodeCover Right-click on your RegexSearcher project and select Properties. Select CodeCover, enable it, select the coverages you want to measure (StatementCoverage is a good place to start) and click OK. Select the application classes you want to measure in the Package Explorer, right-click and check Use For Coverage Measurement. Run your tests using Run As → CodeCover Measurement For JUnit. In the Test Sessions view at the bottom, check the test runs whose coverages you want to examine. The covered parts of the code are now highlighted in the measured files. From the Eclipse toolbar click Window → Show View → Other... and select CodeCover/Coverage. The opened view shows coverage percentages for the measured test runs.
28/31
OHJ-3060 / OHJ-3066 Software Testing
Master Test Plan
Version 8.1.0
APPENDIX 3: EXAMPLE OF JUNIT TEST CLASS AND SUITE JUnit Test Class package regexsearcher; // import JUnit assertions import static org.junit.Assert.*; import org.junit.After; import org.junit.Before; import org.junit.Test; public class ExampleTestClass { private RegexSearcher searcher; // execute before every test case @Before public void setUp() throws Exception { searcher = new RegexSearcher("ijk", false, false); } // execute after every test case @After public void tearDown() throws Exception { searcher = null; } // a simple test case @Test public void testSimple() { searcher.setText("abcdefghijklmnopqrstuvwxyz"); Match match = searcher.findNext(); assertEquals("begin", 8, match.begin); assertEquals("end", 11, match.end); } // another test case with multiple method calls @Test public void testLoop() { searcher.setText("abcdefghijklmnopqrstuvwxyz"); searcher.findNext(); Match match = searcher.findNext(); assertEquals("begin", 8, match.begin); assertEquals("end", 11, match.end); } // a test case where a call should cause an exception @Test(expected=RuntimeException.class) public void testNullText() { searcher.setText(null); searcher.findNext(); } }
29/31
OHJ-3060 / OHJ-3066 Software Testing
Master Test Plan
Version 8.1.0
JUnit Test Suite package regexsearcher; import org.junit.runner.RunWith; import org.junit.runners.Suite; import org.junit.runners.Suite.SuiteClasses; // a test suite containing two test classes @RunWith(Suite.class) @SuiteClasses({ ExampleTestClass.class, AnotherTestClass.class }) public class ExampleTestSuite { }
30/31
OHJ-3060 / OHJ-3066 Software Testing
Master Test Plan
Version 8.1.0
APPENDIX 4: TROUBLESHOOTING
Problems in Accessing the Tools or Packages in Lintula The /share/ohjcourses/testaus/ directory does not have read permissions enabled, but its contents can still be accessed. Problems with Eclipse Use the course installation at /share/ohjcourses/testaus/eclipse/eclipse instead of the basic Eclipse installation in Lintula (executed simply with the command eclipse). Make sure you are running the course installation of Eclipse on Linux, not on Solaris. Linux workstations are available in class TC217. Eclipse has a GUI and cannot be executed over a text-based remote connection. Problems with jEdit jEdit stores its settings in .jedit folder, which is placed within the user's home directory on Linux. If jEdit breaks down during testing, removing this folder should restore it to working order. Launching and closing jEdit once afterward may be required.
31/31