testing information from a specification of the software un- der test, rather than ...... A design fsm for a class is generated from the state diagrams and defines the ...
Specification-based class testing: A case study Ian MacColl ianm
Leesa Murray leesam
Paul Strooper pstroop
David Carrington davec
@csee.uq.edu.au Software Verification Research Centre Department of Computer Science & Electrical Engineering The University of Queensland
Abstract This paper contains a case study demonstrating a complete process for specification-based class testing. The process starts with an abstract specification written in ObjectZ and concludes by exercising an implementation with test cases and evaluating the results. The test cases are derived using the Test Template Framework for each individual operation. They are analysed to generate a finite state machine that can execute test sequences within the ClassBench framework. An oracle is also derived from the Object-Z specification. The case study demonstrates how a formal specification contributes to the development of practical tests that can be executed by a testing tool. It also shows how a test oracle can be derived from a specification and used by the same testing tool to evaluate test results.
1. Introduction The purpose of specification-based testing is to derive testing information from a specification of the software under test, rather than from the implementation. Although it is possible in theory to formally refine a specification into an implementation, this rarely happens in practice. However, when the implementation is developed informally from a formal specification, the specification can play a major role in the testing process. Our method for specification-based class testing is shown in Figure 1. Testing involves three main tasks: test derivation, execution and evaluation. In our approach, test inputs are derived from a formal specification. The test inputs are analysed to develop a finite state machine, which is transformed to control test sequencing in a test execution environment. The specification is also translated into an oracle class for test evaluation in the test execution environment.
We use the Test Template Framework [24] to structure the test inputs and guide test derivation. In previous work we have extended the Test Template Framework to Object-Z [5], deriving a finite state machine for test sequencing from a class' s specification [17]. The Test Template Framework provides structure for testing but is not used for actually performing tests. ClassBench [9] is an approach to automated class testing designed especially for testing collection classes. The class under test is (partially) modeled as a finite state machine. The partial model, referred to as a testgraph, represents a subset of the states of the class and the transitions between them. ClassBench executes tests by traversing the testgraph and evaluates results by comparing the state of the class under test to the expected state supplied by an oracle class. In this paper we present the first complete case study applying all the steps of our approach to an example, deliberately kept simple to ease presentation. The case study is a part of a Dependency Management System. Section 2 contains an Object-Z specification of part of the Dependency Management System. In Section 3 we use the Test Template Framework to derive test cases and expected outputs, and in Section 4 we develop a finite state machine representation of the class under test. In Section 5 we transform the finite state machine into a testgraph, and in Section 6 we translate the specification into a ClassBench test oracle class. Testing of an independently developed implementation is presented in Section 7. In Section 8 we discuss related work and Section 9 concludes. Throughout this paper we make extensive use of Z [21, 26] and Object-Z [5], an object-oriented variant of Z with its own semantics. We assume familiarity with Z and introduce Object-Z extensions as they arise. We adopt a convention of using emphasis for specification constructs and monospace for implementation constructs.
tion). The complete specification is available as part of a technical report [14].
Object-Z Specification
TTF
Oracle Generation
[]
Items X
Development
Test Cases & Oracles
Analysis
Class FSM
INIT xs
: FX
xs
=?
NoXs result
!:B result! , xs = ?
IsX x X result
AddX
(xs) x? : X x? 2 xs xs = xs n fx?g
?:
!:B result! , x? 2 xs RemoveX
(xs) x? : X x? 62 xs xs = xs [ fx?g
Transformation
0
Testgraph
Oracle Class
Class Implementation
ClassBench
Test Results
Figure 1. Our approach to specification-based class testing
2. Case study The Dependency Management System (DMS) case study was a testbed for a project exploring software development methodologies [13]. The DMS is a critical component of a theorem-proving tool, tracking dependencies between theorems and assertions in a proof, thus preventing circular reasoning. However, the concept of dependency management generalises to other problems (such as revision control systems), so the DMS is a reasonably generic component. It is comprised of a set of items under management and the dependency graph between them. The DMS tracks dependencies between nodes. The basic DMS maintains a set of nodes, direct dependencies between nodes, and inferred transitive dependencies between nodes. The DMS provides operations for manipulating a dependency graph, from adding nodes and dependencies to calculating all the nodes dependent on some other node. In this paper we consider only features associated with the set of nodes.
2.1. Specification We specify the features associated with the set of nodes of the DMS in Object-Z (ignoring the dependency informa-
0
An Object-Z class is represented as a named box with optional generic parameters (here class Items with generic parameter X). A class contains an unnamed state schema, an initialisation schema (INIT ) and zero or more operations (four in this case). The state variable xs represents the aggregation of items stored in the system' s database. Initially there are no nodes. Operation NoXs returns true if and only if xs is empty. Operation IsX returns true if and only if the input node x is a member of xs. NoXs and IsX do not change the state of the class. In an Object-Z operation schema, state variables are implicitly equated to their primed after-state equivalents unless included in a -list. For example, state variable xs is in the -list of operation AddX and so can be changed. Node x is added to xs by AddX. An Object-Z operation is disabled outside its precondition, in contrast to Z in which precondition violation gives an undefined result (satisfying the precondition is interpreted as a client responsibility). AddX is disabled if x is already in xs. Operation RemoveX removes node x . The precondition is that the input value must be an existing node and the remaining conjunct specifies the removal of the designated node from the set of nodes.
?
?
?
?
3. Test cases and expected outputs Our approach to specification-based class testing is based on the Test Template Framework [24]. The Framework is a formal, abstract model of testing, used to derive a hierarchy of test information from a model-based formal specification. The hierarchy includes test inputs and expected outputs, but is not used directly for test execution. The Test Template Framework provides structure to testing information and processes without mandating any particular
techniques or methods. Most of the work with the Framework has used the Z specification notation [21, 26]. In [18], we extend the Framework to accommodate object-oriented features such as those available in Object-Z. We use the AddX operation to illustrate the Test Template Framework. We expand the operation as a Z schema (including expansion of the -list), and instantiate the generic parameter as the integers to simplify our presentation.
AddX xs; xs0 x
: FZ ?:Z x? 62 xs xs = xs [ fx?g 0
The Test Template Framework uses the valid input space (VIS) of an operation as the source of all tests, since an Object-Z operation is disabled for inputs outside the valid input space. The valid input space for AddX is the precondition of the operation' s schema VISAX
=b pre AddX
(here, and in the remainder of the paper, we use two-letter abbreviations of operation names as subscripts) which expands and simplifies to VISAX
=b [ xs : F Z; x? : Zj x? 62 xs ]
The basic unit for defining test data in the Test Template Framework is a test template (TT). A test template is a constrained subset of the valid input space and is expressed as a Z schema. We define a type for all AddX test templates. TTAX
== PVIS
AX
A strategy in the Test Template Framwork identifies a particular technique for deriving test cases. The Framework encourages the use of multiple strategies to build the test template hierarchy, both traditional techniques such as input partitioning and boundary analysis [1], and specialised techniques that exploit the specification notation [23]. We take the set of all possible testing strategies as given.
[Strategy]
(one example of) type-based selection. We identify typebased selection (TB) as a particular testing strategy.
:
TB Strategy
We use type-based selection to partition the valid input space into cases where xs is empty, a singleton, and contains more than one element, deriving three test templates as a result.
= = = (
#xs = 0 ] #xs = 1 ] #xs 1 ] ) = fTT 1
TTAX:1 b [ VISAX j TTAX:2 b [ VISAX j TTAX:3 b [ VISAX j TTHAX VISAX ; TB
>
AX: ; TTAX:2 ; TTAX:3
g
We distinguish the new test templates by sequentially numbering the subscript. Test templates are derived until the tester is satisfied that the subdivisions of the valid input space are adequate to achieve satisfactory testing. A test template hierarchy is usually a directed acyclic graph with the leaf nodes partitioning the valid input space. The formal specification can also be used to determine the success or failure of each test. We derive oracle templates (OT) to describe suitable output for a given input. Oracle templates can be derived at any point from test templates but are commonly used for unique instantiations of test templates (referred to as instance templates). For example, the oracle templates for the above test templates are OTAX:1 OTAX:2 OTAX:3
=b [ xs : F Zj #xs = 1 ] =b [ xs : F Zj #xs = 2 ] =b [ xs : F Zj #xs 2 ] 0
0
0
0
0
0
>
The complete test derivation of the other operations is available as part of a technical report [14]. The Test Template Framework provides a model and process for deriving testing information from a formal specification, but it does not provide mechanisms for executing tests or checking results. We use test and oracle templates to derive states and transitions for a finite state machine representing the class under test. The finite state machine is transformed for use in ClassBench, which provides test execution and result checking.
4. Finite state machine
Test templates are organised into a hierarchy called the test template hierarchy (TTH). The root of this hierarchy is the valid input space. The hierarchy is created by applying testing strategies to existing test templates to derive additional test templates. We identify the AddX test template hierarchy.
:
TTHAX TTAX
Strategy !7 PTTAX
A common, intuitive testing strategy is 0-1-many, based on the cardinality of a container type. We refer to this as
This section details the development of a finite state machine from testing information derived from an Object-Z specification. We address the development in two steps: first the derivation of the states of the finite state machine and then the transitions.
4.1. States We use the Test Template Framework to derive test templates and oracle templates for the class' s operations. We
derive the states of the class' s finite state machine from the class' s INIT schema and from each of the operations' test and oracle templates. The states derived from test templates only guarantee that some operation of the class can be executed in each state. However, there is no guarantee that the state reached after the operation terminates is a member of the set of states derived from test templates. Therefore it is also necessary to use the oracle templates to derive states for the finite state machine. We only consider oracle templates for operations that change the state of the class, as operations that do not change the state cannot contribute any new states. The collection of states gained from the INIT schema, the test and oracle templates gives us the set of states for the class' s finite state machine, and these are referred to as state templates (ST). The state template derived from the INIT schema of Items is STINIT
=b [ xs : F Zj xs = ? ]
We derive states from the leaf test templates of each operation' s test template hierarchy by using schema hiding to restrict the signatures of the templates to only the state variables. Hiding involves removing the input variables from the template declaration and existentially quantifying them in the template predicate. For example, the state template derived from test template TTAX:1 is STTTAX :1
=b TT
AX:1 n
(x?) [ xs : F Zj #xs = 0 ]
We derive ten test templates from the four operations of class Items (NoXs, IsX, AddX and RemoveX), which, in turn, provide ten state templates (subscript NX, IX, AX and RX respectively). STTTNX:1 STTTNX:2
=b [ xs : F Zj xs = ? ] =b [ xs : F Zj xs 6= ? ]
STTTIX :1 STTTIX :2
=b [ xs : F Zj xs 6= ? ] =b [ xs : F Z]
STTTAX :1 STTTAX :2 STTTAX :3
=b [ xs : F Zj #xs = 0 ] =b [ xs : F Zj #xs = 1 ] =b [ xs : F Zj #xs 1 ]
STTTRX :1 STTTRX :2 STTTRX :3
=b [ xs : F Zj #xs = 1 ] =b [ xs : F Zj #xs = 2 ] =b [ xs : F Zj #xs 2 ]
>
>
We calculate oracle templates for operations that change the state of the class (AddX and RemoveX), to ensure that all post-states of operations are included in the finite state machine. After calculation of the oracle templates, state templates are formed by renaming the primed state variables to
their unprimed equivalents, and hiding any output variables. The templates are then simplified so only state variables appear in the templates' declarations. STOTAX :1 STOTAX :2 STOTAX :3
=b [ xs : F Zj #xs = 1 ] =b [ xs : F Zj #xs = 2 ] =b [ xs : F Zj #xs 2 ]
STOTRX :1 STOTRX :2 STOTRX :3
=b [ xs : F Zj #xs = 0 ] =b [ xs : F Zj #xs = 1 ] =b [ xs : F Zj #xs 1 ]
>
>
Once the set of state templates is derived, we consider whether the templates are disjoint. If they are not, it is necessary to resolve any overlap of states so that we have a maximal partition1 of the class' s state space. To achieve this, we derive a canonical disjunctive normal form (DNF) to construct a partition, based on the technique described by Dick and Faivre [3], using equivalences of the form A
_ B (A ^ B) _ (: A ^ B) _ (A ^ : B)
If the INIT schema is partitioned in the process of formulating disjoint templates, the subscript INIT is added to the names of the resulting templates to track those templates that are initial states of the class' s finite state machine. This does not occur in our example. We form a set of disjoint state templates by removing duplicates and considering overlap of remaining state templates. State templates STTTNX:1 ; STTTIX :1 , STTTAX :1 and STTTRX :1 , and all those derived from oracle templates are duplicates. For the remaining seven candidate templates we have four that form the equivalence classes of a partition, and three that can be partitioned. STINIT STTTNX:2 STTTIX :2 STTTAX :2 STTTAX :3 STTTRX :2 STTTRX :3
equivalence class AX :2 ; STTTRX :2 ; STTTRX :3 NIT ; STTTAX :2 ; STTTRX :2 ; STTTRX :3 equivalence class STTTRX :2 ; STTTRX :3 equivalence class equivalence class
hSTTT hSTI
i
h
i
i
We rename the four remaining state templates as the states of our finite state machine. The subscript corresponds to the cardinality of xs: 0, 1, 2 or Many.
= = = =
ST0 b [ xs ST1 b [ xs ST2 b [ xs STM b [ xs
: F Zj #xs = 0 ] : F Zj #xs = 1 ] : F Zj #xs = 2 ] : F Zj #xs 2 ] >
1 By maximal partition, we mean that we maximise the number of STs in the partition, or, conversely, that we make the `granularity' of the partition as small as possible, i.e., that we minimise the amount of the state space covered by each ST.
AX
ST0
RX
IX NX
AX
ST1
AX
ST2
RX
IX NX
IX NX
RX
STM IX NX AX RX
Figure 2. Finite state machine representation of Items
4.2. Transitions In the previous section, we partitioned a class' s state into state templates to be used as the nodes of its finite state machine. We derive the transitions of the finite state machine by considering each pair of states and checking whether the pair is related by an operation (see [17] for a formal model of this process). A valid transition occurs when a pair of state templates, STi ; STj0 , is related by an operation. For each valid transition identified, an arc labelled with the operation' s name, beginning in state STi and terminating in state STj , is added to the class' s finite state machine. We choose two possible Items transitions to illustrate the derivation of valid and invalid transitions of a class. To show that ST0 ; ST00 are not related by AddX and do not form a valid transition in Items finite state machine, we prove2
(
)
(
)
9 ISAX ; OSAX ST0 ^ AddX ^ ST0 9 xs : F Z; x? : Z; xs : F Z #xs = 0 ^ x? 62 xs ^ xs = xs [ fx?g ^ 0
0
#xs = 0
false
0
0
(
)
Conversely, to show that ST0 ; ST10 are related by AddX, we prove
9 ISAX ; OSAX ST0 ^ AddX ^ ST1 9 xs : F Z; x? : Z; xs : F Z #xs = 0 ^ x? 62 xs ^ xs = xs [ fx?g ^ 0
0
#xs = 1
true
an Object-Z specification is now transformed into a partial model of the class under test, called a testgraph, for testing under ClassBench.
5. Testgraph ClassBench [8, 9] is an approach to automated class testing designed especially for testing C++ collection classes. For each ClassBench test suite, the tester must provide three components: a testgraph, an Oracle class, and a Driver class. The testgraph is a finite state machine that models a subset of the possible states and transitions of the class under test (or CUT). The testgraph nodes correspond to states of the CUT and the arcs correspond to the transitions; there is one distinguished node, the start node, that represents the initial state of the CUT. The Oracle class provides the same operations as the CUT, but supports only the testgraph states and transitions. The Oracle class is discussed in the next section. The Driver class is called by the framework as the testgraph is traversed. The Driver must provide two functions. The Driver::arc() function performs the state transitions associated with each testgraph arc on both the CUT and Oracle. The Driver::node() function checks that the CUT behaviour is correct for the state corresponding to the current testgraph node. At run-time, the framework automatically traverses the testgraph, calling Driver::arc() each time an arc is traversed, and calling Driver::node() each time a node is reached. To test the Items implementation with ClassBench, we must provide a testgraph. In [19] we detail the process for deriving a class' s testgraph from its formally derived finite state machine. Due to the abstractness of the finite state machine, and the relative concreteness of the testgraph, many issues must be resolved for the derivation to be possible and successful. These issues and their resolutions are also presented in [19]. The derivation of a testgraph is a four step process.
0
0
ST0 is the only initial state of Items as it was not partitioned during the derivation. We represent initial states by an incoming arc that does not originate in a state and has no label. The resulting finite state machine of class Items is shown in Figure 2. The finite state machine derived from 2 The input space of an operation (IS ) is the operation's signature Op restricted to input and before-state variables. The output space of an operation (OSOp ) is the operation's signature restricted to output and after-state variables. The expression 9 ISOp ; OSOp makes Op's signature available within the quantification.
Step 1: States ! Testgraph Nodes To begin, we map each finite state machine state to a testgraph node. We choose concrete values for the class' s state variables that conform with the finite state machine' s state and types chosen for generic parameters in the class' s implementation. We map Z integers (Z) to C++ integers (int), and for the testgraph we choose the set of integers f ; ; ; ; g. The result of transforming the Items finite state machine states (shown in Figure 2) to testgraph nodes is shown in Figure 3. The value chosen for the Items state variable, xs, is shown above each node. Note that for TGM
12345
we have chosen a set of four integers, but that any set with more than two integers would have been acceptable. {}
{1}
{1,2}
{1,2,3,4}
TG0
TG1
TG2
TGM
case arises when multi-arcs occur; that is, multiple transitions with different names linking the same states. These multi-arcs are also not permitted in testgraphs. We solve this problem in the same way: introduce a new node in the testgraph derived from the destination state. The complete, derived testgraph for Items is shown in Figure 5.
Figure 3. Testgraph nodes for Items. {} TG0
Step 2: Choose a Start Node
Step 3: Reachable Nodes In a testgraph, all nodes should be reachable. By reachable, we mean that there is a path from the start node to every other node, via a sequence of arcs. We achieve this by identifying the transitions in the finite state machine that make its states reachable. For example in Figure 2, ST1 is reachable from ST0 by way of the AddX operation. In our testgraph, we identify the nodes derived from ST0 and ST1, TG0 and TG1 respectively, and formulate a sequence of operation calls to transform TG0 to TG1. We use the transition' s operation somewhere in the sequence, preferably at the end. In this case, we achieve this with a call to AddX(1), and call this arc AX1. This process is continued until all derived nodes in the testgraph are reachable. The Items' testgraph is shown in Figure 4, which also gives the operation calls associated with each arc.
TG0
AddX(1);
{1} TG1
AddX(2);
{1} TG1
RemoveX(1);
A testgraph has one start node. The Items finite state machine has one initial state, ST0 . The testgraph node TG0 is derived from ST0 and so becomes the start node as shown in Figure 3.
{}
AddX(1);
{1,2} TG2
AddX(3); AddX(4);
{1,2,3,4} TGM
Figure 4. Reachable testgraph nodes for Items.
Step 4: Arc Coverage Each transition in the finite state machine that changes the state of the class must map to a testgraph arc. We achieve this by the same process used in Step 3 to map transitions to arcs. However, there are two special cases. One is transitions that begin and end in the same state, as AddX (AX) does for STM in Figure 2. Such arcs are not permitted in testgraphs. To overcome this problem, we derive another node from the state involved. That is, for the example just stated, we derive node TGMA from STM and give it a state that permits an AddX transition from TGM to it. The second
AddX(2);
{1,2} TG2
RemoveX(2);
AddX(3); AddX(4);
{1,2,3,4}
AddX(5);
TGM RemoveX(3); RemoveX(4);
{1,2,3,4,5} TGMA
RemoveX(5);
Figure 5. Testgraph for Items. Transitions in the class' s finite state machine that do not change the class' s state are not mapped to arcs in the derived testgraph, but to code in Driver::node(). For Items this applies to all NoXs and IsX transitions in Figure 2. This process is described in [19].
6. Oracle class In the original ClassBench framework, the behaviour of the implementation is checked by an Oracle class that is written by the tester. However, McDonald has developed a method for systematically deriving a test oracle from an Object-Z specification [15]. The method is sufficiently general that the generated oracle can be used in most testing frameworks, but in this section we show how to derive an oracle for Items for use in the ClassBench test suite. The oracle we generate is a passive test oracle that checks the behaviour of the implementation, rather than reproducing this behaviour. The oracle essentially provides a wrapper class around the implementation that checks each time an operation is called that the operation behaves correctly. The method follows the suggestion by H¨orcher [10] to check the behaviour in the abstract state space of the specification rather than the concrete state space of the implementation. To accomplish this, the method relies on an abstraction function implemented by the tester. There are two stages in mapping an Object-Z specification to a passive test oracle: optimisation and oracle translation. Optimisation is the rearrangement of the specification to simplify translation to an implementation language. Oracle translation maps the optimised specification to C++ code. The optimisation step involves changes such as flattening inheritance, schema inclusions, schema enrichments (and performing any associated renamings), replacing concurrency with equivalent sequential constructs (Griffiths [6]), and replacing constructs that are not finitely evaluable with equivalent finitely evaluable constructs (where possible). At the end of the optimisation step, if any constructs cannot
be transformed into finitely evaluable constructs, then the translation fails. For Items no optimisation is necessary. To assist in producing oracles, McDonald has developed a library of classes that implement some of the standard types of the Z mathematical toolkit [15]. This library provides implementations of the set, sequence, relation and function types. The implementations are in the form of generic classes. For example, the declaration Z Set xs; in a test oracle corresponds to the definition xs F X in Object-Z. To translate from an optimised specification to a passive oracle implemented in C++, the oracle inherits the implementation and augments it with code to check its behaviour. We use inheritance so that the oracle gains access to the private state of the implementation, to which it applies the abstraction function. The oracle provides the following operations:
:
abs(): an abstraction function that maps the implementation state to the abstract state of the specification. It is the tester' s responsibility to implement the abstraction function, and the complexity of the abstraction function depends on the complexity of the state of the implementation. inv(): an invariant checker that is passed the abstract state calculated by the abstraction function and that checks whether that state satisfies the invariant predicate of the specification. class constructor: a class constructor that checks the behaviour of the implementation' s constructor.
The oracle inherits DepManSys and so calls the inherited constructor. It then declares a variable xs to represent the abstract specification state and calls abs(xs) to instantiate this variable to the abstract specification state corresponding to the current concrete implementation state. The call to inv() checks that the abstract state satisfies the class invariant, and check init() evaluates whether or not the implementation satisfies the INIT schema of the specification. In this case, it simply checks that the current state represents the empty set using the == operator provided by the Z toolkit library. template void DMSOracle::check init(Z Set xs)
f g
CHECKTRUE is a macro provided by the ClassBench framework that checks a boolean value and outputs an error message if the value does not represent the boolean value true (here checking the assertion that the state is an empty set). The oracle replaces each of the implementation' s operations by an augmented version that calls the original operation, checks its behaviour, and signals any differences between the specified behaviour and the actual behaviour. For example, the implementation of the augmented version of AddX() is shown below. template void DMSOracle::AddX(X x)
f
one operation for each operation in the implementation: the operation checks the behaviour of the corresponding operation in the implementation. For Items, the state space in the test oracle is declared as Z_Set xs; The abstraction function copies the implementation state to a parameter of the same type as the specification state (here to Z Set) for subsequent checking. Since there is no invariant predicate in Items, the invariant checker does nothing. The oracle constructor for Items is shown below. template DMSOracle::DMSOracle():DepManSys()
f
g
Z Set xs = Z Set(); abs(xs); inv(xs); check init(xs);
CHECKTRUE(xs == Z Set());
g
Z Set pre xs; abs(pre xs); inv(pre xs); DepManSys::AddX(x); Z Set post xs; abs(post xs); inv(post xs); check AddX(pre xs,post xs,x);
Before calling the inherited operation, the oracle instantiates the abstract pre-state in pre xs and calls inv() to check that it satisfies the class invariant. After calling the inherited operation, it records the abstract post-state in post xs, calls inv() to check that the post-state satisfies the class invariant, and calls check AddX() to check that the pre- and post-states satisfy the specification of the operation. The invariant is checked both before and after calling the CUT operation because we view the invariant as an additional pre- and postcondition on each operation. The check function for AddX() is shown below. It is a straightforward translation from the Object-Z specification for AddX to the corresponding expression using the Z toolkit library.
template void DMSOracle::check AddX( Z Set pre xs, Z Set post xs, X x)
f g
CHECKTRUE(post xs == (pre xs + Z Set(x)));
For operations that do not modify the state (i.e., those with no -list), the operation checker additionally checks that the pre-state and post-state are the same. The checks for other operations are similar. Note that although the oracle code for each operation is lengthy, most of this code is the same for all operations and can be generated automatically.
7. Implementation and testing In this section we describe an independently developed implementation of the Items specification, and the results of testing it.
7.1. Implementation The implementation that we tested implements the full Dependency Management System (DMS) specification as a generic C++ class with a complex inheritance structure involving nine classes. Even though we include the full implementation in our test suite, the testing described below tests only Items operations.
7.2. Testing To test the implementation, we instantiate the generic DMS class to integer. The implementation was tested using ClassBench, with the testgraph described in Section 5 and the passive oracle derived in Section 6. During testing, each arc in the testgraph shown in Figure 5 was traversed once, node TGMA was visited once, and the other nodes were visited twice. No errors were detected in the implementation. However, initially check AddX() from the oracle reported a problem that was traced to an error in the oracle' s abs() function. Table 1 gives the number of uncommented lines of source code in the driver, oracle and implementation classes. Note that the 752 lines for the implementation include the implementation of all of the DMS, not just Items.
8. Related work In this section, we discuss related work on objectoriented testing based on specifications. Bosman and Schmidt [2] use finite state machines to test classes. Their method uses state machines that result from
Driver 108
Oracle 126
Implementation 752
Table 1. Lines of code for Driver, Oracle and implementation of Items.
Statecharts [7] used in object-oriented analysis and design. A design fsm for a class is generated from the state diagrams and defines the expected outcomes of test cases. A representation fsm is an abstraction of the class' s implementation and is used to drive the testing of the class. It also provides a mapping between the design and implementation of the class. Their approach is similar to ours but starts from a behavioural rather than a model-based specification. Their tool support appears less developed than ClassBench. Turner and Robson' s [25] state-based testing technique uses finite state machines for test case generation. Finite state machines are constructed to model the internal representation of a class, in contrast to Bosman and Schmidt, who use a state machine constructed for object-oriented analysis and design. Turner and Robson highlight the importance of considering state in object-oriented testing. In our approach, the Test Template Framework provides access to both the state of the class and the local parameters of an operation. Testing strategies can be applied to either. ASTOOT, developed by Doong and Frankl [4], provides tools for automatic driver generation from a class interface specification and semi-automatic test case generation from algebraic class specifications. Each test case contains an original message sequence and a simplified message sequence. The two sequences should leave objects of the class under test in observationally equivalent states. However, it is left to the tester to provide a means of determining whether the states are actually equivalent. ClassBench provides this functionality and reports if non-equivalence occurs. Stepney [22] describes a method for formally specifying tests based on applying abstraction to a specification. The method uses the ZEST object-oriented variant of Z, and involves systematically introducing non-determinism into the specification. A firing condition interpretation of precondition is used and testing information is structured as an inheritance tree of specifications. The method is supported by the Zylva tool. Kung et al. [12] describe the Object-Oriented Testing and Maintenance (OOTM) environment. OOTM uses a mathematically defined test model, derived from the source code. This highlights a limitation of the system as it only supports program-based testing. Murphy et al. [16] describe testing of classes and clusters using the Automated Class Exerciser (ACE) tool. The
ACE tool was developed to support the testing of C++ and Eiffel classes. The tool takes a test script and generates a test harness. Each test script describes how to compile the class under test and provides code for the test cases and the expected results of the tests. The test harness runs the test cases and reports the results, any failures being entered into a problem management system for later correction. Smith and Robson [20] present the Framework for Object-Oriented Testing (FOOT). The framework allows a class to be tested either manually or automatically, using either a test strategy or a previously defined test sequence. FOOT provides several testing strategies, including exhaustive testing method calls, testing for memory leaks, and testing that sequences of operations that should be identities do actually leave the state unchanged.
9. Conclusion Our method for testing, based on object-oriented formal specifications, has been demonstrated by a case study. The case study, although small, illustrates how the various tasks use information derived from the Object-Z specification to construct practical tests that can be executed in the ClassBench framework. The Object-Z specification is also used to derive a test oracle for ClassBench. Our method achieves a beneficial synergy between the two tasks of software testing: test generation and test execution. It links the existing Test Template Framework for test generation with the ClassBench test execution framework. Future work will investigate tool support for our method of testing from object-oriented formal specifications.
10. Acknowledgements The Dependency Management System (DMS) was originally specified in Object-Z by Gordon Rose in at least two different versions. Our Object-Z specification is based on a Z specification by Phil Stocks [23] which was derived from Gordon' s. The implementation used in Section 7 was originally developed by Wendy Johnston [11] under Unix from one of the original Object-Z specifications. Minor modifications have been made to handle a change of operating system to Microsoft Windows 95 and a compiler change to Borland C++ 4.5. We also thank Jason McDonald for his assistance with ClassBench to help us execute our tests, and for reading drafts of this paper. We also thank the anonymous referees for their helpful suggestions. This work is supported by Australian Research Council Grant A4-96-00282. Ian MacColl is supported by an Australian Postgraduate Award and a Telstra Research Laboratories Postgraduate Fellowship.
References [1] B. Beizer. Software Testing Techniques. Van Nostrand Reinhold, 2nd edition, 1990. [2] O. Bosman and H. W. Schmidt. Object test coverage using finite state machines. In Technology of ObjectOriented Languages and Systems (TOOLS 18), pages 171–178. Prentice-Hall, 1995. [3] J. Dick and A. Faivre. Automating the generation and sequencing of test cases from model-based specification. In J. C. P. Woodcock and P. G. Larsen, editors, FME ' 93, volume 670 of LNCS, pages 268–284. Springer-Verlag, 1993. [4] R.-K. Doong and P. G. Frankl. The ASTOOT approach to testing object-oriented programs. ACM Transactions on Software Engineering and Methodology, 3(2):101–130, 1994. [5] R. Duke, G. Rose, and G. Smith. Object-Z: A specification language advocated for the description of standards. Computer Standards & Interfaces, 17:511–533, 1995. [6] A. Griffiths. From Object-Z to Eiffel: Towards a rigorous development method. In Technology of ObjectOriented Languages and Systems (TOOLS 18), pages 293–307. Prentice-Hall, 1995. [7] D. Harel. Statecharts: A visual formalism for complex systems. Science of Computer Programming, 8(3):231–274, 1987. [8] D. Hoffman and P. Strooper. The testgraph methodology: Automated testing of collection classes. Journal of Object-Oriented Programming, 8(7):35–41, 1995. [9] D. Hoffman and P. Strooper. Classbench: A methodology and framework for automated class testing. Software—Practice and Experience, 27(5):573–597, 1997. Also SVRC TR96-03. [10] H.-M. H¨orcher. Improving software tests using Z specifications. In ZUM' 95, volume 967 of LNCS, pages 152–166. Springer-Verlag, 1995. [11] W. Johnston and G. Rose. Guidelines for the manual conversion of Object-Z to C++. Technical Report TR93-32, Software Verification Research Centre, The University of Queensland, 1993. [12] D. Kung, P. Hsia, J. Z. Gao, Y. Toyoshima, Chris Chen, Young-Si Kim, and Young-Kee Song. Developing an object-oriented software testing and maintenance environment. Communications of the ACM, 38(10):75–87, 1995.
[13] P. Lindsay. The dependency management system case study. Technical Report 94-01, Software Verification Research Centre, The University of Queensland, 1994. [14] I. MacColl, L. Murray, P. Strooper, and D. Carrington. Specification-based object-oriented testing: A case study. Technical Report 98-08, Software Verification Research Centre, The University of Queensland, 1998. [15] J. McDonald. Translating Object-Z specifications to passive test oracles. In International Conference on Formal Engineering Methods (ICFEM98), 1998. Also SVRC TR98-04. [16] G. C. Murphy, P. Townsend, and P. Wong. Experiences with class and cluster testing. Communications of the ACM, 37(9):39–47, 1994. [17] L. Murray, D. Carrington, I. MacColl, J. McDonald, and P. Strooper. Formal derivation of finite state machines for class testing. In ZUM' 98, volume 1493 of LNCS. Springer-Verlag, 1998. Also SVRC TR98-03. [18] L. Murray, D. Carrington, I. MacColl, and P. Strooper. Extending Test Templates with inheritance. In Australian Software Engineering Conference (ASWEC97), pages 80–87. IEEE Computer Society, 1997. [19] L. Murray, J. McDonald, and P. Strooper. Specification-based class testing with ClassBench. In Asia-Pacific Software Engineering Conference (accepted), 1998. [20] M. D. Smith and D. J. Robson. A framework for testing object-oriented programs. Journal of ObjectOriented Programming, 5(3):45–53, 1992. [21] J. M. Spivey. The Z Notation: A Reference Manual. Prentice Hall, 2nd edition, 1992. [22] S. Stepney. Testing as abstraction. In ZUM' 95, volume 967 of LNCS, pages 137–151. Springer-Verlag, 1995. [23] P. Stocks. Applying formal methods to software testing. PhD thesis, Department of Computer Science, The University of Queensland, 1993. [24] P. Stocks and D. Carrington. A framework for specification-based testing. IEEE Transactions on Software Engineering, 22(11):777–793, 1996. [25] C. D. Turner and D. J. Robson. A state-based approach to the testing of class-based programs. Software— Concepts and Tools, 16:106–112, 1995.
[26] Z notation. ftp://ftp.comlab.ox.ac.uk/ pub/Zforum/ZSTAN/versions/, 1995. Version 1.2. Committee Draft of proposed ISO standard.