SOFL is a formal language and method for system specification and design. As a language it is an integration of Petri nets, Data Flow Diagrams, and VDM-SL.
A GUI and Testing Tool for SOFL∗ Shaoying Liu† , Tetsuo Fukuzaki‡, Koji Miyamoto †Hosei University, Japan ‡ IBM Japan Hiroshima City University, Japan
Abstract SOFL is a formal language and method for system specification and design. As a language it is an integration of Petri nets, Data Flow Diagrams, and VDM-SL. As a method it is a combination of structured method and object-oriented method. In this paper we introduce a graphical user interface (GUI) for supporting the construction of SOFL specifications, and a tool for testing SOFL specifications. With those tools it would be easy to construct a SOFL specification that consists of Condition Data Flow Diagrams and specification modules, and to ensure its consistency and validity. Keywords: Structured Methods, Object-Oriented Methods, Formal Engineering Methods, SOFL, Data Flow Diagrams.
1
Introduction
SOFL (Structured-Object-Oriented Formal Language) is a comprehensible formal language for specification and design that combines advantages of Petri nets, Data flow diagrams, and VDM-SL. It is also a method that integrates the structured method [1], the object-oriented method[2], and the formal method [3] for specification construction. In general, structured methods offer an effective way to achieve functional abstraction, modularity, and stepwise approach, while object-oriented methods are effective in achieving data abstraction, encapsulation, information hiding, and reusability. Due to precise semantics of mathematical notation, formal methods are helpful in achieving precise definitions of data and operations. All these three approaches are complementary in producing reliable and good quality software systems. SOFL has been applied to model a Hotel Management System[4], a Railway Crossing Controller[5], and to develop a University Information System[6]. It is also been used to develop SOFL tools [7]. However, as specifications usually contain condition data flow diagrams (CDFDs), a graphical notation representing the overall architecture of the system, often need to be modified and reviewed, the manipulation of spcifications without an effective graphical user interface (GUI) can be very inefficient and prone to errors. Also, we have developed a technique for testing SOFL specifications [8], but without tool support, it can be very difficult for generating test cases, managing test cases, and evaluating specifications with test cases, and their correctness may not be ensured. Those difficulities have motivated us to build a GUI and a testing tool for SOFL. Both are written in Java for their portability. ∗
This work is supported in part by the Ministry of Education of Japan under Grant-in-Aid for Scientific Research (B) (No. 11694173) and (C) (No. 11680368)
1
The GUI we have built provides several functions: drawing of CDFD, editing of CDFD, automatic generation of module outline from a CDFD, automatic generation of the hierarchy of CDFDs, saving of CDFD, loading of CDFD, and printing of CDFD. The testing tool, called SOSTEM (SOfl Specification TEsting Machine), is designed using SOFL and implemented using Java. It currently supports the five functionalities: generation of proof obligations, interactive testing, batch testing, test results analsis, and test information management. The tool can be used for both unit testing and integration testing. The remainder of this paper is organized as follows. Section 2 gives a brief introduction to SOFL. Section 3 explains the major features of the SOFL GUI, and section 4 describes the most important functions of the testing tool SOSTEM and a small case study for the evaluation of its efficience and effectiveness in helping specification testing. Finally, in section 5 we give conclusions and outline the future work.
2
A Brief Introduction to SOFL
A SOFL specification consists of a collection of inter-related specification modules and classes. A specification module, or module for short, is a special class in the sense that it has only one instance (object). A class is a collection of objects. In a class or module, state variables (like attributes of objects in Java) and processes (like methods in Java) that access and/or change the states, can be defined. Modules are mainly used to express the architecture of the system, while classes are used for data abstraction, which are necessary for definitions of the components of the architecture. A module or class can have any name, but the very top level module must be named in the form SYSTEM N ame, where SYSTEM is the keyword and N ame is a specific name for the specific system, such as SYSTEM Ticket. A module has a CDFD to represent its entire behaviour, but all the processes occurring in the CDFD must be either defined using pre and postconditions, or decomposed into next lower level CDFD, or a calling of a process of an existing object. An execution (or firing) of the entire system must start with the execution of the CDFD of the top level module SYSTEM N ame. Objects of classes and modules interact through their processes. Classes can have sub-classes that inherit all the state variables and processes defined in the current classes, but SOFL only allows single inheritance for simplicity in specification and verification. The overall structure of a SOFL specification is outlined in Figure 1. The top level CDFD given in Figure 1(a) is associated with the top level module SYSTEM Name. Each process occurring in this CDFD, such as A1 and A2, is defined by pre and postconditions in the module SYSTEM Name. All constant identifiers, type identifiers, state variables necessary for definitions of the processes must be declared in the corresponding sections within this module. A process in a module, such as A2, can be decomposed into a lower level module whose behaviour is described by a lower level CDFD, such as A2-decom. All the processes occurring in this CDFD, such as B1, B2, and B3, are defined in the same module. Some of them can also be existing processes defined in classes. In this way, an entire system can be constructed as a hierarchy of modules. To achieve data abstraction, polymorphism, information hiding, and specification resue, classes can be defined, such as S1 and S2. They offer services only through their processes. A process defined in a class is similar to a process defined in a module in terms of its semantics, but with some differences in their usage. For example, a process defined in a class can be decomposed into a lower level module whose CDFD can only be composed of existing processes of objects. The processes in a class can be applied in the specification of processes defined in modules or other classes. The notion of decomposition is a typical concept in structured methods, while the notion of compo-
2
class S1; const; type; var; inv;
module SYSTEM_Name; const; type; var; inv;
process
S1;
behaviour: CDFD-a
process
P1;
process
process
P2;
process
A1
process
P3;
process
A2
end-class ;
A1
A2
(a)
SYSTEM;
end-module ;
B2 (b)
B1 B3
class S2; const; type; var; inv;
module A2-decom; const; type; var; inv;
process
S2;
behaviour: CDFD-b
process
Q1;
process
A2-Decom;
process
Q2;
process
B1;
process
Q3;
process
B2;
process
B3;
end-class ;
end-module ;
Figure 1: The structure of a SOFL specification
sition is a typical concept in object-oriented methods. SOFL supports the combination of those two notions.
3
Major Features of SOFL GUI
The SOFL GUI aims to support drawing of CDFDs, modification of CDFDs, display of the entire hierarchy of CDFDs, automatic generation of the corresponding specification module for a specific CDFD, and to provide all the necessary information for other related activities such as testing of specifications. To this end, we design the GUI in the way that it allows the presentation of both CDFD and its specification module simultaneously. It also clearly shows the the component panel for the user to select the component to draw. For the support of other functions of the entire SOFL supporting environment, the GUI also provides a menue bar which can be extened to integrate other important functions in the future, such as testing, reviewing, documentation, code generation, and so on. Figure ?? shows the overall strucutre of the current SOFL GUI. Drawing of CDFDs includes drawing of process, data flow, data store, conditional node, brocasting node, exclusive-or node, merging node, and resolving node. The component panel lists the majore components in CDFD, Once a component patern is selected on the panel, that kind of components can be drawn within the CDFD window. When a process is drawn, the process node is automatically generated; the name of the process is temorarily decided by the system; and the input and output ports are set to only one port ready for necessary changes. When a component is drawn, the correctness of its syntax and connections to other components (e.g., data store is connected to a process) are automatically ensured. If they are not satisfactory with resepct to the user’s desire, they can be modified. The editing of CDFD includes changing the name of a process, a data flow, or a data store. It also includes the resizing of process, creating more output ports for a process, moving, copying, and deleting of components. Figure 3 shows the feature of decomposing a high level process. The rational for automatic generation of specification module outline is to help enchance the productivity in specification construction and reduce opportunities for mistakes. Of course, it is usually 3
0.4 Figure 2: The overall structure of the SOFL GUI
impossible to automatically generate a complete specification module from a CDFD, as no detail of the function of each process is given in the CDFD. However, the generation of the specification module outline does help the user (of the GUI) understand what need to be defined (e.g., inputs, outputs, precondition, postcondition). Automatic generation of the hierarchy of the CDFDs for the entire system or part of it can help the user find easily the desired processes during the modification of specifications. The user can freely access any level CDFD in the hierarchy. The idea of organizing the CDFD hierarchy is similar to the UNIX file system. In addition to the above functions, as usual the GUI has also provided the functions, such as saving, loading, and printing of CDFDs. We are still working on the generation of postscript files for SOFL specifications, which will allow us to put SOFL specifications in latex source files. In the long run we would like to develop this GUI into an Intelligent Graphical User Interface that supports not only the construction of CDFDs, but more importantly the SOFL method for building systems. By intelligent tool we mean a tool that can guide the user to develop his or her system in a systematical fashion. Such a tool should possess necessary knowledge about the specific notation, method, and perhaps application domain. It should be able to lead the user, in an interactive manner, to the final system step by step. Thus, it would reduce opportunity for mistakes in building systems, and enchance the productivity. Figure 4 shows the feature of displaying the CDFD hierarchy.
4
0.5 Figure 3: The GUI feature
0.5 Figure 4: The GUI feature
5
4
The Testing Tool SOSTEM
The first author has developed a technique for testing formal specifications, including SOFL. Such a speification is usually defined by pre and postcontion, and is unexecutable. Therefore, the traditional programming testing technique cannot be applied to specification testing without change. As it is pointed out in the first author’s previous publication on specification testing [8], the technique for testing a formal specification, especially implicit specifications, such as those in VDM and Z, should combine the principle of formal proof and program testing. The application of proof principle leads to a derivation of the proof obligation telling what to prove, and the program testing approach offers the idea of selecting test cases to “discharge” the proof obligation. The purpose of testing specifications is to help detect faults occurring in the specifications produced in early phases of software development so that the costing and unnecessary efforts can be avoided in the later phases. Testing a specification needs three steps. The first is the generation of test sets (a test set is a set of test cases, and a test case is an input to the process), and next step is to evaluate the specification with the generated test sets. The final step is to analysize the test results to determine whether faults are detected or not by the test. The testing tool SOSTEM is designed to support those three activites, and test cases management. Specifically, SOSTEM offers the following major features: generation of proof obligations, interactive testing, batch testing, test results analsis, and test information management. The tool can be used for both unit testing and integration testing. Figure 5 illustrates the overall structure of the tool.
Figure 5: The overall structure of the supporting tool
6
4.1
Generation of Proof Obligations
Since proof obligations for consistency properties are the targets of testing in ensuring the consistency of a specification, their generation is the essential step for the testing of formal specifications. As proof obligations can be derived based on the syntax of the specification, the tool can automatically generate the proof obligation for a given invariant, condition process, or construct. Specifically speaking, the tool reads the specification file that contains the specification to be tested, and automatically generates the proof obligation. It then prompts the tester, the user of the tool, to choose the way of testing (e.g., interactive testing or batch testing).
4.2
Interactive testing
To carry out an interactive testing, the tester needs to provide both the test target and the test cases intereactively through the graphical user interface. After obtaining a test case from the tester, the tool will evaluate the proof obligation derived automatically from the given test target, and display the test result, including the intermediate evaluation results. The tool then prompts the tester to choose the contiunation of testing or the termnation of testing. Interactive testing goes on until the tester choose to terminate the testing. While accepting a test case, the tool automatically checks whether it matches the types of the corresponding variables given in the test target. For example, an input {2, 3, 7} for the variable x of type nat will be rejected by the tool. In that case, the tool will display an error message and remind the tester of providing another test case. Figure 5 shows the user interface for inputing test cases for interactive testing of the condition process Withdraw given below, which defines a simplified function of withdrawing cash from a personal bank account. type Account
=
composed of name: string number: nat a-type: nat balance: nat end;
process Withdraw(passwd: nat, amount: nat) ext wr tmp-account: Account pre tmp-account.name ” ” and passwd = tmp-account.number and tmp-account.balance >= amount post tmp-account = modify(~tmp-account, balance -> ~tmp-account.balance − amount) end-process;
4.3
Batch Testing
When carrying out a batch testing, the tester needs to store all the necessary test cases in a specified file, called test case file. After receiving the command for batch testing, the tool will automatically derive the proof obligation for the test target stored in the specification file, and then use the test cases in the test case file. After finishing the test, the tool will put the test result in a file, called test result file, and then display the content of the file on the display through the user interface of the tool.
7
As this functionality shares the similar feature with the interactive testing except the way of reading test cases, we do not give any further description for brevity. To support the demonstration of system functionalities (e.g., for acceptance testing), the tool also supports the reuse of the test cases used for interactive testing and batch testing previously. That is, the tool allows the tester to reuse the existing test cases to test the existing test targets, either in an interactive manner or batch manner.
4.4
Test Results Analysis
Test results analysis is an important step to determine whether faults have been detected by the test or not. The tool can automatically analyze test results by applying the proof obligation for consistency property or that for validity, depending on what kind of testing is conducted (consistency testing or validity testing). It displays the corresponding messages to indicate the nature of the test: failed test, successful test, or no-confident test (undeterministic). Figure 6 illustrates the test result of testing the condition process Withdraw given previously, and the result of its analysis.
Figure 6: The test and anlysis results for Withdraw
4.5
Test Information Management
Testing a complex system usually involves many test targets and needs a large number of test cases. It also generate many pieces of information concerned with the test, such as test results and analysis results. By test management we mean the management of those necessary pieces of information. To support the testing activities efficiently, the management of the test targets, test cases, test results,
8
System Triangle HMS
Table 1: The evaluation of the case study Number of testing targets Fault detection rate Number of other faults 11 100% 1 12 82% 0
and the analysis results of the test results, which are together called test information, becomes a very important issue to address. We organize test information in a test information database. The database can be perceived as a set of quadruple (Tt , Tc, Tr , Ta), where Tt is a test target, Tc is a test set or suite for the testing of the target, Tr is a set of test results, and Ta is a set of analysis results of the test results given in Tr . As a testing is conducted, the test information database may be changed. With the functionality of test information management, the testing tool can automatically change the database to meet the need of the corresponding testing activities (e.g., interactive testing, batch testing, consistency testing).
5
A Small Case Study
We have conducted a small case study to gain the knowledge about the usability and effectiveness of the testing approach presented in this paper. We choose the SOFL specifciations for Triangle and Hotel Management System(HMS) described in the first author’s previous publications [9] and [4], respectively, as the test targets for unit testing. In addition to detect possible faults in the original specifications, our aim is to evaluate, albeit on a small scale, the usability and effectiveness of the testing approach and the tool. To this end, we take the mutation testing approach [10, 11]. We first create mutants (a mutant is a changed condition process specification) from the original specifications by either changing the specification structure without creating faults or inserting faults leading to the violation of the consistency properties into the original specifcations, and then test the mutants with the strategies given in section ??. The mutants involving faults are called faulting mutants. For brevity, we do not provide the details of the specifications and test cases used in this case study. The details are described in [7].
5.1
Evaluation of the Testing Approach
In order to make the evaluation as rigorous and trustable as possible, the first author created all the mutants without showing the second author where the faults are inserted. The second author then applied the strategies to test the mutants and the original specifications. After finishing the testing for all of the mutants and the original specifications, both authors then checked together how many inserted faults had been found by the test. In addition, we have also checked how many faults existing in the original specifications were found. Table 1 shows the result of the case study. System means what systems are tested. Number of testing targets indicates how many components (invariant or condition process) are tested. A test target in this case can be the original specification, a mutant without fault, or a faulting mutant. Number of mutants is the number of mutants that are created. Fault detection rate is the result of dividing the number of faulting mutants detected through the testing by the total number of the mutants created. Number of other faults is the number of the faults that exist in the original specifications, which are different from the faults involved in the created mutants.
9
According to table 1, the result of this case study is quite positive about the effectiveness of the proposed testing approach in [8]. We detected all the faulting mutants created for Triangle specification, and 82% of the faulting mutants created for HMS. In addition, we also detected one fault in the original Triangle specification related to its validity. Although this is a small scale case study, it has helped us demonstrate the usefulness of both the proposed method and the tool. As far as the efficiency of SOSTEM is concerned, we have done an experiment of batch testing for the Triangle specification using up to one thousand test cases . The relationship between the number of test cases and the time consumed by evaluating those test cases is illustrated in Figure 7.
Figure 7: The relationship between the number of test cases and the time comsumed.
5.2
Findings
We have also found some interesting points during this case study. Firstly, we found that many faults were potentially detected during test case generations rather than after testing. This is because that we were forced to review the specification thoroughly during the test case generation. However, they could not be confirmed until the specifications were tested using the SOSTEM. Secondly, generating test cases is time-consuming, but not difficult. In particular, this problem becomes evident when specifications involve variables of compound data types (e.g., set of seq of composed of). In many cases, test cases can be potentially generated automatically. Finally, the use of tool is definitely necessary in ensuring the correctness of test case generateion and expected results. During the testing of the HMS, some expected faulting mutants turned out to be consistent specifications based on the decision made by SOSTEM. After careful analysis of those mutants, we found that the expected faults were actually not faults. In this sense, the use of tool seems to be very effective in providing reliable testings.
6 6.1
Conclusions and Future Research Conclusions
We have described a graphical user interface and a testing tool SOSTEM for SOFL. The former can efficiently help the user construct, modify, and review a specification, and the latter provides effective assistant to the user in testing the specifiation. We have conducted a small case study for evaluating the effectiveness and efficiency of the tools and the related SOFL specification construction
10
and testing techniques. Both tools have been used in our present Formal Engineering Methods for Software Development project supported by the Ministry of Education of Japan. Extending the functionalities of the current GUI and SOSTEM remains as our future research. We aim to achieve an intelligent CASE tool for SOFL, which possesses necessary knowledge about SOFL notation and method, and can guide the user to develop his or her specifications in an interactive manner.
7
Acknowledgements
We would like to thank the Ministry of Education of Japan for their research grants to support this work.
References [1] Edward Yourdon. Modern Structured Analysis. Prentice Hall International, Inc., 1989. [2] Bertrand Meyer. Object-Oriented Software Construction. Prentice Hall International Series in Computer Science, 1988. [3] Cliff B. Jones. Systematic Software Development Using VDM. Prentice-Hall International(UK) Ltd., 1990. [4] Shaoying Liu, A. Jeff Offutt, Chris Ho-Stuart, Yong Sun, and Mitsuru Ohba. SOFL: A Formal Engineering Methodology for Industrial Applications. IEEE Transactions on Software Engineering, 24(1):337–344, January 1998. Special Issue on Formal Methods. [5] Shaoying Liu, Masashi Asuka, Kiyotoshi Komaya, and Yasuaki Nakamura. An Approach to Specifying and Verifying Safety-Critical Systems with Practical Formal Method SOFL. In Proceedings of Fourth IEEE International Conference on Engineering of Complex Computer Systems (ICECCS’98), pages 100–114, Monterey, California, USA, August 10-14 1998. IEEE Computer Society Press. [6] Shaoying Liu, Masaomi Shibata, and Ryuichi Sat. Applying SOFL to Develop a Univeristy Information System. In Proceedings of 1999 Asia-Pacific Software Engineering Conference (APSEC’99), pages 404–411, Takamatsu, Japan, December 6-10 1999. IEEE Computer Society Press. [7] Tetsuo Fukuzaki. A Supporting Environment and Evaluation of Formal Specification Testing Techniques. M.Sc Thesis (in Japanese), Department of Computer Science, Faculty of Information Sciences, Hiroshima City University, February 2000. [8] Shaoying Liu. Verifying Consistency and Validity of Formal Specifications by Testing. In Jeannette M. Wing, Jim Woodcock, and Jim Davies, editors, Proceedings of World Congress on Formal Methods in the Development of Computing Systems, Lecture Notes in Computer Science, pages 896–914, Toulouse, France, September 1999. Springer-Verlag. [9] A. Jeff. Offutt and Shaoying Liu. Generating test data from SOFL specifications. The Journal of Systems and Software, 49(1):49–62, December 1999. [10] R.A. DeMillo, R.J. Lipton, and F.G. Sayward. Hints on test data selection: Help for the practicing programmer. IEEE Computer, 11(4), April 1978.
11
[11] S. C. P. F. Fabbri, J. C. Maldonado, P. C. Masiero, M. E. Delamaro, and E. W. Wong. Mutation Analysis Applied to Validate Specifications based on Petri Nets. In Proceedings of the 8th International Conference on Formal Description Techniques (FORTE’95), pages 329–337, Quebec, Canada, October 1995.
12