language Python â with some syntactic extensions for typical test evaluation functions ... Our patterns support both a manual and an automated test generation ...
1
Test Design Patterns for Embedded Systems Justyna Zander-Nowicka1, Abel Marrero Pérez2, Ina Schieferdecker1,3, Zhen Ru Dai1 1
Fraunhofer FOKUS, Kaiserin-Augusta-Allee 31, 10589 Berlin, Germany
2
Technische Universität Berlin, DaimlerChrysler Automotive IT Institute, Ernst-Reuter-Platz 7,
10587 Berlin, Germany 3
Technische Universität Berlin, Faculty IV, Straße des 17. Juni 135, 10623 Berlin, Germany
Abstract Test suites for embedded systems are typically created from scratch using different, often inadequate methods. In consequence, industry branches dealing with software-intensive embedded systems have to cope with quality problems, even though test processes are particularly time-consuming and costly. Based on an evolving model-based testing methodology we introduce test design patterns for simplifying and accelerating the test specification process. Making use of patterns, complex test objectives are covered with lower effort and test cases contain fewer design errors.
1
Test Specification for Embedded Systems1
A reasonable and potentially succesful test specification method for embedded systems implies understanding of the system under test (SUT) and the test engineering areas. For that purpose we firstly characterise the domain of embedded systems according to our knowledge and then we introduce the methods and test patterns for testing them. Our goal is to support the dynamic black-box software testing. 1.1
Testing within the Embedded Systems Domain
Software in embedded systems is gradually becoming of even higher importance than hardware [SZ06] due to the creation of new innovative functions based on software and due to the implementation of formerly mechanically or electronically integrated functions in embedded software. The definition of embedded systems has often been a source of misunderstanding among experts. Relating to such systems we mean mainly embedded software of a certain functional complexity, presenting hybrid behaviour (due to continuous and discrete signals) and constrained by temporal conditions. Addi1
This work is partially supported by the ITEA2 within the project: Deployment of Model-based Technologies to Industrial Testing, http://www.d-mint.org/.
2
J. Zander-Nowicka, A. Marrero Pérez, I. Schieferdecker, Z. R. Dai
tionally, in this paper we focus on systems within the automotive domain, where several software variations are used to address the same functionality in different models. Thus, the configuration (i.e. parameterization) of the test system is a complex task. As a result, the test method for embedded systems has to cope with all those obstacles to provide the efficiency and effectiveness. An automated, but systematic and consistent test specification method is needed. The test execution should proceed automatically. The proposed technique must handle discrete and continuous signals, as well as timing constraints. We assume that the requirements are available in forms of a textual specification and an executable model of the SUT exists. Moreover, we support testing on the model level, when neither code, nor hardware exists yet. By that we start testing already in the design phase. This allows for resource and cost reduction during the entire software development process. Finally, the test reusability is an important quality issue, especially if a number of similar SUT variants have to be tested over and over again. By supporting reusability, a concept of patterns emerges. A pattern is a general solution to a specific recurring design problem. It explains the insight and good practices that have evolved to solve the problem. It provides a concise definition of common elements, context, and essential requirements for a solution as discussed in [Ale79] [VS04] [Neu04] [PTD05]. Patterns were first used to describe constellations of structural elements in buildings and towns [Ale79]. Moreover, patterns contribute to such quality characteristics like maintainability and efficiency due to a common and understandable vocabulary for problem solutions that they provide [Neu04]. 1.2
Introduction to Test Patterns
Test patterns [VS04] represent a form of reuse in the test development in which the essences of solutions and experiences gathered in testing are extracted and documented so as to enable their application in similar contexts that might arise in the future. Test patterns aim at capturing test design knowledge from past projects in a canonical form, so that future projects would benefit from it. The author of [Bert07] considers the efficacy-maximized test engineering and arguments that any good testing practices need to be collected by a systematic effort so as to organize recurring effective solutions into a catalogue of test patterns, similarly to what is now a well established scheme for design approaches. Such a test pattern extraction [VS04] is the process of abstracting from existing problem-solution-benefit triples in the test developing process, to obtain patterns suitable for reuse in future contexts. Although the process of going through existing test artefacts and trying to identify patterns for later reuse might appear costly and unrewarding at the first sight, it pays off in a long term [VS04]. This argumentation especially applies to testing of embedded software, where reusability of some common test procedures is appreciated. The test paterrns provide means for test system developers to focus more on what to test and less on the notation itself. With this practice, they simplify the test development process, increase the level of automation and facilitate the understandability of the test.
Test Design Patterns for Embedded Systems
3
The novelty of this paper reflects both: a new method for the test specification of embedded systems as well as a related concept supporting the test patterns extraction and application. 1.3
Paper Outline
The paper is divided into the following sections. After the introduction, we give an overview on the related work. In Section 3 the test specification method for embedded systems is provided. Subsequently, the test design patterns for test data and test behavior including the test oracle and test control parts are described. Section 4 evaluates the approach based on an example of a speed controller from the automotive domain. Finally, a summary is given. Future work plans finish the paper.
2
Related Work
Several efforts have been undertaken to establish an approach to test specification and test generation for embedded systems. Research and industrial work is being continuously developed. In the following we present the work of a number of authors related to both the application of test patterns for verifying and validating embedded software as well as the test patterns themselves. Embedded Validator [BBS04] is the model verification tool used for verifying temporal and causal safety critical requirements. The method offers a set of test behavior patterns like “an output is set only after certain input values are observed”. Safety Checker Blockset [SCB] instead, verifies a set of properties2 being the combination of model's variables that are connected to a proof operator (e.g. “NeverTogether”, “NeverAfter”). A test model typically expresses an unwanted situation (e.g. the cruise control should never be active while the driver is braking). The model checking process assesses if this situation is reachable or not and establishes on this base the test outcome. Both tools support only some basic verification patterns, however they contribute to the concept of reusability. The Time Partition Testing [Leh03] is a dedicated method for testing continuous behaviour. It supports the selection and documentation of test data on the semantic basis of so-called testlets and the syntactic techniques Direct Definition, Time Partitioning and Data Partitioning which are used to build testlets. Testlets facilitate an exact description of test data for a specific time frame. The non-parameterized testlets reflect the properties of test patterns. A special assessment language is used to write the test evaluation mechanisms and to assign the verdicts. The evaluation is based on the script language Python – with some syntactic extensions for typical test evaluation functions available in a library3. Thereby, the test evaluation can be designed with a high level of
2 3
They are understood as patterns in this paper. This library can be seen as a dedicated collection of concrete test patterns.
4
J. Zander-Nowicka, A. Marrero Pérez, I. Schieferdecker, Z. R. Dai
flexibility, allowing dedicated complex algorithms and filters to be applied to the recorded test signals [BK06]. Refering to the test patterns, [Bin99] describes object-oriented test strategy patterns. They handle several strategies to derive test cases for testing object-oriented software. Our test method refers to the data-oriented models, thus we cannot adapt this definition of test patterns. [TYZP05] argues that developers often specify embedded systems using scenarios, and a typical medium-size system has hundreds of thousands of scenarios. Each scenario requires at least one test case, which in turn requires individual development and debugging [TYZP05]. Verification patterns (VP) are proposed to address this problem. The VP approach classifies system scenarios into patterns. For each scenario pattern (SP), the test engineer can develop a template to test all the scenarios that belong to the same pattern. This means that the engineer can reuse the test templates to verify a large number of system scenarios with minimum incremental cost and effort. Each scenario has preconditions (causes), postconditions (effects) and optional timing constraints. A SP [TYZP05] is defined as a specific temporal template or cause and effect relation representing a collection of requirements with similar structures. A VP [TYZP05] is a predefined mechanism that can verify a group of behavioral requirements that describe similar scenarios. A GUI-based specification tool [TYZP05] to facilitate the scenario specification is available. We focus on the test specification patterns which mainly relate to the test behaviour similarly to [TYZP05]. However, we handle both discrete and continuous signals, while [TYZP05] addresses only scenarios describing discrete behaviour. [Neu04] presents a detailed survey through test patterns regarding a number of criteria. According to his classification we categorise our patterns as functional concerning the test goals, as test design patterns in the sense of the test development and as component level thinking about the scope of testing. Our patterns support both a manual and an automated test generation approach discussed in [Neu04] [PTD05]. In a manual development approach, a developer can reuse patterns. An automated approach may benefit from an automatic identification of patterns so as to provide further solutions for the revealed issue [Neu04]. We distinguish the test data, test oracle and test control patterns, whereat the test data patterns can be retrieved automatically from the test evaluation4. We propose the implementation of patterns, which provide abstract solutions for generic problems, but are instantiated in a given design language. Hence, our patterns can be seen as parametrisable libraries, in fact. However, the customization possibility make them more abstract than a library in a traditional meaning is5.
4
The terms: test oracle and test evaluation refer to the assessment of the SUT behavior and are used replaceable. 5 In [PTD05] a variant of patterns called idioms are considered. Idioms provide solutions for general problems arising using a certain language. In that sense, our test patterns are such idioms, however we call them test patterns in this paper.
Test Design Patterns for Embedded Systems
5
We provide the test design patterns in a graphical form. Thus, the textual table-form templates suggested by [Bin99] are not supported. Instead, we assume that a graphical user interface (GUI) for each pattern-block is informative enough to express its meaning, context and the application sense.
3
Test Design Patterns for a Selected Test Specification Method
The test process possible due to the application of our approach consists of such phases like test oracle specification, automatic test data generation and automatic test execution. Assuming, that a SUT model with its interfaces is available, we generate the test harness (step I) as depicted in the Fig. 1. Further on, the test engineer manually refines the test oracle design (step II) based on the concept of validation functions patterns presented in the next subsections. Afterwards, the test data structures and concrete data are generated applying a constraint solver (step III). The test control design is added, however it should be refined manually (step IV). Finally, the tests (i.e. test cases) may be executed and the test results are obtained. In Fig. 1 a test harness used for testing the embedded software is presented. By the test harness we mean an automated test framework including the general test structure and the test execution engine. In Fig.1 the requirements specify functionality of the SUT and indicate the test objectives. Together with the interfaces of the SUT model they drive both – test data generation and test oracle. The test control may react on the signals from the test evaluation unit to realize the reactive testing concepts. It automatically organizes the test cases during the test execution according to the predefined criteria. The details concerning the test oracle creation can be found in [ZSM06] [MP07], the test data patterns retrieval is described in [ZMS07], whereas the test control and reactive testing considerations are discussed in [Zan07]. Hence, we explain only some essentials of our test approach in the following focusing rather on the test patterns application than on the test logic.
Fig. 1: A test harness.
We use the test patterns already by generation of the test harness for a given SUT. They are very abstract and can be seen as the contributors to a skeleton of every single test system being developed. More detailed test patterns are classified according to the complexity level of the test specification process.
6
J. Zander-Nowicka, A. Marrero Pérez, I. Schieferdecker, Z. R. Dai
3.1
Test Oracle Patterns
We start with the requirements and interfaces of the SUT model to build the test oracle design applying already available templates and patterns. We model the test evaluation first – opposite to a common practice. Afterward, the test data patterns are retrieved from the test oracle design automatically. They are refined by setting the appropriate parameters. Finally, the SUT model fed with the previously created test data is executed and the evaluation unit supplies verdicts on the fly. The first step during the test oracle specification is to identify the test objectives based on the SUT requirements. For that purpose a high level pattern within the test oracle unit is applied, where at least one requirement appears. The number of test requirements can be chosen in the graphical user interface (GUI) that updates the design and adjusts the structural changes (e.g. the number of inputs in the arbitration unit) of the test model. The situation is illustrated in Fig. 2.
Fig. 2: A pattern for the test requirement specification and its GUI.
Further on, we introduce the concept of a validation function (VF) [ZSM06] to specify the test oracle in a systematic and structured way. VFs serve to evaluate the execution status of a test case by assessing the SUT observations and/or additional characteristics/parameters of the SUT. VF is a pattern that contributes to the overall concept of test behavior6. If testing of one requirement demands more rules combinations, we implement all of them and conclude with a verdict according to the arbitration algorithm that accumulates the results of the combined rules. Predefined verdict values are pass, fail, none and error. Pass indicates that the test behavior gives evidence for correctness of the functional aspects of the SUT for that specific test case. Fail describes that the purpose of the test case has been violated. An error verdict shall be used to indicate errors (exceptions) within the test system itself. When the preconditions of the VF are not fulfilled or no other verdict can be set, a none verdict is delivered. The verdict of a test case is calculated by the arbiter [TTCN3]. We provide a special Simulink arbitration block [ZSM06] to play the role of an arbiter in our approach. During test execution, each VF responsible for verdict assignment, continuously reports verdicts to the arbiter. The arbiter produces the final verdict from those intermediate verdicts. There is a default arbitration algorithm. It is used while de6
We define test behavior as the interactions between and inside the test data, SUT and validation functions.
Test Design Patterns for Embedded Systems
7
livering the verdict for a single requirement evaluation or for the entire test case. Verdicts are ordered according to the rule: none < pass < fail < error. Let us introduce the process of VFs creation. A single informal requirement may imply multiple VFs, but we focus on one only for simplification reasons. Having a requirement, we organize VFs following a generic conditional rule:
IF
preconditions set
THEN assertions set
Preconditions set consists of a number of SUT signals’ feature extractors related to each other by temporal, logical or quantitative dependencies (e.g. FeatureA AND after(time1) FeatureB OR NOT(FeatureC)); a comparator for every single feature or dependency; and a unit for preconditions synchronization (PS). Assertions set looks similar, however includes a unit for preconditions and assertions synchronization (PAS), respectively. The details concerning synchronization problems are described in [MP07]. Hence, an abstract pattern for a VF (shown in Fig. 3) consists of a preconditions block which activates the assertions block, where the comparison of actual and expected signal values happens. The activation and by that the actual evaluation proceeds only if the preconditions are fulfilled.
Fig. 3: Structure of a VF – a pattern and its GUI7.
The easiest assertions blocks checking time independent features are built following the schema shown in Fig. 4.
Fig. 4: Assertions block – a pattern.
7
The patterns are designed in Matlab/Simulink/Stateflow [Math07].
8
J. Zander-Nowicka, A. Marrero Pérez, I. Schieferdecker, Z. R. Dai
They include a signal’s feature extraction part, a block comparing the actual values with the expected ones and PAS synchronizer. Optionally some signal deviations within a permitted tolerance range are allowed. We distinguish further schemas of preconditions and assertions blocks for triggered features discussed by [MP07] in details. Under feature we understand an identifiable, descriptive property of a signal. We distinguish different feature types considering the feature availability on the one hand and the identification delay on the other hand [ZSM06] [MP07]. However, their classification is out of the scope of this paper. 3.2
Test Data Generation and their Paterrns
The same as during the test oracle specification process, the requirements level for the test data is generated. However, this happens automatically based on the knowledge gained from the test oracle specification phase. The pattern applied in this step is shown in Fig. 5.
Fig. 5: Requirements level within the test data unit – a pattern.
As we mentioned before the test data patterns can be retrieved from the test oracle design automatically [ZMS07]. During the test evaluation SUT input and output signals’ features are extracted. Hence, knowing the features appearing in the preconditions of a VF, we are able to reconstruct the test data from them. The preconditions typically depend on the SUT inputs; however they may be related to the SUT outputs at some points. Every time a feature extractor occurs for the assertion activation, a corresponding feature generator may be applied for the test data creation. Giving a very simpe example – for detection of a given signal value in a precondition of a VF, a constant signal featuring this value during a default time is needed. Apart from the feature generation the SUT output signals may be checked for some constraints, if necessary (see Fig. 6). The feature generation is activated by a Stateflow diagram sequencing the features in time according to the default temporal constraints (e.g. after(time1)). A switch is needed for each SUT input to handle the dependencies between generated signals. Initialization & Stabilization block enables to reset the obtained signal so that there are no influences of one test case on another.
Test Design Patterns for Embedded Systems
9
Fig. 6: Preconditions and feature generation levels within the test data – a pattern.
A similar sequencing algorithm applies for ordering the test cases on a higher hierarchy level while dealing with a number of requirements. We call this aspect test control. A traditional understanding of the test control makes it responsible for the order of test cases over time [TTCN3]. It may invoke, change or stop the test case execution due to the influence of the verdict values coming from the test evaluation. Thus, the test cases are sequenced according to the previously specified criteria (e.g. pass verdict). An extended definition of the test control may be found in [Zan07]. We consider this issue to be beyond the scope of this paper. 3.3
Visualization of the Selected Test Patterns
The test patterns are collected in a library and can be applied immadiately during the test design phase. In Tab. 1 two patterns are presented, each of them corresponding to a selected testing activity. Since they will be applied in the example in the next Section, we explain their meaning afterwards. Tab. 1: Test activity
4
Selected Test Patterns and their Graphical Correspondents
Test pattern name
Context
Problem
Test oracle specification
Detect signal’s feature
Test of a control unit
Assessment of a control unit behavior in terms of a selected signal’s feature
Test data generation
Generate signal’s feature
Evaluation of a step response function is intended
How to generate appropriate signal to stimulate a selected feature on the SUT output
Solution Instance
Evaluation of the Approach Based on an Example
In the following, we present a test of a component belonging to an Adaptive Cruise Controller (ACC) [Con04] unit. The ACC controls the speed of the vehicle while maintaining a safe distance from the preceding vehicle. We focus on the speed controller only to simplify the test procedures. The speed controller operates in a loop in conjunc-
10
J. Zander-Nowicka, A. Marrero Pérez, I. Schieferdecker, Z. R. Dai
tion with the vehicle. It measures the actual vehicle speed, compares it with the desired one and corrects any deviations by accelerating or decelerating the vehicle within a predefined time interval. If the desired velocity is considerably changed, the controller should react and adapt the vehicle velocity. This happens within a certain time due to the inertial characteristics of the velocity, which is related to the vehicle dynamics. Afterwards, the vehicle velocity should be maintained constant, if the desired speed does not vary. Fig. 7 shows the speed controller being the SUT connected to a vehicle model. Bus
Fig. 7: The SUT connected to a vehicle model via a loop.
The SUT interfaces are listed in Tab. 2. The desired velocity (v_Des) at the SUT input influences the actual vehicle velocity (v_Vhcl) at the SUT output. Selection and CCMode are internal ACC signals that indicate the state of the whole ACC system. They should be both set to a predefined constant value to activate the speed control during its test. Tab. 2: name of SUT signal values range unit
4.1
SUT Inputs of the Speed Controller
desired velocity (v_Des)
selection (Selection)
cruise controller mode (CCMode)
vehicle velocity (v_Vhcl)
{0, 1, 2, 3}
{0, 1, 2, 3}
m/s
–
–
m/s
Tool Selection for Test Modelling and Test Execution
We selected the Matlab/Simulink/Stateflow [Math07] environment to show the feasibility of our solution. It provides a simulation engine, which allows for the execution of tests, thus their dynamic analysis. It supports hybrid systems development and it lets us use the same development language for both – system and test – so as to integrate test and system modeling [ZSM06]. Our test patterns belong to a Simulink library. 4.2
Test Specification by Application of the Test Patterns
In Fig. 8 a test harness for the speed controller test is presented. It is the implementation of the concept given in Fig. 1.
Test Design Patterns for Embedded Systems
11
Fig. 8: The test harness.
An excerpt of the functional requirements on the speed controller is given in Tab. 3. We have chosen very technical and concrete requirements so as to introduce our concepts more easily. Tab. 3:
Requirements for the Speed Controller
ID
Formal Requirements on Speed Controller
1
Speed Control – step response: When the cruise control is active, there is no other vehicle ahead the car, the speed controller is active and a complete step within desired velocity has been detected, than the maximum overshoot should be less than 5 and the steady state error should be also less then 5.
2
Speed Control – velocity restoration time after simple acceleration/deceleration When the cruise control is active, there is no other vehicle ahead the car, the speed controller is active and a simple speed increase/decrease applying a speed control button is set resulting in a 1 unit difference between the actual vehicle speed and the desired speed, the speed controller shall restore the difference to