The Adaptation of Test-Driven Software Processes to ... - IEEE Xplore

67 downloads 20534 Views 985KB Size Report
engineering domain to industrial automation engineering. We identify a set of UML models that enable the systematic derivation of test cases. Based on an initial ...
The Adaptation of Test-Driven Software Processes to Industrial Automation Engineering Reinhard Hametner1 Dietmar Winkler2 Thomas Östreicher2 Stefan Biffl2 Alois Zoitl1 1

Automation and Control Institute Vienna University of Technology A-1040 Vienna, Austria, Gußhausstr. 27-29/E376 {Hametner, Zoitl}@acin.tuwien.ac.at Abstract - Software components provide an increasing part of added value in automation systems and become more complex to construct and test. Test-driven development (TDD) of software systems has been successfully used for agile development of business software systems. Test cases guide the system implementation and can be executed automatically after software changes (continuous integration & build strategy). However, TDD processes need to be adapted to control automation systems engineering, where real-world systems are challenging to model and to test automatically. In this paper we introduce an adapted TDD process from the business software engineering domain to industrial automation engineering. We identify a set of UML models that enable the systematic derivation of test cases. Based on an initial empirical study we evaluate the adapted TDD process based on an industrial use case to identify strength and limitation of this approach. Major results of the study were that UML models enabled effective test case derivation in the study context.

I.

INTRODUCTION

Business needs for flexible automation systems, such as efficient reconfiguration of manufacturing plants and frequently changing consumer product requirements foster the trend to shift automation functions from hardware to software components. Software components embedded within a common industry and/or consumer platform enable high flexibility through automated reconfiguration and fast response to frequently changing requirements [16]. Nevertheless, in the automation systems engineering domain we observed a strong focus on hardware development and only limited engineering processes in software engineering. Thus, there is a need for strengthening the software engineering capability by (a) introducing systematic process approaches, (b) constructive approaches for software development, and (c) efficient analytic quality assurance activities for assessing (software) product quality embedded within a well-designed hardware solution. Various requirements from automation systems, (e.g., timecritical requirements, cyclic operations of controllers, safety) require a systematic evaluation of candidate techniques from business software development and their applicability to the automation systems domain. In the automation systems engineering domain we observed a set of new challenges which can be supported by the software component concept: Changes of requirements and system properties, (e.g., during systems development, maintenance) can be addressed by software components. Nevertheless, changes must be tested systematically and efficiently [17]. Thus, we see the need for a testing environment to provide tool support for systematic testing.

978-1-4244-7300-7/10/$26.00 ©2010 IEEE

2

Institute of Software Technology and Interactive Systems Vienna University of Technology A-1040 Vienna, Austria, Favoritenstr. 9-11/E188 {Winkler, Oestreicher, Biffl}@qse.ifs.tuwien.ac.at Relevant developments outside of the industrial automation identify the concept like test-driven development (TDD) [16] and continuous integration and test (CI&T) [6] seem promising for the application of these concepts in the automation systems domain. Nevertheless, there is a need to support automation engineers in systematically capturing system properties and expected systems behavior during design and operation. Models, like UML [1] and SysML [8] seem promising candidates for systematically capturing the expected systems behavior on various levels of detail as foundation for test case generation [2]: (a) requirements, (b) system architecture, and (c) individual system components. Note that UML includes (a) structural diagrams to capture static system patterns (i.e., individual components and the interaction between these components) and (b) behavior and interaction diagrams to identify workflows and dynamic systems behavior, (e.g., use cases, interaction, communication, transition diagrams, state charts) [11]. There are several reports on the use of UML models in the automation systems domain [7]. Models can support (a) systematic design of automation systems and components and (b) systematic test case generation on various levels based on modeling approaches provided by UML [2]. In this paper we focus on a straight-forward test case generation process adapted to the automation systems domain, i.e., using (a) the test-first concept [17] and (b) models for test case generation [3]. Some papers refer to adaptive software testing strategies, e.g., [4], for online modifications of test cases during the testing process. Nevertheless, in this paper adaptation refers to the modification of common practices for test case generation derived from business software development. We introduce an adapted TDD process for test case generation in the industrial automation systems domain and identify a set of models that allows the systematic derivation of test cases that can be run automatically [9]. We evaluate the adapted TDD process in an empirical feasibility study based on an industrial use case to analyze strengths and limitations of the approach. The remainder of this paper is structured as follows: Section II summarizes related work on software testing in business software development and the automation systems domain with focus on model-based testing. Section III discusses research issues. We present a basic workflow for test case generation in the automation

921

II. RELATED WORK This section summarizes related work on test-driven development and test processes in business IT adapted to the industrial automation domain. A.

Test-First Development Incomplete, wrong or ambiguous requirements, which could be identified early in the development phase, can lead to high effort to locate defects in large applications in later stages of development and operation. Test-First Development (TFD) is a strategy for shortening the cycles between test definition and test execution [16]. Identifying defects in early stages of development and early test case definitions drives to the idea of TFD in agile software development processes [6]. Test cases are defined prior to or in parallel to design and implementation of software components and help engineers by better capturing expected systems behavior on different levels. Fig. 1 illustrates three design levels in automation systems and the basic concept of TFD. The concept of TFD is embedded in 4 fundamental steps: (1) Test definition. Select specific requirements on every level and define test cases according to these requirements; (2) Test implementation. Implement and execute test cases; as there is no implementation of the requirement, the test cases must fail; (3) Implementation and test. Cyclic implementation of test-caserelated functionality and test case execution until successful completion; and (4) Refactoring. Optimize the implementation without changing functionality and execution of test cases. After finishing the last step, select the next batch of requirements and continue at step (1). This process is explained more detail in [16]. A systematic application of TFD and frequent test runs lead to the concept of continuous integration and test [6][16][17]. B. Test Levels in the Automation Systems Domain In general, we observed three testing levels in the automation systems domain [16]: (1) systems-level tests at the customer and developer site, i.e., acceptance, factory, and system tests; (2) architecture & subsystem specification with focus on integration tests: and (3) component level tests for individual components. Fig. 1 illustrates these three testing levels of automation systems applying a top-down design, i.e., starting at the business level (requirements and systems) on level 1, a more detailed view on the system with focus on components and the interaction of components on level 2, and a detailed view on individual components on level 3. Note that we also apply the test-first approach [16], a well-known concept from business software development [6], to enable a continuous integration strategy for systematic test case generation, code construction and test execution. C. Testing in the Automation Systems Domain Systematic test cases on various levels require appropriate test execution frameworks. Test cases can require simulated system components for test execution, if components are not available at testing time (so-called mock objects).

TOP-DOWN APPROACH

systems domain in Section IV and present a sample application in Section V. Section VI concludes and presents future work.

Fig. 1: Test Levels in the Automation Systems Domain. A continuous integration approach using a test framework with focus on automation systems engineering is described in [17] (see Fig. 2 for a schematic overview). The framework enables frequent test runs, (e.g., after changes) including a high number of test cases automatically. Fig. 2 illustrates a schematic overview on the test framework, proposed in [17]. The test framework consists of four major building blocks: (1) Test Suite, summarizing individual test cases; (2) System under Test, representing the configuration of the software and system product; (3) Test Runner applies the test cases to test system under test and provides fast feedback the test results, which are presented by (4) test reports. Note that the test runner also provides the required mock objects for system parts, which are not available at testing time.

Fig. 2: Schematic overview on the Test Framework [17]. A test framework enables automated test execution including different software configurations. However, the question remains open on how to support efficient test case generation. D. Model-Based Testing Model-based testing (MBT) [2] aims at supporting (a) test case generation and (b) soft-commissioning, based on models on various system levels, (e.g., system-level tests) can be derived from requirements, see Fig. 1 for testing levels. Additionally, models enable automated code and test case generation [9]. Based on the requirements description of the software, engineers create models. The UML diagram family [1] is well-known in business software development to address systems structure (6 diagrams), behavior (3 diagrams), and interaction (4 diagrams). System structure diagrams present a static view of the system based on components, distribution of components, or classes. Behavior diagrams include use cases, state machines, and activity diagrams to define workflows. Interaction diagrams focus on communication and collaboration of components.

922

Models can be (a) used to illustrate the structure and interaction and (b) can support automated code and test case generation [13] (set-behavior). During test execution the setbehavior of the model is compared with the actual-behavior of the test application. The complexity of the test problem is abstracted to an intellectual controllable level, i.e., test engineers build a mental model of the system. This mental model is used to derive test cases for the implementation or system under test (SuT) [3]. The test specification is the foundation for selecting test cases.

Fig. 3: Test Case Generation with model-based testing derived from the UML Diagram family, described in [1]. Fig. 3 gives an overview of the data flow in a generic testgeneration system. Model-based testing uses abstract models to generate test cases for an implementation. The inputs of the software can be developed early in the cycle from requirements information – test case specification. Individual test cases are summarized in test suites, i.e., bundles of test cases with focus on a set of systems requirements. The test suite includes inputs, expected outputs, and necessary infrastructure to run the test automatically. Test generation can be especially effective for systems, which are changed frequently. Testers can update the data model and derive a test suite automatically. The suite of editing hand-crafted tests can be avoided. For validation the generated test cases should be checked manually in order to ensure that the model represents the system requirements and their specification correctly. Finally, the model is used to increase confidence in the understanding between customers and developers. Modelbased testing depends on three key technologies [4]: (a) the notation used for the data model, e.g., UML diagrams; (b) the test-generation algorithm, i.e., deriving test cases from the models, and (c) the tools that support test case generation and can provide a framework for test execution. III. RESEARCH ISSUES Applying models in TFD can provide (a) increased understanding of the system and systems behavior for relevant stakeholders because of a graphical representation, (b) enable verification and validation, (c) enable automated model checking and (d) can support (automated) test case generation based on these models. However, open questions are: (a) which sequence of steps can enable a systematic test case generation in the automation systems domain, and (b) which models can address automation systems requirements with respect to test case generation. From these research questions we derived two basic research issues:

RI-1. Identification of an effective test-driven software process for model-based test case generation in industrial automation engineering. Based on experience from test case generation in business software development we develop a candidate process approach for test case generation based on models for automation systems application. RI-2. Investigation of models that support test case generation in context of automation systems design. Following the concept of test-driven and model-driven development, we select the models from the well-known UML [14] and SysML [8] family based on the basic requirements derived from a top-down systems design approach. As research approach we will introduce an adapted TFD process workflow for the automation systems domain regarding test case generation based on the TFD concept and models in Section IV and show the application of the proposed approach in an initial feasibility study in Section V. IV. TEST CASE GENERATION PROCESS APPROACH Depending on the application domain in automation systems development, we see the need to (a) capture systems requirements, (b) model the systems structure and (b) model interaction and communication between components and subsystems. Following the individual levels of automation systems engineering (see Fig. 1) we see the focus on systems (a) requirements as foundation for product development and (b) technical design concepts, which can be refined on every level successively. Systems level (customer/developer). The systems level focuses on basic customer requirements and on system, acceptance and factory tests. Thus, a first step starts with systematically capturing requirements from user perspective including test case generation of requirements level. Architectural level of systems, subsystems, and components. Depending on systems and component complexity and risk various models help identifying and modeling systems structure and the expected behavior. Nevertheless, structural diagrams help identifying a static overview on the system and interaction diagrams show individual systems behavior. Note that structured diagrams and behavior diagrams can focus on (a) the overall system to illustrate the relationship between different components within a system and (b) can also support a detailed modeling of individual components on a high level of detail for implementation and test case generation purposes. Based on the UML diagram family we see two basic diagram types as foundation for test case generation based on models: (a) models for static systems design (e.g., component, deployment, and class diagrams) and (b) models for interaction and communication. Note that interaction and communication diagrams (e.g., use cases, state charts, sequence charts, activity, timing diagrams) enable the derivation of test cases directly from the models and are promising candidates for automating the test case generation process. Thus, the proposed systematic test case generation process consists of 11 steps:

923

1. Definition of requirements based on textual descriptions to capture basic systems requirements. 2. Derive a structured and prioritized requirements list (iceberg-list) [5] based on customer prioritization of requirements. Prioritized requirements can help to focus on most important requirements of the system to derive most valuable test cases. 3. Modeling of UML use cases to derive scenarios, i.e., sequences of activities, based on captured requirements. Scenarios can be reused as test suites during systems and acceptance testing because they represent typical workflows of the system [17]. Note that every activity of a test scenario can be seen as individual test case. 4. Risk assessment. An important issue is to capture critical system components (a) to focus on most critical test cases, (b) to address error and special test cases, and (c) to enable exit criteria for test processes. Risk assessment approaches, e.g., RiskIt [10], FMEA, and FTA can support risk assessment systematically. 5. The Identification of components and interfaces for collaboration represent the foundation for system tests and interaction between various components. Note that UML component diagrams represent a static view on the system. 6. Deployment diagrams present a model of the physical layout and communication paths between system components, applicable for distributed control units. 7. Class diagrams show the static design of individual components and the relationship between components on a detailed level. Note that class diagrams are used to model systems and components on implementation level and represent the foundation for unit tests within a TFD test strategy on component level. 8. State charts illustrate different states and transitions between states. Depending on the risk, state charts can be used on various levels, for example system level to present the overall system behavior or detailed level to present individual (critical or risky) components. 9. Sequence Charts present the temporal sequence of events within the system under construction. They show which objects communicate with other objects and what messages trigger those communications. 10. Activity diagrams illustrate the workflow behavior of a system by detailing the decision paths that exist in the flow of events encapsulated by the activity. 11. Timing Diagrams present the temporal behavior of a system or subsystem. They are used to show interaction between timed events and the time and duration constraints that govern them. Results of models can be used to identify test cases which are executable within a testing framework [17], following a continuous integration and test strategy. Note that models can enable automated test case generation [9], for example based on executable state charts. Nevertheless, the selection and application of individual models depends on the application area and risk.

V. PROTOTYPE EVALUATION STUDY This section demonstrates the application of the proposed model-based workflow for test case generation on a representative small waterworks control application. A. Feasibility study We conducted an initial feasibility study to (a) present the proposed workflow, (b) the application of selected UML models and (c) the test case generation approach based on TDD and MDD. We applied a small and well-defined waterworks systems for small irrigation system (see Section V-B). For evaluation purposes we use an open source graphical software tool for modeling and a paper-based approach for test-case generation. Note that the application of professional UML modeling tools can enable automated code and test case generation (future work) [9]. The selection of models is based on a test-first development approach (Section II-A) according to three engineering levels (Section II-B). Note that the selection of models depends on systems complexity, project type and project size. Nevertheless, we apply relevant models for the initial feasibility study and present the basic steps for test case generation. B.

Definition of Requirements – Textual Description (Step 1) Waterworks is an automated application to control a simple irrigation system including one water tank, three pumps (handling two incoming pipes and one outgoing pipe), and four sensors (measuring water level). The main task of the automation system is to control the pumps based on the sensor data to enable a continuous water supply of max 15 l/min. An operator controls pump P3 according to water needs. Fig. 4 illustrates a schematic overview of the irrigation system.

Fig. 4: Schematic overview of the Irrigation System. C. Requirements Structuring and Prioritization (Step 2) Based on textual requirements descriptions and illustrations a structured set of prioritized requirements is required to capture requirements systematically and well-defined. Note that the prioritization task represents the added value for the customer on three levels, (i.e., critical (A), important (B), and less important (C) requirements). TABLE I presents a selection of most important requirements system. Note that prioritized requirements can be used for test scenario and test case generation. For example, one test scenario can be the initial filling of the tank until the upper water level of the tank is reached, (i.e., sensor HH).

924

TABLE I Prioritized Selected Requirements of Waterworks. No. Requirement Priority 1 Filling of the water tank A 2 Tank may not be empty A 3 No flooding accepted A D. Modeling Use Cases (Step 3) Use cases are usually modeled to illustrate the interaction of actors with respect to related task, i.e., humans and machines. The use case notation is easy, understandable, and supports the discussion with stakeholders, who are not very familiar with technical details. Thus, use cases are well applicable for quality assurance to illustrate basic workflows. Fig. 5 presents a simple use case for waterworks.

cases). TABLE III presents a snapshot of identified risks for the waterworks system. TABLE III Sample Risks of Waterworks. No. Risks 1 Flooding of the tank because of inoperative Sensor HH and/or pump failures. 2 Defect of component P2/P1 can result in an empty tank 3 Defect of Sensor LL / HH not detectable F.

Definition of Components and Interfaces (Step 5). Component Diagrams illustrate the static structure of the system from a technical perspective identifying components and interfaces for system collaboration (Fig. 6). Thus, component diagrams are the foundation for testing the interaction between components because all interfaces and related signals are represented in this model. Note that these diagrams illustrate the structure of the system but test cases cannot be defined directly from the diagram.

Fig. 5: Sample Use Case for Waterworks. Based on textual requirements (Section V-B), a prioritized requirement list (Section V-C), and use cases important scenarios for implementation and testing can be derived. Note that a scenario summarizes a set of individual (more detailed) tasks representing individual test cases.

No. 1 2 3

TABLE II Sample Scenarios of Waterworks. Scenarios Initial filling of the tank (from empty to full tank) Draining of the tank (from full to empty tank) Regular operation, e.g., 10 l/min for flow rate for P3

Because of the general view on the system from requirements perspective, use case and scenario-based test cases are used in system, acceptance, and factory tests. E. Risk Assessment (Step 4) Capturing critical risks, resulting in an unstable and maybe insecure overall system state, is a key issue in engineering disciplines. Thus, risk assessment, (e.g., by applying the RiskIt [10] approach, Failure Mode, Effect Analysis (FMEA)) [11], help addressing critical system issues to address them properly. Results from risk assessment approaches can be used to select test cases according to the identified risks. Note that test cases should include error cases (EC) and special test cases (SC) to address tests at system borders. Additionally, from the result risk assessment can be used for limiting testing effort by providing exit criteria for test processes (e.g., if all risks are covered by appropriate test

Fig. 6: Component Diagram of Waterworks. G. Deployment Diagram (Step 6) UML deployment diagrams aim at illustrating the physical layout and identifying communication paths between different components of the system. Deployment diagrams have strong benefits in the context of distributed systems, (e.g., for distributed control units) because they represented the layout of the overall system clearly. Fig. 7a illustrates the deployment diagram for the waterworks system. Note that the deployment diagram supports test case generation of distributed control systems because of a clear representation of components, reasonable also for small application. Nevertheless, test cases are based on the deployment diagram but are not derivable directly.

Fig. 7: a) Sample Deployment Diagram of Waterworks, b) State Chart of Waterworks.

925

H. Class Diagram (Step 7) The technical description of individual components for all identified components is the foundation for the system implementation on a detailed level. Thus, signals, variables, and interfaces have to be designed at a high level of detail. Note that this specification is comparable to units, addressed by unit tests in business IT software development in a testfirst development strategy. Class diagrams represent the structural diagrams on component and unit level and are the basic material for test case generation. State Charts (Step 8) Structure diagrams, e.g., component (section V-F), deployment (section V-G), and class diagrams (section V-H), represent the static design of the system. Note that static models are the foundation for the test case generation but there is no direct link between the test case generation and the diagrams. System dynamics (i.e., behavior and interaction), represent sequences of events, states, and temporal behavior of the system. Note that use cases (system interaction of various actors) are also assigned to behavior diagrams (see section V-D). Behavior and interaction diagrams enable test case deviation directly from the model. Fig. 7b shows the systems state chart of the waterworks system, including a tank model and the pump control unit. Note that this representation enables a detailed view on the system, (e.g., change of states based on current system states and events). Test cases can be derived (automatically) from this state chart after applying the structured models. See TABLE IV for example test cases derived from waterworks state chart. Depending on the complexity and risk models can be refined to identify component level tests in more detail. Fig. 8 shows a more detailed view on an individual pump. Additional test cases can focus on individual states and the transitions between the states. Second section of TABLE IV presents some sample test cases for an individual pump. State charts present individual states of a system and the relationship between two states, (i.e., transitions). Test cases can be derived directly from the state charts. Note that state charts also enable test coverage investigations. Software code and test cases can be derived from state charts with automation support, (i.e., using executable state charts) [12]. Nevertheless, temporal behavior and sequences of steps cover typically requirements derived from systems requirements in the automation systems domain. But there is limited support by state charts.

J.

Sequence Charts (Step 9) Sequence charts visualize the sequence and temporal behavior of components within an automation system. Thus, sequence charts are promising approaches modeling temporal behavior of automation systems. Fig. 9 shows the sequence of waterworks components. Engineers can derive test cases directly from the sequences in order to address temporal requirements. See third section of TABLE IV for a set of sample test cases based on sequence charts.

I.

Fig. 9: Waterworks Sequence Chart. K. Activity Diagrams (Step 10) Activity diagrams are used to describe the workflow behavior of a system. They describe the state of activities by showing the sequence of activities performed. While similar to state machine diagrams, they do not show the details about how objects behave or collaborate. In the automation context they are used to describe the manufacturing process. Note that activity diagrams are usually used for modeling workflows, e.g., logistics in manufacturing plants. Thus, activity diagrams are not reasonable for the waterworks application. L.

Timing Diagrams (Step 11) Timing diagrams illustrate the behavior of a system regarding a sequence of events or time. They are used to explore the behavior of one or more objects throughout a given period of time. While this is quite similar to state machine diagrams or sequence diagrams. Timing diagrams are used to show an exemplary run of a scenario at a higher level. Fig. 10 shows the timing diagram for the initial filling of the tank and the activation of P3 after the tank is sufficiently filled. Based on system structure models and dynamic behavior and interaction diagrams, test cases can be derived manually or automation supported [9]. Note that proposed test process enables engineers in stepwise refining the systems design and derive test cases more efficient and effective.

Fig. 10: Timing Diagram of Waterworks.

Fig. 8: Detailed State Chart of an individual Pump.

926

TABLE IV Waterworks: Test-Cases based on System State Chart, Detailed State Chart (Pump), and Sequence Chart. No. Description Type* Pre-Condition Action Post-Condition Expected Results 1 Start pump P1 NC H=true H=false P1 running P1.start = true 2 Stop pump P2 NC P2 stopped LL=true, P1 running L=true P1 running L,H,HH=false P2 stopped H,HH = false 3 Start pump P2 NC HH,H=false L=false P2 running P2.start=true L,LL=true 4 Stop pumpP1 NC L,LL=true H=true P1 stopped P1 stopped HH,H=false 1 Start pump NC State “stopped” Pump.Start State „starting“ State „starting“ 2 Stop pump NC State „running“ Pump.Stop State „stopping“ State „stopping“ 3 Emergency stop NC State „starting“ Emergency Stop State „stopping“ State „stopping“ 4 Emergency stop NC State „running“ Emergency Stop State „stopping“ State „stopping“ 1 Stop pump1 P1 P1 running P1 running NC H=true P1 stopped at sensor level H H=false H=false 2 Water Level P1 stopped P1 stopped NC H=false P1 running below H H = true H=true *Type: NC .. Normal Case, EC .. Error Case, SC .. Special Case. VI.

DISCUSSION AND CONCLUSION

REFERENCES

Increasing complexity of software components in automation systems require a systematic testing approach to enable efficient and effective testing in case of changes. We presented an adapted test-driven development (TDD) approach for software in automation engineering and conducted an initial feasibility study using selected models from the UML diagram family. RI-1. Identification of an effective test-driven software process for model-based test case generation in industrial automation engineering. In a top-down systems design, models can be refined based on structure and behavior models from the UML diagram family in 11 steps. RI-2. Investigation of models that support test case generation in context of automation systems design. The results of the feasibility study showed that models can support static and dynamic modeling activities. Static models are required as a foundation for describing the system. Behavior and interaction diagrams allow directly deriving test cases based on the models. Nevertheless, the selection of models strongly depends on the project complexity and size. Future work includes (a) the application the proposed process approach in a larger application context at our industry partners, (b) investigating automated code and test case generation with tool support based on these models, and (c) the integration of the automated test generation process workflow to support systematic engineering processes in automation systems development.

[1] [2]

ACKNOWLEDGMENT

[15]

We want to thank our partners from the logi.DIAG project for their valuable discussions and feedback. Parts of this work were funded by the Austrian Research Funding Agency (FFG) grant logi.DIAG (Bridge7-196929).

[16]

[3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13]

[14]

[17]

927

Ambler S.: Elements of UML 2.0 Style, Cambridge Univ Press, 2005. Baker P., Dai Z.R., Grabowski J.: Model-Driven Testing: Using the UML Testing Profile, Springer, 2007. Broy M., Jonsson B., Katoen J-P., Leucker M., Pretschner A.: ModelBased Testing of Reactive Systems. Springer, 2005. Cai K-Y.: Optimal software testing and adaptive software testing in context of software cybernetics, Elsevier, 2002. Cockburn A.: Crystal-Clear a Human-Powered Methodology for Small Teams, Addison-Wesley, 2004. Duvall M.P., Matyas S., Glover A.: Continuous Integration: Improving Software Quality and Reducing Risk, Addison-Wesley, 2007. Estevez, E.; Marcos, M.; Sarachaga, I.; Orive, D.: A Methodology for Multidisciplinary Modeling of Industrial Control Systems using UML, INDIN, 2007. Friedenthal S., Steiner R., Moore A.C.: Practical Guide to SysML: The Systems Modeling Language, Elsevier, 2008. Fröhlich P., Link J.: Automated Test Case Generation from Dynamic Models, ECOOP, 2000. Kontio, J.: Software engineering risk management: A method, improvement framework, and empirical evaluation, PhD Tech. Helsinki University of Technology, 2001. Love J.: Process Automation Handbook – A Guide to Theory and Practice, Springer, 2007. Mellor S.J., Balcer M.J.: Executable UML: A Foundation for ModelDriven Architecture, Addison-Wesley, 2002. Nascimento F.A.M, Olieria M.F, Wehrmeister M.A., Peieria C.E. Wagner F.R.: MDA-based approach for embedded software generation from UML/MOF repositories, Proc. of 19th Symp on Integrated Circuits and Systems Design, 2006. Ritala T, Kuikka S.: UML Automation Profile: Enhancing the Efficiency of Software Development in the Automation Industry, INDIN, 2007. Thramboulidis, K.C.: Using UML in control and automation: a model driven approach, INDIN, 2004. Winkler D., Biffl S., Östreicher T.: Test-Driven Automation – Adopting Test-First Development to Improve Automation Systems Engineering Processes, EuroSPI, Madrid, Spain, 2009. Winkler D., Hametner R., Biffl S.: Automation Component Aspects of Efficient Unit Testing, ETFA, Mallorca, Spain, 2009.

Suggest Documents