SEI PowerPoint Title Slide

1 downloads 0 Views 4MB Size Report
Jul 11, 2018 - Check completeness with respect to requirements and architecture. Support development, QT, DT, and OT. Support rapid cycle Devops.
SEI Observations and Reference Model for Software Integration Labs

Robert V. Binder July 11, 2018

Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213

SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

[[DISTRIBUTION STATEMENT STATEMENT A] A] [[DISTRIBUTION This This material material has has been been approved approved for for public public release release and and unlimited unlimited distribution distribution

1

Notices Copyright 2018 Carnegie Mellon University. All Rights Reserved. This material is based upon work funded and supported by the Department of Defense under Contract No. FA8702-15-D-0002 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and development center. The view, opinions, and/or findings contained in this material are those of the author(s) and should not be construed as an official Government position, policy, or decision, unless designated by other documentation. References herein to any specific commercial product, process, or service by trade name, trade mark, manufacturer, or otherwise, does not necessarily constitute or imply its endorsement, recommendation, or favoring by Carnegie Mellon University or its Software Engineering Institute. NO WARRANTY. THIS CARNEGIE MELLON UNIVERSITY AND SOFTWARE ENGINEERING INSTITUTE MATERIAL IS FURNISHED ON AN "AS-IS" BASIS. CARNEGIE MELLON UNIVERSITY MAKES NO WARRANTIES OF ANY KIND, EITHER EXPRESSED OR IMPLIED, AS TO ANY MATTER INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR PURPOSE OR MERCHANTABILITY, EXCLUSIVITY, OR RESULTS OBTAINED FROM USE OF THE MATERIAL. CARNEGIE MELLON UNIVERSITY DOES NOT MAKE ANY WARRANTY OF ANY KIND WITH RESPECT TO FREEDOM FROM PATENT, TRADEMARK, OR COPYRIGHT INFRINGEMENT. [DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution. Please see Copyright notice for non-US Government use and distribution. This material may be reproduced in its entirety, without modification, and freely distributed in written or electronic form without requesting formal permission. Permission is required for any other use. Requests for permission should be directed to the Software Engineering Institute at [email protected]. Carnegie Mellon® is registered in the U.S. Patent and Trademark Office by Carnegie Mellon University. DM18-0843

SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

2

Overview •

What should a Software Integration Lab (SIL) do?



SIL Reference Model



SIL configurations observed at SEI customers



Automotive best practices



Test automation levels and effectiveness



Testing productivity versus effectiveness

SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

3

US Army SIL for M153 CROWS System

http://www.ardec.army.mil/armamentsec/facilities/crows.aspx Images from the web site – SEI did not visit this lab. SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

4

SIL Reference Model: the System Under Test and its Environment SUT

Environment

SUT Users

User Interface Software Item Run Time OS Comp Hardware

Friendly Actors

Actuators Sensors

SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

External Users

External System

Plant

Hostile Actors

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

5

What should a Software Integration Lab do? SIL Goals Evaluate interoperability and stability of: • System Under Test (SUT) software items (SI) • SUT and run-time stack interaction • SUT and sensors, actuators, peripherals • SUT and external systems Support testing of partial SUT configurations, including falsework Achieve realistic environmental conditions Check completeness with respect to requirements and architecture Support development, QT, DT, and OT Support rapid cycle Devops SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

6

What should a Software Integration Lab do? SIL Non-goals (typical) White-box software evaluation (maintainability, structure, etc.) Comprehensive functional testing of SI, OS, or HW Comprehensive functional testing of SUT Comprehensive failure/restart/recovery testing SUT reliability or performance test (MTBF, response time, utilization, etc.) SUT usability or effectiveness testing, user documentation evaluation Long duration soak test Safety testing Testing physical aspects of mechanical, electrical, or RF components

SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

7

What should a Software Integration Lab do? SIL Strategy: Testing System The Testing System is a software-defined environment purpose-built to achieve testing goals for the SUT and its environment It should be funded, developed, staffed, and managed as a first class sustainment asset Maximal test automation • Test asset management system, all test code under CM control • Model-based test generation • Test execution system(s) • Test objects drive adapter objects that drive falsework and real SUT interfaces • User-interface test suites follow Feature-action-control pattern • Seamless interleaving of manual and traditional test code/procedures SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

8

What should a Software Integration Lab do? SIL Strategy: Test Approach Design realistic test scenarios to achieve interaction coverage • Exercise all modes (normal and failure), mode transitions, and duty cycles • Exercise at least one failure of each sensed/managed mechanical, electrical, or RF interface • Verify datastore integrity at entry/exit of each mode Calibrate test artifacts • Appropriate level of rigor for test artifacts • Test artifacts should be reusable • Test artifacts must be maintainable Living antecedent traceability

SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

9

What should a Software Integration Lab do? SIL Strategy: Test Coverage Test at least once: • Every externally triggered interaction • Every internally triggered interaction (e.g., timer) • Every requirement for an interaction and its observable effects • Each mode and transition, including failure modes Evaluate interaction coverage (end-to-end paths) Test at least pair-wise combinations of inputs, configurations, settings, etc. Don’t rely on stale regression testing

SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

10

What should a Software Integration Lab do? SIL Strategy: Testing System’s Network Testing System’s network is isolated from SUT network SUT network(s) provide passive “Tee” for injection and monitoring Configuration-as-Code and containerization stage both Testing System and SUT Staging the Testing System for a classified SUT • Testing System development -



Development impractical without public internet connectivity Testing System developed in unclassified environment Falsework allows tests to run in unclassified environment One-way data diode or air gap staging to classified Testing System Install Testing System container in classified environment

SUT container installed into classified environment and tested SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

11

What should a Software Integration Lab do? SIL Strategy: Process Produce specific, measured, actionable, realistic, and timely evaluation results Follow quality management standard ISO/IEC 17025 General requirements for the competence of testing and calibration laboratories Support upstream and downstream activities • Provide design-for-testability guidance and entry requirements to suppliers and developers • Operate a SIL instance dedicated to developer continuous integration (CI) • Gate incoming candidates: Accept new SUT version only after upstream CI passes, smoke test passes; test readiness review acceptance • Continuously evaluate and improve downstream handoff • Track all integration bug reports; use to evaluate/improve test effectiveness SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

12

SIL Reference Model: the System Under Test and its Environment SUT

Environment

SUT Users

User Interface Software Item Run Time OS Comp Hardware

Friendly Actors

Actuators Sensors

SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

External Users

External System

Plant

Hostile Actors

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

13

SIL Reference Model: Testing System Architecture Testing System

Test Asset Management Antecedents

Configuration Management

Point of Control and Observation (PCO)

Test Modeling Test Generation

SUT

Test Execution

Devops Staging

Environment

Test Evaluation Anomaly Reporting SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

14

SIL Reference Model: Adapter Framework for all PCOs Adapter Object

Test Execution

Abstract Interface

Test Object

Adapters decouple test objects from physical interfaces so same test suites can run on multiple SUT/SIL configurations

Setup

SUT

SUT Physical Interface

SUT Config

Null (SW) Mock (SW) Simulator (SW/HW) Emulator (HW)

SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

Environment Real controlled

Real uncontrolled

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

15

SIL Reference Model: Devops for the Testing System Testing System Release Candidate

Internet Staging

Data Diode

TS

TS

TS

SUT

SUT

Continuous Integration Tool Chain

SUT Developer SIL Instance

Staging

SUT Release Candidate SIL Instance SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

Staging

Secure SIL Instance [[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

16

Profiled Software Testing Labs Composite of multiple SEI engagements

Notable



25 testing labs



Some SILs also used for training



Data collected for different projects by interviews and visits



Almost no effective shift-left



Many challenges for development and validation of simulation falsework



Upstream testing often superficial

• • •

~80% developing or sustaining weapon systems

~20% developing or sustaining enterprise • systems Includes experience of standing up a SIL • at SEI for the SOCOM TALOS program



No explicit design-for-testability High friction moving unclass to class Automated SILs have highest defect removal efficiency

SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

17

Profiled Lab Characteristics Embedded App Type

Developer

C2

QT

Stage

EIS

DT

ISR

OT

SI Scope

System

Test Run Days

CI/CD? Configuration

LRU

Component

Environment

Sim? HIL? Sim? HIL?

SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

Effectiveness

Number of Tests % Automated

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

18

Application Type and Stage Used Profiled SILs support mostly embedded apps …

… during QT and DT

18 16 14 12 10 8 6 4 2 0

C4I

C2

ISR

EIS

Embedded

8 7 6 5 4 3 2 1 0

System Dev

SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

LRU DT

SI Dev

LRU QT

System QT

System DT

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

19

Configuration and Automation 1/3 use automated regression suites, 1/3 have no automation

1/4 use both Sim + HIL for LRU integration

9 8

8

7

7

6

6

5

5

4

4

3

3

2

2

1

1

0

25%

35%

20%

50%

0%

100%

Percent of tests that are automated

0

4 Elements

SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

3 Elements

Real only

2 Element

1 Element

Number and type of falseworks

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

20

Test Automation Levels Ad Hoc Manual

Repeatable Manual

Naive Automation

Layered Architecture Automation

Model-based Automation

Usability Omissions Lowest cost

Rqmt checkoff Repeatable Low cost

Rqmt checkoff Repeatable Run on demand

Rqmt checkoff Repeatable Run on demand Lower maint $

Maintain model Generate tests Deep coverage Lowest $/test

Superficial Long test time

Superficial Brittle High maint $

Superficial SW eng stds

Staffing/Skilling

Concerns

Coverage? No repeat Inconsistent

Risk Reduction

Low

Low-Med

Low-Med

Low-Med

Med-High

None

Some

Some

Some

Deep

Main Benefits

Tech Baseline Ownership via Test Design

Layered, model-based test automation achieves highest ROI and effectiveness SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

21

Automation Levels at Profiled SILs

Naïve Automation 52%

Good news: All labs follow regular testing process; none ad hoc or skip

Layered Automation 12% Model-based Automation 4%

No Automation 32%

SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

22

Profiled Labs Productivity Comparison

Falsework/Real Configuration

Test Execution Automation Manual Interaction No Assist

Manual Interaction Software Assisted

Automated Test Case Execution

2200 500 200

LRU Simulated or Real Hardware-inthe-Loop

150

LRU Simulated

390 70

100, 100 60

Deployed System (all Real)

50 20

50 50

(nnn): Estimated average test points evaluated per day during a test cycle SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

23

Best Practices: Testing Strategies Profiled Labs Best Practices Coverage: every requirement at least once

Automotive Best Practices



Model-based Test Generation



Test Asset Management



Profile-based Reliability Testing



Combinatorial Design



Data-driven promotion/acceptance



SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

24

Best Practices: Testing System Configuration Profiled Labs Best Practices

Automotive Best Practices

SI build with controllable adapters



Continuous Integration with LRU simulation



LRU simulation





LRU simulation + Real LRU





Testable full-up chassis



Manual testing on system





Manual testing on system, telemetry/capture



Field monitoring



SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

25

Best Practices: Test Execution Profiled Labs Best Practices

Automotive Best Practices

Integrated component simulation/emulation





Automated test harness





User Interface automation





Network traffic monitoring





Controllable fault injection





SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

26

Backup

SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

27

Falsework

Simulation

Null

“Falsework consists of temporary structures used in construction to support spanning or arched structures in order to hold the component in place until its construction is sufficiently Stub to support itself.” advanced Mock

Emulation SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

28

Falsework for a SIL Purpose

Role in a SIL

Null

Placeholder

SW

May be part of SUT. Limits coverage, placeholder for unavailable interface.

Mock

Implement subset of SI behavior.

SW

May be part of SUT. Limits coverage, placeholder for unavailable interface.

Simulation

Mimic selected behaviors of an SI, sensor, actuator, external system, or user.

SW

Use a simulator to generate input or accept output.

Emulation

Replicate all behaviors of a component, sensor, actuator, or external system with hard real-time constraints.

SW/ HW

Use a HIL emulator to generate actuator input or sensor output using high speed digital devices.

In software testing, falsework refers to stubs, mocks, fakes, “Service Virtualization,” generated or programmed simulations, and high fidelity hardware-based emulation. SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

29

Interface Coverage Matrix Each cell represents a possible interaction between actor types •





All cells are candidates because any cell can initiate an interaction

SI

SI

RTS

HW

XS









RTS



HW



XS





The structure of the system must be mapped to be sure interactions are covered. Some good sources are: •

Interface Control Documents (ICDs)

Table shows a minimal subset for a typical system



Sequence diagrams



Code analyzers like SciTools Understand

Some systems may have only a few interactions; some have all



Runtime logs and traffic



User documentation

SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

30

The Test Automation Tool Chain

There are many hundreds of COTS, FOSS, and GOTS software testing tools SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

31

Testing scope, tooling, focus, lanes Scope Developer, Unit, SI Component, SI, Subsystem System, SoS

Example Tool

CTR SQT

DTO

OTO

Focus

Junit SonarCube

Functions

Selenium SOAP UI

Features, User stories

TestStand Jmeter

Use cases, Performance, Mission threads

SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

PO FAT

      

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

32

Which testing tool(s) is right for your job(s)? Often dozens of tools in each niche

Tools are specialized for: • Testing purpose • Target interface, IDE, programming language • Application domain: transaction processing, embedded, mobile app…

Enterprise Information Systems Application Domain Embedded Control Systems

• Runtime stack(s) of target and tool SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

Function Performance Security

Function Performance Security

...

...

Function Performance Security

Function Performance Security

...

...

API

GUI

Interface Supported

*nix RTOS Windows Platforms Supported

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

33

Test Automation Reference Architecture: Java/Cloud stack Maven Selenium Ranorex Appium TestMaker

Requirements Management Model-based Systems Engineering

JBehave ACTS Smartesting

Junit Klocwork Fortify Pit Byteman

BDD/ATDD Support Combinatorial Design Model-based Testing Dev Test Framework Code Coverage Analyzer Static Analyzer Mutation Testing Fault Injection

Silk Performer Nagios WireShark

Test Runner Web UI Harness Dot Net UI Harness Smartphone Harness SOA/API Harness

Virtualize

Load Generation Performance Monitor Network Capture

Italics indicate advanced capability

Company, product, and service names used in this slide are for illustrative purposes only. All trademarks and registered trademarks are the property of their respective owners.

Configuration Management Continuous Integration Continuous Deployment

FalseWork Bug Tracking Test Asset Management Application Life Cycle Management SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

Representative tools are not recommendations. Many others for each slot [[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

34

Test Automation Reference Architecture: C++/RTOS stack Requirements Management Model-based Systems Engineering

SpecFlow ACTS Matelo

GoogleTest VectorCast FlawFinder Insure++ FITH

BDD/ATDD Support Combinatorial Design Model-based Testing Dev Test Framework Code Coverage Analyzer Static Analyzer Mutation Testing Fault Injection

CANoe nmon Link 16 Diagnostic

Test Runner Web UI Harness Dot Net UI Harness Smartphone Harness SOA/API Harness

MATLAB

Load Generation Performance Monitor Network Capture

Italics indicate advanced capability

Company, product, and service names used in this slide are for illustrative purposes only. All trademarks and registered trademarks are the property of their respective owners.

Configuration Management Continuous Integration Continuous Deployment

TestStand Squish --SOAP UI

FalseWork Bug Tracking Test Asset Management Application Life Cycle Management SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

Representative tools are not recommendations. Many others for each slot [[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

35

Acronyms API CI CM COTS CPU DT DTO FOSS FQT GOTS GUI HIL HW ICD IDE LRU MBT

Application Programming Interface Continuous Integration Configuration Management Commercial off the Shelf Computer Processor Unit Developmental Testing Developmental Testing Organization Free open source software Factory Qualification Test Government off the shelf Graphic User Interface Hardware in the Loop Hardware Interface Control Document Interactive Development Environment Line Replaceable Unit Model-based Testing

MTBF OS OT OTO QT RCA RF RTOS SI SIL SoS SQT SUT SW TS UI

Mean time between failures Operating System Operational Testing Operational Testing Organization Qualification Testing Root Cause Analysis Radio Frequency Real-time operating system Software Item Software Integration Lab System of Systems System/Software Qualification Test System Under Test Software Testing System User Interface

SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

36

Contact Information Organization Carnegie Mellon University Software Engineering Institute Client Technical Services Division

U.S. Mail Software Engineering Institute Customer Relations 4500 Fifth Avenue Pittsburgh, PA 15213-2612, USA

Robert Binder [email protected] +1 412-268-1549

Web www.sei.cmu.edu www.sei.cmu.edu/contact.cfm Customer Relations Email: [email protected] Telephone: +1 412-268-5800 SEI Phone: +1 412-268-5800 SEI Fax: +1 412-268-6257 SEI Observations and Reference Model for Software Integration Labs © 2018 Carnegie Mellon University

[[DISTRIBUTION STATEMENT A] This material has been approved for public release and unlimited distribution

37