Metrics for Evaluation of Trustworthiness-By-Design Software ...

2 downloads 8689 Views 314KB Size Report
Metrics for Evaluation of Trustworthiness-By-Design Software ... In an ideal software development, trustworthiness of software will be provided by an absolute ...
Metrics for Evaluation of Trustworthiness-By-Design Software Development Processes Sandro Hartenstein, Holger Könnecke Department of Economics, Brandenburg University of Applied Sciences, 4770 Brandenburg an der Havel, Germany {sandro.hartenstein, koennech}@fh-brandenburg.de, Abstract In an ideal software development, trustworthiness of software will be provided by an absolute guarantee that it will perform its required. Under all possible circumstances, software will do so on time, and will never perform any actions that have hazardous consequences. However, in practice this hardly ever happens, since different software products have different degrees of trustworthiness. This paper will show how to develop a trustworthiness metric model for Socio-Technical Systems, by identifying trustworthiness goals and consequently derive software development metrics based on a Goal-Quality-Metric. Based on a state-of-art analysis, and large number of participants in survey, the factors that are believed to determine the trustworthiness of STS are analyzed [7]. Then software development processes and practices have been analyzed in their success in enabling these factors [11]. Then, a set of metrics has been defined and injected to the repositories and projects will be identified and tested so as to gather information about their intrinsic characteristics and check if it is possible to measure the previously identified factors. Then, a number of tools will be developed to measure the trustworthiness goals for that there are no metrics available. Keywords: Metrics; Process Metrics; Software Development Process; Computation; Trustworthiness.

Introduction Socio-Technical Systems (STS) include humans, organizations, and the information systems that they use to achieve certain goals [12]. They are increasingly relevant for society, since advances in ICT technologies, such as cloud computing, facilitate their integration in our daily life. Due to the difficulty in preventing malicious attacks, vulnerabilities, or the misuse of sensitive information, users might not trust these systems. Trustworthiness in general can be defined as the assurance that the system will perform as expected, or meets certain requirements (cf., e.g. [1]). We consider trustworthiness as a multitude of quality attributes. As a means of constructive quality assurance, development methodologies should explicitly address the different challenges of building trustworthy software as well as evaluating trustworthiness, which is not supported by development methodologies. We believe that the enable trustworthiness goals in such kind of systems the trustworthiness of process itself has major affect. A major goal of this work is to develop and propose a trustworthy methodology to create software. One of dimensions that satisfy this goal is to move from a qualitative to a quantitative view. Therefore, we have developed a set of metrics to increase the trustworthiness of the software development process. The following sections will give you an overview to understand how we investigate and elaborate useful process metrics for the software development process.

Related Work In our research we use several established models and best practices. The Common Criteria approach is our starting point to determine process steps and process artefacts in software engineering [2]. To simplify the steps and artefact we synchronize the steps with the Microsoft Secure Software Lifecycle [8]. The third Standard that is used by us is the ISO/IEC 21827 [6]. IT based on the System Security Engineering Capability Maturity Model and it supplies us information about controls and capabilities. The best practices OWASP CLASP [9] and OpenSAMM [10] influenced our work towards a good applicability and

acceptance. Our recent work that address trustworthiness and trustworthiness product metrics has also influential on this research. Trustworthiness product metrics are described in our contribution to the 9th future security conference [5]. A Further explanation of trustworthiness attributes in engineering was published in Cloud Computing and Service Science [4].

Method To develop the metrics we used the Goal-Question-Metric (GQM) method [3]. GQM handles the problem of how to decide, and what to measure to reach your goals. It is a method invented by Prof. Victor R. Basili, Dr. David Weiss and Prof. Dr. Dieter

Rombach. GQM is based on the idea of goal-oriented measurement and therefore it is a top-down approach. Top-down approach means that one starts with improvement goals, tries to make them measurable and finally to be able to reach them. [13]

Figure 1 Derived GQM Paradigms from Basili [13].

Processing GQM means performing six important steps. [13] 1. Characterize the environment 2. 3. 4. 5. 6.

Identify measurement goals and develop measurement plans Define data collection procedures Collect, analyze and interpret data Perform post-mortem analysis and interpret data Package experience

Debra claim is to get good metric values with the characteristics of: ”

        

accurate precise valid correct consistent current or time stamped can be replicated can be compared to previous measurement, target values, or benchmarks can stand on its own. “[3] p.51

We starting with few goals to keep the complexity relatively low. The benefit of measurement will increase by every single goal.

Our approach in defining Trustworthiness Goals and Metrics in Trustworthiness-by-Design Process In order to develop process metrics, we followed three steps: At first, we step investigated the state-of-the art and recommendations in developing process metrics. We map existing and already used sources in the OPTET project (for the identification of development capability patterns), which already in some cases contain some hints for process metrics. The project team has checked them for a useful quality contribution with the focus of trustworthiness and as a result, we have mainly been oriented by the following sources, which are current best practice standards:  Common Criteria ISO 15408  SSE-CMM ISO 21827  BSIMM/ OpenSAMM  Microsoft SDL  OWASP CLASP In a second step, we could identify some similar process phases and some initial process metrics for each phase. For an easier handling of this process metrics and on the base of the best practice standards the project team has identified so-called “Quality Goals” for each development phase. The quality goals have the similar role than the attributes when developing product trustworthiness metrics, and will be serving as “Goals” in the subsequent GQM methodology. Note that “Governance” is not a “real” development phase, but since it is necessary to determine and specify some requirements and guidelines across all phases. Therefore we have added “Governance” as a “meta”-phase to simplify the representation of our approach. In Table 1Table 1, you can find the Quality Goals together with a short description.

Governance • Trustworthy Employees • Secure IDE • Iterative Improvement • Policies and Processes • Legal Compliance

Requirements

Design

Coding

• Completeness • Consistency • User orientation • Traceability • Legal compliance • Precision

• Simplicity • Traceability • Pattern orientation • Completeness

• Code generation • Documentation • Correctness • Verifiabiity

Testing

Deployment

Support

• Test coverage • Req coverage • Auditability • Fuzz coverage • Integration test coverage

• Documentation • Integrity • Customizeability Legal Compliance • Record

• User centricity • Competence • Effectiveness • Integrity • Reversable • Traceable

Figure 2 The process phases and the corresponding trustworthiness goals.

Finally, we developed process metrics using the GQM methodology, in addition to the few already existing ones identified in step 2. The aim was to develop at least one process metric for each quality goal. We completed this step by checking whether the developed metrics are indeed measuring a process quality goal, yielding to a trustworthy software development process. The results of the GQM Method used for trustworthiness quality goals can be found in next section. Table 1 Quality Goals Description

STEP Governance

Requirements

Design

Coding

Testing

Deploy

QUALITY GOAL Trustworthy Employees Secure IDE Iterative Improvement Policies and Processes Legal Compliance Completeness Consistent User oriented Validated / Traceable Legal Precise

DESCRIPTION The employees needs to be trustworthy. Defined tools to develop software are used. The development process need iterative improvements. The development process need clearly policies. The development process need to correspond legal compliance. The requirements need to be complete. The requirements need to be consistent. The requirements consider the end user expectations. The requirements need to be traceable. The requirements need to cover the necessary legal regulations. The requirements need to be precise.

Simple Traceable

The software design needs to be simple as possible. The structure of the software is understandable and arises from the requirements. The structure of the software uses established patterns. The software design needs to be complete.

Pattern-based Design Completeness Code generation Documented Error free Verifiable Test coverage Requirements Coverage Auditable Fuzzy Coverage Integration test coverage Documented

The deployment of the software needs to be fully documented.

Integrity

Legal

The deployment process needs to maintain the software integrity (no subversions covering individual requirements). The installation of the software needs to be adaptable to customer requirements. The customizing options need to be documented The deployment of the software needs to comply with all applicable legal requirements.

Recorded

All customer installations of the software need to be recorded.

User centricity

The user needs to be the focus during support calls.

Competence

The support team needs to be suitable qualified.

Effectiveness

The solution of problems needs be suitable.

Integrity

The support process needs to be secure against manipulation.

Reversible

The application of a solution of problems needs to be reversible.

Traceable

The application of a solution of problems needs to be documented.

Customizable

Support

The code needs to be well documented. The code needs to be error free. The code needs to be verifiable in respect to the requirements. The tests need to cover the entire software. The tests need to cover the requirements. The test results needs to be verifiable. The tests need to use random inputs. All involved components need to be covered by integration tests. The tests need to cover the collaboration with other software components completely.

Trustworthiness Goals, Questions and Metrics in Trustworthiness-by-Design Process For the Quality Goal consistency we will explain exemplary how to understand that it is a product and a process metric. The product metric for the Quality Goal consistency is to understand in a way, that for example the requirements for a product have to be consistence. For the process metric of consistency it is necessary that the steps in the development process are consistent and compared. For example the requirements for the development process should be consistent and therefore the developer can be use metrics in the development process to validate that the each phase are trustworthiness. 0. Governance Development of software should only be done by trustworthy employees and it needs to be supported by secure tools. Each development phases should be processed iteratively and it needs clear policies and defined processes. The whole development process has to be compliant to all applicable local laws. Table 2 Derived metrics for trustworthiness goals for Governance

QUALITY GOAL Employees Secure IDE

Iterative Policies and processes Legal

1

GOAL

QUESTION

Employees are security screened. Only secure IDEs are used.

How many employees are security screened? How many code is processed by secure IDEs?

Development processes are iteratively used. Development process steps are covered by policies. Development practices are compliant to (local) law.

How many process steps are executed iteratively? How many process steps are covered by policies? How many practices are according to the (local) law?

METRIC % of employees that have been security screened % of lines of code that have been processed by an IDE with the following properties: - source code is protected against unauthorized change - approval requires dual control - authentication of developers - no other access to source code than through IDE - supports change logs with assignment to developers % of development process steps that are executed iteratively % of development process steps covered by policies

% of development practices that are according to the (local)1 law

This depends on the development locations, as well as customer / delivery regional / national targets. Eventually, individual metrics per country might be necessary.

1. Requirements The first goal focuses the requirement completeness, because the requirements needs to be complete. Also the requirements need to be consistent and consider the end user expectations to be user oriented. The requirements have to be covering the necessary legal regulations and need to be a precise formulation to prevent different interpretations. Requirements are traceable and needs to be validated. Table 3 Derived metrics for trustworthiness goals in Requirements Phase

QUALITY GOAL Completeness

Consistent User oriented

Validated / Traceable Legal

Precise

GOAL

QUESTION

METRIC

All stakeholder expectations are covered by explicitly defined requirements. Requirements are free of conflicts. The requirements consider the end user expectations. The requirements are validated and traceable. The requirements are aligned with the (local) legal systems. The requirements allow a unique interpretation for developers.

How many stakeholder expectations are covered by explicitly defined requirements? How many requirements are free of conflicts? How many requirements consider the end user expectations? How many requirements are validated?

% of stakeholder expectations covered by explicitly defined requirements

How many requirements are aligned with the (local) legal systems? How many requirements allow a unique interpretation for developers?

% of requirements that are aligned with the (local)2 legal systems

1 - (% of conflicts in requirements vs. all requirements) % of requirements that implement an end user expectation % of requirements found in more than one source / formulated by more than one stakeholder

% of requirements that allow a unique interpretation for developers

2. Design The software design needs to be simple as possible and the structure of the software uses established patterns. The design has to be traceable and for that the structure of the software is understandable and arises from the requirements. One of the important thing is that the software design needs to be complete. Table 4 Derived metrics for trustworthiness goals in Design Phase

QUALITY GOAL Simple Traceable

Pattern based

Completeness

2

GOAL The software design is as simple as possible. The design artifacts are traceable back to requirements. Patterns are used.

Design should address all requirements.

QUESTION How simple is the software design? How many design artifacts can be traced back to requirements? How many design artefacts uses patterns that are trusted by the community? How many requirements are addressed by design?

METRIC % of directed communication paths between design "units" / (number of units)^2)3 % of design artefacts that can be traced back to requirements % of design artefacts that use patterns trusted by the community4

% of requirements addressed by design

This depends on the development locations, as well as customer / delivery regional / national targets. Eventually, individual metrics per country might be necessary. 3 units may be different elements, e.g. classes, hardware, services.... and may lead to more than one metric 4 This requires to define a „trusted pattern base“ (which should be an important step for developers any way).

3. Coding The creating of source code should be as far as possible automatically by generators and tools. The code should be well documented to make them comprehensible. Error free is also one of an important goal and therefore the code needs to be verifiable in respect to the requirements. Table 5 Derived metrics for trustworthiness goals in Coding Phase

QUALITY GOAL Code generation Documented

Error free

Verifiable

GOAL

QUESTION

METRIC

Source code is generated automatically. The source code documentation is complete. The source code is error free.

How many lines of code are generated automatically?

% of lines of code generated automatically

How many functions are documented?

% of functions that contain complete documentation (purpose, parameters and return values) % of lines of code processed by regular static code analysis (including compiler settings) and automated (unit) tests

Source code is verifiable.

How many lines of code are processed by regular static code analysis (including compiler settings) and automated (unit) tests? How many lines of code can be traced back to requirements?

% of lines of code that can be traced back to requirements

4. Testing The tests need to cover the requirements and the results have to be verifiable. All test results should be auditable. To cover a wide range of possible inputs for testing fuzzy logics are suitable and preferred. All involved components need to be covered by integration tests. The tests need to cover the collaboration with other software components completely. Table 6 Derived metrics for trustworthiness goals in Testing Phase

QUALITY GOAL Test coverage Req coverage

Auditable

Fuzz coverage

Integration test coverage

GOAL

QUESTION

METRIC

The software is completely tested. The software is completely tested with respect to the requirements. The test cases are documented.

How many lines of code have been tested? How many requirements have been tested?

% of lines of code that have been tested

How many test cases have been documented?

The software is tested using fuzzing techniques. The Integration of the software is completely tested.

How many tests involving input data have been executed using fuzzing techniques?

% of test cases which have been completely documented (functions/classes/ versions tested, requirements tested against, input data used, developers and testers actions) % of tests involving input data that have been executed using fuzzing techniques

How many software components are covered by integration tests?

% of requirements that have been tested

% of software components covered by integration tests

5. Deploy For the deploy phase the deployment of the software needs to be fully documented and should be integer. For that the deployment process needs to maintain the software integrity (no subversions covering individual requirements). The installation of the software needs to be adaptable to customer requirements. The customizing options need to be documented. The deployment of the software needs to comply with all applicable legal requirements and all customer installations of the software need to be recorded. Table 7 Derived metrics for trustworthiness goals in Deploy Phase

QUALITY GOAL Documented

GOAL

QUESTION

METRIC

The Installation process steps are fully documented.

How many steps are described in an installation (and customization) manual?

% of steps that could be used during installation described in an installation (and customization) manual

Integrity

The deployment preserves the integrity of the software.

Customizable

The software is customizable. The deployed software satisfies the specific national legal conditions. All deployed product instances are fully documented.

Legal

Recorded

How many of the executed installation steps are documented? How much of the deployed code is identical to released code?

% of executed installation steps that are documented % of deployed code identical to released code

How many of parameters are customizable? How many deployment-specific national legal conditions are adhered to?

% of parameters that can be customized

How many deployed product instances are fully documented?

% of deployed product instances fully documented (information about customer, release version, environment (OS, DB, WS) )

% of deployment-specific national legal conditions that are adhered to

6. Support For the process step support is the user centricity is crucial. For that the user needs to be the focus during support call and the support team needs to be suitable qualified to have the necessary competence. Another aspect for this development phase is the effectiveness. Therefore the solution of problems needs be suitable. The support process needs to be secure against manipulation and have to be integer. The application of a solution of problems needs to be reversible and gapless documented for the traceability. Table 8 Derived metrics for trustworthiness goals in Support Phase

QUALITY GOAL User centricity Competence

Effectiveness

Integrity

Reversible

Traceable

GOAL

QUESTION

METRIC

The user is the focus of the support process. The support requests are solved with high competence. The support process is highly effective. The support process preserves the integrity of deployed code and customer data The support solutions (request solutions and updates/patches) are reversible. The support process is traceable.

How many support requests can be closed within one interaction with the user? How many support requests can be closed within a predefined expected solution time?

% of support requests that can be closed within one interaction with the user

How many support requests are unique in the request database?

% of support requests that are unique in the request DB

How many support request solutions do not affect deployed code or customer data?

% of support request solutions that to not affect deployed code or customer data

How many support request solutions allow to undo changes or patching?

% of support request solutions that allow to undo changes or patching

How many support request steps are documented?

% of support request steps documented (date, supporter, customer, problem description, solution,...)

% of support requests that can be closed within a predefined expected solution time

Conclusion and Future Work Our work has led us to conclude that the use of metrics to evaluate the process quality in relation to trustworthiness supports the TWBD process extension. In this paper we have shown a way to develop metrics to get an independent audit possibility. With process metrics we can proof the existence of controls for increasing the trustworthiness in software development processes. In our view these results are an excellent initial step toward more process metrics and process observation in relation to trustworthiness. Future work will concentrate on metrics validation and tool support. ACKNOWLEDGEMENTS The research leading to these results has received funding from the European Union's 7th Framework Programme FP7/2007-2013 under grant agreement 317631 (OPTET).

REFERENCES [1] [2] [3] [4]

[5]

[6] [7] [8] [9] [10] [11]

[12] [13]

Brillinger, D. R. 2001. Time series. Data analysis and theory. Classics in applied mathematics 36. Society for Industrial and Applied Mathematics, Philadelphia. CCRA. 2012. Common Methodology for Information Technology Security Evaluation Evaluation methodology September 2012 Revision 4 Foreword, September. http://www.commoncriteriaportal.org/files/ccfiles/CEMV3.1R4.pdf. Debra, S. H. 2007. Complete Guide to Security and Privacy Metrics: Measuring Regulatory Compliance, Operational Resilience, and ROI. Auerbach Publications; 1 edition (January 22, 2007). Gol Mohammadi, N., Paulus, S., Bishr, M., Metzger, A., Könnecke, H., Hartenstein, S., Weyer, T., and Pohl, K. 2014. Trustworthiness Attributes and Metrics for Engineering Trusted Internet-Based Software Systems. In Cloud Computing and Services Science, M. Helfert, F. Desprez, D. Ferguson and F. Leymann, Eds. Communications in Computer and Information Science. Springer International Publishing, Cham, 19–35. DOI=10.1007/978-3-319-11561-0_2. Hartenstein, S., Könnecke, H., and Paulus, S. 2014. TRUSTWORTHINESS METRICS FOR SOCIO-TECHNICAL SOFTWARE. In 9th future security. Berlin, September 16 - 18, 2014 ; proceedings, K. Thoma, Ed. Fraunhofer-Verl, Stuttgart, 673–682. ISO/IEC JTC1 /SC7 ISO. 2008. Information technology — Security techniques — Systems Security Engineering — Capability Maturity Model® (SSE-CMM®), 21827. http://standards.iso.org/ittf/PubliclyAvailableStandards/index.html. IT Innovation, AUEB, IMinds, and FHB. 2013. Socio - economic requirements for trust and trustworthiness. http:// www.optet.eu/wp-content/uploads/2013/09/OPTET_WP2_D2.1_SocioEconomic_Requirements_V1.1.pdf. Microsoft. Security Development Lifecycle. http://www.microsoft.com/security/sdl/default.aspx. Accessed 22 November 2014. OWASP. 2006. OWASP CLASP Project. Comprehensive, Lightweight Application Security Process. https://www.owasp.org /index.php/Category:OWASP_CLASP_Project. OWASP. 2008. Software Assurance Maturity Model. https://www.owasp.org/index.php/ Category:Software_Assurance_Maturity_Model. Spais, I., Kanakakis, M., Kalogiros, C., Paulus, S., Könnecke, H., Hartenstein, S., Ioannidis, S., Hartman, A., Moffie, M., Short, S., Di Cerbo, F., Håkon Meland, P., Ahlmann Nyre, Å., Bernsmed, K., Keller, S., Mooij, M., Gol Mohammadi, N., Bishr, M., Bandyszak, T., and Nasser, B. 2013. Initial trustworthiness-by-design process and tool support. Succi, G. P., Clapp, D., Gampert, R., Prado, G., and Carapezza, E. M. 2001. Footstep detection and tracking. In Aerospace/Defense Sensing, Simulation, and Controls. SPIE Proceedings. SPIE, 22–29. DOI=10.1117/12.441277. van Solingen, R., Basili, V., Caldiera, G., and Rombach, H. D. 2002. Goal Question Metric (GQM) Approach. In Encyclopedia of Software Engineering, J. J. Marciniak, Ed. John Wiley & Sons, Inc, Hoboken, NJ, USA.