A Scalable Product Quality Verifier Framework for a

0 downloads 0 Views 218KB Size Report
increased market value to stay ahead of competition, there are few inherent problems .... Perhaps the developer can be an author of the scripts used by PQA too. For example ... CppCheck - An open source static code analysis tool for C/C++.
A Scalable Product Quality Verifier Framework for a Outsourcing Supplier Dileepa Jayathilake 99xTechnology Colombo, Sri Lanka [email protected]

Hasith Yaggahavita

Upul Senanayake

99xTechnology Colombo, Sri Lanka [email protected]

University of Peradeniya Peradeniya, Sri Lanka [email protected]

Charitha Elvitigala

Dhammika Sriyananda

University of Colombo Colombo, Sri Lanka [email protected]

99xTechnology Colombo, Sri Lanka [email protected]

Abstract— Outsourced software development is a growing business model that has proven to bring cost-effective and efficient solutions for varying demands of a software product company. Though it has proven its capability in bringing increased market value to stay ahead of competition, there are few inherent problems commonly identified in practice. A prominent issue is how to verify the quality of the code/applications delivered to the customer. Given the fact that a critical bug leaking in to production can bring disastrous results, it is vital to ensure that the deliverables from the supplier conforms to a defined set of quality guidelines. The work described in this paper is the design and implementation of a scalable software quality verification framework targeted towards an outsourcing supplier. The framework enables the supplier to build an industrial grade automated quality verification system, on top of which they may validate and ensure the quality of their deliverables before it reaches the customer. The framework is capable of evaluating both at software code and software application levels. Code level evaluation is done in two phases; first is when the developer tries to add code to the repository (interactive commit stage) and secondly a deeper analysis covering a wide range of problems offline (noninteractive backend analysis). It is important that the rules used for evaluation, actions on results and alerting can be customized to suit the project context. When it comes to the application level, the framework provides a programming interface and a set of tools to verify the artifacts. A case study quality verification system built using this framework proved to add a significant value to the deliverables of a commercial software project. An experiment done with the programming interface showed that powerful and complex analysis systems could be built to evaluate deliverables and even to aid in software due-diligence process. Keywords-software quality; automated software testing; white box testing; black box testing; expert system

This research was sponsored by 99xTechnology, Sri Lanka

I.

INTRODUCTION

Software outsourcing is a proven practice as a costeffective and efficient solution to the demand for new and specialized applications in today’s internet-based marketplace. There are numerous reasons for a product company to move towards outsourcing as a strategy. Some of them are to reduce cost, speed up development, complement internal staff competencies, achieve resource scalability, etc. Statistics show that outsourcing continues to be a significant business strategy for application development [1]. Despite of the benefits of it, there are several facts the outsourcer needs to be aware of to avoid future havocs caused by the technical debts introduced by the development team. For example, often neglected, but important verification is to ensure that the delivered code performs only the required functions because additional unnecessary interfaces to the system may cause vulnerabilities that can be exploited by attackers. Such aspects are often not verified by traditional functional testing. Furthermore, the conformance of the code to security best practices should be assured with clear documentation of the same while scoping. Also it is important that mechanisms are made available to detect the presence of intentional malicious code too. Not only the code but also the binaries delivered need to be verified against quality in a near identical environment to the production. Having a quality verification system that can highlight the quality deficiencies at development stage and prove the high quality of deliverables gives a clear advantage to an outsourcing supplier in the long run. This paper describes approaches and techniques in building a scalable software quality verification framework that is customizable for the exact needs of an outsourced product development. The framework is designed focusing on the requirements of software outsourcing suppliers in particular. It offers functionality for both white box and black box testing to verify quality. Existing standard tools can be employed wherever possible and new components may be implemented as and when necessary. The framework is highly customizable and

integrates well even with different software development processes such as CMMI and Agile. It provides a configuration interface, programming interface and extensive tooling. The simple yet powerful programming interface creates ground for building a knowledgebase accumulating the experience of veterans. This is used in collaboration with modern tools to evaluate applications against their performance, security, memory, IO usage, etc. The paper is divided in to several sections. Section II briefs the importance of maintaining good quality in software. Value of quality in delivered software for a software outsourcing vendor in particular is explained in Section III. Requirements for a software quality verification system are discussed in Section IV. Architecture of the new framework is described in Section V and Section VI identifies the users of the system and their roles. Section VII lists various existing tools used in the framework. Implementation of the main modules of the framework is detailed in Section VIII. Section IX is on a test run of a quality verification system built using the framework on a commercial project. Section X concludes the work. II.

VALUE OF SOFTWARE QUALITY

In today’s highly competitive software product market, the reputation, return on investment and even extreme legal risks in some cases rely on the quality of the software. There are commonly reported cases of what seems to be an insignificant error in the development process becoming severe to cause considerable losses in the production. Similarly, there are several examples of the successful brand images that are built with over many years of efforts, are lost in seconds due to few but critical problems in production. A software quality failure impairs business predictability and may manifest itself in many ways as discussed below. •

Operational problems

Higher number of bugs due to poor quality can affect the development and operational phases and ultimately may hinder the project from meeting deadlines. •

Maintainability issues

Effects of poor quality software may not appear immediately, but can cause severe problems in the long run. For example, poorly written code causes significantly high maintenance costs. •

Long term business value decline

When customers are directly affected by quality issues the product might lose its market share and brand equity by a significant proportion. One of the biggest challenges is how a vendor should invest for maintaining quality, given the economic constraints of the business. Apparently, the safest way is to use proven approaches and standard tools. III.

VALUE OF SOFTWARE QUALITY VERIFICATION TO AN OUTSOURCING VENDOR

Given the importance of quality to the companies that outsource, a quality verification system that assures the conformance of deliverables to a standard set of quality

measures can bring an outsourcing company a clear competitive advantage. The system may also help to improve the quality of deliverables which will cut a certain portion of maintenance costs. In addition, the use of such a system will increase the awareness of employees on quality as a fact. This will lead to a more skilled workforce. As an added benefit, the quality verification system may also be used in due-diligence which is a popular process practiced by many outsourcing companies in getting new projects. IV.

REQUIREMENTS FOR A SOFTWARE QUALITY VERIFICATION SYSTEM

Considering the challenges associated with promoting and maintaining a quality verification system, the following high level requirements are formulated. • The system should employ standard tools that have a wide acceptance in the industry – This will lead to induce a greater trust in customers on the system output. • The rules used in the system and their levels of importance, number of tests performed and their frequency, alerting mechanisms and the actions based on test results should be configurable in the project level – This is required because the quality verification system must be aligned with varying customer priorities in different projects. • The system should easily integrate with the current software process in the organization – An organization may have to stick to certain practices in its process in order to maintain standards. The quality verification system should harmonize with this existing process. • The output of the system should address various levels of users such as customers, company management, project managers, technical leads, developers and quality assurance engineers – Information should be presented in various levels of details. • The system should be usable by due-diligence engineers to benchmark software developed by other vendors – Although considered as a side benefit of the system, this is an important requirement from the outsourcing vendor’s perspective as there is no standard product evaluation process in due-diligence. Given this wide spectrum of requirements the better solution, apparently, is to develop a framework which can be used to craft quality verification systems with various shapes and sizes rather than providing a single system that tries to fulfill all the needs. V.

SYSTEM ARCHITECTURE

Fig. 1 shows the component level break down of the system. The three main modules of the system are Commit Manager (CM), Offline Quality Analyzer (OQA) and Product Quality Analyzer (PQA). CM is responsible for verifying the code quality at the commit time i.e. when the developer tries to add new code to the repository. A reasonably faster evaluation of the code quality is performed at this point. OQA performs a thorough analysis on the current version of the code taken from the repository preferably at a time the rest of the system is not busy (at midnight every day, for example). In addition to

analyzing the code quality, it runs unit tests on the code as well. PQA performs quality checks on the binaries generated by compiling the code. Its use is two-fold. It is used for verifying the quality of applications developed inside the company before being delivered to the customer as well as for evaluating applications by other vendors as a part of the process of acquiring a new customer. Existing tools are employed for code and binary evaluation in each of these three modules. Bridging each tool to other modules is done by a tool wrapper which makes the different input / output interfaces of the tool transparent by encapsulating them beneath a homogeneous interface.

A. Technical Lead This role creates quality profiles to be used in all 3 main modules after analyzing the project requirements. Either creating a new profile or reusing an existing profile with minor changes can be done. In addition, the technical lead is supposed to write scripts for the PQA targeting the tasks involved. There are another bunch of tasks that can come under the responsibility of this role depending on the nature of the system usage in the project. They are:

The rules used for quality evaluation in each module and their relative importance vary across projects. The same is true for the actions triggered based on the results of the evaluation. These need to be specified in the form of a profile as a configuration step. Rule database is used to store them. Tasks involved in PQA range from parsing log files to expressing inference rules. The diversity of these requirements brings the need for a more generic solution that provides a unified approach for handling them. The scripting language provides this generic solution. Hook is implemented using the development interface provided by the code repository management system. The pre-commit event is hooked to take control when the developer tries to add new code to the repository.

• Observe the dashboard of OQA periodically to track critical deficiencies

VI.

USERS OF THE SYSTEM

Fig. 2 shows the users of the system and the use cases they are associated with. The users are named by their role in the project. However, the actual categorization is based on their activities on the system.



Helping developers to fix the deficiencies reported



Identifying false positives reported by the system



Analyze technical reports generated by PQA

B. Project Manager The main responsibility of this role is to make sure that the project deliverables are in good health by using the information provided by the system. The project manager may also have to assist the tech lead when creating quality profiles. Negotiating time allocations for fixing the deficiencies can be another task. C. Developer The developer deals with the CM as his/her code is evaluated at each commit attempt. Therefore this role must be given sufficient knowledge to decode the error messages displayed by CM and to fix the deficiencies. In addition, he/she may be a frequent user of the OQA dashboard. It’s the developer who

Figure 1. System Components: Input for the system comes from the code repository (e.g. subversion) and the output is a set of reports.

Figure 2. Users of the system and use cases: Four user roles are identified

has to ultimately fix the problems pointed by the system. In most projects the developer may be a receiver of the reports generated by PQA too. They may help him/her to track the deficiencies introduced by his/her code to the product. Perhaps the developer can be an author of the scripts used by PQA too. For example, developer is the one who implements logging in applications and so is the best person to write the script for log file parsing.

A. Code Quality Evaluation Tools • FxCop - A free tool from Microsoft that can be used to identify deficiencies in .NET code. •

StyleCop - An open source tool from Microsoft that analyses C# source code against a set of coding style and consistency rules.



Gendarme - A free extensible tool to analyze potential problems in .NET applications and library code.

D. Due-Diligence Engineer This role interacts with the system when evaluating products of other vendors (potential customer). He/she might be interested in revealing as many loopholes and pitfalls of the product being analyzed. The script(s) used for the analysis is probably authored by this role. The due-diligence engineer may use the reports generated by PQA to evaluate the product.



CppCheck - An open source static code analysis tool for C/C++.



CheckStyle - A free static code analysis tool for Java. It defines a set of available modules, each of which provides rules checking with a configurable level of strictness (mandatory, optional, etc). Each rule can raise notifications, warnings, and errors.

VII. TOOLS USED



PMD – A static rule-set based Java source code analyzer.

The tools need to be carefully selected considering the current requirements and possible future issues in maintaining. Therefore a detailed study is done on candidate tools before selecting them. The selection was carried out by taking the following facts into account. •

Tool should be widely used in the industry



Tool should be able to carry out evaluation according to standard measures



It should offer a fair amount of configurability



It should generate zero or an acceptably low amount of false positives



Support should be available for the tool (for a free tool, there should be an active community)



It should perform acceptably faster

According to these criteria, the following tools are selected to be used in the framework for various purposes.

B. Binary Quality Evaluation Tools • Apache JMeter - This is an open source Java tool that can be used to test performance on both static and dynamic resources such as files, servlets, Perl scripts, Java objects, database queries, ftp servers, etc. •

Microsoft Application Verifier - This is a handy tool for detecting problems incurred in the system level when running an application.



LeakDiag & LDGrapher - These are a pair of tools that can be used in conjunction to detect memory leaks in an application. Both tools are provided by Microsoft and are free.



Process Monitor - Process Monitor is another free analysis tool from Microsoft. It is an advanced Windows logging utility which collects and records most of the system activity regarding file system, registry and process/thread activity.





XPerf - Being a part of Microsoft Windows Performance Toolkit, XPerf is a free performance profiling tool for Windows applications. It can be used for tracking performance bottlenecks in applications as well as for comparing time taken for various operations in an application. Application Compatibility Toolkit - This is an application lifecycle management toolset that assists in identifying and managing overall application portfolio. The toolset is free and can be used to reduce the time and cost involved in resolving application compatibility issues.

C. Other Tools • Cruise Control - This is an open source tool used for continuous integration. It provides an extensible framework for creating a custom continuous build process and includes plugins for various source controls, build technologies and notification schemes such as email and instant messaging. Though it is implemented in Java, Cruise Control allows one to perform a continuous integration of any software development process. •



Apache Maven - Apache Maven is a software project management and comprehension tool. It can manage a project's build, reporting and documentation from a central piece of information. While primarily used for Java programming, it can also be used to build and manage projects written in C#, Ruby, Scala and other languages. Sonar - Sonar is an open source platform built for managing source code quality in projects in collaboration with Apache Maven. While Maven manages the build of a project, Sonar provides facility to integrate source code quality evaluation into that process and generate reports that can be analyzed later. It comes with a bunch of code analyzers, reporting tools, defects hunting modules and time machine functionality. Sonar is built as a plug-in for Maven. It can be configured to execute inside a Maven goal. It collects analysis results from code analysis tools and uses them to prepare reports and trigger various actions. VIII. IMPLEMENTATION

This section briefly describes the implementation of each of the three main modules of the system. A. Commit Manager The commit time code quality verifier system hooks the pre-commit event in the code repository system, which is triggered just before performing an update in code. Information on the code update is passed into the Commit Manager, which then queries further information required for the analysis from the repository engine, and performs a code quality analysis employing the tools. The communication between the Commit Manager and the tools take place via the interface implemented by the tool wrappers. The results of the analysis are passed back to the Commit Manager which uses that information to

decide whether the commit should be allowed. The decision is sent to the repository hook which completes the hook cycle with the appropriate return code. If the code evaluates to be deficiency-free according to the rules the repository hook component allows the commit to happen in the normal way. The code change is added to the repository and the developer sees the usual success message in the repository client interface. However, if deficiencies are found in the code, the repository hook component rejects the commit so that the code change will not be added to the repository (the whole change is rejected in a transaction-based repository). It further sends the appropriate error messages to the repository client for the developer to track the violations in his code. Attention is given for the responsiveness of the system as the developer has to wait for the output of the system to complete the commit. Only a restricted set of rules with high importance are selected for the system to minimize the time taken for the analysis. Other less important rules are used in the offline code quality verification system which is allowed to consume a relatively longer time for analysis. As the violations detected in this system stops the developer from adding source code to the repository, it is assured that the violation descriptions are clear and descriptive enough so that the developer can fix the violations. B. Offline Quality Analyzer OQA is implemented by utilizing and integrating a bunch of software [2]-[4]. (a) Java Development Kit (b) Apache Maven (c) Maven .NET Plugin (d) Sonar (e) Code quality evaluation tools (Gendarme, Gallio, FxCop, StyleCop, Metric C++, Checkstyle) (f) Code quality tool plugins for Sonar (g) Cruise Control Above-mentioned code quality evaluation tools are configured as Sonar extensions so that it will use the output of these tools when generating the information to be displayed in the dashboard and sent with emails. The tools are registered with Maven (using its configuration file) to make sure that they are executed inside the Maven goal. Cruise Control, through its daemon runner, periodically checks the repository for new code. If new code is added to the repository it gets the code and performs a build. After that the build status is published in the Cruise Control front end. There are two possible ways that the code quality evaluation can be performed on the new code. •

Add a post build action to Cruise Control to trigger a Maven goal. This Maven goal is then responsible to carry out the rest of the process.



Instead of starting a code quality evaluation each time the source code is built, have a scheduled operating system task to perform it in a given time(s) of the day (e.g. a nightly execution).

When the Maven goal is initiated it reads the project information from the Project Object Model xml file (pom.xml) and builds the source code. Then it runs each code quality evaluation tool on the source code. Each of these tools generates an output consumable by Sonar. After running all the tools Maven initiates Sonar which reads the output of the tools

and generates the information to be displayed in its dashboard and sends the required emails if it is configured to. C. Product Quality Analyzer PQA is implemented using the framework described in [5] which is a previous work by the same author. This framework provides a scripting engine and a set of tools to process log files generated by various tools and analyze them using scripts. After that customized reports can be generated using the same framework. A set of binary analysis tools is first used to monitor the execution of binaries under near production conditions. Then the log files generated by the tools and the binaries themselves (if any) are correlated and analyzed using scripts written by experts to generate a set of reports. IX.

PILOT PROJECT RUN

The system was put in to test in a relatively short term commercial software project. The project team consisted of one project manager, one technical lead and two developers. Rules and their relative weight were configured according to general guidelines on coding best practices [6]-[8]. The system was used in the project for about one month. The code was written in Microsoft .NET platform and the above-mentioned code quality evaluation tools for .NET were activated in the system. However, developer restrictions were not imposed due to the limited time frame of the project. This means the developers were able to see a detailed analysis of their code but were free to continue without fixing the violations. However, the tech lead did examine the system dashboard to keep a track on critical violations and followed up with developers to make sure that they are fixed. This process seemed to be moderately successful and few problems were also identified.



Evolve and maintain a set of rule configuration templates for different categories of projects X.

CONCLUSIONS

The framework implemented in this project provides a solid foundation for an automated software quality verification system that evaluates software product artifacts during various important stages of development life cycle. The framework is flexible enough to meet vastly different quality verification requirements that can arise in different types of projects. Selected tooling of the framework offers a rich interface for the technical experts in a project to craft a quality verification system that is custom made to suit the project. In addition, it is capable of keeping the quality verification in line with various software manufacturing processes practiced across various organizations. The framework is scalable so that new concepts and modules can be easily plugged in to improve the verification process. By employing state-of-the-art standard tools in various stages in the quality verification process, the framework ensures the validity of results, which can be used as a strong differentiation against the competition. As discussed in the paper, the flexibility and extendibility of the proposed framework enables the implementer to select and plug verification tools that are best suited for the project. But as experienced in some of the implementations, the differences of these tools may cause difficulties in uniform reporting. For example, although each of the tools can report back with independent findings, generating a single report/dashboard requires some additional tampering which is not standardized under the scope of this paper.



Not making the customer aware about the code quality verification system



Lack of knowledge with developers about the rule semantics



Lack of developer knowledge on how to fix certain critical violations

As for the current design, the different verification tools run in isolation. But there are interesting outcomes possible if tools were able to use the findings of each other in enhancing their verification techniques. Such possibility could allow a specialized tool to execute targeted expensive deep verification against a vulnerable code piece identified by a shallow verification tool. Therefore implementation possibilities of such a inter module communication bus should be further studied.



Not integrating the code quality verification process with the existing process practiced by the company

REFERENCES



Lack of knowledge with tech leads in creating rule profiles according to project scale and customer expectations

Using these lessons the following guideline is created to follow when using the system in commercial projects. •

Sell the idea of code quality verification to customers and convince them about the long term benefits of it



Add the quality verification as a step in the existing process of the organization



Conduct more organization-wide awareness sessions on quality improvements



Provide developer guidelines on rules used for code quality evaluation and fixing deficiencies

[1] Ryan Berg, “Trust, but Verify: How to manage risks in outsourced applications”, Rational Software, 2009. [2] Apache Software Foundation, “Apache Maven Project”, Internet: http://maven.apache.org, Nov. 26, 2010. [3] Codehaus, “Sonar: Put your technical debt under control”, Internet: http://www.sonarsource.org, Nov. 18, 2010. [4] Jose Chillan, Alexandre Victoor, “Sonar .Net plugin”, Internet: http://docs.codehaus.org/display/SONAR/.Net+plugin, Nov. 18, 2010. [5] D. Jayathilake, “A mind map based framework for automated software log file analysis,” in International Conference on Software and Computer Applications, Kathmandu, 2011, pp. 1-6. [6] Andy Oram, Greg Wilson, “Beautiful Code”, O’Reilly, 2007, ch. 3. [7] Steve Maguire, “Writing Solid Code”, Microsoft Press, 1993, ch. 3. [8] Steve McConnell, “Code Complete”, Microsoft Press, 1993, ch. 28, 29.

Suggest Documents