Submission to Techniques, tools and formalisms for capturing and assessing architectural quality in OO software
Object-oriented frameworks: architecture adaptability Paolo Predonzani¥ƒ, Giancarlo Succi§, Andrea Valerio¥ ¥ §
Dipartimento di Informatica, Sistemistica e Telematica, Università di Genova, Genova, Italy Department of Electrical and Computer Engineering, The University of Calgary, Calgary, Canada
Abstract Quality and economical aspects of software development are strictly related. Adaptability, as a feature of quality, fosters reuse and the resulting economies of scale. Domain analysis is an effective way to improve adaptability. Yet domain analysis is expensive. To reduce the cost of domain analysis without reducing its effectiveness, we use a set of metrics to measure adaptability during design, i.e., when the cost of improving adaptability is still low. The measures are based on interfaces and assume an object-oriented software development.
1 Introduction The quality of components and architectures is frequently related to their capability to meet the needs of several situations, integrating in different software systems. Improperly designed architectures lead to architecture mismatch [Garland, 1995]. The result of this poor quality is increased cost in the adaptation of the components. Domain analysis addresses these problems by performing a broad-scope analysis of several applications in a domain. Applications can be real or potential. Domain analysis focuses on the requirements of these applications. Its output is a framework that can be reused to produce several applications in the domain. We base domain analysis on the identification of commonalty and variability. Commonalty is the common part of several applications. Variability is the specific, variable part of the applications. Common and specific parts are connected: variability is hooked to commonality through a set of variation points. Variation points express features of the common part that can be customized in a specific way specified in the variable part. Variation points are the conceptual tool to express adaptability. Domain analysis has a cost too. Deeper understanding of the domain produces higher levels of adaptability but produces higher costs. Without jeopardizing the adaptability of the framework, we try the reduce the cost of quality through the detection, at design stage, of adaptability. We exploit the power of interfaces to implement variation point in a way that fosters adaptability. Interfaces are becoming increasingly popular and are supported in many object-oriented languages. Interfaces can be fully specified during design, which makes them perfectly suitable for early estimation of adaptability. We provide a set of metrics to assess adaptability applicable to designs of domain frameworks. The paper is structured as follows: section 2 presents the state of the art in this field, section 3 presents our approach to domain analysis and object-oriented architectures, section 4 presents a set of metrics to assess the adaptability of a design, and section 5 draws some conclusions.
2 State of the art Several proposals exist to measure reusability and adaptability of software. Most of them are based on analysis of some properties of code. Code is not available until implementation phase, which compromises the applicability of these measures to design.
ƒ
Address for correspondence:
Paolo Predonzani, DIST – Università di Genova, via Opera Pia 13, 16145 Genova, Italy Email:
[email protected] Paolo Predonzani, Giancarlo Succi, Andrea Valerio
1/5
Submission to Techniques, tools and formalisms for capturing and assessing architectural quality in OO software Prieto-Diaz and Freeman have developed one of the earliest approaches to software reusability assessment [PrietoDiaz and Freeman, 1987]. Among the program attributes identified as important for software reusability, they list the program size expressed in lines of code and the cyclomatic complexity [McCabe, 1976]. The underlying hypothesis is that reuse is correlated to small module size and low complexity, as the two metrics indicated. The claim that small size is an indication of high reusability is embraced also by Selby [Selby, 1989], who proposes size as a one of the reusability metrics. Other metrics include the complexity of interfaces, the number of calls to other modules and the number of function and procedure parameters. The stress is on information hiding, cohesion and coupling. Chen and Lee evidence a strong statistical correlation between complexity and reusability in a controlled experiment involving several students [Chen and Lee, 1993]. There is a general belief that complexity lowers reusability. In the reusability model proposed by STARS (Software Technology for Adaptable, Reliable Systems) some of the most interesting attributes of reusable software are information hiding, high cohesion and low coupling [STARS, 1989]. Design patterns [Gamma et al., 1994] address adaptability from a different perspective. First they are design concepts applicable to several situations. Second, they do not measure the result of a design phase, rather they actively affect design with solutions (or template solutions) that have proved to be adaptable.
3 Domain analysis and OO frameworks Prieto-Diaz defines domain analysis as “a process by which information used in developing software systems is identified, captured and organized with the purpose of making it reusable when creating new systems” [PrietoDiaz, 1990]. Domain analysis explores a domain to synthesize new applications. A domain is a collection of applications. Domain analysis has a broader view than traditional application analysis: application analysis focuses only on the application under development, while domain analysis considers several existing and potential applications. The output of domain analysis is a domain framework, a collection of software components and architectures to support the development of several applications in the domain. Domain analysis is useful when the software is developed with reuse in mind. Traditional application analysis can produce software for specific requirements but may be insufficient if the developed software needs to be reused and adapted to new requirements. Domain analysis concentrates the effort in the analysis stage, so that the resulting framework is better designed to satisfy a broad range of requirements. Domain analysis has high initial costs but promises low application development costs exploiting reuse. From a commercial point of view, domain analysis works in markets that allow development of multiple applications with a common reused part. From a technical point of view, domain analysis works if the resulting framework is actually reusable with minimal effort in adaptation to different requirements We need to maximize the effectiveness of the initial effort. Unfortunately, effectiveness can be measured only when the framework is actually written and reused. In our approach we design the architecture of the framework using tools that potentially improve adaptability and then we estimate the adaptability with a set of metrics used as a proxy for real adaptability. 3.1 Developing architecture w ith variation points: commonalty and variability Applications in a domain should be similar for some features but can differ for others. A variation point is a not fully specified feature. The feature can be implemented in several ways. A variant is a specific implementation of the feature. A variation point is a generic placeholder for a set of variants; it retains the common part of the variants and leaves unspecified what is specific to single variants (Figure 1). For instance a variation point in the domain of operating systems is the kind of file systems supported: the variation point tells us that operating systems "have" file systems but leaves the kind of the file systems unspecified. Specific applications in the domain, i.e., real operating systems, support real file systems, e.g., DOS, Minix, Ext2fs, etc. An application in the domain can choose any combination of variants. Designing an application means picking up the variants most suitable for the requirements.
Paolo Predonzani, Giancarlo Succi, Andrea Valerio
2/5
Submission to Techniques, tools and formalisms for capturing and assessing architectural quality in OO software Variants Variation point
Application
Application
Application
Domain model and framework architecture
Figure 1: Commonalty and variability On one hand the result of this process is the deep understanding of how the domain works and what the domain requires (domain model); on the other hand it is a guide to design the software components to support the development of several applications (framework architecture). 3.2 Variation points as interfa ces When we design frameworks and applications we should take into consideration what programming languages offer to support our design. Variability is, at different levels, supported in any programming languages. Some of the tools to support it are the following: parameters, method overloading, inheritance, virtual methods, interfaces, multiple inheritance, templates, aggregation and delegation, network combination. Many of these tools require an object oriented language. C++ and Java actually provide direct support for most of them. In object oriented languages, an interface is a collection of method declarations. In our approach, interfaces are the core tool to support variability: each variation point is expressed though an interface. This does not mean that the other tools are not used: they are allowed as long as they are encapsulated in an interface. Two reasons are in favor of this approach: •
Interfaces are a design concept. Interfaces can be written and thoroughly specified before code is written.
•
Interfaces can specify the use of all the other variation tools. Some tools, like parameters, method overloading, inheritance, virtual methods, multiple inheritance, and templates have language features to support them and are evident from the definition of the interface. Other tools, like aggregation, delegation and network combination, usually do not have language features to support them, but the use of interfaces (and the resulting omission of the implementation) forces the designer to make these tools evident at interface level.
4 Measuring the adapt ability of frameworks Reuse of frameworks is the justification of the costs of domain analysis. Reuse is possible only frameworks are adaptable to different situations. We have shown that the starting point of domain analysis is the analysis of existing and potential applications. Frameworks should prove to be adaptable to support any application in the domain. Even if domain analysis considers many applications, it cannot consider all feasible application. Many applications are not yet known when domain analysis is performed. We need a way to assess the adaptability a priori, through some properties of the framework. Moreover we need this assessment as soon as possible to reduce costs of rework in case we discover poor adaptability of the framework. We consider generality and modularity as two useful properties influencing adaptability and we propose three metrics (External Interfaces, Variation points as Interfaces, and Interface Coupling) to assess them quantitatively. The metrics are based on interfaces and are applicable during design.
Paolo Predonzani, Giancarlo Succi, Andrea Valerio
3/5
Submission to Techniques, tools and formalisms for capturing and assessing architectural quality in OO software 4.1 Generality The generality of a framework is the degree to which the framework is independent from its environment. To measure generality we consider the framework and its environment as separate parts. Then, we analyze the interactions between these two parts. The components of the environment usually belong to specific software platforms. The environment usually has an architecture of its own which is fixed before we design the framework. If a framework is designed for only one environment, it may be difficult to port to other environments. This problem is known as architectural mismatch [Garland, 1995] and occurs when the assumptions a framework makes about its environment are not valid any longer in a new environment. We cannot avoid assumptions about the environment but we can make assumptions explicitly visible at design level. We express assumptions through interfaces. The framework accesses its environment's features through the interfaces and provides concrete classes to implement the interface. The cost of adapting a framework to a new environment is defined as follows: •
Cost of identification of the components to change. This cost is low because the identification requires only to examine the list of interfaces.
•
Cost of changing the components. This cost is low because changes are localized in concrete classes implementing the interfaces.
The metric to express generality is External Interfaces (EI) and is defined as follows:
EI =
m n
where m is the number of interfaces defined for external dependencies and n is the total number of external dependencies. Figure 2 depicts a sample scenario which has two external dependencies (Access1 and Access2) and one external dependency though an interface (Access1). Framework Interface
Access1
Componen
Environment
Concrete class Access2
Figure 2: External dependencies 4.2 Modularity Modularity is an internal property of a framework. A framework is modular if it is split in various modules, if the modules perform non-overlapping functionality, and if the communication between the modules is clear. Interfaces are related to modularity because interfaces can be used to express the variable features of a module. We assume that the variable features are also the most relevant ones. We propose two metrics for modularity: Variation points as Interfaces (VI) and Interface Coupling (IC). VI is defined as follows:
VI =
vi v
where vi is the number of variation points implemented through interfaces and v is the total number of variation points. VI is meant to assess whether modules depend on each other in a controlled way, i.e., though interfaces. VI is the equivalent of External Interfaces for variable features within the framework. Interfaces are a clean technique to represent and group the variable features of modules. Interfaces can interact with each other through the parameters of their methods. This interaction is safe if the parameter types are interfaces too, Paolo Predonzani, Giancarlo Succi, Andrea Valerio
4/5
Submission to Techniques, tools and formalisms for capturing and assessing architectural quality in OO software but it can be dangerous for modularity if the parameter types are concrete classes. The Interface Coupling (IC) metric takes into account the negative effects of using concrete classes as parameters in interfaces. The metric is defined as follows:
IC =
∑ cc
i
{ci }
∑ np
i
{ pi }
where {pi} is the set of parameter types and {ci} is the set of concrete classes used as parameter types. cci is the number of times concrete class ci is used as a parameter and npi is the number of times parameter type pi is used as a parameter. All these sets and values are measures taking into consideration the interface definitions only. High value of IC show high coupling and poor modularity.
5 Conclusions We have addressed the issue of quality in object oriented architectures from the perspective of adaptability. Adaptability fosters reusability of software components and architectures in different situation. Our approach to adaptability is based on domain analysis as a tool to assess the requirements of several application in the domain. Domain analysis decomposes application according to commonalty and variability expressed through variation points. Object oriented languages offer several tools to support variation points but interfaces are the most effective as they can express design concepts. To assess how domain analysis and interfaces influence adaptability, we have proposed three metrics - External Interfaces, Variation point as Interfaces, and Interface Coupling - which are applicable during design.
6 References Chen, D., P. J. Lee, On the Study of Software Reuse Using Reusable C++ Components, Journal of System Software, Vol. 20, No.1, January 1993. Gamma, E., R. Helm, R. Johnson, J. Vlissides, Design Patterns: Elements of Reusable Object-Oriented Software, Addison-Wesley, 1995. Garlan, D., R. Allen, J. Ockerbloom, Architectural Mismatch: Why reuse is so hard, IEEE Software, 12(6): 17-26, 1995. McCabe, T. J., A Complexity Measure, IEEE Transactions on Software Engineering, Vol. 8E-2, No. 4, December 1976. Prieto-Diaz, R., Domain Analysis: an Introduction, in ACM SIGSOFT - Software Engineering Notes, 15(2), April 1990. Prieto-Diaz, R., P. Freeman P., Classifying Software for Reusability, IEEE Software, Vol. 4, No.1, January 1987. Selby, R. W., Quantitative Studies of Software Reuse, in Software Reusability Volume II, Biggerstaff, T.J., A.J. Perlis (eds.), Addison-Wesley, Reading, MA, 1989. STARS, Respository Guidelines for the Software Technology for Adaptable, Reliable Systems (STARS) Program, CDRL Sequence Number 0460, 15 March 1989.
Paolo Predonzani, Giancarlo Succi, Andrea Valerio
5/5