A Process Metrics based Framework for Aspect Oriented Software to Predict Software Bugs and Maintenance Pradeep Kumar Singh
Om Prakash Sangwan
AMITY University, Noida, Uttar Pradesh, India pS[QVOP@IUQ\a.ML]
separation of concerns (SoC) is come up as a solution with the increasing use of AOP based on concern abstraction. AOP is considered to be better solution in case of logging, exceptional handling and security features [6-11]. However, Aspect Oriented Programming (AOP) is considered to be the best and popular choice among software professions where crosscutting features are need to be implemented. Merits and demerits of the popular software methodologies are reported by Singh et al. [7]. Cristina Lopes and Gregor Kiczales of the Palo Alto Research Center (PARC), a subsidiary of Xerox Corporation, were among the initial proposed of AOP. Gregor presented the term “AOP” in 1996 [10]. In late 1990s, he led the team at Xerox that developed AspectJ, one of the first language with practical implementations of AOP. Software metrics have been considered as the best predictor of quality characteristics. Numerous quantitative and qualitative assessments are reported till date based on software metrics for module oriented, object oriented, component oriented and very few for aspect oriented, for overall quality evaluation of different kinds of software. Metrics are calculated so that process and product indicators can be ascertained [1]. Software metrics broadly classified into two major categories as (i) Process metrics (ii) Product metrics. Process metrics are measured across all projects and over long periods of time [1]. Process metrics main objective is to provide a set of process indicators that lead to long term process improvement. Numbers of indirect measures have been derived to measure process quality in software, measure of errors uncovered before the release of the software, defects delivered to and reported by end users, schedule. Process metrics mainly help in improving the overall level of process maturity. Process indicators enable a software engineering organization to gain insight into the efficacy of an
Abstract: The quality evaluation of software products, e.g., defect measuring and identification approaches of individual versions, gains importance with higher use of software applications. Process metrics are considered as the main predictor of defect prediction and software maintenance in numerous empirical studies for the software product. In addition to that, there are few models, consider process metrics as an indicator of software quality too. The prime objective of this paper is to design of process metrics framework for Aspect Oriented Software. This framework can be extended for defect prediction and quality evaluation of aspect oriented software. Keywords: Software Metrics, Process Metrics, Aspect Oriented Software Development (AOSD), Separation of Concerns (SoC), Software Maintenance, Software Testability, Defect Prediction, Software Quality and Quality Characteristics, Aspect Oriented Programming (AOP).
I.
INTRODUCTION
Software development has come a long way from machine-level languages to procedural programming to OOP, to Component Based Software to Aspect Oriented Software [6,7]. However, even with the software methodologies that we use in industry is having significant gap between knowing the system goals and implementation. Based on fundamental principle of software engineering, many researchers have agreed that the best way to create manageable systems is to identify and separate the system concerns. In a 1972 paper [9], David Parnas presented that the best way to implement SoC is by modularization—a method of creating modules that hide their decisions from each other. Numerous software methodologies—generative programming, meta-programming, reflective programming, compositional filtering, adaptive programming, subject-oriented programming, aspect-oriented programming, and intentional programming—have emerged as possible solutions to modularizing crosscutting concerns [6]. In some of the cases c 978-1-4799-4236-7/14/$31.00 2014 IEEE
Gauatm Buddha University, Gr. Noida, India
[email protected]
831
existing process. It enables software practitioners to assess what works and what does not. Process metrics are collected across all projects and over long period of time. Their intent is to provide indicators that lead to long term software process improvement. In [26], Henderson-Sellers defines product metric in term of software “snapshot” at a particular point of time, while process metric related to the changes over time, e.g. the number of code changes. Product metrics are divided into two classes: (a) Dynamic Metrics: these metrics are measured at the time of program execution (b) Static Metrics: It is measured based on the system representations. These classes for software metrics are not always unambiguous because some metrics are used to evaluate both products and processes [2]. Dynamic metrics are useful for assessing efficiency and reliability, while static metrics mainly contribute for the assessment of the quality attributes such as complexity, understandability and maintainability. It has been identified based on published research that dynamic metrics are more closely related to software quality attributes. Software researchers have used dynamic metrics and its direct relationship from the response time or number of failures. However, static metrics have shown their indirect measurement relationship with quality attributes. Product metrics can further be classified as (i) external metrics (metrics related to functionality or cost) (ii) internal metrics (related to size or complexity). In addition to that in some reported works similar to quality metrics, product metrics are also classified into (i) direct metrics (related to quality attributes) and (ii) indirect metrics (related to quality criteria). Most popular software product metrics are Size, Defect Density and System Complexity metrics, while effort, defect density for project, duration, efficiency are most widely used for process measurement. This paper is divided into four sections; Introduction about the software process metrics along with aspect oriented software development is followed by related work in second part. In third section we have presented the proposed process metric based framework with metric definitions, later we have discussed the applicability of process metrics for the prediction of bugs and quality evaluation based on maintenance model specific to
832
aspect oriented software. In last, conclusion and future scope is reported for the proposed work. II. RELATED WORK ON PROCESS METRICS In order to derive the proposed framework we carried out the exhaustive literature review on software process metrics. We have use the similar review process as presented with flowchart by Kumar et al. and Singh et al. [5,12]. We find the most relevant studies and to limit our literature in context to defect identification based on process metrics for software systems. So we first decide the of Search String according to objective, than search the databases (Google, IEEE Xplore, ACM, Springer) and collect the papers, there after we have select the appropriate research paper based on title, abstract and conclusion and discard the irrelevant studies. There are number of studies considered process metrics as an indicator of defect prediction and maintenance [3]. Process metrics designed till date, considered only modular and object oriented software. In [3], Jureczko et al. present a review of research studies that investigate process metrics in defect prediction. In [3], authors already analyzed more than twenty research papers to survey these above mentioned process metrics, in our work we have not considered the already analyzed papers because of the limitation of space and for further reference readers may refer to the research work done by Jureczko et al. [3]. In [4], Sisman et al. presented a theoretically approach for incorporating version histories of software files in an Information Retrieval based framework for bug localization. This paper used the information stored in software versioning tools in context to the frequency with which a file is related with defects and its modifications in order to collect estimates for the previous probability of any given file to be the source of a bug. This approach significantly improves the retrieval performance for bug localization. Authors have used two different algorithms, one based on Bayesian reasoning and other on the Divergence from Randomness for the retrieval. However, none of the process metrics framework has been reported for aspect oriented software till date. Our proposed framework is the extension of the work done for object oriented software respect to process metrics. Proposed metrics
2014 5th International Conference- Confluence The Next Generation Information Technology Summit (Confluence)
suite extended the applicability of process metrics for bug identification on aspect oriented software with their specific features based on AOP. III.
PROCESS METRICS FRAMEWORK FOR AO SOFTWARE In [3], Jureczko et al. surveyed the broadly classified process metrics and presented mainly five categories specific to class and they have defined these process metrics as follows: (i) Number of Revisions (NR): This metric collect the number of revisions of a given class during development of the examined release of a software system. (ii) Number of Distinct Committers (NDC) : This metric took the number of distinct authors, usually developers, who did their changes in a given Java class during the development of the examined release of a software system (iii) Number of Modified Lines (NML) : The NML metric is equal to the sum of all lines of modified source code (iv) Is New Class( IN): It shows whether the given class existed in the previous version of the examined system or whether it is a new class (v) Number of defects in previous version for class (NDPV): It counts the number of defects which were repaired in a given class during the
development of the previous release of a software system. Our prime objective is to derive the metric framework for aspect oriented software specific to process metrics by considering the aspect oriented constructs features like aspects, pointcuts, joinpoints, advices and introductions. We have derived total eight process metrics for aspect oriented software as listed in Table 1 Proposed framework considered total eight metrics for aspect oriented software and based on the extension of object oriented process metrics. Process metrics for object oriented software already reported and proven to be good indicator of defects with number of supported data sets, as reported by Jureczko et al.[3]. Based on aspect oriented features, we have derived eight new metrics specific to Aspect Oriented Software Development and specific to the features of aspect oriented technology. Description for each metrics is reported in Table 1. A. Process Metrics Definitions In this section we have presented the details description of all the proposed process metrics for Aspect Oriented Software.
Table 1. Process Metrics description for the Aspect Oriented Software Development (AOSD)
S.N. 1.
Metrics Number of Revisions for Aspects (NRA)
Description NRA metric count the number of revisions of a given aspect during development of the examined release of a software system.
2.
Number of Distinct Commiters for Aspect (NDCA)
The NDCA metric took the number of distinct authors, usually developers, who committed their changes in a given aspect during the development of the examined release of a software system.
3.
Number of Modified Lines in Aspect (NMLA)
4.
Is New Aspect (INA)
The value of the NMLA metric is equal to the sum of all lines of source code which were added or removed for the given Aspect. NMLA= NMLJ+NMLAC+NMLP+NMLI It shows whether the given aspect existed in the previous version of the examined system or whether it is a new one.
5.
Is New Advice ( INAE)
6.
Is New Pointcut (INPT)
7.
Is New Introduction (ININ)
8.
Number of Defects in Previous Version for Aspect (NDPVA)
It shows whether the given advice( before, after, around and throws advices) existed in the previous version of the investigated system or whether it is a new one. It considers whether the given pointcut existed in the previous version of the investigated system or whether it is a new one. It presents whether the given introduction existed in the previous version of the examined system or whether it is a new one.
The NDPVA metric counts the number of defects which were repaired in a given aspect during the development of the previous release of a software system. NDPVA= NDPVJ+NDPVAC+NDPVP+NDPVI Metrics Abbreviations: NMLJ: Number of Modified Lines in Join points, NMLAC: Number of Modified Lines in Advice, NMLP: Number of Modified Lines in Pointcuts, NMLI: Number of Modified Lines in Introductions. NDPVJ: Number of Defects in Previous Version for Joinpoint, NDPVAC: Number of Defects in Previous Version for Advice, NDPVP: Number of Defects in Previous Version for Pointcut, NDPVI: Number of Defects in Previous Version for Introductions.
2014 5th International Conference- Confluence The Next Generation Information Technology Summit (Confluence)
833
From the above mentioned eight process metrics as shown in Table 1, NMLA and NDPVA are composite process metrics for AOSD. Number of Modified Lines in Aspect (NMLA) is collected by measuring the sum of Number of Modified Lines in Join points (NMLJ), Number of Modified Lines in Advice (NMLAC), Number of Modified Lines in Pointcuts (NMLP), and Number of Modified Lines in Introductions (NMLI).
For AO software, NDPVA metric can be collected using the given equation. B. Maintenance Assessment Framework for AO Software using Process Metrics We have presented an aspect oriented framework based on process metrics to estimate the maintenance efforts. Based on presented framework for AOS software shown in figure 1, process metrics can be collected based on AOS artifacts. Sant’Anna et al. have already used the similar assessment framework previously, it was based on the product metrics and this work give us inputs to consider the process metrics to design similar framework to evaluate maintenance [13]. Collected process metrics can be used to interpret the results in term of maintenance efforts as shown in figure 2 and other quality attributes for AO software.
ൌ ሺ ሻ ୀଵ
Number of Defects in Previous Version for Aspect (NDPVA) is measured by evaluating the Number of Defects in Previous Version for Joinpoint (NDPVJ), Number of Defects in Previous Version for Advice (NDPVAC), Number of Defects in Previous Version for Pointcut (NDPVP), Number of Defects in Previous Version for Introductions (NDPVI).
ൌ ሺ ሻ ୀଵ
Design and Source Code of Current and Past Releases
Concern Description with UML Diagrams
Log Files
Bug History with additional details
AOS Artifacts
AOS Process Metrics Maintenance Model Figure 1. Assessment Framework for AOS Maintenance from process metrics
C. Aspect Oriented Maintenance model using Process Metrics In 2003, Sant’Anna et al. proposed an assessment framework based on the product metrics [13]. Authors have used the product metrics to assess the maintenance and reusability for aspect oriented software in context of two different empirical studies. In 2005, Figueiredo et al. also reported another assessment scheme based on aspect oriented product metrics [15]. These works gives us motivation to consider dynamic metrics also for the assessment of maintenance in aspect oriented software and we have proposed a model in the similar direction in our 834
previous work [14]. In this paper, extending our previous work in context of the process metrics point of view we have proposed aspect oriented maintenance model based on process metrics framework. This proposed model and framework have considered similar design adopted by by Sant’Anna et al. based on product metrics [13] and Singh et al. [14] to estimate the maintenance. However, this model uses the process metrics point of view, which is not considered in any of the study till date in aspect oriented software.
2014 5th International Conference- Confluence The Next Generation Information Technology Summit (Confluence)
In this proposed work we have taken aspect oriented process metrics and their dependency on number of bugs. Bugs are indirect measure of maintenance and it has been in used in several previous studies [1619]. Eaddy et al. presented that scattering is strongly correlated with defects based in empirical analysis [16,17]. In their work, authors have correlated the concerns metrics with defects. In [18], Burrows et al. presented the study which focus on the analysis of how faults are injected during maintenance activities involving aspects. In [19],Nagappan et al. reported the results of an empirical study which assess the relationship between software dependencies, code churns and post-release failures observed in a largescale software system. In addition to identifying the relationship between the faults and process metrics, fault characterization is also an important criterion for
analysis. Number of researchers has exposed the characterization of AOP-specific fault types [20, 21, and 22]. In [23], Ferrari et al. presented fault taxonomy for AO software. According to our proposed model, after identifying the bugs we can also assign the severity to predict more accurate and effective maintenance index values for AO software. In [24], Wahyudin et al. proposed a study for defect prediction using product and project metrics and used a data for analysis. In [25], Piveta et al. discuss the strengths and weakness of the aspect oriented software metrics and how these metrics can be used to predict the quality attributes. So, in the similar fashion our proposed framework and model as shown in fig.1 & fig.2 can be used to estimate maintenance index for AO software. In addition to bugs, their severity may also better and accurate predictor of maintenance index.
NR and NRA NDC and NDCA NML and NMLA
No. of Bugs with their Severity
Maintenance Index
IN and INA, INAE, INPT, ININ NDPV and NDPVA
Figure 2. Aspect Oriented Maintenance model using Process Metrics Framework
IV.
CONCLUSION & FUTURE SCOPE Proposed framework and model considered the process metrics of aspect oriented software to predict the bugs and maintenance index. Based on AO artifacts, process metrics can be collected and these metrics can be used to evaluate the maintenance. Empirical evaluation and validation of the process metrics based framework and model is taken as the future work. Each process metrics can separately measured and their correlation with bugs may be used to estimate maintenance index for aspect oriented software. In addition to maintenance estimation for AO software, other quality attributes may also be assessed based on aspect oriented
process metrics, like Singh et al. [27] proposed model from product metrics and these models automation can be carried out using fuzzy or other soft computing techniques in the similar fashion as presented by Singh et al. [28]. V.
ACKNOWLEDGMENT
We would like to express our sincere gratitude toward the faculties of Amity University and Gautam Buddha University for helping us in during this research work and both Universities management for providing us research environment and facilities. We also like to thank Dr. Arun Sharma, Professor and Head, KIET Ghaziabad, India for his valuable suggestions and review comments for this manuscript.
2014 5th International Conference- Confluence The Next Generation Information Technology Summit (Confluence)
835
a tool-supported quantitative method. In Proc. of the 9th ECOOP Workshop on Quantitative Approaches in OO Soft. Engineering (QAOOSE. 05), Glasgow.
REFERENCES 1.
2.
R.S. Pressman (2001). Software Engineering-A Practitioner’s Approach, fifth edition, Mc Graw Hill publication, pp. 82-83. M. Bundschuh , C. Dekkers (2008). Product and Process Metrics, Springer, pp. 207-239.
3.
M. Jureczko and Madeyski L. (2011). A review of process metrics in defect prediction studies, Metody Informatyki Stosowanej, pp. 133-145.
4.
B. Sisman , A. C. Kak (2012). Incorporating version histories in information retrieval based bug localization, In Mining Software Repositories (MSR), 9th IEEE Working Conference, Zurich, Switzerland, IEEE, pp. 50-59.
5.
V. Kumar, A. Sharma, R. Kumar and P.S.Grover (2012). Quality aspects for component-based systems: A metrics based approach, SOFTWARE – PRACTICE AND EXPERIENCE, Vol. 42, Issue 12, pp.1531-1548.
6.
7.
8.
Ramnivas Laddad (2003). AspectJ in Action- Practical Aspect Oriented Programming, Manning Publications Co. Greenwich, pp. 6-26. P.K. Singh, P. Mittal, L. Batra and U. Mittal, (2014). A Perception on Programming Methodologies for Software Development, published in IJCA Proceedings on 4th International IT Summit Confluence 2013 - The Next Generation Information Technology Summit Confluence 2013, pp. 1-6. J. Viega, J.T. Bloch and P. Chandra (2001). Applying Aspect Oriented Programming to Security, Cutter IT Journal, Vol. 14, No.2, pp. 31-39.
16.
M. Eaddy, V. Garg, A. Aho, N. Nagappan, K.D. Sherwood, On the relationship between Crosscutting concerns and defects: An empirical investigation.
17.
M. Eaddy, T. Zimmermann, K.D. Sherwood, V. Garg, G.C. Murphy, N. Nagappan, A.V. Aho (2008). Do Crosscutting Concerns Cause Defects?, Software Engineering, IEEE Transactions on , Vol.34, No.4, pp.497-515.
18.
R. Burrows, F. Taïani, A. Garcia, F.C. Ferrari (2011). Reasoning about faults in aspect-oriented programs: a metrics-based evaluation. In Program Comprehension (ICPC), IEEE 19th International Conference, pp. 131-140.
19.
N. Nagappan and T. Ball (2007). Using software dependencies and churn metrics to predict field failures: An empirical case study, in ESEM’07. IEEE, pp. 364–373.
20.
R. T. Alexander, J. M. Bieman, and A. A. Andrews (2004). Towards the systematic testing of aspect-oriented programs, Dept. of Computer Science, Colorado State University, Fort Collins/Colorado - USA, Tech. Report CS-04-105.
21.
M. Ceccato, P. Tonella, and F. Ricca (2005). Is AOP code easier or harder to test than OOP code?” in WTAOP’05.
22.
J. S. Bækken and R. T. Alexander (2006). A candidate fault model for aspectj pointcuts, in ISSRE’06. IEEE, pp. 169– 178.
23.
F. C. Ferrari, R. Burrows, O. A. L. Lemos, A. Garcia, and J. C. Maldonado (2010). Characterising faults in aspectoriented programs: Towards filling the gap between theory and practice, IEEE, in SBES’10 , pp. 50–59.
24.
D. Wahyudin, A. Schatten, D. Winkler, A. M. Tjoa and S. Biffl (2008). Defect Prediction using Combined Product and Project Metrics-A Case Study from the Open Source" Apache" MyFaces Project Family. In Software Engineering and Advanced Applications, SEAA'08. 34th Euromicro Conference, pp. 207-215.
9.
D.L. Parnas (1972). On the criteria to be used in decomposing systems into modules. Communications of the ACM, 15(12), pp.1053-1058.
10.
G. Kiczales (1996). Aspect-oriented programming, ACM Computing Surveys (CSUR), Vol. 28(4es), 154.
25.
11.
G. Kiczales, E. Hilsdale, J. Hugunin, M. Kersten, J. Palm, and W.G. Griswold (2001). An overview of AspectJ, In ECOOP 2001—Object-Oriented Programming, Springer Berlin Heidelberg, pp. 327-35.
E. K. Piveta, A. Moreira, M.S. Pimenta, J. Araújo, P. Guerreiro, R. T. Price (2012). An empirical study of aspectoriented metrics, Science of Computer Programming, 78(1), pp. 117-144.
26.
B. Henderson-Sellers (1996). Object-oriented metrics: measures of complexity. Prentice-Hall, Inc., Upper Saddle River, NJ, USA.
27.
P.K. Singh, O. P. Sangwan, A. Pratap and A. P. Singh (2014). Testability Assessment of Aspect Oriented Software Using Multicriteria Decision Making Approaches, World Applied Sciences Journal, 32(4), pp.718-730.
28.
Y. Singh, P.K. Bhatia, O.P. Sangwan (2009). Predicting software maintenance using fuzzy model. ACM SIGSOFT Software Engineering Notes, 34(4), pp. 1-6.
12.
P.K.Singh, O.P.Sangwan and Arun Sharma (2013). A Systematic Review on Fault Based Mutation Testing Techniques and Tools for Aspect-J Programs, published in 3rd IEEE International Advance Computing Conference, IACC-2013 at AKGEC Ghaziabad, India, IEEE Xplore, pp. 1455 – 1461.
13.
C. Sant’Anna, A. Garcia, C. Chavez, C. Lucena, and A. VonStaa (2003). On the reuse and maintenance of aspectoriented software: An assessment framework. In Proceedings of Brazilian Symposium on Software Engineering, pp. 19-34.
14.
P.K.Singh and O.P.Sangwan (2013). Aspect Oriented Software Metrics Based Maintainability Assessment: Framework and Model, published in proceeding’s of Confluence-2013, The Next Generation Information Technology Submit, 26th -27th September, Amity University, Noida, India, IET Digital Library, pp. 1-7.
15.
E. Figueiredo, A. Garcia, C. Sant’Anna, U. Kulesza, and C. Lucena (2005). Assessing aspect-oriented artifacts: Towards
836
2014 5th International Conference- Confluence The Next Generation Information Technology Summit (Confluence)