Verification of Open Source Web Frameworks for Java Platform

3 downloads 16840 Views 138KB Size Report
method. For each new framework we should develop new verification process. ... JavaServer Faces applications that attempt to mimic behaviour of desktop ap-.
Verification of Open Source Web Frameworks for Java Platform Dariusz Król and Jacek Panachida

Abstract A comparative analysis of the two most popular open source web frameworks for Java platform is done. The aim of the paper is to present modern software environments designed for implementing web applications and also to make final recommendations about web framework, which should be used in developing web application. The subjects of the analysis are Spring MVC and JavaServer Faces. The solution of the problem relies upon theoretical analysis of available framework features and upon empirical studies on implemented application designed to support managing a pet clinic.

1 Introduction At the moment there exists over 40 web frameworks for the Java platform, e.i. Cocoon, Echo, JavaServer Faces, Maverick, Spring, Struts, Tapestry, Turbine, WebWork. The selection of appropriate framework is very difficult task. This paper can help to avoid potential problems. The intrinsic quality of software system in a real world is its ability to evolve and adapt. The software is still modified to new requirements and new frameworks are created that can augment or enhance an existing system. Because of this, the programmers are in a dilemma, which one should be used. Existing software tools do not solve this problem because there does not exist the universal verification method. For each new framework we should develop new verification process. In this study, we investigate the two most popular environments: Spring MVC and JavaServer Faces. It is clear that obtained results could not be direct transferred Dariusz Król Wrocław University of Technology, Institute of Informatics, Wyb. Wyspia´nskiego 27, 50-370 Wrocław, Poland, e-mail: [email protected] Jacek Panachida Wrocław University of Technology, Faculty of Computer Science and Management

1

2

Dariusz Król and Jacek Panachida

to any other framework, but we believe that this method could be used in a similar fashion also to other systems from the list of existing web frameworks. The remainder of this paper is structured as follows. Section 2 provides an overview of application performance study of three selected user groups. Section 3 discusses essential issues from the point of Java metrics. Templates and Trinidad components study are described in Sections 4 and 5. Finally, Section 6 concludes the presentation and provides final conclusions.

2 Application Performance Study Performance study was conducted on final version of Petclinic application [9]. Each test involved three different group of users: users that mostly browse data- Group 1, users that execute tasks connected with updating - Group 2, and mixed group in proportions 4 to 1 of previous two groups - Group 3. Table 1 Numbers of requests according to user group Measurement Max. no of requests Max. no of requests in 3 sec.

Group 1 478/190 54/38

Group 2 263/172 52/35

Group 3 317/184 37/26

Numerator replies to Spring MVC, denominator – JavaServer Faces

Table 1 presents numbers of requests according to user group. Value in numerator describes result for Spring MVC [1, 12] and in denominator for JavaServer Faces [2, 6]. Data in first row apply for maximal measured requests per second, second for requests in fixed response time (3 sec.). In case of failed response test was finished. The main observations are the following: • For first user group there is significant difference. It results from nature of JavaServer Faces applications that attempt to mimic behaviour of desktop applications. Each request directed to JavaServer Faces application has cookie with the average size of 5kB. It has significant impact on performance. The cookie include information about state of page components. • The second group consists of users that made operations with database usage e.g. add new visit or change owner’s information. This type of operations effected in reduced performance, nearly two times. The decrease of requests is mostly noticeable for Spring MVC application that changed from 478 to 263. For JavaServer Faces values are almost the same. It probably result of cookie size and higher web server load due to database operations. • The third group was created to simulate ordinary system usage. Most users only browse data applications. There was only small difference between second and last group. The reason is that database operations are far more resource consuming than common browsing actions.

Verification of Open Source Web Frameworks

3

• The second row presents similar results but for requests in fixed response time. As well in this case Spring MVC application was better, it has more requests per second versus JavaServer Faces application with the same group of users. In each group ratio of Spring MVC and JavaServer Faces requests are nearly the same and is equal to 1.4.

Table 2 Average response time according to user group Measurement Max. no of requests Max. no of requests in 3 sec.

Group 1 0,4/1,9 0,25/0,5

Group 2 0,6/0,8 0,12/0,27

Group 3 0,54/1,4 0,19/0,36

Value of numerator replies to Spring MVC, denominator – JavaServer Faces

Table 2 presents average response time depending on user groups. Value in numerator describes result for Spring MVC and in denominator for JavaServer Faces. The biggest difference is for the first group to the advantage of Spring MVC framework. The lowest difference is for second group. Value for the third group various for about 50% for each framework [11].

3 Java Metrics Study The aim of code metric measure [3] was comparison of projects size and quality of each project. Table 3 Metrics of code for application packages Metric LOC LOC of methods No of classes No of interfaces No of packages No of methods No of fields

Data 1048 259 23 9 5 137 29

Core 450 93 10 9 3 47 20

Spring MVC 1195 477 29 0 7 115 29

JSF 977 407 21 1 7 130 41

Table 3 shows size of each project. Analysis shows also data for common modules such as data and core. Data module is responsible for data storage and core module for business logic. The main observations are the following: • Application created with Spring MVC has more lines of code than JavaServer Faces application. Difference in Method Lines of Code metric is only insignificant lower. The smallest module is core that does not have to implement com-

4

Dariusz Król and Jacek Panachida

plex business logic that is necessary for standard CRUD (create, retrieve, update, delete) application. • Number of packages and interfaces is nearly the same. The exception is number of classes which is bigger in Spring MVC application. Again core module is the smallest. There is only one interface used for implementation of utility beans for JSF application. • The only metrics that JSF has higher values is number of fields and methods. This is a result of usage component model in JavaServer Faces application. Each class of component is Plain Old Java Object (POJO), which encapsulate its state. Rich components encapsulate many fields to preserve their state. It results also in higher number of getter and setter methods per class.

Table 4 Metrics of code with functional classification of packages Metric LOC LOC of methods No of classes No of interfaces No of methods No of fields

VAL 59/– 34/– 2/– 0/– 4/– 0/–

WEB 233/109 91/31 6/4 0/0 22/22 7/4

UT 67/149 30/68 2/3 0/1 0/3 0/1

CON 150/63 64/28 3/2 0/0 20/4 7/0

LST 93/336 32/156 7/6 0/0 0/54 7/17

FORM 573/300 223/121 9/5 0/0 60/45 15/19

Numerator replies to Spring MVC, denominator – JavaServer Faces

Another comparison takes into consideration functional aspects of application packages. These were divided into six groups. Code that is responsible for validation (VAL), code of controllers (WEB), code of utility classes (UT), code of converters (CON), code for display list of objects (LST) and code for handle forms (FORM). Table 4 presents metrics of code with functional classification of packages. Value in numerator describes result for Spring MVC and in denominator for JavaServer Faces. The main observations are the following: • JavaServer application does not have validators. All necessary validation is performed using Trinidad validation tags in web page templates. • In case of utility classes Spring MVC is better as it has two times lower number lines of code. The difference is a result of code that was necessary for implementation of beans management and class for storage of used language version of application. • Process of conversion of data was easier in JavaServer Faces application. There was no necessity for implementation of converter for list of medicines. • Comparison of others values is hard because of difference in models used in each application. WEB module in case of Spring MVC application is set of traditional controllers. In JSF application role of controllers is concentrated in code of components, which has also other responsibilities. In JavaServer Faces application controllers from WEB group are facade for business logic code.

Verification of Open Source Web Frameworks

5

• Last element is code related to web forms. Approach of JSF application that is similar for desktop application results in two times smaller value of total lines of code metric. In fact part of code that is responsible for form handle is located in LST group. • Others metrics like number of children, depth of inheritance tree or number of overridden methods show some interesting places in applications. In Spring MVC application code of forms have the biggest value of depth of inheritance tree (average 8.78) and number of overridden methods (3.11). For JSF application average depth of inheritance tree is 1. Metric for number of children and number of overridden methods equals 0. This is result of lack of necessity of extending of class of JavaServer Faces framework.

Table 5 Complex metrics of code for Spring MVC and JavaServer Faces application Metric Specialization index Afferent coupling Efferent coupling Abstractness Instability Normalized Distance

Spring MVC 1,862 0,857 4,143 0,016 0,829 0,187

JSF 0 2,429 2,714 0,036 0,664 0,3

The results of complex metrics are presented in Table 5. Each value is average of packages for whole application. We use the following metrics: • Specialization index based on proportion of number of overridden methods and depth of inheritance tree. In connection with lack of overridden methods in JavaServer application value of this metric equals 0. • Afferent coupling indicates on average packages responsibility. Code of application that was created with Spring MVC has less than two times responsibility in comparison to JavaServer Faces application. In Spring MVC application the biggest responsibility have utility classes and converters, in JSF application utility classes and controllers. • Efferent coupling is the number of other packages that the package being measured is dependent upon. In Spring MVC the higher value of metric have code of forms and controllers, in JSF application code of controllers and lists of objects. The result of a subtraction between Afferent and Efferent coupling equals 5 for Spring MVC application and 1 for JSF application. It means that in Spring MVC application majority of packages depends on minority of other packages. Situation is different for JSF application, here values of Afferent and Efferent coupling are almost the same. • Abstractness of both applications is comparable low. There was much more concrete class in comparison of abstract one. • Instability as quotient of efferent coupling and sum of all coupling is bigger for Spring MVC application. The most instability is due to utility packages.

6

Dariusz Król and Jacek Panachida

• Normalized Distance from Main Sequence represents the best balance between abstractness and stability of package. In this case better is Spring MVC as result of bigger instability with smaller abstractness.

Fig. 1 Changes of code metrics during project development

Figure 1 presents changes in size of project during implementation of web module. Second phase related to addition of web forms made biggest changes in code base. Third phase connected with localization of application [10] was insignificant for both project sizes.

4 Page Templates Study In order to provide a comparison, metrics for page templates were introduced. There are five types of views: paged list, details of object, form of object modification, form for adding visit (wizard style), simple list with fixed number of rows.

Fig. 2 Number of tags according to view type Fig. 3 The size of generated HTML page

Verification of Open Source Web Frameworks

7

Figure 2 presents average numbers of tags of dynamic contents per view type. Application created with JavaServer Faces use more tags [5]. The biggest difference is in case of simple list that was implemented in JSF application with usage of paged list. Trinidad library does not have dedicated solution for simple types of list. Another difference is for paged list, list in JavaServer Faces application is more complex. The smaller difference was in case of modification form despite the fact of presents of validations tags of JavaServer Faces application. Figure 3 presents the size of generated HTML page. The size of a page template is almost the same and equals about 7kB. But after transformation of dynamic tags size into HTML code differences increase four times. The biggest difference is in case of paged list but it is because of complexity of list component. The main reason in all types of views is result of additional JavaScript in JavaServer Faces application [7].

5 Accessibility and Maintainability Study of Trinidad Components Accessibility [4] of Trinidad components was measured by validation of HTML code and by verification of correctness of functioning components in modern internet browsers. This evaluation apply only for visual components. Table 6 Correctness of functioning Trinidad components according to Internet browser Internet browser IE 6.x IE 7.x Firefox 3.x Opera 9.x

Fig. 4 Correctness of Trinidad components

No of points 62 62 59 50

Fig. 5 Types of errors

8

Dariusz Król and Jacek Panachida

Table 7 Maintainability of Trinidad applications and number of changes regard to project files Function Add Complete Modify

List 2/2 2/1 0/0

Form 3/1 2/1 0/0

Localization 1/2 1/1 0/0

Value of numerator replies to Spring MVC, denominator – JavaServer Faces

In the first test each browser obtained points that depended of correctness of functioning tested components. Two points if component was fully functional. One point if component was partly functional. No points if component did not work. Results are presented in Table 6. Figure 4 presents the correctness of components according to internet browsers. Components work the best in Internet Explorer browsers. The results for Firefox are slightly worse, but all components still work. There are only insignificant problems with chooseDate, inputData and selectBooleanCheckbox components. The worst was Opera browser. Three components (chooseDate, inputData and selectBooleanCheckbox) did not work. Another two (navigationTree and inputNumberSpinbox) had problems with proper rendering. Next test was performed using W3C validator for HTML 4.01 Transitional specification. During testing some components returned internal error with message "the server encountered an internal error() that prevented it from fulfilling this request". Component selectOneChoice was returning message "javax.faces.FacesException: SelectItem with no value". Finally in test took part 31 components. During test it turned out that over 95% of errors belong to one of four group. Figure 5 presents percentage share of errors according to their types. The most common error applied for invalid value of id attribute, that starts with underscore or dollar sign. Second category was lack of type attribute in script tag. Third applied to invalid tag placement and last applied to lack of close tag for caption and table tags. Over 1/3 of components did not have errors, 13% had one error and 20% two errors. There was about 1/4 components with number of errors that include between three and seven and over 6% with over seven errors. Components with the most number of errors are inputColor with 19 errors, chooseColor with 13 errors and selectManyCheckbox with 7 errors. Measure of maintainability [8] of application base on number of files that have to be added, modified or appended according to some new elements in application. Action add means add new file, modification means change content of some file that already exists and append means change file but without changing old content. Table 7 presents results of maintainability of Spring MVC (numerator) and JavaServer Faces applications (denominator). Measured values relate to add new paged list, web form and localization of application. The main observations are the following: • To add new paged list it is necessary to update (append) petclinic-servlet.xml, add new mapping for controller and a method handler. Then add new file with controller and implement handler. Next action involves creating of page template and adding new mapping of logical view name (file views.properties).

Verification of Open Source Web Frameworks

9

• In case of JSF application first declaration of managed bean must be done (facesconfig.xml). There is no necessity for mapping any new declarations. Next is creation of managed bean implementation. Last actions involves creation of page template. • In comparison of adding pages list, JavaServer Faces application require less work to achieve paged list. Only one modification is required and there is no need to define new URL mappings. • Creation of web form in Spring MVC application involves as before adding mapping and new controller method, but additionally validator class is required. • For JSF application there is only necessary to add new component method and add new page template. In case of simple forms there is not necessary to add custom validators. Often Trinidad validator tags are enough. • Form creation using JavaServer Faces is easier. In Spring MVC application it is required to repeat steps known from previous action and add new validator. In JavaServer application is often enough to add new template and method. • Localization of application in Spring MVC involves adding special beans to configuration file and adding message translation in separated files for different language. Language switching is done by beans. • In JSF application first adding information about using languages is required. Next action involves adding files with message translations. Trinidad has out of box translation for various language. To add language switcher implementing new classes to code base is required. • JavaServer Faces is better in message translation. Spring is better in case of language switching, JavaServer Faces doesn’t have support for this feature.

6 Conclusions and Future Work The selection of the appropriate framework is complex and important task of software programming. From our empirical study, we found that: 1. Spring MVC represents standard approach to developing web applications, wellknown for development using Struts framework. JavaServer Faces uses componentand event-based model, similar to Microsoft ASP.NET. 2. Spring MVC based application have better performance than JavaServer Faces application. It is consequence of JSF components used in presentation layer. 3. Project made with Spring MVC was bigger in sense of code metrics of lines of code and other artifacts. Classes of JSF project had more methods and fields as a consequence of component model. 4. Measurement of complex metrics indicated better design of Spring MVC application in concern of package dependencies and OOP techniques. 5. Pages generated with JavaServer Faces are even four times bigger than similar pages of Spring MVC application. 6. JavaServer Faces framework is better in creation of web forms but worse in comprehensive localization of application.

10

Dariusz Król and Jacek Panachida

7. Trinidad components work properly in most of the modern web browsers. Although our study investigated only two systems, we think that the observations from the study provide reasonable basis for further validation process. Further research is currently being undertaken to extend the application by additional functionality such as reporting or authentication and to evaluate Open Source Ajax Frameworks: Google Web Toolkit versus Direct Web Remoting.

Acknowledgements The authors would like to thank Mariusz Nowostawski and the anonymous reviewers for their comments. Mariusz Nowostawski from Otago University offered valuable suggestions on an early version of the manuscript.

References 1. Arthur J, Azadegan S (2005) Spring Framework for Rapid Open Source J2EE Web Application Development: A Case Study. In: Proceedings of the Sixth International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing and First ACIS International Workshop on Self-Assembling Wireless Networks. doi: http://dx.doi.org/10.1109/SNPD-SAWN.2005.74 2. Bergsten H (2004) JavaServer Faces. O’Reilly Media, Sebastopol 3. Chidamber SR, Kemerer CF (1994) A metrics suite for object oriented design. IEEE Trans Software Eng 20:476–493 4. Chisholm W, Vanderheiden G, Jacobs I (2000) Core Techniques for Web Content Accessibility Guidelines 1.0. http://www.w3.org/TR/WCAG10-CORE-TECHS/. Accessed 1 June 2010 5. Chusho T, Ishigure H, Konda N, Iwata T (2000) Component-based application development on architecture of a model, UI and components. In: Proceedings of the Seventh Asia-Pacific Software Engineering Conference. doi: http://doi.ieeecomputersociety.org/10.1109/APSEC.2000.896719 6. Deugo D (2006) Techniques for Handling JSF Exceptions, Messages and Contexts. In: Proceedings of the International Conference on Internet Computing, CSREA Press 7. Dunkel J, Bruns R, Holitschke A (2004) Comparison of JavaServer Pages and XSLT: a software engineering perspective. Software—Practice & Experience 34:1–13 8. International Standard ISO/IEC 9126-1 (2001) Software Engineering – Product Quality. Part 1: Quality model, Technical report, Geneva 9. Krebs K, Leau C, Brannen S (2007) The Spring Petclinic Application. http://static.springsource.org/docs/petclinic.html. Accessed 15 June 2010 10. Parr TJ (2006) Web application internationalization and localization in action. In: Proceedings of the 6th International Conference on Web Engineering. doi: http://doi.acm.org/10.1145/1145581.1145650 11. Selfa DM, Carrillo M, Del Rocio Boone M (2006) A Database and Web Application Based on MVC Architecture. In: Proceedings of the 16th IEEE International Conference on Electronics, Communications and Computers. doi: http://dx.doi.org/10.1109/CONIELECOMP.2006.6 12. Seth L, Darren D, Steven D, Colin Y (2006) Expert Spring MVC and Web Flow. Apress, Berkeley

Suggest Documents