Understanding Interoperability Issues of Web Service Frameworks

60 downloads 220 Views 769KB Size Report
different languages (e.g., Java, C#, Python) or deployed on ... subsystem of all client frameworks selected for testing can ...... [8] C. Pautasso, O. Zimmermann, and F. Leymann, “Restful Web ... [26] Oracle, “Java SE 7 API Specification,” 2013.
Understanding Interoperability Issues of Web Service Frameworks Ivano Alessandro Elia, Nuno Laranjeiro, Marco Vieira CISUC, Department of Informatics Engineering University of Coimbra, Portugal {ivanoe, cnl, mvieira}@dei.uc.pt Abstract—Web Services are a set of technologies designed to support the invocation of remote services by client applications, with the key goal of providing interoperable application-to-application interaction while supporting vendor and platform independence. The goal of this work is to study the real level of interoperability provided by these technologies through a massive experimental campaign involving a wide set of very popular frameworks for web services, implemented using seven different programming languages. We have tested the inter-operation of eleven client-side framework subsystems with three of the most widely used server-side implementations, each one hosting thousands of different services. The results highlight numerous situations where the goal of interoperability between different frameworks is not met due to problems both on the client and the server side. Moreover, we have identified issues also affecting interactions between the client and server subsystems of the same framework. Keywords-web service; interoperability; WS-I Basic Profile; web service framework

I.

INTRODUCTION

Web services (WS) are frequently deployed in environments where application-level interoperability (i.e., the ability of making systems to operate in conjunction [1]) is a critical feature. Typical deployment environments include large-scale business-to-business collaborations, safety-critical environments, and industrial manufacturing, just to name a few. In this type of environments, interoperability issues may result in severe financial and reputation costs for the service providers [2]. The WS technology is based on open XML standards and consists of self-describing components that can be used by other applications across the web in a platformindependent manner, being supported by standard protocols such as SOAP and WSDL [3]. Platform independence is the key goal of the WS technology, which defines a set of mechanisms to assure that two applications are able to exchange information, even if they have been built using different languages (e.g., Java, C#, Python) or deployed on top of different WS frameworks (platforms for creating and deploying web services, such as Axis 2 or JBossWS for Java, gSOAP or Axis2/C for C, etc.). In theory, WS frameworks include all the mechanisms needed to assure interoperable interaction, namely: 1) serverside interface description generation tools that generate a WSDL document, which is programming language agnostic

and describes the interface of a service (i.e., it describes the list of available operations, input and output parameters, among other aspects); and 2) client-side artifact generation tools that, in general, use the service description to produce code that in turn can be used by developers to invoke the service. In general, both server-side and client-side tools referred are subsystems of a given software framework, which also provides facilities for runtime communication between client and server (e.g., processing of SOAP requests and responses after deployment). Although interoperability is the major goal of WS, field experience and previous research studies [4]–[8] suggest, in a quite sparse manner, that not only full interoperability is difficult to reach, but also that it has not yet been fully achieved by current WS frameworks. This is supported by the Web Services Interoperability Organization (WS-I) [9], which has been working, for many years now, on solving or mitigating the interoperability problems by polishing the WS specifications. Despite the WS-I major effort, experience weakly suggests that even web services that conform to WSI profiles may also present interoperability issues. The problem is that developers many times create and deploy their web services assuming that the underlying framework they choose provides full interoperability. Thus, programmers are frequently unaware that by choosing a specific framework to deploy their services they might be introducing interoperability issues and thus excluding some frameworks from inter-operating with their service. To the best of our knowledge, a practical perspective that allows us to systematize this kind of knowledge is still missing. In this practical experience report we present an experimental study that, from a pragmatic perspective, allows getting insights on how interoperable web service frameworks are. The approach used in this study consists of two phases: 1) a Preparation Phase where we select a set of WS frameworks for the server and client sides, and create a set of services for testing; 2) a Testing Phase where we generate the services description documents and then use a set of frameworks at the client-side to generate and also compile (when required) client artifacts, based on those service descriptions. The goal of this work is to understand if the client-side subsystem of all client frameworks selected for testing can actually inter-operate with the server-side sub-system of all server frameworks participating in the tests. For the time being, we assess the frameworks interoperability for the generation of service interface descriptions, generation of

II.

BACKGROUND AND RELATED WORK

In a typical web services environment, the provider (i.e., the server) offers a well-defined interface to consumers (i.e., clients), which includes a set of operations and typed input/output parameters. Clients and servers typically interact by using a web service framework, which provides a set of mechanisms to guarantee that: i) the service can be deployed at the server, along with a service interface description (i.e., a WSDL is published); ii) the client developer can generate (and compile when required) clientside artifacts to easily invoke the service operations; iii) both client and server applications can communicate, and they do so by exchanging SOAP messages that are produced by the framework on behalf of the client and the server [3].

(1)  Service)Descrip0on)) Genera0on) Framework)A) )

(5))Execu0on)

SO

WSDL)

Ar0fact)) Generator)

) Framework)B)

(4))Communica0on)

Service) AP )

The outline of this paper is as follows. The next section presents background and related work on web services and interoperability. Section III presents the interoperability assessment approach. Section IV presents and discusses the results of the experimental evaluation and Section V concludes this paper.

Fig. 1 represents a typical web services inter-operation scenario using two different frameworks at the client and server sides, which are placed at each endpoint of the interaction. In the scenario represented, the client uses the client-side subsystem of framework B, while the server uses the server-side subsystem of framework A, however these could also be two subsystems of the same framework. As shown, a set of five steps needs to be performed for a client to invoke a remote web service operation. These steps represent the critical points where platform-level interoperability issues can arise. In this paper we focus on the first three, in which problems may prevent any further inter-operation (the analysis of the Communication and Execution steps is out of the scope of this paper). The Service Description Generation Step (1) typically occurs automatically when the service is deployed at the server, although it can also be executed manually by using directly the WS framework tools and the service code as input. The result of this step is a WSDL document that provides the information needed by a client using framework B to invoke the service provided by framework A. During the Client Artifact Generation step (2) the client’s artifacts are produced using the artifact generation tool provided by framework B. These artifacts are pieces of code that will translate application-level calls at the client to SOAP messages that will be delivered to the service. The Client Artifact Compilation step (3) is necessary only for platforms that are implemented in programming languages that require compilation of the code before execution (e.g., Java, C#), and in some languages/frameworks (e.g., Python) it may not be required, as the client-side artifacts are generated dynamically at runtime. Obviously, the artifacts (that after this step may be compiled) only provide the methods for a client application to invoke the remote service, but do not actually invoke it. It is up to the developer to create the client application and then invoke the methods that allow the client to communicate with the server. This is represented by the Communication step (4) in Fig. 1, which ultimately sends a SOAP message to the server. This message will be processed by server during the Execution step (5) and a response SOAP message will be produced to deliver the result of the operation. Despite being created to support interoperable operations [3], research and field experience suggests that web services interoperability has been an issue, since the inception of this technology. In fact, since 2002 the WS-I organization [9] has

AP )

SO

client artifacts, and compilation of artifacts, which are key steps in which problems may prevent any further interoperation, and do not account for later possible communication issues between the client and the server. This latter type of issues is out of the scope of this paper and will be tackled as future work. We carried out a massive experimental evaluation based on more than seven thousand services deployed on top of the server-side sub-systems of three major WS frameworks: Oracle Metro 2.3, JBossWS CXF 4.2.3, Microsoft WCF .NET 4.0.30319.17929 (C#) [10]–[12]. The services were hosted in major application servers, respectively GlassFish 4, JBoss AS 7.2, and Microsoft IIS 8.0.8418.0 (Express) [13]– [15]. We used a total of eleven client-side sub-systems to understand how interoperable are these frameworks, accounting for a total of 79629 executed tests. The clientside subsystems used are: Oracle Metro 2.3; Apache Axis1 1.4; Apache Axis2 1.6.2; Apache CXF 2.7.6; JBoss 6.1 JBossWS 4.2.3; .NET Framework 4.0.30319.17929 (used for three languages: C#; Visual Basic. NET; and Jscript .NET); gSOAP 2.8.16 (C++); Zend Framework 1.9 (PHP); and Python suds 0.4 [10]–[12], [16]–[21]. The results clearly show the presence of severe interoperability issues, even in very popular and WS-I compliant service interfaces, which requires urgent attention from the industry and research communities. The main contributions of the paper are as follows: • A programming language agnostic approach for assessing the interoperability of web service frameworks; • A large experimental evaluation highlighting critical interoperability issues, even in widely used and very popular frameworks; • A free tool [22] that implements the interoperability assessment approach and can be used by developers and researchers to extend this study.

Client) Ar0facts) (2))Client)Ar0fact)) Genera0on)

Client) Compiler)

(3))Client)Ar0fact)) Compila0on)

Figure 1. A typical web services environment.

III.

APPROACH AND EXPERIMENTAL SETUP

In this section we present the approach designed to test the interoperability of web service frameworks. In practice, the goal is to verify the interoperability between client and server subsystems of a set of widely used web service frameworks. As referred in the previous section, we focus on the typical three steps of a web services inter-operation scenario: Service Description Generation, Client Artifact Generation, and Client Artifact Compilation. The following points outline the two key phases of our approach, which are composed of several steps each (see Fig. 2): Preparation Phase: a) Selection of server frameworks: consists of choosing the web service frameworks that will act as service containers at the server-side; b) Selection of client frameworks: involves choosing the web service frameworks that will be used at the client-side (the frameworks selected in step a) can/should be included); c) Service creation: implies selecting input and output parameters and generating web service implementations that accept and return those parameters. Testing Phase: a) Service description generation: consists of using the server-side subsystem of a given framework to generate a WSDL document for each service created; b) Artifact generation: involves using the client-side subsystem of a given framework to generate client artifacts that can be used to invoke the remote service operations; c) Artifact compilation: consists of compiling (when required) the client artifacts generated in the previous step. d) Results classification: this step is to be carried out after the execution of each of the previous three steps. It consists of observing the outcome of each step in order to identify interoperability problems. Fig. 2 presents a graphical representation of the execution of the seven steps of the approach. The following subsections describe each phase and the corresponding steps in detail. A. Preparation Phase The Preparation Phase involves selecting frameworks to be used at the server and client sides and creating the testing

Ar%facts(( compila%on(

Ar%facts(( genera%on(

Tes%ng(Phase( Service(( Descrip%on( (genera%on(

Service(( Crea%on(

Selec%on( (of(client(( Frameworks(

Prepara%on(Phase( Selec%on(( of(server( (Frameworks(

been promoting best practices (i.e., Profiles) for web services interoperability for major web services standards across platforms, operating systems, and programming languages. A set of interoperability issues in the web services technology is analyzed in [4]. The authors identify some situations that commonly lead to interoperability issues: truncation of decimal data types, conversion to and from native data types, and the representation of sequences of elements and namespaces in WSDL files. The analysis is limited to a high level description of the different interoperability issues but no concrete examples of errors are detailed in the text. The authors also analyze how the WS-I Basic Profile 1.0 [9], [23] tries to address the raised issues and highlights the limitations of the WS-I set of interoperability recommendations (Basic Profile). These limitations are more related with aspects that impact the business logic of the applications (such as float or date and time precision) than with issues on the supporting platforms. Conclusions include the fact that adhering to the WS-I recommendations helps in reducing the web services interoperability issues. However, a more extensive practical view is still missing, as the paper does not cover the support for interoperability provided by current WS frameworks. The authors of [5] trace back the interoperability issues to the use of a non-formal language in the specifications, but also recognize that problems can occur much later with improper implementations of the protocols. Authors propose an approach based in models (and model-checking) and in message conformance checking (at runtime). No practical insight is provided on the interoperability of different frameworks currently being used in the industry. In [6] authors propose a technique to improve interoperability of web services. The approach is quite complex, making use of an enriched information model as support for testing, Protocol State Machines and UML, and also involves the interaction with an UDDI repository. In general, the approach tries to improve interoperability, but does not focus on understanding the interoperability issues of specific web service frameworks. A broad view on the interoperability of technologies frequently used in SOA environments, including web services, is presented in [7]. The authors propose a conceptual framework for analyzing web service interoperability issues, and recognize that it is a basis for studying standards and specifications and to identify interoperability improvement opportunities. The proposal is broad and focuses on high-level interaction (e.g., business processes, human consumers), there is no indication of the practical interoperability properties provided by supporting frameworks, and no specific information is given on how to detect interoperability issues. Interoperability problems can occur when, for instance, native data types or specific language constructs of the service implementation are present at the interface. The problem is that, to the best of our knowledge, there is no standard indicating which types are adequate to be used as a service interface or not. The authors of [8] confirm precisely that it is difficult to identify the right construct to express a data type that is fully supported by all WS frameworks.

Results(classifica%on( Figure 2. Outline of the main phases of the approach.

services. In our experiments we use a total of eleven different client artifact generation tools (provided by the client subsystem of WS frameworks). The artifacts created by the tools inter-operate with the server subsystems of three WS frameworks (deployed in three application servers) populated with more than two thousand services each. Fig. 3 provides an overall view of the experimental environment created. As shown, the combination of subsystems may result in a massive testing campaign, as the goal is to test the inter-operation of each client-side subsystem of each framework with selected server-side subsystems (each server-side subsystem is populated with thousands of services). The following sections describe the systems selection and creation of services in detail. a) Selecting server frameworks The server selection was based on market relevance and, as a result, we selected 3 very well-known servers that lead the enterprise application servers market, according to a 2011 report from Gartner [24]. The servers are GlassFish 4, JBoss AS 7.2, and Microsoft IIS 8.0.8418.0 (Express) [13]– [15] Each server uses a specific web service framework, GlassFish and JBoss AS support the deployment of Java web services and IIS supports the deployment C# web services. Table I presents the selected systems and frameworks. b) Selecting client frameworks Table II presents the WS frameworks selected to be used at the client-side (for inter-operation with the serverside), including the specific tool that is bundled with each framework (and that will be used later on to produce client artifacts). As we can see, we selected 11 client-side WS frameworks subsystems (we used the .NET framework to produce different artifacts for 3 programming languages). The selection includes the three frameworks already selected for the server-side, but also many popular frameworks (e.g., Axis2, Apache CXF) for several different languages (e.g., C++, Python, PHP). c) Creating test services Since our aim is to test inter-operation the focus is on the input and output parameters of the web services that are critical inter-operation points in this technology (and not on business logic execution inside each service). Therefore, our first batch of test services has been composed of simple web service implementations. Each service has a single operation with one input and one output variable of the same type. Ar7fact'' Generator'A'

Client' Client' 'A'!'B' Client' 'A'!'B' Client' 'A'!'B' Client'D'!' 'A'!'B' Client' 'Server'A' Client' 'A'!'B' Client' 'A'!'B' Client' 'A'!'B' Client'C'!'' 'A'!'B' Client' Server'B' Client' 'A'!'B' Client' 'A'!'B' Client' 'A'!'B' Client'D'!' 'A'!'B' Server'C'

Service' Service' Service' WSDL' Service' WSDL' Service' WSDL' Service' WSDL' Service' WSDL' Service' WSDL' Service' WSDL' Service' WSDL' Service' WSDL' Service' WSDL' Service' WSDL' Service' WSDL' Service' WSDL' WSDL' WSDL'

Framework'A' '

Client' Client' Client' Client' 'A'!'B' 'A'!'B' Client' Client' 'A'!'B' 'A'!'B' Client' Client' 'A'!'B' 'A'!'B' Client'B'!'' Client'A'!' 'A'!'B' 'A'!'B' Client' Client' Server'A'' Server'A' Client' Client' 'A'!'B' 'A'!'B' Client' Client' 'A'!'B' 'A'!'B' Client' Client' 'A'!'B' 'A'!'B' Client'B'!' Client'A'!'' 'A'!'B' 'A'!'B' Client' Client' Server'B' Server'B' Client' Client' 'A'!'B' 'A'!'B' Client' Client' 'A'!'B' 'A'!'B' Client' Client' 'A'!'B' 'A'!'B' Client'B'!' Client'A'!'' 'A'!'B' 'A'!'B' 'Server'C' Server'C'

'

Ar7fact'' Generator'D'

Framework'B' ' Framework'C'

Ar7fact'' Generator'B'

Figure 3. An overview of the prepared experimental environment.

TABLE I. Server GlassFish 4.0 [13]

SERVER PLATFORMS

Framework Metro 2.3 [10]

JBoss AS 7.2 [14] JBossWS CXF 4.2.3 [11] Microsoft IIS 8.0.8418.0 WCF .NET 4.0.30319.17929 [12] (Express) [15]

Language Java Java C#

The operation simply returns the provided input without further processing. We used a code generation script [22] to automatically create thousands of services that follow the structure described in the previous paragraph, each of them having as input/output all the different native classes of the programming language supported by the framework. In our case, we generated 3971 Java services for GlassFish and 3971 Java services for JBoss AS and 14082 C# services for the .NET framework. To gather a list of all classes in the two languages we used scripts [22] based on wget [25] to crawl the online documentation for the two languages [26], [27]. Obviously, not all classes can be used as input or output of a service. The first step of the next approach phase is used as a filter to exclude the services that cannot be used. B. Testing Phase After the Preparation Phase, the Testing Phase follows. The key steps are: a) generation of service description documents; b) generation of client-side artifacts; c) compilation of client-side artifacts; and d) classification of the results. The execution of the first three steps in this phase is interleaved with the execution of classification step. The Testing phase is described in the next paragraphs. a) Service Description Generation The first step is to generate a WSDL document for each service written. For this purpose, we can use a WSDL generation tool (which is typically part of the WS frameworks), or use the automatic WSDL generation approach, which is executed by the application server when we the service is deployed. We used this latter approach, since it is a typical technique for generating the WSDL (and in theory should have the same effect as running the WSDL generation tool manually). Since, in the service creation step, we tried to use all TABLE II.

CLIENT-SIDE FRAMEWORKS

Framework Tool Language Compilation Oracle Metro 2.3 [10] wsimport Java Yes Apache Axis1 1.4 [16] wsdl2java Java Yes1) Apache Axis2 1.6.2 [17] wsdl2java Java Yes2) Apache CXF 2.7.6 [18] wsdl2java Java Yes JBossWS CXF 4.2.3 [11] wsconsume Java Yes C#/ Microsoft WCF .NET Framework wsdl.exe VB .NET Yes1) 4.0.30319.17929 [12] Jscript NET gSOAP Toolkit 2.8.16 wsdl2h.exe and C++ Yes1) [19] soapcpp2.exe Zend Framework [20] Zend_Soap_Client PHP N/A3) suds Python 0.4 [21] suds Python client Python N/A3) 1) The tool does not compile automatically. A script was added to perform the task; 2) Compilation performed via an ant task generated by the tool; 3) Compilation is not possible. Client object instantiation was checked instead.

classes available in each platform language (Java and C#) as input/output parameters of individual services, we used this step to exclude (from the evaluation) the services for which the frameworks were not able to generate a WSDL document. In the end, we were able to create 2489 service in Glassfish, 2248 in JBoss AS, and 2502 services in IIS. For the sake of the experiments, we optimistically assume that if the platform is unable to create the service it is due to the fact that the class cannot be used as part of the service interface as the server platform in unable to handle the specific data type and cannot bind it to any XSD data type in the WSDL document. b) Client artifact generation After generating the WSDL document for all services, we then try to generate client artifacts for each WSDL produced. This procedure is executed not only with the client-side artifact generation tool of the same framework at the server-side, but with all client-side frameworks selected for testing. The goal is to uncover potential artifact generation issues that can prevent developers from creating a functional web service client. c) Client artifact compilation In this step we try to compile the artifacts generated in the previous step. Erratic generation tools, or tools using an incorrectly produced WSDL might silently reach this phase, having already produced the artifacts code without signaling a failure. If the artifacts cannot be compiled, it is a clear sign that something is not correct and must be analyzed in the classification step. d) Results classification As referred, the analysis of the results targets the output production at each of the three previous steps described. In general, we verify the existence of errors and warnings. An error occurs when no output is produced after executing a given approach step (the error may be signaled by the tool being used, or not). A warning occurs when the tool produces output but also points out some issue (e.g., unsafe operations being used), during its execution. Besides errors, a simple tool warning in one of the three steps may result in a potentially severe interoperability problem later on. After the service description generation step and since the current WSDL generation tools did not produce a single warning during this step in our experimental evaluation, we used a WS-I tool to check the service interface for WS-I Basic Profile 1.1 compliance [9]. The goal is to understand which service interfaces might generate a future problem and to give us an easier way of pinpointing that future problem to this early step. As such, we optimistically assume, at this point, that a problematic WSDL can proceed through testing. A relevant aspect is that some frameworks do not support artifact compilation (i.e., the service proxies are dynamically generated at runtime). In these cases, and to keep the communication step excluded from the evaluation, the results classification phase should test if the created objects

can be instantiated without any error (in order to verify if this step produced the right output). Notice that errors in any of the first three steps of the Testing Phase (i.e., Service Description Generation; Client Artifact Generation; and Client Artifact Compilation) are quite disruptive. If one step results in an error, the next step will not be performed. This means that if a WSDL for a given service cannot be created during the Service Description Generation Step, a client developer will not be able to generate artifacts. In the same manner, a failure in the generation of artifacts will obviously result in the impossibility to compile artifacts. In turn, a failure during artifact compilation can also prevent the creation of the client and, in consequence, the invocation and execution of the service. Thus, these steps represent clearly interoperabilitycritical points that must be analyzed. Obviously, and besides errors, a simple warning in one of the three steps may result in a potentially severe interoperability problem later on. Our goal for now is to detect and mark potential issues on the very first critical steps of the typical web services development procedure (which, in case of error, prevent posterior Communication and Execution steps from being carried out successfully). IV.

RESULTS AND DISCUSSION

In this section we present the results of our experimental evaluation. We performed interoperability tests using the conditions described in Section III, leading to a total number of 79629 tests executed. The tests disclosed a quite large number of errors and warnings scattered across the three different steps of the Testing Phase. Fig. 4 provides an overview of the warnings and errors that we found in each step of the Testing Phase and per each server framework. Table III presents a more detailed view of the number and type of issues detected in each step for each tool combination used. Please refer to Table III for the details on the overall numbers presented in the following paragraphs. Also, full details can be found in [22]. As we can see in Fig. 4, in the Service Description Generation Step the error count is zero, since we do not use the 14785 services (out of a total of 22024) for which the frameworks were unable to produce a WSDL. Among the remaining 7239 services, (for which the frameworks were able to produce a WSDL) we have encountered a total of 86 warnings (see also Table III), corresponding to services that do not pass the WS-I compliance test [7]. From these 86 services, 80 belong to the .NET tests and only 4 services (of 4978"

5000"

4496"

5004"

Service"DescripBon" GeneraBon"Errors"

4000" 3000"

2489"

Client"ArBfacts" GeneraBon"Warnings"

2255"

Client"ArBfacts" GeneraBon"Errors"

2000" 1000" 0"

Service"DescripBon" GeneraBon"Warnings"

529" 2" 0"

13"

Metro"

464" 4" 0"

21"

JBossWS"CXF"

256" 308" 80" 0" 4" WCF".NET"

Client"ArBfacts" CompilaBon"Warnings" Client"ArBfacts" CompilaBon""Errors"

Figure 4. Overview of the experimental results.

the 86) will reach the final step of the study without showing some kind of error. In the Client Artifact Generation Step, 4763 tests produced at least one warning and 287 resulted in errors (out of the total 79629). The JScript .NET clients are responsible for about 99.8% of all warnings due to an incompatibility with the Java platforms (Metro and JBossWS) that generates warnings at every execution of the tool. The remaining warnings correspond to other tools trying to generate artifacts from the WSDL documents that failed the WS-I check. About 97% of the errors in this step are produced when using WSDL documents that failed the WS-I check. The remaining 3% are actually the result of processing WS-I compliant service interfaces. During the Client Artifact Compilation Step the tools produced 14478 warnings. All of these are due to compiling the artifacts produced by Axis1 and Axis2 (see also Table III) and are non-disruptive warnings, which refer to the use of “unchecked or unsafe operations”. Regarding the errors in this step, we detected 1301 tests resulting in compilation errors that were, in most cases, due to generation of code carrying naming collisions or missing parameters and even crashes of the compilation tool (the Jscript .NET compilation tool crashed in some tests, with the message “131 INTERNAL COMPILER CRASH”). The compilation of the Axis1 and Jscript .NET artifacts account for the vast majority of the compilation errors (approximately 68% and 30%, respectively). The remaining errors occur when compiling the Visual Basic .NET, Axis2 artifacts. A. Main findings and trends In this section we summarize the main findings observed during the experimental evaluation. In the Service Description Generation Step we noticed that, under certain conditions, the server platforms actually publish service descriptions, even when the service interfaces do not pass WS-I compliance. In our view, this should not occur as the servers are publishing a potentially problematic WSDL. This is confirmed by the fact that about 95.3% of the services that TABLE III. WS-I Warnings

Metro

did not pass the WS-I compliance check also did not reach the final approach step without showing some kind of error. Despite its utility, the WS-I compliance check is not useful to detect all types of problematic WSDL documents. As an example, we encountered 2 WSDL documents published by JBossWS that passed the WS-I tests and still were unusable, since they did not describe any operation to be invoked by clients. A WSDL generation tool should fail or at least show a warning when a service description document is created and misses critical information (such as operations). In fact, we advocate that the minimum occurrence of operation elements in WSDL XML Schema definition [28] should be changed to 1 (currently is 0). This would allow tools to stop when such a problematic WSDL is created. Note that, despite this issue in JBossWS, the Metro framework in the GlassFish server signaled the problem by refusing to deploy these same services, which is a more adequate behavior when the goal is reaching interoperability. The Client Artifact Generation Step is not very prone to errors, as visible in Fig. 4 and Table III. As referred, the vast majority of the errors were caused by the unusable WSDL documents mentioned in the previous paragraph. This is probably due to the fact that, in most cases, the generation of the clients is a very automated task, where each statement is translated to a specific piece of code, and this will obviously fail when something quite unexpected is being processed (such as a WSDL without operations). Despite this, Axis1, Apache CXF and JBossWS did not signal any problem when using the WSDL documents without operations, which is obviously not the right behavior for the users of the tools (since the tools are silently failing to produce usable code), and also shows the silent propagation of a severe issue to the client side. The Client Artifact Compilation Step appears to be a more error prone step, where we found the majority of errors (refer to Table III and Fig. 4). On the other hand, if we exclude the Axis platforms warnings (which are all of the same type and refer to the use of unchecked operations), the total number of warnings found in this step drops to zero.

EXPERIMENTAL RESULTS

JBossWS CXF

WCF .NET

2 out of 2489 services 4 out of 2248 services 80 out of 2502 services Generation Compilation Generation Compilation Generation Compilation Warning Errors Warning Errors Client-side FW Warning Errors Warnings Errors Warnings Errors Warnings Errors s Metro 0 1a) s 0 0 s 1e) 3c),d) 0 0 0 77f) 3g) 0 0 a) d) Apache Axis1 0 1 2489 477 0 1 2248 412 0 3 2502 0 Apache Axis2 0 1a) 2489 1 0 2c) 2248 1 0 0 2502 1h) 2g) Apache-CXF 0 1a) 0 0 0 1d) 0 0 0 77f) 3g) 0 0 JBoss 0 1a) 0 0 0 1d) 0 0 0 77f) 3g) 0 0 .NET C#: 0 2 a),b) 0 0 0 4c),d),e) 0 0 1f) 0 00 0 .NET Visual Basic 0 2 a),b) 0 1 0 4c),d),e) 0 1 1f) 0 0 4 a),b) c),d),e) f) .NET JScript 2489 2 0 50 2248 4 0 50 1 0 0 301 15 gSOAP 0 1b) 0 0 2c 0 0 0 0 13f) 0 Zend Framework 0 0 2c 0 0 0 0 0 Suds Python 0 1a) 2c 1d) 0 0 1 0 a) WSDL for the service based on the class javax.xml.ws.wsaddressing.W3CEndpointReference fails the WS-I check; b) WSDL for the service based on the class java.text.SimpleDateFormat fails the WS-I check; c) Services based on java.util.concurrent.Future and javax.xml.ws.Response are WS-I compliant but do not provide operations that can be invoked; d) WSDL for the service based on the class javax.xml.ws.wsaddressing.W3CEndpointReference fails the WS-I check; e) WSDL for the service based on the class java.text.SimpleDateFormat fails the WS-I check; f) 77.NET services that fail the WS-I check; g) WS-I compliant services based on System.Data.DataTable and System.Data.DataTableCollection; h) WS-I compliant services based on System.Net.Sockets.SocketError.

This shows that either the generation process goes perfectly (and also without warnings), or it fails in result of serious problems in the artifacts code. Most of the compilation failures detected are due to wrong variables names, naming collisions or duplicate variables in within specific groups of services (see next section for examples of disclosed issues). We would also like to emphasize the huge behavior differences shown by the different tools. The client artifact generation tools provided with the tested versions of Metro, JBossWS, Apache CXF, gSOAP, and C# .NET appear to be quite mature as they fail almost only in presence of non WSI compliant WSDL documents. Moreover, failures are always in the generation phase and these tools never produced code that later results in compilation errors or warnings. In addition, the two client artifact generation tools for PHP and Python have a similar behavior, but they lack the compilation step. Their behavior will be more extensively tested when we assess the Communication and Execution steps (see the future work section for details). Some of the tested tools appear to be at a quite immature stage (although they are being used in production environments). One of the most problematic tools was the JScript .NET client artifact generator, where in many cases the tool did not produce the necessary functions to provide communication with the server. The interesting aspect is that we observed this behavior when generating artifacts for some of the JBoss and GlassFish services, but also for the .NET framework services (including tool crashes in this latter case). This indicates that inter-operation problems can sometimes occur, even within the same platform (and although using different languages). Similarly, the Visual Basic .NET tool had issues with 4 services of its own platform, generating code with variable naming collisions. We would expect good inter-operation between the client subsystem and the server subsystem of the same framework, but this is not always the case. The gSOAP client artifact generator, along with the JScript .NET generation tool, are the only tools that produce generation errors for WSDL documents that pass the WS-I check. In the case of gSOAP, the error is due to inconsistent inter-operation between the two client artifact generation tools. On the other hand, the artifacts generated by this framework are always compiled without errors or warnings. The Axis frameworks are quite mature and well known among researchers and practitioners. Axis 1 appears to be among the less interoperable client generation tools, probably due to the lack of recent updates. The Axis 2 platform shows some compilation errors that occur with services that have successfully passed the WS-I tests and are consumed by other platforms without encountering similar problems. These errors are due to the incorrect generation of code containing duplicate variables. B. Technical Examples of Disclosed Issues In addition to the issues described in the previous section, we now describe a few technical examples of problems found during the 3 key steps of our experimental evaluation. 1) WSDL generation

GlassFish and JBoss successfully deploy two services that do not pass the WS-I check. All the client artifacts generators produce warnings or errors with these two services, with the exception of Zend (that nevertheless produces an uncommon data structure in the generated client, which may be problematic in the following steps of the inter-operation). JBoss also deploys two other services that pass the WS-I check but provide no operations to be invoked. These two services are unusable by Metro, Axis2, .NET (for C# Visual Basic and JScript), and gSOAP while the Zend and Suds platforms generated client objects without methods. It is worth noting GlassFish refused to deploy these two services. 2) Client artifact generation Eighty of the services deployed in the .NET platform fail the WS-I compliance test (all services based on classes from the same packages). Seventy-six of these services produce errors that prevent generation of clients for Metro, Apache CXF and JBoss (while Suds only has problems with one of these services). These tools have problems in the Client Artifact Generation Step because some XML tags used in the WSDL (s:schema, s:lang) are not recognized. It is interesting to note that two other services that pass the WS-I tests produce very similar errors for the use of the s:any tag. We have also to highlight that all the errors in this group (those resulting from the 76 non WS-I compliant services and also from 3 WS-I compliant services) can be solved by using manual customization of the data type bindings [29], however the client developer has to know precisely which binding to define, which may again result in an interoperability problem. 3) Client artifact compilation Axis1 artifacts generated for Metro and JBossWS services resulted in 889 artifact compilation errors. The services that use Java Exception and Error classes result in a compilation issue that appears to be caused by the incorrect naming of an attribute inside the generated class that should wrap the Error or Exception. Renaming the attribute fixes the compilation issue (although obviously we would need the Communication and Execution steps to verify the correctness of the bug fix). The Axis2 platform shows 5 compilation errors, of which 2 account for the services that use the javax.xml.datatype.XMLGregorianCalendar class. Also in this case the error seems to be caused by the wrong naming of a parameter. Parameters in artifacts linked to classes in the same package follow the naming convention “local_suffixName”, while in this case the parameter is missing the suffix. Similarly, the VB.Net client artifacts fail to compile 4 services on the .NET framework (that use classes from the System.Web.UI.WebControls namespace). The problem seems to be an erroneous generation of the code of the artifacts where a parameter and a method share the same name leading to a collision. V.

CONCLUSION AND FUTURE WORK

The main motivation behind web service technologies is based on the concept of inter-operation of different service providers and consumers. In this work we test the interoperation of different web service frameworks and perform a

massive experimental campaign involving 3 major server side WS frameworks subsystems and eleven different client side frameworks subsystems. We deployed thousands of different services covering the widest range possible of native data types for the input and output variables of the services, resulting in 79629 tests covering interoperabilitycritical steps in typical WS environments. Results show that inter-operation between different frameworks is not yet fully achieved and that there are many cases where inter-operation is not possible due to errors both on the server and client side of the frameworks. Even though our tests use only services with a simple structure we encountered 1583 situations that led to interoperability errors. We have also encountered many cases (307) where even the interaction of clients and services of the same framework was not possible. It is very important to emphasize that each of the errors we have identified represents a serious failure of the claimed interoperability of the web service technologies. In this context, even a single interoperability error should be considered unacceptable, as it represents a situation where a client is prevented from using a service because of a flaw in one of the involved frameworks. Moreover we observed that all the three server platforms under test, deployed services that do not pass WS-I compliance checks and among those that pass, some still present interoperability issues. In future work we intend to test WS frameworks during the communication and execution phase to test the whole inter-operation lifecycle. Moreover, we plan to widen our setup by increasing the number of server side frameworks and use services with a higher level of complexity to cover more elaborate patterns of inter-operation. VI.

ACKNOWLEDGEMENT

This work has been partially supported by the Project ICIS - Intelligent Computing in the Internet of Services, number 4774, of the program CENTRO-SCT-2011-02, and by the projects CErtification of CRItical Systems (CECRIS), Marie Curie Industry-Academia Partnerships and Pathways (IAPP) number 324334; and DEsign, Verification and VAlidation of large-scale, dynamic Service SystEmS (DEVASSES), Marie Curie International Research Staff Exchange Scheme (IRSES) number 612569, both within the context of the EU Seventh Framework Programme (FP7).

[5]

[6]

[7]

[8]

[9] [10] [11] [12]

[13] [14] [15] [16] [17]

[18] [19]

[20] [21] [22] [23]

[24]

REFERENCES [1] [2]

[3]

[4]

A. Stevenson and M. Waite, Eds., Concise Oxford English dictionary. Oxford; New York: Oxford University Press, 2011. S. Szykman, S. J. Fenves, W. Keirouz, and S. B. Shooter, “A foundation for interoperability in next-generation product development systems,” Comput.-Aided Des., vol. 33, no. 7, pp. 545– 559, Jun. 2001. F. Curbera, M. Duftler, R. Khalaf, W. Nagy, N. Mukhi, and S. Weerawarana, “Unraveling the Web services web: an introduction to SOAP, WSDL, and UDDI,” IEEE Internet Comput., vol. 6, no. 2, pp. 86–93, Apr. 2002. K. M. Senthil Kumar, Akash Saurav Das, and Srinivas Padmanabhuni, “WS-I Basic Profile: a practitioner’s view,” in IEEE International Conference on Web Services (ICWS 2004), 2004, pp. 17–24.

[25] [26] [27] [28]

[29]

P. Ramsokul and A. Sowmya, “A Sniffer Based Approach to WS Protocols Conformance Checking,” in The Fifth International Symposium on Parallel and Distributed Computing, 2006. ISPDC ’06, 2006, pp. 58–65. A. Bertolino and A. Polini, “The audition framework for testing Web services interoperability,” in 31st EUROMICRO Conference on Software Engineering and Advanced Applications, 2005, 2005, pp. 134–142. H. R. M. Nezhad, B. Benatallah, F. Casati, and F. Toumani, “Web services interoperability specifications,” Computer, vol. 39, no. 5, pp. 24–32, 2006. C. Pautasso, O. Zimmermann, and F. Leymann, “Restful Web Services vs. ‘Big’’ Web Services: Making the Right Architectural Decision,” in Proceedings of the 17th International Conference on World Wide Web, New York, NY, USA, 2008, pp. 805–814 “Web Services Interoperability Organization (WS-I),” 2002. [Online]. Available: http://www.ws-i.org/. Oracle, “Metro,” 2013. [Online]. Available: https://metro.java.net/. Red Hat Middleware, “JBoss Web Services (JBossWS).” [Online]. Available: http://www.jboss.org/jbossws. D. Chappell, “Introducing Windows Communication Foundation in .NET Framework 4,” 2009. [Online]. Available: http://msdn.microsoft.com/en-us/library/ee958158.aspx. “GlassFish” [Online]. Available: https://glassfish.java.net Red Hat Middleware, “JBoss Application Server,” 2013. [Online]. Available: http://www.jboss.org/jbossas/. Microsoft Corporation, “IIS 8.0 Express,” 2013. [Online]. Available: http://www.microsoft.com/en-us/download/details.aspx?id=34679. Apache Software Foundation, “Apache Axis,” 2008. [Online]. Available: http://ws.apache.org/axis/. Apache Software Foundation, “Apache Axis2,” 2012. [Online]. Available: http://axis.apache.org/axis2/java/core/. [Accessed: 07-Dec2013] Apache Software Foundation, “Apache CXF,” 2013. [Online]. Available: http://cxf.apache.org/. [Accessed: 07-Dec-2013] R. van Engelen, “gSOAP: SOAP C++ Web Services,” 2013. [Online]. Available: http://www.cs.fsu.edu/~engelen/soap.html. [Accessed: 07-Dec-2013] Zend Technologies, “Zend Framework,” 2013. [Online]. Available: http://framework.zend.com/. J. Ortel, “Suds - Lightweight SOAP client.” [Online]. Available: https://fedorahosted.org/suds. I. A. Elia, N. Laranjeiro, and M. Vieira, “Interoperability Tool and Results.” [Online]. Available: http://eden.dei.uc.pt/~ivanoe/DSN14/ K. Ballinger, D. Ehnebuske, M. Gudgin, M. Nottingham, and P. Yendluri, “WS-I Basic Profile - Version 1.0,” 2004. [Online]. Available: http://www.ws-i.org/Profiles/BasicProfile-1.0.html. Massimo Pezzini, Yefim V. Natis, Kimihiko Iijima, Daniel Sholler, and Raffaella Favata, “Gartner’s Magic Quadrant for Enterprise Application Servers,” Gartner, Sep. 2011 [Online]. Available: https://www.gartner.com/doc/1804114/magic-quadrant-enterpriseapplication-servers “GNU Wget.” [Online]. Available: http://www.gnu.org/software/wget/manual/wget.html Oracle, “Java SE 7 API Specification,” 2013. [Online]. Available: http://docs.oracle.com/javase/7/docs/api/. “.NET Framework Class Library (C#).” [Online]. Available: http://msdn.microsoft.com/en-us/library/gg145045(v=vs.100).aspx E. Christensen, F. Curbera, G. Meredith, and S. Weerawarana, “Web Service Definition Language (WSDL),” 2001. [Online]. Available: http://schemas.xmlsoap.org/wsdl/. V. Pandey, “How to deal with unresolved xs:schema references in WSDL.” [Online]. Available: https://weblogs.java.net/blog/vivekp/archive/2007/05/how_to_deal_w it.html.