An Interoperability Testing Study: Automotive Inventory Visibility ... - ISL

3 downloads 96 Views 144KB Size Report
Visibility and Interoperability (IV&I) project initiated by the Automotive Industry Action .... correctly interpreted by a technician using the tool User Interface.
An Interoperability Testing Study: Automotive Inventory Visibility and Interoperability Nenad IVEZIC1, Boonserm KULVATUNYOU1, Simon FRECHETTE1, Albert JONES1 1 NIST, 100 Bureau Drive, Gaithersburg, MD, 20817, USA Tel: +001 301 975-3536, Fax: + 001 301 975 4482, Email: {nivezic|serm|frechette|[email protected]} Hyunbo CHO2, Buhwan JEONG2 2 POSTECH, , South Korea Tel: +82 54 279 2204, Fax: +82 54 279 2870, Email: {hcho|[email protected]} Jaeman LEE3 KATS, , South Korea Tel: +82 2 509 7272, Fax: +82 2 507 1924, Email: {[email protected]} 3

Abstract: This paper describes a collaborative effort between the NIST and Korean Business-to-Business Interoperability Test Beds to support a global, automotive industry interoperability project. The purpose of the collaboration is to develop a methodology for validation of interoperable data content standards implemented within inventory visibility tools in an internationally adopted testing framework. Towards that goal, we formulate methods to help the vendors consistently implement the interoperable message standards within inventory visibility tools. We develop conformance testing methods to assess compliance of the vendor tools’ implementations with respect to the prescribed data content standards. We illustrate our developments in support of an initial proof of concept for an international automotive industry interoperability project. 1. Introduction The NIST Business-to-Business (B2B) Interoperability Testbed and the Korean B2B Interoperability Testbed (KorBIT) collaboratively develop infrastructure for testing interoperable information systems in the extended manufacturing enterprise [1,2]. The ultimate goal of this collaboration is to establish a common, international testing framework. Currently, the NIST test bed focuses on data content standards testing while the KorBIT test bed focuses on the messaging testing approaches. The intent is to exchange the results of the two test beds within a common testing framework. With its focus on the data content standards, the NIST testbed supports the Inventory Visibility and Interoperability (IV&I) project initiated by the Automotive Industry Action Group (AIAG) [3]. AIAG is a globally recognized organization with 1,600 member companies from the automotive supply chain that co-operate to develop common business solutions. The IV&I project addresses interoperability among inventory visibility (IV) tools in the automotive supply chain. This project is perhaps the first large-scale, open, industryled, international interoperability project utilizing the emerging B2B standards.

Currently, the automotive customer companies require their suppliers to monitor customer inventory status and replenish parts using IV tools that use proprietary data exchange standards. Consequently, suppliers use multiple IV tools to communicate with multiple customers. However, once the interoperable IV&I data exchange standards are in place, the suppliers will be able to use a single tool of their choosing to communicate with multiple customer tools and cut the costs of managing and operating these tools. The test bed team develops a testing methodology for the IV&I data content standards validation to be used in an internationally adopted testing framework. This paper describes the testing methodology and illustrates usage of the methodology to support the IV&I data content standards validation. 2. Objectives The objective of this paper is to describe a testing methodology developed for data content standards used to specify message interchange among applications in a business-to-business (B2B) setting. Data content standards for the B2B message interchange are currently defined using some industry adopted XML-based content specification. The IV&I project has adopted the OAGIS content standard which makes use of the W3C XML Schema standard [4]. We use this content standard to illustrate the intended application of the testing methodology. This selection, however, doesn’t limit the generality of the described testing methodology. 3. Methodology Our data content testing methodology is developed to help deploy and assess message content standards within a B2B setting and it consists of the following four components: • Methods to support vendors implement the content standards; • Conformance testing methods for data content standards; • Methods to develop test cases generation procedures which would drive test framework execution; and • Execution framework to run the test cases within a specific interoperability project. In this paper, we focus on the first three components of the methodology. The last component comprises the monitoring and testing tools that are described elsewhere [1]. 4. Developments 4.1 The Inventory Visibility and Interoperability (IV&I) Project Before we describe specifics of the testing methodology, we describe the IV&I interoperability project artefacts that drive the methodology development. We focus on the artefacts driving the proposed IV&I Proof of Concept (POC). Figure 1 shows the POC interoperability requirements diagram in which the customer systems communicate inventory data to their suppliers. A supplier, with a single tool of choice, gathers inventory data coming from different suppliers. This is possible because all inventory visibility (IV) tools use interoperable message standards. The inventory data is exchanged using Quantity On Hand (QOH) message. Figure 2 shows the QOH document model that a customer sends to its suppliers as envisioned by the IV&I business analysts for the POC. The model contains header information about the receiver and the sender, item information, inventory, shipment, and storage location data. Figure 3 illustrates a portion of the IV&I QOH Business Object Document (BOD) XML Schema definition based on the QOH document model and following the OAGIS 8.0 specifications for message content definition. The QOH BOD reuses a number of OAGIS

Customer A uses suppliers A, B, C

QOH QOH

Customer B uses suppliers A, C

Transform Tool #1

IV Tool #1

Supplier A

QOH

QOH QOH QOH Supplier C IV Tool #3

QOH

Customer C uses suppliers B, C

QOH QOH

Transform Tool #2

QOH

QOH

Customer D uses suppliers A, B, C

IV Tool #2

Supplier B

QOH

8.0 business concept definitions such as the item and inventory. Other business concepts, such as the storage location, have been added with the OAGIS IV&I overlay to support the QOH BOD definition. Figure 1: The IV&I POC interoperability requirements diagram

Figure 2: The QOH document model 4.2 Methods to Support a Content Standard Implementation To ensure consistent implementation of the content standard, we develop three types of facilities: Document Validation Rules, Business Case Definitions, and Mapping Tables. 4.2.1

Document Validation Rules

Document Validation Rules explicate structural, syntactic, and semantic constraints that need to hold for any instance of an XML-based message definition. Table 1 shows

examples of such validation rules. The three right-most column entries indicate whether a rule constrains a specific element/attribute, a range of values for an element/attribute, or some relationship among the elements/attributes or their values.

Figure 3: A Portion of the SyncQOH BOD XML Schema Document Validation Rule CustomerPartyId and its qualifier are required The syncExpression value must be QuantityOnHand EffectivePeriod is mutually exclusive with the EndEffectiveQuantity If the LineNumber element is present then the DocumentIds must also be present The ReceivedAsOfDate must be the same or later than the ivi:StartDate Customer and supplier must be different parties

element *

value

relationship

* * *

* * * *

* *

Table 1: Examples of Document Validation Rules The first two rules are syntactic validation rules. The first rule indicates that a specific element and its attribute must appear in the document. The second rule is an example of a constraint imposed on an element/attribute value. While both types of constraints may be defined using XML Schema definitions, it is often desirable or necessary (because of the reuse issues) to leave such constraints within external validation rules. The next two rules are structure validation rules. The first rule indicates that two elements are exclusive. The second rule expresses co-existence constraint on two elements. While the first rule may be specified using the XML Schema definition language, the second rule is not possible to express in such a way. The last two rules are semantic validation rules. The first rule specifies a constraint on the values of two elements by imposing ordering relation. The second rule also expresses a constraint on values of two elements, this time a non-equality relation. Both rules are not possible to express only in the XML Schema definition language.

4.2.2

Business Case Definitions

Business Case Definitions explicate requisite constraints among the message elements/attributes both in terms of the message elements/attribute usage occurrence and tool support indicators. The ‘usage occurrence’ for a BOD indicates the minimum and maximum occurrence for each element/attribute in order for an instance of the BOD to be valid in a particular context of data exchange (e.g., IV&I project). These occurrence constraints are different from those expressed in the BOD schema as they reflect additional requirements. The occurrences of each element/attribute are conditioned on their parent elements. For example, within a SyncQOH document schema, the ItemStatus (parent) element may have usage occurrence O, while the ItemStatus/Code (child) element may have usage occurrence 1. The meaning is that the ItemStatus/Code element must occur if the ItemStatus element occurs; otherwise, the ItemStatus/Code element must not occur. The following notation applies: • O indicates an optional element. The element/attribute may occur 0 or 1 time. • C indicates a conditional optional element and is meant to indicate a dependency on elements or attributes other than the element’s ancestors. The element/attribute may occur 0 or 1 time, based on some conditions involving elements/attributes other than the element/attribute ancestors. • 1 indicates an element/attribute occurs once and only once. • 0+ indicates an element/attribute that may occur zero or more times. • C+ is similar to C where an element/attribute may occur multiple times. • 1+ indicates an element/attribute that must occur at least one time. The ‘tool support’ indicates optionallity of elements/attributes from a functional perspective and drives definition of the business cases for testing purposes. If the field's usage occurrence is required (1 or 1+), that field always has a mandatory tool support (M). If the field's usage occurrence is optional or conditionally optional (0, 0+, C, C+), the tool support indicator points whether the tool must be able to process the field or not, if it occurs. The following notation applies: • M indicates mandatory tool support is required for the field, i.e, the tool must be able to store, process, or interpret the field. • O indicates optional tool support for the field. This means that the field may be disregarded by the tool. This M and O tool support are also interpreted conditionally on the parent of the element/attribute in the same way as the usage occurrence. All the fields with the mandatory tool support constitute the Base Business Case while different combinations of optional tool support fields constitute additional business cases. As described in Section 4.4, the business cases definition drives the test case requirements. 4.2.3

Mapping Tables

Mapping Tables explicate mappings between each XML-based message element/attribute and an intended vendor tool interface. Table 2 shows an example mapping table with usage occurrence and tool support definitions. The ‘Vendor’ column shows a vendor support of the selected document schema elements. For example, the vendor support of the From and To components of EffectivePeriod but not of the Duration component is a potential problem since the ‘Tool Support’ column indicates all three elements are mandatory. Element

Description

Item/CustomerItemId/Id Item/CustomerItemId/Revision Item/EffectivePeriod/From

Customer part number Part revision number Start date of part production

Usage Occurrence 1 0 C

Tool Support M M M

Vendor Support 1 1 1

Item/EffectivePeriod/To Item/EffectivePeriod/Duration Item/EndEffectiveQuantity

Planned end date of production Planned duration of production Planned part cumulative quantity

C C 0

M M O

1 0 0

Table 2: Example Mapping Table with Usage Occurrence and Tool Support Definitions 4.3 Conformance testing methods for message content standards Render BOD Receive msg & Retrieve BOD

Validat e BOD

Store BOD

Generate BOD

Prepare & Send msg

Confirm BOD

Once the vendor tools (either IV Application or Transformer tool) implement a message content specification, such as the SyncQOH BOD, a number of conformance testing methods may be applied to validate compliance of the tool with IV&I requirements and the capacity of the tools to manage content data. The conformance testing methods may be explained using a reference function architecture for the IV&I tool shown in Figure 4. Figure 4: Reference Function Architecture for IV&I Conformant Tool The conceptual reference architecture states that an IV&I-enabled tool is capable of supporting the following functions: • Receive a message (over a transport) and Retrieve a BOD from the message envelope; • Validate a BOD content; • Create a Confirm BOD instance; • Store a BOD instance; • Render a BOD instance for viewing via a GUI; • Generate a BOD instance from locally stored data; and • Prepare a BOD instance for transmission and send the instance within a message As indicated in the figure by the dotted lines, the first and the last functionality are not of concern for the content conformance tests and both of the functions are assumed correct. The following are description of each conformance phase and their positive test claims. • The BOD Validation Phase tests the capability of a tool to generate a valid BOD instance. An IV&I compliant tool, upon completion of this test phase, is claimed to successfully generate a SyncQOH BOD instance based on a correctly stored QOH data. • The Information Input Mapping Test Phase tests the capability of a participating tool to correctly map an input BOD instance onto the tool’s local store. An IV&I compliant tool, upon completion of this test phase, is claimed to successfully map an input SyncQOH BOD instance based on a provided valid SyncQOH instance that was (1) correctly validated by the tool, (2) correctly rendered by the tool User Interface, and (3) correctly interpreted by a technician using the tool User Interface. • The Information Output Mapping Test Phase tests the capability of a participating tool to correctly transfer a received valid BOD instance. An IV&I compliant tool, upon completion of this test phase, is claimed to successfully generate a SyncQOH BOD instance based on a provided valid SyncQOH instance that was (1) correctly validated by the tool and (2) correctly mapped onto the tool data store. • The Behavioral Test Phase – tests the transactional behavior (i.e., request/response) of a participating tool with respect to the boundary QOH BOD instances (i.e., an invalid QOH BOD instance). An IV&I compliant tool, upon completion of this test phase, is

claimed to successfully generate a Confirm BOD instance based on a valid or invalid SyncQOH instance that was correctly determined to be valid or invalid by the tool. 4.4 Test Cases Generation Procedures As mentioned previously, mandatory tool support specification defines the Base Business Case while different element combinations appearing in the optional tool support specification constitute additional business cases. The specification of business cases defines testing requirements for the IV&I conformance tests. In addition, presence of conditional fields drives different test requirements for each business case because some of those fields cannot co-occur. Prior to test case requirements generation, it is desirable to specify possible IV&I profiles (i.e., valid combinations of IV Tool support and conditional fields). The IV&I profiles determine which individual message interfaces (i.e., business cases) makes sense to support from the business logic standpoint. Once the IV&I profiles are determined, test requirements are created to indicate data elements/attributes that must appear in test cases. Test requirements (together with IV&I profiles) guide test data selection which matches sample application data with test requirements to form test data. Then, the test data are assembled in the form of abstract (i.e., independent of a specific format) test cases that match test requirements. The semantic validation rules ensure valid abstract test cases. Before generating the executable test cases, conformance level statements are created to aggregate abstract test cases that match some conformance testing strategy. Such strategy identifies possible aggregation of IV&I profiles and the corresponding business cases. Finally, the executable test cases are generated using the IV&I conformance level statements, the abstract test cases, and the artefacts created during the IV&I BOD Development process - the actual IV&I BOD schemas and the external syntax, structural, and semantic validation rules. 5. Results The initial phase of the IV&I POC focused on the QOH document to create a methodology and tools to support implementation and testing of this and additional documents in the IV&I project (e.g., Advanced Shipment Notice and Delivery Notice). Within this phase, the test bed team has been working with an initial group of IV vendors and service providers such as QAD [5], Covisint [6], and iConnect [7] to develop and verify the methodology and tools. The following are the most notable results of this phase: • On the order of 120 syntax, structure, and semantic validation rules were identified for the QOH document and encoded in Schematron langugage. The encoding was done using a rule authoring environment developed at NIST specifically for XML content validation rule definition [8]. The encoded rules are executed within a testbed node that consists of a Reflector tool and its associated message logging repository [9], and a Schematron processor [10]. Using this installation, the initial group of IV vendors successfully tested their QOH interface implementations using this validation suite. • In addition to the base business case, we have identified six more business cases that define different use of the QOH document. For example, one business case is the base business case plus the EffectivePeriod element with either From-Duration or From-To element combination. Another business case is the base business case plus the EndEffectiveQuantity element. Currently, we are using the base business case to test input mapping with the initial vendor group and are identifying significant inconsistent interpretations of the QOH specification using the conformance test suite. This wil be followed by the output mapping test using the base and other business cases.



We have completed formulation of behavioral test cases in collaboration with Drake Certivo [11]. The behavioral test case scenarios include simulation of exception conditions such as min/max inventory boundaries. We are starting the behavioral testing in parallel with the input and output mapping conformance tests.

6. Business Value The essential value of developing a testing methodology within a shared, international testing framework to validate data content standards implementations is twofold: • Increase the confidence in data exchange standards are readiness of the vendors to support the data exchange standards for global supply chains; and • Align international standards testing approaches to ensure that interoperability may be achieved in a common testing framework and in support of the global supply chaincs. 7. Conclusions This paper described a testing methodology developed to support vendors implement data content standards, perform conformance testing methods to validate these implementations, and generate test cases to drive test execution. Two areas of further work appear to be most essential. First, the IV&I testing efforts will involve large group of participants and will require a capable collaborative environment to capture testing requirements, plan and execute testing campaigns, and track history of interactions with participants. Second, automated testing methods with workflow capabilities will be required to efficiently plan and execute repeatable testing campaigns. The IV&I POC project has adopted an iterative, incremental approach to developing testing framework in support of validation of data exchange standards with close involvement of both software vendors and users. This model of work has shown to provide necessary assumptions in support of large-scale interoperability project development. 8. Acknowledgements The authors would like to acknowledge help of Tony Blazej, David Connelly, Deborah Hamill, Bill Harrelson, Jae-man Lee, Steve Ray, Tim Sakach, Pat Snack, Ron White, and other individuals participating in, and supporting, our collaborative testbed developments. 9. Disclaimer Certain commercial software products are identified in this paper. These products were used only for demonstration purposes. This use does not imply approval or endorsement by NIST, nor does it imply these products are necessarily the best available for the purpose. References [1] The NIST B2B Interoperability Testbed Homepage , accessed April 2004, Available at http://www.mel.nist.gov/msid/b2btestbed/ [2] Korean B2B/A2A Interoperability Testbed Homepage, accessed April 2004, Available at http://www.korbit.org/ [3] Automotive Industry Action Group (AIAG) Web Site, accessed April 2004. Available at http://www.aiag.org/ [4] Open Application Groups Web Site, accessed March 2004. OAGIS version 8.0. Available at http://www.openapplications.org/downloads/oagidownloads.htm [5] QAD Web Site, accessed April 2004. Available at http://www.qad.com [6] Covisint Web Site, accessed April 2004. Available at http://www.covisint.com [7] iConnect Web Site, accessed April 2004. Available at http://www.iconnect-corp.com

[8] Schematron Editor Tool, accessed April 2004, Available at http://sourceforge.net/projects/cs-wizard [9] Accordare Web Site, accessed April 2004. Available at http://www.accordare.com [10] Resource Directory for Schematron 1.5, accessed April 2004, Available at http://xml.ascc.net/schematron/ [11] Drake Certivo Web Site, accessed April 2004, Available at http://www.certivo.com