DIMP: an Interoperable Solution for Software ...

6 downloads 1102 Views 2MB Size Report
In the modern manufacturing business, collaborations do not only exist amongst ... individual software applications that form a “CAD-CAM-CNC” chain. Product ...
DIMP: an Interoperable Solution for Software Integration and Product Data Exchange

Xi Vincent Wang, Xun W Xu*

Department of Mechanical Engineering, University of Auckland, Auckland, New Zealand

* Corresponding Author: Tel.: +64 9 373 7599 ext.84527; fax: +64 9 373 7479 Private Bag 92019, Auckland, New Zealand Email: [email protected] (Xun W Xu)

DIMP: an Interoperable Solution for Software Integration and Product Data Exchange Today, globalization has become one of the main trends of manufacturing business that has led to a world-wide decentralization of resources amongst not only individual departments within one company but also business partners. However, despite the development and improvement in the last few decades, difficulties in information exchange and sharing still exist in heterogeneous applications environments. This paper is divided into two parts. In the first part, related research work and integrating solutions are reviewed and discussed. The second part introduces a collaborative environment called Distributed Interoperable Manufacturing Platform (DIMP), which is based on a module-based, Service-Oriented Architecture (SOA). In the platform, the STEP-NC data model is used to facilitate data-exchange among heterogeneous CAD/CAM/CNC systems.

Key words: Interoperability, STEP/STEP-NC, Software Integration, data exchanging, collaborative manufacturing 1 Introduction In the modern manufacturing business, collaborations do not only exist amongst departments within the same enterprise, but also among business partners and contractors. Thus, the Virtual Enterprise concept was introduced to describe the consortium of different departments and companies which come together to quickly exploit fast-changing, worldwide product manufacturing opportunities (Zhang et al., 2000). The connection between these partners is required to be highly collaborated, distributed and in an agile regime. In such an environment, a tight data flow is a must between different organisations and processes. A better way of sharing and exchanging product data is also essential. In today’s industry, manufacturing and the related business processes are based on various individual software applications that form a “CAD-CAM-CNC” chain. Product information is created by CAD (Computer-aided Design) systems, while the CAM (Computer-aided Manufacturing) systems enable the users to add manufacturing information to the design such as machining processes, tool path and fixture. The output of a CAM system is usually an NC (Numerical Control) part program. Such program is read by a CNC (Computer Numerical Control) system as input and transformed to motion signals. Besides the systems mentioned

above, there exist other software products such as Computer-aided Engineering (CAE) and Computer-aided Process Planning (CAPP) tools. Product Data Management (PDM) and Product Lifecycle Management (PLM) systems are used to deal with the product data from design and manufacture to service and disposal. All of these systems are developed by using different software tools, data formats, interfaces and databases, thus forming a highly heterogeneous data environment. Based on a survey among 251 executive officers of German-speaking enterprises (Konstruktion, 2006.), the main interoperation difficulties of systems in the CAD-CAM-CNC chain are illustrated in Figure 1. It is evident that more than 75% of the large problems are directly related to the causes of different CAD versions or systems, different file formats and conversions.

Figure 1 : Main interoperation difficulties of systems in the CAD-CAM-CNC chain (Konstruktion, 2006.) There are even more obstacles to achieving interoperability and portability at the tail end of the CAD-CAM-CNC chain. There are more than 4500 configurations today just for post-processing data from a CAM system to machine tools (Figure 2). Despite the advancements in design and manufacturing, the current situation of NC programming is till compartmentalized and isolated. Though over the years the machine tools have been improved immensely, the programming language is still based on the G & M codes basically that only

document instructional and procedural data, leaving most of the design and process information behind. Apparently this machine-specific part programme generated by a postprocessor supports only a one-way data flow from CAM to CNC.

Figure 2. Status of Post-Processing in the CAD-CAM-CNC chain Data translation and conversion are common ways of coping with data heterogeneity in the CAD-CAM-CNC chain. These exercises often lead to huge data loss. The next section of this paper reviews the existing research work and commercial solutions in achieving interoperability and portability. Discussions and summaries are presented toward the end. 2 Literature Review To achieve interoperability and product data sharing/exchanging, there are two broad categories of approaches, data-centric approach and process-centric approach. With a data-centric approach, all software applications use the same data syntax, whereas with a process-centric approach, integration is achieved at the process level with multiple data formats allowed. 2.1

Data-centric Approaches As mentioned above, syntax harmonization is the prerequisite of this approach. To achieve

such harmonization, researchers and software vendors have used a number of methods, e.g. use of proprietary data formats, neutral data format, and international standard formats.

2.1.1 Use of Proprietary Data Formats Use of propriety data formats has been considered by both researchers and software vendors. It may be referred to as “Big Boy’s Games”. CATIA and its PLM system ENOVIA® provide a production proven SOA (Service-Oriented Architecture) to support collaborative manufacturing. Siemens NX and their PLM solution Teamcentre® and Sinumerik® 840D NC solutions aim to offer a complete solution for industry such as automotive manufacturers from design all the way to manufacturing. Likewise, PTC’s Pro/ENGINEER® together with its PLM solution, Windchill® caters for product information throughout all engineering processes, with associative CAD, CAM and CAE applications spanning conceptual design to NC tool-path generation. Proprietary data formats are dependent on the solid modelling kernels and their wrappers. Data portability is readily achieved across different CAD/CAM systems with the same kernel being deployed. 2.1.2 Use of Neutral Data Format During the last few decades, many neutral data formats were developed to enable data exchange. The most commonly used formats for the geometric data exchange were the Drawing Transfer File (DXF) (Autodesk, 2010), the Initial Graphics Exchange Standard (IGES) (Y2K, 1999) and the Product Description Exchange for Standard (PDES) (Ssemakula and Satsangi, 1990). In 1994, the Standard for the Exchange of Product data (STEP ISO 10303) (ISO, 1994) was published for industrial automation systems and integration. Different from its other predecessors, STEP provides a neutral mechanism by specifying a form capable of describing the entire product data throughout the life cycle of a product. STEP consists of different Application Protocols (APs) which provide data models relevant to individual targeted applications, activities or environments. As illustrated in Figure 3, different types of information are stored in a STEP file; individual software suits can extract the information relevant to the application itself. For instance, a CAM solution is able to read in the information

of acquiring a single manufacturing process and pass it on to a CNC machine without the need of other data.

Figure 3 Manufacturing data chain within STEP In practise, STEP-compliant PDM framework can be built based on the Standard Data Access Interface (SDAI). SDAI defines an abstract Application Programming Interface (API) to work on application data according to a given data model defined in EXPRESS (ISO, 1998), C++ (ISO, 2003a), C (ISO, 2001) and JAVA language (ISO, 2000). Detailed interfaces are defined to achieve the portability from one implementation to another. Recently, Mokhtar and Houshmand (Mokhtar and Houshmand, 2010) studied a similar manufacturing platform combining with the axiomatic design theory to realise interoperability among the CAx chain. Two basic approaches are considered, utilizing interfaces and utilizing neutral format based on STEP. The methodology of axiomatic design is proposed to generate a systematic roadmap of an optimum combination of data exchange via direct (using the STEP neutral format) or indirect (using bidirectional interfaces) solution in the CAx environment.

2.1.3 STEP-NC to Complete the Loop In 2003, the Data Model for Computerized Numerical Controllers was formally published as an international standard which is also known as STEP-NC (ISO, 2003c) (ISO, 2007). As a data model to connect CAD/CAM systems with CNC machines, STEP-NC completes the integrated loop of CAD/CAM/CNC. It remedies the shortcomings of ISO 6983 (ISO, 1982) by specifying machining processes instead of the machine tool movements. Data at the task-level (what-to-do) is defined in a STEP-NC file rather than the “how-to-do” data on the method-level. Thanks to the vendor-independent nature of a STEP-NC file, the portability of this kind of data is guaranteed. Nassehi et al. proposed a framework to combat the incompatibility problem among CAx systems (Nassehi et al., 2008). In this framework, STEP-NC data model is utilized as the basis for representing manufacturing knowledge augmented with XML schema while a comprehensive data warehouse is utilized to store CNC information. The platform is further explained in (Newman and Nassehi, 2007). The system consists of manufacturing data warehouse, manufacturing knowledgebase, intercommunication bus, and diverse CAx interfaces as main structures. Mobile agent technology is used to support the intercommunication bus and CAx interfaces. UbiDM®, a concept of Design and Manufacture via Ubiquitous computing technology, has been proposed (Suh et al., 2008). The key aspect of UbiDM® is the utilization of the entire product lifecycle information obtained via ubiquitous computing technologies for the design and manufacture of a product. A unified product life cycle data model, which is compliant with existing international standards (i.e. STEP and STEP-NC), is utilized for the use of data exchanging. The model represents the input and output information used in the life cycle activity, derived as Begin-Of-Life (BOL), Middle-Of-Life (MOL) and End-Of-Life (FOL). To

support the concept, a Ubiquitous Product Life Cycle Support (UPLS) system has been developed as well (Lee and Suh, 2009). 2.1.4 Recap The data-centric approach enables a total compatibility and communication in the design-manufacturing chain. It is easy to maintain data integrity and realize high-level synchronisation. Because data consistency is made at the data level, it is easy to consolidate all the data for the product and the need for coordination or supervision is minimized. However, the approach suffers from a few drawbacks. It is not a trivial task to consolidate a large number of heterogeneous applications. Semantic issues still exist. In many data-centric systems, software applications are encapsulated in a highly closed environment, which goes against the modern business practice that is becoming more and more decentralised. Furthermore, there are still synchronization and confidentiality issues with the data-centric approach. As illustrated in Figure 4, product data may experience a parallel or serial flow through the system. When a product is being manufactured by different suppliers or contractors at the same time, a parallel data flow occurs. Communication between the suppliers is usually minimised if any. So is data synchronization. On the other hand, a serial data flow occurs if multiple suppliers work on the job one after another. The requirement of passing on a complete data model from one supplier to another may infringe on the supplier’s confidentiality requirement. This means that use of data-centric approaches makes it hard to deal with intellectual property matters.

Figure 4. Synchronization and confidentiality issues for data-centric approaches 2.2

Process-centric Approach In the CAD/CAM/CNC chain, systems may be integrated and consolidated at the process

level. This process-centric approach utilizes the data flow in a process as a common thread to “string” various data entities together. In this way, no common data format is required. There are four dominant types of approaches, direct data translation, two-way data conversion, dual kernel solutions and service –oriented solutions. 2.2.1 Direct Data Translation Data translations are used between each pair of existing software. For example, Teamcenter® (Siemens) is utilized by General Dynamics Land Systems (GDLS) in the U.S. to manage data exchange among systems such as NX™, Pro/E® as well as Siemens PLM components, e.g. PLM Vis, Parasolid®, and NX™ Nastran® SDK. A CAD conversion engine called CrossCAD (UnifiedCAD) provides not only stand-alone translators and plug-ins for CAD systems, but also components that can be integrated by software companies. Additional tools such as importing, analyzing, healing and exporting models are available in the toolkit as well.

In 2009, Choi et al. proposed a framework for virtual engineering. The system is described as a Middleware for Exchanging Machinery and Product Data in Highly Immersive Systems (MEMPHIS) (Choi et al., 2009). As part of MEMPHIS, a lightweight CAE middleware for CAE data exchange is developed for the purpose of exchanging data relying in heterogeneous CAE systems (Song et al., 2009). A generic CAE kernel is designed to store analysis data from various CAE systems and translate it to different formats. 2.2.2 Two-way Data Conversion In this approach, data conversion is through a common data format. This common data format can be a proprietary data format or a neutral data format, e.g. STEP or IGES. Using its internal conversion modules, CATIA® can insert, assemble, and modify design models of a non-CATIA format. Automatic updates in the context of non-CATIA data are also possible. As another commercial example, a multi-CAD solution is developed by Theorem utilises. Generic Collaboration Objects (GCO) (Theorem®) are used to represent and hold all forms of data. Besides direct CAD conversion and visualization, Theorem also provides STEP processors for AP203 (ISO, 2005) and AP 214 (ISO, 2003b). Targeting interoperability solutions for digital design and PLM markets, Elysium’s portal-based multi-CAD systems provide 4 major functionalities with its 4 sub-modules (Elysiuminc) : CADporter™ :CAD-to-CAD translation and exchange of parts, assemblies, and Product Manufacturing Information (PMI) CADdoctor™: Geometry checking, healing and verification as well as geometry simplification for CAE, CAM, and PLM CADfeature™ : Feature-based design exchange for re-mastering CAD and legacy CAD files and process control, and

Integrated Systems: Integrated solutions for data translation and exchange, managing design data, and integrating the supply chain. Based on the technology called Universal Product Representation (Proficiency®), Proficency’s feature-based translation solution enables the transfer of complete design intelligence between major CAD systems, such as geometry, features, sketches, manufacturing information, metadata, assembly information and drawings in the conversion process. 2.2.3 Dual Kernel Solutions Two-way data conversion may also take place in a system in which two different modelling kernels co-exist, even though such conversions are hardly noticeable to the user. Visionary Design Systems’ IronCAD® (IronCAD) formally an ACIS®-only system, now has incorporated Parasolid® into the system to become the first dual-kernel CAD system. Additionally, CAXA® also utilizes both ACIS® and Parasolid® kernels (CAXA). 2.2.4 Service-oriented Approach Applications in an information integration environment can be organized in a service-oriented way. Brecher et al. proposed a module-based, configurable platform for interoperable CAD-CAM-CNC planning (Brecher et al., 2009). The goal is to combat the problems of software inhomogeneity along the CAD-CAM-NC chain. The approach is called open Computer-Based Manufacturing (openCBM) in support of co-operative process planning (Figure 5). To implement the architecture and integrate the inspection task into a sequence of machining operations, STEP standard is utilized to preserve the results of the manufacturing process and feed them back to the planning process (Brecher et al., 2006).

Figure 5. Module users and providers of the openCBM approach (Brecher et al., 2009) The openCBM platform is organized through a service-orient architecture providing the abstractions and tools to model the information and connect the models (Huhns and Singh, 2005). Applications are not realised as monolithic programs, but as set of services that are loosely linked to each other, which guarantees the modularity and reusability of a system. To achieve a run-time configuration integration environment, van der Velde reported a plug-and-play framework for the construction of modular simulation software (van de Velde, 2009). In this framework (Figure 6), stakeholders are allowed to select a target of simulation and assign the performer of the simulation called component before running the selected components. After the simulation, the output is post-processed through the components. In such architecture, modules are detected, loaded and used at run-time when the framework needs no prior knowledge of the type and availability of components, thus providing true plug-and-play capabilities.

Figure 6. The main elements of the plug-and-play framework. 2.2.5 Recap of the Process-centric Approaches The process-centric approach respects existing vendor-specific software suites in that each activity in a business process can use its native data format. It supports “Best of the Best” concept. The data confidentiality problem is alleviated. However, numerous data translators (e.g. plug-ins and convertors) may be needed to harmonize the system. Data modification may lead to difficult and complex updates. According to the calculation done by Parasolid's business development manager, approximately 20% of the models imported from a different kernel still contain errors that have to be manually fixed, not to mention the data loss during conversions among different software applications(Anonymous, 2000). 2.3

Discussion & Summary Table 1 summarizes different methods in terms of user-friendliness, practicality, and

efficiency. Regardless of different approaches (data-centric or process-centric), data translation always takes place, albeit to varying degrees. Many software packages in a modern manufacturing enterprise are normally provided by multiple vendors using different development toolkits and programming languages. It is obvious that use of software with the same kernels is not a practical solution. Use of common file formats, e.g. IGES, STEP, is still the most commonly used approach. The problem lies in the common data formats themselves, which are often incapable of capturing the complete package of the data being dealt with. For

example, current STEP files do not yet fully support tolerance data. Applications that operate on dual kernels perform well, but limitations still exist. Use of a direct data translator appears to be more user-friendly though its practicality and efficiency level is low hence begging more research. Table 1 Comparison between data-centric and process- centric approaches

Approaches

Data-Centric

Process-Centric

Data Translation Method

User Friendly

Practical

Efficient

Use of software with the same kernels

XXXX

X

XXXX

Use of common file formats

XXXX

XXX

XXX

Applications that operate on dual kernels

XX

XX

XXXX

Use of a direct data translator

XXX

XX

XX

It is clear that neither data-centric nor process-centric will alone provide a feasible solution. The authors propose a hybrid architecture combining both data-centric and process-centric concepts. A module-based, neutral data-centric and service-oriented architecture is proposed for the development of a Distributed Interoperable Manufacturing Platform (DIMP). 3

Distributed Interoperable Manufacturing Platform The criteria for a contemporary manufacturing information system can be summarised

as portability, interoperability, visibility and longevity. As a requirement of portability, data or applications should be easily moved from one application to another. With interoperability, software applications are able to run on different hardware and operating systems. Clearly, the concept of interoperability is different from portability. Interoperability is referred to the ability of software, systems and organizations to work together regardless of the platform upon which they are deployed or built. Portability relies on the ability to move applications from one

platform to another. For instance, Java application (Oralcle) can run on different platforms with other applications while PostScript (Adobe Systems Incorporated, 1999) is a kind of control language that can be understood by many printers. The former is the case of interoperability whereas the latter portability. Data visibility is also important, which means the right type of data is made available to the right people at the right time and in a right manner. Data longevity is referred to data outliving the software and hardware on which it was originated and both data and applications be extended to take advantage of new and future techniques. In an interoperable manufacturing environment, devices with a “Plug & Play” feature are advantageous. Furthermore, so called “what-to-do” data, meaning the manufacturing information at the task level, is more easily interchangeable among the CAD/CAM/CNC chain than the traditional “how-to-do” data describing the method. To meet these criteria, a Distributed Interoperable Manufacturing Platform is proposed to achieve an integrative environment among existing and future CAD/CAM/CNC applications (Wang et al., 2010). To integrate the software suites based on the requests and tasks from the user, service-oriented architecture is used. In DIMP, user’s requests are collected and organized as a serial of software services. From the service point of view, heterogeneous software tools are integrated as “Virtual Service Combinations” and provided to the user. In this way, software suites are embedded into operational processes. Moreover, in the data domain of DIMP both STEP and STEP-NC data models are utilized as the central data schema to take advantage of Data-centric approaches. With “coupling” technologies, STEP and STEP-NC data models can be connected to commercial CAD/CAM software suits, giving system the much needed portability. STEP is used to cater for CAD data and STEP-NC for manufacturing data. However, when working with existing CAD/CAM systems the semantics and syntax of domain-specific data need to be represented. Thus, a data localization/integration technology has been developed to support communications at the data

subset level; this is defined as the “data-packet” concept in DIMP. At the application level, heterogeneity is maintained in DIMP (e.g., through different data models, tools, and programming languages) while homogeneity exists at the platform level based on STEP and STEP-NC. Furthermore, a subset of data is divided from the whole project and regarded as data-packet based on both the project requirement and the user’s authority level before the database is updated after the operation. Therefore, DIMP adopts a combined approach, i.e. combining the process-centric approach based on task-oriented architecture and data-centric approach using STEP/STEP-NC data model as the neutral file format. In this section, the software integration mechanisms is presented, which is followed by product data exchanging technologies based on STEP and STEP-NC. 3.1

Interoperable Software Integration Environment As mentioned above, DIMP aims to integrate software applications based on the request

from the users. The platform is capable of handling in the requests from the users such as task objects, software functionality and input/output requirements, and then organizing a serial of software services, which forms a “request-find-combine-provide” loop. As depicted in Figure 7, DIMP is designed with a three-layer architecture. Under the platform layer at the top, Modules such as decision makers and supervisors are provided to aid data and service flows. At the bottom layer of the system, the software tools are packaged by individual agents acting as plug-ins in DIMP. The packaged software suites are the basic unit in the platform fulfilling the request from the users.

Figure 7. Three-layer architecture 3.1.1 Service-Oriented Supervision Mechanism The modules at the Module Layer offer service and supervision (Figure 8). Consisting of an interface agent, broker agent and supervision agent, the Supervisory Module is connected with the other modules directly e.g. Database Module and Application Warehouse Module.

Figure 8. Service-oriented DIMP architecture

The Supervision Module is a decision maker and core supervisor for the whole platform. At the Agent Layer, besides serving as a Human-Machine Interface, interface agent collects requirements from users. After proposing a set of tasks to the user, Interface Agent gives the user access to a list of available application modules and translates the result into a service description in a predefined format before passing it to the Broker Agent. The Broker Agent then makes the selection of Application Modules at the upper layer to fulfil the service requested. As soon as the request is read in, the Broker Agent searches for available modules in the Application Warehouse and chooses the best Application Modules to which it can allocate the request. In this way, global optimisation can be realized either automatically or manually at the Module Layer. As an approach to scenario generation, an optimized list of tasks and events is generated and passed to the Supervision Agent. Based on this list, selected software tools will be packaged and provided to the user. From the user’s point of view, a combination of software service which fulfils his/her requirement is delivered by DIMP according to his/her own need. In order to model the service list, a data model is needed. To be compliant with the STEP and STEP-NC data model, Service data is modelled in Express-G diagrams (Figure 9). The service model links to ISO 14649 Part 10 through Entity project. The top level of the service model has attributes such as its_id, its_discription, its_object, its_event_flow, its_data_flow, its_related_connection, its_relevant_application_module, and time_stamp.

its_service_id service_specification

service_id its_distription

INTEGER

service_type

creat modify delete

related_service_specification L[0:?]

its_object

its_line_number

object

it_identifier its_event_flow

event

INTEGER identifier event_in

event_out SERVICE

it_data_flow data_flow

data_in data_out

its_related_connection connection

its_relevant_application_module L[0:?] its_time_stamp Time_stamp

data_depth

application_module

its_start time_of_start its_end time_of_end its_date_and_time

date_and_time

Figure 9. Service data model Entity servie_id stores an id for every single service so that a service can be tagged easily. Since every single service is saved in the database, service and version history can be traced thanks to the servie_id Entity. Entity service_type describes the type of service, via its_types, such as creating service, modifying service, or deleting service based on the request from the user. Entity object defines the data to be processed through its_line_number and its_identifier. Based on different tasks being processed, the top line of the data portion is mapped to the entity

by the line number and the identifier of the line. In this way, service is linked to the existing sub-model of the project or product. Entity event is a supertype of the entity event_in and event_out which support the event-driven type of activities. Entity data_flow defines the input and output data of each service. A copy of the input data is saved and linked to attribute data_in after the service is initialized while an output copy is saved after the service terminates. Entity connection defines and records both the relationships within a data-subset, and the connections around it. Based on the service description mentioned above, the information requested is asked by the user will be tagged and extracted based on the type of connections. More detailed explanation of Entity connection is given latter in the paper. Entity application_module defines which modules are needed to fulfil the user’s request. The module list can be a combination of various modules or a single software tool. Entity time_of_stamp records the time when the service commences and terminates. With the attribute date_and_time which is defined in ISO 10303-41, history of service and project can be traced from the chronological data. In order to monitor and control the information flow, Supervision Agent is charged with tasks of connecting and terminating Application Modules and databases directly. Database is extracted or updated accordingly while inputs and outputs of Application Modules are monitored by supervision agents mentioned above. 3.1.2 Software Integration Technology As soon as the service list is generated by the Supervisory Module, Application Module Warehouse carries out one or several task(s) at the request of a user. Software Applications are defined as Application Modules such as CAD file converters, CAM software suites and

Tool-path generators. Such modules are developed to be self-contained and executed autonomously. Moreover, modules are packaged for individual applications with an event/data-controlled layer. New tools and systems can be included through newly encapsulated modules. This is how the platform’s extendibility is guaranteed. Agent technology is used to implement such integration as it provides desired properties such as autonomy, responsiveness, redundancy, distributiveness, and openness (Monostori et al., 2006 ). The National Institute of Standards and Technology (NIST) has developed a prototype agent-based platform supporting the integration of manufacturing planning, predictive machining models, and manufacturing control (Feng et al., 2005). In DIMP, agents are in charge of the functions of monitoring data/event flow and controlling the activity of modules. The structure of each module is inspired by the function block concept, which is a new IEC (IEC-61499) standard for distributing industrial-process measurement and control systems (IEC, 2005). By providing an explicit event-driven model for data flow and finite state automata-based control, Function Blocks provide several features such as robustness and modularity. In DIMP, a function block which is combined with agent technology can be re-used in a wide range of applications and it is also easy to implement. Application software tools are packaged using function block (Figure 10), so that individual modules can work autonomously and be considered as “black boxes”. Usually, there is no need to modify the modules after it is defined for the very first time. Mechanisms to trigger the data flows are also integrated into the Function Blocks. The running of each function block is controlled by event_in and event_out variables defined in the architecture. When the output files are saved an event_out variable is sent to the Supervisory Module and the next step is activated. In practice, a module can be opened by the event_in variable and it is available for the users through the Human-Machine Interface.

Figure 10. Software tool encapsulated by function block Inside each Module, the software application is packaged by the Wrapper Agent in which an Agent core, pre-processor and post-processor are encapsulated. Based on the request from the user, different pre-/post-processors are selected and integrated for different purposes, which means processors are pre-defined or pre-stored when the software application is involved in the platform. Furthermore, based on their functionalities, software tools can be defined and utilized as a processor for another software tool, such as data translator or converter. In practice, Broker Agent makes the optimized choice of processors based on the user’s request. The agent core monitors and controls the activity of the whole module and reacts to the control statement issued by the Supervisory Module. 3.1.3 Application Module Database DIMP contains two main databases, Product/Project Database and Module Database. When the STEP/STEP-NC file of a product and manufacturing project is stored in the Product/Project Database, the description of the software modules is contained in the Module Database. To

model the information of these software applications, Figure 11 shows an Express-G diagram of the Application Modules.

Figure 11. MODULE data model The top-level of the MODULE model has an its_module definition to describe modules at the functionality level. Note that one single module could have multiple specifications or software applications integrated across the platform. Thus, both multi-functionality and reusability are guaranteed in DIMP. Entity module_definition describes the detailed module information through attributes its_name, its_type, its_discription, its_event_flow and its_data_flow. Entity module_type defines the kind of service provided by the module, i.e. creating, modifying or deleting a data packet. Entities event and data_flow are defined in the same structure as entity service. The event-trigger information is stored and the input/output information is defined. With this familiar definition, communications between Supervisory Module and Database are streamlined. Without the help of interfaces, Broker Agent is enabled to access the

Module Database directly and make an optimal selection with the information gathered here. Thanks to the Product/Project Database, all the data files utilized by the Application Modules are stored in one place, which means input and output history for each module is made traceable. The development record of a product/project is kept in the database. As well as the native files, the input/output data are also saved in the STEP and STEP-NC neutral data format for archiving purposes. 3.1.4 Case Study I In this case study, a neutral STEP-NC file is used as the input information and a G-code file is used as the output. Two software applications are nominated and integrated, IIMS STEP-NC Adapter and IIMS STEP-NC Convertor (Wang et al., 2007). The previous software maps local manufacturing resource information to the STEP-NC data and generates a native STEP-NC file. The latter translates the native STEP-NC files a corresponding G-code program as output. Table 2 shows the work flow mentioned and event triggering information controlling agents encapsulated in Function Blocks. Table 2. Task list of the case study Task

Event_In Trigger

Event_out Trigger

Start Work

Interface Message Displayed

File Loaded

Generic STEP-NC file loaded

Native STEP-NC file created

IIMS STEP-NC Adapter

Native STEP-NC File loaded

G-code file created

IIMS STEP-NC Converter

G-code file present in Product Data Hub

Interface Message displayed

Generic STEP-NC file =>Native STEP-NC file Native STEP-NC File => G-Code file Arrived at Destination

Application Module

According to the service list, the project is launched (Figure 12). Appointed software applications are encapsulated in the Application Modules and provided to the users as a service serial. The user can work on the project directly while the proper procedure has been made before it starts. At the Module Layer, Supervision Module, Database Module and Software Applications are integrated to provide a service-combination, while from the data-flow point of view all the input and output files are saved in the database along with the neutral database files.

Figure 12. Running of DIMP case study I As shown in this case study, the use of modular concept provides DIMP with a flexible and tailorable manufacturing environment. Software applications, convertors and translators can be easily integrated in this environment as new Application Modules. Meanwhile, controls are exercised at the input/output level of each module which guarantees the autonomy of each

Application Module and the development of these modules is relatively independent. Furthermore, this kind of architecture is also suitable for distributed and networked applications. 3.2

Collaborative Product Information Exchanging Mechanism As mentioned above, exchanging entire product data set between collaborators as a

data-centric approach, may lead to an intelligent property protection issue. When an enterprise works with a number of contractors and sub-contractors, the issue becomes a more acute problem. With STEP/STEP-NC data models are chosen as the neutral data formats, the drawbacks of Data-Centric solutions, such as Synchronization and confidentiality issues need to be overcome. In this research, the concept of data packet is conceived. A data packet is a set of self-contained mobile cluster of data. Data packets are handled by DIMP Data Localization mechanism. Once processed, data packets may be reassembled to the original data source, which is defined as Data Integration Mechanism in DIMP. In general, the goal is to develop an identification algorithm defining rules for logical connections amongst subsets. The stand-alone file that contains the extracted data packet can be “reassembled” back to, or “re-connected” with, the original data source after it is operated on. As one of the key concepts within DIMP, data packet provides an enabling tool for collaborative manufacturing. Users are allowed to work with product components at different levels of abstraction using different representations. Heterogeneous engineering tools are allowed to be integrated to support collaborative work in a distributed environment. After the product is modified by different personnel, changes can be integrated with other concurrent modifications that are made by other experts in the system. Once the task of integration mechanism is completed, an ill-defined validation mechanism will be used to validate the

updated data. This mechanism is responsible for detecting illogical parameters, redundant data and data conflicts. In realization of the data-packet concept, as STEP/STEP-NC describes product information from an object-oriented perspective, it is possible to process a subset of the whole project on the task level. This provides an interoperable environment for the users that choose to work on the data in a confined scope. This relieves the synchronization issue and resolves confidentiality problems as discussed above. 3.2.1 Data-Localization Mechanism The concept of data packets is realized by the processors encapsulated in Wrapper Agents (Figure 10). Since STEP/STEP-NC is utilized as the neutral data format, pre-/post-processors are designed to extract data packets from a data source and reassemble them afterwards. Before Software Application is launched, pre-processor searches the database and locates the data packet whose top-level information is assigned in Entity Object. After the data pocket is located, the related information is extracted from the data source before it is translated to an input file for the targeted application. As the software is processing the data packet, pre-processor terminates and the wrapper agent keeps monitoring the software activity, including input and output triggering event. After the output file is saved on the hard drive, wrapper agent detects the trigger-event and then post-processor is launched. To realize the data packet concept, it is required to locate and process a data subset from a STEP physical file. However, because the instances of entities are defined in a text-based structure, it is difficult to program and process them directly. To overcome this problem and generate data packets efficiently, a meta-model is designed to “re-describe” the product information in the STEP file. Since STEP/STEP-NC data models describe the information in a task-oriented way using ‘entities’ and reference relationships, the logics within such a data file can be described as nodes and edges. The instance of entities stored in a STEP file can be

denoted as ‘nodes’ and the relationships between nodes denoted as ‘edges’. As depicted in Table 3, one line representing Workplan Entity is used as an example. Before the data packet is generated, each character is scanned and mapped to a specific syntax describer. Then such describers are used to fill in a meta-model which consists of nodes and edges.

Table 3.Syntax Interpreting for STEP/STEP-NC #2=WORKPLAN('MAIN WORKPLAN',(#4,#5,#6),$,#8,$); No. Word Syntax Describer 1 #2 node start 2 = stack 1 3 workplan stack 2 4 ( stack 3 5 MAIN WORKPLAN' stack 4 6 ( pointer_list start 7 #4,#5,#6 pointers to #4,#5,#6 8 ) pointer_list end 9 $ stack 5 10 #8 pointer to #8 11 $ stack 6 12 ) stack 7 13 ; node end

When all the entities in the Part 21 file are mapped to the Nodes, the relationships between nodes are represented as Edges as. The Node data model is developed using Express-G language to be compliant with the original data source (Figure 13). In addition to the start and end Nodes, there are three main data types which are stack, pointer, and pointer_list. Type stack defines the ordinary characters in the entities which can be utilized to generate Part 21 lines reversely. Type pointer is recognized through character #, which is identified as the keyword to define a pointer connecting two different nodes. Additionally, the Edge data model maintains all the logic information between nodes such as type, status and direction. In this way, when a task is recalled by the user, the pre-processor will locate the top level of the information before the related information is analysed and recorded by the Node and Edge data model. In this way, the connection between the data packet and the rest of the product data is always traceable and measurable. Accordingly, all the required information is found to generate a data-packet before it is passed to the user via the pre-processor. Furthermore, an

entity named “depth” is defined in the mechanism. The agent core will evaluate the appropriate amount of data according to “how deep” the user wants from the top level of the data packet down. For instance, the stakeholder will be able to access all the information related to a workplan, or just display the basic parameter of a feature.

Figure 13. Edge and Node to interpret STEP/STEP-NC 3.2.2 Data-integration Mechanism After the user finishes the Software Application, the agent core detects the output results, and post-processer is initialized before data packets are re-assembled to the data source. Since the STEP/STEP-NC stores the product data in a object-oriented way, it helps maintain the integrity of the data and reconnect the broken links between data packet and original data source. To maintain consistent data semantics and syntax, a validation mechanism is in place. As part of the post-processor, the validation mechanism detects the unreasonable parameters, such as too large/small dimensions for a manufacturing feature and ill-defined manufacturing information (e.g. minus diameter of a driller). Furthermore, harmonization between data-packet and data source will be validated, too. For instance, if the depth of a pocket is changed in the data-packet while the machining depth is remained unchanged in the data source, the validation system will detect the conflict and send a warning message to the user. Thanks to the object-oriented nature of STEP/STEP-NC, it is possible to develop the validation mechanism based on the concept of data-packet.

At completion of the validation process, the post-processor shuts down and the Agent Core sends a service-end message to the service module before a new service is launched. To store both the project/product information and the statement of Application Modules, Module Database is situated in the database domain along with the Project/Product Database. 3.2.3 Case Study II This case study demonstrates the process of extracting and re-assembling a data-packet. The product information is stored in a STEP-NC file of ISO14649-11 example 2 (Figure 14) (ISO, 2004).

Figure 14. Component for Case Study II (ISO, 2004)

The first section of the part program is the header section marked by the keyword “HEADER” (Figure 15). In this section, some general information and comments concerning the project are given, e.g. filename, author, date and organization. The second and main section of the program file is the data section marked by the keyword “DATA”. This section contains all the information about manufacturing tasks and geometries.

Figure 15. STEP-NC data structure Suppose a design engineer who has the needed authority needs to change the diameter of a hole named HOLE2_FLAT_BOTTOM (as shown in line #108 in Figure 15) from 22 mm to 25 mm. According to the service list based on the customer’s request, this part of feature information is extracted from the original database file. The pre-processor first of all identifies the data concerned and tags the reference relationship between the data (i.e. the hole feature) and the rest of the file. The unconcerned reference relationships are cut and tagged, e.g. workpiece and operation. The entity that defines the HOLE2 feature is tagged and all the tagged information is saved in a temporary file, i.e. data packet. The original data file is still intact. Figure 16 shows the tagging process and the data packet formed. At the assistance of the Application module, the diameter is changed from 22 mm to 25 mm. The post-processor then reads in both the original

data piece of product file and the revised data packet. The re-assembled data file is shown in Figure 17.

Cut & Tags

Keep

Keep

Keep

Keep

Figure 16. Original database file Reconnect

Keep

Keep

Keep

Keep

Figure 17. Modified database file 4 Conclusions The contemporary manufacturing business is heterogeneous in data representation and increasingly decentralised in infrastructure. A tight data flow between different organizations is required. A better way to share and exchange product data is needed by the participator. DIMP is designed to meet the above needs. This paper reviews and acknowledges latest data integration researches and commercial solutions. Data-centric approach is idealistic but unrealistic while process-centric

approach is rational but heavy-handed. Thus, the authors adopted a “combined” approach to achieving an efficient and interoperable integration in the design-to-manufacturing chain. The proposed Distributed Interoperable Manufacturing Platform consists of a Supervisory Module, Module &Product/Project Database and Application Module Warehouse. It provides a coherent collaboration environment across the CAD/CAM/CNC chain. Software solutions are encapsulated as application modules that are integrated as a virtual service station for users. In this DIMP solution, STEP and STEP-NC is used to be the neutral data format which enables data portability. Meanwhile, a multi-level product data exchanging mechanism is realized through data-packet technology, to enable an efficient and advanced manufacturing environment. Acknowledgement The authors would like to thank Dr. Matthieu Rauch at the Institut Recherche Communications et Cybernetique Nantes (IRCCyN), l'Ecole Centrale de Nantes, France, for his contribution to this research.

References Adobe Systems Incorporated, 1999. PostScript® Language Reference. Addison-Wesley Publishing Company, Multiple. Anonymous, 2000. Healing the wounds of data conversion. CAD User AEC Magazine, Vol. 13(Issue. 3). Autodesk, 2010. AutoCAD 2011 - DXF Reference, San Rafael, USA. Brecher, C., Lohse, W. and Vitr, M., 2009. Module-based Platform for Seamless Interoperable CAD-CAM-CNC Planning. In: X.W. Xu and A.Y.C. Nee (Editors), Advanced Design and Manufacturing Based on STEP. Springer Series in Advanced Manufacturing. Springer, London, pp. 439-462. Brecher, C., Vitr, M. and Wolf, J., 2006. Closed-loop CAPP/CAM/CNC process chain based on STEP and STEP-NC inspection tasks. International Journal of Computer Integrated Manufacturing, 19(6): 570-580. CAXA, CAD/CAM/PLM Provider - CAXA. Choi, S.S., Herter, J., Bruening, A. and Do Noh, S., 2009. MEMPHIS: New framework for realistic virtual engineering. Concurrent Engineering Research and Applications, 17(1): 21-33. Elysiuminc, Products Overview. Feng, S.C., Stouffer, K.A. and Jurrens, K.K., 2005. Manufacturing planning and predictive process model integration using software agents. Advanced Engineering Informatics, 19(2): 135-142. Huhns, M.N. and Singh, M.P., 2005. Service-oriented computing: Key concepts and principles. Internet Computing, IEEE, 9(1): 75-81.

IEC, 2005. IEC 61499. Function blocks for industrial-process measurement and control systems-part 1: architecture. International Electrotechnical Commission, Geneva, Switzerland. IronCAD, 3D design Sofware for Mechanical Design and Product Design. ISO, 1982. ISO 6983-1 Numerical Control of machines-Program Format and Definition of Address Words-Part 1: Data Format for Positioning, Line Motion and Contouring Control Systems. International Organization for Standardization, Geneva, Switzerland. ISO, 1994. ISO 10303 -1: Industrial automation systems and integration -- Product data representation and exchange -- Part 1: Overview and fundamental principles. Science direct. International Organization for Standardization, Geneva, Switzerland, 8 pp. ISO, 1998. ISO 10303-22.Industrial automation systems and integration -- Product data representation and exchange -- Part 22: Implementation methods: Standard data access interface. International Organization for Standardization, Geneva, Switzerland. ISO, 2000. ISO 10303-27. Industrial automation systems and integration: Product data representation and exchange -- Part 27: Implementation methods: Java TM programming language binding to the standard data access interface with Internet/Intranet extensions. International Organization for Standardization, Geneva, Switzerland. ISO, 2001. ISO 10303-24. Industrial automation systems and integration -- Product data representation and exchange -- Part 24: Implementation methods: C language binding of standard data access interface. International Organization for Standardization, Geneva, Switzerland. ISO, 2003a. ISO 10303-23. Industrial automation systems and integration -- Product data representation and exchange -- Part 23: Implementation methods: C++ language binding to the standard data access interface. International Organization for Standardization, Geneva, Switzerland. ISO, 2003b. ISO 10303-214. Industrial automation systems and integration -- Product data representation and exchange -- Part 214: Application protocol: Core data for automotive mechanical design processes. International Organization for Standardization, Geneva, Switzerland. ISO, 2003c. ISO 14649-1. Industrial automation systems and integration -- Physical device control -- Data model for computerized numerical controllers -- Part 1: Overview and fundamental principles. International Organization for Standardization, Geneva, Switzerland. ISO, 2004. ISO 14649-11 Industrial automation systems and integration -- Physical device control -- Data model for computerized numerical controllers -- Part 11: Process data for milling. International Organization for Standardization, Geneva, Switzerland. ISO, 2005. ISO/TS 10303-203. Industrial automation systems and integration -- Product data representation and exchange -- Part 203: Application protocol: Configuration controlled 3D design of mechanical parts and assemblies (modular version) International Organization for Standardization, Geneva, Switzerland. ISO, 2007. ISO 10303-238.Industrial automation systems and integration -- Product data representation and exchange -- Part 238: Application protocol: Application interpreted model for computerized numerical controllers. International Organization for Standardization, Geneva, Switzerland. Konstruktion, N.N.W., 2006. Ungenutztes Potential im Engineering: Status, Trends und Herausforderungen bei CAD und PDM, Autodesk, München.

Lee, B.E. and Suh, B.-E.L.S.-H., 2009. An architecture for ubiquitous product life cycle support system and its extension to machine tools with product data model. Int J Adv Manuf Technol, 42: 606–620. Mokhtar, A. and Houshmand, M., 2010. Introducing a roadmap to implement the universal manufacturing platform using axiomatic design theory. International Journal of Manufacturing Research, 5(2): 252-269. Monostori, L., Váncza, J. and Kumara, S.R.T., 2006 Agent-Based Systems for Manufacturing Annals of the CIRP, 55(2): 697-720. Nassehi, A., Newman, S.T., Xu, X.W. and ROSSO Jr, R.S.U., 2008. Toward interoperable CNC manufacturing. Computer Integrated Manufacturing, 21(2): 222-230. Newman, S.T. and Nassehi, A., 2007. Universal Manufacturing Platform for CNC Machining. Annals of the CIRP, 56(1): 459. Oralcle, Oracle Technology Network for Java Developers. Proficiency®, Feature Based CAD Translation. Siemens, Teamcenter PLM−Product Lifecycle Management Siemens PLM Software. Song, I.-H., Yang *, J., Jo, H. and CHOI, I., 2009. Development of a lightweight CAE middleware for CAE data exchange. International Journal of Computer Integrated Manufacturing, 22(9): 823-835. Ssemakula, M.E. and Satsangi, A., 1990. Application of PDES to CAD/CAPP integration. Computers & Industrial Engineering, 18(4): 435-444. Suh, S.H., Shin, S.J., Yoon, J.S. and Um, J.M., 2008. UbiDM: A new paradigm for product design and manufacturing via ubiquitous computing technology. International Journal of Computer Integrated Manufacturing, 21(5): 540-549. Theorem®, Technology Overview: Theorem Solutions. UnifiedCAD, Unified CAD Solutions. van de Velde, P.J.M.C., 2009. Runtime congurable systems for computational fuid dynamics simulations. Gruaduation Thesis Thesis, University of Auckland, Auckland, 243 pp. Wang, H., Xu, X. and Tedford, J., 2007. An adaptable CNC system based on STEP-NC and function blocks. International Journal of Production Research, 45(17): 3809-3829. Wang, X.V., Xu, X. and Hämmerle, E., 2010. Distributed Interoperable Manufacturing Platform Based on STEP-NC, The 20th International Flexible Automation and Intelligent Manufacturing Conference(FAIM 2010). California State University, California, USA., pp. 153-160. Y2K, 1999. Digital Representation for Communication of Product Data: IGES Application Subsets and IGES Application Protocols. Interpharm Pr, Wheeling, USA. Zhang, Y., Zhang, C. and Wang, H.P., 2000. An Internet based STEP data exchange framework for virtual enterprises. Computers in Industry, 41(1): 51-63.

Suggest Documents