1
On a Data-Driven Environment for Multiphysics Applications J. Michopoulos a ∗ † , P. Tsompanopouloub , E. Houstis c ,b , C. Farhat d , M. Lesoinne d , J. Rice c , A. Joshi e a
U.S. Naval Research Laboratory, Special Projects Group, Code 6303, Washington, DC 20375, U.S.A. b
University of Thessaly, Dept. of Comp. Eng. and Telecommunications 38221 Volos, Greece c
Purdue University, Computer Sciences Department W. Lafayette, IN 47906, U.S.A. d
University of Colorado at Boulder, Dept. of Aerospace Eng. Sciences Boulder, CO 80309-0429, U.S.A. e
University of Maryland Baltimore County, Dept. of Comp. Sci. and Electr. Eng. Baltimore, MD 21250, U.S.A. The combination of the recent advances in computational and distributed sensor network technologies provide a unique opportunity for focused efforts on low uncertainty modelling and simulation of multiphysics systems. Responding to this opportunity, we present in this paper the architecture of a data-driven environment for multiphysics applications (DDEMA) as a multidisciplinary problem solving environment (MPSE). The goal of this environment is to support the automated identification and efficient prediction of the behavioral response of multiphysics continuous interacting systems. The design takes into consideration heterogeneous and distributed information technologies, coupled multiphysics sciences, and sensor-supported data-driveness to steer adaptive modelling and simulation of the underlying systemic behavior. The design objectives and proposed software architecture are described in the context of two multidisciplinary applications related to material-structure design of supersonic platforms and fire/material/environment interaction monitoring, assessment and management. These applications of DDEMA will be distributed over a highly heterogeneous networks that extend from light and ubiquitous resources (thin portable devices/clients) to heavy GRID-based computational infrastructure.
1. Introduction Continuous multi-domain interacting systems under the influence of coupled multi-field loading can exhibit static and dynamic behavior that is highly variable in space and time, spans multiple scales, and sometimes has intense nonlinear interaction and response. This is true especially for material/structural systems embedded in host continua as in the case of structures interacting with surrounding fluids under generalized loading ∗ Corresponding † E-mail
author. address:
[email protected]
conditions. The development of the science, engineering and technology that allows predicting the behavior of such interacting systems is of utmost importance when it comes to their applications in contexts driven by human needs. An example would be the case designing materials and structures for the civil and defense infrastructure from the perspectives of life extension and maintainability. Candidate technologies for this application would be are smart materials and systems that can self-adjust when a scheduled or unexpected restriction of their state space occurs like structural composites that can change their me-
2 chanical properties when exposed to sudden temperature gradients due to explosion or mechanical damage. Furthermore, specific vertical industries must comply with consumer driven quality standards and/or demands that introduce implicit requirements for reducing uncertainty of simulated systemic behavior. This has increased the demand for the validation and verification of predictions, rendered by the computational environments used for this purpose. A focal motivating point of our efforts is the fact that the opportunity for the automated utilization of experimental and monitoring data can be used for pursuing modelling and simulation activities that intrinsically contain validation of simulated behavior. In response to these needs, the main objective of our project is the creation of a framework for developing and applying specific Multidisciplinary Problem Solving Environment (MPSEs) technologies to applications focused on the multifield behavior of interacting continuum systems. Accurate behavior prediction of fully identified systems will be achieved as a consequence of applying inverse approaches along with massive automated data acquisition during tests of continuous systems; these tests correspond to exhaustive scanning of multidimensional stimulus-response spaces of the systems at hand. Such a data-driven framework will allow the experimental capturing of the multiphysics behavioral characteristics of the continuous system, thus intrinsically ensuring the validity of simulations based on massive and automated testing. The present paper describes the current status of the architecture of DDEMA as it has evolved from its initially described state [1,2]. The structure of the present paper is organized as follows. In section 2, we present the motivating effects that data-driven methodologies can have on the qualification, validation and verification (QV& V)of the modelling and simulation (M&S) process. Additionally, in this section we characterize data-driven modelling methodologies in terms of their attributes and present an overview of our MPSE efforts, along with a description of the system specifications and requirements of
the proposed DDEMA environment, followed by a discussion of the existing infrastructure to be incorporated in the implementation of DDEMA. Section 3 presents proposed architectures as the means to satisfy requirements based on the two time modalities for prior- and real-time contexts. Section 4 provides some detail to the middleware implementation involved in the architecture. Section 5 contains a description of two applications (one for each time modality) of DDEMA that will be utilized to validate its feasibility. Finally, the conclusions are presented in section 6. 2. MPSE Overview and DDEMA Specifications 2.1. Qualification, Validation and Verification of Data-Driven Modelling and Simulation Data-driven modelling and simulation (M&S) can have an entirely different nature when compared to the traditional M&S practices, from a perspective of uncertainty. Extensive utilization of data as is promoted by this and other similar efforts has important motivating characteristics. In order to address the issues of simulation fidelity, accuracy, and in general high confidence predictions of physical system behavior, contemporary M&S practices are subject to the necessary model qualification, verification and validation (QV&V) activities. There are many descriptions of how QV&V relates to modelling and simulation. Figure 1 represents a depiction of the modelling and simulation process along with the associated QV&V activities that represents a unification of the abstractions defined by many organizations such as the AIAA [3], ASME [4], DoD’s Defense Modelling and Simulation Office (DMSO) [5] and DOE Defense Programs (DOE/DP) Accelerated Strategic Computing Initiative (ASCII) [6]. The dotted arrows represent human activities that are implemented with various degrees of computational automation and allow transitioning from the physical system to the system’s conceptual model through analysis, to the computational system model through programming and back to the physical system through simulation. Conceptual model represen-
3 computational model introduces additional uncertainty associated with the space and time discretization errors associated with the process of converting differential operators within the involved PDEs to systems of algebraic equations and the corresponding algorithms for carrying out the solution schemes of these systems. The formal definitions for QV&V used in the diagram of Fig. 1 are as follows [3]: • Qualification is the process of determining that a conceptual model implementation represents correctly a real physical system. Figure 1. Traditional QV&V for the M&S process.
tation is constructed through analysis of the observable behavioral structure of the physical system within the application context. The derived conceptual model can be a set of partial differential equations (PDEs) combined with the necessary constitutive equations. For many investigators this equational representation of the model corresponds to what is known as an analytical model and it encapsulates the conceptual model that can reproduce the behavior of the physical system. This model introduces the uncertainty or error associated with the differences between the implied idealization of the analytical/mathematical representation of the conceptual model and that of the actual one physical model. The computational model is encapsulated by the software that implements the conceptual model on the computational infrastructure. It is constructed by a variety of programming techniques based on various degrees of computational automation from manual code development [7,8] to automated code synthesis that exploits the ability of some of the commercial design tools (i.e. Mathematica) used in the conceptual model building process to automatically generate code or code stubs and extends to custom approaches [9,10], and to problem solving environment (PSE) based compositional approaches [11,12]. The
• Verification is the process of determining that a computational model implementation represents correctly a conceptual model of the physical system. • Validation is the process of determining the degree to which a computer model is an accurate representation of the real physical system from the perspective of the intended uses of the model. Both validation and qualification are attempting to answer the question of the representational fidelity of the conceptual (qualification) and computational (validation) models relative to the real physical system. For this reason qualification can be thought of as another type of validation. This consideration explains why most of the bibliography has narrowed the discussion to only the verification and validation (V&V) notions. Verification is attempting to answer the question of closeness (error, accuracy) between two models (conceptual and computational). Figure 1 depicts the QV&V concepts in terms of comparing with the behavioral data corresponding to the systemic behaviors of the physical (through experimentation), the conceptual and computational systems (through simulation prediction). The most critical issues associated with this data-driven behavioral comparison among the various systemic representations are those of complexity and weak termination as described below. Complexity: The comparison is usually performed on data instantiating behavioral field state variables defined over vector spaces spanned
4 by the input-output vectors (i.e. loaddisplacement, stress-strain, etc) of the modelled system which many times correspond to vector fields defined over volumes, areas or lines associated with the geometrical models of the involved continua. In multi-field multi-continua applications (coupled or/and uncoupled) the dimensionality of these spaces can be high especially when including electromagnetic fields to those of the classical continuum thermomechanics, leading thus to increased problem complexity. Weak Termination: Unfortunately termination of the recursive process of minimizing the differences between model predicted datasets and physical system behavior datasets, is not guaranteed to terminate for a user defined accuracy (as a measure of fidelity of simulation) in a finite time, nor is guaranteed that there is a unique model that produces a predicted behavior dataset that converges to the physical system behavior dataset with a desired speed of convergence. This situation forces users and domain experts in general, to employ “engineering approximation” attitudes of accepting models within “acceptable bounds” for specific applications, which consequently leads to a plethora of low-confidence specialized models and solutions that vary according to the personal modelling assumptions of the respective author/expert. The complexity issue has been one of the main motivators for the successful development of various MPSE frameworks and corresponding applications. The pluralism of implementation methodologies and domain of application sensitivities has lead to long list of such systems that end up being used mostly by those who developed them. This is another secondary yet critical issue associated with the PSE technology. To address the weak termination issue as well as to move towards an approach that guarantees high confidence reality approximation with high fidelity in a built-in or embedded-in-the-model manner, we are not following the approach of finding a solution that alleviates the issue, but rather we are following the approach of establishing the conditions that give rise to it and subsequently considering removing them such as the issue seizes to exist.
The pursued alternative is to implement a M&S methodology that can utilize the following three elements: (1) data-streams of controlled continuous system behavior (stimulus-response pairs of all observables regardless of their field or nonfield nature), (2) analytical representations of the model that can accurately reproduce a subset of the acquired data via system identification through successive dynamic model adaptation with the help of optimization techniques thus encoding intrinsically the validity of the derived models, and (3) derived models to simulate predictive response of the originally modelled system or any other system that shares the identified continuous behavior with the original system. Figure 2 shows the proposed data driven M&S process as a modification of the original Fig. 1. The modules inside the dashed line area constitute an alternative conceptual system model that is embodied by the optimization scheme described. This approach effectively moves the physical data of the experimentally observed behavior of the actual system, inside the conceptual model of the system thus guaranteeing the intrinsic validity of the conceptual system. For continuous systems, usually this model refers to the set of algebraic and partial differential or integral equations representing the constitutive behavior along with the conservation laws involved. Thus, this methodology eliminates the weak termination problem. Termination is strong and embedded since the adaptively computed model is trained on the behavioral data and has to terminate as soon it satisfies the optimization criteria including the acceptable error tolerance between actual and simulated data, and occurs prior to using the model for prediction of behavior that has not been used in the training phase. 2.2. Attribute space of Data-Driven Environments The main feature of DDEMA that differentiates it to existing MPSE efforts will be its utilization of data and its data-driven attributes in general. The attribute space of a data driven information technology application/environment such as DDEMA, can be viewed as a four di-
5
Figure 2. Evolved QV&V for data-driven M&S process.
mensional space spanned by four bases: (1) Time modality: Time of utilization vs. time of acquisition of data (static vs. dynamic or prior-time vs. real time); (2) place of data utilization: onboard (computational operations and decision support environment (CODSE)) vs. outboard (computational design environment (CDE)); (3) implementation topology: localized vs. distributed and (4) building blocks implementation economy: custom vs. commercial of the shelf (COTS) technologies. Figure 3 shows a two dimensional projection of this space, spanned by the basis attributes of time modality and place of data utilization. The two values per basis cases define four particular applications of a data-driven environment depending on these values. They are described counterclockwise (starting from the Outboard-Static case). When pre-existing data have been captured and are used for constructing a (mathematical and computational) model of the physical system that has been systematically observed while the computational environment that utilizes them is located out of the modelled physical system (outboard) then our data-driven environment can be utilized as a computational design environment (CDE). When the environment just described is actually located within (onboard) the physical system
Figure 3. Special cases of data-driven environments defined in the ”place of data utilization” vs. ”time modality” context space.
it models, then our data-driven environment can be utilized as a computational operation and decision support (CODSE) system to aid the crew within the system (i.e. submarine, destroyer, aircraft carrier). To the extend that we allow the previous CODSE to be driven by live real time data originating from sensor networks, then it becomes a dynamic CODSE (D-CODSE). Finally, if we take some or all of its implementation modules and distribute them in remote locations so they are not on board any more, then our data-driven environment becomes a remotely assisted dynamic CODSE (RAD-CODSE). Special emphasis will be given in the transitioning from prior-time (static) to a real time (dynamic) environment. All other cases are expected to be almost trivially producible from these two cases. 2.3. Multiphysics Problem Solving Environments Background Progress towards the solution of coupled multiphysics and multi-continua interaction both from the constitutive and the field equation perspectives has been scarce. There is significant progress however, when focus has been given to computational implementations for a small num-
6 ber (mostly three) of interacting fields and a number of active projects exist. Among them one can include our development of the GasTurbLab [13,14] multidisciplinary problem solving environment (MPSE) that addresses the interactions between a stator, rotor, and combustor in a gas turbine engine that has been implemented on a mobile agent platform, utilizing a collaborating partial differential equations (PDEs) methodology [15,16]. In the area of fluid-structurecontrol interaction, we can mention the advanced AERO computational platform. This platform comprises the AERO-F, AERO-S, AERO-C and MATCHER suite of codes, [7,8], developed at the University of Colorado for the solution on nonlinear transient aeroelastic problems. They are portable, and run on a large variety of computing platforms ranging from Unix workstations to shared as well as distributed memory massively parallel computers. A generalization of AERO to an aerothermoelastic implementation was recently initiated [17]. The case of aerostructural coupling with an energy dissipative non-linear material response under a virtual wind tunnel environment has also been initiated recently [18,19]. The DDEMA system will provide high confidence prediction capability through its data-driven system modelling capability. 2.4. Requirements The DDEMA system will use network based distributed computing infrastructures and data and will provide a web-accessible user’s environment. The envisioned computing infrastructure will have the respond to the following requirements: (1) Computation should be partitioned into coarse-grain and fine-grain components and will be executed in a loosely coupled fashion and distributed manner for the lightweight implementation, and in tightly coupled fashion for the heavyweight implementation perhaps by exploitation of web services technologies. (2) The system should be able to distribute dynamically the various components in order to adapt according to resource availability and performance variations and/or demands. (3) Automatic adaptive modelling should be
available and the system should be minimizing the difference between stored behavioral datasets and predicted behavioral datasets, in a design optimization fashion, where the objective function and the potential constraints will be user or system definable. (4) To enable optimal user customization and history maintenance of user-system interaction, a visual problem definition editor (VPDE) should be developed and its user-invoked states will be achievable and unarchievable upon request. (5) To enable collective and dynamic knowledge transfer among distributed points of entry and achieve auto enhancement of simulation due to historical collective user experience, abstractions of successful simulations and/or problem solutions as well as their least upper bound (as a representation of the non conflicting historic common asserted knowledge) should be maintainable for all users with appropriate authorization/authentication. (6) The system should be able to deal with machine and programming language heterogeneity problems while it will preserve platform independent component communication by leveraging efficient and ubiquitous data exchange formats and methodologies. 2.5. Existing Technologies Integration The work under development borrows design concepts from or builds on top of: (1) existing agent systems [20,21], (2) communications tools and protocols that can provide interoperability between different platforms (e.g. Java-RMI-IIOP [22]), (3) symbolic algebra systems and sub systems for automating the field equation generation process (e.g. MathTensor [23]), (4) flexible APIs for visual editing of user controlled problem definition as well as 2D and 3D visualization (e.g. PTOLEMY-II [24]), (5) high performance Java implementations for parallel computation over distributed and shared memory architectures focused on intra-paradigm compatibility (e.g. ProactivePDC [25]) or message passing efficiency (e.g. Hyperion [26]), (6) resources allowing GRID integration (e.g. web service XML-based metadata integration [27] and GSiB [28,29], and finally (7) heterogeneous device (thin and heavy
7 weight) implementations to support ubiquitous computing infrastructure [30]. Many of these technologies have already been proven to be appropriate for rapid prototyping, involving traditional multiphysics and multi-scale models developed from different disciplines and organizations. What is crucially important for our project is the clean, simple and flexible programming model of mobile or static component systems, which is expected to facilitate the development of the overall envisioned system. Besides making necessary adjustments/extensions to such systems, our IT development work will also focus on transforming proven but largely monolithic legacy codes into a set of well integrated, interoperable components (via wrappers based on the Java Native Interface (JNI) [31]) that can be distributed over the network. Very recently we have established that web services based approaches followed by very few groups [27–29] share a common perspective with our team regarding the computational technologies utilization and philosophy of integration especially the GSiB project [28] that has proceeded to creating a crucially important resource for automatic JNI wrapping of legacy codes through JACAW [32]. From the data-driven perspective we have also very recently established that other groups [33,34] are following similar philosophy and approaches. 3. Design Specifications of DDEMA for two data-driveness application scenarios The application domain for the validation of utility for DDEMA will be that a) of composite material structures in their linear and nonlinear behavioral regimes when they interact with surrounding fluids under multiphysics loading conditions primarily for material/structural design of supersonic platform and b) fire-materialenvironment interaction, monitoring, assessment and management system. The two application scenarios considered can be characterized by the prior-time and real-time modalities identified in the overview section introduced in the discussion of Fig. 3. Following, we present the user and system ar-
chitectural design objectives of DDEMA framework for the two identified application scenarios. 3.1. Real-Time System Design Objectives In this system configuration, we assume that the on-board user has many operational objectives some of which are: • Monitor the health state of the area (e.g. entire platform, substructure, component) of interest by observing continuous field distributions guided by incoming sensor datastreams originating from a distributed sensor network that can be monitored from multiple points of entry. • Simulate potential future states (“what-if” scenarios) of the system given certain hypothetical battle or situational escalation scenarios. This requires auto conversion of actual operating conditions to corresponding boundary/initial conditions, exploitation pre-computed field states to synthesize potential field evolutions quickly for “whatif” scenarios. • Based on input from the previous two items, the on-board users will be able to make decisions and implement them by activating actuation mechanisms. These require a dynamic analysis of simulated scenarios based on controllability of the systems,an impact analysis via a system that implements heuristic knowledge management techniques, and an archival capability for future re-usage and recording evaluative experiences of implemented decisions. There are two general approaches that depending on computational resources availability, complexity and size of modelling can be utilized in the real-time modality context. One is using real time data to dynamically steer simulation based on real time solution of the corresponding PDEs. The other one is using real time data streams to select appropriate proportions of precomputed basis solutions, provided the stimulus space of the system at hand is parameterizable and decomposable to basis cases and that linear superposition
8 can be exploited due to the fact that the inherent stimulus-response state variables are linearly related or the system can be linearized. Though we plan to explore a solution for the first case here we present a high level schematic view of the so called real-time system architecture for the second case of solution synthesis in Fig. 4, where the coarse grain functionality modules have been shown from a data association perspective only. This diagram should be implementable within the VPDE. A short description of each one of the modules presented is as follows. Solution Generator: This module encompasses the functionality required preparing the data for the codes performing fluid-structural analysis. It is responsible for convert mission specific events to finite element specification and data. Multiphysics/Multidomain Solvers: This modules accept fluid properties, flight conditions, and structural properties and they can compute any fluid or structure related field results. They are the solvers of the multi-field and multidomain PDEs involved in the modelling of the corresponding physical systems. The aero-structural problem will utilize the AERO suit of codes while an additional suite for reactive flow has to be incorporated in order to address the fire related problems. Precomputed Basis Case Solutions: This module is a database that stores all basis case field solutions that the user generates through the corresponding analysis module for the domain(s) involved. Stimulus Specification Controller (Middleware 2): This module establishes the stimulus vector space and its bases and produces parametric representations of desired loading paths applied onto the structure via explicit loading conditions, or implicit ones via loading event parameterization. It also establishes the material and geometry specification for the system at hand as a function of both user-decisions and sensor data. Labor Distributor (Middleware 1): This module distributes labor among the various solvers in a manner consistent with the restrictions imposed by the rest of the modules that are
Figure 4. A real-time system architecture.
connected to it and provides its output to the solution composer module. Solution Specification Composer: This module combines all the selections from the controllers for loading, material and geometry path definitions, and synthesizes from the base case solutions database, the desired combined solution. Simulation Composer: This module utilizes the output specification from the solution composer module with the precomputed fields stored in the precomputed solution basis cases database module to synthesize the consistent to the user/sensor data specification solution. Display Module: The user interaction and visualization module. The focal domain of application of this realtime approaches will be addressing mainly the fire assessment, evaluation and control scenario depicted in Fig. 5. This scenario involves utilizing sensor network data along with a user specified query processed from the routing software on a base station(s) environment, that is capable of requesting a “GRID” implemented solution of the corresponding PDE solvers o formulate a “microfuture” prediction of a “what-if” specification (i.e. what is going to happen if the fire-personnel opens a door).
9
Figure 5. A fire scenario implementation of DDEMA’s architecture.
3.2. Prior-time System Design Objectives In this system configuration, we assume that the laboratory or off-board user, who can be a structural designer or system qualifier who needs accurate systemic state behavior prediction, has similar operational objectives to that of the realtime modality. The only difference will be that the real-time component will not be present at the time of simulation but it will be present at the time of model formation. The role of the incoming data will be played by the multidimensional material testing data-streams for identifying constitutive behavior of materials that may also guide model selection and implementation. This requirements specification represents the structural system design and qualification utilization of the system. The high level schematic view of the so called prior-time (static) system architecture is depicted in Fig. 6 was produced by appropriate adjustments of Fig. 4. 4. Implementation Issues for DDEMA Environment In this section we discuss the implementation of DDEMA for the two identified application scenarios.
Figure 6. A Prior time implementation of DDEMA’s architecture for a CDE application.
4.1. Design Overview The plethora of available choices on computational infrastructure (hardware, software, networking), see Sec. 2, along with specific domain of application and user specific backgrounds and needs have introduced a large number of research issues associated with the computational implementation of any MPSE and particularly DDEMA. The major of these issues are: (1) ability and optimization for computation over distributed resources, (2) ability for dynamic migrating component distribution, (3) adaptive modelling capability, (4) user dependent customization, (5) collaborative knowledge representation and interactivity, (6) dealing with heterogeneity of legacy and new code, and finally (7) ability to sustain component fault tolerance. To address as many of the above issues, in the context of the data-driveness and the two time modalities scenarios considered, the three layered design shown in Fig. 7 is implemented. The surfaceware layer implements the user interface view of DDEMA. It consists of actors with a visual abstraction in a two dimensional visual language context, that allows the user to use various semantics (i.e. data flow, flow of control,
10
Figure 7. Three-layered abstractions of DDEMA in terms of their distance from user’s perspective and the abstract functional refactoring.
etc) to implement problem solving composites of schemes and workflows. The middleware layer serves as the interface between the surfaceware and the deepware that essentially is capable of encapsulating instantiations of the deepware modules and project them to their visual incarnations available in the surfaceware level. The deepware layer corresponds to the runtime environment where the legacy codes run of “GRID” or conventional resources. The symbolic modules associated with the PDE models themselves are also implemented at this level. From a logical instantiation perspective an actor can contain the functionality of one or more agents, while an agent can contain the functionality of one or more deepware units. From an implementation perspective an actor is a wrapper providing a view to an agent composite, while an agent is a wrapper on deepware composite. Below is a more detailed description of the three layers.
4.2. Surfaceware DDEMA’s realization will be based on two main operational modes: The application design and the application use modes. In the application design mode the designer will utilize the VPDE for designing the actual application architecture in terms of data flow and message passing diagram. In particular, a visual representation of the linguistic components available for composition through their visual incarnations within the VPDE will be used. This is exactly why we plan to utilize an appropriately modified version of the “Vergil” visual editor paradigm provided by the Ptolemy-II ([24]), system which already contains a well documented specification for creating new actors and directors. In this mode of usage DDEMA’s GUI will be that of the VPDE based on Ptolemy-II. it is important to realize that the Ptolemy-II infrastructure guarantees that actors are composable but static (from the perspective of the fact the runtime JVM environment exploited to create the corresponding composition diagram is at the machine used by the user to author it). In the application use mode, the user will take action to initiate the automatic generation of java source code or bytecode that implements the application functionality intended. Upon execution of the bytecode, the user will be presented with a stand alone custom made application that is the result of the previous mode to perform the activities associated with the corresponding usage requirements. During this phase secondary process will be spawned on the appropriate lightweight and heavyweight computational infrastructure available at hand. In order to address issues of parallelism, distribution, concurrency, security and process mobility, we will first attempt to introduce an intermediate layer between the actual mission functional and actor view layers, that provides a unified set of resources for this goal such as Inria’s ProActive library [25]. In this case the GUI will be the one defined by the user in terms of the previous stage. Based on the issues, requirements, and data associations between functional abstractions as they have been described above, we have decided on proposing the following set of actors to cap-
11
Figure 8. Example of surfaceware utilization of a FETI model for DDEMA in Ptolemy’s Vergil used as a VPDE.
ture as an example the finite element tear and interconnect methodology for solving field PDEs. A Director (or General Control Actor (GCA)), a Finite Element Tear and Interconnect Actor (FETIA), a Communication Actor (CA), Domain Actors (DA), Database Actor (DBA), and many I/O Actors (e.g. Display Actor, Printer Actor, Sensor Actor, User Actor), see Fig. 8. Each Actor has specific tasks and interactions with other Actors. However, we will not discuss here the individual responsibilities of each actor/agent due to the lack of space. Instead, the reader can infer some of this functionality through the associations depicted in Fig. 8. In many cases the actors to be developed will have to be constructed such as they encapsulate of existing codes like the AERO suite [7,8] or their aggregations and/or subcomponents through the intermediate corresponding agent middleware wrapping. 4.3. Middleware The design of the middleware implementation for DDEMA is based on our experience from previous projects implementation, such as GasTurbinLab [13]. It is presented in the context of an agent-based middleware in order to be able to address the large spectrum of hardware hetero-
geneity involved that ranges from portable PDAs and small processing power intelligent sensors to large scale shared memory machines, GRIDbased computing and web-services-based computing resources. Agent systems are by default composable in the sense of the existence of their container abstraction that is the most common characteristic of most of these systems. However, they can also be mobile in that they can migrate to various runtime environments hosted by various physical hosts. This feature is particularly desirable for the cases where for unforseen reasons the hardware may fail (i.e. fire damage) while their runtime presence can migrate into healthy hardware. Since there is plethora of agent-based systems we needed to establish an informed perspective as to which feature were most important from the DDEMA’s design needs specification in order to be able to downselect a particular agent middleware infrastructure. We followed a strategy of identifying a set of knockout and performance criteria similar to that of [35]. Those systems that survived the knockout criteria evaluation were further evaluated from a performance set of criteria perspective pertinent to not only their general behavior but also to their ability to be integrated under the needs for the DDEMA project. In a forthcoming publication we are describing the process and criteria along with the evaluation results in detail. The results of this effort concluded that there are three potential systems that can be used for DDEMA. They are the Java Agent Development Network (JADE) [20], the GRASSHOPPER mobile agents system [21], and the ProActive Java library for parallel, distributed, concurrent computing with security and mobility [25]. At the present stage it has not been decided which of the three systems will be utilized for the needs of DDEMA. However, there is an effort to use the FIPA [36] compatibility feature of the JADE and GRASSHOPPER to use both of them in conjunction with ProActive. 4.4. Deepware The main components of DDEMA’s deepware are the AERO suite modules capable of handling
12 fluid structure interaction in conjunction with heat transfer and fluid mesh deformation as it has been described elsewhere [17]. Additional components will be the codes that are cable of solving the reactive flow problem associated with fire conditions associated with the one of the two demonstrations applications of DDEMA. Potential candidates for this task are NIST’s ”Fire Dynamic Simulator” [37] or a general purpose PDE solver such as freeFEM [38] or flexPDE [39]. Finally, the following modules that are essential for our project applications will be developed and will be a part of the deepware layer of DDEMA: (1) Symbolic constitutive theory and PDE model creator infrastructure, (2) DataDriven Automatic Model Generator Based on Information theory or/and standard optimization techniques, and (3) Real time solution constructor based on data-driven approaches either for steering real time computation for simulation purposes, or for computing solutions based on precomputed basis solution approaches. These applications will be projected on the surfaceware level trough the appropriate agent-based middleware wrapping, and will be available for user graphical synthesis via the actor wrapping of the corresponding agents. 5. Validation Case Studies 5.1. Coupled Multiphysics of Continua: Material/Structural Design of Supersonic Platform. Our experience with continuous multiphysics field theories [40,41] suggests that the design process of a structure often requires the usage of a tool such as DDEMA for establishing optimal material characteristics like fiber-matrix properties, fiber orientation, and laminate layup properties, and the shape tailoring for an aircraft - or an aircraft component such as a fuselage, a wing, or a control surface - under high temperature and mechanical loading conditions inflicted from a supersonic mission requirement. Intrinsic to the validity and confidence of any aero-structural simulation are the utilized continuous models of material behavior. It is exactly
this area of where our robotic testing capability along with historical exhaustive data pertaining to identification of material behavior, will be used as the springboard to launch an effort for automating model selection, implementation and verification methodologies. This will be done by considering all intrinsic and extrinsic, wanted and unwanted factors (uncontrolled biasing, noise, repeatability, history dependence etc) effecting data quality for the intended usage of model formation. Information theoretic, statistical and certainly deterministic techniques for model selection and/or generation in the context of continuous system identification and inverse approaches will be considered and ultimately some of them will be encapsulated in DDEMA. This will exercise the prior time data-driven aspect of the system. We have already demonstrated that it is possible to deal with most of these issues from a single physics rapid modelling perspective [18,19] in conjunction with the development of a supersonic aircraft shaping technology for reducing the initial shock pressure rise characterizing the ground signature. However, such a Quiet Supersonic Platform (QSP) requires a lightweight airframe, and will exhibit an aero-structural behavior. Its feasibility depends not only on the ability to design a shape that generates a low-boom ground signature, but most importantly on the ability to build a lightweight high-stiffness and damage tolerant structure that can withstand the expected aerodynamic and thermal loadings for long-range missions at a sustained rate. Therefore, the final design of a QSP will have to rely on a multidisciplinary aero-thermo-structural optimization. 5.2. Monitoring, Assessment and Management of Fire-environment Interaction. The ability to deal with the consequences of fire, extends over a wide range of application areas that have a direct effect on the survivability, ability to repair, maintainability, life extension and mission goals attainment of the environments and structural platforms effected by it. Some of these application areas can involve time critical situations that when they arise, demand decision-making support for an accurate moni-
13 toring capability supplemented by fire damage assessment and management and control countermeasures capabilities. A point in case are navy vessels build with a great variety of materials under extremely demanding threat conditions such as possible catastrophic events due to fuel tanks proximity, limited oxygen supply (i.e. submarines), fire byproducts toxicity, structural and atmospheric damage. To demonstrate the value of the core ideas behind the development of DDEMA, we plan develop a proof of concept instantiation of DDEMA within the context of managing all required activities of an accidental fire induced crisis scenario as depicted in Fig. 4. More specifically, this application of DDEMA should be able to provide assistance to the users by employing the following features: (1) Ability to receive real time datastreams from multiple redundant distributed sensor networks that allow capturing of multimodal point or continuous distributions of various field quantities such as area/volume of live flames, temperature, chemical byproducts concentrations etc; (2) ability for multi-point-of-entry monitoring where this can be accomplished in multiple locations synchronously or asynchronously; (3) copresentation of reactive flow and reactive phase transformation simulation capability with multiphysics fluid-structure interaction simulation capability that allows “what-if” prediction exploration in order to be able to evaluate validity of decisions and alternative countermeasures; (4) a decision support subsystem, in order to combine sensor inputs, simulation results analysis and user choices based on experience and knowledge and form a course of action countermeasures that can also be simulated; (5) an interface to a control system of an existing countermeasure distributed actuation network, in order to implement the decided management strategy. A system with these capabilities allows for portability and wide applicability. 6. Conclusions We have presented an overview of the datadriven motivations for creating DDEMA as a system where data representing discrete factual
encapsulation are used for embedding validation and qualification within the modelling and simulation aspects of systemic behavior prediction. The requirements of such an environment have been established and the major details of the abstract architecture of the software abstractions required for DDEMA’s implementation have been determined. Two application scenarios for validating the architecture, implementation and utility of this system have also been described. The first one is the usage of DDEMA as a computational design environment of a nonlinear multiphysics structure such as a Quite Supersonic Jet, and the second one is the usage of DDEMA as a computational decision support system for fire monitoring and control in various platforms. Acknowledgements. The authors acknowledge the support by the National Science Foundation under grants EIA-0205663 and EIA-0203958. REFERENCES 1. J. Michopoulos, P. Tsompanopoulou, E. Houstis, J. Rice, C. Farhat, M. Lesoinne, F. Lechenault, DDEMA: A Data Driven Environment for Multiphysics Applications,in: Proceedings of International Conference of Computational Science - ICCS’03, Peter M.A. Sloot, et al. (Eds.) Melbourne Australia, June 2-4, LNCS 2660,Part IV, Springer-Verlag, Haidelberg, (2003) 309-318. 2. J. Michopoulos, P. Tsompanopoulou, E. Houstis, J. Rice, C. Farhat, M. Lesoinne, F. Lechenault, Design Architecture of a Data Driven Environment for Multiphysics Applications, in: Proceedings of DETC’03, ASME DETC2003/CIE Chicago IL, Sept. 2-6 2003, Paper No DETC2003/CIE-48268, (2003) 3. American Institute of Aeronautics and Astronautics Standards Program, Guide for the Verification and Validation of Computational Fluid Dynamics Simulations, AIAA report G077-1998, 1998 4. ASME-JFE, Journal of Fluids Engineering Editorial Policy Statement on Control or Numerical Accuracy, Jnl. Of Fluids Engineering, Vol. 115, 3, 1993, pp, 339-340
14 5. DoD, Verification, Validation, and Accredization (VV&A) Recommended Practices Guide, Defense Modeling Simulation Office, Office of the Director of Defense Research and Engr., available: www.dmso.mil/docslib 6. M. Pilch, T. Trucano, J. Moya, G. Froelich, A. Hodges, D. Peercy, Guidelines for Sandia ASCI Verification and Validation Plans - Content and Format: Version 2.0, Sandia Reports SAN2000-3101, January 2001. 7. C. Farhat, M. Lesoinne, Two Efficient Staggered Procedures for the Serial and Parallel Solution of Three-Dimensional Nonlinear Transient Aeroelastic Problems. Computer Methods in Applied Mechanics and Engineering, 182 (2000) 499-516. 8. C. Farhat, M. Lesoinne, P. Stern, S. Lanteri: High Performance Solution of ThreeDimensional Nonlinear Aeroelastic Problems Via Parallel Partitioned Algorithms: Methodology and Preliminary Results. Advances in Engineering Software, 28 (1997) 4361. 9. E. Kant, F. Daube, W. MacGregor, J. Wald, Scientific Programming by Automated Synthesis, Chapter 8 in Automatic Software Design, M. Lowry and R. McCartney, eds, AAAI Press/MIT Press, Menlo Park, CA, (1991) 169- 205. 10. R. van Engelen, L. Wolters, and G. Cats, Ctadel: A Generator of Efficient Code for PDE-based Scientific Applications, Technical Report 95-26, Department of Computer Science, Leiden University, 1995. 11. H. Fujio, S. Doi, FEEL: A Problem-Solving Environment for Finite Element Analysis, NEC Research and Development, 39/4 (1998) 491-496. 12. E.N. Houstis and J.R. Rice, Parallel ELLPACK: A development and problem solving environment for high performance computing machines. In Programming Environments for High-Level Scientific Problem Solving, (P. Gaffney and E. Houstis, eds.), North-Holland, Amsterdam, (1992) 229-241. 13. S. Fleeter, E. Houstis, J. Rice, C. Zhou, A. Catlin, GasTurbnLab: A Problem Solving Environment for Simulating Gas Turbines. in:
14.
15.
16.
17.
18.
19.
20. 21.
22. 23.
Proceedings of 16th IMACS World Congress, (2000) No 104-5. E.N. Houstis, A.C. Catlin, P. Tsompanopoulou, D. Gottfried, G. Balakrishnan, K. Su, J.R. Rice, GasTurbnLab: A Multidisciplinary Problem Solving Environment for Gas Turbine Engine Design on a Network of Non-Homogeneous Machines., J. of Computational Engineering and Mathematics, 149 (2002) 83-100. J.R. Rice, P. Tsompanopoulou, E. A. Vavalis, Interface Relaxation Methods for Elliptic Differential Equations., Applied Numerical Mathematics 32 (1999) 219-245. P. Tsompanopoulou, Collaborative PDEs Solvers: Theory and Practice. PhD thesis, Mathematics Department, University of Crete, Greece (2000). H. Tran, C. Farhat, An Integrated Platform for the Simulation of Fluid-StructureThermal Interaction Problems, AIAA J. (in press.) J. Michopoulos, R. Badaliance, T. Chwastyk, L. Gause, P. Mast, C. Farhat, M. Lessoine, Coupled Multiphysics Simulation of Composite Material Softening in a Virtual Wind Tunnel Environment, in: Proceedings of Sixth U.S. National Congress on Computational Mechanics, U.S. Association for Computational Mechanics, Dearborn MI, (2001) pp. 521. J. Michopoulos, C. Farhat, M. Lesoinne, P. Mast, R. Badaliance, T. Chwastyk, L. Gause,Material Softening Issues in a Multiphysics Virtual Wind Tunnel Environment, AIAA Paper 2002-1095, 40th Aerospace Sciences Meeting and Exhibit, Reno, Nevada, (2002). JADE, Java Agent Development framework. http://jade.cselt.it The Grasshopper Agent Platform, IKV++ GmbH, Kurfurstendamm 173-174, D-10707 Berlin, Germany. http://www.ikv.de Java-RMI-IIOP: http://java.sun.com/products/rmi-iiop/ L. Parker, S.M. Christensen, MathTensor: A System for Doing Tensor Analysis by Computer, Addison-Wesley, (1994)
15 24. J. Davis II, C. Hylands, B. Kienhuis, E.A. Lee, J. Liu, X. Liu, L. Muliadi, S. Neuendorffer, J. Tsay, B. Vogel, Y. Xiong, Heterogeneous Concurrent Modeling and Design in Java, Memorandum UCB/ERL M01/12, EECS, University of California, Berkeley, CA USA 94720 March 15, 2001, http://ptolemy.eecs.berkeley.edu/ptolemyII/ 25. D. Caromel, W. Klauser, J. Vayssiere, Towards Seamless Computing and Metacomputing in Java, pp. 1043–1061 in Concurrency Practice and Experience, SeptemberNovember 1998, 10(11–13). Editor Geoffrey C. Fox, Wiley & Sons, Ltd., http://wwwsop.inria.fr/oasis/proactive/ 26. G.L. Antoniu, L. Bouge, P. Hatcher, M. MacBeth, K. McGuigan, R. Namyst, Compiling multithreaded Java bytecode for distributed execution. In Euro-Par 2000: Parallel Processing. Lecture Notes in Comp. Science, Vol. 1900. Springer-Verlag, Munchen, Germany, (2000) 1039-1052. 27. C. Youn, M. Pierce, G. Fox, Building Problem Solving Environments with Application Web Service Toolkit, in: Proceedings of International Conference of Computational Science ICCS’03, Peter M.A. Sloot, et al. (Eds.) Melbourne Australia, June 2-4, LNCS 2660,Part IV, Springer-Verlag, Haidelberg, (2003) 403412. 28. Y. Huang and D. W. Walker, JSFL: An Extension to WSFL for Composing Web Services, in: Proceedings of 3rd International Workshop on Grid Computing, Baltimore, MD. Submitted 8th June, 2002. 29. Y. Huang, GSiB: PSE Infrastructure for Dynamic Service-Oriented Grid Applications, in: Proceedings of International Conference of Computational Science - ICCS’03, Peter M.A. Sloot, et al. (Eds.) Melbourne Australia, June 2-4, LNCS 2660,Part IV, Springer-Verlag, Haidelberg, (2003) 430-439. 30. T. Drashansky, S. Weerawarana, J. Anupam , R. Weerasinghe, and E. Houstis, Software Architecture of Ubiquitous Scientific Computing Environments, ACM - Baltzer Mobile Networks and Nomadic Applications (MONET), 1/4 (1996).
31. Java Native Interface Specification. http://web2.java.sun.com/products/jdk/1.1/docs/guide/jni. 32. JACAW: A Java-C Automatic Wrapper Tool, http://www.wesc.ac.uk/projects/jacaw/ 33. C. Douglas, Y. Efendiev, R. Ewing, R. Lazarov, M. Cole, C. Jones, C. Johnson, Virtual Telemetry for Dynamic Data-Driven Application Simulations, in: Proceedings of International Conference of Computational Science - ICCS’03, Peter M.A. Sloot, et al. (Eds.) Melbourne Australia, June 2-4, LNCS 2660,Part IV, Springer-Verlag, Haidelberg, (2003) 279-288. 34. D. Knight, Data Driven Design Optimization Methodology, A Dynamic Data Driven Application System, in: Proceedings of International Conference of Computational Science ICCS’03, Peter M.A. Sloot, et al. (Eds.) Melbourne Australia, June 2-4, LNCS 2660,Part IV, Springer-Verlag, Haidelberg, (2003) 329336. 35. M. Grabner, F. Gruber, L. Klug, W. Stockner, J. Altmann, W. Esmayr, EvalAgents: Evaluation Report, Software Competence Center Hagenberg Technical Report SCCHTR-64/2000, 2000. 36. The Foundation of Intelligent Physical Agents, FIPA Specifications, in: http://www.fipa.org/specifications/index.html 37. NIST Fire Dynamic Simulator and Smokeview, http://fire.nist.gov/fds/ 38. freeFEM family of codes, http://www.freefem.org/ 39. FlexPDE, http://www.pdesolutions.com/index.html 40. G.C. Sih, J.G. Michopoulos, S.C. Chou S.C.: Hygrothermoelasticity. Martinus Nijhoff Publishers (now Kluwer Academic) (1986). 41. J.G. Michopoulos, G.C Sih, Coupled Theory of Temperature Moisture Deformation and Electromagnetic Fields. Institute of Fracture and Solid Mechanics report IFSM-84123, Lehigh University, (1984).