JDMS Special issue article
Supporting Network Enabled Capability by extending the Levels of Conceptual Interoperability Model to an interoperability maturity model
Journal of Defense Modeling and Simulation: Applications, Methodology, Technology 10(2) 145–160 Ó 2012 The Society for Modeling and Simulation International DOI: 10.1177/1548512911428457 dms.sagepub.com
Andreas Tolk1, Lisa Jean Bair2 and Saikou Y Diallo3
Abstract The North Atlantic Treaty Organization (NATO) Network Enabled Capability (NNEC) addresses technical and cognitive abilities of NATO requiring technical and operational interoperability standards and targets for adaptation. Net-enabled Modeling and Simulation (M&S) can support in all life cycle phases. The paper evaluates the contribution of four example technical activities of NATO conducted by different bodies of the Research and Technology Organization: net-enabled M&S by the Modeling and Simulation Group, semantic interoperability by the Information Systems Technology panel, new command and control concepts by the System Analysis & Studies panel, and human factors for NNEC by the Human Factors & Medicine panel. The results request a framework that included technical and operational aspects as well. The technical challenges can be supported by the Levels of Conceptual Interoperability Model, which is extended to an Interoperability Maturity Model. The operational challenges can be supported by the NNEC Command and Control Maturity Model. Both models are aligned to provide the necessary support to address NATO’s technical and cognitive abilities by M&S services.
1. Introduction The North Atlantic Treaty Organization (NATO) Network Enabled Capability (NNEC) Vision and Concept1 from April 2006 defines NNEC as follows: NATO Network Enabled Capability is the Alliance cognitive and technical ability to federate the various components of the operational environment from the strategic level down to the tactical level through a networking and information infrastructure.
The guiding question addressed in this paper is: How can Modeling and Simulation (M&S) contribute to implement this vision? Firstly, it needs to be recognized that NNEC is not a technical capability alone. The technical connectivity of NATO components and their ability to exchange data is an important enabler, but NNEC needs to contribute to the cognitive aspects of collaboration as well: how to better understand the data that has been shared? How can it be ensured that NNEC is not just implementing a technically very sophisticated version of a ‘Chinese whisper?’ Can the
work of the M&S community and their insights on composable M&S services2 help and contribute? How can the various contributions of technical activities of NATO’s Research and Technology Organization (RTO) of the various panels and groups be consolidated into a common framework supporting management and technology enabling NNEC? Regarding these questions, the director of NATO’s Command and Control (C2) Centre of Excellence (CoE) observes that .consequently resides in NNEC a coherent approach to the development of technical and operational interoperability 1
Old Dominion University, Norfolk, USA Weisel Science and Technology, Corp., Norfolk, USA 3 Virginia Modeling, Analysis, and Simulation Center, Suffolk, USA 2
Corresponding author: Andreas Tolk, Old Dominion University, 241 Kaufman Hall, Norfolk, VA 23529, USA Email:
[email protected]
146
The Journal of Defense Modeling and Simulation: Applications, Methodology, Technology 10(2)
standards and targets for adaptation. NNEC also aims to align national NEC related programs and not only technical interoperability but also operational interoperability, like training, doctrine, etc. (Kanis and van Ettinger,3 pp.21-22)
In other words, the framework needs to support technical standards for interoperability as well as pragmatic aspects enabling the operational evaluation. These ideas can be found in the NATO Code of Best Practice for C2 Assessment4 as well. The Code recommends an orchestrated set of tools and the evaluation of the results applied based on operationally justified measures of merits ranging from political effectiveness down to dimensional technical parameters, similar to the hierarchical structures recommended by the Military Operations Research Society (MORS).5 In order to address similar issues for international M&S federation development, the Levels of Conceptual Interoperability Model (LCIM) was developed.6 Originally targeting the M&S community, the LCIM has been widely adopted to other application domains requiring a similar approach addressing the challenges of integratability of communication networks, interoperability of implemented solutions, and composability of underlying conceptualizations derived from the business cases to be supported. One of the examples of successful adaption of these ideas is the Interoperability Framework for the Smart Power Grid.7 As such, the idea behind the LCIM is to provide metrics on various levels of interoperation, enhancing the cross collaboration throughout the business organization, making the resulting federation flexible, adaptable, and agile by applying operational guidelines as well as technical standards. Applying these metrics on the various levels leads to an Interoperability Maturity Model (IMM) that allows measuring the degree of alignment of two solutions regarding their ability to support a common NNEC operation. It also supports the execution of coalition operations. It is the objective of this paper to introduce such an IMM based on an extension of the LCIM utilizing the recent research results of related RTO tasks. It will also be shown that this IMM is an enhancement of the current approaches to measure Net Enabled Capability Maturity,8 not a replacement. To reach these objectives, the paper is structured as follows: In Section 2, a short overview of the RTO and its panel visions will be given. From these visions, the necessity to align their findings in support of NNEC is derived, along with several activities of the Modeling and Simulation Group (MSG), the Information Systems and Technology (IST) panel, and the Systems, Analyses, and Studies (SAS) panel, supported by some overarching insights from the panel of Human Factors and Medicine (HFM). The result of this evaluation of the related work of NATO will be the identification of technical activities that can and must
support technical and operational metrics for maturity on respective levels. In Section 3, related work on the topic of composability of model-based solutions in support of complex systems and system of systems will be given. These contributions enrich the findings of NATO experts by providing the underlying mathematics needed to justify the use of metrics in several areas. Section 4 introduces the LCIM in more detail and compares the LCIM with the NATO Net Enabled Capability Command and Control Maturity Model (N2C2M2). Section 5 introduces the methods to generate metrics applicable to extend the LCIM into an IMM, extending the mathematics developed in recent academic research to capture the operational and technical constraints derived from related NATO work by providing the technical interoperability foundations that are the basis for operational interoperability and the resulting capabilities.
2. Related NATO work As pointed out in the introduction, as well as in the vision statement, NNEC addresses technical and cognitive abilities of NATO components. Therefore, exclusively addressing technical abilities cannot be sufficient. Accordingly, insights from technical and non-technical research groups supported by NATO’s RTO need to be taken into consideration. To this end, this section gives a short overview of the panels and groups conducting related research.
2.1 NATO’s Research and Technology Organization NATO has organized its RTO into six technical panels and the M&S group. The System Analysis & Studies (SAS) panel conducts studies and analyses of an operational and technological nature and promotes the exchange and development of methods and tools for operational analysis as applied to defense problems. The Human Factors & Medicine (HFM) panel provides the science and technology base for optimizing health, human protection, well being, and performance of the human in operational environments with the consideration of affordability. The Information Systems Technology (IST) panel identifies and reviews areas of research of common interest and recommends the establishment of activities in these areas. The Applied Vehicle Technology (AVT) panel improves the performance, affordability, and safety of vehicles. The Systems Concepts & Integration (SCI) panel advances knowledge concerning advanced systems, concepts, integration, engineering techniques, and technologies across the spectrum of platforms and operating environments to assure cost-effective mission area capabilities, and the Sensors & Electronics Technology (SET) panel advances technology in electronics and passive/active sensors and
Tolk et al. enhances sensor capabilities through multi-sensor integration/fusion in order to improve the operating capability and contribute to fulfilling strategic military results. Finally, NATO established the NATO M&S Group (MSG) to promote co-operation among alliance bodies and NATO member nations and Partnership for Peace nations to maximize the effective utilization of M&S across all technical panels. Each panel and the MSG have a steering group that coordinates the various activities within the panel or group. The actual research is conducted within expert teams that conduct technical activities, normally limited to a period of two years, but exceptions are possible. Within these technical tasks, special research questions are addressed. In the scope of this paper, several activities that support different aspects of NNEC metrics will be addressed. The focus will be on SAS, IST, HFM, and the MSG: SAS conducts operational analysis studies, directly addressing the needs and requirements for NNEC; IST is the home for C2 information systems and communications, so their studies have to provide technical connectivity, but also connecting these technical aspects to the challenges of C2; HFM addresses the cognitive aspect; and the MSG is the M&S backbone of NATO, focusing not only on M&S applications, but also on theoretic aspects of interoperability and composability. NATO-supported NNEC research work goes far beyond this selection. The list is neither complete nor exclusive. However, the focus is to show how the results can be used to contribute to address particular issues within the common framework by giving a short evaluation from four technical activities representing the four RTO bodies. The interested reader is invited to apply respective ideas to additional research results to broaden the applicability of the proposed solution in Section 5.
2.2 MSG-062: M&S for NNEC The recent study on M&S for NNEC9 should be of particular interest when addressing the guiding question of this paper. The tenet of the final report is that M&S can be applied to net-enabled capabilities in all life cycle phases of such components: from concept development and experimentation (CD&E), via acquisition, training, and exercises into the phase of operations. In its seven main chapters, the report addresses M&S support in support of the following: • • • • • • •
architectural constraints and evaluations of throughlife management; CD&E; acquisition, test and evaluations, and logistics; training and exercises; agile operations and C2; human dimension; and evolution of M&S.
147 Each chapter summarizes its findings with a set of key M&S principles, such as that NNEC is a Systems-ofSystems (SoS) problem that can be understood through effective use of enterprise architectures. As in this paper, the idea is to address technical and cognitive abilities through technical and operational interoperability standards. However, the majority of the contributions focus on the positive aspects of M&S in all life cycle phases without giving detailed technical recommendations or operational constraints. However, the need for common approaches in all life cycles is recognized and emphasized in the report. The contributions on M&S using system architectures in support of through-life management for NNEC and the evolution of M&S in the NNEC context are of particular interest with regard to the objective of this paper. Both of the chapters addressed above show the necessity for an aligned approach of NNEC components and net-enabled M&S. In the operational context, architectural frameworks are applied to describe functional and physical aspects of capabilities of systems and show how these capabilities can fulfill the set of operational requirements that justifies the use of the respective systems in the operational context from which the requirements were drawn. Systems described in such a framework can be exercised through the execution of M&S. Furthermore, M&S evolves more and more into an operational tool, characterized as an ‘operational pull for M&S’ in the study on M&S for NNEC9 (p.8-2): At one time, M&S was considered to be a method of idea development away from the actual battle in both time and space. However, the nature of M&S, as it is seen by the operational community, is changing.
If both of these aspects are growing together, a commander in the future can reconfigure his systems on the full and communicate the new capabilities with accompanying M&S explanations to his/her partners in the battle sphere. MSG-062 shows several use cases that clearly establish operational requirements for future developments and give examples of successful experiments that demonstrate the feasibility and applicability of these ideas. It also shows the fundamental need to address interoperability of several levels in heterogeneous environments. Both aspects are addressed in this paper.
2.3 IST-075: semantic interoperability The need for unambiguous information exchange in heterogeneous information technology (IT) solutions has been addressed in the recent IST technical activity that just published the findings in their final report.10 The main contribution of IST-075 regarding the objectives of this paper is the introduction of the Semantic Interoperability Logical Framework (SILF). The report defines the SILF10 as
148
The Journal of Defense Modeling and Simulation: Applications, Methodology, Technology 10(2) .whether it is called Network-Enabled Capability (as it is in NATO), Network Centric Operations, Network Enabled Defense, or Edge Organizations, this transformation is predicated upon a set of network-centric tenets. The tenets that form the intellectual foundation for these ongoing transformations are (a) a robustly networked force (enterprise) enables the widespread sharing of information; (b) widespread information sharing and collaboration in the information domain improves the quality of awareness, shared awareness, and collaboration (C2 and operations processes); (c) this, in turn, enables self-synchronization, and (d) this results in a dramatic improvement in operational effectiveness and agility. (p.1-1)
Figure 1. High-level view of the Semantic Interoperability Logical Framework. (RTO Technical Report,10 p.2-3). .a set of ontology operation tools, translation engines and guidelines aimed at ensuring the provision of information exchange services for military operations in a NATO context. (p.2-1)
It prescribes four steps to increase semantic interoperability between NATO C2 systems: (1) (2) (3) (4)
describe and expose the semantic descriptions of each system; refer the local descriptions to previously described and shared concepts; align the descriptions (ontologies) of the two systems; transform information according to definition rules from the aligned ontologies.
Figure 1 visualizes this approach. The necessary requirement is the existence of a common conceptual representation of exchangeable information, which is referred to as ‘common ground’ in the report. If such a common world view exists that comprises all concepts represented with at least two of the participating systems (if a concept only resides in one it does not need to be shared), this common ground can serve as a unifier between the systems. This represents the common reference model (CRM) idea that will be evaluated in more detail in Section 3 of this paper. Another valuable contribution of IST-075 is a description of the state-of-the-art technologies for the ontological alignment of systems, but a detailed description goes beyond the scope of this paper.
2.4 SAS-050: exploring new command and control concepts and capabilities The expert of this operational analysis group identify in their final report11 that
As the sister activity in IST, this report identifies the need for a common representation of concepts. Consequently, the development of a common conceptual model identifying the key variables of NNEC and the relationships among them was the first task in this group. The report addresses in detail the following: • • • • • •
C2 approach; information domain; individual characteristics and behaviors; team characteristics and behaviors; decision making, actions, effects, and consequences; and value view.
Of particular interest for this paper is the definition of the key terms and relations describing the C2 approach. Figure 2 shows the conceptual view of C2 as applied by SAS-050. The conceptual model developed within SAS-050 can provide the foundation for the common ground requested in IST-075, but needs to be extended with higher detailed concepts as represented in the IT systems. It should be pointed out that IT systems are just one contributing factor to C2 within SAS studies, so that the scope of SAS and IST should generally be adjusted to this constraint of different application domains of the respective technical activities to do the justice. This work also led to the development of the N2C2M2 that will be described in more detail in Section 4 of this paper.
2.5 HFM-155: human systems integration for network centric warfare While the IST focuses on IT system and SAS extended this to the operational level, HFM adds another level of complexity to NNEC by addressing the human component. Information in the NNEC environment may be semantically aligned, but due to its inherent nature as so many sources become available, these data will be incomplete,
Tolk et al.
149
Figure 2. The Command and Control approach of SAS-050 (RTO Technical Report,11 p.2-4).
inconsistent, and may even be contradictive, resulting in massive uncertainty. The challenge for the commander is to effectively use this information and the new options of NNEC. However, using these data and new options accurately and timely is stretching the border of human ability. Human System Integration (HSI) is addressing these issues in their recently published report:12 While many believe that NNEC should provide wider and deeper information availability that will dramatically improve military operations information availability, it does not necessarily mean human operators can or will use it effectively. (p.1-1)
A main objective of HSI was the integration of human capabilities and human limitations into system definition, design, development, and evaluation to optimize total system performance in operational environments. It should be noted that the system is the C2 system comprising not only the IT systems, but humans, resources, sensors, and all other elements identified in SAS-050 and used later by Alberts et al.8 as the foundation for NNEC maturity. To support decision makers, the architecture frameworks already addressed in MSG-062 were extended with artifacts for representing human components of a complex socio-technical system focusing on, but not limited to, systems engineering purposes. This is also closing the circle: this selected research conducted in four different bodies of the RTO – MSG, IST, SAS, and HFM – shows the mutual dependencies to support the general vision of NNEC. Interoperability as required for NNEC needs to be embedded into the
Figure 3. Principles of Network Enabled Capability and supporting results.
operational context in order to support a better understanding. The technical integration of NATO components is only a first enabler. The exchangeable data need to be embedded into a common view representing the operation to be supported: the common ground. However, the SAS work has shown that today’s military operations have to support a multitude of operations that may require different methods and tools based on different world views. To represent such contradictive data to human decision makers inherits the danger of not increasing but decreasing decision quality. Figure 3 qualitatively shows the four studies and their relations to the four driving principles for NNEC. Clearly, a common framework is needed that allows measuring the NNEC interoperability maturity to support that only appropriate tools are composed and applied.
150
The Journal of Defense Modeling and Simulation: Applications, Methodology, Technology 10(2)
Technical maturity and operational appropriateness become the key factors for success and must be the target merits towards which potential solutions are evaluated.
3. Related academic work The need to addresses technical and cognitive abilities requiring technical and operational interoperability standards and targets for adaptation was also recognized in recent academic contributions of two doctoral theses. •
•
In his work ‘On the role of assertions for conceptual modeling as enablers of composable simulation solutions’,13 King shows the need and feasibility to explicitly deal with assumptions and constraints that have to be taken into consideration when composing two individually developed model-based solutions in support of common operations. Diallo14 contributed ‘Towards a formal theory of interoperability,’ generalizing the work of Page and Opper15 and Yilmaz and Oren.16 His dissertation provides the foundation of the mathematics used in Section 5 of this paper.
Both dissertations deal with M&S interoperability challenges directly applicable to the NNEC discussion.
and implementations thereof. King’s proposal is to model assertions that constrain the validity of composable solutions. When composing such annotated solutions, computing the assertions derives the valid solution sphere of the composition. If two assertions are mutual exclusive, the solution space is empty. Capturing the assertions using ontological means allows to reason over them. That capturing the assertions governing the validity of a model is not easy was exemplified by Spiegel et al.18 They published their results from conducting a small case study in order to clarify the role that model context plays in simulation composability and reusability. Using the task to compute the position and velocity of a falling body, they collected assumptions and constraints regarding this problem. An ad hoc study with subject matter expert colleagues led to a non-exhaustive list of 29 assumptions. Furthermore, they observed that failure to appreciate the importance of various constraints when selecting a model can lead to unacceptable results. If a simple model, such as the falling body problem, already leads to so many constraints, more complex models will likely require support by machines to trace the validity. The method defined by King comprises the following four steps. •
3.1 Capturing assertions In the M&S community it is often observed that undetected conflicts exist between components that result in hidden, unintended behaviors. King13 observed that there is a need for a method to detect these conflicts. As shown by Tolk et al.,2,6 interoperability of systems requires composability of models to ensure correctness of models. Furthermore, if meta-data is used to capture this information, such an approach permits software agents to reason about model concepts in an unambiguous, machine understandable form. King’s13 main contribution is therefore a method for standardized representation and use of assertions so that a conceptual model can be annotated with a list of critical assertions that the system relies upon. In his proof of feasibility, he implemented simple software agents that detect mismatches in conceptual models with respect to the listed assertions of a well-known problem: physical forces required to describe the falling body problem. King applied Davis and Anderson’s17 definition of composability in his work: ‘Composability is the capability to select and assemble components in various combinations to satisfy specific user requirements meaningfully’. King’s research supports Davis and Anderson’s assumption that in practice, when designing and creating models, it is the analysts who decide what to ignore and what to include, as well as how to model what is included. This process often results in incompatibilities between models
• •
•
Capturing the assertion propositions for the model, system, and environment as a natural language statement about the problem, one or more of its components, or a particular solution. Encoding the assertion in a knowledge representation language so that it can be used by a software agent, such as a logic reasoner. Comparing the assertions of the model of the operation to be supported and system lists that shall be composed. The task of comparing lists may require a multi-level strategy to be effective, which is described in more detail by King.13 Adjudicating and resolving conflicts and engineering a sufficient solution (if possible under the constraints).
It may be of interest that King was able to show that even open source components that lined up perfectly could be based on mutually exclusive constraints. King showed in his work that two components can be composed technically without any difficulty and executed without any errors although they were conceptually not aligned. Although technically both components could exchange information, conceptually this information exchange made no sense. If a commander relies on the results of such a conceptually not aligned federation, this will lead to bad decisions. These results also showed that a re-engineering of assertions from solutions is not always possible and may lead to incomplete assertion lists. Finally, although the approach cannot assure valid solutions, at least it can
Tolk et al. protect users from compositions providing invalid solutions under the captured assertions.
151 Given a set S of elements and a non-empty set of domains , every element in S is associated with a domain. Mathematically, we define the tuples:
3.2 Towards mathematical foundations of interoperability
•
While King’s13 approach was more driven from the top down, Diallo’s work14 was more bottom up, starting with IT-driven observations and resulting in significant contributions to a theory of interoperability. Starting from the established theory of data modeling, in Diallo’s work the results were applied to M&S interoperability challenges. By generalizing the various M&S paradigms to a common denomination, he started with a set of simple observations on models that can be captured as tuples. In data modeling theory, the three terms ‘entity’, ‘property’, and ‘value domain’ constitute the core of a data model. For M&S, we need to add the notion of a domain, which is similar to the traditional view of the value domain but is generalized to encompass the domain of discourse. Finally, the term ‘element’ is used to mean anything real or imagined that can be described and/or observed. The five terms are defined as follows.
•
• • • • •
Elements are real or imaginary things. Entities are abstractions of elements. It is worth noting that by this definition, any abstraction including processes and relationships are considered entities. Properties are the characteristics of an element. Symbols are the representations of elements. Symbols can be numbers, strings, images, text, or a combination of symbols. A domain is a collection of unique symbols. The domain is the set of elements that belong to a context. Every element is uniquely identifiable within a given domain.
Having these definitions of terms, a formal definition of a conceptualization is now possible. Let S be the set of elements, the set of entities, Π the set of properties, and V the set of symbols. A conceptualization S is formally captured by the categorization of elements into entities, properties, or symbols. The sets of entities, properties, and symbols are mutually disjoint and an element within S maps to only one of these sets; in other words: , Π, and V build a partition of S. A conceptualization is formally defined using partial functions, F(s), that provide this mapping: 8 F : S → if s ∈ S is an entity > > < F : S → Π if s ∈ S is a property : def T Conceptualization ¼ where Ai ; 8Ai ∈ f; Π; V g; i 6¼ j > > i : F : S → V if s ∈ S is a symbol
•
a is a subset of × , the Cartesian product of entities and domains; b is a subset of Π × , the Cartesian product of properties and domains; g is a subset of V × , the Cartesian product of symbols and domains.
Given the set of domains, we define the relation ρ as the subset of × , the Cartesian product of domains. A model M of a conceptualization S denoted MS is the relation (a, b, g, r). The formal approach allows one to capture various modeling paradigms using the same conceptualization, exclusively focusing on a model of entities, properties, and relations. They may be captured using different symbols or domains. The quadruple (a, b, g, r) defines all these models and supports similarity comparisons and metrics for alignments as requested in a generalized form required in support of formal composability. The approach is currently extended to Axioms of Interoperability. Some preliminary results of implications have recently been published.19,20 •
•
•
Interoperability of two systems implies the existence of a CRM and vice versa. This CRM must be consistent, including dependencies of comprised entities over all participating federates. It is worth mentioning that the common ground approach described by IST-075 postulates the elements of this ‘common ground’ as the CRM on the conceptual model level on the interoperability basis. SAS050 defines a conceptual model that can serve as common ground for C2 in the NNEC context. Interoperability of two systems implies mathematical equivalency of their conceptualization. In other words, interoperability is only given in the intersection of two systems. This is counterintuitive to many current views that assume that by interoperability the union of the provided capabilities becomes available. We therefore need an operational frame that helps to orchestrate individual and independent technical solutions. Current interoperability standards are insufficient to capture the data needed to support the new required approaches.20
This mathematical approach can be used to capture the syntax and semantics of data, the definition of input and output parameters and their categorization, the definition of the capability(ies) providing functionality, the role of
152
The Journal of Defense Modeling and Simulation: Applications, Methodology, Technology 10(2)
hidden state parameters, and assertions for all of these elements. Name spaces and data models become comparable, data mediation is supported, and model-based data engineering is envisioned by Tolk and Diallo21 is directly supported. Furthermore, the mathematics was used to recommend specifications for languages for ambassador agents by Tolk and Diallo.22 Ambassador agents represent simulation services that are candidates to contribute to the solution of a problem. They need to know and express enough about the simulations to negotiate with other ambassador agents if the represented simulation systems can be composed to contribute to the solution. The theory of interoperability concepts that needs to be addressed by engineers when creating a federation needs to be addressed by ambassador agents as well; hence, it needs to be integrated into a language supporting their task. Currently, the approach is applied on a broad scale for the definition and implementation of Coalition Battle Management Services (CBMS), a technical infrastructure that enables the exchange of orders, reports, and requests between C2 systems, simulation systems, and robotic forces.23 CBMS is a collection of composable web services that can be orchestrated to support the needs of a particular federation. CBMS is implemented as a serviceoriented architecture with interrupt mechanisms, filtering mechanisms, and data distribution mechanisms that can be used to support the validation, storage, search, and exchange of Extensible Markup Language (XML)-based languages. These languages include but are not limited to the Coalition Battle Management Language (C-BML)24 and Military Scenario Definition Language (MSDL).25 Extensions to this mathematical approach are presented in Section 5 of this paper.
4.1 Levels of Conceptual Interoperability Model In particular, computer science and engineering have a tradition of using layered models to better understand the concepts underlying successful interoperation. One very successful example is the International Organization for Standards (ISO)/Open System Interconnect (OSI) reference model that introduced seven layers of interconnection, each with well-defined protocols and responsibilities.26 Based on experiences in using these models, several researchers introduced layered models to better understand the theoretical underpinnings of interoperation for NNEC as well. The LCIM has been introduced as a reference model with welldefined layers of interoperation to better deal with challenges of interoperability of simulation systems and composability of simulation models. The LCIM evolved over several years and various applications in different application domains, including ontology alignment27 and intelligent power systems.7 The current version of the LCIM exposes six layers of interoperation as follows. •
•
•
•
4. Current Network Enabled Capability interoperability and maturity models In Sections 2 and 3, we established research results and recommendations needed for NNEC in general and netenabled M&S in particular. Before these different contributions described so far can be merged into the desired IMM, two maturity models that are applied within NATO and that have mutual completing characteristics need to be introduced first: the LCIM used in MSG activities and the N2C2M2 used in SAS and HFM activities. This paper uses the LCIM to focus on the technical interoperability aspects and the N2C2M2 to address the operational interoperability aspects. It will furthermore show that both models are actually two views on the same challenge: enabling NNEC in general and applying net-enabled M&S in support thereof.
•
•
The technical layer deals with infrastructure and network challenges, enabling systems to exchange carriers of information. This is the domain of integratability. The syntactic layer deals with challenges to interpret and structure the information to form symbols within protocols. This layer belongs to the domain of interoperability. The semantic layer provides a common understanding of the information exchange. On this level, the pieces of information that can be composed to objects, messages, and other higher structures are identified. It represents the aligned static data. The pragmatic layer recognizes the patterns in which data are organized for the information exchange, which are, in particular, the inputs and outputs of procedures and methods to be called. This is the context in which data are exchanged as applicable information. These groups are often referred to as (business) objects. It represents the aligned dynamic data. The dynamic layer recognizes various system states, including the possibility for agile and adaptive systems. The same business object exchanged with different systems can trigger very different state changes. It is also possible that the same information sent to the same system at different times can trigger different responses. Finally, assumptions, constraints, and simplifications need to be captured. This happens in the conceptual layer. This layer represents the harmonized data.
Tolk et al.
Figure 4. The Levels of Conceptual Interoperability Model.
Contributing to the improvement of an earlier version of the LCIM that was very data centric, Page et al.28 proposed to clearly distinguish between the three governing concepts of interoperation as follows. •
•
•
Integratability contends with the physical/technical realms of connections between systems, which include hardware and firmware, protocols, networks, etc. Interoperability contends with the software and implementation details of interoperations; this includes exchange of data elements via interfaces, the use of middleware, mapping to common information exchange models, etc. Composability contends with the alignment of issues on the modeling level. The underlying models are purposeful abstractions of reality used for the conceptualization being implemented by the resulting systems.
The current technical activity MSG- 086 on Simulation Interoperability explicitly mentions the use of the LCIM as a reference model. Figure 4 shows the LCIM, including the three governing concepts. In summary, successful interoperation of solutions requires integratability of infrastructures, interoperability of systems, and composability of models. While integratability and interoperability address the technical interoperability challenges, composability addresses the operational interoperability aspects, but needs to be enhanced by pragmatics of the systems application, as supported in by the N2C2M2 frame.
4.2 NNEC Command and Control Maturity Model While the ISO/OSI model is the prototypical example for layered models supporting technical interoperability
153 questions, the Capability Maturity Model (CMM), developed by Carnegie Mellon University,29 is the example for maturity modeling. The approach is the definition of metrics, providing a progression of measures. The general idea behind maturity models is that providing a certain capability to the enterprise based on the alignment and orchestration of individually and independently developed part solution is not a cookie-cutter function. The process of aligning and orchestrating these solutions regarding the business model to be supported can happen gradually, passing different levels of maturity. For NNEC, the Allied Command Transformation defined transformational maturity levels to measure to what degree the vision of NNEC is already supported by a given solution.3 While the LCIM measures the technical interoperability maturity, the transformational maturity levels measure the operational interoperability maturity. Each maturity level is associated with a set of required capabilities that can be observed – if the maturity model is used to describe a certain state – or that can be required – if the maturity model is used to prescribe certain abilities – similar to the use of the LCIM described by Wang et al.30 The maturity levels identified and applied to measure NNEC maturity are measured by the resulting category of NNEC capability levels supporting certain kinds of operations, which are: • • • • •
stand alone or disjoint operations; de-conflicted operations; coordinated operations; collaborates or integrated operations; coherent or transformed operations.
While the NNEC capability levels define the operational outcome or observations, the C2 maturity level measures the C2 approaches. They were the focus of another NATO RTO SAS technical activity – SAS-065 – whose results are summarized by Alberts et al.8 These levels are also characterized by exposed C2 capabilities, but the model also addresses the ability to move between the levels as required. The identified C2 approaches and their characteristics are shown in Figure 5. The number of C2 approaches within each of the NNEC maturity levels and the ability to transition between possible C2 approaches are the two main contributors to the N2C2M2. Therefore, the N2C2M2 helps one to understand what kind of operations can be supported by what kind of C2 approaches that are available or can be transformed accordingly. To better understand this concept, the NNEC maturity levels shall be evaluated in greater detail. •
Stand alone operations are conducted when each participant exclusively focuses on their own resources and capabilities to reach their own objectives as if no
154
The Journal of Defense Modeling and Simulation: Applications, Methodology, Technology 10(2)
Figure 5. Command and Control (C2) approaches (Alberts et al.,8 p.64).
•
•
•
other participant were present. Therefore, conflicts between participants are the rule. Their objectives are potentially mutually exclusive. Plans and execution will compete with each other. De-conflicted operations ensure that organizations avoid interfering with one another. Areas of responsibility and other geographic organization features support this way to support business. The orchestration is supported by special personnel at intersection points, liaison officers, and the like. C2 is still very hierarchical; decision rights are well defined and connected to the responsibility sections. Orchestration is limited to synchronize operations in space and time, using phase lines, synchronization points, etc. The potentials of NNEC are not utilized. Coordinated operations require joint planning driven by a shared intent. The synchronized plan allows decision making on lower levels, which is supported by horizontal links between the various C2 components. However, each component is still acting on the participant’s behalf. The execution of the plan remains the responsibility of the local component. Coordinate operations require a common intent and a common awareness, supported by broader information access to shared sensors and a common picture. Collaboration requires to not only share the planning process, but also the execution process. Shared situational awareness supported by joint common operational pictures requires a unifying infrastructure that integrates the heterogeneous contributions
•
of all partners. Information shall be shared seamless. While the execution process is still limited to the component in the coordinated approach, the collaborative approach synchronizes planning and execution vertically and horizontally. Coherent operations are characterized by rapid and agile decision-making processes based on seamless and transparent information sharing. Decision makers have access to all information they need to make the decision, regardless of where they are, which systems they use to gain access to the information, or where the information came from. Future ‘Edge Command and Control’ will support these operations.
As already pointed out in the work of MSG-062,9 netenabled M&S services can support reaching higher levels of maturity. In particular, the increased use of simulation to provide evaluation of alternative courses of actions helps to support shared planning and execution.
4.3 LCIM and NNEC maturity levels The different NNEC maturity levels address different levels of operational and cognitive capability and require different levels of technical and operational interoperability standards and targets for adaptation. SAS-065 mapped the C2 maturity level to NNEC capability levels based on the respective transformational maturity level. The LCIM can be used to address the technical interoperability standards and targets for adaptation required for the NNEC
Tolk et al. capability. While it is unfortunate that the LCIM uses the term technical interoperability in a formally descriptive way for its Level 1, in support of NNEC capabilities the term is used more colloquially, and its use in this paper should be clear from context. The mapping described in this section is qualitative, but using the approach discussed in the final section of this paper, quantitative metrics can and should be applied as well. •
•
•
•
Stand alone or disjointed operations do not require any coordination of C2 and therefore also no interoperability of participating or supporting systems. Therefore, conflicted C2 is defined by LCIM Level 0: no interoperability. De-conflicted operations require de-conflicted C2 support. To support de-conflicted C2 on the IT side, input and output parameters describing the operational responsibilities need to be shared, which is normally done via messages. This requires technical interoperability and syntactical interoperability with the support of semantic interoperability for the common elements that describe the domains of responsibility (including the spatio-temporal constraints). In other words, syntax and semantics of the messages shared via a common technical protocol must be understood on the sending and receiving side, which results in de-conflicted C2 to be defined by LCIM Level 1: technical interoperability, LCIM Level 2: syntactic interoperability, and LCIM Level 3: semantic interoperability for the shared information. Coordinated operations require coordinated C2, which means the coordination of processes and plans. On the IT side, this translates in the request for semantic interoperability, so that linked plans can be communicated unambiguously, and pragmatic interoperability to understand the connection between desired capability and required input parameters – addressed as business objects in the LCIM. It is not sufficient to understand the meaning of the shared information, the context in which they are shared and how the information is utilized as needed as well. This represents the ‘common ground’ described in Section 2.3. Coordinated C2 is therefore defined by the LCIM Level 3: semantic interoperability and LCIM Level 4: pragmatic interoperability. Collaborated or integrated operations require collaborative C2 that supports the shared planning and execution. On the IT side, this requires full pragmatic interoperability and dynamic interoperability for shared functions. A common conceptual model of how the systems and the embedding situation change is required, such as that defined by the C2
155
•
approach described in Section 2.4. Collaborative C2 is therefore defined by the LCIM Level 4: pragmatic interoperability and LCIM Level 5: dynamic interoperability. Coherent or transformed operations require the Edge C2 system8 that supports all options and horizontal and vertical interoperability. On the IT side, all levels of the LCIM need to be fully supported, including the annotation of services with assertions, as described in Section 3 of this paper. Edge C2 requires LCIM Level 6: conceptual interoperability, representing a common world view, given by the world knowledge defined by IST-075, represented in the same rigor as the conceptual model defined by SAS-050.
Based on this result, a formal approach for the LCIM directly supports NNEC maturity as well, at least on the technical side of capabilities and interoperability challenges. Figure 6 exemplifies the relation of the C2 approaches captured by the N2C2M2 and the LCIM. The interested reader is furthermore referred to Winters et al.,31 where the LCIM is related in similar manner the Levels of Information Systems Interoperability (LISI)32 and the Organization Interoperability Maturity Model (OIMM).33
5. From layers of interoperation to interoperability maturity metrics After we established that various NNEC capability levels require different interoperability standards that can be measured qualitatively by the LCIM, the last section of this paper addresses a method based on an extension of the mathematics presented in Section 3 and described in detail by Diallo14 to establish metrics to measure the alignment of two systems quantitatively as well.
5.1 A formal approach to describe the LCIM To frame the proposed formal approach, firstly we will provide an informal description that highlights key elements for consideration. Consider the generic model representation shown in Figure 7. This representation shows artifacts common to these systems: (1)
(2)
the service/function at any given moment exists in a certain state (Sin) that is needed to provide the desired capability; the service/function receives inputs (IN) from outside itself (e.g. the environment, other systems, etc.);
156
The Journal of Defense Modeling and Simulation: Applications, Methodology, Technology 10(2)
Edge C2
Level 6 Conceptual Interoperability Level 5 Dynamic Interoperability
Collaborave C2
Level 4 Pragmac Interoperability
Coordinated C2
Level 3 Semanc Interoperability
De-Conflicted C2
Level 2 Syntacc Interoperability Level 1 Technical Interoperability
Conflicted C2
Level 0 No Interoperability
Figure 6. Mapping Net Enabled Capability Command and Control Maturity Model levels and Levels of Conceptual Interoperability Model levels. C2: Command and Control.
Figure 7. Generic model representation. M&S: Modeling and Simulation.
(3) (4)
the state changes (Sout) in response to internal methods and procedures, Sin and IN; output (OUT) is produced in response to internal methods and procedures, Sin (possibly Sout) and IN.
Recognizing the use in practice of these distinct symbols (Sin, Sout, IN, and OUT) identifies several domains, , of interest that must be explicitly identified, namely (S*), (I*), and (O*). For practitioners, it may be of value to
Figure 8. Constraint model validity.
mention that in web-based environments such domains are often implemented as namespaces. It also is obvious that where S* is the state set, I* is the input set, and O* is the output set, the values Sin, Sout, IN, and OUT exist within the domains of their respective sets. Using these extensions to describe a system and its functionality beyond represented entities, Diallo’s formalism
Tolk et al. now can be used to formulate the system as a tuple. It is an implicit assumption that the state set of a service will map onto itself within this context. However, recognizing that the state does and will change (referenced in the pragmatic layer), the obvious question is How? In the realm of computer science, it is through the service implemented by computable functions. The set of functions, F*, that implement for the service, also has a domain, (F*), which are the labels used to address the functions. Armed with this fundamental description of artifacts, consider key components embedded within the LCIM. Level 1 deals with technical feasibility. The syntactical layer (Level 2) deals with understanding the data format and structure. It is at the semantic level (Level 3) that understanding of that data is exchanged, that is, terms, are defined. At the pragmatic level (Level 4) the context of the data is understood; in particular, methods, processes, and the dynamic data are understood. At the dynamic level (Level 5), behavior is understood through the states the system can take. Finally, at the conceptual level (Level 6), all assumptions, constraints, and simplifications are captured. The work of Diallo provides a starting place for the discourse. In the theoretical framework he provided, a model M of a conceptualization T denoted MT is the relation (a, b, g, r). Each of the tuple members a, b, and g belongs to a set of symbols (entity, property, and symbol). Syntactic interoperability ensures that the data that represents the symbols is passed. In general, the set of symbols is *. The semantic level requires that this data be understood. Therefore, their meaning and relationships to the entities and properties within the model are required. Thus, Diallo provides the foundation for semantic interoperability through his definition of a model. In particular, I*, O*, (I*), and (O*) must all be known. It is at the pragmatic and dynamic levels that Diallo’s work must be extended. At these layers, it is not enough to understand the model with respect to its entities, properties, and symbols and the relations between them. Which of these elements change and the manner and mechanisms for those changes must also be understood. This motivates the needed addition of F* and (F*) and S* and (S*) at the pragmatic and dynamic layers, respectively. Finally, the work of King13 demonstrates that the constraints on each (*, *, I*, O*, F*, S*) must be explicit for a complete picture. This is summarized in Table 1. These concepts can be illustrated with two simple examples. While not a formal proof for representation of the concepts identified in Table 1, these examples can motivate the need for the proposed extensions and some of the implications to be discovered.
5.2 Motivating constraints and states For the purposes of discussion, consider the set of real numbers, R. Define two sets X ⊂ R and Y ⊂ R, where
157 Table 1. Representation requirements. LCIM level
Formal representation requireda
Technical level Syntactical level Semantic level
No P formal representation * set of symbols * Set of domains (I,O) I* Set of inputs O* Set of outputs * Set of domains (F) F* Set of functions * Set of domains (S) S* Set of states Constraints on (*, *, I*, O*, F*, S*)
Pragmatic level Dynamic level Conceptual level a
Levels of Conceptual Interoperability Model (LCIM) levels accumulate the formal representation required.
X = fx j 0 ≤ x ≤ 100g and Y = fyj 0 ≤ y ≤ 100g. Further, define a real valued function, y = f (x). In this context, the set of inputs and the set of outputs is R. The domain of inputs is represented by X and the domain of outputs is represented by Y . Considering various choices for f (x), we can motivate considerations for and the importance of elements proposed for the pragmatic, dynamic, and conceptual levels. As a first example to motivate the necessity for constraints, for the purposes of discussion consider three functions, shown in Figure 7: y = f (x) = x, y = g(x) = 2x, and y = h(x) = 2x. Each of these choices has a different constraining effect on the domain of either the input or the output sets beyond that already prescribed. •
•
•
It is clear that f ð xÞ has no constraining affect on either the input or output sets. That is, all outputs y ∈ Y can be realized from the input set (and in fact from one and only one value x ∈ X ); further, all inputs x ∈ X result in an output y that exists within the domain of Y . In contrast, y = g ð xÞ = 2x results in either a constraint on the allowable input set or necessitates a change in the function (perhaps through error checking control mechanisms) to prevent outputs that are outside of the acceptable domain. That is, only the inputs within Xv = fxj 0 ≤ x ≤ 50g ⊂ X results in outputs y that exist within the domain of Y . For x ∈ :Xv = fxj 50 < x ≤ 100g, outputs are outside the domain for Y . This could be mitigated by revising g ð xÞ to gv (x) = f2x, fxj 0 ≤ x ≤ 50g; 100, fxj50 < x ≤ 100gg. It is only through explicit representation of this revision that the lack of differentiation in output response to inputs in x ∈ :Xv are exposed. The function y = hð xÞ = 2x creates a constraining effect on the output, which is perhaps more compelling. In this example the set of outputs that is
158
The Journal of Defense Modeling and Simulation: Applications, Methodology, Technology 10(2) accessible within the model is restricted. That is, 8x ∈ X all of the output exists in y ∈ Ya = fyj 0 ≤ y ≤ 50g ⊂ Y . The implication of this is that no matter which inputs x ∈ X are chosen y ∈ :Ya = fyj 50 < y ≤ 100g cannot be realized even if the model subscribes to these outputs. This poses issues if the model is chosen for the model’s ability to subscribe to these outputs. Thus, mere subscription within a domain of output is insufficient.
The implication of these observations is significant when implementing models. Conceptually identifying, even formally, the desirable domains for inputs and outputs is insufficient to understand whether the full set of inputs can be used or the full set of outputs is realizable in the implementation of the model. The functions affecting these inputs from which the outputs are derived is necessary. The problems in not exposing the functionality are exacerbated when combining models is required. As a second example to motivate the addition of states into the formalism, consider a recursive function on S = {s | 0 ≤ s ≤ 100}, sn + 1 = f(sn). Defining sn + 1 = f(sn) = exp (–sn) eventually converges to a single element within S 8s0 ∈ S. Defining sn + 1 = f(sn) = 1 + sn, clearly 8s0 ∈ S; 9n that will lead to snþ1∈ S. This observation illustrates that feasibility imposes constraints on the set of f(sn) and that the selection of f(sn) may constrain the domain for S based on accessibility or initialization values.
5.3 Metrics for levels of interoperability These simple examples demonstrate the need to expand the formalism developed by Diallo14 and some of the possible directions. Further, it has been shown that interoperability only exists in the intersection, as proved for at least the semantic level by Diallo.14 In addition, King13 showed that overlapping constraints are necessary for valid interoperability, although it may not be sufficient. Therefore, once we have formalized our models, we can measure whether the intersection is non-empty. If this is the case, the cardinality of the intersection compared to the cardinality of the sets themselves, as well as other assessments, determines how well aligned our models are. Further, in extending the formalism of Diallo to include the functional and state aspects of our services/functions, we can hopefully glean additional insights into the limits of the use of those systems. It has already been shown that it is not sufficient to specify a common data exchange model. We need standards that fully align our conceptualizations in machine readable form on all levels of interoperation.20 The main challenge is neither the LCIM nor the formalism but annotating the model-based services with metadata. For a given problem, such as answering a research question, supporting a training exercise, or comparing
alternative courses of action, these meta-data shall allow one to identify applicable solutions that can contribute to solve the problem, to select the best set of solutions to solve the problem, to compose these solutions meaningful, and to orchestrate their execution when solving the problem. The LCIM supports all phases. •
•
•
•
For all model-based solutions, the elements captured in Table 1 need to be captured. This includes symbols used, domains used, inputs and outputs, functions, and states. Namespaces capture not only the terms used for inputs, outputs, functions, and states, but also their conceptual structure, as exemplified using modelbased data engineering by Tolk et al.19 for inputs and outputs. If communities of interest can agree on common terms and standardize them, which include the conceptual structure of data and algorithmic structure for functions, meta-data are reduced significantly to references to these terms. The evaluation of model-based solutions is then reduced to the comparison of tuples as defined in Table 1. The complexity of these formal system descriptions can still be significant, as already discussed by Page and Opper.15 However, even if no general solutions are possible, as some of the mapping challenges are generally not decidable, the formal description builds a solid foundation for heuristics that can support the engineering of solutions for solvable cases.
The approach described in this section needs to be embedded as the technical interoperability foundation into the N2C2M2, as described in Section 4, to cover the whole spectrum of technical and operational interoperability challenges identified previously.
6. Summary NNEC addresses technical and cognitive abilities of NATO requiring technical and operational interoperability standards and targets for adaptation. Neither stove-piped approaches nor monopolies or too rigid mandates can support the ideas presented in this paper. Technical proposals cannot solve conceptual problems, but conceptual solutions require a solid technical foundation. Only a framework that addresses all levels of interoperation required for successful NNEC can address the challenges successfully. This paper showed that solutions on different levels already exist and can be merged into the LCIM using mathematics recently developed in academic research. The resulting IMM enhances the N2C2M2 by tying conceptual ideas and technical foundations closer together,
Tolk et al. allowing the community to address all challenges and accordingly enabling flexible, adaptable, and agile solutions for NATO and partners. The research group is currently working on embedding all of these related concepts into one coherent theory rooted in the mathematics of model theory. First results are well aligned with the engineering principles described in this paper. Funding
159
14
15
16
This research received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors. References 1 MCM-0032-2006. NATO Network Enabled Capability (NNEC) vision and concept. 19 April 2006. 2 Tolk A, Diallo SY, Turnitsa CD and Winters LS. Composable M&S web services for net-centric applications. J Defense Model Simulat 2006; 3: 27-44. 3 Kanis GJ and van Ettinger MR. Operational assessment of a NATO response force. In: Proceedings of Human Factors and Medicine Panel Symposium in Adaptability in Coalition Teamwork, RTO-MP-HFM-142, Paper 21, Copenhagen, Denmark, 2008. 4 NATO code of best practice for C2 assessment. Revised ed. Washington, DC: Command and Control Research Program (CCRP) Press, 2002, 5 Pawlowski TJ (ed.). Command, control, communications, intelligence, electronic warfare measures of effectiveness. Military Operations Research Society Workshop Final Report, Fort Leavenworth, KS, 1992. 6 Tolk A, Diallo SY and Turnitsa CD. Applying the levels of conceptual interoperability model in support of integratability, interoperability, and composability for system-of-systems engineering. J Syst Cybern Inf 2007; 5: 65-74. 7 Ambrosio R and Widergren SE. A framework for addressing interoperability issues. In: Proceedings of the 2007 IEEE Power and Energy Society (PES) General Meeting, Tampa, FL, 2007. 8 Alberts DS, Huber RK and Moffat J. NATO NEC C2 maturity model. Washington, DC: Command and Control Research Program (CCRP) Press, 2010. 9 Guide to modelling & simulation (M&S) for NATO networkenabled capability (‘‘M&S for NNEC’’). RTO Technical Report, TR-MSG-062 Final Report Brussels, Belgium, February 2010. 10 Semantic interoperability. RTO Technical Report, TR-IST075 Final Report, Brussels, Belgium, July 2010. 11 Exploring new command and control concepts and capabilities. RTO Technical Report, TR-SAS-050 Final Report, Brussels, Belgium, April 2007. 12 Human systems integration for network centric warfare. RTO Technical Report, TR-HFM-155 Final Report, Brussels, Belgium, February 2010. 13 King RD. On the role of assertions for conceptual modeling as enablers of composable simulation solutions. PhD
17
18
19
20
21 22
23
24
25
26
27
dissertation, Batten College of Engineering, Old Dominion University, Norfolk, VA, 2009. Diallo SY. Towards a formal theory of interoperability. PhD dissertation, Batten College of Engineering, Old Dominion University, Norfolk, VA, 2010. Page EH and Opper JM. Observations on the complexity of composable simulation. In: Farrington PA, Nembhard HB, Sturrock DT and Evans GW (eds) Proceedings of the 1999 Winter Simulation Conference, ACM Press, New York, 1999, pp.553-560. Yilmaz L and Oren T. Exploring agent-supported simulation brokering on the semantic web: foundations for a dynamic composability approach. In: Ingalls RG, Rosetti MD, Smith JS and Peters BA (eds) Proceedings of the 2004 Winter Simulation Conference, ACM Press, New York, 2004, pp.766-773. Davis PK and Anderson RA. Improving the composability of DoD models and simulations. J Defense Model Simulat 2004; 1: 5-17. Spiegel M, Reynolds PF and Brogan DC. A case study of model context for simulation composability and reusability. In: Proceedings of the2005 Winter Simulation Conference, ACM Press, New York, 2005, pp.437-444. Tolk A, Diallo SY, King RD, Turnitsa CD and Padilla JJ. Conceptual modeling for composition of model-based complex systems. In: Robinson S, Brooks R, Kotiadis K and van der Zee D-J (eds) Conceptual modeling for discrete-event simulation. CRC Press, 2010, pp.355-381. Boca Raton, FL, USA Diallo SY, Padilla JJ and Tolk A. Why is interoperability bad: towards a paradigm shift in simulation composition. In: Proceedings of the Fall Simulation Interoperability Workshop, Orlando, FL, 20–24 September 2010. Tolk A and Diallo SY. Model-Based Data Engineering for Web Services. IEEE Internet Comput 2005; 9: 65-70. Tolk A and Diallo SY. Using a formal approach to simulation interoperability to specify languages for ambassador agents. In: Proceedings of the Winter Simulation Conference, Baltimore, MD, December 2010, pp.359-370. Diallo SY, Tolk A, Graff J and Barraco A. (2011). Using the levels of conceptual interoperability model and model-based data engineering to develop a modular interoperability framework. In: Proceedings of the Winter Simulation Conference, Phoenix, AZ, December 2011 [in press]. Simulation Interoperability Standards Organization (SISO). Coalition Battle Management Language (Trial Use Version), 26 January 2011. Simulation Interoperability Standards Organization (SISO). Military Scenario Definition Language, Released Standard SISO-STD-007-2008, 14 October 2008. International Organization for Standardization (ISO)/ International Electrotechnical Commission (IEC) 10731:1994. Information technology – open systems interconnection – basic reference model – conventions for the definition of OSI services. ISO Press, 1994, Geneva, Switzerland. Dobrev P, Kalaydjiev O and Angelova G. From conceptual structures to semantic interoperability of content. In: Conceptual structures: knowledge architectures for smart applications. Berlin: Springer, 2007, pp.192-205.
160
The Journal of Defense Modeling and Simulation: Applications, Methodology, Technology 10(2)
28 Page EH, Briggs R and Tufarolo JA. Toward a family of maturity models for the simulation interconnection problem. In: Proceedings of the Spring Simulation Interoperability Workshop, IEEE CS Press, 2004. 29 Paulk MC, Weber CV, Curtis B and Chrissis MB. The capability maturity model: guidelines for improving the software process. Addison-Wesley Professional, 1994, Reading, MA, USA. 30 Wang W, Tolk A and Wang W. The levels of conceptual interoperability model: applying systems engineering principles to M&S. In: Proceedings of the Spring Simulation Multiconference, San Diego, CA, ACM Simulation Series 41, 2009, pp.375-384. 31 Winters LS, Gorman MG and Tolk A. Next generation data interoperability: it’s all about metadata. In: Proceedings of the Fall Simulation Interoperability Workshop, Paper 06FSIW-059, Orlando, FL, September 2006. 32 US DoD, OSD (C3I), CIO, Director for Architecture and Interoperability. Levels of information systems interoperability. C4ISR Architectures Working Group, 30 March 1998. 33 Clark T and Jones R. Organisational interoperability maturity model for C2. In: Proceedings of the 4th Command and Control Research and Technology Symposium, US Naval War College, Rhode Island, 1999.
Author Biographies Andreas Tolk is Professor for Engineering Management and Systems Engineering at Old Dominion University,
Norfolk, Virginia. He holds a PhD and a MS, both in Computer Science. He served in several NATO Technical Working groups, mainly supporting the NATO M&S Group (MSG) and the Systems, Analysis and Studies (SAS) Panel. He is also supporting the Command and Control Research Program. His research interests are interoperability and composability in system of systems and M&S-based systems engineering. Lisa Jean Bair is Chief Architect at Weisel Science and Technology, Corp. She is a PhD candidate in the Modeling, Simulation, and Visualization Engineering (MSVE) program of Old Dominion University. Her PhD research focuses on developing mathematics to better understand the concepts of interoperability and composability of M&S. She holds a MS in Operations Research. Saikou Y Diallo is Research Assistant Professor at the Virginia Modeling Analysis and Simulation Center (VMASC) in Suffolk, Virginia. He chairs the Interoperability and Composability track of the research center. He holds a PhD in Modeling and Simulation and a MS in Computer Engineering. His research deals with complexity and computability challenges of mathematicsbased tools supporting the evaluation of interoperability of model-based systems.