(SOA) based Persistent ISR Simulation System

2 downloads 0 Views 4MB Size Report
d Air Force Research Laboratory, Space Vehicles Directorate, Kirtland AFB, NM, USA ... Future advances in Intelligence, Surveillance and Reconnaissance (ISR) for ... manned low-flying RC-12s, unattended stationary sensors, etc. ... Build the simulation from the space satellite to the ground-based battlefield environment.
Services Oriented Architecture (SOA) based Persistent ISR Simulation System Genshe Chen∗a, Erik Blaschb, Dan Shena, Huimin Chenc, Khanh Phamd a

b

DCM Research Resources, LLC, 14163 Furlong Way, Germantown, MD, USA 20874 Air Force Research Laboratory, Sensors Directorate, Wright-Patterson AFB, OH, USA 45433 c University of New Orleans, New Orleans, LA, USA 70148 d Air Force Research Laboratory, Space Vehicles Directorate, Kirtland AFB, NM, USA 87117

ABSTRACT In the modern networked battlefield, network centric warfare (NCW) scenarios need to interoperate between shared resources and data assets such as sensors, UAVs, satellites, ground vehicles, and command and control (C2/C4I) systems. By linking and fusing platform routing information, sensor exploitation results, and databases (e.g. Geospatial Information Systems [GIS]), the shared situation awareness and mission effectiveness will be improved. Within the information fusion community, various research efforts are looking at open standard approaches to composing the heterogeneous network components under one framework for future modeling and simulation applications. By utilizing the open source services oriented architecture (SOA) based sensor web services, and GIS visualization services, we propose a framework that ensures the fast prototyping of intelligence, surveillance, and reconnaissance (ISR) system simulations to determine an asset mix for a desired mission effectiveness, performance modeling for sensor management and prediction, and user testing of various scenarios.

Keywords: SOA web services, information fusion, performance metrics, 3D simulation, open source, ISR, GIS visualization.

1. INTRODUCTION Future advances in Intelligence, Surveillance and Reconnaissance (ISR) for multi-layered defense depend largely on the foundation of Network Centric Warfare (NCW) that enables collaborative usage and control of distributed warfare assets for time-critical operations. Selecting the best sensor/platform assets among geographically distributed ISR units can improve timely situation interpretation (e.g. target tracks) and achieve effective utilization of ISR constellation resources. Various complex dynamic simulators exist today, such as Satellite Toolkit; however, most often the inputs and outputs of the simulation are confined to the particular settings of the developer software tools and not standardized with crossplatform interoperability. In addition, the evaluation metrics generally do not consider constellations of different types of sensors on different types of platforms. As the military deploys many different types of platforms (high-altitude UAVs, manned low-flying RC-12s, unattended stationary sensors, etc.) with different types of sensors, the current methods to examine how to most effectively employ ISR assets (which are generally targeted at specific sensors or platforms) may become inadequate. In this paper, we present an open-standard framework to provide the user with capabilities for developing ISR simulations with the following objectives: • Build the test bed for different ISR scenarios. • Build the open source ISR system simulation environment. • Build the virtual 3-D graphical simulation system in real time. ∗

[email protected]. This work was supported in part by the US Air Force under contracts FA9453-09-M-0061 and FA8650-09-M1552 Ground/Air Multi-Sensor Interoperability, Integration, and Networking for Persistent ISR, edited by Michael A. Kolodny, Proc. of SPIE Vol. 7694, 76941D · © 2010 SPIE CCC code: 0277-786X/10/$18 · doi: 10.1117/12.849783 Proc. of SPIE Vol. 7694 76941D-1

Downloaded From: http://spiedigitallibrary.org/ on 11/26/2013 Terms of Use: http://spiedl.org/terms

• •

Build an interactive virtual feedback system during scenario display. Build the simulation from the space satellite to the ground-based battlefield environment.

First, we have developed overall system architecture of our proposed ISR simulation environment. The system integrates various sensor models, sensor web based components, ISR algorithms, and provides 3D simulation and visualization. The initial prototype of the simulator can create, update, and maintain all the relevant components of an integrated airborne picture. Second, by following the services oriented architecture (SOA) standard, an open-standard sensor web system is designed with necessary software components, programming language, and query access of sensor web for cross platform simulation. We adopted the Sensor Web Enablement (SWE) standard defined by the OpenGIS Consortium (OGC), including a set of specifications such as SensorML, Sensor Observation Service (SOS), and Sensor Planning Service (SPS). Finally, we constructed a 3D visualization tool by integrating sensor constellation and data fusion algorithms, performance metrics, and geospatial enabled performance score display. The current simulator is able to provide target detection and data retrieval via the SWE. The SWE is the technology (a suite of standards) to enable the realization of Sensor Webs. Our simulator can be easily expanded, upgraded, and transitioned to large scale human-inthe-loop ISR mission study. Each algorithm or component works as a plug-and-play unit of the simulator. Each algorithm that we developed so far can be modified or replaced by a third party software developer following the same SOA standard. This paper starts with a description of SOA system, SOA components, and an overview of the SOA’s interaction. Section 3 addresses the simulation conceptual model and demonstrates the GUI interface for the different system components. Finally, Section 4 discusses the advantage of the proposed simulation framework and future research objectives. For each section, we give a high level description of SOA-based ISR simulation system. We refer to the cited references for the technical details.

2. SERVICES ORIENTED ARCHITECTURE (SOA) SYSTEM 2.1 Overview of System Architecture We have proposed and implemented a sensor web enabled reconfigurable ISR mission simulator which integrates innovative target detection, recognition, tracking, data fusion and sensor management algorithms. The primary goals are: (1) to enhance the operational capability of the existing ISR simulation system, (2) to develop meaningful performance evaluation metrics for tasking and managing different platforms and sensors, and (3) to implement the visualized simulation results with user friendly interfaces according to the open sources standard. Figure 1 is the flowchart of the simulator prototype.

Figure1. System architecture of the proposed constellation sensor simulation environment for airborne ISR.

2.2 ISR Simulation Scenario We used a scenario shown in Figure 2 to demonstrate the performance of our proposed coordinated search and tracking algorithms. The simulation setting is a fictitious urban environment where two forces were engaged before the battle. The constellation assets included surrogate UAVs and UGSs. In the urban environment, the blue force’s missions are to

Proc. of SPIE Vol. 7694 76941D-2 Downloaded From: http://spiedigitallibrary.org/ on 11/26/2013 Terms of Use: http://spiedl.org/terms

capture and secure two bridges which are guarded by the red force (yellow teams in the 3D visualization). The red force includes armed vehicles, fighters and missiles. The blue force consists of a few fighters supported by several unmanned aerial vehicles (UAVs) such as small weapon UAVs and small sensor UAVs, whose main task is to perform ISR. We assume the total offense force and total defense force are almost at the same level. The wide body UAV will operate at high altitude performing wide-area surveillance. The UAV groups, working together, offer the potential for comprehensive persistent coverage of tracking targets on the battlefield. The white cars (police cars) on the back road are acting as civilians vehicles. There are several choices for the red force to guard these objectives efficiently. They can deploy all red units to protect one location. However, the blue force must arbitrate which places to capture and protect. So the main challenge for blue force is to perform cooperative ISR, provide the situation awareness, as well as predict the intent of the opponent under the “believed” war situation. The main part of this simulation is the cooperative ISR.

Figure 2. An ISR Scenario.

2.3 SOA Prototype Design We adopted the software prototype of the ISR simulation system that integrates the SOA architecture with the visualization clients using Google Earth (which can be replaced by another geo-spatial visualization tool). The dynamic sensor tracking algorithms were incorporated into the SWE framework of the middle tier, which is shown in Figure 3. We have developed sensor constellation and management algorithm and the visualization tool that (a) captures sensor surveillance video and images, (b) uses 3D visualization system to display the sensor overlay information on top of terrain, including in-situ sensor, sensor footprint, sensor capability, (c) provides metrics display of various situation effectiveness metrics, and (d) provides user-friendly GUI to handle mission planning, assets management, and sensor task matching. In conjunction with available OGC services, such as SOS, SPS, we also construct the Service Oriented Architecture (SOA) for sensor data discovery, transport, visualization and retrieval. We emphasize the Sensor Web Enablement (SWE) standard defined by the OpenGIS Consortium (OGC), which includes a set of specifications such as SensorML and TransducerML. 2.4 Model Design The high-level design of the system is shown in Figure 3. The components of the system include, • Google Earth client 3D visualization system • Sensor web services • Backend knowledge database • Backend PostGIS database • Fusion algorithm • Sensor web registry services In general, two types of transactions always happen between client and sensor web service. One example of transaction is the interaction which takes place between a client and a registry service during the sensor publishing and binding process. The Register Sensor operation allows the client to register a new sensor with the SPS/SOS services as part of the transactional profile. Sensor observations can only be registered with the sensor that is available. Another example of transaction is to retrieve a query request, which is formulated with SensorML schema. We will give a couple of examples to explain the communication data flow in the next section. As the center piece of our architecture, the sensor web framework provides the SOAP and XML-RPC interface for clients to consume. As shown in Figure 3, the sensor web composited from different web servers forms a network cloud. The sensor web keeps track of all available sensors

Proc. of SPIE Vol. 7694 76941D-3 Downloaded From: http://spiedigitallibrary.org/ on 11/26/2013 Terms of Use: http://spiedl.org/terms

Services Oriented Architecture (SOA) based Persistent ISR Simulation System Genshe Chen∗a, Erik Blaschb, Dan Shena, Huimin Chenc, Khanh Phamd a

b

DCM Research Resources, LLC, 14163 Furlong Way, Germantown, MD, USA 20874 Air Force Research Laboratory, Sensors Directorate, Wright-Patterson AFB, OH, USA 45433 c University of New Orleans, New Orleans, LA, USA 70148 d Air Force Research Laboratory, Space Vehicles Directorate, Kirtland AFB, NM, USA 87117

ABSTRACT In the modern networked battlefield, network centric warfare (NCW) scenarios need to interoperate between shared resources and data assets such as sensors, UAVs, satellites, ground vehicles, and command and control (C2/C4I) systems. By linking and fusing platform routing information, sensor exploitation results, and databases (e.g. Geospatial Information Systems [GIS]), the shared situation awareness and mission effectiveness will be improved. Within the information fusion community, various research efforts are looking at open standard approaches to composing the heterogeneous network components under one framework for future modeling and simulation applications. By utilizing the open source services oriented architecture (SOA) based sensor web services, and GIS visualization services, we propose a framework that ensures the fast prototyping of intelligence, surveillance, and reconnaissance (ISR) system simulations to determine an asset mix for a desired mission effectiveness, performance modeling for sensor management and prediction, and user testing of various scenarios.

Keywords: SOA web services, information fusion, performance metrics, 3D simulation, open source, ISR, GIS visualization.

1. INTRODUCTION Future advances in Intelligence, Surveillance and Reconnaissance (ISR) for multi-layered defense depend largely on the foundation of Network Centric Warfare (NCW) that enables collaborative usage and control of distributed warfare assets for time-critical operations. Selecting the best sensor/platform assets among geographically distributed ISR units can improve timely situation interpretation (e.g. target tracks) and achieve effective utilization of ISR constellation resources. Various complex dynamic simulators exist today, such as Satellite Toolkit; however, most often the inputs and outputs of the simulation are confined to the particular settings of the developer software tools and not standardized with crossplatform interoperability. In addition, the evaluation metrics generally do not consider constellations of different types of sensors on different types of platforms. As the military deploys many different types of platforms (high-altitude UAVs, manned low-flying RC-12s, unattended stationary sensors, etc.) with different types of sensors, the current methods to examine how to most effectively employ ISR assets (which are generally targeted at specific sensors or platforms) may become inadequate. In this paper, we present an open-standard framework to provide the user with capabilities for developing ISR simulations with the following objectives: • Build the test bed for different ISR scenarios. • Build the open source ISR system simulation environment. • Build the virtual 3-D graphical simulation system in real time. ∗

[email protected]. This work was supported in part by the US Air Force under contracts FA9453-09-M-0061 and FA8650-09-M1552 Ground/Air Multi-Sensor Interoperability, Integration, and Networking for Persistent ISR, edited by Michael A. Kolodny, Proc. of SPIE Vol. 7694, 76941D · © 2010 SPIE CCC code: 0277-786X/10/$18 · doi: 10.1117/12.849783 Proc. of SPIE Vol. 7694 76941D-1

Downloaded From: http://spiedigitallibrary.org/ on 11/26/2013 Terms of Use: http://spiedl.org/terms

O&M-OWL. The knowledge base receives the query, evaluates the result in a set of triples representing and RDF [10] graph. The result is returned to fusion algorithm. FA finally composites the data graph to SensorML and returns the result to the Data Layer.

Figure 4. Fusion Algorithms with Knowledge Base. 2.6 Data Flows The following scenario was used in the simulation: the engaged enemy (red force) is hidden somewhere in the battlefield. At the pre-planning phase, the commander specifies an area of interest (AOI) as the danger zone, allocates the UAVs (blue force) fly to the region to detect enemy’s movements. The persistent surveillance example explains how the framework’s components collaborate together to achieve the mission planning goal. We omit the OGC standard SPS, SOS communication process, which can be found in the OpenGIS Standards and Specifications web site [6]. The UML sequence diagram in Figure 5 illustrates the steps during the SPS interaction for an AOI surveillance request. The Google Earth clients reside in the command theatre. An actor (commander, a client GUI) enters the query to look for the available UAVs for task matching. The commander draws the AOI on the GE web browser and submits the query. The Asynchronous JavaScript and XML (AJAX) application programming interface (API) parses and encodes the AOI coordinates into SensorML and sends the XML based query via proxy server to sensor web. Http POST protocol can be used to send the request to the sensor web. When the sensor web SPS service detects the query event, it communicates to the fusion algorithm component in order to find the solution. The fusion algorithm transforms the query into SPARQL, and uses Jena framework to communicate with the backend knowledge base. The knowledge base performs logical reasoning, does a metric-analysis comparison and matches UAVs with the given task. The UAVs sensorID is retrieved from the backend knowledge base, and the feasibility is checked inside the SPS service. The final response is composited into SensorML document, and sent back to the client. The client parses the SensorML document, populates the related UAVs information on the browser. The complete information flow, metric analysis, and geo-spatial solution are presented to the user.

Figure 5. UML Sequence Diagram for AOI Query Request.

Proc. of SPIE Vol. 7694 76941D-5 Downloaded From: http://spiedigitallibrary.org/ on 11/26/2013 Terms of Use: http://spiedl.org/terms

Another example is the image retrieval from the SOS service as shown in Figure 6. The image query was initiated by the Google Earth Client to request UAV captured images. First, the client requests the UAV image in an OGC compliant way by passing the desired UAV name and all other required GetImage parameters to the proxy server. The proxy launches the OGC compliant GetImage request to SOS. The SOS service consults the sensor database to retrieve the image. The physical part of the image is stored on a web server directory. The backend PostGIS database returns the URL of the image back to the proxy server. The proxy passes the information back to the client. The client collects and displays the stored image from the specified URL. Future developments will involve: (1) the integration of more complex networked sensors including imaging sensors such as video, infrared, and other sensors that provide rich, high resolution data, (2) exploitation of the sensor data such as image segmentation and target recognition, and (3) crossmodality fusion evaulation (e.g. metrics) of target identification and localization for constellation effectiveness; assuming that each constellation asset contains one or more of the multimodality sensors.

Figure 6. UML Sequence Diagram for Image Retrieval.

2.7 Sensor Web Framework We developed a sensor constellation management and evaluation visualization (SMEV) system prototype on top of the standard sensor web services. The overall block diagram of the operational system in Figure 7 shows the key elements. The real sensors include the in-situ/remote and weather SWE architecture. The sensor web services provide the bidirectional transport of sensor tasking and data to the sensors, with their sensor registry and publishing services through internet HTTP protocol to the net-centric standard network for the purpose of sensor discovery and planning. The registry services embedded within the sensor web form the net-centric web service tier. Depending on a number of sensor web services developed via the OGC and standard SensorML, the sensor data was sent to the command theatre and disseminated to various networks. The Google earth clients send requests to SWE module to display and control sensors in real time.

Figure 7. Sensor Web System Block Diagram

Proc. of SPIE Vol. 7694 76941D-6 Downloaded From: http://spiedigitallibrary.org/ on 11/26/2013 Terms of Use: http://spiedl.org/terms

2.8 Framework Components Implemention We implement the sensor web (which follows the prototype SOA architecture) by using 52 North Sensor Web services. 52 North is open source software with the focus on the interoperable web services. It includes the novel web services that provide functionality of sensor assets management. The Tomcat web server is used to emulate the SWE clustering on a localhost. Each Sensor Web Service hosts the service on localhost port 8080, and listens to the upcoming sensor query event. The backend sensor database is the PostgreSQL/PostGIS database, which hosts on localhost port 5432. SOS, SPS services connect to the PostGIS database to update and query the sensor data. Java Open Database Connectivity (ODBC) is used to access the backend database. Google Earth is the simulation environment of our framework. Google Earth displays 3D models, terrains and battlefield environment. When users click on the 3D model, the corresponding metadata (UAV info, Captured image and video) is evoked. For current developments, the programming language AJAX was chosen as the mean of implement our sensor web framework. The JAX option is restricted by the Google Earth web service. The Google Earth plug-in only supports the AJAX API accessibility. For the next development phase, we will utilize Ruby on Rails to enhance our framework. Ruby on Rails offers many flexibility features needed in the service oriented architecture, such as SOAP and XML-RPC (remote procedure call)-based web services, easy database integration, and web-interface using AJAX. Our current web interface used Cascading Styles Sheets (CSS), HTML, JavaScript to render information on the web browser. To demonstrate the technologies discussed earlier, we have developed Google Earth based interface leveraging AJAX techniques. AJAX or Asynchronous JavaScript and XML is a new web development technique for creating highly interactive web interface. AJAX uses a collection of well-known technologies together. It employs XHTML or HTML, JavaScript, document object model (DOM) and XMLHttpRequest. XMLHTTPRequest is used to exchange data with the web server asynchronously.

3. SIMULATION SYSTEM 3.1 3D Visualization Based On Google Earth Based on the Google Earth and our previous researched scenario [2], we designed our visualization engine (Figure 8) with layered sensing assets such as satellite tracking [1], UAV target detection and UGSs with a 3D performance metrics display. Additionally, we also implemented a GUI interface for SOA based sensor web access.

Figure 8. 3D Visualization based on Google Earth. Note that performance metrics display of classification, estimation, filtering, and tracking are presented on the GUI, while the target ID and localization are presented inside the geo-spatial display. Further refinements include

Proc. of SPIE Vol. 7694 76941D-7 Downloaded From: http://spiedigitallibrary.org/ on 11/26/2013 Terms of Use: http://spiedl.org/terms

constellation numbers, utilization, scheduling, and cost to show a spatial, temporal, and frequency allocation usage for full spectrum dominance over a situation awareness analysis. 3.2 Simulator Conceptual 3D Model Design In this section, the overall 3D model design of the visualization system is described. All the 3D objects in the simulation environment can be categorized into the following components: soldiers, ground vehicles, UAVs, satellites and terrain, and communication units. • Soldiers. The soldiers in the simulation environment consist of red and blue forces. The blue force’s mission is to follow the commander’s guidance to track and observe the red force movements. Each blue unit, equiped with sensors, communicates with the command theatre to receive the real time recommendations. The red force is hidden somewhere in the battlefield and is enabled with target detection. When the red unit is detected, it will switch from hidden mode to engaged mode. • Ground vehicles. All the ground vehicles belong to the red force and are equipped with various types of sensors.. • UAVs. The UAVs are attached to the blue force. Their mission is to track the red force movement. The UAVs include image and video sensors to capture the red force’s current status. Each UAV unit continuously feeds the red force status to the blue theater commander. In case of detecting red units, the UAVs send a suggestion to the theater commander to allocate resources of the blue units. • Satellites. The satellites gather ground information and generate the surveillance data from the space point of view for the blue forces. • Terrain. The terrain model is integrated with Google Earth. 2D maps and high resolution 3D terrain models are available for certain regions. The terrain model of the demo battlefield environment is available. • Communication units. The communication protocols is the open source SOA based sensor web framework. We provide the detailed information in the SOA system architecture section. The sensor web framework performs the dynamic remote control between a set of sensors and the command theater. • Sensing Assets: Various image, video, radar, and SIGINT sensors are assumed on the UAV and satellite platforms. 3.3 UML Diagram of 3D Models For the Google Earth based simulation environment, we defined the ISR conceptual model using the object oriented design. The design ensures that the whole ISR scenario is modular and fully customizable. Basically, each 3D object in the scene is an entity. We define the 3D object as a base class – SimualtorModel (Figure 9). Each 3D model object contains 3D model geometry, heading direction, speed, path, and step size (delta). The path defines the points along each time instance. The speed, step size (delta) and heading are automatically calculated with the given time window. Each object makes the move based on the path generated from the planning algorithm. The UAV model is defined in the SimulatorModelUAV class. The SM-UAV class aggregates from the base SimulatorModel class with additional features, the field of view (FOV) calculation and display. Two types of FOVs are defined, namely, pyramid type and circle type. The UAV is able to trace the 3D object’s movement, such as soldier. The FOV ground range is able to track each soldier’s movement during run time. In the future development, the 3D object will have its own properties, for example, a missile object should be able to track the target and UAV object can change the bank and pitch angle in order to adjust the FOV ground range, etc. Additionally, we implemented the view inheritance relationship for multiple views. The View class defines the basic battle field viewing environment. From the multiple view perspective, each viewing screen has its own focus point. SatelliteView, UAVview and VehicleView inherit from the View class and define its own FOV. We will adopt the multiple views in the SOA based environment. For the time being, we can simply customize the 3D model and View classes within this design paradigm.

Proc. of SPIE Vol. 7694 76941D-8 Downloaded From: http://spiedigitallibrary.org/ on 11/26/2013 Terms of Use: http://spiedl.org/terms

Figure 9. 3D model and View inheritance relation. 3.4 Multiple Views The multi-viewer centralizes the different battlefield viewers in one browser, accessing from the client to the Google Earth (GE) server using the Google AJAX API. JavaScript passes the 3D model location on web server to GE server via the Google Earth load() method to load 3D models. Predefined path and the current 3D object positions are sent back and forth between the clients and GE server and updating the view at end of each frame. Additionally, individual target viewers can reside on different clients to view the same battlefield environment. Figure 10 demonstrates the basic work flow with the multi-viewer.

Figure 10. Multi-viewer work flow. Figure 11 shows the target positions in multi-viewers including UAV view, vehicle view, satellite view and battlefield view. Targets outside the view are not detected and the targets inside the views are detected as well as a multiresolution classification is inversely associated with the pixel resolution of the sensor, The UAV and vehicle views are predefined as sightseeing tours, where user can trace with camera focus on the targeting object. The satellite view and battlefield view allow the user to drag the terrain and change the field of view (FOV).

Figure 11. Multiple Viewers of 3D Visualization. 3.5 User-Friendly Enhancements Our visualization engine has the following two user-friendly features as shown in Figure 12 and Figure 13.

Proc. of SPIE Vol. 7694 76941D-9 Downloaded From: http://spiedigitallibrary.org/ on 11/26/2013 Terms of Use: http://spiedl.org/terms

Figure 12. Illustration of icon change after an unknown target is detected.

Figure 13. Performance score display in a snapshot of the simulation. In Figure 12, we included an icon-change feature to help user visually known that which target is detected. We also include a 3D performance metrics display feature (Figure 13) to show the performance scores calculated based on our performance metrics algorithm. The user can turn on the performance evaluation feature, turn off, or as the user scrolls over a target, the metrics will pop up for the associated target. 3.6 GUI and Simulation of Sensor Web Figure 14 displays the user interface view of the software application we are developing to enable the integration of sensor web with Google Earth. The metadata visible in the popup window shows the UAV’s latitude, longitude, altitude and range at the particular location. Users can also view the captured images and videos of the particular intersection, which is updated through the web proxy server residing on the sensor web.

Figure 14. Meta data feedback with sensor web access. Figure 15 shows a screenshot of the interface for sensor web query controls. The window embraced inside the Google Earth web browser contains these areas. The right portion displays GE 3D simulation environment. The top row of the interface contains several button and checkboxes, where users can control the mission simulation replay. The left portion is the majority control of sensor web services. The top pane enables the users to select the different type of missions, such as weather, UAVs, sensor planning related missions. One example is the UAV mission. The operator draws an AOI on the Google Earth terrain, and clicks the “getQuery” button. The sensors available inside the AOI will be retrieved from the SPS web service. The related information of the UAV is populated to the table of the middle pane. In addition, users can list the http web service address, and copy/paste the SensorML request into the bottom pane’s text area. When submitting the request, the user will get the response SensorML documentation displayed in a pop-up web browser. The interactive GUI provides more flexibility to the user for selecting among different sensor assets and tasks based on the presented constellation metrics. For current

Proc. of SPIE Vol. 7694 76941D-10 Downloaded From: http://spiedigitallibrary.org/ on 11/26/2013 Terms of Use: http://spiedl.org/terms

development, we have provided the basic prototype that can handle the interaction between user interface and sensor web service. For future development, we will design and implement a detailed SensorML parsing mechanism to deal with the complex sensor task assignment.

Figure 15: Sensor web query control

4. DISCUSSION These preliminary yet promising results obtained in the current study clearly demonstrate that our sensor constellation simulation tool is flexible, intuitive and efficient for realistically simulating and evaluating methodologies of effectively managing constellations of airborne ISR assets. Based on the design and evaluation of the sensor constellation simulator prototype, we developed executable software product that can load, visualize, evaluate and compare various ISR constellations over various scenarios. By using the open source SOA-based sensor web architecture and other OpenGIS web services, an ISR simulation environment is tailorable for different mission scenarios. Future developments include using the NASA WorldWind [12] based simulation system with the massive parallel processing and modern multi-core and multi-threading based computer hardware. We are currently investigating into two major migration processes, first, converting the Google Earth based simulation environment into WorldWind based simulation environment; second, converting the PostGIS sensor web backend database system into Oracle database system. We believe that the proposed SOA based ISR simulation system will be useful for both military and civilian applications in the future.

REFERENCES [1] Y. Bar-Shalom, and H. Chen, “IMM estimator with out-of-sequence measurements”, IEEE Trans. Aerospace and Electronic Systems, 41(1), pp. 90–98, 2005. [2] G. Chen. D. Shen, C. Kwan, J.B. Cruz, Jr., M. Kruger, and E. Blasch, "Game Theoretic Approach to Threat Prediction and Situation Awareness", Journal of Advances in Information Fusion, vol. 2, no. 1, pp. 35-48, June 2007. [3] C. Mezzetti, “Mechanism design with interdependent valuations: Surplus extraction”, Economic Theory, 31(3), 473–488. 2007. [4] A. Preece, G. de Mel, M. Gomez, W. Vasconcelos, D. Sleeman, S. Colley, and T. La Porta, “Matching sensors to missions using a knowledge-based approach”, SPIE Defense Transformation and Net-Centric Systems 2008, Orlando, Florida, 2008. [5] M. Gomez, A. Preece, M. P. Johnson, G. de Mel, W. Vasconcelos, C. Gibson, A. Bar-Noy, K. Borowiecki, T. La Porta, D. Pizzocaro, H. Rowaihy, G. Pearson, and T. Pham, “An Ontology-Centric Approach to Sensor-Mission Assignment”. 16th International Conf. on Knowledge Engineering and Knowledge Management. 2008. [6] http://www.opengeospatial.org/standards [7] A. Sheth, C. Henson, and S. Sahoo, "Semantic Sensor Web," IEEE Internet Computing, July/August 2008, pp. 7883. [8] Jena Semantic Web Framework, http://jena.sourceforge.net/ [9] SPARQL Query Language for RDF, http://www.w3.org/TR/rdf-sparqlquery/ [10] RDF Schema (RDF-S), http://www.w3.org/TR/rdf-schema/ [11] Web Ontology Language (OWL), http://www.w3.org/TR/owl-ref/ [12] WorldWind, http://worldwind.arc.nasa.gov/java

Proc. of SPIE Vol. 7694 76941D-11 Downloaded From: http://spiedigitallibrary.org/ on 11/26/2013 Terms of Use: http://spiedl.org/terms

Suggest Documents