Issues in Modelling Sensor and Data Fusion in Agent Based ...

3 downloads 74 Views 156KB Size Report
cision making processes of military decision makers such as pilots, fighter controllers and mission commanders. For example, John Boyd's well known model of ...
Issues in Modelling Sensor and Data Fusion in Agent Based Simulation of Air Operations Clint Heinze, Michael Papasimeon and Simon Goss Department of Computer Science and Software Engineering, The University of Melbourne, Australia Air Operations Division, Defence Science and Technology Organisation (DSTO), Australia [email protected], [email protected], [email protected]

Abstract – Sensor fusion is often not the focus of operational analysis, but the increased connectivity and information sharing of a network enabled warfare environment and platforms with sophisticated on-board sensor fusion capabilities makes it an important consideration for military analysts. Modelling sensor fusion within a simulation of largescale military operations is complicated because the sensor fusion processes under consideration are not confined to the mission systems of the various platforms. The cognitive processing of sensor data by air-crews and the entire command and control chain are as important. Agent based techniques can offer architectural modelling options that, as well as providing flexibility in meeting the requirements of simulation developers, can address issues raised by the wider sensor fusion community. This paper broadens accepted definitions of sensor fusion to characterise it as a socio-technical problem requiring that attention be given to the human users of sensor fusion systems as well as the technical aspects internal to the mission systems. A set of models that illustrates this approach is presented. This modelling approach supports the adoption of intelligent agent techniques and technologies. Intelligent agent architectures for modelling sensor fusion together with insights into particular implementations that show promise in addressing the needs of the simulation community and may offer insights for developers of sensor fusion systems. Keywords: Sensor Fusion, Intelligent Agents, Patterns

1

Sensor and Data Fusion in Simulation of Air Operations

Sensor and data fusion, in the context of this paper is not limited to that processing that occurs within the fusion processing of airborne mission systems. It includes that component of fusion that occurs within the heads of the aircrews and those people in the command and control chain. For the

purposes of this paper these are differentiated as computational data fusion and cognitive data fusion. Having extended the definition of sensor fusion to cater for that processing that occurs within the head of the pilot it is possible to go a step further and to consider the impact of sensor fusion across a team. If a team of aircraft is considered as a single system then the definition of sensor fusion within that system encapsulates the processing and sharing of information between aircraft via tactical data links as well as the processing of the situation cognitively and the sharing of information by voice. The phrase “air operations simulation” tends to conjure mental images of the graphics intensive, possibly moving base, flight training simulators that characterise the sexy end of simulation. These simulators provide the trainee with a cognitively engrossing perceptual experience that provides the, appropriately high, level fidelity necessary to realise a training benefit. The early engineering of these simulators was often concerned primarily with the visual stimulus but as image generator technology improved the focus shifted to other areas. Some time ago it was recognised that computer generated forces were a necessary addition to these systems to offset the large manpower requirement to support simulator operations. More recently the distributed mission training community has also recognised the role that virtual entities can play in populating simulated battlespaces. Constructive simulations1 used for operational analysis have always required virtual pilots to fly their virtual aircraft and so have, at least in some areas, pioneered technologies for simulating the human component [6]. Regardless of the heritage, there are clear requirements to provide intelligent entities for virtual environments. In the battlespace of the twenty-first century, where emerging concepts like network centric warfare are driving an information focussed view of combat it is inevitable, and indeed 1 Constructive simulations are those that involve no human participation. The simulation in its entirety is played out in a virtual space.

296

desirable, that sensor fusion becomes an increasingly important aspect of simulation. With that in mind, sensor fusion, whilst an important consideration in the development of simulations, is seldom the primary focus. Models of sensor fusion must be tailored to meet specific simulation needs and this often requires models where fidelity, accuracy, and detail play second fiddle to software engineering features: performance; simplicity; and reuse. The simulation community refers to models that sacrifice an accurate internal representation for the appearance of functioning correctly as an effects model. In the real world the line between human and machine is the concern of those interested in human computer interaction, the design of crew environments and those concerned with providing mission system functions that support crew activities. The obvious two in this context being situation and threat assessment. In virtual worlds the line between simulated aircraft and simulated pilots is less clear. Simulated processing of mission system sensor fusion can be intertwined with simulated human reasoning resulting in a “holistic” model of sensor fusion that makes no distinction about where the processing actually occurs. This paper will present the several varieties of sensor fusion that must be modelled, from those traditional sensor fusion issues managed by systems engineers to the less tangible aspects of sensor fusion that occur within the heads of military personnel and are the domain of cognitive scientists. By relaxing the requirement to model the detailed internal functioning of physical sensor fusion systems it will be shown that the modern fighter aircraft with its crew can be treated as a complex socio-technical system and the line between on-board mission system and pilot can be blurred. Not only does this result in software engineering advantages for the construction of simulations but it suggests intriguing research directions for the future of sensor fusion research.

2

Modelling Sensor Fusion

This section describes modelling multi-sensor data fusion from both a systems and a cognitive modelling perspective. It explores the issues and similarities between systems and cognitive level models and shows how they can be grounded in a common theory of agency.

2.1

System and Cognitive Modelling

The modelling of air operations for operational analysis purposes is often a complex and demanding task for both analysts and engineers. In studies where the modelling of an operator’s tactical decision making is important (for example modelling fighter pilots in air combat), the task is particularly difficult. Not only does the modeller have to be concerned with how to represent the pilot’s reasoning and decision making abilities, but must also be

aware of how the pilot agent interacts with a virtual environment through perception of different sensory information and through the possible action that can be undertaken in the world. When modelling reasoning processes it is truly an inter-disciplinary task; the modeller having to draw on ideas from artificial intelligence, agent research, software engineering and the cognitive sciences. When considering how to present perceptual information from modelled sensors to the agents, the modeller must draw on ideas from the sensor fusion community. The modelling of sensor fusion processes in this domain is partly cognitive modelling and partly systems modelling. To highlight the issues involved, two cases will be considered. Firstly, consider the case when modelling a modern fourth of fifth generation fighter aircraft together with a cognitive model of the fighter pilot operating such an aircraft. In a modern fighter such as the F-35 Joint Strike Fighter, the pilot will be presented in many cases with a fused picture of the battlespace. The onboard mission system will fuse the data from onboard sensors (such as Radar, RWR 2 , and EOTS 3 ) and offboard sensors such as AEW&C and other fighters, through the use of tactical datalinks. Depending on the requirements of a particular study, not only will the individual on and off board sensors need to be modelled, but also parts of the F-35 JSF’s mission system which are responsible for sensor fusion will need to be modelled to provide a fused picture to the computational cognitive model representing the fighter pilot. In this situation the sensor fusion process is treated more from a systems modelling perspective. Consider another case, when the requirements of an analytical study call for the need to model a less sophisticated aircraft; one with multiple sensors but with less sophisticated avionics; without the provision of onboard sensor fusion. Although the information from the aircraft’s sensors can be treated separately, this information is then passed to the pilot agent who has to deal with it in some manner. The pilot agent is a cognitive model, and it is the cognitive process of perception which must be modelled for the pilot to “fuse” information from his own sensors, with that being provided by the airfcraft sensors to form beliefs about the world and the entities in the world that can be perceived. In some regards this is the cognitive equivalent of the sensor fusion found in some aircraft mission systems. Although the human brain and a mission system, process and fuse information in different ways, when we are creating models of these processes the models begin too look very similar. In operational analysis where the focus is often on tactical development and not specifically on sensor fusion, the sensor fusion at the cognitive and mission system level are often very similar if not the one and the same model. The two cases presented are in some respects the extreme

297

2 Radar

Warning Receiver Tactical Sensor

3 Electro-Optical

positions. The majority of the time, the models that need to be created will require a combination of both approaches. For example in most fighter aircraft, some of the sensors may be fused to provide a single picture to the pilot, while others may not. Not only must the pilot fuse this information at a cognitve level, but it must also be combined with other information he is receiving from his own senses of vision, hearing, touch, taste and smell. Information a fighter pilot will need to fuse at the cognitive level include: • Fused picture from multiple sensors. • Non fused individual sensors.

IRST/FLIR

RWR/ESM

MAW

DATALINK

Models of Computational Data Fusion

The modelling of cognitive sensor fusion has many analogues with the sensor fusion for avionics and other mission systems. For example the Data Fusion Subpanel [5, 8] has adopted a three level approach to sensor fusion. The levels are briefly summarised as follows: Level 1 Processing: Fuses data from sensors to provide position, velocity and identity of low level entities. This processing is divided into four parts: (1) data alignment, (2) association, (3) tracking, (4) identification. Level 2 Processing: Is concerned with situation assessment. That is to provide meaning and patterns of the level one data with respect to the environment and relationships between different entities.

• Pilot’s own biological sensors.

RADAR

2.2

EW SYSTEMS

Level 3 Processing: Is concerned with threat assessment. It assess the data from an adversarial perspective. FIGHTER PILOT

RADIO

VISION

AUDIO

PLATFORM

WEAPONS

GPS/INS

Figure 5: An example of some of the sensory demands placed on the modern pilot. Common to modern fighter aircraft are Radar, IRST (Infra-Red Search & Track) and FLIR (Forward Looking Infra Red) for searching and detecting targets, Figure 1: The multi-sensory demands placed on a modern RWR (Radar Warning Receiver), ESM (Electronic Support Measures) and MAW (Missile Approach Warner) for detecting threats, datalinks for sharing information with other aircraft or units securely and efficiently, radio for communication, and fighter pilot. a suite of sensory input to inform the pilot about the state of various systems on board the aircraft ranging from weapons, to electronic-warfare suites, to the state of the aircraft itself. In addition to this the pilot has to fuse this information with information obtained using his hearing, and the view outside the cockpit.

For example, the brain must be able to cognitively process that an aircraft the pilot can see with his eyes outside the cockpit, is the same one displayed on his radar screen. When modelling these cognitive and system level sensor fusion processes we can create a variety of models that interact in different ways. In the human factors domain where issues like workload and information overload are important, it is perhaps useful to have separate sensor fusion models to represent the mission system and the pilot’s cognitive perceptual systems. However, in some applications such as operational analysis the focus is on the tactical decision making. The models created are much simpler than the actual systems themselves by their very nature, and are at a higher level of abstraction. It is often the case that a single sensor fusion model will be used to represent the capabilities of both the avionics and the pilot’s cognitive and perceptual processes. There are a number of advantages in considering such an approach. Firstly from an operational analysis perspective it simplifies the modelling. However, it is also a powerful abstraction mechanism. Taking this approach allows the analyst to view the pilot-aircraft as a single entity. The pilot is truly embodied in the aircraft and situated in the environment. We begin to blur the line between man and machine, considering the capabilities of the system as a whole. This socio-technical perspective provides significant value to the operational analyst.

Although later approaches to the definition of fusion levels [4] include a fourth level which performs process refinement, for the purposes of this paper only the first three levels are of interest.

2.3

Models of Cognitive Data Fusion

There are many parallels to be drawn between the levels of sensor fusion processing used in combat or mission systems and the approach taken to model the cognitve or decision making processes of military decision makers such as pilots, fighter controllers and mission commanders. For example, John Boyd’s well known model of military decision making known as the OODA loop [2] can be compared to the three levels of sensor fusion described above. The OODA loop theory of military decision making involves the following four steps: Observe (O): The fighter pilots observes the world through his sensors. Orient (O): He orients himself with respect to what he observes (assessess the situation). Decide (D): He decides what he is going to do based on his assessment of the situation. Act (A): He carries out his decision by undertaking the relevant action. The OODA loop model was adopted, modified and used as a basis to cognitively model military operators with agent technology in Air Operations Division, DSTO in the SWARMM project [9]. This model, known as the AOD 4Box Cognitive Architecture was implemented on a number of projects using agent languages such as dMARS [3] and Jack [1]. Both these languages are suitable for modelling

298

reasoning in some domains as they implement a computational representation of the BDI (Beliefs, Desires and Intentions) folk-psychological model of human reasoning [7]. In Air Operations Division, DSTO, a 4-Box Cognitive Architecture was developed on top of the BDI model and it consisted of the following four steps.

issue is how to combine all this sensory information that minimises information overload yet maximises the pilot’s tactical potential. SENSOR 1

• Situation Awareness

SENSOR 2

• Situation Assessment SENSOR 3

• Tactics Selection

ENVIRONMENT

PILOT

• Tactics Implementation

SENSOR ...

The first two steps of Situation Awareness and Situation Assessment are roughly analogous to the Level 1 and Level 2 sensor fusion processing discussed earlier. However, this model was often combined with a lower level processing module which was responsible for modelling the computationally expensive Level 1 sensor fusion functions such as data alignment and association.

2.4

Models of Agency

There are many challenges in modelling sensor fusion processes. However, in the case of operational analysis it is possible in many cases to rely on the commonality between cognitive and system modelling of sensor fusion to develop a single model which can be used to cover both cases. The advantage and strength in this approach is that the models aren’t only adequate representationally but are also grounded in models of agency. Furthermore they allow us to merge some system and cognitive processes, providing an embodied and socio-technical perspective on air operations. The next section explores in more detail some of the design patterns used to address these issues discussed here.

3

Sensor Fusion Patterns for Agents

Based on the discussion regarding ways to model sensor fusion for operational analysis purposes in the previous section, the next step is to look the corresponding design patterns. Design patterns refer to common widely used software architectures and designs. The cataloguing of patterns facilitates reuse of the particular design and more importantly serve as a means of documenting these designs. This section presents a number of schematic diagrams, representing high level designs of how the different types of sensor fusion modelling issues might be realised in an operational analysis simulation system for both the modelling of mission and cognitive systems. Figure 2 presents the sensor fusion problem in perhaps its most basic form. In the domain of air combat, a fighter pilot has access to multiple sensors both those related to his aircraft and systems presented to him by instruments or other mission system displays, and his biological sensors. The

SENSOR N-1

SENSOR N

Figure2: 1: AAfighter pilotpilot uses information multiple sensors to mulFigure fighter must fusefrom information from obtain information about the environment he is situated in. tiple sensors to make sense of the environment and hence fight effectively.

In modern fighter aircraft the onboard mission systems are often sophisticated enough to provide a fused picture to the pilot, combining sensors such as radar, radar warning receivers, and infrared-sensors to provide the pilot with a unified view of the environment. The pilot then has to fuse this information cognitively with additional input from his own senses (vision, hearing etc). Figure 3 shows a design pattern that can be used to model such a situation. In this case a model of the mission system is used to fuse data from all the aircraft’s systems, with only the fused picture going to the pilot. Additionally, the pilot model must fuse this information internally with information obtained from the computational models of biological sensors such as eyesight. SENSOR 1

SENSOR 2

SENSOR 3 ENVIRONMENT

FIGHTER AIRCRAFT MISSION SYSTEM

FUSED PICTURE

PILOT SENSOR ...

SENSOR N-1

SENSOR N

Figure 2: In modern fighter aircraft the onboard mission systems are often sophisticated enough to provide a fused picture to the pilot, combining sensors such as radar, radar warning receivers, and infrared-sensors to Figure 3: The role of a modern multi-sensor data fusion provide the pilot with a unified view of the environment. The pilot then has to fuse this information cognitively with additional input from his own senses (vision, hearing etc). capable mission system in modern air combat.

Another approach is presented in figure 4. Here the approach is similar to figure 3 but the design explicitly shows a second level sensor fusion model inside the reasoning

299

model of the pilot. This secondary model undertakes what has been referred to as as cognitive sensor fusion. From a cognitive psychology perspective, this component of a pilot model’s reasoning abilities would correspond to a perception module or basically the brain’s perception function. As can be seen from the figure, the cognitive model of perception fuses information incoming from the mission system fusion model as well as the models of the biological sensors. SENSOR 1 PILOT

SENSOR 2

SENSOR 3 ENVIRONMENT SENSOR ...

FIGHTER AIRCRAFT MISSION SYSTEM COGNITIVE SENSOR FUSION SYSTEM (PERCEPTION)

SENSOR N-1

SENSOR N

Figure 3: A mission system and a cognitive system at work together to provide a fused sensor fusion picture or percepts of the environment for the fighter pilot. Figure 4: Fusion at the mission system and at the cognitive level

The patterns presented so far have been from the point of view of a single military operator such as a fighter pilot. However, if the future battlespace is considered, there are wider sensor fusion issues at stake. Not only is the individual important, but teams are important as well. A military operator must be able to fuse information not only originating from sensors attached to his own platform, but also fuse information from fellow team members whether they be a wingman, off-board sensors such as ground or airborne radar, information from ships and land based units in joint operations, and information from friendly assets in coalition operations. The modelling of such complex fusion issues will no doubt be the focus of much future research. Figure 6 illustrates a pattern that can be used in one the simpler cases of teamed fusion. The figure represents the fusion processes in place for a fighter two-ship with two fighter pilots; a leader and a wingman. As can be seen from the figure, not only does each pilot have to deal with fusion of data from his own physical and biological sensors, but also information provided from his team mate via datalink and radio communications. AIRCRAFT 1

Although a modern fighter aircraft might contain a sophisticated mission system capable of sensor fusion, for the purposes of modeling, it is possible to treat the cognitive and mission systems as a single sensor fusion system providing percepts for the model of the pilot. Figure 5 presents a design patterns that follows such an approach. In the case of many operational analysis applications, a single model (whether it is the mission system or the cognitive perceptive system) can be used to fuse information from all sensor sources. This is because when undertaking operational analysis of air operations, the issue of sensor fusion isn’t primary. It is important, and must be modelled, however some liberties and approximations which do not affect the outcome or the results may be taken. The design pattern in figure 5 represents such an approach. SENSOR 1

SENSOR N

PILOT 1 FIGHTER AIRCRAFT MISSION SYSTEM

DATA LINKS

RADIOS

ENVIRONMENT

AIRCRAFT 2 SENSOR 1 SENSOR 1

FIGHTER AIRCRAFT MISSION SYSTEM PILOT 2

SENSOR N

Sensor fusion occurs in both mission systems and in the heads of both pilots. Information is

shared via data links and radios. Figure 6: Sensor fusion occurs in both mission systems and in the heads of both pilots. Information is shared via data links and radios.

4

SENSOR 3

SENSOR ...

SENSOR 2

PILOT

SENSOR 2

ENVIRONMENT

SENSOR 1

COGNITIVE SENSOR FUSION SYSTEM (PERCEPTION)

SENSOR N-1

SENSOR N

Figure 4: Although a modern fighter aircraft might contain a sophisticated mission

Figure A design pattern a modeling, single itmodel fusion system 5: capable of sensor fusion, for thewhere purposes of is possibleof to treat the cognitive and mission systems as a single sensor fusion system providing percepts is used to represent cognitive and mission systems for the model of the pilot.

Conclusions

This paper presented models of fusion that blur the line between the cognitive processing occurring in the head of the pilot and the computational processing occurring inside modern mission systems. For some applications, notably simulation where sensor and data fusion must be represented but the detail is unimportant, it is possible to develop a one model fits all approach. This approach results in software engineering advantages for simulation developers: architectures are simpler; reuse is facilitated; and performance improved. A number of interesting questions are uncovered and research directions suggested as a result include:

300

• The emerging focus on concepts such as network centric warfare suggests that modelling and simulation may have to deal with a shift in focus from the specifics of hardware and platforms and onto networks, information and data flows. In this context a set of modelling choices and architectures that flexibly account for cognitive and computational fusion are likely to be useful.

[6] C. Heinze, Brad Smith, and M. Cross. Thinking quickly: Agents for modeling air warfare. In Proceedings of 4th Australian Joint Conference on Artificial Intelligence. AI’98, 1998. [7] A. S. Rao and M. P. Georgeff. Modeling rational agents within a BDI-architecture. In J. Allen, R. Fikes, and E. Sandewall, editors, Proceedings of the Second International Conference on Principles of Knowledge Representation and Reasoning, pages 473–484. Morgan Kaufmann Publishers, San Mateo, CA, 1991.

• Preliminary design and implementations whilst not comprehensive enough to qualify as a cognitive work analysis or a cognitive task analysis [10] there is a strong suggestion that modelling approaches that integrate the human and the computer might offer insights for the design of sensor fusion systems or of the interfaces that they present to the user. • The possibility exists to use sensor fusion algorithms inside intelligent agents to model the cognitive (or often sub-cognitive) aspects of data fusion. That is not to suggest that there is any psychological plausibility to standard data fusion models: psychological plausibility is less important in this context than software engineering expediency.

[8] Data Fusion Subpanel of the Joint Directors of Laboratories Technical Panel for C3. Data fusion lexicon. Technical report, U.S. Department of Defense, 1991. [9] Gil Tidhar, Clinton Heinze, and Mario C. Selvestrel. Flying together: Modelling air mission teams. Applied Intelligence, 8(3):195–218, 1998. [10] Kim J. Vicente. Cognitive Work Analysis: Toward Safe, Productive, and Healthy Computer Based Work. Lawrence Erlbaum Associates, 1999.

• The possibility exists to use models of cognitive situation awareness and assessment that are grounded in psychology to model the processing of a mission system. This idea could be extended to the actual physical system. Sensor and data fusion systems could use psychologically grounded theories as the basis for their operation. 4

References [1] P. Busetta, R. Ronnquist, A. Hodgson, and A. Lucas. Jack intelligent agents - components for intelligent agents in java, 1999. [2] Robert Coram. Boyd: The Fighter Pilot Who Changed the Art of War. Little Brown and Company, 2002. [3] Mark d’Inverno, David Kinny, Michael Luck, and Michael Wooldridge. A formal specification of dMARS. In Agent Theories, Architectures, and Languages, pages 155–176, 1997. [4] DSTO Data Fusion Special Interest Group. Data fusion lexicon. Technical report, DSTO, Department of Defence, Australia, 1994. [5] David L. Hall. Mathematical Techniques in Multisensor Data Fusion. Artech House, 1992. 4 Naturalistic decision making is a broad school of theories and approaches that has been applied in military contexts and might have implications for the design of sensor and data fusion systems.

301

Suggest Documents