Modeling and Simulation for Training and Support of Emergency ...

10 downloads 75 Views 2MB Size Report
Management Teams:The Development of CrisisKit ... Emergency management (EM), the decision making involved in directing the relief operation after a ...
Modeling and Simulation for Training and Support of Emergency Management Teams:The Development of CrisisKit Alma Schaafstal, Ph.D, & Wilfried Post, Ph.D. TNO Human Factors P.O. Box 23 3769 ZG SOESTERBERG

The Netherlands e-mail: [email protected], [email protected]

Abstract Emergency management (EM), the decision making involved in directing the relief operation after a disaster or otherwise catastrophic accident is an issue of great public and private concern because of the high stakes involved. It is important to consider what can be done to optimize training and support of EM teams, both in terms of the structure of training and support, and the use of simulation technology. In terms of technology, there have been developments that would be worthwhile applying to the domain of EM, such as automated instructor and observer aids, and intelligent automated cognitive performance diagnosis. CrisisKit is introduced as an example of a simulated environment in which training and support are combined. We are currently exploring how the CrisisKit concept could be augmented with intelligent, synthetic, agents acting as team members and intelligent controllers, and which, on the basis of performance measurement, are able to flexibly adapt the scenarios to the current level of knowledge and skill of the team.

Introduction Emergency management (EM), the decision making involved in directing relief operations after a disaster, is an issue of great public and private concern because of the potential catastrophic losses (e.g., human lives and property) involved. Mass emergencies involve the contribution and interaction of several to many EM disciplines (e.g., fire fighting, rescue, police, sheriff) and local, state, and federal governments, resulting in novel and distributed communications that require rapid adaptation to the exigencies of the emergency (Dowell, Smith, & Pidgeon, 1997). The nature of mass emergencies requires EM teams to conduct decision making in stressful situations involving information overload and a significant level of uncertainty. Such situations often call for nonroutine, complex problem solving. The potential for multiple failures in the EM response is influenced by several factors: disasters consist of a sequence of “unpredictable” events; potential top decision makers (e.g. mayors, county/city managers, and school and university officials) are rarely involved with EM as their primary profession; and several large organizations (consisting of many internal teams), with different goals and cultures, have to work together to minimize the effect of the emergency. Consequently, a lack of distributed team

training makes it difficult to build up a solid level of expertise. As a consequence, EM requires good coordination and communication not just within, but also among the various organizations. Coordination among multiple, complex teams in EM should, therefore, be a key focus for team training. EM training has received wide attention and is regarded as a high priority issue, reflected, for example, in the work by the American Federal Emergency Management Agency (FEMA). FEMA makes it clear that the quality of its response and recovery efforts is directly linked to the knowledge and skills possessed by staff working at disaster sites. Current EM training practices focus on the procedures involved and on how to scale up the EM organizations when needed (Stroomer, Schaafstal, & Riemersma, 2000). Most of the courses focus on establishing individual knowledge and skills. Consequently, the large number of people required and the resultant costs of team training means that few opportunities exist to conduct distributed team training. To summarize, some of the major problems and challenges for developing EM team training include: limited training time available; large scale of the exercises, and limited funds. Therefore, team training must be effective the very first time and every time. Currently, training exercises are

focused primarily on training within teams, as compared to training among a team of teams. However, there is a growing awareness that to enable proficiency in crisismanagement, training should also focus on interdisciplinary teams (police, fire fighters, medical staff, together with local representatives) at all levels: operational, tactical and strategic. There is also a growing awareness that training of EM teams should not just focus on individual knowledge and skills (although those are a prerequisite for good team performance), but that team skills, the ability to perform efficiently and effectively as a team, is crucial. Therefore, training should focus on those team skills, as well as focused on individual skills.

Training EM teams: challenges for modelling and simulation Since few opportunities exist for a team of teams to practice together, it is very important to use the limited time as efficiently as possible. Training distributed teams is a complex endeavor; it will involve a large number of role players, exercise controllers, and observers who will require job aids to enhance the quality of training through structured observations, briefs and debriefs. Recent progress in advanced training technologies have produced methods for implementing hand-held commercial-off-theshelf computers. This aids the observers during complex exercises by providing structured note-taking, relating those notes to particular scenario-segments and communication segments, communicate if necessary the observations with other observers, and finally enables a structured debrief (Fowlkes, Dwyer, Milham, Burns, & Pierce, 1999; Pruitt, Burns, Wetteland, & Dumestre, 1997). In addition automated intelligent performance measurement and diagnosis at the cognitive level for training of tactical decision making could eventually lead to a reduction of personnel needed to run EM exercises. Zachary, Cannon-Bowers, Bilazarian, Krecker, Lardieri, & Burns (1999) describe how the Advanced Embedded Training System demonstrated on-line real time performance feedback, and reduced instructor requirements for team evaluation and team debrief. Another example of instructor support for performance measurement and debriefing of field exercises can be found in the work by Jenvald (1999).

CrisisKit: a simulation environment for training and support of EM teams As outlined above, an important challenge for training technology can be found in the development of training support tools. However, there are more challenges for modelling and simulation in the area of EM, one of them being the development of simulated environments in which we can train the various emergency management professionals and teams. On the one hand, there is a

tendency for team training to takes place increasingly in synthetic environments. However, such training is often still modeled after live team training, resulting in a very cost-intensive process in terms of both the time and resources required. Specifically, gathering all of the team members, instructors, and support personnel to conduct the training is very costly, and does not even include the costs involved with the lost productivity associated with the other tasks that those personnel are not doing while helping other teammates practice or while practicing themselves (Hicinbothom & Lyons, 1999). This issue becomes even more salient in cases where large-scale training of distributed teams is involved. These large numbers of observers/controllers, role-players, team members and other training staff for more complex EMtraining exercises are simply prohibitive in today's climate of decreasing budgets and manpower. As part of background research, TNO Human Factors developed CrisisKit, a simulation tool aimed initially at understanding the team processes in emergency management teams. It was developed with a number of goals: •

Understanding decision making in emergency management teams



Developing appropriate decision support tools for the various teams (strategic, tactical, operational)



The integration of training and support tools in one simulated environment



The development of agents that could support the training process.

The Event-Based Approach to Training A core concept in the development of CrisisKit as a training tool is the Event-Based Approach to Training (EBAT), a strategy that has demonstrated considerable application for training teams that must perform in complex, dynamic environments. The primary goal of EBAT is to systematically provide opportunities for a training audience to develop important competencies. The EBAT strategy is to allow teams to practice in simulated environments that are representative of realistic operational conditions, and provide teams with performance-based feedback that is linked to specific events that occur during the training scenario. The EBAT approach to scenario design emerged from more traditional, and general approaches to training design (e.g., Instructional Systems Development - NAVEDTRA 110A, 1981). While other frameworks possess similar characteristics, EBAT differs from the traditional approaches to training in three important ways. First, EBAT focuses on training where the audience is immersed in a simulated operational environment. In contrast the traditional approaches provide generic frameworks that can be applied to any

aspect of training. Most traditional training settings involve a set of instructions (e.g., lessons, lectures, computer-based training) for the curriculum. EBAT research has focussed on the operational and technical requirements of training with complex technology. EBAT success has been demonstrated in aviation environments (Fowlkes, Lane, Salas, Franz, & Oser, 1994), command and control, multiservice environments and virtual environments. More recently EBAT practices have been used to support evaluation of oil spill management response teams and design of large scale training systems. Some of the competencies trained with EBAT have included situation assessment, decision-making, planning, and resource management (Oser, Cannon-Bowers, Dwyer, & Salas, 1997). Third, EBAT methods and tools have been researched and tailored to meet the specific needs found in team training environments (e.g., performance measurement, feedback) (e.g. Johnston, Smith-Jentsch, & Cannon-Bowers, 1997). Figure 1 shows that EBAT links trainee needs, critical tasks, learning objectives, scenario design, performance measurement, and feedback. The general assumption of EBAT is that without a systematic linkage among these components there is no way of knowing or ensuring--with any degree of certainty--that the training will have its intended effect. Figure 1 shows that the EBAT framework supports the design, development, execution, and evaluation of training scenarios. The EBAT framework was derived from observations of team exercises, a review of the team training and performance literature, and ongoing team training research and development in both military and non-military settings (e.g., Federal Aviation Administration, 1990; Hall, Dwyer, Cannon-Bowers, Salas, & Volpe, 1993; Johnston et al., 1997; Moses, 1997) Because of the linkages in EBAT, performance can be traced directly back to specific objectives via events and performance measures. Performance related to a given objective can then be assessed and communicated to the training audience. The linkages are critical, and therefore must not be viewed as a set of options. If properly implemented, EBAT has the potential to result in effective learning environments. We believe the EBAT framework has considerable application to training EM teams. As stated above, CrisisKit was developed with a number of goals in mind. We wanted to develop a flexible tool for researching a number of issues relating to emergency management teams, and we wanted to develop an integrated concept for training and support.

T a s k L is ts (e .g ., M E T L s , JM E TLs) S k ill In v e n to r y H is to r ic a l P e rfo r m a n c e D a ta

Feedback and D e b r ie f

L e a r n in g O b je c tiv e s / C o m p e te n c ie s

P e rfo r m a n c e D ia g n o s is

S c e n a rio E v e n ts S c rip ts

P e r fo r m a n c e M easures S ta n d a r d s

Figure 1: EBAT links trainee needs, critical tasks, learning objectives, scenario design, performance measurement, and feedback (Adapted from Zachary, Cannon-Bowers, Burns, Bilazarian, & Krecker, 1998)

Figure 2: High level task analysis of EM

This tool since developed into a (nearly) operational training and support tool for the Royal Netherlands Airforce, and serves as the basis for the current development of the Netherlands Interactive Disaster Management Trainer, a simulation tool for the training of EM teams in the Netherlands, iniated and supported by the Dutch Ministry of Internal Affairs. TNO Human Factors is heavily involved in the development of this national training tool. CrisisKit mainly focuses on the decisionmaking in teams, and as such was not developed to support field exercises. As a basis on which to develop CrisisKit, a thorough analysis was carried out into the tasks and information needs of the various teams involved in EM, and the individuals in those teams. This resulted in a task analysis and a number of representations of the information flow within and between teams (see figure 2).

Figure 3: the CrisisKit configuration

Figure 6: An example support screen in CrisisKit

Based on this analysis, a flexible multi-player system emerged, implemented on a number of (wireless) networked laptop-computers, coupled with large-screen displays and optional Smartboards. The CrisisKit primary user interface consists of a number of displays: a map of the terrain which can be adapted to the situation by the team (e.g. mapping of the location of the emergency situation, mapping of the location of different key players and supporting institutions), an overview of events in the scenario, and an e-mail system through which the different players may interact, apart from normal voice-communication

Figure 4: The user interface of CrisisKit

The developer of training scenarios has a number of different supporting tools: first a tool to assign players to the game. Secondly, a tool for assigning information events to each of the different players, in line with the information distribution in real EM situations in which not every team member gets the same information. Finally, the training developer has an editor to create information events CrisisKit is not only a training tool, but integrates learning and decision support. Our initial analyses revealed a number of aspects in EM which cause difficulties for many EM teams, such as keeping an exact count of the number of (classes of) victims, and an overview of different decisions to be made, and several decision aids. These difficulties have since been validated in a number of reallife field exercises in which we were invited to act as observers.

The integration of SYNTHERS in CrisisKit Figure 5: The editor for scenario-events

We believe that the development of synthetic teammates, or ‘SYNTHERS’, is a promising alternative to training with all human teammates and role-players. SYNTHERS are cognitive agents that perform the job of an unavailable

team member, or any role external to the team that normally requires a human role-player or trainer to enable training or teamwork practice. We believe that they are promising, since SYNTHERS may always be available, may be modeled after experienced training personnel, and may therefore be more cost effective in the long run.

To define empirically validated requirements for SYNTHERS for team training. The emphasis is not just on defining requirements but also on the empirical validation of those requirements: we would like to know what the relative contribution of certain SYNTHER requirements is in terms of training effectiveness.

The research challenge lies in keeping the advantages associated with human teammates, while minimizing or eliminating the disadvantages. Thus, SYNTHERS should display the same collaborative and cooperative behavior typically associated with human teammates, while reducing some of the negative aspects of training with live teammates, such as the relative unpredictability of their behavior and their fluctuating availability for team training. When these research challenges are met, we believe that SYNTHERS will not only reduce the cost of training, but will even result in increased training effectiveness.

To develop guidelines for the development of plausible cognitive models.

However, there is a gap that has to be bridged between this conception of SYNTHERS and current implementations of computer generated forces, which essentially a SYNTHER is. A recent National Research Council report (Pew and Mavor, 1998) identified that a shortcoming in the area of military simulations is the lack of behavioral realism in computer generated forces (CGF’s). Most synthetic force models within military simulations have been constructed using relatively primitive human models in which the richness of behavior and decision-making are represented in only a coarse and brittle manner. This has produced simulated forces with unrealistic behavior and simplistic responses that do not correspond to the behavior of real individual people and teams (Pew and Mavor, 1998; Lyons and Hawkins, 1999). Therefore, in order to fulfill the objectives of our SYNTHERS, there is a need for improvement in the behavioral representation of CGF’s. Part of this improvement will come through the development of new architectures for cognitive modeling. However, even if very powerful architectures existed for the modeling of SYNTHERS, the question would still be what behaviors they should exhibit. The question of required behaviors is probably one that cannot be answered in general; Chandrasekaran and Josephson (1999) argue that what a cognitive model needs to contain is vitally affected by what kinds of questions one needs to answer, i.e. the goals of the simulation. Therefore, the fidelity of the model must be judged by the needs of the model user. They describe a research strategy in which the requirements of models are empirically investigated. Our approach towards the definition of SYNTHERS as teammates for team training has been somewhat along those lines. The goals of this research are threefold:

To develop guidelines for scenario development including role players and/or SYNTHERS. The results that we have so far with SYNTHERS show that they can be very valuable teammates in teamtraining, and that teams trained with synthetic teammates (in this case very thoroughly scripted role-players) have better teamwork skills then teams trained in the “classical way” (with other teammates) (Schaafstal, Hoeft , & van Schaik, 2002). Since the results are this promising, we are working towards defining SYNTHERS in the CrisisKit environment, too.

Conclusion Although CrisisKit was initially developed as a research tool for understanding EM team performance, it has evolved to an operational system used for EM training, and it forms the basis for the current development of a national, generic tool for EM training. There are still plenty of opportunities for improving the system and its application in real-life. Future publications will detail some of those developments.

References Chandrasekaran & Josephson : “Cognitive Modeling For Simulation Goals: A Research Strategy for ComputerGenerated Forces. In Proceedings of the Eighth Conference on Computer Generated Forces and Behavioral Representation, May 11-13, 1999, Orlando, FL, USA. Dowell, J., Smith, W., & Pidgeon, N. (1997). Design of the natural: An engineering process for naturalistic decision making. In R. Flin, E. Salas, M. Strub, & L. Martin (Eds.), Decision making under stress: Emerging themes and applications (pp. 126-136). Ashgate, Hants, UK. Federal Aviation Administration (1990). Line operational simulation: Line-oriented flight training, special purpose operational training, live operational evaluation (Advisory Circular No. 120-53B). Washington, DC: Department of Transportation. Fowlkes, J.E., Dwyer, D.J., Milham, L.M., Burns, J.J., & Pierce, L.G. (1999). Team skills assessment: A test and evaluation component for emerging weapon systems.

Proceedings of the 1999 Interservice/Industry Training, Simulation, and Education Conference (CD-ROM). Fowlkes, J. E., Lane, N. E., Salas, E., Franz, T., & Oser, R. (1994). Improving the measurement of team performance: The TARGETs methodology. Military Psychology, 6, 4761. Hall, J. K., Dwyer, D. J., Cannon-Bowers, J. A., & Salas, E., & Volpe, C. E. (1993). Towards assessing team tactical decision making under stress: The development of a methodology for structuring team training exercises. Proceedings of the 15th Annual Interservice/Industry Training Systems and Education Conference, pp. 89-98, Washington, DC, USA.

R. W. Pew & A. S. Mayor (Eds.) (1998): Modeling Human and Organizational Behavior: Application to Military Simulations, National Academy Press, 1998, Washington, DC, USA. Schaafstal, A.M., & Lyons, D.M., & Reynols, R. (2001). Teammates and Trainers: The Fusion of SAF’s and ITS’s. Proceedings Conference on Computer Generated Forces and Behavioral Representation. Schaafstal, A.M., Hoeft, R.M. , & van Schaik, M. van (2002, in press). Training a team with simulated Team Members. Proceedings Human Factors and Ergonomic Society Meeting, October 2002, Baltimore, USA.

Jenvald, J. (1999). Methods and Tools in ComputerSupported Taskforce Training. Linköping University, Studies in Science and Technology, Dissertation No. 598.

Stroomer, S., Schaafstal, A. M., & Riemersma, J. B. J. (2000)Training of crisisteams: A task analysis and general didactic model (Report TNO-TM-00-D003), (in dutch).

Johnston, J. H., Smith-Jentsch, K. A., & Cannon-Bowers, J. A. (1997). Performance measurement tools for enhancing team decision making. In M. T. Brannick, E. Salas, & C. Prince (Eds.), Assessment and measurement of team performance: Theory, research, and applications (pp. 45-62). Erlbaum, Hillsdale, NJ, USA.

Zachary, W., Cannon-Bowers, J., Bilazarian, P., Krecker, D., Lardieri, P., & Burns, J. (1999). The advanced embedded training system (AETS): An intelligent embedded tutoring system for tactical team training. International Journal of Artificial Intelligence in Education, 10, 257-277.

Moses, F. L. (Ed.) (1997). Guidelines for using virtual simulation and distributed interactive simulation (DIS) to train multi-service tactical operations. TAPSTEM, Washington DC, USA.

W. Zachary, J. A. Cannon-Bowers, J. Burns, P. Bilazarian, & D. Krecker (1998). An Advanced Embedded Training Systems (AETS) for Tactical Team Training. Paper presented at the Fourth International Conference on Intelligent Tutoring Systems, August 1998, San Antonio, TX, USA.

NAVEDTRA 110A (1981). Procedures for instructional system development (Report 0502-LP-00000-5510). Pensacola, FL: Department of the Navy, Chief of Naval Education and Training. Pruitt, J. S., Burns, J. J., Wetteland, C. R., & Dumestre, T. L. (1997). Shipboard mobile aid for training and evaluation. Proceedings of the 41st Meeting of the Human Factors Society and Ergonomic Society, pp. 1113-1117 Santa Monica, CA, USA. Hicinbothom, J. & D. M. Lyons (1999). Synthetic Forces for Team Training. In Proceedings of the Eight Conference on Computer Generated Forces and Behavioral Representation, May 11-13,1999, Orlando, FL, USA. Lyons, D.M. & H. Hawkins (1999) Cognitive and Behavioral Modeling Techniques for CGF: A New Initiative. In Proceedings of the Eighth Conference on Computer Generated Forces and Behavioral Representation, May 11-13, 1999,Orlando, FL, USA. Oser, R. L., Cannon-Bowers, J. A., Dwyer, D. J., & Salas, E. (1997). Establishing a learning environment for JSIMS: Challenges and considerations [CD-ROM]. Proceedings of the 19th Annual Interservice/Industry Training, Simulation and Education Conference, pp.144-153, Orlando, FL, USA.