Simulation in Graduate Medical Education 2008: A Review for Emergency Medicine Steve McLaughlin, MD, Michael T. Fitch, MD, PhD, Deepi G. Goyal, MD, Emily Hayden, MD, Christine Yang Kauh, MD, Torrey A. Laack, MD, Thomas Nowicki, MD, Yasuharu Okuda, MD, Ken Palm, MD, Charles N. Pozner, MD, John Vozenilek, MD, Ernest Wang, MD, James A. Gordon, MD, MPA, on behalf of the SAEM Technology in Medical Education Committee and the Simulation Interest Group
Abstract Health care simulation includes a variety of educational techniques used to complement actual patient experiences with realistic yet artificial exercises. This field is rapidly growing and is widely used in emergency medicine (EM) graduate medical education (GME) programs. We describe the state of simulation in EM resident education, including its role in learning and assessment. The use of medical simulation in GME is increasing for a number of reasons, including the limitations of the 80-hour resident work week, patient dissatisfaction with being ‘‘practiced on,’’ a greater emphasis on patient safety, and the importance of early acquisition of complex clinical skills. Simulation-based assessment (SBA) is advancing to the point where it can revolutionize the way clinical competence is assessed in residency training programs. This article also discusses the design of simulation centers and the resources available for developing simulation programs in graduate EM education. The level of interest in these resources is evident by the numerous national EM organizations with internal working groups focusing on simulation. In the future, the health care system will likely follow the example of the airline industry, nuclear power plants, and the military, making rigorous simulation-based training and evaluation a routine part of education and practice. ACADEMIC EMERGENCY MEDICINE 2008; 15:1117–1129 ª 2008 by the Society for Academic Emergency Medicine Keywords: graduate medical education, simulation
From the Department of Emergency Medicine, University of New Mexico (SM), Albuquerque, NM; the Department of Emergency Medicine, Wake Forest University (MTF), WinstonSalem, NC; the Department of Emergency Medicine, Mayo Clinic Rochester (DGG, TAL), Rochester, MN; Gilbert Program in Medical Simulation, Harvard Medical School and the Department of Emergency Medicine, Massachusetts General Hospital (EH, JAG), Boston, MA; New York Methodist Hospital (CYK), Brooklyn, NY; the Integrated Residency in Emergency Medicine at the University of Connecticut (TN), Farmington, CT; the Department of Emergency Medicine, Mount Sinai School of Medicine (YO), New York, NY; the Department of Emergency Medicine, Vanderbilt University Medical Center (KP), Nashville, TN; the STRATUS Center for Medical Simulation, Brigham and Women’s Hospital (Emergency Medicine), Harvard Medical School (CNP), Boston, MA; the Northwestern McGaw Simulation Network, Medical Education, Feinberg School of Medicine, Northwestern University, Division of Emergency Medicine, Evanston Northwestern Healthcare (JV), Evanston, IL; and the Division of Emergency Medicine, McGaw Medical Center of Northwestern University, Evanston Hospital (EW), Evanston, IL. Received March 29, 2008; accepted March 31, 2008. Address for correspondence and reprints: Steve McLaughlin, MD; e-mail:
[email protected].
ª 2008 by the Society for Academic Emergency Medicine doi: 10.1111/j.1553-2712.2008.00188.x
T
he field of health care simulation is rapidly growing, and these techniques are widely used in emergency medicine (EM) graduate medical education (GME) programs. We describe the current state of simulation in EM resident education, including its role in learning and assessment. We focus on four main areas: simulation as a teaching tool, its role in assessment, faculty development, and guidance on developing a simulation program. We conclude with a list of resources and thoughts on the future of health care simulation. History Simulation is a technique used in health care education to replace or amplify real patient experiences with contrived scenarios designed to replicate real clinical encounters. These experiential learning sessions are designed to evoke or replicate substantial aspects of the real world in a fully interactive manner.1 In 1959, Marx et al.2 described the first MEDLINE-referenced use of a cardiovascular simulator for the evaluation of prosthetic aortic valves. The first mannequin-based systems were developed by Denson and Abrahamson at USC3 in the 1960s, and later by Gaba and DeAnda at Stanford.4 Gaba and DeAnda describe an anesthesia simulation to recreate the operating room environment
ISSN 1069-6563 PII ISSN 1069-6563583
1117
1118
McLaughlin et al.
•
SIMULATION IN GRADUATE MEDICAL EDUCATION
Table 1 Types of Simulation Technology Mannequin-based simulators
Partial or complex task trainers Screen-based computer simulators Standardized patients VR
Mannequin-based, or ‘‘high-fidelity,’’ simulators use sophisticated computer-driven electronic and pneumatic mannequins to provide health care professionals with realistic patients that breathe, respond to drugs, talk, and have vital sign outputs into the clinical monitoring equipment in the treatment room. Provide a highly realistic yet focused experience for the learner and are designed for a specific procedure, such as central line placement, bronchoscopy, or airway management. Programs that run on personal computers or the Internet that allow learners to work through cases using clinical knowledge and critical decision-making skills. ‘‘Actors’’ specifically trained to present their medical histories, simulate physical symptoms, and portray emotions as specified by each case. VR is a simulated, immersive environment, created by a combination of computer based images and interface devices. A VR environment may include visual stimuli, sound, motion, and smell.
VR = virtual reality.
that could be used for physician training and research.4 Using this model, anesthesiology educators pioneered the use of immersive simulation in GME. In the 20 years since Gaba and DeAnda’s original work, the technique of simulation has grown to encompass a variety of tools for providing an augmented learning experience (Table 1). Some of the initial published descriptions of the use of simulation in EM education include a description of team training principles,5 a discussion of human responses to the simulation environment,6 and a description of a simulation-based medical education service.7 There has been tremendous growth in simulation within EM since 2003, highlighted by the creation of interest groups and committees focused on simulation within the major EM organizations. The Society for Simulation in Healthcare (SSH) was established in January 2004 as an umbrella organization for all specialties. Its mission is ‘‘. . . (to) lead in facilitating excellence in (multispecialty) health care education, practice, and research through simulation modalities.’’8 SSH has approximately 2,000 members representing a multispecialty, international membership. The leadership and membership of SSH include numerous emergency physicians (EPs). Recent Changes in GME Initiated in 1999, the Accreditation Council for Graduate Medical Education (ACGME) Outcomes Project was the product of a collaborative effort between the ACGME and the American Board of Medical Specialties (ABMS). To ensure the quality of GME, one of its primary goals was to place, ‘‘increasing emphasis on educational outcomes in the accreditation of residency education programs.’’9 The project is rooted in six general core competencies that can be applied across the spectrum of medical specialties. Each specialty and individual program is permitted to tailor the competencies to the learning requirements of their respective practice environments. There are four phases to this project with a general timeline of implementation. Residency programs are now in Phase 3, which includes the full integration of the competencies and their assessment into their training programs. Residencies are charged with employing resident performance data as the basis for promotion and using external measures to
ensure that residents are performing at levels consistent with their educational objectives. Simulation is listed as one of the key assessment tools by the ACGME for a number of the core competencies.10 SIMULATION-BASED TEACHING IN GME The use of medical simulation in GME is increasing in part because of limitations of the 80-hour resident work week, patient dissatisfaction regarding being ‘‘practiced on,’’ a greater emphasis on patient safety, and the importance of early acquisition of complex skills before actual operative or procedural practice. While a few programs have transformed their residency curriculum to fully integrate medical simulation,11 most have employed simulation less comprehensively. Over the past decade, there has been a major paradigm shift in the format of medical teaching, as exemplified by the incorporation of problem-based learning within our medical schools.12 GME has also shifted, out of necessity, from traditional apprenticeship to more directed clinical skills training.13 Medical information is growing exponentially, yet the length of medical training remains static. Furthermore, with work hour restrictions, it is unrealistic to expect that residents will have broad exposure to all necessary clinical entities before graduation. Simulation can bridge the gap between the classroom and bedside patient care by deliberately ensuring that all trainees are exposed to core clinical problems. Using simulation, the teacher can apply many of the principles of adult learning by creating a risk-free environment for residents to learn practical material that is relevant to their day-to-day patient encounters.14 Instead of sitting in a lecture imagining how to intubate a patient, or learning on-the-job in the emergency department (ED) with a rapidly desaturating patient with chronic obstructive pulmonary disease, the learner can be taught the indications of airway management, practice procedural techniques, and receive immediate feedback, all in a controlled setting. This mode of teaching allows trainees to integrate and apply knowledge for clinical decision-making in immersive environments. Simulation has been successfully used with emergency medical services (EMS) providers,15 nurses,16
ACAD EMERG MED • November 2008, Vol. 15, No. 11
•
www.aemj.org
medical students,17,18,19 resident physicians,19 practicing physicians,21 and even high school and college students.22 The anticipated learning goals for the exercise, the expectations for learner performance, and the subject matter for the simulation all depend on the intended audience. Once the target audience has been identified, the main subject matter and the learning goals for the session should be determined. These may center around clinical management principles, such as an approach to airway management, or a specific disease process like the management of congestive heart failure or acute myocardial infarction. Other objectives may include crisis resource management and team training,23,24 or basic science principles that underlie a clinical scenario.18,25 Secondary goals may include illustration of the clinical reasoning process, teamwork and communication skills, review of clinical algorithms such as advanced cardiac life support (ACLS),26 or demonstration of clinical skills and procedures. Simulation allows learners the opportunity to practice critical, time-sensitive skills without risk to patient or learner. Patients do not want their own clinical care to be used as an opportunity for skill training. A patient preference survey showed that if given the choice, only 42% would let a medical student perform their first venipuncture on them, and only 7% would allow a first timelumbar puncture. Approximately 50% of patients would never let a medical student perform a lumbar puncture, central line, or intubation on them at all.27 Skills such as ACLS can be taught to a trainee away from the distractions of the clinical environment and can allow time for rehearsal before application to a patient encounter. Instructional science research has shown that to ensure the acquisition and maintenance of skills at an expert level, the learner must engage upon deliberate practice of the desired educational outcome.28 Deliberate practice must consist of repetition and feedback in a structured environment with a rigorous skills assessment where the learning objective is appropriate for the level of the learner.29,30 Simulation is an ideal modality to allow deliberate practice in a wide variety of clinical scenarios, with opportunities for debriefing after the scenario.26 Debriefing sessions are important for the success of simulation experiences and may be most effective if presented in a structured format to discuss specific aspects important for participant learning.31,32 While simulation may not be a replacement for actual patient encounters, it can offer a much more protected opportunity for review and reflection of clinical care. Despite close supervision, some aspects of clinical care will always be carried out independently by learners. Some, such as the establishment of rapport, efficient history-taking, informed consent, and discharge instructions, are seldom observed by faculty. Simulation allows for the direct observation of these skills by educators, and allows learners to self-reflect and receive external feedback.33 Pervasive throughout simulation literature are surveys and feedback statements from learners that simulation is helpful and that learners typically enjoy it as a training modality. Any teaching adjunct that can inspire and engage a student should be strongly considered as part of a teacher’s repertoire. Through an in-depth review of
1119
simulation education research literature, Issenberg et al.34 suggest features of simulation that lead to effective learning. Some of these features include providing feedback during the learning experience, allowing repetitive practice, provide increasing levels of difficulty, creating clinical variation, and carefully controlling the environment.34 Issenberg et al.34 also discusses the importance of integrating simulation into the overall curriculum, providing team learning opportunities, and clearly defining benchmarks and outcomes. We next present an overview of some of the evidence to support the various simulation technologies as effective teaching methods. Evidence Like medicine, other industries that have utilized simulation are inherently complex and high stakes and rely on expert skill acquisition to reflexively deal with the unexpected. There is limited but growing evidence that simulation training or improved performance on a simulator can translate to improved overall patient care.35 Some of this limitation in evidence is due to the relatively new focus on simulation as an educational tool. In addition, many of the important questions in simulation-based education and assessment cannot be answered by traditional biomedical randomized controlled trials. Poor patient outcomes are relatively rare and rest on such an inherently complex and individualized set of factors, which power and feasibility issues often preclude the definitive study. Novel research techniques and those developed and used extensively in social and educational science will need to inform simulation research when important questions cannot be answered using the routine biomedical paradigm. Many studies in the medical simulation literature focus on skills training. Surrogate endpoints, such as knowledge retention or subjective improvement, are used to show that simulation can be effective in teaching specific skills or protocols. Success, as rated by task performance, error reduction, reduced training time, or decreased response time, has also been used to quantify the efficacy of training. In one study, paramedic students trained in intubation on a simulator were equally able to intubate in the operating room as students trained in the operating room.36 Simulation training has also been shown to improve adherence to safety concerns in pediatric procedural sedation by nonanesthesiologists.37 One randomized controlled trial does suggest that fourthyear medical students performed better in simulated scenarios after training on the simulator than students who participated in problem-based learning sessions.38 A curriculum featuring deliberate practice dramatically increased the skills of residents in ACLS scenarios.26 Some of the most compelling analyses of real-world advantages of simulation training come from the surgical literature. As part of a ‘‘VR to OR’’ project, a randomized, controlled double-blinded trial found that residents trained using a moderate-fidelity virtual reality (VR) trainer for laparoscopic cholecystectomy were 29% faster in gallbladder dissection. Importantly, VR-trained subjects were nine times less likely to transiently falter and five times less likely to injure the gallbladder or burn nontarget tissue than their counterparts who received standard programmatic training.
1120
McLaughlin et al.
Mean errors were also six times less likely for the VRtrained group.39 Multiple studies have followed, showing that VR training improves performance on minimally invasive surgery and various endoscopies. A milestone in the progress of simulation-based training was the 2004 Food and Drug Administration mandate for VR training for carotid stent placement.40 Banks et al.41 also reported improved resident performance of laparoscopic tubal ligation compared to controls, when employing laparoscopic simulators in a obstetrics ⁄ gynecology residency. Other procedural simulations used in residency training include central venous access,42 cystoscopy,43 and airway management.44 One study of 45 fellows in gastroenterology training attempted to answer the question about the relationship of simulation training to clinical performance with actual colonoscopy.45 This randomized, controlled, blinded, multicenter trial compared 10 hours of colonoscopy simulator training to no training and found greater objective competency in the first 80 colonoscopy procedures on real patients. Interestingly, this increase in competency was only seen in the early phase of training, as the median number of cases to reach 90% competency was the same (160 patients) for both groups. This supports the idea that doing simulation training first may make additional training on real patients safer. Crisis or crew resource management and teamwork skills have been another focus of simulation training. Many institutions are now instituting crew resource management courses or rapid response teams that train using high-fidelity simulation. Although there is little rigorous evidence, multiple survey studies report perceived improvement and successful real-life application of learned skills soon after completing the course. In two small controlled studies, team behaviors seemed to improve after simulator training.46,47 Although high-fidelity simulation in GME is becoming commonplace, superiority over other innovative and interactive teaching modalities, such as case-based learning,48 computer-based learning,49 or video-assisted modalities19 for scenario training is unclear. Simulation, like all educational techniques, needs to be matched to an appropriate set of learning objectives. Clearly there is a need for further research to validate the utility of simulation and its correlation to performance in a clinical setting and the overall care of the patient. Examples in EM A survey conducted in 2002 to 2003 revealed that 60% of EM training programs had either ‘‘no formal curriculum’’ or only the ‘‘initial development’’ of a simulation curriculum.20 In addition, less than half of the ACGME-approved EM residency programs possessed a high-fidelity mannequin-based simulation training center, and only 18% of programs with institutional simulation training centers used them to train EM residents.20 Over the past 5 years, however, use of simulation training in EM has grown tremendously, with over 80% of residency programs now using mannequin-simulations.49a Of the programs offering simulation-based teaching, most have added select simulation modalities to their
•
SIMULATION IN GRADUATE MEDICAL EDUCATION
existing curriculum. A few programs have redesigned their educational curriculum to fully incorporate medical simulation. Binstadt et al.11 describe a revamped EM curriculum utilizing the full spectrum of simulationbased teaching. Computer-based cases, actors, and high-fidelity mannequins and an advanced skills laboratory with a variety of task trainers are incorporated. Traditional lectures and seminar-based teaching are reserved for content better suited to that modality. McLaughlin et al.50 describe a 3-year curriculum that involves 15 simulated patient encounters with a graduated complexity. The ACGME core competencies are incorporated into the cases with formative evaluations of the learners. Another simulation-based curriculum specifically addresses the systems-based practice ACGME core competency.51 In contrast to the prior examples, the EM residency at the Mayo Clinic has transitioned 20% of the core curriculum to simulationbased teaching without segregating junior and senior residents for the cases or debriefing sessions.52 Caring for multiple patients simultaneously is essential to the practice of EM and represents a particularly high-risk environment. Simulation scenarios with two or more simultaneous patients are being used to develop multitasking, crew resource management, and decision-making skills without risk to actual patients.53 High-fidelity simulation has even been used to replicate patient encounters during morbidity and mortality conferences.54 Internationally, simulation has been used to challenge the traditional techniques of medical education on an even larger scale. Ziv et al.55 describe a model for cultural change in medical education using simulation-based teaching. Matching Learning Objectives to the Educational Approach High-fidelity simulation-based teaching has been compared to a theatrical production, with the need for actors, props, scripts, and people to work behind the scenes.56 It can be very resource-intensive, which is often viewed as its primary limitation. However, simple or low-tech simulation approaches can often be just as effective.57 Binstadt et al.11 describe a ‘‘clinical performance pyramid’’ based on a hierarchical approach to teaching. At its base is knowledge, from which higherlevel decision-making develops. Decision-making guides the choice of appropriate actions leading to a need for procedural competence. Finally, the highest level of competence is achieved when one functions successfully as a member of the health care team.11 Simulation may be an inefficient method for teaching simple facts in isolation, but is quite an effective tool for anchoring knowledge within the context of higher-level skills like decision-making, procedural skills, and teamwork training.56 Screen-based computer simulations58 or VR systems59 can effectively complement high-fidelity simulation. The goal of the educator is to choose the optimal educational method for the learning objectives they are trying to achieve. As training programs incorporate simulation-based education into their curriculum,5,24,50 they are working to define which of the ACGME core competencies are most amenable to simulation-based education.
ACAD EMERG MED • November 2008, Vol. 15, No. 11
•
www.aemj.org
‘‘Patient care,’’ for example, is highly compatible with high-fidelity mannequin-based simulation, which offers a robust platform for performing a history and physical exam, creating a differential diagnosis, and initiating therapy. Crisis or crew resource management and teamwork skills can be learned and assessed through simulated patient encounters. ‘‘Communication and interpersonal skills’’ can be practiced through cases that incorporate elements, such as end-of-life discussions with family or dynamic team resuscitations. If learners are allowed to access a realistic practice environment as part of the simulated exercise, then they can also develop the competencies of ‘‘practice-based learning’’ and ‘‘systems-based practice.’’ Such resources may include the imagined ability to send a patient to the cardiac lab or real-time access to on-line medical support. Multiple patient scenarios can also present challenges in resource or systems management. Similarly, repetitive scenarios can allow instructors to assess students’ ability to incorporate previously learned knowledge. While ‘‘medical knowledge’’ has been historically learned through reading and attending formal lectures, anchoring this basic knowledge into medical decision-making may be enhanced through simulation. The ability to engage the student in complex dialogue allows for communication encounters similar to those developed as part of standardized actor-patient exercises. Therefore, even ‘‘professionalism’’ can be taught effectively using simulation. One obvious and widely accepted use of medical simulation is in the area of procedural skill development. Public demand for patient safety is helping to drive the concept that health care providers should be competent with invasive procedural skills before live patient care. Task training with specialized units can allow procedures to be repetitively performed with no patient risk. Detailed instruction can occur throughout the initial learning process, allowing the trainee to explore each step and its potential complications. Repetition, linked with corrective feedback from an expert, can solidify a student’s competence. There are many task trainers being produced for specific procedures, such as ultrasound-guided central venous catheter insertion, airway management techniques, lumbar puncture, and vaginal delivery. As with carotid stent placement, it may soon become standard practice to require the demonstration of procedural competence and prove skill maintenance on a task trainer to acquire credentialing for actual patient care. SIMULATION-BASED ASSESSMENT IN GME Simulation-based assessment (SBA) has been used in healthcare since the introduction of the Objective Structured Clinical Examination (OSCE) with simulated actor-patients in the early 1980s. High-fidelity mannequin-based simulation now offers the potential to assess learners on more diverse and complex aspects of clinical care. There are numerous potential applications for SBA in EM. As the ACGME Outcomes Project moves into its third phase, full integration, SBA offers promise as a tool for objectively assessing some of the competencies that are more difficult to evaluate via traditional
1121
means.10 As alluded to earlier, a module that requires a resident to discuss advanced directives with the family member of a critically ill patient would allow assessment of professionalism, interpersonal and communication skills, systems-based practice, and patient care. As part of a crisis resource management course, Gisondi et al.60 assessed management of ethical dilemmas by trainees. Their performance assessment tool was able to discriminate between experienced and inexperienced residents using several elements of professionalism. Although staffing of EDs by faculty offers an opportunity to observe behaviors more closely than in many other specialties, important aspects of care are not routinely observed. SBA offers the added benefit of allowing the learner to develop and implement his or her own plans without the need for faculty intervention to ensure real-time patient safety. Evidence Simulation-based assessment using modern tools and techniques has the potential to revolutionize the manner in which competence is assessed and may serve as a critical tool to accomplish the long-term objectives of the ACGME Outcomes Project. Although the studies validating assessment tools for use in SBA in EM are increasing,60–63 the majority of the work to date has addressed graduate and undergraduate learners in anesthesiology as well as the surgical and procedural specialties.60,61,63–74 To move forward, much more work must be done to validate these tools for EM. Simulation training has been demonstrated to be a useful tool for skill assessment and training. A recent study of physicians in a pediatric training program found that highfidelity medical simulation can assess a resident’s ability to manage a pediatric airway.75 Assessment of skill development for managing shoulder dystocia found that training with mannequins improved physician and midwife management of simulated dystocia.76,77 Interestingly, both traditional low-fidelity training and computerized high-fidelity simulation were effective in producing some improvement. The challenge remains to determine whether or not these increases in simulated performance translate into improvements in real patient outcomes. Teams of resident physicians from multiple specialties (medicine, surgery, and anesthesiology) were assessed in another study for their ability to follow practice guidelines in the management of sepsis.78 This retrospective review of videotapes identified adequate and inadequate levels of performance in the simulated scenario based on established guidelines. The use of these consensus practice guidelines as a benchmark for performance assessment demonstrates one approach to standardized assessment when customized metrics have not been developed or validated. At a minimum, simulation assessments should reliably be able to discriminate between novice and experienced clinicians. Evaluation tools previously developed for EM oral examinations appear to retain the ability to discriminate among skill levels when used in a simulator-based testing enviromment.62 Crisis resource management in critically ill patients was assessed in 60 residents using a novel rating scale and found significant differences between first-year and third-year
1122
McLaughlin et al.
residents.23 Another study of 54 residents in a pediatric training program found that simulation can reliably measure and discriminate competence.64 Expert consensus was used to develop and validate four simulation case scenarios, and three of the four cases demonstrated statistically and educationally significant differences between first- and second-year residents. In a study of 44 EM residents who were tested on a patient care competency using time-based goals for decision-making, significant differences were found between novice and experienced resident physicians.61 An alternative approach to evaluate clinical competency was demonstrated in a study of internal medicine residents learning ACLS, who were required to achieve mastery of the skill set.26 They assessed second-year residents in proficiency with ACLS scenarios and required residents who did not achieve competency after initial training to continue with additional practice time until competency was achieved. While residents required differing amounts of training time to achieve acceptable results, all 41 residents ultimately met the mastery learning goals. For essential components of clinical patient care, such as resuscitation algorithms, these assessment methods may be a step toward ensuring and documenting learner competence. Guidelines for training can then be geared toward outcomes (e.g., competence) rather than processes (e.g., course completion). Critical to ensuring that assessments fairly and accurately evaluate skills are the scoring rubrics used to determine competence. Studies vary on the superiority of checklist (CL) or global rating scale (GRS)-based scoring models.63,65,67,72,74 While CLs seem to be particularly useful at rating technical actions, GRSs seem to be better for assessing complex behaviors and decision-making.72–74 In cases where successful management hinges not only on performance of specific tasks, but also the order in which those tasks are completed, GRSs are better able to capture the aggregate performance.79 Critical action-based global ratings, akin to those used on the American Board of Emergency Medicine Oral Certification examination, may combine aspects of both CL- and GR-based ratings. This hybrid approach provides concrete guidance when rating complex tasks and offers specific criteria for various performance levels.68,71,73,74 Also critical to accurate assessment is the number of observations that are necessary to reliably rate learners. Most studies have found good correlation between independent raters assessing individual performances. Although one study suggested that at least two raters were required to provide reliable ratings, this study also was one of the few that suggested that behavioral ratings were significantly less reliable than CLs.63 This raises the possibility that raters disagreed because criteria were not established for various ratings a priori. Most other anesthesia studies found that there was little gained in terms of reliability by adding more than one rater.66,67,69,74 Most studies have found that learners may perform well on one case but poorly on other cases when being assessed on the management of several cases.66–69,73 As noted above, internal consistency may be an unrealistic goal. Murray found that 6–8 performance samples yielded moderately
•
SIMULATION IN GRADUATE MEDICAL EDUCATION
reliable scores, but felt that more samples would further increase reliability.71,73 The optimal number of samples required for EM assessment is probably in the range of 6–12 cases.69 The American Board of Emergency Medicine uses a total of seven patient encounters, five single and two multiple, for its highstakes oral exam.80 Trainees must not be penalized for poor performance on a single case, because performance in one scenario is not a good predictor of performance in another.69 This suggests that it is important to ensure that characteristics pertinent to a specific case not be generalized to a subject’s performance at large. Unfortunately, the feasibility of SBA is inversely proportional to the number of cases, time, and resources necessary to perform the assessment. Development of highly reliable cases is difficult and time-consuming.64 It will therefore be critical determine the minimal number of cases required to provide accurate data. Matching Assessment Objectives to the Evaluation Approach Opportunities for SBA to satisfy specific requirements of the Residency Review Committee for Emergency Medicine (RRC-EM) include but are not limited to satisfying the guidelines for patient evaluation, resuscitation, and procedural care. Chief Complaint Competency. By having a series of cases that demonstrate a range of pathology for a given chief complaint, it is possible to assess a resident’s ability to obtain data, develop a differential diagnosis, interpret diagnostic studies, and develop treatment plans for a range of conditions related to a given chief complaint. Resuscitation Competency. Simulation offers the opportunity to allow a resident to manage a resuscitation from start to finish without the need for intervention on the part of supervising faculty. This allows one to introduce assessment of critical thinking and management strategies that are typically unavailable. Simulated environments also offer an opportunity to assess the nonmedical skills required to optimally lead a resuscitation, such as prioritization, communication, and team management. Procedural Competency. The development of realistic procedural simulators may allow for the assessment of resident competency in commonly performed EM procedures. Annual Competency Assessment. By controlling the cases being assessed, programs can define expectations for learners based on their year of training to assess readiness for year-to-year progression. This can be done by defining minimal expectations for the management of a given case, based on year of training. Alternatively, one could introduce more complex problems to advanced trainees. Although SBA is potentially well suited to such high-stakes assessment, until validated, it should only be used as one tool to gauge readiness to progress and must be used in conjunction with other established modalities.
ACAD EMERG MED • November 2008, Vol. 15, No. 11
•
www.aemj.org
In its current form, SBA can be used effectively for formative or summative assessment. When used formatively, SBA can help provide a medium by which faculty can objectively identify areas in which a learner is particularly weak or strong. If videotaped, this can be particularly powerful by allowing the learner to self-identify behaviors and develop strategies to optimize performance. When used for formative feedback where the stakes are low and the goal is to improve performance, the content of the session is more important than the structure. In contrast, when used summatively, the logistic requirements are more stringent, particularly if stakes are high. Before SBA can be incorporated into high-stakes examinations, several criteria must be met to ensure its validity.63–66,81 Specifically, the tool must be reproducible, reliable, valid, and feasible. Before developing a SBA, the purpose of the activity must be clearly defined.81 An assessment of minimal competency for credentialing would be expected to have tightly defined performance standards with specific competency requirements. In Israel, high-fidelity simulation has been incorporated into the anesthesiology board certification process.67 Because of the purpose of such an examination, the pass ⁄ fail standard must understandably be set at a level that sets the minimal bar for performance. Meanwhile, sessions designed for formative feedback may incorporate behavioral ratings indicative of a wider range of performance levels to allow for the identification of characteristics demonstrating excellence, in addition to those indicative of minimal performance standards. Feedback would be aimed at remediating suboptimal behaviors and identifying and reinforcing positive ones. FACULTY TRAINING IN SIMULATION-BASED EDUCATION Emergency medicine faculty should be familiar with the advantages and disadvantages of simulation, just like any other educational tool. While some faculty may naturally transition from bedside teaching to simulatorbased instruction, dedicated training in simulation as a teaching tool is becoming more common in faculty development courses.82 Expertise in this area will become even more important for the current and future generations of EM educators, as simulation becomes a more prevalent part of our residency programs. Simulation in the absence of a carefully thought-out scenario with defined educational goals is rarely as effective as a well-planned and well-executed event.34 Faculty who are interested in providing educational sessions for physicians in residency training must, therefore, be familiar with an approach to creating stimulating and appropriate experiences using highfidelity simulation. General educational competencies such as objective writing, feedback, and assessment fully apply to simulation-based activities, but alone are not adequate preparation. Faculty who lead teaching initiatives or who are involved in simulation-based assessment will usually require additional focused training with these techniques. This training is available through a growing number of courses that are listed in the
1123
resources section of this article. The training can also be provided at the institutional level if local expertise is available. Some elements important to include in faculty education programs in simulation are detailed below. As a foundation, simulation faculty leadership should have knowledge and skills in adult learning theory,83 objective writing,84 and curriculum design.85 They also need expertise in the clinical content area. The second level of educational expertise in simulation includes basic skills in assessment, such as knowing the advantages and disadvantages of various performance assessment tools and being able to match learner level to objectives and to teaching method. Experience with standard setting and giving feedback is also important. The third level of expertise would include simulationspecific skills such as scenario design, debriefing of high fidelity mannequin-based simulation and some technical knowledge about simulator operation, capabilities, limitations, and programming. In the ‘‘Train the Trainer’’ course at the University of New Mexico (UNM), these three levels of skills are taught in four half-day sessions including 16 total hours of contact time. One area of particular focus is that of scenario design. UNM’s faculty development course86 recommends an eight-step technique for developing simulation scenarios, as well as the use of a standardized reporting format. The eight steps of simulation design are listed in Table 2. Many EPs have also received training at the Institute for Medical Simulation, an intensive faculty development program sponsored by the Center for Medical Simulation in collaboration with the Harvard–MIT Division of Health Sciences and Technology. In summary, training faculty to be competent users of simulation-based technology requires time and is critical to the success of the program. There are a number of resources available for faculty development in simulation. Simulation skills should be seen as a core competency for education faculty, along with the more traditional teaching techniques. RESIDENT TRAINING IN SIMULATION-BASED EDUCATION The major educational organizations in medicine have identified residents’ roles as teachers as an important part of the educational environment. The Liaison Committee on Medical Education (LCME) states that ‘‘Residents must be fully informed about the educational objectives of the clerkships and be prepared for their role as teachers and evaluators of medical students.’’87 The core competencies from the ACGME list the following objective under practice-based learning and improvement: ‘‘Residents must be able to facilitate the learning of students and other health care professionals.’’ Finally, Morrison et al.88 states that ‘‘. . . in the GME Core Curriculum of the AAMC, residents’ teaching skills are vitally important, particularly for those residents who teach third-year medical students in the ‘core’ clinical clerkships.’’88 Data indicate that residents spent 20%–25% of their time supervising, teaching, and evaluating medical students and other residents.89–92 From the perspective of the students, residents have a significant role as teachers in their first clinical year,93
1124
McLaughlin et al.
•
SIMULATION IN GRADUATE MEDICAL EDUCATION
Table 2 The Eight Steps of Scenario Design (from S.A. McLaughlin) 1. 2. 3. 4. 5. 6. 7. 8.
OBJECTIVES: Create learning ⁄ assessment objectives. LEARNERS: Incorporate background ⁄ needs of learners. PATIENT: Create a patient vignette to meet objectives that also must elicit the performance you want to observe. FLOW: Develop flow of simulation scenario including initial parameters, planned events ⁄ transitions, and response to anticipated interventions. ENVIRONMENT: Design room, props, and script and determine simulator requirements. ASSESSMENT: Develop assessment tools and methods. DEBRIEFING: Determine debriefing issues and mislearning opportunities. DEBUGGING: Test the scenario, equipment, learner responses, timing, assessment tools, and methods through extensive pilot testing.
and they estimate that up to 30% of the teaching they receive is from residents.94 Simulation-based teaching is becoming an integral part of the educational skill set that we expect residents to obtain during their training. As the utilization of simulation expands, some programs have begun to involve residents as simulation facilitators. The Harvard Affiliated Emergency Medicine Residency at Brigham and Women’s Hospital and Massachusetts General Hospital has incorporated simulation as a component of the senior resident ‘‘teaching’’ rotation. Residents work closely with simulation faculty and staff to develop training modules for rotating medical students and junior residents. This program offers a more robust simulation component for medical student education, at the same time providing theoretical, technical, and instructional experience to the residents. It is clear that residents are critical members of our teaching teams, and as simulation becomes a more prevalent teaching method they will need skills in how to teach with this modality. FELLOWSHIP TRAINING IN MEDICAL SIMULATION For those residents who want preparation for academic careers in medical simulation, fellowship training is now becoming more widely available. Currently, simulation fellowship programs are 1 to 2 years in length at institutions with well-developed simulation programs. The fellowship experience can focus on a variety of simulation topic areas depending on the area of interest, typically including medical education, educational research, patient safety, program administration, or technology development. These fellowship programs can include master’s degree coursework or other faculty development courses in the relevant topics. Emergency medicine is well suited for simulation fellowship opportunities. As an example, newly graduated EPs have had opportunities to train as fellows in Harvard Medical School’s simulation program since 2003, a position administered in collaboration with the Department of Emergency Medicine at Massachusetts General Hospital. Many departments of EM already offer fellowships in medical education or research, and a simulation fellowship can be constructed and offered as a natural extension of such efforts; more such fellowships are being offered every year. Because of the interdisciplinary nature of simulation, mentors and collaborators can be found from across departments in
the institution. Funding for the fellow is typically provided by a combination of sources, including the fellow’s part-time clinical work, departmental research and development funds, dedicated fellowship stipends, extramural grants, and institutional budgets. DEVELOPING A SIMULATION CENTER The number of simulation centers continues to increase, as has the body of experience on how to best develop a simulation program. At the same time, the relevant technologies have become more compact, and the increased use of wireless devices has improved usability. This combination of increased experience and new technologies has revolutionized simulation education spaces. It is essential that the simulation location and space fit the goals of the program. It is not entirely clear that a fixed center is the best solution for all programs. Developing a high-fidelity simulation laboratory requires significant upfront costs for equipment. Over the long term, however, even greater expenses are needed for faculty, actors, content development, technical professionals, and administrative staff.95 Several active efforts are under way by seasoned simulation educators that focus on in situ simulation. These projects use simple storage space to house their devices, which are then deployed in clinical or educational areas on demand. Doing scenarios in nontraditional locations, such as the back of an ambulance or in a medical helicopter,96 may also be highly effective. Large groups can experience and benefit from simulation in a lecture hall setting with some creative modifications to cases and presentation formats in both live18 and prerecorded54 simulation scenarios. If we are following the best science for the effective use of simulation,19,97 our simulation events will create optimum conditions for deliberate practice and for feedback. These should be the guiding principles as a simulation center is planned. The location of the center, the rooms, the layout, and the equipment will follow, and the specific local goals will further fine-tune the design effort. Location is a paramount consideration. If the facility is not located in close proximity to where the trainees and educators typically spend their time, a program may be severely limited.98 A simulation center generally has four main areas: the simulation area, the control room, the debriefing area, and an area for the storage of equipment. These
ACAD EMERG MED • November 2008, Vol. 15, No. 11
•
www.aemj.org
four spaces should be laid out so that participants and those touring the facility may easily pass between rooms. Facilities with an observation area for additional participants to watch and critique performance during an ongoing simulation can be effective in allowing a large group to participate by observing a smaller group of learners at the bedside. The simulation area (or stage) will require sufficient sound-proofing such that ambient noise and discussion in adjacent rooms does not penetrate, while being acoustically friendly to recording devices essential to debriefing sessions. It is recommended that a line of direct sight be present between the control area and the simulation area, usually via one-way glass. Even though a center may be equipped with video recording and playback, this lineof-sight will assist the simulation director when the action of the participants blocks the camera’s view. The number and type of cables that pass between the control and the simulation area are variable, but it is sufficient to say that despite the prevalence of wireless controls and audiovideo devices, the numbers of cables should not be underestimated. The audiovisual support that these cables bring to the debriefing area is a nearly essential component for simulation and facilitates one of the pillars of simulation education: feedback. An excellent resource has recently become available to those interested in planning a center. Kyle and Murray99 have published a compendium of simulation resources in their text, Clinical Simulation, Operations, Engineering, and Management. This volume includes the plans and schematics of several centers of various sizes. The common central elements remain that the best centers are designed to support the educational goals of its faculty. RESOURCES Over the past several years, a number of resources have become available to help develop programs in simulation for GME. In addition, most of the major EM organizations have created internal working groups on simulation. The Society for Academic Emergency Medicine (SAEM) has a new standing committee called the SAEM Technology in Medical Education Committee, which currently focuses on simulationrelated activity as directed by the SAEM board of directors. In addition, SAEM has a Simulation Interest Group, which has a member-driven agenda. Finally, SAEM provides a simulation consult service for a nominal fee, which will send experienced faculty to institutions to assist with the development of simulation programs. All of these resources can be found on the Web site for SAEM under the education section: http://www.saem.org/saemdnn/Education/Simulation/ tabid/73/Default.aspx. The Harvard–MIT affiliated Center for Medical Simulation (CMS) offers one of the most complete courses to train faculty in using simulation. More information can be found at http://www.harvardmedsim.org/. The CMS course is well known for its rigorous approach and development of a solid foundation in the underlying science of simulation. The American College of
1125
Emergency Physicians (ACEP) offers both a basic and an advanced teaching fellowship for faculty who are interested in becoming better educators. The ACEP Advanced Teaching Fellowship includes one of the only hands-on faculty development courses in the country for simulation. Additional information is available from the ACEP Web site: http://www.acep.org/cme.aspx?id=22382. The SSH, based in the United States, is the largest multispecialty organization in the world dedicated to the science of simulation. The SSH holds an annual meeting and has an active EM special interest group. More information is available at http://www.ssih.org/ public/. SSH sponsors Simulation in Healthcare, the first peer-reviewed academic journal dedicated to the field. It was launched in January 2006 by Lippincott, Williams & Wilkins. In Europe, The Society in Europe for Simulation Applied to Medicine (SESAM) is the major simulation organization. SESAM was organized in 1994 and has close ties with SSH. Both organizations share the journal Simulation in Healthcare. Information on SESAM is available at http://www.sesam-web.org/ sesam_home.html. There are a number of excellent simulation Web sites supported by academic and private institutions around the country. Links to many of these can be accessed from the simulation section of the SAEM Web site. The SAEM Simulation Case Bank is an excellent resource to help develop simulation scenarios for GME (http://www.emedu.org/simlibrary). Faculty are encouraged to use this standardized case format to facilitate sharing of cases between institutions. Published cases are also available via this online resource for faculty interested in using materials that have been developed and tested at other institutions. Opportunities are also available for peer review of these types of emergency simulation materials via the AAMC’s online MedEdPORTAL (http://www.aamc.org/mededportal), which collaboratively sponsors the SAEM-AAMC Simulation Case Collection. Faculty are encouraged to post their own teaching materials via these mechanisms to disseminate their scholarly products and share simulation cases with faculty at other centers. THE FUTURE OF MEDICAL SIMULATION In the future, a safer health care system will likely follow the example of the airline industry, nuclear power plants, and the military—all making rigorous simulation-based training and evaluation a routine part of education and practice. Imagine a health care system where it will be uncommon to practice on a patient before practicing on a simulator. Proponents will argue for such an approach as both an ethical100 and a regulatory imperative. Some accreditation bodies67 and hospital committees101 are already beginning to favor simulation-based demonstration of skill, and some insurance companies offer premium incentives for simulation training.102 That may become the norm. The government has already begun to experiment with a future where simulation is a component of both regulatory and reimbursement considerations.40
1126
McLaughlin et al.
In the future, deans and divisions of medical simulation within schools and hospitals may be much more commonplace. EM as a specialty can and should play a important role in helping explore the field as a collaborative platform for integrating traditional and modern teaching techniques across health care. Imagine a learning environment where immersive simulation is easily compatible across a broad range of technologies and approaches, where costs decline and fidelity improves as industry, academia, and government all move to advance the field. Given the excitement that surrounds medical simulation, the topic might even help catalyze innovation in the life sciences, much like the Moon shot helped invigorate the field of engineering.22 Legislative initiatives like a bill currently before the U.S. House of Representatives, ‘‘Enhancing Safety in Medicine Utilizing Leading Advanced Simulation Technologies to Improve Outcomes Now (SIMULATION) Act of 2007,’’103 may assist in building federal infrastructure to help support the field. References 1. Gaba DM. The future vision of simulation in healthcare. Qual Saf Health Care. 2004; 13(Suppl 1):i2–i10. 2. Marx T, Baldwin BR, Kittle CF. A cardiovascular simulator for the evaluation of prosthetic aortic valves. J Thorac Cardiovasc Surg. 1959; 38:412–8. 3. Denson J, Abrahamson S. A computer-controlled patient simulator. JAMA. 1969; 208:504–8. 4. Gaba DM, DeAnda A. A comprehensive anesthesia simulation environment: re-creating the operating room for research and training. Anesthesiology. 1988; 69:387–94. 5. Small S, Wuerz RC, Simon R, Shapiro N, Conn A, Setnik G. Demonstration of high-fidelity simulation team training for emergency medicine. Acad Emerg Med 1999; 6:312–23. 6. Gordon JA, Wilkerson WM, Shaffer DW, Armstrong EG. ‘‘Practicing’’ medicine without risk: students’ and educators’ responses to high-fidelity patient simulation. Acad Med. 2001; 76:469–72. 7. Gordon JA, Pawlowski J. Education on-demand: the development of a simulator-based medical education service. Acad Med. 2002; 77:751–2. 8. Society for Simulation in Healthcare. Bylaws. Available at: http://www.ssih.org/public/. Accessed May 31, 2008. 9. Accreditation Council for Graduate Medical Education. Outcome Project. Available at: http://www. acgme.org/outcome/project/proHome.asp. Accessed May 18, 2008. 10. Accreditation Council for Graduate Medical Education (ACGME) and American Board of Medical Specialties (ABMS). Toolbox of Assessment Methods, Version 1.1. Sept. 2000. Available at: http://www. acgme.org/Outcome/assess/Toolbox.pdf. Accessed May 18, 2008. 11. Binstadt E, Walls RM, White B, et al. A comprehensive medical simulation education curriculum for emergency medicine residents. Ann Emerg Med. 2007; 49:495–504.
•
SIMULATION IN GRADUATE MEDICAL EDUCATION
12. Neville AJ, Norman GR. PBL in the undergraduate MD program at McMaster University: three iterations in three decades. Acad Med. 2007; 82:370–4. 13. Maran NJ, Glavin RJ. Low- to high fidelity simulation–a continuum of medical education? Med Educ. 2003; 37(Suppl 1):22–8. 14. Knowles M. The Adult Learner: A Neglected Species. Houston, TX: Gulf Publishing, 1990. 15. Moyer M. The simulation story: how patient simulation technology is raising the bar for EMS education. Emerg Med Serv. 2006; 35:47–53. 16. Beyea SC, von Reyn LK, Slattery MJ. A nurse residency program for competency development using human patient simulation. J Nurses Staff Dev. 2007; 23:77–82. 17. Gordon JA, Oriol NE, Cooper JB. Bringing good teaching cases ‘‘to life’’: a simulator based medical education service. Acad Med. 2004; 79:23–7. 18. Fitch MT. Using high-fidelity emergency simulation with large groups of preclinical medical students in a basic science course. Med Teacher. 2007; 29:261– 3. 19. Morgan PJ, Cleave-Hogg D, McIlroy J, Devitt JH. Simulation technology: a comparison of experiential and visual learning for undergraduate medical students. Anesthesiology. 2002; 96:10–6. 20. McLaughlin S, Bond W, Promes S, Spillane L. The status of human simulation training in emergency medicine residency programs. Simul Healthc. 2006; 1:18–21. 21. Sinz EH. Anesthesiology national CME program and ASA activities in simulation. Anesthesiol Clin. 2007; 25:209–23. 22. Gordon JA, Oriol NE. Perspective: fostering biomedical literacy among America’s youth: how medical simulation reshapes the agenda. Acad Med. 2008; 83:521–3. 23. Kim J, Neilipovitz D, Cardinal P, Chiu M, Clinch J. A pilot study using high-fidelity simulation to formally evaluate performance in the resuscitation of critically ill patients: The University of Ottawa Critical Care Medicine, High-Fidelity Simulation, and Crisis Resource Management I Study. Crit Care Med. 2006; 34:2167–74. 24. Reznek M, Smith-Coggins R, Howard S, et al. Emergency medicine crisis resource management (EMCRM): pilot study of a simulation-based crisis management course for emergency medicine. Acad Emerg Med. 2003; 10:386–9. 25. Gordon JA, Brown DFM, Armstrong EG. Can a simulated critical care encounter accelerate basic science learning among preclinical medical students? A pilot study. Simul Healthc. 2006; 1:13–7. 26. Wayne DB, Butter J, Siddall VJ, et al. Mastery learning of advanced cardiac life support skills by internal medicine residents using simulation technology and deliberate practice. J Gen Intern Med. 2006; 21:251–6. 27. Graber MA, Pierre J, Charlton M. Patient opinions and attitudes toward medical student procedures in the emergency department. Acad Emerg Med. 2003; 10:1329–33.
ACAD EMERG MED • November 2008, Vol. 15, No. 11
•
www.aemj.org
28. Issenberg SB, McGaghie WC. Clinical skills training–practice makes perfect. Med Educ. 2002; 36:210. 29. Ericsson KA, Drampe RT, Tesch Roemer C. The role of deliberate practice in the acquisition of expert performance. Psychol Rev. 1993; 100:363–406. 30. Moulaert V, Verwijnen MGM, Rikers R, Scherpbier A. The effects of deliberate practice in undergraduate medical education. Med Educ. 2004; 38:1044–52. 31. Owen H, Follows V. GREAT simulation debriefing. Med Educ. 2006; 40:488–9. 32. Bond WF, Deitrick LM, Eberhardt M, et al. Cognitive versus technical debriefing after simulation training. Acad Emerg Med. 2006; 13:276–83. 33. Rudolph J, Simon R, Rivard P, Dufresne R, Raemer D. Debriefing with good judgment: combining rigorous feedback with genuine inquiry. Anesth Clin. 2007; 25:361–76. 34. Issenberg SB, McGaghie CM, Petrusa ER, Gordon DL, Scalese RJ. Features and uses of highfidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teacher. 2005; 27:10–28. 35. Wayne DB, Didwania A, Feinglass J, Fudala MJ, Barsuk JH, McGaghie WC. Simulation-based education improves quality of care during cardiac arrest team responses at an academic teaching hospital: a case-control study. Chest. 2008; 133:56–61. 36. Hall RE, Plant JR, Bands CJ, Wall AR, Kang J, Hall CA. Human patient simulation is effective for teaching paramedic students endotracheal intubation. Acad Emerg Med. 2005; 12:850–5. 37. Shavit I, Keidan I, Hoffman Y, Mishuk L, Rubin O, Ziv A, Steiner IP. Enhancing patient safety during pediatric sedation: the impact of simulation-based training of nonanaesthesiologists. Arch Ped Adol Med. 2007; 161:740–3. 38. Steadman RH, Coates WC, Huang YM, et al. Simulation-based training is superior to problem-based learning for the acquisition of critical assessment and management skills. Crit Care Med. 2006; 34:151–7. 39. Seymour NE, Gallagher AG, Roman SA, et al. Virtual reality training improves operating room performance: results of a randomized, double-blinded study. Ann Surg. 2002; 236:458–64. 40. Gallagher AG, Cates CU. Approval of virtual reality training for carotid stenting: what this means for procedural-based medicine. JAMA. 2004; 292:3024–6. 41. Banks EH, Chunoff S, Karmin I, et al. Does a surgical simulator improve resident operative performance of laparoscopic tubal ligation? Am J Obstet Gynecol. 2007; 197:541. 42. Britt RC, Reed SF, Britt LD. Central line simulation: a new training algorithm. Am Surg. 2007; 73:682–3. 43. Le CQ, Lightner DJ, VanderLei L, et al. The current role of medical simulation in American urological residency training programs: an assessment by program directors. J Urol. 2007; 177:288–91. 44. Kory PD, Eisen LA, Adachi M, Ribaudo VA, Rosenthal ME, Mayo PH. Initial airway management skills of senior residents: simulation training compared with traditional training. Chest. 2007; 132:1927–31.
1127
45. Cohen J, Cohen SA, Vora KC, et al. Multicenter, randomized, controlled trial of virtual-reality simulator training in acquisition of competency in colonoscopy. Gastrointest Endosc. 2006; 64:361–8. 46. Shapiro MJ, Morey JC, Small SD, et al. Simulation based teamwork training for emergency department staff: does it improve clinical team performance when added to an existing didactic teamwork curriculum? Qual Saf Health Care. 2004; 13:417–21. 47. Rudy SJ, Polomano R, Murray WB, Henry J, Marine R. Team management training using crisis resource management results in perceived benefits by healthcare workers. J Contin Educ Nurs. 2007; 38:219–26. 48. Schwartz LR, Fernandez R, Kouyoumjian SR, Jones KA, Compton S. A randomized comparison trial of case-based learning versus human patient simulation in medical student education. Acad Emerg Med. 2007; 14:131–7. 49. Nyssen AS, Larbuisson R, Janssens M, et al. A comparison of the training value of two types of anesthesia simulators: computer screen-based and mannequin-based simulators. Anesth Analg. 2002; 94:1560–5. 49a. Okuda Y, et al. National Growth in Simulation Training within Emergency Medicine Residency Programs, 2003-2008. Academic Emerg Med. 2008; 15:1113–16. 50. McLaughlin SA, Doezema D, Sklar DP. Human simulation in emergency medicine training: a model curriculum. Acad Emerg Med. 2002; 9:1310–8. 51. Wang EE, Vozenilek JA. Addressing the systemsbased practice core competency: a simulation-based curriculum. Acad Emerg Med. 2005; 12:1191–4. 52. Goyal DG, Cabrera DT, Laack TA, Luke A, Sadosty AT. Back to the Bedside: A Redesign of Emergency Medicine Core Curriculum Content Delivery. Poster presentation at ACGME Educational meeting in Orlando, FL, March 2007. 53. Kobayashi L, Shapiro MJ, Gutman DC, et al. Multiple encounter simulation for high-acuity multipatient environment training. Acad Emerg Med. 2007; 14:1141–8. 54. Vozenilek J, Wang E, Kharasch M, et al. Simulation-based morbidity and mortality conference: new technologies augmenting traditional case-based presentations. Acad Emerg Med. 2006; 13:48–53. 55. Ziv A, Erez D, Munz Y, et al. The Israel Center for Medical Simulation: a paradigm for cultural change in medical education. Acad Med. 2006; 81:1091–7. 56. Lammers RL. Simulation: the new teaching tool. Ann Emerg Med. 2007; 49:505–7. 57. Keyser EJ, Derossis AM, Antoniuk M, Sigman HH, Fried GM. A simplified simulator for the training and evaluation of laparoscopic skills. Surg Endosc. 2000; 14:149–53. 58. Wang F, Duratti L, Samur E, Spaelter U, Bleuler H. A computer-based real-time simulation of interventional radiology. Conf Proc IEEE Eng Med Biol Soc. 2007; 1:1742–5. 59. Van Herzeele I, Aggarwal R, Choong A, Brightwell R, Vermassen FE, Cheshire NJ. Virtual reality simulation objectively differentiates level of
1128
60.
61.
62.
63.
64.
65.
66.
67.
68.
69.
70.
71.
72.
73.
74.
McLaughlin et al.
carotid stent experience in experienced interventionalists. J Vasc Surg. 2007; 46:855–63. Gisondi MA, Smith-Coggins R, Harter PM, Soltysik RC, Yarnold PR. Assessment of resident professionalism using high-fidelity simulation of ethical dilemmas. Acad Emerg Med. 2004; 11:931–7. Girzadas DV Jr, Clay L, Caris J, Rzechula K, Harwood R. High fidelity simulation can discriminate between novice and experienced residents when assessing competency in patient care. Med Teach. 2007; 29:472–6. Gordon JA, Tancredi DN, Binder WD, Wilkerson WM, Shaffer DW. Assessment of a clinical performance evaluation tool for use in a simulator-based testing environment: a pilot study. Acad Med. 2003; 78:S45–47. Gaba DM, Howard SK, Flanagan B, Smith BE, Fish KJ, Botney R. Assessment of clinical performance during simulated crises using both technical and behavioral ratings. Anesthesiology. 1998; 89:8–18. Adler MD, Trainor JL, Siddall VJ, McGaghie WC. Development and evaluation of high-fidelity simulation case scenarios for pediatric resident education. Ambul Pediatr. 2007; 7:182–6. Morgan PJ, Cleave-Hogg D, Guest CB. A comparison of global ratings and checklist scores from an undergraduate assessment using an anesthesia simulator. Acad Med. 2001; 76:1053–5. Schwid HA, Rooke GA, Carline J, et al. Evaluation of anesthesia residents using mannequin-based simulation: a multi-institutional study. Anesthesiology. 2002; 97:1434–44. Berkenstadt H, Ziv A, Gafni N, Sidi A. Incorporating simulation-based objective structured clinical examination into the Israeli National Board Examination in Anesthesiology. Anesth Analg. 2006; 102:853–8. Weller JM, Robinson BJ, Jolly B, et al. Psychometric characteristics of simulation-based assessment in anaesthesia and accuracy of self-assessed scores. Anaesthesia. 2005; 60:245–50. Savoldelli GL, Naik VN, Joo HS, et al. Evaluation of patient simulator performance as an adjunct to the oral examination for senior anesthesia residents. Anesthesiology. 2006; 104:475–81. Morgan PJ, Cleave-Hogg D, DeSousa S, Tarshis J. High-fidelity patient simulation: validation of performance checklists. Br J Anaesth. 2004; 92:388–92. Murray D, Boulet J, Avidan M, et al. Performance of residents and anesthesiologists in a simulationbased skill assessment. Anesthesiology. 2007; 107:705–13. Murray D, Boulet J, Ziv A, Woodhouse J, Kras J, McAllister J. An acute care skills evaluation for graduating medical students: a pilot study using clinical simulation. Med Educ. 2002; 36:833–41. Murray DJ, Boulet JR, Kras JF, McAllister JD, Cox TE. A simulation-based acute skills performance assessment for anesthesia training. Anesth Analg. 2005; 101:1127–34. Weller JM, Bloch M, Young S, et al. Evaluation of high fidelity patient simulator in assessment of per-
75.
76.
77.
78.
79.
80.
81.
82.
83.
84. 85.
86.
87.
88.
89.
90.
•
SIMULATION IN GRADUATE MEDICAL EDUCATION
formance of anaesthetists. Br J Anaesth. 2003; 90:43–7. Overly FL, Sudikoff SN, Shapiro MJ. High-fidelity medical simulation as an assessment tool for pediatric residents’airway management skills. PediatrEmerg Care. 2007; 23:11–5. Crofts JF, Bartlett C, Ellis D, Hunt LP, Fox R, Draycott TJ. Training for shoulder dystocia: a trial of simulation using low-fidelity and high-fidelity mannequins. Obstet Gynecol. 2006; 108:1477–85. Crofts JF, Bartlett C, Ellis D, Hunt LP, Fox R, Draycott TJ. Management of shoulder dystocia: skill retention 6 and 12 months after training. Obstet Gynecol. 2007; 110:1069–74. Ottestad E, Boulet JR, Lighthall GK. Evaluating the management of septic shock using patient simulation. Crit Care Med. 2007; 35:769–75. Regher G, MacRae H, Reznick R, Szalay D. Comparing the psychometric properties of checklists and global rating scales for assessing performance in an OSCE-format examination. Acad Med. 1998; 73:993–7. American Board of Emergency Medicine. Available at: http://www.abem.org/public/. Accessed Mar 11, 2008. Boulet JR, De Champlain AF, McKinley DW. Setting defensible performance standards on OSCEs and standardized patient examinations. Med Teach. 2003; 25:245–9. Gordon JA, Cooper J, Simon R, Raemer D, Rudolph J, Gray M. The Institute for Medical Simulation: a new resource for medical educators worldwide [abstract]. Int Meeting Med Simul. 2005. Bransford JD, Brown AL, Cocking RC (eds). How People Learn: Brain, Mind, Experience, and School: Expanded Edition. Washington, DC: National Academy Press, 2000. Mager RF. Preparing instructional objectives. 2nd ed. Belmont, CA: David S. Lake, 1984. Fink LD. Creating Significant Learning Experiences: An Integrated Approach to Designing College Courses. San Francisco, CA: John Wiley and Sons, 2003. University of New Mexico Health Sciences Center. Basic Advanced Trauma Computer Assisted Virtual Experience. Available at: http://hsc.unm.edu/som/ gme/batcave/. Accessed Mar 1, 2008. Liaison Committee on Medical Education. Functions and Structure of a Medical School: Standards for Accreditation of Medical Education Programs Leading to the M.D. Degree. Available at: http:// www.lcme.org/functions2007jun.pdf. Accessed Feb 26, 2008. Morrison EH, Garman KA, Friedland JA. A national Web site for residents as teachers. Acad Med. 2001; 76:544. Wilkerson L, Lesky L, Medio FJ. The resident as teacher during work rounds. J Med Educ. 1986; 61:823–9. Armstrong E, Ashford I, Freeman J, et al. Developing the Teaching Skills of Residents Through Interactive Resident-as-Teacher Workshops. AAMC IME. Washington, DC, 2001.
ACAD EMERG MED • November 2008, Vol. 15, No. 11
•
www.aemj.org
91. Dunnington GL, DaRosa D. A prospective randomized trial of a residents-as-teachers training program. Acad Med. 1998; 73:696–700. 92. Spickard A III, Wenger M, Corbett EC Jr. Three essential features of a workshop to improve resident teaching skills. Teach Learn Med. 1996; 9:170–3. 93. Bing-You RG, Sproul MS. Medical students’ perceptions of themselves and residents as teachers. Med Teacher. 1992; 14:133. 94. Barrows MV. Medical students’ opinions of the house office as a medical educator. J Med Educ. 1966; 41:807–10. 95. Kurrek MM, Devitt JH. The cost for construction and operation of a simulation centre. Can J Anesth. 1997; 44:1191–5. 96. Wright SW, Lindsell CJ, Hinckley WR, et al. High fidelity medical simulation in the difficult environment of a helicopter: feasibility, self-efficacy and cost. BMC Med Educ. 2006; 6:49. 97. McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. Effect of practice on standardised learning outcomes in simulation-based medical education. Med Educ. 2006; 40:792–7.
1129
98. Dunn WF. Simulators in Critical Care and Beyond. Des Plaines, IL: Society of Critical Care Medicine, 2004. 99. Kyle R, Murray DB. Clinical Simulation: Operations, Engineering, and Management. San Diego, CA: Elsevier, 2008. 100. Ziv A, Wolpe PR, Small SD, Glick S. Simulationbased medical education: an ethical imperative. Acad Med. 2003; 78:783–8. 101. Cooney E. Hospital testing laparoscopic surgeons’ motor skills. The Boston Globe, Jan 4, 2008. Available at: http://www.boston.com/news/health/blog/2008/ 01/surgeons_who_pe.html. Accessed Feb 11, 2008. 102. McCarthy J, Cooper JB. Malpractice insurance carrier provides premium incentive for simulationbased training and believes it has made a difference. Newslett Anesth Patient Safety Foundation. 2007; 22(1):17. 103. Forbes R, Kennedy P. Enhancing Safety in Medicine Utilizing Leading Advanced Simulation Technologies to Improve Outcomes Now (SIMULATION) Act of 2007. HR#4321, December 6, 2007. Available at: http://www.medsim.org/documents/ HR4321_000.pdf. Accessed Feb 11, 2008.