Research Summit Article
Research Regarding Debriefing as Part of the Learning Process Daniel Raemer, PhD; Mindi Anderson, PhD, RN, CPNP-PC, ANEF; Adam Cheng, MD, FRCPC; Ruth Fanning, MB, MRCPI, FFARCSI; Vinay Nadkarni, MD; Georges Savoldelli, MD, MEd
Introduction: Debriefing is a process involving the active participation of learners, guided by a facilitator or instructor whose primary goal is to identify and close gaps in knowledge and skills. A review of existing research and a process for identifying future opportunities was undertaken. Methods: A selective critical review of the literature on debriefing in simulation-based education was done. An iterative process of analysis, gathering input from audience participants, and consensus-based synthesis was conducted. Results: Research is sparse and limited in presentation for all important topic areas where debriefing is a primary variable. The importance of a format for reporting data on debriefing in a research context was realized and a “who, when, where, what, why” approach was proposed. Also, a graphical representation of the characteristics of debriefing studies was developed (Sim-PICO) to help guide simulation researchers in appropriate experimental design and reporting. Conclusion: A few areas of debriefing practice where obvious gaps that deserve study were identified, such as comparing debriefing techniques, comparing trained versus untrained debriefers, and comparing the effect of different debriefing venues and times. A model for publication of research data was developed and presented which should help researchers clarify methodology in future work. (Sim Healthcare 6:S52–S57, 2011)
Key Words: Debriefing, Research, Learning, Simulation, Education.
T
he critical role of debriefing has been largely demonstrated in simulation-based education (SBE) and recently described in two systematic reviews of SBE.1,2 However, despite debriefing’s critical importance for the field of simulation, it is surFrom the Center for Medical Simulation (D.R.), Cambridge, MA; Department of Anesthesia, Critical Care, and Pain Management (D.R.), Massachusetts General Hospital, Cambridge, MA; Department of Anaesthesia (D.R.), Harvard Medical School, Cambridge, MA; The University of Texas at Arlington College of Nursing (M.A.), Arlington, TX; KidSIM-Aspire Simulation Research Program (A.C.), Alberta Children’s Hospital, University of Calgary, Calgary, Canada; Stanford University Medical School (R.F.), Stanford, CA; Center for Simulation, Advanced Education and Innovation (V.N.), The Children’s Hospital of Philadelphia, University of Pennsylvania School of Medicine, Philadelphia, PA; Department of Anesthesiology (G.S.), Pharmacology and Intensive Care, Geneva University Hospitals, Geneva, Switzerland; and Unit for Development and Research in Medical Education (G.S.), University of Geneva, Geneva, Switzerland. D.R.: Salaried instructor for simulation-based instructor courses at the Center for Medical Simulation; M.A.: Author for the online course, Debriefing and Guided Reflection, on the Simulation Innovation Resource Center site. Additionally, she teaches a continuing education course on simulation, as well as a simulation course for the Master’s program at UT Arlington College of Nursing. She has done some consultation related to simulation and has received grant funding from The Laerdal Foundation for Acute Medicine, The Association of Standardized Patient Educators, and the UT System; A.C.: Instructor for simulation-based instructor courses through KidSIM at Alberta Children’s Hospital and receives grant support from the Laerdal Foundation for Acute Medicine and the Heart and Stroke Foundation of Canada; R.F.: Instructor for simulation-based instructor courses at Stanford University; V.N.: Endowed Chair, Anesthesia and Critical Care Medicine, The Children’s Hospital of Philadelphia. He has received unrestricted research grants from the NIH (cardiac arrest and resuscitation, glycemic control) and Laerdal Foundation for Acute Care Medicine (education); G.S.: Instructor for simulation-based instructor courses through SimulHUG at the University Hospitals of Geneva, Switzerland and the University of Paris Descartes, France. Reprints: Daniel Raemer, PhD, Center for Medical Simulation, 65 Landsdowne St., Cambridge, MA 02139 (e-mail:
[email protected]). Copyright © 2011 Society for Simulation in Healthcare DOI: 10.1097/SIH.0b013e31822724d0
S52
Debriefing as Part of the Learning Process
prising that few comprehensive studies focus on debriefing.3 Although various definitions of debriefing have appeared in the literature, for the purposes of this article, we will use the definition that Fanning and Gaba3 presented in their 2007 review, which defined debriefing as a “facilitated or guided reflection in the cycle of experiential learning.” This process involves active participation of learners, who are guided by a facilitator or instructor whose primary goal is to help learners identify and close gaps in knowledge and skills. In this article, we provide an overview of the current literature related to postsimulation debriefing, organize the existing research into topics relative to debriefing, identify some of the gaps in the literature, describe future opportunities for research, and most importantly, make recommendations on how studies of debriefing should be reported in the future. We intend that this article will provide a roadmap for educators and researchers who are looking to conduct future research related to simulation-based debriefing.
METHODS The authors completed a selective critical review of the relevant literature around debriefing in SBE. A brainstorming session was then conducted to identify, by consensus, the most important topics to review more extensively. Various authors were assigned and independently examined the following research areas in debriefing: characterization of debriefing; philosophies of debriefing; scripted versus unscripted debriefing; individuals versus teams; length of the debriefing; assessment of debriefing; instructor training; video versus no video facilitated debriefing; transcultural, transdisciplinary, and transspecialty debriefing; timing of deSimulation in Healthcare
briefing; role of standardized patients (SPs) and confederates; and technical, medical, and behavioral aspects of debriefing. Each author attempted to find a highly representative sample of the current literature (approximately 10 –20 references), although did not necessarily conduct a complete literature review. While reviewing the literature, each author targeted describing the references within their topic area to the PICO framework (ie, P ⫽ population; I ⫽ intervention; C ⫽ comparator; O ⫽ outcome), commonly used in evidence-based medical literature.4 This framework was chosen to standardize the analysis between the reviewers. The authors combined and harmonized the results of their searches, and the results were presented during a plenary session of the research summit. Audience participants of the summit, approximately 40 people with a particular interest in debriefing, were solicited for new ideas, comments, and clarifications. The authors then met to analyze and consider this input to generate an organized review of the important literature on debriefing. At this point, it was agreed that the research in many of the important topic areas involving debriefing as a variable was sparse, serious limitations in the presentation of existing studies were common, and it would be difficult to set specific priorities for research topics for the future. Despite these limitations, it would be possible to organize the existing literature in a rational manner that could serve as a template for research reporting in the future. Also, the authors realized that a paradigm to place debriefing research in a graphical format based on PICO would be helpful to future researchers and educators. Finally, some interesting and seemingly important questions became apparent and can be used to generate some topics for future study. State of the Science A critical review of the literature exploring debriefing as a study intervention illustrated a growing body of research in describing, quantifying, and assessing debriefing.5–10 There are a number of studies of modalities or aids to debriefing such as the use of video,11–13 structured14 –17 or scripted debriefing,6 approaches to debriefing, and whether technical or behavioral.18,19 In addition, studies exploring underlying learning theories upon which the debriefing techniques were based were uncovered.17,20 –23 When applying the construct of translational science (the progression from bench to bedside) to debriefing in SBE, where T1 studies measure process or outcomes in the simulated environment, T2, processes at the real-world level, and T3, outcomes of the individual patient or public health, we found that current studies of debriefing lie firmly in the T1 category.24 Assessment of debriefing is a critical issue. We believe that studies looking at the characteristics of debriefing and feedback that convey the greatest advantages in terms of learning outcomes are clearly achievable not only at the T1 level but also at the T2 level. To generate rich and useful knowledge, these studies should be conducted using both quantitative and qualitative methodologies. Knowledge and science generated with these studies should then be translated into debriefing practices based on evidence. Because of the complexVol. 6, No. 7, August 2011 Supplement
ity and expense, conducting studies at the T3 level examining debriefing as the solitary variable is challenging. However, T3 and T2 level studies assessing the efficacy and cost-effectiveness of SBE compared with more traditional methods should aim to incorporate proven methods of debriefing that have been previously validated in lower level studies. Opportunities and Recommendations We identified a critical need to better characterize the debriefing process and to develop recommendations for publications. The authors agreed that many characteristics of the debriefing process might affect its efficacy. However, given the current state of the science, important aspects of the process may still be poorly understood or even undiscovered. Therefore, the recommendations of the authors are not meant to be exhaustive. These “general guidelines” are meant to fulfill the needs expressed by members of our community. We strongly recommend to use them when designing and developing sound SBE activities and for reporting results of scholarly work. The recommendations articulate around five questions that must be answered to characterize the debriefing process: • • • • •
Who—who is debriefing? What—what is the content/methods of debriefing? When—timing of the debriefing? Where— environment of the debriefing? Why—theoretical framework supporting the debriefing?
Table 1 provides detailed examples of the characteristics of the debriefing process pertaining to each of the five questions. Who Debriefing by competent instructors is considered important to maximize the learning opportunities arising from simulated events. Instructors require both structure and specific techniques to optimize learning during debriefing.25 Despite the increasing prevalence of simulation instructor training courses, research from Diekmann et al26 suggests that there exists considerable variation between the perceived ideal role of the debriefer and what is actually executed during real debriefing sessions. As simulation becomes more widely used in healthcare as a means of both formative and summative assessment, a reliable and valid way to assess the efficacy and quality of the debriefing becomes more important.3,20,27 McDonnell et al22 have done some pioneering work in the aviation industry in this area by describing a “debriefing assessment battery” designed to facilitate debriefing of aviation-related training. Unfortunately, this tool does not translate well to healthcare simulation and debriefing. To address this gap, Rudolph et al28 developed a debriefing assessment tool entitled “Debriefing Assessment for Simulation in Healthcare (DASH),” which includes six debriefing elements crucial to facilitation of an effective debriefing session. With the caveat that this assessment tool requires training of raters, early unpublished research has shown that this tool has strong interrater reliability (M. Brett-Fleegler, personal communication, 2011). Future research should explore the impact of debriefing training programs on quality of debriefing and learner outcomes. In addition, we should identify the optimal frequency © 2011 Society for Simulation in Healthcare
S53
Table 1. The 5 Ws of Debriefing Research Key Element Who (debriefer)
What (content and methods of debriefing) When (timing) Where (environment) Why (theory)
Examples of Detailed Characteristics of the Debriefing* Number and characteristics of the debriefers: expertise/training in debriefing, experience with simulation, demographics, clinical experience, same discipline vs. multidiscipline, clinician with clinician coinstructor, clinician with psychologist or educator, peer or not, specialty training, culture, confederate or SP feedback, etc. Number and characteristics of the participants: discipline and specialty, students or practitioners, student level, active and observers, one time or repeated experience, familiarity with simulation Purpose: CRM or technical skills, formative or summative assessment; individual vs. team debriefing; model of debriefing: advocacy/inquiry, plus/minus, plus/delta, nonjudgmental, mechanics (length, video, or no video), instructor vs. facilitator style, scripted, structure, self-assessment, cueing, debriefing software Prebriefing, chronology, duration, pause and discuss, immediate or delayed postsimulation debriefing In-situ, in debriefing room, in simulation area, in clinical room, at the institutional level, etc. Why was the type of debriefing chosen? Theoretical framework: experiential learning, mastery learning, corrective feedback, reflective practice, andrology, constructivism, behaviorism, mental frameworks, modeling, human performance
The “5 Ws” table details the information that should be provided for each of the five questions that characterize the debriefing process. When designing a debriefing or when reporting the results of a study, the authors should consider each of these five questions. When reporting research on debriefing, the level of detail should be sufficient to allow the repetition of the study by other groups of researchers. The elements shown in the table are not meant to be exhaustive as many aspects of the debriefing process may still be poorly understood or even undiscovered. *Items in the column are intended to be examples and not an exhaustive nor prioritized listing.
of retraining for simulation instructors. Furthermore, the most optimal way to perform debriefing assessment (ie, self vs. peer vs. student) should be explored. What It has been suggested that further research is needed on various debriefing methods to see whether one type has specific advantages over another.3 For example, video-assisted debriefing permits both individuals and teams to review their skills by providing an objective record of performance. To date, relatively few empirical studies have been specifically related to video-facilitated feedback during SBE.11–13,29 These studies have yielded mixed results. Savoldelli et al showed that, when teaching Crisis Resource Management (CRM) principles to anesthesia residents, participants’ nontechnical skills improved equally after oral feedback or video-assisted oral feedback but did not improve without any debriefing. However, the relatively small sample size and the absence of a retention test are clear limitations of this study.12 Other studies have shown no significant difference in performance outcomes when using debriefing with and without video playback and also suffer from methodological and reporting limitations.11,13 Scripted debriefing is a different method where a debriefing tool is used to help the instructor incorporate questions and statements to guide the debriefing.6 Early unpublished research has shown that scripted debriefing may be beneficial, particularly among those who have little experience in debriefing (A. Cheng, personal communication, 2011). Future research might focus on learning outcomes based on level of expertise of the person conducting the debriefing with a script versus no script or video playback versus no video playback during debriefing. The issue of individual versus team debriefing needs to be addressed as well, specifically assessing how effectively learners in a team debriefing are able to close their own performance gaps. Finally, more work needs to be done exploring the relative benefit of behavioral debriefing with an emphasis on teamwork and crisis resource management versus technical/medical debriefings with an emphasis on skills, tasks, or medical treatment plans. S54
Debriefing as Part of the Learning Process
When Timing is one element of the debriefing that needs study, such as outcomes related to the length of time spent in a debriefing.30 A variety of times for the length of debriefing have been suggested in the literature including 20 minutes,21,31 30 minutes,21,32 and up to 1 hour.22 Outcomes, such as learner satisfaction, knowledge, confidence, technical skills, critical thinking,31 and teamwork principles, have not been explored between varied lengths of debriefing times. In addition to the total length of time, the amount of time for each phase3 or aspects of the debriefing process have not been studied. Another important question is whether learning is more effective if the debriefing occurs completely after the scenario experience or if it is interspaced during the simulation episode. Yet another interesting trend is to study whether spaced learning will improve concept acquisition and retention.33 Why The use of particular debriefing methods, whether chosen for practical or logistical reasons, is barely addressed in the current literature. The theoretical framework on which the chosen debriefing method is based is rarely mentioned. While many studies refer to the importance of reflection, a key concept in experiential learning,34 –36 only a minority describe this process in detail.20 Theoretical constructs such as CRM,37 previous experience,38 mastery in learning and deliberate practice,39,40 constructivism and behaviorism,41 and corrective or solutionfocused feedback,42,43 when explicitly used in simulation studies, rarely separate the contribution debriefing makes in this theoretical framework. Study is needed to define explicit models of debriefing as they pertain to underlying learning theories: what models suit different modalities of simulation, learning objectives, and participant populations. In addition, the learning curves, the level of expertise required of SBE instructors while using certain models, between institutions, disciplines, and across cultures, are other important concepts to address. The penultimate aim is the fostering of research pertaining to T3 outcomes where exploration of how debriefing models based Simulation in Healthcare
Figure 1. Graphical concept of how the PICO (Population, Intervention, Comparator, Outcome) approach could be adapted
in a generic fashion to simulation studies (Sim-PICO). The vertical columns in this matrix now represent the generic dimensions of debriefing (eg, the 5Ws from Table 1: Who, What, When, Where, and Why). Each simulation study might target one or more than one column to study as the intervention. The overview and characterization of the simulation trial (eg, approach, scenario and manikin definition, level of evidence, and randomization approach) would be the first methodological descriptor that crosses all vertical dimensions. The P, I, C, O horizontal rows could be as wide or narrow as appropriate for a given study (see specific examples in Figs. 2 3). Again, when multiple simulation studies are plotted on the matrix, it would become easy to see where “gaps” in studies or findings were, to help identify critical gaps where time, effort, and resources are most needed.
on established or novel theoretical frameworks impact on patient care. A Generalizable Framework The literature review and subsequent analysis of debriefing research has led the authors to the conclusion that a generalizable framework for guiding future studies and results reporting would be helpful. Using the PICO paradigm
during the analysis has led us to suggest that this model will be helpful as a foundation for a framework. Figure 1 shows a generic version of the Sim-PICO graphic. Each of the dimensions of research on debriefing identified by the authors is plotted as columns. The description of studies completed, contemplated, or reported can be defined according to the population studied, intervention applied, comparisons
Figure 2. Specific example of how the Sim-PICO (Population, Intervention, Comparator, Outcome) approach could be applied to a specific simulation study, in this case the study by Savoldelli et al.12 The vertical columns in this matrix continue to represent the generic dimensions of debriefing (from Table 1). The I (intervention) and C (Comparator) now shrink to focus on the “What (methods/content)” dimension of debriefing. In this study, the Intervention was oral or video debriefing, and the comparator was “no debriefing.” This specific study thus answers a question related to the “what” of debriefing. The Outcome was the delta ANTS score, a validated anesthesia nontechnical skill score for these scenarios. Note that the outcome crosses multiple dimensions (columns), because the other factors can influence that outcome, but it is clear from the matrix that the “what” column is what is altered and tested in this study. Vol. 6, No. 7, August 2011 Supplement
© 2011 Society for Simulation in Healthcare
S55
Hypothetical example of how the Sim-PICO (Population, Intervention, Comparator, Outcome) approach could be applied to a new simulation study. In this example, the researcher wishes to compare trained versus untrained nonclinical debriefers, the “who” dimension. The population (“P”) involved in the simulation might be nurse and ICU fellow pairs doing an airway management case. The training of the subject debriefers might consist of a 4-hour workshop of didactic and role-play practice (“what”). The untrained subjects would be the comparison group and would need to be exposed to some experience of equal time, perhaps reviewing airway management algorithms. After the training and nontraining intervention, the subjects would conduct an immediate postscenario (“when”) debriefing of pairs of clinicians after a standardized in situ simulation scenario (“where”). The “why” dimension would describe the theoretical method of debriefing trained, such as the advocacy/ inquiry technique. Outcomes (“O”), such as the scores on a behaviorally anchored rating scale (BARS), could be compared between the trained and untrained subjects.
Figure 3.
tested, and outcomes determined. These variables are plotted as bars across the dimensions. Figure 2 shows an example of how this graphical representation would be applied to an existing study. Figure 3 shows how the Sim-PICO framework might be used to design a new research study. Setting the expectation that each study regarding debriefing would have to address one or more dimensions and reveal each of the PICO elements would be a strong step forward in the field.
DISCUSSION At the outset of this review process, our aim was to provide an overview of the state of the science of research regarding debriefing, identify gaps, and make recommendations on how studies of debriefing might be conducted and reported in the future. While reviewing the relevant literature, we found the characteristics of the debriefing were rarely reported in depth, lacked consistency, and, thus, produced little opportunity for reproducibility between studies or populations. This was one of the main concerns for attendees of the research summit, who stressed the need for creating a structure or framework to guide simulation research where debriefing played a role. In an effort to address this concern, we explored five basic tenets of debriefing, Who, What, When, Where, Why (Table 1), and created a matrix incorporating the PICO construct (Population, Intervention, Comparator, Outcome). Our aim in creating this matrix was not to create the definitive template for describing or conducting studies but to add clarity to the process.
CONCLUSION The penultimate aim of debriefing-related research is to systematically assess the role debriefing plays not only at the S56
Debriefing as Part of the Learning Process
level of the learner (T1, T2) but also at the level of patient care outcomes (T3). The introduction of a construct such as the one described in this article represents a starting point where we might address this important goal. REFERENCES 1. Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach 2005;27: 10 –28. 2. McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003–2009. Med Educ 2010;44:50 – 63. 3. Fanning RM, Gaba DM. The role of debriefing in simulation-based learning. Simul Healthc 2007;2:115–125. 4. Richardson WS, Wilson MC, Nishikawa J, Hayward RS. The well-built clinical question: a key to evidence-based decisions. ACP J Club 1995; 123:A12–A13. 5. Lederman L. Debriefing: toward a systematic assessment of theory and practice. Simul Gaming 1992;23:145–160. 6. Cheng A, Hunt EA, Donoghue A, et al V. EXPRESS–Examining Pediatric Resuscitation Education Using Simulation and Scripting. The birth of an international pediatric simulation research collaborative—from concept to reality. Simul Healthc 2011;6: 34 – 41. 7. Donoghue A, Ventre K, Boulet J, et al. Design, implementation, and psychometric analysis of a scoring instrument for simulated pediatric resuscitation: a report from the EXPRESS pediatric investigators. Simul Healthc 2011;6:71–77. 8. Mort TC, Donahue SP. Debriefing: the basics. In: Dunn WF, ed. Simulators in Critical Care and Beyond. Des Plains, IL: Society of Critical Care Medicine; 2004:76 – 83. 9. Flanagan B. Debriefing: theory and techniques. In: Riley RH, ed. Simulation in Healthcare. New York, NY: Oxford University Press; 2008:155–170.
Simulation in Healthcare
10. Johnson-Russell J, Bailey C. Facilitated debriefing. In: Nehring WM, Lashley FR, eds. High-Fidelity Patient Simulation in Nursing Education. Sudbury, MA: Jones and Bartlett; 2010:369 –385.
26. Dieckmann P, Molin Friis S, Lippert A, Ostergaard D. The art and science of debriefing in simulation: ideal and practice. Med Teach 2009;31:e287– e294.
11. Byrne AJ, Sellen AJ, Jones JG, et al. Effect of videotape feedback on anaesthetists’ performance while managing simulated anaesthetic crises: a multicentre study. Anaesthesia 2002;57:176 –179.
27. Rudolph JW, Simon R, Raemer DB, Eppich WJ. Debriefing as formative assessment: closing performance gaps in medical education. Acad Emerg Med 2008;15:1010 –1016.
12. Savoldelli GL, Naik VN, Park J, Joo HS, Chow R, Hamstra SJ. Value of debriefing during simulated crisis management: oral versus videoassisted oral feedback. Anesthesiology 2006;105:279 –285.
28. Rudolph J, Raemer D, Simon R. Debriefing Assessment for Simulation in Healthcare (DASH). Cambridge, MA: Center for Medical Simulation; 2010.
13. Grant J, Moss J, Epps C, Watts P. Using video-facilitated feedback to improve student performance following high-fidelity simulation. Clin Simul Nurs 2010;6:177–184.
29. Cantrell MJ. The importance of debriefing in clinical simulations. Clin Simul Nurs 2008;4:e19 – e23.
14. Bordley WC, Travers D, Scanlon P, Frush K, Hohenhaus S. Office preparedness for pediatric emergencies: a randomized, controlled trial of an office-based training program. Pediatrics 2003;112:291–295.
30. Kardong-Edgren S. Guidelines for simulation research. New York, NY: National League for Nursing; 2010.
15. Fox-Robichaud AE, Nimmo GR. Education and simulation techniques for improving reliability of care. Curr Opin Crit Care 2007;13:737–741.
31. Jeffries PR, Rogers KJ. Theoretical framework for simulation design. In: Jeffries PR, ed. Simulation in Nursing Education: From Conceptualization to Evaluation. New York, NY: National League for Nursing; 2007:21–33.
16. Keene EA, Hutton N, Hall B, Rushton C. Bereavement debriefing sessions: an intervention to support health care professionals in managing their grief after the death of a patient. Pediatr Nurs 2010;36: 185–189; quiz 190.
32. Hravnak M, Tuite P, Baldisseri M. Expanding acute care nurse practitioner and clinical nurse specialist education: invasive procedure training and human simulation in critical care. AACN Clin Issues 2005; 16:89 –104.
17. Kuiper R, Heinrich C, Matthias A, Graham M, Bell-Kotwall L. Debriefing with the OPT model of clinical reasoning during high fidelity patient simulation. Int J Nurs Educ Scholarsh 2008;5:17.
33. Kelley P. Making Minds: What’s Wrong with Education—And What Should We Do About IT? New York, NY: Routledge; 2008.
18. Gaba DM, Howard SK, Flanagan B, Smith BE, Fish KJ, Botney R. Assessment of clinical performance during simulated crises using both technical and behavioral ratings. Anesthesiology 1998;89:8 –18. 19. Bond WF, Deitrick LM, Eberhardt M, et al. Cognitive versus technical debriefing after simulation training. Acad Emerg Med 2006;13:276 – 283. 20. Rudolph J, Simon R, Dufresne RL, Raemer D. There’s no such thing as “non-judgmental” debriefing: a theory and method for debriefing with good judgment. Simul Healthc 2006;1:49 –55. 21. Decker S. Integrating guided reflection into simulated learning experiences. In: Jeffries PR, ed. Simulation in Nursing Education: From Conceptualization to Evaluation. New York, NY: National League for Nursing; 2007:73– 85.
34. Gibbs G. Learning by Doing: A Guide to Teaching and Learning Methods. London: Further Education Unit; 1988. 35. Kolb D. Experiential Learning: Experience as the Source of Learning and Development. Englewood Cliffs, NJ: Prentice-Hall; 1984. 36. Grant J, Marsden P, King RC. Senior house officers and their training. II. Perceptions of service and training. BMJ 1989;299:1265–1268. 37. Gaba D, Fish K, Howard S. Crisis Management in Anesthesiology. Philadelphia, PA: Churchill Livingstone; 1994. 38. Crookall D, Zhou M. Medical and healthcare simulation: symposium overview. Simul Gaming 2001;32:142–146. 39. Wayne DB, Butter J, Siddall VJ, et al. Mastery learning of advanced cardiac life support skills by internal medicine residents using simulation technology and deliberate practice. J Gen Intern Med 2006; 21:251–256.
22. McDonnell L, Jobe K, Dismukes R. Facilitating LOFT debriefings: a training manual. Moffett Field, CA: National Aeronautics Space Administration; 1997:35–57.
40. Kneebone R. Evaluating clinical simulations for learning procedural skills: a theory-based approach. Acad Med 2005;80:549 –553.
23. Arafeh JM, Hansen SS, Nichols A. Debriefing in simulated-based learning: facilitating a reflective discussion. J Perinat Neonatal Nurs 2010;24:302–309; quiz 310 –311.
41. Parker B, FM. A critical examination of high-fidelity human patient simulation within the context of nursing pedagogy. Nurse Educ Today 2009;29:322–329.
24. McGaghie WC. Medical education research as translational science. Sci Transl Med 2010;2:19cm8.
42. O’Hare D, Roscoe S. Flightdeck Performance: The Human Factor. Ames, IA: Iowa State Press; 1990.
25. Dismukes RK, Gaba DM, Howard SK. So many roads: facilitated debriefing in healthcare. Simul Healthc 2006;1:23–25.
43. Priest S. eXperientia. Available at: http://www.tarrak.com/EXP/exp.htm; 2011. Accessed January 10, 2011.
Vol. 6, No. 7, August 2011 Supplement
© 2011 Society for Simulation in Healthcare
S57