Downloaded from pmj.bmj.com on September 19, 2014 - Published by group.bmj.com
PGMJ Online First, published on September 8, 2014 as 10.1136/postgradmedj-2012-131676 Original article
Development of a tool to improve performance debriefing and learning: the paediatric Objective Structured Assessment of Debriefing (OSAD) tool Jane Runnacles,1 Libby Thomas,2 Nick Sevdalis,3 Roger Kneebone,3 Sonal Arora3 ▸ Additional material is published online only. To view please visit the journal online (http://dx.doi.org/10.1136/ postgradmedj-2012-131676). 1
Department of Paediatrics, Royal Free London NHS Foundation Trust, London, UK 2 Simulated and Interactive Learning Centre, King’s College, London, UK 3 Department of Surgery and Cancer, Imperial College, London, UK Correspondence to Dr Jane Runnacles, Department of Paediatrics, Royal Free London NHS Foundation Trust, Pond Street, London NW3 2QG, UK;
[email protected] Received 4 December 2012 Revised 29 July 2014 Accepted 1 August 2014
ABSTRACT Background Simulation is an important educational tool to improve medical training and patient safety. Debriefing after simulation is crucial to maximise learning and to translate the lessons learnt to improve real clinical performance, and thus to reduce medical error. Currently there are few tools to improve performance debriefing and learning after simulations of serious paediatric situations. Purpose The purpose of this study was to develop a tool to guide and assess debriefings after simulations of serious paediatric situations, applying the current evidence base and user-based research. Study design A literature review and semistructured interviews ( performed in 2010) to identify important features of a paediatric simulation debriefing. Emergent theme analysis was used to identify key components of an effective debriefing which could be used as a tool for assessing debriefing effectiveness. Results The literature review identified 34 relevant studies. Interviews were carried out with 16 paediatricians, both debriefing facilitators and learners. In total, 307 features of a debriefing were identified. These were grouped into eight dimensions representing the key components of a paediatric debriefing: the facilitator’s approach, learning environment, engagement of learners, reaction, descriptive reflection, analysis, diagnosis and application. These eight dimensions were used to create a tool, the Objective Structured Assessment of Debriefing (OSAD). Each dimension can be scored on a five-point Likert scale containing descriptions for scores 1, 3 and 5 to serve as anchors and aid scoring. Conclusions The study identified the important features of a paediatric simulation debriefing, which were developed into the OSAD tool. OSAD offers a structured approach to paediatric simulation debriefing, and is based on evidence from published literature and views of simulation facilitators and learners. OSAD may be used as a guide or assessment tool to improve the quality of debriefing after paediatric simulation.
INTRODUCTION
To cite: Runnacles J, Thomas L, Sevdalis N, et al. Postgrad Med J Published Online First: [ please include Day Month Year] doi:10.1136/postgradmedj2012-131676
Simulation is a powerful learning tool which can improve patient safety and reduce the incidence of adverse events.1 It can be used to teach crisis management skills, but can also help the learner develop the key communication, team-working and decision-making abilities required to effectively manage a seriously ill patient.2 These skills are especially important for paediatric emergencies, which are serious and challenging situations that are rarely experienced. It is not appropriate for
Runnacles J, et al. Postgrad Medauthor J 2014;0:1–9. Copyright Article (or doi:10.1136/postgradmedj-2012-131676 their employer) 2014. Produced
trainees to learn these skills on real children3; alternative training strategies must be sought. Simulation is one strategy that offers paediatric trainees the opportunity of practised experience within a safe learning environment—without exposing patients to preventable harm.3 The use of simulation in paediatric curricula is increasing; it is an exciting and evolving educational tool with a developing evidence base supporting its use.4 5 It can impact on individual and team performance through learning which is both experiential and immersive.6 Evidence suggests that the greatest benefit of simulation is the ability to provide training with a focus on non-technical skills such as communication and leadership.4 The quality of team behaviour has also been shown to improve following simulation, and this can lead to further reductions in medical errors.7 With studies of adverse events around the world suggesting that it is a failure of non-technical and team skills that leads to patient harm,8 simulation-based training provides an opportunity to address these gaps. There is evidence to confirm that participation in such training improves clinical performance, culture and patient outcomes.1 9 Despite these benefits of simulation, it is crucial that the learning experience within the simulated environment is maximised. Feedback or debriefing to the learners through a post-scenario performance debriefing is critical in optimising learning after a simulation.10 Debriefing is defined as a social practice during which people purposely interact with each other and the environment to reflect upon a recently shared common experience.11 Effective debriefing is one where the learning opportunity for the learner is maximised, within a psychologically ‘safe’ environment (ie, an environment where the learner feels they can explore their performance, reflect on it, and freely express views on it). Effective debriefing provides formative feedback to the trainee through reflection on a training experience. It identifies learning needs and translates lessons learned to improve future clinical practice. This is particularly important in paediatric training because simulation scenarios of seriously ill children can be stressful for participants who rarely encounter such situations. Debriefing is thus essential to build confidence, and to identify and explore gaps in performance. Serious paediatric scenarios can be complex, with parent and team interactions, and therefore postsimulation debriefing can provide a ‘safe’, less emotive setting to reflect on behaviours and openly discuss ways to improve patient safety.
by BMJ Publishing Group Ltd under licence.
1
Downloaded from pmj.bmj.com on September 19, 2014 - Published by group.bmj.com
Original article Although acknowledged to be one of the most important aspects of simulation-based training,10 there is little guidance on how debriefing should take place in paediatrics.12 There are many different approaches to debriefing,13 14 but few studies provide evidence-based guidelines on the constituents of an effective debrief or indeed methods of assessing the quality of debriefing. Authors experienced in simulation have produced practical points for debriefing,15 but these are only based on their own beliefs (albeit expert) and do not take into consideration the wider literature or end-user opinion. We took the view that the above represents a gap in our current paediatric medical education and training. We felt that a set of evidence-based guidelines could be developed into a tool and used both to measure the quality of debriefings following a paediatric simulation or as a set of best practice guidelines for novice debriefers to study and develop their debriefing skills. Such a tool could have several key implications in terms of current paediatric simulation training and could help to address gaps in research with regard to debriefing practices.12 It could be used as an assessment of the quality of paediatric debriefings (ie, the skills of the facilitator who conducts the debriefing) so as to ensure that faculty provide optimal post-training feedback. It could also be used by faculty for self-evaluation of their own debriefing practice and as a means by which they can reflect upon their performance afterwards. A tool could also be used for more formal training of debriefing facilitators (ie, a ‘train-the-trainers’ course). Finally, it could be used to compare the relative effectiveness of different debriefing techniques, thereby ensuring that best practices are identified. The purpose of this study was to develop an evidence-based tool to guide and assess debriefings after a simulation of a situation involving the management of a seriously ill child.
METHODS We applied a two-part methodology to this study—including a review of the evidence base, followed by a qualitative descriptive study where we collected data prospectively using a semistructured interview approach. Both the literature review and the interview study were performed in 2010 and aimed to identify the important features of a paediatric debriefing as viewed by experts in the field and practising paediatricians. The existing literature was reviewed to address the question of what constitutes an effective debriefing so as to ensure the tool was evidencebased. The qualitative descriptive study, in which we used an interview method to collect our data from experienced colleagues, was subsequently carried out to elicit the opinions of paediatricians experienced in giving or receiving feedback after simulations involving the management of a seriously ill child. This ensured that a debriefing assessment tool could be developed to directly reflect the needs of practising doctors. Ethics approval was sought and obtained for this study from the Institute of Education, University of London and London School of Paediatrics Simulation Committee.
Literature review The purpose of the literature review was to identify the evidence base on paediatric debriefing. We included medical education, non-medical simulation, psychology, healthcare, education and business publications. We searched the following databases: PubMed, Embase, ERIC, OVID, PsycINFO and Google Scholar using the following keywords: ‘debrief*’, ‘feedback’ (linked by ‘OR’) to the combination of terms ‘simulation’, ‘p(a)ediatric’ (linked by ‘AND’). 2
An initial title screen excluded any irrelevant papers. We reviewed all other abstracts to identify relevant papers. The full text of these papers was retrieved for data extraction. Given the small number of retrieved papers (ie, limited evidence base) we decided not to use a critical appraisal to score and subsequently exclude low-scoring articles, so that as much evidence as we could identify could be used in the tool-development process. In addition to this, we hand-searched reference lists of included papers and the grey literature, and contacted experts in the field for any additional papers or studies that were not in the public evidence base. The features of what makes an effective paediatric debriefing were abstracted from the reviewed papers by two clinical reviewers ( JR: paediatrician; SA: simulation expert) (see online supplementary appendix for data abstraction table).
Interview study Participants We performed a semistructured interview study to identify features of a paediatric debriefing from the expert and user perspective. Eight paediatric registrars and eight consultants from acute paediatric specialties (emergency medicine, intensive care and general paediatrics), working at eight London hospitals, were sampled purposively and interviewed in May 2010. Regarding consultants, the inclusion criterion was that they had facilitated over 100 debriefings of paediatric simulation scenarios (involving the management of a seriously ill child) as an instructor. The inclusion criterion for paediatric registrars (trainees) pertained to those who had been on the receiving end of debriefings after a simulation of a seriously ill child and could therefore comment on aspects of the debriefing that made it effective from a learner perspective.
Study procedure We designed a semistructured interview topic guide based on findings from the literature. Input from two independent paediatricians ( JR, LT) ensured that the topic guide was relevant and appropriate for paediatrics. We then piloted it with four consultants and four registrars to ensure comprehension and relevance. After we had refined the topic guide to ensure clarity and brevity, a clinical researcher ( JR, paediatrician trained in interviewing) carried out interviews with 16 participants. By this point, thematic saturation was reached (ie, similar themes on what constitutes a paediatric debriefing were emerging in participants’ interviews). The interview asked paediatricians about their views on components of effective and ineffective debriefing as well as strategies for improvement. An example of a question to a paediatric registrar was: ‘Can you think about a time when you received a good debrief or effective feedback? Why did you find it effective?’, and to a consultant was: ‘Can you recall when you gave a trainee a good debrief or effective feedback? What was it that made it effective?’ (the full interview schedule is available from the corresponding author). The interviews were carried out face to face at a time and location convenient to the participants within their own hospitals and lasted approximately 20 min. They were recorded and then transcribed verbatim. Transcripts were crosschecked with the original recordings to ensure accuracy.
Analysis We extracted and listed features of an effective paediatric debriefing from the papers in the literature review. Emergent theme analysis of the interviews by the interviewer ( JR) and a second independent blinded coder with a background in Runnacles J, et al. Postgrad Med J 2014;0:1–9. doi:10.1136/postgradmedj-2012-131676
Downloaded from pmj.bmj.com on September 19, 2014 - Published by group.bmj.com
Original article medical education (LT) was performed to reliably identify views on an effective paediatric debriefing according to the paediatricians interviewed. After coding of the first three interviews, the coding results were calibrated to ensure agreement. They were then coded individually after each interview from the tenth interview onwards to ensure thematic saturation. The features of effective paediatric debriefing as viewed by the paediatricians were listed alongside those identified from the literature review. We tabulated the emergent components of a paediatric debriefing that were consistently identified as important across both the review and the interviews. This was conducted through an iterative process where two researchers ( JR, LT) first independently reviewed the long lists from the review and interviews and grouped them into the themes. Any disagreements were resolved by consensus with a third researcher with expertise in surgery and simulation (SA). The final outcome was reviewed for consistency by a senior psychologist with expertise in qualitative methodology and patient safety (NS).
Tool development The emergent components of an effective paediatric debriefing identified from the literature review and interview study were listed as the main dimensions of the tool. The tool was designed as a table with these dimensions in the first column and a Likert rating scale for each dimension across the rows. On the basis of a range of other extensively validated tools16 and the need for a relatively simple scale for ease of use, a five-point Likert rating scale was chosen (1=minimum, 5=maximum score, total score 40). We wrote descriptions of observable behaviours that could be assessed objectively for scores 1, 3 and 5 based on the findings of the literature review and interviews—that is, we provided anchors for these scores so that an evaluator could allocate them. This was to allow reliable ratings without extensive training in a format that would be easy to use by doctors.16
RESULTS Literature review The literature review of effective debriefing yielded 32 relevant publications. A further two publications were identified from consulting experts in the field (identified in table 1 with an asterisk),17 18 thereby producing a total of 34 papers which were included in the final analysis (table 1). Twenty-one papers were from the setting of medical simulation, three on the use of debriefing in healthcare education, six from broader educational literature and four from business/management literature. The majority of papers were secondary research articles including mostly reviews and expert opinions. Only two randomised trials, two observational studies and one survey were identified, highlighting the paucity of empirical research in this area. Fifteen papers highlighted the importance of the facilitator’s approach to the debriefing, with eight papers confirming the importance of a safe learning environment to conduct a sensitive debriefing. Learner engagement was specifically mentioned in 13 papers with an emphasis to include more passive members in a group setting. Seven papers described the importance of gauging a learner’s reaction to the scenario at the start of a debriefing, while 17 papers focused on a descriptive reflection as a critical component. Regarding analysis of what happened and why, 12 papers discussed the necessity of exploring trainees’ reasons for their actions in order to embed deep learning. Fourteen papers described making a diagnosis of performance gaps a significant component of debriefing. Finally, 13 papers highlighted the importance of a discussion on applying lessons learnt to future practice as being critical to closing a debriefing. Runnacles J, et al. Postgrad Med J 2014;0:1–9. doi:10.1136/postgradmedj-2012-131676
Interview study Eight senior registrars (three male, five female, 3–8 years postgraduate experience) and eight consultants (two male, six female, 12–20 years postgraduate experience) were interviewed. They came from the acute paediatric specialties of emergency medicine, intensive care and acute general paediatrics (n=4, 4 and 8, respectively). After initial coding of the interview transcripts, the codes were then grouped thematically. For example, ‘assurance, non-blaming, non-threatening, open approach/listening, noncritical but constructive, skilled facilitator, and gentle pace’ were all grouped into ‘Approach of the facilitator’. The other examples of how the codes were grouped can be found in table 2. The features of effective debriefing that emerged from these interviews are discussed below and illustrated with verbatim quotes from the interviews (the code letter suffixed to each quotation refers to the participant’s level of expertise as registrar (R) or consultant (C)). Within each verbatim quote, the key feature/s highlighted as judged by the coders are shown in parentheses to exemplify how we drew these from the interviews; these features, across interviews, are then presented in detail in table 2. With regard to the ‘facilitator’s approach’, participants said that debriefing should emphasise positive aspects and provide constructive (not overly critical) feedback: I think people have to be quite careful about how they deliver feedback (Skilled facilitator). There are always things that can be learnt and sometimes people are either extremely positive and just say you did all of it very well, and then equally that doesn’t help you learn, but I think if you’re going to give constructive feedback for things people could have done better (Not critical but constructive), I think you have to be very wary about sounding very negative (Non-blaming, non-threatening) (R5).
Having a ‘safe environment and learner engagement’ was identified by half of the participants, who described how debriefing should involve using open questions and good use of listening: when you try and debrief someone you don’t know what they’re thinking…so I think it should be an open question first, like how did you feel (Appropriate choice of questions) and they’ll probably just start talking and talking, and you can actually guide the actual debrief (Learner-centred, allowed for personal reflection) (R2).
Three other interviewees also emphasised the emotional support that debriefing provides and the necessity to elicit ‘reaction’: in paediatric acute cases, there’s always a lot of emotion involved, so it’s quite important to make sure that people do leave feeling that their confidence hasn’t been completely undermined (Addressed emotions, emotional support) (R6).
A ‘detailed reflection’ was also identified as crucial to an effective debriefing. For example: talking through what we actually did at each step (Step-by-step description) is always helpful and encourages good reflective practice (Allowed for personal reflection) (C1).
‘Analysis’ was another key component of debrief including an opportunity for improved insight and awareness. One participant stated: give time to ask people why they did what they did, in a non confrontational way (Analysis of event)… because if you don’t ask people why they’ve done things, then you’re not going to influence their behaviour later on… (Improved insight/awareness) (C8). 3
Downloaded from pmj.bmj.com on September 19, 2014 - Published by group.bmj.com
Original article Table 1
Results of the literature review
Authors, Year
Methodology of papers
Components of an effective debriefing
Bishop, 200019
Expert opinion
*Brett-Fleegler et al, 200917
Expert opinion
Dieckmann et al, 200911
Observation study
Dismukes et al, 200620
Editorial
Domuracki et al, 200921 Dreifuerst, 200922
Randomised controlled trial Case studies
Edelson et al, 200823
Case control
Fanning and Gaba, 200724
Literature review
Folkman, 200625
Expert opinion
Gaba, 200426
Expert opinion
*Gururaja et al, 200918
Video-based observational study
Harvard Business School, 200727
Expert opinion
Issenberg et al, 199928
Selective narrative review
Issenberg et al, 200510 Kilbourn, 199029
Systematic review Case study
Kyle and Murray, 200830
Expert opinion
Lederman, 198431
Critical review
Lederman, 199213
Literature review
McGaghie et al, 200632
Review
McGaghie et al, 201033 Morgan et al, 200934
Critical review Prospective randomised controlled trial
Pearson and Smith, 198635
Expert opinion
Owen and Follows, 200636
Expert opinion
Petranek, 200037
Case study
Approach Establishes learning environment Engagement of learners Analysis Establishes learning environment Engagement of learners Reaction Analysis Diagnosis Approach Engagement of learners Approach Engagement of learners Establishes learning environment Approach Engagement of learners Application Descriptive reflection Analysis Diagnosis Approach Reaction Analysis Application Diagnosis Application Descriptive reflection Analysis Diagnosis Approach Engagement of learners Descriptive reflection Application Establishes learning environment Diagnosis Application Analysis Descriptive reflection Establishes learning environment Engagement of learners Reaction Descriptive reflection Analysis Approach Diagnosis Analysis Diagnosis Application Engagement of learners Descriptive reflection Analysis Application Descriptive reflection Application Engagement of learners Descriptive reflection Diagnosis Approach Establishes learning environment Reaction Descriptive reflection Diagnosis Descriptive reflection Analysis Application Descriptive reflection Continued
4
Runnacles J, et al. Postgrad Med J 2014;0:1–9. doi:10.1136/postgradmedj-2012-131676
Downloaded from pmj.bmj.com on September 19, 2014 - Published by group.bmj.com
Original article Table 1
Continued
Authors, Year
Methodology of papers
Components of an effective debriefing
Porter, 199938
Case study
Rall et al, 200039
Descriptive survey
Rubin and Campbell, 199740
Expert opinion
Rudolph et al, 200614
Expert opinion
Rudolph et al, 200741
Expert opinion
Rudolph et al, 200842
Expert opinion
Salas et al, 200815
Expert opinion
Steinwachs, 199243
Expert opinion
van de Ridder et al, 200844 Westberg, 200145
Review Expert opinion
Approach Establishes learning environment Diagnosis Approach Engagement of learners Descriptive reflection Application Reaction Descriptive reflection Approach Engagement of learners Descriptive reflection Diagnosis Approach Engagement of learners Descriptive reflection Approach Establishes learning environment Analysis Application Approach Diagnosis Application Approach Engagement of learners Reaction Descriptive reflection Analysis Application Diagnosis Reaction Descriptive reflection Diagnosis
The majority of participants raised the importance of feedback regarding teamwork and non-technical skills such as communication, leadership and teamwork. These elements are part of the ‘diagnosis’ of scenario outcome: teamwork is, for me, the thing that I find the debrief is really useful for. I think you can adjust one person’s performance but we generally don’t resuscitate children individually, we look after them as a team (Feedback on team management, learning points) (C2).
As an example of ‘application’, which focused on strategies for future improvements, one participant simply stated: you can come to an agreement about an action plan about what you might do differently (Strategies for future improvement) (R3).
When asked to suggest ways of improving the quality of debriefing or feedback, there was a widely shared view of the importance of developing a culture for feedback and reflective practice in paediatric training: develop a culture where people find that (Feedback) easy to do and easy to take is really key (C1).
The most commonly identified of these components were descriptive reflection, analysis, diagnosing learning points and application (strategies for future improvement). A few interviewees suggested that there should be ways of increasing the awareness of debriefing and many felt that debriefing should be formalised in some way or incorporated into portfolios as written reflections. Nearly half of the Runnacles J, et al. Postgrad Med J 2014;0:1–9. doi:10.1136/postgradmedj-2012-131676
participants mentioned the importance of training facilitators being skilful at debriefing.
Synthesis of findings from the literature review and interview study The eight thematic coding groups from the interview study were cross-referenced with the findings of the literature review. The features that were consistently identified and common both within the evidence base (review) and across end users (interviews) resulted in eight components of an effective debriefing that make up the core dimensions of the final Objective Structured Assessment of Debriefing (OSAD) tool. These components are outlined in table 2, with the relevant studies that mention them alongside examples from the interview study. The themes extracted from the review and interviews are relevant to both one-to-one debriefings and group/team debriefing scenarios.
Developing the ‘OSAD’ tool In the designing of the OSAD tool, end-usability and ease of quick referencing were key to the design. A six by nine grid system was selected that would fit, with all the data, on to one side of A4 paper. A five-point Likert scale was chosen for ‘marking’ each debrief with 1=‘done very poorly’ to 5=‘done very well’. The grid showed the eight components of an effective debriefing down the left-hand column. The subsequent five parallel columns represent scores 1–5. For each of the eight components, scores 1, 3 and 5 were anchored with specific 5
Downloaded from pmj.bmj.com on September 19, 2014 - Published by group.bmj.com
Original article Table 2
Components of effective debriefing identified from literature review and interview study No of interview participants who mentioned component
Component
Studies
Registrars
Consultants
Approach of the facilitator
Bishop, 200019; Dieckmann et al, 200911; Dismukes et al, 200620; Dreifuerst, 200922; Fanning and Gaba, 200724; Gururaja et al, 200918; Kyle and Murray, 200830; Pearson and Smith, 198635; Porter, 199938; Rall, 200039; Rudolph et al, 200614; Rudolph et al, 200741; Rudolph et al, 200842; Salas et al, 200815; Steinwachs, 199243
8
7
Establishes learning environment
Bishop, 200019; Brett-Fleegler et al, 200917; Domuracki et al, 200921; Harvard Business School, 200727; Issenberg et al, 200510; Pearson and Smith, 198635; Porter, 199938; Rudolph et al, 200842
8
6
Engagement of learners
Bishop, 200019; Brett-Fleegler et al, 200917; Dieckmann et al, 200911; Dismukes et al, 200620; Dreifuerst, 200922; Gururaja et al, 200918; Kilbourn, 199029; Lederman, 199213; McGaghie et al, 201033; Rall et al, 200039; Rudolph et al, 200614; Rudolph et al, 200741; Steinwachs, 199243 Brett-Fleegler et al, 200917; Fanning and Gaba, 200724; Kilbourn, 199029; Pearson and Smith, 198635; Rubin and Campbell, 199740; Steinwachs, 199243; Westberg, 200145 Edelson et al, 200823; Gaba, 200426; Gururaja et al, 200918; Issenberg et al, 199928; Kilbourn, 199029; Lederman, 199213; McGaghie et al, 200632; Morgan et al, 200934; Pearson and Smith, 198635; Owen and Follows, 200636; Petranek, 200037; Rall, 200039; Rubin and Campbell, 199740; Rudolph et al, 200614; Rudolph et al, 200741; Steinwachs, 199243; Westberg, 200145 Bishop, 200019; Brett-Fleegler et al, 200917; Edelson et al, 200823; Fanning and Gaba, 200724; Gaba, 200426; Issenberg et al, 199928; Kilbourn, 199029; Lederman, 198431; Lederman, 199213; Owen and Follows, 200636; Rudolph et al, 200842; Steinwachs, 199243 Brett-Fleegler et al, 200917; Edelson et al, 200823; Folkman, 200625; Gaba, 200426; Harvard Business School, 200727; Kyle and Murray, 200830; Lederman,198431; Morgan et al, 200934; Pearson and Smith, 198635; Porter, 199938; Rudolph et al, 200614; Salas et al, 200815; van de Ridder et al, 200844; Westberg, 200145
6
5
3
6
8
8
6
7
Analysis of event Improved insight/awareness
8
8
Dreifuerst, 200922; Fanning and Gaba, 200724; Folkman, 200625; Gururaja et al, 200918; Harvard Business School, 200727; Lederman,198431; Lederman, 199213; McGaghie et al, 200632; Owen and Follows, 200636; Rall, 200039; Rudolph et al, 200842; Salas et al, 200815; Steinwachs, 199243
8
7
Positive feedback Feedback on team management Learning points Feedback on leadership Feedback on communication Strategies for future improvement
Reaction
Descriptive reflection
Analysis
Diagnosis
Application
descriptions of what would be expected by the debriefer to achieve that score. For example, in the row pertaining to the component ‘Reaction’: ▸ 1=No acknowledgment of reactions of learners, or emotional impact of the experience. ▸ 3=Asks the learners about their feelings, but does not fully explore their reaction to the events. ▸ 5=Fully explores reactions of learners to the event, dealing appropriately with learners who are unhappy. Score anchors were not added for scores 2 and 4 as it was felt some leeway needed to be left in the system for the users to use their own discretion, so they could further grade debriefings. In scales with such scoring systems, the description of performance for ‘done very well’ may be used as a guide for best practice— this is what we aimed to achieve with the specific anchors we allocated to scores of 5 for debriefing. To further facilitate implementation, a short manual has been produced to accompany the tool and is available online and from the corresponding author. The final eight components of effective debriefing included within the OSAD tool (figure 1) are outlined below. 1. Approach of the facilitator: the manner in which the facilitator conducts the debriefing session, their level of enthusiasm 6
Features of effective debriefing from interview study
2.
3.
4. 5.
Assurance, non-blaming, non-threatening Open approach/listening Non-critical but constructive Skilled facilitator Gentle pace Dedicated time Structure to the debrief Choice of environment: quiet/ uninterrupted Correct timing of session Ground rules at the start Team approach Learner-centred Appropriate choice of questions Use of silence Addressed emotions Emotional support Step by step description Allowed for personal reflection
and positivity when appropriate, showing interest in the learners by establishing and maintaining rapport and finishing the session on an upbeat note. Establishing a learning environment: introduction of the simulation/learning session to the learners by clarifying what is expected of them during the debriefing, emphasising ground rules of confidentiality and respect for others, and encouraging the learners to identify their own learning objectives. Engagement of the learners: active involvement of all learners in the debriefing discussions, by asking open questions to explore their thinking and using silence to encourage their input, without the facilitator talking for most of the debriefing, to ensure that deep rather than surface learning occurs. Reaction of the learners: establishing how the simulation/ learning session impacted emotionally on the learners. Description of the scenario through reflection: self-reflection of events that occurred in the simulation/learning session in a step-by-step factual manner, clarifying any technical clinical issues at the start, to allow ongoing reflection from all learners throughout the analysis and application phases, linking to previous experiences.
Runnacles J, et al. Postgrad Med J 2014;0:1–9. doi:10.1136/postgradmedj-2012-131676
Downloaded from pmj.bmj.com on September 19, 2014 - Published by group.bmj.com
Original article 6. Analysis of events: eliciting the thought processes that drove learners’ actions, using specific examples of observable behaviours, to allow learners to make sense of the simulation/ learning session events. 7. Diagnosis: enabling learners to identify their performance gaps and strategies for improvement, targeting only behaviours that can be changed, and thus providing structured and objective feedback on the simulation/learning session. 8. Application to future practice: summary of the learning points and strategies for improvement that have been identified by the learners during the debrief and how these could be applied to change their future clinical practice. We further comment on the content of these eight dimensions in the Discussion.
DISCUSSION This study identified eight key components of effective debriefing, which were then used to develop and design an OSAD tool to guide and assess debriefings of simulations of serious paediatric situations. The eight components of effective debriefing included in this tool are: Approach of the facilitator; Establishing a learning environment; Engagement of the learners; Reaction/ emotional impact of the learners; Description of the scenario through reflection; Analysis of events; Diagnosis; Application to future practice. The final tool is illustrated in figure 1. The literature review and interview study both identified similar important features of a paediatric debriefing that informed the components of OSAD. The facilitator’s approach was referred to by many sources and also the importance of it being non-threatening, yet open and constructive. These concepts have parallels with debriefing with ‘good judgement’.14 Interestingly, more registrars than consultants mentioned the importance of an uninterrupted environment with dedicated time for debriefing and ground rules at the start. This suggests that they struggle to receive feedback in the conditions of a busy clinical environment yet recognise the importance of it.
Most of the interviews, and indeed the available evidence base we reviewed, referred to the importance of descriptive or step-by-step personal reflection on the experience. The importance of reflection has been well described in medical education literature.46 Reflection is particularly important after paediatric simulation because children can deteriorate rapidly, and scenarios of seriously ill children are stressful and fast-moving, meaning that it is difficult to recall and analyse one’s behaviours when immersed in the simulation. Analysis helps uncover the true reason for the learners’ actions, thus improving their insight, so that constructive feedback can be given. There was also discussion about feedback on both technical and ‘nontechnical’ aspects (such as leadership and communication) in the literature and in many of the interviews. This is important because communication is particularly important in paediatrics within the team treating a seriously ill child.47 The development of the OSAD tool has strengths in its clarity of methods and evidence base. The methodological approach taken (review of the evidence base, followed by end-users’ perspective) is well established to provide evidence for content validity as well as relevance to the clinical audience for which OSAD is intended.48 Importantly, opinions were captured from both registrars, who often receive feedback, and consultants, who are experienced in providing it. The fact that saturation of emergent themes was reached in all the enquiry areas provides confidence that the key points have been captured. Importantly, parallel research by our team on performance debriefings in adult surgical settings identified the same features of debriefing,49 and hence we now have some evidence of the applicability of the OSAD components across paediatric and also adult clinical settings.50 Further testing is required to formally establish the generalisability of these elements of debriefing. Limitations of this study are related to a paucity of published literature on the subject of paediatric debriefing, making it difficult to obtain high-quality evidence on what is truly effective. Ideally, we would have carried out formal critical appraisals of the reviewed evidence, so as to base our initial evaluation of
Figure 1 Objective Structured Assessment of Debriefing (OSAD) in paediatrics. Runnacles J, et al. Postgrad Med J 2014;0:1–9. doi:10.1136/postgradmedj-2012-131676
7
Downloaded from pmj.bmj.com on September 19, 2014 - Published by group.bmj.com
Original article elements of an effective debrief on better-quality evidence; however, this was not possible at the current early stage of development of the evidence base. Interviews may have introduced bias in that participants could have reconstructed examples and events according to their perception and insight of the subject and how they think they should portray an opinion to the interviewer, rather than their true thoughts on the subject. Furthermore, the sample of paediatricians included in the interviews may not reflect those of the paediatric community as a whole. Nonetheless, thematic saturation was achieved, lending credibility to our findings. Further research should seek to ascertain the psychometric properties of OSAD. Although there is evidence to support the validity of its content, its reliability and feasibility of use in a simulated setting remains to be tested. It is also important to establish whether the OSAD tool measures the quality of debriefings consistently. The fact that the tool consists of observable behaviours is important in achieving this (ie, the amount of inference involved from the part of the assessor is minimised), but inter-rater reliability should be evaluated. Further research should also evaluate the meaningfulness of the scoring system and the way OSAD scores correlate with externally derived criteria—that is, further validation of OSAD ought to be carried out. This should be hypothesis driven—for instance, we could hypothesise that more effective debriefings (as assessed via higher OSAD scores) lead to better transfer of learning from simulation-based training to the clinical setting, or from one training scenario to another. OSAD serves the purpose of providing a model for debrief to ensure a level of standardisation in debriefing across multiple sites, particularly pertinent for groups such as the Examining Paediatric Resuscitation Education Using Simulation and Scripting (EXPRESS) network.5 Although developed for paediatric simulation debriefing, OSAD also has the potential to be used by other specialties and it may also be useful for providing feedback in the real-life clinical environment. As highlighted by the present study, the majority of registrars felt that they did not routinely receive feedback after managing a seriously ill child, leaving them with feelings of uncertainty and unanswered questions. If tested further in a real clinical environment, OSAD has the potential to address this deficit by providing a structured method of feedback to encourage such workplace-based learning and to improve the quality of patient care and safety.
debriefings. Although feedback has also suggested that the tool can appear rather complicated and ‘wordy’ on first impression, users have commented on its ease of use once the paediatrician user reads through it and becomes familiar with the format and tool dimensions. Following the presentation of OSAD at national and international conferences, other simulation centres in the UK and overseas (including the Scottish Centre for Simulation and Clinical Human Factors, University of Miami and Manchester Metropolitan University) are feeding back to our team that they are using the instrument to help structure their debriefs or evaluate them, or as part of ‘train-the-trainers’ debriefing courses for any sub-speciality debriefings.
Main messages ▸ Performance debriefing after simulations of serious paediatric situations is crucial to maximise the learning experience and improve patient safety. ▸ A literature review and interview study of paediatricians identified the most important features of an effective paediatric debriefing. ▸ A newly developed user-informed tool using the current evidence base, the Objective Structured Assessment of Debriefing (OSAD) tool for paediatrics, was produced from this research. ▸ OSAD may be used to guide and assess debriefings after simulations of serious paediatric situations.
Current research questions ▸ Does using Objective Structured Assessment of Debriefing (OSAD) in paediatrics improve the quality of debriefings in simulated and clinical settings? ▸ What is the optimal method of using OSAD in paediatrics to train novice facilitators to improve their debriefing techniques? ▸ What are the psychometric properties of OSAD in paediatrics as an assessment tool?
CONCLUSION OSAD has been developed using the evidence base in the literature and an interview study to create a tool to guide and assess debriefings of simulations of serious paediatric situations. This tool provides a structured approach to debriefing, which has initial evidence supporting its validity. Pending further psychometric testing, this tool may be used to improve the quality of debriefing after paediatric simulation.
POST-SCRIPT The OSAD tool for paediatrics is currently used by the London School of Paediatrics Simulation Committee both for faculty development (to guide and assess novice debriefers) and to aid simulation centres in standardising the structure of the debrief for the regional ST3 specialty training simulation programme (third year of a 7–8-year residency programme in the UK paediatric training system). Feedback that our team has had to date from simulation facilitators in London suggests that the OSAD tool is particularly useful for novice debriefers, as an aide-memoire immediately before and during a debriefing, and to guide reflection on their debriefing practices with their mentor after facilitating these 8
Key references ▸ Issenberg SB, McGaghie W, Petrusa E, et al. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach 2005;27:10–28. ▸ Dieckmann P, Molin Friis S, Lippert A, et al. The art and science of debriefing in simulation: Ideal and practice. Med Teach 2009;31:e287–94. ▸ Raemer D, Anderson M, Cheng A, et al. Research regarding debriefing as part of the learning process. Simul Healthc 2011;6:S52–7. ▸ Fanning R, Gaba D. The role of debriefing in simulation-based learning. Simul Healthc 2007;2:115–25. ▸ Rudolph J, Simon R, Dufresne R, et al. There’s no such thing as “nonjudgmental” debriefing: a theory and method for debriefing with good judgment. Simul Healthc 2006;1:49–55. Runnacles J, et al. Postgrad Med J 2014;0:1–9. doi:10.1136/postgradmedj-2012-131676
Downloaded from pmj.bmj.com on September 19, 2014 - Published by group.bmj.com
Original article Acknowledgements The authors would like to thank Dr Mehrengise Cooper, the London School of Paediatrics and all participants who agreed to be interviewed for this study. Collaborators Dr Mehrengise Cooper, London School of Paediatrics Simulation Network. Contributors All authors listed contributed to the revision and final approval of this article. Study design: JR, SA, NS, RK. Data collection: JR, LT. Data analysis and interpretation: JR, LT, SA, NS. Drafting and revising article: JR, SA, NS, RK, LT. Final approval of version to be published: JR, SA, NS, RK, LT. Funding The London Deanery Educational Fellowship Programme provided funding for this work. SA and NS are affiliated with the Imperial Patient Safety Translational Research Centre (http://www.cpssq.org), which is funded by the National Institute for Health Research, UK. Competing interests None.
19 20 21 22 23
24 25 26
Ethics approval Institute of Education, University of London.
27
Provenance and peer review Not commissioned; externally peer reviewed.
28
REFERENCES
29 30
1 2 3
4 5
6 7
8 9 10
11 12 13 14
15 16 17
18
Aggarwal R, Mytton OT, Derbrew M, et al. Training and simulation for patient safety. Qual Saf Health Care 2010;19(Suppl 2):i34–43. Undre S, Koutantji M, Sevdalis N, et al. Multidisciplinary crisis simulations: the way forward for training surgical teams. World J Surg, 2007;31:1843–53. Eppich W, Adler M, McGaghie W. Emergency and critical care paediatrics: use of medical simulation for training in acute paediatric emergencies. Curr Opin Pediatr 2006;18:266–71. Cheng A, Duff J, Grant E, et al. Simulation in Paediatrics: an educational revolution. Paediatr Child Health 2007;12:465–8. Cheng A, Hunt E, Donoghue A, et al. EXPRESS: Examining Pediatric Resuscitation Education Using Simulation and Scripting. The birth of an international pediatric simulation research collaborative- from concept to reality. Simul Healthc 2011;6:34–41. Kneebone R, Nestel D. Learning clinical skills–the place of simulation and feedback. Clin Teach 2005;2:86–90. Shapiro MJ, Morey JC, Small SD, et al. Simulation based teamwork training for emergency department staff: Does it improve clinical team performance when added to an existing didactic teamwork curriculum? Qual Saf Health Care 2004;13:417–21. Reason J. Understanding adverse events: human factors. Qual Health Care 1995;4:80–9. Neily J, Mills PD, Young-Xu Y, et al. Association between implementation of a medical team training program and surgical mortality. JAMA 2010;304:1693–700. Issenberg SB, McGaghie W, Petrusa E, et al. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach 2005;27:10–28. Dieckmann P, Molin Friis S, Lippert A, et al. The art and science of debriefing in simulation: Ideal and practice. Med Teach 2009;31:e287–94. Raemer D, Anderson M, Cheng A, et al. Research regarding debriefing as part of the learning process. Simul Healthc 2011;6:S52–7. Lederman L. Debriefing: towards a systematic assessment of theory and practice. Simul Gaming 1992;23:145–60. Rudolph J, Simon R, Dufresne R, et al. There’s no such thing as “nonjudgmental” debriefing: a theory and method for debriefing with good judgment. Simul Healthc 2006;1:49–55. Salas E, Klein C, King H, et al. Debriefing medical teams: 12 evidence- based best practices and tips. Jt Comm J Qual Patient Saf 2008;34:518–27. Martin J, Regehr G, Reznick H, et al. Objective structured assessment of technical skills (OSATS) for surgical residents. Br J Surg 1997;84:273–8. Brett-Fleegler M, Rudolph J, Eppich W, et al. Debriefing assessment for simulation in healthcare: development and psychometric properties. Simul Healthc 2012;7:288–94. Gururaja R, Yang T, Paige J, et al. Examining the effectiveness of debriefing at the point of care in simulation- based operating room team training. 2009 [online]. http://www.ahrq.gov/downloads/pub/advances2/vol3/Advances-Gururaja_7.pdf
Runnacles J, et al. Postgrad Med J 2014;0:1–9. doi:10.1136/postgradmedj-2012-131676
31
32
33 34
35 36 37 38 39 40 41 42 43 44 45 46 47
48 49 50
Bishop S. The complete feedback skills training book. Farnham: Gower, 2000. Dismukes R, Gaba D, Howard S. So many roads: facilitated debriefing in healthcare. Simul Healthc 2006;1:23–5. Domuracki KJ, Moule CJ, Owen H, et al. Learning on a simulator does transfer to clinical practice. Resuscitation 2009;80:346–9. Dreifuerst K. The essentials of debriefing in simulation learning: a concept analysis. Nurs Educ Perspect 2009;30:109–14. Edelson D, Litzinger B, Arora V, et al. Improving in-hospital cardiac arrest process and outcomes with performance debriefing. Arch Intern Med 2008;168:1063–9. Fanning R, Gaba D. The role of debriefing in simulation-based learning. Simul Healthc 2007;2:115–25. Folkman J. The power of feedback: 35 principles for turning feedback from others into personal and professional change. New Jersey: John Wiley & Sons, 2006. Gaba D. The future vision of simulation in healthcare. Qual Saf Health Care 2004;13:i2–10. Harvard Business School. Giving feedback: expert solutions to everyday challenges. Boston: Harvard Business Press, 2007. Issenberg SB, McGaghie W, Hart I, et al. Simulation technology for health care professional skills training and assessment. JAMA 1999;282:861–6. Kilbourn B. Constructive feedback: learning the art. Virginia: OISE press, 1990. Kyle R, Murray WB. Clinical Simulation: operations, engineering and management. Burlington, MA: Academic Press, 2008. Lederman L. Debriefing: a critical re-examination of the postexperience analytic process with implications for its effective use. Simul Gaming 1984;15:415–31. McGaghie W, Issenberg B, Petrusa E, et al. Effect of practice on standardised learning outcomes in simulation-based medical education. Med Educ 2006;40:792–7. McGaghie W, Issenberg B, Petrusa E, et al. A critical review of simulation based medical education research: 2003–2009. Med Educ 2010;44:50–63. Morgan P, Tarshis J, LeBlanc V, et al. Efficacy of high- fidelity simulation debriefing on the performance of practicing anaesthetists in simulated scenarios. Br J Anaesth 2009;103:531–7. Pearson M, Smith D. Debriefing in experience-based learning. Simul Games Learn 1986;1621:155–72. Owen H, Follows V. GREAT simulation debriefing. Med Educ 2006;40:488–9. Petranek C. Written debriefings: the next vital step in learning with simulations. Simul Gaming 2000;31:108–18. Porter T. Beyond metaphor: applying a new paradigm of change to experiential debriefing. JEE 1999;22:85–90. Rall M, Manser T, Howard S. Key elements of debriefing for simulator training. Eur J Anaesthesiol 2000;17:515–26. Rubin I, Campbell T. The ABCs of effective feedback: a guide for caring professionals. San Francisco, CA: Jossey Bass, 1997. Rudolph J, Simon R, Rivard P, et al. Debriefing with good judgment: combining rigorous feedback with genuine inquiry. Anesth Clin 2007;25:361–76. Rudolph J, Simon R, Raemer D, et al. Debriefing as formative assessment: closing performance gaps in medical education. Acad Emerg Med 2008;15:1–7. Steinwachs B. How to facilitate a debriefing. Simul Gaming 1992;23:186–95. van de Ridder J, Stokking K, McGaghie W, et al. What is feedback in clinical education? Med Educ 2008;42:189–97. Westberg J. Fostering reflection and providing feedback: helping others learn from experiences. New York: Springer publishing company, 2001. Schön D. Educating the reflective practitioner. San Francisco, CA: Jossey-Bass, 1987. Lambden S, DeMunter C, Dowson A, et al. The Imperial Paediatric Emergency Training Toolkit (IPETT) for use in paediatric emergency training: development and evaluation of feasibility and validity. Resuscitation 2013;84:831–6. Abell N, Springer D, Kamata A. Developing and validating rapid assessment instruments. New York, NY: Oxford University Press, 2009. Ahmed M, Sevdalis N, Nestel D, et al. Identifying best practice guidelines for debriefing in surgery: a tri-continental study. Am J Surg 2012;203:523–9. Arora S, Ahmed M, Paige J, et al. Objective structured assessment of debriefing (OSAD): bringing science to the art of debriefing in surgery. Ann Surg 2012;256:982–8.
9
Downloaded from pmj.bmj.com on September 19, 2014 - Published by group.bmj.com
Development of a tool to improve performance debriefing and learning: the paediatric Objective Structured Assessment of Debriefing (OSAD) tool Jane Runnacles, Libby Thomas, Nick Sevdalis, et al. Postgrad Med J published online September 8, 2014
doi: 10.1136/postgradmedj-2012-131676
Updated information and services can be found at: http://pmj.bmj.com/content/early/2014/09/08/postgradmedj-2012-131676.full.html
These include:
Data Supplement
"Supplementary Data" http://pmj.bmj.com/content/suppl/2014/09/08/postgradmedj-2012-131676.DC1.html
References
This article cites 40 articles, 9 of which can be accessed free at: http://pmj.bmj.com/content/early/2014/09/08/postgradmedj-2012-131676.full.html#ref-list-1
P