Using peer feedback in a formative objective structured clinical

0 downloads 0 Views 47KB Size Report
clinical encounters evoking aspects of the patient– ... (OSCE) consisting of three 5-minute stations (based ... developed their own marking schemes using a.
really good stuff clinical encounters evoking aspects of the patient– doctor relationship and then engage in a group discussion to identify themes and learning points. Reflection triggers are clerkship-specific and thus may entail reflecting on clinical encounters that have involved intense emotional responses (psychiatry), unpredictable outcomes (surgery) or unexpected aspects of the doctor role (medicine), but all have bearing on aspects of the relationship between patient and doctor or on aspects of the professional role. Why the idea was necessary There is an emerging literature emphasising the potential utility of tools for: (i) assessing the quality of reflection, and (ii) providing feedback aimed at enhancing learning through systematic reflection. We sought to develop a simple and user-friendly rubric for providing feedback on reflective essays written in the context of our clerkship-based course. What was done We adapted and expanded a typology of reflection developed by Jay and Johnson for use in teacher education1. Our rubric includes four dimensions. Each dimension presents prompts to guide assessment and feedback. Dimension 1 is descriptive and is concerned with the clarity of the elaboration of the reflection topic as a problem or question for enquiry. Prompts are: Is it clear what triggered the reflection? Can you complete the following sentence from the writer’s standpoint: In the course of this reflection, I would like to learn more about… ? What is the writer’s dilemma or puzzle? Dimension 2 is comparative and refers to the consideration of relevant alternative perspectives of the problem. Prompts are: Does the writer include all relevant personal perspectives, including her or his own? Are perspectives justified by data? Are perspectives juxtaposed in a way that promotes additional reflection? Dimension 3 is personal and is concerned with the expression of personal intellectual and emotional engagement in the reflection. Prompts are: Is there evidence of personal struggle on cognitive and emotional levels? Is it apparent why the writer chose this particular incident for reflection? What is at stake for the writer? Dimension 4 is critical and relates to commitment to strengthen or alter one’s personal understanding and subsequent related behaviours. Prompts are: Is there an explicit statement of what was learned? Is there evidence of movement from previously held assumptions or of the deepening of beliefs? Is there a plan for action or commitment towards personal or systemic change? Evaluation of results and impact Feedback to date on the rubric includes: (i) presentations at a local faculty development seminar in March 2008 and at the

1144

Association of American Medical Colleges (AAMC) Annual Meeting, November 2008 (n » 10 and n » 30, respectively); (ii) the experience of faculty staff in using the rubric to critique reflections and give feedback to nine students in spring 2009, and (iii) a focus group of five of these students conducted in April 2009, and recorded and transcribed for review. In these settings, students and faculty members using the rubric overwhelmingly found it to be a conceptually straightforward and helpful ‘roadmap’, as one respondent termed it, for communicating about reflective writings. REFERENCE 1 Jay JK, Johnson KL. Capturing complexity: a typology of reflective practice for teacher education. Teaching and Teacher Education 2002;18:73–85.

Correspondence: Michael J Devlin MD, New York State Psychiatric Institute, Unit 116, 1051 Riverside Drive, New York, New York 10032, USA. Tel: 00 1 212 543 5748; Fax: 00 1 212 543 5607; E-mail: [email protected] doi: 10.1111/j.1365-2923.2010.03815.x

Using peer feedback in a formative objective structured clinical examination Annie M Cushing & Olwyn M R Westwood Context and setting With the increasing emphasis on learner autonomy, the use of self- and peer feedback may be viewed as preparation for lifelong learning and as an essential generic professional skill. Its potential in the learning setting implies a need to explore ways of training people to provide immediate feedback that is constructive, is delivered with sensitivity and can encourage insight into personal performance. Why the idea was necessary Key needs identified were: (i) to support student practice and learning in professional communication skills and feedback, and (ii) to maximise feedback opportunities even in a culture that might thwart attempts owing to the pressures of a high student : staff ratio. What was done Students in graduate-entry programmes in medicine and nursing took part in a formative objective structured clinical examination (OSCE) consisting of three 5-minute stations (based on their problem-based learning cases) in which actors were used as simulated patients. Students developed their own marking schemes using a constructivist approach. These were compared with

ª Blackwell Publishing Ltd 2010. MEDICAL EDUCATION 2010; 44: 1117–1147

really good stuff pre-prepared mark sheets and any variations were discussed. Each student took part in three iterations of the three-station OSCE as either the ‘candidate’, ‘examiner’ or ‘observer’. In the introductory session, students watched a DVD on giving feedback, and then rehearsed ways of phrasing responses on good practice and areas for improvement. Students in the role of candidate rotated through each of the OSCE stations; those in the roles of either examiner or observer remained at the same station for the duration of the circuit. Students in the role of candidate received immediate feedback that they could use in the other two stations on the circuit, where they received more feedback from different peers. The defined criteria within the OSCE format offered guidance to feedback novices on the elements to consider in order to convey constructive feedback to their peers. Evaluation of results and impact To evaluate the perceived merit of this method, a questionnaire (containing 20 statement items scored on a 4-point Likert scale) and focus group sessions (the nominal group technique) were used. The participants agreed ⁄ strongly agreed (range 82–100%) with positively framed statements about feedback, and generally disagreed ⁄ strongly disagreed with negatively framed statements about feedback (range 88–100%). A question on whether they valued peer feedback as much as tutor feedback produced divided opinion (48% versus 52%). Although 94% of participants felt they could be honest, they found it difficult to give negative feedback owing to relationship issues and the perceived potential adverse impact on the recipient. Hence, students sought ‘permission’ from faculty members to give opinions on areas to be commended and specifically areas in which improvement was required. Particular educational features of these sessions were the transparency of learner goals through the OSCE marking schemes and the opportunity for multiple practices with immediate feedback, as well as the active engagement of students and the fact that they were made responsible for giving feedback to peers. The project resulted in a robust method by which health professional students can learn how to give constructive feedback on performance and which also supports the development of clinical communication skills. Correspondence: Professor Olwyn M R Westwood, Centre for Medical Education, Institute of Health Sciences Education, Barts and The London School of Medicine and Dentistry, London E1 2AD, UK. Tel: 00 44 20 7882 2219; Fax: 00 44 20 7882 7814; E-mail: [email protected] doi: 10.1111/j.1365-2923.2010.03832.x

A way to assess students’ clinical reasoning Ieda M B Aleluia, Paulo M Carvalho Jr & Marta S Menezes Context and setting Knowledge integration is a fundamental skill needed by all doctors. Techniques that facilitate the development of this skill have therefore gained in importance for both teachers and students. To improve the integration of semiology (the meaning behind linguistic signs and signals, especially in patients) with pharmacology during the sixth semester of the medicine course at our medical school in Brazil, semiology teaching staff encouraged the active involvement of pharmacology faculty members in curriculum planning and educational activities. Our school uses a Moodle platform as a virtual educational environment. This has facilitated the involvement of students and faculty members during clinical case discussions. It also allows student assessments and involvement to be tracked and supports the sharing of online educational documents. Knowledge integration can be assessed by faculty members’ observation of students through the Moodle platform. Why the idea was necessary We considered that it would be advantageous to strengthen existing integration between the two disciplines of semiology and pharmacology in such a way that would stimulate interdisciplinary discussion, and improve student clinical reasoning skills and the interaction between students and teaching staff. What was done To improve the measurement of student development, teachers in the two disciplines actively participated in four meetings to construct 13 clinical cases to be used in the Moodle platform. To measure clinical reasoning, three integrated tests with discursive questions based on short clinical cases were prepared by these faculty members. A questionnaire assessing student satisfaction was developed for administration at the end of the first semester in 2009. This project was approved by our institutional review board. Evaluation of results and impact The joint work resulted in greater integration between the two disciplines. It permitted the production of new educational material and facilitated wide discussion of various clinical cases. The Moodle platform was helpful during the process and was easily navigated by faculty members and students. The online discussion was stimulated by faculty members and attended by students. This interaction also gave teaching staff the opportunity to follow the development of students’ clinical reasoning skills by using a structured skills

ª Blackwell Publishing Ltd 2010. MEDICAL EDUCATION 2010; 44: 1117–1147

1145

Suggest Documents