Document not found! Please try again

Using a Delphi process to establish consensus on emergency ...

12 downloads 58 Views 214KB Size Report
national standards in this area (Frank et al. 2008). ... CanMEDS 2005 Physician Competency Framework (Frank ..... Hasson F, Keeney S, McKenna H. 2000.
2011; 33: e333–e339

WEB PAPER

Using a Delphi process to establish consensus on emergency medicine clerkship competencies RICK PENCINER1, TREVOR LANGHAN2, RICHARD LEE3, JILL MCEWEN4, ROBERT A. WOODS5 & GLEN BANDIERA1 1

University of Toronto, Canada, 2University of Calgary, Canada, 3University of Alberta, Canada, 4University of British Columbia, Canada, 5University of Saskatchewan, Canada

Abstract Background: Currently, there is no consensus on the core competencies required for emergency medicine (EM) clerkships in Canada. Existing EM curricula have been developed through informal consensus or local efforts. The Delphi process has been used extensively as a means for establishing consensus. Aim: The purpose of this project was to define core competencies for EM clerkships in Canada, to validate a Delphi process in the context of national curriculum development, and to demonstrate the adoption of the CanMEDS physician competency paradigm in the undergraduate medical education realm. Methods: Using a modified Delphi process, we developed a consensus amongst a panel of expert emergency physicians from across Canada utilizing the CanMEDS 2005 Physician Competency Framework. Results: Thirty experts from nine different medical schools across Canada participated on the panel. The initial list consisted of 152 competencies organized in the seven domains of the CanMEDS 2005 Physician Competency Framework. After the second round of the Delphi process, the list of competencies was reduced to 62 (59% reduction). Conclusion: This study demonstrated that a modified Delphi process can result in a strong consensus around a realistic number of core competencies for EM clerkships. We propose that such a method could be used by other medical specialties and health professions to develop rotation-specific core competencies.

Introduction

Practice points

Undergraduate emergency medicine (EM) teaching in Canada is heterogonous. There have been recommendations for national standards in this area (Frank et al. 2008). Recommendations for a fourth-year medical student EM curriculum have been published in the United States (Manthey et al. 2006). The International Federation for Emergency Medicine recently described an international generic model curriculum for medical student education in EM (Hobgood et al. 2009). However, neither of these sets of recommendations were developed using validated methods defined a priori. Furthermore, both prescribe EM curricula for the overall training program rather than a specific clinical experience. Canadian programs employ a clinical clerkship model wherein students rotate through various clinical placements in their senior year(s) and there is an imperative to select relevant learning objectives from the overall program curriculum for application in each specific, time-limited clinical rotation. We are unaware of systematically derived curricula for such a rotation in EM. A core competency is defined as the essential knowledge, skill, or attitude needed to succeed in a given field. Defining learning objectives and core competencies for clinical

. There is no consensus on the core competencies required for EM clerkships. . The Delphi process is an effective method in establishing consensus. . The Delphi process is a valid method to inform competency and curriculum development. . A consensus was established in the development of EM clerkship core competencies in Canada. . Physician competency frameworks such as CanMEDS 2005 can be used to structure undergraduate curricula even though they were developed to define practice at the consultant level.

clerkship has become a priority in medical education (Burke & Brodkey 2006). The Liaison Committee on Medical Education (LCME) is the accrediting authority for programs leading to the MD degree in the United States and Canada. The LCME requires medical schools to specify objectives of their educational program and ensure that all those responsible for teaching medical students be aware of these objectives. These objectives are to be stated in terms that allow assessment of

Correspondence: R. Penciner, North York General Hospital, 4001 Leslie Street, 630N Toronto, ON M2K1E1, Canada. Tel: 416 756 6615; fax: 416 756 6916; email: [email protected] ISSN 0142–159X print/ISSN 1466–187X online/11/060333–7 ß 2011 Informa UK Ltd. DOI: 10.3109/0142159X.2011.575903

e333

R. Penciner et al.

student progress in developing the competencies that the profession and the public expect of a physician (Liaison Committee on Medical Education 2010). There are already widely recognized definitions of the knowledge, skills, and attitudinal attributes appropriate for a physician, including those described in the ACGME Outcome Project (Accreditation Council for Graduate Medical Education 2010) and the CanMEDS 2005 Physician Competency Framework (Frank 2005). The Association of American Medical Colleges has defined the clinical skills curricula for undergraduate medical education (Association of American Colleges 2005). Implementing these recommendations requires educators to define from these general lists what are appropriate for every given educational unit (such as a course or clinical rotation). Most undergraduate and postgraduate medical education programs have developed national core competencies for their respective overall programs, if not for specific experiences within it. Although all programs are encouraged to design curricula to maximize local resources and constraints, a national reference core competency list for specific rotations can help to ensure nothing is missed, realistic priorities are set, and political discussions are well-informed. The CanMEDS 2005 Physician Competency Framework was developed by the Royal College of Physicians and Surgeons of Canada as a guide to the essential abilities physicians need for optimal patient outcomes. It consists of seven roles or thematic groups of competencies: medical expert, communicator, collaborator, manager, health advocate, scholar, and professional (Frank 2005). Each role can be further broken down into smaller components for teaching, learning, and assessment. The Delphi process has been used extensively in social sciences and health-related research. The Delphi process is a group facilitation technique, that seeks to obtain consensus on the opinions of experts through a series of structured questionnaires (rounds) in an iterative multistage process, designed to transform opinion into group consensus (Hasson et al. 2000). The purpose of this project was to define the core competencies for EM clerkships in Canada, to validate a Delphi process in the context of national curriculum development, and to demonstrate the adoption of the CanMEDS physician competency paradigm in the undergraduate medical education realm.

Methods Using a modified Delphi process, we developed a consensus amongst a panel of expert EM undergraduate educators from across Canada utilizing the CanMEDS 2005 Physician Competency Framework (Frank 2005).

Questionnaire development A convenience sample of experts in EM undergraduate education (the project investigators n ¼ 6) developed a comprehensive list of competencies for the EM clerkship. This list was developed by consulting multiple sources including a review of the published literature and the gray literature e334

(internet) for existing EM curriculum and course objectives; and reviewing the course curricula of nine Canadian medical school EM clerkships until saturation was reached (Appendix 1). The list of competencies was organized into the domains described in the CanMEDS 2005 Physician Competency Framework (Frank 2005) by consensus of two study investigators (Rick Penciner and Glen Bandiera). This comprehensive list underwent a series of edits and additions with a view to inclusivity until there was consensus amongst the expert panel. A questionnaire was developed utilizing a web-based Õ survey tool (SurveyMonkey available at www.surveymonkey.com). Each item consisted of a description of the competency followed by a 7-point Likert scale indicating the strength of agreement whether the competency should be included in a 4-week EM clerkship rotation (as distinctly opposed to what students should know about EM upon graduation). Detailed instructions for panel members on what to consider when rating each competency were developed (Appendix 2). In addition, panel members were given the opportunity to add any additional competencies at their discretion. The questionnaire was piloted with nine EM educators. Final edits were made to the questionnaire based on feedback received.

Round 1 A larger representative panel of expert EM undergraduate educators in Canada was identified. The EM undergraduate coordinator/director at each English speaking medical school in Canada (n ¼ 13) was invited by email to participate. Using a ‘‘snowball’’ technique (Valente & Pumpuang 2007), we asked each EM undergraduate coordinator/director to identify four additional ‘‘experts’’ at their University and to provide the principal investigator (PI) their names and email addresses. An expert was defined as an ‘‘emergency medicine physician with expertise and/or interest in emergency medicine undergraduate education’’. The PI contacted potential participants by email to ensure interest and agreement to participate in the project. Reminder invitations were sent at weekly intervals twice. Participation in the panel was voluntary. The process was conducted in a ‘‘quasi’’ anonymous manner. Respondents’ identity was known to the PI only to allow for reminders and provision of feedback in subsequent rounds. The participants’ judgments and opinions remained strictly anonymous to members of the expert panel. On March 4, 2010, an email was sent to each member of the panel with a link to an online questionnaire. Reminder emails were sent at weekly intervals twice.

Round 2 Competencies rated as 6 or 7 were categorized as ‘‘must include’’, 4 or 5 as ‘‘for consideration’’ and 1, 2, or 3 as ‘‘not include’’. The PI ranked the competencies from highest percent to lowest percent of ‘‘must include’’ responses within each CanMEDS competency domain. Competencies rated ‘‘not include’’ by 75% or more of the respondents were eliminated from the list. Panel members were subsequently invited by

Emergency medicine clerkship competencies

email 22 days after the first questionnaire to complete a second online questionnaire. Each panel member was asked to rate each competency using a scale of only ‘‘must include’’ or ‘‘not include’’. The panel’s aggregate responses for each competency from round 1 were noted beside each competency as a percent of ‘‘not include’’, ‘‘consideration’’, and ‘‘must include’’. Reminder emails were sent at weekly intervals twice.

Evaluation method Using descriptive statistics, we rank-ordered the competencies based on percent response of ‘‘must include’’. Inclusion criteria for the final list were those competencies for which 75% or more respondents provided a response of ‘‘must include’’.

External review Upon completion of data collection, the project was reviewed by four external reviewers. The reviewers (none of whom participated in the study and who were selected by the investigators based on their national reputation as experts in curriculum development) included three EM educators/ researchers and one family medicine undergraduate educator/researcher. Reviewers were asked to provide a brief narrative commenting whether the project results were valid, useful, applicable to EM clerkship, and whether the methodology was appropriate to inform competency and curriculum development.

Results Delphi process Nine clerkship directors responded to the initial invitation providing names of 33 potential participants. Of these potential participants, 30 consented to participate on the expert panel. One panel member did not respond to the second questionnaire. There was representation from nine different medical schools from across Canada (range ¼ 2–5 participants per school) with six provinces represented. The expert panel (n ¼ 30) was diverse in terms of education background and expertise (Table 1). The initial list of competencies consisted of 152 competencies organized in the seven domains of the CanMEDs framework (Table 2). After round 2 of the Delphi process, the list of competencies was reduced from 152 to 62 competencies (59% reduction). The seven domains of the CanMEDS 2005 framework are; medical expert, communicator, collaborator, manager, health advocate, scholar, and professional. A sample of the ‘‘Medical Expert’’ competencies can be found in Tables 3 and 4.

External review The external reviewers commented on four areas of the project; face validity of results, applicability of results, usefulness of results, and whether the methodology was appropriate to inform competency and curriculum development. The reviewers believed the project results had face validity by ensuring a comprehensive literature search and identifying

Table 1.

Profile of expert panel (n ¼ 30).

Characteristic Years in practice Less than 5 years 5–10 years 10–15 years Greater than 15 years

No. (%) 5 8 9 8

(17) (27) (30) (26)

Certification CCFP (EM) FRCP (EM) Dual certification

18 (60) 16 (53) 4 (13)

University rank Lecturer Assistant professor Associate professor Professor

6 12 11 1

Practice setting Academic Health Science Centre Community Teaching Hospital Community Hospital

26 (87) 3 (10) 1 (3)

Practice mix Almost exclusive adult patients Mix of both adult and pediatric patients

15 (50) 15 (50)

Involvement in EM clerkship teaching in past 5 years Clinical teaching (example teaching during shifts) Non-clinical teaching (example seminars) Workshop/seminar coordination/administration Curriculum development Course committee Course director

29 27 19 16 16 15

(20) (40) (37) (3)

(97) (90) (63) (53) (53) (50)

the right experts for the panel. One reviewer commented that ‘‘the unique educational considerations of the emergency department learning environment have been considered’’. Two reviewers commented that there was no representation on the panel outside of EM educators (such as students and other medical educators). The results were considered applicable to EM clerkships across Canada. The inclusion of educators from across Canada and of educators from a variety of educational environments and backgrounds ensured the findings were generalizable. There was concern regarding the absence of panel members from the province of Quebec given its unique language and culture within Canada and the limited representation on the panel from community Emergency Departments. The reviewers commented that the results were useful since they were mapped onto the CanMEDS 2005 Framework which is increasingly informing undergraduate medical education. The face validity of the construct of adapting consultant level competency models to design undergraduate curricula specific to given rotations thus seems robust. The reviewers believed that the methodology employed was appropriate to inform competency and curriculum development. Using a Delphi process was ideal in developing consensus amongst experts. One reviewer commented that ‘‘there was no representation from learners or recent graduates from medical school.’’

Discussion This study demonstrated that a modified Delphi process is an effective method to establish a national consensus in

e335

R. Penciner et al.

Table 2.

Competency domain Medical expert Medical expert Medical expert Medical expert Communicator Collaborator Manager Advocate Scholar Professional

Number of competencies by Delphi round.

Number of competencies on initial list

Number of competencies after last round (75% agreement of ‘‘must include’’)

Percent reduction of competencies

16 41 36 13 7 6 8 10 5 10

9 17 9 8 4 2 0 2 2 9

44 59 75 39 43 67 100 80 60 10

152

62

59

(general) ( presenting problems) ( procedures) (interpretive skills)

Total

Table 3.

Example of ‘‘Medical Expert’’ (General) competencies.

No. of experts who chose ‘‘must include’’ (%)

Competency Demonstrate the ability to rapidly recognize and initiate basic management of acute life- or limb-threatening illness or injury Describe a basic differential diagnosis including the significant worst-case diagnosis for every patient assessed Demonstrate a basic systematic, prioritized approach to resuscitation and stabilization of emergencies Demonstrate a basic ability to distinguish seriously ill or injured patients from those with minor conditions Demonstrate a focused history and physical examination Distinguish which conditions are life-threatening or emergent from those that are less urgent Demonstrate the ability to evaluate and initiate treatment of the undifferentiated patient Describe the concept of triage and prioritization of care Recognize that certain groups of patients require a high index of suspicion for serious illness (e.g., – immune-compromised, chronic renal failure, transplant, extremes of age, intoxicated, and diabetes) Monitor the response to therapeutic interventions Describe the risks and benefits of investigations and treatments utilized in the ED for the common presenting problems Recognize and initiate management of a presentation related to domestic violence Demonstrate an understanding of the concept of triage and prioritization of care in management of multiple patients simultaneously List which areas in the ED are most appropriate for various triage categories and types of patients Describe the five levels of the Canadian Triage and Acuity Scale Perform a rapid triage assessment

development of core competencies for EM clerkships. Consensus methods in medical and health services research provide a means of harnessing the insights of appropriate experts to enable decisions to be made. Commonly used consensus methods include, brainstorming, nominal group technique, consensus development conference and Delphi technique (Jones & Hunter 1995). We employed the Delphi process as it was considered the most rigorous and practical means to achieve consensus amongst a group of experts geographically dispersed. The Delphi process has been used extensively in social science research and health-related research. The process allows researchers to seek out information which may generate a consensus on the part of the respondent group. It is increasingly being used in health professional education research including curriculum and competency development. This includes nursing (Barton et al. 2009), dentistry (Fried & Leao 2007) and pharmacology (Walley & Webb 1997). The Delphi process has been used in curriculum development in medical education, including continuing education (Esmaily et al. 2008) and postgraduate e336

29 29 29 28 28 28 27 27 27

(100) (100) (100) (97) (97) (97) (93) (93) (93)

20 17 17 15

(69) (59) (59) (52)

14 (50) 11 (38) 8 (28)

education (Flynn & Verma 2008). It has also been used in undergraduate medical education including palliative medicine (Paes & Wee 2008), dermatology (Clayton et al. 2006), critical care (Perkins et al. 2005) and family medicine (Hueston et al. 2004). We found no evidence in the English language literature of the Delphi process being used in curriculum and competency development in undergraduate EM. There is no universally accepted uniform process for the use of a Delphi technique (Hasson et al. 2000). We chose to develop a questionnaire with a finite list of options. This list of options was based on the literature and existing curricula in Canada and was developed by consensus of a smaller group of experts. The initial list of competencies was overly inclusive by design. The first questionnaire employed a 7-point Likert scale which forced participants to rate each competency indicating the strength of agreement whether the competency should be included. This allowed for a response by each expert without having to make a final commitment on inclusion or exclusion of a given competency. We decided a priori to eliminate from the second questionnaire any competency that was rated 1, 2,

Emergency medicine clerkship competencies

Table 4. Example of ‘‘Medical Expert’’ ( presenting problem) competencies. This domain demonstrates an approach to patients presenting to the ED with the following problems (including basic differential diagnosis, initial investigations, and initial treatments).

Competency Chest pain Shortness of breath Altered level of consciousness Anaphylaxis/severe allergic reaction Abdominal pain Loss of consciousness (syncope) Shock Seizure Cardio-respiratory arrest Headache Minor trauma (including fracture/dislocation/sprain) Abnormal behavior (e.g., psychosis, delirium, intoxication, violence) Head injury – minor Fever Dizziness/vertigo Cardiac dysrhythmias Vaginal bleeding – pregnant Poisoning Gastrointestinal bleeding Eye pain (including red eye) Vomiting Back pain Urinary symptoms (e.g., dysuria, hematuria) Major trauma Intoxication/withdrawal Burn – minor Depression/suicidal Epistaxis Diarrhea Burn – major Weakness Pelvic pain Bites (animal/insects) Ear pain Heat injury Vaginal bleeding – non-pregnant Vision change Cold injury Urinary retention Rash Foreign body (cavitary, ENT)

No. of experts who chose ‘‘must include’’ (%) 29 29 29 28 28 28 28 27 27 27 27 25

(100) (100) (100) (97) (97) (97) (97) (93) (93) (93) (93) (86)

25 23 23 22 22 21 21 21 20 20 20 19 19 19 17 17 17 17 16 14 14 13 12 11 11 11 9 7 6

(86) (79) (79) (76) (76) (72) (72) (72) (69) (69) (69) (68) (66) (66) (59) (59) (59) (59) (55) (48) (48) (45) (41) (38) (38) (38) (31) (24) (21)

or 3 by greater than 75% of the respondents. No competencies met this criterion and all were included in the final questionnaire. However, the 7-point Likert scale ratings resulted in a range of responses, allowing for initial rankings of the competencies. These aggregate rankings were fed back to the panel on the second questionnaire. One of the key elements of the Delphi process is to provide the current status of the groups’ collective opinion after each round. The purpose is to allow each panel member to consider revising their opinion based on the forming group opinion. On the second questionnaire the rating scale was changed to ‘‘not include’’ or ‘‘must include’’, ultimately we wanted a commitment from each expert on this simple binary question. We believe that changing the rating scale allowed for a significant reduction of the list of competencies from 152 to 62 (59% reduction). The classic Delphi technique had four rounds however some evidence appears to show that either two or

three rounds are preferred (Hasson et al. 2000). Due to limitations of time and resources, we limited the Delphi process to two rounds. There is no clear definition in the literature of ‘‘expert’’ for the purposes of a Delphi process. We defined our experts as EM physicians with interest and/or expertise in EM undergraduate education. We employed purposive sampling to ensure we had a panel of experts from all regions of Canada with medical schools. By utilizing a ‘‘snowball’’ technique for recruitment of the expert panel, we ensured a representative panel was formed. Each participant was identified as an expert either by means of their formal position (EM undergraduate education director at their respective medical school) or as being identified as an expert by a colleague. Furthermore, by agreeing to participate in the panel, participants demonstrated a level of interest in the topic. An expert panel usually consists of 15–30 participants (Linstone & Turoff 1975). Increasing the group size beyond 30 has seldom been found to improve the results (Fink et al. 1984). Our study employed 30 experts. By pre-recruiting interested individuals prior to administration of the questionnaires we had 100% response rate to the first questionnaire and only one panel member not responding to the second questionnaire. Delphi process is an effective method of collecting opinion and determining consensus. However, a universally agreed proportion does not exist, as the level used depends upon the sample numbers, aim of the research, and resources. The reported level of consensus in the literature ranges from 51% to 80% (Hasson et al. 2000). The level of consensus that we sought was 75% of respondents supporting that the competency ‘‘must be included’’. We chose a high level of consensus to ensure that a manageable, core list of competencies was developed that a medical student could encounter during a time-limited EM clerkship rotation. There has been an active debate in the literature on the validity of the Delphi process (Jones & Hunter 1995). The validity of our study was enhanced by ensuring a representative panel of experts from across Canada was recruited. Our experts represented every province with a medical school with the exception of two provinces. The experts’ profile was diverse in terms of educational background. Our high response rates and use of successive rounds also increased the validity of the results. There are several widely recognized definitions of the knowledge, skills, and attitudinal attributes appropriate for a physician, including those described in the AAMC’s Medical School Objectives Project, the general competencies of physicians resulting from the collaborative efforts of ACGME and the physician roles summarized in the CanMEDS 2005 Physician Competency Framework (LCME). In Canada, the CanMEDS 2005 Physician Competency Framework describes the generic abilities required for a practicing physician. This framework of competencies was originally adapted for use at the postgraduate medical specialist level. We chose to utilize the CanMEDS Framework because it is increasingly being used to inform undergraduate medical education and is familiar to most educators in Canada. We demonstrated that the CanMEDS Framework is a valid and practical framework to

e337

R. Penciner et al.

structure competencies and curriculum at the undergraduate level. It is no surprise that the number of competencies for the ‘‘Medical Expert’’ role (43) far exceeded those for another role, and in fact significantly outnumbered all the other roles combined (19). This reflects the traditional teaching paradigm of all medical schools and the fact that medical knowledge and skills are still seen by teachers to be at the core of the curriculum for medical students. More programs are being developed for medical schools that reflect how to incorporate the non- ‘‘Medical Expert’’ roles into the curriculum. Of note, the expert panel did not include one competency in the final list in the ‘‘Manager’’ role. This may reflect the fact that they believe the role is not appropriate for students in undergraduate education. It may also reflect an exercise of priorities in what should be included in a time-limited EM Clerkship. There are a number of limitations of this study. The Delphi process itself has limitations including concerns of the lack of reliability (Hasson et al. 2000). Inherent in the anonymity of the process is a danger of the lack of accountability of the opinions expressed. Our expert panel had no representation from two provinces that have medical schools and did not include representation from French language medical schools in Canada. The panel included only four physicians from community hospitals (teaching and non-teaching). This panel format was designed to include community representation yet recognize that most EM Clerkships in Canada (and expertise in the practicalities of curriculum delivery) are concentrated in Academic Health Science Centres. Furthermore, since undergraduate training in EM is typically oriented toward acquiring general fundamental competencies, our panel included physicians who practiced in a variety of adult and adult/pediatric settings. Since emergency physicians practicing exclusively pediatric EM are almost always specialty or sub-specialty certified in this area and see a unique practice population, we chose to exclude them from this undergraduate study. Our study utilized the CanMEDS 2005 Competency Framework (Frank 2005) to organize the competencies. Although this framework is increasingly informing undergraduate education, it has not been adopted by all medical schools in Canada. Our study provides evidence of the face validity of such an approach.

Conclusions This study demonstrated that a modified Delphi process is an effective method to establish a national consensus in development of core competencies for EM clerkship. These results are applicable and useful to EM clerkship programs in Canada. We propose that such a method could be used by other medical specialties and health professions to develop rotation-specific core competencies.

Acknowledgments The authors are grateful to the external reviewers: Jonathan Sherbino, McMaster University; Farhan Banji, McGill University; Eddy Lang, University of Calgary and Risa Freeman, University of Toronto. They also gratefully e338

acknowledge the members of the expert panel including: Simon Field, Connie LeBlanc, John Ross, Dalhousie University; Danielle Blouin, Jaelyn Caudle, Jim Landine, Queen’s University; Randy Cunningham, Ryan Oland, Curtis Rabuka, Andrew Stagg, University of Alberta; Nancy Austin, Mike Mostrenko, Patrick Rowe, University of British Columbia; Laurie-Ann Baker, David Lendrum, University of Calgary; Albert Buchel, Zoe Oliver, University of Manitoba; Jason Frank, Brian Weitzman, University of Ottawa; Nadim Lalani, Patrick Ling, University of Saskatchewan, Shirley Lee, Rahim Valani, Stella Yiu, University of Toronto. Declaration of interest: The authors report no conflicts of interest.

Notes on contributors RICK PENCINER, MD, is the Director of Medical Education and an Emergency Physician at North York General Hospital in Toronto. He is an Associate Professor in the Division of Emergency Medicine, Department of Family and Community Medicine at the University of Toronto. TREVOR LANGHAN, MD, is an Emergency Physician and a Clinical Assistant Professor at the University of Calgary where he is the Clerkship Coordinator for Emergency Medicine. RICHARD LEE, MD, is an Emergency Physician, a Clinical Associate Professor and the past undergraduate Program Director for Emergency Medicine and is now the Assistant Dean of Clinical Undergraduate Education with the University of Alberta, Faculty of Medicine and Dentistry. JILL MCEWEN, MD, is the undergraduate Program Director, Department of Emergency Medicine, and the Vancouver-Fraser Clerkship Site Director, Faculty of Medicine, University of British Columbia. She is an Associate Clinical Professor at University of British Columbia and an Emergency Physician at the Vancouver General Hospital. ROBERT A. WOODS, MD, is the Emergency Medicine Clerkship Director and a Clinical Assistant Professor at the University of Saskatchewan. GLEN BANDIERA, MD, MEd, is the Chief of Emergency Medicine at St. Michael’s Hospital in Toronto. He is an Associate Professor in the Division of Emergency Medicine, and Director of Postgraduate Programs, Department of Medicine at the University of Toronto.

References Accreditation Council of Graduate Medical Education 2010. ACGME Outcome Project. [Retrieved 2010 August 16]. Available from: http:// www.acgme.org/outcome Association of American Colleges 2005. Recommendations for Clinical Skills Curricula for Undergraduate Medical Education 2005. [Retrieved 2010 August 9]. Available from: https://services.aamc.org/Publications/ showFile.cfm?file=version56.pdf&prd_id=141&prv_id=165&pdf_id=56 Barton AJ, Armstrong G, Preheim G, Gelmon SB, Andrus LC. 2009. A national Delphi to determine developmental progression of quality and safety competencies in nursing education. Nurs Outlook 57(6):313–322. Burke MJ, Brodkey AC. 2006. Trends in undergraduate medical education clinical clerkship learning objectives. Acad Psychiatry 30(2):158–165. Clayton R, Perera R, Burge S. 2006. Defining the dermatological content of the undergraduate medical curriculum: A modified Delphi study. Br J Dermatol 155(1):137–144. Esmaily HM, Savage C, Vahidi R, Amini A, Zarrintan MH, Wahlstrom R. 2008. Identifying outcome-based indicators and developing a curriculum for a continuing medical education programme on rational prescribing using a modified Delphi process. BMC Med Educ 8:33. Fink A, Kosecoff J, Chassin M, Brook RH. 1984. Consensus methods: Characteristics and guidelines for use. Am J Public Health 74: 979–983.

Emergency medicine clerkship competencies

Flynn L, Verma S. 2008. Fundamental components of a curriculum for residents in health advocacy. Med Teach 30(7):e178–e183. Frank JR. (Ed) 2005. The CanMEDS 2005 physician competency framework: Better physicians, better care. Ottawa: The Royal College of Physicians and Surgeons of Canada. [Retrieved 2010 August 9]. Available from: http://rcpsc.medical.org/canmeds/CanMEDS2005/ CanMEDS2005_e.pdf Frank JR, Penciner R, Upadhye S, Nuth J, Lee C. 2008. State of the nation: A profile of Canadian emergency medicine clerkships 2007. Can J Emerg Med 10(3):266. Fried H, Leao AT. 2007. Using Delphi technique in a consensual curriculum for peridontics. J Dent Educ 71(11):1441–1446. Hasson F, Keeney S, McKenna H. 2000. Research guidelines for the Delphi survey technique. J Adv Nurs 32(4):1008–1015. Hobgood C, Anantharaman V, Bandiera G, Cameron P, Halperin P, Holliman J, Jouriles N, Kilroy D, Mulligan T, Singer A. 2009. International federation for emergency medicine model curriculum for medical student education in emergency medicine. Can J Emerg Med 11(4): 349–354. Hueston WJ, Koopman RJ, Chessman AW. 2004. A suggested fourth-year curriculum for medical students planning on entering family medicine. Fam Med 36(2):118–122. Jones J, Hunter D. 1995. Consensus methods for medical and health services research. BMJ 311:76–380.

Liaison Committee on Medical Education. 2010LCME Accreditation Standards: Educational Program for the MD degree: Educational Objectives. [Retrieved 2010 August 9]. Available from: http:// www.lcme.org/functionslist.htm#educational program Linstone HA, Turoff M. 1975. The Delphi method techniques and application. Boston, MA: Addison-Wesley. Manthey DE, Coates WC, Ander DS, Ankel FK, Blumstein H, Christopher TA, Courtney JM, Hamilton GC, Kaiyala EK, Rodger K, et al. 2006. Report of the task force on national fourth year medical student emergency medicine curriculum guide. Ann Emergency Med 47:E1–E7. Paes P, Wee B. 2008. A Delphi study to develop the association for palliative medicine consensus syllabus for undergraduate palliative medicine in Great Britain and Ireland. Palliat Med 22(4):360–364. Perkins GD, Barrett H, Bullock I, Gabbott DA, Nolan JP, Mitchell S, Short A, Smith CM, Smith GB, Todd S, et al. 2005. The acute care undergraduate teaching (ACUTE) initiative: Consensus development of core competencies in acute care for undergraduates in the United Kingdom. Intensive Care Med 31(12):1627–1633. Valente TW, Pumpuang P. 2007. Identifying opinion leaders to promote behaviour change. Health Educ Behav 34(6):881–896. Walley T, Webb DJ. 1997. Developing a core curriculum in clinical pharmacology and therapeutics: A Delphi study. Br J Clin Pharmacol 44(2):167–170.

Appendix 1

(12) (13)

Ref. Manthey et al. (2006) Ref. Hobgood et al. (2009).

Sources for first questionnaire (1)

(2) (3) (4) (5) (6) (7) (8) (9) (10) (11)

Dalhousie University, Department of Emergency Medicine, Clinical Competencies/Encounters Documentation Form. Dalhousie University, Emergency Medicine Clerkship Objectives. Queen’s University, Emergency Medicine Clerkship Objectives. University of Alberta, Medical Student Emergency Medicine Rotation Course Goals. University of Alberta, Medical Student Emergency Medicine Rotation Course Goals. University of British Columbia Emergency Medicine Core Year 3 Clerkship Objectives. University of Calgary, Emergency Medicine Clerkship, Rotation Objectives. University of Manitoba, Emergency Medicine Clerkship Academic Objectives. University of Ottawa, Emergency Medicine Clerkship Goals and Objectives. University of Saskatchewan Emergency Medicine Clerkship Core Content. University of Western Ontario Emergency Medicine Clerkship Objectives.

Appendix 2 Instructions to expert panel WHAT TO CONSIDER WHILE COMPLETING THE QUESTIONNAIRE: As you rate each competency, consider the following: (1) (2)

(3)

(4)

(5)

That the level of training of the learner is a senior medical student NOT a resident. Which competencies should be taught DURING a time –limited 4 week Emergency Medicine clerkship rotation (as opposed to other rotations). Which competencies should be acquired during an Emergency Medicine clerkship rotation that ALL medical students should have upon graduation regardless of chosen specialty or career path. This rating is not what is currently done at your site, but rather what you think should be included in a national curriculum. Competencies may be acquired through exposure to real or simulated patients (e.g. clinical activities, workshops, simulation, online learning).

e339

Suggest Documents