BREAKOUT SESSION
Assessing Professionalism: Summary of the Working Group on Assessment of Observable Learner Performance Elliot Rodriguez, MD, Jeffrey Siegelman, MD, Katrina Leone, MD, and Chad Kessler, MD, MHPE
Abstract Professionalism is one of the six Accreditation Council on Graduate Medical Education (ACGME) core competencies on which emergency medicine (EM) residents are assessed. However, very few assessment tools exist that have been rigorously evaluated in this population. One goal of the 2012 Academic Emergency Medicine consensus conference on education research in EM was to develop a research agenda for testing and developing tools to assess professionalism in EM residents. A literature review was performed to identify existing assessment tools. Recommendations on future research directions were presented at the consensus conference, and an agenda was developed. ACADEMIC EMERGENCY MEDICINE 2012; 19:1372–1378 © 2012 by the Society for Academic Emergency Medicine
I
n 1999 the Accreditation Council on Graduate Medical Education (ACGME) shifted to an outcomesbased assessment model with emphasis on six core competencies, one of which is professionalism. The ACGME requires these competencies be systemically assessed for purposes of both formative and summative assessment.1 The ACGME lists several characteristics of the professionalism competency that residents are expected to demonstrate, including compassion, integrity, respect, altruism, accountability, and sensitivity.2 Other international medical education governing bodies have variably delineated the elements of professionalism. The Royal College of Physicians and Surgeons of
From the Department of Emergency Medicine, SUNY Upstate Medical University (ER), Syracuse, NY; the Department of Emergency Medicine, Emory University (JS), Atlanta, GA; The Department of Emergency Medicine, Oregon Health & Sciences University (KL), Portland, OR; and The Department of Emergency Medicine, University Illinois–Chicago (CK), Chicago, IL. Received June 29, 2012; accepted July 1, 2012. The list of breakout session participants can be found as the appendix of a related article on page 1486. This paper reports on a breakout workshop session of the 2012 Academic Emergency Medicine consensus conference, “Education Research in Emergency Medicine: Opportunities, Challenges, and Strategies for Success,” May 9, 2012, Chicago, IL. The authors have no relevant financial information or potential conflicts of interest to disclose. Supervising Editor: John Burton, MD. Address for correspondence and reprints: Elliot Rodriguez, MD; e-mail:
[email protected].
1372
ISSN 1069-6563 PII ISSN 1069-6563583
Canada organized the elements of physician competency into the CanMEDS framework, in which the domain of professionalism contains 17 elements.3 To objectively assess professional behavior, however, requires an operational definition consisting of observable behaviors rather than conceptual elements. In 2008, the Council of Emergency Medicine Residency Directors (CORD) Workgroup on Outcomes Assessment published a list of behaviors specific to the assessment of professionalism in emergency medicine (EM) residents.4 The ACGME recently introduced the Next Accreditation System (NAS).5 The NAS includes specialty-specific milestones within each competency that residents must meet. EM will be implementing the NAS in 2013 and has recently published its list of milestones, two of which are for professionalism6 (Data Supplement S1, available as supporting information in the online version of this paper). The 2012 Academic Emergency Medicine consensus conference “Education Research in Emergency Medicine: Opportunities, Challenges, and Strategies for Success” had as one of its goals to develop and advance a consensus-based research agenda on the assessment of learner competency in the six ACGME core competencies. The objective of this article is to summarize those findings and recommendations as they apply specifically to the assessment of professionalism. METHODS Prior to the consensus conference, a comprehensive review of the published literature on professionalism assessment was performed. The Medline database was
© 2012 by the Society for Academic Emergency Medicine doi: 10.1111/acem.12031
ACADEMIC EMERGENCY MEDICINE • December 2012, Vol. 19, No. 12 • www.aemj.org
searched using the terms “professionalism” and “humanism” independently combined with the term “assessment,” resulting in 464 and 211 references, respectively. Combining “professionalism” and “evaluation” identified 509 references. Each of these had their abstract and/or full text reviewed by one or more of the authors to determine relevance. All relevant articles had their bibliographies reviewed for any additional previously unidentified references. Additionally the MedEdPORTAL database (https://www.mededportal.org) was searched using the term “professionalism,” which resulted in 216 results, but only six remained when the term “assessment” was added to the search strategy. Presenting a comprehensive systematic review of this topic is beyond the scope of this article, but the reader is directed to the reviews by Lynch et al.7 and Veloski et al.8 The summaries cited in the article by Veloski et al. can be obtained directly from ABIM (
[email protected]). The results of our review were then discussed among educational experts at a breakout session of the consensus conference. Their input was incorporated into the final recommendations and agenda. RESULTS Identified assessment tools were classified into several categories. These include: ethical knowledge and moral reasoning tests; direct observation assessment tools; ratings and survey-based assessment tools (this category includes global rating instruments and multisource feedback [MSF] tools); critical incident reporting systems; portfolios and narratives; and simulated encounter observations, including objective structured clinical examination (OSCE), unannounced standardized patients (SP), and mannequin-based simulation. Ethics Knowledge and Moral Reasoning Tests There have been several instruments developed to assess ethics knowledge and moral reasoning using selected response questions, modified essay questions, and simulation-based cases.9–12 The most widely studied tool for the assessment of moral reasoning is the Defining Issues Test (DIT). The DIT is a multiple-choice test that is easy to administer and score, so has good feasibility. It has been extensively used, demonstrating good acceptability, and has good reliability and validity.13 The DIT was revised and is now available as the DIT2, which was found to have better reliability and validity than the DIT.14 While the reliability, validity, and feasibility of the DIT are good, limitations and questions still remain for these types of instruments. Some studies question how well these types of ethics knowledge and reasoning assessments correlate with actual professional behavior.15 The DIT has been studied in other specialties,16 but there are no published studies evaluating these types of tools in EM resident populations. Direct Observation Assessment Tools Direct observation tools, for purposes of our discussion, refer to instruments used to assess learner performance in the workplace during real clinical encounters. It is argued that actual clinical encounters are necessary to
1373
fully capture the complexity of context in ethical conflicts.17 The Professionalism Mini-Evaluation Exercise (P-MEX) is a professionalism-specific direct observation tool. It demonstrated good reliability and both content and construct validity when evaluated in Canadian medical students.18 Most recently, the Standardized Direct Observation Assessment Tool (SDOT) has been described by Shayne et al.19 The tool is developed for EM residents, and while not professionalism-specific, it does incorporate several elements of professional behavior, including a global rating for that competency. In a simulated resident–patient setting, it showed good interrater agreement but validation was not assessed. The SDOT has also shown good reliability in real resident– patient encounters.20 The need for an EM-specific validated direct observation professionalism-specific assessment tool remains. The P-MEX may be adaptable for this purpose, but evaluation of its reliability and validity must be undertaken with EM residents. The instrument is brief and appears to have good feasibility, although consideration of the barriers to implementing a program for its use should also be explored. The SDOT, while not professionalismspecific, has demonstrated good reliability in assessing EM residents but validity has yet to be assessed. These tools have potential for educational impact as they can help provide timely, formative feedback and evaluative assessment. Ratings- and Survey-based Assessment Tools Ratings and surveys can be used to assess professionalism. They allow for evaluation of observed behaviors in both direct and indirect patient care activities, as well as in other settings such as educational activities. Global rating forms or summary rating forms are fairly common tools used in the evaluation of EM residents’ professionalism,21 often combining the assessment of multiple core competencies into a single form. Because the core competencies are each unique in their constructs, they may not be accurately assessed with a single instrument.22 Reisdorff et al.23 attempted to validate a global rating tool for EM residents. Principal component analysis revealed professionalism to be a complex multidimensional phenomenon to measure, and as such, comprehensive global rating tools may not be ideal tools for assessing professionalism in EM residents. When two or more rater groups are combined it is termed MSF, or 360-degree evaluation. Raters may include self, peers, nurses, faculty, and patients. While some MSF instruments may attempt to rate multiple core competencies simultaneously, MSF seems best suited to the assessment of professionalism and interpersonal and communication skills.24 The main advantage of MSF is that it allows for multiple raters from multiple vantage points to provide feedback on observed behaviors. A review by Lockyer25 found MSF to have good acceptability and potential for educational impact. More robust evidence of educational impact of MSF on communication skills and professional behavior was demonstrated in pediatric residents.26
1374
Rodriguez et al. • ASSESSING THE CORE COMPETENCIES: PROFESSIONALISM
Several MSF instruments have been studied in various populations. Tsugawa et al.27 modified the P-MEX into an MSF tool in a multispecialty study of Japanese residents and fellows. The P-MEX demonstrated good validity and overall reliability, although the peer rater subgroup had poor reliability. MedEdPORTAL contains a professionalism-specific MSF tool used to assess medical students; however, the reliability and validity are not reported.28 The National Board of Medical Examiners has developed an MSF tool based on observable professional behaviors delineated in their Assessment of Professional Behaviors program.29 This tool was developed as part of a proprietary longitudinal professionalism evaluation program. The reliability and validity data for the tool is still being compiled (M. Richmond, personal communication, January 19, 2012), but a description of feasibility and implementation issues for this MSF program has been published.30 Another MSF instrument was developed by the College of Physicians and Surgeons of Alberta’s Physician Achievement Program (CPSA PAP) for physicians who provide episodic care, including emergency physicians (EPs). It is not professionalism-specific but aspects of professionalism make up a large part of the assessment. Lockyer et al.31 evaluated the tool in a population of practicing EPs. Their protocol required ratings from eight coworkers, eight colleagues, and 25 patients who were selected by the physician. It was found to be a feasible, reliable, and valid tool. Garra et al.32 studied an MSF tool, the EM-Humanism Scale, which evaluated U.S. EM residents’ professionalism and interpersonal and communication skills. The instrument had good reliability for nurses and faculty, but only fair reliability for patient ratings. The study was not designed to assess validity. There are some general concerns with all of these types of ratings instruments. Their susceptibility to systematic bias (e.g., halo/horns effect, severity/leniency bias, and range restriction) is an issue.33 Validity concerns, especially when physicians choose their own assessors, have also been raised.34 A structured approach to development of the evaluation items is important to assure validity.25 Rater training is important as raters may not consistently assess residents solely on observed behaviors.35 The number of evaluations per rater category needed to optimize reliability merits further study as well. Critical Incident Reporting Systems Incident reports, comment cards, and other similar mechanisms for reporting lapses in professionalism have been developed. These types of tools, when implemented systematically, allow for the longitudinal collection of snapshot assessments of professional behavior. Some systems allow for these tools to be used not only to identify unprofessional behavior, but also to commend exemplary professional behavior. The system should be transparent, non-anonymous, behavior-focused, and allow for formative feedback and remediation (unless so egregious that it triggers mandated disciplinary action).36 In a survey of EM residency directors, Sullivan et al.21 report that the majority of professional lapses are discovered through informal “curbside”
complaints of faculty or nurses, and less than 8% come from ED evaluations. The superiority of face-to-face communication in revealing concerns in learners’ professionalism is supported by other researchers.37 Critical incident reporting systems may help overcome some of those barriers. The Physicianship Evaluation Form was developed for the reporting of unprofessional behavior of medical students by course and clerkship directors. The form has been indirectly validated in a 2004 study by Papadakis et al.,38 who found an increased likelihood of state medical board disciplinary action if a student had negative excerpts in his or her medical school records significant enough to have warranted submission of a Physicianship Evaluation Form (odds ratio = 2.15). The American Board of Internal Medicine has developed comment cards that have been used for assessment of professionalism in internal medicine residents, but we have not been able to identify any studies evaluating the reliability or validity of this tool.39 We have also been unable to identify any literature documenting the use and evaluation of this type of tool in EM residents. One weakness of these systems is that they tend to focus on the extremes of behavior and therefore are not ideal tools for assessing all levels of professionalism. Another concern is that physicians have demonstrated difficulty in identifying issues with and providing feedback on professionalism, making faculty education and training an important determinant of success for these systems.40 Portfolios and Narratives A portfolio can take many forms, but in general terms, comprises the work that a student feels demonstrates his or her progress and success in a given area and can include essays, presentations, patient notes, letters, and any other evidence the student finds important to demonstrate his or her achievement.41,42 Portfolios have been utilized and assessed in EM residency programs as well as in other disciplines and health professions.43 Narratives have been employed in the teaching and assessment of professionalism.44–46 A workshop for EM residents was created to teach the concepts of altruism, excellence, and duty and to incorporate those values into future job performance. Residents were subsequently assessed on the timely completion of residencyrelated duties, with most participants demonstrating improvement.47 In evaluating the ability of fourth-year medical students in an EM clerkship to reflect on and interpret aspects of professionalism during an isolated event, Baernstein and Fryer-Edwards48 utilized the critical incident report (CIR), a brief written account of an event that is judged meaningful to the learner. In that study, the CIR was found to be inferior to a one-on-one interview with a faculty member in the learner’s ability to identify and explore issues of professionalism arising in the chosen event. An assessment of these tools is made difficult by the diversity of forms they take and by the complexities of converting narrative data into quantitative metrics. Tochel et al.43 performed a systematic review of all articles assessing portfolios for postgraduate assessment and education. She found few studies with high enough
1375
ACADEMIC EMERGENCY MEDICINE • December 2012, Vol. 19, No. 12 • www.aemj.org
quality design to make generalized statements about the tool’s effectiveness. The reliability of portfolios for assessment has been shown to be improved by increasing the number of reviewers, training those reviewers, and using the portfolio as one component among other metrics to assess competence. There is a theoretical, though inconclusively proven, utility of portfolios for formative assessment. Other concerns about portfolios exist as well. Some authors argue that portfolios are insufficient to assess professionalism because they are limited by the scope of the material submitted by the learner.49 There is not convincing evidence that portfolios do promote reflection, and some groups have expressed decreased willingness to share reflections when concerned that the information could be used against them.43 One residency program in psychiatry compiled a portfolio of documents created during patient care to assess residents (though not explicitly professionalism), which decreases the burden on residents to create new works to contribute.42 This may not translate to EM residencies because the patient care documents created do not typically lend themselves to robust assessments of professionalism. Simulation Simulation in all its forms has been increasingly used in medical education. The evidence supporting its use in professionalism, though, is lacking. Standardized patients have been employed as both patient and rater to assess professionalism. Zabar and colleagues50 used unannounced SPs in a real clinical setting to evaluate EM residents’ communication and professionalism competence. They determined that this approach was feasible and noted no difference in scores from residents who identified the SP during the encounter versus those who did not. Professionalism has also been assessed with OSCEs by multiple researchers. Many have reported success in multiple populations: in medical students51 and residents, both in EM52,53 and in other specialties,54 although rigorous reliability and validity analysis is lacking. Finally, one group evaluated professionalism in EM residents using high-fidelity simulation during a crisis
resource management course with an unvalidated checklist and was able to demonstrate construct validity using their tool for the specific scenarios they created.55 While OSCEs have gained acceptance for their reliability and validity in measuring many of the competencies, there are scant data on how effective they are at assessing professionalism.56 In particular, concerns remain regarding inter-rater reliability, interstation consistency, and construct validity depending on the scenarios created.51,54,57,58 Implementation of simulation for assessment is also limited by the amount of money and faculty time required. DISCUSSION We have identified a multitude of tools used in the assessment of professionalism. When evaluating the utility of an assessment tool it is helpful to consider five elements: reliability, validity, educational impact, acceptability, and cost (feasibility).59 Unfortunately, none of the identified instruments have been evaluated rigorously enough to define all five of these factors in an EM residency setting (Table 1). While reliability and validity may be generalizable if staying within similar populations, other factors such as acceptability and feasibility may be residency program dependent. The potential for educational impact was often implicit but rarely explicitly evaluated in the studies reviewed. None of the assessment tools we have identified have as yet been evaluated rigorously enough in EM residents to confidently recommend their use for summative evaluation purposes. While only a minority of the instruments have been evaluated in EM residency settings, many still may be appropriate for them. However, professionalism is culture- and context-specific, so tools from other environments should be reevaluated in U.S. EM settings prior to their adoption.60 As professionalism is a complex construct, it is recommended that it be assessed using a variety of tools so that all of its important elements are included.49,60–62 This multimethod assessment concept is called triangulation, and it is useful when attempting to assess complex human behavior.63 Wilkinson et al.49 developed a
Table 1 Potential Professionalism Assessment Toolbox for EM Residents Tool Ethical/moral reasoning tests Direct observation MSF CIRs Portfolios Simulation High-fidelity SP OSCE
Reliability
Validity
Feasibility
Studied in EM
Reference
+ + + + + NA –
+ – + – + + –
+ +/– +/– + + + +/–
– + – + + – +
Rest et al.14 Shayne et al.19 Cruess et al.18 Garra et al.32 Lockyer et al.31 Papadakis et al.38 Baernstein et al.48
– – –
+ – –
+/– +/– +/–
+ + +
Gisondi et al.55 Zabar et al.50 Wallenstein et al.56
CIRs = critical incident reports; MSF = multisource feedback; NA = not applicable; OSCE = objective structured clinical exam; SP = standardized patient.
1376
Rodriguez et al. • ASSESSING THE CORE COMPETENCIES: PROFESSIONALISM
framework for assessing professionalism that involved deconstructing professionalism into themes and subthemes. He identified gaps in seven subthemes, six of which fell under the theme of commitment to autonomous maintenance and continuous improvement of competence in self, others, and systems. Many of these subthemes may be assessable using well-designed qualitative assessment tools such as portfolios. At this point, however, those types of tools still need to be developed for EM residents. SUMMARY AND RECOMMENDATIONS Professionalism is a complex construct that defies comprehensive assessment with a single instrument. The assessment of professionalism should be programmatic, multimodal, and longitudinal. Qualitative assessments will probably need to be included to avoid gaps in assessing the myriad of elements that comprise the construct of professionalism. The consensus process has led us to develop the following research agenda: 1. Because multiple experts advocate for a multimodal approach to professionalism assessment, develop and evaluate triangulation strategies that combine accuracy with efficiency. A. Determine if certain modalities or tools have sufficient psychometric rigor to allow for summative assessment. 2. Evaluate whether non-EM tools (e.g., DIT2, P-MEX) can produce data that are valid and reliable reflections of professionalism in EM residents. 3. Evaluate qualitative tools, such as portfolios, for EM professionalism assessment. A. Conduct a needs assessment of current practices among EM residencies. B. Determine the essential elements of a portfolio so that it provides a meaningful representation of an EM resident’s professional development. C. Develop and evaluate portfolio assessment rubrics. 4. Evaluate existing MSF instruments. A. Determine the number of evaluations needed to achieve adequate reliability, especially for patient evaluations. B. Confirm the validity of these instruments in EM residents. C. Evaluate strategies, such as rater training, to minimize biases that may compromise the validity of MSF assessments. 5. Direct observation may be an ideal method to assess professionalism; however, certain issues merit further investigation. A. Evaluate implementation strategies that improve feasibility. B. Evaluate the SDOT’s validity specifically for professionalism. C. Create and validate a direct observation tool incorporating the ACGME NAS EM Milestones.
6. Evaluate the role of simulation in the assessment of professionalism, especially considering the importance of context in professional behaviors. A. Explore which, if any, aspects of professionalism are best suited to evaluation with simulation. 7. Explore in what format the critical professional lapse should best be reported. A. Evaluate the utility of existing critical incident reporting tools. B. Determine the role of these reports in formative and summative assessment. References 1. ACGME. Common Program Requirements. Available at: http://www.acgme.org/acgmeweb/Portals/0/dh_ dutyhoursCommonPR07012007.pdf. Accessed Nov 9, 2012. 2. ACGME. ACGME Competencies. Educational Program. Available at: http://www.acgme.org/acgmeweb/ Portals/0/PDFs/commonguide/IVA5e_EducationalProgram_ACGMECompetencies_Professionalism_Documentation.pdf. Accessed Nov 9, 2012. 3. RCPSC. The CanMEDS 2005 Physician Competency Framework. Available at: http://www.ub.edu/medicina_unitateducaciomedica/documentos/CanMeds.pdf. Accessed Sep 9, 2012. 4. Hobgood C, Promes S, Wang E, et al. Outcome assessment in emergency medicine–a beginning: results of the Council of Emergency Medicine Residency Directors (CORD) emergency medicine consensus workgroup on outcome assessment. Acad Emerg Med. 2008; 15:267–77. 5. Nasca TJ, Philibert I, Brigham T, Flynn TC. The next GME accreditation system–rationale and benefits. N Engl J Med. 2012; 366:1051–6. 6. ACGME. Emergency Medicine Milestones. Available at: http://www.acgme.org/acgmeweb/Portals/0/PDFs/ EMMilestonesMeeting4_Final1092012.pdf. Accessed Nov 8, 2012. 7. Lynch DC, Surdyk PM, Eiser AR. Assessing professionalism: a review of the literature. Med Teach. 2004; 26:366–73. 8. Veloski JJ, Fields SK, Boex JR, Blank LL. Measuring professionalism: a review of studies with instruments reported in the literature between 1982 and 2002. Acad Med. 2005; 80:366–70. 9. Christie RJ, Hoffmaster CB, Stewart MA. Ethical decision making by Canadian family physicians. CMAJ. 1987; 137:891–7. 10. Patterson F, Baron H, Carr V, Plint S, Lane P. Evaluation of three short-listing methodologies for selection into postgraduate training in general practice. Med Educ. 2009; 43:50–7. 11. Rezler AG, Schwartz RL, Obenshain SS, Lambert P, Gibson JM, Bennahum DA. Assessment of ethical decisions and values. Med Educ. 1992; 26:7–16. 12. Sulmasy DP, Dwyer M, Marx E. Knowledge, confidence, and attitudes regarding medical ethics: how
ACADEMIC EMERGENCY MEDICINE • December 2012, Vol. 19, No. 12 • www.aemj.org
13.
14.
15.
16.
17.
18.
19.
20.
21.
22.
23.
24.
25.
26.
27.
do faculty and housestaff compare? Acad Med. 1995; 70:1038–40. Baldwin DC. The assessment of moral reasoning and professionalism in medical education and practice. In: Stern DT (ed.). Measuring Medical Professionalism. New York, NY: Oxford University Press, 2006. Rest JR, Narvaez D, Thoma SJ, Bebeau MJ. DIT2: Devising and testing a revised instrument of moral judgment. J Educ Psychology. 1999; 91:644–59. Tiffin PA, Finn GM, McLachlan JC. Evaluating professionalism in medical undergraduates using selected response questions: findings from an item response modelling study. BMC Med Educ. 2011; 11:43. Baldwin DC Jr, Bunch WH. Moral reasoning, professionalism, and the teaching of ethics to orthopaedic surgeons. Clin Orthop Relat Res. 2000; 378:97–103. Ginsburg S, Regehr G, Mylopoulos M. From behaviours to attributions: further concerns regarding the evaluation of professionalism. Med Educ. 2009; 43:414–25. Cruess R, McIlroy JH, Cruess S, Ginsburg S, Steinert Y. The professionalism mini-evaluation exercise: a preliminary investigation. Acad Med. 2006; 10 (Suppl):S74–8. Shayne P, Gallahue F, Rinnert S, et al. Reliability of a core competency checklist assessment in the emergency department: the Standardized Direct Observation Assessment Tool. Acad Emerg Med. 2006; 13:727–32. LaMantia J, Kane B, Yarris L, et al. Real-time interrater reliability of the Council of Emergency Medicine Residency Directors Standardized Direct Observation Assessment Tool. Acad Emerg Med. 2009; 16(Suppl 2):S51–7. Sullivan C, Murano T, Comes J, Smith JL, Katz ED. Emergency medicine directors’ perceptions on professionalism: a Council of Emergency Medicine Residency Directors survey. Acad Emerg Med. 2011; 18 (Suppl 2):S97–103. Lurie SJ, Mooney CJ, Lyness JM. Measurement of the general competencies of the accreditation council for graduate medical education: a systematic review. Acad Med. 2009; 84:301–9. Reisdorff EJ, Carlson DJ, Reeves M, Walker G, Hayes OW, Reynolds B. Quantitative validation of a general competency composite assessment evaluation. Acad Emerg Med. 2004; 11:881–4. Rodgers KG, Manifold C. 360-degree feedback: possibilities for assessment of the ACGME core competencies for emergency medicine residents. Acad Emerg Med. 2002; 9:1300–4. Lockyer J. Multisource feedback in the assessment of physician competencies. J Contin Educ Health Prof. 2003; 23:4–12. Brinkman WB, Geraghty SR, Lanphear BP, et al. Effect of multisource feedback on resident communication skills and professionalism: a randomized controlled trial. Arch Pediatr Adolesc Med. 2007; 161:44–9. Tsugawa Y, Ohbu S, Cruess S, et al. Introducing the professionalism mini-evaluation exercise (P-MEX) in
28.
29.
30.
31.
32.
33.
34.
35.
36.
37.
38.
39.
40.
41.
42.
43.
1377
Japan: results from a multicenter, cross-sectional study. Acad Med. 2011; 86:1026–31. Crow S. 360-degree Professionalism Assessment Instrument. Available at: www.mededportal.org/publication/ 236. Accessed Sep 9, 2012. National Board of Medical Examiners. Assessment of Professional Behaviors. Available at: http://www. nbme.org/Schools/OA/APB/elements.html. Accessed Nov 9, 2012. Richmond M, Canavan C, Holtman MC, Katsufrakis PJ. Feasibility of implementing a standardized multisource feedback program in the graduate medical education environment. J Grad Med Educ. 2011; 3:511–6. Lockyer JM, Violato C, Fidler H. The assessment of emergency physicians by a regulatory authority. Acad Emerg Med. 2006; 13:1296–303. Garra G, Wackett A, Thode H Jr. Feasibility and reliability of a multisource feedback tool for emergency medicine residents. J Grad Med Educ. 2011; 3:356–9. Swing SR. Assessing the ACGME general competencies: general considerations and assessment methods. Acad Emerg Med. 2002; 9:1278–88. Archer J, McGraw M, Davies H. Assuring validity of multisource feedback in a national programme. Arch Dis Child. 2010; 95:330–5. Mazor KM, Canavan C, Farrell M, Margolis MJ, Clauser BE. Collecting validity evidence for an assessment of professionalism: findings from think-aloud interviews. Acad Med. 2008; 10(Suppl): S9–12. Papadakis MA, Loeser H. Using critical incident reports and longitudinal observations to assess professionalism. In: Stern DT (ed.). Measuring Medical Professionalism. New York, NY: Oxford University Press, 2006. Hemmer PA, Hawkins R, Jackson JL, Pangaro LN. Assessing how well three evaluation methods detect deficiencies in medical students’ professionalism in two settings of an internal medicine clerkship. Acad Med. 2000; 75:167–73. Papadakis MA, Hodgson CS, Teherani A, Kohatsu ND. Unprofessional behavior in medical school is associated with subsequent disciplinary action by a state medical board. Acad Med. 2004; 79:244–9. Markakis KM, Beckman HB, Suchman AL, Frankel RM. The path to professionalism: cultivating humanistic values and attitudes in residency training. Acad Med. 2000; 75:141–50. Ginsburg S, Lingard L, Regehr F, Underwood K. Know when to rock the boat: how faculty rationalize students’ behaviors. J Gen Intern Med. 2008; 23: 942–7. O’Sullivan P, Greene C. Portfolios: possibilities for addressing emergency medicine resident competencies. Acad Emerg Med. 2002; 9:1305–9. O’Sullivan PS, Reckase MD, McClain T, Savidge MA, Clardy JA. Demonstration of portfolios to assess competency of residents. Adv Health Sci Educ Theory Pract. 2004; 9:309–23. Tochel C, Reckase MD, McClain T, Savidge MA, Clardy JA. The effectiveness of portfolios for post-
1378
44.
45.
46.
47.
48.
49.
50.
51.
52.
53.
54.
55.
Rodriguez et al. • ASSESSING THE CORE COMPETENCIES: PROFESSIONALISM
graduate assessment and education: BEME Guide No 12. Med Teach. 2009; 31:299–318. Gordon J. Assessing students’ personal and professional development using portfolios and interviews. Med Educ. 2003; 37:335–40. Santen SA, Hemphill RR. A window on professionalism in the emergency department through medical student narratives. Ann Emerg Med. 2011; 58:288–94. Sklar DP, Doezema D, McLaughlin S, Helitzer D. Teaching communications and professionalism through writing and humanities: reflections of ten years of experience. Acad Emerg Med. 2002; 9: 1360–4. Callahan E, Marcdante K. AED: Altruism Excellence and Duty: A Professionalism Module for Emergency Medicine. Available at: www.mededportal. org/publication/8145. Accessed Sep 23, 2012. Baernstein A, Fryer-Edwards K. Promoting reflection on professionalism: a comparison trial of educational interventions for medical students. Acad Med. 2003; 78:742–7. Wilkinson TJ, Wade WB, Knock LD. A blueprint to assess professionalism: results of a systematic review. Acad Med. 2009; 84:551–8. Zabar S, Ark T, Gillespie C, et al. Can unannounced standardized patients assess professionalism and communication skills in the emergency department? Acad Emerg Med. 2009; 16:915–8. Singer PA, Robb A, Cohen R, Norman G, Turnbill J. Performance-based assessment of clinical ethics using an objective structured clinical examination. Acad Med. 1996; 71:495–8. Larkin GL. Evaluating professionalism in emergency medicine: clinical ethical competence. Acad Emerg Med. 1999; 6:302–11. Larkin GL, Binder L, Houry D, Adams J. Defining and evaluating professionalism: a core competency for graduate emergency medicine education. Acad Emerg Med. 2002; 9:1249–56. Chipman JG, Webb TP, Shabahang M, et al. A multi-institutional study of the Family Conference objective structured clinical exam: a reliable assessment of professional communication. Am J Surg. 2011; 201:492–7. Gisondi MA, Smith-Coggins R, Harter PM, Soltysik RC, Yarnold PR. Assessment of resident profession-
56.
57.
58.
59.
60.
61.
62.
63.
alism using high-fidelity simulation of ethical dilemmas. Acad Emerg Med. 2004; 11:931–7. Wallenstein J, Heron S, Santen S, Shayne P, Ander D. A core competency-based objective structured clinical examination (OSCE) can predict future resident performance. Acad Emerg Med. 2010; 17(Suppl 2): S67–71. Mazor KM, Zanetti ML, Alper EJ, et al. Assessing professionalism in the context of an objective structured clinical examination: an in-depth study of the rating process. Med Educ. 2007; 41:331–40. Van Mook WN, Gorter SL, O’Sullivan H, Wass V, Schuwirth LW, van der Vleuten CP. Approaches to professional behaviour assessment: tools in the professionalism toolbox. Eur J Intern Med. 2009; 20: e153–7. Van der Vleuten C. The assessment of professional competence: developments, research and practical implications. Adv Health Sci Educ. 1996; 1:41–67. Hodges BD, Ginsburg S, Cruess R, et al. Assessment of professionalism: recommendations from the Ottawa 2010 Conference. Med Teach. 2011; 33: 354–63. Arnold L. Assessing professional behavior: yesterday, today, and tomorrow. Acad Med. 2002; 77: 502–15. Stern DT, Frohna AZ, Gruppen LD. The prediction of professional behaviour. Med Educ. 2005; 39: 75–82. Cohen L, Manion L, Morrison K. Research Methods in Education. 6th ed. New York, NY: Routledge, 2007.
Supporting Information The following supporting information is available in the online version of this paper: Data S1. National Accreditation System5 Milestones: Professionalism. The document is in PDF format. Please note: Wiley Periodicals Inc. is not responsible for the content or functionality of any supporting information supplied by the authors. Any queries (other than missing material) should be directed to the corresponding author for the article.