Assessing Systemsbased Practice - Wiley Online Library

9 downloads 13275 Views 254KB Size Report
consensus conference on education research in emergency medicine (EM), a breakout group ... cate, Graham et al. also developed a comprehensive list.
BREAKOUT SESSION

Assessing Systems-based Practice Esther H. Chen, MD, Patricia S. O’Sullivan, EdD, Camiron L. Pfennig, MD, Katrina Leone, MD, and Chad S. Kessler, MD, MHPE

Abstract The conceptual definition of systems-based practice (SBP) does not easily translate into directly observable actions or behaviors that can be easily assessed. At the Academic Emergency Medicine consensus conference on education research in emergency medicine (EM), a breakout group presented a review of the literature on existing assessment tools for SBP, discussed the recommendations for research tool development during breakout sessions, and developed a research agenda based on this discussion. ACADEMIC EMERGENCY MEDICINE 19:1366–1371 © 2012 by the Society for Academic Emergency Medicine

T

he inclusion of systems-based practice (SBP) as one of the Accreditation Council for Graduate Medical Education (ACGME) core competencies highlights the importance of the health care system on the ability of a physician to provide competent and effective patient care.1 Physicians must be able to collaborate with other members of the health care team, consider costs when weighing risks and benefits, improve system performance by identifying system errors and implementing potential solutions, and continue to advocate for quality patient care. This set of skills is also described in the CanMEDS Physician Competency Framework as elements of the physician role as a collaborator, manager, and health advocate.2

From the Department of Emergency Medicine (EHC) and the Office of Medical Education (PSO), University of California San Francisco/San Francisco General Hospital, San Francisco, CA; the Department of Emergency Medicine, Vanderbilt University Medical Center (CLP), Nashville, TN; the Department of Emergency Medicine, Oregon Health and Science University (KL), Portland, OR; and the Department of Emergency Medicine, Jesse Brown VA Hospita, (CSK), Chicago, IL. The list of breakout session participants can be found as the appendix of a related article on page 1486. Received June 28, 2012; accepted June 28, 2012. This paper reports on a workshop session of the 2012 Academic Emergency Medicine consensus conference, “Education Research in Emergency Medicine: Opportunities, Challenges, and Strategies for Success,” May 9, 2012, Chicago, IL. The authors have no relevant financial information or potential conflicts of interest to disclose. Supervising Editor: Terry Kowalenko, MD. Address for correspondence and reprints: Esther H. Chen, MD; e-mail: [email protected].

1366

ISSN 1069-6563 PII ISSN 1069-6563583

Despite a recent shift in the ACGME assessment system to the Next Accreditation System, the framework for assessing milestones within the competencies still requires specific tools to assess SBP.3 Although the conceptual definition of SBP does not easily translate into observable actions or behaviors, the Council of Emergency Medicine Residency Directors (CORD) was able to develop emergency medicine (EM)-specific evaluation domains for SBP.4 These included specific actions and behaviors that are directly observable and facilitated real-time and summative feedback. However, this set of observable behaviors did overlook some pertinent areas, such as situational awareness and participation in systems improvement. Similarly, by using the CanMEDS roles of collaborator, manager, and health advocate, Graham et al. also developed a comprehensive list of observable actions that readily translated into assessments.5 This article summarizes the authors’ review of the current assessment tools for the SBP competency used in EM and non-EM residencies, both within and outside the United States. As a result of several small group discussions during the breakout session on the assessment of observable learner performance at the 2012 Academic Emergency Medicine (AEM) consensus conference on education research in EM, we developed a research plan for assessment tool development. PROCESS AND CONSENSUS We reviewed the literature on SBP assessment, searching the Medline and Pubmed databases by combining the terms “systems based” with “evaluat* OR assess*” and “competenc*” as well as combining “simulation” or “portfolio” with “systems based practice,” yielding 156 and 309 references, respectively. An additional search was performed in the MedEdPORTAL database (https://

© 2012 by the Society for Academic Emergency Medicine doi: 10.1111/acem.12024

ACADEMIC EMERGENCY MEDICINE • December 2012, Vol. 19, No. 12 • www.aemj.org

www.mededportal.org) with the terms “systems based practice” and “assessment,” which resulted in 47 resources. The bibliographies of all relevant articles were reviewed for additional citations. The most prevalent studies in the current education literature describe innovative approaches to teaching SBP and assessment of the learners’ knowledge based on those educational interventions. Some examples include resident involvement in quality improvement projects, implementation of an SBP curriculum (e.g., managed care, root cause analysis, the economic and business elements of the health care system), and resident participation in a patient panel conference where patients discuss their experiences with the health care system.6–10 Other educational innovations included round-table discussions about nonclinical skills (e.g., prioritization, efficiency, accountability), an interspecialty airway course, and a Health Advocacy Day that showcases community health advocates.11 There are fewer studies of competency assessments that either reflect a learner’s ability and skill or measure actual patient outcomes. These assessment instruments are described in the following section. Situational Awareness and Teamwork Scoring Systems Undergraduate and graduate medical educators frequently use low- and high-fidelity mannequins or standardized patients to assess teamwork and communication skills.12–14 This format incorporates direct observation of the learner by faculty using standardized checklists with scenario-specific SBP competency criteria or standardized instruments (e.g., Behaviorally Anchored Rating Scale, Teamwork and Patient Safety Attitudes Questionnaire, Perceived Stress Scale, Anaesthetists’ Non-Technical Skills System, and the Ottawa Crisis Resource Management Global Rating Scale), followed by debriefing and feedback.12–18 One study showed that simulated teamwork training as an educational strategy improved the quality of team behavior (as observed in the actual clinical setting) when compared to simply working together as a team in the clinical area without the simulation.12 Otherwise, there is little evidence that simulation training can directly improve a resident’s clinical performance. Direct Observation Assessment Tools Perhaps the best representation of resident behavior is direct observation of residents providing clinical care in their practice environment, known as workplace-based assessment. Some educators will argue that this is the ideal method of assessing competency because it provides the context of professional practice rather than a simulated or standardized encounter.19,20 A systematic review of direct observation assessment tools showed that the mini-clinical evaluation exercise (mini-CEX) had the strongest validity evidence of all the instruments.21,22 A widely used instrument in EM is the Standardized Direct Observation Assessment Tool (SDOT), which provides an objective, efficient way to assess learners in the clinical environment.23,24 Although this tool has good inter-rater reliability when used by faculty

1367

from different institutions, validity evidence is lacking.24,25 Direct observation of residents in simulated environments can also be used to assess behavior. Simulated cases using standardized patients, formally called the objective structured clinical examination (OSCE), can create authentic interactions for the learner. Performance in OSCEs have been shown to correlate with future clinical performance in medical students.26 A study of first-year EM residents, however, showed no correlation between the SBP competency scores on the SDOT for five OSCE cases, compared to the cumulative composite SBP score on the ACGME global resident competency form completed by faculty for each resident during the EM rotation over 18 months.27,28 However, the low reliability of global evaluations may account for this finding rather than a failure to measure SBP in the OSCE. Other studies using OSCEs, however, have demonstrated reliable measurement of clinical skills in all the ACGME general competencies and the CanMEDS physician competencies.29,30 An internal medicine program developed a 12-station Objective Structured System-Interaction Examination (OSSIE), with situations involving patient handoffs, complicated discharges, consultations, cost-effective diagnostics, evidence-based health promotion, and informed consent.31 This work provides more evidence that simulation may be a valid approach to assessing SBP.32 While it is difficult to draw definitive conclusions based on these studies, they do highlight the need for more research on the efficacy of using simulation, considering the time and personnel resources involved, in assessing clinical performance. Ratings and Survey-based instruments Multisource feedback and ratings of residents offer different perspectives into a learner’s interactions with “the village” (the integral players in the health care system) and theoretically can provide a more comprehensive picture of resident performance.33 Nursing evaluation of residents, patient surveys, chart selfaudits, and program director evaluation of learning portfolios were all used in this internal medicine program to assess SBP competency. On self-assessment, residents reported an improvement in their ability to access and utilize resources, providers, and systems to provide optimal patient care.33 While they do not solely assess the SBP competency, global rating scales are widely used to evaluate EM residents and therefore are included here. Eight EM residency programs participated in a study to develop, implement, and obtain evidence to support the validity of a global rating tool.34,35 This instrument had acceptable reliability statistics and demonstrated the progressive acquisition of the SBP competency across three years of residency training. Quality Improvement Projects Several medical disciplines teach and assess SBP through development and participation in quality improvement projects.6,36–39 Some studies use objective quality measures or process outcomes for competency assessment, such as a decrease in laboratory fees for pediatric inpatients by altering the ordering system or

1368

Chen et al. • ASSESSING CORE COMPETENCIES: SYSTEMS-BASED PRACTICE

developing clinical guidelines to improve clinical care in an intensive care unit.6,36 Others use resident self-assessment or a faculty’s global assessment of a group’s performance, rather than an individual’s performance.37–39 One particular study is worth highlighting for its innovative assessment process. Although not an SBPspecific evaluation tool, the health care matrix is a diagnostic tool that incorporates all of the ACGME competencies and the Institute of Medicine aims for improvement.40 In this study, each internal medicine resident used the matrix to analyze the care provided for one patient and presented this self-analysis to peers who provided feedback. The matrix assists the resident in identifying the systems issues that prevented the patient from receiving optimal care and generating ideas for quality improvement projects. It provides an analytical framework that enables the resident to critically reflect on his or her own performance using the competency language, while also identifying ways to improve care based on evidence. A critical step in this process is a faculty facilitator or “expert” who understands the case and the systems issues and can provide formative feedback on the work. Portfolio The portfolio can be used for formative and summative assessment of the competencies, although its reliability as an evaluation tool improves with regular feedback from a mentor and well-designed entries that address specific competencies and encourage reflection.41,42 Since portfolios are based on actual work, they are considered to be a more accurate representation of learner performance, and therefore a more valid measure of the competencies, than other assessment tools. Following several proposed suggestions for implementation into the EM curriculum, the EM literature currently reports individual resident and faculty reflections on cases that address systems issues (e.g., multicasualty incident, resource utilization, and medical error).43–45 Achievement of the competency is illustrated by evidence of self-assessment (inherent in the reflection) as well as by the subjective assessment from a faculty member who provides feedback on the entry. Other programs have coupled portfolio entries with a particular activity that focuses on systems issues. For example, after presenting in morbidity and mortality conference, surgery residents must also complete a form in their portfolio that describes the factors that contributed to the complication and/or mortality, the opportunities for systems improvement, and the specific plan for improvement.46 The resident then receives feedback during the group discussion and from the surgery residency director who regularly reviews the portfolio entries. In a psychiatry program where portfolios were well integrated into the resident evaluation system, the residents were required to submit their best work that demonstrated acquisition of 13 essential psychiatric skills, 10 of which met the SBP competency.47 Two faculty members evaluated each portfolio and assessed whether the residents satisfied the competency. Portfolios have also been linked to an active learning experience in which a resident assumes the role of a parent

faced with complex life situations while the resident’s colleagues (acting as the physicians) prioritize the problems and access community resources to address them.48 The resident then documents the scenario experiences in his or her portfolio to demonstrate competency. Finally, the portfolio can be used to track a resident’s progress on a quality improvement project and provide a summative assessment of learner performance. At one institution, a multidisciplinary group of radiology and EM faculty and residents worked together to improve the efficiency of the radiology process starting from ordering a diagnostic study to receiving the final reading.49 A needs assessment was performed, cases were discussed jointly during conferences, and suggestions for improvement were made. The outcomes achieved by this project included systems changes to improve communication between the two departments, decreased patient transport time, and improved understanding of the difficulties in providing clinical services in both departments. Participation in this problem-solving project was documented by self-reflection entries in the resident portfolio that was regularly reviewed by the program directors. This “plan-do-study-act” approach to systems improvement taught SBP concepts, improved the system with measurable clinical outcomes, and reflected a learner’s performance.50 Although we have described several assessment methods for SBP, if we use the hierarchy of competencies described by Miller51 as the conceptual framework by which we evaluate these instruments, we find several limitations. First and foremost, many SBP studies are focused on effective instructional methods or program environments, rather than determining the ability of an individual learner.33,46,48,52 We need to link assessment to the skills of individual learners when performing or applying the knowledge in clinical practice, which is the highest level of competency in Miller’s schema. This perspective supports the use of workplace-based assessments. Therefore, a good assessment process (Figure 1) should be context-specific (i.e., using a sample of cases rather than a single case); require sophisticated judgment by physicians to account for context rather than checklists; occur in the real workplace rather than a standardized, controlled environment; and use a rating scale with constructs of developing clinical sophistication and independence rather than conventional gradations of performance (e.g., unsatisfactory to superior, below expectations to above expectations).53 More research on developing scales using anchors with observable behaviors to demonstrate progressive clinical independence is necessary for this assessment process. This measurement and achievement of developing behaviors, or milestones, is now a key component of ACGME’s Next Accreditation System.3 A second limitation of the current evaluation instruments is that only a few have shown that the scores are valid in EM. Data obtained from tools that have not undergone rigorous testing may not actually measure the constructs of interest.23,54 We are only assuming that instruments that have been tested elsewhere may be generalizable to the EM residency setting.

ACADEMIC EMERGENCY MEDICINE • December 2012, Vol. 19, No. 12 • www.aemj.org

1369

3. Ratings and survey-based instruments: Currently, one global rating scale has validity evidence for the assessment of EM resident performance.35 Patient surveys and nursing surveys used to provide feedback on resident behavior have not been rigorously tested. Proposed research agenda: A. Develop forms to be used by multiple evaluators (multisource feedback) to measure EM resident performance in the SBP domains using scales with progressively developing observable behaviors or milestones.

Figure 1. The assessment of competency in medical education. The arrow indicates the goal of a good assessment process that should move toward the highest level of competency, as assessed by physician judgment based on direct observation of the learner in the actual workplace.

Finally, some educators believe that each competency should not be assessed independently of the other competencies because these behaviors cannot be isolated and must be evaluated in context with interpersonal and communication skills and practice-based learning and improvement.4,55 For example, residents who are good at managing a team during resuscitations of acutely ill patients are good communicators, demonstrate good patient care and problem-solving skills, and understand how to use resources effectively. However, it is unclear whether a more global view adequately evaluates all the content domains of the SBP competency. RECOMMENDATIONS 1. Situational awareness and teamwork scoring systems: Currently, only a few EM studies of simulation use rating systems other than checklists of critical actions to assess resident performance. Proposed research agenda: A.

Collect evidence for validity of the various teamwork and situational awareness rating scales using simulation in EM education.

B.

Demonstrate an association between performance in simulation and patient care. 2. Direct observation assessment tools: Of the many available direct observation assessment tools, the SDOT is the most widely used in EM. Proposed research agenda: A.

Refine the current direct observation assessment tools (e.g., mini-CEX, SDOT) using scales with progressively developing observable behaviors or milestones.

B.

Determine evidence of validity for these assessments, including the optimal number of observations and association with other performance outcomes.

B. Determine evidence of validity for the effect of multisource feedback on resident performance. 4. Quality improvement projects: Currently, selfassessment, global assessment by faculty, and achievement of the clinical outcome for quality improvement projects are tools used to assess resident competency. Proposed research agenda: A. Develop structured self-assessment tools that describe residents with different engagement, abilities, and skills in quality improvement. B. Develop tools for assessing quality improvement projects (e.g., checklist for chart review) using objective quality measures for SBP competency. 5. Portfolio: Portfolios are learner-generated documentation of residency-specific activities that reflect specific aspects of the ACGME competencies. Proposed research agenda: A. Develop self-assessment tools that encourage informed self-assessment and reflection on specific SBP domains and behaviors.56 B. Develop structured scoring rubrics for faculty to use in assessing portfolio reflections. SUMMARY Systems-based practice is a complex concept that highlights the importance of the health care system on the ability of a physician to provide good patient care and requires the physician to be a good manager, collaborator, and patient advocate. While it may be challenging to assess SBP independently of the other competencies, a comprehensive assessment should be multimodal and include direct observation by expert clinicians in the actual workplace. References 1. ACGME. Common Program Requirements. Available at: http://www.acgme.org/acgmeweb/tabid/83/ ProgramandInstitutionalGuidelines.aspx. Accessed Oct 31, 2012. 2. Frank J. The CanMEDS 2005 Physician Competency Framework. Better standards. Better Physicians. Better care. Available at: http://medical-imaging.utoronto.ca/Assets/Medical+Imaging+Digital+Assets/ resident/manual/canmeds/tbl.pdf. Accessed Sep 19, 2012.

1370

Chen et al. • ASSESSING CORE COMPETENCIES: SYSTEMS-BASED PRACTICE

3. Nasca TJ, Philibert I, Brigham T, Flynn TC. The next GME accreditation system–rationale and benefits. N Engl J Med. 2012; 366:1051–6. 4. Wang EE, Dyne PL, Du H. Systems-based practice: summary of the 2010 Council of Emergency Medicine Residency Directors Academic Assembly Consensus Workgroup–teaching and evaluating the difficult-to-teach competencies. Acad Emerg Med. 2011; 18(Suppl 2):S110–20. 5. Graham MJ, Naqvi Z, Encandela JA, et al. What indicates competency in systems based practice? An analysis of perspective consistency among healthcare team members. Adv Health Sci Educ Theory Pract. 2009; 14:187–203. 6. Marinaro J, Tawil I, Nelson MT. Resident guideline development to standardize intensive care unit care delivery: a competency-based educational method. J Surg Educ. 2008; 65:109–11. 7. David RA, Reich LM. The creation and evaluation of a systems-based practice/managed care curriculum in a primary care internal medicine residency program. Mt Sinai J Med. 2005; 72:296–9. 8. Peters AS, Kimura J, Ladden MD, March E, Moore GT. A self-instructional model to teach systems-based practice and practice-based learning and improvement. J Gen Intern Med. 2008; 23: 931–6. 9. Turley CB, Roach R, Marx M. Systems survivor: a program for house staff in systems-based practice. Teach Learn Med. 2007; 19:128–38. 10. Colbert CY, Mirkes C, Cable CT, Sibbitt SJ, VanZyl GO, Ogden PE. The patient panel conference experience: what patients can teach our residents about competency issues. Acad Med. 2009; 84: 1833–9. 11. Royal College of Physicians and Surgeons in Canada. International Conference on Residency Education. Summary. Ottowa, Canada, September 23–25, 2010. 12. Shapiro MJ, Morey JC, Small SD, et al. Simulation based teamwork training for emergency department staff: does it improve clinical team performance when added to an existing didactic teamwork curriculum? Qual Saf Health Care. 2004; 13:417–21. 13. Wang EE, Vozenilek JA. Addressing the systemsbased practice core competency: a simulation-based curriculum. Acad Emerg Med. 2005; 12:1191–4. 14. Opar SP, Short MW, Jorgensen JE, Blankenship RB, Roth BJ. Acute coronary syndrome and cardiac arrest: using simulation to assess resident performance and program outcomes. J Grad Med Educ. 2010; 2:404–9. 15. Kaissi A, Johnson T, Kirschbaum MS. Measuring teamwork and patient safety attitudes of high-risk areas. Nurs Econ. 2003; 21:211–8. 16. Cohen S, Kamarck T, Mermelstein R. A global measure of perceived stress. J Health Soc Behav. 1983; 24:385–96. 17. Fletcher G, Flin R, McGeorge P, Glavin R, Maran N, Patey R. Anaesthetists’ Non-Technical Skills (ANTS): evaluation of a behavioural marker system. Br J Anaesth. 2003; 90:580–8.

18. Kim J, Neilipovitz D, Cardinal P, Chiu M, Clinch J. A pilot study using high-fidelity simulation to formally evaluate performance in the resuscitation of critically ill patients: the University of Ottawa Critical Care Medicine, High-Fidelity Simulation, and Crisis Resource Management I Study. Crit Care Med. 2006; 34:2167–74. 19. Govaerts MJ. Educational competencies or education for professional competence? Med Educ. 2008; 42:234–6. 20. Govaerts MJ, van der Vleuten CP, Schuwirth LW, Muijtjens AM. Broadening perspectives on clinical performance assessment: rethinking the nature of in-training assessment. Adv Health Sci Educ Theory Pract. 2007; 12:239–60. 21. Norcini JJ, Blank LL, Duffy FD, Fortna GS. The mini-CEX: a method for assessing clinical skills. Ann Intern Med. 2003; 138:476–81. 22. Kogan JR, Holmboe ES, Hauer KE. Tools for direct observation and assessment of clinical skills of medical trainees: a systematic review. JAMA. 2009; 302:1316–26. 23. Shayne P, Heilpern K, Ander D, Palmer-Smith V. Protected clinical teaching time and a bedside clinical evaluation instrument in an emergency medicine training program. Acad Emerg Med. 2002; 9:1342–9. 24. Shayne P, Gallahue F, Rinnert S, Anderson CL, Hern G, Katz E. Reliability of a core competency checklist assessment in the emergency department: the Standardized Direct Observation Assessment Tool. Acad Emerg Med. 2006; 13:727–32. 25. LaMantia J, Kane B, Yarris L, et al. Real-time inter-rater reliability of the Council of Emergency Medicine residency directors Standardized Direct Observation Assessment Tool. Acad Emerg Med. 2009; 16(Suppl 2):S51–7. 26. Probert CS, Cahill DJ, McCann GL, Ben-Shlomo Y. Traditional finals and OSCEs in predicting consultant and self-reported clinical skills of PRHOs: a pilot study. Med Educ. 2003; 37:597–602. 27. ACGME. Global Resident Competency Rating Form. Available at: www.acgme.org/acgmeweb/Portals/0/ PFAssets/ProgramResources/999_GlobalResidency CompetencyForm.pdf. Accessed Oct 31, 2012. 28. Wallenstein J, Heron S, Santen S, Shayne P, Ander D. A core competency-based objective structured clinical examination (OSCE) can predict future resident performance. Acad Emerg Med. 2010; 17(Suppl 2):S67–71. 29. Short MW, Jorgensen JE, Edwards JA, Blankenship RB, Roth BJ. Assessing intern core competencies with an objective structured clinical examination. J Grad Med Educ. 2009; 1:30–6. 30. Jefferies A, Simmons B, Tabak D, et al. Using an objective structured clinical examination (OSCE) to assess multiple physician competencies in postgraduate training. Med Teach. 2007; 29:183–91. 31. Hingle S, Rosher RB, Robinson S, McCann-Stone N, Todd C, Clark M. Development of the objective structured system-interaction examination. J Grad Med Educ. 2009; 1:82–8. 32. Hingle ST, Robinson S, Colliver JA, Rosher RB, McCann-Stone N. Systems-based practice assessed

ACADEMIC EMERGENCY MEDICINE • December 2012, Vol. 19, No. 12 • www.aemj.org

33.

34.

35.

36.

37.

38.

39.

40.

41.

42.

43.

with a performance-based examination simulated and scored by standardized participants in the health care system: feasibility and psychometric properties. Teach Learn Med. 2011; 23:148–54. Ziegelstein RC, Fiebach NH. “The mirror” and “the village”: a new method for teaching practice-based learning and improvement and systems-based practice. Acad Med. 2004; 79:83–8. Reisdorff EJ, Hayes OW, Reynolds B, et al. General competencies are intrinsic to emergency medicine training: a multicenter study. Acad Emerg Med. 2003; 10:1049–53. Reisdorff EJ, Carlson DJ, Reeves M, Walker G, Hayes OW, Reynolds B. Quantitative validation of a general competency composite assessment evaluation. Acad Emerg Med. 2004; 11:881–4. Englander R, Agostinucci W, Zalneraiti E, Carraccio CL. Teaching residents systems-based practice through a hospital cost-reduction program: a “win-win” situation. Teach Learn Med. 2006; 18: 150–2. Voss JD, May NB, Schorling JB, et al. Changing conversations: teaching safety and quality in residency training. Acad Med. 2008; 83:1080–7. Siri J, Reed AI, Flynn TC, Silver M, Behrns KE. A multidisciplinary systems-based practice learning experience and its impact on surgical residency education. J Surg Educ. 2007; 64:328–32. Delphin E, Davidson M. Teaching and evaluating group competency in systems-based practice in anesthesiology. Anesth Analg. 2008; 106:1837–43. Quinn DC, Bingham JW, Garriss GW, Dozier EA. Residents learn to improve care using the ACGME core competencies and Institute of Medicine aims for improvement: the health care matrix. J Grad Med Educ. 2009; 1:119–26. O’Sullivan PS, Reckase MD, McClain T, Savidge MA, Clardy JA. Demonstration of portfolios to assess competency of residents. Adv Health Sci Educ Theory Pract. 2004; 9:309–23. Tochel C, Haig A, Hesketh A, et al. The effectiveness of portfolios for post-graduate assessment and education: BEME Guide No 12. Med Teach. 2009; 31:299–318. O’Sullivan P, Greene C. Portfolios: possibilities for addressing emergency medicine resident competencies. Acad Emerg Med. 2002; 9:1305–9.

1371

44. Acciani J. Resident portfolio: breaking trust–a reflection on confidentiality and minors. Acad Emerg Med. 2006; 13:1339–40. 45. Chisholm CD, Croskerry P. A case study in medical error: the use of the portfolio entry. Acad Emerg Med. 2004; 11:388–92. 46. Rosenfeld JC. Using the morbidity and mortality conference to teach and assess the ACGME general competencies. Curr Surg. 2005; 62:664–9. 47. Jarvis RM, O’Sullivan PS, McClain T, Clardy JA. Can one portfolio measure the six ACGME general competencies? Acad Psychiatry. 2004; 28:190–6. 48. Zenni EA, Ravago L, Ewart C, Livingood W, Wood D, Goldhagen J. A walk in the patients’ shoes: a step toward competency development in systemsbased practice. Ambul Pediatr. 2006; 6:54–7. 49. Panek RC, Deloney LA, Park J, Goodwin W, Klein S, Ferris EJ. Interdepartmental problem-solving as a method for teaching and learning systems-based practice. Acad Radiol. 2006; 13:1150–4. 50. Berwick DM. Developing and testing changes in delivery of care. Ann Intern Med. 1998; 128:651–6. 51. Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990; 65(9 Suppl): S63–7. 52. Larkin AC, Cahan MA, Whalen G, et al. Human emotion and response in surgery (HEARS): a simulation-based curriculum for communication skills, systems-based practice, and professionalism in surgical residency training. J Am Coll Surg. 2010; 211:285–92. 53. Crossley J, Johnson G, Booth J, Wade W. Good questions, good answers: construct alignment improves the performance of workplace-based assessment scales. Med Educ. 2011; 45:560–9. 54. Tomolo A, Caron A, Perz ML, Fultz T, Aron DC. The outcomes card. Development of a systemsbased practice educational tool. J Gen Intern Med. 2005; 20:769–71. 55. Lurie SJ, Mooney CJ, Lyness JM. Measurement of the general competencies of the Accreditation Council for Graduate Medical Education: a systematic review. Acad Med. 2009; 84:301–9. 56. Sargeant J, Eva KW, Armson H, et al. Features of assessment learners use to make informed selfassessments of clinical performance. Med Educ. 2011; 45:636–47.