Development and Initial Validation of a Project-Based Rubric to Assess the Systems-Based Practice Competency of Residents in the Clinical Chemistry Rotation of a Pathology Residency Carolyn R. Vitek, MS; Jane C. Dale, MD; Henry A. Homburger, MD; Sandra C. Bryant, MS; Amy K. Saenger, PhD; Brad S. Karon, MD, PhD
Context.—Systems-based practice (SBP) is 1 of 6 core competencies required in all resident training programs accredited by the Accreditation Council for Graduate Medical Education. Reliable methods of assessing resident competency in SBP have not been described in the medical literature. Objective.—To develop and validate an analytic grading rubric to assess pathology residents’ analyses of SBP problems in clinical chemistry. Design.—Residents were assigned an SBP project based upon unmet clinical needs in the clinical chemistry laboratories. Using an iterative method, we created an analytic grading rubric based on critical thinking principles. Four faculty raters used the SBP project evaluation rubric to independently grade 11 residents’ projects during their clinical chemistry rotations. Interrater reliability and Cronbach a were calculated to determine the reliability
and validity of the rubric. Project mean scores and range were also assessed to determine whether the rubric differentiated resident critical thinking skills related to the SBP projects. Results.—Overall project scores ranged from 6.56 to 16.50 out of a possible 20 points. Cronbach a ranged from 0.91 to 0.96, indicating that the 4 rubric categories were internally consistent without significant overlap. Intraclass correlation coefficients ranged from 0.63 to 0.81, indicating moderate to strong interrater reliability. Conclusions.—We report development and statistical analysis of a novel SBP project evaluation rubric. The results indicate the rubric can be used to reliably assess pathology residents’ critical thinking skills in SBP. (Arch Pathol Lab Med. 2014;138:809–813; doi: 10.5858/ arpa.2013-0046-OA)
T
care within the health care system relevant to their clinical specialty; incorporate considerations of cost awareness and risk-benefit analysis in patient and/or population-based care as appropriate; advocate for quality patient care and optimal patient care systems; work in interprofessional teams to enhance patient safety and improve patient care quality; and participate in identifying system errors and implementing potential systems solutions. Implementation of the competencies and linking them to outcomes has been challenging.7 To address these and other challenges, ACGME has instituted educational ‘‘Milestones’’ as part of the Next Accreditation System (NAS).8,9 Educational Milestones are developmentally based, specialtyspecific achievements that demonstrate a resident’s natural progression of professional development for each of the 6 competencies. Resident progression is measured and monitored at regular, established intervals throughout training. To ensure the success of this model, resident progress depends upon the use of reliable and valid evaluation tools. The lack of tools to assess pathologyspecific SBP competencies poses a challenge to meeting Milestone and NAS requirements. As part of their effort to address the challenge of maintenance of certification, the College of American Pathologists (CAP) commissioned a study10 to define
he Outcome Project of the Accreditation Council for Graduate Medical Education (ACGME) defined 6 core competencies required by all physicians. Among these, systems-based practice (SBP) has been considered conceptually confusing and difficult to teach and assess.1–5 ACGME SBP common program requirements stipulate that residents must demonstrate an awareness of and responsiveness to the larger context and system of health care, as well as the ability to call effectively on other resources in the system to provide optimal health care.6 They are expected to work effectively in various health care delivery settings and systems relevant to their clinical specialty; coordinate patient Accepted for publication July 10, 2013. From the Center for Individualized Medicine (Ms Vitek), Emeritus Faculty (Drs Dale and Homburger), the Division of Biostatistics and Informatics (Ms Bryant), and the Department of Laboratory Medicine and Pathology (Drs Saenger and Karon), Mayo Clinic, Rochester, Minnesota. The authors have no relevant financial interest in the products or companies described in this article. Supplemental digital content is available for this article at www. archivesofpathology.org in the June 2014 table of contents. Reprints: Brad S. Karon, MD, PhD, Department of Laboratory Medicine and Pathology, Mayo Clinic, 200 First St SW, Rochester, MN 55905 (e-mail:
[email protected]). Arch Pathol Lab Med—Vol 138, June 2014
Validation of a Systems-Based Practice Rubric—Vitek et al 809
pathology-specific competencies within the 6 ACGME core competencies. Systems-based practice is principally concerned with understanding how a pathology practice integrates with other medical specialties in a system of integrated health care delivery. Using a combination of small group sessions, surveys, and solicited feedback from dedicated pathology educators and unselected practicing pathologists, the working group put together a list of pathology-specific SBP competencies in 3 general categories: 1. Practice and system integration (ie, demonstrate awareness of interdependencies between pathology/laboratory medicine practices and the system; use performance indicators to improve delivery effectiveness; and apply leadership and management principles to effect change). 2. Medical practice and delivery systems (ie, evaluate cost effectiveness and resource allocation for different types of medical practice or delivery systems; evaluate utility of new technology or analytes and assess the feasibility of their adoption; identify and address various types of medical practice, delivery system, and patient safety deficiencies). 3. Practice economics (ie, control practice expenses, allocate resources, and manage work and demand; properly code and bill; apply contracting and negotiating skills; apply knowledge of health care trends; and use financial performance indicators in decisions). A white paper11 published jointly by the CAP and Association of Pathology Chairs in 2009 stressed clinical pathology consultation, laboratory medical direction, and laboratory management as areas of deficiency that needed to be addressed to better prepare pathologists for practice. Because elements of the role of the laboratory director (test selection, clinical consultation, interaction with the wider health care system) have been identified as crucial elements of pathology-specific SBP competencies, we sought to improve both teaching and assessment of pathology resident SBP competency as related to these duties. The clinical chemistry curriculum of our 4-year combined anatomic and clinical pathology residency includes a mandatory SBP project assigned to each resident. The projects are based upon actual recognized and unmet clinical needs. Residents are asked to investigate the clinical need, define options for solving any unmet needs, and support one option as a best solution. In some instances, laboratory data (including estimates of sensitivity, specificity, and turnaround times) are gathered and analyzed, while in other instances the resident is asked to identify the data needed to perform the analysis. Each resident meets with a variety of stakeholders including clinicians who have a need not currently met by the core or stat laboratories; laboratory technologists who have relevant data; and supervisors or managers who understand regulatory, cost, and workflow implications of potential solutions. To reduce subjectivity in the evaluation process and provide reliable and consistent feedback to residents, we designed and validated a new assessment rubric for SBP projects. Since many of the pathology-specific SBP competency skills identified by CAP require critical thinking skills, we relied upon the elements of thought and intellectual standards of reasoning in critical thinking of Paul and Elder12 as the framework for our analytic rubric. In this article we document the validity and reliability of the SBP project evaluation rubric. 810 Arch Pathol Lab Med—Vol 138, June 2014
MATERIALS AND METHODS Sample and Setting Our combined anatomic and clinical pathology residency consists of 5 residents in each of 4 years for a total of 20 residents. Residents complete the clinical chemistry rotation during their third year (1 or 2 residents rotate through clinical chemistry at a time) and are assigned an SBP project during their rotation. The project consists of a 1-page synopsis that includes a brief background of the problem, key questions for the medical director, a list of resources to contact, and description of expectations for the summary report (see Supplemental Digital Content A, project e x a m p l e [ s e e s u p p l e m e n t a l m a t e r i a l fi l e a t w w w . archivesofpathology.org in the June 2014 table of contents]). The projects evaluated herein were assigned during the first week of the 3-month clinical chemistry rotation, while residents were rotating through the core and stat laboratories. Owing to the short duration of the core and stat laboratory rotation (3 weeks), appointments were made in advance with key stakeholders. Each stakeholder had knowledge of each project including at least 1 clinician proponent bringing forth an unmet clinical need, 1 laboratory technologist or scientist with existing knowledge or data relating to the assigned problem, and laboratory management personnel such as supervisors or quality specialists who understood workflow, regulatory, or cost implications of potential solutions. Residents had the entire 3 months of the clinical chemistry rotation to complete the project and were encouraged to seek out and speak to additional health care and laboratory staff. In addition, residents were encouraged to use literature review; obtain cost, patient charge, or other information from management staff; and discuss regulatory issues with quality management staff within the department. Examples of problems assigned as projects include standardization of reporting units for urinalysis tests performed at point of care and different laboratory sites, options to support stat lactate testing for an institutional sepsis initiative, and practice implications of reporting A1C-derived average glucose with hemoglobin A1C results (see Supplemental Digital Content A, project example).
Instrument Development The SBP rubric we developed is based on the elements of thought and intellectual standards of reasoning in critical thinking of Paul and Elder.12 The rubric includes the following elements: (1) definition of the question at issue and purpose of project; (2) identification of key stakeholders and their operating assumptions; (3) elucidation of concepts, evidence, and information; and (4) presentation of conclusion(s) and implications of the recommended solution. To allow for more objective and granular scores for each category, each of the major categories is further subdivided into 3 subcomponents as shown in Supplemental Digital Content B (see supplemental material file at www.archivesofpathology.org in the June 2014 table of contents). For instance, a rater could determine that a project response met criteria for a score of 4 on 2 of the subcomponents of ‘‘question at issue,’’ but only 3 for the last subcomponent, resulting in a score of 3.7 for that category (see Supplemental Digital Content B). For each of the 4 major categories evaluated (question at issue; key stakeholders and operating assumptions; concepts and information; and conclusions and implications), we developed a 5-point scale, using an iterative process with reviewers to establish standard definitions of what constituted performance at each of the levels from 1 to 5 (1 ¼ unacceptable, 2 ¼ marginal, 3 ¼ proficient, 4 ¼ good, 5 ¼ exceptional) for each subcomponent of the 4 categories (Supplemental Digital Content B, rubric). We then developed a standard set of questions to ask each resident to answer as part of the SBP project (see Supplemental Digital Content A, sample project). Four reviewers (J.C.D., H.A.H., A.K.S., B.S.K.), all practicing clinical pathologists or clinical chemists, were initially trained on the rubric, using 3 residents’ projects and iterative discussion until agreement was reached. Two reviewers (A.K.S., B.S.K.) worked directly with residents during the chemistry rotation, while the other 2 reviewers (J.C.D., H.A.H.) are content Validation of a Systems-Based Practice Rubric—Vitek et al
Scatter plot of overall (total) score for 11 systems-based practice projects reviewed independently by 4 different raters. Individual rater scores are displayed for each of the 11 projects graded.
experts but did not participate in residency training activities and thus did not know the individuals being evaluated. We then scored resident SBP projects for statistical analysis during a period of 3 years. Faculty raters reconvened periodically for retraining before scoring new projects, using previously evaluated SBP projects to calibrate scoring among reviewers.
Evaluation Methods Four reviewers independently evaluated 11 SBP projects. In one instance, 2 residents were assigned to investigate the same issue, though projects were independently written and submitted. All reviewers evaluated all 4 categories for each project, yielding a total of 176 evaluable items (4 reviewers 3 4 categories 3 11 projects). In addition, the total score (sum of the 4 categories) was evaluated for interrater reliability. Cronbach a was used to measure internal consistency reliability for the 4 categories in the SBP rubric.13 Interrater reliability14 was assessed by intraclass correlation coefficient (ICC). Analysis of variance (ANOVA) P values using an F statistic were calculated for each major category and for total score to determine whether a significant difference existed between raters in evaluation of either major categories or overall project scores. A P value ,.05 would indicate that a statistically significant difference exists between individual raters for scoring either a major category or overall project score. Power analysis performed demonstrated that, assuming a 2-sided ANOVA with a level of significance of .05, a sample size of 10 projects needed to be reviewed to detect a minimum difference of 5 points in the total score among the reviewers with 80% power, assuming a standard deviation among the reviewers of 3.89. The study design was determined to be exempt from review by the Mayo Clinic Institutional Review Board.
Table 1.
Resident Project 1 2 3 4 5 6 7 8 9 10b 11b a b
RESULTS Overall Scores The range of total overall scores (sum of the 4 categories) given by the 4 raters varied from 5 to 17 of a possible 20 points. A scatter plot of total scores by rater and project is shown in the Figure. Some projects were graded more consistently by raters (Figure, projects 4 and 9), while other projects showed wider variation between raters (Figure, projects 2, 10, and 11). The study was powered to detect a difference in total score between reviewers of 5 points or greater. Total scores assigned by all 4 reviewers were within 5 points of each other for all 11 projects (Figure). Differentiation Among Resident Projects Category mean scores (averaged across all raters on a 5point scale) for question at issue and purpose ranged from 2.19 to 4.00; stakeholders and assumptions ranged from 1.50 to 4.13; concepts and information ranged from 1.5 to 4.13; and conclusions and implications ranged from 1.38 to 4.25 (Table 1). Average total scores across all raters for SBP projects ranged from 6.56 to 16.50 (Table 1). The rubric differentiated the performance among residents both within categories of critical thinking and for the overall score of the project. Reliability Cronbach a was used to measure internal consistency for the 4 categories in the SBP rubric.13 The Cronbach a scores
Resident Systems-Based Practice (SBP) Project Rubric Scores: Mean (Range) Category Scores and Mean (Range) Overall (Total) Score When 4 Raters Reviewed 11 SBP Projectsa
Question at Issue and Purpose Mean (Min–Max) 4.00 3.43 3.68 3.48 2.58 3.65 3.58 3.13 4.00 2.19 3.00
(3.70–4.30) (2.70–4.00) (3.30–4.00) (3.30–4.00) (2.30–3.00) (3.30–4.70) (3.30–4.00) (3.00–3.50) (4.00) (2.00–2.75) (2.50–3.50)
Stakeholders and Assumptions Mean (Min–Max) 3.55 2.73 3.58 3.23 2.18 4.03 3.35 2.44 4.13 1.50 2.38
(3.30–4.30) (2.70–3.00) (3.30–4.00) (3.00–3.30) (1.00–3.00) (3.70–4.70) (3.00–3.70) (2.00–3.00) (4.00–4.50) (1.00–2.00) (1.50–3.00)
Concepts and Information Mean (Min–Max) 3.08 2.50 3.65 3.50 1.83 3.65 3.85 2.25 4.13 1.50 2.25
(2.30–4.00) (2.00–3.00) (3.00–4.30) (3.30–3.70) (1.00–2.30) (3.30–4.00) (3.70–4.00) (2.00–2.50) (4.00–4.50) (1.00–2.00) (2.00–3.00)
Conclusions and Implications Mean (Min–Max) 3.35 2.83 3.60 3.60 2.00 3.95 4.10 2.50 4.25 1.38 2.13
(3.00–3.70) (2.30–3.30) (3.00–4.00) (3.30–3.70) (1.00–2.70) (3.70–4.0) (3.70–5.00) (2.00–3.00) (4.00–4.50) (1.00–2.00) (1.50–3.00)
Total Mean (Min–Max) 13.98 11.48 14.50 13.80 8.58 15.28 14.88 10.31 16.50 6.56 9.75
(12.30–16.30) (9.60–13.60) (13.00–16.00) (13.20–14.70) (6.00–9.70) (14.00–16.70) (13.70–16.40) (9.00–12.00) (16.00–17.00) (5.00–8.25) (8.00–12.00)
Min–Max is the minimum and maximum scores for projects scored in each category. Indicates that 2 residents were assigned the same project to investigate, though written projects were submitted and graded independently.
Arch Pathol Lab Med—Vol 138, June 2014
Validation of a Systems-Based Practice Rubric—Vitek et al 811
Table 2. Systems-Based Practice Project Rubric: Cronbach a and Interrater Reliability Results Critical Thinking Categories Question at issue and purpose Stakeholders and assumptions Concepts and information Conclusions and implications Total Overall measure
Cronbach a ICC (95% CI) ANOVA 0.96
0.63 (0.35–0.86)
0.82
0.92
0.69 (0.43–0.89)
0.75
0.91
0.77 (0.56–0.92)
0.82
0.91
0.78 (0.57–0.93)
0.77
0.81 (0.62–0.94)
0.80
0.95
Abbreviations: ANOVA, analysis of variance; CI, confidence interval; ICC, intraclass correlation coefficient, assuming the 4 reviewers are a random set of 4 reviewers from a larger population of reviewers.
ranged from 0.91 to 0.96, with an overall value of 0.95 (Table 2). In general, Cronbach a scores above 0.7 are considered acceptable. Scores in the 0.8 to 0.9 range are considered ideal and reflect internal consistency between questions without significant duplication or overlap. Interrater reliability14 was assessed by ICC and varied from 0.63 to 0.81, with an overall ICC of 0.81 (Table 2). Intraclass correlation coefficient scores between 0 and 0.2 indicate poor agreement, 0.3 to 0.4 indicates fair agreement, 0.5 to 0.6 indicates moderate agreement, 0.7 to 0.8 indicates strong agreement, and scores above 0.8 indicate almost perfect agreement. Intraclass correlation coefficient scores for each category and overall total scores indicated moderate to strong agreement between raters. ANOVA P values varied from .75 to .82, indicating no statistically significant difference between raters for scoring either major categories of the rubric or for the overall (total) score. COMMENT Though it is an ACGME requirement15 to ensure that residents can demonstrate competency in SBP, multiple groups report that SBP is difficult to assess; and valid, reliable assessment tools are lacking.4,10,11,16 Even so, the ACGME Outcome Project stipulates that ‘‘. . .programs are expected to phase-in assessment tools that provide useful and increasingly valid, reliable evidence that residents achieve competency-based educational objectives,’’ 6 which support the natural progression toward achieving educational milestones.9 The rubric offers an opportunity for objective assessment of SBP competency at a fixed point in time during resident training. Tools for continuous assessment of SBP competency over the course of the 4-year pathology residency will require further development. The ACGME Toolbox of Assessment Methods suggests that 3608 global rating, objective structured clinical examinations, portfolios, multiple choice examinations, and record review may be used to assess various aspects of SBP.17 Though several residency training programs describe using a project-based approach to teach SBP principles18–20 or evaluate quality improvement proposals,21 none have described the use of rubrics to evaluate residents’ writing and thinking skills in their approach to solving SBP problems. Previously published evaluation methods to assess resident competency in SBP included surveys and questionnaires of self-reported improvement,22 observation using objective structured clinical examination,23,24 simulated cases with examinations,25 Web-based tools,26–28 and 812 Arch Pathol Lab Med—Vol 138, June 2014
simulation.29 Two prior studies18,30 described resident projects, yet did not use validated instruments for assessment. None of these tools are easily amenable to objective evaluation of projects in which a resident is asked to address clinical practice problems as a laboratory director within the context of pathology training. There were several benefits to assigning residents SBP projects in the clinical chemistry rotation. The process of resident investigation fostered dialogue with clinicians and exposed them to representative, real-world problems that laboratory medical directors and pathologists face in practice. The residents also had the opportunity to explore in-depth one area of clinical chemistry testing and to obtain some level of mastery of the testing techniques and principles in this area. Residents were also required to identify additional available resources (literature, content experts within the laboratory, external content experts) and had the opportunity to learn to reach out for help and information. Rubrics are used widely in education as reliable, valid methods of performance assessment.31–33 A rubric is a set of clear criteria or expectations often using descriptions to develop universal understanding of desired performance.33 In designing effective rubrics for teaching and learning, 5 elements are considered: essential elements or criteria, number of achievement levels, clear descriptions of performance, consequences of performance at each level, and rating scheme. Advantages of using analytic rubrics include the increased reliability of scores from multiple raters rendering independent judgments on the same item; increased learning and instruction; and clarified, enhanced expectations of both learners and educators. Rubrics have the added benefit of being able to provide a structured approach to giving specific feedback to learners. Challenges of developing and using rubrics to assess written work are the time needed to develop common understanding for various skill levels of performance and evaluator training necessary to obtain reliable scores.33 Once implemented, rubrics can streamline the grading and feedback process and provide more reliable assessments. In this study we describe a process for assigning residents SBP projects based upon real problems facing laboratory directors. The residents are expected to interact with stakeholders outside of pathology to acquire information and test hypotheses related to potential solutions to the problem. After gathering input, the residents prepare a written report that includes answers to a set of standardized questions. Reports are evaluated by use of a novel rubric based on principles of critical thinking. We demonstrated that the SBP project evaluation rubric achieved high scores for interrater reliability when used by multiple faculty members during a period of 3 years. This study does have some limitations. The study was confined to pathology residents in 1 program within 1 rotation (clinical chemistry). Residents were not required to identify the issues to be analyzed or complete follow-up and analysis or implementation of their recommended solutions. Additional challenges were encountered in project identification and development. Not all projects were of equal difficulty, and some had more obvious solutions or options for exploration than others. Developing SBP projects with appropriate scope that can be investigated within the specified time frame, given the timeliness and variability of actual issues in the laboratories, creates a ‘‘project effect,’’ which may lead to lower scores for some projects. Despite Validation of a Systems-Based Practice Rubric—Vitek et al
these limitations, our approach mitigates some challenges described when using independent study projects to teach SBP. For example, Allen et al19 reported that residents tend to select traditional scientific topics for investigation over SBP projects, had difficulty selecting projects or topics, and had problems identifying and arranging meetings with stakeholders. In conclusion, we describe a process for assigning residents SBP projects during the clinical chemistry rotation of a 4-year pathology residency program. We developed and assessed a novel analytic rubric for evaluating the critical thinking skills of residents in the context of SBP projects based upon actual problems that laboratory directors routinely encounter. The SBP project and associated evaluation rubric can be used reliably to objectively measure residents’ competency in SPB. Further studies are needed to fully validate the rubric and extend its use to the assessment of other core competencies in pathology. References 1. Dyne PL, Strauss RW, Rinnert S. Systems-based practice: the sixth core competency. Acad Emerg Med. 2002;9(11):1270–1277. 2. Ziegelstein RC, Fiebach NH. ‘‘The mirror’’ and ‘‘the village’’: a new method for teaching practice-based learning and improvement and systems-based practice. Acad Med. 2004;79(1):83–88. 3. Folberg R, Antonioli DA, Alexander CB. Competency-based residency training in pathology: challenges and opportunities. Hum Pathol. 2002;33(1):3–6. 4. Wang EE, Dyne PL, Du H. Systems-based practice: summary of the 2010 Council of Emergency Medicine Residency Directors Academic Assembly Consensus Workgroup—teaching and evaluating the difficult-to-teach competencies. Acad Emerg Med. 2011;18(suppl 2):S110–S120. 5. Lee AG, Beaver HA, Greenlee E, et al. Teaching and assessing systemsbased competency in ophthalmology residency training programs. Surv Ophthalmol. 2007;52(6):680–689. 6. ACGME Outcome Project. 2009. Accreditation Council for Graduate Medical Education. http://www.acgme.org/outcome/comp/compFull.asp. Accessed October 10, 2009. 7. Jones MD Jr, Rosenberg AA, Gilhooly JT, Carraccio CL. Perspective: competencies, outcomes, and controversy—linking professional activities to competencies to improve resident education and practice. Acad Med. 2011; 86(2):161–165. 8. Nasca TJ. Where will the ‘‘milestones’’ take us: the next accreditation system. ACGME Bulletin. September 3–5, 2008. 9. Nasca TJ, Philibert I, Brigham T, Flynn TC. The next GME accreditation system—rationale and benefits. N Engl J Med. 2012;366(11):1051–1056. 10. Hammond ME, Filling CM, Neumann AR, Homburger HA. Addressing the maintenance of certification challenge. Arch Pathol Lab Med. 2005;129(5):666– 675. 11. Talbert ML, Ashwood ER, Brownlee NA, et al. Resident preparation for practice: a white paper from the College of American Pathologists and Association of Pathology Chairs. Arch Pathol Lab Med. 2009;133(7):1139–1147. 12. Paul R, Elder L. Critical Thinking. Upper Saddle River, New Jersey: Financial Times Prentice Hall; 2002:1–342.
Arch Pathol Lab Med—Vol 138, June 2014
13. Cronbach LJ. Coefficient alpha and the internal structure of tests. Psychometrika. 1951;16:297–334. 14. Shrout PE, Fleiss JL. Intraclass correlations: uses in assessing rater reliability. Psychol Bull. 1979;86:420–428. 15. ACGME Common Program Requirements. 2011. http://www.acgme.org/ acgmeweb/Portals/0/PFAssets/ProgramResources/Common_Program_ Requirements_07012011[1].pdf. Accessed December 8, 2012. 16. Prak ET, Young DS, Kamoun M, et al. 2008 ACLPS panel discussion on resident education in clinical pathology. Am J Clin Pathol. 2009;131(5):618–622. 17. ACGME Toolbox of Assessment Methods. 2000. http://www.acgme.org/ Outcome/assess/ToolTable.pdf. Accessed October 10, 2009. 18. Buchmann RF, Deloney LA, Donepudi SK, Mitchell CM, Klein SG. Development and implementation of a systems-based practice project requirement for radiology residents. Acad Radiol. 2008;15(8):1040–1045. 19. Allen E, Zerzan J, Choo C, Shenson D, Saha S. Teaching systems-based practice to residents by using independent study projects. Acad Med. 2005;80(2): 125–128. 20. Delphin E, Davidson M. Teaching and evaluating group competency in systems-based practice in anesthesiology. Anesth Analg. 2008;106(6):1837– 1843. 21. Leenstra JL, Beckman TJ, Reed DA, et al. Validation of a method for assessing resident physicians’ quality improvement proposals. J Gen Intern Med. 2007;22(9):1330–1334. 22. Gakhar B, Spencer AL. Using direct observation, formal evaluation, and an interactive curriculum to improve the sign-out practices of internal medicine interns. Acad Med. 2010;85(7):1182–1188. 23. Davis D, Lee G. The use of standardized patients in the plastic surgery residency curriculum: teaching core competencies with objective structured clinical examinations. Plast Reconstr Surg. 2011;128(1):291–298. 24. Garstang S, Altschuler EL, Jain S, Delisa JA. Designing the objective structured clinical examination to cover all major areas of physical medicine and rehabilitation over 3 yrs. Am J Phys Med Rehabil. 2012;91(6):519–527. 25. Hingle ST, Robinson S, Colliver JA, Rosher RB, McCann-Stone N. Systemsbased practice assessed with a performance-based examination simulated and scored by standardized participants in the health care system: feasibility and psychometric properties. Teach Learn Med. 2011;23(2):148–154. 26. Eskildsen MA. Review of web-based module to train and assess competency in systems-based practice. J Am Geriatr Soc. 2010;58(12):2412– 2413. 27. Hauge LS, Frischknecht AC, Gauger PG, et al. Web-based curriculum improves residents’ knowledge of health care business. J Am Coll Surg. 2010; 211(6):777–783. 28. Kerfoot BP, Conlin PR, Travison T, McMahon GT. Web-based education in systems-based practice: a randomized trial. Arch Intern Med. 2007;167(4):361– 366. 29. Issenberg SB, Chung HS, Devine LA. Patient safety training simulations based on competency criteria of the Accreditation Council for Graduate Medical Education. Mt Sinai J Med. 2011;78(6):842–853. 30. Relyea-Chew A, Talner LB. A dedicated general competencies curriculum for radiology residents development and implementation. Acad Radiol. 2011; 18(5):650–654. 31. Jonsson A, Svingby G. The use of scoring rubrics: reliability, validity, and educational consequences. Educ Res Rev. 2007;2:130–144. 32. Moskal BM, Leydens JA. Scoring rubric development: validity and reliability. Practical Assess Res Eval. 2000;7(10). http://PAREonline.net/getvn. asp?v¼7&n¼10. Accessed July 22, 2012. 33. Taggart G, Phifer SJ, Nixon JA, Wood M. Rubrics: A Handbook for Construction and Use. Lancaster, PA: Technomic Publishing Company; 1998:58– 74.
Validation of a Systems-Based Practice Rubric—Vitek et al 813