experiences understanding of third-year medical ...

4 downloads 12359 Views 194KB Size Report
Aug 5, 2009 - Background. Advances in technology and biomedical sciences, .... Gained an understanding of business rules. Information Technology Group.
Downloaded from qshc.bmj.com on 5 August 2009

An educational improvement project to track patient encounters: toward a more complete understanding of third-year medical students’ experiences K G Hoffman, M D Griggs, C A Kerber, M Wakefield, E Garrett, C Kersten, M C Hosokawa and L A Headrick Qual. Saf. Health Care 2009;18;278-282 doi:10.1136/qshc.2008.028720

Updated information and services can be found at: http://qshc.bmj.com/cgi/content/full/18/4/278

These include:

References

This article cites 26 articles, 5 of which can be accessed free at: http://qshc.bmj.com/cgi/content/full/18/4/278#BIBL

1 online articles that cite this article can be accessed at: http://qshc.bmj.com/cgi/content/full/18/4/278#otherarticles

Rapid responses

You can respond to this article at: http://qshc.bmj.com/cgi/eletter-submit/18/4/278

Email alerting service

Receive free email alerts when new articles cite this article - sign up in the box at the top right corner of the article

Notes

To order reprints of this article go to: http://journals.bmj.com/cgi/reprintform

To subscribe to Quality and Safety in Health Care go to: http://journals.bmj.com/subscriptions/

Downloaded from qshc.bmj.com on 5 August 2009

Education and training

An educational improvement project to track patient encounters: toward a more complete understanding of third-year medical students’ experiences K G Hoffman, M D Griggs, C A Kerber, M Wakefield, E Garrett, C Kersten, M C Hosokawa, L A Headrick University of Missouri-Columbia School of Medicine, Columbia, Missouri, USA Correspondence to: Dr K G Hoffman, Offices of Medical Education, MA213 Medical Science Building, Columbia, MO 65212, USA; [email protected] Accepted 13 April 2009

ABSTRACT Background: At the University of Missouri School of Medicine (MUSOM), ‘‘commitment to improving quality and safety in healthcare’’ is one of eight key characteristics set as goals for our graduates. As educators, we have modelled our commitment to continuous improvement in the educational experiences through the creation of a method to monitor and analyse patient encounters in the third year of medical school. This educational improvement project allowed course directors to (1) confirm adequate clinical exposure, (2) obtain prompt information on student experiences, (3) adjust individual student rotations to meet requirements and (4) ascertain the range of clinical experiences available to students. Discussion: Data illustrate high levels of use and satisfaction with the educational innovation. We are in our second year using the new Patient Log (PLOG) process and are now considering expanding the use of PLOG into the fourth year of medical school.

number and type of patients encountered. Patient logs are an electronic or paper record of the number and types of patients experienced. Logs provide data to (1) identify gaps in individual students’ clinical experiences; (2) promote student reflection on their practice; (3) better understand learning venues (including site differences); (4) improve the overall course; (5) compare experiences across third-year courses within a single institution; and (6) compare experiences with similar programmes at other institutions.8 Recognising the importance of quantity and type of clinical exposure in the development of professional competence, US medical school accreditation now requires documentation of: (1) types of patients, (2) appropriate clinical settings, (3) expected level of student responsibility and (4) revision of students’ experiences, when necessary, to ensure students meet the requirements set by the faculty.9

Problem National voices are calling for medical school graduates able to improve quality and safety in healthcare.1 2 By applying continuous improvement principles to educational innovations, educators reinforce and model this trait and enhance education programmes. At the University of Missouri School of Medicine (MUSOM), ‘‘commitment to improving quality and safety in healthcare’’ is one of eight outcomes we set for our graduates.3 We present an example of continuous improvement applied to our education work, the creation of a Patient Log (PLOG) system to track medical students’ clinical experiences.

Background Advances in technology and biomedical sciences, demand for increased public accountability, increased patient complexity with shorter lengths of stay and financing of healthcare have dramatically changed the environment in which our students learn. The Institute of Medicine defines quality healthcare as safe, effective, patientcentred, timely, efficient and equitable.2 Important, unresolved challenges facing medical education are to provide clinical experiences in which students care for the same types of patients they will encounter in professional practice, and to provide meaningful continuity experiences with patients.4 5 Clinical education has been criticised for being disjointed, inconsistent, less than rigorous and lacking uniformity in teaching skills.6 7 To optimize student learning, educators must recognize that learning is closely coupled with the 278

In the absence of a standardized, timely, reporting system, course directors are challenged to monitor specific types and numbers of patients available to students. Lack of standardisation results in duplication of effort and inability to view third-year experiences across courses.

Purpose of change The aims of this educational improvement project were to: (1) support timely monitoring of student clinical experiences, provide mid-course feedback and corrections when necessary; (2) systematically collect data across third-year courses; (3) inform clinical leaders of the students’ view of the thirdyear experience; (4) improve efficiency, decrease waste and improve the use of scarce resources; and (5) implement the educational improvement project expeditiously.

METHODS Setting MUSOM is located on the main campus of the University of Missouri, a public research university supported by the state and designated as Doctoral Research University-Extensive. MUSOM is ranked 69th of 76 in state funding per student among publically supported medical schools. Thus, efficient use of scarce resources is vital. The School accepts 96 students each year. The programme leading to the MD degree consists of a problem-based learning curriculum in the first 2 years, seven core clinical courses in the third-year and additional fourth-year

Qual Saf Health Care 2009;18:278–282. doi:10.1136/qshc.2008.028720

Downloaded from qshc.bmj.com on 5 August 2009

Education and training courses. Students may complete their third year locally or in rural communities. We began this educational improvement project in January 2006 with a projected implementation in June 2006.

Box 1 Institute for Health Care Improvement’s knowledge domains for the improvement of healthcare

Function

c

The PLOG Committee was formed from the larger Clinical Curriculum Steering Committee that provides leadership and oversight for the clinical curriculum. Recognising the need for collaboration between people who would use the educational innovation and those who would build the new process, we created the work group illustrated in table 1.

c

Intervention The Institute for Health Care Improvement’s Eight Knowledge Domains for Improvement of Health Care (box 1) and Langley et al. Model for Improvement10 11 helped frame the educational innovation. This model encouraged the team to: (1) better understand the needs and preferences of those who would work in the new process; (2) view the newly created process as part of a larger system; (3) standardize data collection to decrease variation and enable comparisons; (4) enhance collaboration across courses; and (5) model process improvement and therefore develop new team knowledge.

Organising the content To better understand the quantity and mix of patient encounters required for each student’s experience, course directors considered both discipline-specific national guidelines (eg, paediatrics, obstetrics–gynaecology) and MUSOM curricular goals.3 In the absence of national criteria for minimum number of patients, course faculty developed local requirements with variation across courses. Family Medicine and Ob–Gyn required students to document a large percentage of their patient encounters; Internal Medicine and Surgery courses required students to document fewer total encounters. Once course directors had a clear understanding of their own course needs, the committee defined categories for collection of patient encounters across all third-year courses. Three broad categories emerged (table 2). Box 2 illustrates data elements collected across all patient encounters. Participation in PLOG was required for each course, and data were shared across courses. The PLOG committee developed a standard set of instructions for use. Although each course director could add to the common instructions, these core ‘‘rules’’ were followed across all third-year courses.

Defining guidelines for use Students used the same template for documenting their patient encounters across all seven courses; however, specific parts of the template were activated for a given course. For example,

c c c c c

Knowledge of the needs and preferences of those we serve. Work as a process, system. Variation and measurement. Leading, following and making changes. Developing new locally useful knowledge. Social context and accountability. Professional subject matter.

students enrolled in Child Health could document congenital heart disease, while students enrolled in Internal Medicine could view but not document congenital heart disease.

Leveraging technology in support of a new process The PLOG committee considered two questions: (1) In what area(s) can automation maximize efficiency in the process? (2) How can we leverage the newly developed system to eliminate redundancy? To meet the needs of users in different locations, we decided to use a web-based system. Once we were clear on how the educational innovation should be used, we moved from illustrations of how the PLOG website might work to a functioning prototype. Faculty, staff and students reviewed the new process and recommended changes in content displayed and in the PLOG user interface.

Measures Prior to the educational improvement project, data on students’ educational experience were not systematically collected across all courses; thus baseline measures were not available. We monitored student entries for (1) number of patients logged, (2) location and site of the patient encounter, (3) patient gender and age, (4) the use of the broad categories, (5) number of students not meeting requirements each week and 6) number of weeks required to meet course requirements. Data were analysed at both the individual course level and across all courses in the third year. Data were collected on how frequently course directors used the system to provide feedback to the students, timeliness of student data entry and student progress in meeting course requirements. Surveys at 6 months and 1 year solicited course directors’, course coordinators’ and students’ perceptions of satisfaction, time required to make entries, technical difficulties and data use. To ensure anonymity, responses were not linked to role or department. Descriptive statistics were used to analyse PLOG data and survey results. We analysed open-ended survey questions for themes.

Table 1 Composition and roles of process-improvement team Position

Role in team

Course Directors for Child Health, Surgery, Family Medicine, Internal Medicine

Contributed detailed knowledge of course requirements, clinical settings and clinical material

Table 2

Information Technology Group Business Technology Analyst Associate Dean Educational Evaluation and Improvement Instructional Design Specialist

Gained an understanding of business rules Provide immediate feedback on feasibility of proposed changes Provided detailed knowledge of existing information systems Provided guidance and leadership in quality improvement Provided curriculum and evaluation expertise

Qual Saf Health Care 2009;18:278–282. doi:10.1136/qshc.2008.028720

Broad categories for collecting patient encounters

Broad category

Examples

Diagnosis/symptom

Breast cancer Cardiovascular dyslipidaemia Peripheral vascular disease Obtaining a history Wound/trauma care Maternity care Preventive care Caring for a patient from a culture other than your own

Skill Special domain

279

Downloaded from qshc.bmj.com on 5 August 2009

Education and training Box 2 Standardized data collection Instructions: c Not all patient encounters will be entered. c Each patient encounter may generate up to three entries. c Each encounter must have at least one entry. c Each patient encounter can only be counted once per site. c Once entered, a patient encounter cannot be edited. c The entry of encounter data is encouraged on a daily basis and required on a weekly basis by 07:00 each Monday. c Data may be audited for accuracy; falsification will be considered an Honour Code violation. c Students should monitor their progress on a regular basis. Data elements: c location of clinical encounter; c patient age and gender; c site of care (ambulatory/inpatient/urgent care, etc); c use of simulation; c level of student participation with patient. RESULTS Situation analysis Prior to participation in the educational improvement project, the models for process improvement were not well known to faculty or the PLOG committee. Most participants were familiar with Web-based applications. The situational analysis brought into focus the importance of: (1) creating a user-friendly interface, (2) online examples, (3) one-on-one training and (4) timely, high-quality technical assistance. We developed locally useful process improvement knowledge by explicitly planning for small changes as we implemented the new process. Our decisions were informed by data collected across the project, and we linked decisions to improvement models.

Patient encounters During 2006–7, the average number of clinical encounters over an 8-week course ranged from 28.4 (Surgery) to 140 (Family Medicine) (table 3). Fewer encounters were expected in courses of shorter duration (Neurology 2 weeks, Psychiatry 6 weeks). Individual course requirements influenced the average number of encounters students logged. Figure 1 illustrates the use of the broad categories in each course. These data reflect differences in the course experiences and illustrate the system’s flexibility. Data in each broad category were further analysed within a course and across the third year. For example, analysis identified that 12% of patients

Figure 1

Use of broad categories of patients logged by clerkship.

logged across all courses had cardiovascular disease, and 12% of skills logged were for education and prevention activities. Using the PLOG system, course directors provided prompt feedback to students on timeliness of their logs and progress in meeting course requirements (table 4). In 2006–7, the majority of students entered data in a timely manner and made adequate progress across the course. For example, of the 90 students matriculating in the Family Medicine course, four demonstrated inadequate progress at mid-course, and rotation changes were made to ensure completion of course objectives. Data on students’ ability to meet minimum requirements for each course were monitored by the course directors weekly when the system was first implemented, and minor adjustments were made in the Child Health and Psychiatry courses. Most students needed 6–7 weeks to complete course requirements across all courses.

Usability and feedback Table 5 details the students’ feedback from the 2006–7 end-ofyear survey. Table 6 summarises the course directors’ and coordinators’ responses to an end-of-year survey. Timely data encouraged more conversations with students regarding clinical experiences. For example, PLOG data were used in Internal Medicine to determine if students would have adequate exposure to adult patients in a Medicine/Paediatrics practice. PLOG data also enhanced supervision. For example, an atypically low number of patients logged for a Family Medicine rotation promptly identified issues with the preceptor and site.

DISCUSSION Table 3 Patient encounters logged for 2006–7 Patient encounters logged (2006–7) Age of patients Range birth to .80 years 40% patients 16–64 years

Care settings 83% community encounters in Family Medicine course 43% rural settings in Obstetrics & Gynaecology course 62% ambulatory across all courses

15% patients >65 years 7% patients ,1 year Average number of patients logged per course Child Health = 90.2 Neurology = 22.5 Family Medicine = 140 Obstetrics & Gynaecology = 107.5 Internal Medicine = 48.9 Psychiatry = 37.9

280

The educational improvement project is part of our school’s commitment to continuously improve the quality of our work. The PLOG project supported monitoring and timely feedback of students’ clinical experiences, enabled course directors to better understand the clinical experiences of our third-year medical students and provided comparative course data across the third year. This project illustrates use of several of the Institute for Health Improvements’ Knowledge Domains as these domains relate to improvements in medical education (box 2).10 We explore four of these below.

Surgery = 28.4

Customer/beneficiary knowledge The PLOG project used a common problem to assess the needs and preferences of persons who would use the improved Qual Saf Health Care 2009;18:278–282. doi:10.1136/qshc.2008.028720

Downloaded from qshc.bmj.com on 5 August 2009

Education and training Table 4 Timeliness of data entry and progress in meeting course requirements 2006–7 no of students with patient log reviews Professionalism/timeliness

Student progress and documentation of clerkship requirements

Clerkship

Student is demonstrating professionalism by entering data in a timely and thoughtful manner

Student is not demonstrating professionalism by not entering data in a timely and thoughtful manner

Student is making appropriate progress and is on target to meet clerkship requirements; no problems are anticipated

Student has made some progress but needs to increase participation and/or documentation to meet clerkship requirements

Student has not demonstrated adequate progress and is at risk of not meeting clerkship requirements

Child Health Family Medicine Internal Medicine Neurology Obstetrics & Gynaecology Psychiatry Surgery Grand total

88 90 87 Not applicable 35 31 86 417

4 2 4 Not applicable 9 2 0 21

87 90 86 Not applicable 31 29 85 408

3 3 3 Not applicable 15 3 4 31

5 1 1 Not applicable 5 1 0 13

process. Students needed (1) ease of use, (2) system access from multiple locations, (3) prompt feedback on progress during the course and (4) ability to monitor clinical encounters in light of course goals. Faculty needed (1) a better understanding of the students’ third-year clinical experiences, (2) more efficient methods to manage data, (3) a work flow to accommodate busy schedules and (4) the ability to view students’ clinical experience longitudinally across the third year.

Education as process/system A system is an interdependent group of people, procedures, activities and technologies with a common purpose or aim.10 Historically schools have not looked at third-year clinical subject matter as a whole. Those that have found overlap of content, isolation of course directors and the need for better collaboration and coordination.12–14 The PLOG educational improvement project helped to dissolve discipline-specific boundaries and to integrate data on students’ clinical encounters across the third year to aid course directors in viewing the educational experiences in total and from the point of view of the student. Course directors gained a better understanding of how their course content fitted within the third year as well as their unique contribution to the School’s educational outcomes. This project helped to clarify the desired outcomes for our third-year students. Efficiency was gained by moving data collection from a labour-intensive, idiosyncratic paper process to an integrated web-based process. Duplication of effort was reduced, as the same data were used in the assessment of school outcomes and to address multiple accreditation requirements. Table 5 Student usability survey results Student usability survey results 93.3% response rate Time spent entering data 74% ,1 h/week 9.6% .1.5 h/week

Locations of data entry Hospital/clinics Library Home

Student use of patient log data Monitor progress 36% Develop learning objectives 9% Inform reading goals 2% Discuss progress 10% Seek additional experiences during course 34% Seek experiences across courses 9%

Qual Saf Health Care 2009;18:278–282. doi:10.1136/qshc.2008.028720

Collaboration The project fostered skill in collaboration by bringing together persons with diverse skills and perspectives. Collaboration among third-year courses ensures that experiences and skill acquisition will be optimally sequenced,14 allows pooling of intellectual and material resources to support innovations in evaluation tools, enhances recruitment and development of faculty, and increases collegial support and scholarly efforts.12 15 Burke et al16 found that a collaborative effort to develop a process for peer review of courses led to greater sharing of information and better understanding across third-year courses. For our educational improvement project, transparency of the data across the seven courses increased collegiality and collaboration, as course directors better understood the structure of each course as well as differences among disciplines. Participants gained experience in an effective team with clearly defined team roles and responsibilities which they have applied to other collaborative projects. This project not only enhanced communication and collaboration among medical specialties but fostered collaboration between content experts and technology experts that greatly enhanced the work. We believe that the work was successful because participants gained an appreciation for multiple perspectives and developed a common language to advance the project goals.

Developing new, locally useful knowledge Educational leaders analysed current processes, looked for opportunities to improve processes, solicited actionable feedback at 6 and 12 months, and used this feedback to further improve PLOG. Knowledge gained through this educational innovation has advanced other cross-discipline projects (end-of-year clinical exam, student perceptions of quality of clinical teaching). We are in our second year using the new PLOG process. Table 6

Director and coordinator survey results

Director and coordinator survey results 77% response rate Time required c 90% ,1 h/week Satisfaction c

80% fully satisfied

Reported uses of patient log c confirm adequate clinical exposure c obtain real-time information on student experiences c readjust individual student rotations to meet requirements c integrate simulated cases into courses c better understand range of clinical experiences in community sites

281

Downloaded from qshc.bmj.com on 5 August 2009

Education and training CONTEXT

REFERENCES 1.

Extant research contains multiple reports of the use of patient logs within discipline-specific courses.17–20 Authors have debated the benefits of paper, handheld devices or web-based systems.21–23 Although patient logs have many advantages, Dolmans et al24 found that clinical course directors did not make course changes based on patient logs, and many students did not feel it worthwhile completing the logs. The data presented in this paper demonstrate high compliance from both course directors and students, and both groups reported making adjustments to education experiences from PLOG data. Recently, interest in professionalism, outcomes and competency-based medical education has grown.25 Wimmers et al26 found that increases in the number of patient encounters did not necessarily lead to increased competence; rather, quality of supervision more directly impacted student learning. We found that PLOG helped to increase both the timeliness and quality of student supervision. Discussions about the application of quality improvement to health professions education have occurred since the early days of quality improvement in healthcare,27–32 but there are relatively few examples in the literature. Armstrong et al33 suggested a systems approach to improving medical education: (1) defining an organisation’s objectives by characteristics (volume, variety, features) of the services provided; (2) designating who should do specific work in what sequence for others to achieve organisational objectives; (3) designating characteristics for services one person provides, given the capabilities and needs of processes that precede and follow; and (4) determining how the individual work assignments should be done, given their relationship to the system overall.33 Participation in the PLOG educational innovation aided course directors in placing their educational offerings into the bigger picture of medical student education. The process promoted enhanced clarity in individual course objectives and illuminated how each course helped students achieve the school’s desired outcomes. PLOG data have opened discussions on learning activities throughout medical school.

10.

LIMITATIONS

21.

This study is limited in that it reports a process-improvement activity in one medical school. There was no opportunity to create a control group or a comparison group. Because each course independently and idiosyncratically collected and processed its information, historical comparisons are not possible.

22.

2. 3.

4. 5. 6. 7. 8. 9.

11. 12. 13. 14. 15. 16. 17. 18. 19. 20.

23. 24. 25.

CONCLUSIONS This project used a common problem to which all participants could relate: the documentation of medical student patient encounters for the purpose of improving the learning experiences. The common problem was accentuated by new accreditation standards. By engaging the individuals primarily involved in the clinical courses, the development of the PLOG brought together a team and provided a collaborative opportunity to learn about quality improvement in education. Together, the team learnt by doing. The relationships and skills gained in this initiative have led to additional initiatives to improve medical education.

26. 27. 28. 29. 30. 31. 32.

Competing interests: None. Ethics approval: Ethical approval granted by University of Missouri Health Sciences Institutional Review Board (project no 1078237).

282

33.

Accreditation Council for Graduate Medical Education. Outcomes project. Chicago: Accreditation Council for Graduate Medical Education. http://www.acgme. org/acWebsite/home/home.asp (accessed 3 Mar 2008). Institute of Medicine (US), Committee on Quality of Health Care in America. Crossing the quality chasm: a new health system for the 21st century. Washington: National Academy Press, 2001. The University of Missouri—Columbia School of Medicine. MU2020 mission statement for medical education (updated 22 Apr 2005). Columbia: The University of Missouri—Columbia School of Medicine. https://somis.umh.edu/visions/ (accessed 1 Feb 2008). Whitcomb ME. More on medical education reform. Acad Med 2004;79:1–2. Hirsh DA, Ogur B, Thibault GE, et al. ‘‘Continuity’’ as an organizing principle for clinical education reform. N Engl J Med 2007;356:858–66. Spencer J. ABC of learning and teaching in medicine: learning and teaching in the clinical environment. BMJ 2003;326:591–4. Nutter D, Whitcomb M. The AAMC project on the clinical education of medical students. http://www.aamc.org/meded/clinicalskills/AAMC Project (accessed 1 Feb 2008). Bridge PD, Ginsburg KA. An integrated approach for evaluating students’ achievement of clinical objectives. Med Educ Online 2001;6:9. http://www.med-edorg (accessed 1 Feb 2008). Liason Committee on Medical Education. Functions and structure of a medical school: standards for accreditation of medical education programs leading to the M.D. degree. Washington: Liason Committee on Medical Education, June 2007: 1–31. Institute for Healthcare Improvement. Eight knowledge domains for health professional students. Health Professions Education. Available from: http://www.ihi. org/IHI/Topics/HealthProfessionsEducation/EducationGeneral/EmergingContent/ EightKnowledgeDomainsforHealthProfessionalStudents.html (accessed 1 Feb 2008). Langley GJ, Nolan KM, Nolan TW, et al. The improvement guide: a practical approach to enhancing organizational performance. San Francisco: Jossey-Bass, 1996. Pipas CF, Peltier DA, Fall LH, et al. Collaborating to integrate curriculum in primary care medical education: successes and challenges from three US medical schools. Fam Med 2004;36:126–32. Prystowsky JB, DaRosa DA, Thompson JA. Promoting collaborative teaching in clinical education. Teach Learn Med 2001;13:148–52. Magrane D, Ephgrave K, Jacobs MB, et al. Weaving women’s health across clinical clerkships. Acad Med 2000;75:1066–70. Kutner JS, Morrison EH, Beach MC, et al. Facilitating collaboration among academic generalist disciplines: a call to action. Ann Fam Med 2006;4:172–6. Burke MJ, Bonaminio MJ, Walling A. Implementing a systematic course/clerkship peer review process. Acad Med 2002;77:930–1. Alderson TS, Oswald NT. Clinical experience of medical students in primary care: use of an electronic log in monitoring experience and in guiding education in the Cambridge Community Based Clinical Course. Med Educ 1999;33:429–33. Benjamin S, Robbins LI, Kung S, et al. Online resources for assessment and evaluation. Acad Psychiatry 2006;30:498–504. Bennett AJ, Arnold LM. Use of a computerized evaluation system in a psychiatry clerkship. Acad Psychiatry 2004;28:197–203. Carney PA, Pipas CF, Eliassen S, et al. An analysis of students’ clinical experiences in an integrated primary care clerkship. Acad Med 2002;77:681–7. Fischer S, Stewart TE, Mehta S, et al. Handheld computing in medicine. J Am Med Inform Asso 2003;10:139–49. Johnson VK, Michener JL. Tracking medical students’ clinical experiences with a computerized medical records system. Fam Med 1994;26:425–7. Kurth R, Silenzio V, Irigoyen MM, et al. Use of personal digital assistants to enhance educational evaluation in a primary care clerkship. Med Teach 2002;24:488–90. Dolmans D, Schmidt A, van der Beek J, et al. Does a student log provide a means to better structure clinical education? Med Educ 1999;33:89–94. Musick DW. Emerging trends in medical education: What are they? And why are they important? N C Med J 2005;66:244–8. Wimmers PF, Schmidt HG, Splinter TAW, et al. Influence of clerkship experiences on clinical competence [see comment]. Med Educ 2006;40:450–8. Ellrodt A. Introduction of total quality management (TQM) into an internal medicine training program. Acad Med 1993;68:817–23. Headrick L, Knapp M, Neuhauser D, et al. Working from upstream to improve health care: the IHI interdisciplinary professional education collaborative. Jt Comm J Qual Improv 1996;22:149–64. Moore S, Alemi F, Headrick L, et al. Using learning cycles to build an interdisciplinary curriculum in CI for health professions students in Cleveland. Jt Comm J Qual Improv 1996;22:165–71. Cleghorn G, Headrick L. The PDSA cycle at the core of learning in health professions education. Jt Comm J Qual Improv 1996;22:206–12. Baker G, Gelmon S, Headrick L. Collaborating for improvement in health professions education. Quality Manage Health Care 1998;6:1–11. Coleman M, Headrick L, Langley A, et al. Teaching medical faculty how to apply continuous quality improvement to medical education. Jt Comm J Qual Improv 1998;24:640–52. Armstrong E, Mackey M, Spear S. Medical education as a process management problem. Acad Med 2004;79:721–8.

Qual Saf Health Care 2009;18:278–282. doi:10.1136/qshc.2008.028720