A Curriculum for Teaching Patient Safety in ... - Wiley Online Library

20 downloads 397569 Views 124KB Size Report
Medicine (SAEM) Patient Safety Task Force, and was approved by the SAEM ... (2) High degree of uncertainty. ..... from nurses, pharmacists from tech- nicians.
69

ACAD EMERG MED • January 2003, Vol. 10, No. 1 • www.aemj.org

Patient Safety: A Curriculum for Teaching Patient Safety in Emergency Medicine Karen S. Cosby, MD, Pat Croskerry, MD, PhD Abstract The last decade has witnessed a growing awareness of medical error and the inadequacies of our health care delivery systems. The Harvard Practice Study and subsequent Institute of Medicine Reports brought national attention to long-overlooked problems with health care quality and patient safety. The Committee on Quality of Health Care in America challenged professional societies to develop curriculums on patient safety and adopt pa-

tient safety teaching into their training and certification requirements. The Patient Safety Task Force of the Society for Academic Emergency Medicine (SAEM) was charged with that mission. The curriculum presented here offers an approach to teaching patient safety in emergency medicine. Key words: patient safety; curriculum; emergency medicine. ACADEMIC EMERGENCY MEDICINE 2003; 10:69–78.

Patient safety is not a well-developed discipline. Definitions and concepts are evolving as experts debate error models and solutions to system problems. There are in fact more questions than answers. However, there is a growing ‘‘error’’ knowledge base and there are resources that can make us better and safer. There are also overwhelming system problems that cannot wait for the science to mature. Simply enough, the imperative is to improve and to teach improvement. While we attempt to reform existing systems, significant change can begin by defining a curriculum to inform and educate. Once a knowledge base is defined and basic principles established, a foundation is laid to foster discussion and debate, encourage innovation, and advance ideas and shared goals. We propose a variety of approaches to teaching patient safety. The curriculum begins with the traditional role of physician as diagnostician and decision-maker and focuses on complex decision making, cognitive science theory, and evidence-

based medicine. The student is then directed to step away from the bedside and take a system-wide look at problems in health care delivery to consider how system problems contribute to risk and harm. We challenge educators to step beyond their usual domain of expertise to develop skills in human factors engineering, information technology, and highfidelity patient simulation. This curriculum is intended for those actively engaged in the practice and teaching of emergency medicine. Although it was designed to guide the teaching of emergency medicine residents and medical students, we hope that it will be shared with others in the wider health care community, including nurses, pharmacists, paramedics, and administrators. The content is meant to be taught across disciplines and job categories. The outline of the curriculum is presented here. Supplemental teaching materials with case examples are available on the SAEM (www.saem.org) website. Guidelines for a curriculum for error prevention were proposed by the AEM consensus conference on errors in emergency medicine in 2000.1 This curriculum is the result of that conference and subsequent work by the Patient Safety Task Force.

From the Department of Emergency Medicine, Cook County Hospital/Rush Medical School, Chicago, IL (KSC); and the Department of Emergency Medicine, Dalhousie University, Halifax, Nova Scotia, Canada (PC). This paper is a product of the Society of Academic Emergency Medicine (SAEM) Patient Safety Task Force, and was approved by the SAEM Board of directors on July 5, 2002. Received June 5, 2002; accepted July 5, 2002. Section editors: Pat Croskerry, MD, PhD, Department of Emergency Medicine, Dartmouth General Hospital Site, Dalhousie University, Halifax, Nova Scotia, Canada; and Marc J. Shapiro, MD, Department of Emergency Medicine, Rhode Island Hospital, Brown University School of Medicine, Providence, Rhode Island. Address for correspondence and reprints: Karen S. Cosby, MD, Cook County Hospital, Department of Emergency Medicine, 1900 West Polk Street, Chicago, IL 60612. Fax: 312-633-8189; e-mail: [email protected].

CURRICULUM I. Awareness of Medical Error: Bringing a Safety-conscious Culture to Medicine A. Concepts 1. Patients frequently incur harm from the very system meant to help. 2. Awareness, acknowledgment, and acceptance of the reality of error can help avoid the tendency to blame individ-

70

Cosby, Croskerry • CURRICULUM FOR PATIENT SAFETY

uals and lead to a culture that more effectively addresses the inadequacies of the system as a whole. B. Content 1. The scope and magnitude of error and harm in medicine.2–8 a) As many as 98,000 deaths each year may be attributed to medical errors.4 b) Deaths from medical errors probably exceed the number of deaths each year from motor vehicle collisions.4 c) The incidence of iatrogenic injury in the United States alone may be the equivalent of three jumbo jet crashes every two days.9 d) One study found an average of 1.7 errors per patient day in the intensive care unit.10 e) Iatrogenic injuries are a common cause for harm, including cardiac arrests.11 2. The emergency department (ED) is an environment of great risk. a) Of adverse events reported in the ED by the Harvard Practice Study, more than 70% were judged to be secondary to negligence and more than 90% were judged to be ‘‘preventable.’’3,12 b) The ED setting itself poses risk.13 (1) Undifferentiated problems of varying acuity. (2) High degree of uncertainty. (3) Need for rapid intervention in the face of incomplete information. 3. The hidden nature of error in the ED. a) Lack of feedback.14 b) Lack of ownership; many people involved in each patient visit. c) Frequent distractions.15 d) Fragmentation in the delivery of care.16 4. The contribution of medical education to error.17 a) Presents medicine in an authoritarian and hierarchical structure. b) Emphasizes individual performance with little attention to teamwork.

c) Tends to view doubt and indecision as a weakness; fails to acknowledge uncertainty in medical decision making. d) Fails to view the clinician as a part of a larger system. 5. The need to reform. a) Error and iatrogenic harm are as serious as many of the diseases we face. b) We need to begin to understand the many factors that contribute to harm. C. Teaching Methods 1. Facts concerning the statistics of medical error can be presented in a lecture format. 2. Small-group discussions can center on how to acknowledge and address error. 3. These concepts can be introduced by viewing the video: Beyond Blame. Solutions to America’s other drug problem [videotape]. Solana Beach, CA: Bridge Medical, Inc., 1997. Refer to: http:// www.bridgemedical.com/beyond㛭 blame.shtml. D. Recommended Reading 1. Special Issue: Errors in Emergency Medicine. Acad Emerg Med. 2000; 7: 1173–340. 2. Reducing Error. BMJ. 2000; 320:725– 814. 3. Leape LL. Error in medicine. JAMA. 1994; 272:1851–7. 4. Blumenthal D. Making medical errors into ‘‘medical treasures’’ [editorial]. JAMA. 1994; 272:1867–8. 5. Additional resources can be found at the website for the National Patient Safety Foundation at: http://www. npsf.org. II. Definitions and Models of Error A. Concepts 1. Basic definitions of error are provided to give a common language for discussing medical errors and iatrogenic harm. 2. A variety of models for incident analysis are presented to guide investigations, detect causes for harm, and help find solutions.

71

ACAD EMERG MED • January 2003, Vol. 10, No. 1 • www.aemj.org

B. Content 18–21

1. Definitions of error. a) Error, error of execution, error of planning. b) Active versus latent error. c) Adverse event. d) Near miss. 2. Models of error and incident analysis. a) Reason’s ‘‘Swiss-cheese’’ model.22,23 (1) Describes active and latent error. (2) Recognizes human factors distinct from system problems. (3) Describes high-reliability organizations. b) Root cause analysis24: tracks back to find the most basic (root) cause to adverse events. c) Vincent’s organizational accident model25: evaluates the task, the team, the work environment, and the organization. d) Haddon’s injury prevention matrix26,27: examines the host, the vector, and the environment in three phases: pre-event, event, and postevent. e) Helmreich’s crew resource management 28,29: emphasizes teamwork and team management principles. f) Failure modes and effects analysis (FMEA)30–32: prospectively examines the likelihood of failure and designs (or redesigns) the system to minimize the risk of failure and harm. g) Industries outside medicine employ many different safety models. Medicine has largely evolved without a safety structure. No single model has been accepted for medical injury. Features of many different models have something to offer. C. Teaching Methods 1. Basic definitions and models can be provided and used to evaluate an actual case. Students can be encouraged to compare approaches and different models to find as many corrective actions as possible. D. Recommended Reading 1. Reason J. Human error: models and management. BMJ. 2000; 320:768–70.

2. Vincent C, Taylor-Adams S, Stanhope N. Framework for analysing risk and safety in clinical medicine. BMJ. 1998; 316:1154–7. 3. Brasel KJ, Layde PM, Hargarten S. Evaluation of error in medicine: application of a public health model. Acad Emerg Med. 2000; 7:1298–302. III. Cognitive Error and Medical Decision Making A. Concepts 1. Error is common in medicine. The ED is the third most likely site for significant errors, second only to the operating room and the patient’s hospital room.3 Compared with the total time spent in each arena, it is clear that the risk of harm in the ED is disproportionate to other settings. 2. The emergency physician is forced to act in the face of great uncertainty and within tight time constraints. Most of our time is spent making decisions. How we make decisions is essential to how well we perform. 3. Thinking about how we think and improving our decisions can impact the quality of care we provide.33 B. Content 1. Models for medical decision making. a) Hypothetico–deductive method.34,35 (1) Hypothesis generation. (2) Hypothesis refinement. (3) Testing the hypothesis. (4) Causal reasoning. (5) Diagnostic verification. b) Normative principles and basic statistics.36,37 (1) Disease prevalence. (2) Sensitivity, specificity, positive and negative predictive values, 2 ⫻ 2 tables. (3) Pretest probability, posterior probability, Bayes’ theorem. c) Evidence-based medicine38,39: skills to apply the best possible information to the unique situation of each patient. d) Specialty bias: the distinct approach applied in emergency medicine to ‘‘rule out the worst’’ rather than accept the most likely. e) Thresholds34: thresholds to test,

72

Cosby, Croskerry • CURRICULUM FOR PATIENT SAFETY

thresholds to treat, thresholds to admit. f) Heuristics.33,34,40–42 (1) ‘‘If, then’’ rules, rules of thumb. (2) Diagnostic and treatment algorithms. (3) Clinical protocols. g) Cognitive science: understanding how we think.33,35,40,43 2. How we err in decision making. a) Making the wrong diagnosis. (1) Knowledge gaps or inexperience. (2) Faulty information gathering. (3) Faulty pattern recognition. (4) Misuse or misinterpretation of tests. b) Elstein’s error classification.40 (1) Inaccurate estimation of pretest probability, an inaccurate estimate of disease prevalence. (2) An inaccurate estimate of the strength of the evidence. c) Cognitive bias.33,35,40 (1) Availability.33,40–42 (2) Representativeness.33,40–42 (3) Confirmation.33,40 (4) Search-satisficing.33,44 (5) Anchoring.33,42 (6) Premature diagnostic closure.34 (7) Zebra retreat.35,45 (8) Omission bias.40,46 (9) Outcome bias.40,47 (10) Prevalence bias.33,48 (11) Conjunction fallacy.33,41 d) Factors influencing human performance. (1) Affective errors. (2) Personal impairment: fatigue, stress, interpersonal conflicts. 3. How can we prevent cognitive error?49 a) Reduce cognitive load. (1) Simplify diagnostic and treatment protocols. (a) Clinical decision rules and practice guidelines. (b) Shared approach to care: Advanced Cardiac Life Support (ACLS), Advanced Trauma Life Support (ATLS), Pediatric Advanced Life Support (PALS).

b) c)

d)

e)

(2) Electronic templates to prompt desired response. (3) Memory devices. (4) Resources such as computer textbooks, online consultation. Shared responsibility and accountability: teamwork principles. Cognitive forcing strategies: e.g., locking functions, understanding cognitive bias. Optimize human performance. (1) Minimize the effect of circadian cycles and sleep deprivation on individuals. (2) Recognize impaired team members. Address system problems.

C. Teaching Methods 1. This is best taught in the context of actual cases. Students should walk through cases in real time to demonstrate common errors. Errors can be pointed out in mortality and morbidity conferences. Bedside instruction can help identify errors before they occur. 2. Students should have the opportunity to participate in the development of strategies to solve problems in the ED. This can include finding simplified approaches to common problems. D. Recommended Reading 1. Croskerry P. The cognitive imperative: thinking about how we think. Acad Emerg Med. 2000; 7:1223–31. 2. Kuhn GJ. Diagnostic errors. Acad Emerg Med. 2002; 9:740–50. 3. Croskerry P. Achieving quality in clinical decision making: cognitive strategies and detection of biases. Acad Emerg Med. 2002; 9:1184–204. 4. Croskerry P. Cognitive forcing strategies in clinical decision making. Ann Emerg Med. 2003 (in press). 5. Kovacs G, Croskerry P. Clinical decision making: an emergency medicine perspective. Acad Emerg Med. 1999; 6: 947–52. 6. Elstein AS. Heuristics and biases: Selected errors in clinical reasoning. Acad Med. 1999; 74:791–4.

73

ACAD EMERG MED • January 2003, Vol. 10, No. 1 • www.aemj.org

IV. Learning from the Experience of Others A. Concepts 1. Error is a part of our everyday practice. We should learn from the errors of others to avoid repeating the same errors. 2. Cases where error is common, and perhaps avoidable, are provided. 3. With a few exceptions, most of this knowledge base is largely anecdotal although surprisingly uniform across many institutions and over time. This information base is not established by any scientific method but is true to our experience. It is, in fact, part of the secret knowledge that fellow residents, medical students, and clinicians share in their private circles. This list is not intended to be complete but can serve as a starting point for discussion. B. Contents 1. High-risk diagnoses: diagnoses we can’t afford to miss in the ED. 2. Critical actions we can’t afford to miss. 3. High-risk patients. 4. High risk moments in the ED. 5. Common injury patterns. 6. Commonly missed radiographic findings. 7. Common malpractice cases.50 8. Causes of preventable iatrogenic cardiac arrest.11 9. Textbook guidelines on pitfalls to avoid.51 10. Mistakes from physicians’ perspective.52 11. Tools to avoid errors: memory aids, practice guidelines, mortality and morbidity conferences. C. Teaching Methodology 1. Story-telling is an effective way to share this common knowledge in emergency medicine. 2. Mortality and morbidity conferences provide excellent material to teach common error scenarios. D. Recommended Reading 1. Lancet ‘‘Uses of Error’’ series. 2. Academic Emergency Medicine ‘‘Profiles in Patient Safety’’ series.

V. Complications from Invasive Procedures A. Concepts 1. Complications from invasive procedures are a leading cause of iatrogenic injury.3 2. Understanding risk, and how to avoid complications, should be a part of the general education of every clinician. B. Content 1. Every treatment and procedure has risks and benefits. 2. Approach to minimizing risk. a) Know the indications and alternative options. b) Know the contraindications, relative and absolute. c) Know the possible side effects and complications. d) Prepare the patient to optimize result. e) Modify technique whenever possible to minimize risk. f) Monitor the patient for side effects or complications. g) Be able to recognize a complication should one occur. h) Be prepared and able to treat any side effects and complications. i) The system should support appropriate monitoring and post-procedure care. 3. Risk can be modified by optimizing individual skills and improving system design. C. Teaching Methodology 1. Students should be allowed to analyze an adverse event from a procedure complication, forming ideas about how to avoid, monitor, detect, and rescue from harm. 2. This information can be taught in a small-group format. It can also be applied in the laboratory setting or simulation where procedures are taught and demonstrated. VI. Medical Error from a Systems Perspective A. Concepts 1. Medical care begins with individual care providers but eventually relies on the integrated efforts of a complex network of people and support services.

74

Cosby, Croskerry • CURRICULUM FOR PATIENT SAFETY

2.

3. 4.

5.

6.

Much of medical error can be attributed to an inadequate infrastructure to support high-tech medical care. System problems probably contribute to most medical errors, even when human error is a factor.22,23,53 Improvements in system design may help improve care.22,23,54,55 Teamwork failure has been implicated in a significant percent of cases of preventable death and disability.56 Medical systems need better information networks for improved system performance. Medication errors and equipment failures contribute to iatrogenic injuries. System reforms can reduce risk.57

B. Content 1. Error models. a) Reason’s ‘‘Swiss-cheese’’ model; latent error.22,23 b) System error is random. Use Deming’s red bead demonstration.58 2. The system defined. a) The setting (ED, hospital). b) The people (physicians, nurses, pharmacists, technicians, phlebotomists, administrators, etc.). c) Support services (radiology, blood bank, laboratory, etc.). d) The organization (policies and procedures). 3. The systems contribution to error; ‘‘latent’’ error.22,23 In contrast to human error, latent errors: a) Are less visible, less proximate, thus frequently not seen as causal to errors. b) Impact more patients over longer period of time, thus are likely to be a greater contributor to error. 4. The ED as a risk-prone system.45 a) Frequently overloaded, understaffed. b) High-acuity. c) Rapid decision making in the face of uncertainty. d) Need for rapid interventions. 5. ‘‘High-risk industries’’ and ‘‘high-reliability design.’’22,23,54,55 a) Aware of risks. b) Designed with risk in mind.

c) Anticipate system failures and provide backup plans. d) Make error ‘‘visible.’’ e) Designed for surveillance, rapid detection, and recovery from error. f) Prioritize communication and teamwork. g) High-reliability design features standardization, simplification, automation, and accounts for human factors such as fatigue and boredom. 6. Teamwork as a component of the system. a) Teamwork failure contributes to poor outcomes.56,59 b) Medical training segregates doctors from nurses, pharmacists from technicians. Graduate medical education emphasizes individual performance with little regard for the real nature of medical delivery.60 c) System reform needs to improve coordination of care and teamwork. 7. Teamwork principles to optimize system performance.28,61,62 a) Less hierarchy. b) Sharing of tasks. c) Cross-checking. d) Prioritizes communication and information sharing. 8. Information networks as a part of the system. a) Need to convey information between teams and across shifts. b) Timely access to patient information and exchange of patient information are essential. c) Lack of an effective information network may contribute to risk. 9. Medication errors: account for a significant amount of iatrogenic harm. a) Majority are due to system failures. b) Solutions will require system changes.56,63 10. Equipment failure.64 a) Sophisticated equipment in highrisk settings can pose risk.65 b) Human factors engineering is one approach. (1) Improved design: easy to learn, function intuitive. (2) ‘‘Usability testing.’’

75

ACAD EMERG MED • January 2003, Vol. 10, No. 1 • www.aemj.org

11. Potential strategies to address the system component of medical errors.56,57,66,67 a) First, recognize that latent error is a significant factor in many medical errors. b) Encourage reporting to identify system flaws. c) Promote teamwork. Teamwork drills and simulation of patient settings can be used to improve.68 d) Prevent medication errors. e) Improve information technology. f) Address equipment problems. g) Use human factors engineering to optimize human–equipment interfaces.69 h) Future challenges70: The need to report, investigate, innovate, and improve. C. Teaching Methods: This section can be introduced briefly; then students should be encouraged to find a specific system problem to solve. D. Recommended Reading 1. Vaughan D. The dark side of organizations: mistake, misconduct, and disaster. Annu Rev Sociol. 1999; 25:271– 305. 2. Adams JG, Bohan JS. System contributions to error. Acad Emerg Med. 2000; 7:1189–93. 3. Leape LL. A systems analysis approach to medical error. J Eval Clin Pract. 1997; 3:213–22. 4. Kohn LT, Corrigan JM, Donaldson MS (eds). To Err Is Human: Building a Safer Health system. Washington, DC: National Academy Press, 2000. VII. Living with the Reality of Medical Error A. Concepts 1. Real-life difficulties that deal with error are addressed.71–74 B. Content 1. 2. 3. 4.

The need to acknowledge the error. The need to inform the patient. The need to report. How to cope with error.

C. Teaching Methodology: General group discussion.

D. Recommended Reading 1. Christensen JF, Levinson W, Dunn PM. The heart of darkness: the impact of perceived mistakes on physicians. J Gen Intern Med. 1992; 7:424–31. (Excellent account of how physicians cope with errors, much of it written in physicians’ own words. Contrasts those who believe they should be perfect and work harder and longer, versus those who deny error and attribute it to lack of control over illness.) 2. Wu AW, Folkman S, McPhee SJ, et al. How house officers cope with their mistakes. West J Med. 1993; 159:565–9. 3. Wusthoff CJ. MSJAMA: medical mistakes and disclosure: the role of the medical student. JAMA. 2001; 286: 1080–1. 4. AMA principles of ethics. See American Medical Association. Principles of Medical Ethics. Available at: www. ama - assn . org / ama / pub / category / 2512html. Accessed Aug 2001.

DISCUSSION This is not a traditional curriculum based on a welldefined or widely accepted body of knowledge. This curriculum primarily teaches concepts and philosophies. We focus less on facts (the few that we have) and more on how to find and apply medical knowledge. We offer less didactic material and focus more on finding strategies for effective change. We promote the learner as a problemsolver. We prioritize teamwork over individual performance. The ideas in this curriculum encourage innovation from both the instructor and the students. The goal of the curriculum is not so much to educate as to enlighten and motivate. We rely on true insight coming from within the student. If we succeed in training students who can recognize and acknowledge the faults of our system and help them develop capacity to solve problems, they may be the ones who ultimately find the solutions we seek. This curriculum is not intended to simply add to the growing body of medical knowledge and content required in training. Teaching and applying these concepts need not add a burden to those struggling to meet curriculum requirements. The content can easily be incorporated into existing areas of the curriculum. Established lectures on pathophysiology, diagnosis, and treatment can be expanded to include discussion on how to achieve optimal care in a safe environment as free as pos-

76 sible from harm. The principles can be introduced in case conferences, applied in daily bedside decision making, and reinforced in mortality and morbidity conferences and journal clubs. Although a few basic lectures are needed to define terms and introduce safety models, the most essential content can be accomplished by adopting a practice and educational philosophy focused on quality, awareness of the potential for harm, and openness to creative change. Much of the content is best suited to alternative teaching methods such as problem-based learning, team projects, and patient simulation. We recommend expanding the audience to include other specialties and disciplines to cross-fertilize ideas as well as promote communication and teamwork. We encourage the student to find solutions to complex problems. Students should be urged to discover their own fallibilities as well as the weaknesses of the systems in which they work. The fact that much of this can be accomplished outside traditional didactic formats does not mean that it will be easy. An investment of time and energy as well as innovative thinking will be necessary to incorporate these ideas. Additional financial resources may be required to introduce improved methods of instruction such as simulation. Changes in attitude will take time. However, the intent of the curriculum will be accomplished in part if role models and leaders demonstrate commitment to improvement and acceptance of change. Our goals are consistent with the need to teach and demonstrate core competency in practice-based learning and improvement as required by the Accreditation Council for Graduate Medical Education (ACGME) and the Residency Review Committee of Emergency Medicine (RRC-EM). Although authorities argue that system problems ultimately contribute the most to medical errors, this curriculum is weighted disproportionately on human error. This is simply a reflection of what we as physicians are most familiar and comfortable with and what is best represented in the current medical literature. We acknowledge the role of systems. Beyond that realization, we lack much insight as to what solutions will impact significant change. We have ideas, largely untested and not yet validated. As the field matures, it is likely that our understanding of system flaws and their solutions will grow. We know that teamwork and information technology will likely play major roles in system reform, and we introduce those ideas. This curriculum is a product of debate. The content will undoubtedly change and evolve as theories are tested and improvements found. What we propose today may be rejected tomorrow. The extent

Cosby, Croskerry • CURRICULUM FOR PATIENT SAFETY

to which this document changes may well reflect our success in achieving meaningful progress. We offer this content as a beginning, as Patient Safety 101, with much anticipation for the next version.

CONCLUSIONS A curriculum for teaching patient safety and medical error in emergency medicine is presented. Past educational goals have largely focused on the scientific aspects of medicine with less regard for the difficulties in delivering optimal quality. This curriculum offers a basic approach to understanding the nature of medical error and medical injury. References 1.

2.

3.

4.

5.

6.

7.

8.

9. 10.

11.

12.

13. 14. 15.

16.

Croskerry P, Wears RL, Binder LS. Setting the educational agenda and curriculum for error prevention in emergency medicine. Acad Emerg Med. 2000; 7:1194–200. Brennan TA, Leape LL, Laird NM, Hebert L, Localio AR, Lawthers AG. Incidence of adverse events and negligence in hospitalized patients: results of the Harvard Medical Practice Study I. N Engl J Med. 1991; 324:370–6. Leape LL, Brennan TA, Laird N, Lawthers AG, Localio AR, Barnes BA. The nature of adverse events in hospitalized patients: results of the Harvard Medical Practice Study II. N Engl J Med. 1991; 324:377–84. Kohn LT, Corrigan JM, Donaldson MS (eds). To Err Is Human: Building a Safer Health System. Institute of Medicine. Washington, DC: National Academy Press, 2000. Institute of Medicine. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: National Academy Press, 2001. Gawande AA, Thomas EJ, Zinner MJ, Brennan TA. The incidence and nature of surgical adverse events in Colorado and Utah in 1992. Surgery. 1999; 126(1):66–75. Fleming ST. Complications, adverse events, and iatrogenesis: classifications and quality of care measurement issues. Clin Perform Qual Health Care. 1996; 4:137–47. Wilson RM, Runciman WB, Gibberd RW, Harrison BT, Newby L, Hamilton JD. The Quality in Australian Health Care Study. Med J Aust. 1995; 163:458–71. Leape LL. Error in medicine. JAMA. 1994; 272:1851–7. Donchin Y, Gopher D, Olin M, et al. A look into the nature and causes of human errors in the intensive care unit. Crit Care Med. 1995; 23:294–300. Bedell SE, Deitz DC, Leeman D, Delbanco TL. Incidence and characteristics of preventable iatrogenic cardiac arrests. JAMA. 1991; 265:2815–20. Leape LL. The preventability of medical injury. In: Bogner MS (ed). Human Error in Medicine. Hillsdale, NJ: Lawrence Erlbaum Associates, 1994, pp 13–25. Croskerry P, Sinclair D. Emergency medicine: a practice prone to error? Can J Emerg Med. 2001; 3:271–6. Croskerry P. The feedback sanction. Acad Emerg Med. 2000; 7:1232–8. Chisholm CD, Collison EK, Nelson DR, Cordell WH. Emergency department workplace interruptions: are emergency physicians ‘‘interrupt-driven’’ and ‘‘multitasking’’? Acad Emerg Med. 2000; 7:1239–43. Shepherd A, Kostopoulou O. Fragmentation in care and the potential for human error. In: Johnson C (ed). Proceedings of the First Workshop on Human Error and Clinical Systems. Glasgow Accident Analysis Group

77

ACAD EMERG MED • January 2003, Vol. 10, No. 1 • www.aemj.org

17.

18.

19.

20.

21.

22. 23.

24.

25.

26.

27.

28.

29. 30.

31.

32.

33. 34. 35.

36.

37.

Technical Report G99–1. Glasgow: Glasgow Accident Analysis Group, 1999. Pilpel D, Schor R, Benbassat J. Barriers to acceptance of medical error: the case for a teaching programme. Med Educ. 1998; 32:3–7. Kohn LT, Corrigan JM, Donaldson MS, eds. To Err Is Human: Building a Safer Health System. Institute of Medicine. Washington, D.C.: National Academy Press; 2000, pp 210–1. Appendix B: Glossary and Acronyms. Handler JA, Gillam M, Sanders AB, Klasco R. Defining, identifying, and measuring error in emergency medicine. Acad Emerg Med. 2000; 7:1183–8. Meurer S. Patient Safety Term Glossary [e-mail attachment]. The Patient Safety Discussion Forum , item #1728 (Aug 22, 2001). Available at: http://[email protected]. org/SCRIPTS/WA-NPSF.EXE?A2=ind0108&L= PATIENTSAFETY-L&P=R11365. Accessed Jun 2002. Senders JW, Moray NP. Human Error: Cause, Prediction, and Reduction. Hillsdale, NJ: Lawrence Erlbaum Associates, 1991. Reason J. Human error: models and management. BMJ. 2000; 320:768–70. Reason J. Managing the Risks of Organizational Accidents. Burlington, VT: Ashgate Publishing Company, 1997. Joint Commission on Accreditation of Healthcare Organizations. What Every Hospital Should Know About Sentinel Events. Oakbrook, IL: JCAHO, 2000. Vincent C, Taylor-Adams S, Stanhope N. Framework for analysing risk and safety in clinical medicine. BMJ. 1998; 316:1154–7. Brasel KJ, Layde PM, Hargarten S. Evaluation of error in medicine: application of a public health model. Acad Emerg Med. 2000; 7:1298–302. Haddon W Jr. A logical framework for categorizing highway safety phenomena and activity. J Trauma. 1972; 12: 193–207. Helmreich RL, Schaefer H. Team performance in the operating room. In: Bogner MS (ed). Human Error in Medicine. Hillsdale, NJ: Lawrence Erlbaum Associates, 1994, pp 225–53. Helmreich RL. On error management: lessons from aviation. BMJ. 2000; 320:781–5. Leveson NG. Hazard analysis models and techniques. In: Safeware: System Safety and Computers: A Guide to Preventing Accidents and Losses Caused by Technology. Reading, MA: Addison-Wesley Publishing Company, 1995, pp 313–58. VA National Center for Patient Safety. Healthcare Failure Mode and Effects Analysis Course Materials (HFMEATM). Available at: http://www.patientsafety.gov/HFMEA. html. Accessed June 2002. DeRosier J, Stalhandske E, Bagian JP, Nudell T. Using healthcare failure mode and effects analysisTM: The VA National Center for Patient Safety’s Prospective Risk Analysis System. Jt Comm J Qual Improve. 2002; 27:248– 67. Croskerry P. The cognitive imperative: thinking about how we think. Acad Emerg Med. 2000; 7:1223–31. Kassirer JP, Kopelman RI. Learning Clinical Reasoning. Baltimore: Williams & Wilkins, 1991. Kovacs G, Croskerry P. Clinical decision making: an emergency medicine perspective. Acad Emerg Med. 1999; 6:947–52. Kassirer JP, Kopelman RI. Use and interpretation of diagnostic tests. In: Learning Clinical Reasoning. Baltimore: Williams & Wilkins, 1991, pp 17–28. McNeil BJ, Keeler E, Adelstein SJ. Primer on certain ele-

38.

39.

40. 41.

42. 43.

44. 45. 46.

47.

48.

49.

50.

51.

52.

53.

54.

55.

56.

ments of medical decision making. N Engl J Med. 1975; 293:211–5. Sacket DL, Straus SE, Richardson WS, Rosenberg W, Haynes RB. Evidence-Based Medicine: How to Practice and Teach EBM. 2nd ed. New York, NY: Livingstone, 2000. Corrall CJ, Wyer PC, Zick LS, Bockrath CR. Evidencebased emergency medicine. How to find evidence when you need it, Part 1: databases, search programs, and strategies. Ann Emerg Med. 2002; 39:302–6. Elstein AS. Heuristics and biases: selected errors in clinical reasoning. Acad Med. 1999; 74:791–4. Kahneman D, Slovic P, Tversky A. Judgment Under Uncertainty: Heuristics and Biases. Cambridge, UK: Cambridge University Press, 1982. Tversky A, Kahneman D. Judgment under uncertainty: heuristics and biases. Science. 1974; 185:1124–31. Redelmeier DA, Ferris LE, Tu JV, Hux JE, Schull MJ. Problems for clinical judgment: introducing cognitive psychology as one more basic science. Can Med Assoc J. 2001; 164:358–60. Simon HA. Reason in Human Affairs. London: Basil Blackwell, 1983. Croskerry P. Avoiding pitfalls in the emergency room. Can J Contin Med Educ. 1996; Apr:1–10. Elstein AS, Holzman GB, Ravitch MM, et al. Comparison of physicians’ decisions regarding estrogen replacement therapy for menopausal women and decisions derived from a decision analytic model. Am J Med. 1986; 80:246– 58. Gruppen LD, Margolin J, Wisdom K, Grum CM. Outcome bias and cognitive dissonance in evaluating treatment decisions. Acad Med. 1994; 69(10 suppl):S57–S59. Tversky A, Kahneman D. Availability: a heuristic for judging frequency and probability. Cogn Psychol. 1973; 5: 207–32. Croskerry P. Cognitive forcing strategies in clinical decision making. Philadelphia, PA: Presented at the annual meeting of the Society for Academic Emergency Medicine, May 2001. Freeman L, Antill T. Ten things emergency physicians should not do, unless they want to become defendants. American College of Emergency Physicians, Foresight: Risk Management for Emergency Physicians. Sept 2000; 49:1–11. Harwood-Nuss A, Wolfson AB, Linden CH, Shepherd SM, Stenklyft PH (eds). The Clinical Practice of Emergency Medicine. 3rd ed. Philadelphia: Lippincott Williams and Wilkins, 2001. Ely JW, Levinson W, Elder NC, Mainous AG 3rd, Vinson DC. Perceived causes of family physicians’ errors. J Fam Pract. 1995; 40:337–44. McClanahan S, Goodwin ST, Houser F. A formula for errors: good people ⫹ bad systems. In: Spath PL (ed). Error Reduction in Health Care: A Systems Approach to Improving Patient Safety. San Francisco, CA: Jossey-Bass, 2000, pp 1–15. Sagan SD. The Limits of Safety: Organization, Accidents and Weapons. Trenton, NJ: Princeton University Press, 1993. Kohn LT, Corrigan JM, Donaldson MS. Creating safety systems in health care organizations. In: Kohn LT, Corrigan JM, Donaldson MS (eds). To Err Is Human: Building a Safer Health System. Washington, DC: National Academy Press, 2000, pp 155–201. Risser DT, Rice MM, Salisbury ML, Simon R, Jay GD, Berns SD. The potential for improved teamwork to reduce medical errors in the emergency department. Ann Emerg Med. 1999; 34:373–83.

78 57.

58.

59.

60.

61.

62.

63. 64.

65.

Cosby, Croskerry • CURRICULUM FOR PATIENT SAFETY Spath PL. Reducing errors through work system improvements. In: Spath PL (ed). Error Reduction in Health Care. A Systems Approach to Improving Patient Safety. San Francisco, CA: Jossey-Bass, 2000, pp 199–234. Lightning Calculator. The Red Bead Experiment. Available at: http://www.qualitytng.com/REDBEAD.HTM. Accessed Feb 2002. Bhasale A. The wrong diagnosis: identifying causes of potentially adverse events in general practice using incident monitoring. Fam Pract. 1998; 15:308–18. West E. Organisational sources of safety and danger: sociological contributions to the study of adverse events. Qual Health Care. 2000; 9:120–6. Risser DT, Simon R, Rice MM, Salisbury ML. A structured teamwork system to reduce clinical errors. In: Spath PL (ed). Error Reduction in Health Care: A Systems Approach to Improving Patient Safety. San Francisco, CA: Jossey-Bass, 2000, pp 235–78. Marsch SCU, Harms C, Scheidegger DH. Enhancing team performance. In: Vincent C, De Mol B (eds). Safety in Medicine. Oxford, UK: Elsevier Science Ltd., 2000, pp 139–53. Bates DW. Using information technology to reduce rates of medication errors in hospitals. BMJ. 2000; 320:788–91. Hyman WA. Errors in the use of medical equipment. In: Bogner MS (ed). Human Error in Medicine. Hillsdale, NJ: Lawrence Erlbaum Associates, 1994, pp 327–47. Casey S. Set Phasers on Stun and Other True Tales of Design, Technology, and Human Error. 2nd ed. Santa Barbara, CA: Aegean Publishing Company, 1998.

66.

67. 68.

69.

70.

71.

72.

73. 74.

Hale A. Approaching safety in healthcare: from medical errors to healthy organizations. In: Vincent C, de Mol B (eds). Safety in Medicine. Amsterdam: Pergamon, 2000 pp 247–61. Nolan TW. System changes to improve patient safety. BMJ. 2000; 320:771–3. Small SD, Wuerz RC, Simon R, Shapiro N, Conn A, Setnik G. Demonstration of high-fidelity simulation team training for emergency medicine. Acad Emerg Med. 1999; 6:312–23. Stahlhut RW, Gosbee JW, Gardner-Bonneau DJ. A human-centered approach to medical informatics for medical students, residents, and practicing clinicians. Acad Med. 1997; 72:881–7. Woods D. Moving Forward on Patient Safety: Inquiry, Innovation, and Learning. Available at: http://www.uth. tmc.edu/schools/sahs/SpeakerSeries/woods2-99.html. Accessed Feb 2002. Christensen JF, Levinson W, Dunn PM. The heart of darkness: the impact of perceived mistakes on physicians. J Gen Intern Med. 1992; 7:424–31. Wu AW, Folkman S, McPhee SJ, Lo B. How house officers cope with their mistakes. West J Med. 1993; 159: 565–9. Wusthoff CJ. MSJAMA: medical mistakes and disclosure: the role of the medical student. JAMA. 2001; 286:1080–1. AMA principles of ethics. See American Medical Association. Principles of Medical Ethics. Available at: www. ama-assn.org/ama/pub/category/2512.html. Accessed Aug 2001.