ACADEMIC EMERGENCY MEDICINE • November 2000, Volume 7, Number 11
Beyond Error
T
he publication of the Institute of Medicine report on the problem of medical errors in November 19991 unleashed a dramatic reaction in the press, in government, and within the health professions. Much of the shrillness of this reaction was due to the use of the term ‘‘error,’’ an emotional and value-laden word that obscures more than it reveals when applied to the analysis of adverse events in health care (or any other field). It could be argued that this was a tactical necessity—that the report would have sunk unnoticed, like so many before it, had it not been expressed in a dramatic form. And, some very good things have resulted from the report. Federal funding for research and implementation efforts in patient safety seem likely to increase, and professional bodies, including SAEM and ACEP, have taken steps to begin paying systematic and ongoing attention to safety issues in emergency care. Academic Emergency Medicine hosted its first consensus conference on the topic this year, bringing leaders in teaching and research in emergency care together with experts from the worlds of cognitive psychology, group and team dynamics, and organizational behavior for the first time (a list of conference participants appears at the end of this issue of AEM). However, it is also apparent that the dramatic character of the report and the emotional impact of the term ‘‘medical errors’’ have led to controversy and a backlash.2–4 It is therefore now time to move beyond the concept of ‘‘human error.’’ It has served a limited purpose in drawing attention to a hitherto neglected problem, but will become an impediment to progress if we do not drop it from our lexicon, and purge it from our discourse.
The word ‘‘error’’ is a retrospective judgment about human performance based on outcome, not an identifiable, prospective, scientifically separable, or qualitatively distinct category of performance. This judgment is seldom considered for the same acts when the outcome is good; in fact, there is evidence that acts (or omissions) are labeled inappropriate when associated with a bad outcome and not so labeled when the outcome is good.5 In complex systems such as health care, it is invariably the case that multiple factors participate in the ‘‘incubation chain’’ leading to an adverse event.6 There may frequently be five to ten factors interacting, none of which alone is sufficient to produce a poor outcome, but all of which are necessary. We choose to call some of these factors ‘‘errors,’’ to call some ‘‘contributing conditions,’’ and to ignore others by a process of custom and social attribution, not because of the intrinsic nature of the acts or omissions. How does calling some aspects of human performance ‘‘errors’’ help us understand failures in complex systems? The evidence from other fields suggests it does not. Cook and Woods7 have argued that such labeling ‘‘. . . retards rather than advances our understanding of how complex systems fail and the role of human practitioners in both successful and unsuccessful operations.’’ Reason and Vincent et al. have pointed out that frontline workers are ‘‘the inheritors, not the instigators’’ of errors; thus, human error is a symptom or, even more strongly, a consequence of system problems, not a cause of undesired outcomes.8–10 Their view is that systems call forth from workers behaviors that are later called errors, not
1175 the other way around.11 To paraphrase Rochlin,12 for a caregiver in an emergency, unsure of context and pressed into action only when something has already gone wrong, with an overabundance of some data but missing the rest and under pressure to act quickly, avoiding a mistake may be as much a matter of good luck as good training. Two corollaries follow from these arguments. The first is that a conclusion that ‘‘human error’’ caused an adverse event is itself evidence of an inadequate investigation. The second is solutions that amount to admonitions to ‘‘be more careful’’ or to ‘‘have a high index of suspicion’’ are simply guarantees that the same bad outcome will occur again, only with different people next time. What is needed is a shift in focus. Instead of talking about reducing errors in emergency care, we should concentrate on reducing adverse events by enhancing human performance in the emergency department. While it is true that human error can lead to adverse outcomes, those acts or omissions we sometimes would call errors are themselves the evil spawn of poor systems, information overload, task ambiguity, conflicting goals, cognitive biases, awkward automation, resource or staffing shortages, fatigue, tradeoffs, and a host of other factors. This shift has the additional advantage of switching our attention to protecting patients from injury and away from protecting doctors from making mistakes. After all, patients aren’t so much interested in avoiding medical errors as they are in avoiding adverse events of any kind, whatever their cause. It also allows us to illuminate a hitherto neglected aspect of caregiver performance —how people make things ‘‘go right’’ by interrupting the propagation of multiple small failures and latent errors that are invar-
1176
COMMENTARIES
iably present and trying to make things ‘‘go wrong.’’13 Health care is fond of aviation analogies, but Gaba has pointed out an important distinction between the two fields, in that physicians and nurses, in contrast to pilots, are by definition working on flawed systems—the pathologic physiology of the sick and injured.14 This makes Cook and Woods’ observation even more remarkable, when they note that: ‘‘It is not surprising that human operators occasionally should be unable to extract good outcomes from the conflicted and contradictory circumstances in which they work. The surprise is that they are able to produce good outcomes as often as they do.’’7 By moving beyond the outmoded concept of error and concentrating on enhancing performance, we can make that less of a surprise.— ROBERT L. WEARS,
Wears • BEYOND ERROR
MD, MS (
[email protected]), Department of Emergency Medicine, University of Florida, Jacksonville, FL Key words. errors; emergency medicine; outcomes; adverse events.
References 1. Kohn LT, Corrigan JM, Donaldson MS (eds). To Err Is Human. Building a Safer Health System. Report of the Institute of Medicine. Washington, DC: National Academy Press, 1999. 2. Brennan TA. The Institute of Medicine report on medical error—could it do harm? N Engl J Med. 2000; 342:1123–5. 3. McDonald CJ, Weiner M, Hui SL. Deaths due to medical errors are exaggerated in Institute of Medicine report. JAMA. 2000; 284:93–5. 4. Leape LL. Institute of Medicine medical error figures are not exaggerated. JAMA. 2000; 284:95–7. 5. Caplan RA, Posner KL, Cheney FW. Effect of outcome on physician judgments of appropriateness of care. JAMA. 1991; 265:1957–60. 6. Turner B. Man-made Disasters. Lon-
don, UK: Wykeham Publications, 1978. 7. Cook RI, Woods DD. Operating at the sharp end: the complexity of human error. In: Bogner MS (ed). Human Error in Medicine. Hillsdale, NJ: Lawrence Erlbaum Associates, 1994, pp 255–310. 8. Reason J. Managing the Risks of Organizational Accidents. Aldershot, UK: Ashgate Publishing, 1997. 9. Vincent C, Taylor-Adams S, Stanhope N. Framework for analysing risk and safety in clinical medicine. BMJ. 1998; 316:1154–7. 10. Vincent C, Taylor-Adams S, Chapman EJ, et al. How to investigate and analyse clinical incidents: clinical risk unit and association of litigation and risk management protocol. BMJ. 2000; 320: 777–81. 11. Moray N. Error reduction as a systems problem. In: Bogner MS (ed). Human Error in Medicine. Hillsdale, NJ: Lawrence Erlbaum Associates, 1994, pp 67–91. 12. Rochlin GI. Trapped in the Net: The Unanticipated Consequences of Computerization. Princeton, NJ: Princeton University Press, 1997. 13. Perrow C. Normal Accidents: Living with High-Risk Technologies. Princeton, NJ: Princeton University Press, 1999. 14. Gaba D. Re: Aviation fatality data. http://www.bmj.com/cgi/eletters/319/ 7203/136#EL5, accessed 1/21/2000.
䢇
Erratum Four authors’ names were inadvertently omitted from abstract 045 in the May 2000 issue of Academic Emergency Medicine (Graeme KA. Dimethyl sulfoxide (DMSO) does not prevent ␣-amanitin-induced hepatic injury in mice [abstract]. Acad Emerg Med. 2000; 7:440–1). The missing names are Thomas Higgins, Steven Curry, Christine Reagan, and Jana Lee.