causes and consequences of human errors at workplace for a long time. In the ...
Reason (1990) tries to classify the already known error types (slips, lapses,.
Human Error
Tobias Schwarz Siemens AG Corporate Technology CT T DE IT2 Otto-Hahn-Ring 6 81739 Munich, Germany Tel.: +49 (89) 636-49653 Fax: +49 (89) 636-49428 mailto:
[email protected]
Flavius Kehr University of Koblenz-Landau, Campus Landau Fachbereich 8: Psychologie Fortstraße 7 76829 Landau, Germany mailto:
[email protected] Working student Siemens AG
Version 1.0
1
Contents
1
Contents 1
Contents.......................................................................................................... 2
2
Introduction .................................................................................................... 3
3
Claims to the operator .................................................................................... 3
4
Classification of Human Error........................................................................ 3
5
Human Error and incidents............................................................................. 6
6
Conclusion...................................................................................................... 7
7
Literature ........................................................................................................ 8
2
2
Introduction
2
Introduction
Particularly with regard to serious incidents with fatal consequences, among others, operators are often discussed to be the causers of the disaster (e.g. Dambeck, 2006). Yet, what does the word „mistake“ imply and what course has to be set for a mistake of an operator to contribute to an incident? At least errors are part of the human nature (Noyes & Bransby, 2001)! Psychological research of action planning and conducting has been handling with causes and consequences of human errors at workplace for a long time. In the following, its findings shall be marked out briefly.
3
Claims to the operator
The main task of the operator is to monitor the system’s mode of operation and to intervene if it does not work correctly. In order to chiefly recognize and eliminate dangerous situations, the operator should act in a planning, foresighted way. Additionally, due to the complexity of dynamic systems that behave autonomous even without intervention, high flexibility, actions adapted to the situation and the capacity to distribute cognitive resources to multiple, simultaneously occurring tasks and subtasks are required from the operator (Bainbridge, 1997). Especially with regard to events of fault, not least as the data provided by the system can be contradictory or insecure, the capacity to solve complex problems is asked from the operator. The rationally acting operator would, considering his knowledge and the own conceptual understanding of the system, choose the decision with the lowest risks (Wittenberg, 2000). However, improper workplace design, information overload in a concrete situation and faulty elapsing cognitive processes (according to Dörner (2003) e.g. monocausal thinking, reduction of complexity, illusion of extrapolatability) can contribute to fallible actions (Grams, 1998). Therefore, designing workplaces and systems that do not contribute to “cognitive overload” but to interception of fallible cognitive processes is essential. For this, it is necessary to be able to understand and classify defective cognitive processes.
4
Classification of Human Error
Regarding the model of Norman (1986) which divides action planning and course of action into 7 steps (see fig. 1), theoretically, cognitive or motor misinterpretations are imaginable at every stage. However, literature fundamentally distinguishes three types of failures, occurring on different stages of action course. Following Reason (1990), those are slips, lapses and mistakes. Slips and lapses describe failures occurring on the level of action course (mistakes in attention and thus fallible perception of the environment or mistakes on recall of action sequences). Norman (1988) considers the following scenario to be a typical “slip”:
3
4
Classification of Human Error
„My office phone rang. I picked up the receiver and bellowed ‚Come in’ at it.” In contrast, failures in planning of actions are referred to as „mistakes“. Although the proper action is carried out correctly, the preceding cognitive steps have processed erroneously.
fig. 1.
Planning and course of action (Norman, 1986)
The SRK-framework, elaborated by Rasmussen (1987), shows which kind of actions operators have to be able to execute at work. Thereby, “skill-based behavior” is identified as amount of all actions running semi- or fully-automatic as the needed skills to their execution are available and internalized by the operator. Performing this behavior, cognitive resources can be freed for more demanding operations (e.g. problem solving). In contrast, “rule-based behavior” describes actions based on externally prescribed rules. The operator does not need any knowledge in order to be able to execute them. As emergency plans have to be completed step by step in a dangerous situation, they perfectly fit into this category. Finally, „knowledge-based behavior” is referred to as generation of action plans based on implicit and explicit knowledge about the system and process. Especially in new and unknown situations, the operator has to be able to generate actions referring to his existent knowledge in order to eliminate difficulties. Thus, this cluster of actions requires most cognitive resources. Reason (1990) tries to classify the already known error types (slips, lapses, mistakes) to the possible operators’ actions described in the model of Rasmussen (1987). At this juncture, slips and lapses are assigned to the “skill-based behavior”.
4
4
Classification of Human Error
Thus, errors occurring while executing internalized actions can be found here: Although the person in the previous example has a sript available that codes the modalities to respond a phone call in a commonly accepted way, a mistake occurs executing the action. Also Norman (1988) describes slips as mistakes on execution of activities running semi- or fully-automatic, i.e. actions that do not have to be controlled consciously. In contrast, “rule-based behavior” and “knowledge-based behavior” in Rasmussen's (1987) model can, following the classification of Reason (1990), be described as mistakes: If execution of actions depends on rules (rule-based behavior) or knowledge (knowledge-based behavior), mistakes in planning processes can occur. Hereby, common biases like heuristics, known from cognitive and psychological science, play an important role. The classification of mistakes according to Reason (1990) is demonstrated in fig. 2.
fig. 2.
Classification of mistakes (Reason, 1990)
Deeper causes of errors in action planning and execution may be searched in other psychological variables, too – the efficiency of course of action does certainly depend on personality traits and current mood of the operating person. Particularly with regard to errors caused by the behavior of multiple persons, their relationship and social interactions do essentially contribute, too (Reason, 1990). However, most common errors in daily life are “slips” (Norman, 1988).
5
5
5
Human Error and incidents
Human Error and incidents
It became clear that errors in action planning and execution can particularly emerge in complex situations, in which uncertain, possibly contradictory information is presented. In the context of control rooms however, not every error causes an incident (Noyes, 2001). Regarding human errors and their consequences in enterprises, it should be considered that not only the error of the operator in a concrete situation is crucial; latent mistakes, e.g. on management level, can cause or assist fallible actions in operating cycles. Thus, they should be kept at the back of one’s mind as well when dealing with the concept of holistic workspace (Reason, 1990). Reason (1990) points out the multi-determinative character of incidents in a „swiss-cheese-model” (fig. 3): An incident only occurs when latent or active failures emerge on every level of organizational, inter- and intraindividual structures – hence, a single error of an operator can never be the only cause of an accident.
fig. 3.
How errors lead to accidents (Reason, 1990)
All phases (“loop holes”) have to be traversed in order an accident to happen. In the process, every phase offers just a minimal “loop hole” with a low occurrence probability. In full detail, the phases are: ‐ Fallible decisions: Erroneous arbitrations on management level, e.g. too low expenditures on security issues.
6
6
Conclusion
‐ Line management defenses: Fallible decisions on management level express at the level of team leaders, foremen, department chiefs etc., leading to a summation of latent erroneous leadership behavior in the enterprise. ‐ Psychological precursors of unsafe acts: Leadership problems can cause a negative psychological state of staff. Deficits in human resources development e.g. can boost exorbitant pressure of time, high workload or motivational problems. However, psychological precursors of unsafe acts can be influenced by other factors (for instance life crisis of an employee), too. Anyway, distinctive leadership ability arises in the competence to anticipate such precursors and previously react to them adequately. ‐ Unsafe acts: In a concrete, potentially dangerous situation, an error is committed. The probability of an unsafe act is raised if latent failures on previous levels exist. ‐ Inadequate defenses: Unsafe acts can only lead to an incident if their causes are defended inadequately. An accident only occurs if all defense strategies collapse, i.e. if defense possibilities are either non-existent or nonfunctional or if they drop out in the concrete situation. A holistic view on the topic „control rooms” should be aware of these correlations and interactions in order to create a system which is able to anticipate and react adequately to different situations, particularly on the level of defenses.
6
Conclusion
Errors in planning and coursing actions of operators can lead to a „human error”. The probability of such a mistake is increased if external factors (e.g. information flood, improper workplace design) accrue. However, further misconducts on organizational-, inter- and intraindividual level are necessary for an error of an operator to cause severe consequences. Thus, a system based on a holistic view of the topic “control rooms” should be able to offer adequate defense strategies in order to avoid accidents.
7
7
Literature
7
Literature
Bainbridge. (1997). The change in concepts needed to account for human behavior in complex dynamic tasks. IEEE Transactions on Systems, Man and Cybernetics - Part A. Systems and Humans, (27), 351 - 359. Dambeck, H. (2006, April 21). Tschernobyl-GAU: In der Atomkraft gilt Murphys Gesetz. Abgerufen Dezember 11, 2009, von http://www.spiegel.de/wissenschaft/mensch/0,1518,409579,00.html. Dörner, D. (2003). Die Logik des Mißlingens. Strategisches Denken in komplexen Situationen. (8. Aufl.). Hamburg: Rowohlt. Grams, T. (1998). Bedienfehler und ihre Ursachen (Teil 1). Automatisierungstechnische Praxis atp 40, (3), 53-56. Norman, D. A. (1986). Cognitive Engineering. In S. W. Draper & D. Norman (Hrsg.), User Centered System Design: New Perspectives on Human-computer Interaction (S. 31-61). Hillsdale, NJ: Lawrence Erlbaum Associates. Norman, D. A. (1988). The Design of Everyday Things. New York: Basic Books. Noyes, J. M., & Bransby, M. (2001). People in control (S. 315). London: Institution of Engineering and Technology. Rasmussen, J. (1987). Skills, rules, and knowledge; signals, signs, and symbols, and other distinctions in human performance models. In System design for human interaction (S. 291300). New York: IEEE Press. Reason, J. (1990). Human Error. Cambridge: Cambridge University Press. Wittenberg, C. (2000). Virtuelle Prozessvisualisierung am Beispiel eines verfahrenstechnischen Prozesses. Düsseldorf: VDI.
8