Situation Awareness: Its Applications Value and Its

0 downloads 0 Views 417KB Size Report
This article provides a strong endorsement for the util- ity of the situation awareness (SA) construct in applica- tions and agreement with the identification of ...
564571 2015

EDMXXX10.1177/1555343414564571Journal of Cognitive Engineering and Decision MakingSituation Awareness

Special Issue

Situation Awareness: Its Applications Value and Its Fuzzy Dichotomies Christopher D. Wickens, Alion Science Boulder and Department of Psychology, Colorado State University

This article provides a strong endorsement for the utility of the situation awareness (SA) construct in applications and agreement with the identification of fallacies on the construct cited by Endsley. I then highlight three aspects of the construct for discussion: the distinction between SA and decision/choice/action, the distinction between long-term and working memory (related to bandwidth of change), and the importance and challenges of Level 3 SA. Keywords: situation awareness, topics, information processing, memory

The situation awareness (SA) construct is, in my opinion, one of the most important constructs in engineering/applied psychology to emerge in the 65 years since our discipline was started after World War II. While invoking theory and theoretical constructs, it is less firmly anchored as a theory than it is as a construct model that has a very important role to play in system design, mishap analysis, and training. Of course, the very breadth of the construct sometimes endangers its theoretical status, in that theories tend to be narrow and focused. But the very breadth of SA asks for delineations, and I provide two of these below. To recapitulate Endsley’s classic definition, SA is “knowing what’s going on” and, more formally, is “the perception of the elements in the environment within a volume of time and space, the comprehension of their meaning and the projection of their status in the near future.” To paraphrase Address correspondence to Christopher D. Wickens, Alion Science, 4949 Pearl E. Circle Boulder, CO 80301, [email protected]. Journal of Cognitive Engineering and Decision Making 2015, Volume 9, Number 1, March 2015, pp. 90­–94 DOI: 10.1177/1555343414564571 Copyright © 2015, Human Factors and Ergonomics Society.

Tenney and Pew (2007), it is “what?, so-what?, and now what?” I would like to make two general points about the construct that relate closely to the focus of her article in stating fallacies and clearly showing the fallacious nature of these. First, SA cannot easily be defined/discussed in the abstract, devoid of a context; any more than we can talk about “a situation” without saying what that situation is. This is in contrast to the workload construct (with which SA is often compared; Wickens, 2000, 2002). One can speak of workload being high, without necessarily describing the source of that workload, because the impacts of that high workload on other tasks, or on mental stress, are exerted somewhat independent of the source. In contrast, when one speaks of SA as being low, it is far more necessary to specify “awareness of what.” Otherwise the implications of this observation for performance are somewhat meaningless. Second, because of its functional value, the focus should not be on “proving” or “disproving” a particular theory of SA (as many of the critics reviewed by Endsley endeavor to do), but rather on establishing its degree of applicability to real-world problems, and particularly the factors that may mitigate that degree of applicability. Good applied psychology should not impose standards of “right or wrong” but rather guidelines for greater or lesser relevance to a particular context. Allowing a certain fuzziness enables concentration to be redirected away from proving right or wrong, toward the utility of the concept in applications. Two Fuzzy Dichotomies Endsley rightly and appropriately identifies the fallacies of critics of SA and offers compelling counterarguments. I shall not repeat those here, other than the brief references above. Rather, I would like to elaborate on two of the

Situation Awareness

91

Figure 1. The SA construct is positioned at the intersection of the two fuzzy dichotomies of memory types and of stages of information processing.

“fuzzy dichotomies” below, in some sense, repeating points made in Wickens (2008). I do so because I firmly believe that these points bear repetition or, in the case of the third, bear highlighting. Figure 1 presents my two-dimensional representation of SA, consisting of two “fuzzy dichotomies”: SA versus action choice, and high versus low bandwidth changes. Representing the first of these, across the top is an abstracted “stages of processing” representation (Wickens & Carswell, 2012). Although several have criticized this as an “information processing approach,” I am quite confident in advocating such an approach, and in doing so, there is no reason at all to assume that stages must run in a strictly left-right fashion; as in many real-world renderings, sometimes action clearly does drive perception (even as I have not depicted the feedback loop here). But it is still the case in many cognitive systems that the assessment of a situation does, more often than not, proceed the decision of what to do about it (Hoffman, Crandall, & Shadbolt, 1998). I argue that SA is directly lodged within perception-cognition, not within action selection, choice, or decision making. Human performance is a product of both stages, but SA is not. I highlight this distinction because I have found

that many people conflate the concepts in speaking of SA as “performance.” I offer as follows three examples of when this distinction between awareness or understanding, and action often appears both in nature and in human-systems interaction. The dichotomy of stages here between “what is” and “what should be done” is well validated in a number of arenas: (a) There are distinct brain regions associated with memory and understanding, different from those associated with action. In speech, Wernicke’s area, aft of the central sulcis, is associated with the former, whereas Brocha’s area, forward, is associated with the latter. (b) In decision analysis, situation assessment or diagnosis is typically viewed as separate from choice. The distinction is critical because situation assessment may be viewed as relatively “value free.” That is, there is often a defined existing situation, and the human’s assessment can be objectively evaluated as “right or wrong” relative to that standard. In contrast, this is far from true with action choice because, in nearly every model of choice, accuracy depends upon the values imposed by the decider. And values are usually subjective, prohibiting one from establishing any clear scale of correctness (unless it is defined with respect to a particular set of expressed values).

92 March 2015 - Journal of Cognitive Engineering and Decision Making

(c) Extending from decision analysis to automation, there is a clear distinction between automation as a situation assessor and automation as a “course of action” recommender. Nowhere is this seen more clearly than in medicine (Garg et al., 2005; Morrow, Wickens & North, 2006), but it is also quite evident in aviation, where conflict assessment systems are quite distinct from conflict resolution advisories. Furthermore, it is the case that the consequences of imperfect automation are more severe for those that recommend action than for those that assess a situation (Onnasch, Wickens, Li, & Manzey, 2014). Finally, it is important to note that, although SA only relates to the product in perception/cognition, this does not mean that the product of updating SA is not reliant upon the process of action. As Endsley highlights, one can decide whether to seek more information (to improve SA) or not, and one can certainly execute that decision by the actions of information seeking. Thus, action can support SA, just as, typically, SA supports action. This is the cycle of which Endsley writes. The second dichotomy, between static and dynamic knowledge, is somewhat fuzzier but is closely related to the distinction between longterm memory (LTM) and long-term working memory( LTWM) (Durso, Rawson, & Girotto, 2007; Ericsson & Kintsch, 1995). This “dichotomy” is obviously fuzzy because the rate of change of a situation—its bandwidth—can range continuously from seconds (as in attitude of an aircraft) to minutes (the progress of a forest fire or movement of a tornado) to hours (the track of a hurricane) and even days. This dichotomization is imposed by the two different memory systems, LTM and LTWM. At the extreme static end, we can say that the knowledge about a situation is really stored in LTM and needs to be altered little or at all. Plans can often be made on the basis of this static situation assessment and carried out without concerns for the evolution of the situation. (Of course, even here there are exceptions, a fact that dictates tolerance for fuzziness.) Thus, the necessary knowledge for situation assessment can be eminently retrievable from LTM if necessary, and if this knowledge is wrong, or not available, these phenomena are quite different from a “loss of SA” and more in the domains of classic memory, forgetting, schema, and static

mental models. (Note, however, that this statement does not imply that mental models and schema are not used in the process of updating SA, a point Endsley clearly makes.) In fact, the closest application of SA to static knowledge is in the initial acquisition of knowledge about a new situation: for example, a company auditor coming on board to rapidly assess the financial health of the company. But once obtained, this knowledge is likely to remain static. Hence, this may be better described as situation assessment than as maintaining SA. At the other, hyperdynamic end of the bandwidth continuum, SA is highly applicable: As elements change, changes must be noticed (Level 1) and both their present meaning (Level 2) and future implications (Level 3) should be understood. Three cognitive or memory constructs have different roles to play here. First, obviously working memory is critical, and it is indeed vital for both novices and experts. Second, as we have noted, LTM is of little immediate value, other than the permanent knowledge it contains to embody mental models and schema that can facilitate updating of the arriving information. But LTM of the situation itself is of lesser value because the fast changing status of a situation outstrips the slow time it takes for LTMs to be acquired as well as the long time it takes to forget or “erase” them. Third and most vital for the SA concept is that of LTWM (Ericsson & Kintsch, 1995). Although Endsley does not explicitly call out this construct, her writing clearly alludes to it as a vehicle for rapid retrieval of information that may not truly exist in rehearsable working memory and an “…integrated relationship between short term working memory and LTM systems that SA relies upon.” Thus, Endsley’s development of the SA concept is fully consistent with the fuzzy dichotomy between the two different sets of memory systems—LTM, on the one hand, and the joint system of LTWM and WM, on the other—as this dichotomy correlates in its relevance with the bandwidth of the environment. Levels of SA and the Importance of Projection

Much has been written about the three levels of SA, and I only wish to add a brief observation

Situation Awareness

about Level 1, but also highlight my thoughts on Level 3 in more detail. Regarding Level 1, I prefer and often use the more focal term “noticing” rather than “perception.” Noticing connotes perception of events that occur over time and hence is more descriptive of the dynamic world in which SA resides as a critical function. Perception is much less specific and can be thought of as an ongoing function that takes place almost whenever we are conscious, whether the environment around us is static or dynamic. (We perceive when we read a book, look at a picture, or admire a scene … but these hardly seem to be exemplaries of SA.) Regarding Level 3, I want to highlight its vital but often neglected and underappreciated role. In dynamic systems, SA determines the effectiveness of a response to environmental disturbances, particularly those that are unexpected (Wickens, 2000). But if that response takes time (even a few seconds), as it does in many systems (the delay in cooling an overheated tank, in moving personnel around to combat a threat, or in an aircraft response to an ATC instruction), then the response to the current, understood, state (Level 2 SA) may be of little relevance. What counts entirely is the response to the future, predicted, or projected state (Level 3 SA), because by the time that response has had time to play out (through the system lag) and change the system state, the current state at which it is issued will be irrelevant; that is, what will be is more important than what is. Hence, in many systems, Level 3 SA has been found to be more predictive of performance than the lower levels (Ma & Kaber, 2007; O’Brien & O’Hare, 2007; Sulistyawati, Wickens, & Chui, 2011). Furthermore, other researchers have found that Level 3 questions are harder to answer (lower accuracy; Durso, Bleckley, & Dattel, 2006; Sulistyawati et al., 2011). Thus, given its importance in the dynamic systems in which SA is relevant, and given how relatively poor people are at making accurate predictions in other walks of life (Kahenman, 2011; Tetlock, 2005), it is surprising that more research is not done on this topic. Certainly one reason for this paucity is that prediction, although hard to do for an operator, is also harder to study and objectively measure for an investigator. This is because, unlike Levels 1 and 2 with their clearly defined “gold standards”

93

of correctness, it is harder to evaluate the correctness of a prediction in a dynamic environment because of that very dynamism. Thus, to the extent that these dynamics are random, then it is unfair to penalize imperfect prediction (as optimal prediction could not be perfect either), and the gold standard of prediction must be based on statistical models of how well the future state is predicted by the current state. But complicating matters further is the fact that this prediction or autocorrelation between future and present state is itself typically highly dependent upon the “look ahead time” (LAT) of prediction—higher with shorter times—and open to the question of what LAT should then be used to assess prediction. All of these questions and uncertainties render the assessment of prediction (Level 3 SA) a challenging undertaking and perhaps explain its underrepresentation in research, despite its critical importance. Conclusions It seems that SA, as Endsley has defined and operationalized the concept, is quite valuable in its applications to human-system design. It allows human factors analysts to focus on different interface characteristics (in particular displays), on different sources of error (misinterpreting the presence, understanding, and implications of dynamic changes), and different training techniques (focus on visual scanning and metacognition), than is the case of action selection, or general “human performance.” Its uncertain status as a “testable theory” is well dominated by its value as a model in applications. Acknowledgments The author acknowledges the helpful comments of Dr. Patty McDermott.

References Durso, F. T., Bleckley, M. K., & Dattel, A. R. (2006). Does SA add to the validity of cognitive tests? Human Factors, 48, 721-733. Durso, F. T., Rawson, K. A., & Girotto, S. (2007). Comprehension and situation awareness. In F. T. Durso, R. S. Nickerson, S. Dumais, S. Lewandowsky, & T. J. Perfect (Eds.), Handbook of applied cognition (2nd ed., pp. 163-194). Hoboken, NJ: Wiley. Ericsson, K. A., & Kintsch, W. (1995). Long-term working memory. Psychological Review, 102, 211-245. Garg, A. X., Adhikari, N. K., McDonald, H., Rosas-Arellano, M. P., Devereaux, P., & Beyene, J. (2005). Effects of computerized clinical decision support systems on practitioner performance

94 March 2015 - Journal of Cognitive Engineering and Decision Making and patient outcomes. Journal of the American Medical Association, 293, 1223-1238 Hoffman, R. R., Crandall, B., & Shadbolt, N. (1998). Use of the critical decision method to elicit expert knowledge: A case study in the methodology of cognitive task analysis. Human Factors, 40, 254-276. Kahenman, D. (2011). Thinking fast & slow. New York: Farrar Straus & Giroux. Ma, R., & Kaber, D. B. (2007). Situation awareness and driving performance in a simulated navigation task. Ergonomics, 50, 1351-1364. Morrow, D. G., Wickens, C. D., & North, R. (2006). Reducing and mitigating human error in medicine. Annual Review of Human Factors and Ergonomics, 1, 254-296. O’Brien, K. S., & O’Hare, D. O. (2007). Situational awareness ability and cognitive skills training in a complex real-world task. Ergonomics, 50(7), 1064-1091. Onnasch, L., Wickens, C., Li, H., & Manzey, D. (2014). Human performance consequences of stages and levels of automation: An integrated meta-analysis. Human Factors, 56, 476-488. Sulistyawati, K., Wickens, C. D., & Chui, Y. P. (2011). Prediction in situation awareness: Confidence bias and underlying cognitive abilities. International Journal of Aviation Psychology, 21, 153-174. Tenney, Y. J., & Pew, R. W. (2007). Situation awareness catches on. What? So what? What now? In R. C. Williges (Ed.), Reviews of human factors and ergonomics (Vol. 2, pp. 89-129). Santa Monica, CA: Human Factors and Ergonomics Society.

Tetlock, P. E. (2005). Expert political judgment: How good is it? How can we know? Princeton, NJ: Princeton University Press. Wickens, C. D. (2000). The tradeoff of design for routine and unexpected performance: Implications of situation awareness. In D. J. Garland & M. R. Endsley (Eds.), Situation awareness analysis and measurement (pp. 211-226). Mahwah, NJ: Lawrence Erlbaum. Wickens, C. D. (2002). Situation awareness and workload in aviation. Current Directions in Psychological Science, 11(4), 128133. Wickens, C. D. (2008). Situation awareness. Review of Mica Endsley’s articles on situation awareness. Human Factors, 50(Golden Anniversary Special Issue), 397-403. Wickens, C. D., & Carswell, C. M. (2012). Information processing. In G. Salvendy (Ed.), Handbook of human factors and ergonomics (4th ed., pp. 117-161). New York: Wiley.

Christopher D. Wickens is a senior scientist at Alion Science Corporation, Micro Analysis and Design Operations, in Boulder, Colorado; a professor emeritus at the University of Illinois at Urbana-Champaign; and a visiting professor at Colorado State University. He received his PhD in psychology from the University of Michigan in 1974.