Avoidance of information technology threats: a theoretical perspective ...

23 downloads 115970 Views 463KB Size Report
which users assess the degree to which the IT threat can be avoided by taking ... Introduction. Information technology is a double-edged sword. When its.
Liang & Xue/Avoidance of IT Threats

RESEARCH ARTICLE

AVOIDANCE OF INFORMATION TECHNOLOGY THREATS: A THEORETICAL PERSPECTIVE1 By: Huigang Liang Department of MIS College of Business East Carolina University Greenville, NC 27858 U.S.A. [email protected] Yajiong Xue Department of MIS College of Business East Carolina University Greenville, NC 27858 U.S.A. [email protected]

Abstract This paper describes the development of the technology threat avoidance theory (TTAT), which explains individual IT users’ behavior of avoiding the threat of malicious information technologies. We articulate that avoidance and adoption are two qualitatively different phenomena and contend that technology acceptance theories provide a valuable, but incomplete, understanding of users’ IT threat avoidance behavior. Drawing from cybernetic theory and coping theory, TTAT delineates the avoidance behavior as a dynamic positive feedback loop in which users go through two cognitive processes, threat appraisal and coping appraisal, to decide how to cope with IT threats. In the threat appraisal, users will perceive an IT threat if they believe that they are susceptible 1

Alan Dennis was the accepting senior editor for this paper.

to malicious IT and that the negative consequences are severe. The threat perception leads to coping appraisal, in which users assess the degree to which the IT threat can be avoided by taking safeguarding measures based on perceived effectiveness and costs of the safeguarding measure and selfefficacy of taking the safeguarding measure. TTAT posits that users are motivated to avoid malicious IT when they perceive a threat and believe that the threat is avoidable by taking safeguarding measures; if users believe that the threat cannot be fully avoided by taking safeguarding measures, they would engage in emotion-focused coping. Integrating process theory and variance theory, TTAT enhances our understanding of human behavior under IT threats and makes an important contribution to IT security research and practice. Keywords: Technology threat avoidance theory, cybernetics, coping, malicious IT, virtuous IT, safeguarding measure, threat, susceptibility, severity, avoidability, effectiveness, costs, self-efficacy

Introduction Information technology is a double-edged sword. When its power is properly harnessed to serve virtuous purposes, it has tremendous potential to improve human and organizational performance. However, when it is exploited for malicious purposes, it can pose huge threats to individuals, organizations, and society. Many forms of malicious IT such as viruses, worms, e-mail spam, spyware, adware, and Trojan horses can affect personal computers and even enterprise IT infrastructure, causing large-scale productivity and financial losses (Bagchi and Udo 2003; Stafford and Urbaczewski 2004). According to a CSI/FBI survey (Gordon et al. 2006),

MIS Quarterly Vol. 33 No. 1, pp. 71-90/March 2009

71

Liang & Xue/Avoidance of IT Threats

the 313 participating U.S. organizations lost $52.5 million in 2006 due to computer crime and security problems, of which $15.7 million was caused by virus attacks. The worldwide financial cost of virus attacks in 2006 was $13.3 billion (Computer Economics 2007). Given this gigantic impact, IT security has drawn great attention from researchers and practitioners (Baskerville 1993; Dhillon and Backhouse 2000; Loch et al. 1992; Straub and Welke 1998). To prevent potential harm and losses, a critical IT security issue is that end users need to perform the tasks that are necessary to effectively cope with IT threats. Thus, it is imperative to have a profound understanding of IT users’ threat avoidance behavior. Given that safeguarding IT (e.g., anti-virus and anti-spyware software) can reduce the threat of malicious IT, many information systems researchers have applied technology acceptance theories to investigate people’s adoption of safeguarding IT. A commonly held belief is that adoption of safeguarding IT is the same as avoidance of malicious IT. This is not surprising given the preponderance of acceptance theories coupled with the very limited reliance on avoidance theories in the IS literature. Actually, it appears quite reasonable to apply acceptance theories in the IT security context because it is important to understand individuals’ adoption of safeguarding IT. However, strong theoretical and empirical evidence shows that there are fundamental differences between adoption and avoidance behaviors (Carver and White 1994; Elliot 2006; Elliot and Covington 2001). While studying adoption of safeguarding IT provides some useful findings, this approach tends to draw an incomplete picture of the phenomenon of IT threat avoidance. For example, to avoid a virus spreading through e-mail, users must first perceive the virus as a threat and can take several actions such as enabling a firewall, updating their anti-virus software, or stop checking e-mail. If only adoption of the anti-virus software is studied, we can at best partially understand the avoidance phenomenon because that approach fails to consider the evaluation of threat and alternative avoidance actions. Moreover, in IT security practice, the ultimate goal is to avoid IT threats rather than to adopt a specific safeguarding IT. Adoption of safeguarding IT is only one means that may lead to the goal. Therefore, a broader approach focused on avoidance needs to be taken to gain a complete understanding of the phenomenon. Since no theory exists in the IS literature to guide this broader approach, it is the objective of this paper to develop theory that explains IT users’ threat avoidance behavior. Although a number of studies have investigated IT security at the organizational level (e.g., Baskerville 1988; Goodhue and Straub 1991; Straub 1990; Straub and Nance 1990; Straub and Welke 1998), few efforts have been made to establish an overarching paradigm to guide theory development and to

72

MIS Quarterly Vol. 33 No. 1/March 2009

provide a common frame of reference at the individual user level. Discerning the lack of theorization regarding IT threat, we synthesize a large body of literature from psychology, health care, risk analysis, and information systems to develop the technology threat avoidance theory (TTAT)2 to explain individual IT users’ threat avoidance behavior. Drawing on cybernetic theory (Carver and Scheier 1982; Edwards 1992), TTAT posits that users’ IT threat avoidance is represented by a dynamic positive feedback loop that intends to enlarge the discrepancy between the current state and the undesired end state. Drawing on coping theory (Lazarus 1966; Lazarus and Folkman 1984), TTAT suggests that users go through two cognitive processes, threat appraisal and coping appraisal, to assess the threat they are facing and decide how to avoid the threat by performing problem- and/or emotion-focused coping. Drawing on the literature of risk analysis (Baskerville 1991a, 1991b) and health psychology (Janz and Becker 1984; Rogers 1983; Weinstein 2000), TTAT proposes that users’ threat perception is determined by the perceived probability of the threat’s occurrence and the perceived severity of the threat’s negative consequences. Drawing on prior research on health protective behavior (Janz and Becker 1984; Maddus and Rogers 1983) and self-efficacy (Bandura 1982; Compeau and Higgins 1995), TTAT submits that users consider three factors to evaluate how avoidable the threat can be made by a safeguarding measure: the effectiveness of the measure, the costs of the measure, and users’ self-efficacy of taking the measure. The central theme of TTAT is that when users perceive an IT threat, they are motivated to actively avoid the threat by taking a safeguarding measure if they perceive the threat to be avoidable by the safeguarding measure, or they will passively avoid the threat by performing emotion-focused coping if they believe that the threat is not avoidable by any safeguarding measure available to them. The validity of TTAT hinges on the assumption that human beings’ avoidance and adoption behaviors are qualitatively different. This difference underlies the necessity of our development of TTAT. Naturally, human beings avoid negative stimuli and approach positive stimuli (Freud 1915; James 1890; Pavlov 1927; Skinner 1953; Thorndike 1911). In the IT context, the stimuli are various information technologies. Thus, in the rest of this paper we first define malicious vis-àvis virtuous IT, describe cybernetic theory, and explain the approach–avoidance distinction to set the stage for the development of TTAT. Then, TTAT is formally developed, following which we discuss implications for IT security practice. Finally, we highlight the contributions of TTAT to conclude the paper. 2

Following Whetten (1989), we do not differentiate between theory and model in this article.

Liang & Xue/Avoidance of IT Threats

Malicious Versus Virtuous IT We distinguish between malicious IT and virtuous IT so that their different consequences can be delineated and IT users’ different reactions to them can be understood. We propose that the differentiation between malicious IT and virtuous IT can be based on designer intention or user perception. Based on designer intention, malicious IT refers to computer programs designed to cause system dysfunction or security and privacy breaches and virtuous IT refers to computer systems designed to provide communicational, computational, or decisional aids to users to increase their performance. However, an IT designed to be virtuous might be perceived by users as malicious due to contextual complexities and interest conflicts. For example, advertising e-mail is designed to help sellers market their products. For the sellers and consumers who are interested in the products, the e-mail is virtuous. Yet for the consumers who are not interested in the products, the e-mail is malicious spam. Therefore, users’ perspective needs to be taken into account to provide a clear conceptualization. As Figure 1 shows, designer intention and user perception converge in quadrant 1, in which IT is designed to be virtuous and achieves its design objective from the user’s perspective, and quadrant 3, in which IT is intended to be malicious. Quadrant 2 includes the IT that is designed to be virtuous but fails to achieve its design purpose from the user’s perspective. Quadrant 4 is empty because IT designed to be malicious is highly unlikely to produce positive outcomes to users. Regardless of designer intention, users react to a given IT based on their perceptions of the IT’s characteristics and potential impact on them. Thus, in this paper we choose to identify IT maliciousness and virtuousness based on user perception. Specifically, malicious IT is defined as systems perceived by users to be repulsive and to cause negative outcomes, and virtuous IT is defined as systems perceived by users to be attractive and to cause positive outcomes. Traditionally, theory building efforts in the IS discipline are focused on virtuous IT. Several theories have been imported from other disciplines or developed within the IS discipline to explain why a certain IT is (or is not) adopted given that such adoption is a good thing to do. This can be seen from a voluminous body of literature applying innovation diffusion theory (IDT) (Rogers 1995), theory of reasoned action (TRA) (Ajzen and Fishbein 1980; Fishbein and Ajzen 1975), theory of planned behavior (TPB) (Ajzen 1991), and technology acceptance model (TAM) (Davis 1989; Davis et al. 1989; Venkatesh and Davis 2000). Substantial empirical evidence shows that these theories provide a convincing account of individual users’ acceptance of a wide range of IT in various settings (Adams et al. 1992; Karahanna et al. 1999; Moore

and Benbasat 1991; Taylor and Todd 1995; van der Heijden 2004; Venkatesh and Brown 2001; Venkatesh et al. 2003). On the contrary, few theories are centered on malicious IT. Avoidance of malicious IT is often simplified as adoption of safeguarding IT so that existing acceptance theories can be applied. However, adoption of safeguarding IT is only a part of the malicious IT avoidance phenomenon. The malicious IT avoidance phenomenon tends to be underrepresented by applying IT acceptance theories to study safeguarding IT adoption. This is because IT acceptance theories are not intended to explain avoidance behavior. Following the expectancy paradigm (Steers et al. 2004; Vroom 1964), IT acceptance theories assume that human behavior is purposeful and goal directed and users will go through a cognitive process to choose the behavior that will lead to their most valued rewards. However, the goal that directs human behavior and the process through which the goal is achieved are not explicitly considered in the theories, at least as it is applied in extant IS research. This omission limits the explanatory potency of existing IT acceptance theories in the context of IT threat avoidance.3 For example, if users do not perceive spyware as a threat, they may choose not to install anti-spyware although they think it is useful to counteract spyware and easy to use. Based on this observation, we may falsely reject TAM. Yet, this conclusion is unfair to TAM because it is not developed to explain avoidance behavior. Drawing on cybernetic theory, we illustrate this point in detail next.

Cybernetic Theory We use cybernetic theory (Wiener 1948) as a framework to show why IT acceptance theories cannot fully explain people’s IT threat avoidance behavior and to build a new theory that is more appropriate for explaining this behavior. Cybernetic theory is chosen because it is consistent with expectancy theory (Vroom 1964) and widely accepted as a theoretical framework for understanding human behavior (Edwards 1992). It is argued to be general systems theory because cybernetic processes are ubiquitous, identifiable in virtually any self-regulating system (Carver and Scheier 1982). The principles of cybernetics have been widely applied in social and health psychology and organizational behavior theories (Carver and Scheier 1982; Edwards 1992; Green and Welsh 1988; Klein 1989). 3

It should be noted that we have no intention to discredit technology acceptance theories. The problem is not about acceptance theories per se. Rather, it is about how to apply these theories appropriately. No theory is capable of explaining every phenomenon.

MIS Quarterly Vol. 33 No. 1/March 2009

73

Liang & Xue/Avoidance of IT Threats

Designer intention

Malicious

Trojan horses Denial of service Viruses W orms etc.

Virtuous

DSS CAD etc.

Email spam Spyware Adware etc.

Virtuous

Malicious

User perception

Figure 1. A Grid Categorizing IT Maliciousness Versus Virtuousness

The central idea of cybernetics is that human beings selfregulate their behaviors by feedback loops (Carver and Scheier 1982). The most commonly discussed cybernetic process is the negative feedback loop. As shown in Figure 2, a self-regulating system consists of a goal, a comparator, an input function, and an output function (Carver and Scheier 1982). A cybernetic loop is triggered by disturbance in the environment or goal setting. The input function senses the environment and sends signals to the comparator, which compares the present state with a reference value specified by the goal. If a discrepancy is detected, the output function activates a behavior to eliminate or reduce the discrepancy. The behavior generates an impact on the environment, thus altering the present condition, which in turn is compared with the reference value to determine continuance or discontinuance of the behavior. This process is called a negative feedback loop because it functions to decrease the discrepancy between the present state and the desired end state. All of the contemporary technology acceptance theories such as TAM, TRA, TPB, and IDT seem to reflect the principle of the negative feedback loop. Implicit in these theories is that there is a positive goal to be attained and individuals need to evaluate the extent to which the technology of interest can help them attain the goal. The focus of these theories is to explain what happens after a discrepancy is detected between the goal and the current state, that is, the link between the comparator and the output function in Figure 2. Goal, input

74

MIS Quarterly Vol. 33 No. 1/March 2009

function, and impact on environment are not explicitly considered in these theories.4 Some IS researchers (Bhattacherjee 2001; Bhattacherjee and Premkumar 2004) use confirmation or disconfirmation to predict IS continuation, which contributes to completing the negative feedback loop by incorporating the “impact on environment” element. Yet the picture is still incomplete without considering goal. When the user’s goal of IT usage differs from the goal assumed by acceptance theories, the theories’ explanatory power may be weakened. For example, Brown et al. (2002) find that TAM and TPB do not work well in explaining mandatory IT usage because the user’s goal is to avoid punishment, which differs from the assumed goal of improving performance. They suggest that the valued outcome (goal) of IT usage should be taken into account in future IT acceptance research. There is also a positive feedback loop in cybernetics. While the negative feedback loop concerns discrepancy reduction, the positive feedback loop acts to enlarge the discrepancy (Carver 2006). That is, it increases the distance between the present state and the reference value specified by the goal. The goal of this loop specifies an undesired end state that individuals try not to achieve. To differentiate it from the goal of the negative feedback loop, it is called an “anti-goal” 4

Rogers (2003) updated IDT by adding a confirmation stage. Thus, relative to other acceptance theories IDT has a closer representation of the negative feedback loop.

Liang & Xue/Avoidance of IT Threats

Goal/ Anti-Goal Comparator

Input function (Perception)

Output function (Behavior) Impact on environment Disturbance

Figure 2. The Cybernetic Loop

by Carver (2006). In the positive feedback loop, a behavior is activated when the present state is too close to the undesired end state and the behavior discontinues when the discrepancy is sufficiently large. The positive feedback loop nicely describes the phenomena that individuals avoid malicious IT. Being infected by a malicious IT can be thought of as the anti-goal, which users try to avoid. When the malicious IT is perceived from the environment by the input function, the comparator finds that the current state is close to the anti-goal. An avoidance behavior is then activated to drive the current state away from the undesired state. Within the cybernetic framework, adoption of virtuous IT is explained by the negative feedback loop, whereas avoidance of malicious IT is explained by the positive feedback loop. Since adoption involves approaching the desired end state and avoidance involves avoiding undesired end state, these phenomena are qualitatively different. We show the approach– avoidance distinction in detail next.

The Approach–Avoidance Distinction The two feedback loops of cybernetics are conceptually and theoretically different (Carver 2006). Obviously, they push the present state into different directions with respect to the reference value set by the goal or anti-goal. The negative feedback loop involves an approach behavior that is instigated

by a positive, desirable end state, while the positive feedback loop involves an avoidance behavior that is instigated by a negative, undesirable end state. The approach–avoidance distinction has been recognized as a fundamental, basic concept in motivation and decision theories (Carver and White 1994; Elliot 2006; Elliot and Covington 2001; Steel and Konig 2006; Tversky and Kahneman 1992). Abundant evidence supports the approach–avoidance distinction. First, this distinction has a long and rich history in intellectual thought, dating back more than a century, and can be found in many major psychology theories. For example, hedonism posits that humans’ motivational foundation is to seek pleasure and avoid pain (Freud 1915; James 1890). In his seminal work on classical conditioning, Pavlov (1927) identifies two types of reflexive responses: an orienting response toward the stimulus and a defensive response away from the stimulus. Reinforcement theorists (Skinner 1953; Thorndike 1911) contend that rewards increase the likelihood of subsequent behavior while punishments decrease the likelihood of subsequent behavior. Second, all forms of living organisms besides human beings exhibit the approach–avoidance distinction (Elliot and Covington 2001). Being able to approach beneficial stimuli and withdraw from harmful stimuli has significant survival value for organisms. They have to constantly adjust to environments by making approach or withdrawal decisions in their history of evolution (Tooby and Cosmides 1990). Given the importance of approach versus avoidance behaviors, it has been asserted that the approach–avoidance distinction is the

MIS Quarterly Vol. 33 No. 1/March 2009

75

Liang & Xue/Avoidance of IT Threats

The approach behavior Present State

Desired End State

One determinant direction to reduce the discrepancy The avoidance behavior Present State

Undesired End State

Many possible directions to enlarge the discrepancy Figure 3. Differences between the Positive and Negative Feedback Loops

most elemental reaction of organisms to external stimuli (Zajonc 1998). Third, given that human behavior has biological bases (Nebylitsyn and Gray 1972), if the approach and avoidance behaviors are distinct, they should be associated with different biological brain structures. Prior neurophysiological research shows that identifiable substrates in the brain underlie approach and avoidance motivation processes respectively (Davidson 1995; Gray 1982; Lang et al. 1990). Clinical and laboratory observations suggest that approach and avoidance motivation systems are located in different cerebral hemispheres: left prefrontal cortex is associated with approach behavior, whereas right prefrontal cortex is associated with avoidance behavior (Sutton and Davidson 1997). Fourth, cumulative prospect theory (Tversky and Kahneman 1992), one of the leading theories of decision (Steel and Konig 2006), posits that people derive values from gains and losses by following different functions. The value function is concave for gains, convex for losses, and steeper for losses than for gains (Kahneman and Tversky 1979). That is, losses loom larger than gains. Since human beings tend to approach gains and avoid losses, cumulative prospect theory suggests that approach and avoidance behaviors are associated with different evaluative processes. Finally, a subtle difference between the two behaviors is that the approach behavior always moves the current state toward

76

MIS Quarterly Vol. 33 No. 1/March 2009

the desired end state, while the avoidance behavior has no affirmative direction as long as it separates the current state from the undesired end state (Carver 2006). As Figure 3 shows, approach behavior follows a determinate path to reduce the discrepancy. In contrast, avoidance behavior could take many alternative paths to enlarge the discrepancy. As we articulate the approach–avoidance distinction, it should be clear that avoidance of malicious IT should not be treated the same as adoption of safeguarding IT. While studying adoption of safeguarding IT contributes to our knowledge, it tends to provide a fragmented understanding of the focal phenomenon of IT threat avoidance. As the earlier example shows, if the threat of spyware is not explicitly considered, applying IT acceptance theories to study adoption of antispyware may lead to inconsistent, even false findings. To fully understand the behavior of IT users under threat of malicious IT, we develop the technology threat avoidance theory (TTAT).

Technology Threat Avoidance Theory TTAT intends to explain the process and determinants of IT threat avoidance behavior across a broad range of IT threats and user populations. At the same time, TTAT should be both parsimonious and theoretically justified. To attain these objec-

Liang & Xue/Avoidance of IT Threats

Anti-Goal (Being harmed by malicious IT)

Threat appraisal

P2 Coping appraisal P3

Coping P3b

Emotionfocused coping

Impact on environment

Problemfocused coping

P3a

Emergence of malicious IT

Figure 4. The Process of IT Threat Avoidance

tives, we attempt to explicate human cognitive processes under threat and identify a small number of fundamental variables that are assessed in the cognitive processes, as suggested by extant research and theories. Hence, TTAT integrates the advantages of both process theory and variance theory.

TTAT as Process Theory As process theory, TTAT posits that IT users’ avoidance behavior is explained by the positive feedback loop (Carver 2006; Carver and Scheier 1982). The loop can be started with the emergence of malicious IT in the environment (see Figure 4). Then users become aware of the existence of this malicious IT and develop a perception of their current state. They set being harmed by malicious IT as the anti-goal (undesired end state) and compare it with their current state. They will develop a sense of threat when they find that their current state is too close to the anti-goal. The threat motivates them to engage in coping behavior to avoid the threat. That is, the coping behavior is performed to increase the discrepancy between the current state and the undesired end state, objectively or subjectively. The behavior will continue until the discrepancy is sufficiently large so that the threat disappears.

Proposition 1: Users’ avoidance behavior under threat of malicious IT is a dynamic process that intends to enlarge the discrepancy between users’ current state and their undesired end state. Based on coping theory (Lazarus 1966; Lazarus and Folkman 1984), we submit that in the positive feedback loop users go through two cognitive processes to determine their responses to malicious IT: threat (primary) appraisal and coping (secondary) appraisal. These cognitive processes evaluate the influence of environmental factors such as social norms, available information, and personal experience. This twoprocess model has been widely applied in explaining health protective behavior such as smoking cessation, having a mammogram, using condoms, and fastening seatbelts (Flynn et al. 1995; Ho 1998; Maddus and Rogers 1983; Pechmann et al. 2003; Rippetoe and Rogers 1987; Rogers 1983; Tanner et al. 1989; Tanner et al. 1991). It has also been applied in IS research to investigate users’ cognitive and behavioral efforts to cope with significant IT events taking place in their work environment (Beaudry and Pinsonneault 2005). In the threat appraisal, users evaluate the potential negative consequences of being attacked by malicious IT. The threat appraisal is analogous to the comparator in cybernetic theory

MIS Quarterly Vol. 33 No. 1/March 2009

77

Liang & Xue/Avoidance of IT Threats

(Carver and Scheier 1982). Conceptually, the distance between the current state and the undesired end state is inversely proportional to the strength of potential negative consequences. A threat is perceived when the distance decreases to a certain value. The threat perception activates the coping appraisal in which users assess available action options and decide what to do to cope with the threat. The threat appraisal must occur before the coping appraisal (Lazarus and Folkman 1984). Only after a threat is identified do users develop a sense of urgency and become motivated to search for and evaluate information related to coping. Thus, the perception that a threat exists is a necessary condition for seeking coping methods. For example, residents of Ohio do not worry about installing hurricane shutters because the threat of hurricanes does not exist.

theory (Vroom 1964), they will evaluate a series of safeguarding options and choose the one that will most likely reduce the threat of malicious IT. When an effective safeguarding measure is identified, the avoidance loop leads to an adoption behavior. Carver (2006) suggests that many cases of active avoidance of a threat also involve approach of an incentive. Cybernetic theory posits that feedback loops have a hierarchical organization in which achievement of a goal requires achieving a hierarchy of subgoals (Carver and Scheier 1982; Elliot 2006; Klein 1989). In the IT threat context, adoption of safeguarding measures is the subgoal that serves the superordinate goal of avoiding IT threats. Therefore, adoption of a safeguard is an important part of the threat avoidance loop, but it cannot represent the entire phenomenon of IT threat avoidance.

Proposition 2: Users appraise how to cope with malicious IT only after they appraise the threat of malicious IT.

If users exhaust their repertoire of safeguarding measures and fail to find one that can help them avoid the threat, they need to practice emotion-focused coping so that their psychological well-being is maintained (Beaudry and Pinsonneault 2005). Emotion-based coping can also be activated when users adopt ineffective safeguarding measures. Since malicious IT constantly evolve by producing new variants, users often find that adoption of safeguarding measures cannot completely eliminate the IT threat. For example, the SoBig worm had five different variants before the notorious SoBig.F caused devastating losses in 2003. Even users who had installed antivirus software against SoBig.A through SoBig.E became victims of SoBig.F. Therefore, problem-focused coping itself is not likely to be enough to remove the threat of malicious IT and emotion-focused coping is needed as well.

Two types of coping can be performed to deal with the threat: problem-focused and emotion-focused (Lazarus and Folkman 1984). Problem-focused coping refers to adaptive behaviors that take a problem-solving approach to attempt to change objective reality. It deals directly with the source of the threat by taking safeguarding measures (e.g., installing safeguarding IT, disabling cookies, updating passwords regularly). After the measures take effect, users’ perception of their current state is further away from the undesired end state, thus reducing the threat. In contrast, emotion-focused coping is oriented toward creating a false perception of the environment without actually changing it or adjusting one’s desires or importance of desires so that negative emotions related to threat (e.g., fear and stress) are mitigated. This coping reduces perceived threat or motivation of coping with the threat without changing objective reality. It includes various modes such as religious faith (beliefs in God will to remove danger), fatalism (acceptance of a dangerous situation), denial (denial of the presence of danger), and helplessness (internalization of blame or resignation for not being able to control the danger) (Lazarus and Folkman 1984; Rippetoe and Rogers 1987). Given that the avoidance behavior can take many possible directions as long as it enlarges the discrepancy between the current state and the undesired end state, when users try to avoid malicious IT their behavior is not predetermined. They can perform problem-focused coping, emotion-focused coping, or both. Rippetoe and Rogers (1987) find that highthreat situations give rise to both problem- and emotionfocused coping. To the degree that IT users are rational, they will try problem-focused coping first. Based on expectancy

78

MIS Quarterly Vol. 33 No. 1/March 2009

Proposition 3: Users may employ either problemfocused or emotion-focused coping to reduce the threat of malicious IT. Proposition 3a: Users perform problem-focused coping to objectively mitigate the danger of malicious IT, thus reducing the threat. Proposition 3b: Users perform emotion-focused coping to subjectively reduce the threat of malicious IT.

TTAT as Variance Theory While the process-oriented view of TTAT delineates clearly how IT users dynamically perform cognitive appraisals and engage in coping behaviors, it does not identify key influencing factors of this process. To further understand IT users’ avoidance behavior, we integrate protection motivation theory

Liang & Xue/Avoidance of IT Threats

Risk Tolerance

P11

Threat Appraisal Perceived Susceptibility

P4a P4c

Perceived Threat

Coping P6

P4b

Perceived Severity

P8

Problem-focused Coping Avoidance Motivation

P9

P10

Avoidance Behavior

Coping Appraisal Perceived Effectiveness

Perceived Costs

Emotion-focused Coping P5a

P5b

P7 Perceived Avoidability

P5c Self-Efficacy

P12 Social Influence

Figure 5. The Variance Theory View of TTAT

(Rogers 1975, 1983), health belief model (Janz and Becker 1984; Rosenstock 1974), and risk analysis research (Baskerville 1991a, 1991b) into TTAT to identify a set of key factors and their relationships, thereby creating a variance theory view of TTAT. This variance theory view basically describes a cross-sectional snapshot of the avoidance process. At a certain point in time, the key variables generated by the avoidance process coexist, as Figure 5 shows.

In accord with the process theory view, the variance theory view consists of variables that are relevant to threat appraisal, coping appraisal, and coping. Variables from these three processes sufficiently reflect the entire feedback loop because users’ perceptions of the environment and the undesired end state are evaluated in threat appraisal and impact on environment is sensed by users’ perceptions. Thus, all of the elements of the feedback loop are taken into account. Further-

MIS Quarterly Vol. 33 No. 1/March 2009

79

Liang & Xue/Avoidance of IT Threats

more, we divide problem-focused coping into two variables to capture the transition from motivation to behavior. We contend that two beliefs, perceived threat and perceived avoidability, determine both avoidance motivation of using safeguarding measures and emotion-focused coping. Perceived threat is the outcome of the threat appraisal, and perceived avoidability is concerned with the safeguarding measure that is used to cope with malicious IT. We theorize that both beliefs are determined by several antecedents. We draw heavily on theories in health psychology (Janz and Becker 1984; Oliver and Berger 1979; Rogers 1975, 1983; Rosenstock 1974; Weinstein 1993, 2000) in identifying key variables and their relationships in IT users’ avoidance process because of the similarities between malicious IT and diseases. We contend that malicious IT is analogous to diseases in three aspects. First, both of them are rogue agents that invade a system to cause malevolent changes. Malicious IT impairs computer functionalities and information security and privacy, while diseases deteriorate human health. Second, people have different subjective assessments of the threat imposed by malicious IT or diseases. Due to personal idiosyncrasies, some people are more sensitive to harmful threat than others. Finally, coping measures are available for both malicious IT and diseases. While diseases can be treated medically, malicious IT can be avoided by various protective measures. Given these similarities, we contend that the cognitive reactions in human agency aroused by both sources of threat are similar. In addition, IT has become an integral part of many people’s work and everyday lives (Hoffman et al. 2004). We argue that they may start to view personal computers as an indispensable component of their quality of life. This mindset tends to make people experience similar cognitive processes in response to either health threats or IT threats. Therefore, the key variables defining these processes may also be similar. Threat Appraisal IT users develop a perception of threat after assessing potential dangers in their computing environment. Within the cybernetic framework, perceived threat reflects the proximity between users’ current state and undesired end state. Thus, we define perceived threat as the extent to which an individual perceives malicious IT as dangerous. Based on the extant literature of health psychology and risk analysis, we propose that the threat perception is shaped by two antecedents: perceived susceptibility and perceived severity. Perceived susceptibility is defined as an individual’s subjective probability that the malicious IT will negatively

80

MIS Quarterly Vol. 33 No. 1/March 2009

affect him or her, and perceived severity is defined as the extent to which an individual perceives that negative consequences caused by the malicious IT are severe. Only when users believe that they are vulnerable to malicious IT and that the consequence of being attacked is serious will they perceive a threat. Failing to consider either one may lead to a misunderstanding of the threat perception. Proposition 4: Users’ IT threat perception is determined by their perceived susceptibility and severity of malicious IT. The relationships between susceptibility, severity, and threat are well supported by the health psychology and risk analysis literature. First, most prior research on health protective behavior has proposed that the perceived probability that harm will occur if no action is taken and the perceived severity of the harm are two defining characteristics of health threat (Weinstein 2000). For example, the health belief model and protection motivation theory posit that perceived probability and severity of a health threat occurrence motivate people to take protective measures. Second, the widely accepted risk analysis method asserts that the probability of a threat occurring and the cost or loss attributed to the threat are the two critical elements of risk (Baskerville 1991a, 1991b, 1993). This conceptualization of risk is also reflected in previous IS security research (Goodhue and Straub 1991; Straub and Welke 1998). For example, Straub and Welke (1998) state that “technically, [risk] is the probability associated with losses (or failure) of a system multiplied by the dollar loss if the risk is realized” (p. 442). To examine people’s appraisal of malicious IT, it is necessary to take both susceptibility and severity into account. These perceptions vary among different malicious IT and different individuals and it is unrealistic to assume that either is invariant across technology and individuals. For example, some viruses can reformat the computer hard drive and destroy all the files, while most spyware only monitors computer activities and collects user information silently, without the user’s awareness. Conceivably, an individual will perceive the virus attack as devastating and think the spyware infection is innocuous, given equal chances of occurring. As a result, an individual considers the virus attack as a huge threat and the spyware as, probably, only a slight nuisance. Even for the same malicious IT, different individuals would perceive different levels of severity. This is because people’s judgments are strongly influenced by a reference point that is determined by personal expectations and knowledge (Kahneman and Tversky 1979; Tversky and Kahneman 1974). For example, of two students who got a “B” for a course, one is happy because he expects to get a “C” whereas the other is

Liang & Xue/Avoidance of IT Threats

upset because she expects to get an “A.” Similarly, people develop different subjective perceptions even toward the same degree of objective severity of malicious IT. Proposition 4a: The more users believe that they are susceptible to malicious IT, the stronger their threat perception will be. Proposition 4b: The more users believe that the negative consequences caused by malicious IT are severe, the stronger their threat perception will be. While perceived susceptibility and severity both have a main effect on perceived threat, their influences can also be multiplicative. Prior research provides support to the multiplicative nature of these variables. According to the risk analysis method, risk is calculated by multiplying the probability and cost of damage when a threat occurs. Health psychology theories also suggest that there is interaction between perceived susceptibility and perceived severity. Due to the interaction between perceived susceptibility and perceived severity, if either variable takes a value of zero, the other variable would have no relationship with perceived threat. For example, some people know that AIDS is deadly, but they do not perceive AIDS as a threat because they see no possibility for them to be infected. Also, some people know that they are likely to become obese due to overeating, but they do not view overeating as a threat because they do not think obesity is a serious problem. Malicious IT poses threats in the same manner. If an IT threat is perceived to have no chances of occurring, there should be no interest in acting against it, regardless how serious it might be. Similarly, it would be irrational to take precautions if the outcome is not at all severe, no matter how likely it is. Besides the zero-value boundary conditions, perceived susceptibility and perceived severity have a positive moderating effect on each other. That is, the relationship between perceived susceptibility and threat is a function of perceived severity, so that the higher the perceived severity, the stronger the susceptibility–threat relationship. For example, Internet user A is likely to be infected by a malignant computer virus and user B is likely to be infected by an adware that only creates annoying pop-up windows. Given the same change in probability of being infected, the increase in user A’s threat perception will be greater than user B’s. The same logic applies to the role of perceived susceptibility in moderating the severity–threat relationship. Proposition 4c: Perceived severity and susceptibility have a positive interaction effect on perceived

threat. The boundary condition of this interaction is that if either variable takes a value of zero, the other will have no relationship with perceived threat. Coping Appraisal Perceived threat, resulting from the threat appraisal process, provides the initial motivation to commence avoidance behavior, then users start the coping appraisal process to seek safeguarding measures to pursue active threat avoidance (Figure 5). An important judgment that people need to make in this process is to determine how avoidable the IT threat can be if a particular safeguarding measure is taken. Perceived avoidability is defined as an individual’s assessment of the likelihood he or she will be able to avoid malicious IT by using a specific safeguarding measure. Based on prior research (Bandura 1982; Compeau and Higgins 1995; Maddus and Rogers 1983; Weinstein 1993), we contend that perceived avoidability is formulated by considering how the safeguarding measure effectively avoids the IT threat, what costs are associated with the safeguarding measure, and how confident an individual feels about taking the safeguarding measure. Hence, three constructs—perceived effectiveness, perceived costs, and self-efficacy—will be appraised in the coping appraisal process to give rise to perceived avoidability. Perceived effectiveness is derived from Bandura’s (1982) outcome judgment that is concerned with the extent to which a behavior, if executed, is believed to lead to specific outcomes. In the context of IT threat avoidance, the behavior is taking a safeguarding measure and the outcome is reducing the IT threat. Thus, in the purview of TTAT, perceived effectiveness refers to an individual’s subjective assessment that the safeguarding measure can effectively avoid the threat of malicious IT. Based on this assessment, users can appraise whether the threat is avoidable. Perceived effectiveness has some nexus with perceived usefulness, a major determinant of users’ attitude or behavioral intention in TAM (Davis 1989; Davis et al. 1989). Both constructs are based on outcome expectancy. Perceived usefulness is intended to measure how the technology increases the user job performance. This intention is particularly pronounced in a revised version of TAM, in which perceived usefulness is renamed as performance expectancy (Venkatesh et al. 2003). Perceived effectiveness in TTAT measures the usefulness of the safeguard in terms of its ability to objectively avoid the threat of malicious IT. The theoretical nuance between the two constructs lies in how the goal of adoption is treated: perceived usefulness is conceptualized with the assumption that users’ goal is to improve perfor-

MIS Quarterly Vol. 33 No. 1/March 2009

81

Liang & Xue/Avoidance of IT Threats

mance, whereas perceived effectiveness does not have an assumed goal. For example, a Mac user who perceives the Windows firewall to be effective in protecting computers running Windows may not perceive the Windows firewall to be useful because she does not use Windows. When people select a safeguarding measure, they consider not only its effectiveness, but also its costs. Perceived costs refer to an individual’s physical and cognitive efforts that are needed to use the safeguarding measure such as time, money, inconvenience, and comprehension (Weinstein 1993). These efforts tend to create barriers to behavior. Hence, an individual’s motivation to taking the safeguarding measure has to be weighed against the expected costs. If the costs outweigh the benefits (i.e., perceived effectiveness), the individual is unlikely to take the safeguarding measure. Thus, perceived costs negatively influence users’ appraisal of the safeguard avoidability. Besides perceived effectiveness and costs, individuals confidence in taking the safeguarding measure, referred to as selfefficacy, is also an important determinant of avoidance motivation. The inclusion of self-efficacy in TTAT is consistent with Bandura’s argument that “in any given instance, behavior would be best predicted by considering both selfefficacy and outcome beliefs” (p. 140). Self-efficacy has been examined in numerous studies in the IS literature and its relationship with intention to IT adoption is well established (Agarwal et al. 2000; Bandura 1977; Bandura 1982; Chau 2001; Compeau et al. 1999; Venkatesh 2000). In the context of IT threat avoidance, the safeguarding measure is often computer software or a computer-related behavior (e.g., installing anti-virus software, turning off cookies, editing the computer registry file, etc.). Users’ self-efficacy to a large extent affects their perceptions of the overall benefits provided by the safeguarding measure (i.e., perceived avoidability). For example, people might think that a safeguarding measure cannot reduce IT threat because they are not confident that they are able to take that measure, even though it is effective and has low costs. Jointly, these three antecedents shape the perception whether the IT threat is avoidable if the safeguarding measure is taken. While perceived effectiveness and self-efficacy tend to increase perceived avoidability, perceived costs are likely to weaken perceived avoidability. In coping appraisal, the perceived avoidability of many safeguarding measures may be assessed and compared. The safeguarding measure with the highest perceived avoidability will be performed by users to reduce the threat. Proposition 5: A safeguarding measure’s effectiveness, costs, and users’ self-efficacy toward it influ-

82

MIS Quarterly Vol. 33 No. 1/March 2009

ence users’ perceptions of avoidability of the IT threat. Proposition 5a: Users are more likely to perceive an IT threat as avoidable by taking a safeguarding measure when this measure can effectively reduce the IT threat. Proposition 5b: Users are more likely to perceive an IT threat as avoidable by taking a safeguarding measure when the costs associated with this measure decrease. Proposition 5c: Users are more likely to perceive an IT threat as avoidable by taking a safeguarding measure when they feel confident in taking this measure.

Coping According to TTAT, when users find that they are threatened by malicious IT, they will be motivated to take actions to protect themselves. This tendency of avoidance is driven by human nature. According to the hedonic principle, people are motivated to approach pleasure and avoid pain (Freud 1915; Higgins 1997; James 1890). Health psychology research shows that health problems give rise to perceived threats which increase the likelihood of performing protective health behavior (Janz and Becker 1984; Oliver and Berger 1979; Rosenstock 1974). Deductively, we suggest that the threat of malicious IT also leads to protective behavior. In the IT context, the negative impact caused by malicious IT is associated with the users’ computer-related well being, which includes two dimensions: computer performance and information privacy. First, computer performance can be affected by malicious IT in many ways. An intentional virus or worm attack initiated by hacker can cause serious damage to computer systems. As a result, software applications may malfunction and important documents may be stolen or deleted, which may lead to tremendous financial losses. For example, in 2003 the SoBig.F worm led to a worldwide loss of $1.2 billion and the Slammer worm caused a loss of $1.1 billion (Computer Economics 2003). For less malignant IT threats such as spyware, computer performance is less severely affected, but the consequences can still be exceedingly annoying. Spyware can crash operating systems, steal CPU cycles and bandwidth by making unauthorized connections with other computers, change the setup of browsers, and pop up unwanted advertisements periodically (Schultz 2003; Shaw 2003). According to Dell, 12 percent of all tech-

Liang & Xue/Avoidance of IT Threats

nical support calls made to its consumer hardware division request help to remove spyware, and Microsoft claims that spyware is responsible for 50 percent of the Windows system crashes reported over the Internet (Hunter 2004). Second, information privacy invasion is another problem caused by malicious IT. Privacy refers to an individual’s ability to control how their personal information is acquired and used (Stone et al. 1983; Westin 1967). Privacy draws increasing attention from IT researchers and practitioners because a variety of malicious IT such as virus, keyboard trackers, Trojan horses, and spyware can acquire and misuse personal information and they spread at an amazing speed on the Internet. When users’ personal information is collected by malicious IT, two types of privacy problems result from their inability to control the use of this information (Culnan and Armstrong 1999): one relates to improper access to personal information (Smith et al. 1996), and the other pertains to secondary use of personal information for unknown purposes (Culnan 1993). For example, spyware, usually carelessly downloaded onto personal computers, covertly gathers user information through the user Internet connection without the user knowledge (Stafford and Urbaczewski 2004). It can steal financial data, credit card information, passwords, private pictures, and videos. This personal information is sent by spyware to the party who created the spyware for marketing or fraudulent purposes (Sipior et al. 2005). In summary, malicious IT can impair users’ computer performance and/or compromise their information privacy, thus creating a negative stimulus that motivates users to avoid the threats. Recall that avoidance behavior can take many possible directions as long as they enlarge the discrepancy between the current state and the undesired end state (Figure 3). The perceived threat alone cannot determine what coping measure will be taken. It leads to a sense of urgency that motivates users to take safeguarding measures and/or activates emotion-focused coping. Proposition 6: Users who perceive the threat of malicious IT will be motivated to avoid the threat by employing safeguarding measures and/or perform emotion-focused coping. Based on expectancy theory (Steers et al. 2004; Vroom 1964), users are motivated to take the safeguarding measure that can bring them the most valued outcome. Hence, their intention to adopt a certain safeguard is a function of the degree to which the safeguard makes the threat of malicious IT avoidable. On the other hand, emotion-focused coping is unlikely to be performed when the safeguard can sufficiently reduce the threat. Coping theory suggests that individuals are in-

clined to adopt problem-focused coping when they feel that they are in control of the situation, and to adopt emotionfocused coping when they feel that they have limited control over the situation (Lazarus and Folkman 1984). In the IT context, Beaudry and Pinsonneault (2005) find that in a threatening IT event, individuals’ choice of problem- or emotion-focused coping depends on their perceived control over self, work, and technology. Perceived avoidability reflects how much control users have over the threat. Thus, as perceived avoidability increases, users are more motivated to adopt the safeguard and less likely to perform emotionfocused coping, and vice versa. Proposition 7: Users who perceive an IT threat to be avoidable will be motivated to take safeguarding measures and less likely perform emotion-focused coping. In contrast, users who perceive an IT threat to be unavoidable will likely perform emotionfocused coping. Problem-focused coping and emotion-focused coping are both competing and complementary behavioral strategies to each other (Lazarus and Folkman 1984). For example, many diseases cannot be fully cured (e.g., AIDS, diabetes, multiple sclerosis, and many chronic diseases) although their progress can be slowed down by medical treatments. Patients need to both follow therapeutic regimens and perform psychological adjustments to maintain their overall well being. Similarly, not every IT threat can be totally avoided by using safeguards. To the degree that the threat is not avoided by the safeguard, users need to perform emotion-focused coping to further reduce the threat. Therefore, as perceived threat increases both problem- and emotion-focused coping will be performed. For example, if a new breed of virus is found to be spreading on the Internet and it can only be partially controlled by existing anti-virus software, users would activate their antivirus software to avoid, to a certain degree, being infected by the virus. In the meantime, they would perform emotionfocused coping such as hoping that the virus will not infect their computers or accepting the fact that they will probably be infected. According to cybernetic theory, emotion-focused coping helps users restore emotion stability without actively controlling the source of the threat: malicious IT. When emotion-focused coping is performed, people become relatively complacent under threat and feel less urgency to actively avoid the harm. Consequently, users’ motivation to avoid the threat in general and to take a safeguarding measure in specific becomes insensitive to changes in threat. This leads to two insights. First, the relationship between perceived threat and avoidance motivation is a convex curve (see Figure 6). As the threat

MIS Quarterly Vol. 33 No. 1/March 2009

83

Liang & Xue/Avoidance of IT Threats

level increases, the slope of the threat-motivation relationship becomes flatter. Second, the impact of perceived avoidability on avoidance motivation is negatively moderated by perceived threat. As the threat level increases, the slope of the avoidability-motivation relationship becomes less steep (see Figure 7). Our theorization of these relationships is consistent with prior research. Weinstein (2000) shows that people are relatively unresponsive under health threats that have a moderate to high probability of occurrence. Studies on persuasion find that when fear, resulting from threat, reaches a certain level, people start to become insensitive to changes in fear levels (Ray and Wilkie 1970). Janis (1967) proposes that the fearpersuasion relationship is represented by an inverted Ushaped curve. Whereas most prior research is conducted in the health psychology arena, the logic can be applied in the IT context because of the similarities between IT threats and health threats. Proposition 8: The relationship between perceived threat and avoidance motivation is a convex curve so that it becomes weaker as perceived threat increases. Proposition 9: The relationship between perceived avoidability and avoidance motivation is negatively moderated by perceived threat so that it is weaker when perceived threat increases. The boundary condition is that if perceived threat is zero, there would be no relationship between perceived avoidability and avoidance motivation. In TTAT, avoidance motivation is defined as the degree to which IT users are motivated to avoid IT threats by taking safeguarding measures. The relationship between motivation and action has long been established in psychology and organization research. According to the law of effect (Thorndike 1911), human beings are motivated to repeat past actions that lead to positive outcomes, and reduce past actions that produce negative outcomes. Expectancy theory posits that people tend to choose those behaviors that, if accomplished, will lead to their most valued rewards (Vroom 1964). The existing literature offers substantial evidence that people are likely to take action when they are motivated (Steers et al. 2004). In the context of IT threat, people’s avoidance motivation is expected to have a close relationship with their actual avoidance behavior. Proposition 10: Users’ avoidance motivation leads to their avoidance behavior, which is taking safeguarding measures to reduce the IT threat.

84

MIS Quarterly Vol. 33 No. 1/March 2009

Risk Tolerance and Social Influence So far our discussion of TTAT assumes that IT users uniformly respond to IT threats without being influenced by their social environment. In reality, different people rarely have the same degree of responsiveness to a given threat and their responses are hardly immune from influences of their surrounding environment. To account for these effects, we consider two variables in TTAT: risk tolerance and social influence. Risk tolerance mainly influences users’ threat perceptions. In the finance literature, risk tolerance refers to the maximum amount of uncertainty that one is willing to accept when making financial decisions (Grable 2000). Due to the ubiquity of risk in financial decisions, many studies are focused on risk tolerance and find that risk tolerance is positively related to risky financial behaviors (Grable et al. 2006; Hallahan et al. 2004; Yao et al. 2004). Although risk tolerance is a finance concept, Barsky et al. (1997) suggest that individuals tend to show similar tolerant responses to risky situations across settings. They find that risk tolerance predicts a set of diverse risky behaviors such as smoking, drinking, not having insurance, choosing risky employment, and holding risky assets. In addition, ample evidence demonstrates that risk tolerance is a personal trait related to demographic variables including age, gender, marital status, race, religion, education, and income (Filbeck et al. 2005; Grable 2000; Hallahan et al. 2004). Therefore, we contend that risk tolerance also influences IT users’ responses to IT threats. Based on cybernetics, we define risk tolerance as the minimum discrepancy between the undesired end state (attacked by malicious IT) and the current state that users are able to tolerate. A more risk tolerant user would be able to endure more (objectively) threatening malicious IT than a less risk tolerant user. Facing the same malicious IT, the former will perceive a lower degree of threat than the latter. Thus, users’ risk tolerance has a negative effect on their perceived threat. Proposition 11: Users’ risk tolerance negatively influences their perception of IT threat. Social influence has a more general effect on users’ IT threat avoidance behavior. Given that an individual user is almost always associated with a group, an organization, or a society, his or her behaviors are inevitably influenced by other people. Research on diffusion of innovations finds that interpersonal network and mass media play a critical role in convincing people to adopt innovations (Rogers 2003). Similarly, IT users’ threat avoidance behavior is also subject to the influence of their social environment. Based on psychology theory, we contend that there are two types of social influ-

Liang & Xue/Avoidance of IT Threats

Avoidance Motivation

Avoidance Motivation

Low Threat

Perceived Avoidability

Perceived Threat

Figure 6. The Curvilinear Relationship between Perceived Threat and Avoidance Motivation

ence: informational influence and normative influence (Burnkrant and Cousineau 1975; Deutsch and Gerard 1955). First, the social environment provides information to individual users to help them appraise malicious IT and safeguarding measures. This is especially important when users lack knowledge and experience regarding IT threat avoidance. Second, the social environment generates normative influence through three processes: compliance, internalization, and identification (Kelman 1974). Compliance is based on the need for getting approval, acceptance, or rewards or the fear of punishment. Internalization is realized by developing congruence between one’s own and a group’s shared values or goals. Identification is based on an individual’s self-defining relationship with another person or group. The consequence of these processes is that an individual would commit behaviors that are socially desirable. For example, users might take actions to avoid IT threats because it is a requirement of their company, even though they do not perceive the threats. In summary, social influence both provides information to and imposes normative pressures on IT users. While the informational influence is mediated by the threat and coping appraisals, the normative influence has a direct impact on avoidance motivation. Proposition 12: Social influence affects users’ evaluation of IT threats and safeguarding measures as well as their motivation to avoid IT threats. In summary, TTAT postulates that IT threat avoidance can be represented by a positive feedback loop where users try to get away from being harmed by malicious IT. The threat is determined by users’ susceptibility to the malicious IT and severity

High Threat

Figure 7. The Moderating Role of Perceived Threat

of being infected by the malicious IT. To reduce the threat, users have two options: problem-focused and emotionfocused coping. Problem-focused coping involves using safeguarding measures. Whether a safeguard is adopted is influenced by the safeguard’s effectiveness and costs and users’ self-efficacy. If no safeguarding measures exist to counter the malicious IT or the existing measures cannot completely reduce the threat, users are likely to perform emotion-focused coping, which helps them reduce the IT threat subjectively. Integrating process theory and variance theory, TTAT draws from cybernetic theory and coping theory to delineate IT users’ threat avoidance behavior that has been understudied in the existing IS literature, thus making a significant contribution to the advancement of IS theory.

Discussion TTAT helps to understand IT phenomena that cannot be explained by existing theories in the IS literature. For example, after the well-known Blaster Worm attack in 2003, Microsoft released service patches to fix the security holes in the Windows XP operating system. However, in 2005, Microsoft still found 500 to 800 cases of Blaster on Windows machines every day (Naraine 2005). These machines either did not have the service patches or the latest anti-virus software installed. Given that the users might fully understand that Blaster was devastating and the security tools were effective in removing Blaster, it seemed a mystery that they did not install the tools. With TTAT, we can go beyond the traditional wisdom by examining the users’ threat perceptions.

MIS Quarterly Vol. 33 No. 1/March 2009

85

Liang & Xue/Avoidance of IT Threats

It is plausible that the users thought Blaster was totally eradicated after the 2003 outbreak or Blaster could not go through the company firewall. That is, their perceived susceptibility of being infected by Blaster is zero. According to TTAT, their perceived threat will also be zero and they will not be motivated to take safeguarding measures. This reasonably explains why they did not install the security patches and latest anti-virus software. Another example shows how TTAT explains users’ inaction under threat. We check e-mail everyday. We know that there is always a possibility that an e-mail contains a malware that could lead to damage or loss of important information stored on our computer, since many well-known security attacks have been launched through e-mail (e.g., the SoBig worm). In this case, we perceive the threat, but can do nothing to eliminate it because we know nothing about it before it happens. Based on TTAT, the avoidability of the lurking threat is close to zero. Given that there are no countermeasures, we tend to rely on emotional coping to maintain psychological balance. We would think like this—“it won’t happen to me” (denial) or “nobody would be interested in stealing my information” (wishful thinking). As a result, the perceived threat level is lowered and checking e-mail remains a non-stressful task. As the examples demonstrate, TTAT reveals insights that help us understand why users actively and passively respond to IT threats, thus providing important implications for IT security practice. As the world is swarmed with a variety of malicious IT that, just like epidemics, constantly evolves to find new ways of causing damage (Householder et al. 2002), it is imperative for IT executives and managers to understand how users cope with such threats. TTAT provides a framework to explain the cognitive processes people use to appraise threat, seek solutions, and ultimately avoid IT threats by applying safeguarding measures. Equipped with TTAT, IT managers will be able to design more effective organizational mechanisms to educate employees about IT threats, thus increasing the likelihood of successfully enforcing firm-level IT security policies. TTAT suggests that in order to motivate employees to avoid malicious IT, firms should recognize the two cognitive processes through which employees evaluate the IT threat and the safeguarding measure. Raising employees’ perception of IT threats will complement the efforts, stressing the usefulness and ease of use of safeguarding measure and facilitating security policy compliance at the organization level. Extending prior research on IT security that has stressed the importance of awareness or recognition of security problems (Goodhue and Straub 1991; Straub and Welke 1998), our

86

MIS Quarterly Vol. 33 No. 1/March 2009

theory offers a deeper understanding of how to raise security awareness. We suggest that IT users need to be educated about not only the likelihood of being attacked by malicious IT but also the negative outcomes once they become victims. According to TTAT, they need both wires connected to become sensitized. Otherwise, they will remain ignorant of the threat and take no action even if they are in jeopardy. In addition, TTAT provides prescriptive guidelines for IT practice. It suggests that perceived threat can be deliberately utilized to motivate users if they are helped to perceive how devastating the negative consequence caused by IT threats could be and how likely those consequences are to occur. Communication and educational programs can be designed based on the framework provided by TTAT. To promote employees’ avoidance of malicious IT, firms should raise the level of their employees’ perceived threat. They also need to advocate the effectiveness of the safeguarding measure they plan to use, offer support to decrease perceived costs, and provide training to increase employees’ self-efficacy. It should be noted that strong threat perceptions can lead to increased emotion-focused coping, which neutralizes employees’ desire to cope with threats and hinders their adoption of safeguarding measures. Thus, firms need to pay attention to the “dosage” of their educational programs to ensure that they lead to appropriate levels of threat perceptions in their employees.

Conclusion In this paper we develop TTAT by synthesizing a multitude of theories and the relevant literature in psychology, health psychology, information systems, management, marketing, and finance. To our best knowledge, this is the first attempt to build theory that explains individual IT users’ threat avoidance behavior. To part, we highlight the contributions of this paper. First, we articulate the approach–avoidance distinction that has long been overlooked by IS researchers, which helps to clarify the confusion between adoption of safeguarding IT and avoidance of malicious IT. Second, we describe IT users’ avoidance behavior as a dynamic positive feedback loop, which allows a more complete understanding of IT threat avoidance to be achieved. Third, the threat appraisal and coping appraisal precisely delineate users’ cognitive processes under IT threats. Fourth, we integrate a process theory view and a variance theory view of TTAT, thus allowing TTAT to be tested by both process research and variance research. Finally, we propose that users can take both problem- and emotion-focused coping to reduce IT threats. In the IT threat context, emotion-focused coping cannot be overlooked and the interaction between the two coping

Liang & Xue/Avoidance of IT Threats

strategies leads to some interesting relationships that have important implications for IS research and practice. This paper breaks ground for future research to empirically test or theoretically extend TTAT, which provides exciting opportunities to enhance our understanding of IT security phenomena.

Acknowledgments We would like to thank the three anonymous reviewers, the associate editor, and the senior editor for their encouraging and constructive comments which helped us considerably improve the paper. An early version of this paper was presented at the JAIS Theory Development Workshop at the International Conference on Information Systems in Las Vegas. We thank Vicki McKinney, Paul Pavlou, Nava Pliskin, Andrew Schwarz, and Viswanath Venkatesh for their valuable feedback.

References Adams, D. A., Nelson, R. R., and Todd, P. A. 1992. “Perceived Usefulness, Ease of Use, and Usage of Information Technology: A Replication,” MIS Quarterly (16:2), pp. 227-247. Agarwal, R., Sambamurthy, V., and Stair, R. M. 2000. “Research Report: The Evolving Relationship Between General and Specific Computer Self-Efficacy—An Empirical Assessment,” Information Systems Research (11:4), pp. 418-430. Ajzen, I. 1991. “The Theory of Planned Behavior,” Organizational Behavior & Decision Processes (50), pp. 179-211. Ajzen, I., and Fishbein, M. 1980. Understanding Attitudes and Predicting Behavior, Englewood Cliffs, NJ: Prentice Hall. Bagchi, K., and Udo, G. 2003. “An Analysis of the Growth of Computer and Internet Security Breaches,” Communications of the AIS (12), pp. 684-700. Bandura, A. 1977. “Self-Efficacy: Toward a Unifying Theory of Behavior Change,” Psychological Review (84), pp. 191-215. Bandura, A. 1982. “Self-Efficacy Mechanism in Human Agency,” American Psychologist (37), pp. 122-147. Barsky, R. B., Juster, F. T., Kimball, M. S., and Shapiro, M. D. 1997. “Preference Parameters and Behavioral Heterogeneity: An Experimental Approach in the Health and Retirement Study,” The Quarterly Journal of Economics (72:3), pp. 537-579. Baskerville, R. 1988. Designing Information Systems Security, Chichester, UK: John Wiley. Baskerville, R. 1991a. “Risk Analysis: An Interpretive Feasibility Tool in Justifying Information Systems Security,” European Journal of Information Systems (1:2), pp. 121-130. Baskerville, R. 1991b. “Risk Analysis as a Source of Professional Knowledge,” Computer & Security (10:8), pp. 749-764. Baskerville, R. 1993. “Information Systems Security Design Methods: Implications for Information Systems Development,” ACM Computing Surveys (25:4), pp. 375-414. Beaudry, A., and Pinsonneault, A. 2005. “Understanding User Responses to Information Technology: A Coping Model of User Adaptation,” MIS Quarterly (29:3), pp. 493-524.

Bhattacherjee, A. 2001. “Understanding Information Systems Continuance: An Expectation-Confirmation Model,” MIS Quarterly (25:3), pp. 351-370. Bhattacherjee, A., and Premkumar, G. 2004. “Understanding Changes in Belief and Attitude Toward Information Technology Usage: A Theoretical Model and Longitudinal Test,” MIS Quarterly (28:2), pp. 229-254. Brown, S. A., Massey, A. P., Montoya-Weiss, M. M., and Burkman, J. P. 2002. “Do I Really Have To? User Acceptance of Mandated Technology,” European Journal of Information Systems (11), pp. 283-295. Burnkrant, R. E., and Cousineau, A. 1975. “Informational and Normative Social Influence in Buyer Behavior,” Journal of Consumer Research (2), pp. 206-215. Carver, C. S. 2006. “Approach, Avoidance, and the Self-Regulation of Affect and Action,” Motivation and Emotion (30), pp. 105-110. Carver, C. S., and Scheier, M. F. 1982. “Control Theory: A Useful Conceptual Framework for Personality-Social, Clinical, and Health Psychology,” Psychological Bulletin (92:1), pp. 111-135. Carver, C. S., and White, T. L. 1994. “Behavioral Inhibition, Behavioral Activation, and Affective Responses to Impending Reward and Punishment: The BIS/BAS Scales,” Journal of Personality and Social Psychology (67), pp. 319-333. Chau, P. Y. K. 2001. “Influence of Computer Attitude and SelfEfficacy on IT Usage Behavior,” Journal of End User Computing (13:1), pp. 26-33. Compeau, D. R., and Higgins, C. A. 1995. “Computer SelfEfficacy: Development of a Measure and Initial Test,” MIS Quarterly (19), pp. 189-211. Compeau, D. R., Higgins, C. A., and Huff, S. 1999. “Social Cognitive Theory and Individual Reactions to Computing Technology: A Longitudinal Study,” MIS Quarterly (23:2), 1999, pp. 145-158. Computer Economics. 2003. “Virus Attack Costs on the Rise– Again,” September (http://www.computereconomics.com/ article.cfm?id=873). Computer Economics. 2007. “Annual Worldwide Economic Damages from Malware Exceed $13 Billion,” June (http://www. computereconomics.com/article. cfm?id=1225). Culnan, M. 1993. “”How Did They Get My Name?” An Exploratory Investigation of Consumer Attitudes Toward Secondary Information Use,” MIS Quarterly (17:3), pp. 341-363. Culnan, M., and Armstrong, P. 1999. “Information Privacy Concerns, Procedural Fairness, and Impersonal Trust: An Empirical Investigation,” Organization Science (10:1), pp. 104-115. Davidson, R. 1995. “Cerebral Asymmetry, Emotion, and Affective Style,” in Brain Asymmetry, K. Hugdahl (ed.), Cambridge, MA: MIT Press. Davis, F. D. 1989. “Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology,” MIS Quarterly (13:3), pp. 319-338. Davis, F. D., Bagozzi, R. P., and Warshaw, P. R. 1989. “User Acceptance of Computer Technology: A Comparison of Two Theoretical Models,” Management Science (35:8), pp. 982-1003.

MIS Quarterly Vol. 33 No. 1/March 2009

87

Liang & Xue/Avoidance of IT Threats

Deutsch, M., and Gerard, H. B. 1955. “A Study of Normative and Informational Social Influences upon Individual Judgment,” Journal of Abnormal and Social Psychology (51), pp. 629-636. Dhillon, G., and Backhouse, J. 2000. “Information System Security Management in the New Millennium,” Communications of the ACM (43:7), pp. 125-128. Edwards, J. 1992. “A Cybernetic Theory of Stress, Coping, and Well-Being in Organizations,” Academy of Management Review (17:2), pp. 238-274. Elliot, A. J. 2006. “The Hierarchical Model of Approach–Avoidance Motivation,” Motivation and Emotion (30), pp. 111-116. Elliot, A. J., and Covington, M. V. 2001. “Approach and Avoidance Motivation,” Educational Psychology Review (13:2), pp. 73-92. Filbeck, G., Hatfield, P., and Horvath, P. 2005. “Risk Aversion and Personality Type,” Journal of Behavioral Finance (6:4), pp. 170-180. Fishbein, M., and Ajzen, I. 1975. Belief, Attitude, Intention and Behavior: An Introduction to Theory and Research, Reading, MA: Addison-Wesley. Flynn, M. F., Lyman, R. D., and Prentice-Dunn, S. 1995. “Protection Motivation Theory and Adherence to Medical Treatment Regimens for Muscular Dystrophy,” Journal of Social and Clinical Psychology 1995, pp. 61-75. Freud, S. 1915. “Repression,” in Complete Psychological Works of Sigmund Freud, London: Hogarth. Goodhue, D., and Straub, D. 1991. “Security Concerns of System Users: A Study of Perceptions of the Adequacy of Security,” Information & Management (20), pp. 13-27. Gordon, L. A., Loeb, M. P., Lucyshyn, W., and Richardson, R. 2006. “2006 CSI/FBI Computer Crime and Security Survey,” Computer Security Institute (http://i.cmpnet.com/gocsi/db_area/ pdfs/fbi/FBI2006.pdf). Grable, J. 2000. “Financial Risk Tolerance and Additional Factors That Affect Risk Taking in Everyday Money Matters,” Journal of Business and Psychology (14:4), pp. 625-630. Grable, J., Lytton, R. H., O’Neill, B., Joo, S., and Klock, D. 2006. “Risk Tolerance, Projection Bias, Vividness, and Equity Prices,” Journal of Investing, (15:2), pp. 68-74. Gray, J. A. 1982. The Neuropsychology of Anxiety, New York: Oxford University Press. Green, S. G., and Welsh, M. A. 1988. “Cybernetics and Dependence: Reframing the Control Concept,” Academy of Management Review (13:2), pp. 287-301. Hallahan, T. A., Faff, R. W., and McKenzie, M. D. 2004. “An Empirical Investigation of Personal Financial Risk Tolerance,” Financial Service Review (13:1), pp. 57-78. Higgins, E. T. 1997. “Beyond Pleasure and Pain,” American Psychologist (52:12), pp. 1280-1300. Ho, R. 1998. “The Intention to Give up Smoking: Disease Versus Social Dimensions,” Journal of Social Psychology (138), pp. 368-380. Hoffman, D. L., Novak, T. P., and Venkatesh, A. 2004. “Has the Internet Become Indispensable?,” Communications of the ACM (47:7), pp. 37-42.

88

MIS Quarterly Vol. 33 No. 1/March 2009

Householder, A., Houle, K., and Dougherty, C. 2002. “Computer Attack Trends Challenge Internet Security,” IEEE Computer (Supplement) (35:4), pp. 5-7. Hunter, P. 2004. “New Threats This Year,” Computer Fraud & Security (2004:11), pp. 12-13. James, W. 1890. The Principles of Psychology, New York: Henry Holt & Co. Janis, M. 1967. “Effects of Fear Arousal on Attitude Change: Recent Developments in Theory and Experimental Research,” in Advances in Experimental Social Psychology, L. Berkowitz (ed.), New York: Academic Press, pp. 166-224. Janz, N. K., and Becker, M. H. 1984. “The Health Belief Model: A Decade Later,” Health Education Quarterly (11:1), pp. 1-45. Kahneman, D., and Tversky, A. 1979. “Prospect Theory: An Analysis of Decision Under Risk,” Econometrica (47:2), pp. 263-291. Karahanna, E., Straub, D., and Chervany, N. 1999. “Information Technology Adoption Across Time: A Cross-Sectional Comparison of Pre-Adoption and Post-Adoption Beliefs,” MIS Quarterly (23:2), pp. 183-213. Kelman, H. C. 1974. “Further Thoughts on the Processes of Compliance, Identification, and Internalization,” in Perspectives on Social Power, J. T. Tedeschi (ed.), Chicago: Aldine, pp. 126-171. Klein, H. J. 1989. “An Integrated Control Theory Model of Work Motivation,” Academy of Management Review (14:2), pp. 150-172. Lang, P., Bradley, M., and Cuthbert, B. 1990. “Emotion, Attention, and the Startle Reflex,” Psychological Review (97), pp. 377-395. Lazarus, R. 1966. Psychological Stress and the Coping Process, New York: McGraw-Hill. Lazarus, R., and Folkman, S. 1984. Stress, Coping, and Adaptation, New York: Springer-Verlag. Loch, K. D., Carr, H. H., and Warkentin, M. E. 1992. “Threats to Information Systems: Today’s Reality, Yesterday’s Understanding,” MIS Quarterly (16:2), pp. 173-186. Maddus, J. E., and Rogers, R. W. 1983. “Protection Motivation and Self-Efficacy: A Revised Theory of Fear Appeals and Attitude Change,” Journal of Experimental Social Psychology (19), pp. 469-479. Moore, G. C., and Benbasat, I. 1991. “Development of an Instrument to Measure the Perceptions of Adopting an Information Technology Innovation,” Information Systems Research (2:3), pp. 192-239. Naraine, R. 2005. “Two Years Later, Blaster Worm Still Squirming,” eWeek.com, December 5 (http://www.eweek.com/ c/a/Security/Two-Years-Later-Blaster-Worm-Still-Squirming/). Nebylitsyn, V. D., and Gray, J. A. 1972. Biological Bases of Individual Behavior, New York: Academic Press. Oliver, R. L., and Berger, P. K. 1979. “A Path Analysis of Preventive Health Care Decision Models,” Journal of Consumer Research (6:2), pp. 113-122. Pavlov, I. 1927. Conditioned Reflexes: An investigation into the Physiological Activity of the Cortex, New York: Dover. Pechmann, C., Zhao, G., Goldberg, M. E., and Reibling, E. T. 2003. “What to Convey in Antismoking Advertisements for

Liang & Xue/Avoidance of IT Threats

Adolescents: The Use of Protection Motivation Theory to Identify Effective Message Themes,” Journal of Marketing (67), pp. 1-18. Ray, M., and Wilkie, W. 1970. “Fear: The Potential of an Appeal Neglected by Marketing,” Journal of Marketing (34:1), pp. 54-62. Rippetoe, P. A., and Rogers, R. W. 1987. “Effects of Components of Protection-Motivation Theory on Adaptive and Maladaptive Coping with a Health Threat,” Journal of Personality and Social Psychology (52:3), pp. 596-604. Rogers, E. M. 1995. Diffusion of Innovations (4th ed.), New York: The Free Press. Rogers, E. M. 2003. Diffusion of Innovations (5th ed.), New York: The Free Press. Rogers, R. W. 1975. “A Protection Motivation Theory of Fear Appeals and Attitude Change,” Journal of Psychology (91), pp. 93-114. Rogers, R. W. 1983. “Cognitive and Physiological Process in Fear Appeals and Attitude Change: A Revised Theory of Protection Motivation,” in Social Psychophysiology: A Source Book, R. Petty (ed.), New York: Guilford Press, pp. 153-176. Rosenstock, I. M. 1974. “The Health Belief Model and Preventive Health Behavior,” Health education Monographs (2), pp. 354-386. Schultz, E. 2003. “Pandora’s Box: Spyware, Adware, Autoexecution, and NGSCB,” Computer & Security (22:5), pp. 366-367. Shaw, G. 2003. “Spyware & Adware: The Risks Facing Businesses,” Network Security (2003:9), pp. 12-14. Sipior, J. C., Ward, B. T., and Roselli, G. R. 2005. “The Ethical and Legal Concerns of Spyware,” Information Systems Management (22:2), pp. 39-49. Skinner, B. F. 1953. Science and Human Behavior, New York: Macmillan. Smith, H., Milberg, S., and Burke, S. 1996. “Information Privacy: Measuring Individuals’ Concerns about Organizational Practices,” MIS Quarterly (20:2), pp. 167-196. Stafford, T. F., and Urbaczewski, A. 2004. “Spyware: The Ghost in the Machine,” Communications of the AIS (14), pp. 291-306. Steel, P., and Konig, C. J. 2006. “Integrating Theories of Motivation,” Academy of Management Review (31:4), pp. 889-913. Steers, R. M., Mowday, R. T., and Shapiro, D. L. 2004. “The Future of Work Motivation Theory,” Academy of Management Review (29:3), pp. 379-387. Stone, E., Gardner, D., Gueutal, H., and McClure, S. 1983. “A Field Experiment Comparing Information-Privacy Values, Beliefs, and Attitudes across Several Types of Organizations,” Journal of Applied Psychology (68:3), pp. 459-468. Straub, D. 1990. “Effective IS Security: An Empirical Study,” Information Systems Research (1:3), pp. 255-276. Straub, D., and Nance, W. 1990. “Discovering and Disciplining Computer Abuse in Organizations: A Field Study,” MIS Quarterly (14:1), pp. 45-62. Straub, D., and Welke, R. 1998. “Coping with Systems Risk: Security Planning Models for Management Decision Making,” MIS Quarterly (22:4), pp. 441-469.

Sutton, S. K., and Davidson, R. 1997. “Prefrontal Brain Asymmetry: A Biological Substrate of the Behavioral Approach and Inhibition Systems,” Psychological Science (8:3), pp. 204-210. Tanner, J. F., Day, E., and Crask, M. R. 1989. “Protection Motivation Theory: an Extension of Fear Appeals Theory in Communication,” Journal of Business Research (19), pp. 267-276. Tanner, J. F., Hunt, J. B., and Eppright, D. R. 1991. “The Protection Motivation Model: a Normative Model of Fear Appeals,” Journal of Marketing (55), pp. 36-45. Taylor, S., and Todd, P. A. 1995. “Understanding Information Technology Usage: A Test of Competing Models,” Information Systems Research (6:2), pp. 144-177. Thorndike, E. L. 1911. Animal Intelligence, New York: Macmillan. Tooby, J., and Cosmides, L. 1990. “The Past Explains the Present: Emotional Adaptions and the Structure of Ancestral Environments,” Ethology and Sociobiology (11), pp. 375-424. Tversky, A., and Kahneman, D. 1974. “Judgment under Uncertainty: Heuristics and Biases,” Science (185:4157), pp. 1124-1131. Tversky, A., and Kahneman, D. 1992. “Advances in Prospect Theory: Cumulative Representation of Uncertainty,” Journal of Risk and Uncertainty (5), pp. 297-323. van der Heijden, H. 2004. “User Acceptance of Hedonic Information Systems,” MIS Quarterly (28:4), pp. 695-704. Venkatesh, V. 2000. “Determinants of Perceived Ease of Use: Integrating Control, Intrinsic Motivation, and Emotion into the Technology Acceptance Model,” Information Systems Research (11:4), pp. 342-365. Venkatesh, V., and Brown, S. 2001. “A Longitudinal Investigation of Personal Computers in Homes: Adoption Determinants and Emerging Challenges,” MIS Quarterly (25:1), pp. 71-102. Venkatesh, V., and Davis, F. D. 2000. “A Theoretical Extension of the Technology Acceptance Model: Four Longitudinal Field Studies,” Management Science (46:2), pp. 186-204. Venkatesh, V., Morris, M. G., Davis, G. B., and Davis, F. D. 2003. “User Acceptance of Information Technology: Toward a Unified View,” MIS Quarterly (27:3), pp. 425-478. Vroom, V. H. 1964. Work and Motivation, New York: Wiley. Weinstein, N. D. 1993. “Testing Four Competing Theories of Health-Protective Behavior,” Health Psychology (12:4), pp. 324-333. Weinstein, N. D. 2000. “Perceived Probability, Perceived Severity, and Health-Protective Behavior,” Health Psychology (19:1), pp. 65-74. Westin, A. 1967. Privacy and Freedom, New York: Atheneum. Whetten, D. 1989. “What Constitutes a Theoretical Contribution?,” Academy of Management Review (14:4), pp. 190-195. Wiener, N. 1948. Cybernetics: Control and Communication in the Animal and the Machine, Cambridge, MA: MIT Press. Yao, R., Hanna, S. D., and Lindamood, S. 2004. “Changes in Financial Risk Tolerance, 1983-2001,” Financial Services Review (13:4), pp. 249-266. Zajonc, R. 1998. “Emotion,” in The Handbook of Social Psychology, G. Lindzey (ed.), New York: McGraw-Hill.

MIS Quarterly Vol. 33 No. 1/March 2009

89

Liang & Xue/Avoidance of IT Threats

About the Authors Huigang Liang is an assistant professor in the College of Business at East Carolina University. His current research interests focus on IT issues at both individual and organizational levels such as IT avoidance, adoption, assimilation, decision process, and healthcare informatics. His research has appeared in scholarly journals including MIS Quarterly, Communications of the ACM, Decision Support Systems, Journal of Strategic Information Systems, IEEE Transactions on Information Technology in Biomedicine, Communications of the AIS, International Journal of Production Economics, International Journal of Medical Informatics, Journal of the American Pharmacists Association, among others. He is currently working as co-principal investigator on a $1.3 million NIH grant. He received his Ph.D. from Auburn University.

90

MIS Quarterly Vol. 33 No. 1/March 2009

Yajiong Xue is an assistant professor at East Carolina University. She was an assistant professor of Information Systems at the University of Rhode Island. She received her Ph.D. from Auburn University in 2004. Her research appears in such journals as MIS Quarterly, Communications of the ACM, Communications of the AIS, Decision Support Systems, IEEE Transactions on Information Technology in Biomedicine, Journal of Strategic Information Systems, International Journal of Production Economics, and International Journal of Medical Informatics. Her current research interests include the strategic management of information technology, IT governance, and healthcare information systems. She worked for Pharmacia & Upjohn and Kirsch Pharma GmbH for several years. She served as a senior editor of Harvard China Review in 2005-2006.