1
STRATEGIC DECEPTION IN THE AGE OF ‘TRUTHINESS’ Sergei A. Samoilenko1
Department of Communication, George Mason University, United States of America
ABSTRACT This paper seeks to contribute to the growing body of knowledge on deceptive communication by attempting to reconcile a relative lack of agreement among different areas of literature. It examines the notion of deceptive communication in functional terms from the perspective of strategic deception. Specifically, it sees deceptive communication as a set of stratagems that are deliberately employed by pragmatic actors to attain their goals within current media ecosystem. In addition, it examines special issues related to the strategic use of ambiguity, equivocation, and impression management. It also provides an insight of how manipulation and deception are used in psychological warfare and corporate campaigns. Finally, this paper calls for more comprehensive scholarly inquiry aiming to address the complexity of deceptive communication strategies and tactics. Keywords: Deceptive communication, communication competence, impression management, communication campaigns, social media, anonymity.
INTRODUCTION During the 2016 presidential election, a conspiracy theory spread throughout numerous social media groups and 4chan.org, an image board website for anonymous posting, claiming that Democratic candidate Hillary Clinton was involved in a child sex ring and satanic rituals. This rumor was then circulated via decoy internet sites which specialized in sensationalistic false content. It was also retweeted by the son of Trump’s newly appointed security adviser Mike Flynn. Soon after, WikiLeaks published hacked emails from Clinton’s campaign, in which her campaign chair and the owner of Comet Ping Pong, a Washington, DC pizza restaurant, discussed the details of a Clinton fundraiser set to take place there. The restaurant soon suffered from the claim of a place of ill sexual repute, for internet audiences believed the resulting “Pizzagate” story that suggested the restaurant was the headquarters of the purported child sex ring, and the word "pizza" was a code name for “pedophilia.” The owner and employees of the pizza restaurant soon became the targets of repeated harassment. On December 4, 2016, Edgar Maddison Welch entered the restaurant and fired an assault rifle, claiming he wanted to investigate the story himself (Siddiqui and Svrluga 2016). He had read about it online and also heard from far-right radio shows. The resulting incident garnered media coverage giving the conspiracy theory new life and a new wave of exposure. Four month later, annoyed protestors gathered outside the White House to demand further investigation of the story. Marwick and Lewis (2017) illustrate how a conspiracy theory born within online networked communities become “fake news” and spreads virally through social media. In this instance, because of continuous exposure to misinformation, an ordinary citizen was moved to action. The mainstream media that amplified the theory left many story believers feeling confused and deprived of the right to know “the truth.” The incident also demonstrated that physical harassment and violence can easily emerge as a direct result of media manipulation and online deception. The process of “believing” can be viewed as the construction of individual perception of reality. “Beliefs represent people’s information about themselves and about their social and nonsocial environment, be that information accurate or inaccurate” (Ajzen 1995, 88-89). As such, our belief systems are not universal and fixed, but rather comparative and flexible. “Whether a statement is true is an entirely different question from 1
Email:
[email protected]
whether you or anybody believes it. [...] Someone might use the expression ‘true for me’ to express the idea that each of us makes our own reality and that our beliefs constitute that reality” (Sober 1991, 15-16). Scholars argue that modern day reality is becoming split as a result of new technologies and the Internet largely contributing to fragmented media, audience segmentation, and widening ideological polarization (Bonfadelli 2002; Fonseca 2014; Lee 2009). According to Manjoo (2008), modern communication technologies have altered the very understanding of truth and shifted it to “truthiness.” This word was coined by Stephen Colbert in 2005 to describe America as being divided between two camps of people – those who “think with their head” and those who “know with their heart” (The Colbert Report). The Internet has changed the media landscape in which user-generated content and speculative politics have become the norm. The 2016 U.S. election season demonstrated the power of factually inaccurate and misleading information due to the popularity of new phrase coinages such as “fake news” and “alternative facts,” as well as many Cold War archaisms, including “disinformation” and “kompromat” that reemerged and became a hot topic again. As the above “Pizzagate” case suggests, many falsehoods are the products of rumors and conspiracy theories coming from venues like on-line chat rooms, digital news shows, various social media platforms, and “gated” online communities. Moreover, there are many examples of pragmatic deception when information intends to deceive or allows the whole truth to become or remain opaque (McCornack and Levine 1990). Miller’s (1983) notion of deception is as an instrumental strategy; a means to an end which is relevant to the goals of the communicators in a particular situation. For the purpose of this chapter, I define “strategic deception” as the deliberate communication attempt that intends, “whether successful or not, to conceal, fabricate, and/or manipulate in any other way, factual and/or emotional information, by verbal and/or nonverbal means, in order to create or maintain in another or others a belief that the communicator himself or herself considers false” (Masip, Garrido, and Herrero 2004). Clearly, deception is also used to alter a preexisting belief (Buller and Burgoon 1994). This chapter intends to accomplish several objectives. First, it seeks to contribute to the growing body of knowledge on deceptive communication by attempting to reconcile a relative lack of agreement among different areas of literature. Second, it examines the notion of deceptive communication in functional terms from the perspective of strategic deception. Specifically, it sees deceptive communication as a set of stratagems that are deliberately employed by pragmatic actors to attain their goals within current media ecosystem. Third, this chapter also discusses special issues related to the strategic use of ambiguity, equivocation, and impression management. Both scholars and practitioners will benefit from the insights of how manipulation and deception are used in psychological warfare and corporate campaigns. Importantly, this chapter views the Internet environment as highly conducive to deception and susceptible to state and corporate communication campaigns. Finally, it calls for more comprehensive scholarly inquiry aiming to address the complexity of deceptive communication strategies and tactics.
THEORETICAL FRAMEWORK Deceptive Communication Research
The social world is manifested through individual and collective struggles that “seek to impose the legitimate definition of reality” (Bourdieu 1990, 141). Players compete with each other for power by playing “games of strategy” (Foucault 1997), trying to obtain or protect a desired resource or capital. Their plans and decisions get realized through stratagems defined by the Oxford English Dictionary as a device or scheme for obtaining an advantage. Many social actors also resort to manipulation that involves the exertion of social control or influence over a person or a situation. It usually implies the use of some underhanded tactics in a clever, skillful, or unscrupulous manner. In comparison to persuasion, which is seen by Guth and Marsh (2011) as an ethical process through which consensus and approval are gained, manipulation seeks alternatives and information that make the receiver behave in a way that benefits the manipulator. Deceptive communication traditionally refers to conscious misrepresentation of information by the deceiver strategically choosing between two major forms of lying: concealment or leaving out true information; and falsification, or presenting false information as if it were true (Ekman 1985, 41). However, the category of what counts as deceptive communication is broad and includes various forms of non-straightforward communication such as: indirect speech acts, strategic ambiguity, equivocal communication, and others. According to Masip, Garrido, and Herrero (2004), most scholarship on deception includes at least one of the
3 three elements that Coleman and Kay (1981) identified as components of the prototypical lie: the objective falsity of the proposition, the sender’s belief in this falsity and the intention of the sender to deceive the receiver. Knapp and Comadena (1979) identify three noninformational criteria for deception including: the actor's motivation; 2) the degree to which the actor was aware of what he or she was doing; and 3) the effects of the act on the parties involved. Thus, deceptive communication is an intentional act, which involves the deliberate manipulation of verbal and/or nonverbal messages, behavior, or image to lead a target person to a false belief or conclusion without providing a proper forewarning (Buller and Burgoon 1994; Dunbar 2009). Deception can also occur when: a) a sender is less informative, b) the message is syntactically incomplete and does not meet the minimal requirements for a response; or c) the message is semantically incomplete or does not contain pertinent information (Buller and Burgoon 1994). In addition, scholars debate about the status of various forms of manipulative communication such as: white lies, cover-ups, bluffing, euphemisms, masks, pretenses, tall-tales, put-ons, hoaxes (Knapp and Comadena 1979). Some scholars (Bavelas et al. 1990; Hopper and Bell (1984) argue that pretending aspect both includes and differentiates among different types of deception. Also, simply communicating false message may not necessarily account for an act of deception. For example, intentionally transparent lies (e.g., jokes), and mistaken lies (such as unknowingly providing faulty instructions) are neither expected to confuse the target, nor intend to deceive him/her respectively. Deceptive research is largely concerned with issues on deception detection (Burgoon 2009; Dunbar 2009), ambiguity and message distortion (Huckfeldt 1998), as well as moral agency and communication ethics (Bok 1983), and other issues. Specifically, deception detection issues are primarily addressed by the Grice’s notion of conversational implicature (Grice 1975); the Theory of Information Manipulation (McCornack 1992); and Interpersonal Deception Theory (Buller and Burgoon 1996). Grice’s maxim of quality urges speakers to refrain from saying anything they know to be false or for which they have inadequate grounds for belief. McCornack raises pragmatic concerns about the violation of maxim of quality in regard to content, possible detection, situational context, and relational consequences associated with deceptive messages. Buller and Burgoon seek to predict and explain deception in the context of interpersonal verbal and nonverbal interactions as influenced by personal goals and the meaning of an unfolding interaction. Scholarship on fallacies arising from ambiguities is especially instrumental for understanding the subtleties of deceptive communication. According to Walton (1996, 2), there are three leading factors that may lead to such fallacies: vagueness, ambiguity, and obscurity. An expression is vague when it has no definite borderlines clearly indicating whether something is included under that expression or not. An ambiguous message has one more than one meaning. An expression is obscure (unclear) if the receiver does not understand its meaning. While these factors are not inherently erroneous, they can naturally lead to misunderstanding, misdirection, logical errors. Clearly, these intricacies of discourse can be conveniently exploited by skillful communicators to influence the political or social discourse. Essentially, strategic deception is a mind game between the communicator and his/her target. Botan (2017) refers to strategic communication as the planned campaigns that grow out of first understanding what publics think and want. In other words, campaigns based on anything else just cannot be strategic because their relationship to what publics are thinking is not clear so what they call "strategies" may in fact not be strategic. Thus, crafting a deceptive message requires a careful examination of the situation, personal attributes of the target, and its relational closeness with the deceiver. Clearly, the target’perceptions, biases, and stereotypes equally determine the effectiveness of deceptive communication. Naturally, the sender first attempts to evaluate target’s knowledge and then generate a false belief in the receiver. However, if the receiver already has a false belief, the communicator may opt to not correct that impression (Masip, Garrido, and Herrero 2004). Any relevant feedback from the receiver is particularly critical, as it helps to identify strategies helping to reduce incredulity or minimize the risks of possible detection. Buller and Burgoon (1994) argue that intentional deception requires greater cognitive exertion than truthful communication regardless of the deception type. The deceiver may choose to manipulate several message characteristics with the aim of decreasing detectability or disassociating the sender from the message to provide deniability. Goals and motivations of deceptive communication may vary based on three taxonomies suggested by Buller and Burgoon (1996) such as: (a) instrumental, to avoid punishment or to protect resources; relational, to maintain interpersonal relationships or bonds; and identity, to preserve face or the self-image. The strategy choice model (Seibold, Cantrill, and Meyers 1985) posits that communicators are not limited to lying, but exposed to a variety of possible messages that fit their deception strategy. The choice of deceptive tactics
depends on multiple factors such as: time available to plan, consequences of being detected, chances to escape possible detection, and so on (Hopper and Bell 1984). Strategic communicators can select from a list of deceptive acts: a) to fabricate false information (lies); b) to conceal or omit truthful/ relevant information (concealment/half-truths); c) to exaggerate or overstate truthful information (exaggerations); d) to minimize or downplay aspects of truth (understatements); e) to mix truthful and deceptive information, mislead and misdirect attention (implicit falsification/misdirections); e) to create false beliefs through skirting issues by changing the subject or responding indirectly (equivocations), and others (Bavelas et al. 1990; Buller and Burgoon 1994). Importantly, deception, like other planned and overlearned behavior, can be strategic without being highly conscious (Buller and Burgoon 1996). In other words, skillful communicator can learn how to manage information, behavior, and image to create a believable communication performance. In contemporary academic research, public conception of deceptive communication is frequently gauged in terms of the value-laden moral compass, and is most viewed as dishonest, deviant, and aberrant compared to open and clear communication. Bavelas et al. (1990) suggest that this view often overlooks the subtlety and situational characteristics of communication discourse. Under certain circumstances, deceptive communication gets exempt from reprehension and becomes perceived socially acceptable and competent.
Deception as Communication Competence
In modern society, the content in most conversations is assumed to be truthful (Goffman 1959). In actual practice, however, veracity appears to be just one of many other strategies to achieve and secure a desired outcome (Ekman 1985; Wolk and Henley 1970). Strategic deception has evolutionary origins (Kraut 1980) and has been essential for human survival for many centuries (Sun Tzu 1963). Buller and Burgoon (1994) argue that the ability to deceive successfully is considered a socially competent and appropriate communication strategy if it helps to spare the target the experience of painful truth. The hallmarks of successful interpersonal deception are perceived as “the ability to modulate one’s performance so as to create false beliefs in the receiver, to control and mask spontaneous expressions that might betray one’s true feelings, and to monitor and adjust one’s performances in response to any indications of skepticism or suspicion by receivers” (Burgoon and Bacue 2003, 207). Various forms of manipulative communication, such as topic avoidance and ambiguous messages, are frequently employed by relational partners to avoid greater degrees of relational uncertainty during divorce and other times of stressful change (Golish and Caughlin 2002; McManus and Nussbaum 2011). Goffman (1959) argues that because of cooperative principle dictated by social dramaturgy, people agree tacitly to support each other's performance or face to avoid disrupting the entire social scene. No one can continue in performance when others are embarrassed or shamed (Cupach and Metts 1994). Thus, deceptive communication is often perceived as a social skill and placed in the same domain with topic avoidance, saving face, and politeness. The situational approach suggests a closer examination of equivocation, ambiguity, and impression management in strategic discourse.
Equivocal Communication
Equivocation is a popular technique used as a deceptive cogency of the argument which occurs when the same word or phrase undergoes an unsignaled shift in meaning (Johnson and Blair 2006, 154). In 1982, a dissatisfied fan brought a suit against the Chicago Bears charging them with false advertising, “they advertise themselves as a professional team, but they do not play a very professional game.” Equivocation exploiting double-meaning is frequently used in transparent deception (e.g., humorous arguments, jokes, and puns) that can be easily recognized by the audience. Equivocation is one of the most popular deceptive communication strategy because it creates a false belief without being dishonest (Bavelas et al., 1990). As Walton suggests (1996, 68), the genuine cases of equivocation are possible because of a shift in context, they have a special type of capability to deceive, in a given context of dialogue. Bavelas et al. (1990) argue that equivocation is neither a false message, nor a clear truth, but rather an alternative used in complex situations. It is often used to resolve situations that present an avoidance-avoidance conflict when clear or direct messages would lead to negative consequences. Thus, equivocation is a tactic of avoidance, a means of “leaving the field,” used as an alternative when all other communicative choices appear to exacerbate the conflict or lead to lose-lose outcomes.
5 Strategic Ambiguity
Clarity and openness are not always seen as the only criteria of communication effectiveness. According to Eisenberg (1984), in the organizational setting, successful strategy is frequently associated with the subtle roles of ambiguity, tact, politeness, white lies, and agenda control. Strategic ambiguity ranges in its opacity and must be viewed as a continuum, from most clear to most ambiguous. There are several functions of strategic ambiguity in formal organizations. For example, it helps facilitate organizational change and relational development. Effective leaders use ambiguity against groupthink and passive conformity to encourage creativity among team members. It is often used to generate multiple viewpoints and orient individuals toward multiple goals. Athos and Pascale (1981) argue that in some cases vague communication serves to hold strained relations together and reduce unnecessary conflict. Strategic ambiguity provides leaders with “character insurance” helping them maintain their power and standing in the organization (Williams and Goss 1975). Clarity is always risky for highly credible leaders because new information always brings additional challenges which can potentially result in a negative reevaluation of character by subordinates and the public. Organizational and political actors with low credibility benefit more from clear communication when it helps to make a positive impression and improve their image. It is impossible for political actors to exercise and retain power without strategic ambiguity (Yoder 1983). It gives them room to maneuver, conceal tactical information from their opponents, and complicate their sense-making of the situation. It also allows individuals to protect confidentiality, avoid dysfunctional conflicts, and if necessary, save face gracefully. Strategic ambiguity creates an option of deniability and helps preserve privileged positions (Eisenberg 1984). In other words, the more ambiguous is the messaging, the easier it is to deny specific interpretations. Strategic ambiguity in task-related activities helps to preserve future options. For example, disclosure of information in unequivocal terms limits options and may prematurely endanger plans (Bok 1983 as cited in Eisenberg 1984). Constructive ambiguity is a term often credited to former United States Secretary of State Henry Kissinger (Keller 2012) that refers to a negotiating tactic used to disguise an inability to resolve a contentious issue. It is similar to Carl Schmitt’s notion of the “dilatorischer Formelkompromiss” (disingenuous compromise), which produces a constitution that, in Schmitt’s words, “satisfies all contradictory demands and leaves, in an ambiguous turn of phrase, the actual points of controversy undecided, and therefore provides nothing but a semantic jumble of substantively irreconcilable matters” (Versteeg 2015, 721). Specifically, this is the deliberate use of ambiguous language on a sensitive issue in order to allow the parties to claim obtaining some concession on it.
Impression Management
“Managing an impression” refers to the presentation and maintenance of social identity (Metts, 2009). Impression management does not imply setting an artificial or manipulative agenda, but rather refers to the image that a person presents during social interactions (e.g., job interview or a first date). People cannot display all aspects of their private selves when interacting with others. Instead they adopt various forms of strategic self-presentation (Jones, 1964) to perform and manage the impression perceived as most appropriate in a particular situation. The notion of impression management has been applied in professional practices of image or reputation management (Coombs and Holladay 2006). Professional communicators apply two different processes of impression management: impression motivation and impression construction to bolster the image of their organizations or political candidates. According to Leary and Kowalski (1990, 35), impression motivation is the desire to create particular impressions in other's minds, and impression construction explains how people may alter their behaviors to affect other's impressions of them. Clearly, there is a very subtle difference of whether deceptive communication is understood in terms of plain lies and distortion or skills and competence. This distinction becomes even more nebulous when attempting to study the use of strategic deception campaigns run by state or corporate actors.
STRATEGIC DECEPTION CAMPAINGS Strategic Deception in Psychological Warfare The most famous quotation of Sun Tzu is that “all warfare is based upon deception.” Many political conflicts entail using psychological manipulation through media (Libicki 1995). This is most obvious in ideological conflicts when adversarial political agendas compete for influence through the means of information/psychological warfare. In the area of information and psychological warfare, applied deception is seen as strategic art that helps a nation to impose its will on an adversary without using a military force (MacDonald 2007). This type of warfare is critical as the best success in war is achieved by the destruction of the enemy’s will to resist, and with a minimum annihilation of fighting capacity (Lasswell 1951, 261). Strategic deception helps achieve a competitive advantage derived from the ability to exploit a superior information position (Alberts, Gartska, and Stein 1999, 32). Military intelligence often refers to strategic deception as denial and deception (Godson and Wirtz 2000). A doctrine from the U.S. Air Force (2005) on information operations includes several practices including: psychological operations, military deception, operations security, counter-intelligence, public affairs, and counterpropaganda. The denial component refers to multiple efforts to conceal facts and block all the information channels and thus, prevent the adversary from obtaining intelligence about the real situation. Apart from concealing the truth, it is equally important to reveal a distorted reality. For instance, strategic activities to deliberately exaggerate capabilities helps to deter a stronger adversary. For example, during The Gulf War, the U.S. military downplayed the lack of armored forces by focusing reporters’ attention on the arrival in Saudi Arabia of the 82nd Airborne, F-15C Eagles and the Marines. The goal was to make Hussein believe that the United States had sufficient forces in the Kingdom to defend it effectively against an Iraqi invasion (Hallion 1992). Military deception is based on stratagems defined in the Oxford English Dictionary as “acts of generalship, usually an artifice or a trick designed to outwit or surprise the enemy.” According to MacDonald (2007), the primary deceptive stratagems include: camouflage, decoys, mimicry, dazzle, conditioning, disinformation, and so on. These stratagems help to fulfill different objectives. For instance, camouflage increases ambiguity though hiding or masking an object, person or activity. A decoy is designed to show a false target, which the deceiver wants his enemy to attack. Mimicry seeks to make one thing appear to be something else. Dazzle involves increasing ambiguity by overwhelming an adversary with unimportant information. Through conditioning, communicators may try to create a pattern of behavior, so that the target develops an expectation that later can be exploited. For instance, a source may choose to condition the target by providing valid information for a sustained period of time before finally distributing the single key piece of disinformation. Finally, disinformation is a well-known category of deception that reinforces an existing belief so that ambiguity decreases, making an enemy certain, but wrong. “Disinformation” is different from “misinformation” which is unintentionally false (Jowett and O'Donnell 2011). Disinformation may appear as news stories deliberately designed and planted in news outlets. The propagandist may create a deflective source, such as a “fake” independent group, to disseminate information in a form of factoid or negative information. It is often used in a form of character assassination aimed at discrediting adversaries and spreading rumors to mask true intentions of the perpetrator (Samoilenko, 2016). For example, in 1937 the German secret police fed the information about Soviet officers through the double agent to the Soviets. As a result, the 1937–8 purge that resulted from the supposed military conspiracy against Stalin cost the lives of about 20,000 Soviet officers, more than one-third of those on active duty, and crippled the Red Army (Epstein 1989). In the 1980s the Reagan Administration engaged in a disinformation campaign against then-leader of Libya, Muammar Gaddafi (Biagi 2014). The origins of disinformation stem from phony towns constructed by Russian military leader Grigory Potyomkin in Crimea to impress Catherine the Great during her 1783 journey to the region. This inventive strategy is often referred to as Potemkin villages (Pacepa and Rychlak 2013). Disinformation was also a crucial tactic used in the Soviet political warfare called active measures (Manning and Romerstein 2004). Soviet intelligence used the term maskirovka, a combination of tactics including disinformation, simulation, camouflage, and concealment. Prototypical
7 instances of disinformation include deceptive advertising, government propaganda, forged documents, manipulated Wikipedia entries, doctored photographs, and so on (Macdonald 2007). A notable example of disinformation occurred in February 2015, when U.S. Senator James Inhofe presented a photographic “proof” of Russia’s invasion of Ukraine in the Senate. Inhofe was provided with the photographic “evidence” of Russian invasion from a delegation from Kiev in print form, as if they came “directly from a camera.” Internet users and bloggers immediately identified some of the photos as dating from the 2008 conflict between Georgia and South Ossetia, in which Russia became involved (Mackey, 2015). Throughout history, propaganda has been an essential element of psychological warfare and subversion used for waging wars and promoting political ideologies (Jowett and O'Donnell 2011; Samoilenko 2018). Young and Launer (1988) define propaganda as “discourse as attempts to conceal evidence and subvert rational processes” (272). Black propaganda is primarily associated with creative deceit when the source is concealed. Joseph Goebbels, Hitler’s propaganda minister, claimed that outrageous charges evoke more belief than milder statements that merely twist the truth slightly (Bogart 1995, xii). Dehumanization and demonization of the enemy are the most important elements of black propaganda. These are particularly common forms of moral disengagement (Bandura 1999), which refers to cognitive restructuring of inhumane conduct into a worthy one by using moral justification, sanitizing language, and attribution of blame to those who are victimized. Every nation’s propaganda attempts to present the other as a fundamental threat to national values and beliefs, frequently reducing the target to a lesser, and more animalistic status. The common tactics are used to present the enemy as embodying the exact opposite of what is valued in the society, and to characterize the target as subhuman or inhuman without redeeming values (Johnson-Cartee and Copeland 2004). For example, Lasswell (1927) notes that the Allied propaganda during World War I relied on “simple Satanism” as an important media manipulation strategy. It was primarily achieved through personalization of Kaiser Wilhelm II depicting him as a madman and a warmonger. The adversaries created posters of their enemy as a subhuman monster, blamed it for starting the war, and accused it of unspeakable atrocities. In the 1930s Nazi Germany, the educational propaganda shaped the beliefs of school children through the reading of assigned texts in which the Jews were compared to poisonous mushrooms. During the Gulf War, Saddam Hussein, “the Butcher of Baghdad,” was compared to Hitler and demonized as a voracious spider in cartoons. Most recently, during the 2014 Ukrainian crisis, headlines continuously focused on the personality of Russian President Vladimir Putin, vilifying him as the latest reincarnation of Hitler and Nazi Germany, and questioning his mental state as the main threat to Europe and the rest of the civilized world (Johnson 2014). Research on propaganda and deception overlap to a degree, just as some propaganda deceives and some deception operations employ propaganda techniques (Jowett and O'Donnell 2011; MacDonald 2007). However, these concepts should not be confused, or used synonymously, as different types of propaganda (e.g., white propaganda) are often used in everyday life in the areas of cultural diplomacy, nation-building, and so on. This type of propaganda is similar to informative communication as it comes from an identifiable source, and tends to be accurate (Jowett and O’Donnel 2011, 31). Yet, it still can be used for ideological purposes by news broadcasting services targeting foreign nations, such as the Voice of America (VOA). Corporate propaganda is a gray area as it often involves symbolic manipulation and perception management which may or may not be deceptive, and therefore, should be discussed separately.
Strategic Deception in Corporate Practices Numerous propaganda stratagems that were originally designed for psychological warfare migrated into the world of corporate communications and public relations (PR). Edward Bernays, public relations pioneer, noted that applied propaganda wartime methods can also be used for launching peacetime products and services (Bernays 1965). Corporate propaganda is regarded as “communications where the form and content are selected with the single-minded purpose of bringing some target audience to adopt attitudes and beliefs chosen in advance by the sponsors of the communications” (Carey 1997, 21). Corporate public relations and lobbying agencies frequently resort to the means of stealth communication intended to influence public perceptions and protect corporate power. The nature of corporate propaganda implies that “the propagandist may appear to have a clear purpose …, but the true purpose of communication is likely to be concealed” (Jowett and O’Donnel 2011, 45). Although stealth communication (Morris and Goldsworthy 2008 as cited in Jansen 2017) represent approximately 30 percent of activities compared to other forms of publicity and
commercial promotion, it is largely responsible for the shadowy reputation of the PR industry. Despite its manipulative nature, many ingenious techniques, such as framing, are considered normal and accepted when done in an ethical manner (Guth and Marsh 2012). Hence, the process of setting the proper context is essential to the ethical framing process as it rules out wrong interpretations.
Spinning and Framing in Corporate Propaganda
According to Jansen (2017), the public relations industry rationalized and justified the techniques of what Edward Bernays once described as “semantic tyranny.” Similar to ideological propaganda, it is demonstrated in the attempts of “manipulative publicity” (Habermas 1991) intended to control information flow and shape public perceptions and opinions through news management and spin (Jowett and O’Donnel 2011). This is a coordinated strategy to minimize negative information and present in a favorable light the news that could be damaging to corporate interests. In public relations and politics, spin is the selective assembly of facts and other nuances to support a particular view of a story. Standard tactics used in spinning involve reframing, repositioning, or altering the public perception about an issue or a person. Every story has an angle, or lens through which the events can be interpreted. Multiple “discourse engineers” advise the public to pay attention to a specific frame. Thus, we often negotiate, manage and comprehend information through the frames created for us by professional communicators (Goffman 1974). Entman (1993, 52) argues that ‘‘to frame is to select some aspects of a perceived reality and make them more salient in a communicating text, in such a way as to promote a particular problem definition, causal interpretation, causal interpretation, moral evaluation, and/or treatment recommendation for the item described.’’ Public relations professions frequently interfere with individual sense-making process and “put a painted screen where there should be a window to the world” (Lippmann 1920, 14). In other words, they insert “pseudo-environments” to manipulate the pictures people form in their heads about the world. Corporations and politicians constantly engage in framing wars to dictate how sensitive issues (e.g., abortion, climate change, etc.) should be framed. For example, bridging language refers to a popular strategy to answer questions with specific terms in order to shift the discourse from an uncomfortable topic to a more comfortable one (Rampton and Stauber 2002). The expression “mistakes were made” is commonly used when acknowledging that a situation was handled inappropriately with no direct reference to personal responsibility nor implied intent. The “if apology” is also popular among corporate CEOs with lines such as "I apologize if I offended anyone,” which shifts the blame onto the offended party, and denies personal acceptance of wrongdoing. In September, 2015, movie actor Matt Damon used "if apology" when he has apologized for appearing to downplay the importance of diversity in film while judging the HBO reality show. He said, “I am sorry that they offended some people, but, at the very least, I am happy that they started a conversation about diversity in Hollywood.” Another popular framing type is doublespeak or language that deliberately obscures, disguises, or reverses the meaning of words. Doublespeak may take the form of euphemisms such as: “downsizing” for “layoffs,” “servicing the target” for “bombing,” or “exploring for energy” instead of “drilling for oil” (Bai 2015). Misguided framing based on limited information is especially dangerous because it causes “problems with fabricated stories that spread quickly through the grapevine” (Johansson 2009, 124). It may be further exacerbated by numerous rhetorical artifices, which are especially popular in the advertising business. They involve the use of double entendre language and imagery (Key 1993), open-ended meanings, weasel words, suggestio falsi, empty comparisons, parity claims, appeals to the psyche, paraphernalia and many others (Johnson and Blair 2006, 227-234). These concepts are beyond the primary scope of this papers and should be attentively address in future studies. Yet, they are perfectly compatible with disposition and motives of corporate propaganda.
Spin in Corporate Campaigns
In general terms, spin today is publicly perceived as propaganda that runs disingenuous and highly manipulative corporate campaigns intended to suggest a biased interpretation of events and sway public opinion (Bodrunova 2010; Miller and Dinan 2007; Safire 1996). Corporate campaigns, known as science bending (McGarity and Wagner 2012), target science and scientific communities in their ideological or economic attacks on research. Lewandowsky et al. (2012) use the term “seepage” to define the infiltration and influence of what non-scientific claims into scientific work and discourse. According to Cook (2015), climate deniers rely on a common set of techniques to dispute the science and climate change scientists including
9 “fake” experts, fallacious arguments and convenient frames, supporting global conspiracy theories, and cherry-picking scientific data, and so on. For example, climate change contrarians support a persistent myth that global warming stopped in recent decades by focusing only on short time period. This ignores the long-term trend as well as the many warming indicators telling us that our planet continues to build up heat. Another popular form of spin is greenwashing, which refers to the promotion of the perception that corporate products or policies are environmentally friendly even when they are not. Greenwashing efforts range from changing the name or label of a product to massive advertising campaigns portraying highly polluting energy companies as eco-friendly (Karliner 2001). For example, “Clean Coal,” an initiative adopted by several platforms for the 2008 U.S presidential elections, became known an example of political greenwashing and "ultimate climate change oxymoron” (Pearce, 2009). At the same time, studies demonstrate that environmental stewardship gives firms incentives to greenwashing. For example, consumer reaction following the BP oil spill was reduced by pre-spill exposure to BP advertising. Barrage, Chyn, and Hastings (2014, 5) argue that “green advertising plays more of a persuasive role than an informative role, shifting beliefs rather than providing information about and commitment to environmental quality.” An increasingly popular practice of corporate propaganda rooted in psychological warfare and disinformation campaigns is astroturfing. It refers to fake grassroots campaigns, or a lobbying effort of artificially creating the impression of widespread public support for a policy, cause, or product, where little or no support in fact exists (Bailey and Samoilenko 2017). A typical pre-internet form of astroturfing was paid-for letter writing campaigns to convince political representatives that their cause enjoyed greater public support than was in fact the case (Lyon and Maxwell 2004 563-4). These fake grassroots imitate public advocacy efforts started by individuals or local communities and frequently withhold information about the source’s financial connection. The core type of deception here is identity-based deceit, a false representation of the identity of the author or supporter. Other forms of astroturfing also involve message-based deceit as the delivery of false or misleading information (Zhang, Carpenter, and Ko 2013, 3). Astroturfing involving message-based deceit is often employed to generate positive consumer reviews for one’s product or service, or to generate negative reviews for that of a rival. Professional “spin-doctors” exploit various forms of information control such as: withholding information, controlling the media as a source of information distribution, or even presenting distorted information from a “credible” source. Other spin techniques include “burying” potentially negative new information by releasing it during the graveyard slot when a television audience is very small. Another popular technique involves spreading disinformation about a whistle-blower to divert public attention from the issue. For example, after Jeffrey Wigand revealed information that cigarette companies engaged in campaigns to hide from the public that smoking was highly addictive and caused lung cancer, his decision at great cost to his personal life and safety. Cigarette maker Brown and Williamson retaliated with a ruthless smear campaign against Wigand that publicly exaggerated claims of him being a raging alcoholic, a wife beater, and a pathological liar (Samoilenko 2016). Despite the utopian notion of the Internet as a genuinely democratic space (Shirky 2008), this environment is highly susceptible to ideological and corporate propaganda. Ironically, the very ‘democracy’ and accessibility of the Internet has made it “the most potent force for the spreading of disinformation yet devised” (Jowett and O'Donnel 2011, 159). Morozov (2011) argues that spin is now preferred by many authoritarian states and used as “spinternet.” Therefore, it is critical to address the notion of the Internet as an environment conducive to the spread of deceptive communication.
THE INTERNET AS A TERRITORY OF DECEPTION Internet anonymity is a double-edge sword. On one hand, in situations of limited press freedom or suppressed public expression, securing anonymity of the source is a prerequisite for safety, privacy, and open discussion. Anonymity also provides a cover for whistle-blowing and investigative journalism (Kte’pi 2012) and an additional layer of personal freedom “afforded by a lack of attribution” (Davison 2012, 132). The process of creating the virtual self is often seen as an escape from social and psychological constraints (Turkle 2011). On the other hand, Internet anonymity is easily abused by unrestrained, impulsive or manipulative behaviors such as trolling, harassment, and fraud. Oftentimes, antisocial behavior results from disinhibition when people act without regard for social norms and consequences (Lapidot-Lefler and Barack 2012).
Anonymity can be conveniently exploited by online users who tend to post negative comments or incorrect information without identification or supporting evidence. It also encourages various forms of deception ranging from idealized self-presentation on employment-oriented networking sites (e.g., LinkedIn) to massive astroturfing campaigns. As a consequence, the Internet has transformed how we, as a society, decide what constitutes fact (Katz 1998), and “opened the door to misinformed reactions and . . . chaotic behavior’’ (Ayres 1999, 141, as cited in Garrett 2011). Both individual users, and online communities take advantage of the current media ecosystem to make profit, manipulate news frames, and propagate ideas (Marwick and Lewis 2017). One popular form of deception online is decoy (Reynard 2014). Some decoy election sites pose as fund-raising arms of the major parties to siphon funds from the voters. For example, in 2012, two sites, DemocraticNationalCommittee.org and RepublicanNationalCommittee.org, were identified by news sources as fund-raising scams. The owner of these two domain names was a Massachusetts truck driver who operated dozens of similar sites. The convenience and affordability of media production, editing and distribution software led to a large volume of user-generated misinformation. Technology provides the ability to alter images perfectly with little or no chance of detection, especially if the time frame for analysis of the image is short or not enough context is available (McDonald 2007). For example, during the 2004 U.S. presidential campaign, an image surfaced, depicting Democratic Presidential nominee John Kerry sharing a speaking platform at a protest rally with Jane Fonda in 1971. It was originated by a conservative group and falsely attributed to the Associated Press. The image was circulated widely on the Internet and a number of media outlets for several days before it was revealed to be a fabrication. Amateur filmmakers can easily distribute and promote their own conspiracy documentaries on YouTube and social media. Manjoo (2008) features the story of Dylan Avery, a young man, who in December 2005 released the popular conspiracy film “Loose Change” challenging the official story of the 9/11 attacks. In a matter of months, the documentary was viewed on the Internet more than 10 million times, by 20,000 people per day (Spies 2014). This is an example of several other 9/11 “truth” documentaries that now circulate on the Internet. The internet environment amplifies the threat of manipulation through hearsay and falsehoods (Garrett, 2011). Several internet platforms have become fertile ground for the growth of conspiracy theories. For example, the notorious “Pizzagate” example demonstrates the power of sites like 4chan.org that allow anonymous contributions from users with no registration process. Marwick and Lewis (2017) refer to the preservation of ambiguity (Poe’s Law) as a distinctive feature of trolling developed by 4chan users: “Without a clear indication of the author’s intent, it is difficult or impossible to tell the difference between an expression of sincere extremism and a parody of extremism” (Gelman 2014). Since the Internet favors simplicity over complexity, many “experts” and opinion leaders are always ready to offer simple explanations for complex issues and policies. Various ideologues and conspiracy theorists regularly contribute to media manipulation and get mainstream coverage (Beauchamp 2017; Marwick and Lewis 2017). Niche media facilitates connections among people, but also makes them more vulnerable to misinformation that corresponds well with their basic worldview. Online social networks can be “as much if not more segregated as social networks in the physical world” (Jenkins, Ford, and Green 2013, 192) operating as the digital equivalent of “gated communities.” These homogenous communities are primarily composed of like-minded people with similar political beliefs, education level and socioeconomic status. The tendency of online users to follow like-minded people leads to the creation of echo chambers and filter bubbles, which exacerbate polarization. “With no conflicting information to counter the falsehoods, the end result is a lack of shared reality, which may be divisive and dangerous to society” (Benkler et al. 2017 as cited in Lazer et al. 2017; also see Fisher, 2017). News sources frequently report on falsehoods and inaccuracies they find to be newsworthy, thus unintentionally giving them more exposure (Gessen 2017). That explains the recent phenomena of fake news that refers to “a wide range of disinformation and misinformation circulating online and in the media” (Marwick and Lewis 2017). In their analysis of the rise in far-right online activity, Marwick and Lewis note that it would be less significant if the mainstream media had not amplified its messaging. According to them, the mainstream media was susceptible to manipulation from the far-right press due to a number of factors including: low public trust in media; a proclivity for sensationalism and novelty over newsworthiness; lack of resources for fact-checking and investigative reporting; and clickbait and corporate consolidation resulting in the replacement of local publications with hegemonic media brands. Holiday (2013) explains how fake news
11 get propelled by influential blogs such as Gawker, Business Insider, Buzzfeed, Drudge Report, and many others. These blogs become vehicles from which mass media reporters and most informed friends discover and borrow the news, "Radio DJs and news anchors once filled their broadcasts with newspaper headlines; today they repeat what they read on blogs - certain blogs more than others. Stories from blogs also filter into real conversations and rumors that spread from person to person through word of mouth. [...] Then it gives birth to the memes that become our cultural references, the budding stars who become our celebrities, the thinkers who become our gurus, and the news that become our news" (13). Online deception benefits from the fast pace of today’s media ecosystem. Stories from popular blogs and news outlets filter into real conversations and rumors that spread from person to person through word of mouth. Individuals contribute directly to the distribution of fake news by sharing it themselves both knowingly and unknowingly (Barthel, Mitchell, and Holcomb 2016). Rumors and conspiracy theories coming from anonymous sources are especially dangerous as the original source is lost or no longer responsible for verifying the proposition. In rumors, an allegation is typically distorted and blown out of proportion as the process of passing along from one source to another continues (Walton 1996, 195). According to Garrett (2011), rumors e-mailed to friends and family members are more likely to be believed and shared with others. A high volume of information sources online leads individuals to rely on heuristics and social cues in order to determine the credibility of information. More information, coupled with decreasing gatekeeping, means that “there are too many unverified claims to evaluate” (Garett 2011, 256). Scholars raise concerns about “the vulnerability of democratic societies to fake news and the public’s limited ability to contain it” (Lazer et al. 2017). The impact of deceptive communication online is significant, as falsehoods and fabricated stories do not only lead to public confusion, and political polarization, but also contribute to distrust in main institutions. The Internet is often praised to be a free and democratic environment, which enables open and frank discussion on key issues that affect society (Castells 2003). At the same time, when the power rests in the hands of the government or corporations, these technologies can be harnessed to enhance control of society (Simons and Samoilenko 2017). In 2013 an American public relations company has been caught allegedly fluffing up Wikipedia pages for clients that include powerful governments from around the world (Morris 2013). In 2017, Facebook publicly stated that its site has been exploited by governments for the manipulation of public opinion in other countries (Weedon, Nuland, and Stamos 2017). Political campaigns are increasingly threatened by online astroturfing in the form of paid pollsters, trolls, and social bots. These imposters pose as autonomous individuals with the intent of promoting a specific agenda. For example, the Chinese state employed an army of paid online commentators to spread pro-regime propaganda on online forums. They became known as “the fifty-cent army” after the amount they are supposedly paid per post (Bailey and Samoilenko 2017; Han 2015). During the spring of 2017, Russian social media was enraged after Moscow Mayor Sergei Sobyanin unveiled his grand plan in March to demolish thousands of cheap Soviet-era apartment buildings. Synchronously, dozens of community pages suddenly emerged on the social network Vkontakte in support of the mayor and his plans. Roughly 10,000 accounts have joined in enthusiastic support for Sobyanin’s program on one of those, churning out dozens of posts and comments on social media. Armies of online activists and trolls then flooded Russian social media with posts praising Sobyanin and promoting his latest cause. Astroturfers also bought ads on social media that have hounded Moscow Internet users for weeks (Kovalyov 2017). Similarly, corporations design and maintain millions of false accounts to create the illusion of popularity or public support. A 2013 study by Barracuda Labs found that the average consumer of fake Twitter followers purchased 50,000 followers. Social media accounts of celebrities and politicians were reported to have numerous fake followers, identified as such because they had acquired or lost a large number of followers in one day. Major brands accused of purchasing fake social media support include Pepsi, Louis Vuitton, Sean “Diddy” Combs, and Mercedes Benz (Finn 2013; Kte’pi 2014). Fake websites range from promoting a product to discrediting a person or company. There are sites explicitly designed to deceive people, publishing provably false claims. These sites generally appear like reputable news sources or even impersonate specific outlets. They post clickbait content consisting of sensationalist, partisan stories, and frequently misleading headlines
designed to increase article views. Fake blogs contain enthusiastic articles about new products that usually do not warrant positive reviews. Some are health related and focus on supplements, procedures, or other health enhancements. These blogs can be a part of corporate astroturfing. In 2008, a fake blog titled “Walmarting Across America” began to highlight the adventures of a married couple. They posted their stories about Walmart while travelling across the country in a recreational vehicle. Every post featured Walmart employees who loved their job and praised their company. Their trip was supported by Working Families for Walmart, an organization created by the public relations firm Edelman to counter criticism from union-funded groups (Skene 2014). Indeed, although the Internet has given more people the ability to produce own frames about complex issues, gatekeepers such as politicians and corporations still play a pivotal role by legitimizing some frames over others. Jenkins, Ford and Green (2013) support this view by admitting that power institutions feel a great investment in the institutions and practices of networked culture (163) and create "brand communities" that become vehicles for promoting particular corporate messages. In this case, Facebook and Google, which are themselves linked to powerful media institutions, often promote elaborate forms of "produsage" which further benefits the development of frames approved and prompted by corporations.
CONCLUSION The classical Greek philosophers Socrates, Plato, and Aristotle believed that the truth or falsity of a statement is determined by how it relates to the world, whether it accurately describes that world (Hanna and Harrison 2004), and thus, corresponds with reality. Several centuries later, on January 22, 2017, Kellyanne Conway, U.S. Counselor to President Donald J. Trump, told Meet the Press host Chuck Todd that White House Press Secretary Sean Spicer used “alternative facts,” referring to his false statement to the press corps about the attendance numbers at Trump's presidential inauguration. Todd responded, "Look, alternative facts are not facts. They're falsehoods." Two days after the Todd interview, Conway defended Trump's travel ban by talking about a nonexistent "Bowling Green massacre," and by falsely claiming that President Obama in 2011 had "banned visas for refugees from Iraq for six months" (Hoefer 2017). This ignited a massive response on social media that went viral in which the phrase "alternative facts" was discussed in terms of Newspeak, the language to meet the ideological requirements from George Orwell’s dystopian novel 1984. In several days after the interview, sales of the book increased by more than 9,500 percent, making it the number one best-selling book on Amazon.com (de Freytas-Tamura 2017). This chapter addressed the rise of deceptive communication as prompted by the new mediated reality, particularly social media. The use of strategic deception in communication practices is multifaceted and mainly defined by contextual characteristics and pragmatic aims of the strategic communicator. Importantly, the study of strategic deceptive communication should not be limited to the moral agency of social actors, but also examined in terms of their social skills and competencies. Primarily, it should include a detailed analysis of the functional use of deceptive communication including ambiguity, equivocation, and impression management. Manipulation and deception stratagems typical for psychological warfare have been conveniently adopted by corporations and used for their propaganda purposes, spin, and information control. A detailed overview of information warfare stratagems helps us better understand the roots of many contemporary internet campaigns, including numerous hybrid forms of disinformation, spin, decoy, rumor and word-of-mouth based campaigns. Most importantly, the Internet environment is highly vulnerable to such stratagems exploited by government and corporate actors due to the nature of the online environment and the polarizing state of the current infosphere. Finally, there needs to be more comprehensive inquiry aiming to address the complexity of deceptive communication strategies and tactics. Currently, there is a tendency to study strategic deceptive communication only through the lenses of moral justification and transparency. I argue that this approach is unidimensional and sets very narrow empirical boundaries. In addition, it also adds very little to our understanding of the mechanics and complexity of strategic deception in the current media environment. Future research should further address numerous issues related to strategic deceptive communication. First, we need to examine strategic deception as a social phenomenon which appears to be inherent to human communication and manifests itself in different forms and different cultures. It is not only limited to lying, but can appear in many shapes and forms including concealment and fabrication, ambiguity, equivocation, impression management, and many others. In addition, goal-oriented manipulative behavior is typical online
13 and gets displayed in various forms of its virtual self. Future studies also need to address the circumstances under which deceptive communication becomes socially acceptable and competent. Specifically, we need to further understand to what extent strategic deception is adequate in international relations, public diplomacy and related fields that imply scenarios to be intentionally ambiguous. As deception is not solely determined by human agency, but also contextual conventions and exogenous factors, we should further explore under what conditions it thrives and becomes prevalent and sustainable. We especially need to address the psychology and motivations of the influencers responsible for framing discourses in niche online communities and dissemination of conspiracy theories. Thus, identifying the highest-ranking opinion leaders and their most popular frames, we can analyze are how conversations shift over time and what topics get most recognition from online users. Clearly, we need to focus more on receivers of deceptive communication. Since social media changes news consumption preferences, researchers need to further explore content preferences of digital consumers. For example, scholars should further explore the visual aspect of framing. The Internet is now saturated with fabricated visual distortion, including photo-shopped images, memes, and video coubs of all kinds. Another important issue is the cognitive and emotional impact of memes, interactive graphics, and YouTube videos on the public’s opinions and behaviors (Powell et al. 2015). Different forms of visual distortion illustrating deceptive content should be addressed more closely to better understand how they add another spin on frames about complex issues (e.g., public policies, science, etc.). Communication scholarship is often evaluated on the extent that it generates new insights and produces new thinking. Strategic deception provides multiple opportunities for academic scholarship in the areas that are relatively new or not yet sufficiently addressed. If we agree that strategic communication has a dark side, then it is time that scholars start paying closer attention to deception as persuasive communication by which social actors seek to manipulate each other.
REFERENCES Ajzen, Icek. 1995. “Beliefs.” In The Blackwell Encyclopedia of Social Psychology, edited by A. S. R. Manstead, and M. Hewstone, 88-89. Oxford: Blackwell. Alberts, David S., John J. Garstka, and Frederick Stein. 1999. Network Centric Warfare: Developing and Leveraging Information Superiority. Command and Control Research Program: Washington, DC. Athos, Anthony G., and Richard Tanner Pascale. 1981. The Art of Japanese Management. New York: Simon and Schuster. Ayres, Jeffrey. M. 1999. “From the Streets to the Internet: The Cyber-Diffusion of Contention.” Annals of the American Academy of Political and Social Science 566 (1):132–143. Bai, Matt. 2005. “The Framing Wars.” The New York Times Magazine. July 17. Accessed July 30, 2017. http://www.nytimes.com/2005/07/17/magazine/the-framing-wars.html Bailey, Anna, and Sergei Samoilenko. n.d. “Astroturfing.” In The Global Encyclopaedia of Informality, edited by A. V. Ledeneva and International Board. London, UK: UCL University Press. http://in-formality.com/wiki/index.php?title=Astroturfing Bandura, Albert. 1999. “Moral Disengagement in the Perpetration of Inhumanities.” Personality and Social Psychology Review 3 (3):193–209. Barthel, Michael, Amy Mitchell, and Jesse Holcomb. 2016. “Many Americans Believe Fake News Is Sowing Confusion.” Pew Research Center. Accessed July 30, 2017. http://www.journalism.org/2016/12/15/many-americans-believe-fake-news-is-sowing-confusion/ Barrage, Lint, Eric Chyn, and Justine Hastings. 2014. “Advertising as Insurance or Commitment? Evidence from the BP Oil Spill.” Working Paper No. 19838. National Bureau of Economic Research. Accessed July 30, 2017. http://www.nber.org/papers/w19838 Bavelas, Janet B., A. Black, N. Chovil, and J. Mullett.1990. Equivocal Communication. Newbury Park, CA: Sage. Beauchamp, Zack. 2017. “Democrats are Falling for Fake News about Russia.” Vox, May 19. Accessed July 30, 2017. https://www.vox.com/world/2017/5/19/15561842/trump-russia-louise-mensch Benkler, Yochai, Robert Faris, Hal Roberts, and Ethan Zuckerman. 2017. “Study: Breitbart-led Right-Wing Media Ecosystem Altered Broader Media Agenda.” Columbia Journalism Review, March 3. Accessed July 30, 2017. http://www.cjr.org/analysis/breitbart-media-trump-harvard-study.php Bernays, Edward L. 1965. Biography of an Idea: Memoirs of Public Relations Council Edward L. Bernays. Simon and Schuster, New York. Biagi, Shirley. 2014. “Disinformation,” Media/Impact: An Introduction to Mass Media. Boston, MA: Cengage Learning. Blair, Anthony J., and Ralph H. Johnson. 2006. Logical Self-Defense. New York: International Debate Education Association. Bodrunova, Svetlana. 2010. Sovremennye Strategii Britanskoi Politicheskoi Kommunikacii [Contemporary Strategies of British Political Communication]. Moscow: Tovarishestvo Nauchnyh Izdani KMK. Bogart, Leo.1995. Cool Words, Cold War. The American University Press. Washington, DC. Bok, S. 1983. Secrets: On the Ethics of Concealment and Revelation. New York: Pantheon Books. Bonfadelli, Heinz. 2002. “The Internet and Knowledge Gaps: A Theoretical and Empirical Investigation.” European Journal of Communication, 17(1): 65-84. Botan, Carl. 2017. Strategic Communication Theory and Practice: The Cocreational Model. Hoboken, NJ: Wiley-Blackwell. Bourdieu, Pierre. 1990. The Logic of Practice. Cambridge, UK: Polity. Buller, David. B., Burgoon, J. K., Buslig, A., and J. Roiger. 1996. “Testing interpersonal deception theory: The Language of Interpersonal Deception.” Communication Theory 6: 268-289. Buller, David. B., and Judee K. Burgoon. 1994. “Deception: Strategic and Nonstrategic Communication.” In Strategic Interpersonal Communication, edited by J. A. Daly and J. M. Wiemann. 191-223. Hillsdale, NJ: Erlbaum. Buller, David. B., and Judee K. Burgoon. 1996. “Interpersonal Deception Theory.” Communication Theory 6: 203-242.
15 Burgoon, Judee K. 2009. “Interpersonal Deception Theory.” In Encyclopedia of Communication Theory, edited by Stephen W. Littlejohn, and Karen A. Foss, 551-553. Thousand Oaks, CA: Sage. Burgoon, Judee. K., and Aaron E. Bacue. 2003. “Nonverbal Communication Skills.” In Handbook of Communication and Social Interaction Skills, edited by J.O. Greene and B.R. Burleson. (pp. 179 –219). Mahwah, NJ: Erlbaum. Carey, Alex. 1996. Taking the Risk Out of Democracy: Corporate Propaganda versus Freedom and Liberty. University of Illinois Press: Champaign. Child, Ben. 2015. “Matt Damon Apologises for Diversity in Film Gaffe as #damonsplaining Trends.” The Guardian. September 17. Accessed July 30, 2017. https://www.theguardian.com/film/2015/sep/17/matt-damon-damonsplaining-apology-diversity-race Castells, Manuel. 2003. The Internet Galaxy: Reflections on the Internet, Business, and Society. New York: Oxford University Press. The Colbert Report. 2005. “Videos: The Word (Truthiness)". October 17. Accessed July 30, 2017. http://www.cc.com/shows/the-colbert-report Cook, John. 2015. “The 5 Telltale Techniques of Climate Change Denial.” CNN, July 22. Accessed July 30, 2017. http://www.cnn.com/2015/07/22/opinions/cook-techniques-climate-change-denial/index.html Coombs, W. Timothy, and Sherry J. Holladay. 2006. “Halo or Reputational Capital: Reputation and Crisis Management.” Journal of Communication Management 10(2):123-137. de Freytas-Tamura, Kimiko. 2017."George Orwell's '1984' Is Suddenly a Best-Seller". The New York Times. Accessed July 30, 2017. https://www.nytimes.com/2017/01/25/books/1984-george-orwell-donald-trump.html Donovan, Pamela. 2007. “How Idle is Idle Talk? One Hundred Years of Rumor Research.” Diogenes 54(1):59–82. Dunbar, Norah E. 2009. “Deception Detection.” In Encyclopedia of Communication Theory, edited by Stephen W. Littlejohn, and Karen A. Foss. 291-292. Thousand Oaks, CA: Sage. Entman, Robert M.1993. “Framing: Towards Clarification of a Fractured Paradigm.” Journal of Communication 43(4): 51–58. Cupach, William R., and Sandra Metts. 1994. Facework. Thousand Oaks, CA: Sage. Davison, Patrick. 2012. “The Language of Internet Memes.” In The Social Media Reader, edited by Michael Mandiberg, 120-137. New York: New York University Press. Eisenberg, Eric M.1984.“Ambiguity as Strategy in Organizational Communication.” Communication Monographs 51: 227-242. Ekman, Paul. 2009. Telling Lies. Clues to Deceit in the Marketplace, Politics, and Marriage. New York: W. W. Norton and Company. Epstein, Edward J. 1989. Deception: The Invisible War between the KGB and the CIA. New York: Simon and Schuster. Finn, Greg. 2013. “Purchasing Popularity: Fake Followers and Accounts Still Plague Social Networks.” Marketing Land, April 26. Accessed July 30, 2017. http://marketingland.com/purchasing-popularity-fake-followers-accounts-still-plague-social-networks-41 537 Fischer, Sara. 2017. “Americans Trust Their Friends, not Media or Government.” Apr 7. Accessed July 30, 2017. https://www.axios.com/americans-dont-know-what-to-believe-anymore-2348143613.html Fonseca, Jaime R. S. 2014. “Audience Fragmentation/Segmentation.” In Encyclopedia of Social Media and Politics, edited by Kerric Harvey, 90-93, Thousand Oaks, CA: Sage. Foucault, Michel. 1997. “The Ethics of the Concern for Self as a Practice of Freedom.” In Ethics, Subjectivity and Truth Essential Works of Foucault, Vol. 1, edited by Rabinow. 281-301. New York: New Press. Garrett, Kelly. 2011.“Troubling Consequences of Online Political Rumoring.” Human Communication Research 37, 255–274. Gelman, Andrew. 2014. “Poe’s Law in Action.” The Washington Post. February 4. Accessed July 30, 2017. https://www.washingtonpost.com/news/monkey-cage/wp/2014/02/04/poes-law-in-action/ Gessen, Masha. 2017. “Verify Everything, Don’t Publish Rumors.” The New York Times, January 14.
Accessed July 30, 2017. https://www.https://www.nytimes.com/2017/01/14/opinion/sunday/lessons-from-russia-verify-everythi ng-dont-publish-rumors.html Godson, Roy, and James J. Wirtz. 2000. “Strategic Denial and Deception.” International Journal of Intelligence and Counterintelligence 13: 424-437. Goffman, Erving.1959. The Presentation of Self in Everyday Life. New York: Doubleday. Goffman, Erving. 1974. Frame Analysis: An Essay on the Organization of Experience. New York, NY: Harper and Row. Golish, Tamara, and John Caughlin. 2002. “I’d Rather Not Talk about It:” Adolescents’ and Young Adults‘Use of Topic Avoidance in Stepfamilies.” Journal of Applied Communication Research 30, 1: 78-106. Grice, Paul H. 1975. Logic and Conversation. In Syntax and Semantics: Vol. 3, edited by Cole and J. Morgan, 41–58. New York: Academic. Guth, David W., and Charles Marsh. 2011. Public Relations: A Value Driven Approach. New York: Pearson. Habermas, Jürgen. 1991. The Structural Transformation of the Public Sphere: An Inquiry into a category of Bourgeois Society. Cambridge, MA: MIT Press. Han, Rongbin. 2015. “Manufacturing Consent in Cyberspace: China’s ‘Fifty-Cent Army’.” Journal of Current Chinese Affairs 44(2):105-34. Hanna, Patricia, and Bernard Harrison. 2004. Word and World: Practices and the Foundation of Language, Cambridge University Press. Holiday, Ryan. 2012. Trust Me, I’m Lying. New York: Portfolio/Penguin. Hallion, Richard P. 1992. Storm over Iraq. Washington, DC: Smithsonian Institution Press. Huckfeldt, Robert, Paul Allen Beck, Russell J. Dalton, Jeffrey Levine, and William Morgan. 1998. “Ambiguity, Distorted Messages, and Nested Environmental Effects on Political Communication.” The Journal of Politics 60, 4: 996-1030. Hoefer, Hayley. 2017. “Kellyanne Conway’s 'Alternative Facts'” https://www.usnews.com/opinion/views-you-can-use/articles/2017-02-03/kellyanne-conway-gets-facts -wrong-on-bowling-green-massacre Hopper, Robert, and Robert Bell.1984. “Broadening the Deception Construct.” Quarterly Journal of Speech 70, 3: 288-302. Jansen, Sue Curry. 2017. Stealth Communications. Cambridge, UK: Polity Press. Jenkins, Henry, Sam Ford, and Joshua Green. 2013. Spreadable Media: Creating Value and Meaning in a Networked Culture. New York: New York University Press. Johansson, Catrin. 2009. On Goffman: Researching Relations with Erving Goffman as Pathfinder. In Public Relations and Social Theory: Key Figures and Concepts, edited by Ø. Ihlen, B. van Ruler, and M. Fredriksson (Eds.), 71–91. New York: Routledge. Johnson, Paul. 2014. Is Vladimir Putin another Adolf Hitler? Forbes. April 16. Accessed July 30, 2017.https://www.forbes.com/sites/currentevents/2014/04/16/is-vladimir-putin-another-adolf-hitler/#4 13b41c9237a Jones, Edward E. 1964. Ingratiation: A Social Psychological Analysis. New York: Meredith. Karliner, Joshua. 2001. “CorpWatch: A Brief History of Greenwash.” CorpWatch. March 21. Accessed July 30, 2017. http://www.corpwatch.org/article.php?id=243 Katz, James E. 1998. “Struggle in Cyberspace: Fact and Friction in the World Wide Web.” Annals of the American Academy of Political and Social Science 560(1): 194–199. Keller, Bill. 2012. “Mitt and Bibi: Diplomacy as Demolition Derby.” New York Times, September 12. Accessed July 30, 2017. https://keller.blogs.nytimes.com/2012/09/12/mitt-and-bibi-diplomacy-as-demolition-derby/?hp Key, Wilson B. 1993. The Age of Manipulation. The Con in Confidence, The Sin in Sincere. Lanham, MD: Madison Books. Knapp, Mark. L., and Mark A. Comadena. (1979). “Telling It Like It Isn’t: A Review of Theory and Research on Deceptive Communications.” Human Communication Research 5: 270–285.
17 Kovalev, Alexey. 2017. “The City of Moscow Has Its Own Propaganda Empire.” The Moscow Times. May 16. Accessed July 30, 2017. https://themoscowtimes.com/articles/the-city-of-moscow-has-its-own-propaganda-empire-58005 Kraut, Robert. 1980. “Humans as Lie Detectors: Some Second Thoughts.” Journal of Communication 30: 209-216. Kte’pi, Bill. 2014. “Deception in Political Social Media.” In Encyclopedia of Social Media and Politics, edited by Kerric Harvey, 356-358, Thousand Oaks, CA: Sage. Lapidot-Lefler, Noam, and Azy Barak. 2012. “Effects of Anonymity, Invisibility, and Lack of Eye-Contact on Toxic Online Disinhibition.” Computers in Human Behavior 28(2): 434–443. Lasswell, Harold D. 1927. Propaganda Technique in the World War I. Boston: MIT Press. Lasswell, Harold D. 1951. Political Writings of Harold Lasswell. Philadelphia: University of Pennsylvania. Lazer, David, Matthew Baum, Nir Grinberg, Lisa Friedland, Kenneth Joseph, Will Hobbs, and Carolina Mattsson. 2017. “Combating Fake News: An Agenda for Research and Action.” Conference Report. https://shorensteincenter.org/combating-fake-news-agenda-for-research/ Leary, Mark, and Robin. M. Kowalski. 1990. “Impression Management: A Literature Review and Two-Component Model. Psychological Bulletin, 107, 34-47. Lee, Jae Kook. 2009. Incidental Exposure to News: Limiting Fragmentation in the New Media Environment. Austin: University of Texas at Austin. https://repositories.lib.utexas.edu/handle/2152/6686 Lewandowsky, Stephan, Naomi Oreskes, James S, Risbey, Ben R Newell, and Michael Smithson. 2015. “Seepage: Climate Change Denial and Its Effect on the Scientific Community.” Global Environmental Change, 33: 1-13. Libicki, Martin C. 1995. What is Information Warfare. National Defense University, Washington DC. Lippmann, Walter. 1920. Liberty and the News. New Brunswick, NJ: Transaction Publishers. Macdonald, Scot. 2007. Propaganda and Information Warfare in the Twenty-First Century. Altered Images and Deception Operations. New York: Routledge. Mackey, Robert. 2015. “Sifting Ukrainian Fact from Ukrainian Fiction” New York Times. February 13. Accessed July 30, 2017. https://www.nytimes.com/2015/02/14/world/europe/sifting-ukrainian-fact-from-ukrainian-fiction.html Manjoo, Farhad. 2008. True Enough: Learning to Live in a Post-Fact Society. Hoboken, NJ: Wiley. Manning, Martin, and Herbert Romerstein. 2004. Historical Dictionary of American Propaganda. Greenwood Press. Marwick, Alice, and Rebecca Lewis. 2017. Media Manipulation and Disinformation Online. New York: Data and Society Research Institute. Accessed July 30, 2017. https://datasociety.net/output/media-manipulation-and-disinfo-online/ Masip, Jaume, Eugenio Garrido, and Carmen Herrero. 2004."Defining Deception." Anales de Psicología 20(1): 147-171. McCornack, Steve A. 1992. Information Manipulation Theory. Communication Monographs, 59, 203–242. McCornack, Steve A., and T.R. Levine. 1990. “When Lies are Discovered: Emotional and Relational Outcomes of Discovered Deception.” Communication Monographs, 57, 119–138. McGarity, Thomas O., and Wendy E. Wagner. 2012. Bending Science: How Special Interests Corrupt Public Health Research. Harvard, MA: Harvard University Press. McManus, Tara J., and Jon Nussbaum. 2011. Ambiguous Divorce-Related Communication, Relational Closeness, Relational Satisfaction, and Communication Satisfaction. Western Journal of Communication 75(5): 500-522. Metts, Sandra. 2009. “Impression Management.” In Encyclopedia of Communication Theory, edited by Stephen W. Littlejohn, and Karen A. Foss. 506-509. Thousand Oaks, CA: Sage. Miller, David, and William Dinan. 2007. “Thinker, Faker, Spinner, Spy. Corporate PR and The Assault on Democracy.” London: Pluto Press. Miller, G. R. 1983. “Telling it like it isn’t and not telling it like it is: Some thoughts on deceptive communication.” In The Jensen Lectures, edited by J. I. Sisco. 91-116. Tampa, FL: University of South Florida. Morozov, Evgeny. 2011. The Net Delusion: The Dark Side of Internet Freedom. New York:
PublicAffairs. Morris, Kevin. 2013. “PR Firm Accused of Editing Wikipedia for Government Clients.” The Daily Dot, March 8. Accessed July 30, 2017. https://www.dailydot.com/news/qorvis-lauer-wikipedia-paid-editing-scandal/ Morris, Trevor, and Simon Goldsworthy. 2008. PR- A Persuasive Industry?: Spin, Public Relations and the Shaping of the Modern Media London: Palgrave. Pacepa, Ion Mihai, and Ronald J. Rychlak. 2013. “Disinformation: Former Spy Chief Reveals Secret Strategies for Undermining Freedom, Attacking Religion, and Promoting Terrorism.” WND Books. Pearce, Fred. 2009. “Greenwash: Why ‘Clean Coal’ is the Ultimate Climate-Change Oxymoron”. The Guardian. February 26. Accessed July 30, 2017. https://www.theguardian.com/environment/2009/feb/26/greenwash-clean-coal Powell, Thomas E., Hajo G. Boomgaarden, Knut De Swert, and Claes H. de Vreese. 2015. “A Clearer Picture: The Contribution of Visuals and Text to Framing Effects.” Journal of Communication 65(6): 997-1017. Rampton, Sheldon, and John Stauber. 2002. Trust Us, We're Experts! New York, NY: Putnam Publishing. Reynard, Leslie. 2014. “Decoy Campaign Web Sites.” In Encyclopedia of Social Media and Politics, edited by Kerric Harvey, 358-360, Thousand Oaks, CA: Sage. Samoilenko, Sergei. 2014. “Campaigns, Grassroots.” In Encyclopedia of Social Media and Politics edited by K. Harvey. 189-193. Thousand Oaks, CA: Sage. Samoilenko, Sergei, A. 2018. “Subversion: From Coercion to Attraction.” In Experiencing Public Relations: International Voices, edited by Elizabeth Bridgen, and Dejan Verčic. (174-193). Oxon: Routledge. Safire, William. 1996. “The Spinner Spun.” The New York Times Magazine, December 22. Accessed July 30, 2017. http://www.nytimes.com/1996/12/22/magazine/the-spinner-spun.html Shirky, Clay. 2008. Here Comes Everybody: The Power of Organizing Without Organizations. New York: Penguin Press. Siddiqui, Faiz, and Susan Svrluga. 2016. “N.C. Man Told Police He Went to D.C. Pizzeria with Gun to Investigate Conspiracy Theory.” The Washington Post. December 5. Accessed July 30, 2017. https://www.washingtonpost.com/news/local/wp/2016/12/04/d-c-police-respond-to-report-of-a-man-wi th-a-gun-at-comet-ping-pong-restaurant/ Simons, Greg and Sergei A. Samoilenko. 2017. “The Effects of Social Media in the Context of Public Sphere Insularity in Russia.” In Civil Society and Democracy in the Age of Social Media, edited Fatima Roumate and Amaro La Ros, 189-223. Marrakech, Morocco: Institut International de la Recherche Scientifique. Sken, Kiley. 2014. “A PR Case Study: Wal-Marting Across America.” News Generation, April 4. http://www.newsgeneration.com/2014/04/04/pr-case-study-walmarting-across-america/ Sober, Elliott: 1991. Core Questions in Philosophy: A Text with Readings. New York: Macmillan. Spies, Mike. 2014. “The Rapid Rise and Fall of Dylan Avery.” Vocativ. April 25. Accessed July 30, 2017. http://www.vocativ.com/usa/us-politics/rapid-rise-fall-dylan-avery/ Sun Tzu. 1963. The Art of War. Oxford: Oxford University Press. Schmitt, Carl. 2008. Constitutional Theory. Duke University. Press. Seibold, David, James G Cantrill, and Renee A. Meyers.1985. “Communication and Interpersonal Influence.” In Handbook of Interpersonal Communication, edited by M.L Knapp and G.R. Miller. 551-614. Beverly Hills, CA: Sage. Treadwell, D. F., and Teresa M. Harrison Harrison. (1994). “Conceptualizing and Assessing Organizational Image: Model Images, Commitment, and Communication.” Communication Monographs 61, 63-85. Turkle, Sherry. 2011. Life on the Screen. New York: Simon and Shuster. The U.S. Air Force. 2005. “Information Operations Air Force Doctrine Document 2-5.” Accessed July 30, 2017. http://www.dtic.mil/dtic/tr/fulltext/u2/b311353.pdf Versteeg, Mila. 2015. “The Politics of Takings Clauses.” Northwestern University Law Review 109(3). Accessed July 30, 2017. http://scholarlycommons.law.northwestern.edu/nulr/vol109/iss3/5 Walton, Douglas. 1996. Fallacies Arising from Ambiguity. Dordrecht, The Netherlands: Kluwer Academic Publishers.
19 William, M.Lee, and Blaine Goss. 1975. “Equivocation: Character Insurance.” Human Communication Research 1: 265–270. Weedon, Jen, William Nuland, and Alex Stamos. 2017. “Information Operations and Facebook.” Facebook. Accessed July 30, 2017. https://fbnewsroomus.files.wordpress.com/2017/04/facebook-and-information-operations-v1.pdf Wolk, Robert L., and Arthur Henley. 1970. The Right to Lie. New York: Peter Wyden. Young, Marilyn J., and Michael K. Launer. 1988. Flights of Fancy, Flight of Doom: KAL 007 and SovietAmerican Rhetoric. Lanham, MD: United Press of America, Inc. Yoder, E. M. (1983). “Foreign Policy Needs Ambiguity.” Philadelphia Inquirer, August 2. Zhang, Jerry, Darrell Carpenter, Myung Ko. 2013. “Online Astroturfing: A Theoretical Perspective.” Proceedings of the Nineteenth Americas Conference on Information Systems, Chicago, Illinois. Accessed July 30, 2017. http://aisel.aisnet.org/cgi/viewcontent.cgi?article=1620andcontext=amcis2013