Available online at www.sciencedirect.com
Computers and Composition 28 (2011) 14–27
Using I, Robot in the Technical Writing Classroom: Developing a Critical Technological Awareness Aaron A. Toscano ∗ 280F Fretwell, Department of English, University of North Carolina at Charlotte, 9201 University City Blvd, Charlotte, NC 28223-0001 Received 6 May 2008; received in revised form 26 October 2010; accepted 19 December 2010
Abstract This article calls for technical writing courses to be more engaged in discussions on critical technological awareness. Being critically aware of technology means looking beyond a socially constructed artifact’s assumed practical benefit and critiquing its effects and development. All discourse surrounding technology should be the purview of the field of technical writing. Because much technical writing pedagogy ignores cultural issues related to technology, this article promotes student engagement in discussions about social constructions of technology to foster critical thinking. This article concludes with a discussion of student responses to an essay assignment based on Isaac Asimov’s novel I, Robot. Asimov envisions a high-tech world where technologies of the 1940s are amplified and new ones imagined to create stories where humans must interact with not-so-perfect robots. The novel offers a chance for students to reflect on how contemporary technologies, such as computers, are enmeshed into the social fabric of twenty-first-century life. Additionally, I, Robot generates classroom discussions that bolster student engagement and highlight the impact of contemporary (and future) technologies on workplace practices and the culture at large. Some topics that I, Robot addresses are issues of gender in technological fields, military beginnings for consumer technologies, and labor issues. © 2010 Elsevier Inc. All rights reserved. Keywords: Isaac Asimov; Critical Technological Awareness; Critical Thinking; Literature; Pedagogy; Positivism; Science Fiction; Technical Writing; Technology
1. Introduction Technical writing or technical communication should focus on all discourse surrounding technology. The ways in which we engage with, absorb information about, and communicate through technology means that a discipline focused on the communicating of technical information has a stake in analyzing the critical, rhetorical, and pedagogical value of technological discourse. Claiming that technical writing encompasses all discourse surrounding technology probably has many readers surprised. A typical view of the technical writing course is that it trains engineering, science, and information technology majors to communicate effectively. Another typical view is that assignments focus solely, or perhaps mainly, on instructions, procedures, and reports—the instrumental forms for conveying technical information. This article aims not to reconfigure the entire technical writing course, but to demonstrate how students can engage in critical technological awareness in technical writing courses that want to stress critical cultural analysis.
∗
Corresponding author. Tel.: +1 704 687 6562; fax: +1 704 687 3961. E-mail address:
[email protected]
8755-4615/$ – see front matter © 2010 Elsevier Inc. All rights reserved. doi:10.1016/j.compcom.2010.12.001
A.A. Toscano / Computers and Composition 28 (2011) 14–27
15
By expanding the definition of technical writing beyond simply communicating technical information, I hope to encourage students to engage in discussions that promote a critical understanding of technology. To embrace such a pedagogy, we need a paradigm shift away from creating products—final reports and such that showcase a student’s semester-long “acquisition” of effective technical writing strategies. Professional writing textbooks often present technical communication formats as universal formulae to be applied across industries. However, the documents and contexts that students will encounter on the job will be varied and even unknown to many students because career paths can change. Instead of formats and attempts to codify formulae, we ought to privilege rhetorical situations for both general and specific professional communication situations. Analyzing writing about technology, whether it be specifications or science fiction narratives, offers teachers the chance to have students focus on ideologies surrounding textual and, therefore, material production. Such practices ask students to explore audience and purpose from a culturally aware position. The professional environments to which students aspire (or in which they participate) are politically constructed communities—firms, organizations, government agencies—that embody the ideologies of an economic system driven by producing or sustaining technology. In order to effect change, however, more teachers and students must adopt the paradigm that ideological discussions are not tangential to technical writing pedagogy (or other technology education), but germane to systemic analyses of technologies. After all, technologies reflect the cultures from which they come. William E. Rivers (1994) suggested that technical writing teachers ought to incorporate “literature and literary criticism” into their pedagogy (p. 45). The field of technical writing has yet to consider seriously a broader pedagogy in light of Rivers’ call for looking toward non-technical discourse. This article is an attempt to consider what literature might have to offer technical writing, a field often assumed to be diametrically opposed to traditional assumptions about literature. 2. Critical Thinking in the Technical Writing Classroom My goal for using Isaac Asimov’s novel I, Robot is to move students toward being aware that there are sub(altern) discourses about technologies—and not just computers. These discourses do not consider technology inherently progressive or essentially good. The novel’s setup fits in well with technical writing courses because, although it is 60 years old, it analyzes high-tech culture. Readers understand it as fiction, but they can be moved to read the novel’s subtext, which illuminates the ideologies carried out through the dominant culture’s beliefs and discourses about technology. When we adhere to strictly instrumentalist activities and lessons without critical analysis of the technical communicator-audience relationship, we may miss a major opportunity to engage our student population, a group that most likely—coming from engineering and sciences—has an uncritical view of technology: The mantra “bigger/smaller, faster, stronger, better” is not always accurate. In fact, it promotes a modernist paradigm for technical writing. Such a paradigm privileges the technologies over the communicators, much like system-centered design suggests that “the documentation is written to reflect the image of the system designer” (Johnson, 1998/2003, p. 295). Incorporating I, Robot into the technical writing classroom advances a postmodern pedagogy that privileges student agency and asks students to be critical of their own career goals when exploring the wider cultural forces that may shape their decisions and the products that their cultures produce. Whether students become engineers, scientists, IT professionals, technical writers, or general communication specialists, we cannot deny that current and future students must contend with the rapidly changing technologies on the job and in the larger culture. Providing students with avenues for understanding their agency in the workplace appears to be a desirable goal. Tony Scott (2006) observed that “technical writing operates within a murky, troubled border between a largely intellectual and politically progressive humanism that is concerned with ideology, ethics and social responsibility, and the more narrowly profit-driven marketplace” (p. 229). He goes on to argue that broad social and political critique is especially problematic in professional and technical writing programs where students expect to learn the kinds of skills that can most directly help them get and keep jobs in a competitive marketplace that tends not to value intellectual inquiry. (p. 229) Assuming Scott’s analysis is correct, the dominant technical writing pedagogy of American colleges and universities aims for technological literacy—meaning the use of tools. Scott also argues that conditioning technical writing students to adjust to the concerns of the workplace is “the dominant scholarly metanarrative in the field” (p. 235) and that the problem of a “pedagogy that seek[s] to create a technical writer who is most advantaged in the new economy” is that it
16
A.A. Toscano / Computers and Composition 28 (2011) 14–27
conditions students ultimately to conform to a coercive, non-socially conscious post-industrial career (p. 238-239). In other words, we must reflect critically on pedagogies that espouse agency for real change and not simply for serving the whims of the marketplace. Readers of this journal no doubt understand the importance of the intellectual and social value of developing critical, culturally informed understandings of technologies. My approach is part of the scholarship that calls for students to have critical computer literacy. Barbara Blakely Duffelmeyer (2000) finds that students can be uncritically accepting of dominant values regarding technology in her study of student technology narratives related to computers. Duffelmeyer wants students to be more critically literate but notes that some “students are clearly aware of society’s influence and don’t see it as necessarily natural, but as nonetheless unavoidable and inevitable” (p. 299). I hope to move students to consider not just computers but all technologies as texts for uncovering cultural values. The computer is a fleeting technology of our time—consider computers in relation to houses, automobiles, or even clothing—but an enduring symbolic reference of contemporary technology. Composition scholars often associate technology as/for computers and vice versa. However, technology is more than silicon-based artifacts, and recognizing that is the beginning of critical cultural awareness. Before students can acquire critical technological literacy, they must understand that technologies are not merely tools but products of the culture that reflect social values. Stuart Selber’s (2004) thorough, large-scale literacy model asks students to engage in functional, critical, and rhetorical discussions about computers. His overall goal is a reimagining of computer literacy from a humanist perspective in order, ultimately, to challenge the view that technological literacy concerns are not the purview of the Humanities in general or English specifically (p. 235). The overall focus of his work is on computers used in the classroom. One does not have to teach in a computer lab to understand why the computer is a main focus for critical technological literacy. Computers are seemingly ubiquitous and they are an important technology to analyze because of the cultural narratives that hold them up as the pinnacle of high-tech society. If we want to change technology to be more of a humanistic endeavor, we have to begin to challenge the unconscious apotheosis given to technology. Selber’s goal is to affect change in the classroom, department, institution, and industry with his approach to reimagining computer literacy. I present I, Robot as a way to generate the discussions Selber envisions “to consider . . . the conditions that construct perceptions of computers and their boundaries” (p. 95). Before pursuing technological change, which I agree with Selber cannot happen in a semester (p. 212), we must help students become aware that their attitudes about technology are socially constructed. To address such a task completely requires more space than an article provides. This article, however, attempts to demonstrate a component of a class that advocates critical technological awareness. 3. Critical Technological Awareness Pedagogy I define critical technological awareness as understanding how technology fits into our lives beyond merely using tools. This awareness, which initiates a student’s acquisition of critical technological literacy, requires a social analysis of the impacts and demands of technologies and a personal exploration regarding one’s uses and desires for technologies. In other words, having a critical technological awareness does not mean knowing how to use technology, but rather knowing how technology fits into our lives and carries ideologies that our culture embraces. By focusing on critical technological awareness, a concept similar to a critical technological literacy approach (Gurak, 2001; Kastman Breuch, 2002; Selfe, 1999), we remove the awe of new technologies’ functionalities (or bells and whistles) as the primary focus of technology studies. The approach I advocate for technology-oriented classes, which are not limited to technical writing, asks students to be aware of the fact that technology is not neutral. However, teachers might find it difficult to get students to think critically about common technologies like computers. Many times I ask students about clothing when we discuss common technologies, and, because clothing is not electronic and is a second skin for students, they often do not consider it a technology. Similarly, students identify computers as just tools that do not need any reflection. The more integrated a technology is into one’s life, the harder it is to perceive that artifact as a technology. A topic even more difficult for students to consider is that technologies such as computers are political. Selber (2004) understands the political nature of technology when he explains that “a technology must be located in an environment that is sympathetic to its politics” (p. 101), and computers (as well as other silicon-based products) are located in quite sympathetic political situations. Planned obsolescence, which drives consumer technology development, dominates computer narratives covering both hardware and software. Like Selber, I borrow from Langdon Winners’ (1986) discussion on the politics of technology to situate technology as a non-neutral artifact: “[T]he designs of technologies
A.A. Toscano / Computers and Composition 28 (2011) 14–27
17
have served purposes of domination” (p. 89). There is an obvious corporate interest in getting consumers to purchase the “latest and greatest” versions of their products: profit. Making students aware of the perpetual lifecycle of updates and updated computer products demonstrates that we do not always buy new computers because they break but because we think we need something new. This topic has parallels with fashion, mobile devices, and other short shelf-life products. Using I, Robot in a course promoting discussions about critical technological awareness minimizes instruction leading to technological skills acquisition by privileging a cultural, humanistic understanding of technology. The novel’s robots are not real, so readers do not assume that there is an instrumentalist purpose for the text. The implications for I, Robot in technical writing classes are exciting. Although an exegesis of the novel is beyond the scope of the technical writing class, Asimov’s stories offer ample opportunities to encourage discussions about the role of technology in society. Additionally, relating science fiction technology with contemporary technologies such as e-mail, faxes, mobile phones, etc. facilitates discussion about how technology mediates communication. The ways in which an instructor incorporates discussions of the novel will vary, but it is important to weave the novel into the class goals as opposed to simply having it act as an “escape” from technical writing study. Without adopting a critical technological awareness paradigm, students would consider the text as separate from course learning goals and might have the tendency to view it as a temporary distraction and not as an assignment germane to traditional technical writing objectives. Early in the semester, I begin having students discuss their attitudes about technologies in order to introduce the social construction of technology theories. I give the I, Robot essay assignment that I describe below about midway through the semester. I find the essay and discussions surrounding the novel help boost critical thinking by making students develop abstract ideas about abstract technologies—humanoid robots—which leads students to consider the invisible social forces that push for technological advances in their own culture. Although I do not use I, Robot in all of the technical writing courses I teach, I always use it in my Department’s “Introduction to Technical Communication” course and in upper-level theoretically-oriented courses. The next section discusses the particulars of the graduate course from which I collected student data, but I will first present the approach that I take in the more general introductory course. Besides foregrounding the class with ideas about the social construction of technology, students in the courses I teach create various technical documents—reports, procedures, resumés, sets of instructions, etc.—but I ask them to explore how those documents communicate on a cultural level just as I ask them to consider how culture mediates technologies. For instance, one typical assignment, the procedure, asks students to create a sequence of steps or an explanation of something technical in order to communicate technical information to a lay audience. Why do we need such a document? What purpose does such a document serve? What does such a document say about the culture overall? Although there are no exact answers, the procedure reflects our consumerist culture (among other features) because we desire to consume technologies, but, unfortunately, they are not necessarily intuitive products. A product’s affordances do not become affordances until the public understands their use. We know that we can turn a doorknob because that is ingrained in our collective consciousness; we use doorknobs many times a day throughout our lives. In fact, nearly 30 years ago in his history of technical writing instruction, Robert J. Connors (1982/2004) noted that in addition to the “military influence of [World War II],” technical writing “became a popular skill to learn” because of “the increasing number of technically-based consumer products America was turning out” (p. 89). Discussions of technologies do not have to be profound in order to demonstrate that there are overarching cultural values that technical documents serve. Besides providing students with a background in genre theory, explaining the cultural work that technical documents do is germane to a critical technological awareness pedagogy. I, Robot helps students identify narratives of the cultural work that technologies do by comparing the experiences that Asimov’s characters have with their own experiences in order to understand the ways in which technologies adhere to and are mediated by culture. From a strict social-constructivist position, technology comes to be because society “demands” it. Also, this position claims that no technology gets produced without adhering to social values. In contrast, a technological deterministic position claims that technologies get produced and then cause social changes. Winner (1986) describes the view as “the idea that technological innovation is the basic cause of changes in society and that human beings have little choice other than to sit back and watch this ineluctable process unfold” (pp. 9-10). Andrew Feenberg (2002) defines technological determinism similarly in his critique of modernization by noting that technological determinism claims “technology is an invariant element that, once introduced, bends the recipient social system to its imperatives” as if “[it] has its own autonomous logic of development” (p. 138). The problem with this view is that it assumes that technology advances outside of human endeavors (Feenberg, p. 139). Although technologies can change practices, it is difficult to claim that technologies change values—the values already permeate society. For
18
A.A. Toscano / Computers and Composition 28 (2011) 14–27
example, mobile communication devices altered where and when people communicated, but they did not cause people to want the ability to have remote, instant communication. The demand or value for instant communication already existed; mobile communication devices fulfill a social desire. A longer discussion regarding the social construction of technology is beyond the scope of this article, but it is important to note that critical technological awareness privileges a social constructivist position over technological determinism. I, Robot provides an opportunity to discuss the values or themes related to socially constructed technologies. Although there are more themes in the book, the following list is representative of the themes related to critical technological awareness: labor issues, women in science and engineering, the military-industrial complex, and machine predictability (see Appendix for descriptions of these themes). None of these themes claim that the use of technologies affects social issues, as a technological deterministic stance might claim; instead, the themes represent cultural situations that embed technology with dominant capitalist and militaristic ideologies. For instance, many technologies in the Western world are created for their military applications. The first half of the twentieth century shows how war drove technological advancement from Marconi’s wireless to the atomic bomb. Knowing how to use weapons is a technological literacy; knowing that the weapons reflect a militaristic culture is critical technological awareness. Unlike assessing technological literacy through field-specific tests, such as identifying whether users can perform the functions or not, there is no way to quantify critical technological awareness. However, students’ responses give clues to their critical awareness. As mentioned above, I informally assess how well students develop critical technological awareness through class-wide discussions and an essay on I, Robot. I provide suggestions to help foster student thinking about the novel as more than just a “fun read” that falls outside of “normal” technical writing instruction. Ideally, I have students write an essay prior to a class-wide discussion of the novel. After that discussion, students should have a chance to workshop and turn in revised essays later. The essay topic suggestions are often followed, which is to be expected because students view them as “approved” subjects, but occasionally students create their own topics. 4. Class Response to I, Robot A graduate seminar that I taught had the most unique variations on the essay topic suggestions that I provided, and that seminar’s students are the focus of this student voices section. Most essays showed very sophisticated understandings of technology from a social perspective. The overall course goal for this graduate seminar was to explore New Media as a socially constructed phenomenon. The course’s main text was Lev Manovich’s The Language of New Media. All but one student enrolled in the course declared technical writing as his or her graduate emphasis.1 Although each of the eight students addressed different parts of the I, Robot text, the following three themes dominated essay topics: 1) Technology as Life/Work Necessity, 2) Technological Dis-ease, and 3) Machine Infallibility/Predictability. I have chosen five student essays to discuss. The students’ work shows critical technological awareness, but their writing should not be considered an end of their thinking. One thing I hope to impart on students is that critical technological awareness is a lifelong pursuit (as education should be) because we will never be divorced from technology or society, and technology is not static. I will now turn to discussing the student work specifically by showing how students addressed the three themes above. 5. Student Theme #1: Technology as Life/Work Necessity. Not surprisingly, students equate the robots in I, Robot with contemporary computers and view the robots as metaphors for our reliance on computers for everyday tasks. One student, whom I will call Eve,2 notes that “we are willing to struggle with the problems that arise when we interact with these evolving technologies because we have become reliant on them for functioning in our society.”3 As a technical communicator and citizen of the twentyfirst century, Eve recognizes our dependence on technology; her statement above provides a glimpse of her critical 1 All but one student in the course were pursuing either a Graduate Certificate in Technical/Professional Communication or a Master of Arts in English with a concentration in Technical/Professional Communication. 2 All student names are pseudonyms. 3 I have chosen not to use brackets to identify where I altered the syntax of a student’s original work. I also do not use “[sic]” to identify students’ non-conventional discourse choices. Readers should assume all errors are mine.
A.A. Toscano / Computers and Composition 28 (2011) 14–27
19
technological awareness. Eve is also aware of her limitations with computer-based products but expresses that she has to get through the hurdles in order to accomplish her tasks: When I am using a word processor, Windows Vista, or Adobe software, I might not be able to get every function to work the way I need it to, but I work around this issue, and I am able to produce a product or complete an activity that will fulfill my needs nonetheless. Eve relates this situation—being able to use but not fully know the technology as a programmer would—to an instance in I, Robot where two robot engineers stop trying to understand why a robot behaves the way it does; instead, the two engineers are content that the robot simply does what it is supposed to do. They, and Eve, work around the technology. Although these technical writing students are not necessarily going to work as programmers or engineers in the future, they believe that technological literacy is important. Knowing how to use software is beneficial, but the goal of critical technological awareness is to have students notice how technology is implicated with cultural values. Eve believes that “we yield to the technology because we have to in order to compete and exist in a technology-inundated culture.” Such an observation extends to Eve’s view of getting a job because she explains that “computer and software skills are extremely important in obtaining employment in a technology-driven market.” Understanding the currency of technological literacy demonstrates Eve’s critical technological awareness: computers are not just tools to get the job done; they are skills to get her the job. Students often come to the class with the thought that they will acquire skills that they can then use in their careers. It is difficult to convince students that cultural analyses of technology are more important than skills acquisition, but teachers can affect a change in that thinking if they consistently emphasize critical technological awareness. The realization does not happen over night, but possibly over time. Another important feature of Eve’s discussion demonstrating critical technological awareness is her argument that without technological literacy, people will fall behind. Eve claims “the fear. . .in our culture for not being up to date. . . makes it easy to market expensive software.” Additionally, Eve feels the need to attain the currency of technological literacy derives from the belief that “technology has created a paranoid culture of people who live at the mercy of being able to operate and interact with the latest versions of software or devices that are available.” The above observations are excellent insights into our consumerist culture in general and planned obsolescence in particular. Consumer demand, in part, fuels the creation of new gadgets, and planned obsolescence maintains a short usage life for disposable technologies (mobile phones, computers, software, etc.), so consumers keep buying the latest updates. Not only must a consumer know how to use new technology, she must own the latest technology or risk falling behind and becoming unemployable. In class-wide discussions, I introduce the concept of planned obsolescence, which follows Eve’s discussion of marketing new products. Eve goes on to argue that “technology also provides us with a sense of security that makes it more necessary for some people to possess it,” and she brings up cell phones as a major example. Cell phones or, more accurately today “mobile communication devices,” are great ways to open discussion on critical technological awareness. They certainly appear to be ubiquitous, and it is rare a student does not have one.4 Eve believes our society is dependent on mobile communication devices. They appear to be what Marshall McLuhan (1964) considers “fixed charges” in society (p. 21). Much like clothing, leaving home without one’s mobile phone could make a person feel naked, vulnerable. Societies have commodities that are, for lack of a better term, givens; the community accepts (it does not have to be a conscious decision) these commodities as givens, which “create the unique cultural flavor of any society” (McLuhan, p. 21). Not having a technology that many feel that everyone should have could make the non-user appear to be backward. Backwardness and out-datedness are common projections for citizens who do not adopt popular technologies. After all, if the technologies are thought to be necessary by members of a culture, a person should not be able to operate in the world without these devices. Believing certain technologies or technological changes (e.g., planned obsolescence) are inevitable is absorbing the dominant cultural narrative that we must consume technology and keep up with it. Duffelmeyer’s (2000) study aimed to determine whether or not students are supportive or hostile towards computers “or even [if they] demonstrate an awareness of prevailing discourses about technology” (p. 290), and she finds there are various student perspectives.
4 I do not own a mobile phone and have not for almost a decade. This fact is quite shocking to students. I hear cries like “how do you live?” or “what about for emergencies?” My apparent backwardness generates much conversation about the mobile communication device being a fixed charge in society.
20
A.A. Toscano / Computers and Composition 28 (2011) 14–27
Additionally, Duffelmeyer’s definition of “critical literacy” is having “an awareness of the forces that affect the microand macrolevel conditions within which we acquire literacy and how we view the uses and meaning of literacy” (p. 290). She concludes that “students are still very likely to see digital technology as something over which they have no control, to believe that their concerns and hesitance about technology are due to their lack of experience” (p. 303). Although effecting socio-technological change might be the overall goal for critical technological literacy, students must first be aware that there are other views of technology. Students will continue to consider technological ineptness as their problem unless they are asked to challenge the assumptions of givens—the technology itself or the socially constructed positive/positivistic narrative. Our goal as teachers is to show students that so-called technology “givens” are not organic but culturally constructed. 6. Student Theme #2: Technological Dis-ease. Another theme students address when comparing I, Robot’s technologies to contemporary real-world technologies is a sense of dis-ease caused by technologies that can monitor us without our knowledge, leading us to feel that we lack control over parts of our lives. Eve’s discussion suggests that consumers might not feel good without the security of having a particular technology or knowing how to use one, but her classmate Barbara is emphatic “that we lack control over machines and media that we now depend on.” Many students point out that society’s dependence causes us to give up assumed values, which disturbs some students. Technologies, or more likely their effects, become invisible when individuals cannot separate the technology from everyday activities. Therefore, when Eve mentions our society is dependent on mobile communication devices, she is being critically aware of a dependence on a tool. Because technologies are prostheses that extend our capabilities, without them—and specifically, without ones that are considered fixed—how can we possibly get by? Mobile communication devices, clothing, and automobiles are extensions of humans and many would feel lost without access to them. In Eve’s analysis, she recognizes that employers want future employees to have skills; those skills are necessary for one to be considered worthy for a job. The idea that technologies are prostheses of humans is further complicated by N. Katherine Hayles’ (1999) observation that humans function “in systems whose cognitive capacity exceeds [one’s] individual knowledge” (p. 289). For example, in order to cook food in a microwave, the user need not understand the science behind microwaves; instead, he or she just pushes buttons. Similarly, computer users rarely understand the engineering of software and hardware but use these technologies to perform countless tasks. This situation—having machines become prostheses that humans do not understand—exists because of a dialectic relationship between machines and humans. Hayles calls this relationship “reflexivity” and defines the concept as “the movement whereby that which has been used to generate a system is made, through a changed perspective, to become part of the system it generates” (p. 8). Therefore, humans created an industrialized world full of high-tech efficient artifacts only to become instruments in the industrial, economic, and technological systems they created. Hayles also claims, “Reflexivity tends notoriously toward infinite regress” (p. 9), meaning that technological control is a simulacrum with a beginning that is difficult, if not impossible, to find. I, Robot leads readers to a similar conclusion because the novel’s human-machine situations enhance analyses on social constructions of technology. Human beings are real in I, Robot, but so are the robots because they become masters of “the system” (the world economy). By the novel’s conclusion, readers understand that the world’s technologies have basically taken over. Asimov shows that humans are the new robots; robots are no longer under control but in control. These robots have taken over and humanity does not understand how the system works—humans may even believe they are still in charge. In class discussions, the concept of technology exceeding user capacity can be facilitated by asking students how a common technology works. Although a few might know, for instance, the science behind microwave-oven technology, most will know only how to push the right buttons to use the machine. This example highlights Hayles’ theory on technology exceeding users capacity and reflexivity because students may begin to recognize their dependence on these machines. Additionally, students might never remember a time before microwaves and have difficulty understanding how people cooked before their existence. This is a good time to remind them that microwaves are a standard American technology but not universally embraced across the world. Recognizing the cultural situation of a technology is critical technological awareness. Another student named Karé explores the surveillance aspects of the World Wide Web and how some users are unaware of privacy violations. Her inspiration for this topic comes from a vignette in I, Robot about a mind-reading robot, Herbie, who lies to its human handlers in order not to hurt their feelings. Herbie “reads” their desires and creates
A.A. Toscano / Computers and Composition 28 (2011) 14–27
21
stories, telling his handlers what they want to hear. In her essay, Karé compares Herbie’s mind reading to the data collected by online providers and search engines. She claims that the “anonymous and private exchanges” we assume we have “are easily accessible by robots online.”5 She goes on to say, “as we build profiles on Facebook, gather RSS feeds, download music, and explore Google Earth, these companies are keeping tabs on our information.” Many undergraduates have discussed a similar topic related to online file sharing and piracy, but Karé takes the discussion a step further and relates this “mind-reading” to the cultural value of profit. A capitalist society needs to gather demographic information in order to market products successfully. Karé demonstrates her critical technological awareness by providing research on the goal for all this data mining: advertising. The inundation of media requires companies to advertise to niche markets because there is no one media outlet that dominates users’ attention. Instead of a 15-to-30-second commercial on prime time television, companies must advertise online to consumers who are rapidly clicking through cyberspace. Karé provides research to support her argument that Google, in particular, collects information on users. She goes on to ask “why so much personal information needs to be stored indefinitely.” As a critically technological aware user, Karé suggests, much like her classmates, that we are just used to the surveillance: “Perhaps most people find this practice acceptable because we are becoming so accustomed to personalized service we don’t think about how it is fine-tuned just for us.”6 Karé advises readers that “it appears the profits are being put before the protection of their [the company’s] customers.” Her insight shows she understands the cultural or, specifically, economic reason for gathering personal information—to make the company profitable. She also seems to be advocating that this information gathering is a violation of privacy and consumers ought to be protected against it. Although we might assume many users hope their online privacy is protected, critically technological aware users understand their actions online may have repercussions. From embarrassing e-mails one wishes could be taken back to less-than-appropriate pictures of oneself posted online, users have to negotiate the almost impossibility of complete privacy in the very public technology of the Internet. The Internet paradox is described best through Karé’s observation of it being “a double-edged sword” for users wanting access to virtually everything and those same users wanting their personal spaces online to be private. She explains that the Internet “has become an essential part of everyday life for most of us, and we expect to be able to find information about any topic at anytime.” What she notices that others might not be conscious of is that users “willingly create digital reflections” on Youtube.com and Facebook.com, but they “still expect them to be personal and private.” Noticing the contradictions, paradoxes, and conditioning technology presents is the mark of a critically technological aware user and one of “[t]he cornerstones of critical thinking at the college level,” which Duffelmeyer (2000) claims “are the related abilities to tolerate ambiguity and contradictions and the willingness to consider an issue from many sides” (p. 291). 7. Student Theme #3: Machine Infallibility/Predictability. Karé also brings up a point that her other classmates develop further: The myth of machine predictability or infallibility. She notes that “we are entrusting robots and computers with more and more autonomous tasks” that cannot possibly replace human analysis. After all, as Karé asks, “how is a robot supposed to understand sarcasm or a joke? It can’t, so it simply treats everything literally.” The critically technological aware user recognizes limitations programmers and engineers—experts at building and assembling technology—might not recognize because she or he does not focus on code, switches, or other components of a system, but instead focuses on how the technology is embedded into a culture. A technical writer is in a good position to explain to programmers and engineers that general users do not come to technology with the exact same assumptions, attitudes, and skill levels as experts do. Such an understanding may help effect change in the development phase of a technology because being critically aware of user expectations and behaviors provides a user-centered approach to development as opposed to a system-centered (expertoriented) approach. The distinction between user-centered and system-centered is similar to the difference between 5
In this particular quotation, Karé is referring to automated systems and contemporary robots (e.g., assembly-line robots) and not to Asimov’s robot characters. 6 Karé most likely refers to the way search engines tailor advertisements to a user’s search criteria. Also, Google’s web mail system, Gmail, displays advertisements around an e-mail that correspond to the content of that particular e-mail. For instance, if Karé mentions her new TV in an e-mail on her Gmail account, the receiver will probably have advertisements showing up around the message for new HDTVs from big-box stores or directly from manufacturers.
22
A.A. Toscano / Computers and Composition 28 (2011) 14–27
reader-based prose and writer-based prose that Linda Flower (1979) identifies in student writers: Just as writer-based prose “fail[s] to transform private thought into a public, reader-based expression” (p. 19), system-centered approaches do not effectively take the user’s (the general, non-expert public’s) knowledge into account because the simplest tasks are supposedly intuitive to the expert. Considering machines to be intuitive is an expert orientation to technology and relates to Karé’s point that machines are literal. As the critically technological aware student recognizes, computers do not operate based on ambiguous instructions. Karé asserts human agency by pointing out that humans are needed for analysis. Machines return errors when confronted with ambiguity; critical-thinking humans have strategies for dealing with non-black-and-white issues. Different users (like different readers) bring various experiences to technologies, which may affect a technological response, an outcome that could demonstrate that the technology does not behave in a predictable or infallible way as the designers assumed. Pointing to the disconnect between experts and general users is a fundamental concern in technical writing. Experts have shortcuts in their own heads because their use of technology is second nature; often times, they do not remember what it was like to experience a technology for the first time. When technologies do not work for users, experts will often blame the user and not the technology, which they feel is infallible. Most students recognize that technology is fallible—it does not always work the way we assume it should. There are glitches. Even Asimov’s robots had provisions to make sure they behaved predictably. His robots are governed by the three laws of robotics: 1. “A robot may not injure a human being, or, through inaction, allow a human being to come to harm,” 2. “a robot must obey the orders given to it by human beings except where such orders conflict with the First Law,” and 3. “a robot must protect its own existence as long as such protection does not conflict with the First or Second Laws” (Asimov, 1950/2004, pp. 44-45). Of course, as readers find out, the robots are hardly predictable and cause the human characters much stress when they do not behave according to design. The critically technological aware student also draws comparisons between the robots’ glitches and their own experiences with computer problems. The myth of technological infallibility and predictability is addressed by two students that I call Winona and Belinda. Belinda observes a condition that reflects the situation of many desktop users at work: “People sit in front of their computers predicting that they will instantaneously be able to connect to the Internet, conduct transactions, create graphs or reports, and do many other tasks.” However, as she goes on to explain, “every day we will experience a computer glitch or disruption of our work.” Although it is inaccurate to claim that the majority of users are unaware of computer glitches, the idea that computers should always work seems to underline our society’s emphasis on demanding technological solutions to a host of contemporary issues: education, environment, energy, terrorism, and transportation to name several hot topics on the national register. Our belief that technology can solve those issues presumes an attitude that technological solutions are worth pursuing. They will not all work, but society believes in their promise of working. Along with Belinda’s (and several other students’) observations that contemporary technologies are not perfect, Winona’s essay critiques the assumptions that computers are predictable and humans are inherently fallible. She compares I, Robot and Arthur C. Clarke’s 2001: A Space Odyssey to contemporary culture’s insistence on “looking to technology to counteract (or perhaps even solve the problem of) the human element.” Being a critically technological aware observer, Winona notices that our culture seeks to “stabilize the human element with technology.” Automation seeks to reduce human error in production and has its roots in Fordist/Taylorist goals of efficiency. Winona appears to be aware of the cultural practice of giving up control of tasks to computers, which “can ruin an entire day or more when they malfunction.” Computers store information, help us communicate, and carry out various tasks. When they malfunction, productivity is lost and, inevitably, users become frustrated. These machines that help us accomplish so many tasks are expected to work. However, as Winona points out, we cannot separate from technologies because they “are presented to us as solutions to the ever growing need for speed and efficiency,” and “this handoff of control is required of us in order to see the full benefits of any given technology.” Furthermore, she notes that we lose autonomy as we gain efficiency, but our consumer culture accepts that. Such a point is consistent with the earlier discussion that contemporary users must have new technology because it is a fixed charge in society. The critically technological aware user understands her lack of control over the tools needed to compete in the high-tech job market.
A.A. Toscano / Computers and Composition 28 (2011) 14–27
23
Even though technologies do not always behave the way we expect them to, Winona notes that “humans seem to carry an expectation of utter predictability when approaching technology in general.” Perhaps, as Winona suggests, our “expectation is a natural coping mechanism when faced with something more powerful and less comprehensible than ourselves.” Whether or not we can prove Winona’s previous statement is irrelevant to the fact that we do operate in a world of technology that we do not fully comprehend. Such an observation definitely marks her critical technological awareness because that understanding allows her to recognize that we carry on activities through systems that we have absorbed. Much like language usage is not about knowing how to explain the grammatical functions of words, operating technology is not about knowing the science and engineering (or programming) behind the technology’s operations: A user presses the power button, turns the key, or flips the switch and does not need to know how the technology works in order to use it. An instrumentalist-based approach to technical writing focuses on how to use technologies for immediate or artificial tasks such as assignments, but a critical technological awareness approach takes more effort to move students to identify how technologies we do not truly understand carry implicit values of the culture from which they come. 8. Conclusion Ultimately, the above student observations continue the approach of working from a humanist perspective to influence technical writing pedagogy. As mentioned previously, Selber’s (2004) book is a thorough blueprint for a humanist “reimagining,” but providing assignments like the I, Robot essay or just discussing alternatives to the dominant positivistic, technology-is-always-good attitude brings students to be critically aware of technology. Like Carolyn R. Miller’s (1979) rationale for technical writing as a humanistic endeavor, I agree we should not unwittingly privilege a “covert acceptance of positivism” in our pedagogy (p. 616). A positivistic pedagogy privileges expediency over critical technological awareness, promoting a skills-oriented, product-driven class. Such a course might overemphasize learning new software, which may by obsolete in 12-18 months, instead of emphasizing understanding of the society that uses such perpetually obsolete technologies. Again, learning formulae that do not translate to all situations is just as much an instrumentalist exercise as software instruction. Asking students to create traditional technical writing assignments—résumés, reports, procedures, etc.—while discussing how those genres reflect professional cultures fits within a critical pedagogy framework and allows teachers to engage in the why behind genres as opposed to simply thrusting formats onto students to regurgitate. In the courses I teach, I ask for as much critical analysis of the traditional genres we produce as I ask about critical technological awareness. A class advocating a critical technological awareness perspective is compatible with traditional technical writing instruction if the course encourages students to analyze the rhetorical situations behind communicating with standard technical texts. There is great potential for advocating a critical technological awareness stance using I, Robot. When they approach the novel, students seem to immediately identify and explain systems with which they are familiar. Because technologies surround us, students should look beyond the surface of a technology’s usage and examine it as a cultural product that is embedded in society. Consumerism, surveillance, and planned obsolescence are just a few of the areas students address in class, but I, Robot clearly offers much more. The appendix shows that a class using I, Robot can cover everything from outsourcing jobs to redhead stereotypes to the role of women in science and engineering. The novel is a way of approaching critical technological awareness, but I do not consider it to be the way. After all, there are other texts that address human-computer interaction (HCI) and complex socio-technical issues related to common “problem” technologies (e.g., the irksome fax machine in Office Space) or far-out concepts like entirely automated spaceships (e.g., the murderous HAL from 2001 A Space Odyssey). I, Robot does not accurately predict the future, but labeling the novel as an outdated, science-fiction artifact ignores the fact that it is a valuable resource for opening up discussions of critical technological awareness. I have promoted a view of technical writing that advocates a less instrumental focus. However, we cannot entirely abandon instrumental techniques in the classroom. Obviously, tools instruction helps technical writers to apply their classroom exercises directly to their jobs. There is value in such instruction, but what happens when the tools change or the students’ career paths change? The critical technological awareness paradigm I propose challenges teachers and students to broaden their focus on technology by analyzing the social, political, and historical aspects of technologies. Certainly, this is not a new approach: The examples I have given are simply a new way to continue rethinking technical writing pedagogy from a humanist perspective. I believe it is a more humanist endeavor to not give control to technologies; instead, student agency becomes the focus, and, without ways to discuss how agency disappears, our students will continue
24
A.A. Toscano / Computers and Composition 28 (2011) 14–27
to become victims of a non-socially conscious economic system that views workers as products to be utilized and discarded much like obsolete technologies. I encourage our discussions on technological literacy not to emphasize solely skills training. More research should be done on how we may bring technological literacy discussions into lessons dealing with how to use technology effectively. For instance, a class building web pages helps students learn a critical skill—web page development—but such an assignment may also spark discussions on planned obsolescence and the industry’s “need” for updated software and hardware so that users can access Web 2.0 applications quickly. Even though our outsourcing economy encourages contractual labor, late-capitalism still needs consumers. Even if postmodernity is our current “condition,” the modernist values of progress, efficiency, and, unfortunately, dehumanization have not gone away. Borrowing from others across the disciplines (and across English Departments) will help us be critically aware scholars who are not just talking to one another but engaging in conversations within a larger community. This article is an attempt to incorporate more critical thinking into technology education. Whether the course is a technical writing class or a composition class, having a critical awareness of technology is paramount for the twentyfirst-century citizen. It is not enough for us to know how to use technologies; we must understand how technologies use us. Acknowledgements I am grateful to the students whose work I reviewed for this article as well as to the students who participated in all the I, Robot assignments and discussions I’ve had over the last several years. I also want to thank Greg Wickliff for his thoughtful responses to an earlier draft of this article and Aaron Jaffe for his encouragement to pursue this unique approach to technical writing. Additionally, I thank the editors and anonymous reviewers at Computers and Composition for their constructive feedback.. Appendix A. A Sample of Discussion topics for Critical Technological Awareness from Isaac Asimov’s I, Robot. A.1. Robot/human vignettes The following examples match the first eight chapters in Asimov’s I, Robot; of course, they are not exhaustive. Also, the examples may appear to stress a tools approach, but as moments for discussion, these examples can easily be incorporated into critical technological awareness or critical literacy. Although the examples stress ethics in technical writing, I hope readers experiment with the multiple topics that may arise from the following situations. 1. Robot marketing. I, Robot helps students to consider how they would sell twenty-first-century technology. Obviously, the text allows for a reading of the audience, and in this case, Asimov’s audience is that of the hyper-industrial future. Space exploration is not merely for expanding humanity’s knowledge of the galaxy; it is for colonizing far-off places and extracting valuable resources. A New-Historical approach would capitalize on the mercantile, neo-colonial nature of such a world system and offer critiques of globalization. The discussion may even highlight differences between a manufacturing economy and an information economy. 2. Labor issues surrounding robots. Because robots are ultra-efficient in the novel, humans almost banned their use on Earth (Asimov, p. 35). Because our contemporary technologies continue to put jobs in jeopardy, students may benefit from examining how robots are used mainly for jobs humans cannot do. By looking at contemporary labor disputes regarding illegal immigration, the novel may be compared to political positions in favor of allowing undocumented workers to obtain visas—these workers do jobs that Americans will not—and positions contrary to allowing undocumented workers to acquire jobs in the United States. Furthermore, the practice of outsourcing manufacturing to lower-paying countries (e. g., sweatshops) may provide an interesting parallel to the novel’s “outsourcing” to robots. Such a conversation may spur students to think about the precarious nature of technical jobs that “disappear” because of either automation or a cheaper labor source. 3. QT, the existential robot. A rather peculiar robot named QT or “Cutie” is able to rationalize his existence as superior to that of humans. Because he feels that his kind is bigger, faster, and stronger than humans, he reasons that the “Creator” must have made him in His own image and not the weaker humans (Asimov, p. 63). Because
A.A. Toscano / Computers and Composition 28 (2011) 14–27
4.
5.
6.
7.
25
QT does his job well enough, the humans decide not to disassemble him; instead, they allow the peculiar robot to go about believing that he is superior even though that condition is a minor nuisance for the robot engineers. Alan Cooper (2004) calls users who accept minor nuisances of technology apologists: An apologist learns to work around software bugs because she or he accepts that one must bend to the rules of technology (p. 30). This may lead to more discussion on user friendly versus user-centered designs in documentation. After all, what service are technical communicators doing if they assume their audience will conform to their expectations? Audience analysis is fundamental to all communication courses, and a more humanist approach would adapt documentation to users instead of conditioning users to just “deal with” the idiosyncratic problems that arise with software or other technologies. Redhead stereotypes. Interestingly, the future of I, Robot still portrays redheads as ill-tempered individuals. From Judas Iscariot to the redheaded robot engineer Mike Donovan, popular culture portrays redheads as ill-tempered or simply suspect (Roach, 2005). Because science and technology are often considered truth and not socially constructed “beliefs,” Donovan’s attitude helps foster discussions about how experts’ attitudes may affect how they carry out their work. Scientists and engineers are often considered cold, calculated, objective professionals, so Donovan’s disposition, even though a stereotype, helps humanize the experts. Too often science and technology are thought to be purely objective, practical endeavors devoid of human subjectivity. The politics of science and technology are well known (Latour, 1987; Winner, 1986). A discussion on the role of scientists’ and engineers’ attitudes goes well with ethical discussions that arise when scientists and engineers have to bow to the pressures of management. For example, the often-cited instance of “when Jerald Mason asked Robert Lund to ‘take off his engineer’s hat and put on his manager’s hat”’ and approve the Challenger’s ill-fated launch to conform to managerial wishes (Walzer & Gross, 1994, p. 425) supports the idea that attitudes (or, simply, human factors) affect technological policy decisions. Views of women in engineering and science. When the engineers are confronted with a robot that tells the humans what they want to hear, Dr. Calvin is tricked into believing that an object of her affection, Milton Ashe, is attracted to her. An analysis of women in science is quite appropriate at this point because Asimov portrays Dr. Calvin as an asexual being for most of the novel; however, after she is lied to about Ashe’s affection (p. 118), Dr. Calvin begins “using lipstick . . . [r]ouge, powder, and eyeshadow, too” (p. 121). Her new cosmetic look, combined with the fact that Dr. Calvin never marries, seems to suggest that a woman either has her job or her man—never both. Although such sexist ideology might be refuted as outdated, we cannot ignore that women are overwhelmingly underrepresented in “physical science, engineering, [and] technology field[s]” (Burger, Creamer, & Meszaros, 2007, p. 6). Discussions on gender are important in technical writing to begin addressing barriers to women entering these careers. Also, drawing on a humanist rationale, the class discussion may help introduce students to feminist critiques of gender, topics that traditionally get left out of science and engineering courses but are integral to humanities courses. Although Dr. Calvin has a prominent role in the novel—she is the narrator—she inhabits a subtle stereotypical role: In the universe of hard-science engineers, Dr. Calvin is a “soft” robopsychologist. That persona reinforces the stereotypes surrounding women’s understanding of emotions but not cold logic or facts—the supposed realm of male science and engineering. Government support of technology and war. Even though the world is at peace in Asimov’s novel, the military still invests in research and development. The first law of robotics, a governing principle impressioned into the “brains” of all robots, states “a robot may not injure a human being, or, through inaction, allow a human being to come to harm” (Asimov, p. 44-45). In order to do a job that the military wants, a few robots are not impressioned with the first law (Asimov, p. 140). Class discussion can begin with observations on how technologies such as the Internet, or Humvees (later sold commercially as Hummers), began as military applications before becoming consumer products. Besides the possible military-industrial complex critique, this particular vignette supports discussion on ethics when one non-first-law-impressioned robot gets loose and mixes in with the fully-impressioned robots. Dr. Calvin says to destroy all of the robots because it is too dangerous to have a non-first-law-impressioned robot on the loose. However, at thirty thousand dollars per robot (in 1940s dollars), a cost-benefit analysis has to be done, begging the question, “When exactly do you take the metaphorical Ford Pinto off the market?” Classes should use this example to explore the ethics of both technology and documentation of technologies. Who’s responsible for harmful technologies? Scientific and technological races. A rival company wants to sabotage U. S. Robot’s “brain robot” in order to keep the firm from coming up with a secret to creating the Hyperatomic Drive, an interstaller engine that, if
26
A.A. Toscano / Computers and Composition 28 (2011) 14–27
created, “will be the biggest thing in the world” according to a company manager (Asimov, p. 179). The brain robot is a supercomputer that does calculations faster than any other robot, and the brain robot can think. The rival company’s brain computer has failed, which puts them back “six years at least to build another [super robot]” (Asimov, p. 176). Because competition is so fierce, the rival company wants to slow U.S. Robots down. This scenario allows for a fruitful discussion examining competition in technology and science. After all, why do we call Watson and Crick’s work on DNA “the race for the double helix” or the United States and the Soviet Union’s attempts to travel into outer space “the space race” after Sputnik? With growing litigation and concern for intellectual property violations—whether they be downloading MP3 audio files or cannibalizing another firm’s proprietary software code—discussions on workplace ethics are germane to any technical writing course. The way that the rival company tries to sabotage U.S. Robot’s brain is an interesting example of ethics and withholding information. 8. The fallacy of machine predictability. At the novel’s conclusion, the reader learns that the robots “run the show.” Although utopia is not quite reached, Asimov’s world does look good. More accurately, though, the good is really from a modernist perspective that believes “science and technology” lead humanity on “the path of human progress and efficiency” (Wilson, 2001, p. 73). If robots—beings that think in mathematic formulae—can run the world, what does that say about our society? Is life so predictable that one can punch in numbers and get perfect results? Hardly. Science and technology are not perfect, but the ideology of the industrialized world insists that science and technology will solve our problems. This is not to say science and technology have not improved life; instead, the discussion should explore the social values and practices that stem from holding the view of technological panacea. Are all technologies equally valuable? Technologies need to be analyzed just like we analyze communication—from the surface level to the systemic level. A pacemaker “update” is a bit more significant than a software update when we think of the pacemaker’s direct benefit for human life. Then again, software updates do make our lives easier if they fix bugs or improve functionality. Introducing this topic may also support critical discussions on planned obsolescence—the dominant model for most, if not all, software. Aaron A. Toscano is an assistant professor of English at the University of North Carolina at Charlotte, where he teaches courses in technical communication, rhetoric/composition, new media studies, and women’s and gender studies. His teaching and research interests also include rhetoric of technology, science and technology studies (STS), and popular culture studies.
References Asimov, Isaac. (2004). I, robot. New York: Spectra-Bantam.(Original work published 1950 by Gnome Press). Burger, Carol J., Creamer, Elizabeth G., & Meszaros, Peggy S. (Eds.). (2007). Reconfiguring the firewall: Recruiting women to information technology across cultures and continents. Wellesley, MA: A K Peters. Connors, Robert J. (2004). The rise of technical writing instruction in America. In J. Dubinsky (Ed.), Teaching technical communication: Critical issues for the classroom. (pp. 77–98). Boston: Bedford/St. Martin’s. (Original work published in 1982). Cooper, Alan. (2004). The inmates are running the asylum: Why high tech products drive us crazy and how to restore the sanity. Indianapolis: Sams. Duffelmeyer, Barbara B. (2000). Critical computer literacy: Computers in first-year composition as topic and entertainment. Computers and Composition, 17(3), 289–307. Feenberg, Andrew. (2002). Transforming technology: A critical theory revisited. Oxford: Oxford University Press. Flower, Linda. (1979). Writer-based prose: A cognitive basis for problems in writing. College English, 41(1), 19–37. Gurak, Laura J. (2001). Cyberliteracy: Navigating the Internet with awareness. London: Yale University Press. Hayles, N. Katherine. (1999). How we became posthuman: Virtual bodies in cybernetics, literature, and informatics. Chicago: University of Chicago Press. Johnson, Robert J. (2003). When all else fails, use the instructions: Local knowledge, negotiation, and the construction of user-centered computer documentation. In T. Peeples (Ed.), Professional writing and rhetoric. (pp. 287–316). New York: Longman. (Original work published 1998). Kastman Breuch, Lee-Ann. (2002). Thinking critically about technological literacy: Developing a framework to guide computer pedagogy in technical communication. Technical Communication Quarterly, 11(3), 267–288. Latour, Bruno. (1987). Science in action: How to follow scientists and engineers through society. Cambridge, MA: Harvard University Press. McLuhan, Marshall. (1964). Understanding media: The extensions of man. New York: Signet. Miller, Carolyn R. (1979). A humanistic rationale for technical writing. College English, 40(6), 610–617. Rivers, William E. (1994). Studies in the history of business and technical writing: A bibliographic essay. Journal of Business and Technical Communication, 8(1), 6–57. Roach, Marion. (2005). The roots of desire: The myth, meaning, and sexual power of red hair. New York: Bloomsbury. Scott, Tony. (2006). Writing work, technology, and pedagogy in the era of late capitalism. Computers and Composition, 23(2), 228–243. Selber, Stuart. (2004). Multiliteracies for a digital age. Carbondale, IL: Southern Illinois University Press.
A.A. Toscano / Computers and Composition 28 (2011) 14–27
27
Selfe, Cynthia L. (1999). Technology and literacy in the twenty-first century: The importance of paying attention. Carbondale, IL: Southern Illinois University Press. Walzer, Arthur E., & Gross, Alan. (1994). Positivists, postmodernists, Aristotelians, and the Challenger disaster. College English, 56(4), 420–433. Wilson, Greg. (2001). Technical communication in late capitalism: Considering a postmodern technical communication pedagogy. Journal of Business and Technical Communication, 15(2), 72–99. Winner, Langdon. (1986). The whale and the reactor: A search for limits in the age of high technology. Chicago: The University of Chicago Press.