Bulletin of Science, Technology & Society - camolinaro.net

4 downloads 514 Views 208KB Size Report
Additional services and information for ... human agency) and technology can influence the way legal thinkers develop policy at the intersection of law and ...
Bulletin of Science, Technology & Society http://bst.sagepub.com

Individual Autonomy, Law, and Technology: Should Soft Determinism Guide Legal Analysis? Arthur J. Cockfield Bulletin of Science Technology Society 2010; 30; 4 DOI: 10.1177/0270467609357452 The online version of this article can be found at: http://bst.sagepub.com/cgi/content/abstract/30/1/4

Published by: http://www.sagepublications.com

On behalf of: National Association for Science, Technology & Society

Additional services and information for Bulletin of Science, Technology & Society can be found at: Email Alerts: http://bst.sagepub.com/cgi/alerts Subscriptions: http://bst.sagepub.com/subscriptions Reprints: http://www.sagepub.com/journalsReprints.nav Permissions: http://www.sagepub.com/journalsPermissions.nav Citations http://bst.sagepub.com/cgi/content/refs/30/1/4

Downloaded from http://bst.sagepub.com at QUEENS UNIV LIBRARIES on March 29, 2010

Electronic copy available at: http://ssrn.com/abstract=1582565

Individual Autonomy, Law, and Technology: Should Soft Determinism Guide Legal Analysis?

Bulletin of Science, Technology & Society 30(1) 4­–8 © 2010 SAGE Publications Reprints and permission: http://www. sagepub.com/journalsPermissions.nav DOI: 10.1177/0270467609357452 http://bsts.sagepub.com

Arthur J. Cockfield1

Abstract How one thinks about the relationship between individual autonomy (sometimes referred to as individual willpower or human agency) and technology can influence the way legal thinkers develop policy at the intersection of law and technology. Perspectives that fall toward the `machines control us' end of the spectrum may support more interventionist legal policies while those who identify more closely with the `we are in charge of machines' position may refuse to interfere with technological developments.The concept of soft determinism charts a middle-ground between these two positions and could assist in the formulation of a general theory of the relationship between law and technology. Soft determinism maintains that technological developments are embedded in social, political, economic and other processes, and serve to guide and, potentially, configure future actions and relationships with these technologies, their users, and their subjects: while past technology develops shape the present, individuals and groups can still exert control over these technological developments. Keywords Law, technology, theory, soft determinism

1. Humans Versus Machines Do machines control us or do we control machines? Do we live in a Matrix-like environment oblivious to the fact that technologies structure our individual lives as well as the societies in which we live? Or are we in charge of these technologies? How one thinks about the relationship between individual autonomy (sometimes referred to as individual willpower or human agency) and technology can influence the way legal thinkers develop policy at the intersection of law and technology. This article describes how views on this topic can assist legal analysis in developing optimal laws and policies. By way of background, technology thinkers are sometimes broken down into two groups: instrumental and substantive theorists of technology (Feenberg, 2002, pp. 5-8). Generally speaking, instrumental theories (or perspectives) tend to treat technology as a neutral tool without examining its broader social, cultural, and political impacts.1 The instrumentalists are often identified with strains of thought that respect individual autonomy (or agency) in matters of technology, in part because technology itself is perceived to be neutral in its impact on human affairs, and in part because of the emphasis on human willpower to decide whether to adopt technologies. Technological transformation in this framework is a matter of purposeful action, a matter of rational and utilitarian choice.

A number of theorists, particularly in communications, management, and economics, hold that technology is simply a tool, that is, an instrument of the group or individual that chooses to develop and use a certain technology.2 This perspective maintains that technology solely serves the intended purposes held for it by its users. For the instrumentalists, human beings can and do direct the use of technology, and the fears of technological tyranny overcoming individual autonomy are unfounded. Many of these conceptions of technology rely on the optimistic view that technology change produces largely beneficent results for individuals and their communities. In contrast, substantive theories emphasize the ways in which technological systems (or “structure”) can have a substantive impact on individual and community interests that may differ from the technologies’ intended impact: Substantive theorists sometimes emphasize how technological structure can overcome human willpower or even institutional action (Winner, 1980, p. 122). The perspective of Jacques Ellul, for instance, suggests technology to be far more than a tool. In his estimation, 1

Queen’s University, Kingston, Ontario, Canada

Corresponding Author: Arthur J. Cockfield, Faculty of Law, Queen’s University, Macdonald Hall, 128 Union Street, Kingston, Ontario K7L3N6, Canada Email: [email protected]

Downloaded from http://bst.sagepub.com at QUEENS UNIV LIBRARIES on March 29, 2010

Electronic copy available at: http://ssrn.com/abstract=1582565

5

Cockfield technology is imbued with the power of the social structure, and as such has rendered the actions of human agents insignificant. In our technological society, all of life is being subsumed by “technique” described as “the totality of methods rationally arrived at and having absolute efficiency . . . in every field of human activity” (Ellul, 1964, p. xxv). How can instrumental and substantive theories influence legal analysis?3 Instrumentalists, such as Alfred E. Neuman, tend to say, “What, me worry?” After all, an instrumentalist might maintain, individuals or governments can choose to adopt (or not) new technologies, and hence there is no real policy problem. A substantive theorist, on the other hand, might worry that, as Heidegger (1977) suggested, we believe that we are masters of technology, but this is largely an illusion because we have become a part of technological processes and are forced to keep up with and adapt to these processes (p. 5). Hence the adoption of certain technologies, or the lack of regulation over these technologies by government or judicial actors, may lead us to a place we did not necessarily expect or want to go to. In summary, legal thinkers who fall toward the “machines control us” end of the spectrum may support more interventionist legal policies whereas those who identify more closely with the “we are in charge of machines” position may refuse to interfere with technological developments in part due to a reluctance to change the legal status quo in an aggressive manner as this could unduly upset traditional legal interests. In a related issue, law and technology perspectives can be similarly influenced by views on whether modern technology developments form part of a continuum of ongoing developments or whether these developments represent a substantial break from prior technological eras. Commentators who view new technologies as discontinuous from these earlier times may emphasize the deficiencies of traditional approaches (Johnson & Post, 1996, p. 1400). Others who see historic continuities in developments of technology may claim that traditional legal mechanisms can properly address legal issues (Goldsmith, 1998).

2. Big Brother Is Watching You To take one example, how would the two theories inform views on the increasing deployment by governments of surveillance technologies? George Orwell’s famed novel 1984 warned of a future totalitarian state where the government watched its citizens through “telescreens” to ensure they followed the law, and to tamp down possible political dissent. The United Kingdom today is widely regarded as an emerging “surveillance society” because of its enthusiastic adoption of private sector and government surveillance technologies to fight crime and terrorism (Lyon, 1994; Lyon & Zureik, 1996). For example,

over 4 million close-circuit television (CCTV) cameras scrutinize public and private spaces in Great Britain, and certain London residents can be caught on camera up to 300 times per day (McCahill & Norris, 2003). Legal processes may be encouraging the rise of surveillance societies because legal analysis traditionally emphasizes the individual rights aspects of privacy in the context of police investigations (Cockfield, 2007). This view has sometimes led to the notion that privacy is a competing interest with security hence privacy must be diluted to protect the public against criminals and terrorists. From an instrumental perspective, governments, operating through legitimate democratic means, should adopt these technologies to fight crime and terrorism as long as it is an efficient and effective means to achieve these ends. A substantive perspective, on the other hand, could maintain we are heading down a road where our governments are deploying powerful new technologies to keep us safe with the unintended effect of undermining democratic values and, at least in the long run, our safety and security. It is not so much that the machines are controlling us, but more that we are adopting technologies (ostensibly for sound policy reasons) that may end up backfiring. In an era where governments are deploying powerful information, communication, and genetic technologies that greatly enhance their ability to collect, use, and share personal information as part of their investigations, a broader consideration of the privacy interests at stake is required. Under this view, legal analysis should recognize the “public” or “social” aspect of privacy, which is society’s interest in preserving privacy apart from the interest of a particular individual’s interest. Priscilla Regan (1995), for instance, argues that privacy serves purposes beyond those that it performs for a particular individual. She notes that one aspect of the social value of privacy is that it sets boundaries for the state’s exercise of power. Such boundaries, for example, underlie freedom of speech and association within a liberal democratic political system. Hence even if privacy becomes less important to certain individuals, it continues to serve other critical interests in a free and democratic state (e.g., the need to protect political dissent) beyond those that it performs for a particular person. Consistent with this view, research by sociologists, political scientists, and others explores how technological advances in surveillance heighten the risk of unanticipated adverse social consequences.4 These outcomes include repression of political dissent as surveillance technologies are used to target members of identifiable groups despite no evidence of individual wrongdoing (Cockfield, 2007, pp. 51-53). This sort of profiling tends to lead to social alienation of individuals within the targeted group, which increasingly leads to an “us versus them” mentality where they refuse to assist

Downloaded from http://bst.sagepub.com at QUEENS UNIV LIBRARIES on March 29, 2010

6

Bulletin of Science,Technology & Society 30(1)

authorities with their investigations. Further, pervasive and unseen scrutiny by state agents carries the potential for inhibiting freedom of expression as individuals fear their speech and actions could be monitored by the police. Finally, nations become less democratic when citizens have greater difficulty in holding state agents accountable for their actions: Technological and legal developments increase the risk that police and intelligence officers will abuse their new surveillance powers without being detected. The erosion of these traditional democratic values may reduce our safety and security, at least in the long run. As such, the preservation of the social value of privacy can be portrayed as consistent—and not competing—with security interests.

3. Soft Determinism as the Way Forward Whether machines control us is ultimately a way to help understand how the relationship between law and technology affects the outcomes of legal and policy decisions. In the previous example, an instrumental perspective tends to underappreciate the complex interaction between law, technology, and human institutions that can lead to unanticipated and adverse social policy outcomes. Yet substantive theories often seem to pay insufficient heed to the importance of human agency. They may be too quick to assume that technological structures overwhelm the wills of technology producers and consumers. In any event, attempts such as the one above to parse instrumental and substantive approaches in terms of their positions on agency/structure are recognized as reductionist interpretations of complex theories. Each of these approaches is far more complex than this dichotomy suggests and most works note how agency affects structure and vice versa. For these reasons, there have been ongoing attempts outside the legal academy to chart a middle-ground between the two approaches. By way of example, Manuel Castells’s (2000) work places people and their artifacts in a mutually bound relationship (pp. 500-502). In his view, neither can one remove a technology or a conception of technology from the networks of relations in which it is bound, nor can one extract the relationships of human beings with technology from the network in which they are bound. Hence both technology and humanity are necessarily implicated in and bound together within complex social relationships. Similarly, rather than suggesting that technology drives individuals (or vice versa), science and technology studies seek this middle-ground seeing, for instance, history and technological development as intertwined (Sismondo, 2004). These views appear to track the “soft determinism” perspective articulated by some technology theorists (Sismondo, 2004, p. 81). Unlike strict versions of technological determinism,

soft determinism maintains that technological developments are embedded in social, political, economic, and other processes, and serve to guide and, potentially, configure future actions and relationships with these technologies, their users, and their subjects. Although past technology developments shape the present, individuals and groups can still exert control over these technological developments. Soft determinism, at least initially, may appear to offer a comforting story about possible law and technology analysis. It accepts that technology exerts a determining influence on human affairs, but argues that insightful legal analysis can somehow help to resist the harmful aspects of this determinism. Under this view, governments could wisely deploy CCTV to watch individuals in limited ways to promote security while preserving important civil liberties. The fact that the story appears comforting should give one pause. In fact, because of the complex nonlinear relationship between law and technology and that human constructs such as law and technology are embedded within changing social, political, economic, and other processes, it will always remain a difficult challenge to gauge how legal rules and policies will result in desired policy outcomes. Moreover, soft determinism is related to the philosophical notion of compatibilism that holds out the prospect of free will in a deterministic universe (i.e., a universe where every event is causally related to past events). Compatibilism accepts that human choice is constrained by the fact that everything outside the mind (the natural environment, parental, and peer influences, etc.) and everything inside the mind (genetics) constrains individual decision making. Yet compabitilism suggests we can still exert free will as long as we do not act out of compulsion by another person. This account of free will has been attacked on various grounds by schools of thought called incompatibilism or hard determinism, which suggest that genuine free will is impossible when one accounts for all the possible influences on decision making (Pereboom, 2001). A law and technology theory that is informed by soft determinism could similarly be subjected to critiques that argue it is an illusion to think we can truly resist the determining power of technology through better informed legal and policy analysis. Nevertheless, soft determinism provides the most helpful way to reconcile the tension between instrumental and substantive theories and perspectives on technology. Legal analysis informed by soft determinism could seek to balance the potentials for restrictive and beneficial forms of structure against the limitations and potentials of individual autonomy. The challenge is to account for these complexities to the greatest extent possible while recognizing that prescriptions for “optimal” laws and policies will forever remain contentious. Indeed, soft determinism recognizes that the past often exerts control over the present in a similar way that common law lawyers understand that the law is tied to prior developments.

Downloaded from http://bst.sagepub.com at QUEENS UNIV LIBRARIES on March 29, 2010

7

Cockfield The “always looking back” element of the common law, for instance, promotes stability in the law while simultaneously “always looking forward” to adapt to changing circumstances: The law embodies the story of a nation’s development through many centuries, and it cannot be dealt with as if it contained only the axioms and corollaries of a book of mathematics. In order to know what it is, we must know what it has been, and what it tends to become. (Holmes, 1881/1991, p. 1) A synthesis of instrumental and substantive theoretical strands that adopts the soft determinism approach and could help to define the ambit and scope of a general theory of law and technology (Cockfield & Pridmore, 2007, pp. 494-500). This synthetic theory of law and technology could inform frameworks or guiding principles to help legal thinkers address situations where technology change is destabilizing values and interests traditionally protected by law. The approach, like the common law, investigates the tension between past and present developments and could ultimately “illuminate the entire law” (Easterbrook, 1996). Under the proposed framework, the instrumentalist approach reminds legal analysts that each situation must be carefully scrutinized under its own facts and circumstances to determine whether technology is unduly subverting interests that the law has traditionally protected. Once a determination is made that technology change is in fact harming traditional interests, the substantive approach can inform analysis that seeks a broader contextual (i.e., less deferential to precedent) understanding of potential legal solutions that will preserve the traditional interests. These solutions include investigations into whether the law can mold technological developments to indirectly influence individual and group behavior (Lessig, 1999, p. 502). This perspective does not seek to present a radical reconception of legal analysis involving law and technology matters. Rather, the theory informs a framework that calls for a more explicit consideration of the complex interplay between law and technology, and the ways technology can have a substantive impact on individuals and their legal interests apart from the technology’s initial intended use. The critiques of soft determinism remind us that legal analysis in this area is fraught with difficulties, in part because the determining power of technological structure may be masked within other processes. Declaration of Conflicting Interests The author declared no conflicts of interest with respect to the authorship and/or publication of this article.

Funding The author received no financial support for the research and/or authorship of this article.

Notes 1. I examined the relationship between individual autonomy, law, and technology in earlier works, which this article draws from to a certain extent (Cockfield, 2004, pp. 385-386, 398-399; Cockfield & Pridmore, 2007, pp. 478-495). 2. Some instrumental approaches ignore questions of individual autonomy because they are exclusively focused on enhancing efficiency, leaving the social questions to other disciplines (van Wyk, 2002). 3. A number of legal scholars have previously, either implicitly or explicitly, accounted for views on human willpower/ technological structure when examining legal responses to innovations (Bennett Moses, 2005; Bernstein, 2004; Chandler, 2007; Tranter, 2005). 4. Our research team’s comparative survey of over 7,000 individuals in nine countries (Canada, the United States, France, Hungary, Spain, Mexico, Japan, China, and Brazil) broadly suggests that individuals in these countries believe their governments have not struck the right balance in protecting their security and privacy, leading to views that privacy interests may be undermined by a combination of legal reforms and the usage of powerful new surveillance technologies. See The Surveillance Project (2008), The Globalization of Personal Data Project: An International Survey on Privacy and Surveillance, Summary of Findings.

References Bennett Moses, L. (2005). Understanding legal responses to technological change: The example of in vitro fertilization. Minnesota Journal of Law, Science & Technology, 6, 505-618. Bernstein, G. (2004). Accommodating technological innovation: Identity, genetic testing and the Internet. Vanderbilt Law Review, 57, 965. Castells, M. (2000). The rise of the network society (2nd ed.). Oxford, UK: Blackwell. Chandler, J. (2007). The autonomy of technology: Do courts control technology or do they just legitimize its social acceptance? Bulletin of Science, Technology & Society, 27, 339-348. Cockfield, A. J. (2004). Towards a law and technology theory. Manitoba Law Journal, 30, 383-415. Cockfield, A. J. (2007). Protecting the social value of privacy in the context of state investigations using new technologies. University of British Columbia Law Review, 40, 41-67. Cockfield, A. J., & Pridmore, J. (2007). A synthetic theory of law and technology. Minnesota Journal Law, Science & Technology, 8, 475-513. Easterbrook, F. H. (1996). Cyberspace and the law of the horse. University of Chicago Legal Forum, 1996, 207. Ellul, J. (1964). The technological society (J. Wilkinson, Trans.). New York: Vintage Press. Feenberg, A. (2002). Transforming technology: A critical theory revisited. Oxford, UK: Oxford University Press. Holmes, O. W., Jr. (1991). The common law. New York: Dover. (Original work published 1881)

Downloaded from http://bst.sagepub.com at QUEENS UNIV LIBRARIES on March 29, 2010

8

Bulletin of Science,Technology & Society 30(1)

Goldsmith, J. L. (1998). Against cyberanarchy. University of Chicago Law Review, 65, 1199. Heidegger, M. (1977). The question concerning technology and other essays (W. Lovitt, Trans.). New York: HarperCollins. Johnson, D., & Post, D. (1996). Law and borders: The rise of law in cyberspace. Stanford Law Review, 48, 1367. Lessig, L. (1999). The law of the horse: What cyberlaw might teach. Harvard Law Review, 113, 501. Lyon, D. (1994). The electronic eye: The rise of surveillance society. Minneapolis: University of Minnesota Press. Lyon, D., & Zureik, E. (1996). Surveillance, privacy, and the new technology. In D. Lyon & E. Zureik (Eds.), Computers, surveillance, and privacy. Minneapolis: University of Minnesota Press. McCahill, M., & Norris, C. (2003). Estimating the extent, sophistication and legality of CCTV in London. In M. Gill (Ed.), CCTV (pp. 51-66). Basingstoke, UK: Perpetuity Press. Pereboom, D. (2001). Living without free will. Cambridge, UK: Cambridge University Press.

Regan, P. (1995). Legislating privacy: Technology, social values, and public policy. Chapel Hill: University of North Carolina Press. Sismondo, S. (2004). An introduction to science and technology studies. Oxford, UK: Blackwell. The Surveillance Project. (2008). The globalization of personal data project: An international survey on privacy and surveillance, summary of findings. Kingston, Ontario, Canada: Author. Tranter, K. (2005). The history of the haste wagons: Motor car 1909 (VIC), emergent technology and the call for law. Melbourne University Law Review, 29, 843. van Wyk, R. J. (2002). Technology: A fundamental structure? Knowledge, Technology & Policy, 15, 14-35. Winner, L. (1980). Do artifacts have politics? Daedalus, Winter, 109.

Bio Arthur J. Cockfield, BA (University of Western Ontario), LL.B (Queen’s University), JSM and JSD (Stanford University), is an Associate Professor at Queen’s University Faculty of Law.

Downloaded from http://bst.sagepub.com at QUEENS UNIV LIBRARIES on March 29, 2010