GOLUMBIA, COMPUTATIONAL OBJECT
1
The Computational Object: A Poststructuralist Approach DAVID GOLUMBIA Independent scholar, New York, NY (
[email protected])
Abstract. To consider the development environment of modern computing means to take seriously the metaphoric operation that links language in general with programming languages. On close examination very little of the common-sense structure of this metaphor can be said to hold. Programming languages are not much like so-called natural languages, making the overt cultural power of the metaphor that much stronger.
Programming
languages in this sense are explicitly a product of Western language practices and the Western orientation toward logic, including the deep metaphor according to which the mind is like a machine, and thought is like computation. To the degree that they are like natural languages, programming languages represent an extraordinarily successful extension of standard written versions of English, whose spread is deeply problematic in world linguistic and cultural terms. The modern Object-Oriented Paradigm is shown to move both toward and away from these metaphors, perhaps less fully than might be possible. Keywords: Computer environment; cultural diversity; deconstruction; linguistic diversity; linguistic imperialism; metaphor; multilinguality; Object-Oriented Paradigm; programming languages; standardization.
1. “Programming Languages” as Metaphor Some aspects of the computer environment have come into sharp focus within contemporary philosophy and cultural studies. In general, this focus has been concentrated on issues of content and the end-user experience, and to a lesser extent on end-user application design. The areas of scholarly inquiry one thinks of most 1 Computers and the Humanities xx: nnn-nnn, yyyy © yyyy. Kluwer Academic Publishers. Printed in the Netherlands.
often in this area include “new media studies,” which focuses very much on content, and “informatics,” which is in some ways a kind of general orientation toward the computer revolution. In both approaches, as in the social understanding more generally, we see the general acceptance and deployment of a series of metaphorical strategies in which communication is the proper transfer of a given amount of information from point A to point B, the points referred to being somewhat interchangeably human beings or computers. 1 An area to which less attention has been paid is the means of construction of the computer environment: this is to say what we know as programming languages, though I will suggest that there are serious problems with understanding the environment this simply. In truth, the object under consideration extends from the actual hardware on which computer software runs, through the multiple layers of software that exist between the hardware and the programmer, each of which has been designed in a more or less systematic way for any number of reasons. Our modern understanding of this process as one controlled by programming languages is part of what has not yet been brought into sustained critical or philosophical perspective (see Floridi 1999). It is unclear where the linguistic and conceptual isomorphisms arose that inserted the term language into programming languages, although that it persists as a central metaphor from the beginning of the information age is completely clear. (COBOL, for example, one of the first languages to be written by a corporate-militarygovernment committee, was explicitly written in terms of nouns and verbs; see Sammet 1981, 207) But on the surface, programming languages have very little in common with so-called natural languages. Natural languages are highly variable, possessing of any number of interlocking structural plans that remain partly intractable even to today’s most sophisticated grammatical theorists (Chomsky 1986, 1995). Programming languages, on the contrary, are explicit formalisms, derived from a small handful of core structural models, designed from the beginning to be reducible to logic and mathematics (Gelernter and Jagannathan 1990, Knuth 1992). Around 6,000 human languages are spoken in the world today; there are only about
GOLUMBIA, COMPUTATIONAL OBJECT
3
120 active writing systems of any sort, in which about 300-400 of the 6,000 languages are commonly expressed.2 Despite the Western analytic bias toward writing, it is clear that understanding human language as a user has to do with conversational interaction and oral performance: these are the fundamental ground from which natural languages are learned (Abram 1992, Illich 1980, Linell 1982). Even so-called illiterate persons generally have full command of every aspect of human linguistic competence (which raises interesting questions about what is being taught when print literacy is socially enforced). Programming languages, on the other hand, are solely written; only about 20 or so have played important roles in constructing the computational infrastructure to which we trust much of our economic and eco-political relations. They are all constructed out of English words and English orthography and they are all standardized (to greater and lesser degrees), whereas few human languages are standardized (see Bex and Watts 1999, Crowley 1989, de Grazia 1990 and Parakrama 1995). (At the same time, it is fair to say that programming languages display a higher degree of variation than do natural languages; no new written languages have emerged at all over the time that all programming languages have emerged.) Although it is possible to imagine understanding spoken programming languages, it is hard to imagine this being done for any practicable human purpose. In other words, programming languages do not seem to do what natural languages do. The ease with which we adapt to the metaphor of programming languages as languages has several interesting antecedents and consequences.3 It helps to underwrite the dominant paradigm in much contemporary philosophy of language and linguistics, suggesting that the structure of programming languages might be (or must be) something like the structure of human languages. In part this stems from an even deeper, and arguably uniquely Western, belief: that the mind and its linguistic expression are best understood as a kind of logical machine, ultimately founded on a kind of digital or binary logic in which all decisions are, ultimately, true or false; on or off; black or white; right or wrong; and so on (a core belief that has been the target no less of critical linguistics than of deconstruction and of cultural materialism). 4 In the case of the computer environment, the “language machine” story underwrites a 3 Computers and the Humanities xx: nnn-nnn, yyyy © yyyy. Kluwer Academic Publishers. Printed in the Netherlands.
fact pattern that has not been interrogated closely enough: that it does require significant aspects of the human language processing mechanism to manage programming languages. That is, to say that programming languages constitute an interface between exactly the binary qualities of the underlying hardware and the ability of most human beings to stretch their language skills only so far — only so far away from natural language while still retaining sufficient fluency to manipulate software with dexterity.
2. Computation and Competence Along with language, mathematics is often understood to be a universal, formal system of propositions and values, not a skill set exactly but a kind of core component of human rationality that nevertheless requires significant environmental prompting to be realized. Arguably this view is mistaken: arguably mathematical competence is culturally-specific in precisely the sense that what we call math is culturally-specific, and is learned only in context of education in Western-style schools. In other words, most cultures maintain a wide variety of methods with which to manage quantitative structures and practices, both in an empirical sense and abstractly; but few cultures insist that abstracted algebra, geometry, calculus, trigonometry must be taught to the members of the ruling and bureaucratic classes as a system of universal principles. In this respect it is interesting to reflect on just what is the object that transformationalists have in mind when they talk about linguistic competence, which is the hallmark of Universal Grammar. It is a key Chomskyan assumption that every spoken language manifests the entirety — or nearly the entirety — of the core grammatical architecture. In some important sense, and surprisingly since Chomsky himself rarely stresses the point, a non-standardized language like so-called Black English represents the mechanisms of Universal Grammar at least as much if not moreso than does a language practice like standard written English that must be enforced through continual training and education. This shows another strain in the programming/language analogy: for we have little doubt that even an adult who has been wholly unexposed to written language can
GOLUMBIA, COMPUTATIONAL OBJECT
5
master its ways with some amount of tutoring and study; one could see a Chomskyan using this fact to suggest that some parameters of the universal architecture are revealed in written practice as well, requiring environmental stimulus to be awakened (although other construals are possible as well). But this is not the case with programming languages. Programming languages seem to align much more readily with other mathematically-based skills: a definable subset of the population has talent for them, and only a subset of that subset that goes on to train these skills can truly be said to be fluent in them. This is curious because it seems to suggest directly, contra much contemporary philosophy, that mathematical skills are not universal, at least not in the sense that language skills are. This raises questions about how much of an “unstacked deck” social structure can be, when it is run by principles that are not understood or even manipulable equally by all members of the population. The critical problem with the lack of analogy between computational competence and linguistic competence is that it is precisely the computational aspects of natural language that , on Chomsky’s account, define competence. Thus an illustration of the “fundamental problem” of linguistics is that “without instruction or direct evidence, children unerringly use computationally complex structure-dependent rules rather than computationally simple rules that involve only the predicate ‘leftmost’ in a linear sequence of words” (Chomsky 1986, 7-8). Universal Grammar is supposed to be a kind of computer in the head; it parses (in an unspecified order) strings of natural language, decomposing them into lexical items and combinatorial rules. On the surface this is just what programming languages seem to do. A programming language, especially a very logically-oriented one like Basic or Pascal, also decomposes into natural language strings and logical rules. When the program runs these rules and propositions are executed, and this execution is called “computation.” This sounds reasonable but on reflection it falls apart. Chomskyan computation approaches a sentence like a logical proof; the Universal Grammar “computer” takes strings of natural language as input, applying automatic rules of syntactic organization to derive — generate — meaning. This process is applied once (or
5 Computers and the Humanities xx: nnn-nnn, yyyy © yyyy. Kluwer Academic Publishers. Printed in the Netherlands.
recursively), essentially transforming a proposition into a meaning: “The bicycle was moved by me” is a transformation from “I moved the bicycle” via the passivization transformation. That “passivization” rule is applied right at the level of natural language: it is operation directly on the pronoun “I” to move it to the end of the sentence and change its case to accusative. The computer that does this operates nothing at all like what we understand as a physical computer. A physical computer does not take strings of natural language as input; when a physical computer interprets what appears to be a string of logical propositions, it is precisely because the processor cannot understand them that the program must be translated into an entirely different language, machine language, that is almost entirely uninterpretable (at the level of entire programs) by human beings. There is no sense in which the binary processing of machine language can be said to consist in taking strings of natural language as input, applying automatic rules of syntactic organization to derive meaning. In fact, this almost sounds like a description precisely of what computers cannot do: it sounds like an explanation for why we have had to build programming languages, precisely to translate our own, highly idealized and abstracted, natural-language thoughts into a form of expression that a computer can understand. Indeed, if the architecture of a computer were much like the architecture of the human language processor, one would expect to see a far higher degree of harmony between human skill with computers and human skill with language.5 The reason we do not is that computers are fundamentally mathematical and logical; they understand and manipulate a certain kind of abstract symbol, in purely logical terms, in a way no human being can. Instead, programming languages in generally are concentrated on a limited set of functions that map precisely to their hardware underpinnings and logico-mathematical nature. “Because mathematics provides no traditional way for expressing [other kinds of] objects and operations, they were left out of programming’s notational scheme at the beginning; and by and large, they remain left out to this day” (Gelernter and Jagannathan 1990, 9).
GOLUMBIA, COMPUTATIONAL OBJECT
7
Human beings understand natural language in a way no computer appears to be able to. This suggests that the phrase “programming languages” forms a kind of cultural crux (or “catachresis” in the terms of Spivak 1999), a stance toward the world that is part and parcel of other forms of ideology, and one whose remarkable underexamination comports disturbingly with its almost unmatched power, in historical terms, over human subjectivity and the definition of social relations.
3. Computer Languages and Linguistic Diversity Another tempting but inaccurate analogy between programming languages and natural languages can be found along the axis of linguistic diversity. It is no accident, and no less remarkable, that insofar as something akin to natural language “runs” computers, that language would have to be identified with contemporary standard written English. Contemporary standard written English is remarkably effective at projecting itself as typical of “the way language works”; it is also remarkably atypical, both from an historical perspective and in a synchronic perspective, that is as one out of the world’s 6,000 or so languages. Of the approximately 120 writing systems in active use, only a handful can be said to have developed English-style alphabetic orthography “internally” — that is, much prior to the imposition of direct colonial administration by European countries (Coulmas 1989, 1996; SkutnabbKangas 2000; also see Derrida 1976, 1979, 1996; Golumbia 1999; Ong 1977, 1982). Many societies have found themselves compelled to re-write their languages so as to either comport with Western standards or, simply, to become Romanized or quasiEnglish (Skutnabb-Kangas 2000). Few have been standardized to anything like the Greco-Roman degree (in other words, most non-Western writing systems include many “left over” characters, alternate forms, competing systems, and so on); each of them presents unique difficulties for computers, such that native speakers of any of the world’s non-European major languages experience a routine degree of difficulty in using these languages in the computational environment.6 Few English-only speakers realize the implications of the fact that almost all programming languages consist entirely of English orthographic phrases, and that most operating systems are structured around command-line interfaces that take 7 Computers and the Humanities xx: nnn-nnn, yyyy © yyyy. Kluwer Academic Publishers. Printed in the Netherlands.
English writing, and specifically imperative statements, as their input (Lawler 1999). The extraordinary success of software engineers in India, Russia, Japan and Hong Kong (among other places) maps directly onto the megalopolises created and maintained by world empires, and correlates no less with the spread of English-style education and enforced English language and orthography (Skutnabb-Kangas 2000). It seems no accident that computers rely on the availability of standardized text and the availability of persons who are fluent in computer engineering emerge from cultures where English-style standardization is produced and often enforced. This has not resulted in the wide availability of native-language resources, even in the widelydistributed alternate orthographies, most especially at the software development level. One might say that, despite the ability of computers to produce documents in Hindi or Japanese, computers and networks themselves speak and in some sense “think” only a fragmentary, math- and logic-oriented chunk of English. This is visible most readily on the Internet. In at least three, connected ways, the web looks like an instrument of multilinguality, but on closer examination seems largely to be organized around very Westernized, English-based categories and language concepts. First, the HTML for most web documents, the markup which surrounds the document’s content (along with JavaScript and several other non-compiled script languages), is fundamentally in English, so that it is necessary to understand the English meaning of certain words and abbreviations in order to read a document’s source (and in a critical sense, to interpret the document). Second, web servers and web software are themselves usually confined solely to English, necessitating (among other things) English-based CGI programs and directories, meaning that the URLs of most web documents are largely in English (in fact, it is only very recently that it has been proposed for browsers to allow any non-Roman characters in URLs). 7 Related to this is the fact that entire operating system world is run by English products and with English directory structures, such that most web-based categorization tools (like Yahoo!) are organized around profoundly Western-style categories and category systems — often, the exactly equivalent English titles, viewable as the HTML link for any categories on any non-English Yahoo! website, including most of the pages that have Roman character representations and those that do not.
GOLUMBIA, COMPUTATIONAL OBJECT
9
From the perspective of world linguistic history, programming languages represent not a diversity of approaches so much as a remarkable extension of an already highlystandardized phenomenon: English. This might seem a strong claim, were it not exactly in line with one of the most remarkable features of contemporary world culture, namely the global replacement of local languages with English and, to an important but lesser degree, other standardized languages (Nettle and Romaine 2000, Illich 1980). We have been taught to think of the computer revolution as a fundamental extension of human thinking power, but in a significant way mass computerization may be more accurately thought of as a vehicle for the accelerated spread of a dominant standard written language. (Computer-based writing may only be less successful, that is to say, than are mass media such as television and pop music as media for spreading prestige English as a spoken tongue.) Fortunately or not, one of the things computers have also done is to have helped expose and call into question the spread of English and other standardized languages, just as this spread has taken on new power exactly through the spread of computers and the bureaucratic systems of control they bring with them (Illich 1980, Phillipson 1992, Skutnabb-Kangas 2000, Spivak 1999). The problem with this spread, which must always be presented as instrumental, and is therefore always profoundly ideological, is that it arises in suspicious proximity to the other phenomena of cultural domination toward which recent critical work has made us especially sensitive. I am thinking here of the strong tendency in the West (although not at all unique to us) to dismiss alternative forms of subjectivity, sexuality, racial identity, gender, kinship, family structure and so on, in favor of a relatively singular model or small set of models. Of course some of this is due to the cultural oppression that goes along with every empire; what in particular seems to be at issue the world over is a loss of cultural diversity and a coordinate loss of linguistic diversity, and the coordination between these two is complex. By “cultural diversity” here I mean something broader than what is sometimes understood by the phrase — not merely styles of dress or manners but the principles and guides by which the whole of social structure is elaborated, and within the matrix
9 Computers and the Humanities xx: nnn-nnn, yyyy © yyyy. Kluwer Academic Publishers. Printed in the Netherlands.
of which identity is crafted. Today we radically underestimate the power of these forces, despite their clear implication in the production of modern society, itself far more uncharacteristic than typical in the tens of thousands of years of human history (Mander 1992). This has become so much the case that we have encased in terms like globality and modernity the apparently inevitable spread of written English and other European languages (Spivak 1999). We have convinced ourselves that, because the people on the ground seem to want to learn these languages, then they must only constitute tools to attain economic status, rather than a deep part of one cultural position that has historically spread as much through explicit imposition as through mutual agreement. Mass standardization and geographically-wide linguistic and cultural uniformity are not necessities. In world linguistic history, there have been any number of periods of deep, long-standing linguistic interchange between high numbers of linguistically diverse groups, where high levels of structural diversity correlate with low cultural diversity over small geographic areas, but higher cultural diversity over larger areas (Dixon 1997, Nettle and Romaine 2000, Nichols 1992, Sapir 1921). In other words, there are many areas where high linguistic diversity over a wide geographic area correlates with a high level of linguistic and cultural adaptation to local geographic conditions, and a great deal of linguistic variation and change over relatively short periods of time. There even seems to be anecdotal evidence promoting the usefulness of the kind of ecological management exercised by humans in such geographic areas, coupled with any number of attitudes toward technological progress that are quite different from ours (Abram 1996, Bragdon 1996, Nettle and Romaine 2000).
4. Objects The Object-Oriented Paradigm (OOP) has been the dominant model for modern programming languages. since the 1980s and have grown to special prominence lately with the wide acceptance of Java. Object-Oriented (OO) languages re-orient the programmer to think in terms of abstract objects, rather than in terms of step-by-step order associated with more widely-known procedural languages like Basic, Fortran, COBOL and Pascal (Gelernter and Jagannathan 1990, Hathaway 1997, Hu 1990,
GOLUMBIA, COMPUTATIONAL OBJECT
11
Weisfeld 2000). It is associated with many modern software applications such as CRM (Customer-Relationship Management) and ERP (Enterprise Resource Management). OO languages are often presented to the programmer as a set of visual tools that are conceptually and practically related to the GUIs found on personal computers today (the tools themselves are often Windows applications, much like standard Windows programs). Part of the question I want to raise, in fact, has to do with what might be thought the change in vantage from the user to the programmer, and the interesting lack of full fit between the system as the user interacts with it, and the system with which the programmer interacts, over the course of computer history (for Vantage Theory see Hill and MacLaury 1995). It is an interesting paradox that these visual tools and development environments have direct and (sometimes) understood linguistic interpretations that are notationally equivalent to their visual representations. We are accustomed to thinking of programming languages according to the procedural model associated with nonObject Oriented varieties. A programming language is supposed to be something like a natural language in that it combines a lexicon with a set of rules. While this is a serious misunderstanding of natural language, it is actually a fairly reasonable way to approach Basic or Pascal. But C++ and Java are much more difficult to directly understand in these terms: instead it is necessary to understand the conceptual processes according to which these languages have been built, which are both deeply conceptual and impressively central for our culture and its systems of control. In fact, the Object-Oriented Paradigm represents a conscious attempt by engineers to take control of the language development process at a high conceptual level. OOP advocates make enthusiastic claims: Objects provide a canonical focus throughout analysis, design, and implementation by emphasizing the state, behavior, and interaction of objects in models, providing the desirable property of seamlessness between activities. … Object-orientation offers a better paradigm, a pattern of practice and thought through which we apply the traditions of the discipline, a means through which we can view the world, and a fundamental shift in the philosophies of computer science and engineering, replacing
11 Computers and the Humanities xx: nnn-nnn, yyyy © yyyy. Kluwer Academic Publishers. Printed in the Netherlands.
the older paradigm of structured techniques by providing fundamental improvement and superior solution spaces. (Hathaway 1997, 1)
The OOP shift entails an interesting extension of the programming language-aslanguage metaphor. In OOP, as the name implies, all significant entities are understood as objects, a concept which is said to transcend the digital divide. Things in both the real world and the computer world are said to be objects, and objects are understood via their methods and behaviors. Objects can have state and they can have class membership, an extension of the logical type/token distinction. The standard way to represent these concepts to the user is to map them onto a supposedly natural real-world metaphysics. This leads to relatively straightforwardly metaphorical descriptions like: “when you look at a person, you see the person as an object. The person has attributes, such as eye color, behavior, and a certain way of walking. In its basic definition, an object is an entity that contains both data and behavior” (Weisfeld 2000, 9). Emphasis in OO languages is on the ability to realize complex conceptual structures, utilizing relational concepts such as inheritance and polymorphism. Some OO languages — though not the ones in widespread commercial or institutional use — are said to implement prototyping in a flexible way, drawing on prototype theory (Hathaway 1997, 1.1). 8 In real-world implementations using languages like Java and C++, one finds a constant tension between the idealized abstractions of objects and their practical consequences, such as the speed and memory usage of various messages and methods (in other words, what goes into an object and what goes into a class, what inherits what from what, and what is left out of the OOP-designed system, are all highly variable and situation-dependent) (Hu 1990). In some ways, then, the OOP represents a highly adaptive and potentially localizable system, and in other ways it is a highly abstracted one that reinforces the programmer’s background beliefs in abstract properties and methods. Interestingly, the whole system is essentially abstract with regard to the hardware itself: that is to say that objects do not exist in the hardware. Rather, they are implemented in high-level compilers such as a Java Virtual Machine, and do not have direct representations in machine
GOLUMBIA, COMPUTATIONAL OBJECT
13
language (Gelernter and Jagannathan 1990). Objects, much like other objects in the cultural sphere, remain largely idealized, both in their conception and their use. It is clear, indeed explicit, that the OOP is structured around a conceptual metaphor that begs a fundamental question. In the human environment, objects co-occur with subjects. In most fields to take on this conceptual structure (linguistics, philosophy, deconstruction, psychoanalysis), one does not find extensive consideration of the effects of one part of the conceptual structure without consideration of the other. The OOP is explicitly a theory of objects without a theory of subjects: as such it is hard not to take seriously its construal of subjects as objects, as interpellations within the apparently externalized sphere of physical being (see Hill and MacLaury 1995 for some alternate linguistic-cultural construals of person and object according to subject vantage; also see Lacan 1981). Deconstruction and cultural studies have been at the forefront of calling such efforts always into question, and of noting which practices have tended to include balanced and or interactive subject/object conceptualizations (and, indeed, deep cultural diversity) and which have not. The mathematical subject postulated not just by graphical development environments with but even by computer games and VR simulations maps too neatly onto the worldview of the language machine, where the other is construed as object to the exclusion of third-party subjectivity (and this seems to be much the direction in which the ideal programmer is said to be heading, as envisioned, for example, by Garner and Crawford 1979, Gelernter 1998, Knuth 1992, and Weisfeld 2000; also see De Landa 1991). Logic is said to define the machine, but somehow also defines the person, at the expense of the unconscious impulses and forms that we know shadow all our actions. If a computer has an unconscious, its language is massively binary; the human unconscious, on the contrary, is largely linguistic, very much made irreducibly of the language in which subjectivity is made (Kress and Hodge 1979, Lacan 1981). To a human being, language is almost entirely metaphor, and almost entirely a means of social connection between interacting persons (despite our persistent faith in ideologies like communication).
13 Computers and the Humanities xx: nnn-nnn, yyyy © yyyy. Kluwer Academic Publishers. Printed in the Netherlands.
Were we to take language seriously, as Harris suggests, we might end up somewhere much more like the place Tim Berners-Lee (1999) originally pointed the web (the original idea was to have pages be always editable, an equitability of input and output that has seemed unimplementable in terms of user acceptance). But technological extensions of linguistic technology can hardly be seen as inherently positive. The technologies of mass communication that do allow oral languages to be transmitted, free of markup or even content control (such as some forms of radio and television, and the telephone) do not necessarily even then serve a salutary purpose. But one can imagine that inter-group communication and interaction, in which both the subject and object (and notions of person that fall outside the subject/object dichotomy) are the only stuff available within which language itself can do its interactive work — constructing authentic social alternatives. It seems possible that multilinguality, orality and nonstandardization (to use three shortcuts for values not stressed in the computational environment) could be emphasized on purpose in future computer projects. (That values like nonstandardization are useful can be seen in the wide variability of email and chat constructions, often going as far afield as the keyboard will allow.) The increasing promise of audiovisual-based interaction suggests the use of aural and other nonwritten forms of language could allow some of the 5500 or so nonwritten languages to be stressed on the web (although the experience of television suggests otherwise). Efforts to deliberately provide indigenous cultures with network access, websites, language resources and other relevant materials also seem to hold promise. Multilingual technologies like Unicode, although fraught with ideological issues of their own, seem at least to offer the potential for mass intra-group understanding of linguistic complexity. It is even possible that a recognition of the degree to which computers do not “use language” might allow us to develop computers that could respond to the way human beings use language in more creative and useful ways. Perhaps a spread of this sort of natural-language creativity would be useful in stemming the tide of unquestioned, digitally-necessitated standardization.
Acknowledgments
GOLUMBIA, COMPUTATIONAL OBJECT
15
I appreciate advice, ideas and input from Suzanne Daly, Sonali Perera, Jodi Lynn Melamed, Elliott R. Trice, Lisa R. Henderson, Jen Leibhart and Peter R. Mahnke.
Notes
1
Golumbia (in press) attempts to pinpoint this orientation and to outline some misunderstandings that feed into it. The most widely-cited origin point for this critique in contemporary philosophical and linguistic thought is Reddy (1979). For new media studies see, for example, Lunenfeld (1999) and Bolter and Grusin (1999). Robins and Webster (1999) is an excellent introduction to what might be understood as critical informatics; for related perspectives see Golumbia (1996) and Lessig (1999). 2
This information is derived from Grimes (2000), Coulmas (1989, 1996), Illich (1980), Nichols (1992) and Nettle (1999). For a more thorough summary of conceptual issues in world language history see Golumbia (1999). 3
On the connections between central metaphors and conceptual practice more generally, see Lakoff and Johnson (1980, 1999) and Derrida (1976, 1982, 1987). 4
For critical discussion of the mind-machine hypothesis, see, for example, Harris (1980), Ong (1977, 1982) and Preston (1995). For more general works in cultural studies and deconstruction that target such central belief systems and their structure, see Foucault (1971, 1972), Illich (1980) and De Landa (1991, 1997). 5
On less restrictive accounts than Chomsky’s, this is clear; see e.g. Jackendoff (1997). 6
It is difficult enough simply to identify languages in terms appropriate for computers; see Constable and Simons (2000) for a metadata proposal that shows how far the computer environment is from being truly multilingual. (Today, most computers cannot recognize most of the world’s 6000 languages).
15 Computers and the Humanities xx: nnn-nnn, yyyy © yyyy. Kluwer Academic Publishers. Printed in the Netherlands.
7
Only very recently has a proposal been adopted to alter this situation, such that some providers are now experimenting with allowing non-Western characters in domain names (see www.networksolutions.com; www.register.com). Of course these are by necessity limited to the major standardized languages. 8
For prototype theory see Lakoff (1987).
References Abram, D. 1996. The Spell of the Sensuous: Perception and Language in a More-thanHuman World. Pantheon, New York. Adam, A. 1998. Artificial Knowing: Gender and the Thinking Machine. New York: Routledge. Berners-Lee, T. 1999. Weaving the Web: The Original Design and Ultimate Destiny of the World Wide Web by Its Inventor. New York: HarperCollins Publishers. Bex, T, and Watts, R. J. (eds.) 1999. Standard English: The Widening Debate. London and New York: Routledge. Bolter, J. D., and Grusin, R. 1999. Remediation: Understanding New Media. Cambridge, MA: The MIT Press. Bragdon, K. J. 1996. Native People of Southern New England, 1500-1650. Norman, OK: U of Oklahoma P. Chomsky, N. 1986. Knowledge of Language: Its Nature, Origin and Use. Westport, CT and London: Praeger. Chomsky, Noam. 1995. The Minimalist Program. Cambridge, MA: The MIT Press. Constable, P. and Simons, G. 2000. Language Identification and IT: Addressing Problems of Linguistic Diversity on a Global Scale. SIL Electronic Working Papers (SILEWP). Dallas, TX: SIL International. [http://www.sil.org/silewp/2000/]
GOLUMBIA, COMPUTATIONAL OBJECT
17
Coulmas, F. 1989. The Writing Systems of the World. Oxford: Blackwell. Coulmas, F. 1996. The Blackwell Encyclopedia of Writing Systems. Oxford: Blackwell. Crowley, T. 1989. Standard English and the Politics of Language. Urbana: U of Illinois P. de Grazia, M. 1990. “Homonyms Before and After Lexical Standardization.” Deutsche Shakespeare-Gesellschaft West Jahrbuch. 143-156. De Landa, M. 1991. War in the Age of Intelligent Machines. Cambridge, MA: Swerve Editions/Zone Books/The MIT Press. De Landa, M. 1997. A Thousand Years of Nonlinear History. New York: Swerve Editions/Zone Books/The MIT Press. Derrida, J. 1976. Of Grammatology. Spivak, G. C. (trans.) Baltimore: Johns Hopkins UP. Derrida, J. 1979. Scribble (writing-power). Plotkin, C. (trans.) In Derrida (1998). 50-73. Derrida, J. 1982. Margins of Philosophy. Bass, A. (trans.) Chicago: U of Chicago P. Derrida, J. 1987. The Retrait of Metaphor. Gasdner, F. (trans.) In Derrida (1998). 102-129. Derrida, J. 1996. Monolingualism of the Other; or, The Prosthesis of Origin. Mensah, P. (trans.) Stanford, CA: Stanford UP. Derrida, J. 1998. The Derrida Reader: Writing Performances. Wolfreys, J. (ed.) Lincoln, NB: U of Nebraska P. Dixon, R. M. W. 1997. The Rise and Fall of Languages. Cambridge and New York: Cambridge UP. Floridi, L. 1999. Philosophy and Computing: An Introduction. London: Routledge. Foucault, M. 1971. The Order of Things: An Archaeology of the Human Sciences. Sheridan, A. (trans.) New York: Vintage Books. Foucault, M. 1972. The Archaeology of Knowledge and the Discourse on Language. Sheridan, A. (trans.) New York: Pantheon Books.
17 Computers and the Humanities xx: nnn-nnn, yyyy © yyyy. Kluwer Academic Publishers. Printed in the Netherlands.
Garner, R., and Crawford, M. 1979. The COBOL Environment. Englewood Cliffs, NJ: Prentice-Hall. Gelernter, D. 1998. Machine Beauty: Elegance and the Heart of Technology. New York: Basic Books. Gelernter, D., and Jagannathan, S. 1990. Programming Linguistics. Cambridge, MA and London.: The MIT Press. Golumbia,
D.
1996.
Hypercapital.
Postmodern
Culture
7(1).
40
paragraphs.
[http://www.mindspring.com/~dgolumbi/docs/hycap/ hypercapital.html] Golumbia, D. 1999. Toward a History of ‘Language’: Ong and Derrida. Oxford Literary Review, 21(1-2). 73-90. Golumbia, D. In press. Metadiversity: On the Unavailability of Alternatives to Information. In Bousquet, M. and Wills, K. (eds.) Informatics of Resistance. Grimes, B. (ed.) 2000. Ethnologue. 14th Edition. CD-ROM. Dallas, TX: SIL International. [http://www.sil.org/ethnologue/] Harris, R. 1987. The Language Machine. Ithaca, NY: Cornell UP. Hathaway, R. J. 1997. Object FAQ.. [http://www.cyberdyne-object-sys.com/oofaq2] Hill, J. H., and MacLaury, R. E. 1995. The Terror of Montezuma: Aztec History, Vantage Theory, and the Category of ‘Person.’ In Taylor, J. R. and MacLaury, R. E. (eds.) Language and the Cognitive Construal of the World. Berlin: Mouton. 277-329. Hu, D. 1990. Object-Oriented Environment in C++: A User-Friendly Interface. Portland, OR: MIS (Management Information Source) Press. Illich, I. 1980. Vernacular Values. CoEvolution Quarterly. [http://www.preservenet.com/ theory/Illich/Vernacular.html] Jackendoff, Ray. 1997. The Architecture of the Language Faculty. Cambridge, MA: The MIT Press. Knuth, D. E. 1992. Literate Programming. Stanford, CA: CSLI.
GOLUMBIA, COMPUTATIONAL OBJECT
19
Kress, G. and Hodge, R. 1979. Language as Ideology. Routledge and Kegan Paul, London. Lacan, J. 1981. The Four Fundamental Concepts of Psychoanalysis. Sheridan, A. (trans.) New York and London: Norton. Lakoff, G. 1987. Women, Fire, and Dangerous Things: What Categories Reveal about the Mind. Chicago and London: U of Chicago P. Lakoff, G. and Johnson, M. 1980. Metaphors We Live By. Chicago: U of Chicago P. Lakoff, G. and Johnson, M. 1999. Philosophy in the Flesh: The Embodied Mind and Its Challenge to Western Thought. New York: Basic Books. Lawler, J. M. 1999. Metaphors We Compute By. In Hickey, D. (ed.) Figures of Thought: For College Writers. Mayfield Publishing. [http://www-personal.umich.edu/~jlawler/ meta4compute.html] Lessig, L. 1999. Code, and Other Laws of Cyberspace. New York: Basic Books. Linell, P. 1982. The Written Language Bias in Linguistics. Linköping: U of Linköping P. Lunenfeld, P. (ed.) 1999. The Digital Dialectic: New Essays on New Media. Cambridge, MA: The MIT Press. Mander, J. 1992. In the Absence of the Sacred: The Failure of Technology and the Survival of the Indian Nations. San Francisco, CA: Sierra Club Books. Nettle, D. 1999. Linguistic Diversity. Oxford: Oxford UP. Nettle, D., and Romaine, S. 2000. Vanishing Voices: The Extinction of the World’s Languages. New York and London: Oxford UP. Nichols, J. 1992. Linguistic Diversity in Space and Time. Chicago: U of Chicago P. Ong, W. J. 1977. Interfaces of the Word: Studies in the Evolution of Consciousness and Culture. Ithaca, NY: Cornell UP. Ong, W. J. 1982. Orality and Literacy: The Technologizing of the Word. New York and London: Routledge.
19 Computers and the Humanities xx: nnn-nnn, yyyy © yyyy. Kluwer Academic Publishers. Printed in the Netherlands.
Parakrama, A. 1995. De-Hegemonizing Language Standards: Learning from (Post)Colonial Englishes about “English.” New York: St. Martin’s Press. Phillipson, R. 1992. Linguistic Imperialism. Oxford: Oxford UP. Preston, B. 1995. The Ontological Argument Against the Mind-Machine Hypothesis. Philosophical Studies 80(2). 131-157. Reddy, M. 1979. The Conduit Metaphor: A Case of Frame Conflict in Our Language about Language. In Ortony, A. (ed.) Metaphor and Thought. Cambridge and New York: Cambridge UP. 284-324. Robins, K., and Webster, F. 1999. Times of the Technoculture: From the Information Society to the Virtual Life. London and New York: Routledge. Sammet, J. E. 1981. The Early History of COBOL. In Wexelblat, R. L. (ed.) History of Programming Languages. New York and London: ACM/Academic Press. 199-243. Sapir, E. 1921. Language: An Introduction to the Study of Speech. London: Granada, 1978. Skutnabb-Kangas, T. 2000. Linguistic Genocide in Education — Or Worldwide Diversity and Human Rights? Mahwah, NJ and London: Lawrence Erlbaum. Spivak, G. C. 1999. A Critique of Postcolonial Reason: Toward a History of the Vanishing Present. Cambridge, MA: Harvard UP. Weisfeld, M. 2000. The Object-Oriented Thought Process: The Authoritative Solution. Indianapolis, IN: SAMS. Wilson, E. A. 1998. Neural Geographies: Feminism and the Microstructure of Cognition. New York and London: Routledge.