privacy protection and communicative respect - CiteSeerX

2 downloads 0 Views 92KB Size Report
Privacy protection and communicative respect. The Language-Action Perspective on Communication Modelling 2003. 75 right” [OECD 1980:4] to inspect data ...
PRIVACY PROTECTION AND COMMUNICATIVE RESPECT Hans Weigand1, Cate Heeney2 Infolab, Tilburg University, The Netherlands1 CCSR, Manchester University, UK2 [email protected], [email protected]

ABSTRACT Privacy is an important rationale for data protection. However, the concept of privacy is not without problems. In this paper, we argue for a new perspective on data protection. This perspective considers data in the context of a communicative setting. It explores the norms for communicative action and their consequences for data protection. The paper starts with a specific data protection problem that is not easily dealt with from a privacy perspective, that is, the problem of the dissemination of statistical data by National Statistical Organizations. The communicative approach is worked out in a number of guiding principles: (a) forestalling a focus on expressive information, (b) true informed consent, (c) recognition of discourse spheres.

KEYWORDS Privacy, communicative action, statistical data, data mining

The copyright of this paper belongs to the paper’s authors. Permission to copy without fee all or part of this material is granted provided that the copies are not made or distributed for direct commercial advantage. Proceedings of the 8th International Working Conference on the Language-Action Perspective on Communication Modelling (LAP 2003) Tilburg, The Netherlands, July 1-2, 2003 (H.Weigand, G. Goldkuhl, A. de Moor, eds.) http://www.uvt.nl/lap2003

H.Weigand, C. Heeney

1. INTRODUCTION In the post 9-11 environment, data surveillance has become a serious discussion item [Gandy, 2002]. Already for many years, data protection has found its way in European directives and national laws. In this paper, we argue for a new perspective on data protection. This perspective considers data in the context of a communicative setting. It explores the norms for communicative action and their consequences for data protection. The paper is built on a specific data protection problem that is not easily dealt with from a privacy perspective, that is, the problem of the dissemination of statistical data by National Statistical Organizations. Although these organizations are usually very careful in their handling of data, the data can be refined so that, in combination with other data and with the use of data mining techniques, they allow the construction of quite precise profiles of individuals or small groups. The aim of this paper is twofold: first, it introduces briefly the privacy problems associated with statistical data, based on [Heeney & Weigand, 2003]. Secondly, it introduces a communication perspective on privacy, which is applied then to the case.

2. PRIVACY PROTECTION AND STATISTICAL DATA The section explores the nature of the responsibility of National Statistical Organizations (NSOs) to data subjects as regards dissemination of statistical data. Three main issues will be discussed. The first is how secondary uses of data released by statistical organizations can affect individual privacy. The second is what sort of protection is offered to individuals with regard to secondary uses of statistical data. The third is the extent to which statistical organisations can protect the data they release from particular secondary uses. 2.1 How NSOs Tackle the Problem of Privacy The approach of National Statistical Organizations to protecting privacy tends to put most emphasis on direct disclosure or the re-identification of a particular individual in the data set. Much work is done on simulating the motivations and actions of ‘data intruders’ who wish to match a given individual from the population with one in the data set [Elliot, 1996]. This could entail identifying a person in the data set that one knows from the population or conversely finding a person in the population after selecting her from the data set. The notion of attribute disclosure is also a concern but tends to be a much more difficult thing to define and protect against. Attribute disclosure can be said to have definitely happened however, where a particular characteristic of an individual can be inferred with a high degree of certainty. An example of this would be that the data set showed all single women of 35 in a given area owned their own home. Upon

72

The Language-Action Perspective on Communication Modelling 2003

Privacy protection and communicative respect

meeting a 35 year old female from this area it could then be inferred with a good degree of certainty that she was a homeowner. Knowledge Discovery and Data Mining (KDDM) are technologies for analysing data in such a way as to find patterns and relationships between variables. Another technique is Geographical Information Systems (GIS). 2.2 Uses of data from NSOs The data produced by NSOs tends to be released into the public domain in a heavily protected form. However, the fact that data can then be used in combination with other data sources can have implications for individuals despite their not being directly identified. The importance of the data disseminated by NSOs is clearly recognised by marketers in building up a profile of the people living in a given area. In the UK the data for small areas, Small Area Statistics (SAS) is acknowledged as fundamental to the creation of GIS. SAS data also helps to ascertain the homogeneity or heterogeneity of an area. The marketers are eager to point out that they use a probabilistic rather than direct approach to the characteristics of individuals. This remains unproblematic from the perspective of NSOs as it does not aim to re-identify individuals and therefore does not breach direct disclosure pledges. However, direct marketers for example are able to study identified data sets for ‘look-alikes’ who fit profiles discovered in the nonidentified data sets [Sleight, 1993].

2.3 The importance of accuracy NSOs are concerned to prevent direct disclosure of the identity of individuals or the possibility of accurately inferring attributes about individuals. There are two problems with this approach in terms of their application to the reality of secondary uses of data. First, technologies such as data mining and GIS can allow the incorporation of data of different types from many other sources and analysis that reduces the possible values of data and allows the isolation of smaller groups within the data. Second, the links between variables and the patterns that they constitute are sometimes used to ground decisions as if the links were causal rather than probabilistic. The fact that statistical organizations take care to disguise the data or make it safe against accurate re-identification does not protect individuals against uses of data where accuracy is not seen as of primary importance. Rather the use of data by marketers and other speculative organizations is to improve the odds in their favor [Sleight, 1993]. The motivation is to segment to the population in order to make the most efficient use of resources. So while it is likely that such organizations would wish to reduce the margin of error as much as possible for their purposes direct re-identification of a given individual is not always necessary.

The Language-Action Perspective on Communication Modelling 2003

73

H.Weigand, C. Heeney

2.4 Problems with statistical data in GIS and KDDM GIS and KDDM aim like statistics to distinguish groups in order to learn more about these groups in terms of their characteristics and behaviour. These technologies can be used to complement the activity of statistics in general for example data mining can be used to identify risky records to combat attribute and direct disclosure. GIS can be used to highlight areas of deprivation in order to call for political intervention. Here the incorporation of more data allows the relationships between more variables to be measured potentially providing new tools to fight poverty and social deprivation. So the technologies can be used in ways that could be justified in similar terms as those used for statistics; increased knowledge means fairer distribution of social and economic goods. However, the same data and the same technologies can be used in order to target areas on the one hand for potential customers and exclude areas where the odds of finding potential customers are not so great. At one university in the UK area level classifications were used to pinpoint areas of deprivation in order to highlight those areas where most help was needed. Marketers were quick to recognise the potential of such classifications and they are now acknowledged as being very useful in targeting promising areas and for deciding which areas are not promising and which areas should be excluded from offers of goods and services.

2.5 Legal Protection The Organization for Economic Cooperation and Development (OECD) published the ‘Guidelines Governing the Protection of Privacy and Transborder Flows of Personal Data’ in 1980. The guidelines were intended as ‘minimum standards’ thereby permitting whatever supplementation was thought necessary to protect privacy interests, by national legislators. These guidelines have influenced national data protection legislation in many countries, where the principles as well as the definitions of relevant concepts have been adopted. There are eight principles. Use Limitation indicates that identifying data should not be disclosed by the collecting organization unless such disclosure is permitted by legislation or consented to by the data subject. The Principle of Security Safeguards calls for adequate disclosure control to prevent access by other parties than those to which the data was supplied or to those for whom disclosure was consented to by the data subject. The Purpose Specification Principle, states that at the time of data collection the data subject should be informed of current and future purposes intended for their data and informed if any changes to these purposes occur. The principle of Openness, recommends transparency with regard to the processing of identifying data. This notion includes enabling verification of the type of data that is being processed, the policies, which govern processing, and the purposes of use. The Individual Participation Principle states that individuals “should have the

74

The Language-Action Perspective on Communication Modelling 2003

Privacy protection and communicative respect

right” [OECD 1980:4] to inspect data relating to themselves that is in the possession of the data processor and to challenge the data and have it erased or rectified. Finally, the Principle of Accountability recommends that the data controller provide a base of responsibility for compliance with the principles. There are clear problems with applying these principles to the kind of issues discussed here. Although the principles of security and purpose specification are adhered to by NSOs national and European legislation tends to give broad exemptions to statistical data making the other principles partially irrelevant in tackling the problem of unwanted uses of aggregated data. There are other issues with the legal approach to understanding the nature of the responsibility of data processors to data subjects. The legal approach to fulfilling responsibilities to data subjects such as respecting their privacy and allowing them to consent or not to uses of their data might be only loosely related an ethical approach to the same issues. A distinction can be drawn between what is legally binding and what is morally desirable and this distinction can be described in terms of focus [Faden & Beauchamp: 1986]. On the one hand the legal approach tends to concentrate on the liability of the data processor. The ethical approach on the other would be more concerned with whether the notions of privacy and informed consent had been respected in a way that protected the autonomy of the individual data subject. In terms of the liability of the data processor while personal data is not an issue and it can be claimed there are no clearly direct and personal consequences to an individual then the duty of responsibility has been fulfilled. This may explain the concern with disclosure control and ensuring individuals and attributes cannot be accurately determined. However, the use of aggregated data clearly does throw up problems both for privacy and informed consent from a moral perspective. 2.6 Conclusions The situation as regards secondary uses of statistical data does not appear to be one that can be tackled by statistical organizations alone but must be seen as part of a much bigger problem requiring a number of solutions. The disclosure control issue is a real problem as it would be undesirable if data provided to statistical organizations was passed outside of this context in ways that allowed individuals to be easily recognised. After these controls are in place however it is difficult to see what more NSOs could do to combat uses made possible by technology and driven by motives such as profit and efficiency. What seems to be needed is not so much more control measures, but a new, broader perspective on how the data are put to use and what that may mean for individuals. The governmental use of data is always related to policy making, a process that is already subject to political control and, most importantly, does not combine the

The Language-Action Perspective on Communication Modelling 2003

75

H.Weigand, C. Heeney

statistical data with direct indentifiers of individuals. The problem of secondary uses of the statistical data in non-governmental organizations is that this combination can be made easily.

3. A Communication Perspective Data protection is usually motivated by an appeal to privacy rights. However, privacy is still not a well-understood concept. In its current form, it fails to articulate a compelling rationale for protecting statistical data. In this section, we will first consider briefly the contemporary privacy discourse, including recent semiotic approaches, and then propose a communicative perspective. 3.1 Limitations of the “right to be left alone” argument One description of the concept of privacy is “the right to be left alone and the right to be free of unreasonable personal intrusions”. Informational privacy has been defined by [Agranoff,1993] as “the claim of individuals, groups, or institutions to determine for themselves when, and to what extent, information about them is communicated to others.” The first definition embodies the view that privacy is the ability to defend against intrusiton on a private sphere, informational privacy clarifies the issue of control popular in informational privacy debates. Both the general and the more specific definition have some problems. One problem with the “right to be alone” argument is that it is rather vague, and does not distinguish normal ways of human interaction with intrusive ones. Its underlying assumption seems to be that people build their lives individually, and interaction with others is seen as a hindrance at best. This is a bit simplistic. People engage voluntarily in interactions and often need the attention of others for a satisfactory life. In practice, the “right to be alone” argument often leads authors to the standpoint that any bit of information about you is a privacy concern and in the ideal world no information about you would be collected at all. This is not a very practical standpoint as it fails to distinguish between reasonably innocent situations and the really intrusive ones. There are also problems with the control-based paradigm of informational privacy. Apart from practical reasons (it is far from easy for individuals to exercise this control, because they are not aware or because of the costs involved), it has also principled problems. For one thing, it would mean that I should be able to control all information spread about me. Does this include opinions that others express of me? If the paradigm is taken literally, then [Posner, 1978] is right in rejecting the principle all together (but cf. [Nissenbaum, 1998]). Apparently, some distinctions must be made, for example, between confidential data and nonconfidential ones. Another point is that the protection of privacy serves values that in some cases go beyond or even against the interest of a particular data subject

76

The Language-Action Perspective on Communication Modelling 2003

Privacy protection and communicative respect

[Blok, 2001]. A good example of such a non-individualistic value is equality. When it is up to the individual, he would reveal certain personal information when it serves his interests in that situation. A data collector could take advantage of that and ask for voluntary disclosure; in that case, the people that have an interest in keeping that information hidden are filtered out automatically. 3.2 Privacy as control over expressive information Recently, Stan Karas has argued for new “semiotic” perspective on privacy. His case is the use of consumer data. When consumer data, for example based on internet surfing behavior, are aggregated, very detailed profiles of individuals can be made which reveal a lot about that individual’s identity. As he states [Karas, 2002]: An aggregation of isolated transaction records often amount to a personality profile that can be used to predict consumption patterns. One of the loci of privacy in consumer information is in the expressiveness of consumer behavior itself. What consumers buy is how they present themselves to the outside world. Therefore, what consumers buy is, to a large extent, who they are. Consequently, the act of choosing one product over another is an expressive one. When consumers choose to purchase products of certain brands that are identified with a certain personality trait, for instance insecurity about body image or promiscuity, a sophisticated examiner of those brands may get a blurry, but strikingly accurate, glance at private lives. What is important here is not the record of our purchases, as some privacy theorists claim, but the inferences that an examiner may make about our preference of one brand over another. The choice of brands, and not the choice whether to buy a product or not, is what is expressive, revealing, and useful to marketers. The consumer profiles that Karas talks about are comparable to the statistical data that we discussed above, with the difference that the consumer data may link the profiles to individuals, whereas this is in principle not possible with the statistical data. However, in both cases, it becomes possible to make quit precise inferences about personal identities. In the “semiotic” approach of Karas, it is not sufficient to concentrate on data protection as a means towards privacy protection. As he says, the records themselves are not important. The question is whether someone does build a picture out of the data that reveals the expressed identities. This depends on the number of data available and their aggregation. A possible weakness is that Karas does not offer clear criteria for deciding on that. Nevertheless, we think his point is very relevant to the privacy problem of consumer data and statistical data.

The Language-Action Perspective on Communication Modelling 2003

77

H.Weigand, C. Heeney

According to [Goffman,1956], people generally adopt “fronts” depending on the circumstances, and that social life is not unlike a series of performance stages for the benefit of others. People continuously express themselves in these roles or personas. In the post-modern world, the consumer roles seem to be most prominent. As Jean Beaudrillard observed, “today what we are experiencing is the absorption of all modes of expression into that of advertising” [Beaudrillard, 1994]. Products appeal through the figurative meaning they express. This situation is exploited gracefully by business and the marketeers [Sternberg, 1999]. It is also exploited by the consumers themselves. Pierre Bourdieu argued that individuals use consumption to construct social status and identity through distinctions in taste, so that the choice of a particular group of products “classifies the classifier” [Bourdieu, 1984]. Allison Laurie, in her book “The Language of Clothes” [Laurie,1981], argues that a person’s choices in clothes gives valuable clues into his or her “occupation, origin, personality, opinions, taste and sexual desires”. The choices of color schemes (red - intense emotions; blue – ease and trustworthiness, etc) style and fabric are as telling as one’s brand preferences. 3.3 A communication perspective In our opinion, many privacy issues can be handled better when we take a communication perspective rather than a perspective focused on isolated individuals. A good starting-point would be the communicative action theory of Habermas [Habermas, 1981; Outhwaite, 1996]. In this view, communication is a form of joint action. It is an effective means of coordination because it aims at building a shared understanding of a situation. Since it is oriented at shared understanding, it should allow participants to challenge any claim made. This can be a truth claim, as in an assertion, but can also be a claim for rightness, as in a request, or a sincerity claim, when the subject reveals something from himself (the expressive). Interestingly, the communicative setting presupposes certain basic rules, even if in practice, we see these rules sometimes violated. The basic rule is that participants must respect each other as communication subjects. It means that you must be able to give valid reasons for the communicative actions that you perform. For example, when you gather or use information about another. These are only valid actions when you are able to give convincing reasons. Sincerity demands that you are accountable for using the information only for these reasons, otherwise, your communication would have been manipulative and manipulation is contrary to the basic principle of communication. The discourse ethics also imply that within the communicative setting one is not using the other as an object. The communication perspective can be used to explain the problem in the previous section, that is, the problem of focused attention towards expressive information of consumers. Focused attention on the expressed self without being engaged and being willing to engage in communication, violates the rule that

78

The Language-Action Perspective on Communication Modelling 2003

Privacy protection and communicative respect

within a communicative setting the other must be treated as a communicative subject, and not as an object. This holds also when the person is not aware of it at all, as in the panoptical situation described by Bentham. According to [Johnson, 2001], all privacy concerns can be brought together in a model that is based on respect for the other as person, realized in immunity from the focused attention to others. What we would like to add to this is that this respect for the other as person may very well be grounded in communicative ethic. In this way, it may also be possible to back up the distinction that Johnson makes between appropriate focused attention within genuine social relations and excessive attention that precludes freedom and autonomy, as the use of words like “appropriate” and “excessive” bags the question why they are appropriate or excessive. There are more ethical consequences that can be drawn from the communication situation. One is the principle of non-repudiation, an important principle in e-commerce. Confidentiality is another important consequence. It has to do with respecting the boundaries of the discourse setting. If X communicates something to Y in the context of discourse setting D, then this communication should be kept within context D. An eavesdropper Z does violate the communication boundary: he enters D without introducing himself and without being allowed. If Y would forward information to Z, Y would also violate the boundaries of D. What level of confidentiality can be expected is very much dependent on the situation, the kind of data in question and the cultural background. There is no universal rule for that. What should be universal is the communicative right of each subject to talk about it: X should be able to challenge Y when he breaches confidentiality, and ask for valid reasons. If Y is able to give these reasons, then there is no problem. Put differently, Y is accountable for breaching confidentiality only in cases that he cannot justify towards X. Of course, there must be a framework on the basis of which a challenge or a justification can be made. In traditional society, this framework was largely implicit. In modern global society, this is no longer feasible, at least the framework should be made explicit. An advantage of the communication perspective is that it avoids the pitfalls of the traditional “right to be alone” argument, and the control paradigm. People do not necessarily have a right to be left alone, but they have a right to be respected as communicative subjects. People do not have a right to control everything somebody else communicates about them, but they can expect some level of confidentiality and the possibility to engage in a rational discourse when confidentiality seems to be breached. Drawing on the theory of Habermas, Judith Perrolle [Perolle,2002] argued for a form of “negotiated privacy”. In her study of privacy issues in group work, she concluded that attention should not only be given to the (dangers of) surveillance by the formal organization, but also to the possibility of people interacting with each others in groups to mark their own privacy, in an open communication atmosphere.

The Language-Action Perspective on Communication Modelling 2003

79

H.Weigand, C. Heeney

3.4 Informed consent At the heart of liberal arguments concerning the individual’s claims for some degree of self-governance is the notion of autonomy. This grounds the idea of individual privacy and the doctrine of informed consent, “These are matters of the protection of self-determination” [Faden, Beauchamp, 1986:40]. In our opinion, the communication perspective can give more substance to the principle of “informed consent” in the current information climate. Within LAP, this principle was investigated by [Verheggen and Widdershoven, 1996]. Their paper discusses the problem of informed consent in hospitals. This is usually seen as a two stage process of giving information by the doctor and making a decision by the patient. According to the authors, this picture is too simple. Rather informed consent should be seen as a communicative process of deliberation about a proposal for treatment or a request to participate in a trial. “The communication process should be structured in such a way that motivation and agreement are enhanced. Emphasis should be both on facts and norms and values”. In the same vein, we have argued for a communicative approach to data protection, which goes beyond the recognition of “rights” and the duty to “inform”. Note such a communicative approach is hard to combine with Lessig’s suggestion to delegate privacy negotiations to smart agents [Lessig, 1999]. A similar conclusion has been expressed by Bernhard Debatin [Debatin, 2001]. He argues that “instead of offering a mere opt-out choice, all data collecting bodies should be obliged to offer an opt-in choice as their default setting. At the same time, they must make sufficiently clear to users what they agree to when they opt in”. He refers to the principle of informed consent. The intended and possible uses by the collecting body and third parties must be clearly and comprehensibly specified. Debatin adds that this approach can only become successful when there is sufficient pressure from the public discourse. The issue should not be understood as a problem of the individual whose privacy has been infringed upon.

3.5 Discourse as sphere A discourse boundary can be viewed as a “sphere”. The notion of a sphere of thought and action, over which one has dominion, is one of the major distinguishing traits of liberalism, as opposed to totalitarianism for example. In the liberal democratic discourse the individual’s private sphere is usually seen as a basic component part. From a communication perspective, it is not so much the individual’s private sphere, but the discourse settings in which individuals are involved that are seen as basic components. Schoeman writes, “Part of what people care about when others know about them is that these things are to be understood in a certain light, or with a particular kind of appreciation for the meaning these have for the subject” [Schoeman,1984: 11]. In our words: what

80

The Language-Action Perspective on Communication Modelling 2003

Privacy protection and communicative respect

people care about is that information about them is understood within the context of the discourse setting, its hermeneutic horizon (cf. [Nissenbaum, 1998]). Schoeman talks about social spheres, such as the commercial sphere, the religious sphere, the medical sphere etc. We would like to extend this notion to any particular discourse setting with its own hermeneutic horizon. Note that such a sphere also brings with it its own set of shared norms, and in some sense is constituted by the force of these norms [Stamper,2000]. Discourse boundaries should be respected. When X talks to Y as a representative of company Z, then Z is included in the discourse, and Z is free to store and distribute these data internally, but not externally. In the case of big organizations or government agencies, it is not always clear where the boundaries are. We suggest that it should become part of Information System design to indicate the spheres in which an IS works, and when an IS crosses different spheres (e.g. communication between the company and the tax agency, or between the company and other companies). An important question in e-government is whether it is possible to indicate a priori some minimal sphere boundaries within the governmental domain. 3.6 Privacy in Public Privacy in public is a phrase created by Nissenbaum to advance the notion that while an individual may have willingly disclosed personal information in one context it should not be assumed that she thereby gave permission for it to be used in any context subsequently. Nissenbaum develops the notion of contextual integrity, which she argues must be recognised in order to protect individual privacy in the information age. She maintains that despite the fact that individuals may have willingly disclosed information this does not mean that they ought thereby to loose all control over its future use and dissemination. Another important point is that even fairly innocuous information about a person can become problematic if it can be combined with other information and accessed very easily [Nissenbaum 1998]. The control approach to informational privacy does not adequately capture the power discrepencies between large organisations and individuals. From the individual control perspective it is argued that in a free market individuals can choose whether or not to do business with one company or another depending on their privacy preferences. As we mentioned above however, the costs to the individual is often prohibitive as they involve being aware of the consequences of giving data in a particular setting including the possible outcomes of inferences. Without information about how data will be used the choice, the individual makes, is arguably not autonomous as Nissenbaum states: „they have neither implicitly or explicitly agreed to others collecting information and selling it to third, fourth etc parties so that their data may be warehoused and mined and assembled, so that their behaviour maybe modelled and manipulated” [ibid: 595].

The Language-Action Perspective on Communication Modelling 2003

81

H.Weigand, C. Heeney

For this reason Nissenbaum calls not only for the individual to have control over her information but for there to be an underlying framework which protects data after it has been disclosed. Such a framework can then be appealed to by an individual in the communicative setting in order to negotiate privacy in a more realistic way than the direct control paradigm allows. 3.7 Privacy and discrimination Sometimes, publishing profiles is deemed wrong because it may lead to violations of the principle of equality (e.g. [Lessig, 1999]). [Pierik, 2001] argues that it could lead to job discrimination if one would publish data about the job productivity of men and women. However, this is a separate issue. If we think that men and women must be treated equally in the labor market, then this holds irrespective of whether profiles would suggest that women are less productive. The point is not that this information must be suppressed; the principle of equality does not state that men and women are equal in all respects, but should be treated equally. The principle of equality bears on our behavior, not on facts. For that reason, we do not agree with [Gandy, 2002] who talks about the privacy issues related with data mining and motivates his concern by pointing to the potential discriminating effects. In that case, the problem is better framed as a discrimination issue. Discriminating effects (actions such as “redlining”) should be judged on the basis of anti-discrimination law. Creating or publishing profiles should not be judged on that basis, but should be judged on privacy principles. Particularly relevant here is the principle of communicative respect. Gandy recalls the classic case of Claire Cherry a White woman in Georgia who claimed that she had been victim of discrimination because Amoco denied her a gasoline credit card. The denial decision was based on a complicated scoring system that included information about the zip code. It is said that it was hard for Ms. Cherry to indicate the impact of that zip code data on the final score. In our perspective, this is also not Ms. Cherry’s responsibility. It is the company who has the communicative obligation to be able to explain its decision, and to make clear that this decision did not violate anti-discrimination law. Karas (ibid) defined privacy as control over information that is expressive of one’s identity. By “identity”, he means information that distinguishes an individual and is expressive. Note the two related concepts in this definition. Distinguishing has to do with differences. We generally consider information about one’s homosexuality as private, because it distinguishes an individual from the “norm” of heterosexuality. Someone who is seen as different is in risk of being discriminated. But distinguishing information is not necessarily private information: to be private, it should also be expressive. As we said, expressive information should be protected on the basis of communicative respect. Therefore, information control with respect to expressive information and anti-discrimination

82

The Language-Action Perspective on Communication Modelling 2003

Privacy protection and communicative respect

law are two complementary means to protect privacy; sometimes they concur, sometimes one of the two is relevant.

4. Conclusions The communication perspective suggested in this paper offers two good reasons why statistical data should be handled with care. One is the issue of communicative respect that excludes focused attention towards expressive information. The other is the issue of respecting the boundaries of discourse spheres. The practical implications are in line with the well-known OECD principles, although we think that more attention must be given to informed consent and to the determination of the boundaries of the discourse spheres (or to what OECD calls “party”), particularly in the area of e-government. Also, the OECD principles fail to recognize the problem of focused attention towards expressive information.

REFERENCES [Agranoff, 1993] Agranoff, P. Controlling the threat to individual privacy. Journal of Information Systems Management, Summer 1993. [Baudrillard, 1994] Baudrillard, J., Simulacra and Simulation,1994. [Blok, 2001] Blok, P. The limits of Informational Self-Determination. In: Vedder, A. (ed) Ethics and the Internet. Intersentia, Antwerpen, 2001. [Bourdieu, 1984] Bourdieu, P., Distinction: a social critique of the judgment of taste, 1984. [Debatin, 2002] Debatin, B., From public/private to public privacy: a critical perspective on the infosphere. Proc. DIAC-02 Symposium, Seattle, May 2002. http://www.cpsr.org/conferences/diac02/ [Eisenberg, 1999] Eisenberg, E., The economy of icons: how business manufactures meaning, 1999. [Elliot, 1996] Elliot, M. . Attacks on census confidentiality using the sample of anonymised records: an analysis draft. Presentation at the. 3rd Int Seminar on Statistical Confidentiality, Bled, Slovenia, October 1996 [Faden and Beauchamp, 1986] Faden, R, R., and Beauchamp, T, L. in collaboration with King, Nancy, P. A History and Theory of Informed Consent. Oxford Univ Press, 1986. [Gandy, 2002] Gandy, O.,. Data mining and surveillance in the post-9.11 environment. Presentation to the Political Economy Section, IAMCR, Barcelona, July 2002. [Goffman, 1956] Goffman, E., The presentation of self in everyday life, 1956. [Habermas, 1981] Habermas, J., Theorie des kommunikativen Handelns. Suhrkamp, Frankfurt (2 volumes), 1981.

The Language-Action Perspective on Communication Modelling 2003

83

H.Weigand, C. Heeney

[Heeney & Weigand, 2003] Heeney, C., H. Weigand, Data protection from a communicative perspective – the case of statistical data. Proc. IASTED Conf on. E-society, Lisbon, June 2003. [Johnson, 2001] Johnson, J., Immunity from the illegitimate focused attention of others: an explanation of our thinking and talking about privacy. In: Vedder, A. (ed) Ethics and the Internet. Intersentia, Antwerpen, 2001. [Laurie, 1981] Laurie, A.. The Language of Clothes, 1981. [Karas, 2002a] Karas, S. Enhancing the privacy discourse: consumer information gathering as surveillance J of Technology, Law & Policy SPRING 2002 ISSUE Version 7.1 [Karas, 2002b] Karas, S., Privacy, Identity, Databases: Toward a New Conception of the Consumer Privacy Discourse. Stanfard Technology Law Review (working paper), 2002. Lessig, L., 1999. Code and other laws of cyberspace. Basic Books, New York. [Nissenbaum, 1998] Nissenbaum, H. Protecting Privacy in and Information Age: The Problem of Privacy in Public, Law and Philosophy, Vol 17, 1998, pp 559 –596. [OECD, 1980] OECD Guidelines on the protection of privacy and transborder flows of personal data: recommendations of the Council Concerning Guidelines Governing the Protection of Privacy and Transborder Flows of Personal Data, 1980, available at http://www.oecd.org/ [Outwaite, 1996] Outhwaite, W. (ed), The Habermas Reader. Polity Press, Cambridge UK, 1996. [Perolle, 1995] Perrolle, J.A., Surveillance and Privacy in Computer Supported Cooperative Work. In: D. Lyon, E. Zureik (eds), New Technology, Surveillance and Social Control. Univ of Minnesota Press, 1995. [Pierik, 2001] Pierik, R.. Group profiles, Equality, and the Power of Numbers. In: Vedder, A. (ed) Ethics and the Internet. Intersentia, Antwerpen, 2001. [Posner, 1978] Posner, R, The Right of Privacy, 12 GA Law Review 393, 408, 1978, [Schoeman, 1984] Schoeman, F D, Ed Philosophical dimensions of privacy : an anthology Cambridge University Press, Cambridge UK, 1984. [Sleight, 1993] Sleight, P., Targeting Customers: How to use Geodemographic and Lifestyle Data in Your Business. NTC Publ Lim, 1993. [Stamper, 2000] Stamper, R. "New Directions for Systems Analysis and Design". In Filipe, J. (ed.), Enterprise Information Systems, Kluwer Academic Publ., London, pp.14-39, 2000. [Verheggen & Widdershoven, 1996] Verheggen, F., G. Widdershoven, Informed consent: implementing shared decision-making in health care. Proc. LAP’ 1996 – Int. Workshop on Communication Modelling, Tilburg University, 1996.

84

The Language-Action Perspective on Communication Modelling 2003