Applying ethical theories to personalization and customization has led to conflicting conclusions, but they are in line with the findings from the survey, suggesting ...
Proceedings of the 37th Hawaii International Conference on System Sciences - 2004
Evaluating Personalization and Customization from an Ethical Point of View: An Empirical Study Horst Treiblmaier, Maria Madlberger, Nicolas Knotzer, Irene Pollach Vienna University of Economics and Business Administration Austria {horst.treiblmaier, maria.madlberger, nicolas.knotzer, irene.pollach}@wu-wien.ac.at
suggesting that customization is questionable than personalization.
Abstract This paper examines whether classic ethical theories can solve the ethical dilemmas associated with user-controlled customization and system-driven personalization of Web sites. Based on the notion that data sensitivity is not a universal concept but comes in different levels of intensity, we conducted an Internetbased survey among consumers to determine their level of data sensitivity and their attitudes towards personalization and customization. Our results have shown that users can be classified into different groups who differ significantly in terms of data sensitivity. Applying ethical theories to personalization and customization has led to conflicting conclusions, but they are in line with the findings from the survey,
ethically
less
1. Introduction In this paper we are investigating whether classic ethical theories can solve the ethical dilemmas associated with the customization and personalization of Web sites in respect of user privacy. The terms customization and personalization are often used interchangeably in both academic and non-academic literature. The resulting ambiguities arise from different views on personalization and customization. Table 1 gives an overview of different notions of personalization and customization found in academic literature.
Table 1: Personalization and customization in academic literature Source Fink et al., 2002 [1]
Used term(s) Personalization
Chiasson et al., 2002 [2]
Personalization
Kalyanam and McIntyre, 2002 [3] Billsus et al., 2002 [4] Ardissono et al., 2002 [5] Ansari and Mela, 2003 [6]
Personalization
Sundbo, 2002 [7] Lampel and Mintzberg, 1996 [8] Nielsen, 1998 [9]
Personalization Personalization Customization Customization
Personalization, Customization
Context Marketing, Communication
Meaning/Application One-to-one relationships with customers; direct access to personally relevant news, seamlessly integrating user preferences into the existing infrastructure, collecting information about user interests Information Systems Personalization of information in order to customize (IS), Humaninteractions with end-users and reduce interaction Computer-Interaction complexity (HCI) Marketing One of the instruments of e-marketing mix, aspect of segmentation IS Personalization as a result of adaptive technologies IS One-to-one recommendation of products Marketing Customization of communications by means of clickstream data Services Customization of services as the opposite of service standardization IS, HCI
Customization: under direct user control; the user explicitly selects between certain options Personalization: driven by computers which try to serve individualized pages to users based on some kind of model of their individual needs
0-7695-2056-1/04 $17.00 (C) 2004 IEEE
1
Proceedings of the 37th Hawaii International Conference on System Sciences - 2004
Source Zahay and Griffin, 2003 [10]
Used term(s) Customization, Personalization
Context Marketing, B2B sector
Coener, 2003 [11]
Customization, Personalization
Services
Hirsh, Basu and Davison, 2000 [12]
Customization, Personalization
IS
In this paper the information systems (IS) perspective towards personalization and customization seems to be the most appropriate since we are investigating personalized Web sites. In the context of information systems both terms can be defined as modifications of the functionality, interface, information content or distinctiveness of an information system with a view to increasing the personal relevance to an individual user [13]. The difference between personalization and customization lies in the control of the adaptation process [9] [14]. Customization is a user-initiated and user-driven process. It uses adaptable system-components which users can tailor to their specific needs. Adaptable systems use static profiles, which may be changed by the user, like e.g. the Web Portal DailyRoutine (www.dailyroutine.com), which enables users to adapt the content and layout to their preferences. By contrast, personalization is system-initiated, system-driven and requires adaptive components. In order to make modifications appropriate to the needs of the individual, both approaches require detailed information about the user. Personalization, however, additionally requires the system to monitor user behavior in order to adapt automatically and users are thus unable to control how the system adjusts to their behavior. Personalized systems employ user profiles, which are changed dynamically by the system. Amazon (http://www.amazon.com) monitors buying behavior and click stream data of customers in order to suggest products which may be of interest to the customer. From an ethical perspective, the distinction between personalization and customization is of major importance, as both personalization and customization raise privacy concerns, yet to varying extents. Personalization raises more serious ethical concerns than mere customization, as the latter requires users to explicitly control the adaptation process. Personalization, by contrast, tracks user behavior on Web sites, which conflicts stronger with the users' right
Meaning/Application Customization activity: Use information from value-added chain to create product for individual customers; Personalization capability: respond to customers by taking into account their individual responses to communication Customization: Web site users can actively dictate the information on the site, match of categorized content to profiled users Personalization: more passive role, content is filtered for users Personalization as “self-customizing” software, systems are automatically customized to the personal characteristics of the user
to data privacy and security [15]. Although online vendors which track user activity might exploit this information for the sole purpose of increasing the system's convenience for users without being aware of the ethical dimension of this practice, they may as well misuse the data to harass users with personalized advertising material or pass on this information to third parties [16]. The ethical implications of personalization have already been the focus of academic study (cf. [17] [18] [19]). Sheehan and Hoy surveyed a consumer sample as to their attitudes towards online privacy concerns and found that consumers consider unsolicited e-mails from companies a minor privacy issue if they have done business with this company before and are more willing to divulge personal information if they are provided with something in return. However, their study averages all findings and fails to acknowledge that data sensitivity is not a universal concept but comes in different levels of intensity [20]. Culnan found that consumers differ in their attitudes towards the secondary use of personal information in direct marketing and thus perceive this invasion of privacy with different levels of intensity [21]. This paper explores whether Internet users can be classified into different groups according to their data sensitivity on the Web. It also examines whether classic ethical theories can help to solve the ethical dilemmas associated with personalization and customization, based on the hypothesis that ethical dilemmas might not be experienced with the same intensity by all users due to their potentially differing levels of data sensitivity.
2. Background Normative ethical theories seek to arrive at conclusions about whether an action is morally good or bad rather than merely describe the ethical dilemma associated with an action [22]. However, the moral
0-7695-2056-1/04 $17.00 (C) 2004 IEEE
2
Proceedings of the 37th Hawaii International Conference on System Sciences - 2004
intensity of the potential privacy invasion varies widely across normative ethical theories, which may lead to divergent views of the ethicality of customization and personalization. This section seeks to outline the main foci of the most important normative ethical theories and juxtaposes them to relevant aspects of stakeholder theory, all of which may be helpful in determining whether personalization and customization are ethically problematic.
2.1. Normative ethical theories Normative ethical theories can be classified into (1) deontological theories, which evaluate the act as such, (2) teleological theories, which take into account the consequences of an act, and (3) virtue ethics, which concentrates on the agent's character [23]. Deontologism focuses on the rightness or wrongness of an act, irrespective of whether the consequences of the act are morally good or bad [24]. Deontological theories are based on rights and duties and have universal maxims [25]. An example of a deontological theory is Kant's categorical imperative, according to which people should always be treated with dignity, and never be used as mere instruments to achieve one's own goals [26]. The two major criteria incorporated in the categorical imperative are universality and reversibility. Thus, an action is considered morally right if the agent would also want all other people to act the same way in a similar situation [22]. The downside of this approach is that it focuses solely on duties and the intrinsic values of actions, while it completely ignores the consequences of actions for others [26]. When applying deontologism to issues of customization and personalization only the process is analyzed while the results — which may or may not benefit the user — are ignored. In his classic paper "Four Ethical Issues of the Information Age", Richard Mason developed the PAPA (Privacy, Accuracy, Property, and Accessibility) model, which is rooted in the deontological tradition. Similar to Kant, Mason postulates the universal maxim that information technology and the information it handles must be used to enhance the dignity of mankind. Privacy, accuracy, property, and accessibility are the key factors to consider when designing information systems. Mason further holds that the main questions concerning privacy are to what extent a person must reveal personal information to others, under what circumstances, and how preventive measures can be taken to guarantee privacy. Further problems in connection with accuracy are how to ensure authenticity, fidelity and accuracy of information, and how to determine who is responsible
for information accuracy. Concerning property it has to be clarified who owns the information and what prices are fair for the information divulged. In this context it is also of central interest who should own the channels for transmitting information. The last relevant field is accessibility. The fundamental questions are who should have access to information and under what conditions [27]. These four issues of the "Information Age" are of high relevance especially for the design of personalized and customized information systems where a lot of confidential information is stored, processed and used to create personal profiles. Teleological theories, by contrast, focus primarily on the consequences, results, goals and purposes of actions [28]. Social contract theory is one such teleological normative theory, stating that every exchange should be based on reciprocity and equality [29]. Utilitarianism, also referred to as consequentialism, is another teleological theory, evaluating the morality of an action in terms of costs and benefits to society [23]. If the outcome of an action is a surplus of benefits over costs, utilitarianism endorses the goodness of this action [28]. The downside of utilitarianism is that it seeks to produce the greatest good for the greatest number, favoring the interests of the majority, while a minority may suffer from the consequences of the action [26]. When teleological theories are applied to issues of customization and personalization the overall outcome is just one single criterion whereas the process is rather irrelevant. This has interesting implications for the use of personal data, especially when the expected outcome exceeds the sum of all inputs. Virtue ethics focuses on the traits of an ethical subject, e.g. character, motivation or intention. It investigates the agent-intrinsic reasons for responding one way or another to external or internal forces [23]. Virtues are understood as "an acquired disposition that is valued as part of the character of a morally good human being" [22]. Examples of virtues include honesty, unselfishness, or fairness. The famous virtue ethicist Aristotle claimed that all virtues are at the center of a continuum between deficiency and excess. He stressed that the uppermost goal of all human behavior should be to find a balance between the two extremes of the continuum by striving for the middle ground, which he referred to as the "golden mean" [23]. Virtue theory argues that an action is morally good if a morally virtuous person would exhibit this behavior habitually. Also, actions that make a person more honest, more caring or more responsible are morally good, while actions that achieve the opposite are morally bad. This theory stresses the nature of the organization rather than its goals or processes, which means that the usage of personal information by
0-7695-2056-1/04 $17.00 (C) 2004 IEEE
3
Proceedings of the 37th Hawaii International Conference on System Sciences - 2004
"responsible organizations" could serve as a kind of benchmark for the whole industry.
2.2. Stakeholder theory When examining the ethical aspects of personalization and customization, stakeholder theory appears to be useful as well. In this context stakeholder theory focuses on (1) the stakeholders, (2) their perspectives concerning privacy issues, (3) their interests underlying these perspectives, and (4) the values affecting their attitudes [30]. Although stakeholder theory is not a purely ethical theory, it is still useful in resolving ethical dilemmas and conflicts. A careful analysis of a company's stakeholders may help decision makers to see all ethical implications of their business conduct and may prevent unintended ramifications for a company's stakeholders [31]. For privacy claims it is important that the following three principles be taken into consideration [32]: First, the access principle holds that stakeholders should have access only to the information which is necessary in a given situation. Second, the representation principle refers to the presence of all relevant stakeholders, i.e. the stakeholders should have the possibility to communicate their values and interests. Ultimately, the power principle states that all stakeholders should have equal power to protect their interests. If one of these principles is violated, the risk of stakeholder claims being unacknowledged or underrepresented will rise. Obviously, there are no situations where all three principles are abided by, but taking these principles into consideration is an important step towards the understanding, identification, and analysis of privacyrelated problems. Caudill and Murphy hold that ethical conflicts among stakeholder groups can be resolved by requiring each group to trade off certain rights to other groups and by ensuring that these trade-offs are dealt with in a fair and just manner, balancing each stakeholder group's benefits and harms [29]. This smacks of utilitarianism, but differs substantially from the former in that it considers individual stakeholder groups rather than society as a whole. Due to this individualistic and somehow pragmatic approach we considered stakeholder theory quite suitable for assessing ethical issues pertaining to privacy and data sensitivity.
3. Research method Our research concentrates on determining which ethical theory best fits the ethical issues arising from personalization and customization. After briefly
describing the methods used, we discuss the empirical data and then consider the ethical implications of our findings (Section 5) by pointing out the pros and cons of each theory. With personalization and customization being increasingly used on Web sites we decided to conduct an online survey among Internet users to determine their attitudes towards adaptive and adaptable systems. The geographic scope of this investigation is Austria, where Internet users still differ demographically from the total population. Only 50% of Austrians older than 14 years use the Internet. Strikingly, 87% of Austrians between 14 and 19 years do so, but only 10% of the 60+ population. Overall, 62% of men but only 40% of women use the Internet in Austria [33]. Clearly, online surveys can never achieve representativeness unless the survey intends to study only Internet users, which is the case in our study. Another problem of online surveys is the self-selection of the respondents, which may result in a bias, as Internet users decide themselves whether or not they fill out the questionnaire. Thus, users who are highly involved with the topic are more likely to complete the survey than uninvolved users. Furthermore, Internet users who navigate the Internet intensively may be over-represented in the sample, as they might participate in surveys more often than others. For our study, highly involved and intensive Internet users were of special interest, since this survey covers the sensitive topics of privacy and data protection. Therefore, we conducted an Internet-based survey, mindful of the above-mentioned limitations and drawbacks of online studies. First, a pretest was conducted in order to ensure the understandability and usability of the questionnaire. 10 persons of different age groups with varying levels of computer experience were asked to fill out the survey. Subsequently, the questionnaire was adapted and finally posted on the Internet in March 2003. For the purpose of obtaining a satisfactory sample size the questionnaire was linked to several online forums popular in Austria. At the end of the inquiry period 250 completed forms had been collected. The questionnaire contained a total of 42 questions, pertaining to demographics, Internet use, data sensitivity, personalization and customization. Table 2 contains a demographic overview of the sample. The sample turned out to be very representative of Austrian Internet users in respect to sex and profession, but was slightly biased in terms of education and frequency of Internet use due to the above-mentioned biases inherent in online surveys.
0-7695-2056-1/04 $17.00 (C) 2004 IEEE
4
Proceedings of the 37th Hawaii International Conference on System Sciences - 2004
Table 2: Characteristics of Respondents Sex Male Female Age 14 –19 years 20 – 29 years 30 – 39 years 40 – 49 years 50+ Education Secondary school Apprenticeship/ Vocational school High school grad. Technical College University
Occupation Management Administrative/technical Self-employed House-wife or husband Retired Student Other
14.4% 42.0% 12.4% 1.6% 2.0% 22.0% 5.6%
6.8% 22.0%
Experience on the Internet less than 6 months 6 – 12 months
2.8% 2.0%
40.0% 8.0% 23.2%
1 – 2 years 2 – 4 years more than 4 years
9.6% 29.2% 56.4%
52.8% 47.2% 5.2 % 44.0 % 24.8% 16.4% 9.6%
To study the sample's levels of data sensitivity, we transformed those questions referring to data sensitivity into 10 items with three different values each. These items are used to measure the evaluative dimension of data sensitivity, which is appropriate when conducting a survey over the Internet. The items pertain to data collection and data sharing and their three values represent approval, indifference or disapproval.
Frequency of Internet Use Daily Several times per week Several times per month Less than once a month
89.6% 9.2% 1.2% 0.0%
Table 3 summarizes the variables for data sensitivity and their three possible answers. In order to ensure reliability we used the Kuder-Richardson formula 20 for dichotomous variables (Į = .76). An expert rating was used to establish content validity, i.e. whether the items represent all situations we sought to measure. Based on this rating the following variables were chosen:
Table 3: Indicators of Data Sensitivity Variable VAR20 VAR26 VAR27 VAR28 VAR30 VAR31 VAR36
VAR37 VAR38 VAR40
Description Storage of personal data by online shops to avoid reentering data in the future Use of cookies by online shops Web sites storing data of their visitors Fears of “big brother” in the context of the Internet Companies gathering navigational data without the user's knowledge Receiving newsletters or other messages from Web sites at which the user is registered Readiness to divulge personal data when a personal advantage is offered and the data are treated confidentially Storage of personal data in order to facilitate the finding of products and information Positive experience with an online shop enhances readiness to divulge personal data Sharing of personal data with third parties so that users receive potentially interesting advertisements
negative
Answer categories indifferent
positive
negative negative justified objectionable
indifferent indifferent indifferent indifferent
positive positive not justified not objectionable
annoying
indifferent
not annoying
no
indifferent
yes
negative
indifferent
positive
no
indifferent
yes
negative
indifferent
positive
0-7695-2056-1/04 $17.00 (C) 2004 IEEE
5
Proceedings of the 37th Hawaii International Conference on System Sciences - 2004
4. Empirical Investigation We have conducted a cluster analysis in order to identify different groups of Internet users in respect to their levels of data sensitivity. As can be seen from Table 3, we used three answer categories that were considered to be equally spaced. Our analysis has shown that these variables are multi-dimensionally heterogeneous and so a cluster analysis can be conducted. A single-linkage analysis has proved that the sample does not contain any outliers. Therefore, we have decided to apply the Ward method (squared Euclidean distance), which uses an analysis of variance approach to evaluate the distances between clusters. According to the elbow criterion, four clusters (groups) have been identified, which differ in their levels of data sensitivity and form the basis for our further analyses. The size of each group in absolute and relative terms is shown in Table 4. Table 4: Size of groups Groups A B C D Total
Absolute 108 68 39 35 250
Relative 43.2% 27.2% 15.6% 14.0% 100.0%
A discriminant analysis has led to three functions that have a significant discriminatory power (WilksLambda= .036, Chi-Squared= 805.232 (df: 30), p < .001). The centroids of these clusters are shown in Figure 1. The three functions are made up of variables that lead to an ideal grouping. We decided not to name them due to their inherent variety. They could be best described as “amount of distrust” (function 1), “nescience of data collection and usage” (function 2) and “facilitation of Internet surfing” (function 3).
B
3
F u n c t i o n 1
2 1
D
0
A
-1 -2
C
-3 4
3
2
1
Function 2
0
-1
-1,5 -1,0
,5 -,5 0,0
1,0 1,5
Table 5 gives the common correlations within the groups between the discriminatory variables and the standardized canonical discriminatory functions. The variables are ranked according to their absolute level of correlation with the function. Table 5: Structure matrix VAR38 VAR37 VAR36 VAR26 VAR28 VAR31 VAR30 VAR20 VAR27 VAR40
Function 1 .685* .358* .350* .248* .141* .259 .123 .456 .294 .177
Function 2 -.384 .047 .040 .117 .037 .806* .147* .055 .181 .061
Function 3 -.572 .108 -.058 .226 .042 -.373 .117 .593* .304* .296*
Figure 2 depicts a polarity profile, illustrating the differences between the individual groups. We used this univariate interpretation in order to highlight the characteristics of the four groups. Based on this profile, Group C can be characterized as chiefly data sensitive and Group B as rather data insensitive, whereas the profiles of Groups A and D are heterogeneous. Members of Group B have a low aversion to providing online shops with personal information, while members of Group C are very reluctant to do so, even if this would increase their level of convenience (VAR20). Interestingly, Group B ranks only second in terms of data insensitivity for VAR31 (receiving unsolicited newsletters) and VAR38 (enhanced readiness to divulge personal data because of previous positive experience). It is also striking that the values for VAR20, VAR26, and VAR27, which are questions pertaining to data collection in general, are very similar for Groups A, D and C, whereas those of Group B are markedly different. Although Group B is apparently not afraid of the collection of personal data in general, it strongly objects to unethical practices such as the unauthorized collection of data (VAR30) or the sharing of their data with third parties (VAR40), similar to Group C. However, Group B appreciates the convenience of personalization and customization (VAR20, VAR36, VAR37), while Group C does not. It is also interesting to note that Group C is not even willing to divulge personal information to online shops if it has done business successfully with the shop before (VAR38).
Function 3
Figure 1: Centroids of clusters
0-7695-2056-1/04 $17.00 (C) 2004 IEEE
6
Proceedings of the 37th Hawaii International Conference on System Sciences - 2004
VAR35
Variable VAR21
VAR22
VAR23 VAR32 VAR34
Figure 2: Polarity profile of the four groups (0 High data sensitivity, 1 Low data sensitivity) In a next step we tested whether the groups differ significantly regarding their attitudes towards customization and personalization. We used 10 different indicators, which are summarized in Table 6. The questions included in the questionnaire did not bear direct resemblance with the variables. Rather, the variables were hidden in examples and paraphrases to make sure all respondents could relate to the questions and fully understood them. The comprehensibility of the questions was also tested in the pilot survey. The nominal-scale answer categories call for a chisquare test for interrelations. The results reveal that all groups differ highly significantly in their answers to almost all questions (p < .01). Only variable 14 showed a significance of .015.
Wish to be asked for no indiff. yes personal preferences Attitude towards Personalization Description Answer categories Attitude towards a neg. indiff. pos. personalized homepage Previous use of no indiff. yes personalized homepages Quality of inf- indiff. suppersonalized offers erior erior Wish for more no indiff. yes personalized offers Wish for more no indiff. yes adaptive Web sites
The results for the variables pertaining to customization have shown that Group B has the most positive attitude towards customization among the four groups, while Group C has the most negative attitude towards customization on all five variables. The substantial discrepancies between attitudes of Group C and attitudes of Group B are evident from the chart in Figure 3. Especially VAR33 and VAR35 clearly show that Group C does not wish for more customized Web sites in the future either. This suggests that user perceptions of customization are impacted by the users' general level of data sensitivity rather than the quality of the customized Web sites.
Table 6: Indicators of Attitude towards Customization and Personalization Attitude towards Customization Variable VAR14 VAR15
VAR17 VAR33
Description Attitude towards a customized homepage Previous use of customized homepages Attitude towards customized news Wish for more customizable Web sites
Answer categories neg. indiff. pos. no
indiff.
yes
neg.
indiff.
pos.
no
indiff.
yes
0-7695-2056-1/04 $17.00 (C) 2004 IEEE
7
Proceedings of the 37th Hawaii International Conference on System Sciences - 2004
14% 10%
var 34
var 35
9% 5% 32%
57% 25%
15%
17% 13%
var 32
var 33
20% 13% 47%
50% 21%
20%
26% var 23
var 17
69% 38% 76%
15% 50% 31%
63%
3% 8%
var 22
var 15
23% 15%
38% 19%
44% 20%
var 21
40% var 14
26% 60% 56% 0%
20%
40%
A
B
60%
C
72% 48% 0%
80%
20%
100%
D
Figure 3: Positive attitude towards customization Figure 4 depicts the results obtained for the variables pertaining to personalization. They are consistent with those obtained for the customization variables, but differ in that the discrepancies between results for Groups C and B are considerably higher. Thus, Group C opposes personalization even more strongly than customization. Also, the results obtained for VAR32 and VAR34 indicate that Group C does not intend to use personalized sites in the future either. Hence, similar to customization, the quality of a personalized Web is apparently unable to convince data-sensitive users of the convenience provided by personalization.
49%
21%
40%
A
60%
B
C
80%
100%
D
Figure 4: Positive attitude towards personalization
5. Implications of findings As the results above have shown, different user groups have varying attitudes towards customization and personalization, which can be put down to the ethical issues associated with tracking user behavior and adapting Web sites to users without their consent. The ethical theories introduced in Section 2 are wellsuited for determining whether personalization and customization are morally good or bad. From a Kantian perspective, for example, the collection of consumer information without the consumers' consent can never be ethically justified, simply because the act as such would be considered ethically wrong and not because the online merchant may misuse the information collected. Deontologism would not even permit the monitoring of members of Group B, although they would not mind being monitored. Thus, deontological theories deem personalization of any kind as morally wrong, while customization as a user-imitated adaptation process is ethically acceptable because it does not impose anything on users without their prior consent.
0-7695-2056-1/04 $17.00 (C) 2004 IEEE
8
Proceedings of the 37th Hawaii International Conference on System Sciences - 2004
According to social contract theory, personalized Web sites could be seen as the benefits consumers obtain for divulging personal information to companies, which the companies could use to make their offerings more appealing and increase their sales. From this perspective, personalization creates a winwin situation. However, considering that consumers differ in terms of data sensitivity, these benefits would have to come in different forms to fully compensate all consumers for the data they have traded off and to abide by the principle of equality and reciprocity. As this is hardly feasible, social contract theory would endorse customization as ethically good but would disapprove of personalization, since it cannot ensure an equal exchange. From a Utilitarian perspective, personalization in general is ethically acceptable if the potential benefits for the users, e.g. higher usability or more targeted offers, exceed the harms resulting from the potential violation of privacy which some users may experience. However, taking into account the four consumer groups identified above, personalization cannot be justified for members of Group C, for whom harms would exceed benefits, but is absolutely justified for Group B, and to a certain extent for A and D as well. Customization, in turn, would meet the requirement of benefits over costs for all four groups, as it does not impose anything on users and hence does not cause any harm. According to virtue ethics, a balance is to be achieved among company goals (better customer relationship management) and consumer goals (data privacy). This balance is found at the center of a moral continuum. One extreme of the continuum would be for companies not to personalize Web sites at all and the other extreme would be to use all information they can possibly get hold of, maybe even without the users' consent, to personalize their Web sites as much as possible. The "virtuous" solution would be some middle ground that exhibits the virtues of honesty, responsibility and caring, e.g. full disclosure of all data collection practices, strict confidentiality concerning all data obtained from consumers, and opt-in rather than opt-out facilities. Since virtue ethics focuses exclusively on the agent, the four consumer groups identified above do not affect the conclusion reached about personalization at all. Customization, in turn, is user-driven and thus not covered by virtue ethics at all. Hence, virtue ethics is not very helpful in determining the ethicality of personalization and customization as to the four consumer groups identified above. Stakeholder theory is well-suited to provide a starting point for resolving the ethical dilemmas encountered in personalizing and customizing Web sites. Its strength lies in the fact that it pays attention to
individual groups rather than society as a whole. Thus, the four groups of consumers identified above could be regarded as individual stakeholder groups whose interests need to be considered when making decisions in ethically ambiguous situations. According to stakeholder theory, customization would not be an ethical problem, as it does not involve any trade-offs on the part of the consumers, as opposed to personalization. To make for a fair distribution of trade-offs among stakeholder groups, these consumer groups should be offered different levels of personalization rather than a one-size-fits-all approach to personalization.
6. Conclusion Our results have shown that there are different groups of online customers who differ significantly in terms of data sensitivity. Thus, the identification of user Groups A to D provides new insights into the implications ethical theories have for personalization and customization. The ethical theories discussed above have led to conflicting views on the ethical problems associated with adapting Web sites to user behavior. Also, the fact that users exhibit differing levels of data sensitivity influences the conclusions reached by those ethical theories focusing on aspects other than the agent. Notably, stakeholder theory appears to be a viable solution to resolve this ethical issue, although it is not a purely ethical theory. Our study supports the conclusion that customization is ethically less questionable than personalization, as it does not impose features on users but is entirely controlled by users. In practical terms, this means that companies tracking user behavior need to disclose their data-gathering practices and should offer opt-in rather than opt-out facilities to fulfill their duty of telling the truth and respecting others.
Acknowledgements The authors wish to thank Armin Gegenbauer, who assisted in the collection of the data, and Andreas Strebinger, who helped with the statistical analysis.
7. References [1] Fink, J., Koenemann, J., Noller, S., and Schwab, I. (2002): Putting personalization into practice, Communications of the ACM, 45 (5), 41-42. [2] Chiasson, T., Hawkey, K., McAllister, M., and Slonim, J. (2002): An architecture in support of universal access to electronic commerce, Information and Software Technology, 44 (5), 279-289.
0-7695-2056-1/04 $17.00 (C) 2004 IEEE
9
Proceedings of the 37th Hawaii International Conference on System Sciences - 2004
[3] Kalyanam, K. and McIntyre, S. (2002): The emarketing mix: A contribution of the e-tailing wars, Academy of Marketing Science, 20 (4), 487-499. [4] Billsus, D., Brunk, C.A., Evans, C., Gladish, B., and Pazzani, M. (2002): Adaptive interfaces for ubiquitous Web access, Communications of the ACM, 45 (5), 34-38. [5] Ardissono, L., Goy, A., Petrone, G., and Segnan, M. (2992): Personalization in business-to-customer interaction, Communications of the ACM, 45 (5), 52-53. [6] Ansari, A. and Mela, C.F. (2003): E-customization, Journal of Marketing Research, 40 (2), 131-145. [7] Sundbo, J. (2002): The service economy, Standardisation or customization. The Service Industries Journal, 22 (4), 93-116. [8] Lampel, J., Mintzberg, H. (1996): Customizing Customization, Sloan Management Review, 38 (1), 21-30. [9] Nielsen, Jakob (1998): Personalization is Over-Rated, Alert Box, Available from http://www.useit.com/alertbox/981004.html, Accessed 200305-23. [10] Zahai, D. and Griffin, A. (2003): Information antecedents of personalization and customization in businessto-business service markets. Journal of Database Marketing, 10 (3), 255-271. [11] Coener, A. (2003): Personalization and Customization in Financial Portals. The Journal of American Academy of Business, 2 (2), 498-504. [12] Hirsh, H., Chumki, B. and Davison B.D. (2000): Learning to Personalize. Recognizing patterns of behavior helps systems predict your next move, Communications of the ACM, 43 (8), 102-106. [13] Blom, J. (2000): "Personalization – A taxonomy", Extended Abstracts of the CHI´00 Conference on Human Factors in Computing Systems, The Hague, Netherlands. [14] Nunes, P., Kambil, A. (2001): Personalization? No Thanks. In: Harvard Business Review 79, S. 32-34. [15] Stead, B. A. and Gilbert, J. (2001): Ethical Issues in Electronic Commerce, Journal of Business Ethics 34 (2), 7585. [16] Sama, L. M. and Shoaf, V. (2002): Ethics on the Web: Applying Moral Decision-Making to the New Media, Journal of Business Ethics, 36 (1-2), 93-103. [17] Chellappa, R.K. and Sin, R. (2002): "Personalization versus Privacy: An Empirical Examination of the Online Consumer’s Dilemma", 7th Conference on Information Systems & Technology (CIST) at INFORMS 2002, San Jose, USA. [18] Mabley, K. (2000): Privacy vs. Personalization. Available from http://www.cyberdialogue.com/library/pdfs/ wp-cd-2000-privacy.pdf, Accessed 2003-05-10.
[19] Volokh, E. (2000): Personalization and Privacy, Communications of Association for Computing Machinery 43 (8). [20] Sheehan, K. B. and Hoy, M. G. (2000): Dimensions of Privacy Concerns among Online Consumers, Journal of Public Policy & Marketing, 19 (1), 62-73. [21] Culnan, M. (1993): “How Did they Get my Name?”: An Exploratory Investigation of Consumer Attitudes Toward Secondary Information Use, Management Information Systems Quarterly, 17 (3), 341-361. [22] Velasquez, M. G. (2002): Business Ethics. Concepts and Cases. 5th Ed., Upper Saddle River. [23] Mason, R. O., Mason, F. M., Culnan, M. J. (1995): Ethics of Information Management, Thousand Oaks: Sage. [24] Brennan, A. (2002): Environmental Ethics, Stanford Encyclopedia of Philosophy. Available from http://plato.stanford.edu/, Accessed 2003-05-24. [25] Floridi, L. (1999): Information Ethics: On the Philosophical Foundation of Computer Ethics, Ethics and Information Technology, 1 (1), 37-56. [26] Buchholz, R. (1995): Business Environment and Public Policy. Implications for Management, 5th ed. Englewood Cliffs: Prentice Hall. [27] Mason, R. O. (1986): Four Ethical Issues of the Information Age, Management Information Systems Quarterly, 10 (1), 5-12. [28] Petrick, J. A. and Quinn, J. F. (2001): Nature and Value of Management Ethics, In: Malachowski, A. (ed), Business Ethics: Critical Perspectives on Business and Management, London, New York, 55-75. [29] Caudill, E. M. and Murphy, P. E. (2000): Consumer Online Privacy: Legal and Ethical Issues, Journal of Public Policy & Marketing, 19 (1), 7-19. [30] Donaldson, T. and Preston L (1995): The Stakeholder Theory of the Corporation: Concepts, Evidence, and Implications, Academy of Management Review, 20 (1), 6591. [31] Josephson, M. (2000): Teaching Ethical Decision Making and Principled Reasoning, In: Hofmann, M. W., Frederick R.E. and Schwartz M. S. (eds.), Business Ethics, Boston: McGraw-Hill, 87-94. [32] Introna, L. D. and Pouloudi, A. (1999): Privacy in the Information Age: Stakeholders, Interests and Values, Journal of Business Ethics, 22 (1), 27-38. [33] Austrian Internet Monitor (2002): “AIM InternetEntwicklung 4. Quartal 2002”, Available from http://www.integral.co.at/dImages/AIMK%204.%20Qu.%2002.ppt, Accessed 2003-05-26.
0-7695-2056-1/04 $17.00 (C) 2004 IEEE
10