eChallenges Template - CiteSeerX

5 downloads 127021 Views 437KB Size Report
Tel: +49 351 463 38005, Fax: + 49 351463 38255, Email: [email protected]. Abstract: Transparency is an important precondition for the users' ...
Transparency Tools for User-Controlled Identity Management Marit HANSEN1, Simone FISCHER-HÜBNER2, John Sören PETTERSSON2, Mike BERGMANN3 1 Unabhängiges Landeszentrum für Datenschutz, Holstenstr. 98, 24103 Kiel, Germany Tel: +49 431 988 1214, Fax: +49 431 988 1223, Email: [email protected] 2 Karlstads Universitet, 651 88 Karlstad, Sweden Tel: +46 54 700 1723/2553, Fax: + 46 54 700 1828/1446, Email: [email protected], [email protected] 3 Technische Universität Dresden, 01062 Dresden, Germany Tel: +49 351 463 38005, Fax: + 49 351463 38255, Email: [email protected] Abstract: Transparency is an important precondition for the users’ control over their privacy. It can increase the users’ trust in accurate and secure processing of their personal data. This paper presents concepts and implementations of different transparency tools, which are employed in a user-controlled identity management system of the project PRIME – Privacy and Identity Management for Europe. A focus is laid on showing the concepts and their visualization via user interfaces. Considerations on human-computer interaction are highlighted for different alternatives, and the motivation for design choices is explained. Moreover, results from user tests are reported and analyzed. In the following section related work is described. This paper concludes that currently transparency tools rarely belong to standard functionality of information and communication technologies although much improvement would already be achievable today.

1.

Introduction

In the online world, every person has to handle numerous accounts and data sets. These socalled “partial identities” will increasingly play a key role in future electronic services as well as in public security. They may very well convey sensitive personal data, such as patient health data, employee data, credit card data, etc. The European project “PRIME – Privacy and Identity Management for Europe”i is developing a working prototype of a user-controlled identity management system. While there are manifold identity management tools, among them Microsoft CardSpace as part of the operating system Vista, which provide the possibility for users to select between different partial identities, the objectives of PRIME go far beyond: The project focuses on solutions for identity management that support end-users’ sovereignty over their private sphere as well as enterprises’ privacy-compliant data processing. In these systems, the user has control of personal information and discloses it only after agreeing to the service’s privacy policy. Such privacy policies describe the data processing, e.g., purpose, possible transfers to other parties and retention. All parties act within the strict bounds of law, under anonymity, pseudonymity, or on the basis of explicitly agreed terms between the parties. In all cases technology supports accountability and recourse. Transparency-enhancing technologies have been discussed as part of or supplement to privacy-enhancing technologies (PETs), e.g., in the Network of Excellence “FIDIS – Future of Identity in the Information Society” (www.fidis.net). Thus, transparency is not only a real and legally stated necessity for the users’ control over their private spheres, but it helps

to increase trust: In PETs users should feel in control of the technologies concerning them, which may be accomplished by transparent and reversible procedures [1].

2.

PRIME’s Transparency Tools: Technology and Test Results

This section describes methods to increase transparency (in the sense of clear visibility) for users on privacy-relevant data processing so that they can maintain the control over their private sphere: • log functionality on the user’s side, • warn functions for signalling possible matches with the user’s preferences, • tutorials and demo tools, • a security feed to report and react to vulnerabilities, and • online help functions which enable users to exercise their rights and to keep control over personal data which they have released. The shown transparency tools are being implemented and evaluated in the usercontrolled identity management prototypes within the project “PRIME – Privacy and Identity Management for Europe”. 2.1

Data Track: Log Functionality on the User’s Side

The Data Track allows users to look up what personal data they have released to other sites via data transaction records logged at the end user’s side. It contains user-friendly functions for searching data records in these log file, support for worried users, as well as functions for users for exercising their rights to access data, or to rectify or request their deletion online (see Section 2.5). The transaction records stored in the data track contain in addition the date of transmission, purpose of data collection, recipient, and all further details of the privacy policy that the user and the service have agreed upon. It would not be sufficient to rely on the privacy policy shown at the service’s website as this may change without notice or change management. Also used pseudonyms or credentials disclosed or received are logged so that users can later on look up what was revealed to which service. The used pseudonyms also allow a user to identify himself as a previous contact in case that he wants to exercise his rights to access, rectification, blocking or deletion while still remaining pseudonymous. Different search functionalities for finding relevant transaction records have been elaborated within the PRIME project to serve beginners and experts best. Two search methods are quite straightforward and might appear as the obvious choices: (1) Sorting step-wise by categories, such as ‘Personal data’ and ‘Receivers’; and (2) Simple search box. However, these two approaches seem unsatisfactory because users are unaware of what the system does as revealed in user tests performed by the PRIME group. More suitable methods that are currently pilot-tested include: (3) Template sentences which put search boxes within meaningful frames: “Who has received my [drop-down list with data]?” and (4) A scrollable transaction track that shows all the records at once. The records are shown in abbreviated form as small pages stacked along a timeline (see Fig. 1). A slider provides the possibility to highlight an individual page in the stack. In this way, users could browse through the records without having to understand sorting or to articulate refined search requests. Obviously, this method seems more appropriate for the beginner whose amount of transaction records will be limited. With an increasing amount of transactions it becomes more and more difficult to find the desired record(s). For the more advanced user combinations of methods have to be explored and developed (see also [2]). In a usability test 45 high school students were instructed to answer questions by searching the Data Track using (2) and (3) and an (5) advanced search (not in Fig. 1). Only

a few persons scored less than half correct answers. In a post-test question, they were asked if they found the search engine for the Data Track easy to understand and use, only five said “No” (interestingly, this did not correlate with a low number of correct answers).

Figure 1: Data Track Window Including Template Sentences and Scrollable Tracks

2.2

Signals If Matching the User’s Preferences

For the user performing online transactions much information can be relevant for influencing their behaviour, e.g., whether the service seems to be trustworthy and whether its privacy policy matches the own preferences. For providing information on trustworthiness, several topics play a role and must be signalled to the user, if demanded: • the reputation based on past experiences from the user herself or peers, • estimations by organizations trusted by the user (in particular if shown on black lists, e.g., from consumer organizations), • audits for the data processing process, e.g., shown as audit seals or trust marks by data protection authorities, or • technical guarantees by specific trusted computing systems which may help enforcing stated obligations. A simplified compliance check which relies on information from data commissioners, consumers’ organisations, and trusted platform schemata was designed and tested in a mock-up test with 12 persons using Internet daily/regularly. If this function gave an alert, the user would see the window in Figure 2. The user could also invoke this window [3]. Test users had to work through five scenarios with both unknown and known domestic and foreign wed sites. There were not many usability issues with this design, except that the concept of privacy seal was a bit problematic for some participants and the local help function (see Figure 2) was necessary to judge from a few responses. In future tests it can be worth putting in more criteria such as “Consumers’ ratings” and letting participants rank the individual entries or let them manually select criteria during each scenario walkthrough.

Figure 2: Mock-up of Privacy Functionality Check and its local help window

On a question if they have a use for the functionality check and why not, if not, only one answered that she does not purchases anything via the Internet because she is unsure about how secure it really is, but added that if there was such a program as in the test, she might start to do Internet shopping. All the others were positive to very positive. 2.3

Tutorials and Demos

Awareness of risks, in particular in the online world, and education on concepts and use of tools for privacy protection are necessary if people should be able to maintain their privacy. The PRIME project has elaborated tutorials for two main target groups, “ordinary” users and specialists such as developers or privacy commissioners (https://www.primeproject.eu/tutorials/). They inform in an entertaining way about risks and countermeasures both in the offline and online world. The knowledge is deepened by interactive components or – for specialists and also teaching purposes – by self-tests after each unit. Integrated in the user interface of the PRIME identity management system is the “Quick demo” window which realizes a similar idea: educating users in a crisp and easily understandable way on their rights, how to exercise their rights offline or online via the Data Track, and giving contact addresses of consumer organizations or data protection authorities for help. 2.4

Security Feed

For user-controlled identity management systems we envision mechanisms for users to be informed about security and privacy incidents, especially if they might influence (re-)use of partial identities. We think of specific information services being offered by feed providers, e.g. via RSS, dealing with security and privacy information on incidents concerning

protocols, applications, cryptographic algorithms, communication partners, or PRIMEenabled software itself. A demonstrator of this system has been developed: Users can subscribe to one or multiple RSS feeds which are regularly polled. Information from the feeds which is relevant for the user is stored at the user’s side and displayed • when the user is going to disclose data, • in the “Data Track” dialogue to understand potential risks related to former transactions, • immediately when the PRIME-enabled system being used is vulnerable itself. For convenience reasons related warnings are grouped, and priorities assigned to the feed items are evaluated together with the user’s estimation of reliability of the respective feed provider (i.e. a trust mark of the value “low”, “medium” or “high”).

Figure 3: Feed Information on a Hacking Attack

In addition if the information items contain dates when the vulnerability started and when it was discovered, this is helpful when interpreting former transactions which happened before the incident was known (see Fig. 3). Further, the warnings should not only comprise mere information on the incident, but also ways how to overcome or at least deal with the vulnerability. To ensure authenticity of the provider’s feed items, they are digitally signed, and the signatures are checked in the polling process. 2.5

Online Functions for Exercising Rights

The right to access means that individuals can request information on their own personal data processed by the data controller, e.g., a company or a governmental authority. Further, individuals can request rectification of their data to correct them or even request erasure, e.g., if they are illegally stored.

For exercising one’s privacy rights, the requests have to be sent to the data controller. In case of no reply or no satisfying reply, the supervisory authority can be addressed, usually a national or regional data protection authority. Many individuals are not aware of their privacy rights, but even if they are, they rarely exercise them because it means much bureaucratic effort to find out whom to address, to compile a letter, often to be personally signed on paper, to send it, wait for an answer, write reminders etc. When using pseudonyms (e.g., from an identity management system), this may even be more complicated because the data controller needs a proof that he communicates with the specific pseudonym holder. In PRIME-enabled systems at both the user’s and the service’s side, the right to access, rectification etc. even under pseudonyms could be totally handled online. Otherwise semiautomatic solutions can be realized: The user’s identity management system could support him to find the address of the data controller (from the privacy policy), generate request letters, authenticate the user (even for pseudonymous access), monitor the complaint status, compile reminders, and – if necessary – draft requests addressing the supervisory authority in charge (see Figure 4). Of course additional personal information disclosed via the request should also be minimised.

Figure 4: Workflow for Online Functions for Exercising Rights

We conducted a pilot user test where 10 participants (“ordinary Internet users”) were asked to exercise personal data rights. Of the ten participants, 8 persons held preconfigured emails, filled in by the Data Track, to facilitate withdrawal of information, but 3 of the 8 saw no need to use the aid function as they declared they only buy from companies they

trust. They did not seem to regard that this makes them vulnerable for ‘spoofing’ attacks where web pages are made to look like pages from reputable companies [3, 4].

3.

Related Work

In the PISA project on “Privacy Incorporated Software Agents”, Patrick et al. [5] derived the human-computer interaction requirement that users are conscious of their rights, and that they can exercise control over their data, with ability to erase, block, rectify, supplement the data from the privacy principle of user control. This was in turn mapped to the proposed user interface design solution that data tracking functions are displayed prominently, and that commands to exercise user rights are associated to the tracking logs and obvious to use. However, in contrast to our work, the PISA project has not elaborated in more detail how such tracking functions and associated search functions could be designed in a user-friendly manner, nor has the question how commands for exercising rights can be implemented and designed been researched in detail. Weitzner et al. present a basic architecture for transparent and accountable data mining that consists of general-purpose inferencing components to be installed at services sides providing transparency of inferencing steps and accountability of rules [6]. Transparency in this context means that the history of data manipulations and inferences is maintained and can be examined by authorized parties. Accountability means that one could check whether the policies that govern data manipulations were adhered to. Sackmann et al. present a secure logging mechanism for creating privacy evidence for ex post enforcement of privacy policies [7]. Both approaches by Weitzner et al. and Sackmann et al. were, however, not designed to support end user transparency tools. Both approaches thus comprise transparency tools to be installed at services sides that allow to check whether personal data at those sides were processed in a way compliant with privacy legislation. They were, however, in contrast to our approach not designed as end user transparency tools.

4.

Conclusions

Transparency of personal data processing is a prerequisite for users to manage their privacy. Many tools can increase transparency and give feedback to the user, but most of them are not state-of-the-art yet. User-controlled identity management systems are ideal to combine the selection of partial identities and transparency functionality and thereby support the users in their privacy management. User interface design in particular is not trivial as users should not suffer from information overflow or from lack of appropriate information. In addition the mere facts on who processes how which data may not be informative enough because the interpretation of those making use of the data, potentially joining the data with data from other sources, might not be guessable by the user. For current technologies this is work in progress; for future and emerging technologies possibly new concepts have to be developed. To calculate the benefit for organisations, a cost-benefit analysis of provision of transparency tools has to be elaborated. Here the type of user feedback (which content via which channel) is relevant to consider because the effort for the organisation as well as the potential effects on its reputation can vary. Moreover privacy law and freedom of information law demand giving information to the users, and in several US States, Security Breach Notification Acts force companies in case of hacking attacks to inform the people concerned. Here a proactive approach, supported by technical transparency tools and accompanied by organisational procedures, can enhance the organisation’s reputation and its trustworthiness. Further research is needed on the question whether users will behave differently regarding their right to privacy if they are provided by transparency-enhancing technologies

in the online – or even ambient – world. The trend of increasing complexity and the growing difficulty for users to stay in control in a world of sensors, video surveillance, bugging devices or smart dust, which may come from a plenitude of actors, calls for retrofitting transparency wherever possible and educate the users to understand what actually is or potentially might be happening to their personal data. To avoid a diverging gap between the haves and the have-nots with respect to understanding of the digital world and the competency to protect one’s own privacy, national States or the European Union might offer transparency tools to their citizens (and others) and teach their usage at school. This may even be a precondition for a prospering information society encompassing all users who then could reanimate the image of responsible-minded citizens, able to exercise their rights.

References [1] C. Andersson, J. Camenisch, S. Crane, S. Fischer-Hübner, R. Leenes, S. Pearsson, J.S. Pettersson, D. Sommer, Trust in PRIME. In: Proceedings of the 5th IEEE Int. Symposium on Signal Processing and IT, December 18-21, 2005, Athens, Greece. [2] J.S. Pettersson, S. Fischer-Hübner, M. Bergmann, Outlining Data Track: Privacy-friendly Data Maintenance for End-users. In: Proceedings of the 15th International Conference on Information Systems Development (ISD 2006), Budapest, 31st August - 2nd September 2006, Springer Scientific Publishers. [3] J.S. Pettersson, R1 – First report from the pilot study on privacy technology in the framework of consumer support infrastructure. Working Report (VINNOVA-financed pilot study P29664-1), December 2006. Dept. of Information Systems and Centre for HumanIT, Karlstad University. http://www.humanit.org/pdf/R1_Privacy_rights_infra_December_2006.pdf [4] R. Dhamija, J.D. Tygar, M. Hearst, Why Phishing Works. CHI 2006 Proceeding. ACM Press, 2006. [5] A. Patrick, S. Kenny, C. Holmes, M. van Breuken: Human Computer Interaction. Chapter 12 in: G.W. van Blarkom, J.J. Borking, J.G.E. Olk (Eds.), Handbook of Privacy and Privacy-Enhancing Technologies. College bescherming persoonsgegevens, The Hague, 2003, http://www.andrewpatrick.ca/pisa/handbook/handbook.html [6] D. Weitzner, H. Abelson, T. Berners-Lee, C. Hanson, J. Hendler, L. Kagal, D. McGuinness, G. Sussman, K. Kasnow Waterman, Transparent Accountable Data Mining: New Strategies for Privacy Protection. MIT Computer Science and Artificial Intelligence Laboratory Technical Report, MIT-CSAIL-TR-2006007, January 27, 2006. [7] S. Sackmann, J. Strüker, R. Accorsi, Personalization in Privacy-Aware Highly Dynamic Systems. Communications of the ACM, Vol. 4(9), September 2006. i

www.prime-project.eu; the PRIME project receives research funding from the European Union’s Sixth Framework Programme and the Swiss Federal Office for Education and Science.