Technologies and Human Usability

4 downloads 12318 Views 246KB Size Report
To use search engines or social media, subjects must allow the collection of ... than the data marketing firm values acquiring that data, the data will remain ..... "Law that Hides Massive Health Privacy Breach from Patients is Useless." Calgary.
VOLUME 11 ISSUE 1

Journal of

Technologies and Human Usability _________________________________________________________________________

The Prevention and Mitigation of Breaches of Personal Information Databases A Theoretical Framework SUSAN SPROULE AND FRANCINE VACHON

Techandsoc.com

JOURNAL OF TECHONOLOGIES AND HUMAN USABILITY www.techandsoc.com First published in 2015 in Champaign, Illinois, USA by Common Ground Publishing LLC www.commongroundpublishing.com ISSN: 2381-9227 © 2015 (individual papers), the author(s) © 2015 (selection and editorial matter) Common Ground All rights reserved. Apart from fair dealing for the purposes of study, research, criticism or review as permitted under the applicable copyright legislation, no part of this work may be reproduced by any process without written permission from the publisher. For permissions and other inquiries, please contact [email protected]. Journal of Technologies and Human Usability is peer-reviewed, supported by rigorous processes of criterionreferenced article ranking and qualitative commentary, ensuring that only intellectual work of the greatest substance and highest significance is published.

The Prevention and Mitigation of Breaches of Personal Information Databases: A Theoretical Framework Susan Sproule, Brock University, Canada Francine Vachon, Brock University, Canada Abstract: It seems that everything—from our cars to the search engines we use—collects information about us and our activities. Breaches compromising personal information are an unfortunate by-product of our modern informationintensive society. In this paper, we examine previous research on this problem from diverse perspectives. We then develop an interdisciplinary framework that examines parties involved, benefits derived by each party, losses from a breach, and the ability of each party to mitigate losses. Within this framework, appropriate economic theories are extended to other types of breaches and frauds. Keywords: Cyber-law, Data Breach, Database, Negative Externalities, Liability, Loss Allocation Rules, Public Policy, Privacy, Privacy By Design, Data Security

Introduction

D

ata is collected about us daily from our computers, phones, cars and other devices. Unauthorized access to these data can result from any of the following: insider carelessness, malicious behaviour, external attacks such as hacking or social engineering as well as the transfer of personal data between organizations. Until now, concerns concentrated on identifying data (name, address, telephone, social security, driver license) linked with financial data (credit and bank accounts, utilities, etc.). Internet privacy represents the level of control a person has over the exchange of personal data online (Garber 2013; Langheinrich 2001). This control is eroded by the collection and storage of data such as communication metadata, location, camera, RFID, surveillance and health data (Lee et al. 2012; Rosemann 2013; Al-Kassab, Thiesse, and Buckel 2013) which raises concerns about data breach consequences (Acquisti 2013; Braid 2014; Chen 2014). This paper synthesizes an interdisciplinary review of literature and presents a theoretical framework to address the following questions: • • • •

Who are the main stakeholders? What are the risks related to breaches of personal information? Who among the stakeholders can mitigate these risks? What are possible remedies to these data breaches?

Review of Research Prior research concentrated on specific aspects of the problem. Statistical studies classified data breaches (hereafter breaches) by size, location, types of data involved, causing incident, victim, and institutions (Gupta and Sharman 2012; Garrison and Ncube 2011) to assess the risks involved when organizations collect and store personal information. Event studies found that breaches have short term negative effects on stock prices, but no discernable long-term effects (Acquisti, Friedman, and Telang 2006; Campbell et al. 2003; Cavusoglu, Mishra, and Raghunathan 2004; Gatzlaff and McCullough 2010).

Journal of Technologies and Human Usability Volume 11, Issue 1, 2015, www.techandsoc.com, ISSN 2381-9227 © Common Ground, Susan Sproule, Francine Vachon, All Rights Reserved Permissions: [email protected]

JOURNAL OF TECHNOLOGIES AND HUMAN USABILITY

Romanosky, Telang and Acquisti (2011) found that the adoption of notification laws have reduced identity theft from breaches reported to the Federal Trade Commission (FTC) by 6.1 percent. Romanosky, Hoffman and Acquisti (2012) examined a dataset of breaches from the Open Security Foundation and court records of breach litigation. They conclude that evidence of financial harm increases chances of a lawsuit while an offer of free credit monitoring reduces chances of a lawsuit. Other legal studies examine present laws and their impact in Canada (Lawford and Lo 2011) in the U.S. (Payton 2006) and in Europe (Vamialis 2013).

Stakeholders We classify stakeholders as three parties: data subjects, data holders and regulators: • •



Data subjects (hereafter subjects) are persons about whom data is collected. Data holders (hereafter holders) are organizations that collect financial, heathrelated, work-related or personal data. They include businesses or institutions conducting digital surveillance using closed circuit video, automated license plate recognition systems, automotive or other forms of telemetry, and electronic communication monitoring/recording (Newell 2014; Odella 2010; Yassine et al. 2012). Regulators play a role, either by adopting policies, laws and regulations, establishing policies, assigning liabilities and entitlements, or signing treaties (Bauer and van Eeten 2009).

Benefits of Data Collection The digital collection and storage of information provide efficiencies in administration, transactions and communication thus improving services and reducing costs for subjects and holders (Acquisti 2010). For example, sensor equipped devices provide users with safer and more comfortable homes (Rosemann 2013; Lee et al. 2012). New business models are supported by collecting data about customers (Acquisti 2010). To perform more efficient Web searches, subjects allow search engines to collect information about their queries (Google 2013; 2014). Businesses use personal data to build customer profiles to improve marketing and forecasting capabilities, to lower advertising costs, and improve customer loyalty (Acquisti 2010). However, data disclosed and collected today go beyond what is required for efficiency. Subjects trade information for benefits such as discounts, personalization or customization of services, improved services, and targeted advertising (Acquisti 2010, 7). New business models and peer pressure may compel subjects to provide information (Langheinrich 2001; Acquisti 2010). To use search engines or social media, subjects must allow the collection of associated data (Google 2013; 2014; Facebook 2013). Even when explicit consent is required, subjects are offered a “take-it-or-leave-it” proposition (Solove and Hoofnagle 2006). Businesses have been criticized for collecting data without subjects’ permission (Dingman 2014; Presse Canadienne 2014). Under certain conditions, law enforcement agencies are legally permitted to use databases and surveillance technologies for crime prevention and resolution. Within certain limits, citizens accept these activities for added security, but involuntary disclosure may occur (Bellovin 2010; Langheinrich 2001; Weston, Greenwald, and Gallagher 2014).

2

SPROULE & VACHON: PREVENTION AND MITIGATION OF BREACHES IN DATABASES

Costs of Data Collection Costs of Collection Systems such as Customer Relationship Management (CRM) systems must be purchased, implemented and maintained. Providing data security and developing data privacy and security policies is costly as well (Acquisti 2010).

Costs of Breaches Following a breach, holders are often liable for direct costs, whereas subjects risk being targeted for identity theft and fraud (IDTF). In the USA, out-of-pocket losses for IDTF victims approach 5B USD. Total fraud losses for 2012, primarily borne by holders, amounted to 20.9B USD (Javelin Strategy & Research 2013).

Costs of Externalities Data collection spawns both tangible and intangible negative externalities (hereafter externalities). Spam costs an estimated 20B USD annually, with an additional 5B USD for inefficiencies caused by an email-advertising “stigma” (Rao and Reilly 2012). For their part, negative feelings and discomfort resulting from privacy-related uncertainty are harder to price (Acquisti 2010).

Economic Theories Several economists have studied the mitigation of externalities. Pigou (as cited by Coleman 1980) advocated taxation to compel industry to adopt mitigation measures. However, Coase (1960, 40) argued that when parties are sufficiently informed, government actions are not required and may “produce more harm that the original deficiency”. Coleman (1980) countered that the state may decide to intervene for noneconomic grounds. “The allocation of scarce, lifesaving medical resources, for example, may be best left to nonmarket allocative devices” (Coleman 1980, 245). Applying Coase’s theorem to breaches, whether or not data remain protected depends on the “relative valuations of the parties interested in that data. If the consumer values her privacy more than the data marketing firm values acquiring that data, the data will remain protected because – even in the absence of a law regulating that protection – the consumer would willingly pay for the right to protect that data” (Acquisti 2010, 4). However, holder-subject agreements are not costless. Information asymmetry and disproportionate negotiation costs result in inefficient markets (Cooter and Rubin 1987). Despite technology’s omnipresence, most subjects have low information literacy (Sakai 2014). Moreover, privacy notices and terms of use are difficult to understand for non-experts (Bahr and Allen 2013). Most subjects don’t read them (Beales and Muris 2008). “Individual account holders have span of control problems. How can they keep track of their myriad accounts?”(Facciolo 2008, 628). Behavioural economics tells us that consumers act myopically rather than rationally when faced with short term benefits and long term costs (Acquisti 2010).

Risks Risk is defined as probabilistic costs associated with privacy breaches (Archer et al. 2012). Whenever data are collected, they risk ending up in unauthorized hands. There are two risk dimensions: first, the probability of an event happening and second, its impact, should it occur. Risk can be mitigated by influencing either dimension.

3

JOURNAL OF TECHNOLOGIES AND HUMAN USABILITY

If the potential loss is large but the probability of occurrence is very low, then there is no need to take more than minimal precautions against loss…If the probability of occurrence is higher but the accompanying loss is low…the potential event should be of less concern, with only normal precautions taken. However, if the probability is moderately high and loss is also moderately high, then this becomes an event which needs considerable attention, with care taken to reduce either or both the probability and the loss due to that event. (Archer et al. 2012, 46) Figure 1 illustrates this concept. At both extremes, privacy costs “may be dismissed as unimportant at the individual level – even if, in the aggregate, they may amount to significant societal damage” (Acquisti 2010, 16).

Figure 1: Risk Mitigation Graph Source: Archer et al. 2012

It is usually suggested that both subjects and holders share responsibility for risk mitigation. Probability mitigations include actions to prevent breaches or reduce their likelihood. Impact mitigations include actions to minimize resulting fraud losses. Subjects are held to a duty of “normal care” and holders are responsible for providing appropriate authorization processes (card design, password controls, etc.) and appropriate levels of security (Facciolo 2008). To mitigate losses, subjects are held responsible for monitoring accounts and reporting any problems promptly while holders are responsible for monitoring and closing affected accounts (Facciolo 2008).

Financial Risks The first risk is identity theft and fraud (IDTF). Surveys indicate that 22.5% of US consumers notified of a privacy breach experienced some form of identity fraud, compared to 5.3% of all consumers (Javelin Strategy & Research 2013). IDTF cases cause both tangible losses, like credit-history damage and subsequent loss of purchasing power (Anderson, Durbin, and Salinger 2008; Marron 2008), and intangible losses like anxiety, embarrassment and fear (Calo 2011).

4

SPROULE & VACHON: PREVENTION AND MITIGATION OF BREACHES IN DATABASES

Most IDTF cases involve credit or debit card fraud where victim liability is virtually nil. However, holders incur important losses related to payment card fraud (Morea et al. 2011) which should encourage adoption of prevention measures.

Personal and Property Risks Criminals may use an individual’s location information to target property for theft (WMUR 2010) and to target persons, with death threats, scare tactics or harassment (Newell 2014).

Socio-economic Risks Aggregating data from various sources such as GPS, Facebook “likes” and browsing history, can identify personal traits and habits (Kosinski, Stillwell, and Graepel 2013; Newell 2014; Marwick 2014). “People may choose not to reveal certain pieces of information about their lives (…), yet [these] might be revealed in a statistical sense from other aspects of their lives they do reveal” (Kosinski, Stillwell, and Graepel 2013, 5802). Involuntary digital exposure may have grave consequences. Employers use social media to discriminate between equally qualified job candidates (Acquisti 2013). An unmarried woman may not want to reveal her pregnancy (Kosinski, Stillwell, and Graepel 2013; Marwick 2014).

Externalities Spam-related externalities affect almost everyone and have large aggregate societal costs (Cavoukian 2009). Numerous studies also indicate that fear of IDTF is associated with a drop in confidence in e-commerce which leads 20% to 40% of consumers to behavioral changes (Anderson, Durbin, and Salinger 2008; Archer et al. 2012). The resulting loss of online sales amounts to billions (Acquisti 2010). Anticipating privacy breaches, subjects may provide false information to holders, thus reducing data collection benefits (Hoffman, Novak, and Peralta 1999).

Current Remedies Self-Regulation and Quasi-Regulation Payment card frauds are covered by contracts between subjects and holders. These are classified as quasi-regulations because of the threat of government intervention (Productivity Commission 1998). For example, the Uniform Commercial Code (UCC) outlines recommended federal standards for the state-governed financial industry. In payment systems, losses from fraud and errors are attributed from the perspective of loss allocation rules (Cooter and Rubin 1987; Facciolo 2008). Losses should be assigned to the party able to achieve risk neutrality at the lowest cost (loss spreading) or the party capable of preventing the loss at the least cost. Most payment system losses are attributed to holders who spread these costs among their customers and can adopt new security-heightening technology, such as chip and pin technology or biometrics. Enforcement costs must also be considered (loss imposition) (Cooter and Rubin 1987). A second remedy available to holders is purchasing insurance. “Mass investment in insurance indicates that many businesses are out of their depth and are starting to realise that they now need to view cyber security as a whole package.” (Patel as quoted in Drinkwater 2014). The presence of high occurrence-low impact events prompted finance service providers to offer customers alerting services and identity theft insurance products. “Where there are potential

5

JOURNAL OF TECHNOLOGIES AND HUMAN USABILITY

losses with a relatively low per incident cost, the bean counters are not far behind” (Mercuri 2006, 18).

Explicit Government Regulation Government intervention may be indicated when private remedies fail.

Criminal Law The Identity Theft and Assumption Deterrence Act in the US gave subjects standing in cases and increased penalties associated with IDTF. In Canada, Bill S-4 created new criminal offenses related to identity theft (Holmes and Valiquet 2009).

Statute Law Statute law may be used to address holders’ responsibilities and may prescribe remedies in the event of a breach. Civil cases only need to show that the holders failed to fulfill statute-imposed duties or “negligence per se”. In the US, such regulations are written specifically for sectors where lawmakers believe intervention is necessary (Gramm-Leach-Bliley Act for financial institutions, the Health Insurance Portability and Accountability Act [HIPAA] and the Fair Credit Reporting Act [FCPA]).

Common Law In the absence of statute, breach victims must show that four questions can be answered in the affirmative (Payton 2006): • • • •

Was there a duty of care? Has that duty of care breached? Is there recognized injury? Did the breach cause the injury?

Privacy Legislation US legislation imposes responsibilities by industrial sector. Canada and the European Union (EU) have more general privacy legislations, in large part based on widely accepted “fair information practices.” In Canada, commercial holders are regulated by the Personal Information Protection and Electronic Documents Act. Public-sector holders adhere to the Privacy Act (Charney 2013). The Privacy Commissioner of Canada will assist subjects to resolve complaints under either act. If resolution fails, the Commissioner may pursue court action (OPC 2012). In the EU, the General Data Protection Regulation sets out technology-neutral security standards and requires notification of breaches (Vamialis 2013). Technology-neutral laws do not prescribe specific secure storage and data transmission technologies. EU Regulations require the appointment of a data protection officer and give the Commission the ability to intervene if the technology in use is considered inadequate. However, in the absence of specific requirements, some argue that holders are really self-regulating (Ibid.). In 2014, a decision by the EU Court of Justice reinforced a subject’s “right to be forgotten”, i.e. to have links to “inaccurate, inadequate, irrelevant or excessive” information erased from search engines (EC 2014, 1). This right is “not absolute” as a “case-by-case assessment of the information is needed” (EC 2014, 2).

6

SPROULE & VACHON: PREVENTION AND MITIGATION OF BREACHES IN DATABASES

Notification Laws EU regulations require subjects’ notification of breaches, but Canada has no such regulation (Charney 2013). However, the Privacy Commissioner has developed a voluntary code (OPC 2008). US notification laws are enacted by state legislatures and differ in their requirements and remedies prescribed (Faulkner 2007; Stevens 2009; Joerling 2010). In the wake of recent large breaches, Congress is considering imposing a common federal notification standard (Volz 2014). Remedies under these laws can include requiring credit-monitoring coverage for a prescribed period of time or in some cases requiring credit bureaus to offer a credit freeze to victims. Holder-paid monitoring for breach victims addresses the problem of future harms and provides added incentive for holders to prevent breaches (Rode 2007). Legal precedent can be found in cases where courts have ordered companies to pay for medical monitoring of toxic-spill victims (Payton 2006). Others argue that monitoring and credit freezes still put an unfair burden on victims (Anderson, Durbin, and Salinger 2008; Whitson and Haggerty 2008).

Consumer Protection The Federal Trade Commission (FTC) successfully imposed large fines in some high profile breaches under their authority to control “unfair and deceptive practices”. For example, ChoicePoint settled with FTC for $10M in civil penalties and $5M in consumer redress (Payton 2006). The FTC focuses on the consequences of a breach and whether expected levels of protection were provided (Beales and Muris 2008). Consumer protection agencies attempted to correct some extensive data collection externalities through “do-not-call lists” (Beales and Muris 2008) and anti-spam legislation (Rao and Reilly 2012) with questionable effectiveness. Some legitimate activities were limited and data was unintentionally made available to fraudsters. The US CAN-SPAM act of 2003 proved ineffective because the spam marketplace is global. Pigouvian measures such as email “postage” or “attention bonds” were suggested, but they may also have undesirable consequences (Rao and Reilly 2012).

Problems with Current Remedies Moral Hazard A moral hazard occurs when one party to an agreement is protected against risks and knows that the other party will bear the cost. In the absence of legal liabilities or contractual obligations, firms do not internalize breach associated costs (Acquisti, Friedman, and Telang 2006). Losses can be written off for taxes, are covered by insurance, or are charged back to the consumer (Mercuri 2006). Furthermore, short-term effects in the stock market do not provide a strong incentive to protect data (Acquisti, Friedman, and Telang 2006). In many areas, including most of Canada, customers and shareholders may never be informed of a breach (Lawford and Lo 2011; Charney 2013). Holders are not well incented to protect data because of these factors. Without government intervention, holders only invest in mitigation commensurately with their own potential losses. For example, US adoption of chip and pin technology for payment cards was delayed because holders felt that implementation costs exceeded the reduction in potential fraud costs (Morea et al. 2011). Businesses have no competitive incentive to invest because new technology must be introduced on an industry-wide basis. It is also questionable whether banks and credit bureaus should be allowed to offer insurance products covering breaches that they are responsible for preventing (Sovern 2004). Whitson and Haggerty (2008) argue that these insurance packages lead to a bureaucratization of victim services.

7

JOURNAL OF TECHNOLOGIES AND HUMAN USABILITY

A Systemic Problem “The technical view of information security creates a biased outlook” (Colobran and Basart 2014). In many ways, current remedies reflect the following attitude: “the crime is just a simple risk to be managed” (Whitson and Haggerty 2008, 577) and “the problem becomes pitched not as one of systemic institutional culpability, but as lack of awareness on the part of individuals” (Marron 2008, 29). Breach-related news articles suggest paying in cash only and that “any password older than 30 days is ripe for theft” (Buffett 2014). Shifting responsibility to subjects is unfair (Facciolo 2008; Solove 2003; Marron 2008; Sovern 2004; Whitson and Haggerty 2008). Legal remedies are ineffective for subjects: less than 10% of cases are solved (Payton 2006). Should a case be successfully prosecuted, the thief often does not have the money to make restitution. Complications also arise when a third party is involved (Payton 2006) and “personal data are at the mercy of the least careful holder” (Anderson, Durbin, and Salinger 2008, 187). Civil law only addresses “current tangible injury” with no consideration of future harm (Payton 2006). Holders bear most direct losses. However, indirect losses continue to be borne by the subjects (Marron 2008). Class action suits may reduce the costs of loss imposition (Cooter and Rubin 1987); however courts rarely certify them in the case of breaches (Vamialis 2013). In the current credit system, subjects have no ability to prevent breaches. After a problem occurs, lenders will file default reports rather than investigate frauds (Sovern 2004). “In consequence, then, the risk of identity theft is conceived of as an incalculable systemic risk stemming from the productiveness of the market itself” (Marron 2008, 23). Because of this, Daniel Solove (2003) believes that we live in an “architecture of vulnerability.” For Solove, this is the risk we face as a society. We know information about us is being collected, processed and used – sometimes against our interest. But we have no choice about, or understanding of, the underlying processes. Privacy harm in the contemporary world is less a function of top-down surveillance by a known entity for a reasonably clear if controversial purpose. It is characterized by an absence of understanding, a vague discomfort punctuated by the occasional act of disruption, unfairness, or the occasional violence….Privacy harm is not merely individual but can lead to societal harms. The absence of privacy creates unhealthy power imbalances and interferes with citizen self-actualization. These harms go to the very architecture or structure of our society. (Calo 2011, 1158). To date, nothing has been done to address the chilling effect that IDTF fear poses to electronic commerce. Regulators have followed a neo-liberal policy of “let the market take care of social risks” (Whitson and Haggerty 2008, 577). With many subjects affected and large economic impacts, intervention is highly recommended (see Figure 1).

New Types of Data and Data Collection New forms of data collection and types of data pose additional risks and externalities.

Location Data Our vehicles and mobile phones collect location data from embedded GPS systems. Unauthorized access to someone’s location data has the potential to result in theft, robbery and stalking (Shilton 2009). With the aggregation of location data, it is possible for Google to predict with high probability where a subject will be on a given day up to 5 years forward (Salidek and Krumm 2012). The potential for intrusive location-based advertising is an externality associated with location data (Acquisti 2013).

8

SPROULE & VACHON: PREVENTION AND MITIGATION OF BREACHES IN DATABASES

Health Data With the advent of electronic health records, and the extensive interchange of data required within complex e-health systems, there is a large potential for health-related breaches. Insurance fraud and access to prescription drugs provide strong incentives for breaches. The consequences of a health-related fraud can be life-threatening if the fraudster’s medical data is co-mingled with that of the victim (Kierkegaard 2012). A recent Canadian breach of the health records of 620,000 patients was not reported for more than four months (Braid 2014). Health records also contain sensitive information such as HIV status or genetic profiling. This data could lead to discrimination or to extortion targeting either subjects or holders (Kierkegaard 2012). Victims of a health-related breach report difficulties in restoring or maintaining health insurance coverage. “Compared to a breach of financial records (credit cards), it is difficult to restore the medical records” (Kierkegaard 2012, 170).

Surveillance Video surveillance in public spaces is now common place, police vehicles are equipped with license plate recognition software (Newell 2014) and after the revelations of Edward Snowden, both US and Canadian governments admit to monitoring their own citizens’ communications metadata (Weston, Greenwald, and Gallagher 2014). Our mobile phones are “embedded surveillance tools” that help track our movements through the day (Shilton 2009, 48). Aside from security purposes, there is a small company in Toronto that sells data, gleaned from free Wi-Fi transmissions, to neighbourhood retailers in Toronto (Dwoskin 2014). People may prefer not disclosing their location or activities for many reasons like getting out of social obligations, organising a surprise, or participating in stigmatized activities (Shilton 2009). Moreover, there are legal questions regarding whether such surveillance constitutes unlawful search and what kinds of data should be subject to subpoena (Newell 2014; Hodge 2006; Petrashek 2010).

Internet of Things The “Internet of things” refers to the increasing use of telemetry in everyday objects. Each of these devices connected to the Internet is at risk of breach. Recently, smart appliances were ensnared as part of a spamming botnet (Proofpoint 2014) and security researchers reported the possibility for hacked pacemakers to deliver deadly jolts (Kirk 2012). As these two examples show, the Internet of things poses new personal risks and can contribute to existing externalities. One of the challenges posed by these devices is that they do not contain enough processing power to handle traditional security measures like encryption (Higginbotham 2014).

Aggregation and Profiling There is great potential for aggregating the data from these various sources into very detailed personal profiles (Kosinski, Stillwell, and Graepel 2013). Subjects commonly use social media sites to “broadcast” personal details. This same information is often used for account verification (i.e. what is the name of your favourite pet?) (Shay et al. 2014). Profiles can be used for price discrimination. Some industries have adopted quasi-regulatory controls on such discrimination. For example, vehicle owners in Canada can ask to have their rates determined using vehicle monitoring data, but insurance companies cannot use this data to price insurance packages without customer consent. In addition to advertisement, search engine profiles are used to personalize results presentation, leading to what Pariser (2011) calls “filter bubbles.” Similarly, Facebook sifts the

9

JOURNAL OF TECHNOLOGIES AND HUMAN USABILITY

information presented on users’ newsfeed based on which of their friends they interact with most (McGee 2013). De facto, these filter bubbles prevent subjects from viewing information and constitute externalities resulting from excessive profiling.

New Remedies This problem must be attacked from three fronts with a systemic perspective. The three fronts are changes in policy, privacy by design and new approaches to usable security.

Policy Change Acquisti (2013) commented on the present dearth of policy mechanisms: “One of these mechanisms is transparency, telling people what you are going to do with their data. And in principle, that's a very good thing. It's necessary, but it is not sufficient.” Our analysis shows that the current state of laissez-faire is inefficient (Anderson, Durbin, and Salinger 2008; Kshetri 2009; Cavoukian 2009; Gordon and Liedtke 2013). Holders should be held liable for breaches, beyond cases of negligence. In Canada, shifting responsibility from individuals to lenders produced excellent results in the case of mortgage and credit card frauds. In 2008, Ontario enacted stricter rules and higher requirements of diligent care for real estate lawyers and lending institutions, de facto shifting liability away from innocent home owners to lenders (LSUC 2010). Much discussion also concerns penalties handed down in IDTF cases. Should punishment be proportional to harm or act as a disincentive (Acquisti 2010, 14)? The resources devoted to detection and punishment depend on the frequency and cost of this crime and the effectiveness of enforcement (Anderson, Durbin, and Salinger 2008, 171-192). Courts should recognize the general harm and impose much higher penalties. Legislation should be technology explicit rather than technology neutral. A precedent can be found in environmental protection legislation, where standards are set and testing procedures are implemented. In their review of US legislation, Solove and Hoofnagle (2006) suggest major changes to the credit reporting industry, such as having records frozen by default (where access would require subjects’ pre-authorization), the ability to request notification of credit record inquiries, and free credit monitoring.

Privacy by Design Privacy by design should be built into holders’ systems (Dix 2010). At present, systems are built in a way that facilitates data sharing without subjects’ say. For example, date of birth is a piece of personal identifying information sought by identity thieves (RCMP 2013). This datum is required to open a Facebook account and its default status is “public” (Facebook 2013). Shaar (2010) presented six principles of privacy-by-design. We explain how they apply to subjects: • • • • •

10

Data sovereignty: Subjects should have extensive control over their own data. Voluntary basis: Data should be collected on a voluntary basis. Extent of data: Subjects should be able to decide which data to provide and also when to delete it. Data access: The subjects decide themselves which holders or service providers are allowed to view or use which data. Right to information: The subjects have the right to read all data concerning them and the right to know its uses and by whom.

SPROULE & VACHON: PREVENTION AND MITIGATION OF BREACHES IN DATABASES



Ability to check: The subjects must “be able to use logs to check who accessed which data and when.” (Ibid., 269).

The current rate of innovation in the mobile phone application environment is high and as a result “data is no longer gathered solely by large organizations or governments with established data practices” (Shilton 2009, 51). This leads to an increasing need for privacy protections to be built into mobile apps. Shilton (2009) suggests three additional principles to guide application developers as follows: • • •

Participant primacy: Subjects must have as much control over their data as possible, including use of a personal data vault to collect all sensing data. Data legibility: Methods are required to help subjects visualize large amounts of granular data. Longitudinal engagement: Subjects’ information needs change over time and developers should enable changing preferences including retention and deletion of data.

Handling practices for surveillance data such as license plate recognition systems should be adopted. Policies should limit retention duration for data that wasn’t matched to a database when first scanned. After this time, data should be deleted or anonymized (Newell 2014).

Usable Security Human beings constitute the weakest link in any security system. Cognitive limitations must be recognized when designing user interfaces and writing terms of use and privacy settings (Bahr and Allen 2013). Usable security requires adequate user models of secure systems, and “an ecosystem that makes it easy for senders to become accountable and for receivers to demand it” (Lampson 2009, 27). When users require more freedom, virtualization may be used to segregate risky activities. However, because the true costs of secure systems, including the time that users spend trying to comply or bypass systems is unknown, Lampson (2009) argues that security vendors have no incentive to make their products usable.

Conclusion Our research shows that private market remedies present a moral hazard and do not allocate losses in a way that would reduce overall social costs. This laissez-faire approach to breachrelated risks ignores some significant externalities. Potential personal-information breaches are a complex problem that must be confronted. In the best interest of everyone, this issue must be resolved. However, the remedies suggested here are limited. Further interdisciplinary studies should examine the implications of increasing IT privacy intrusiveness as well as additional remedies worth exploring such as litigation and holder governance, auditing and monitoring.

Acknowledgements This research was made possible by the financial support of the Goodman School of Business. We would like to thank Jean Vachon, Ph.D., and Thérèse Villeneuve, M.A., for reading and commenting on the first version of this paper as well as the two anonymous scholars who kindly reviewed this article.

11

JOURNAL OF TECHNOLOGIES AND HUMAN USABILITY

REFERENCES Acquisti, Alessandro. "The Economics of Personal Data and the Economics of Privacy." The Economics of Personal Data and Privacy: 30 Years after the OECD Privacy Guidelines. The Organisation for Economic Co-operation and Development, last modified 2010/12/01, accessed 2014/03/01, http://www.oecd.org/sti/ieconomy/46968784.pdf. ———. "Why Privacy Matters?" TEDGlobal 2013. TED, last modified 2013/10/14, accessed 2014/02/23, http://new.ted.com/talks/alessandro_acquisti_why_privacy_matters#. Acquisti, Alessandro, Allan Friedman, and Rahul Telang. 2006. "Is there A Cost to Privacy Breaches? An Event Study." In Twenty Seventh International Conference on Information Systems, 1563-1580. Milwaukee: Association for Information Systems. Al-Kassab, Jasser, Frederic Thiesse, and Thomas Buckel. 2013. "RFID-Enabled Business Process Intelligence in Retail Stores: A Case Report." Journal of Theoretical and Applied Electronic Commerce Research 8 (2): 112-137. doi:10.4067/S071818762013000200010. Anderson, Keith B., Erik Durbin, and Michael A. Salinger. 2008. "Identity Theft." Journal of Economic Perspectives 22 (2): 171-192. doi:10.1257/jep.22.2.171. Archer, Norm, Susan Sproule, Yufei Yuan, Ken Guo, and Junlian Xiang. 2012. Identity Theft and Fraud: Evaluating and Managing Risk. Ottawa, ON, Canada: University of Ottawa Press. Bahr, Gisela S. and William H. Allen. 2013. "Rational Interfaces for Effective Security Software: Polite Interaction Guidelines for Secondary Tasks." In Universal Access in HumanComputer Interaction. Design Methods, Tools, and Interaction Techniques for eInclusion, edited by Margherita Antona Constantine Stephanidis, 165-174. Berlin Heidelberg: Springer. Bauer, Johannes M. and Michael J. G. van Eeten. 2009. "Cybersecurity: Stakeholder Incentives, Externalities, and Policy Options." Telecommunications Policy 33 (10-11): 706-719. doi:10.1016/j.telpol.2009.09.001. Beales, J. Howard and Timothy J. Muris. 2008. "Choice Or Consequences: Protecting Privacy in Commercial Information." The University of Chicago Law Review 75 (1): 109-135. doi:10.2307/20141902. Bellovin, Steven M. 2010. "Identity and Security." IEEE Security & Privacy Magazine 8 (2): 8888. doi:10.1109/MSP.2010.71. Braid, Don. "Law that Hides Massive Health Privacy Breach from Patients is Useless." Calgary Herald., last modified 2014/01/23, accessed 2014/02/04, http://www.calgaryherald.com/that+hides+massive+health+privacy+breach+from+patie nts+useless/9423777/story.html. Buffett, Mary. "Big Data Breaches - the Shape of Things to Come." Huff Post Business., last modified 2014/02/04, accessed 2014/02/14, http://www.huffingtonpost.com/marybuffeet/big-data-breachesthe-shap_b_4726105.html. Calo, M. Ryan. 2011. "The Boundaries of Privacy Harm." Indiana Law Journal 86 (3): 11311162. Campbell, Katherine, Lawrence A. Gordon, Martin P. Loeb, and Lei Zhou. 2003. "The Economic Cost of Publicly Announced Information Security Breaches: Empirical Evidence from the Stock Market." Journal of Computer Security 11 (3): 431-448. Cavoukian, Ann. "A Discussion Paper on Privacy Externalities, Security Breach Notification and the Role of Independent Oversight." Discussion Papers. Information and Privacy Commissioner, Ontario, Canada, last modified 2009/06/24, accessed 2014/03/01, http://www.ipc.on.ca/english/Resources/Discussion-Papers/Discussion-PapersSummary/?id=916.

12

SPROULE & VACHON: PREVENTION AND MITIGATION OF BREACHES IN DATABASES

Cavusoglu, Huseyin, Birendra Mishra, and Srinivasan Raghunathan. 2004. "The Effect of Internet Security Breach Announcements on Market Value: Capital Market Reactions for Breached Firms and Internet Security Developers." International Journal of Electronic Commerce 9 (1): 69-104. Charney, Theodore P. "Privacy Law in Canada: Regulatory Framework, Claims for Damages and Class Actions." Cyber Risk & Privacy Liability Forum. Litigation Conferences, last modified 2013/06/07, accessed 2014/02/28, http://litigationconferences.com/wpcontent/uploads/2013/06/Fri-1045-Draft-of-Paper-for-NetDiligence-PrivacyConference-FINAL1.pdf. Chen, Christina. "Expiring Data, Better Privacy, on-Line Trust – Consumers Demand Change!" reform. Reform Digital Limited, last modified 2014/2/5, accessed 2014/02/10, http://www.reformdigital.com/blog/expiring-data-better-privacy-on-line-trustconsumers-demand-change. Coase, R. H. 1960. "The Problem of Social Cost." Journal of Law and Economics 3 (Oct.1960): 1-44. Coleman, Jules L. 1980. "Efficiency, Exchange, and Auction: Philosophic Aspects of the Economic Approach to Law." California Law Review 68 (2): 221-249. Colobran, Miquel and Josep M. Basart. 2014. "Computer Security Crisis: Framework Proposal." Paper presented at the 10th International Conference on Technology, Knowledge and Society, Universidad Complutense de Madrid, Spain, Common Ground Publishing, 6-7 February 2014. Cooter, Robert D. and Edward L. Rubin. 1987. "A Theory of Loss Allocation for Consumer Payments." Texas Law Review 66 (1): 63-130. Dingman, Shane. "Why Your Smartphone is Telling this Toronto Tech Firm all about You." The Globe and Mail Tech News., last modified 2014/01/14, accessed 2014/03/14, http://www.theglobeandmail.com/technology/tech-news/why-your-smartphone-istelling-this-toronto-tech-firm-all-about-you/article16327257/. Drinkwater, Doug. "AIG: Cyber Insurance Sales have Risen by 30%." SC Magazine. Haymarket Media, last modified 2014/01/16, accessed 2014/02/27, http://www.scmagazineuk.com/ aig-cyber-insurance-sales-have-risen-by-30/article/329623/. Dwoskin, Elizabeth. "What Secrets Your Phone is Sharing about You." The Wall Street Journal. Dow Jones and Company, last modified 2014/01/14, accessed 2014/02/27, http://online.wsj.com/news/articles/SB100014240527023034530045792906321289291 94. EC. "Factsheet on the "Right to be Forgotten" Ruling (C-131/12)." Justice - Protection of Personal Data. European Commission, last modified 2014/06/02, accessed 2014/06/06, http://ec.europa.eu/justice/dataprotection/files/factsheets/factsheet_data_protection_en.pdf. Facciolo, Francis J. 2008. "Unauthorized Payment Transactions and Who should Bear the Losses." Chicago-Kent Law Review 83 (2): 605-631. Facebook. "Why do I Need to Provide My Birthday?" Welcome to Facebook - Sign-up., last modified 2013/11/15, accessed 2014/01/27, https://www.facebook.com/. Faulkner, Brandon. 2007. "Hacking into Data Breach Notification Laws." Florida Law Review 59 (5): 1097-1125. Garber, Lee. 2013. "Security, Privacy, Policy, and Dependability Roundup." IEEE Security & Privacy 11 (2): 6-7. doi:10.1109/MSP.2013.40. Garrison, Chlotia Posey and Matoteng Ncube. 2011. "A Longitudinal Analysis of Data Breaches." Information Management & Computer Security 19 (4): 216-230. doi:10.1108/09685221111173049. Gatzlaff, Kevin M. and Kathleen A. McCullough. 2010. "The Effect of Data Breaches on Shareholder Wealth." Risk Management and Insurance Review 13 (1): 61-83.

13

JOURNAL OF TECHNOLOGIES AND HUMAN USABILITY

Google. "About Google Web History." Accounts Help. Google, last modified 2014/01/01, accessed 2014/01/23, https://support.google.com/accounts/answer/54068?hl=en&rd=1. ———. "Google Terms of Service." Policies and Principles. Google, last modified 2013/11/21, accessed 2014/01/23, www.google.com/intl/en/policies/terms/. Gordon, Marcy and Liedtke, Micheal. "Google, Facebook, Twitter Send Letter to Obama about Government Snooping." Huff Post Business., last modified 2013/12/09, accessed 2014/01/22, http://www.huffingtonpost.ca/2013/12/09/google-facebook-letter-obamasnooping_n_4411208.html. Gupta, Manish and Raj Sharman. 2012. "Determinants of Data Breaches: A CategorizationBased Empirical Investigation." Journal of Applied Security Research 7 (3): 375-395. doi:10.1080/19361610.2012.686098. Higginbotham, Stacey. "The Internet of Things Needs a New Security Model. Which One Will Win?" Gigaom., last modified 2014/01/22, accessed 2014/02/27, http://gigaom.com/2014/01/22/the-internet-of-things-needs-a-new-security-modelwhich-one-will-win/. Hodge, Matthew J. 2006. "The Fourth Amendment and Privacy Issues on the “New” Internet: Facebook.Com and Myspace.Com." Southern Illinois University Law Journal 3 (1): 95123. Hoffman, Donna L., Thomas P. Novak, and Marcos Peralta. 1999. "Building Consumer Trust Online." Communications of the ACM 42 (4): 80-85. doi:10.1145/299157.299175. Holmes, Nancy and Valiquet, Dominique. "Legislative Summary of Bill S-4: An Act to Amend the Criminal Code (Identity Theft and Related Misconduct)." Library of Parliament Research Publications. Parliament of Canada, last modified 2009/06/05, accessed 2014/03/03, http://www.parl.gc.ca/Content/LOP/LegislativeSummaries/40/2/s4-e.pdf. Javelin Strategy & Research. "2013 Identity Fraud Report: Data Breaches Becoming a Treasure Trove for Fraudsters." Greenwich Associates, last modified 2013/02/28, accessed 2014/02/19, https://www.javelinstrategy.com/uploads/web_brochure/1303.R_2013 IdentityFraudBrochure.pdf;. Joerling, Jill. 2010. "Data Breach Notification Laws: An Argument for a Comprehensive Federal Law to Protect Consumer Data." Washington University Journal of Law & Policy 32: 467-488. Kierkegaard, Patrick. 2012. "Medical Data Breaches: Notification Delayed is Notification Denied." Computer Law & Security Review 28 (2): 163-163-183. doi:10.1016/j.clsr.2012.01.003. Kirk, Jeremy. "Pacemaker Hack can Deliver Deadly 830-Volt Jolt." Computerworld. IDG Enterprise, last modified 2012/10/17, accessed 2014/02/27, http://www.computerworld.com/s/article/9232477/Pacemaker_hack_can_deliver_deadly _830_volt_jolt. Kosinski, Michal, David Stillwell, and Thore Graepel. 2013. "Private Traits and Attributes are Predictable from Digital Records of Human Behavior." Proceedings of the National Academy of Sciences 110 (8): 5802-5805. doi:10.1073/pnas.1218772110. Kshetri, Nir. 2009. "Positive Externality, Increasing Returns, and the Rise in Cybercrimes." Communications of the ACM 52 (12): 141-144. doi:10.1145/1610252.1610288. Lampson, Butler. 2009. "Privacy and Security: Usable Security: How to Get It." Communications of the ACM 52 (11): 25-27. doi:10.1145/1592761.1592773. Langheinrich, M. 2001. "Privacy by Design - Principles of Privacy-Aware Ubiquitous Systems." In Ubicomp 2001: Ubiquitous Computing, edited by G. D. Abowd, B. Brumitt and S. Shafer, 273-291. Berlin Heindelberg: Springer. Lawford, John and Lo, Janet. "Data Breaches: Worth Noticing?" PIAC.ca. The Public Interest Advocacy Centre, last modified 2012/01/31, accessed 2013/06/17,

14

SPROULE & VACHON: PREVENTION AND MITIGATION OF BREACHES IN DATABASES

http://www.piac.ca/files/data_breaches_worth_noticing_publication_version_final_final .pdf. Lee, Hyensoo, Sung Jun Park, Mi Jeong Kim, Ji Yea Jung, Hae Won Lim, and Jeag Tai Kim. 2012. "The Service Pattern-Oriented Smart Bedroom Based on Elderly Spatial Behaviour Patterns." Indoor and Built Environment 22 (1): 299-308. doi:10.1177/1420326X12469712. LSUC. "Real Estate Practice Guide for Lawyers." Lawyer Practice Guides. The Law Society of Upper Canada, last modified 2010/06/01, accessed 2014/03/03, http://www.lsuc.on.ca/WorkArea/DownloadAsset.aspx?id=2147491160. Marron, Donncha. 2008. "‘Alter Reality’: Governing the Risk of Identity Theft." British Journal of Criminology 48 (1): 20-38. doi:10.1093/bjc/azm041. Marwick, Alice E. 2014. "How Your Data are being Deeply Mined." The New York Review of Books 61 (1): 1-7. McGee, Matt. "The Filter Bubble: What the Internet is Hiding from You." Marketing Land. Third Door Media, Inc., last modified 2013/08/16, accessed 2014/02/24, http://marketingland.com/edgerank-is-dead-facebooks-news-feed-algorithm-now-hasclose-to-100k-weight-factors-55908. Mercuri, Rebecca T. 2006. "Scoping Identity Theft." Communications of the ACM 49 (5): 17-21. doi:10.1145/1125944.1125961. Morea, Dom, Christiansen, Philip, Dragt, Bruce and Randolf, G. R. "EMV in the U.S.: Putting it into Perspective for Merchants and Financial Institutions." Insights. First Data, last modified 2011/11/18, accessed 2014/03/03, http://files.firstdata.com/downloads/ thought-leadership/EMV_US.pdf. Newell, Bryce Clayton. 2014. "Local Law Enforcement Jumps on the Big Data Bandwagon: Automated License Plate Recognition Systems, Information Privacy, and Access to Government Information." Maine Law Review 66 (2): 398-436. Odella, Francesca. 2010. "Exploring the Realm of Privacy: The Institutional Process of the European Personal Data Privacy Directive." The International Journal of Technology, Knowledge and Society 6 (6): 203-218. OPC. "About the Office of the Privacy Commissioner." Office of the Privacy Commissioner of Canada, last modified 2012/12/05, accessed 2014/03/06, http://www.priv.gc.ca/auans/index_e.asp. ———. "Information about Privacy Breaches and how to Respond." Resources. Office of the Privacy Commissioner of Canada, last modified 2008/11/05, accessed 2014/03/06, http://www.priv.gc.ca/resource/pb-avp/pb-avp_intro_e.asp. Pariser, Eli. 2011. The Filter Bubble: What the Internet is Hiding from You. New York NY: The Penguin Press. Payton, Anne M. 2006. "Data Security Breach: Seeking a Prescription for Adequate Remedy." InfoSecCD '06 Proceedings of the 3rd Annual Conference on Information: 162-167. doi:10.1145/1231047.1231084. Petrashek, Nathan. 2010. "The Fourth Amendment and the Brave New World of Online Social Networking." Marquette Law Review 93 (4): 1494-1528. Presse Canadienne. "Bell Canada Pointé Du Doigt Pour La Surveillance De Ses Clients." La Presse: Techno. Groupe Gesca, last modified 2014/01/27, accessed 2014/02/26, http://techno.lapresse.ca/nouvelles/mobilite/201401/27/01-4732866-bell-canada-pointedu-doigt-pour-la-surveillance-de-ses-clients.php. Productivity Commission. 1998. "Quasi-Regulation." Chap. 5, In Regulation and its Review. Vol. 1997-98, 49-58. Canberra: AusInfo. http://www.pc.gov.au/__data/assets/pdf_file/ 0006/99150/07-chapter5.pdf.

15

JOURNAL OF TECHNOLOGIES AND HUMAN USABILITY

Proofpoint. "Proofpoint Uncovers Internet of Things (IoT) Cyberattack." News and Events: Press Releases. Proofpoint Inc., last modified 2014/01/16, accessed 2014/01/23, http://www.proofpoint.com/about-us/press-releases/01162014.php. Rao, Justin M. and David H. Reilly. 2012. "The Economics of Spam." Journal of Economic Perspectives 26 (3): 87-110. doi:10.1257/jep.26.3.87. RCMP. "Identity Theft and Fraud." Scams and Fraud. Royal Canadian Mounted Police, last modified 2013/10/01, accessed 2014/01/25, http://www.rcmp-grc.gc.ca/scamsfraudes/id-theft-vol-eng.htm. Rode, Lilia. 2007. "Database Security Breach Notification Statutes: Does Placing the Responsibility on the True Victim Increase Data Security?" Houston Law Review 43 (Spring): 1597-1634. Romanosky, Sasha, David Hoffman, and Alessandro Acquisti. 2012. "Empirical Analysis of Data Breach Litigation." In Workshop on the Economics of Information Security WEIS 2012, edited by Rainer Böhme, Gert G. Wagner and Nicola Jentzsch. Vol. 11, 1-31. Berlin: DIW. http://weis2012.econinfosec.org/papers/Romanosky_WEIS2012.pdf. Romanosky, Sasha, Rahul Telang, and Alessandro Acquisti. 2011. "Do Data Breach Disclosure Laws Reduce Identity Theft?" Journal of Policy Analysis and Management 30 (2): 256286. doi:10.1002/pam.20567. Rosemann, Micheal. 2013. "The Internet of Things: New Digital Capital in the Hands of Customers." Business Transformation Journal 2013 (9): 6-25. Sakai, Makoto. 2014. "Regulation and Democratization of the Information Society." Paper presented at the 10th International Conference on Technology, Knowledge and Society, Universidad Complutense de Madrid, Spain, 6-7 February 2014. Salidek, Adam and John Krumm. 2012. "Far Out: Predicting Long-Term Human Mobility." In Proceedings of the Twenty-Sixth AAAI Conference on Artificial Intelligence, edited by Dieter Fox, Jorg Hoffman and Bart Selman. Toronto, Canada: AAAI Press. Shaar, Peter. 2010. "Privacy by Design." Identity in the Information Society 3 (2): 267-274. doi:10.1007/s12394-010-0055-x. Shay, Richard, Iulia Ion, Robert W. Reeder, and Sunny Consolvo. 2014. "'My Religious Aunt Asked Why I was Trying to Sell Her Viagra': Experiences with Account Hijacking." CHI '14 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems: 2657-2666. doi:10.1145/2556288.2557330. Shilton, Katie. 2009. "Four Billion Little Brothers? Privacy, Mobile Phones, and Ubiquitous Data Collection." Communications of the ACM 52 (11): 48-53. doi:10.1145/1592761.1592778. Solove, Daniel J. 2003. "Identity Theft, Privacy, and the Architecture of Vulnerability." Hastings Law Journal 54 (April): 1-46. doi:10.2139/ssrn.416740. Solove, Daniel J. and Chris Jay Hoofnagle. 2006. "Model Regime of Privacy Protection, A." University of Illinois Law Review 2006 (2): 357-404. Sovern, Jeff. 2004. "Stopping Identity Theft." Journal of Consumer Affairs 38 (2): 233-243. doi:10.1111/j.1745-6606.2004.tb00866.x. Stevens, Gina. "Federal Information Security and Data Breach Notification Laws." DTIC Online: Information for the Defense Community. Congressional Research Service, Library of Congress, last modified 2009/01/29, accessed 2013/06/17, http://www.dtic.mil/get-trdoc/pdf?AD=ADA509890. Vamialis, Anna. 2013. "Online Service Providers and Liability for Data Security Breaches." Journal of Internet Law 16 (11): 23-33. Volz, Dustin. "Target Data Breach has Congress Eying Data-Security Alternatives." National Journal-Technology. National Journal Group Inc., last modified 2014/02/05, accessed 2014/03/06, http://www.nationaljournal.com/technology/target-data-breach-hascongress-eying-data-security-alternatives-20140205.

16

SPROULE & VACHON: PREVENTION AND MITIGATION OF BREACHES IN DATABASES

Weston, Greg, Greenwald, Glenn and Gallagher, Ryan. "CSEC used Airport Wi-Fi to Track Canadian Travellers: Edward Snowden Documents." CBCNews - Politics. CBC RadioCanada, last modified 2014/01/31, accessed 2014/02/04, http://www.cbc.ca/news/politics/csec-used-airport-wi-fi-to-track-canadian-travellersedward-snowden-documents-1.2517881. Whitson, Jennifer R. and Kevin D. Haggerty. 2008. "Identity Theft and the Care of the Virtual Self." Economy & Society 37 (4): 572-594. doi:10.1080/03085140802357950. WMUR. "Police: Thieves Robbed Homes Based on Facebook, Social Media Sites." WMUR New Hampshire 9. Manchester Hearst Properties Inc., last modified 2010/09/23, accessed 2014/02/24, http://www.wmur.com/Police-Thieves-Robbed-Homes-Based-OnFacebook-Social-Media-Sites/11861116. Yassine, Abdulsalam, Ali A. N. Shirehjini, Shervin Shirmohammadi, and Thomas T. Tran. 2012. "Knowledge-Empowered Agent Information System for Privacy Payoff in eCommerce." Knowledge Information Systems 32 (2): 445-473. doi:10.1007/s10115011-0415-3.

ABOUT THE AUTHORS Dr. Susan Sproule: Assistant Professor, Goodman School of Business, Brock University, St. Catharines, Ontario, Canada Dr. Francine Vachon: Associate Professor, Goodman School of Business, Brock University, St. Catharines, Ontario, Canada

17

Journal of Technologies and Human Usability is one of the four thematically focused journals that comprise the Technology Collection and support the Technology, Knowledge, and Society knowledge community—its journals, book series, and online community. The journal focuses on re-examining the connections between technology, knowledge and society. It looks at human-technology interactions, informatics, cybernetics, artificial intelligence, new media, and new communications channels.

Journal of Technologies and Human Usability is a peer-reviewed scholarly journal.

ISSN 2381-9227