Wearable devices and healthcare: Data sharing and

0 downloads 0 Views 579KB Size Report
Dec 27, 2017 - School of Management, University of Michigan, 2120 Riverfront Center, 303 East Kearsley Street,. Flint, Michigan 48502-1950, USA. Published ...
The Information Society An International Journal

ISSN: 0197-2243 (Print) 1087-6537 (Online) Journal homepage: http://www.tandfonline.com/loi/utis20

Wearable devices and healthcare: Data sharing and privacy Syagnik (Sy) Banerjee, Thomas Hemphill & Phil Longstreet To cite this article: Syagnik (Sy) Banerjee, Thomas Hemphill & Phil Longstreet (2017): Wearable devices and healthcare: Data sharing and privacy, The Information Society, DOI: 10.1080/01972243.2017.1391912 To link to this article: https://doi.org/10.1080/01972243.2017.1391912

Published online: 27 Dec 2017.

Submit your article to this journal

Article views: 4

View related articles

View Crossmark data

Full Terms & Conditions of access and use can be found at http://www.tandfonline.com/action/journalInformation?journalCode=utis20 Download by: [University of Michigan-Flint]

Date: 03 January 2018, At: 11:51

THE INFORMATION SOCIETY 2018, VOL. 34, NO. 1, 1–9 https://doi.org/10.1080/01972243.2017.1391912

Wearable devices and healthcare: Data sharing and privacy Syagnik (Sy) Banerjee, Thomas Hemphill, and Phil Longstreet

Downloaded by [University of Michigan-Flint] at 11:51 03 January 2018

School of Management, University of Michigan-Flint, Flint, Michigan, USA

ABSTRACT

ARTICLE HISTORY

Wearable devices introduce many new capabilities to the delivery of healthcare. But wearables also pose grave privacy risks. Furthermore, information overload gets in the way of informed consent by the patient. To better protect American patients in an increasingly digital world, the U.S. Congress passed the Health Insurance Portability and Accountability Act of 1996 (HIPAA). This article examines the adequacy of HIPAA vis-a-vis issues raised by wearable technologies in the Internet of Things environment and identifies policy gaps and factors that drive health data exposure. It presents a 2 £ 2 Partnership-Identity Exposure Matrix, illustrates implications in four different contexts, and provides recommendations for improving privacy protection.

Received 28 October 2016 Accepted 14 September 2017

Introduction Wearable technologies are anticipated to have a major impact in the health sector. With wearables like Vital Connect’s Healthpatch MD, physicians can remotely receive updates on patient’s vitals (Terry 2014). For patients with amyotrophic lateral sclerosis (ALS) or other neurodegenerative diseases, wearables under development could scan their brainwaves, thoughts, feelings, and expressions and generate alerts and commands to electronic devices in the room (e.g. televisions, lighting) (Blum and Dare 2015). Wearable devices are also changing healthcare for obesity, cardiovascular diseases, asthma, and Alzheimer’s and also in-hospital monitoring. They enable better patient monitoring, drug management, asset monitoring, tracking, and early medical interventions. In general physicians, insurers, patients, and caregivers are anticipated to have unparalleled access to information (FTC Staff 2015). But wearables also pose grave privacy risks, with potentially very serious consequences, including reduction in life chances. On the one hand, companies can gather and trade data gleaned from smartphone sensors and wearables to learn of moods, stress levels, habits, well-being, sleep patterns, exercise, movement, etc. to make credit, insurance, and employment decisions compromising privacy and possibly even resulting in barriers to healthcare. On the other hand, information overload gets in the way of informed consent (Cate and Mayer-Sch€ onberger 2013). The value of personal

CONTACT Syagnik (Sy) Banerjee Flint, Michigan 48502-1950, USA.

syban@umflint.edu

KEYWORDS

HIPAA; informed consent; Internet of Things; privacy; regulatory policy; wearables

information is often not known when it is collected (i.e., when notice and consent are normally given); also the relationship between users and processors of personal data has become increasingly complicated, as datasets are combined, transferred, shared, or sold. Consent notices that do not disclose identity of third parties who can access user data forestall consumers’ ability to provide genuinely “informed” consent. Additionally, consent notices are often written so broadly, or in such voluminous detail, that they inhibit the user’s comprehension, and thus render “conscious choice” meaningless. Such notices, which create the illusion of consent, often distract users from their own privacy protection (Cate 2010). In effect, consumers do not really share but “surrender” information (Walker 2016). Data collected by healthcare wearables are subject to various legal protections. There are sector specific laws, such as the Americans with Disabilities Act, Children’s Online Privacy Protection Act, and Fair Credit Reporting Act, not to mention federal and state laws on access to healthcare insurance and consumer discrimination. On the regulatory front, the U.S. Food and Drug Administration (FDA) regulates medical devices but not wearable devices, although there have been calls to include wearable devices in FDA’s regulatory purview (Future of Privacy Forum 2016). To better protect patients in an increasingly digital world, the U.S. Congress passed the Health Insurance Portability and Accountability Act of 1996 (HIPAA). This article examines the adequacy of

School of Management, University of Michigan, 2120 Riverfront Center, 303 East Kearsley Street,

Published with license by Taylor & Francis. © Syagnik (Sy) Banerjee, Thomas Hemphill, and Phil Longstreet

2

S (SY) BANERJEE ET AL.

HIPAA vis-a-vis issues raised by wearable technologies in the Internet of Things (IoT) environment.1 The rest of the article proceeds as follows: first, a healthcare IoT technologies taxonomy is presented; second, the interests of different stakeholders in the IoT ecosystem are identified and explained; third, the factors driving health data exposure are analyzed; fourth, implications in different types of scenarios are drawn; and last, recommendations are offered to improve privacy protection.

Downloaded by [University of Michigan-Flint] at 11:51 03 January 2018

Wearables in healthcare Applying the framework proposed by Healey, Pollard, and Woods (2015), healthcare devices are categorized as Wearable Health Monitoring Devices, Medical Wearable Devices, Medical Embedded Devices, and Stationary Medical Devices (see Figure 1 below). Wearable health monitoring devices are consumer products (e.g., Fitbit, Fuelband, etc.). Medical wearable devices are prescribed devices (e.g., insulin pumps). Medically embedded devices are implanted into the body (e.g., pacemakers) (Taylor 2016). Stationary medical devices are designed for use in a specific physical location (e.g., chemotherapy dispensing stations for home-based healthcare). Each of the devices in the aforementioned categories has some of following attributes: Identification (serial number to provide positive identification of the user), location (geospatial coordinates), sensing (sensors that produce alert warning signals), and connectivity (ability to connect with other devices and share information) (Dohr et al. 2010). The wearable healthcare device market is forecasted to save 1.3 million lives annually (Dolan 2014b) and grow to $72.7 billion by 2021 (Malik 2016). As the demand for these IoT wearable devices has increased in both consumer and medical markets, the number of entities interested in them has also increased. They are broadly categorized as Healthcare, Bystander Beneficiaries,

Figure 1. Healthcare IOT Typology.

Hardware and Infrastructure, and Watchdogs. The Healthcare IoT entities are those that are directly involved in the use of devices in medical treatments. They include healthcare providers (e.g., doctors, hospitals, and laboratories) and also patients who consent to the use of these devices. Bystander Beneficiaries IoT entities are those that seek to capture market value by utilizing advances in healthcare and well-being related IoT technologies without investing in the delivery process (e.g., employers, insurance companies, data brokers, and marketers). In other words, they are users of the information provided by the IoT devices and are neither involved in the manufacturing, selling, or supporting the IoT devices, nor information creation or delivery process. Infrastructure IoT entities are those that make, sell, support, facilitate, or connect the devices (e.g., device manufactures, infrastructure providers, tech support providers, and app developers). Finally, Watchdog IoT entities are primarily interested in ensuring that technology specifications and ethical practices are met and employed in use of IoT technologies (e.g., Food and Drug Administration, Federal Trade Commission, and Future of Privacy Forum). Several organizations have initiated corporate wellness programs, which include use of wearable devices. Oscar Heath Insurance has offered its customers $1 credit towards an Amazon gift card every time they reach a daily step count goal (Bertoni 2014). Cigna has distributed armbands to thousands of its employees. Self-insured employers, like Autodesk, have purchased Fitbit trackers and distributed them to their employees (Olson, 2014). Oil giant BP has provided incentives to employees for wearing Fitbit and permitting the company to track their daily work steps (Olson 2014). Some employers are using Wildflower, an application for pregnant women to measure weight gain at pregnancy milestones, to reduce maternity-related medical costs (Olson 2014). Conversely, some companies are using punitive measures to change behavior. For example, stickK believes that taking away wellness points, a disincentive, is more effective than offering rewards (Olson 2014). Some self-insured companies are planning to add a $50 surcharge to insurance premiums of employees who smoke. CVS has been charging employees $50 per month who refuse to report weight, body fat, cholesterol, blood pressure, and sugar levels. Overall, improved wellness compliance is emerging, but raises vexing ethical questions about the use of the data being captured. Such efforts rest on the assumption that the increasing health costs are a product of individual lifestyle choices. In contrast, there exists another perspective that takes a systemic view. In this perspective, supposed personal

Downloaded by [University of Michigan-Flint] at 11:51 03 January 2018

THE INFORMATION SOCIETY

“choice” is viewed by scholars as product of the environment (Alkon and Agyeman 2011; Langston 2010) and ecologies that are “inescapable” (e.g., prenatal exposure to toxins can lead to outcomes like obesity) (Nash 2006). Guthman (2011) argues that obesity is a symptom of a greater problem which can only be solved by challenging the cultural values or economic interests of powerful entities that are invested in existing socioeconomic systems. Biltekoff (2007) goes a step further and urges individuals to make responsible choices for protecting their own health instead of relying on government protections and services, which are a product of neoliberal ideologies. However, in practice, the individual lifestyle choices narrative informs most analysis and recommendations (Bipartisan Policy Center 2012; The Physicans Foundation 2012).

Health insurance portability and accountability act The Health Insurance Portability and Accountability Act of 1996 or HIPAA (U.S. Government Publishing Office 2013), Public Law 104–191, was enacted on August 21, 1996. Sections 261 through 264 of HIPAA require the U.S. Secretary of Health and Human Services (HHS) to publicize standards (also known as Administrative Simplification provisions) for the electronic exchange, privacy, and security of patient health information. The HHS Standards for Privacy of Individually Identifiable Health Information administrative rule (“Privacy Rule”) were published on August 14, 2002 in the Code of the Federal Register (@45 CFR Part 160 and Part 164). The primary goal of the Privacy Rule is to protect patients’ health information while allowing the flow of such information to “covered entities”, i.e., healthcare providers, health plans, healthcare clearinghouses, and relevant associates (U.S. Department of Health and Human Services 2003). In addition, HHS also published the Security Standards for the Protection of Protected Electronic Health Information (“Security Rule”), which establishes a national set of security standards for protecting certain health information that is held or transferred in electronic form (Scholl et al. 2008). A major goal of the Security Rule is to protect the privacy of patients’ health information while allowing covered entities to adopt new technologies to improve the quality and efficiency of patient care. Obtaining written permission from individuals to use and disclose their protected health information for treatment, payment, and healthcare operations is optional under the Privacy Rule for all covered entities (U.S. Department of Health and Human Services 2003). However, HIPAA, and specifically the Privacy Rule, requires

3

that a covered entity must obtain the patient’s written authorization for any use or disclosure of protected health information that is not for treatment, payment, healthcare operations or otherwise permitted or required by the Privacy Rule (U.S. Department of Health and Human Services 2003). On the surface, it seems that for patients receiving treatment from a healthcare provider use of wearable devices to monitor their health, any data shared with business associates are well protected, as per the Privacy Rule (@45 CFR 164.504(e)). Business associate functions and activities include: claims processing or administration; data analysis, processing, or administration; utilization review; quality assurance; billing; benefit management; practice management; and repricing. Business associate services include legal, actuarial, accounting, consulting, data aggregation, management, administrative, accreditation, and financial. Similarly, under Title XIII of the American Recovery and Reinvestment Act of 2009, the Health Information Technology for Economic and Clinical Health (“HITECH”) Act has security requirements. For example, it requires business associates to comply directly with Security Rule provisions directing implementation of administrative, physical, and technical standards for electronic protected health information; develop, enforce policies and documentation standards; report noncompliance and terminate contracts with noncompliant entities; and requires HSS to conduct compliance audits. However, different problems emerge as far as wearable devices and adequacy of HIPAA are concerned. Much of the data leakage occurs due to policy gaps, explicit policy violations, as well as legal but self-interest driven marketplace behavior by covered entities.

Factors increasing health data exposure In this section, we identify and discuss factors that drive the exposure of personal health data. This understanding is necessary for developing effective remedies. 1. HIPAA i) Conditionality of protection: According to the Privacy Rule (@45 CFR 160.103), a business associate is defined as a person or organization that performs certain functions or activities that involve the use of disclosure of protected health information on behalf of, or provides services to, a covered entity. In other words, a wearables manufacturer or distributor that neither provides services to any covered entity, nor uses or discloses information on behalf of a covered entity, can be subject to the

Downloaded by [University of Michigan-Flint] at 11:51 03 January 2018

4

S (SY) BANERJEE ET AL.

compliance requirements of business associate defined under HIPAA. Despite additional regulatory safeguards initiated by the healthcare sector, such as the Omnibus Rule, increased penalties, breach notification standards, and incorporation of added protected health information (PHIs) (U.S. Government Publishing Office 2013), they continue to apply to business associates, and mHealth technologies remain “noncovered entities” (NCEs). Even if a doctor endorses a particular app, or even if healthcare providers and app developers enter into interoperability arrangements, HIPAA does not protect consumer data. The status of a wearable manufacturer–distributor depends on whether its participation has been contracted by the healthcare provider for patient management services. If the contract indicates that the wearable device manufacturer engages in transmission of data on behalf of the healthcare provider, the manufacturer is considered a covered entity. However, if a patient purchases a wearable device from a retailer, uses it and voluntarily shares some results with his or her doctor, who then responds with some advisory comments, the company manufacturing the device is not a HIPAA covered entity; consequently, the data collected by the manufacturer is neither bound by nor protected under HIPAA. This situation is similar to sharing data after an automobile accident, when both the healthcare insurer and automobile insurer receive medical bills. One can be bound by HIPAA, the other may not, because the health insurer has not partnered and endorsed the automobile insurer under its own line of business. When the HIPAA guidelines were developed, the increasing diffusion of mobile wearables and subsequent proliferation of third party entities were not foreseen as a potential complication. The only way to make wearable device manufacturers or distributors conform to HIPAA laws is to have them partner with covered entities. ii) Protected data types: Applicability of HIPAA depends on what fields of information are being collected and shared. The HIPAA Privacy Rule protects most “individually identifiable health information”, such as medical records, laboratory reports, and hospital bills, because they can identify the patient by name. The University of California, San Francisco, uses eighteen criteria defining healthcare data covered under HIPAA (Lee 2015), which includes names, geographical markers smaller than a state (street address, county, and zip code), elements of dates directly

related to a person (birth, admission, and discharge), phone, fax and e-mail, social security numbers, numbers related to medical records, health plan beneficiary, accounts, certificates and licenses, vehicle identifiers, license plates, IP addresses, device identifiers, web URLs, biometric identifiers, and photographic images. However, many fields of information that appear to be private, such as blood pressure or sleep statistics, can be shared separately as long as they are not identified by unique personal information. iii) Definition of unique identifiers: Collected health, behavior data or variables can become unique identifiers themselves, though not recognized or categorized as PHI under HIPAA. At a data conference in New York City, Ira Hunter, the CIA’s Chief Technology Officer, said that a person can be hundred percent guaranteed to be identified simply by their gait, i.e., how they walk (Kelley 2013). In other words, gradually evolving fields of behavioral data that we merely consider patterns could be as unique as fingerprints, and therefore personally identifiable. iv) Ability to bypass identification: Even if the concerned entity qualifies as a business associate that is required to comply with HIPAA, in recognition of the potential usefulness of health information when not identifiable, x164.502(d) of the Privacy Rule allows a covered entity or its business associate to create information that is not individually identifiable by following the deidentification standard and implementation specifications in x164.514(a)-(b). If that is true, identification can be achieved post-sharing, by creating and tracking unique proxy combination patterns, therefore in effect bypassing the privacy rule. 2. E-discovery Personal health data can be made accessible for legal purposes. In 2014, a law firm in Calgary used activity data from Fitbit to assess the effect of an accident on their client in a personal injury case (Olson 2014). Similarly, in 2015, a woman’s fitness watch disproved her rape report (Snyder 2015). The entry of wearable device data into judicial proceedings through e-discovery has been facilitated by analytics platforms like Vivametrica, which use public health records to compare a specific person’s records with those of the general population. Once used in judicial proceedings, this personal health information can become part of publicly searchable information under the Open Records Policy or Freedom

THE INFORMATION SOCIETY

Downloaded by [University of Michigan-Flint] at 11:51 03 January 2018

of Information Act. Though there can be exemptions to preserve security and privacy, the provisions vary from state to state (Solove 2002). 3. Data sharing modalities i. Ambiguous legal status of shared data: The existing public policy environment is ambiguous about the future of shared data. If an entity legitimately shares data with a firm (neutral, third party, or otherwise) that undergoes a change in ownership or bankruptcy, or enters bankruptcy, how is the collected data to be treated? Should it be treated as a firm-owned asset? In an end user license agreement (EULA), terms and conditions of wearable devices often indicate that in the event of the sale of their company, or if in bankruptcy proceedings, the company reserves the right to sell the data (Maddox 2014). When Radio Shack underwent bankruptcy, it requested the court to allow it to sell data to pay off its creditors. AT&T and Apple filed a motion to prevent the sale of data concerning customers who had purchased Apple or AT&T devices (Isidore 2015). Similar ambiguities exist regarding status of federal government data, which become very apparent when there is a change in administration. ii. Encryption and hardware connectivity: The existing encryption regime of wearable device or hardware identifiers by manufacturers is inadequate. Wearable devices are usually synced via Bluetooth Low Energy to wirelessly connect to mobile devices or computers. Electronics firm Symantec conducted an experiment where it built portable Bluetooth scanners using Raspberry Pi minicomputers and readily available retail components, which could be purchased and assembled for $75 by any person with basic IT skills. Using the scanner, Symantec employees were able to scan airwaves for signals broadcast from devices and extract device hardware addresses and location as well as other personal data (Barcena, Wueest, and Lau 2014). While location in itself is not a data field that is covered by HIPAA, when connected to an identifier, air quality and pollution estimates can inform audiences about the potential health risks of the individual. iii. Cloud storage vulnerability: Most wearable data transferred between wearables and mobile devices are stored in “clouds.” Due to lack of associated physical space or control, cloud storage can be vulnerable to breaches by hackers. In the first quarter of 2015, approximately hundred million

5

patient records, i.e., estimated by volume of patient records, were compromised in the top six attacks (i.e., by volume of compromised patient records) (Munro 2015). iv. Data sovereignty: Since most of the wearable device applications store the generated data in the cloud, there are a wide variety of national laws determining how protected the data are and how safely they are stored. Earlier, “safe harbor” regulatory compliance with the European Union’s data protection framework required data processing entities to provide sufficient data protection guarantees “irrespective of their location.” (Gibbs 2015) However, following the Snowden revelations regarding U.S. intelligence agencies’ surveillance into commercial Internet services, the European Union struck out the “safe harbor” provision and the consequent move towards adhering to domestic privacy policies could impede the movement of data across borders (Meyer 2016). Similarly, Indonesia introduced Regulation PP 82/2012, which requires data service providers to store the personal data of her citizens within the country. Australia also instituted laws and regulations for such data carried overseas. Moreover, more explicit rules are expected to be issued by the EU General Data Protection Regulation (Bradbury 2015). The increasing push for data sovereignty and adherence of domestic laws, as opposed to international frameworks, could create data vulnerabilities (Berry and Reisman 2012). For instance, start-up application developers operating within tight budgets could sign up with cloud storage providers operating from countries with more “flexible”, i.e., less rigorous, regulations. v. Data brokers: There is a large, disorganized data broker industry that has evolved in the pursuit of profits. In 2012, nine data broker firms analyzed by the FTC generated $426 million in annual revenue (Federal Trade Commission 2014). Unique, personal information is a high-priced commodity, and it is estimated that the information contained in wearables stored on a smartphone or a “cloud” is worth ten times that of credit cards in a black market (Maddox 2015). A study by the FTC found that twelve health applications that were tested transmitted information to seventysix different third-parties (Dolan 2014a), some of which were data brokers, who aggregate and sell consumer data in the marketplace (Addonizio 2015). What is disturbing is that much of this data consisted of unique device identifiers, social

Downloaded by [University of Michigan-Flint] at 11:51 03 January 2018

6

S (SY) BANERJEE ET AL.

security numbers, demographics, court and public records, travel, running routes, dietary habits, sleep patterns, addresses, financial information, full names, health information, location, date of birth, and other similar personally identifiable information (Privacy Rights Clearinghouse 2013). vi. Corporate alliances: Organizations and insurers demand so-called “granular” detailed information, because more detailed behavioral profiles offer insurers more accurate impressions of individual health risks. There have been reports about regional hospitals, insurers, and grocery retailers collaborating to transform buyers’ grocery purchase data into health risk profiles (Sherbit 2015). vii. Liability minimization: Employers who implement healthy work practices monitor employees to ensure improved health and justify reduced insurance premiums. In order to mitigate liabilities related to storage and potential misuse of data accrued while monitoring, employers often introduce neutral third party firms, such as Staywell or Welltok, to help remove personal health information from employee records (Firestone, 2014). There is a potential threat that such neutral third-party firms can “feed” or join the data broker industry to exacerbate misuse concerns and liability. 4. End user license agreements End user license agreements (EULAs), whose density alone has a significant negative effect on a users’ willingness to read and comprehend (B€ohme and K€opsell 2010), often overreach as they are designed without considering the legal enforceability of their terms and conditions. In the process, a wide variety of terms that are not legally enforceable are included into the EULA, further undermining the users’ ability to understand what they are consenting to when they start using the device.

Health data exposure contexts In this section, we use Nissenbaum’s (2004) Contextual Integrity Framework to detect ethical violations, the existence of policy gaps, and provide recommendations. To assess if storage or transfer of information violates contextual integrity, one must examine four factors (Nissenbaum and Patterson 2016): (1) the prevailing context in which data is generated; (2) the key actors, their roles and responsibilities; (3) key information attributes;

and (4) principles of data transmission between parties. These factors determine the norms of appropriateness (whether it is appropriate for concerned entities to access and store information at hand) and distribution (when it is acceptable to share the same information with external parties). We employ two factors—varying roles and information attributes—from Nissenbaum and Patterson (2016), to design a simple 2 £ 2 PartnershipIdentity Exposure Matrix (see Table 1 below) to map different scenarios of health data exposure. We start by defining the family of terms included under the umbrella of health information. Health information (HI) is a broad term that includes both protected health information (PHI), which contains unique identifiers that can be directly linked back to the patient, and de-identified health information (DHI), which is not readily identifiable. Here it is important to clarify that DHI is not necessarily anonymous data; in fact, data can vary from completely anonymous to completely identifiable, differing on the extent to which identification is preserved, masked, or eliminated, based on the effort, skill, time, and cost necessary for complete identification (Nelson 2015). Often, a minimal level of identifiability is desired in the data so that health researchers can combine datasets, or conduct effectiveness or policy analysis for greater societal benefits. To define this safety standard of identifiability, HIPAA’s Privacy Rule provides specific guidelines for alternative methods of de-identification (HHS 2012), thus mitigating privacy risks for patients and enabling safe data sharing practices. Hence, data processed through the above methods for de-identification, i.e., DHI, is considered safer to share than PHI. Also, as described earlier, vendors and device manufacturers who have partnered with healthcare service providers are subject to HIPAA regulations as covered entities (CEs) or business associates (BAs). Those who are not partnered are not subject to HIPAA regulations, and are considered noncovered entities (NCEs). The following scenarios depict contexts where a wearable device manufacturer stores and shares HI with third party data brokers. The scenarios vary on two factors. Table 1. Partnership-Identity Exposure Matrix. Partnership of Device Manufacturer with Healthcare Service Provider Identity Exposure Level of Shared Health Information

Partnered CEs Not Partnered NCEs

Shared DHI

Shared PHI

A (CE, DHI) C (NCE, DHI)

B (CE, PHI) D (NCE, PHI)

Downloaded by [University of Michigan-Flint] at 11:51 03 January 2018

THE INFORMATION SOCIETY

“Partnership” indicates whether or not the wearable device manufacturer has been contracted by a healthcare service provider for patient management services and, as a result, plays the role of a CE or NCE; and “identity exposure” indicates whether or not the shared data was processed through a de-identification method prescribed in the HIPAA privacy laws. Identity exposure level (DHI or PHI) indicates whether or not a user is vulnerable to identifiable information, and partnership status (CE or NCE) indicates whether or not the user can avail legal recourse in the event of harm. Given these conditions, both scenarios A and C are relatively low risk, since the user is not vulnerable in either of these cases, whether they have legal recourse or not. Scenario “B” is a concerning scenario, where PHI has been shared with a third party data broker. However, since the device manufacturer is a CE, the user has legal recourse for any violation. Scenario “D” is the most concerning because user PHI has been shared with a third party data broker, making the user potentially vulnerable to targeting or manipulation, and the device manufacturer is a NCE, hence the user has no legal recourse for any potential harm. We identify the device manufacturers as the primary party and examine whether they should have access to stored user HI, and then analyze whether they should be allowed to share HI with third party data brokers. If a device manufacturer is a CE, as a partner it functions as an entity instrumental to healthcare service delivery, which makes it appropriate to possess or store user HI. However, in the case of an NCE, the manufacturer is not required to be HIPAA compliant, because the NCE strictly sells or maintains medical devices, but does not create, transmit, or receive HI on behalf of any CE. Further, if the entity is an NCE, it does not necessarily share the missions/visions of healthcare providers, and consequently, it does not need to comply with the norms of the healthcare industry to which it does not belong. Thus, the NCE device manufacturer can provide lifestyle-related services, which involve storage of user health/fitness related information, and its access to user HI does not violate the norms of appropriateness because it is not subject to HIPAA. While it may be appropriate for a device manufacturer to store HI, it does not imply the manufacturer has the right to further distribute stored HI. HIPAA allows covered entities to share HI only when it is not identifiable. In other words, contextual integrity is violated when PHI, which are identifiable, are shared. Such violation of contextual integrity can lead to informational harms (e.g., targeting of vulnerable populations based on sensitive information), informational inequality (e.g., data brokers wielding significant power over the fates of individual users), and disruption of relationships

7

(e.g., between service provider–client) among others (Nissenbaum 2004). Overall, scenario B illustrates a clear violation of contextual integrity, since the device manufacturer is a covered entity (CE) that has shared identifiable PHI in contradiction to the requirements of HIPAA. But the most dangerous scenario is D, which can lead to user vulnerability without offering legal recourse, thus becoming an ethical, but not legal violation. When HIPAA was enacted, healthcare service provider’s medical record documents were the primary, if not the only, sources of health information. This is the reason why nonhealthcare entities were not included in the purview of the law. However, increased penetration of technologies capable of generating PHI, the lack of laws to protect user data from NCEs, and the increasing diversity of nonhealthcare providers with access to such information, have together increased the risks of consumer data breaches and misuse. In that light, transfer of PHI from NCEs requires greater scrutiny.

Recommendations Ideally, many of the problems with sharing of health data would be addressed by the industry itself. Forms of industry self-regulation range from “pure” self-regulation, where there is no government or other stakeholder involvement in the private regulatory process, to public standard-setting or pricing and output setting, where there is a high level of direct government oversight (Garvin 1983). However, the greatest potential for industry self-regulation resides with a mixed form of industry rule-making and government oversight (Gupta and Lad 1983). We therefore make the following recommendations: 1. The U.S. Department of Health and Human Services should create a “watchdog” unit that is charged with identifying and monitoring types of new behavioral data that can be captured by wearable technology. Such a unit should examine the extent to which data enables user identifiable. If it is high, this type of data should be added to HIPAA’s list of unique identifiers. In general, this list should be at minimum annually updated. 2. The FTC should require NCE device manufacturers, based on the section 5 of the FTC Act, to “prevent unfair or deceptive acts or practices in or affecting commerce” to record and track any transactions involving sharing, sale, or distribution of information that would be considered PHI under HIPAA, and appoint third party agencies or experts to assess and certify parties buying data from NCE device manufacturers.

8

S (SY) BANERJEE ET AL.

Downloaded by [University of Michigan-Flint] at 11:51 03 January 2018

3. While the EEOC already provides guidelines for operating employee wellness programs and voluntary (and not mandatory) transmission of data from employees to employers, it should also monitor the aggregate insurance premiums of participants over time to ensure that employees are not discriminated against. 4. Manufacturers should develop identification protocols for detecting sensitive information and give it enhanced protection in routing, usage, and storage using encryption and other techniques. They should also develop mechanisms for alerting the user.

Note 1. The Internet of Things consists of billions of smart devices communicating with each other. It comprises of “‘things’ such as devices or sensors—other than computers, smartphones, or tablets—that connect, communicate, or transmit information with or between each other through the Internet” (FTC Staff 2015). This networked system of conversing devices and products is growing exponentially from an estimated two billion devices in 2006 to a forecasted fifty billion by 2020 (Evans 2011).

References Addonizio, G. 2015. The privacy risks surrounding consumer health and fitness apps, associated wearable devices, and HIPAA’s limitations. South Orange, NJ, USA: Seton Hall Law (Paper 861). Retrieved from http://scholarship.shu. edu/student_scholarship/861 Alkon, A. H., and J. Agyeman. 2011. Cultivating food justice: Race, class, and sustainability. Cambridge, MA, USA: MIT Press. Barcena, M. B., C. Wueest, and H. Lau. 2014. How safe is your quantified self. Mountain View, CA, USA: Symantec. Retrieved from https://www.symantec.com/content/dam/ symantec/docs/white-papers/how-safe-is-your-quantifiedself-en.pdf Berry, R., and M. Reisman. 2012. Policy challenges of cross-border cloud computing. United States International Trade Comission, Journal of International Commerce and Economics, Web Version, pp. 1–38. Retrieved from https://www.usitc. gov/journals/policy_challenges_of_cross-border_cloud_com puting.pdf Bertoni, S. 2014, December 8. Oscar Health using misfit wearables to reward fit customers. Retrieved April 23, 2017, from http://www.forbes.com/sites/stevenbertoni/2014/12/ 08/oscar-health-using-misfit-wearables-to-reward-fit-cus tomers/ Biltekoff, C. 2007. The terror within: Obesity in post 9/11 U.S. life. American Studies, 48(3), 29–48. Bipartisan Policy Center. 2012. What is Driving U.S. Health Care Spending. America’s Unsustainable Health Care Cost Growth. Washington, DC: Bipartisan Policy Center.

Blum, B., and F. Dare. 2015. Enhancing Clinical Practice with Wearables: Innovations and Implications. Arlington, VA, USA: Accenture. B€ohme, R., and S. K€opsell. 2010. Trained to accept? A field experiment on consent dialogs. In Proceedings of the SIGCHI conference on human factors in computing systems (pp. 2403–6. New York, NY, USA: ACM. Bradbury, D. 2015, December 17. After safe harbour: Navigating data sovereignty. Retrieved October 6, 2016, from http://www.theregister.co.uk/2015/12/17/navigating_data_ sovereignty/ Cate, F. H. 2010. Protecting privacy in health research: The limits of individual choice. California Law Review, 96(2), 1765–803. Cate, F. H., and V. Mayer-Sch€onberger. 2013. Notice and consent in a world of Big Data. International Data Privacy Law, 3(2), 67–73. Dohr, A., R. Modre-Osprian, M. Drobics, D. Hayn, and G. Schreier. 2010. The Internet of Things for ambient assisted living. ITNG, 10, 804–9. Dolan, B. 2014a, May 23. In-Depth: Consumer health and data privacy issues beyond HIPAA. Retrieved October 24, 2016, from http://www.mobihealthnews.com/33393/ in-depth-consumer-health-and-data-privacy-issuesbeyond-hipaa Dolan, B. 2014b, December 16. Prediction: Health wearables to save 1.3 million lives by 2020. Retrieved October 24, 2016, from http://www.mobihealthnews.com/39062/predictionhealth-wearables-to-save-1-3-million-lives-by-2020 Evans, D. 2011, April. The Internet of Things: How the Next Evolution of the Internet Is Changing Everything. Cisco, IBSG. Retrieved from http://www.cisco.com/c/dam/en_us/ about/ac79/docs/innov/IoT_IBSG_0411FINAL.pdf Federal Trade Commission. 2014. Data Brokers: A Call for Transparency and Accountability. Washington, DC, USA. Retrieved from https://www.ftc.gov/system/files/docu ments/reports/data-brokers-call-transparency-accountabil ity-report-federal-trade-commission-may-2014/140527data brokerreport.pdf Firestone, C. M. 2014. Developing policies for the Internet of Things, background materials. Twenty-Ninth Annual Aspen Institute Conference on Communications Policy. FTC Staff. 2015. Internet of Things: Privacy and security in a connected world. Technical report, Washington, DC, USA: Federal Trade Commission. Future of Privacy Forum. 2016. FPF Mobile Apps Study. Washington DC, USA: Future of Privacy Forum. Retrieved from https://fpf.org/wp-content/uploads/2016/08/2016-FPFMobile-Apps-Study_final.pdf Garvin, D. A. 1983. Can industry self-regulation work? California Management Review, 25(4), 37–52. Gibbs, S. 2015, October 6. What is “safe harbour” and why did the EUCJ just declare it invalid? The Guardian. Retrieved from https://www.theguardian.com/technology/2015/oct/ 06/safe-harbour-european-court-declare-invalid-data-pro tection Gupta, A. K., and L. J. Lad. 1983. Industry self-regulation: An economic, organizational, and political analysis. Academy of Management Review, 8(3), 416–25. Guthman, J. 2011. Weighing in: Obesity, food justice, and the limits of capitalism (Vol. 32). Oakland, CA, USA: University of California Press.

Downloaded by [University of Michigan-Flint] at 11:51 03 January 2018

THE INFORMATION SOCIETY

Healey, J., N. Pollard, and B. Woods. 2015. The Healthcare Internet of Things: Rewards and Risks. Atlantic Council. Retrieved from http://www.atlanticcouncil.org/publica tions/reports/the-healthcare-internet-of-things-rewardsand-risks HHS. 2012. Guidance regarding methods for de-identification of protected health information in accordance with the Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule. Retrieved April 28, 2017, from https://www.hhs.gov/hipaa/for-professionals/privacy/spe cial-topics/de-identification/index.html Isidore, C. 2015, May 15. Apple, AT&T fight sale of RadioShack customer data. Retrieved October 23, 2016, from http://money.cnn.com/2015/05/15/news/companies/radio shack-apple-att/index.html Kelley, M. B. 2013, March 21. CIA Chief Tech Officer: Big Data is the future and we own it. Retrieved October 23, 2016, from http://www.businessinsider.com/cia-presentation-onbig-data-2013-3 Langston, N. 2010. Toxic bodies: Hormone disruptors and the legacy of DES. New Haven, CT, USA: Yale University Press. Lee, K. 2015, July. Wearable health technology and HIPAA: What is and isn’t covered. Retrieved October 23, 2016, from http://searchhealthit.techtarget.com/feature/Wear able-health-technology-and-HIPAA-What-is-and-isnt-cov ered Maddox, T. 2014, July 3. The scary truth about data security with wearables. Retrieved October 23, 2016, from http:// www.techrepublic.com/article/the-scary-truth-about-datasecurity-with-wearables/ Maddox, T. 2015, October 7. The dark side of wearables: How they’re secretly jeopardizing your security and privacy. Retrieved October 23, 2016, from http://www.techrepublic. com/article/the-dark-side-of-wearables-how-theyresecretly-jeopardizing-your-security-and-privacy/ Meyer, D. 2016, February 25. Here comes the post-safe harbor EU privacy crackdown. Retrieved October 24, 2016, from http://fortune.com/2016/02/25/safe-harbor-crackdown/ Malik, M. A. 2016. Internet of Things (IoT) healthcare market by component (Implantable sensor devices, Wearable sensor devices, System and software), Application (Patient monitoring, Clinical operation and workflow optimization, Clinical imaging, Fitness and wellness measurement)—Global Opportunity Analysis and Industry Forecast, 2014–2021. Allied Market Research. Retrieved from https://www.allied marketresearch.com/iot-healthcare-market Munro, D. 2015, December 31. Data breaches in healthcare totaled over 112 million records in 2015. Retrieved October 23, 2016, from http://www.forbes.com/sites/danmunro/ 2015/12/31/data-breaches-in-healthcare-total-over-112-mil lion-records-in-2015/ Nash, L. L. 2006. Inescapable ecologies: A history of environment, disease, and knowledge. Oakland, CA, USA: University of California Press. Nelson, G. S. 2015. Practical implications of sharing data: A primer on data privacy, anonymization, and de-identification—Semantic Scholar. Semantic Scholar. Retrieved from https://www.semanticscholar.org/paper/Practical-Implica

9

tions-of-Sharing-Data-A-Primer-on-Nelson/8a091b5cc 4d3f861c0080d7b3ddf51b717244e6c Nissenbaum, H. 2004. Privacy as contextual integrity. Wash. L. Rev., 79, 119. Nissenbaum, H., and H. Patterson. 2016. Biosensing in context: Health privacy in a connected world. Quantified: Biosensing Technologies in Everyday Life, ed. Dawn Nafus, 79–100. Cambridge, London: The MIT Press. Olson, P. 2014, June 19. Wearable tech is plugging into health insurance. Retrieved October 23, 2016, from http://www.for bes.com/sites/parmyolson/2014/06/19/wearable-techhealth-insurance/ Privacy Rights Clearinghouse. 2013, July 15. Privacy Rights Clearinghouse Releases Study: Mobile Health and Fitness Apps: What Are the Privacy Risks? j Privacy Rights Clearinghouse. Retrieved October 23, 2016, from https://www.pri vacyrights.org/blog/privacy-rights-clearinghouse-releasesstudy-mobile-health-and-fitness-apps-what-are-privacy Scholl, M. A., K. M. Stine, J. Hash, P. Bowen, L. A. Johnson, C. D. Smith, and D. I. Steinberg. 2008. An introductory resource guide for implementing the Health Insurance Portability and Accountability Act (HIPAA) Security Rule. Gaithersburg, MD: National Institute of Standards \& Technology, U.S. Department of Commerce. Sherbit. 2015, April 6. How insurance companies profit from “wearables.” Retrieved October 23, 2016, from https://www. sherbit.io/the-insurance-industry-and-the-quantified-self/ Snyder, M. 2015, June 19. Police: Woman’s fitness watch disproved rape report. Retrieved October 23, 2016, from http://abc27.com/2015/06/19/police-womans-fitnesswatch-disproved-rape-report/ Solove, D. 2002. Access and aggregation: Privacy, public records, and the Constitution—viewcontent.cgi. Minnesota Law Review, 86. Retrieved from http://scholarship.law.gwu.edu/cgi/viewcon tent.cgi?article D 2079&context D faculty_publications Taylor, H. 2016, March 4. How the “Internet of Things” could be fatal. Retrieved September 22, 2016, from http://www. cnbc.com/2016/03/04/how-the-internet-of-things-couldbe-fatal.html Terry, K. 2014. Mobile polysensors offer new potential for patient monitoring. Medscape Medical News. Retrieved from http://www.medscape.com/viewarticle/828637 The Physicans Foundation. 2012, November. Drivers of Healthcare Costs White Paper. Boston, MA. Retrieved from http://www.physiciansfoundation.org/focus-areas/driversof-healthcare-costs-white-paper U.S. Department of Health and Human Services. 2003. Summary of the HIPAA privacy rule. Washington, DC: Office for Civil Rights. Retrieved from http://www.hhs.gov/hipaa/ for-professionals/privacy/laws-regulations/ U.S. Government Publishing Office. 2013, January 25. Federal Register. Vol. 78, Issue 17. Office of the Federal Register, National Archives and Records Administration. Retrieved from https://www.gpo.gov/fdsys/granule/FR-2013-01-25/ 2013-01073/content-detail.html Walker, K. L. 2016. Surrendering information through the looking glass: Transparency, trust, and protection. Journal of Public Policy & Marketing, 35(1), 144–58.