The Perils of Wearable Devices' Health Data Exposur

0 downloads 0 Views 602KB Size Report
Aug 14, 2002 - We discuss the IOT-Healthcare-Wearable Ecosystem, identify roles played ..... and Clinical Health (“HITECH”) Act adds security requirements.
Is IOT a Threat to Consumer Consent? Is IOT a Threat to Consumer Consent? The Perils of Wearable Devices’ Health Data Exposure

Syagnik Banerjee (Corresponding author) Associate Professor University of Michigan Flint Thomas A. Hemphill Professor University of Michigan - Flint Phil Longstreet Assistant Professor University of Michigan - Flint

Is IOT a Threat to Consumer Consent? ABSTRACT The ubiquitous diffusion of wearable Internet of Things (IOT) devices is leading to growing concerns regarding the privacy, security, and potential misuse of the large volume of generated data. In this manuscript, we examine how wearable IOT devices and related businesses can violate or bypass protections of sensitive personal data. We discuss the IOT-Healthcare-Wearable Ecosystem, identify roles played by different entities, explain wearables’ connectivity with other entities like employers and insurance companies, illustrate related HIPAA regulations, factors driving health data exposure and develop a taxonomy that indicates varying levels of end-user vulnerability for different use-case scenarios. The implications of this research are relevant to lifestyle technologies, healthcare providers, pharma-retailers, application developers, hardware manufacturers, market researchers, and policy makers who are involved in facilitating manufacture, endorsement distribution and regulation of instruments that aid in collection, mining, access, storage and redistribution of data. Finally, we offer recommendations for different ecosystem entities to contain health data exposure. Keywords: Internet of Things, Wearables, Informed Consent, HIPAA, Privacy, Regulatory Policy INTRODUCTION The Internet of Things (IOT) consists of billions of smart devices, ranging from digital chips to large electromechnical machines using wireless technology, all communicating with each other. It is a connected world, growing exponentially from an estimated two billion devices in 2006 to a forecasted 50 billion by 2020 (Evans, 2011). This networked system of conversing devices and products is forecasted to expand to smart homes, cities, cars, health and fitness, to name a few. The IOT includes the connections of physical objects to the Internet and among themselves through embedded sensors, utilizing both wired and wireless technologies, creating an ecosystem of ubiquitous computing. Generally, however, the "IOT" does not include such personal devices as computers, smartphones, tablets and other closely related devices, although these types of devices are often used to manage or communicate with sensors or devices included

2

Is IOT a Threat to Consumer Consent? in the definition of "Things" (Federal Trade Commission Staff, 2015). While the growth potential is endless for such a vast domain, in this paper we focus our study on one aspect: wearable devices in the United States (U.S.) healthcare sector. The newly enriched information environment involving collection, storage, and dissemination of sensorstored data will influence industries to improve system efficiencies and comprehension of insights into the delivery of services that better understand and suit customer needs. For the healthcare sector, the global IOT market is expected to reach over $136.8 billion annually by 2021 (Mohd, 2016). Significant consumer health care improvements are achievable through adjustable patient monitoring, enhanced drug management, asset monitoring, tracking, and early medical interventions. Physicians, insurers, patients and caregivers are anticipated to have unparalleled access to all information collected via the ubiquitous interconnectivity of the IOT environment (Federal Trade Commission Staff, 2015). Insulin pumps and blood pressure cuffs connected to mobile apps can enable the recording, tracking, and monitoring of health metrics to not only report back critical data to the physician on his or her medical condition, but also to help patients monitor and engage in their treatment remotely, thus allowing the physician to make decisions regarding preventative care without requiring the physical presence of the patient. These connected devices, through utilization of real-time monitoring, diagnostics, alerts, and recording of behavior patterns, can improve the patient’s quality of life and safety by providing physicians a richer data-source for diagnosis and treatment, and help the healthcare insurance industry better understand risk profiles to customize their consumer offerings. Despite significant advantages, the IOT environment also poses its own share of risks and concerns. Several commentators (Federal Trade Commission, 2015) have identified potential security risks of information “hacking” related to storage of ubiquitously collected information, such as baby monitors and insulin pumps, leading to greater vulnerability and exposure of patients to physical harm, thus compromising safety and personal control. Recently, Johnson & Johnson announced a warning for diabetic patients and their physicians, waring that one of its insulin pumps is vulnerable to cyberhacking. A hacker in close proximity 3

Is IOT a Threat to Consumer Consent? to the insulin pump system could find unencrypted radio signals used by the device and re-program the functioning of the pump (Rockoff, 2016). Just as widespread commercial availability of credit cards, made possible through technological innovation, opened up the consumer frontier for electronic purchases and financial transactions, this same technological innovation also created new high technology opportunities for more nefarious objectives. Similarly, smartphone sensors and wearables can also be used to learn of individual moods, stress levels, habits, well-being, sleep patterns, exercise, physical activity or movement, which shared can be used by different entities to make credit, insurance, and employment decisions, thus compromising the privacy of individual users. However, exposure of health data does elicit concerns deeper than privacy and security related issues. Insurance companies’ access and usage of such unregulated data to modify premiums may potentially even affect individual life chances based on unknown demographic (such as age) or behavioral (habit-related) factors. Though issues regarding security and privacy have been identified in prior literature (Stankovic, 2014; Weber, 2010), in this paper the authors identify a different potential research gap: Does IOT pose a threat to consumer consent? Wearable devices share health data which may or may not be covered by laws pertaining to the healthcare sector, leading to potential violations of existing healthcare laws, or suggesting public policy gaps. Self-interest maximizing marketplace behaviors of different entities in the ecosystem can also lead to unwanted data exposure. The concerns go beyond privacy, which is usually defined by specific fields of personal data that users are explicitly familiar with, to mixed arguments regarding whether compromised data belongs to the category of protected healthcare information, who it is protected by, and who it is compromised by. As a result, the concerns regarding data collection, storage and sharing pertain to the domain of consent, which consumers must provide in full, e.g., information of terms of sale or use, before continuing in a transactional relationship with their service provider. Consent is not a new concept; it was used by the Greek philosopher Plato to distinguish between enslavement and freedom (Kapp, 2006). It is the basis for the consumer’s legal rights, standing on the premise of individual autonomy. However, information overload gets in the way of informed consent. In 4

Is IOT a Threat to Consumer Consent? the age of Big Data, it is difficult to provide conscious consent for several reasons (Cate & MayerSchönberger, 2013, pp. 67–68). The value of personal information is not known when it is collected (and when notice and consent are normally given); also the relationship between users and processors of personal data has become increasingly complicated, as datasets are combined and processors and users change. Further, Cate (2010) concludes that there are numerous conceptual objections to relying on the traditional approach to individual choice (and by proxy “consent”) because the notices that seek consent often simply disclose the absence of choice, or are written so broadly they make choice meaningless. Seeking consent and enhancing the illusion of choice, such notices often distract users from their own privacy needs. Consequently, having privacy linked thus to individual choice can impose significant cognitive burdens on individuals, especially as the volume and pace of data flows increase, as in the age of Big Data. Further, by automating machine-to-machine communication based on intellectual property protocols and overloading consumers who have limited processing capabilities with excess information, IOT also makes consumers less conscious of collected data types, data storage locations, parties shared with, terms of access, and what other data it can be triangulated with. As suggested by Walker (2016), consumers are not really sharing but “surrendering” information over the Internet due to the overwhelming amount of information at hand, and the consent they provide may not be truly “informed.” This situation presents several challenges at the policy level. First, device users do not clearly know or understand potential consequences of terms and conditions they give consent to, which may compromise some of their consumer rights. Second, for patients in the U.S. healthcare sector, where informed consent is explicitly required by the Health Insurance Portability and Accountability Act of 1996 to deliver treatment, such surrender results in a loss of individual autonomy, leading to a violation of the caregivers’ contractual terms of provision or delivery of service to the patient. In this paper, the authors first articulate an IOT healthcare taxonomy; second, identify and explain the interests of different stakeholders in the IOT ecosystem; third, elucidate the factors driving health data exposure, fourth, draw implications in different types of scenarios, and lastly, offer recommendations by

5

Is IOT a Threat to Consumer Consent? the appropriate entities in the IOT system to effectively improve the alignment of the multi-stakeholder goals of permission, privacy, and profit. IOT IN HEALTHCARE As mentioned earlier, the IOT healthcare ecosystem encompasses billions of devices and transactions. Applying the framework proposed by Healey et al. (2015), healthcare devices are categorized as Wearable Health Monitoring Devices, Medical Wearable Devices, Medical Embedded Devices, and Stationary Medical Devices (see Figure 1 below). Wearable Health Monitoring Devices are those that consumers voluntarily acquire with or without endorsement from healthcare providers (e.g., Fitbit, Fuelband, etc.) and that collect information about daily user activities, lifestyle, and well-being. This information may then be synchronized to other devices, such as smartphones, for personal use. Medical Wearable Devices are worn externally and actively deliver some form of treatment; e.g., insulin pumps. Medically Embedded Devices are implanted into the body for the express purpose of monitoring vitals, and allowing intervention, when necessary, for the continued good health of the patient, e.g. example modern pacemaker devices (Taylor, 2016). Stationary Medical Devices are designed to be used in a specific physical location, such as chemotherapy dispensing stations in home-based healthcare. Further, each of the devices in the aforementioned categories must also contain some degree of the following attributes: Identification, Location, Sensing, and Connectivity (Dohr, Modre-Osprian, Drobics, Hayn, & Schreier, 2010), as shown in Figure 1 below. Devices meeting the Identification attribute use their unique serialization to provide positive identification of the user. This attribute could be especially beneficial in the case of a patient who is brought into the hospital unconscious, as the device would aid in identification. The Location attribute is met when the device is able to provide geospatial coordinates. The Sensing attribute describes sensors involved with the device that produce an alert in the event of an abnormal situation. Finally, the Connectivity attribute is the ability of the device to connect with other devices and transmit information to said devices. When examining medical devices, these attributes are especially crucial as they allow for the identification of the device, which then provides identification of 6

Is IOT a Threat to Consumer Consent? the patient. Further, devices containing these attributes can provide the location of where an abnormal event occurred and can immediately transmit all relevant details to the proper medical personnel to aid in maintaining the well-being of the patient. < Insert Figure 1 Here > As in any new market, the penetration of healthcare IOT devices has not reached its full potential. At present, the IOT healthcare market has experienced the most expansive growth within the Wearable Health Monitoring Devices and Medical Wearable Devices categories. In fact, recent reports have shown that the healthcare IOT market is currently worth over $60.4 billion dollars and is expected to reach $136.8 billion by 2021; of this $136.8 billion, patient monitoring will encompass the largest sub-segment at an estimated $72.7 billion by 2021 (Mohd, 2016). This economic impact has largely been accomplished by several manufacturers (e.g. Samsung, Apple, Nike, and Fitbit) capitalizing on the latest capabilities in miniaturization of circuitry, which has allowed IOT devices to be able to capture, store, and transmit a wearer’s vitals on a continuous basis, and still be small enough to fit on the wrist. This technological capability has captured the public’s interest primarily in the consumer health sector where users are interested in being able to track their vitals as they go about their daily routines. Consumers and healthcare providers have become increasingly interested in utilizing these devices and capabilities as they allow for unprecedented access to real-time patient information (Doukas & Maglogiannis, 2012). For instance, previously a hospital and physician would be limited to collecting patient vitals on an interval rotation basis; however, with current technologies, the same healthcare providers have the capability to receive a continuous stream of information, allowing for instantaneous monitoring of the patient whether they be physically at the hospital or in their home. Moreover, this sector is evolving towards incorporating many more wearable IOT devices, such as fully interconnected insulin pumps, blood pressure cuffs, and health monitors (Rohokale, Prasad, & Prasad, 2011).

7

Is IOT a Threat to Consumer Consent? As the demand for these IOT wearable devices has increased in both consumer and medical markets, so have the proliferation in types of entities interested in these technologies. These entities can be broadly categorized as Healthcare, Bystander Beneficiaries, Hardware and Infrastructure, and Regulatory (see Table 1 below). The Healthcare IOT entities are those that are directly involved (and possess capability) in using the devices and/or facilitating treatment and the continued good health to the user. These entities include healthcare service providers, such as medical doctors, who may partner and use data from the devices to provide more accurate and timely healthcare diagnosis and intervention, leading to better quality and longevity of life. It also includes patients who consent to the partnered use of these devices for improved health outcomes. Bystander Beneficiaries utilize healthcare devices as a way to improve or expand their own enterprises. For instance, employers improve productivity by monitoring employee health and subsequently saving on insurance premiums. Healthcare insurance companies, by using this same information, may gain a better understanding of risk profiles of their clients. In general, the Bystander Beneficiary is characterized by the utilization of the information provided by the IOT device, and is not invested in the information creation or delivery process, nor involved in the business of manufacturing, selling, or supporting the IOT device. Infrastructure IOT entities are those that make, sell, support, facilitate, or connect the devices, thereby being directly invested in the information delivery process. Examples of entities in this category would be device manufacturers, application developers, and entities that provide connectivity infrastructure for the device, such as Internet Service Providers (ISPs). Finally, Regulator IOT entities are those entrusted by the public to ensure that these devices are performing functions for what they are intended. These entities come in the form of governmental agencies who are administratively reviewing the device to ensure compliance with the specifications outlined by the manufacturer, or privacy interest groups who are focused on determining the extent of the information sharing from said device. 8

Is IOT a Threat to Consumer Consent? Table 1 below provides a tabular summation of this categorization. < Insert Table 1 Here >

The following section explains how the devices and the data from the devices is currently being utilized and suggest some challenging issues pertaining to their usage. HOW WEARABLE DEVICES CONNECT WITH HEALTH CARE AND INSURANCE As previously described, the various entities present in the IOT ecosystem include employers, employees, patients, insurers, data brokers, and healthcare service providers, among others. Hospitals and clinics are using wearable medical devices to improve emergency medicine communication. The wearable healthcare device market is forecasted to save 1.3 million lives (Dolan, 2014b) and grow to $72.7 billion by 2021 (Mohd, 2016). With wearables like Vital Connect's Healthpatch MD (Terry, 2014), physicians can receive updates on patient’s vital signs without directly observing the patient. Similarly, patients with amyotrophic lateral sclerosis (ALS), or other neurodegenerative diseases, can be connected via wearables so that emotive technology scans of their brainwaves, thoughts, feelings and expressions can be used to generate alerts and commands to electronic devices in the room, including televisions and lighting (Blum & Dare, 2015). Wearable devices can also apply to other scenarios including diabetes, obesity, cardiovascular diseases, asthma, and Alzheimer’s, as well as in-hospital monitoring. Wearable application developers also approach non-medical corporations to purchase large volumes of wearable devices and distribute the products among employees. The regular use of fitness applications has two primary values: the first, that if the usage drives a positive change in behavior, it makes the employees healthier and more productive, and the second, that the data generated from the fitness tracker, if accessed by healthcare insurers, can help them customize more accurate coverage packages relevant to the health of the user. The fundamental premise for the above value perceptions rests on the assumptions that individual lifestyle choices and behaviors, if consciously controlled, can a) drive better health outcomes and b) 9

Is IOT a Threat to Consumer Consent? significantly lower healthcare costs for employers and insurance. A variety of perspectives further enrich a debate on the validity of these assumptions. Traditional reasoning assumes lifestyle choices contribute to obesity and chronic illnesses which lower productivity and raise healthcare costs. That which we address as conscious personal choice is also viewed by scholars as environmental injustice (Alkon & Agyeman, 2011), as some have suggested that a separation between the individual and the environment is impossible to sustain (Langston, 2010, p. 147), and that ecologies are “inescapable” and prenatal exposure to toxins can lead to outcomes like obesity (Nash, 2006), thus overriding the role of conscious personal choices. Guthman (2011) attributes such health outcomes to broader processes, indicating obesity is a symptom of a greater problem which can only be solved by challenging the cultural values or economic interests of powerful entities who are personally invested in existing socio-economic systems. Similarly, Biltekoff (2007) suggests urging individuals to make responsible choices for protecting their own health, rather than rely on government protections and services to do the same, is a reflection of neoliberal ideologies, and hence a product of ideological bias. Despite the diversity of perspectives, the salience of terms such as “lifestyle factors” and “personal health choices” among an array of healthcare cost driving factors found in policy research such as administrative burdens, aging populations, medical technology, insurance design, cultural biases, the state of competition in the market environment, end of life costs and legislative mandates (Bipartisan Policy Center, 2012, pp. 6–7; The Physicans Foundation, 2012, p. 24) encourages corporations to drive an effort of using wearable devices for improved productivity, well-being, and insurance costs for their workforce. Several organizations have initiated corporate wellness programs, which include wearable devices to monitor and improve fitness. Oscar has offered its customers $1 credit every time they reach a daily step count goal, which they can cash out at $20 for an Amazon gift card (Bertoni, 2014). Cigna has distributed armbands (manufactured by Bodymedia) to thousands of its employees, while other self-insured employers, like Autodesk, have purchased Fitbit trackers and distributed them to their employees (Olson, 2014). Oil giant BP has provided positive incentives to 14,000 employees who opted to wear Fitbit in return for letting 10

Is IOT a Threat to Consumer Consent? the company track their daily work steps over the 2013 calendar year (Olson, 2014). BP employees who exceed one million steps gain points that go towards a lower insurance premium. Some employers are using Wildflower, an application for pregnant women to measure weight gain at pregnancy milestones, as maternity is a high expense item for them (Olson, 2014). Contrarily, some companies are using punitive measures to change behavior. For example, a company called StickK believes that taking away wellness points, a disincentive, is a more effective way than offering rewards (Olson, 2014). Some self-insured companies are planning to add a $50 surcharge to insurance premiums of employees who smoke. US retail pharmacy CVS has been charging employees $50 per month who refuse reporting weight, body fat, cholesterol, blood pressure and sugar levels. Overall, improved wellness compliance is emerging, but raises vexing ethical questions about the use of the data being captured. From the above examples, it is evident that in the process of encouraging healthy behavior and improving productivity of employees, two types of entities enter the IOT-Wearable ecosystem. One is the wearable application provider/developer who makes it possible for the data to be captured and shared for constant monitoring. The other is the healthcare insurance provider who is invested in the uncertainties of health related outcomes and can provide incentives and motivation for regular usage of wearable devices. On occasion, the application developer and the company employing the users open up an additional third party “neutral” entity to ensure the individual’s data is stored and shared from a third party platform, thereby removing any of their own liabilities related to misuse of employees’ personal information. This creates additional entities, such as the data broker, which is an emerging service industry. To better explain why this expanded ecosystem is problematic, in the following section we introduce the Health Insurance Portability and Accountability Act of 1996 and explain the issues this law has with patient health data sharing. PRIVACY, SECURITY AND POLICY Data collected by health care wearables are subject to various legal protections. There is sector specific legislation, such as the American Disabilities Act, Children’s Online Privacy Protection Act and Fair Credit 11

Is IOT a Threat to Consumer Consent? Reporting Act, not to mention federal or state laws related to access to healthcare insurance and consumer discrimination. The U.S. Food and Drug Administration (FDA), under its legislative authority, regulates medical devices; however, wearable devices are not regulated by the FDA although some commentators recommend that they should be included under the agency's regulatory oversight responsibilities (Future of Privacy Forum, 2016). However, much lifestyle and well-being data generated by the devices often flies under the radar because of the broader societal benefits and lack of awareness concerning the nature of data collection, storage and sharing. Health service providers protect data from sharing and misuse by requiring informed consent, because traditionally they were the primary sources for generating health-related data on their patients. It is only with ubiquitous technological diffusion that one acknowledges every individual is a potential patient, and non-medical entities such as application developers are able to access, collect, store and share such personal health-related data using non-invasive procedures. This causes a further separation between the existence of medical records and the physical location of health information. Informed consent is implicit in the consumer decision-making and choice process, because selection, which happens post-alternative evaluation, assumes the decision-maker’s consent to the terms of acquisition, which may include terms of service, use, refund, collection, storage, and sharing of personal or transactional data in the purchase or use process. Further, such consent and consequent selection is assumed to be arrived at under complete knowledge, i.e., the consumer is aware, conscious and informed regarding the consequences (such as hidden costs and terms) of the purchase or sale as elaborated above. In the spirit of the above premise, the Health Insurance Portability and Accountability Act of 1996 or HIPAA (U.S. Government Publishing Office, 2013), Public Law 104-191, was enacted on August 21, 1996. Within this legislation, Sections 261 through 264 of HIPAA requires the U.S. Secretary of Health and Human Services (HHS) to publicize standards (also known as Administrative Simplification provisions) for the electronic exchange, privacy and security of patient health information. The HHS Standards for Privacy

12

Is IOT a Threat to Consumer Consent? of Individually Identifiable Health Information administrative rule (“Privacy Rule”) were published on August 14, 2002 in the Code of the Federal Register (@45 CFR Part 160 and Part 164). The primary goal of the Privacy Rule is to protect patients’ health information while allowing the flow of such information to “covered entities”, i.e., health care providers, health care clearinghouses, and business associates, that provide and promote health care services, all the while protecting the patient’s health and security (US Department of Health and Human Services, 2003). In addition, HHS also published the Security Standards for the Protection of Protected Electronic Health Information (“Security Rule”), which establishes a national set of security standards for protecting certain health information that is held or transferred in electronic form (Scholl et al., 2008). A major goal of the Security Rule is to protect the privacy of patients’ health information while allowing covered entities to adopt new technologies to improve the quality and efficiency of patient care. Obtaining “consent”, i.e., written permission from individuals to use and disclose their protected health information for treatment, payment, and health care operations, is optional under the Privacy Rule for all covered entities (US Department of Health and Human Services, 2003). However, HIPAA, and specifically the Privacy Rule, requires that a covered entity, which includes health plans, health care providers, health care clearinghouses, and business associates, must obtain the patient’s written authorization for any use or disclosure of protected health information that is not for treatment, payment, or health care operations or otherwise permitted or required by the Privacy Rule (US Department of Health and Human Services, 2003). On the surface it appears that for patients receiving treatment from a healthcare provider using wearable devices to monitor their health, any data shared with business associates are well protected, as the Privacy Rule (@45 CFR 164.504(e)) requires them to describe the permitted and required uses of protected health information, ensures that business associates will not use or further disclose the protected health information other than as permitted, and uses appropriate safeguards to prevent disclosure or use beyond permission. Business associate functions and activities include: claims processing or administration; data analysis, processing or administration; utilization review; quality assurance; billing; benefit management; practice 13

Is IOT a Threat to Consumer Consent? management; and repricing. Moreover, business associate services include legal, actuarial, accounting, consulting, data aggregation, management, administrative, accreditation, and financial. Similarly, under Title XIII of the American Recovery and Reinvestment Act of 2009, the Health Information Technology for Economic and Clinical Health (“HITECH”) Act adds security requirements. For example, it requires Business Associates to comply directly with Security Rule provisions directing implementation of administrative, physical and technical standards for electronic protected health information; develop, enforce policies and documentation standards; report non-compliance and terminate contracts with non-compliant entities; and requires HSS to conduct compliance audits. However, different problems emerge as far as wearable devices and adequacy of HIPAA are concerned. Much of the data leakage occurs due to policy gaps, explicit policy violations, as well as legal and self-interest driven marketplace behavior by covered entities. FACTORS DRIVING HEALTH DATA EXPOSURE In this section, we elaborate on a list of factors that drive the exposure of personal health data (see Figure 2 below). Understanding the factors is necessary to assist in identifying future remedies and policy recommendations to contain their effects. I.

Policy Gaps

a. HIPAA i)

Conditionality of Protection: According to the Privacy Rule (@45 CFR 160.103), a business associate is defined as a person or organization that performs certain functions or activities that involve the use of disclosure of protected health information on behalf of, or provides services to, a covered entity. In other words, a wearables manufacturer or distributor that neither provides services to any covered entity, nor uses or discloses information on behalf of a covered entity, cannot be subject to the compliance requirements of business associate defined under 14

Is IOT a Threat to Consumer Consent? HIPAA. Despite additional regulatory safeguards initiated by the healthcare sector, such as the Omnibus Rule, increased penalties, breach notification standards and incorporation of added Protected Health Information, or PHIs (U.S. Government Publishing Office, 2013), they continue to apply to Business Associates, and mHealth technologies remain “non-covered entities” (NCE)s. Even if a doctor endorses a particular app, or even if health care providers and app developers enter into interoperability arrangements, HIPAA does not protect consumer data. The status of a wearable manufacturer-distributor depends on whether its participation has been contracted by the healthcare provider for patient management services. If the contract indicates that the wearable device manufacturer engages in transmission of data on behalf of the healthcare provider, the manufacturer is considered a covered entity. However, if a patient purchases a wearable device from a retailer, uses it and voluntarily shares some results with his or her doctor, who then responds with some advisory comments, the company manufacturing the device is not a HIPAA covered entity; consequently, the data collected by the manufacturer is neither bound by nor protected under HIPAA. This situation is similar to sharing data after an automobile accident, when both the healthcare insurer and automobile insurer receive medical bills. One can be bound by HIPAA, the other may not, because the health insurer has not partnered and endorsed the automobile insurer under its own line of business. When the HIPAA guidelines were developed, the increasing diffusion of mobile wearables and subsequent proliferation of third party entities were not foreseen as a potential complication. The only way to make wearable device manufacturers or distributors conform to HIPAA laws is to have them partner with covered entities. ii)

Protected Data Types: Applicability of HIPAA depends on what fields of information are being collected and shared. The HIPAA Privacy Rule protects most “individually identifiable health information”, such as medical records, laboratory reports, and hospital bills, because they can identify the patient by name. The University of California, San Francisco, uses 18 criteria

15

Is IOT a Threat to Consumer Consent? defining healthcare data covered under HIPAA (Lee, 2015), which includes names, geographical markers smaller than a state (street address, county, and zip code), elements of dates directly related to a person (birth, admission, and discharge), phone, fax and email, social security numbers, numbers related to medical records, health plan beneficiary, accounts, certificates and licenses, vehicle identifiers, license plates, IP addresses, device identifiers, web URLs, biometric identifiers and photographic images. However, many fields of information that appear to be private, such as blood pressure or sleep statistics, can be shared separately as long as they are not identified by unique personal information. iii)

Definition of Unique Identifiers: Collected health, behavior data or variables can become unique identifiers themselves, though not recognized or categorized as PHI under HIPAA. At a data conference in New York City, Ira Hunter, the CIA‘s Chief Technology Officer, said that a person can be 100 percent guaranteed to be identified by simply their gait, i.e., how they walk (Kelley, 2013). In other words, gradually evolving fields of behavioral data that we merely consider patterns could be as unique as fingerprints, and therefore personally identifiable.

iv)

Ability to Bypass Identification: Even if the concerned entity qualifies as a business associate that is required to comply with HIPAA, in recognition of the potential usefulness of health information when not identifiable, §164.502(d) of the Privacy Rule allows a covered entity or its business associate to create information that is not individually identifiable by following the de-identification standard and implementation specifications in §164.514(a)-(b). If that is true, identification can be achieved post-sharing, by creating and tracking unique proxy combination patterns, therefore in effect bypassing the privacy rule.

b. Company Policy Gap Lack of Policy Clarity on the Future of Stored Data: The existing public policy environment is ambiguous about the future of shared data. If an entity legitimately shares data with a firm (neutral, third party, or otherwise) that undergoes a change in ownership or bankruptcy, or enters bankruptcy,

16

Is IOT a Threat to Consumer Consent? how is the collected data to be treated? Should it be treated as a firm–owned asset? In an End User License Agreements (EULA), terms and conditions of wearable devices often indicate that in the event of the sale of their company, or if in bankruptcy proceedings, the company reserves the right to sell the data (Maddox, 2014). When Radioshack underwent bankruptcy, they requested the court to allow them to sell data to pay off their creditors. AT&T and Apple filed a motion to prevent the sale of data concerning customers who had purchased Apple or AT&T devices (Isidore, 2015). Similar concerns exist regarding federal government enforcement policy and when federal executive administrations change. c. E-Discovery Policy Gaps Legal Demands: Personal health data can be made accessible for legal purposes. In 2014, a law firm in Calgary used activity data from Fitbit to assess the effect of an accident on their client in a known personal injury case (Olson, 2014). Similarly, in 2015, a woman’s fitness watch disproved her rape report (Snyder, 2015). The entry of wearable device data into judicial proceedings has been through ediscovery, and facilitated by analytics platforms like Vivametrica, which use public health records to compare a specific person’s records with those of the general population. Once used in judicial proceedings, this personal health information can become part of publicly searchable information under the Open Records Policy or Freedom of Information Act. Though there can be exemptions to preserve security and privacy, however such exemptions can vary from state to state (Solove, 2002). d. Sovereignty of Cloud Data and International Policy Jurisdictions Since most of the wearable device applications store the generated data in the cloud1, there are a wide variety of national laws determining how protected the data is and how safely it is stored. Previously, “safe harbor” regulatory compliance with the European Union’s data protection framework required data processing entities to provide sufficient data protection guarantees “irrespective of their location.”

1

Commercially available platforms for storing, distributing and processing data

17

Is IOT a Threat to Consumer Consent? (Gibbs, 2015) However, following the Snowden revelations regarding U.S. intelligence agencies’ surveillance into commercial Internet services, the European Union struck down the “safe harbor” ruling and emphasized the need for adhering to domestic privacy policies which could potentially inhibit the free movement of data across borders (Meyer, 2016). Similarly, Indonesia introduced Regulation PP 82/2012, which demands that data service providers to its citizens maintain this personal data within their territory. Australia imposed its own laws, and regulations on data carried overseas. Moreover, more explicit rules are expected to be issued by the EU General Data Protection Regulation (Bradbury, 2015). With the increasing push for data sovereignty and adherence to domestic laws, it may become difficult for US-based companies that try to utilize global opportunities to store data in international locations and demand more secure protection standards. As a result, a company can store data under a cloud computing service at a cheaper rate where fewer resources have been deployed to make the data secure (Berry & Reisman, 2012). Since a majority of start-up application developers are operating within tight budgets, this economic reality incentivizes them to sign up with cloud storage providers operating from countries with more “flexible”, i.e., less rigorous, security standards in order to minimize their operational expenses. Hence, data sovereignty can lead to data vulnerability and subsequent exposure. II.

Policy Violations Due to Technological Factors

a. Hardware Connectivity Gaps: The existing encryption regime of wearable device or hardware identifiers by manufacturers is inadequate. Sports tracking of wearable devices is generally synced via Bluetooth Low Energy to wirelessly connect to mobile devices or computers. Electronics firm Symantec conducted an experiment where it built portable Bluetooth scanners using Raspberry Pi minicomputers and readily available retail components, which could be purchased and assembled for $75 by any person with basic IT skills. Using the scanner, Symantec employees were able to scan airwaves for signals broadcast from devices and extract device hardware addresses and location as well as other personal data (Barcena, Wueest, & Lau, 2014). While location in itself is

18

Is IOT a Threat to Consumer Consent? not a data field that is questioned by HIPAA, when connected to an identifier, air quality and pollution estimates can inform audiences about the potential health risks of the individual. b. Cloud Storage Vulnerability: Most wearable data transferred between wearables and mobile devices are stored in “clouds.” Due to lack of associated physical space or control, cloud storage can be vulnerable to breaches by hackers. In the first quarter of 2015, approximately 100 million patient records, i.e., estimated by volume of patient records, were compromised in the top six attacks (i.e., by volume of compromised patient records) (Munro, 2015). III.

Marketplace Behaviors

a. Premium and Rising Market Value of Health Data i)

Data Broker Industry of Identifiable Health Data: There is a large, disorganized data broker industry that has evolved in the pursuit of profits. In 2012, nine data broker firms analyzed by the FTC generated $426 million in annual revenue (Federal Trade Commission, 2014). Unique, personal information is a high- priced commodity, and it is estimated that the information contained in wearables stored on a smartphone or a “cloud” is worth ten times that of credit cards in a black market (Maddox, 2015). Credit card companies have reached a level of best practices that can be applied to fraud protection, e.g., whereby they can cancel previous card and transaction data from a customer. However, once unique identifier data is collected, they cannot be re-captured. That enhances the life and value of compromised identifier data. A report by the FTC found that twelve health applications that were tested transmitted information to seventy-six different third-parties (Dolan, 2014a), some of which were data brokers, who aggregate and sell consumer data in the marketplace (Addonizio, 2015). What is disturbing is that much of this data contained unique device identifiers, social security numbers, demographics, court and public records, travel, running routes, dietary habits, sleep patterns, addresses, financial

19

Is IOT a Threat to Consumer Consent? information, full names, health information, location, date of birth and other similar personally identifiable information (Privacy Rights Clearinghouse, 2013). ii)

Demand for Profiling and Risk Estimation: Organizations and insurers demand so-called “granular” detailed information, because more detailed behavioral profiles offer insurers more accurate impressions of individual health risks. There have been reports about regional hospitals, insurers and grocery retailers collaborating to transform buyers’ grocery purchase data into health risk profiles (Sherbit, 2015).

b. Liability minimization Employers who try to implement healthy work practices have to monitor employees to ensure improved health and justify reduced insurance premiums. In order to mitigate liabilities related to storage and potential misuse of data accrued while monitoring, employers often introduce neutral third party firms, such as Staywell or Welltok, to help remove personal health information from employee records (Firestone, C.M., 2014). There is a potential threat that such neutral third-party firms can “feed” or join the data broker industry to exacerbate misuse concerns and liability. c. End User License Agreement Design End User License Agreements (EULAs), whose density alone has a significant negative effect on a users’ willingness to read and comprehend (Böhme & Köpsell, 2010), often overreach as they are designed without considering the legal enforceability of their terms and conditions. In the process, a wide variety of terms that are not legally enforceable are included into the EULA, reducing the users’ likelihood of being aware or conscious of the potential consequences of using the wearable device. The end-users lack of interest and vigilance encourages the IOT connected firms to continue sharing personal data. In conclusion, such diverse access to personal data raises questions about a wide variety of policy gaps and marketplace behaviors. Data brokers collect and sell information for many reasons, including credit risk

20

Is IOT a Threat to Consumer Consent? assessment, fraud prevention and marketing. They operate with minimal transparency, and in most cases consumers are not aware about their data being collected, shared, or sold. Overall, careless storage, data breaches, poor encryption, policy gaps and unscrupulous organizations are all responsible for the current state of affairs. LEGAL AND POLICY IMPLICATIONS In this section we use Nissenbaum’s (2004) Contextual Integrity framework to detect ethical violations, the existence of a policy gap and provide recommendations for improvement. To assess if storage or transfer of information violates Contextual Integrity one must examine four factors (Nissenbaum & Patterson, 2016): first, the prevailing context in which data is generated; second, the key actors, their roles and responsibilities; third, key information attributes; and fourth, principles of data transmission between parties. These factors determine the norms of appropriateness and distribution. Among the norms, the former suggests whether it is appropriate for concerned entities to access and store information at hand, whereas the latter decides when it is acceptable to share the same information with external parties. We employ two factors, i.e. varying roles and information attributes from Nissenbaum and Patterson (2016), to design a simple 2 x 2 Partnership-Anonymity Matrix (see Table 2 below) to visualize different scenarios of health data exposure. To provide clarity to this Partnership-Anonymity Matrix, we must first provide a working definition and background for the family of terms included under the umbrella of Health Information. Health Information (HI) is a broad term that includes both protected health information (PHI), which contains unique identifiers that can be directly linked back to the patient, and De-Identified Health Information (DHI), which are not readily identifiable. Here it is important to clarify that DHI is not necessarily anonymous data; in fact, data can vary from completely anonymous to completely identifiable, differing on the extent to which identification is preserved, masked, or eliminated, based on the effort, skill, time and cost necessary for complete identification (Nelson, 2015). Often a minimal level of identifiability is desired in the data, so that health researchers can combine datasets, or conduct effectiveness or policy analysis for greater societal 21

Is IOT a Threat to Consumer Consent? benefits. To define this safety standard of identifiability, HIPAA’s Privacy Rule provides specific guidelines for alternative methods of de-identification (HHS, 2012), thus mitigating privacy risks for patients and enabling safe data sharing practices. Hence, data processed through the above methods for deidentification, i.e. DHI, is considered safer to share than PHI. Also, as described earlier, vendors, and device manufacturers who have partnered with healthcare service providers are subject to HIPAA regulations as Covered Entities (CEs) or Business Associates (BAs). Those who are not partnered are not subject to HIPAA regulations, and are considered Non-Covered Entities (NCEs). The following scenarios depict contexts where a wearable device manufacturer stores and shares HI with third party data brokers. The scenarios vary on two factors. “Partnership” indicates whether the wearable device manufacturer has been contracted by a healthcare service provider for patient management services and, as a result, plays the role of a CE or NCE; and “Anonymity” indicates whether the shared data was processed through a De-Identification method prescribed in the HIPAA privacy laws, thereby implying DHI or PHI. In terms of implications, the level of Anonymity (DHI or PHI) implies whether a user is vulnerable to being targeted, manipulated or exploited with identifiable information, and Partnership status (CE or NCE) implies whether the user can avail legal recourse in the event of harm. < Insert Table 2 Here >

Given these conditions, both scenarios A and C are relatively low risk, since the user is not vulnerable in either of those cases, whether they have legal recourse or not. Scenario “B” is a concerning scenario, where PHI has been shared with a third party data broker. However, since the device manufacturer is a CE, the user has legal recourse for the violation. Scenario “D” is the most concerning because user PHI has been shared with a third party data broker, making him/her potentially vulnerable to targeting or manipulation, and the device manufacturer is a NCE, hence the user has no legal recourse for any potential harm. How do the above factors of partnership and anonymity decide whether it is appropriate for an entity to have access to HI, and distribute the same information to other parties? We identify the device manufacturer 22

Is IOT a Threat to Consumer Consent? as the primary party and examine whether they should have access to stored user HI, and then analyze whether it should be allowed to share HI with third party data brokers. Examining the application of HIPAA, if a device manufacturer is a CE, as a partner it functions as an entity instrumental to healthcare service delivery, which makes it appropriate to possess or store user HI. However, in the case of an NCE the manufacturer is not required to be HIPAA compliant, because the NCE strictly sells or maintains medical devices, but does not create, transmit, or receive HI on behalf of any CE. Further, if the entity is an NCE, it does not share the missions/visions of healthcare providers, and consequently it does not need to comply with the operating rules of an industry to which it does not belong. Thus, the NCE device manufacturer can provide lifestyle-related services, which involve storage of user health/fitness related information, and its access to user HI does not violate the norms of appropriateness because it is not subject to HIPAA. Hence, the partnership (or lack thereof) does not affect the appropriateness of the manufacturer’s access to user HI. The second norm of distribution regulates the flow of information to other parties. It may be appropriate for a device manufacturer to store HI, but that does not imply the manufacturer has the right to further distribute stored HI. HIPAA indicates that whether covered entities can share HI depends on whether it is identifiable or not. In other words, since PHI are identifiable, contextual integrity is violated when PHI are shared. Such violation of contextual integrity can lead to informational harms (i.e., targeting vulnerable populations based on sensitive information), informational inequality (i.e., data-aggregators such as data brokers wielding significant power over the fates of individual users), and disruption of relationships (i.e., between service provider-client) among others (Nissenbaum, 2004). Overall, scenario B demonstrates clear violation of contextual integrity and HIPAA law, since the device manufacturer is a covered entity (CE) that has shared identifiable PHI. But the most dangerous scenario is D, which can lead to user vulnerability without offering legal recourse, thus becoming an ethical, but not legal violation. The scenarios A and C do not pose any known threat. The above scenarios can be further enriched (and made increasingly more complex) using Nissenbaum (2004) by introducing variations in prevailing contexts; or by variations in principles of transmission. One 23

Is IOT a Threat to Consumer Consent? potential limitation of using norm-violation as a tool of assessment is that norms are subjective and often dated. When the HIPAA laws were created, healthcare service provider’s medical record documents were the primary, if not the only, sources of health information, which is why non-healthcare entities were considered exceptions to the law. However, increased penetration of technologies capable of generating PHI, the lack of laws to protect user data from NCEs, and the increasing diversity of non-healthcare providers with access to such information, have together increased the risks of consumer data breaches and misuse. In that light, transfer of PHI from NCEs requires greater scrutiny. RECOMMENDATIONS To summarize, different types of issues, as illustrated by scenarios in the previous section, concern different entities categorized in the IOT ecosystem. Our following recommendations to Regulators and Healthcare Providers address public policy gaps: those to Bystander Beneficiaries relate to marketplace behaviors and the process of developing private governance solutions; and to Hardware and Infrastructure Providers relating to technological inadequacies and appropriate company responses. Regulators and Healthcare Providers First, there is a need to create a federal regulatory “watchdog” unit in the Department of Health and Human Services that is charged with monitoring i) what new behavioral data can be captured by wearable technology, and ii) to what extent the new behavioral data can be considered unique and identifiable. An example of this type of data is the measurement of a person’s gait and its’ accuracy in predicting the user’s identity. If more variables such as gait can be identified, HIPAAs list of unique identifiers need to be updated. We name these data fields as Newly Discovered Unique Identifiers (NDUIs). This step will help to clarify and update our knowledge of relevant PHI, keeping pace with technological change. Second, for CEs, the NDUIs have to be accounted for in the standardized De-Identification process deemed acceptable under HIPAA. According to the HIPAA Privacy Rule 164.514(a)-(b), either an expert determination method or the Safe Harbor Rules need to be followed. If the expert determination method is 24

Is IOT a Threat to Consumer Consent? engaged, several principles and considerations, such as that of replicability, data source availability and distinguishability, are followed to ascertain individuals’ identification risks. A specific consideration, distinguishability, identifies to what extent selecting a combination of different data fields can make individual users uniquely identifiable. For example, year of birth, gender and last three digits of the zip code are unique for only 0.04% of US residents; whereas date of birth, gender and 5-digit zip code can be unique for more than 50% US residents (HHS, 2012). The NDUIs captured through wearable devices need to be included in these lists, and tested for distinguishability using combinations of other variables. If the Safe Harbor Rules are used for De-Identification, the NDUIs will have to be added to the list of identifiers that need removal. Third, though HIPAA rules are not applicable for NCEs, they are held accountable by the Federal Trade Commission (FTC), based on section 5 of the FTC Act to “prevent unfair or deceptive acts or practices in or affecting commerce”. Hence, in the spirit of encouraging transparency, accountability and security (from Fair Information Practice Principles, the FTC can i) require NCE device manufacturers to record and track any transactions involving sharing, sale or distribution of information that would be considered PHI under HIPAA, and ii) appoint third party agencies or experts to assess and certify parties who buy data from NCE device manufacturers in order to ensure that the acquisition is not motivated by potential misuse of PHI. The above measures should not prevent any analysis of health data which may lead to medical breakthroughs, but they will add more transparency and vigilance to the existing process. Even if not adequate to prevent misuse by data brokers, this process will create record trails that can help us identify involved entities in the event of data-related misuse or harm. Fourth, the U.S. Equal Employment Opportunity Commission (EEOC), which lists rules and requirements for operating employee wellness programs, should require employers to monitor the aggregate insurance premiums of participants over time to ensure that employees are not discriminated against based on demographics, such as age. Concerns regarding information misuse have been voiced by the American Association of Retired Persons in the recent past (Abelson, 2016). While the EEOC already provides 25

Is IOT a Threat to Consumer Consent? guidelines enabling fair principles of voluntary (and not mandatory) transmission of data from employees to employers (Nissenbaum & Patterson, 2016), requiring monitoring of wellness participation effects on insurance premiums will help them enforce compliance with nondiscriminatory work practices. Bystander Beneficiaries “Bystanders” referring to general employers, health care insurance companies, and professional market researchers, need to continuously measure and monitor the effectiveness of wearable devices in corporate wellness programs on employee well-being and healthcare expenditure savings. Since the initiatives completely rest on the assumption that conscious personal choices can make differences in health outcomes, productivity, and costs, it is important for both employers and insurers to test the validity of the assumption. If the above assumption was violated, it would make the multi-party investment into the process of data generation, collection and protection mostly futile. The beneficiaries need to be actively involved in self-regulating their own market behaviors as it pertains to personal health privacy, security, and informed consent issues. Such “private governance”, or selfregulation of market behavior (Stringham, 2015; in contrast to public regulation by government entities), exists when a firm, an industry or the business community establishes its own standards of behavior (1) when no such public statutory or regulatory requirement exists, or (2) when such standards assist in complying with or exceeding existing statutory or regulatory requirements (Hemphill, 1992). In the following paragraphs we discuss types of motivations that can drive such self-regulatory behavior, which business communities or third parties should be involved, and what are the decisions that require industry norms or self-regulation. Industry association members can choose to voluntarily create a code of conduct or standards regime which is generic in their language and scope of practices in order to accommodate diverse practices among industry members (Wotruba, 1997). There are various motivations for establishing industry codes of conduct and standards regimes. First, an expression or implication that the standards to be upheld are above and beyond those required by law. An example is the variety of sources of personal information used by 26

Is IOT a Threat to Consumer Consent? insurance companies to assess risk profiles and their need to set higher standards of acceptability of data sources. Second, a desire to influence public regulators or discourage potential industry entrants. This is relevant to the data broker industry which has till now operated below the radar and exponentially increased data sharing without control. Third, it is more efficient and effective at inducing desirable behavior by those regulated because companies may be assured a more prominent voice when standards are set (and updated) by non-governmental bodies, such as industry associations, than when they are established through public regulation (Haufler, 2001). This can apply to employers who can design as well as share best practices on common safety protocols for neutrally storing or transmitting personal data without incurring liabilities. Fourth, it offers a basis for controlling the behavior of company employees whose dealings with customers offer tempting opportunities for ethically questionable actions (Wotruba, 1997). This can prevent neutral employee data storage firms from selling the personal data to data brokers for profit.

To institute such self-regulation regimes, it is recommended that key stakeholders addressing personal health privacy, security, and informed consent issues, including general employers, insurance companies, data brokers and professional market researchers, develop market behavior standards through their respective U.S.-based industry association. Social pressure or political-legal incentives from third-parties, i.e., those outside the business-customer transaction, may also induce a company to comply with standards that are seen as embodying “best practice.” As to a general employer, the nation’s peak industry association for employers is the U.S. Chamber of Commerce (“Chamber”). The Chamber is the world’s largest business organization representing the interests of more than three million businesses of all sizes, sectors, and regions (U.S. Chamber of Commerce, 2016). For healthcare insurers, the primary national industry association is American Health Insurance Plans (AHIP). Representing nearly 1,300 members, AHIP provides health insurance coverage to more than 200 million Americans (America’s Health Insurance Plans, 2016). Lastly for professional market researchers, the largest and leading U.S. professional association of marketing research professionals is the Marketing Research Association (Marketing Research Association, 2016). 27

Is IOT a Threat to Consumer Consent? Standards need to be developed regarding desirable behaviors which can be connected to types or fields of data that are shared, what variables are considered identifiers, terms of sharing with third parties, restrictions on types of parties that data can be shared with, rights of data ownership, access, usage, transfer and sale, EULA designs, and methods of storing and sharing stakeholder data without triggering conflicts of interest or creation of potential liabilities. Forms of industry self-regulation range from “pure” self-regulation, where there is no government or other stakeholder involvement in the private regulatory process, to public standard-setting or pricing and output setting, where there is a high level of direct government oversight (Garvin, 1983). It is recommended that the greatest potential for industry self-regulation resides with a mixed form of industry rule-making and government oversight (Gupta & Lad, 1983). To emphasize the government oversight aspect of effective industry regulation, we identify three agencies which could have oversight responsibilities (based on their respective scope of regulatory authority): Department of Health and Human Services, Federal Trade Commission, and the Equal Employment Opportunity Commission. As to the aforementioned “policy” gaps noted in the HIPAA legislation and regulations as well as company policy gaps, some form of industry self-regulation needs to be designed to both assist in compliance with this law as it pertains to protecting the privacy, security and informed consent of persons who are wearing personal medical or health devices, and to exceed (when and where applicable as deemed by participating industry associations) the requirements of the law to further ensure protection of data privacy, security, and informed consent for patients and consumers. Hardware and Infrastructure Providers: Hardware and Infrastructure providers should be aware that their devices, networks and servers may be utilized to collect, store and disseminate protected, or unprotected, sensitive information. Generally, devices do not store or transmit completely anonymous data because this precludes the ability to utilize this information within smartphone or computer applications, and because some form of identifier is usually embedded. As such, these providers need to first establish proper identification protocols that accurately detect whether information entering, traveling through, or being stored within their systems is protected or unprotected, sensitive information. Second, once identified as being sensitive, such information should 28

Is IOT a Threat to Consumer Consent? inherit a degree of protection in routing, usage, and storage; usually accomplished through a degree of encryption. Furthermore, with the introduction of this identifier the collected information must also be assessed to determine the nature of the information and if it can, in any way, be linked to the user. If the linkage is possible then either the hardware developer or the application developer need to ensure that they are informing the user of the nature of the collected and transmitted data, as well as safeguarding said data to the appropriate extent necessary. However, under current regulation it is possible that this information is neither detected as sensitive, nor protected. Hence it is preferred that the provider take the extra step of building best practices guidelines or self-regulatory industry standards in the safeguards for the data regardless of regulatory compliance.

SUMMARY AND CONCLUSIONS As an important component of the emerging IOT, the healthcare sector is faced with serious issues that challenges how its various entities collect, store, and disseminate sensitive medical and health data on patients and health-conscious consumers. These issues relate to data privacy and security, and acquiring “informed” consent from respective patients and consumers. In this paper, the authors focus on categories of healthcare technology, namely wearable medical devices and wearable health monitoring devices, and proceed to evaluate the existing federal regulatory environment on privacy, security and informed consent concerning medical information, which is found in HIPAA. After evaluating the entities involved in the wearable healthcare ecosystems, the analysis revealed several policy “gaps” in multiple policy frameworks (including HIPAA) as well as self-interest maximizing legal marketplace behaviors by multiple entities that threaten the consent of the final consumer of wearable devices. In the last section of the paper, the authors address these policy “gaps” by offering practical, implementable recommendations in three categories: regulators and healthcare providers; bystander beneficiaries; and hardware and infrastructure providers.

29

Is IOT a Threat to Consumer Consent? For regulators and healthcare providers, the authors recommend the creation of a research and discovery “watchdog” agency in HHS that constantly monitors what new behavioral data can be captured by wearable technology. Also, measurement of unique identifiability, involvement of FTC for certification and monitoring of transactions and EEOC for tracking wearable-generated information use are suggested. For bystander beneficiaries, such as general employers, health care insurance companies, and professional market researchers, the authors recommend that they need to be actively involved in self-regulation of their own market behavior as it pertains to as it pertains to personal health privacy, security, and informed consent issues; it is recommended that key stakeholders, including general employers, insurance companies, and professional market researchers, address personal health privacy, security, and informed consent issues, develop market behavior standards through their respective U.S.-based industry associations; and industry association members choose to voluntarily create a code of conduct or standards regime which is narrow in focus, but flexible enough to accommodate diverse practices among industry members. For hardware and infrastructure providers, establishing identification protocols for sensitive data, as well as building encryption standards for protecting them, are recommended. The greater challenge in this context is over and above the mere protection of privacy and the validation of consent. It is to attain a delicate balance of protection of sensitive information and enabling true consent with processes that do not stifle innovation or potential path breaking discoveries through research efforts employing health related data. The Privacy Rule does establish the legal basis for how health care data may be employed for larger research analysis to assist in future medical breakthroughs. However, the diffusion of sensors capable of capturing health related information, coupled with the advent of “cloud” computing, has further exacerbated healthcare issues with data integrity. These changes require greater effort on the part of the healthcare information sector to further develop both the transaction (codification) standards and the vocabulary standards to further ensure data integrity (in this case anonymity in research projects) for healthcare information standards operating within the boundaries established by the Privacy Act (Rode, 2012). In essence, the discovery of new identifiable data will require parties in the existing ecosystem to

30

Is IOT a Threat to Consumer Consent? decide necessary changes in encryption and storage to optimize discovery and anonymity. The recommendations in this paper offer a departure point for organizations directly involved in the wearable medical devices and wearable health monitoring device industries, and also for those industries and firms who are potential users of this sensitive information. REFERENCES Abelson, R. (2016, October 24). AARP Sues U.S. Over Rules for Wellness Programs. The New York Times. Retrieved from https://www.nytimes.com/2016/10/25/business/employee-wellness-programsprompt-aarp-lawsuit.html Addonizio, G. (2015). The Privacy Risks Surrounding Consumer Health and Fitness Apps, Associated Wearable Devices, and HIPAA’s Limitations. Seton Hall Law, (Paper 861). Retrieved from http://scholarship.shu.edu/student_scholarship/861 Alkon, A. H., & Agyeman, J. (2011). Cultivating food justice: Race, class, and sustainability. MIT Press. America’s Health Insurance Plans. (2016). About America’s Health Insurance Plans. Retrieved October 23, 2016, from https://www.ahip.org/about-us/ Barcena, M. B., Wueest, C., & Lau, H. (2014). How Safe is your Quantified Self. Symantec. Retrieved from https://www.symantec.com/content/dam/symantec/docs/white-papers/how-safe-is-yourquantified-self-en.pdf Berry, R., & Reisman, M. (2012). Policy Challenges of Cross-Border Cloud Computing. United States International Trade Comission, Journal of International Commerce and Economics, Web Version. Retrieved from https://www.usitc.gov/journals/policy_challenges_of_crossborder_cloud_computing.pdf Bertoni, S. (2014, December 8). Oscar Health Using Misfit Wearables To Reward Fit Customers. Retrieved April 23, 2017, from http://www.forbes.com/sites/stevenbertoni/2014/12/08/oscarhealth-using-misfit-wearables-to-reward-fit-customers/ 31

Is IOT a Threat to Consumer Consent? Biltekoff, C. (2007). The terror within: Obesity in post 9/11 US life. American Studies, 48(3), 29–48. Bipartisan Policy Center. (2012). What is driving US health care spending. America’s Unsustainable Health Care Cost Growth. Washington, DC: Bipartisan Policy Center. Blum, B., & Dare, F. (2015). Enhancing Clinical Practice with Wearables: Innovations and Implications. Accenture. Böhme, R., & Köpsell, S. (2010). Trained to accept?: a field experiment on consent dialogs. In Proceedings of the SIGCHI conference on human factors in computing systems (pp. 2403–2406). ACM. Bradbury, D. (2015, December 17). After safe harbour: Navigating data sovereignty. Retrieved October 6, 2016, from http://www.theregister.co.uk/2015/12/17/navigating_data_sovereignty/ Cate, F. H. (2010). Protecting privacy in health research: the limits of individual choice. California Law Review, 1765–1803. Cate, F. H., & Mayer-Schönberger, V. (2013). Notice and consent in a world of Big Data. International Data Privacy Law, 3(2), 67–73. Dohr, A., Modre-Osprian, R., Drobics, M., Hayn, D., & Schreier, G. (2010). The Internet of Things for Ambient Assisted Living. ITNG, 10, 804–809. Dolan, B. (2014a, May 23). In-Depth: Consumer health and data privacy issues beyond HIPAA. Retrieved October 24, 2016, from http://www.mobihealthnews.com/33393/in-depth-consumer-healthand-data-privacy-issues-beyond-hipaa Dolan, B. (2014b, December 16). Prediction: Health wearables to save 1.3 million lives by 2020. Retrieved October 24, 2016, from http://www.mobihealthnews.com/39062/prediction-healthwearables-to-save-1-3-million-lives-by-2020

32

Is IOT a Threat to Consumer Consent? Doukas, C., & Maglogiannis, I. (2012). Bringing IoT and Cloud Computing towards Pervasive Healthcare. In 2012 Sixth International Conference on Innovative Mobile and Internet Services in Ubiquitous Computing (IMIS) (pp. 922–926). https://doi.org/10.1109/IMIS.2012.26 Evans, D. (2011, April). The Internet of Things: How the Next Evolution of the Internet Is Changing Everything. Cisco, IBSG. Retrieved from http://www.cisco.com/c/dam/en_us/about/ac79/docs/innov/IoT_IBSG_0411FINAL.pdf Federal Trade Commission. (2014). Data Brokers: A Call For Transparency and Accountability. Retrieved from https://www.ftc.gov/system/files/documents/reports/data-brokers-call-transparencyaccountability-report-federal-trade-commission-may-2014/140527databrokerreport.pdf Firestone, C.M. (2014). Developing Policies for the Internet of Things, Background Materials. TwentyNinth Annual Aspen Institute Conference on Communications Policy. FTC Staff. (2015). Internet of Things: Privacy and Security in a Connected World. Technical report, Federal Trade Commission. Future of Privacy Forum. (2016). FPF Mobile Apps Study. Future of Privacy Forum. Retrieved from https://fpf.org/wp-content/uploads/2016/08/2016-FPF-Mobile-Apps-Study_final.pdf Garvin, D. A. (1983). Can industry self-regulation work? California Management Review, 25(4), 37–52. Gibbs, S. (2015, October 6). What is “safe harbour” and why did the EUCJ just declare it invalid? The Guardian. Retrieved from https://www.theguardian.com/technology/2015/oct/06/safeharbour-european-court-declare-invalid-data-protection Gupta, A. K., & Lad, L. J. (1983). Industry self-regulation: An economic, organizational, and political analysis. Academy of Management Review, 8(3), 416–425. Guthman, J. (2011). Weighing in: Obesity, food justice, and the limits of capitalism (Vol. 32). Univ of California Press.

33

Is IOT a Threat to Consumer Consent? Haufler, V. (2001). Public Role for the Private Sector: Industry Self-Regulation in a Global Economy. Washington DC: Carnegie Endowment for International Peace. Retrieved from http://carnegieendowment.org/2001/08/14/public-role-for-private-sector-industry-selfregulation-in-global-economy-pub-679 Healey, J., Pollard, N., & Woods, B. (2015). The Healthcare Internet of Things: Rewards and Risks. Atlantic Council. Retrieved from http://www.atlanticcouncil.org/publications/reports/the-healthcareinternet-of-things-rewards-and-risks Hemphill, T. A. (1992). Self-regulating industry behavior: Antitrust limitations and trade association codes of conduct. Journal of Business Ethics, 11(12), 915–920. HHS. (2012). Guidance Regarding Methods for De-identification of Protected Health Information in Accordance with the Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule. Retrieved April 28, 2017, from https://www.hhs.gov/hipaa/for-professionals/privacy/specialtopics/de-identification/index.html Isidore, C. (2015, May 15). Apple, AT&T fight sale of RadioShack customer data. Retrieved October 23, 2016, from http://money.cnn.com/2015/05/15/news/companies/radioshack-appleatt/index.html Kapp, M. B. (2006). Patient Autonomy in the Age of Consumer Driven Health Care: Informed Consent and Informed Choices (2006). Journal of Health and Biomedical Law, 2, 1, 2. Kelley, M. B. (2013, March 21). CIA Chief Tech Officer: Big Data Is The Future And We Own It. Retrieved October 23, 2016, from http://www.businessinsider.com/cia-presentation-on-big-data-2013-3 Langston, N. (2010). Toxic bodies: Hormone disruptors and the legacy of DES. Yale University Press. Lee, K. (2015, July). Wearable health technology and HIPAA: What is and isn’t covered. Retrieved October 23, 2016, from http://searchhealthit.techtarget.com/feature/Wearable-healthtechnology-and-HIPAA-What-is-and-isnt-covered

34

Is IOT a Threat to Consumer Consent? Maddox, T. (2014, July 3). The scary truth about data security with wearables. Retrieved October 23, 2016, from http://www.techrepublic.com/article/the-scary-truth-about-data-security-withwearables/ Maddox, T. (2015, October 7). The dark side of wearables: How they’re secretly jeopardizing your security and privacy. Retrieved October 23, 2016, from http://www.techrepublic.com/article/the-dark-side-of-wearables-how-theyre-secretlyjeopardizing-your-security-and-privacy/ Marketing Research Association. (2016). About MRA. Retrieved October 24, 2016, from http://www.marketingresearch.org/about marketsandmarkets.com. (2015, October). IoT Healthcare Market by Components & Application - 2020. Retrieved October 24, 2016, from http://www.marketsandmarkets.com/Market-Reports/iothealthcare-market-160082804.html Meyer, D. (2016, February 25). Here Comes the Post-Safe Harbor EU Privacy Crackdown. Retrieved October 24, 2016, from http://fortune.com/2016/02/25/safe-harbor-crackdown/ Mohd, A. M. (2016). Internet of Things (IoT) healthcare Market by Component (Implantable Sensor Devices, Wearable Sensor Devices, System and Software), Application (Patient Monitoring, Clinical Operation and Workflow Optimization, Clinical Imaging, Fitness and Wellness Measurement) - Global Opportunity Analysis and Industry Forecast, 2014 - 2021. Allied Market Research. Retrieved from https://www.alliedmarketresearch.com/iot-healthcare-market Munro, D. (2015, December 31). Data Breaches In Healthcare Totaled Over 112 Million Records In 2015. Retrieved October 23, 2016, from http://www.forbes.com/sites/danmunro/2015/12/31/databreaches-in-healthcare-total-over-112-million-records-in-2015/ Nash, L. L. (2006). Inescapable ecologies: A history of environment, disease, and knowledge. Univ of California Press.

35

Is IOT a Threat to Consumer Consent? Nelson, G. S. (2015). Practical Implications of Sharing Data: A Primer on Data Privacy, Anonymization, and De-Identification - Semantic Scholar. Semantic Scholar. Retrieved from https://www.semanticscholar.org/paper/Practical-Implications-of-Sharing-Data-A-Primer-onNelson/8a091b5cc4d3f861c0080d7b3ddf51b717244e6c Nissenbaum, H. (2004). Privacy as contextual integrity. Wash. L. Rev., 79, 119. Nissenbaum, H., & Patterson, H. (2016). Biosensing in Context: Health Privacy in a Connected World. Quantified: Biosensing Technologies in Everyday Life, 79. Olson, P. (2014, June 19). Wearable Tech Is Plugging Into Health Insurance. Retrieved October 23, 2016, from http://www.forbes.com/sites/parmyolson/2014/06/19/wearable-tech-health-insurance/ Privacy Rights Clearinghouse. (2013, July 15). Privacy Rights Clearinghouse Releases Study: Mobile Health and Fitness Apps: What Are the Privacy Risks? | Privacy Rights Clearinghouse. Retrieved October 23, 2016, from https://www.privacyrights.org/blog/privacy-rights-clearinghousereleases-study-mobile-health-and-fitness-apps-what-are-privacy Rockoff, J. D. (2016, October 4). J&J Warns Insulin Pump Vulnerable to Cyber Hacking. Wall Street Journal. Retrieved from http://www.wsj.com/articles/j-j-warns-insulin-pump-vulnerable-tocyber-hacking-1475610989 Rode, D. (2012). Data Integrity in an Era of EHRs, HIEs, and HIPPA: A Health Information Management Perspective. Office for Office of Civil Rights, Department of Health and Human Services, National Institute for Standards and Technology Conference, Safeguarding Health Information: Building Assurance through HIPAA Security, June 6-7, Washington, DC. Rohokale, V. M., Prasad, N. R., & Prasad, R. (2011). A cooperative Internet of Things (IoT) for rural healthcare monitoring and control. In 2011 2nd International Conference on Wireless Communication, Vehicular Technology, Information Theory and Aerospace Electronic Systems Technology (Wireless VITAE) (pp. 1–6). https://doi.org/10.1109/WIRELESSVITAE.2011.5940920

36

Is IOT a Threat to Consumer Consent? Scholl, M. A., Stine, K. M., Hash, J., Bowen, P., Johnson, L. A., Smith, C. D., & Steinberg, D. I. (2008). An Introductory Resource Guide for Implementing the Health Insurance Portability and Accountability Act (HIPAA) Security Rule. Gaithersburg, MD: National Institute of Standards \& Technology, U.S. Department of Commerce. Sherbit. (2015, April 6). How Insurance Companies Profit From “Wearables.” Retrieved October 23, 2016, from https://www.sherbit.io/the-insurance-industry-and-the-quantified-self/ Snyder, M. (2015, June 19). Police: Woman’s fitness watch disproved rape report. Retrieved October 23, 2016, from http://abc27.com/2015/06/19/police-womans-fitness-watch-disproved-rape-report/ Solove, D. (2002). Access and Aggregation: Privacy, Public Records, and the Constitution viewcontent.cgi. Minnesota Law Review, 86. Retrieved from http://scholarship.law.gwu.edu/cgi/viewcontent.cgi?article=2079&context=faculty_publications Stankovic, J. A. (2014). Research directions for the internet of things. IEEE Internet of Things Journal, 1(1), 3–9. Stringham, E. P. (2015). Private governance: Creating order in economic and social life. Oxford University Press, USA. Taylor, H. (2016, March 4). How the “Internet of Things” could be fatal. Retrieved September 22, 2016, from http://www.cnbc.com/2016/03/04/how-the-internet-of-things-could-be-fatal.html Terry, K. (2014). Mobile Polysensors Offer New Potential for Patient Monitoring. Medscape Medical News. Retrieved from http://www.medscape.com/viewarticle/828637 The Physicans Foundation. (2012, November). Drivers of Healthcare Costs White Paper. Boston, MA. Retrieved from http://www.physiciansfoundation.org/focus-areas/drivers-of-healthcare-costswhite-paper U.S. Chamber of Commerce. (2016). About the U.S. Chamber. Retrieved October 24, 2016, from https://www.uschamber.com/about-us/about-the-us-chamber

37

Is IOT a Threat to Consumer Consent? US Department of Health and Human Services. (2003). Summary of the HIPAA privacy rule. Washington, DC: Office for Civil Rights. Retrieved from http://www.hhs.gov/hipaa/forprofessionals/privacy/laws-regulations/ U.S. Government Publishing Office. (2013, January 25). Federal Register Volume 78, Issue 17. Office of the Federal Register, National Archives and Records Administration. Retrieved from https://www.gpo.gov/fdsys/granule/FR-2013-01-25/2013-01073/content-detail.html Walker, K. L. (2016). Surrendering Information Through the Looking Glass: Transparency, Trust, and Protection. Journal of Public Policy & Marketing, 35(1), 144–158. Weber, R. H. (2010). Internet of Things–New security and privacy challenges. Computer Law & Security Review, 26(1), 23–30. Wotruba, T. R. (1997). Industry self-regulation: A review and extension to a global setting. Journal of Public Policy & Marketing, 38–54.

38

Is IOT a Threat to Consumer Consent? FIGURES Figure 1: Healthcare IOT Typology

Wearable Health Monitoring Devices

Medical Wearable Devices

Attributes Identification Location Sensing Connectivity

Medical Embedded Devices

Stationary Medical Devices

39

Is IOT a Threat to Consumer Consent? Figure 2: Factors Driving Personal Health Data Exposure

40

Is IOT a Threat to Consumer Consent?

TABLES IOT Entities

Table 1: Entities Involved in the Wearable Healthcare Ecosystems Description Types of Entities Example of Entities

Healthcare

These are service providers delivering treatment to ensure continued well-being of individuals

Healthcare Providers, Hospitals, Laboratories, patients

Presbyterian Hospital, Family Clinics, Quest Diagnostics

Bystander Beneficiaries

Entities that seek to capture market value by utilizing advances in healthcare or well-being related IOT technologies without investing in the delivery process.

Insurance Risk Assessment, Employers, Marketers, Data Brokers,

Oscar, Autodesk, BP, Blue Cross Blue Shield, CVS, Vivametrica

Hardware and Infrastructure

Entities invested in the delivery process by manufacturing, connecting, selling, developing networking infrastructure or supporting IOT devices

Device Manufactures, Infrastructure Providers, Tech Support Providers, App Developers

Fitbit, Nike Fuel, Apple, Samsung, Wildflower, Bluetooth

Regulatory

Entities that are primarily interested in ensuring that technology specifications and ethical practices are met and employed in use of IOT technologies

Federal and State Agencies, “Watchdog” Privacy Interest Groups

Federal Trade Commission, Future of Privacy Forum

41

Is IOT a Threat to Consumer Consent? Table 2: Partnership-Anonymity Taxonomy Anonymity of shared Health Information Partnership of Device Shared DHI Shared PHI Manufacturer Partnered with Health A (CE, DHI) B (CE, PHI) Care Service CEs Provider Not Partnered C (NCE, DHI) D (NCE, PHI) NCEs

42