Document not found! Please try again

When brain computer interfaces move from research ...

4 downloads 0 Views 342KB Size Report
responsible research and innovation a top priority, which in return have created a lot ... and not on the people or organizations behind this technology. The change in ... looking at the different stages of neurotechnology (such as brain computer ...
When brain computer interfaces move from research to commercial use Christian B. J. Hansen De Montfort University The Gateway, Leicester LE1 9BH +44 7400 714241

[email protected] ABSTRACT This paper will explore how ethical concerns change when brain computer interfaces move from a research setting into a commercial setting. This paper will argue that the transition from research to commercial settings might change the intentions for the artefact and will explore hypothesis of what this change might affect. This paper will discuss how possible intentions for brain computer interfaces in commercial settings will have an impact on the products developed and what consequences this might have for individuals and society. The ethical concerns discussed in this paper includes privacy, enhancement and the digital divide. This paper will also present possible future research which could help investigate both the hypothesis put forward and the topic of brain computer interfaces moving from research to commercial settings in general.

Categories and Subject Descriptors K.4.1 [Computers and Society]: Public Policy Issues – Ethics, Privacy, regulation

General Terms Human Factors

Keywords Brain Computer Interfaces, BCI, Responsible Research and Innovation, RRI, Ethics, Privacy, Enhancement, Equity and Digital Divide.

1. INTRODUCTION The European Commission has put the agenda of making responsible research and innovation a top priority, which in return have created a lot of focus on how to make research and innovation responsible [20]. While there has been a focus on how to do so, there seems to be a lack of information on what happens to ethical concerns when technology makes the transition from research to commercially available products. Therefore, this paper will explore in which way ethical concerns change for brain computer interfaces (BCI) when making the transition from research to commercial usage. Specifically this paper will focus Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Conference’10, Month 1–2, 2010, City, State, Country. Copyright 2010 ACM 1-58113-000-0/00/0010 …$15.00.

on how intentions for the BCI will change the product that ends up being developed, and what consequences this will have for the end user and society. It is this focus on the relationship between developers, the BCI and the consequences of these that are interesting in this article.

Figure 1: Diagram of change from research to commercial development This paper will explore the current literature dealing with the ethics of BCI in research settings, and provide hypothesis of what occur when the intentions for the developed BCI changes. The focus in this paper will be on the change in intentions for the BCI and not on the people or organizations behind this technology. The change in intentions for the artefact, changes the impact BCI will make on individuals and society. Therefore by exploring how the intention of BCI changes the consequences it might have, we can explore what further research could answer the questions that arise from this change.

2. BACKGROUND This section will describe two different discourses, specifically the discourse of ethics of neuroscience [23] which will be referred to as neuroethics and the discourse of responsible research and innovation (RRI) [25], however firstly a short description of BCI will be made. Brain computer interfaces take many forms, such as invasive, noninvasive, wet and dry BCI. An invasive BCI is a BCI that uses interfaces that are implanted directly onto the brain. These devices are rarely used by healthy individuals as these require surgery and are not biologically sustainable which introduce a lot of risk factors. A non-invasive BCI is a technology that reads signals on the surface of the scalp instead of directly reading signals from the brain. This technology has more potential usage for healthy individuals as these can be compared to wearing a smart watch, using a keyboard or even a computer mouse [16]. This paper will be focusing on specifically non-invasive BCI, as non-invasive BCI are now emerging as commercial products. This paper will

particularly focus on dry BCI as these are the easiest of the wide variation of BCI to commercialize as they require the least preparation from users and are non-invasive. Dry BCI differ from wet BCI by not using any gel which reduces the preparation time, in return their accuracy and ability to read signals are reduced [16]. There is also good reasons to believe that particularly dry non-invasive BCIs will be the most prominent market of BCIs in the near future. This is based on the BNCI Horizon reports, which report of 51% of industries surveyed using some sort of electroencephalography (EEG), and only 6% of industries using invasive BCI. Current BCIs work by reading the electric activity across the scalp. By doing so it is possible to create a model of where activity is present in the brain. Various techniques are then used to provide meaning to this data, such as algorithms that measure the difference between a resting state and an active state to provide either actions or feedback based upon these two states. Companies are using this technology to provide users with a commercial product they can use for various tasks such as therapy (including meditation), entertainment or research [3, 10]. The history of neuroethics has dealt with ethical concerns regarding research into the brain and brain computer interfaces. Topics that has been dealt with range from privacy, to enhancement, [14, 18, 29] however there is a lack of research looking at the different stages of neurotechnology (such as brain computer interfaces) development, and what the different stages might have of impact on the ethical concerns. Specifically neuroethics have discussed privacy, both where it is suggested that the information collected is not different than the information collected in psychological research [1]. Others point at the predicting nature of neuroimaging as similar to genetic information and suggest that the same privacy laws used to regulate genetic information is used as basis for laws to cover neuroimaging data. [24] Whether or not the information is classified at the same privacy level of genetic information, information collected by these devices is none the less private and therefore must be treated as such. What both sides in this debate does not cover is what the change from a research setting to a commercial setting means to privacy concerns. Therefore in dealing with privacy concerns the current literature discuss potential issues, or issues related to keeping data private in a research setting. But by doing so the issue of privacy is focused around a research setting, a commercial setting or both. This however leads to a gap in knowledge of what is changing with the potential ethical concerns when new steps in the development are taken. This gap in knowledge is however not isolated to privacy concerns, but is the case for most ethical concerns. The discourse of responsible research and innovation is mainly focused on the development and specification of what RRI is, or how RRI can make an impact on current research and industry. This can be attributed to the fact that RRI is a fairly new term. While RRI is a new term it leans upon discourses that are more settled such as technology assessment and computer ethics. While these discussions are interesting they leave out an important element, which is the differences between research, and commercial innovation. When the discourse is interested in the differences between research and commercial settings, it is largely on how research can be adopted in commercial settings and society [13, 15]. By not focusing on this difference there is a risk of missing out important aspects of technology assessment and computer ethics. So while good effort is being put into asking questions on how to impact research and innovation, there seems

to be a lack of discussion on what the differences are between commercial innovation and research. Due to this lack of focus on the topic, areas which are specific to either innovation or research might be missed.

3. INTENTION While intentions of humans are a philosophical topic that has been discussed in great effort [8, 21, 27], in this article intentions are meant as for what purpose a BCI is designed and the intended use for the BCI. It is in this context intentions are to be understood throughout this article. As in figure 1 in the introduction, the intentions we are talking about here is the relationship between the BCI developer and the product that is interesting. Therefore the intentions discussed are the intentions for the artefact, and not the intentions of the artefact or the developer of the artefact. If the intentions for the artefact is to provide wheelchair users with another interface to control their wheelchair, the device will look differently than if it were designed for playing a video game. It is these changes in focus of attributes that this paper will focus on and discuss what these different focuses might mean for the ethical consequences for individual users and society. One could argue that the actual consequences of an artefact might not be predictable and possible to design for or against. While this is true, it is still an interesting exercise to speculate what might happen when technology is moved from research to commercial use. Therefore in the following sections, the consequences brought forth might not be complete and there is good reason to believe new concerns will emerge once this change is complete. The reason why intentions are worth looking at is because of the addition of values to artefacts change based on the intentions for the artefact, such as described by literature in value based design [12, 19]. For example the Intel-chip case brought up by Nissenbaum (2001) shows that the intentions for making a chip more secure and protected against hardware theft raised privacy concerns. In a similar way the intentions for a commercial BCI might have other ethical concerns than a BCI developed for research. It is this change in intentions and thereby consequences the following sections will discuss.

4. CONSEQUENCES The following sections will discuss how the values and intentions embedded into a commercial brain computer interface might affect the consequences to individuals and society compared to BCI developed in a research setting. This section will discuss the ethical concerns of privacy, enhancement, and the digital divide. While these ethical concerns are not a comprehensive list of concerns, it is some of the concerns which are likely to be raised when brain computer interfaces move from research to commercial settings. As mentioned previously, the consequences mentioned in the following sections are hypothesis of might happen, and future research will be required to evaluate whether these hypothesis are true.

4.1 Privacy In research, privacy might not always be a value directly in focus when developing BCI products because privacy in part is handled by organizational protocols such as ethical reviews, restrictions to ownership of data, and other means of protecting users and society from data gathered to be misused or disclosed. When the technology moves into commercial usage there is various

interesting possibilities for the value of privacy in the brain computer interface, and this paper will hypothesise on three of these possibilities. The three possibilities discussed will be, developing a BCI with privacy in mind, with share-ability in mind, and finally developing without privacy or lack thereof in mind. If the value of privacy is being embedded into the BCI the consequences could be that for those individual users that value privacy would be more likely to adopt the technology. The same privacy could however mean that for society there is less ability for policing what data is being collected and for what purpose as it would be harder to gain insight. The worry of not having easy access to user data generated with BCI devices should not be as much as a worry though because the data which should be interesting for law enforcement is data that is already collectable. This is data that is closer to output rather than input of the BCI, just as keyboard inputs are interesting, but not whether the user is typing with fingers or another limb. One could argue that law enforcement would be interested in direct BCI inputs in form of brain wave data as it could be used to identify certain states of mind. This is however still a future scenario as research at this point is not able to use BCI data in such a way. This is still an interesting topic, which fortunately is already ongoing, and something we as society need to take a stance on [24, 26]. It might also be that while more products would be sold, there is less options for companies to make a profit as there is less options for using user data as a product. This could slow down commercialization as there would be less incentives for companies to develop BCI products. If however the intention of the BCI is to make a product where the BCI itself is not the main source of income. We could see BCI products that focus on gathering user data, and sharing these with commercial third party companies. This could increase the amount of BCIs sold as it would make it possible to sell products cheaper, however it would in return raise issues of privacy. Whether people would be willing to give up their rights to information about what is going on in their brain is a question worth asking. Companies such as Google and Facebook have been successful in providing products to the world in exchange for personal data, whether this will be a potential business model for BCI developers is yet to be seen. The consequences for BCI products if this course of development is chosen would be that more people would be able to gain access to these devices which could be argued as a good thing. The question remains however whether too much privacy is traded for a cheaper product. Commercial products such as those provided by Facebook seem to have made the concept of privacy fussy in regards to digital privacy [30]. The same sort of change as seen in digital privacy could be an impact of BCI to the concept of brain activity being private. Lastly it might be that there is not going to be any focus on privacy in the BCI device. This might be the most difficult to predict as this leaves out both possibilities and could lead to devices that have privacy concerns without it being the intent. Therefore having a position on whether the BCI development should be developed with the intent of being privacy enhancing or not is important as it at least forces commercial developers to take a stance on what they want their devices to be used for.

4.2 Enhancement The consequences for enhancement when the intentions for the BCI is commercial viability is very hard to evaluate as there is

many different notions of what the concept of enhancement covers. The change in effect however might be the most noticeable to society as there is large potentials for companies to reach a large number of people. BNCI Horizon (2015) mentions that there is over 100 million students in the EU alone, and even with just a percentage of these students using BCI would be a large market to reach into [22]. The ethical concerns with this type of introduction to enhancement is that there is no oversight in both the way people enhance themselves when it is a commercial product, and there is no oversight to who is able to access these devices. This creates a large set of issues that also were a concern in research, but these issues were regulated just like with privacy concern. The major change here is that while enhancement BCI in research settings will be focused on gathering new knowledge and progressing research, enhancement BCI in a commercial setting would be focused on marketability. In a research setting having highly accurate results would be a major concern, whereas this might not be as much as a concern for commercial products as long as the results that were provided could be marketable. The major issue at hand regarding enhancement will be how BCI will be defined. There is two discourses which BCI could follow in this definition, which is either as a training device, or an enhancement device. If BCI is defined as a training device, the effects of a BCI would be categorised as the effects of a treadmills effect on muscle development. If this definition is used one could argue that there is no ethical concerns in regard to enhancement as the ethical concerns regarding digital divide could be solved in the same way as with physical training devices such as treadmills and exercise bikes with training centres. Due to the relatively low hardware costs such a solution could be viable and such centres could offer relatively low fees for such a service which would make it possible for most to gain access to this type of training form. Whether this definition is the most appropriate is however still unclear, and further research needs to be made to determine whether this definition is appropriate. If BCI devices are considered enhancement devices it would indicate that the device should be categorized as a medical device which should only be used by trained therapists. This definition would have implications for the current commercial BCI devices as many of these are being sold as self-therapeutic/training devices [3]. They would be under stricter regulations, and this could make it more difficult to make the transition from research to commercial BCI devices. It could increase prices of BCI devices as enhancement devices which would make the issue of digital divide in enhancement technology more of a concern. Another concern with BCI as enhancement or training devices is whether the training or enhancement translates into other tasks. While the effect of BCI training has been documented to be there in different tests, further research needs to be made to investigate whether this effect of BCI training translates into other settings [6, 28]. Therefore further research is required in both possible future enhancement/training settings for BCI to investigate whether the definition of BCI devices should be defined as either enhancement, or training devices.

4.3 Equity and digital divide Concerns regarding the digital divide have in some way been minimized by BCI technology moving from research to commercial settings. The concern about everyone having access to the technology is being targeted by commercial developers making this technology available to everyone, and not only researchers or specialised technicians. The transition from

research to commercial availability does not solve all digital divide concerns though. Three other concerns about the digital divide is having access to updated technology, motivation for use of technology, and having abilities to use the technology [2, 7, 9]. While having access to technology can be solved by commercial competition leading to lower prices and organisations such as libraries making the technology available, the other concerns are not as easily solved. At the moment the problem of having motivation to use the technology is a focus for both researchers and commercial developers. This is done by looking at usability concerns and making sure there is potential usage for the average user [4, 5, 11]. By doing this the intention for the BCI is to be usable and for the consumers to have motivation to buy and use the BCI. The concern regarding the digital divide and motivation should therefore be focused around motivating those who currently does not have any motivation for using BCI. This seems to be a natural concern for commercial developers however as they always are concerned about trying to get as many users to adopt and use the technology they are selling. The concern about outdated hardware and software is for BCI similar and parallel to the concerns about the digital divide in general. BCI devices could be developed with that concern in mind though, by making the devices more modular and thereby making it easier and cheaper to update selected pieces of the hardware. If a BCI is developed with exchangeable electrodes and components it would mean that BCI users easily could update the electrodes when better electrodes were released, and if a standard of how these electrodes were connected to the main interface, electrodes from different companies could be switched out to provide cheaper alternatives while keeping the hardware updated. The last concern discussed in this paper is the concern of having ability to use the technology. This requires for the before mentioned concerns to be considered and dealt with, as it is difficult to solve this without having users that are motivated to use the technology, and who have access to updated technology. In research this concern in some cases are boiled down to the question of whether a paralyzed patients is able to operate a BCI [17] When it moves into commercial usage, the concern however is how to develop a product that deals with this concern in general. This could be framed as a usability concern and be dealt with in those regards. If a BCI is easy to use, more people will be able to operate a BCI. This does not necessarily solve the digital divide concern though as there will still be a difference between users who have much exposure to the technology and those who is only exposed to the technology in limited ways. If it is possible to develop a BCI with the digital divide in mind though, it could solve some of the digital divide concerns currently existing with human-computer interfaces. If a BCI is to be developed with the intentions of making interaction with computers more intuitive and easier to use than the standard keyboard and mice, it could introduce members of society to technology and information in a way that keyboard and mice could not. By increasing the ways of interacting with technology it would allow for people to seek information and use technology to solve problems and thereby reducing the digital divide. If the intentions for the BCI is to be an enhancement of current interaction such as an addition to the traditional keyboard and mice, it could however mean that the digital divide would increase further as the complexity of humancomputer interaction would increase. Thereby the intentions for the BCI in regards to the digital divide could both help reducing the gap, or widening it.

5. Future research There is a lack of research done on the intentions for specifically commercial BCI technology. Therefore the change in intentions for the BCI is an interesting research topic worth looking into as it might reveal some interesting differences and similarities between the intentions of research and commercial BCI technology. Future research should focus on answering the questions of what kind of changes stakeholders see as possible changes in ethical concerns, and what makes up for changes in attributes of a commercial BCI and a research BCI. There is also be a need for research into methods of preventing or dealing with ethical concerns in regards to commercial brain computer interfaces. Currently there is also a lack of knowledge about the intentions for the BCI and how these intentions change the consequences of the BCI development to society and individuals. Once this research has been done, the next step would be to investigate how these concerns relate to each other, and to what degree different stakeholders find them important. Overall there is a lot of further research to be done in this field, as there is a lack of knowledge in regards to what happens to ethical concerns when the setting changes, but also specifically about brain computer interfaces and their development in both research and commercial settings.

6. Conclusion In this paper the current discourses in neuroethics and responsible research and innovation were explored. A gap in knowledge regarding technology moving from research to commercial settings were identified. Hypothesis were then explored about what happens when the intentions for brain computer interfaces changes by the transition from research to commercial settings. Specifically topics such as privacy, enhancement and the digital divide were discussed. Hypothesis about what would happen if the intention for BCI were various degrees of privacy enhancing were explored. Concerns about enhancement were explored such as the definition of BCI as enhancement or training technology. The digital divide concern were explored, explicitly concerns about access, skills and motivation were discussed. Considerations of what further research could include were also made, such as research exploring the hypothesis discussed throughout the paper.

7. ACKNOWLEDGMENTS Thanks to Prof. Bernd Stahl and Dr. Catherine Flick for your support with this paper.

8. REFERENCES [1]

Arstila, V. and Scott, F. 2011. Brain Reading and Mental Privacy. Trames. Journal of the Humanities and Social Sciences. 15, 2 (2011), 204–212.

[2]

Ball, J.W. 2011. Addressing and overcoming the digital divide in schools. The health education monograph. 28, 3 (2011), 56–59.

[3]

Biosensor Technology | NeuroSky: http://neurosky.com/. Accessed: 2014-11-28.

[4]

Bonaci, T., Calo, R. and Chizeck, H.J. 2014. App stores for the brain: Privacy & security in Brain-Computer

Interfaces. 2014 IEEE International Symposium on Ethics in Science, Technology and Engineering (May 2014), 1–7. [5]

Bos, D.P. 2014. Improving usability through postprocessing.

[6]

Corralejo, R., Member, S., Member, S., Álvarez, D., Hornero, R. and Member, S. 2014. Assessment of Neurofeedback Training by means of Motor Imagery based - BCI for Cognitive Rehabilitation. (2014), 3630– 3633.

[7]

[8]

[9]

Crossing the Digital Divide: Bridges and Barriers to Digital Inclusion: 2011. http://www.edutopia.org/digitaldivide-technology-access-inclusion. Accessed: 2015-0602. Davidson, D. 1963. ACTIONS, REASONS, AND CAUSES. The Journal of Philosophy. 60, 23 (1963), 685–700. DiMaggio, P., Hargittati, E., Celeste, C. and Shafer, S. 2004. Digital Inequality: From unequal access to differentiated use. Social inequality. K. Neckerman, ed. Russel Sage Foundation. 355–400.

[10]

Emotiv | EEG System | Electroencephalography: http://emotiv.com/. Accessed: 2014-11-28.

[11]

Van Erp, J., Lotte, F. and Tangermann, M. 2012. BrainComputer Interfaces: Beyond Medical Applications. Computer. 45, 4 (2012), 26–34.

[12]

Friedman, B. and Kahn, Jr., P.H. 2003. Human Values, Ethics, and Design. The Human-Computer Interaction Handbook.

[13]

Grimpe, B., Hartswood, M. and Jirotka, M. 2014. Towards a Closer Dialogue between Policy and Practice : Responsible Design in HCI. (2014), 2965–2974.

[14]

Haselager, P., Vlek, R., Hill, J. and Nijboer, F. 2009. A note on ethical aspects of BCI. Neural networks : the official journal of the International Neural Network Society. 22, 9 (Nov. 2009), 1352–7.

[15]

Hempel, L., Ostermeier, L., Schaaf, T. and Vedder, D. 2013. Towards a social impact assessment of security technologies: A bottom-up approach. Science and Public Policy. 40, 6 (Dec. 2013), 740–754.

[16]

Introduction to Modern Brain-Computer Interface Design: 2013. https://www.youtube.com/watch?v=Wlwvgm3AHvc&ind ex=1&list=PLbbCsk7MUIGcO_lZMbyymWU2UezVHNa Mq. Accessed: 2015-04-27.

[17]

Kübler, a., Nijboer, F., Mellinger, J., Vaughan, T.M., Pawelzik, H., Schalk, G., McFarland, D.J., Birbaumer, N. and Wolpaw, J.R. 2005. Patients with ALS can use

sensorimotor rhythms to operate a brain-computer interface. Neurology. 64, (2005), 1775–1777. [18]

Nijboer, F., Clausen, J., Allison, B.Z. and Haselager, P. 2013. The asilomar survey: Stakeholders’ opinions on ethical issues related to brain-computer interfacing. Neuroethics. 6, (2013), 541–578.

[19]

Nissenbaum, H. 2001. How computer systems embody values. Computer. 34, 3 (2001).

[20]

Owen, R., Macnaghten, P. and Stilgoe, J. 2012. Responsible research and innovation: From science in society to science for society, with society. Science and Public Policy. 39, 6 (Dec. 2012), 751–760.

[21]

Rawls, J. 2012. Two Concepts of Rules. Interpretation A Journal Of Bible And Theology. 64, 1 (2012), 3–32.

[22]

Roadmap 2020: 2015. http://bnci-horizon2020.eu/roadmap. Accessed: 2015-01-08.

[23]

Roskies, A. 2002. Neuroethics for the New Millenium Commentary. Neuron. 35, (2002), 21–23.

[24]

Safire, W. 2005. Are Your Thoughts Your Own ?: “ Neuroprivacy ” and the Legal Implications of Brain Imaging The Committee on Science and Law. (2005).

[25]

Schomberg, R. von 2013. A Vision of Responsible Research and Innovation. Responsible Innovation Managing the Responsible Emergence of Science and Innovation in Society: Managing the Responsible Emergence of Science and Innovation in Society. R. Owen, J. Bessant, and M. Heinzt, eds. Wiley. 51–74.

[26]

Schreiber, D. 2012. On social attribution: implications of recent cognitive neuroscience research for race, law, and politics. Science and engineering ethics. 18, 3 (Sep. 2012), 557–66.

[27]

Searle, J.R. 1980. Minds , brains , and programs. (1980), 417–457.

[28]

Toppi, J., Risetti, M., Quitadamo, L.R., Petti, M., Bianchi, L., Salinari, S., Babiloni, F., Cincotti, F., Mattia, D. and Astolfi, L. 2014. Investigating the effects of a sensorimotor rhythm-based BCI training on the cortical activity elicited by mental imagery. Journal of neural engineering. 11, (2014), 035010.

[29]

Wahlstrom, K. 2013. Privacy and Brain-Computer Interfaces: clarifying the risks. AiCE 2013 (Melbourne, 2013), 1–8.

[30]

West, A., Lewis, J. and Currie, P. 2009. Students’ Facebook “friends”: public and private spheres. May 2015 (2009), 37–41.

Suggest Documents