Robots Ethical by Design

2 downloads 0 Views 420KB Size Report
engineers. “The Rules, Moral Responsibility for Computing Artifacts”, an initiative lead by Keith W. Miller https://edocs.uis.edu/kmill2/www/TheRules aims to:.
PHIBOT11, Workshop on Philosophy, Computer Science, and Robotics at the Carl von Linde-Akademie in Cooperation with the Research Cluster of Excellence CoTeSys (Cognition in Technical Systems) at TU München

Robots Ethical by Design Gordana Dodig-Crnkovic Head of the Department of Computer Science and Networks

School of Innovation, Innovation Design and Engineering Mälardalen University Sweden http://www.idt.mdh.se/personal/gdc http://www.mrtc.mdh.se/~gdc/work/RobotsEticalByDesign-20110801.pdf

Pre io s work Previous ork Margaryta Georgieva (student) and Gordana Dodig-Crnkovic Dodig-Crnkovic, Who Will Have Irresponsible, Untrustworthy, Immoral Intelligent Robot? Proceedings of IACAP 2011. The Computational Turn: Past, Presents, Futures?, p 129, Mv-Wissenschaft, Münster, Århus University, Danmark, Ed. Ess and Hagengruber, July 2011 Baran Çürüklü, Gordana Dodig-Crnkovic, Batu Akan Towards Industrial Robots with Human Like Moral Responsibilities, 5th ACM/IEEE International Conference on Human-Robot Human Robot Interaction, Interaction Osaka, Japan, March 2010 Gordana Dodig-Crnkovic Information Ethics for Robotic Agents European Computing and Philosophy Conference ECAP 2010 @The Technische Universität München, 4-6 October, 2010

Pre io s work Previous ork Gordana Dodig-Crnkovic g Delegating Responsibilities to Intelligent Robots ICRA2009 IEEE International Conference on Robotics and Automation. Workshop on Roboethics Kobe, Japan, May 17, 2009.

Gordana Dodig-Crnkovic and Daniel Persson (student) Sharing Moral Responsibility with Robots: A Pragmatic Approach. Tenth Scandinavian Conference on Artificial Intelligence g , SCAI 2008. Volume 173, Frontiers in Artificial Intelligence and Applications. Eds. A. Holst, P. Kreuger and P. Funk, 2008

Gordana Dodig-Crnkovic and Daniel Persson (student) Towards Trustworthy Intelligent Robots, NA-CAP@IU 2008, North American Computing and Philosophy Conference, Indiana University, y Bloomington, g July y 10-12, 2008

C rrent presentation Current

Gordana G d Dodig-Crnkovic D di C k i and dB Baran Çü Çürüklü, üklü Robots - Ethical by Design Ethics and Information Technology (9278), Special issue on Requirements Engineering DOI 10 DOI: 10.1007/s10676-011-9278-2 1007/ 10676 011 9278 2 (2011)

B i ffor the Basis th argumentt Roboethics Information Ethics M hi M Machine Morality lit High Reliability Organization Approach Requirement Engineering

Views of morality

Classical view “A A person who is a morally responsible agent is not merely a person who is able to do moral right or wrong. Beyond this, she is accountable for her morallyy significant g conduct. Hence,, she is an apt p target g of moral p praise or blame , as well as reward or punishment”. The Stanford Encyclopedia of Philosophy (McKenna 2009)

Classical view Human + reward or punishment* * sometimes even seen as revenge g

Pragmatic view “Regardless of whether artificial morality is genuine morality, artificial agents act in ways that have moral consequences. This is not simply to say that they may cause harm – even falling trees do that. Rather, it is to draw attention to the fact that the harms caused by artificial agents may be monitored and regulated l t db by th the agents t th themselves”. l ” (Allen, (All S Smit it & W Wallach ll h 2005)

Pragmatic view Regulatory mechanism intelligent agent, self-regulating

Views of responsibility

Classical view Causality + (human) intention

Pragmatic view Regulatory mechanism Moral responsibility not as an individual duty, but as a role that is defined by externalist pragmatic norms of a group. (Dennett 1973) (Strawson 1974)

Socio-technological network Delegation & distribution

Responsible R ibl agents t “natural” natural & “artificial” artificial

Safety barriers - Physical - Organizational O i ti l

Defense in depth SEVERE ACCIDENT

DESIGN BASE ACCIDENT

ABNORMAL OPERATION

NORMAL CONDITION

Artifactual intelligence: behavior that in human would require intelligence. intelligence

Artifactual morality: behavior that in human would require morality. morality

RESPONSIBILITY ASCRIPTION Artifacts A tif t ascribed ib d artifactual tif t l responsibility for a task in planning operations

Proactive approach - requirements engineering in the design and production of robots, softbots and other autonomous intelligent systems

- preparedness for different scenarios of malfunction - prediction - prevention - learning organization

L Legal l aspects t Analysis of technosocial system done on case by case basis. Ugo Pagallo (2011) Robots of Just War: A Legal Perspective. Philosophy and Technology 24 (3):307 (3):307-323. 323

CONVERGENCE

ARTIFACTUAL HUMAN AGENCY

AGENCY

Wide range consequences

Wide range, long-standing consequences of the deployment p y of intelligent g systems y in human societies must be discussed on a democratic basis as the intelligent systems have a potential of radically transforming f i the h ffuture off humanity. h i

The Rules: moral responsibility of engineers “The Rules, Moral Responsibility for Computing Artifacts”, an initiative lead by Keith W. Miller https://edocs.uis.edu/kmill2/www/TheRules aims to: “((…)) reaffirm the importance of moral responsibility for (computing) artifacts, and to encourage individuals and institutions to carefully examine their own responsibilities as they h produce d and d use them”. h ” (Mill (Miller 2011)

Ed Education ti in i Ethi Ethics ffor E Engineers i

Education in professional ethics for engineers is a fundamental factor for building a socio-technological system t off responsibility. ibilit

Engineering g ee g as Soc Social a Experimentation “All products of technology present some potential dangers, and thus engineering is an inherently risky activity. In order to underscore this fact and help in exploring its ethical implications, we suggest that engineering should be viewed as an experimental process. It is not, of course, an experiment conducted solely in a laboratory under controlled conditions. Rather, it is an experiment on a social scale involving human subjects.” Martin & Schinzinger, Schinzinger Ethics in Engineering Engineering, McGraw-Hill, 1996

28

Ethics Contexts Industry (Other firms) Clients Consumers

Profession (S i ti ) (Societies) Engineering firm

Family ((Private Sphere) p )

Engineer

Colleagues

Managers

Global Environment Society/Nature

A framework for ethical decision making Responsible ethical decision making implies ability to: – recognize a moral issue – evaluate the alternative actions from various moral perspectives – make a decision – act – reflect on the results of the decision afterwards.

30

High reliability organizations: responsibility distribution The assurance of safety in modern High Reliability Organizations* through responsibility distribution. In the same way that the concept of agency is generalized in the case of artificial agents, the concept of moral agency, including responsibility, is generalized too.

* E.g. aviation, healthcare, nuclear industry 31

High g reliability y organizations: g responsibility distribution We propose to look at artificial moral agents as having functional responsibilities within a network of distributed responsibilities in a socio-technological system. This does not take away the responsibilities of the other stakeholders in the system, but facilitates an understanding and regulation of such networks.

32

High g reliability y organizations: g responsibility distribution It should be pointed out that the process of development must assume an evolutionary form with a number of iterations because the emergent properties of artifacts must be tested in real world situations with agents of increasing intelligence and moral competence. We see this study as a contribution to the macro-level R Requirement i tE Engineering i i th through h di discussion i and d analysis l i off general requirements for design of ethical robots.

33

Suggest Documents