CCIS 281 - Emotions in Robots - Springer Link

8 downloads 3148 Views 184KB Size Report
2 COMSATS Institute of Information Technology (CIIT), Islamabad, Pakistan. 3 Center for .... they will take more and more jobs done by humans previously. Nevertheless, as can .... office meeting or a just a group of friends having fun. The robot ...
Emotions in Robots Muhammad Mansoor Azeem1,*, Jamshed Iqbal2, Pekka Toivanen1, and Abdul Samad3 1

Faculty of Science and Forestry, University of Eastern Finland (UEF), Kuopio, Finland 2 COMSATS Institute of Information Technology (CIIT), Islamabad, Pakistan 3 Center for Advanced Studies in Engineering (CASE), Islamabad, Pakistan [email protected]

Abstract. This article gives a general overview of emotions in robotics. It sheds light on the composition of emotions in human beings and how this information can be used as blue print for creation of emotional robots. It discusses different factors involved in creation and detection of emotion. The article briefly explains the social aspects of having emotional robots in society. A system architecture to build emotional system of robot is presented, along with information about working of its modules. Keywords: Emotions in robotics, Future robots, Robot learning, Social robots, Knowledgebase.

1

Introduction

With the advancements in technology, machines with emotions are no more a science fiction. As robots are integrating in human society, it is important to have a better Human-Robot Interaction (HRI). It can be achieved only when both can understand each other. Making robots capable of understanding human natural language and respond in a natural way can be one plausible method for natural HRI. For that robots should be able to recognize the human emotions and should be able to give a response understandable and interpretable by humans. The aim of the present work is to figure out the techniques through which we can have self-learning intelligent robots. We will look through the development of emotions in human beings. Our goal is to develop system architecture to implement the process of emotions generation in robots. We will present a literature review of different factors involved in creation of emotion and discuss how these factors can be used for development of emotions in robots. We will shortly discuss different social and ethical aspects involved in the emergence of robots as autonomous decision taking machines.

*

Corresponding author.

B.S. Chowdhry et al. (Eds.): IMTIC 2012, CCIS 281, pp. 144–153, 2012. © Springer-Verlag Berlin Heidelberg 2012

Emotions in Robots

2

Implementation

2.1

What Are Emotions?

145

Emotions consist of feelings, thoughts and behavior. Emotions are always private and subjective. There are large numbers of emotional states which humans can exhibit. Furthermore, many emotions are only blend of different basic emotions. On the other hand, emotion can be considered as a response or a psychological arousal driven by an act of defense or attack. From all the theories presented, a conclusion can be made that emotions are driving factors of human decision-making. They originate because of external environmental factors and internal memory state. In fact, emotions make human beings "autonomous" in their decision-making. Humans based on their feelings and emotions take decisions. When humans analyze the situation and make a decision, they assume that it is correct according to current circumstances. Therefore, we believe that same logic can be used to make our robots autonomous in their decision-making. 2.2

Emotions in Robots

Robots have sensors corresponding to senses of humans. Humans can sense their environment according to the natural senses they have. Robots can be equipped with a number of such senses to sense their environment. There is an extensive research going on for recognizing the human emotions on the basis of different parameters like voice, facial expression and body language. For recognizing emotions from voice, VRA Technology seeks to interpret the complete emotional structure evident in individually assessed conversations (excitement, pain, fear, cognitive activity and more) [1]. The correlation among these various emotional stress characteristics enables to determine whether a statement was, amongst other things, truthful, deceptive, uncertain and inaccurate. The speech of person can provide a lot of information about his emotional state. This information is present in acoustic properties like fundamental frequency, pitch, loudness, speech-rate and frequency range [2]. The emotions in a robot will help to have more life-like Human Computer Interaction (HCI) [3-5]. An emotion of a robot provides information about many aspects of the current state. They can tell about the intentions of the robot, its internal and external state and it may reveal what the robot is "up to" [6,7]. Emotions provide feed-back in the interaction with the robot how the robot feels about its environment. They also show how robot is affected by the people around it, what kind of emotions it understands and how it is adapting to the ever changing world around him [8-10]. 2.3

Emotion Expression

Expression of emotions is an important part of emotional conversation. For two way communication it is important that robot not only recognize the human emotion but also is able to express the emotion. The design and technology used to develop the

146

M.M. Azeem et al.

capability of a robot to show emotions will greatly affect how humans will be able to perceive the robot. The important part of emotion expression is non-verbal communication in which the emotions are expressed through body gestures and facial expressions. Usually a category-based model is used to map emotions, where a number of emotions are present for a number of actions. Nevertheless, this approach is somewhat closed because there is no space of overlapping emotions. We are introducing an Emotion Expression System, which has different approach. As the feedback is main source of learning of a robot, it will also help to refine the precision of emotions expressed. The use of emotion is one aspect but its use in right context is also important. We will not ignore the effect of expression, which can be seen as immediate reaction from human. In ideal case, the human will understand the robotic expression and will continue forward the communication. However, in worst case it can misunderstand the expression demonstrated by a robot. There could be many reasons for this misunderstanding with two possibilities that the robot did not learn the emotion well before using it or it used the right emotion in wrong context. 2.4

Emotion Recognition

When we talk about recognition of emotions, we will have to consider all factors that are involved in an emotion. For the proper recognition of emotion, we need to recognize facial expressions, body movements, speech and some internal factors like blood pressure and heartbeat. The results from these factors will then be combined to recognize an emotion. For the correct recognition of an emotion, it is also important to consider the environment and circumstances at a particular moment, as there could be history involved in the reason of an emotion. For robot, it is important to know all the possible aspects of human emotion. There has been work done in recognition of emotions on the basis of speech and facial expressions. However, not much work has been done to combine these approaches to get a more accurate and robust result of recognized emotion. It is obvious that if system has more information, the decision making will be more precise. Information from speech and facial expression for example is somehow related. Speech. Speech contains a lot of information about the emotional state of human beings. When humans hear a certain voice they can easily know the emotion of person, the stress level and emotional state [11]. Humans change the tone of their voice to express different emotions and extent of their state. Researchers have found a relationship between voice and emotions. Three types of information are contained in human speech: who the speaker is, what it is said and how it is said [11, 12]. This depends on what kind of information the robot will require from the speech. The robot can use this information for emotion recognition or even for identifying the person. Researchers have done work in recognizing the natural language so that robot can be made more efficient in communicating with humans [13-17]. There are no hard and fast rules developed so far about which features of voice should be measured to recognize the emotions. Acoustic parameters vary in their

Emotions in Robots

147

function and kind of information they carry. These parameters and intensity has been researched [11]. For natural conversation, a robot should be able to recognize the commands independent to the speaker. One approach for achieving this is a two-step process. In this process, first audio signals are transformed into feature vectors and then utterances are matched to a vocabulary. Many of current systems use Hidden Markov models to determine the nearest match from vocabulary [11]. A comprehensive introduction of speech recognition is given in [18]. We want our robot to produce human like emotions. For that purpose robot should be able to speak natural human language. It is important because the absence of a response in natural human language will not fulfill the purpose. So, it should have as natural voice as possible. Facial Expressions. Facial expressions convey emotions, but some emotions can be shown barely using one's face. Face gives a personality to living thing and conveys the identity of the person. Humans immediately know about the person and show their expressions according to their relationship with that person. Before recognition, robot will also identify the person. Therefore, it can act according to a previous relationship if there exists one. There are three basic approaches to facial expression recognition [19]. In the first approach, facial muscle actions are identified in a sequence of images. In the second approach, facial features, as for example the distance between nose and lips, are tracked. In the third approach, Principal Component Analyses (PCA) are used to reduce image-based representations of faces into principal components likes eigen faces. A good overview of the techniques used to recognize facial expressions is presented in [5,20,21]. Gesture. When humans communicate, they often use their body parts to effectively deliver their ideas. For example, a person might say that "the cup of coffee is there" but he also points in the direction of the cup with his hand. This makes sure that other person understands him well. Humans mostly use their hand movement shoulders and movement of head. Gestures clarify the speech further. There are vision based techniques used to recognize gestures [11,16,22]. Furthermore, Kortenkamp et al. [23], Waldherr et al [24], and Xu et al. [25] have developed systems that recognize gestures for HCI. Social Aspects. With increase in the population of emotional robots, they will become major part of human society. Therefore, they will make completely new species, who will become residents of our planet Earth. Many questions arise about their presence in human society, because people are not used to interacting with machines. Some may think it as a dump machine even if it will possess emotions and will have the ability to interact with humans. Emotional robots can be an integral part of society. We can think of certain moral and social issues related to them. Edmund Furse at University of Glamorgun gave a presentation "Theology of Robots" where he proposed some ideas about future life of emotional and intelligent robots in our society. According to him, robots will have legal rights, system of life and death and religious and moral issues. But this discussion is still very open that what emotional robots can have and what they cannot.

148

M.M. Azeem et al.

The human acceptance of emotional robots in the society is an interesting discussion. The first thing, which comes to mind and is of great concern for humans, is the increase of unemployment. As the intelligent robots will be produced on large scale, they will take more and more jobs done by humans previously. Nevertheless, as can be seen today around us that robots are making life of humans easier. They are easier to train in certain skills and they can take off lot of workload from humans. This will give more time to humans for relaxation. Calculators are very simple machines and still they have not taken jobs of mathematicians but they have saved their valuable time [24]. Emotional robots can be great pets. Especially the people who want to have pets but cannot have them because of busy city life or some allergies they can have robotic pets. There have been pets developed in shape of dog, fish, lion cub, cat and many other animals. An example can be taken from Paro. Paro is a baby harp seal robot designed by Takanori Shibata of the Intelligent System Research Institute of Japan (AIST). It was developed to replicate animal-assisted therapy but without its negative effects. The robot can show emotions of happiness, surprise and anger [25-27]. As we know now, robots are getting their own intelligence as the technology is advancing and a day might come when they will be able to think also. In that case robots will try to decide for themselves and may make choices unacceptable to humans. We have to make sure that robots follow some strict rules as do humans follow for a peaceful society. They should have clear defined limits so that they can never go out of human control and bring to reality the fantasies shown by movies such as "I, Robot". The movie is set in future where humans become completely dependent on robots. However, robots break laws made for them and try to take over humans. Moral rights and duties will be important characteristic of emotional robot. It depends how they are programmed. Either they can be programmed to have only acceptable human behavior or they can be set free to learn whatever they want [28]. Our goal should be to make them as useful for humans as possible reducing their disadvantages. Presentation of their positive image in society will be key to their successful integration in human society. Learning Emotions. Emotional system learns from experience. So, for showing emotions, it has to learn them first. We can make a robot which will simply copy the emotions and then will reproduce it exactly, but it won’t recognize the state of person and thus the communication will be non-emotional. We not only wish robots to learn emotions but also want them to learn why and how each emotion is produced. There are various learning techniques that can be used for robots to learn emotions. Imitation is one of them. There is not a single widely accepted definition given for imitation. Thorpe [29] defines it as "copying of a novel or otherwise improbable act or utterance, or some act for which there is clearly no instinctive tendency". Observations and continuous feedback from humans is the idea that we want to present. In this technique, the robot observes and recognizes an emotion. If he does not know about it already, it will try to repeat it. Nevertheless, the whole context will also be recorded. Because same emotion used in different context can have different meanings. The new emotion recorded by robot will be used by him and human can

Emotions in Robots

149

assist him by giving feedback hence resulting in the learning of new emotion. This scheme requires great sensor and motor abilities to accomplish.

3

Proposed Emotional System

Based on the research carried out for this work, we can safely assume that there is possibility to create robots with natural human emotional capabilities. Figure 1 presents a proposed theoretical system architecture which could be used as basis for creating emotional system of an intelligent robot.

Fig. 1. Modules of proposed emotional system

Below is the description of various entities of emotional system presented in Figure 1. Communication System is an important part as it will be responsible for the communication to other devices. Its function is to send and retrieve information required by humans. Learning System is responsible for the learning part. Learning of skills and emotions is carried out by this module. Emotion Expression System is responsible for emotion expression system. It processes information and sends instructions to other parts i.e. face, voice and body structure, for participating in expressing emotion.

150

M.M. Azeem et al.

Sensing is responsible for collection of all the information needed for recognizing the emotion. Knowledgebase acts like a “mind” for robot. It is a comprehensive repository of information and also able to communicate with other sources for more information. The driving feature of a robot which directs it to an emotional system is “curiosity”. Action plan related to curiosity are bootstrapped in knowledgebase so that the system can decide to proceed further.

4

Detailed Working of Emotional System

Learning System. The first step here is to choose the learning technique. There are several possibilities like supervised learning, unsupervised learning and reinforcement learning. The System decides about the learning technique on the basis of time and resources available. By resources we mean here guidance which the system can get from humans, knowledgebase or other systems. We assume that there exists an online robotic community where robotic systems can share their knowledge and skills. When the robot likes to learn something, it first gathers all the information about it. For example the robot has seen a person saying something and then, as a reaction, it has seen people laughing. In fact this is analogous to the situation, where a person makes a joke and other people laugh on it. Now the robot records what the person said and the situation, where it was said, i.e., it can be an office meeting or a just a group of friends having fun. The robot adds this information in knowledgebase corresponding to an action that can be used to make people laugh. The next step for the robot will then be to imitate the same thing. The robot can imitate it while working at home or sitting with its human friends. The feedback from the surroundings will help it improve its perception of the emotion. Learning System. The first step here is to choose the learning technique. There are several possibilities like supervised learning, unsupervised learning and reinforcement learning. The System decides about the learning technique on the basis of time and resources available. By resources we mean here guidance which the system can get from humans, knowledgebase or other systems. We assume that there exists an online robotic community where robotic systems can share their knowledge and skills. When the robot likes to learn something, it first gathers all the information about it. For example the robot has seen a person saying something and then, as a reaction, it has seen people laughing. In fact this is analogous to the situation, where a person makes a joke and other people laugh on it. Now the robot records what the person said and the situation, where it was said, i.e., it can be an office meeting or a just a group of friends having fun. The robot adds this information in knowledgebase corresponding to an action that can be used to make people laugh. The next step for the robot will then be to imitate the same thing. The robot can imitate it while working at home or sitting with its human friends. The feedback from the surroundings will help it improve its perception of the emotion.

Emotions in Robots

151

Learning System. The first step here is to choose the learning technique. There are several possibilities like supervised learning, unsupervised learning and reinforcement learning. The System decides about the learning technique on the basis of time and resources available. By resources we mean here guidance which the system can get from humans, knowledgebase or other systems. We assume that there exists an online robotic community where robotic systems can share their knowledge and skills. When the robot likes to learn something, it first gathers all the information about it. For example the robot has seen a person saying something and then, as a reaction, it has seen people laughing. In fact this is analogous to the situation, where a person makes a joke and other people laugh on it. Now the robot records what the person said and the situation, where it was said, i.e., it can be an office meeting or a just a group of friends having fun. The robot adds this information in knowledgebase corresponding to an action that can be used to make people laugh. The next step for the robot will then be to imitate the same thing. The robot can imitate it while working at home or sitting with its human friends. The feedback will improve its perception of emotion. Knowledgebase. Knowledgebase can be implemented as a multifunction distributed database providing a set of services. It is accessed by almost all subsystems of emotional system at some point of processing. The functions that the knowledgebase is providing in emotional system include storage of information, fetching of information from distributed locations and providing information needed by the subsystems. Knowledgebase will optimize data acquisition also by moving frequently accessed data to closer locations and by “dumping” obsolete data. The knowledgebase is also responsible for recovering an earlier stable state, in case of any processing failure or an inappropriate action.

5

Conclusion

We have given a general overview of emotions in robots. Our aim was to outline the possibilities of learning through emotions in robots. Different factors have been described on the basis of which a robot can recognize the emotions. The issues related to the integration of emotional robots in society have been discussed. The research done in this work suggests that it is quite possible to have human like emotions in robots. We have proposed system architecture for the emotional system of an intelligent robot. This architecture can be used for the development of emotional system in real world. Although the advancement in information technology, mechanical engineering and electronics has made it possible to build humanoid robots like KISMET and Cog. However they still lack natural language and human like face. Furthermore the esthetics needs much improvement. We hope that as researchers will have more advanced technology on their disposal, these issues will be addressed.

152

M.M. Azeem et al.

References 1. Durrant, C., Winter, E., Yaxley, D.: Local Authority Omnibus Survey – Wave 18. Research report No. 590, Department of Work and Pensions (DWP), http://research.dwp.gov.uk/asd/asd5/rports2009-2010/ rrep590-ch7-vra.pdf 2. Nass, C., Brave, S.: Wired for Speech: How Voice Activates and Enhances the HumanComputer Relationship. MIT Press, Cambridge (2005) 3. Cañamero, L., Fredslund, J.: I show you how I like you-can you read it in my face? IEEE Transactions on Systems, Man and Cybernetics 31(5), 454–459 (2001) 4. Ogata, T., Sugano, S.: Emotional communication robot: WAMOEBA-2R emotion model and evaluation experiments. In: Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (2000) 5. Chibelushi, C.C., Boure, F.: Facial Expression Recognition: A Brief Tutorial Overview. On-Line Compendium of Computer Vision (2003) 6. Bartneck, C., Okada, M.: Robotic user interfaces. In: Proceedings of the Human and Computer Conference (2001) 7. Breazeal, C.: A motivational system for regulating human–robot interaction. In: Proceedings of the National Conference on Artificial Intelligence, Madison, WI, pp. 54–61 (1998) 8. Michaud, F., Prijanian, P., Audet, J., Létourneau, D.: Artificial emotion and social robotics. In: Proceedings of the International Symposium on Distributed Autonomous Robotic Systems (2000) 9. Velasquez, J.: A computational framework for emotion-based control. In: Proceedings of the Workshop on Grounding Emotions in Adaptive Systems, International Conference on SAB (1998) 10. Billard, A.: Robota: clever toy and educational tool. Robotics and Autonomous Systems 42, 259–269 (2003) 11. Breazeal, C.: Designing Sociable Robots. MIT Press, Cambridge (2002) 12. Adams, B., Breazeal, C., Brooks, R.A., Scassellati, B.: Humanoid robots: a new kind of tool. IEEE Intelligent Systems 15(4), 25–31 (2000) 13. Lauria, S., Bugmann, G., Kyriacou, T., Klein, E.: Mobile robot programming using natural language. Robotics and Autonomous Systems 38, 171–181 (2002) 14. Okuno, H., Nakadai, K., Hidai, K., Mizoguchi, H., Kitano, H.: Human– robot interaction through real-time auditory and visual multiple-talker tracking. In: Proceedings of the IEEE International Conference on Intelligent Robots and Systems (IROS), pp. 1402–1409 (2001) 15. Spiliotopoulos, D., Androutsopoulos, I., Spyropoulos, C.D.: Human– robot interaction based on spoken natural language dialogue. In: Proceedings of the European Workshop on Service and Humanoid Robots, pp. 25–27 (2001) 16. Rabiner, L., Jaung, B.: Fundamentals of Speech Recognition. Prentice-Hall, Englewood Cliffs (1993) 17. Lisetti, C., Schiano, D.: Automatic facial expression interpretation: Where human– computer interaction, artificial intelligence, and cognitive science intersect. Pragmatics and Cognition 8(1) (2000) 18. Chellappa, R., Wilson, C.L., Sirohey, S.: Human and machine recognition of faces: A survey. IEEE Proceedings 83(5), 705–740 (1995) 19. Fromherz, T., Stucki, P., Bichsel, M.: A survey of face recognition, MML Technical Report No. 97.01, Department of Computer Science, University of Zurich (1997)

Emotions in Robots

153

20. Wu, Y., Huang, T.S.: Vision-Based Gesture Recognition: A Review. In: Braffort, A., Gibet, S., Teil, D., Gherbi, R., Richardson, J. (eds.) GW 1999. LNCS (LNAI), vol. 1739, pp. 103–115. Springer, Heidelberg (2000) 21. Kortenkamp, D., Huber, E., Bonasso, P.: Recognizing and interpreting ges-tures on a mobile robot. In: Proceedings of the AAAI, Portland OR, pp. 915–921 (1996) 22. Waldherr, S., Romero, R., Thrun, S.: A gesture-based interface for human–robot interaction. Autonomous Robots 9 (2000) 23. Xu, G., et al.: Toward robot guidance by hand gestures using monocular vision. In: Proceedings of the Hong Kong Symposium on Robot Control (1999) 24. Idaho National Laboratory, Humanoid Robotics, http://www.inl.gov/adaptiverobotics/humanoidrobotics/ 25. Walton, M.: Meet Paro-The therapeutic robot seal, http://www.cnn.com/2003/TECH/ptech/11/20/comdex.bestof/ index.html 26. Wikipedia: Paro (robot), http://en.wikipedia.org/wiki/Paro_(robot) 27. Ackerman, E.: Paro Robot Therapy Seal Just Wants Your Love, http://www.botjunkie.com/2008/04/10/paro-robot-therapyseal-just-wants-your-love/ 28. Duffy, B.R.: Fundamental Issues in Social Robotics. International Review of Information Ethics 6 (2006) 29. Heyes, C., Galef, B.: Social Learning in Animals: The Roots of Culture. Academic Press (1996)

Suggest Documents