The 15th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN06), Hatfield, UK, September 6-8, 2006
Incorporating guidelines for health assistance into a socially intelligent robot Rosemarijn Looije, Fokie Cnossen, Mark A. Neerincx
Abstract— The world population is getting older and more and more people suffer from a chronic disease such as diabetes. The need for medical (self-)care therefore increases, and we think a personal (robot) assistant could help. This paper gives guidelines for self-care that supports and shows how it could be incorporated in a (embodied) personal assistant. These guidelines were derived from motivational interviewing, persuasive technology, and from existing guidelines for personal assistants. Questions this paper addresses include: Is it possible to incorporate them in a personal assistant? Can a robot have the same kind of dialogs as a text interface? A first experiment, conducted with young participants, showed that the guidelines were best expressed in a socially intelligent iCat in comparison with a nonsocially intelligent iCat, a social and nonsocially intelligent Tiggie, and a text interface. Furthermore it showed that people indeed preferred the iCat over the text interface, possibly because the iCat’s social intelligence. I. INTRODUCTION the year 2000 one in every ten individuals in the world
Iwas 60 years or older and one in every fourteen was at N
least 65. It is expected that these numbers will increase to one in every five persons being 60 or older and nearly one in six people 65 or older in 2050 [1]. As a result it is expected that the elderly and especially the chronically ill will have to be more self-sufficient, i.e., they should be involved in their self-care at home. The World Health Organization (WHO) estimated that within the chronically ill adherence to a long term treatment is only about 50% [29]. Therefore, improving this adherence could signify a large improvement in the health of the chronically ill, such as diabetics.
Manuscript received March 15, 2006. The SuperAssist project is partially funded by IOP-MMI Senter a program of the Dutch Ministry of Economics. R. Looije is with the Department of Artificial Intelligence, Rijksuniversiteit Groningen, Grote Kruisstraat 2/1, 9712 TS, Groningen, the Netherlands (phone: +31 (0)346 356 364; e-mail:
[email protected]). F. Cnossen is with the Department of Artificial Intelligence, Rijksuniversiteit Groningen, Grote Kruisstraat 2/1, 9712 TS, Groningen, the Netherlands (phone: +31 (0)50 363 6336; e-mail:
[email protected]). M. A. Neerincx is with the Man-Machine Interaction Group, Delft University of Technology, Mekelweg 4, 2628 CD, Delft, the Netherlands. And he is with TNO Human Factors Research Institute, P.O. Box 23, 3769 ZG, Soesterberg, the Netherlands (phone: +31 (0)346 356 298; e-mail:
[email protected]).
1-4244-0565-3/06/$20.00 ©2006 IEEE.
A. Diabetes Mellitus Among the elderly, diabetics are a large group. People need glucose as an energy source. Normally the body arranges a stable blood glucose level by the hormones insulin and glucagon. Insulin lowers blood glucose level and glucagon heightens it. However, in diabetes blood glucose level is not stable. In Type 1 diabetes, insulin producing cells in the pancreas are destroyed. Therefore insulin is functionally absent or significantly diminished. Type 1 diabetes is usually acquired at a young age. The only treatment is to inject the patient regularly with insulin. Type 2 diabetes on the other hand typically occurs in individuals who are older than 40. The pancreas produces not enough insulin and the body has become more resistant to insulin. Genetic factors, obesity and lack of exercise play a role in acquiring diabetes type 2. The treatment of type 2 diabetes is aimed at changing the lifestyle of patients: changing their diet, quitting smoking, taking medication, and exercising regularly. Without a change in lifestyles diabetics can acquire a number of conditions,, ranging from heart disease, vessel disease and kidney damage to blindness and cognitive problems [3]. Persuasive technology could help with supporting diabetics, because it is aimed at changing the lifestyle of a person, which is often needed in diabetics. Not only is it a tool for treatment, but also for prevention [4]. Persuasive technology is based on the theory that to change behavior people have to be motivated to change their behavior. Two methods are generally used: just-in-time messages; where people receive a simple message at an appropriate time at an appropriate place, using a nonirritating strategy [4]; and highlighting the benefits of a particular behavior [5]. A positive effect on the behavior of participants when they were using an application with persuasive technology was shown in two studies [6], [7]. In the United States of America and recently in the Netherlands, there were experiments with a text interface to change the lifestyle of patients who are chronically ill. This device was the Health Buddy® from the Health Hero Network®. It makes use of a persuasive technology, motivational interviewing, in its dialogs and it has already accomplished positive results in changing lifestyle and improving quality of life [8], [9]. TNO, Delft University of Technology, and Leiden University Medical Center are developing models for the
515
supervision of distributed personal assistants for tele- and self-care within the SuperAssist project [10]. A personal assistant could support them in their daily routine of measuring their blood glucose, taking their medication, and eating appropriately. This paper addresses two questions. Can we find guidelines for personal assistants and health devices? Would the incorporation of the guidelines have effect on the user preference of an interface? II. GUIDELINES A. Personal Assistants Many diabetics do not follow the advices they get from their physician; many diabetics need reassurance when in doubt about their health. When a diabetic regularly forgets medication, or finds it hard to follow the diet and take exercise, or does not self-check his/her feet according to the advice of the physician, a personal assistant could be of help. A computer assistant could help the diabetic to remember the advice from their physician, and could reassure the patient. For a personal assistant to be successful, several guidelines must be followed. 1) People must like to use the assistant. If not, they won’t use it. Advice must therefore be given in a positive manner. When a diabetic has for example eaten too much sugar he or she already knows that. An assistent pointing this out, will not help. What could help is pointing out why it is healthier for the diabetic to eat less sugar. Also, the patient will like the assistant better if it gives emotional support, that is, show sympathy and compassion. It is shown to lead to less frustration and longer interaction times [11]. By looking at the user and showing empathy a robot with the ability to express emotions can be liked better than a text interface. 2) The user must trust the assistant. To reach trust the interaction between user and the system must be acceptable for the human user and perhaps be adapted to the state of the user [12]. Trust can be reached by good advices by the assistant and a good interface. The Health Buddy® for example has only four buttons and its use is self-explanatory. Wærn and Ramberg [13] showed that people tend to trust computers less than human beings. A robotic personal assistant may be trusted more than a text-interface-based personal assistant, because its interface is more natural. On the other hand, a text interface may be trusted more because many people are used to getting information from a computer screen rather than receiving it in spoken text from a robot with the ability to express emotions. 3) A last guideline is that the user should follow the advice the assistant gives [2]. Advice following can be improved when using an embodied character. The
facilitation effects of an embodied character are stronger than with a virtual character [14]. It is therefore possible that people are more likely to follow the advice given by an embodied personal assistant than from a virtual, non-embodied assistant. The use of techniques from psychology could improve both the trust in the personal assistant and the number of advices that are followed. In our research we used techniques that are derived from Motivational Interviewing [15]. This is the same technique that is used in the Health Buddy®. We will discuss motivational interviewing next. B. Motivational interviewing Motivational interviewing is a technique that is grounded in Rogers’ [16] client-centered, non-directive approach. The main difference between motivational interviewing and Rogers’ theory is that motivational interviewing is directive instead of non-directive [17]. The motivational interviewing technique is linked to the Transtheoretical Model (TTM) [18] and the cognitive consistency theory [19]. The key principle is that patient’s self-knowledge about the effects of his/her behavior combined with self efficacy results in a positive behavior change. Our research focuses on the question if a socially intelligent robot is able to change the behavior/lifestyle of a diabetic. Abilities to express empathy, communicate respect, and listen rather than tell—which key guidelines are for something that applies motivational interviewing—are very hard to implement with a text interface. A socially intelligent robot (see below) on the contrary is able to express emotions through facial gestures and speech and is therefore able to satisfy those abilities.
Fig. 1 The iCat C. Social Robots A robot that uses social rules to interact is called a social robot. A social robot exhibits humanlike social characteristics; some important guidelines for human-robot interaction are [20]. 1) Express and/or perceive emotions 2) Communicate with high-level dialogue 3) Learn/recognize models of other agents 4) Use natural cues (gaze, gestures, etc.) 5) Exhibit distinctive personality and character 6) If possible, learn/develop social competencies
516
A robot with the ability to express believable emotions probably makes the interaction for the user more enjoyable [14]. Therefore it satisfies one of the guidelines of a personal agent, namely that the user will want to use the personal assistant [11]. To have a pleasant interaction, the communication and emotion skills of the robot are very important [20], [21], [22]. It is important that the robot has good synthesized speech, because a voice that is hard to understand is irritating for the user. For elderly people (and most type 2 diabetics are elderly), it is even more important that the robot has a clear and articulated voice. If both a human and a robot use speech to interact, a dialog emerges. For it to be effective, a dialog must be fluent, so good turn-taking is necessary. But body and facial expressions are also important and perhaps even the personality (e.g. outgoing or shy) of the robot. Because the emotional state of a user might be reflected in posture and gestures it is preferable that the robot recognizes some body and facial expressions. However, Nehaniv, Dautenhahn, Kubacki, Haegele, and Parlitz [23] showed that is it difficult to recognize different gestures. Many gestures are ambiguous and it’s therefore necessary to disambiguate them as much as possible. At present, there is therefore not a robot available that will recognize gestures. Another important feature for good interaction is gaze and face tracking. [22], [24] showed a high improvement of interaction when the robot tracked the face and gaze of the participant. Bruce, Nourbakhsh, and Simmons [22] showed an improvement of interaction when the robot had a face to communicate at in comparison with a robot without a face. People had difficulties in directing their speech to a faceless robot. A robot with a face and tracking abilities gave a roughly additive increase in performance. Sidner, Kidd, Lee, and Lesh [24] showed the effects of a robot penguin that used only lip-synchronization in comparison with a penguin that also used face tracking and gestures in communicating. Participants rated the latter robot as more reliable. D. Conclusion We were able to identify guidelines for a socially intelligent robot that acts as a personal assistant that gives health advice. Most of these guidelines can be incorporated in the iCat (fig. 1), which we used as the personal assistant. The iCat is a research platform for studying human-robot interaction with a socially intelligent robot. It looks like a yellow cat with a face and a body that can follow a person and can express emotions by moving lips, eyebrows, eyes, eyelids, head and body. To make its movements believable the iCat makes use of the principles of animation [25], [26]. The iCat has a speaker, microphones, a webcam, a proximity sensor and touch sensors. With these, it can speak, hear, see and feel. Our iCat uses the Dutch male voice from Loquendo [27]. Motivational interviewing can be incorporated in its dialog and if it is possible to incorporate the guidelines for a
personal assistant in the iCat has to be proven with the experiment. The present experiment tested how people like working with an iCat that incorporates as much guidelines as possible. Would they like it better than a text interface? In this first exploratory experiment we asked young people to empathize with a diabetic and have “diabetic interactions” with the iCat, to determine if the acceptance of a socially intelligent iCat is higher than the acceptance of a text interface. III. DESIGN OF PERSONAL ASSISTANTS Three scenarios were written about diabetics with selfcare problems: One diabetic had difficulties with following the diet and doing exercises, another with checking his/her feet for wounds, and the last with the intake of the medication. The scenarios were given in this order to every participant, but the order of the experimental conditions was varied. In each scenario the physician had asked the patient to try a personal assistant for a week. It was explained to the patient that the assistant would ask questions on Monday, Wednesday, Friday, and Sunday. The questions, and the reactions on the participants’ responses, were based on motivational interviewing. Two kinds of questions were asked: health questions, and quiz questions related to their illness. Examples of health questions are: “How are you feeling today,” “What is your blood glucose level?” The reaction of the personal assistant was attuned to the answer of the participant: if the participant was positive the interface said it was happy for the participant. If the participant was working with a social interface the facial expression was in line with its reaction. Examples of quiz questions are: “Is a blood glucose level of 8 healthy? A) yes B) no C) I do not know”, “People with diabetes have to eat a lot of sugars. A) yes B) no C) I do not know.” If the answer was wrong the interface did not say that the participant was wrong, but gave the explanation of the correct answer. If the participant gave the correct answer the interface said it was correct and explained why. When the interface was socially intelligent it was happy or sad depending on whether the answer was correct or not. A. Text interface The text interface was a chat program through which the experimenter could ask the questions and the participant could answer them with a keyboard (fig. 2). B. iCat The social robot that is used in this experiment is the iCat from Philips (fig. 1). In the socially intelligent condition, the robot followed the participant with its eyes and head, blinked and expressed emotions. In the non-socially intelligent condition, the robot did not follow the participant with its eyes and head, did not blink and did not express
517
emotions. Both in the social and non-social condition the robot spoke with lip-synchronization and could recognize speech. The movements we used for the expressions for the iCat came either from the animation library of the iCat software or were made in the animation editor. Examples of movements are: happy, compassionate, and understand. C. Tiggie For the screen character we used an agent that used the MSagent technology. The agent we chose was Tiggie (fig. 2), an agent developed by Doellgroup. This agent was chosen because it had almost the same expression abilities as the iCat and it was a catlike agent like the iCat. The social and non social conditions of Tiggie were made in such a way they resembled the social and non social conditions of the iCat. The biggest difference was that Tiggie did not follow the participant in the social condition. The voice Tiggie used was the same as the iCat used, namely the male Dutch voice from Loquendo. The movements of Tiggie were standard movements that were already incorporated in the Tiggie software. The movements were chosen to resemble the movements of iCat as much as possible.
Fig. 2 Tiggie and the text interface IV. EVALUATION OF PERSONAL ASSISTANTS The experiment was an explorative study done in a Wizard of Oz setting. In a Wizard of Oz experiment, the participants think they are interacting with an autonomous system, but the system is partly or completely operated by the experimenter. In the experiment a week was simulated by four blocks of questions with a pause in between every block. A block consisted of eight questions of which four about their health and four quiz questions about diabetes. Three of the quiz questions were repeated in every block to see if people learned faster in a certain condition. After this week questions would be asked about their impressions of the interface. Six participants (students who were doing an internship at our TNO institute, unrelated to the present study) volunteered to participate in the experiment, two female and four male, aged 22-29 (M age = 24.17 SD: 2.56). The participants were randomly assigned to one of two groups: One group (N=4) worked with the iCat and the other group (N=2) worked with the onscreen agent Tiggie. This latter
group was smaller due to technical problems with Tiggie. There were three agent conditions: text interface, social agent, and non-social agent. The text interface condition was the same for both groups. The social and non-social agent condition was performed with either the iCat or Tiggie, dependent on whether a participant was in the iCat or Tiggie group. A. Procedure The experiment was conducted in a room that resembled a sitting room. Each participant was explained that the goal of the experiment was to see which interface they would like if they had diabetes. It was emphasized that the interfaces were specifically designed to give questions and react to the answers to those questions, and not to do anything more. Participants were told they would work with three different interfaces and in each condition they would receive four blocks of questions. The experimental session started with the questionnaires about personal data, personality and attitude towards robots. After the questionnaires there was a short animation about diabetes and an introductory movie about diabetes. Then they received the first scenario about a diabetic. The four blocks of questions were asked with a pause in between every block. A block consisted of eight questions of which four about their health and four quiz questions about diabetes. Three of the quiz questions were repeated in every block to see if people learned faster in a certain condition. After the fourth block they had to fill out the questions about acceptation and empathic abilities of the interface, and trust in the interface After the three scenarios were completed, there was a questionnaire to measure the overall impression of the different interfaces. B. Data collection and analysis The following variables were measured: the attitude of the participant towards robots, the extend to which participants empathize with the subject of the scenario, the level of acceptation, the perception of emphatic abilities, the level of trust participants had in the interface, the overall impression of all interfaces, and the amount of emotional expressions in speech during the interaction with an interface. x Attitude: To measure the attitude towards robots we used a questionnaire based on the questionnaire used by [28]. Our questionnaire consisted of five pictures of robots: the iCat, and robot no.3, robot no. 28, robot no. 102 and robot no. 97 from [28]. These robots were chosen because they were evaluated in [28] as pure animal, pure machine, 80% human/20% machine and 50% human/50% machine (fig. 3). We also measured what the position of the iCat was on the uncanny valley [29]. This was done by asking the participants to say if the robots were human, machine, or animal and than we positioned the robots, according to the
518
x
x
x x
x
reactions of the participants, on a line that ranged from machine to animal to human. To explain the uncanny valley Mori [29] gives an example of when people are repulsed by something that is almost perfect. The example is a prosthetic hand, this hand can look indistinguishable from a real hand, but it does not feel like a real hand. When shaking the hand there is a difference between what you expected to feel and what you actually feel, which gives a feeling of discomfort. Acceptation: To measure the acceptance level an adapted version of the Unified Theory of Acceptance and Use of Technology (UTAUT)questionnaire [30] was used. Empathic abilities: The empathic abilities were measured by a personality questionnaire, which the participants had to fill out for the interface, and some questions about the impression of the emotional expressions of the interface. The personality questionnaire was the same they had filled out about themselves and it on the big-five questionnaire [31]. The big-five says there are five important personality traits: extroversion, openness to experience, emotional stability, agreeableness and conscientiousness. The higher the overall score the more social the interface is perceived. We used a smaller version of this questionnaire that consisted of fifteen questions which were divided in five groups of three questions. This smaller version of the big-five was validated at TNO [32]. Trust: Three questions were asked about level of trust, credibility and expertise. Overall: For the overall impression questions participants were asked to rate the interfaces: which they liked most, how much they liked every interface, which interface they found the most reliable/believable/professional. Emotional expressions: the experimenter recorded the emotional expressions in the speech of the participant during the experiment. An example of an emotional expression is when the participant says “I am happy” to the interface. robot no. 28
robot no. 3 Pure Machine
Pure Animal Robot no. 102 Human (80%) and Machine (20%)
Human (50%) and Machine (50%)
robot no. 97
Fig. 3 Pictures of the robots, besides the iCat, that were used in the questionnaire about the attitude towards robots. Figure taken from [28]
V. RESULTS & CONCLUSIONS FIRST EXPERIMENT Tiggie did not prove to be a suitable tool for this experiment; because of software problems it was not possible to complete the experiments which used Tiggie. Therefore there are just a few results concerning Tiggie. We will not use Tiggie in a next experiment, but use a virtual iCat instead, because of our software problems with Tiggie. The two participants in the Tiggie condition did not see any difference between the social and nonsocial Tiggie. They liked Tiggie better than the text interface, because of the more natural way of interaction. Participants did notice a difference between the social and nonsocial iCat. In the personality questionnaire, the social iCat received a mean score of 7 (out of 9) while the text interface and the nonsocial iCat both scored around the 6, indicating that they liked the iCat better. The UTAUT score was 3.02 for the text interface and 2.83 for the socual iCat, but the nonsocial iCat scored almost a point less. For the empathy score the same trend was seen as in the personality questionnaire: the social iCat scored 2.65, the text interface scored 2.17 and the nonsocial iCat scored 2.09. Trust, credibility and expertise were scored immediately after completing the condition. The social iCat and the text interface scored respectively 4.08 and 4.25 more or less the same on the trust questions. The nonsocial iCat did have a score more than a point lower than the other two conditions, namely 3.00. With regard to emotional expressions, there were no differences in emotional expressions in the speech of the participants during different conditions. There were however differences in emotional expressions in the face of the participants and their attitude towards the interface. With the social iCat, for example, participants leaned towards the iCat and directed their conversation at the iCat, while in the nonsocial condition participants were hanging back in their seat. In our future experiments, we will record facial and body expressions of the participants, because it may indicate how much fun it is to work with a personal assistant. When positioned on the animal-human scale, the iCat was positioned as an animal by all participants, except one who thought it looked human. All participants said iCat had a complete face in contrast with the robot animal, of which half of the participants said that it had a complete face. On average, the other robots were positioned as 100% animal, 66% human/33% machine, 33% human/66% machine, and 100% machine. Only one of the participants thought that the robots could have feelings and none of the participants classified a robot as being really aggressive or unfriendly. In the final questionnaire all four participants in the iCat condition indicated that they would like a personal assistant if they had diabetes. Three of them would like to use the social iCat at home and one the text interface. When we compare our results for attitude toward robots with those from Woods et al. [28], we find that in Woods et al. the children attributed feelings to a robot more easily and in contrast to our participants, they found some robots
519
aggressive and unfriendly. These differences might be caused by the difference in age. There were however no differences in how participants positioned the robots on the animal-machine-human scale: our participants placed the robots at about the same position with regard to the animal, machine, human appearance as the children. In summary, our experiment showed that the socially intelligent iCat had the highest preference. It also showed that it is possible to have the same dialogs with different interfaces.
The aims of this exploratory study were threefold: (1) to develop and test guidelines for a socially intelligent robot that can act as a personal assistant; (2) to find out if a robot can have the same kind of dialogs as a text interface; and (3) to investigate whether the likeability of a socially intelligent robot is higher in comparison with a text interface. We successfully derived guidelines from motivational interviewing, persuasive technology, and from existing guidelines for personal assistants. We were able to incorporate these guidelines in a socially intelligent robot. It was found that it was possible to have the same kind of dialog with the text interface and the iCat therefore it was possible to make a fair comparison between the different interfaces. The experiment that we conducted showed that the socially intelligent iCat had the highest preference. Obviously, the present explorative study was limited in that it involved only 6 participants who were not diabetics. We are presently conducting a larger study to substantiate our findings. It will however be necessary to repeat the experiment with the target patient group, preferably in a long-term study within a natural context, i.e., at home. We hope that socially-intelligent agents may support diabetic patients to cope with their illness better.
[3] [4] [5] [6]
[7]
[10] [11] [12]
[14] [15] [16] [17]
[18] [19] [20] [21] [22]
[23]
[24] [25]
REFERENCES
[2]
[9]
[13]
VI. CONCLUSIONS
[1]
[8]
United Nations, “World population ageing 1950-2050,” 2002. Available: http://www.un.org/esa/population/publications/worldageing19502050 World Health Organization, “Adherence to long-term therapies,” 2003. Available:http://www.who.int/chronic_conditions/en/adherence_intro duction.pdf Diabetes fonds, “Wat is diabetes?” Houten: Brink Omega, 2005 S. S. Intille, “A new research challenge: Persuasive technology to motivate healthy aging,” in IEEE transaction on information technology in biomedicine, vol. 8, 235-237, 2004. S. S. Intille, “Designing a home of the future,” in Pervasive computing, pp. 80-86, 2002. J. P. Nawyn, “ A persuasive television remote control for the promotion of health and well-being,” M.S. thesis, Program in Media Arts and Sciences, Massachusetts Institute of Technology, Boston, MA, 2005. P. Kaushik, “The design and evaluation of a mobile handheld intervention for providing context-sensitive medication reminders,” M.S. thesis, Program in Media Arts and Sciences, Massachusetts Institute of Technology, Boston, MA, 2005.
[26] [27] [28] [29] [30] [31] [32]
520
J. H. Bigalow et al., “Patient compliance with and attitudes towards Health Buddy,” Santa Monica, CA: Rand, 2000. G. D. Van Dijken, A. Niesink and A. J. P. Schrijvers, “Telehealth in de Verenigde Staten,” Utrecht: Julius centrum UMC Utrecht, 2005. G. De Haan, O. Blanson Henkemans and A. Aluwalia, “Personal assistants for healthcare treatment at home,” in Proceedings of EACE’05, pp. 217-223, 2005. J. Klein, Y. Moon and R. Picard, “This computer responds to user frustration: theory, design and results,” in Interacting with Computers, vol. 14, pp. 119-140, 2002. M. A. Neerincx and J. W. Streefkerk, “Interacting in desktop and mobile context: Emotion, trust and task performance,” in Proceedings of the first European Symposium on Ambient Intelligence (EUSAI), pp. 119-132, 2003 Y. Wærn and R. Ramberg, “People’s perception of human and computer advice,” in Computers in human behavior, vol. 12, pp. 1727, 1996. C. Bartneck, “Interacting with an embodied emotional character,” in Proceedings of the DPPwe 2003 conference, pp. 55-60, 2003. W. R. Miller, S. Rollnick, “Motivational Interviewing”, New York, NY: The Guilford Press, 1991 C. R. Rogers, “Client-centered therapy,” Boston, MA: Houghton Mifflin, 1951. W. F. Velicer, J. O. Prochaska, J. L. Fava, G. J. Norman and C. A. Redding, “Smoking cessation and stress management: Applications of the Transtheoretical model of behaviour change,” in Homeostasis, vol. 38, pp. 216-233, 1998. J.O. Prochaska & C.C. DiClemente, “Transtheoretical therapy toward a more integrative model of change,” in Psychotherapy: theory, research and practice, vol. 19, pp. 276-287, 1982. L. Festinger, “A theory of cognitive dissonance,” Stanford, CA: Stanford University Press, 1957. T. Fong, I. Nourbakhsh and K. Dautenhahn, “A survey of socially interactive robots,” in Robotics and autonomous systems, vol. 42, pp. 143-166, 2003. B. R. Duffy, “Anthropomorphism and the social robot,” in Robotics and autonomous systems, vol. 42, pp. 177-190, 2003. A. Bruce, I. Nourbakhsh and R. Simmons, “The role of expressiveness and attention in human-robot interaction,” in Proceedings of the IEEE international conference on robotics and automation 2002 (ICRA’02), pp. 4138-4142, 2002. C. Nehaniv, K. Dautenhahn, J. Kubacki, M. Haegele and C. Parlitz, “A methodological approach in relating the classification of gesture to identification of human intent in the context of human robot interaction,” in Proceedings IEEE Ro-man 2005, pp. 371-377, 2005. C. L. Sidner, C. D. Kidd, C. Lee and N. Nash, “Where to look: A study of human-robot engagement,” in ACM international conference on intelligent user interfaces (IUI), pp. 78-84, 2004. A. J. N. Van Breemen, “Bringing robots to life: Applying principles of animation to robots,” unpublished. A. J. N. Van Breemen, “Animation engine for believable interactive user-interface robots,” in Proceedings of 2004 IEEE/RSJ international conference on intelligent robots and systems, pp. 2873-2878, 2004. www.loquendo.com S. Woods, K. Dautenhahn and J. Schulz, “The design space of robots: Investigating children’s views,” in Proceedings of IEEE Ro-man 2004, pp. 47-52, 2004. M. Mori, “The Uncanny Valley,” in Energy, vol. 7, pp. 33-35, 1970. V. Venkatesh, M. G. Morris, G. B. Davis and F. D. Davis, “User acceptance of information technology: Toward a unified view,” in MIS Quarterly, vol. 23, pp. 425-478, 2003. L. R. Goldberg, “The development of markers for the Big-Five factor structure,” in Psychological Assessment, vol. 4, pp. 26-42, 1992. A. J. van Vliet, Effecten van teamsamenstelling op het moreel van uitgezonden militairen, TM-01-A040, Soesterberg: TNO technische menskunde.