Dynamic Emotion Representation system - Department of Computer ...

5 downloads 0 Views 190KB Size Report
Apr 14, 2003 - Hans Eysenck described personality using two basic factors: Neuroticism and Extroversion. Later on, he also added a third factor: Psychoticism ...
Dynamic Emotion Representation system Emmanuel Tanguy Department of Computer Science University of Bath Sixth month report 14 April 2003

1 Introduction My motivation is to participate in the creation of Virtual Humans as described in Gratch et al, 2002. The research into Virtual Humans brings together many different fields, such as computer graphics, Artificial Intelligence, speech recognition, speech synthesis, psychology, philosophy and many more. But my interest is orientated towards emotions and personalities. Much work has been carried out to give emotions and personality to Virtual Humans or agents, as I will call them later in this report. But no one has implemented a complete emotional system, from environment appraisal to action selection and including all types of emotions. I am not proposing to solve this problem and to do all the work but instead I propose to separate the huge task in smaller ones. I will concentrate on a dynamic representation of emotion or a Dynamic Emotion Representation as the title of this report suggests. In fact, through the remainder of the paper I will explain why it is possible to dissociate the representation of emotion from the generation of emotion and from the action selection. In the first place, I will give an overview of a theory of emotions in section 2; followed by a presentation of two theories of mood in section 3. A personality model is briefly introduced in section 4. Then, I will review the implementations of different emotional and personality models in section 5. In section 6, I will present my arguments for the implementation of a Dynamics Emotion Representation system. I will expose my first attempt of this implementation and its results in section 7. Finally, section 8 will explain the possible future developments and the conclusion will be section 9. 2 Three layers for a theory of emotions In this section, I choose to give an overview of Aaron Sloman’s theory of emotions based on his CogAff Architecture schema (2001, a, b). It is not the only theory of emotions, but it seems highly interesting for computer application. The attraction of the CogAff Architecture schema is due to its completeness, its support

by other theories and its modularity. Picard (1997) says that Sloman’s theory has never been implemented but I believe that many emotional computer applications use at least a part of this architecture. I will come back to this point in section 5. The CogAff Architecture is based on a combination of two concepts. Firstly Sloman separates a system into three parts: the inputs called perception, the central processing and the outputs called action. Then each part is divided into three different layers that enable the system to have different levels of abstraction. Sloman argues that these layers come from the evolutionary process and that different animals (and robots) may or may not have all three layers (Aaron Sloman, 2001, a, b). 2.1 Reactive mechanism The first layer and the older is where Reactive mechanisms are found. These mechanisms map external or internal states to behaviours. They are qualified as fast and dirty mechanisms because they enable fast reactions to certain patterns but these reactions could be unjustified. For instance, these mechanisms could make a person jump at the recognition of a snake shape when in fact it is just an electric cable or a branch. The generated reactions, described as Proto-emotions, are not based on context analysis or plan making, as no cognitive process is involved. The cost of these mechanisms is the amount of space to memorise the mapping rules and it is probably why evolution chose to create the second layer. Proto-emotions are also called Primary Emotions by Picard (1997) and Damasio. 2.2 Deliberative mechanisms The second layer is composed of Deliberative mechanisms, which enable reasoning about situations, plan making, and understanding of action consequences. In other words, these mechanisms generate emotions using cognition processes with respect of goals, standards and expectations. They are also responsible for creating hypothesis and abstract concepts. An example of a cognitive emotion could be hope. The emotions developed by this layer are also

called Secondary Emotions by Picard (1997) and Damasio and Paradigm Emotion by Laura Sizer (2000). 2.3 Metamanagement mechanism The third layer is where the self-monitoring or Metamanagement mechanisms sit. These mechanisms enable the awareness and the appraisal of various internal states. Sloman argues that interactions and competition for control between layers, in particular between the second and third layer, could result in complex internal states typical of human emotions. During my research, I came across two interesting works on the theory of mood. The definitions of mood given by Laura Sizer (2000) and Robert Thayer (1996) are pertinently close to the Metamanagement mechanisms definition. I propose, in the two next sections, an overview of their works to understand how this layer could be implemented in computer applications. 3 Mechanisms of mood as a third layer In this section, I will present two theories of mood, which I hope to be complementary. In fact, Laura Sizer takes a computational view where Robert Thayer takes a human practical point of view. 3.1 Towards a computational theory of mood Many people agree that moods and emotions are different, at least on two points. Emotions are object orientated and have a short life time, where moods take everything or anything as its object and last longer than emotions. Despite the fact that the existence of these differences is known, the same theories are used to explain emotions and moods. Laura Sizer (2000) argues that is it not possible to use “conventional” cognitive theory to understand mood but that it is better to see mood as a “cognitive functional architecture”. Functional architecture is a list of rules that regulate how the information is learned and recalled from memory, how the information is categorised and how other resources are used. She argues that mood is “cognitive impenetrable”, with respect of “Pylyshyn’ cognitive penetrable/ impenetrable distinction”. In other words, mood is not influenced directly by cognitive processes; instead it represents an overview of the person’s internal state. That also means that moods are not influenced by particular beliefs

or contexts but by the changes in the overall state of a person. 3.2 The origin of everyday moods Robert Thayer (1996), professor of psychology, describes mood using two “arousal continuums”: energy to tiredness and tense to calm. Using these two dimensions he proposes four different states: • Energy-calm, which is a state of high energy and low tension, is the optimum mood. It is the mood in which a person will say that he/she is in good mood. • Energy-tense, which is a state of high energy and high tension, is a mood that enables people to be active and to do what has to be done. • Tiredness-calm, which is a state of low energy and low tension, is a relaxing state such as before sleep but a person in this state is also very sensitive to tension. • Tiredness-tense, which is a state of low energy and high tension, is the worst state when the energy is not sufficient to do what has to be done. A person in this state will say that she/he is in a very bad mood. Tense arousal appears when a danger or threat has been detected; the person is on the alert, nervous or anxious, ready to action, if she/he can. Energy arousal is the subjective energy that includes mental energy and energy depending on physical resources. Thayer bases his description of energy on the fact that the body and mind cannot be separated. This explains why mood, by the bias of energy, varies through the day following a same pattern everyday, similar to the patterns of heart rate, respiration or blood sugar. Thayer describes a model of interactions between energy and tension: when tension increases energy increases too but only until a point after which the energy decreases. The same pattern appears when the energy increases. Thayer argues that cognitive processes affect the mood through tension, which goes against Laura Sizer’s opinion. I favour the view proposed by Sizer due to the specificity of her work on this point. Thayer also describes mood as a self-monitoring mechanism, which support Sloman’s theory.

4 Personality and personality models I did some preliminary investigations on personality due to the close relationship between emotions and personality.

of questionnaires. Even if many people argue that the Big Five is not a theory, it is still a good basis to describe personality and a useful tool to work on personality.

4.1 What is the personality of somebody? I chose two definitions of the word personality from a dictionary (Merriam-Webster Online): “1 a: the quality or state of being a person”. “3: the complex of characteristics that distinguishes an individual or a nation or group; especially: the totality of an individual's behavioural and emotional characteristics”. This definition brings in focus two important terms: behavioural and emotional characteristics.

Eysenck proposes a theory to explain the differences between extravert and introvert behaviours. He argues that extraverts have a habitual low level of cortical arousal, where introverts have a high level. He also declares that people will be comfortable at a certain level of cortical arousal. So to achieve this level, extraverts will seek sensations and contacts with other, where introverts will try to escape these types of situations because they may become over aroused and uncomfortable in these contexts.

Personality plays an important part in the creation of a believable virtual human or emotional agent. In fact, personality is expressed through the constancy of behaviours and emotional expressions. Personality is also what makes people unique, unique in the way they act and react with a consistent manner in front of a specific situation. I realised one important characteristic about personality which is that it is not possible to see personality with a snap shot of a person. The personality of a person is apparent only through time, by observing this person for a while. Personality is everywhere within a person’s actions, it transpires through the way he/she walks, talks, smiles, etc. 4.2 Personality traits Psychologists who study personality are trying to find a stable (through time and culture differences) set of traits that could describe every possible personality type. At the moment there is no official consensus that defines this set of personality traits. Psychologists are debating on the number of personality traits needed to describe the range of personalities. Hans Eysenck described personality using two basic factors: Neuroticism and Extroversion. Later on, he also added a third factor: Psychoticism (Carver, 1992, Baldes, Eysenck, 1995). The most recognised and used model is the Big Five (or OCEAN model) that describes personality using 5 factors: Extraversion, Agreeableness, Neuroticism, Conscientiousness and Openness to experience (Mc Crae et al, 1992). This model has been created by analysing English vocabulary related to personality and also through the use

5 Emotional computer applications In this section, I give an overview of different implementations of emotional computer applications and explain how they relate to CogAff architecture schema. Sloman’s theory is interesting because many people have worked on the implementation of different parts. Some of them implemented only the second layer but other created application with the first two or the last two layers. 5.1 Emotion generation (layers 1 & 2) The generation of cognitive emotions (layer 2 in CogAff) is the layer where most of the work has been carried out. The most current model used to generate Secondary emotions is the OCC model developed by Ontony et al (1988). The OCC model takes its name from its creators: Ortony, Clore and Collins. It has been developed to enable AI researchers to reason about emotions but it is now mainly used to synthesise emotions (Picard, 1997). 22 emotions are classified using a hierarchical structure that appraises “valenced” reactions to consequences of events, agents’ actions and aspects of an object. The first level of this hierarchy describes three classes of emotion: 1) pleasure or displeasure regarding the consequences of an event with respect of an agent’s goals and plans, 2) approvingdisapproving regarding actions of agents and 3) liked or disliked regarding an agent’s attitude towards an object. The next layers in the hierarchy ask questions about the focus of the concern: self or other and the desirability of the event consequences or actions. The structure ends up with a selection of a few emotions regarding the followed branch. More details can be found in Ontony et al, 1988.

Computer applications like those developed by André et al (1999), Badler et al (2002), Gratch (2000), Marcella et al (2000), Reilly et al (1992) and Reilly (1996) use the OCC model but none of them implemented it completely. We can notice an interesting initiative to simulate Primary Emotions made by Reilly (1996) but the implementation was the same as the one used for Secondary Emotions. (Picard, 1997). André et al also simulate Primary Emotions using a “simple reactive heuristics” (1999). Picard (1997) cites another model to synthesise emotion, which is Roseman’s Cognitive Appraisal Model, but little work has been carried out using this model. 5.2 Emotion representation (layer 1 & 2) The current way to represent emotional states is by using numerical values also described as “buckets”. Variation of the emotional intensity is represented by a mathematical function that gives a rapid intensity increase after the arrival of an emotional stimuli and an exponential decay. Picard (1997) makes a parallel between the emotional intensity curve and the sound intensity curve of a struck bell. When a bell is repeatedly struck, the sound gets louder and louder, even if it is struck with the same intensity each time. The same kind of emotional representation has been implemented in Em, the emotional module developed by Reilly (1996), Kshirsagar et al (2002), Gratch (2000 and 2002) and Marcella (2002). Note that Sloman (2000 a) explains that it is not because self-monitoring systems detect emotions generated by other layers that these emotions exist as real states. In other words, the emotions detected by the metamanagement mechanisms could be an interpretation of complex processes where the emotional states do not really exist. So the use, by computer applications, of states to represent emotions could be misleading the simulation of emotions. I believe that no other solution has been developed at the moment and that more research needs to be carried out to analyse this problem. 5.3 Mood representation (layer 3) Little research has been carried out on the third layer. Most of the time the mood can have 2 values: bad and good as in the work of Reilly (1992

and 1996). In this work the mood is used to influence the behaviour of the character because it is simpler to make the behaviours dependent on the mood than on each emotion. Khirsagar et al (2002) added the possible value of Neutral for the mood and use the mood to influence the choice of displayed emotions. Gratch et al does not dsicuss mood but they developed a system of coping with strong emotions. This system enables the reduction of emotion intensity by carrying out an action or by the creation of another emotion, like blaming somebody else for the consequences of an event. This mechanism seems to fit the definition of the third layer due to its role of emotion management. In these examples, taken actions are influenced by the implementation of the third layer but the cognition processes is not. The emotion theory argues that the third layer should affect resources of cognition processes and Perception mechanisms. We can also note that none of these implementations show a variation of mood over time, as argued by Thayer (1996), and that they represent the mood as just a snap shot of the emotional state of the character. 5.4 Personality The uniqueness of a character given by the personality is a problem which did not get too much attention in computer applications. Mainly, people increase the probability of certain behaviours or map behaviours to certain emotions to give a personality to their characters. Even when the system is based on a model as the Big Five there is not real structure on how the personality affects the behaviour (André el al; 1999; Reilly et al, 1992; Reilly, 1996). The only work I found where personalities influence emotions is in Kshirsagar (2002). In this work, the personality is used to decide if the mood will change and consequently which emotion will be expressed. Badler (2002) used the personality as two filters: one that filters the perceived information, which indirectly influence the generation of emotions, and one that influences the emotional display. No one has implemented a system where the personality influences directly the dynamic representation of emotions. For instance a neurotic personality will be more sensitive to anxiety, which could be represented by the level of tension in Thayer’s model.

6 Discussion 6.1 Emotional agents There are two types of emotional agents, one which tries to pass information to the user using emotionally communicative channels and one that simulates emotional human behaviour. The first type of agent expresses chosen emotions to facilitate the assimilation of the information or to obtain the desired user reactions. This approach is communicationdriven (Gratch et al, 2002). The problems with these agents are the inconsistency in emotional expressions and behaviours. This is due to the fact that the agent does not assess its environment, or if it does, does not keep track of the events. In this case, it could sometimes express emotions that do fit the situation. For instance, after 100 times that the agent asked the user his/her date of birth, it will repeatedly ask again and again with a big smile on its face: “Please, could you tell me what your date of birth is?” The inconsistency could also derive from the fact that there are no smooth changes in the emotional state of the agent due to the absence of dynamic emotional representation. For instance, the same agent from the previous example will ask to the user with a big smile: “Please, could you tell me what your date of birth is?” and then when the user does not answer the question, the agent could look angry and say: “I asked you your date of birth not how the weather is!!” But, after, it will come back with its smiling face and ask: “Please, could you tell me what your date of birth is?” This means that the agent can switch from an emotion to an opposite emotion unbelievably quickly. Agents of the second type are generally called intelligent or believable agents and are based on cognitive processes and action selection with respect of its goals, standards, and attitudes. They also have a dynamic representation of emotions and the affective state is used to orientate decision-making and action selections. Also the agent expresses its own cognitively generated emotions with no guide intention except from its own goals, which could change over time. This type of agent is a simulation-based approach (Gratch et al, 2002). I believe that both approaches try to resolve the same implementation problems and that the agents of the first type are a subset of the second type of agent.

The creation of an emotionally intelligent and believable agent is very difficult due to the multiple different types of tasks and representations that need to be implemented: • Perception mechanisms that interpret the inputs of the system. • Pattern detection mechanisms working from the perception outputs. These mechanisms could be referred as the first layer of CogAff architecture schema. • Resource mechanisms making available recall information, like beliefs, plans, and other memories. These resources are influence by the third layer of CogAff architecture schema or in a typical case by the mood. • Cognition processes, involved in the second layer of CogAff architecture schema. These processes use results of perception mechanisms, character standards and attitudes, and resources such as memory where goals, plans and beliefs are kept. • Metamanagement mechanisms that give an overview of the emotional and resources state of character. • Dynamic representation of the state of Primary Emotions. • Dynamic representation of the state of Secondary Emotion. • Action selection mechanisms with respect to goals and plans, Primary Emotional state, Secondary Emotional state and mood. With the following diagram in Figure 1 on page 6, I try to make explicit the different parts of the completed emotional system and the relation between each sub-part. 6.2 Representation of emotions What I am proposing to develop is a common part between communication-driven and simulation-driven approaches, naming dynamic representations of Primary and Secondary Emotions; and a dynamic mood representation. Why concentrate on these representations? At the moment, Primary emotions are often not simulated. Aaron Sloman believes uncontrolled reactions generated by Primary Emotions are indispensable in a “society of intelligent agents” because they are truthful reactions. If these reactions did not exist, people will have difficulties in having0 confidence in other people on the basis of “controlled” emotions.

One of the problems with the implementation of the secondary emotion is that the cognitive mechanism and the dynamic representation of emotions are all mixed together within one system. The generation of Secondary emotions and their dynamic changes could be separated and thus given possibility to test different cognitive models. Actually, the simulation of cognitive processes is one topic of research by itself. Nowadays, mood is simulated with a cognitive theory, which is not right, or with a very simple model. Mood need be simulated without cognitive processes because it is cognitive impenetrable. It also needs to be implemented in close relation of Primary and Secondary Emotions due to its role to give an overview of the emotional state. In a complete emotional system, mood plays an important role by influencing the cognitive processes and the action selection. Many experiences show that we recall information from memory easier if we are in the same mood as when we memorised this information (Teasdale 1993). By creating the right

interfaces to the Dynamic Emotion Representation system, this system could be used by different implementations of cognitive emotion generators or action selection mechanisms. Research needs to be carried out to design these interfaces but XML documents seem a good starting point due to its platform and language independency. I propose also to study the effects of personality on the dynamics of emotions using certain factors of Big Five. Using Eysenck’s theory that explains differences between extravert and introvert behaviours, I believe that it is possible to generate an emotion like boredom that will appear in different situations depending on the character personality type. As I explained in a previous section, the level of energy in Thayer’s mood theory varies by following a pattern during the day. I would like to use this interesting characteristic to have a dynamic emotion representation that varies with the time of the day. This could be used to create emotional characters that react differently depending of the time of the day.

Figure 1: Diagram of a complete emotional system

Environment Emotional Agent system Resources: beliefs, goals, plans, desires, energy, etc.

Perception mechanisms

Mood State

Cognitive Processes

Secondary Emotional State

Secondary Action Selection

Reactive Processes

Primary Emotional State

Primary Action Selection

Action Composer

Key : Represent the flow of information between modules or the influence of one module to the other. : Represent modules processing information or representing a state of the system. : Dynamic Emotion Representation system energy and a positive stimulus will decrease the tension level. The tension level is supposed to decay with time but this has not yet been implemented.

7 Dynamic Emotion Representation system 7.1 Description of the system As a first implementation of the Dynamic Emotion Representation system I only developed layers 2 and 3 of the CogAff architecture schema. The representation of secondary emotion is composed of 6 different emotional levels: anger, happiness, sadness, surprise, disgust and fear. These emotional levels represent the intensity of each emotion which increase cumulatively when an emotional stimulus arrives and decays slowly otherwise. In the present implementation the emotional stimuli are filtered by the mood module, which contains the level of energy and the level of tension. The level of energy is fixed but the tension level changes depending on the type of emotional stimulus presented and the level of energy. From the point of view of the mood module the emotional stimuli are classified into two classes of either positive or negative. A negative stimulus will increase the tension by a certain amount depending on the level of

To simulate personalities I introduce two parameters that influence the changes of mood. The first parameter sets the energy level and represents the disposition to anxiety. The second parameter influences the variation of tension and represents the emotional stability. To visualise the dynamics of the system, the mood state is shown through a facial expression that varies with the intensity of tension, and the emotional state is shown through a facial expression that is a composition of the two higher emotional intensities. I added a third personality parameter that controls the expressiveness of the character by modifying how the facial expression, representing the emotional state, is created. Finally the two facial expressions, representing the mood and the emotional states, are composed to create a final facial expression. Figure 2 gives an overview of the first DER system implementation.

Figure 2: Diagram of the first DER system implementation Disposition to anxiety

Emotional Stability

Mood state Energy Emotional Stimuli

Tension

Facial Mood Tension Level Facial expression representing the tension.

Mood emotional stimuli filter

Expression

Emotional state Anger

Emotional Stimuli Surprise

Joy

Fear

Combine the tension & Expression emotional expressions

Expression

Sadness

Emotion Levels Disgust

Expression composition

Facial Emotion Facial expression representing the emotion Expressiveness

Key : Represent the flow of information between modules or the influence of one module to the other. : Represent modules processing information or representing a state of the system. : Variable state

7.2 Results The two first results show the differences of facial expression due to 2 different personality settings but with the same level of expressiveness. Differences in mood state with the same expressiveness: Top left: neutral face Bottom left: low disposition to anxiety

Differences in emotional state with the same expressiveness: Top left: neutral face Bottom left: low disposition to anxiety Bottom right: high disposition to anxiety (After amount of anger stimuli)

Bottom right: high disposition to anxiety (After same amount of anger stimuli)

On these pictures, we can see that after the arousal of the agents, with the same amount of anger stimuli, one (bottom right) is angrier than the other one (bottom left).

On these pictures, we can see that after the arousal of the agents, with the same amount of anger stimuli, one (bottom right) is tenser than the other one (bottom left).

The third result shows the differences of facial expression due to two different levels of expressiveness (after the same amount of happiness stimuli)

emotion. By decomposing the creation of emotions through these two stages we facilitate the animator’s work to create complex emotional expressions. It is similar to the work of actors when they firstly assimilate the state of mind and the personality of the character before to play a scene.

On these pictures, we can see that after the arousal of the agents, with the same amount of happiness stimuli, one (bottom left) show a more intensive happy expression than the other one (bottom right). Finally the forth result shows an interesting combination of the facial expressions due to the mood and the emotional states: Top left: Pure mood state facial expression Bottom left: Pure emotional state facial expression Bottom right: composition of the emotional and mood states facial expressions

I like this example because it really shows that mood expression is an important characteristic of the emotional expression. If you have to make a deal with one of these two characters: the one at the bottom left or the other one at the bottom right, which one will have your confidence? The differences are small but play a crucial role in the expressiveness of a person. An animator could give a certain expression to her/his character by firstly describing its overall emotional state (mood), and then by applying a shorter but highly expressive

One of the important problems with these results is the little perceptive differences between each picture. This is due to a lost of quality in the pictures but it could be improved by adding eyebrows to the character to emphasise facial expression. Another problem is the non-neutrality of the face when it is supposed to be neutral: eyes are wide open and the mouth expresses a light smile. 8 Future developments A lot more research and development need to be carried out before this system could be used. To develop a valuable Dynamic Emotion Representation (DER) system, I would like to investigate more deeply the influences of emotions on each other and the relationship between mood and emotion. I also need to have a better understanding of mood mechanisms and what participates at the generation of different moods. The first layer of Sloman’s architecture schema had not been implemented in this first version of a DER system. However, the representation of the Primary Emotions is important due to the role that these emotions have in emotional communications. The next version of the DER system should probably include this type of emotion representation. To make the DER system more flexible, the number of Secondary Emotions should not be fixed within the implementation. In fact, psychologists have not yet defined the number of Secondary Emotion, so it will be sensible to leave the user to decide how many and which emotions should exist in the model. The life times of emotions are not all the same, which means that the functions controlling the rising and decay of emotions must also be customisable. One of the interesting points in Thayer’s theory is the possibility to make the mood change over time using the variation of energy through a day. This variation could also be used to create a state of boredom in the character, for instance, when the energy level is high and a level of emotional arousal is low.

The difference between these two levels could also be used as motivation for certain behaviours.

References

As personality traits are closely related to emotions and to their changes, it could be interesting to look more closely into the relation between emotions and personalities.

André E., Klesen M., Gebhard P., Allen S. and Rist T., 1999, Integration Models of Personality and Emotions into Lifelike Characters, Proceeding of the workshop on Affect in Interaction Towards a new Generation of Interfaces in conjunction with the 3rd i3 Annual Conference, Siena, Italy, p 136-149.

Finally, one of the most important developments will be the design of user interfaces for the DER system. These interfaces must be developed for an animation application using communication-driven approach as well as for an AI application using a simulation-driven approach. The design of these interfaces will decide of the usability of the system. 9 Conclusion In conclusion, the discussion showed how the implementation of a Dynamic Emotion Representation (DER) system could be a good tool for both approaches of emotional agent: communication-driven and simulation-driven. To be so, it will have to be strongly based on emotions and mood theories. I choose Sloman’s architecture schema as a base of the emotion theory for the DER system, but I have also been inspired by Sizer and Thayer’s models of mood, which are easier to implement. The first implementation of a DER system is composed only of two layers, on three described in Sloman’s model. This implementation already gave interesting results that prove the need for different emotion layers to express more detailed and subtle facial expression. A DER system could help an animator to define complex emotional expressions by decomposing these expressions into a state of mind (mood) and a pure emotion. But the creation of a DER system could also be useful to AI system as cognitive processes are influenced by emotions and mood. This influence may not be indirect but at least a lot of experiments show that mood modifies how information is recalled from memory. It will be interesting to carry out more research on models of emotions and on the influence of personality traits on the dynamism of emotions.

Badler N., Allbeck J., Zhao L., Byun M., 2002, Center for Human Modeling and Simulation, Computer and Information Science, University of Pennsylvania. Baldes S., Literature survey about personality and emotions, DFKI GmbH Ball G. and Breese J., 1998, Emotion and Personality, Computer animation, workshop Embodied …, 12-15, p 83-87. Brooks R. A., 1991, Intelligence without representation, Artificial intelligence 47, 139159. Carver C. S., 1992, Perspectives on Personality, Boston;London: Allyn and Bacon. Eysenck H. J., 1995, Trait theory of personality, from “Individual differences and Personality” edited by Sarah E. Hampson and Andrew M. Colman, Longman Essential Psychology. Eysenck M. and Keane M., 2002, Cognitive Psychology A student’s handbook, fourth edition, Psychology Press Ltd. Gratch J., 2000, Émile: Marshalling Passions in Training and Education, 4th International conference in autonomous agents. Gratch J., Rickel J., Andre E., Badler N., Cassell J., Petajan E., 2002, Creating Interactive Virtual Humans: Some Assembly Required, IEEE Intelligent systems. Klesen M., 2002, Report on affective reasoning and cultural diversity, DFKI, NECA, IST-2000-28580. Kshirsagar S. and Magnenat-Thalmann N., 2002, A Multilayer Personality Model, Proceedings of 2nd International Symposium on Smart Graphics, June 2002, p 107-115.

Marcella S. and Gratch J., 2002, A step toward irrationality: using emotion to change belief, First international joint conference on autonomous agents and multiagent systems, Bologna, Italy. Mc Crae, R. R., and John, O. P., 1992, An Introduction to the five-factor model and its applications, journal of Personality 60, 175215. Merriam-Webster Online http://www.m-w.com/.

dictionary;

Ontony A., Clore G. L., and Collins A., 1988, The cognitive structure of emotions., Cambridge, University Press, Cambridge, MA. Picard R. W., 1997, Affective Computing, The MIT Press, Cambridge, Massachusetts, London, England. Reilly S., 1996, Believable Social and Emotional Agents, Ph.D. School of computer science, Carnegie Mellon University, Pittsburgh. Reilly S., and Bates J., 1992, Building Emotional Agents. Sizer L., 2000, Towards a computational theory of mood, Brit. J. Phil. Sci., 51, 743-769. Sloman A., 1999, Review of: Affective Computing, AI Magazine. Sloman A., 2001 a, Varieties of Affect and the CogAff Architecture Schema, http://www.cs.bham.ac.uk/research/cogaff/. Sloman A., 2001 b, Beyond Shallow Models of Emotions, In Cognitive Processing, Vol. 2 No 1, pp 177-198. Teasdale J. D. and Barnard P. J., 1993, Affect, Cognition and change, Re-modelling depressive thought, Medical Research Council, Applied Psychology Unit, Cambridge, England, LEA Publishers. Thayer R. E., 1996, The origin of everyday moods, Ph.D., Oxford University Press.

Suggest Documents