The Potential of Human Haptic Emotion as Technique for ... - CiteSeerX

82 downloads 16847 Views 1014KB Size Report
computers in order to create interactivity and form human perception on virtual reality ... Faculty of Computer Science and Information Systems. Universiti Teknologi ..... Abdullah Bade received the BSc degree in Computer. Science (2000) from ...
The International Journal of Virtual Reality, 2007, 7(2):27-32

27

The Potential of Human Haptic Emotion as Technique for Virtual Human Characters Movement to Augment Interactivity in Virtual Reality Game Ahmad Hoirul Basori, Daut Daman, Mohd Shahrizal Sunar and Abdullah Bade1 Department of Computer Graphics and Multimedia Faculty of Computer Science and Information Systems Universiti Teknologi Malaysia

Abstract—Virtual Human Character movement is very essential on virtual reality game in order to perform natural or emotional movement. Basically, numerous researchers focusing on movements from input such as: joystick, mouse, and keyboard. The other movement’s concepts come from artificial intelligence as a part of improving interactivity. This paper intends to offer different method of controlling the virtual human characters movement that involves human emotion and haptic sensation known as human haptic emotion. The scheme is engage human haptic emotion to control object life movements as a way to enhance simulation closer to reality and more immersive. Furthermore, the information of virtual human characters movements also sent to the players through haptic device during play session. This approach will be expected to augment the virtual reality game on achieve smooth movements. Index Terms—Virtual human characters movement, human haptic emotion, virtual reality game.

on virtual human characters and transfers those human emotions to players through haptic devices. Therefore the players will sense the emotion of virtual human characters during playing session. This paper intends to recommend a different approach on handling movements of virtual human characters using human haptic emotion that will be used for as feature on handle movement for virtual reality game in order to augmenting interactivity between virtual characters itself and also interactivity with players. This paper classified into Section І introduction of this research and the background of the idea, Section Ⅱ , current and previous research that focused on characters movement, haptic, virtual reality game, and human perception, Section Ⅲ describe about the potential of human haptic emotion and also the proposed theory. Section Ⅳ is conclusion and future works. II.

I.

INTRODUCTION

Virtual characters Movement is a feature in virtual reality game which is usually controlled by mouse or keyboard, but recently it is controlled by artificial intelligence which is built from particular rule [1]. Furthermore, virtual reality game [VRG] depends on interactive movement as well, because each interaction needs natural movement [1]. In VRG context, this is quite useful for developing VRG application such as Flight Simulation, Surgery Simulation, and Military Simulation Training [2, 3, 4]. Moreover, serious game which comprises tasks that include specific knowledge is attached into game modules. It also offer more information using messages, provide tutorial and experience like in real life [2]. From educational perspective VRG need to be more interactive in order to deliver the material of study. This interaction can be come from interaction between virtual characters in virtual environment and also interaction between VRG applications with players [5, 6]. Human haptic emotion is a way of applying human emotion Manuscript received on April 10th, 2008. E-Mail: [email protected].

RELATED WORKS

Characters movements have already become important issues on virtual reality and game development. A lot of advanced game engines consider this as an important aspect for simulating real life case, especially when use a navigation to combine game entertaining with educational theory such as military theory, biology, surgery or medical theory, and etc. [1], [5, 6, 7, 8]. Other research looked into how to solve navigation problem using interface for guiding the players into objective place and also help to trace the object on virtual environment [9]. While the other researchers still concern on navigation way such as tour and looking for the track/path [10]. Characters movement is very useful for helping interactivity, especially, how to attract player’s attention during game session. Moreover, researchers give more focus on making communication between virtual characters through network computers in order to create interactivity and form human perception on virtual reality game or on real life simulation [2], [3], [7], [11], [12], [13]. In other hand, Haptic (Sense of touch) already implemented in several virtual reality and training simulator project. One of those is implemented in Virtual Billiard Game (VBG) by [14], Virtual Haptic Back (VHB) [15], furthermore, a different

28

The International Journal of Virtual Reality, 2007, 7(2):27-32

approach also proposed to explore and analyze human emotion through haptic device [16]. Game already reach the stage how to involve sense of touch, the sense of touch can be sense during the game play session [14]. The detail of haptic (forced feedback) or shock feedback is described from physical action of object such as: grabbing, pressing, make longer, and chafing [17]. Basically we can found the cheapest haptic devices are shock-feedback joystick or vibrate joystick, but haptic device like glove-like input device is very expensive, therefore researcher still continue to produce high quality and cheap haptic device [15, 17]. For the duration of simulating characters in virtual environment, each character requires to act together based on rules that applied since the beginning. Virtual Human characters in virtual reality game can perform several interactions like: • • •

Interaction between virtual human characters with virtual worlds like walkthrough in virtual environment. Interaction between virtual human characters with others virtual human characters. Interaction between virtual human characters with virtual animal characters and other moving object in virtual environment.

Additionally, previous researcher trying to combine feeling or emotion and social parts for controlling virtual characters to respond and make movement according to emotional condition of each characters and also [1]. The other feature is adding social artificial intelligence agent with the intention of managing interaction between human character and animal [1]. Furthermore, the involvement of human emotion will become great benefit on developing artificial life simulation [13]. Our research intends to proposed collaboration approach and improves the previous on handling virtual human characters movement. This approach used a combination between angle direction that generated from emotion and haptic sensation to show the sense of movement. III.

THE POTENTIAL OF HUMAN HAPTIC EMOTION

Human behaviors already become a consideration for researcher to bring it on virtual reality game to perform better visualization. This technique includes performing complex texture and animation to 3D models. Therefore, some 3D models are able to imitate the expression of real human emotion, such as: human mimic representation, lip representation, and also mouth appearance, so as to determine the emotion of virtual human characters. In other hand, researchers also want to improve the acoustic of virtual reality game, so that the human emotion is classified into different levels of decibel power. This will allow players to experience with all emotions with different sound frequencies and devices. This section will describe the potential of human haptic emotion as a technique for virtual characters movement to augment interactivity in virtual reality game. Section Ⅲ Part І will explain how possible the human haptic emotion as technique for virtual human character movement and Section Ⅲ Part Ⅱdescribe about the integration between human haptic emotions with virtual reality game applications.

3.1 Human Haptic Emotion as a Technique for Virtual Human Character Movement In [16] the human emotions from experimental result are classified into different seven emotions, such as: anger, fear, disgust, sadness, joy, interest and surprise. Their emotions are expressed through joystick haptic devices, and later the application will record every interaction of human characters for documentation. Furthermore, intelligent virtual agents can also be applied to make the character to be able to learn and adapt the environment as on digital atmosphere [11]. The proposed method is collaboration between virtual reality game and human haptic emotion. The aim is to construct a virtual character’s that navigated using human emotion and then transferred the position of virtual characters during transition to users by using haptic devices. Based on Bailenson [16], each emotion of virtual human characters will be applied with different approximate frequencies such as shown in TABLE 1: TABLE 1: THE CLASSIFICATION OF HUMAN EMOTION AND FREQUENCIES OF HAPTIC DEVICES. Human Emotion Frequencies of haptic Device Anger I Fear II Disgust III Sadness IV JoyV Interest VI Surprise VII

All these emotions will stimulate diverse movement to virtual human characters then it is also makes different frequencies of vibration with the aim of telling the players about their position and emotion during interaction in virtual environment. The movement of all virtual human characters are stimulated by scenarios that involve particular emotion, for example, a teacher interest to the programming ability his/her students or otherwise may afraid to the current programming abilities of his/her student . In another case students may get fear or joy facing their final exam result. 3.1.1. Virtual Human Character Movement Mocholi as quoted in [1] classify the characters movement into three main parts: “fear-courage”, “aggressivenesscalmness”, “happiness-sadness”, “faithfulness-disloyalty”. This proposed idea involves seven emotions to control characters movement: Take Action and Reaction to particular scenarios. Case 1: Fig. 1. has illustrated the detail of VHC1 movement during VHC1 in “fear” mode. VHC1: make a particular action like “fear”, and then VHC1 will move from one position to another position follow the path of “fear” emotion mode. The case illustrated that VHC 1 only make an action and didn’t affect the other VHC, that’s why VHC one only move to the particular position but not followed by other VHC. Case 2: In case 1 already illustrated that VHC can make an action, therefore in this case will be show how the VHC can make an action and others VHC make some reaction. VHC1

The International Journal of Virtual Reality, 2007, 7(2):27-32 will make interaction with VHC2, in this case VHC1-anger mode and VHC2-fear mode. This will be two alternatives: First, VHC1 will stayed in it position but VHC 2 will move to position near to VHC 1 in order to make particular action such as: apologize, or to entertain. Second, VHC 1 will move to specific position and VHC 2 will react to this action and be far from VHC1.this because VHC 2 feels “fear” and try to move to safe position in order to avoid the VHC 1. This case illustrated that all characters movement tried to be controlled with natural movement. Human haptic emotion will control the movements and the information about position of virtual human characters [VHC] in virtual environment during the playing session. The detail of illustration above shown in Fig. 1.

Fig. 1. The illustration of VHC1 movement path with “fear” emotion mode.

For the case 2, the virtual human character change their position based on act and react that generated from the scenarios. This act and react action can be illustrated like in TABLE 2: TABLE 2: THE ACTION AND REACTION OF VHC FOR CASE 2. Virtual Human Emotion Movement Frequencies Character VHC1 anger Not move I Move to particular position VHC2 fear Move to near VHC1 II Move avoid VHC1

From case study 1 and 2, human haptic emotion has great capability in order to perform natural movement based on emotion to augment interactivity in virtual reality game. This is essential as well if virtual reality games are designed for real life simulation purpose, because a real life simulation needs natural aspects in order to make sensible virtual reality. Therefore human haptic emotion can be very useful in the future of virtual reality game and edutainment especially on

29

increase the natural effects on virtual characters. 3.1.2 Augmenting Interactivity In [6] Interactivity, There are three effects to players, psychological effect, actions and reactions during virtual reality game session. From psychological perspective, human haptic emotion will act as a stimulator for every interaction that caused by virtual human characters [VHC]. This mean VHC will be able to learn and adapt the situation from every interaction that happen and involve in each part of emotion In other hand, referring to previous case study VHC can perform natural interaction with movement that controlled by emotions condition from each VHC. Furthermore, Interactivity will be increase since the VHC can act and react naturally based on human emotions and every action or reaction will send to players through haptic devices. Therefore the players will get awareness for the period of playing this virtual reality game.

Fig. 2. This figure shows the illustration of human haptic emotion contribute to interaction during game session.

3.2 Integrating Into Virtual Reality Game Recently, Game engines already have a tool to make a link between VRG systems with haptic devices, therefore based on game engines capabilities, there is high potential that human haptic emotion is applied into virtual reality game in order to enhance interactivity through natural movement. The method is developing through the communication between game engines and haptic devices. Each emotion from human characters could be transferred to players using haptic devices that can carry out different frequencies depend on the scenarios of simulation. Scenarios will stimulate haptic device to execute a particular frequency depend on the process of scenarios. Therefore this feature will improve virtual reality games immersiveness and interactivity. This human haptic emotion approach can be at finest state if implemented with advanced game engines such as: xna, or torque, and advanced haptic device like glove-like input devices or head mounted displays. The Architecture for overall systems can be illustrated as shown in Fig. 2.

The International Journal of Virtual Reality, 2007, 7(2):27-32

30 IV.

METHODOLOGY

In the previous explanation we already describe about the general idea of this paper. Thus, in this section we will make the detail description about the human haptic emotion. According to Bailenson [16], he classified the measurement of human emotion into several measurement points such as distance, mean speed, standard deviation of speed, mean acceleration, standard deviation of acceleration, angle, standard deviation of position, standard deviation of the major axis. The detail of human emotion information listed in Fig. 3. Measurement Disgust

Anger

Sadness

Joy

Fear

Interest

Surprise

Distance Speed(M) Speed(SD) Accel(M) Angle(SD) Position(SD) Major(SD) Minor(SD) Major(%)

Long Fast Jerky Faster . . . . .

Short Slow Steady Slower . . . . Rectangular

Long Fast Jerky Faster . . . . Square

Short . . . . . Short . Square

. . . . . . . . .

. . Jerky . . . . . .

Short . . . . . . . Square

Fig. 3. List of human experimental result on emotion measurement [16].

Fig. 4. Angle direction recording result.

scaling method and axis to measure and decide the direction of object then calculate the angle of joystick. The complete angle recording can be shown in Fig. 4. Furthermore, N.Zagalo [18] makes a model for story of emotion on four quadrants, which separated with: Happiness, Anxiety, Sadness and Relax. X axis illustrated positive and negative, Y-Axis described about high and low. The detail of emotion grouping is illustrated by N.Zagalo[18] in Fig. 5. Based on previous method, we try to come up with a methodology that called human haptic emotion. This methodology collaborate previous method that can give natural movement and natural behavior via magnitude vibration of haptic device. We try to model the movement of virtual human characters based on two methods: First, the method based on pair’s emotion (Anger-Fear) like scenario explained in case 2 before. Second, based on combining the model from Bailenson [16], N. Zagalo [18], the capabilities of DirectX forced feedback and XNA framework. The proposed model used the direction from each emotion and magnitude vibration from haptic device, therefore every vibration will be expressed by certain frequencies. The detail of this methodology explained by Fig. 6 and TABLE 3.

Fig. 6. Direction in 3D virtual environment.

The movement of human characters in virtual environment, basically still on X-Y-Z axis. Hence, on this experiment, the movement of virtual human characters will be controlled by degree direction for each emotion. On this approach we consider each of emotion will have approximation degree of their path according to angle from Bailenson [16]. For example, Anger: 50%° Right-50%° Left means that this emotion have a same chance to go right route or left route. Other emotions will generate different path of movements such as surprise that have composition 70%°Right- 30%° Left. This indicates that surprise emotions slope to move to the right angle with approximation 70%.

Fig. 5. Model of Story viewer emotion [18].

Bailenson [16] has record the angle of joystick direction from players during the experiment. The experiment use

TABLE 3: DIRECTION AND MAGNITUDE VIBRATION TABLE FOR EACH HUMAN EMOTION. Emotions Directions of Magnitude Magnitude Movements Vibration Duration Disgust 100%° Left 10000 2000000 µs Anger 50%° Right - 50%° Left 9000 3000000 µs Sadness 80%°Right - 20% Left 1000 3000000 µs Joy 75%° Right- 5° Left 5000 1000000 µs Fear 20%° Right- 80%°Left 8000 1000000 µs Interest 70%°Right- 30%°Left 4000 2000000 µs Surprise 70%°Right- 30%° Left 6000 3000000 µs

The International Journal of Virtual Reality, 2007, 7(2):27-32 V.

TESTING AND EVALUATION

For the preliminary testing we used Direct X as a library to communicate with forced feedback (haptic device)-Tesun joystick (see Fig. 7). But for visualization we used XNA framework to visualize and demonstrate the methodology. The human model of this experiment (dude. fbx) taken from Xna Creator.

31

second vibration must be increment with 1000 to show the variation of vibration. Therefore in table 3, the magnitude frequencies of each vibration will be separated by 1000 differentiation. The visualization of virtual human characters movement still preliminary finding that consist of one character, that has shown in Fig. 8 and Fig. 9. The result from this experiment shown delay time between visual movement and haptic sensation. This delay time show that the graphics influence by the change of virtual human characters angle direction. The approximation result illustrated by graphics like shown in Fig. 10.

Fig. 7. Forced Feedback joystick-TESUN. Forces.Add(ForceType.Anger, InitializeForce(Dev, EffectType.ConstantForce, axis, 1000, EffectFlags.ObjectOffsets | EffectFlags.Spherical, 2000000)); Fig. 8. forced feedback in Direct X. Fig.10. Anger mode movement and haptic sensation-Angle changed[16].

This graphic comes from experiment using simple haptic device Tesun forced-feedback USB joystick. The result of delay time will be different if taken by another haptic device like Xbox joystick, hand-glove haptic device, or other haptic device Furthermore, the result will be expected to reduce delay time for haptic transfer in order to give real time sensation to players.

Fig. 9. Anger mode movement and haptic sensation.

We separate the process into three main phases: • • •

First, change the direction of virtual human characters according to each emotion angle during the playing session. Second, classified the magnitude frequencies according to each emotion that stimulated by the scenarios of game or by players. Third, compute the delay time of haptic transfer time.

We try to use Direct X Library to control the haptic device using the common statement from Direct X like shown in Fig. 7. The magnitude vibration frequencies have range from (0-10000), but in the real experiment when we used Tesun joystick the vibration. It is very hard to experience with vibration below 1000.The vibration can feels different when it vibrate above 1000, moreover the distinction range of the

Fig. 11. Delay time during transfer haptic sense via haptic device [16].

VI.

CONCLUSION

This paper has illustrated and demonstrated the potential of human haptic emotion used as a control for virtual human character movement in virtual reality game. The experimental result still showed preliminary finding that has a little data experiment. However, this paper provides the investigational result that has shown the potential of human haptic emotion. It will be acted as a trigger on stimulating movement for virtual

32

The International Journal of Virtual Reality, 2007, 7(2):27-32

human characters in order to act and react naturally in digital environment. In addition, human haptic emotions also affect the interactivity and immersiveness of virtual reality game. It will be expected to augment interactivity on virtual reality game with the aim of performing natural or emotional characters movement that derived from human emotion and sent through haptic devices. Further research will integrate human haptic emotion with artificial intelligence in order to perform realistic virtual reality game. ACKNOWLEDGEMENT The authors would like to thank Ministry of Science, Technology and Innovation Malaysia (MOSTI) and Universiti Teknologi Malaysia (UTM) for their financial support under e-science fund vot no.79223. REFERENCES [1]

[2] [3]

[4]

[5] [6]

[7] [8] [9] [10]

[11] [12]

[13] [14]

J. A. Mocholí, J. M. Esteve, J. Jaén, R. Acosta and P. Louis Xech. An Emotional Path Finding Mechanism for Augmented Reality Applications, proceedings of the 5th International Conference on Entertainment Computing (ICEC2006), pp.13-24, September 2006. D. Michael and S. Chen. Serious Game, Game that Educate, Train, and Inform, Thomson Johnson technology, Thomson Course Technology PTR, Boston, USA, 2006. R. S. Aylett, C.Delgado, J. H. Serna, R. Stockdale, H. Clarke, M. Estebanez, P. Goillau and T. Lynam. REMOTE: Desk-top Virtual Reality for Future Command and Control, Springer-Verlag London Limited, pp. 131-146, 2005. Stefan Marks, John Windsor and Burkhard Wunsche, Evaluation of Game Engines for Simulated Surgical Training, proceedings of the 5th International Conference on Computer Graphics and Interactive Techniques in Australia and Southeast Asia(GRAPHITE), pp. 273-318, December 2007. M. Rossou. Learning by Doing and Learning through Play: An Exploration of interactivity in Virtual Environments for Children, ACM Journal of Computers in Entertainment, vol. 2, no.1, pp.1-23, 2004. F.L.Greitzer, O. A. Kuchar and K. Huston. Cognitive Science Implications for Enhancing Training Effectiveness in a Serious Gaming Context, ACM Journal of Educational Resources in Computing, vol. 7, no. 3, art. 2, 2007. M. Virvou and G. Katsionis. On the Usability and Likeability of Virtual Reality Games for Eeducation: The Case VR-ENGAGE, Journal of Computers and Education, vol. 50, pp. 154-178, 2006. Y. Cai, B. Lu, Z. Fan, C. Indhumathi, K. T. Lim, C. W. Chan, Y. Jiang and L. Li. Bioedutainment: Learning Life Science through X Gaming, Journal of Computer & Graphics-Elsevier , vol. 30, pp.3-9, 2006. M. J. Abasolo and J. M. Della. Magallanes: 3D Navigation for Everybody, the proceedings of GRAPHITE 2007, ACM Conference Proceeding, Perth Western Australia, pp.135-142, December 2007. C. Geiger, R. Fritze, A. Lehmann and J. Stocklein. HYUI-A Visual Framework for prototyping Hybrid User Interfaces, proceedings of the second International Conference on Ttangible Anda Embedded Interaction(TEI’08), pp. 63-70, February 2008. P. Herrero, C. Greenhalgh and A. De Antonio. Modelling the Sensory Abilties of Intelligent Virtual Agents, Springer Science+Business Media, pp. 361-385, 2005. T. Y. Li, M.-Y. Liao and P.-C. Tao. IMNET: An Experimental Tesbed for Extensible Multi-user Virtual Enviroenment Systems, the International Conference on Computational Science and its Applications (ICCSA 2005), pp. 957-966, May 2005. F. R. Miranda, J. E. Kogler Jr, E. M. Hernandez and M. L. Netto. An Artificial Life Approach for the Animation of Cognitive Characters, Journal of Computer and Graphics-Elsivier, pp.955-964, 2001. Y. Takamura, N. Abe, K. Tanaka, H. Taki and S. He. A virtual Billiard Game with Visual, Auditory and Haptic Ssensation, the proceedings of

[15] [16]

[17] [18]

International Conference on E-learning and Games(EDUTAINMENT 2006)-LCNS3942, pp.700-705, April 2006. R. L. Williams II, J. N. Howell and D. C. Eland. The Virtual Haptic Back for Palpatory Training, proceedings of 6th IEEE International Conference on Multimodal Interfaces (ICMI), pp.191-197, October 2004. J. N. Bailenson, N. Yee, S. Brave, D. Merget and D. Koslov. Virtual Interpersonal Touch: Expressing and Recognizing Emotions Through Haptic Devices, Human-Computer Interaction, vol. 22, pp. 325-353, Lawrence Erlbaum Associates, Inc. 2007. N. Magnenat, U. Bonanni and S. Boll. Haptics in Virtual Reality and Multimedia, IEEE Journal, pp. 6-11, 2006. N.Zagalo, A. Barker and V. Branco. Story Reaction Structures to Emotion Detection, SRMC’04, ACM Conference Proceedings, pp.33-38, 2004. Ahmad Hoirul Basori received the B.Sc. from Department of Informatics, Institut Teknologi Sepuluh Nopember Surabaya, Indonesia (2004). Currently he is Master student at Department of Computer Graphics and Multimedia, Faculty of Computer Science and Information System, Universiti Teknologi Malaysia. He is supervised by Associate Professor Daut Daman, Mohd Shahrizal Sunar and Abdullah Bade. His research interests include virtual reality, game development, edutainment and application using haptic devices.

Daut Daman received his M.Sc. in 1984 from Cranfield Institute of Technology, United Kingdom. Currently he is Associate Professor at Department of Computer Graphics and Multimedia, Faculty of Computer Science and Information System, Universiti Teknologi Malaysia. He got Silver Medal INATEX in 2004. From 2006 he is Head Department of Computer Graphics and Multimedia, Faculty of Computer Science and Information System, Universiti Teknologi Malaysia. Mr. Daut Daman is an active professional member Institute of Electrical and Electronic Engineers (IEEE), USA.IEEE Computer Society, USA.ACM Computer Society,ACM SIGGRAPH Society and Computer Graphics Society His research interests include virtual reality, brain mapping visualization, game environment. Mohd Shahrizal Sunar received the BSc degree in Computer Science majoring in Computer Graphics (1999) from Universiti Teknologi Malaysia and MSc in Computer Graphics and Virtual Environment (2001) from The University of Hull, UK. In 2008, he obtained his PhD from National University of Malaysia. His major field of study is real-time and interactive computer graphics and virtual reality. He is a faculty member at Department of Computer Graphics and Multimedia, Faculty of Computer Science and Information System, Universiti Teknologi Malaysia since 1999. He had published numerous articles in international as well as national journals, conference proceedings and technical papers including article in magazines. Dr. Shahrizal is an active profesional member of ACM SIGGRAPH KL-Chapter. He is also a member Malaysian Society of Mathematic and Science. Abdullah Bade received the BSc degree in Computer Science (2000) from Universiti Teknologi Malaysia and MSc(Research) in Computer Graphics (2003) from Universiti Teknologi Malaysia. His research interests include Collision Detection, Articulated figure animation, and game environment. He received Bronze medal for INATEX 2004 (Industrial Art and Technology Exhibition- Universiti Teknologi Malaysia) and he also received award Excellent Service Award 2004-Universiti Teknologi Malaysia).