1 KTH, CSC School of Computer Science and Communication,. Dept. of ... and sadness. Subjects were asked to judge in a perceptual test which emotional.
Displaying Expression in Musical Performance by Means of a Mobile Robot Birgitta Burger1,2 and Roberto Bresin1 1
2
1
KTH, CSC School of Computer Science and Communication, Dept. of Speech Music and Hearing, Stockholm, Sweden University of Cologne, Dept. of Systematic Musicology, Germany
Introduction
In recent times several attempts have been made to give a robot or broader spoken a computer some kind of feelings in order to understand and model human capacities. The main idea of our work was the design of expressive robot movements for the display of emotional content embedded in the audio layer in both live and recorded music performance. Starting from results in studies on musicians’ body in emotional expressive music performance (see [3]), we tried to map different movement cues (e.g. speed, fluency) to movements of a small mobile robot. The robot had constraints of sensors and motors, so the emotions were implemented taking into account only the main characteristics of musicians’ movements. We implemented movements for the three emotions happiness, anger and sadness. Subjects were asked to judge in a perceptual test which emotional intentions were communicated by the movements.
2
The M[E]X Robot and Its Emotional Behaviour
We used the Lego Mindstorms NXT System to build our robot called M[E]X (short for Musical EXpression). Fig. 1 shows the final version of M[E]X. The implementation of the emotional behavior is based on literature on how emotion is communicated by acoustical cues [4], on how musicians communicate emotional intentions by body movements during performance [3], and on how emotional intention is expressed by shapes of 3-d objects [5]. The movements for happiness consisted of two interleaved circular structures: one big circle was interrupted by smaller circles Fig. 1. M[E]X (round shapes convey a positive emotion [5]). The movements for anger were very fast, jerky, irregular, asymmetrical and unpredictable (jerky movements in music performance communicate anger [3]; sharp and spiky objects convey a negative emotion [5]). The movements for sadness were very slow, smooth (i.e. neither jerky or irregular) and the amount of gestures was small. A. Paiva, R. Prada, and R.W. Picard (Eds.): ACII 2007, LNCS 4738, pp. 753–754, 2007. c Springer-Verlag Berlin Heidelberg 2007
754
3
B. Burger and R. Bresin
Experiments
We conducted two experiments, one in Stockholm, and another one in Cologne. The movement stimuli were presented with and without musical background that were conveying the same emotional intention. The aim was to investigate if a musical input supports the perception of the movements. For more information about the musical stimuli see [1]. The experiment was divided into 2 sections, section 1 containing only the movements of the robot, while in section 2 the movements were combined with the musical stimuli. In the first experiment 15 subjects were asked to rate each stimulus on the happy, angry, sad, neutral and expressive scales on a questionnaire. Each scale was divided into 7 steps (from not . . . to very . . . ). The results of the first experiment show that the intended emotion happiness was better recognized when presented with music. When presented without music the subjects could not distinguish between happy and angry movements. The movements for displaying anger were not perceived as angry, independently from the musical input. Sad movements were perceived well both with and without music. 36 subjects took part to the second experiment. For happiness the same results as in Experiment 1 occurred: a better recognition in combination with music. When presented without music the subjects could not distinguish between happy and angry stimuli. The intended emotion anger was better recognized when presented without music. When presented with music the subjects rated the movements rather happy. Sad stimuli were again perceived well with and without music. (Detailed analysis of both experiments: [2]).
4
Discussion and Future Work
Different musical stimuli could be tested in further experiments – e.g. different melodies for each emotion. Further work could be done on the distinction between happiness and anger, since the subjects rated high on the expressive scale, but seemed to have problems distinguishing between happiness and anger. The experiment could be also repeated with another kind of robot. A robot capable to directly translate the emotion in the music into expressive movements instead of using predefined movements.
References 1. Bresin, R.: What is the color of that music performance? In: ICMC, pp. 367–370 (2005) 2. Burger, B.: Communication of Musical Expression from Mobile Robots to Humans. Master’s Thesis (in preparation) 3. Dahl, S., Friberg, A.: Visual perception of expressiveness in musicians’ body movements. In: Music Perception, vol. 24(5) (in press) 4. Friberg, A., Schoonderwaldt, E., Juslin, P.N.: CUEX: An algorithm for extracting expressive tone variables from audio recordings. In Acoustica united with Acta Acoustica. 93(3), 411–420 (2007) 5. Isbister, K., H¨ oo ¨k, K., Sharp, M., Laaksolahti, J.: The Sensual Evaluation Instrument: Developing an Affective Evaluation Tool. In: CHI, pp. 1163–1172 (2006)