Affective Behavior and Nonverbal Interaction in Collaborative Virtual ...

3 downloads 33814 Views 756KB Size Report
This article of the Journal of Educational Technology & Society is available under ... Virtual Environment for learning represents a resourceful technology where.
Peña, A., Rangel, N., Muñoz, M., Mejia, J., & Lara, G. (2016). Affective Behavior and Nonverbal Interaction in Collaborative Virtual Environments. Educational Technology & Society, 19 (2), 29–41.

Affective Behavior and Nonverbal Interaction in Collaborative Virtual Environments Adriana Peña1*, Nora Rangel2, Mirna Muñoz3, Jezreel Mejia3 and Graciela Lara1 1

CUCEI, Universidad de Guadalajara, México // 2Centro de Estudios e Investigaciones en Comportamiento, Universidad de Guadalajara, México // 3Centro de Investigación en Matemáticas, Unidad Zacatecas, México // [email protected] // [email protected] // [email protected] // [email protected] // [email protected] * Corresponding author ABSTRACT While a person’s internal state might not be easily inferred through an automatic computer system, within a group, people express themselves through their interaction with others. The group members’ interaction can be then helpful to understand, to certain extent, its members’ affective behavior in any case toward the task at hand. In this context, a Collaborative Virtual Environment for learning represents a resourceful technology where students can interact with the environment and with each other by their graphical representation, i.e., their avatar; and all at once, each participant’s action can be automatically followed. We propose the analysis of nonverbal interaction to identify affective behavior during the accomplishment of collaborative tasks, which can be extended to verbal content. An exploratory study was conducted to understand the possibilities of this approach; the results showed behavior patterns that can be associated with task-focused affective states.

Keywords Affective learning, Collaborative Virtual Environments, Nonverbal interaction, Collaborative Learning

Introduction Emotions influence our behavior (Kantor, 1921a; Kantor, 1921b). And as such, during the last two decades, the interest in the affective component in computers has grown under the research area of affective computer; aimed to the study of the relation that involves emotions and computers (Picard, 1997, p. 50). For learning, a special effort has been made to automatize the important tutor capability of the humans, to comprise the emotions of their students in the teaching process (Heyward, 2010; Moridis & Economides, 2008), that is, affective learning. However, due to their measure complexity, the automated sensing of emotions remains as an open issue. According to Piccard & Daily (2005), the methods to evaluate affect for computer-users include the classic questionnaires. This self-report on emotions presents inconveniences such as interruption, and the fact that the emotional state of a person can change from moment to moment. The evaluation of affective behavior also includes body measures. Both, based on physiological signals through sensors such as for body-worn, EEG (electroencephalogram) or ECG (electrocardiogram) (e.g., Agrafioti, Hatzinakos, & Anderson, 2012; Bamidis, Papadelis, Kourtidou-Papadeli, Pappas, & Vivas, 2004), or through body activity such as facial expressions, posture, hand tension, gestures or vocal expressions (e.g., Ammar, Neji, Alimi, & Gouardèresc, 2010; see Picard & Daily, 2005). Typically, this type of evaluation requires special equipment, and for the user to be greatly aware of it. A third approach is the one based on task-measures, that as indicated by Picard & Daily (2005), is an indirect evaluation that considers that our affective state influences our behavior on subsequent tasks (e.g., Lerner, Small, & Loewenstein, 2004; Liu, Lieberman, & Selker, 2003). Although, this type of measure is not intrusive, because task-measures are indirect, their results are usually applicable in populations and not in individuals (Picard & Daily, 2005). On the other hand, what a person does is not an expression of internal or innate entities, but a direct effect of what is happening in the environment (Kantor, 1921a; Kantor, 1921b). Almost a hundred years ago, Kantor (1921a; 1921b) was already concerned about the recurrent causal connection between mental and physiological states in the study of emotions. On this regard, Boehner, DePaula, Dourish & Sengers (2007) recommended for affect computing to incorporate an interactionist approach, emotions as constructed through interaction and expression, and not as an objective natural fact. Boehner et al. (2007) proposal is mainly intended for the human-computer interactions (HCI), and not particularly for computer-mediated human interactions. For Kantor (1929), the elements to consider in the analysis of affective behavior are: the stimulus or events that the individuals face, the individual’s behavioral repertoire, the reaction speed, the physiological conditions of the person, ISSN 1436-4522 (online) and 1176-3647 (print). This article of the Journal of Educational Technology & Society is available under Creative Commons CC-BY-ND-NC 3.0 license (https://creativecommons.org/licenses/by-nc-nd/3.0/). For further queries, please contact Journal Editors at [email protected].

29

the familiarity with the stimulus object, the context and the interaction circumstances, and the presence of certain persons in the situation to be analyzed. These last elements turn out to be particularly important when the aim is to analyze the affective component of the individual’s behavior, within a group, while they solve a collaborative task. People express themselves through their interaction with the others; and in computer-meditated interactions, every user intervention can be recorded and analyzed in time. However, automatic understanding of unstructured dialogue has not being completely accomplished yet (Jermann, Soller, & Lesgold, 2004) or it represents a high computer cost. Nevertheless, in Collaborative Virtual Environments (CVEs) users interact via their graphical representation, their avatar, which includes the capability of the display of nonverbal interaction. Our nonverbal behavior comprises most of what we do but the word meaning, including patterns of verbal interchange like gaps or pauses (Heldner & Edlund, 2010), the objects when they are part of the task at hand or proxemics behavior, to name some among other nonverbal cues. Nonverbal behavior has a number of studies in the real world (see Knapp & Hall, 2010, pp.18-21) and in the creation of artificial behavior in robots and animation (e.g., Breazeal, Kidd, Thomaz, Hoffman, & Berlin, 2005; GuyeVuillème, Capin, Pandzic, Thalmann, & Thalmann, 1998); however, there are few studies of the nonverbal cues people display in CVEs through their avatars. In 1998, Guye-Vuillème, Capin, Pandzic, Thalmann, & Thalmann recognized the importance of the conversion of non-verbal communication to an equivalent in virtual worlds. By presenting a study of the influence of complex embodiments, they emphasized the avatars functions in collaborative situations as: perception, localization, and identification, among others. Particularly for object-focused interactions in virtual environments, Hindmarsh et al. (2002) studied how an immersive desktop CVE system provided participants with the ability to refer and discuss features as well as their interaction around objects. On this regard, according to Schroeder (2011, p. 105), for instrumental tasks in CVEs is important to facilitate joint orientation. Schroeder (2011, p. 105) also argues that in such cases the users seem to be able to ignore the absence of many nonverbal cues using those available. The observation of nonverbal behavior to identify affective behavior represents a task-measure approach based on interaction. However, by using the own participants’ nonverbal cues as a parameter for measures, these measures are individualized. Though, it is important to point out that the features of CVEs, which are predominantly visual, are proper for collaborative tasks in which people need to center their attention on the space and the objects in it, spatial tasks where co-presence is desirable, otherwise this technology might not be required (Spante, Heldal, Steed, Axelsson, & Schroeder, 2003). Also, unlike in real life, the focus in a CVE will be narrowed on a few things and constantly engaged because there is an ongoing reason for being in it (Schroeder, 2011, p. 45).

Nonverbal interaction cues and affective behavior in learning As Ben Ammar et al. (2010) pointed out, affect is inextricable bound to learning. Expert teachers are used to recognize and address the emotional state of the learners and based upon that observation, to take actions to impact their learning; placing this skill in computers should result in better computer assistance for the learner (Kort & Reilly, 2002). For the analysis of the term “emotion” have been included a number of phenomena: emotions, passion, sentiments, affective conduct, motivations, and moods, among others (Rodríguez, 2008). Kantor (1921a; 1921b; 1929) defined emotions as a “no answer” moment, like an action interruption for an overwhelming stimulus; by the time the individual responses, the emotional conduct might be gone. In contrast, sentiments are organized reactions produced by the experience; they functionally correspond with the stimulus or events that the individuals face (Kantor, 1921a; Kantor, 1921b; Kantor, 1929; Ryle, 1949). While sentiments are the affective component of the effective behavior, emotions interfere with that behavior; and both facilitate certain activities, not as a cause but as a disposition (Rodríguez, 2008). Sentiments produce changes in the person and they can create attitude changes toward the stimulus, increasing or decreasing the general function of the person, retarding or accelerating his/her activity and generating less or more interest in something in particular (Kantor, 1921a; Kantor, 1921b; Kantor, 1929). Learning is not the exception; emotions are related to learning affecting the students’ performance (Kort & Reilly, 2002). Kort and Reilly (2002) described the dynamics of emotional states in the areas of science, math, engineering 30

and technology, naturally involving failure and a host of associated affective responses. The Kort and Reilly (2002) model is composed of four quadrants associated to affective states as follows:  Quadrant I: “awe, satisfaction, and curiosity”;  Quadrant II: “disappointment, puzzlement, and confusion”;  Quadrant III: “frustration and discard of misconceptions”; and  Quadrant IV: “hopefulness and fresh research”. According to Kort and Reilly (2002), the students ideally begin in quadrant I or II, because they might be curious or fascinated about a new topic of interest (Quadrant I) or they might be puzzled and motivated to reduce confusion (Quadrant II). Then, as the students construct or test knowledge, if they fail, they will be in Quadrant III where changes are required and understanding what might need to be done or what does not; while making progress, they may move to Quadrant IV with fresh ideas and then they may move to any of the other quadrants, until they solve the problem or complete the task. And, the students could be simultaneously in multiple quadrants, for example, they might be frustrated but also curios about how to proceed. It is worth to mention that it is out of the scope of the paper to theorize about the axes in the quadrants of the Kort and Reilly’s model, that is, their labels of negative to positive affect and the unlearning to constructive learning. Nevertheless, this model manages states that can be recognized by changes in the group nonverbal interaction; and it fits to the accomplishment of an open-ended task, that is, a task with different solution possibilities causing, in some cases, that the participants fail before they find a right solution. In such a way that the flow (Csikszentmihályi, 1991) during collaboration can be interrupted when the participants find difficulties on taking care of the task or when they disagree on the course of action. This flow interruption can be associated with frustration, puzzlement or confusion versus satisfaction or the determination to finish the task affective behaviors, as in the Kort and Reilly (2002) model. Now then, we assume that in an engaging situation, due to the emotional state, the implementation of the task can be momentarily interrupted when one of the participants feels that something might be wrong and therefore feels frustrated, puzzled or confused. As a spatial task, the interruption in the implementation could be due to a change in the location of the participant, related to for example, an overview of the scenario; therefore a navigation interruption has to be considered. Afterwards, longer than usual utterances, could represent the moment when the participant explains or discusses his/her point of view on this regard, a reaction to the emotional moment. To our knowledge this alternative has not been explored elsewhere and therefore we decided to go at front with an exploratory study to understand its implications.

Exploratory study As aforementioned, we propose the use of certain nonverbal behavior cues as an alternative to measure affective behavior such as frustration, confusion or puzzlement during the accomplishment of an engaging collaborative task. In this first approximation, for simplicity, as the minimum expression of a group dyads were observed. The nonverbal cues selected were the duration of utterances, object manipulation inactivity linked with navigation inactivity. These atypical behaviors of the participant are assumed as an affective moment; then this data is contrasted with what the participants verbally express to verify, at least to a certain point, if they somehow actually correspond to an affective behavior.

Participants Eight classmates undergraduate students, seven male and one female, with ages in the range of 18 to 23 years old, voluntarily participated in the study in exchange of desktop objects they could choose (i.e., pencils, text markers or pens).

Materials, devices and experimental situation The sessions took place in an illuminated room, free of distractions. In each of two desks in opposite directions was placed a DellTM computer model Allienware X51, for oral communication microphones with earphones and the 31

TeamSpeakTM application was used. Both computers were Internet connected and the participants’ manipulation of objects, navigation in the environment, and starting or ending utterances were automatically registered in a text file. With the OpenSimTM software and the CtrlAltStudioTM viewer applications, a tridimensional (3D) CVE was adapted where the participants worked in dyads. Both participants could see and interact in the same virtual scenario, but from the point of view that matched their place in the virtual world at any given moment. The sessions were videotaped with the FrapsTM application, it saves the screen as the user sees it, and the screens of both participants were videotaped.

Design and procedure The implemented design allowed comparisons intra and among participants and groups. The participants were divided in four dyads that were randomly assigned. The dyads were exposed to an instructional session for the software application, a demonstration session. Afterwards, each dyad was exposed to a session to solve the experimental task, which consisted in the assembly of several pieces to form a geometric figure, like the one in the Figure 1. During the sessions, the participants could saw that same model formed by plastic pieces.

Figure 1. The figure to be assembled in plastic pieces The scenario was an island in which a number of different color pieces were placed around a rectangular plane as shown in Figure 2. In the CVE each user was graphically represented by a humanoid avatar in first person, that is to say, they could not see themselves in the virtual world although they could see his/her partner. The avatar corresponded to the participant’s gender and had written the participant’s name above the avatar to facilitate identification and communication. The human-computer interaction was through keys combinations and the mouse. The participants could navigate by earth, for example by walking or running, and by air flying; and select, move or rotate each piece or object. Navigation is not always necessary for object manipulation; objects can be manipulated from the distance if they are at sight. During the session a piece of paper with the keys combination was available in the desks. There was no limited time to solve the task. During the demonstration the participants were allowed to familiarize with the application for around 3 to 5 minutes before they started the session. Then the next instructions were given to them: “To communicate with each other use the microphone with the earphones. The two of you are going to work together to assemble a figure like this (the figure in plastic was then shown to them and it was also verbally explained), a cross with two levels, each bar of the cross has two cubes. Not all the pieces are necessary to assemble the figure; please assemble it within the white plane. We are here in the next room, give us a call when you finish.”

32

Figure 2. Set of pieces for session 1

Data The software application saves the “X,” “Y” and “Z” coordinates of the position of the objects and the avatars, when the pieces or the avatars are moved or rotated, and when the microphone is deactivated or activated by the voice of the user. Data was treated to calculate: the duration of the utterances, and the inactivity periods for navigation and the accomplishment of the task, that is, the elapsed time when each participant released a piece and he/she grabbed another one.

Figure 3. Utterances’ duration of one participant The outstanding points, those with the longest duration, were calculated for each session for the three indicators: utterance duration, navigation and manipulation of objects inactivity. The calculation was made with the media and the standard deviation of these indicators where the outstanding points or atypical data are the indicators with 5% of probabilities to occur in the upper limit one tailed. As an example, in Figure 3 is shown, the scatter-chart of the 33

utterance duration of one of the participants of the Dyad 1. The points connect the initiation of an utterance (X-axis) and its duration (Y-axis). The longest utterances are the points on the top of the scatter-chart, marked within a black rectangle. Data was also treated to see when more than one piece was taken out of the workspace, this represents when the figure was partially disassembled. The verbally expressed difficulties found during the accomplishment of the task were obtained and classified as follows:  Doubts about the figure shape, when at least one of the participants verbally expressed not to be sure about the final shape of the figure.  Doubts about the pieces, when at least one of the participants verbally expressed his/her concern about not having enough pieces or the proper ones to finish the figure. Sometimes this difficulty was related to doubts about the figure shape.  Doubts about the course of action, when at least one of the participants verbally expressed his/her concern about what his/her partner was doing. Here were observed expressions like “What are you doing?” or “Why are you doing this or that?” or when one of them expressed not being sure if what they were doing was correct.  Problems handling the pieces, when at least one of the participants verbally expressed that he/she could not, for example, lift or rotate a piece. Which might be more related to frustration than to puzzlement or confusion. Other difficulties founded were:  Not dialoging yet, when the atypical behavior took place before they started to talk to each other. Dialogue is fundamental for collaboration, although people might collaborate in a spatial task without talking to each other that represents a very uncommon situation.  They asked for help, in this particular case the participants were not sure about the final shape of the figure and they decided to ask for help instead of reaching an agreement.  Disassemble of part of the figure, the time when they disassemble part of the figure. No dyad disassembled the whole figure.

Results All dyads completed the task, Dyad 1, 3 and 4 successfully; Dyad 2 did the figure using three, instead of two cubes, for each bar of the cross. The time each session took is shown in minutes and seconds in the second column of the Table 1. All the dyads started the session, once both participants were connected, by moving pieces. The dialogue started later in all cases, as shown in the third column in minutes and seconds. Dyad 1 2 3 4

Table 1. Sessions duration Session duration 13:53 17:19 15:18 08:17

Dialogue started at 01:45 05:18 01:55 00:07

The time of all the atypical behaviors and the verbally expressed difficulties are shown as follows: in Table 2 for Dyad 1, Table 3 for Dyad 2, Table 4 for Dyad 3, and Table 5 for Dyad 4. In the first column of these Tables, (Elapsed Time) is the elapsed time during the session when the atypical behavior took place in minutes and seconds. In the second column (Participant) is a given number to identify each participant (1 or 2) in a dyad. In the third column (Duration) is the duration of the atypical behavior in minutes, seconds and tenths of second. In the fourth column (Atypical behavior) is the type of atypical data: OM for inactivity in manipulation of objects, NA for inactivity in navigation and UT for utterance. When the inactivity in object manipulation corresponds to inactivity also in navigation, or inactivity in navigation corresponds with inactivity in object manipulation, the atypical behavior is marked on its right side with an asterisk “*”. In the fifth column (Related difficulty) is the verbally externalized difficulty that took place in the environment around that time, according to the audio of the session. When the related difficulty was “Problems handling the pieces,” the participant expressing this problem is in parenthesis. When there 34

is no atypical behavior only this column has data (this happened only once, see Table 4 at 09:54). When there was no verbally expressed difficulty it is denoted with a short slash “-”. A row with the time was added to indicate when the dyad disassembled part of the figure. By calculating this data, an automatic analysis can be performed during the collaborative session. Table 2. Atypical behavior of Dyad 1 Elapsed time Participant Duration Atypical behavior Related difficulty 01:45 1 00:04.70 UT Doubts about the figure shape 01:47 2 05:11.85 NA Doubts about the figure shape 02:01 1 00:13.93 OM* Doubts about the course of action 03:12 1 00:16.12 OM* Doubts about the pieces 03:16 1 00:02.31 UT Doubts about the pieces 03:17 1 03:56.705 NA Doubts about the pieces 03:27 1 00:02.29 UT Doubts about the pieces 03:40 1 00:02.57 UT Doubts about the pieces 03:55 2 00:21.49 OM* Doubts about the pieces 04:05 1 00:28.90 OM* Doubts about the pieces 04:53 2 00:02.68 UT 05:14 1 00:02.68 UT Doubts about the figure shape 05:19 2 00:45.50 OM Doubts about the figure shape 05:28 1 00:23.53 OM* Doubts about the figure shape 06:28 1 00:02.37 UT Doubts about the figure shape 06:36 1 00:02.58 UT Doubts about the figure shape 06:47 2 00:02.92 UT Doubts about the figure shape 06:53 2 00:02.83 UT Doubts about the figure shape 07:07 1 00:14.52 OM* They asked for help 07:14 2 00:34.69 OM They asked for help 07:52 1 00:39.95 OM* They asked for help 07:57 Disassemble of part of the figure 08:05 2 00:32.56 OM* Doubts about the course of action 09:05 2 00:15.88 OM* Problems handling the pieces (2) 09:59 2 00:17.99 OM* Problems handling the pieces (2) 10:31 1 10:41.22 NA Problems handling the pieces (1) 10:36 2 00:02.76 UT Problems handling the pieces (1) 11:04 2 00:02.44 UT Doubts about the course of action 11:05 1 00:15.77 OM* Doubts about the course of action 11:11 2 00:17.70 OM* Doubts about the course of action 11:14 1 00:02.98 UT Doubts about the course of action 12:10 2 00:03.43 UT Problems handling the pieces (1) 12:18 1 00:02.49 UT Problems handling the pieces (1) 12:54 1 00:02.50 UT Problems handling the pieces (1) 13:10 1 00:02.26 UT Problems handling the pieces (1) 13:30 1 00:02.65 UT Problems handling the pieces (1) Note. “*” = atypical behavior; “-” = no verbally expressed difficulty. Elapsed time 01:00 03:16 03:57 04:41 05:18 05:29 06:39 09:24 10:00

Participant 1 2 1 1 2 2 1 1 1

Table 3. Atypical behavior of Dyad 2 Duration Atypical behavior Related difficulty 01:44.28 NA Not dialoging yet 00:54.85 OM* Not dialoging yet 00:23.38 OM* Not dialoging yet 01:49.64 NA Not dialoging yet 00:03.30 UT Doubts about the course of action 02:03.92 NA Doubts about the course of action 02:35.58 NA 00:24.22 OM* 00:48.91 OM 35

10:22 2 03:59.30 NA 10:25 2 00:27.27 OM* Doubts about the course of action 10:52 Disassemble of part of the figure 11:25 1 00:02.04 UT Doubts about the course of action 12:40 1 00:02.41 UT Doubts about the course of action 12:46 2 00:03.63 UT Doubts about the course of action 15:01 1 00:41.17 OM 15:57 1 00:47.53 OM* Doubts about the pieces 16:28 2 00:03.12 UT Doubts about the pieces 16:41 Disassemble of part of the figure Note. “*” = atypical behavior; “-” = no verbally expressed difficulty. Table 4. Atypical behavior of Dyad 3 Elapsed time Participant Duration Atypical behavior Related difficulty 00:39 1 00:27.43 OM* Not dialoging yet 01:04 2 05:09.18 NA Not dialoging yet 01:52 2 01:33.72 OM* Not dialoging yet 01:53 1 00:18.77 OM* Doubts about the course of action 01:56 1 00:05.28 UT Doubts about the course of action 02:26 1 01:06.10 NA 02:35 1 00:19.10 OM* 03:29 2 00:03.06 UT Doubts about the figure shape 03:36 2 00:02.84 UT Doubts about the figure shape 03:40 1 01:22.40 NA Doubts about the figure shape 03:49 1 00:24.35 OM* Doubts about the figure shape 04:00 Disassemble of part of the figure 04:31 1 00:18.14 OM* Doubts about the figure shape 04:32 1 00:04.88 UT Doubts about the figure shape 04:38 Disassemble of part of the figure 04:47 1 00:07.76 UT Doubts about the figure shape 04:56 1 00:27.52 OM* 05:05 1 00:05.30 UT Doubts about the figure shape 06:06 1 00:06.60 UT Doubts about the course of action 06:16 1 00:15.98 OM Doubts about the course of action 06:16 1 00:08.54 UT Doubts about the course of action 06:50 2 01:18.04 OM* Doubts about the course of action 08:27 2 00:43.47 OM* 08:53 1 00:04.78 UT 08:54 1 00:25.10 OM 08:59 1 00:06.10 UT 09:54 2 Problems handling the pieces (2) 10:51 1 00:04.95 UT Doubts about the course of action 11:16 2 00:15.29 OM* 12:34 2 00:54.61 OM* Doubts about the pieces 12:35 2 00:03.29 UT Doubts about the pieces 12:58 1 00:40.37 NA 13:17 2 00:08.09 UT Note. “*” = atypical behavior; “-” = no verbally expressed difficulty. Elapsed time 00:07 00:36 01:38 01:58 02:06

Participant 1 2 2 2 1

Table 5. Atypical behavior of Dyad 4 Duration Atypical behavior Related difficulty 00:15.48 NA* Doubts about the figure shape 00:16.48 NA* Doubts about the figure shape 00:19.80 OM* Doubts about the course of action 00:23.46 NA* Doubts about the course of action 00:23.79 OM* Doubts about the course of action 36

02:29 03:26 03:59 04:11 04:22 04:31 04:31 04:50 05:11 05:55 05:57 06:10 06:14 06:49 07:31 07:36 07:49 08:06

2

00:02.67

1 1 2 1 2 1 2 1 2 1 1 2 1 2 1

00:03.08 00:43.07 00:21.39 00:02.88 00:04.15 00:03.52 00:15.35 00:02.97 00:13.42 00:15.47 00:04.72 00:16.35 00:27.86 00:16.80 00:03.95

Disassemble of part of the figure UT Doubts about the figure shape Disassemble of part of the figure UT Doubts about the figure shape OM Doubts about the figure shape OM* Doubts about the pieces UT Doubts about the pieces UT UT NA Problems handling the pieces (2) UT Problems handling the pieces (2) OM* Problems handling the pieces (2) NA* Problems handling the pieces (1) UT Problems handling the pieces (1) OM NA* OM* UT -

The total number of atypical behaviors in the four dyads is 103. As mentioned, only one verbally expressed difficulty is not related to any atypical behavior; it occurred in Dyad 3 at 09:54 (see Table 4). On the contrary, in 22 of 103 occasions (21.4%), the atypical behavior was not related to any verbally expressed difficulty (those with a short dash in the Related difficulty column of Tables 2, 3, 4 and 5). Longer utterances do not always follow long periods of inactivity as expected, but during the verbally expressed difficulty. Also, an expressed type of doubt always preceded the disassembling of part of the figure; this was also always related to an atypical behavior (see Table 2, 3, 4 and 5). From the 103 atypical behaviors, were removed those that occurred: when the dyads had not established dialogue because there is no way to corroborate if they correspond to a difficulty (seven of them, those labeled as “Not dialoging yet” in the Related difficulty column in Tables 3 and 4); when Dyad 1 asked for help and was waiting for an answer (three of them in Table 2); and those of inactivity in object manipulation and navigation that did not correspond to inactivity in both (17 of them: those not marked with an asterisk “*” in the Atypical behavior column of Tables 2, 3, 4 and 5). In Table 6 are shown the results by dyad, where it can be seen the total of atypical behaviors presented (76, total of the second column) and the total of verbally expressed difficulties around the time that the atypical behavior took place (62, the total in the third column). In this same Table 6, the total of atypical behavior was break down by utterances (UT), inactivity in object manipulation (OM) and in navigation (NA) and their related difficulty when there was one. Difficulties were separated by doubts of any kind (probably more related to puzzlement and confusion), and problems handling the pieces (probably more related to frustration).

Dyad 1 2 3 4 Total

Total of atypical behavior 28 8 22 18 76

Table 6. Verbally expressed difficulties related to nonverbal cues Related to a Related to Verbally expressed UT handling OM handling NA doubt doubt difficulties pieces pieces 27 18 11 6 10 8 2 0 7 5 5 0 3 2 0 0 15 13 10 3 9 5 0 0 13 8 3 2 5 3 1 5 62 44 29 11 27 18 3 5

Related to doubt

handling pieces

0 0 0 3 3

0 0 0 1 1

Regarding utterances, in Table 6 it can be seen that from 44 longest utterances: 29 are related to a kind of doubt and 11 to problems handling the pieces (the rest of them, five are not associated to a verbally expressed difficulty). No matter which participant had the problems handling the pieces, the longest utterances were made indistinctly by any of them (see Tables 2, 3, 4 and 5). The number of utterances by each participant was accounted to corroborate if the one who spoke the most was also the one with longest utterances, see Table 7; only in two dyads this was truth (Dyad 1 and Dyad 4). 37

Table 7. Utterances by participant Atypical utterances Total of utterances Participant 1 Participant 2 Participant 1 Participant 2 12 6 176 80 2 3 45 26 9 4 104 122 6 2 101 53

Dyad 1 2 3 4

About atypical inactivity in manipulation of objects (as can be seen in Table 6, the seventh column OM) it occurred 27 times: 18 related to different type of doubts, three to problems handling the pieces, and therefore six are not related to an expressed difficulty. Atypical navigation inactivity was present only in Dyad 4, five times: three related to a type of doubt, one to handling the pieces, so one is not related to an expressed difficulty. The two dyads that had atypical behavior before they started the dialogue (Dyad 2 and Dyad 3) presented less expressed problems handling the pieces. The problems handling the pieces were at the end of the session, when the dyads were about to finish the figure, trying to fit the last pieces and managing the second lay of the figure. In Table 8 are presented the amount of atypical behaviors from all the dyads, break down by type of difficulty.

Atypical behavior OM NA UT

Table 8. Number of atypical behaviors in dyads by difficulty Doubts about the Problems handling figure shape course of action pieces the pieces 3 9 6 3 2 1 0 1 13 10 6 8

Not related to a difficulty 5 1 7

Evaluation by Dyad By doing a closer glimpse to each dyad data, it can be observed that in Dyad 1, only one atypical behavior of long utterances could not be explained with a verbally expressed difficulty (see Table 2 at 04:53). A number of long utterances are very representative when they had doubts about the figure shape, which ended in asking for help, and eventually in disassembling part of the figure. In this particular dyad we found a peculiarity about their navigation. They navigated first by air, switched to earth navigation, by the end of the session Participant 1 switched again to air navigation. In Figure 4 are shown, the scatter-charts of navigation of this Dyad 1. The points connected the duration in navigation, a red circle for air navigation and a blue triangle for earth navigation. In the upper side of the figure is Participant 1 and in the bottom the Participant 2. Through the video, we realized that they talked about switching to earth navigation to facilitate the manipulation of the pieces. For Dyad 2 (as shown in Table 3), a bigger number of difficulties seem not to be explained by any atypical behavior than as in other dyads. However, only the one at 09:24 represents inactivity in both object manipulation and navigation. This dyad presented a lack of communication in about the first half of the elapsed time of the session, as shown in Figure 5; they started to communicate and actually to collaborate by the end of the session. They also presented long periods of object manipulation inactivity compared with other dyads. Doubts about the course of action were mainly expressed with long utterances. This dyad was the one that ended doing the figure in a not correct way. Dyad 3 (see Table 4) had long periods of inactivity at the beginning of the session while they were not talking to each other. This is the dyad with more atypical behavior not associated to any expressed difficulty; around a third of all they had (seven of 22). By the video it was appreciated that the participants decided to make one layer of the figure each one, and therefore they were not collaborating but having division of labor.

38

Figure 4. Earth and air navigation of Dyad 1

Figure 5. Utterances of both participants of Dyad 2 39

Dyad 4 was the only dyad that started almost immediately with a dialogue, and also they were the only one having atypical inactivity periods of navigation with inactivity in object manipulation, in both participants (see Table 5).

Conclusions and future work We proposed the observation of nonverbal behavior during the accomplishment of a collaborative task in a CVE for the automatic understanding of affective behavior helpful for learning such as frustration, puzzlement or confusion in contrast to satisfaction or the determination to finish the task (Kort & Reilly, 2002). This is an indirect, not intrusive task-measure, but based on individual measures. As a novelty approach we decided to do an exploratory study on dyads. Following Kantor (1921a; 1921b), we assumed an emotional no answer moment followed by its expression in the interaction with the others. In the exploratory study were collected inactivity atypical behaviors of the participants by measuring when they were not manipulating objects or navigating; also, unusual long utterances were collected. Afterwards the atypical behaviors were associated to verbally expressed difficulties (i.e., doubts about the course of action, the pieces to create the figure and the final shape of the figure to assemble, and also to problems handling the pieces). There is no way to comprehend the atypical behavior when the participants did not express themselves verbally, but that does not mean that they were not facing a frustrating, puzzlement or confusing moment. Through the exploratory study we found, among others, a significant pattern, 62 times of 76 (a high rate for human behavior of 81.6%) atypical behaviors occurred around the time that the participants verbally expressed a type of difficulty related to frustration, puzzlement or confusing moments; although, neither long utterances nor inactivity could be associated to a particular type of difficulty. In a close observation to each specific case dyad, we found that in Dyad 3 they followed a division of labor approach, half of all dyads’ not related to a verbally expressed difficulties occurred in this dyad. We are aware that the use of nonverbal behavior patterns is a holistic solution to understand affective behavior, and that this is just a particular situation for its application. In addition the small amount of trials, due to the fact that this is an exploratory study, does not allow making conclusive asseverations. However, the findings here encourage us to follow this path, exploring other nonverbal cues, as could be gaze direction or other pattern in utterances as the shorter ones. A probable way to better comprehend the affective behavior is to include during the utterances the collection of data related to voice inflexions. Also, the nonverbal behavior has been found helpful to explore, in the first place, if collaboration is actually taking place (Peña & de Antonio, 2009). In a CVE, the collaboration phenomena can be easily collected and analyzed in time through nonverbal cues. This presents a significant resource to automatize affective computing and a way to automatically give feedback for apprentices/students or to aid a human or a virtual tutor to support learning. Some type of verbal analysis can complement this approach. The automatic analysis in affective learning should result in better scaffolding for the students in computer supported collaborative learning.

References Agrafioti, F., Hatzinakos, D., & Anderson, A. K. (2012). ECG pattern analysis for emotion detection. IEEE Transactions on Affective Computing, 3(1), 102-115. Ben Ammar, B. M., Neji M., Alimi, A. M., & Gouardèresc, G. (2010). The Affective tutoring system. Expert Systems with Applications, 37(4), 3013 - 3023. Bamidis, P. D., Papadelis, C., Kourtidou-Papadeli, C., Pappas, C., & Vivas, A. B. (2004). Affective computing in the era of contemporary neurophysiology and health informatics. Interacting with Computers, 16(4), 715-721. Boehner, K., DePaula, R., Dourish, P., & Sengers, P. (2007). How emotion is made and measured. International Journal of Human-Computer Studies, 65(4), 275-291.

40

Breazeal, C., Kidd, C. D., Thomaz, A. L., Hoffman, G., & Berlin, M. (2005). Effects of nonverbal communication on efficiency and robustness in human-robot teamwork. In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (pp. 708-713). doi:10.1109/IROS.2005.1545011 Csikszentmihályi, M. (1991). Flow: The Psychology of optimal experience. New York, NY: Harper Collins. Guye-Vuillème, A., Capin, T. K., Pandzic, I. S., Thalmann, N. M., & Thalmann, D. (1998). Nonverbal communication interface for collaborative virtual environments. In D. Snowdon & E. Churchill (Eds.), Proceedings of the Collaborative Virtual Environments (pp. 105-112). Manchester, UK: University of Manchester. Heldner, M., & Edlund, J. (2010). Pauses, gaps and overlaps in conversations. Journal of Phonetics, 38, 555-568. Heyward, P. (2010). Emotional engagement through drama: Strategies to assist learning through role-play. International Journal of Teaching and Learning in Higher Education, 22(2), 197-203. Hindmarsh, J., Fraser, M., Heath, C., & Benford, S. (2002). Virtually missing the point: Configuring CVEs for object focused interaction. In E. F. Churchill, D. Snowdon & A. Munro (Eds.), Collaborative virtual environments: Digital places and spaces for interaction (pp. 115– 139). London, UK: Springer. Jermann, P., Soller, A., & Lesgold, A. (2004). Computer software support for collaborative learning. In J. W. Strijbos, P. A. Kirschner & R. L. Martens (Eds.), What we know about CSCL in higher education (pp. 141-166). Amsterdam, The Netherlands: Kluwer. doi:10.1007/1-4020-7921-4_6 Kantor, J. R. (1921a). An Attempt toward a naturalistic description of emotions (I). Psychological Review, 28(1), 19-42. Kantor, J. R. (1921b). An Attempt toward a naturalistic description of emotions (II). Psychological Review, 28(2), 120-140. Kantor, J. R. (1929). Principles of psychology (Vol. 2). New York, NY: The Principia Press. Knapp, M., & Hall, J. (2010). Nonverbal communication in human interaction (7th ed.). Belmont, CA: Thomson Wadsworth. Kort, B., & Reilly, R. (2002). An affective module for an intelligent tutoring system. In S. A. Cerri, G. Gouardères, & F. Paraguaçu (Eds.), Proceedings of the 6th International Conference on Intelligent Tutoring Systems (ITS) 2002 (pp. 955-962). doi:10.1007/3-540-47987-2_95 Lerner, J. S., Small, D. A., & Loewenstein, G. (2004). Heart strings and purse strings: carryover effects of emotions on economic decisions. Psychological Science, 15(5), 337-341. Liu, H., Lieberman, H., & Selker, T. (2003). A model of textual affect sensing using real-world knowledge. In Proceedings of the 8th International Conference on Intelligent User Interfaces (pp. 125-132). New York, NY: ACM. Moridis, C., & Economides, A. A. (2008). Towards computer-aided affective learning systems: A Literature review. Journal of Educational Computing Research, 39(4), 313-337. Peña, A., & de Antonio, A. (2009). Nonverbal communication as a means to support collaborative interaction assessment in 3D virtual environments for learning. In A. A. Juan, T. Daradoumis, F. Xhafa, S. Caballe & J. Faulin (Eds.), Monitoring and assessment in online collaborative environments: Emergent computational technologies for E-learning support (pp. 172-197). Hershey, PA: IGI- Global. Picard, R. W., & Daily, S. B. (2005). Evaluating affective interactions: Alternatives to asking what users feel. In CHI Workshop on Evaluating Affective Interfaces: Innovative Approaches (pp. 2219-2122). New York, NY: ACM. Picard, R. W. (Ed.). (1997). Affective computing. Cambridge, MA: MIT Press. Rodríguez, C. M. L. (2008). Emociones y salud: algunas consideraciones [Emotions and health: some considerations]. Revista Psicología Científica, 10(5). Retrieved from http://www.psicologiacientifica.com/emociones-y-salud/ Ryle, G. (1949). The Concept of mind. New York, NY: Barnes & Noble. Schroeder, R. (2011). Being there together: Social interaction in shared virtual environments. New York, NY: Oxford University Press. Spante, M., Heldal, I., Steed, A., Axelsson, A., & Schroeder, R. (2003). Strangers and friends in networked immersive environments: Virtual spaces for future living. In Proceedings of the Home Oriented Informatics and Telematics (HOIT).

41