Sign language interpreters' use of haptic signs in ...

3 downloads 0 Views 1MB Size Report
Trine Næss, a deafblind woman from Norway, who was experiencing progressive ...... Knoblauch, Hubert, Soeffner, Hans-Georg, Schnettler, Bernt, Raab, Jurgen ...
+ Models

PRAGMA-4261; No. of Pages 14

Available online at www.sciencedirect.com

ScienceDirect Journal of Pragmatics xxx (2016) xxx--xxx www.elsevier.com/locate/pragma

Sign language interpreters’ use of haptic signs in interpreted meetings with deafblind persons Eli Raanes *, Sigrid Slettebakk Berge NTNU -- Norwegian University of Science and Technology, 7491 Trondheim, Norway

Abstract This article presents a study of how sign language interpreters use haptic signs in interpreter-mediated meetings with deafblind persons. The analysis illustrates how this kind of interpreting is organized and how interactional space is reconfigured through embodied haptic signs. The material is from an authentic meeting among five deafblind board members of a Norwegian association for the deafblind. Despite their inability to see and hear one another clearly or not at all, the dialogue among them flows. Based on an analysis of videorecordings from the meeting, this article provides insight into the interpreters’ actions as well as their interaction with each other and their deafblind interlocutors. In particular, the article draws attention to how the interpreters alternate their actions between mediating spoken utterances, describing the meeting context and producing different kinds of haptic signs. Haptic signs are conventional signals produced on a deafblind person's body providing contextualizing information about the environment where the interaction is taking place. They also work to convey other participants’ nonverbal expressions, such as turn-taking, minimal-response signals and emotional expressions. As such, haptic signs provide information that the deafblind can use to frame their interaction as well as to enable them to regulate their own self-presentation. In this context, haptic signs produced by the interpreters supports involvement in the ongoing meeting, selecting effective signals and timing the production and adjustment of the signals from the feedback given by the deafblind interlocutors. An interpreter's actions are based on a situated, moment-by-moment evaluation of the participation framework in which all the participants, both the interpreters and the deafblind persons, operate. © 2016 Elsevier B.V. All rights reserved.

Keywords: Haptic signs; Embodied interaction; Tactile sign language; Interpreting; Deafblind; Turn-taking

1. Introduction This article presents a study of an authentic meeting between five deafblind board members of a Norwegian Association for the Deafblind. Despite the members’ inability to see and hear one another, their dialogue and meeting are effective. The interaction in the meeting is facilitated by seven sign language interpreters who work to interpret the meeting and its interactional context. In particular, this study focuses on the interpreters’ use of haptic signs, that is, different kinds of signals that are produced on the deafblind person's body that provide information about the interactional environment as well as the other participants’ turn-taking signals and emotional expressions, such as laughter and confusion (Lahtinen, 2008). The current analysis focuses on the sequential order of the interaction and the way the interpreters successfully deploy haptic signs to provide information to their deafblind interlocutors about their environment and the other participants’ emotional expressions.

* Corresponding author. [3_TD$IF]Tel.: +47 [4_TD$IF]73559892. E-mail addresses: [email protected] (E. Raanes), [email protected] (S.S. Berge). http://dx.doi.org/10.1016/j.pragma.2016.09.013 0378-2166/© 2016 Elsevier B.V. All rights reserved.

Please cite this article in press as: Raanes, E., Berge, S.S., Sign language interpreters’ use of haptic signs in interpreted meetings with deafblind persons. Journal of Pragmatics (2016), http://dx.doi.org/10.1016/j. pragma.2016.09.013

+ Models

PRAGMA-4261; No. of Pages 14

2

E. Raanes, S.S. Berge / Journal of Pragmatics xxx (2016) xxx--xxx

1.1. Deafblindness: a combined sensory loss A person can be defined as deafblind when s/he has a combined visual and auditory disability. In Norway, it has been estimated that approximately 0.1% of the population is deafblind (Raanes, 2006). While some of the deafblind are both completely deaf and blind, others have retained some hearing and/or some vision. The deafblind can be further divided into two main groups with regards the time when the sensory reduction occurred as well as their preferred way of communication: first, those who were born with the disability (congenital deafblindness); and second, those who have become deafblind as adults (acquired deafblindness). Most deafblind belong to the second group, and this group can again be divided into those who are primarily deaf, those who are primarily blind and those who were born with normal sight and hearing and have developed both visual and auditory disability in their adult life. Based on the type and degree of their sensory loss, persons with a combined sensory loss use a variety of communication methods. Some deafblind people find it most efficient to communicate through tactile sign language, while others prefer the codified national sign language. In the latter case, use of national sign language often requires that the speaker adjusts the size and the location of signs towards the deafblind person's sight. Some communicate through spoken language as they can hear rather well, especially if the speaker adjusts their voice and/or the receiver uses a technical hearing aid. As a result, interpretermediated group discussions among deafblind persons can thus involve several different kinds of communication methods and require several interpreters. 1.2. Tactile sign language and haptic signs Communication by touch is characteristic of deafblind communication. In the Nordic countries, a distinction is made between haptic sign and tactile sign language. Tactile and haptic are two terms for ‘‘touch’’, derived from the Latin word ‘‘tactilis’’ and the Greek term ‘‘haptikos’’, respectively. Tactile sign language refers to the traditional use of sign language in deafblind communication and its use goes back to the first meetings between deafblind people in schools and families. Tactile communication has been studied since the early works of Birdwhistell (1952, 1970), which focused on the bodily and kinesthetic modality of human communication, and later studies have been conducted from material in French, Swedish, Norwegian, British and American tactile sign languages (Mesch, 1998; Collins, 2004; Raanes, 2006; Petronio and Dively, 2006; Schwartz, 2009; Edwards, 2014). Haptic signs, on the other hand, are a relatively new system of tactile signals, which provide both environmental and interactional information to deafblind people. The system has been developed in the deafblind societies in the Nordic countries over the last 20 years. The signals were originally created by Trine Næss, a deafblind woman from Norway, who was experiencing progressive visual sensory loss. Together with her sign language interpreters she started to experiment with different techniques of embodied signals that would provide her information about the context, the ongoing conversation and the emotional expressions of the others involved (Næss, 2002). Unfortunately, Næss passed away before her work was published, but three of her interpreters have continued her work, resulting in a handbook for haptic communication (Bjorge et al., 2013; Bjørge and Rehder, 2015). Other important contributors to the development of haptic signs in the Nordic countries include, for example, Riitta Lahtinen and Russ Palmer, whose work has been presented in a number of workshops in the Nordic countries. Lahtinen has also published a case study of how she and Palmer use haptic signs in their communication (Lahtinen, 2003). Today, the use of haptic signs has become an integral part of interpreting for the deafblind in the Nordic countries (Lahtinen et al., 2010; Skåren, 2011), and there are approximately 200 conventional signs for directions, emotions, feedback signals and activities (Bjorge et al., 2013). To illustrate this sign system, Fig. 1 below presents the signs for

[(Fig._1)TD$IG]

Fig. 1. Haptic signs for room and direction -- straight ahead (reproduced with permission from Bjorge et al., 2013: 78, 122).

Please cite this article in press as: Raanes, E., Berge, S.S., Sign language interpreters’ use of haptic signs in interpreted meetings with deafblind persons. Journal of Pragmatics (2016), http://dx.doi.org/10.1016/j. pragma.2016.09.013

+ Models

PRAGMA-4261; No. of Pages 14

E. Raanes, S.S. Berge / Journal of Pragmatics xxx (2016) xxx--xxx

3

‘‘room’’ and ‘‘direction -- straight ahead’’. Both signs are made on the deafblind person's back. When combining the information from the two signs, the deafblind person can understand that s/he is on one side of the room and that s/he is going to walk straight forward to the other side. An alternative meaning can be that something or someone else is moving to the other side of the room. The meaning of the haptic signs is therefore dependent on the mutual construction of meaning between the interpreter and the deafblind person, and the signs are to be conveyed in such a way that they refer to the particular activity in which the participants are currently engaged. However, while haptic signs can be understood as mutually constructed, we currently know little about interpreters’ professional practice and their use of haptic signs in everyday authentic meetings. The current paper seeks to fill this gap by studying interpreters’ actions and interactions, and, as far as we know, this is one of the first studies of haptic signs in an interpreter-mediated group discussion amongst deafblind persons. In this study, we will be investigating how an interpreter uses haptic signs during a meeting to indicate information about the environment and the other participants. 1.3. Interpreting services for the deafblind In the Nordic countries, the need for interpreting services for the deafblind has been recognized since the 1980s (Petrén, 1980) and has been part of the public service for the deaf and the deafblind since the early 1990s (Raanes and Berge, 2011). The main goal of public interpreting services is to provide equal participation in society. Over time the need for qualified interpreters in this domain has increased reflecting the growing activity levels of the deafblind population (Hjort, 2008). Deafblind interpreting in Norway is provided for a variety of situations, from meetings in health care, to education, work and social activities. The provision of interpreting services to the deafblind entails several activities. First, deafblind persons may need to be guided to the place where they have an engagement. Secondly, an interpreter must interpret the ongoing interaction using a variety of communication means. Thirdly, the interpreter must describe the spatial and social environment where the interaction is taking place. This description should include information about, for example, who is present, who is talking and what the interlocutors’ reactions are. Without this description it is difficult for the deafblind person to understand the context of the interaction and to take part in the dialogue. As such, interpreting for the deafblind requires a combination of skills. Per Linell (1997, 1998, 2009) has put forward a meta-theory of ‘‘dialogism’’ that frames conversations within the sociocultural connection between human cognition, activities and language use. These perspectives are followed by Wadensjö’s (1995, 1998) work on interactional dynamics in interpreted mediated dialogues. This research gives insight into the meaning-making taking place between the interpreter and their interlocutors. The interpreters’ role and their responsibilities in choosing efficient mediation strategies are shaped by personal frames of the interpreted text in a context, where ‘‘what is going on’’ and ‘‘who the persons are’’ may be part of working in the framework of participation which the interpreter has constructed together with the other participants (Goffman, 1974). Wadensjö (1995) describes this process like this: The sense-making work carried out by anyone in interaction can be described as based on different aspects of meaning, basically the propositional meaning of talk and the interactional or situated meaning of the words spoken. Moreover, in a conversation involving three or more persons, sense is also arguably made on the basis of the participation framework (Goffman, 1981) continuously negotiated in and by talk (Wadensjö, 1995:111). This article contributes to this theoretical understanding through a conversational analysis of the dialogue in an interpreted meeting between five deafblind people. The next part of the paper presents the methods used in conducting the study. 2. Methods The empirical data for this study are video recordings of an authentic board meeting of an association for deafblind persons living in Norway. All the participants were informed about the study, signed a consent form and gave us permission to use the video-recordings from the meeting in research publications. The meeting, lasting 52 min and 30 s, was filmed from different angles with four cameras. Three of the cameras were placed on tripods, while one was handheld. One camera was used prior to the meeting and filmed the preparation between the board's chairwoman and her interpreter. In total, there are approximately 5 h of video recordings from the meeting. In addition, observation notes were taken by the two researchers, who were present for the whole duration of the meeting. Two months after the board meeting, two focus group interviews were conducted with the interpreters, lasting 2 [1_TD$IF]h and 50 min. Six months after the meeting, three additional individual interviews were conducted with representatives of the board members. These interviews lasted for 1 h, 1 h and 45 min and 1 h and 55 min, respectively. All the interviews were video-recorded and later transcribed into written Norwegian. Before the interviews, a preliminary analysis of the video-recordings from the meeting Please cite this article in press as: Raanes, E., Berge, S.S., Sign language interpreters’ use of haptic signs in interpreted meetings with deafblind persons. Journal of Pragmatics (2016), http://dx.doi.org/10.1016/j. pragma.2016.09.013

+ Models

PRAGMA-4261; No. of Pages 14

4

E. Raanes, S.S. Berge / Journal of Pragmatics xxx (2016) xxx--xxx

was undertaken. In this phase, the researchers mapped the video-recordings, transcribing what was said and done by the different interlocutors and then mapping similar kinds of actions together. This preliminary analysis of the material highlighted certain frequently used strategies of interpreting. Some of the video-cuts were also presented to the interpreters during the focus group interviews when they were interviewed. In analysing the participants’ conversation and interactions, we have used the method known as conversation analysis (CA). This method of enquiry investigates how people use language in everyday life, focusing, in particular on the sequential interaction order amongst the participants and the mutual dynamics between them when they orient themselves towards each other and the interactional/conversational environment. Central to this method, are, for instance, the mutual negotiation process of turn-taking and the participants’ exchange of response and initiatives (Sacks et al., 1974, 1992). Conversation analysis has had a great impact on the development of discourse analysis in general, and of multimodal analysis in particular (Heath, 2002; Mondada, 2009, 2014). The basis for this kind of research is detailed transcriptions of the conversation. This analytic approach has also been applied to the study of interpreter-mediated interaction between different spoken languages (Wadensjö, 1998), between spoken and signed language (Roy, 2000; Metzger, [8_TD$IF]1999; Coates and Sutton-Spence, 2001) and between spoken and tactile sign language (Frankel, [9_TD$IF]2002; Metzger et al., 2004; [10_TD$IF]Berge and [1_TD$IF]Raanes, [12_TD$IF]2013; Berge, 2014). Video-recordings provide suitable data for developing microanalysis of human interaction (Marková et al., 2007; Broth et al., 2014). The transcription work of sign language is, compared to transcriptions of spoken language, far more complex since the amount of input is more extensive (Erlenkamp, 2003). In the board meeting, different kinds of interpreting methods were used by the five interpreter-interpretee pairs, and side-talks operated simultaneously with the main discussion around the table. To annotate what the interlocutors said and the embodied language they used we developed a transcription model inspired by Goodwin (2000), Linell and Gustavsson (1987) and Knoblauch et al. (2006). Our transcription includes the spoken words and the signs and the embodied gestures produced. In the analysis that follows, all haptic signs will be marked with h*. To reduce the complexity, the analysis below focuses on the haptic signs mediated by the chairwoman's interpreter, on the one hand, and the exchange of initiatives and responses between the chairwoman and two of the other board members, on the other. To contextualize the conversation and the embodied interactions, a detailed description of the context will be given. In this context, the spatial arrangement for seating was found to be a key element in how the interpreting was organized and proceeded at the meeting (Berge, 2014; Berge and Raanes, 2013; Raanes and Berge, 2011). 3. Analysis The video-analysis in this study focused on embodied actions and the interpreters’ use of haptic signs. The analysis of the haptic signs and their function in the sequential interaction order among the interpreters and the prime participants reveals that these kind of embodied signs have three key functions: (1) addressing the next speaker; (2) mediating minimal-response signals; and (3) expressing other participants emotions. However, one of the key findings of the analysis was that the meaning of haptic signs was situated, in that it was connected to an awareness of the seating arrangement around the board table. 3.1. Spatial organization of the board meeting Assisted by the interpreters, the board members chose their places around the board table. The seating arrangement was decided based on the board members’ sensory loss and the most efficient way of mediating the conversation during the meeting. The chairwoman (here given the pseudonym Rebecca) was seated at the top of the table. The two board members (db2 and db5) who could capture some information through their hearing sat next to the chairwoman. The two board members who were completely deaf and blind and communicated by tactile Norwegian sign language (here given the pseudonyms Inger (db4) and Ivar (db3)) sat at the end of the table. Inger was seated to the left and Ivar to the right. Next to each board member sat one, or in both Inger's and Ivar's case two, sign language interpreters. Fig. 2 presents the members’ seating arrangement. This spatial organization constructs a ‘‘cognitive figure’’ (Goodwin, 2000) that was used by both the interpreters and the board members when constructing a mutual meaning of the haptic signs and when orienting their bodies towards each other. To construct a cognitive understanding of the seating arrangement requires a detailed description of the context (Berge and Raanes, 2012). Before the formal meeting, the sign language interpreters described basic environmental information to each of the prime participants. All of the interpreters used embodied techniques to provide information about the room, who was present and where they were seated. Fig. 3 shows how Rebecca's interpreter provides her with information about the position of the other board members as she touches Rebecca's hand and moves it towards the place around the table where the different board members are seated. Simultaneously she says their names. This can be defined as Please cite this article in press as: Raanes, E., Berge, S.S., Sign language interpreters’ use of haptic signs in interpreted meetings with deafblind persons. Journal of Pragmatics (2016), http://dx.doi.org/10.1016/j. pragma.2016.09.013

+ Models

[(Fig._2)TD$IG]

PRAGMA-4261; No. of Pages 14

E. Raanes, S.S. Berge / Journal of Pragmatics xxx (2016) xxx--xxx

5

Fig. 2. The seating arrangement at the board meeting.

multimodal action (Goodwin, 2000; Mondada, 2009), that is, action where the interpretations combine spoken words and tactile gestures. The interview material gives insight into the board members’ cognitive understanding of the seating arrangement. The chairwoman, Rebecca, tells that she remembered exactly who was seated where: ‘‘I know what was happening around the table. I know where the others were seated and what the table looked like. It was just as if I could see it’’. While speaking Rebecca was pointing (in the air) to where each of the four other board members were seated, in relationship to herself, around the table. This can be seen as an example of how environmental information had become integrated knowledge through the interpreter's descriptions. The following conversational analysis explores how the interpreters use the seating arrangement as a basis for constructing an informative mediation and to construct meaningful haptic signs.

[(Fig._3)TD$IG]

3.2. Haptic signs for addressing the next speaker In human dialogue, turn-taking among the participants is arranged on the basis of a negotiation process where each of the participants contributes and is involved in the decision about who has the others’ attention and when the turn to speak will be taken by someone else. Such negotiation processes are maintained with vocal signals or embodied gestures (Sacks et al., 1974; Mesch, 1998). In our situation, the chairwoman could not see the others’ turn-taking signals, meaning that she was dependent on interpreter mediated information to allocate turns to speak and thus to perform one of her main

Fig. 3. Haptic and vocal descriptions of the seating arrangement.

Please cite this article in press as: Raanes, E., Berge, S.S., Sign language interpreters’ use of haptic signs in interpreted meetings with deafblind persons. Journal of Pragmatics (2016), http://dx.doi.org/10.1016/j. pragma.2016.09.013

+ Models

PRAGMA-4261; No. of Pages 14

6

E. Raanes, S.S. Berge / Journal of Pragmatics xxx (2016) xxx--xxx

duties as a chairwoman. The analysis below presents how Rebecca's interpreter (1) uses haptic signs to mediate information about which of the other four board members is presenting him/herself as the next speaker. The topic of the discussion in the extract below was to find and decide the date for the annual meeting. Extract 1

01

Who Chairwoman Rebecca

02

Interpreter1

03

Interpreter1

04

Rebecca

05

Ivar db3

06 07

Interpreter3 Interpreter1

08

Interpreter1

09

Rebecca

10

Interpreter1

11

Interpreter3

12

Ivar db3

Frontstage utterances ‘‘I have sent in a request for this venue for the coming year.’’ Jeg har søkt lokalene her for neste år

Backstage utterances

Haptic signs

[‘‘They are still interpreting’’] [De oversetter hva du nettopp sa]

[h*WAIT] (signed on Rebecca’s back) h*GO-ON (signed on Rebecca’s back)

‘‘Yes, and we . . .’’ Ja, og så . . . (Raises his hand: calls for a turn to speak) WAIT h*PERSON: IVAR (signed on Rebecca’s back] h*RAISING-HAND

‘‘Ivar asks for a turn’’ Ivar ber om ordet ‘‘Yes, proceed . . .’’ Ja, værsegod . . . Gazes and nods to interpreter 3 YES GO ON YOU Yes, proceed WAIT WAIT I DEAF CLUB JOURNAL: ABOUT THEIR MEETING PLAN I am still waiting for the bulletin where the deaf association posts their meeting plan

(video recording camera 3: 01:30--02.06) Four haptic signs (h*) are signed on Rebecca’s back in Extract 1 (lines 2, 3, 7 and 8). During Rebecca’s turn to speak, her interpreter realized that Ivar’s interpreter needed more time to convey Rebecca’s question to Ivar (db3). This information was mediated to Rebecca by her interpreter by a multimodal expression where the interpreter constructed a haptic sign for ‘‘WAIT’’ on Rebecca’s back at the same time as she gave a verbal description of the activity and the fact that some of the interpreters needed more time to interpret the information (line 2). Rebecca’s interpreter kept her hand on Rebecca’s shoulder until everybody was ready and then constructed a series of small taps on Rebecca’s back (line 3). These embodied signals informed Rebecca that the interpretation was finished and the board members were ready to focus on new information. Rebecca could therefore go on, and in line 4 we see that Rebecca uses this information and continues to speak. Later in the sequence (lines 7 and 8) there is another example of how haptic signs are used to address the next speaker. During the meeting, when Ivar (db3) wanted to say something to the other board members, he often used a conventional turn-taking signal of raising his hand (line 5). However, this information was not visibly accessible for Rebecca, and so her interpreter had to construct a haptic sign to convey this information to Rebecca by placing her hand on the left side of Rebecca’s back and by moving her hand up and to the left (lines 7 and 8). This sign informed Rebecca that the board member who was seated at the end of the table to the left was giving a turn-taking signal. This cognitive insight could only be constructed because of the situated description of the board members’ placement around the table. Please cite this article in press as: Raanes, E., Berge, S.S., Sign language interpreters’ use of haptic signs in interpreted meetings with deafblind persons. Journal of Pragmatics (2016), http://dx.doi.org/10.1016/j. pragma.2016.09.013

+ Models

PRAGMA-4261; No. of Pages 14

[(Fig._4)TD$IG]

E. Raanes, S.S. Berge / Journal of Pragmatics xxx (2016) xxx--xxx

7

Fig. 4. Haptic signs for addressing the next speaker.

Rebecca knows that it is Ivar and not Inger (db4) who is seated in front of her to her left. The facilitation of haptic signs enables Rebecca to respond to Ivar’s initiative (line 9), meaning that the interpreted information is effectively used to maintain her role as a chairwoman, keeping track of the discussion in the meeting. Rebecca and Ivar cannot see each other, but they are still able to orient their bodies and faces towards each other during the discussion and to make their embodied gestures more visually accessible to the other participants (Goodwin, 2000). The sequence where the interpreters mediated embodied gestures for turn-taking signals, here when Rebecca are asked to wait with a sign at her arm and then on her back given the haptic signs for ‘‘Ivar (db3) is raising his hand’’, as seen in Fig. 4 below. The analysis of Extract 1, as illustrated in Fig. 4 above, documents the multi-layered levels of interaction in the participation framework of the board meeting, where the ‘‘footing’’ of the chairwoman and her interpreter alternates between backstage and frontstage activities (Goffman, 1959). For instance, when Ivar (db3) asks for a turn to speak, his interpreter constructs a backstage turn-signal for pauses (line 6), but this information is not described frontstage to the other board members, only his initiative of wanting to be addressed as the next speaker is shared frontstage (lines 8 and 9). On the other side of the table, similar backstage dialogues are taking place between Rebecca and her interpreter (lines 2 and 3). One common feature of these backstage dialogues is that they are connected to the frontstage activity around the board table in that they facilitate information that the board members use to maintain their frontstage role as board members (see also Berge and Raanes, 2013). 3.3. Haptic minimal-response signals Minimal-response signals are important in all conversations (Linell, 2009). For those with a combined sensory loss, it is challenging to respond to minimal-response signals that are initiated by visual gestures or vocal changes. One way of interpreting such information is to facilitate tactile minimal-response signals (Berge and Raanes, 2013). Extract 2 presents four sequences of tactile minimal-response signals, all exchanged between Inger (db4) and her interpreter (4). Extract 2 01

Who Chairwoman Rebecca (db1)

02 03 04

Ivar (db3) Inger (db4) Interpreter (4)

05

Interpreter (4)

06

Inger (db4)

Frontstage utterances ‘‘Well, is there agreement on the date for the annual meeting, the 5th of March?’’ Er vi enige om at årsmøtet blir 15 mars? (starts answering the question)

Backstage utterances

Haptic signs

(adjusts her chair) h*tapping-responsesigns on Inger’s knee ARE YOU COMFORTABLE? YES. FINE FINE

Please cite this article in press as: Raanes, E., Berge, S.S., Sign language interpreters’ use of haptic signs in interpreted meetings with deafblind persons. Journal of Pragmatics (2016), http://dx.doi.org/10.1016/j. pragma.2016.09.013

+ Models

PRAGMA-4261; No. of Pages 14

8

E. Raanes, S.S. Berge / Journal of Pragmatics xxx (2016) xxx--xxx

07

Interpreter (4)

08

Inger (db4)

09

Interpreter (4)

10

Inger (db4)

11

Interpreter (4)

12

Interpreter (1)

FIVE MARCH ANNUAL MEETING, ALL AGREE. WHAT YOU? The annual meeting can be held on the 5th of March. All the other board members agree. Do you agree? AGREE. YES. WEEK BEFORE EASTER. I agree. It will be fine, the week before Easter. h*tapping-responsesigns on Inger’s knee *tapping response signals on interpreter’s hand *tapping response signs on Inger’s hand ‘‘Inger agrees as well. So everybody agrees on the suggested date for the meeting.’’ Inger er også enig. Da er alle enige om datoen for møtet. (Video recording camera 1: 15.30--16:24:)

In Extract 2, Rebecca has asked the other board members a question. Inger (db4) (the lady sitting on the right in the foreground of the pictures) has just switched interpreters (two interpreters are working with her). Inger starts to move her chair, trying to find a more comfortable position in her chair, and releases the embodied contact with her interpreter. Rebecca’s question to the board members concerning the date of the upcoming annual meeting is therefore not immediately translated to Inger (lines 2--6). When Inger is ready, she and her interpreter have a short exchange of haptic response signals constructed as a series of taps on Inger’s knee (lines 4--6). This signal functions as a short clarification about the seating arrangement. In the next sequence Inger’s interpreter lifts her hands towards Inger, first touching her, and then continuing to translate the Rebecca’s question and the discussion around the table (line 7). In this section we see how Inger’s interpreter alternates between consecutive interpreting and simultaneous interpreting. First she mediates the question, then she describes the other board members’ reactions and then she directs the question to Inger. To point out that Rebecca is waiting for an answer from her, and that it a possible moment now to get involved in the dialogue, the interpreter coordinates the interaction by constructing a pointing gesture towards Inger’s chest. This can be seen as a coordinative action (Wadensjö, 1998), as a clarification that a response is required. The pointing gesture towards Inger’s chest physically extends the signing space between Inger and her interpreter (see also Raanes, 2011). Inger (db4) raises her hands and answers (line 8), and her answer is interpreted to the others. However, as Inger cannot see the other board members’ responses, her interpreter is constructing a new series of tactile minimalresponse signals on Inger’s knee -- the others are listening to our answer (line 9). The next sequence of exchange of haptic minimal-response signals takes place when Inger’s interpreter again moves her hands into a position for tactile signing, and both the interpreter and Inger make a number of new tactile response signals on each other’s hands (lines 10 and 11). Their actions can be understood as a mutual construction of embodied minimal-response signals. In this case, the signals may have multiple purposes: they may describe the dialogue around the table, but they may also denote that the two are exchanging information about how they experience each other and the situation at hand. Fig. 5 illustrates where on Inger’s and the interpreter’s hands the haptic signs are constructed. The minimal response signals in Extract 2 are related to the main conversation but they are not described to the other board members. This indicates that the tactile minimal-response signals between Inger and her interpreter have a back stage character (Goffman, 1959, 1974), indicating that their interaction consists of several layers. In this way the two participants alternate between backstage and frontstage orientations: they maintain a private framework of participation while they also are involved in upholding the public framework between Inger, the chairwomen and the other board members. Please cite this article in press as: Raanes, E., Berge, S.S., Sign language interpreters’ use of haptic signs in interpreted meetings with deafblind persons. Journal of Pragmatics (2016), http://dx.doi.org/10.1016/j. pragma.2016.09.013

+ Models

PRAGMA-4261; No. of Pages 14

[(Fig._5)TD$IG]

E. Raanes, S.S. Berge / Journal of Pragmatics xxx (2016) xxx--xxx

9

Fig. 5. Haptic minimal-response signs facilitated on knee and arm.

This is also an interesting example of how Rebecca’s interpreter takes responsibility to coordinate the utterances in the general front-stage discussion. This is seen in line 12 in Extract 2, where the interpreter (1) sums up and links together several bits of information, which have been either lost or delayed due to the overlapping talks and the time-lag in the interpretation process. Another finding in Extract 2 is that the haptic minimal-response signals are expressed almost in the same way -- as a tapping tactile movement -- but their meaning varies in the different sequences. The co-construction of embodied actions (Goodwin, 2000), where Inger and her interpreter simultaneously construct haptic signs on each other’s hands, indicates the mutual and situated character of meaning-making in human dialogues (Linell, 2009). In the situation described in Extract 2, the understanding of the embodied signals is connected to how Inger and her interpreter frame the ongoing activity and define their roles towards each other (Goffman, 1959, 1974). In Extract 2, the haptic signs presented are easy to observe. However, there are other situations where tactile response signals are hardly noticeable. During the interviews, Inger described how she makes small movements with her thumb on the interpreter’s hands and that this sign means ‘‘I understand and I am following what you say’’. Such response signals, constructed through modifications in touch while holding hands are not easily accessible in videotaped material used to study visual input (Raanes, 2006:255). 3.4. Haptic signs for emotions Modifications and nuances in meaning are often distinguished by means of small visual face expressions, through body language or through changes in pitches and tones. It can therefore be difficult for deafblind listeners to get access to emotional information in spoken utterances. It may also be difficult to capture whether the speaker’s utterances are produced with a serious, aggressive or humorous intention. Haptic signs for emotions are therefore commonly used among sign language interpreters in Norway (Skåren, 2011). During the board meeting a variety of haptic signs for emotions were observed and Extract 3 will present one of the situations. One of the topics on the agenda was to set the date for the annual meeting, and Rebecca, the chairwoman (db1), needed to find a suitable date. To do so, her interpreter describes the available dates in her calendar. Their activity and dialogue are interpreted to the other board members. Finally, two possible dates are identified, one in early February and one in late March. Rebecca presents these two alternatives to the others. Several members state that the first date is suitable. However, Rebecca hesitates to make a decision. At the upcoming meeting there will be an election, and what the four other board members do not know is that Rebecca wants to step down as chairwoman. Instead of telling them this information directly, she constructs a question, saying: ‘‘Well, but do you want me to finish that soon? Do you not want to wait, until late March?’’, and then she smiles (line 1). This way of talking can be seen as a way to introduce serious information, disguised as humour. Her comment immediately creates an intense dialogue around the table. Two of the board members could capture the information through their hearing (db2 and db5), and they gave a response straight away (line 3). Inger (db4) also follows up with a response (line 5). But, Ivar (db3) and his interpreter (3) have had a long sequence of clarification about one of the signs and the utterance. Their backstage sequence of language repair has created a 92 second time delay. Rebecca (db1) and the rest of the group are waiting for his response. When Ivar (db3) has captured the message, and understands that Rebecca (db1) does not want to continue as their chairwomen, he orients his face and body towards the place where she is seated and begs her with an embodied visual expression to continue as their chairwoman. He illustrates how he goes down on his knees, raises and folds his hands and begs her to continue (line 8). His embodied gesture is intense and emotional. His utterance is interpreted to the other Please cite this article in press as: Raanes, E., Berge, S.S., Sign language interpreters’ use of haptic signs in interpreted meetings with deafblind persons. Journal of Pragmatics (2016), http://dx.doi.org/10.1016/j. pragma.2016.09.013

+ Models

PRAGMA-4261; No. of Pages 14

10

E. Raanes, S.S. Berge / Journal of Pragmatics xxx (2016) xxx--xxx

board members, and they start to laugh as they agree and think that Ivar’s gesture (folding his hands and praying) has expressed their mutual opinion. To mediate the meaning of Ivar’s embodied and gestural utterance, Rebecca’s interpreter creates two haptic signs for emotion on Rebecca’s shoulder. First she facilitates the haptic sign for ‘‘smile’’ and then the sign for ‘‘yes’’. The first haptic sign is meant to highlight the meaning of Ivar’s utterances, while the second is describing the general response from the other board members. Rebecca (db1) reacts with a spontaneous gesture where she lifts her hands up in the air and down again on the table. Then she starts to laugh as well (line 12). This leads to a situation where all the board members, as well as the interpreters, laugh together. It seems like they all have been taken by the dialogical flow that has been established around the table. Extract 3 presents a transcription of this sequence: Extract 3 Who

Front-stage utterances

01

Rebecca

‘‘Well, but do you want me to finish that early? Are you sure you don’t want to extend the time until the end of March?’’ Men vil dere at jeg skal slutte så tidlig? Vi dere ikke dryge det ut til slutten av mars?

02 03

Interpreter (1) Ann (db5)

04

Interpreter (1)

05 06

Inger (db4) Interpreter (1)

07

09

Ivar (db3) and his interpreter (3) Ivar (db3) NO! PLEASE: I-BEG-YOU ON MY KNEES: CONTINUE TWO MORE YEARS Interpreter (1)

10

Interpreter (4)

11

Interpreter (1)

12

Rebecca (db1)

13

Interpreter (1)

14 15

All participants Interpreter (1)

08

Backstage utterances

Haptic signs

*h WAIT ‘‘You are not allowed to quit!’’ Slutte? Nei! Du får ikke lov! ‘‘Ann says: you are not allowed to quit!’’ Du får ikke lov å slutte, sier Astri OH NO! THAT WAS BAD NEWS! ‘‘Inger says -- that was sad news.’’ Inger sier at det var trist å høre

h* SMILE

h*marking-person DB4 + h*keep her hand on Rebecca’s back whilst translating (Interpreting and clarifying for 92 sec.)

Gazing and nodding to I4 Gazing and nodding to interpreter 1

‘‘Ivar says this was sad news. I beg you on my knees to continue.’’ Nå sier Ivar: det var synd, jeg ber på mine knær -- fortsett @@ (she laughs loudly, lifts her hand and then hits the table with her hand, laughing) h*tapping response signs (on Rebecca’s back @@ (laughing loudly) ‘‘People around the table are smiling, but they seem sad as well.’’ Folk smiler her, men de er samtidig lei seg (Video recording camera 5: 15.51--17.28)

Please cite this article in press as: Raanes, E., Berge, S.S., Sign language interpreters’ use of haptic signs in interpreted meetings with deafblind persons. Journal of Pragmatics (2016), http://dx.doi.org/10.1016/j. pragma.2016.09.013

+ Models

PRAGMA-4261; No. of Pages 14

[(Fig._6)TD$IG]

E. Raanes, S.S. Berge / Journal of Pragmatics xxx (2016) xxx--xxx

11

Fig. 6. Haptic sign for smiling.

Fig. 6 shows a picture of the haptic sign for smiling and a picture of the related situation around the table. This sequence in Extract 3 turned out to be a mutual moment of emotions. The different board members entered the sequence with different frames of what was taking place, but they were able to construct a communicative project (Linell, 2009), where they together protected and took care of Rebecca’s utterance or, to use Goffman’s term, protected her face and status (Goffman, 1959). When Rebecca announced her plans to end her term as chairwoman, she was probably aware that her decision would create a big change in the association. It was therefore not an easy message to present to the other board members. It could be experienced as a face-threatening situation (Goffman, 1959) as there was a chance that the other board members would not mind if she leaves her position. Rebecca’s interpreter uses haptic signs to highlight Ivar’s (db3) and the other board members’ (db2, db4 and db5) reactions (line 11). Referring to his signs and gestures, in combination with describing the others facial expressions (line 15), creates an emotional utterance which is literally reflected in Rebecca’s body and embodied actions: she laughs and hits the table with her hand. This embodied reaction indicates that her interpreter has constructed an interpretation which gave her access to the others’ emotions. As one board member after another understands Rebecca’s utterance and the practical consequences of this decision for their small association, the room is filled with overlapping comments. However, sign language interpreting and tactile communication requires that the participants speak one by one. Thus, when providing interpreting services for deafblind persons, the critical information must be made clear and the less important information set aside (Raanes and Berge, 2011). In the sequence in Extract 3, the Rebecca’s interpreter focused on Rebecca’s communicative project when she made the omissions for interpreting. Based on a situated understanding of the context, she made a decision about which information was critical so that Rebecca could understand what was happening and the meaning of the others’ utterances. One responsibility of the chair woman’s interpreter was therefore to coordinate the interaction (Wadensjö, 1998) and to ‘‘clear the table’’ (Sacks et al., 1974) so that each member could have the others’ attention for a moment. Extract 3 also includes examples of how the interpreters ‘‘clear the table’’ as a team. After Rebecca (db1) had posed her question and three of the board members (db2, db4 and db5), had responded to her, she was waiting for Ivar’s (db3) reaction, as was the rest of the group. The interpreters as a team turned their attention to Ivar and his interpreter, and as soon as Ivar and his interpreters backstage sequence of clarification was finished, the group of interpreters highlighted and interpreted Ivar’s utterances and embodied gestures. This indicates that another layer of interaction was present, namely, the backstage dialogue shared between the interpreters, often expressed via glances and eye contact. In this way, the interpreters construct mutual agreements on how they can handle the overlapping of talk of different contributions to the meeting and coordinated the turn-taking between the board members. An earlier study by Berge and Raanes (2013) has found a similar layer of interaction. Interpreter mediated dialogues consist of contributions by all participants present (Wadensjö, 1998; Metzger, [13_TD$IF]1999; Roy, [13_TD$IF]2000; Napier, 2007). In interpreting for deafblind persons, especially in group discussions, the conditions for mediating ‘‘all that is said and done’’ is not present, due to overlapping sequences of talk, response and actions. There is an (unspoken) agreement that the interpreters’ to some point must make omission and select which information they pass on and which not. The interpreters’ omissions are, as in this situation, based on a situated understanding of which information is critical for the different people, not only on the individual level, but also because they are part of a group discussion with the other board members. The interpreters’ understanding of ‘‘what is going on’’ influences their decisions concerning what to omit, their interpreting, their descriptions and their use of haptic signs. This kind of personal, or team based, omissions influences the board members’ self-presentations. The other's initiatives, as well as their own responses, are shaped in the interpreting process of the other persons’. This kind of dialogical cooperation indicates the need of trustful relationships (Marková et al., 2008), both between the interpreters as a team and between the interpreter and the deafblind person. However, it seems Please cite this article in press as: Raanes, E., Berge, S.S., Sign language interpreters’ use of haptic signs in interpreted meetings with deafblind persons. Journal of Pragmatics (2016), http://dx.doi.org/10.1016/j. pragma.2016.09.013

+ Models

PRAGMA-4261; No. of Pages 14

12

E. Raanes, S.S. Berge / Journal of Pragmatics xxx (2016) xxx--xxx

that the interpreters’ role performances and mediation technics in this meeting analyzed here were effective: Ivar's prayer was heard by Rebecca and she continued as chairwoman for one more year. 4. Findings and conclusions The analysis in this study has shown how interpreters’ use of haptic signs provides information that is critical for deafblind persons to regulate their self-presentation and participate in a group discussion. Most importantly, haptic signs may reinforce the feeling of having access to information about environmental and interactional conditions. Secondly, the study illustrates that haptic signs must be situated and contextualized in order to be understood. Information about the environment, such as who is present and how they are situated in relation to each other, must be described. This environmental cognitive map lays the ground for understanding the haptic signs. Thirdly, the timing of constructing the haptic sign appears highly significant in order for the deafblind person to be able to understand what the signal mean. It may be difficult to capture a particular sign, if it is not presented in accordance with the communicative project of the situation. Therefore, haptic signs need to be linked to ‘‘the situated chain of utterances’’ (Linell, 2009). Indeed, if the signals are not framed or timed, there is a risk of confusion or misunderstanding. Bearing the sequential and contextual aspects of human communication in mind (Linell, 2009), a simple haptic sign may not be sufficient to provide a nuanced meaning. As described above, this is demonstrated in Extract 3, line 13, when Rebecca tells the other board members that she will not continue as their leader. The board members’ reactions were interpreted by the use of the haptic sign for ‘‘smile’’, then a contextualized meaning was constructed by Rebecca's interpreter's vocal description of the others’ emotional expressions: interpreter 1 saying first that the other board members were smiling, but then specifying her explanation by continuing that the ‘‘people around the table are smiling, but they seem sad as well’’. In this way, Rebecca's interpreter added parts of information that were not expressed in words, but were present as a mutual understanding of the situation. Haptic signs cannot be understood as static signs providing complete access to information about any situation. One sign can have several meanings and can only give access to parts of the others’ reactions and the ongoing communication. Moreover, the risk of over-interpretation is always present when non-verbal expressions are described with rather simplified signs for complex emotions. Haptic signs are therefore language tools that need to be used carefully and in cooperation with the prime participants. The present analysis has also documented some of the advantages of using haptic signs. Most importantly, haptic signs can effectively add a variety of contextual and interactional information into the interpreting, as they allow several information keys to be mediated simultaneously. For example, at the same time as the interpreter mediates the spoken utterances, s/he can mediate haptic backstage information about the other participants’ embodied expressions, speaking styles, emotions, minimal-response signals and turn-taking signals. This clearly saves time, which can be critical especially in deafblind communication. The extracts presented above demonstrate that small haptic signs provide quick and effective access to others’ feedback and response signals. Access to such information is significant to ensure that all interlocutors recognize and consider themselves as members of the ongoing activity and build relations with the people they meet in face-to-face interaction. Haptic signs can though be seen as highly efficient means of communication since deafblind persons do not have visual access to the others’ facial or embodied expressions, and do not hear their vocal changes when they speak. Haptic signs give the interpreter a way to express dimensions of the others’ appearance that can be difficult to say openly and in public, which can often be the case when speaking to a person with a hearing loss, using a loud and clear voice. Haptic signs can therefore give access to descriptions, which can lead to a more dynamic participation. However, it is also important to note that, in the context of the current case study, the interpreters did not fully accomplish their role or fulfil their responsibilities as interpreters due to their influence on the board members’ participation in the discussion. In the interviews, the interpreters described the situated aspect of their work by saying that ‘‘we are coming into their meeting’’. The chairwoman's interpreter said further that she took a special responsibility for coordinating the interaction as she was interpreting for the person who was expected to lead the meeting. This can be seen as an alternation between different footings in the interpreters’ work. The chair woman's interpreter also described her perspective by saying: ‘‘I put on the leader's hat’’. Thus, the interpreter's role was framed in conjunction with the person s/he is working for. This indicates a situated role-space where the framework of participation is co-created by the team of interpreters and the prime participants. The current analysis shows that most of the interpreted utterances are shared front-stage among the board members attending the meeting, but that there is also an interactional layer between the interpreter and the person who is receiving the interpreting service as well as amongst the interpreters as a professional team. All these layers were in play during the meeting, which demonstrates that the participation framework in the meeting was not static, but was being constantly adjusted in interaction moment-by-moment, depending on the communicative project that was the participants’ focus. Introducing haptic signs in communication for the first time, one must keep in mind that simultaneous mediation creates complex communicative patterns. The haptic method of communicating is still new and interpreters cannot expect that all Please cite this article in press as: Raanes, E., Berge, S.S., Sign language interpreters’ use of haptic signs in interpreted meetings with deafblind persons. Journal of Pragmatics (2016), http://dx.doi.org/10.1016/j. pragma.2016.09.013

+ Models

PRAGMA-4261; No. of Pages 14

E. Raanes, S.S. Berge / Journal of Pragmatics xxx (2016) xxx--xxx

13

deafblind persons are able to apply a broad repertoire of haptic signs. The interpreters also need to respect that some deafblind persons may not appreciate to be touched, or that they not are used to understanding the simultaneous language use. The use of haptic signs therefore needs to be clarified: the interpreter and the deafblind person, whether individually or in a group, need time to explore how they understand and use haptic signs in different communicative settings. Moreover, if the haptic signs are not appreciated, it is always possible by words or by sign language to present additional description and needed information. This paper has given new insight into turn-taking, the exchange of minimal-response signals and the expression of emotive utterances in conversations in a particular context. It has highlighted how face-to-face interaction requires the sharing of attention and how interaction needs sequential ordering. Awareness of and response to these structures is of general importance. Marková et al. (2008) describe this aspect as important when building trust and sociocultural understanding in meetings between several participants. This is especially true of interpreter-mediated meetings where there are people of different backgrounds using different communication methods. The examples described above from deafblind interpreter-mediated interaction demonstrate how haptic signs can be effective communicative tools supporting interaction and joint attention. Communicative flow in a dialogues and group setting depends on access to signals that indicate awareness of moment-by-moment actions in ongoing interaction[2_TD$IF]. Appendices A Transcription conventions Rebecca (db1) Interpreter (1) db2 Interpreter (2) h* WAIT [ ] ( ) @ ... ++

Chairwoman Chairwoman’s interpreter Board member 2, numbered in accordance with the seating Interpreter for board member 2 Haptic sign Meaning of the sign in capital letters Overlapping talk description of interactional activities Laughter Pause Eye contact

References Berge, Sigrid S., 2014. Social and private speech in an interpreted meeting of deafblind persons. Interpreting 16 (1), 81--105. http://dx.doi.org/ 10.1075/intp.16.1.05ber Berge, Sigrid Slettebakk, Raanes, Eli, 2012. Koordinering av turtaking i en tolkemediert gruppesamtale mellom døvblinde samtalepartnere (Coordination of turn-taking in an interpreted group dialogue for deafblind). Sosiologisk Tidsskrift (J. Sociol.).20 (01), 47--72. Berge, Sigrid S., Raanes, Eli, 2013. Coordinating the chain of utterances: an analysis of communicative flow and turn-taking in an interpreted group dialogue for deaf-blind persons. Sign Lang. Stud. 13 (3), 350--371. http://dx.doi.org/10.1353/sls.2013.0007 Birdwhistell, Ray L., 1952. Introduction to Kinesics: An Annotation System for Analysis of Body Motion and Gesture. Department of State, Washington, DC. Birdwhistell, Ray L., 1970. Kinesics and Context: Essays on Body Motion Communication. University of Pennsylvania Press, Philadelphia. Bjørge, Hildebjorg K., Rehder, Kathrine G., 2015. Haptic Communication: The American Edition of the Original Title Haptisk Kommunikasjon [Kindle DX Version]. Retrieved from Amazon.com Bjorge, Hildebjorg K., Rehder, Katrine G., Overaas, Magni, 2013. Haptisk kommunikasjon (Haptic Communication). Abstrakt forlag, Oslo. Broth, Mathias, Mondada, Lorenza, Laurier, Eric, 2014. Studies of Video Practices: Video at Work. Routledge, New York. Coates, Jennifer, Sutton-Spence, Rachel, 2001. Turn-taking patterns in deaf conversation. J. Sociolinguist. 5 (4), 507--529. http://dx.doi.org/ 10.1111/1467-9481.00162 Collins, Steven, 2004. Adverbial morphemes in tactile American sign language Dissertation. Union Institute and University, Cincinnati, OH. Edwards, Terra, 2014. From compensation to integration: effects of the pro-tactile movement in the sublexical structure of tactile American sign language. J. Pragmat. 69, 22--41. http://dx.doi.org/10.1016/j.pragma.2014.05.005 Erlenkamp, Sonja, 2003. Informanter i tegnspråkforskning: problemer og utfordringer (Informants in sign language research: problems and challenges). In: Johannessen, J.B. (Ed.), På språkjakt: problemer og utfordringer i språkvitenskapelig datainnsamling. (In Search for Language: Problems and Challenges).Unipub, Oslo, pp. 83--132. Frankel, M.A., 2002. Deaf-blind interpreting: interpreters’ use of negation in tactile American Sign Language. Sign Lang. Stud. 2, 169--181. http:// dx.doi.org/10.1353/sls.2002.0004 Goffman, Erving, 1959. The Presentation of Self in Everyday Life. Penguin Books, London. Goffman, Erving, 1974. Frame Analysis: An Essay on the Organization of Experience. Harper and Row, New York.

Please cite this article in press as: Raanes, E., Berge, S.S., Sign language interpreters’ use of haptic signs in interpreted meetings with deafblind persons. Journal of Pragmatics (2016), http://dx.doi.org/10.1016/j. pragma.2016.09.013

+ Models

PRAGMA-4261; No. of Pages 14

14

E. Raanes, S.S. Berge / Journal of Pragmatics xxx (2016) xxx--xxx

Goffman, Erving, 1981. Forms of Talk. University of Pennsylvania Publications in Conduct and Communication. University of Pennsylvania Press, Philadelphia. Goodwin, Charles, 2000. Action and embodiment within situated human interaction. J. Pragmat. (32), 1489--1522. http://dx.doi.org/10.1016/ S0378-2166(99)00096-X Heath, Christian, 2002. Demonstrative suffering: the gestural (re)embodiment of symptoms. J. Commun. 52 (3), 597--616. http://dx.doi.org/ 10.1111/j.1460-2466.2002.tb02564.x Hjort, Peter, 2008. Framtidens tolke- og kommunikasjonstjenester for døve, døvblinde og hørselshemmede (Official Norwegian report on interpreting: the future of interpreter and communication services for the deaf, deafblind and hearing impaired). Rikstrygdeverket [The National Social Security System], Oslo. Knoblauch, Hubert, Soeffner, Hans-Georg, Schnettler, Bernt, Raab, Jurgen (Eds.), 2006. Video Analysis: Methodology and Methods. Peter Lang, Frankfurt. Lahtinen, Riitta, 2003. Development of the Holistic Social-Haptic Confirmation System: A Case Study of the Yes & No Feedback Signals and How They Become More Commonly and Frequently Used in a Family with an Acquired Deafblind Person. University of Helsinki, Department of Teacher Education, Helsinki. Lahtinen, Riitta, 2008. Haptices and Haptemes: A Case Study of Developmental Process in Social-Haptic Communication of Acquired Deafblind People. University of Helsinki. A1 Management, Essex. Lahtinen, Riitta, Lahtinen, Merja, Russ, Palmer, 2010. Environmental Description for Visually and Dual Sensory Impaired People. A1 Management, Essex. Linell, Per, 1997. Interpreting as communication. In: Gambier, Y., Gile, D., Taylor, C. (Eds.), Conference Interpreting: Current Trends in Research. Proceedings of the International Conference on Interpreting: What Do We Know and How?John Benjamins, Amsterdam, pp. 49--67. http://dx. doi.org/10.1075/btl.23.04lin Linell, Per, 1998. Approaching Dialogue: Talk, Interaction and Contexts in Dialogical Perspective. John Benjamins, Amsterdam. http://dx.doi.org/ 10.1075/impact.3 Linell, Per, 2009. Rethinking Language, Mind, and World Dialogically. Information Age Publishing, Charlotte, NC. Linell, Per, Gustavsson, Lennar, 1987. Initiativ och respons: om dialogens dynamik, dominans och koherens (Initiative and Response: The Dialogical Dynamic, Dominate and Coherence). Tema K, Lindköping. Marková, Ivana, Linell, Per, Grossen, Michele, 2007. Dialogue in Focus Groups: Exploring Socially Shared Knowledge. Studies in Language and Communication. Equinox Publishing, Bristol, CT. Marková, Ivana, Linell, Per, Gillespie, Alex, 2008. Trust and distrust in society. In: Marková, I., Gillespie, A. (Eds.), Trust and Distrust. Sociocultural Perspectives. Information Age, Charlotte, pp. 3--27. Mesch, Johanna, 1998. Teckenspråk i taktil form: turtagning och frågor i dövblindas samtal på teckenspråk (Tactile sign language: turntaking and questions in signed conversations of deafblind people). Dissertation. Department of Linguistics, Stockholm University, Stockholm. Metzger, Melanie, 1999. Sign Language Interpreting. Deconstructing the Myth of Neutrality. Gallaudet University Press, Washington, DC. Metzger, Melanie, Fleetwood, Earl, Collins, Steven D., 2004. Discourse genre and linguistic mode: interpreter influence in visual and tactile interpreted interaction. Sign Lang. Stud. 4 (2), 118--216. http://dx.doi.org/10.1353/sls.2004.0004 Mondada, Lorenza, 2009. Video recording practices and the reflexive constitution of the interactional order: some systematic uses of the splitscreen technique. Hum. Stud. 32 (1), 67--99. Mondada, Lorenza, 2014. Shooting video as a research activity: video making as a form of proto-analysis. In: Broth, M., Mondada, L., Laurier, E. (Eds.), Studies of Video Practices: Video at Work. Routledge, New York, pp. 33--62. Næss, Trine, 2002. En trangsynt erfaring (From a narrow perspective). Tolkeavisa (J. Interpret.).4, 15--18. Napier, Jemina, 2007. Cooperation in interpreter-mediated monologic talk. Discourse Commun. 1 (4), 407--432. http://dx.doi.org/10.1177/ 1750481307082206 Petrén, Finn, 1980. Bättre livsvillkor för dövblinda i Norden: förslag från Nordiska arbetsgruppen för dövblinda (Better Life for Deafblind in the Nordic Countries). Nordiska nämden för handikappfrågor, Bromma. Petronio, Karen, Dively, Valerie, 2006. Yes, #no: visibility and variation in ASL and tactile ASL. Sign Lang. Stud. 7 (1), 57--98. http://dx.doi.org/ 10.1353/sls.2006.0032 Raanes, Eli, 2006. Å gripe inntrykk og uttrykk: interaksjon og meningsdanning i døvblindes samtaler: en studie av et utvalg dialoger på taktilt norsk tegnspråk (To catch impressions and expressions: interaction and meaning construction in deafblind people's conversation: a study on tactile Norwegian sign language dialogues). Dissertation. Norwegian University of Science and Technology, Trondheim. Raanes, Eli, 2011. Tegnrom og taktilt tegnspråk (Signing space in tactile sign language). Norsk lingvistisk tidsskrift (J. Nor. Linguist.).29 (1), 54--86. Raanes, Eli, Berge, Sigrid S., 2011. Tolketjenesten: avgjørende for døvblindes deltagelse (Interpreter services: decisive for the participation of the deaf-blind). Fontene Forskning 1 (11), 4--17. Roy, Cynthia B., 2000. Interpreting as a Discourse Process. Oxford University Press, New York. Sacks, Harvey, Schegloff, Emmanuel A, Jefferson, Gail, 1974. A simplest systematics for the organization of turn-taking for conversation. Language 50 (4), 696--735. http://dx.doi.org/10.1353/lan.1974.0010 Sacks, Harvey, Jefferson, Gail, Schegloff, Emanuel A., 1992. Lectures on Conversation, Vol. 1. Blackwell, Oxford. Schwartz, Sandrine, 2009. Stratégies de synchronisation interactionnelle: alternance conversationnelle et rétroaction en cours de discours chez des locuteurs sourdaveugles pratiquant la Langue des Signes Française tactile (Interactional synchronization strategies: alternating and conversational communication and feedback signals in French tactile sign language). Dissertation. Sciences du langage, Université PARIS 8, Paris. Skåren, Anne-Lise, 2011. Det øynene ikke ser og ørene ikke hører (What The Eyes Don’t See and The Ears Don’t Hear). MA. Norwegian University of Science and Technology, Trondheim. Wadensjö, Cecilia, 1995. Dialogue interpreting and the distribution of responsibility. J. Linguist. 14, 111--129 http://download1.hermes.asb.dk/ archive/download/H14_07.pdf Wadensjö, Cecilia, 1998. Interpreting as Interaction. Longman, New York.

Please cite this article in press as: Raanes, E., Berge, S.S., Sign language interpreters’ use of haptic signs in interpreted meetings with deafblind persons. Journal of Pragmatics (2016), http://dx.doi.org/10.1016/j. pragma.2016.09.013

Suggest Documents