2011); see also http://www.ncvs.org/ncvs/tutorials/voiceprod/tutorial/quality.html. ...... nose; a nod is coded when the nose crosses the horizontal line and a shake ...
Jinni A. Harrigan
3 Methodology: coding and studying nonverbal behavior Abstract: This chapter presents material on methodological issues in the study of nonverbal behavior. Areas included are: facial and vocal behavior, kinesics, proxemics, and gaze behavior. The focus within these dimensions of nonverbal behavior is on strategies used for coding, counting, and recording the various units of analysis. Thorough treatments of specific methodological practices can be found elsewhere, especially for facial and vocal behavior. The complex conceptual issues in cataloguing nonverbal behavioral units for analysis cannot be overlooked and this chapter elucidates some of these issues particularly for kinesics, proxemics, and gaze behavior. Future researchers will enhance the research gleaned on nonverbal behavior by further development of coding and recording methods, particularly for kinesics, proxemics, and gaze behavior. Keywords: nonverbal behavior methodology, coding methods, facial, vocal, gaze, kinesics, proxemics
1 Introduction The methodology for studying nonverbal behavior is as broad and dense as the field itself. A comprehensive description of these methods is not within the scope of this chapter; rather the focus is on the coding methods used for nonverbal behaviors, as the coding methods themselves are essential for both studies where behaviors are quantified (e.g., frequency of head nods) or manipulated (e.g., types of gestures). Gray and Ambady (2006) outlined strategies for studying nonverbal behaviors as independent or dependent variables, but the focus here is on accepted coding practices for recording face and body actions and vocal content, as these methods are essential for replicability and comparison of behaviors across studies. Advances have been made in nonverbal domains where the behavioral codes have been systematically developed and tested, and are used by researchers from different laboratories to denote the same behaviors. A coding system requires conceptualization, segmentation, and classification of behaviors as mutually exclusive units. Segmentation involves decisions for separating and identifying a behavioral unit from the stream of behavior, and is based on conceptual distinctions (i.e., a head nod versus head shake) and temporal parameters (i.e., beginning and end points); symbols are often used to represent the resulting behavioral units (e.g., Facial Action Unit 12). Both micro and macro levels of coding are used (Burgoon
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
36
Jinni A. Harrigan
and Baseler 1991). A micro level may be used to code minute movements of the head, eyes, and fingers in speaker turn-switching studies, whereas a macro level may be used in studies of affiliation where smiles, head nods, and distance are coded. The present chapter is divided into the five, historically distinguished, nonverbal domains: facial actions, vocal cues, proxemics (use and perception of space), gaze, and kinesics (head, body, arm, and leg movement). Coding techniques and sample research methodologies are discussed for each. The notion of separate domains is a handy tool for research, but should not obscure the importance of the congruency that exists across nonverbal, vocal, and verbal behaviors in social interaction. (Additional methodological information on these five domains is available in Chapters 6–11 of the present volume.) Research on the face and voice has evolved considerably over the last 50 years, due in large part to sound conceptual and theoretical perspectives that resulted in distinct coding methods based on anatomy, evolution, and social function. Research in these two research areas was pursued meticulously by investigators who worked consistently to develop and hone reliable and valid coding techniques which are used by other researchers, permitting comparison across laboratories, and ultimately, accelerating our understanding of facial and vocal behavior. With the exception of some technical innovations, strategies for coding proxemics and gaze have changed little since the 1980s, and systematic research on kinesics, begun more than a half century ago, is even less advanced. Investigations in the latter three domains have received less unified attention, and include a range of foci and individual coding methods by researchers from a variety of disciplines. This state of disjointedness in the development of coding systems for proxemics, gaze, and kinesics has hampered the elucidation of theoretical constructs and research methodologies in these areas, and for this reason a wider discussion is provided for these three to outline present coding frameworks which can be further developed in future research. Several issues are relevant for coding nonverbal behavior. First, although artfully suggested by pop-psychology writers (Fast 1970), the great majority of nonverbal behaviors do not convey unambiguous meaning as verbal behavior generally does. A word itself usually has no direct relationship to its referent, other than the defined meaning attributed to it by those who use it. The primary carriers of affect, the face and voice, are the closest to being able to represent specific meaning (e.g., eye and mouth configuration and vocal contour in sadness). There are a few body actions (e.g., insulting hand movements, head nodding) that can be interpreted like language by those within or between cultures, but even these do not reveal the same meaning each time they are displayed (e.g., head nod to signal Yes, or as listener response acknowledging speaker’s comments). Ekman and Friesen (1969) provided a thorough articulation of the differences between idiosyncratic and shared meanings, and informative and communicative behaviors; see also Rosen-
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
Methodology: coding and studying nonverbal behavior
37
feld (1987). Secondly, while a verbalization may be disjointed or blurted out, it implies an intention to send a message. Intention is far less discernible with nonverbal behaviors (Dittmann 1987; Ekman and Friesen 1969). Compare the utterance, “I don’t agree,” with a silent person who simultaneously purses the lips, and looks and turns away from the speaker. In both cases information is revealed, and in the latter, an attitude or feeling may be surmised, but was it intentional or a spontaneous reaction? The encoded (i.e., displayed) nonverbal behaviors may range from conscious and deliberate to automatic and unintended. Lastly, the terminology for coded behaviors needs to be descriptive, rather than inferential, to avoid bias resulting from inferred meanings, i.e., reporting positive relationships between personality variables (e.g., agreeableness and extraversion) and body positions characterized as “open,” without defining “open.” Investigations may be focused on behaviors within a domain (e.g., gaze patterns in teacher-student interactions) or from several domains (i.e., gaze, kinesics, face). The latter are referred to as channel studies and are ideal for obtaining a wealth of information about a construct (e.g., listener feedback), or for examining the interactions of behaviors from different domains. Channel studies are plentiful (Bugental, Kaswan, and Love 1970; Gallois and Callan 1986). For example, a metaanalysis showed the influence of channel on ratings of state and trait anxiety (Harrigan, Wilson, and Rosenthal 2004). While coding nonverbal behaviors may seem an ideal method to determine the relationship between specific actions and a particular concept (e.g., gaze and dominance), a less time consuming method is using adjective ratings (e.g., warmth, dominance) of the behaviors of interest. For example, global assessments of overall behavior with respect to arousal (i.e., calm, reticent, attentive) were shown to be highly correlated in the psychotherapy context (Burgoon et al. 1992).
2 Facial behavior 2.1 Anatomical coding of facial behavior Of the five broad channels of nonverbal behavior identified above, the face has received the most research attention, and a great deal of appreciation in literature and the media as an important vehicle for conveying emotion. For example, reading “he sneered” in a novel connotes a very different impression than “he smiled”; this is even more powerfully conveyed when we see an actor sneer or smile in a movie. When we see these displays we have an idea what they might indicate, but the research question might be what muscles move to provide this information. These simple examples of facial expressions belie the difficulty in measuring such complex, rapid, and sometimes confusing movements. More than 80 years have passed since the earliest attempts to systematically measure facial actions. Tech-
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
38
Jinni A. Harrigan
niques for measuring facial actions are presented here but more thorough treatments exist (Cohn and Ekman 2005). One of the oldest research questions regarding facial cues, cultural specificity in emotion recognition, has been elucidated in Elfenbein and Ambady’s (2002) thorough meta-analysis on studies of the universality of emotion recognition. The oldest, most developed, and most used method for enumerating facial actions is manual coding which involves a trained coder counting specific facial movements based on a systematic inventory of facial muscle movements. Thorough inventories of facial actions are anatomically based (Ekman, Friesen, and Hagar 2002; Ermiane and Gergerian 1978), but other perspectives include linguistic (Birdwhistell 1970) or ethological (Blurton Jones 1971) approaches. The extensive and comprehensive work of Ekman and colleagues has resulted in a detailed and specific system (Facial Affect Coding System or FACS; Ekman et al. 2002) presently used by many researchers (Ekman and Rosenberg 1997; 2005). Like Ermiane and Gergerian (1978), the FACS describes actions of all the facial anatomy, and includes details regarding intensity and timing. For example, the intensity of the contraction of the zygomatic major which pulls the mouth corners upward was correlated with self-reported happiness (Ekman, Friesen, and Ancoli 1980). The timing of various “…actions that compose a facial expression do not all start, reach an apex, and stop simultaneously” (Cohn and Ekman 2005: 23). Both factors are crucial for determining the exact facial configuration and thus expression. Facial affect typically involves more than one muscle movement, and coding systems need to account for changes with respect to speaking, eye movement, and developmental variations (e.g., infants, children, elders). The FACS was developed methodically and rigorously over more than a decade of research (Ekman and Friesen 1978; Ekman et al. 2002), by carefully determining which muscles contracted in various facially-displayed emotions and labeling these individual muscle movements (Action Units or AUs) by number rather than inference (i.e., AUs 9+12 versus “sneer”) to avoid bias associated with terminology. In addition to studying which facial muscles were contracted in each emotion, electromyography was used to stimulate facial muscle contraction to determine specific actions (Ekman, Schwartz, and Friesen 1978). Finally, observers were able to accurately differentiate the facial action units, i.e., visible changes in different facial muscles (Ekman and Friesen 1978). The EMFACS, which is based on the FACS, is used to score only those movements related to the seven universal facial expressions (i.e., happiness, sadness, anger, fear, disgust, contempt, and surprise) delineated by Ekman and Friesen (1971); Izard (1971). The EMFACS (Friesen and Ekman 1984) was developed to reduce the time and cost involved in scoring facial data using the FACS which can be time consuming because all visible movements are scored, and the quantity or compactness (i.e., density of muscle movements) of the data may require many hours of coding. Not all facial coding systems that have been developed considered all muscle movements (Frois-Wittmann 1930; Landis 1924); some included only those related
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
Methodology: coding and studying nonverbal behavior
39
to emotion (Izard 1983). Based on the Differential Emotions Theory, Izard (1971) developed the Maximally Discriminative Facial Movement Coding System or MAX (Izard 1979, 1983) for scoring infant facial expressiveness, and later, a system for identifying affect expressions by holistic judgments, AFFEX (Izard and Dougherty 1980). Disadvantages of the MAX are that it does not include all possible facial muscle movement, it is based on posed, prototypical adult facial expressions (Oster, Hegley, and Nagel 1992), and it cannot “distinguish among facial actions that have different anatomical bases” (Cohn and Ekman 2005: 17).
2.1.1 Reliability and validity of anatomical coding of facial movements Most of the coding techniques cited above either did not report reliability data or reported it only for selected facial actions, and did not include reliabilities for different age groups, spontaneous and posed facial actions, or with respect to intensity and timing, or coder experience differences (Cohn and Ekman 2005). An index of agreement can be used to show which actions are more reliably coded than others; and particularly for small actions, it can help to establish a minimum threshold for an action to be coded (Cohn and Ekman 2005). Reliabilities (e.g., percent agreement) reported for FACS and MAX are very good. Coders must achieve a reliability of .83 to be considered a valid FACS coder, and mean ratios of agreement for a sample of action units was .82 (Ekman et al. 2002: 18). Very little information is available regarding the validity of the various coding strategies. While some other coding techniques purport to predict developmental changes (Frois-Whittman 1930), severity of mental illness, and individual differences (Landis 1924), little data was provided. An exception is the work of Ekman and colleagues who demonstrated validity for the FACS codes and muscle movement: a) requested performances of various facial actions by trained individuals (Ekman and Friesen 1978), and b) electrical stimulation of facial muscles (Ekman et al. 1978). The bulk of studies on facial movement have been directed toward recognition of emotion in others, and facial EMG studies showed associations between FACS codes and the muscles displayed during an emotional experience: (a) autonomic nervous system responses associated with the experience of spontaneous emotions (Ancoli 1979), and (b) subjects’ retrospective reports of experienced emotion (Ekman et al. 1980). Izard found that MAX coders could reliably identify infant facial actions which corresponded to posed adult emotions, and infant vocalization and movement patterns (Izard 1983).
2.2 Other facial movement measurement strategies Facial electromyography (EMG) is a technique to measure the electrical activity of facial muscles by detecting the electrical impulses within the muscle during
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
40
Jinni A. Harrigan
contraction that are not visible to the naked eye. Measures of facial muscle activity using EMG have become more sensitive and precise over time. Two methods can be used: electrodes placed over the surface of the face, or fine wires inserted into the muscle (Cacioppo, Gardner, and Berntson 1999); for detailed descriptions see Cacioppo et al. (2000). While there are some difficulties with facial EMG (i.e., anatomical alignment of muscles over one another, restricted head movement, asymmetry of facial musculature, training of researchers), it was used in the development of the FACS and to establish the reliability of FACS (Cohn et al. 1999; Ekman and Friesen 1978). Reliability for facial EMG has improved since guidelines were developed (Fridlund and Cacioppo 1986), and FACS codes and EMG recordings were shown to be highly correlated (r =.85) (Ekman et al. 1978). Typically, EMG is not used to study specific emotions, but which muscles contract during facial displays of positive or negative affect (Cacioppo et al. 2000). However, EMG recordings have been correlated with self and observer reported emotion (Dimberg, Thunberg, and Grunedal 2002), and the prediction of treatment outcome in depression (Carney et al. 1981). An emerging method of representing facial configurations is the current development of automatic image analysis for facial expressions using computer vision. The result of such analysis is recognized arrangements of facial actions from digitized images; this culminates in the extraction of relevant patterns representing various emotions (Cohn 2010; Cohn and Ekman 2005). Face recognition is an example of automated facial image analysis, and commercial applications are available, however this technique is complex and requires significant expertise and equipment for use (Cohn and Kanade 2007). Use of this technique for facial expression research is described in detail by Cohn and Ekman (2005).
3 Vocal behavior The domain of vocal behavior, also referred to as ‘paralanguage’ (Street 1990; Trager 1958), includes acoustic features of the voice (e.g., pitch), and speech disruptions and nonlinguistic sounds (e.g., stutters). Phonation characteristics of the voice (e.g., movement of the vocal folds; tongue position) are analyzed in relation to the speech process, but have not yet been examined in social interaction studies. Acoustic features most commonly extracted are pitch (i.e., fundamental frequency), tempo (i.e., speech rate), and loudness (i.e., amplitude or intensity), but other measures are lip and articulation control, rhythm, breathiness, nasality, and resonance (Poyatos 1993; K.R. Scherer 1979). Several voice features that describe phonation changes (i.e., tension, perturbation) recently have been analyzed (Patel et al. 2011); see also http://www.ncvs.org/ncvs/tutorials/voiceprod/tutorial/quality.html. Speech disruptions include stutters, repetitions, sentence changes and incompletions, filled and unfilled pauses, word omissions, and nonlinguistic sounds such
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
Methodology: coding and studying nonverbal behavior
41
as sighing or clearing the throat (Poyatos 1993; Siegman 1979). While verbalizations are generally intentional and deliberate with regard to communication, nonlinguistic sounds are less specific and less premeditated, but offer important cues regarding the speaker’s identity, personality, emotion, and conversational turn intentions. For example, vocal stress, a nonlinguistic characteristic helps untangle the meaning of an utterance (Chomsky 1965), or the recognition of sarcasm (Rockwell 2006). Like facial actions, vocal behavior has a long history of study, and has been focused on: personal characteristics, conversational cues, and affect communication. Both acoustic cues and speech disruptions have been considered in these areas.
3.1 Vocal behavior methodologies Speech samples are typically audio- or videotaped, but unlike facial actions which are immediately apparent and capable of being analyzed in a still state (e.g., paused videotape), vocalizations are dynamic. A still state cannot capture vocal characteristics as they transpire over time and are additive, i.e., “each cue is neither necessary nor sufficient, but the larger the number of cues used, the more reliable the communication” (Juslin and Scherer 2005: 84). The collection of speech samples poses several potential problems. Speech samples necessarily involve verbal behavior and reducing its impact can be accomplished in several ways: using standard content (e.g., alphabet, stock passages, or pseudolanguage), or masking techniques such as low-pass filtering (content-filtering; Rogers, Scherer, and Rosenthal 1971) or randomized splicing (K.R. Scherer 1971) remove content and voice quality cues and have been used in many studies (Rosenthal et al. 1979; K.R. Scherer and Wallbott 1985). Still other masking techniques are used where acoustic as well as linguistic cues are masked using “sinewave replication” to filter out voice quality and intonation (Remez, Fellowes, and Rubin 1997; Schiller and Koster 1998); also phonetic cues in voice segments have been distorted by playing them forwards and backwards (van Lacker, Kreiman, and Emmorty 1985). Examples of successful masking using content-filtered speech showed that physicians’ voices correlated with their success at referring alcoholic patients to treatment programs (Milmoe et al. 1967), with patients’ satisfaction (J.A Hall, Roter, and Rand 1981), and with physicians’ malpractice history (Ambady et al. 2002). In addition to concerns regarding verbal content, other issues to address relate to the sound quality and analysis of acoustic cues. Attention to sound quality at the time of recording and playback is critical. Research conducted on acoustic cues requires knowledge about speech acoustics and vocal parameters. Coding procedures have evolved for analyzing vocal expression, with commercially available computer software designed to measure acoustic cues. Commercial products are available via the internet (e.g., Audacity), and these are described by Juslin
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
42
Jinni A. Harrigan
and Scherer (2005) who recommend PRAAT, available for download at http://www. fon.hum.uva.nl/praat/. There are a number of technical devices which may be of service for recording vocal cues: spectrogram (i.e., visual picture of a speech sample similar to a sonogram), EMG (electromyographic measure of muscle actions during speech with surface or needle electrodes), and thermistors to measure air temperature variations during inhalation and exhalation (Juslin and Scherer 2005). Speech disturbances traditionally have been coded by trained coders who tally disturbances from meticulously transcribed speech including all nonlinguistic sounds and incomplete speech (Kasl and Mahl, 1965; Rosenfeld 1987). Speech rate can be determined by word or syllable counts (Buller 2005). Pauses (i.e., silent and filled) in the speech stream have been measured using a stopwatch (GoldmanEisler 1968; Matarazzo and Wiens 1977), but more elaborate software exists to objectively and automatically assess pausing and speech disruptions (Patel and Shrivastav 2007). There are many applications for measuring speech characteristics, and some of these show potential diagnostic benefit. For example, researchers aim to model the vocal quality of dysphonic voices (i.e., involuntary movements of muscles of larynx) during speech to serve as a diagnostic tool in clinical settings (Shrivastav et al. 2011), and efforts to infer personality characteristics from normal speakers has the potential to be used with populations suffering psychological disorders (Mohammadi, Vinciarelli, and Mortillaro 2010).
3.2. Research considerations As with other areas in nonverbal behavioral research, crucial questions regarding vocal behavior depend on the research focus: on the encoder (individuals emitting vocal behavior) or on the decoder (individuals evaluating vocal behavior). Several methodological decisions are necessary to consider: 1) determining the type of speech samples (e.g., portrayals, natural or induced expressions) and how these will be recorded; 2) deciding how to segment speech samples (i.e., establishing boundaries between units of analysis in the speech stream); 3) selecting vocal behaviors (e.g., acoustic parameters, speech disruptions); 4) and determining number of participants. Judgment studies require decisions regarding the choice of judges, rating methods, the format for and type of ratings, and establishing reliabilities among judges and ratings. Suggestions for overcoming methodological difficulties are discussed in detail by Juslin and Scherer (2005) and Tusing (2005).
4 Proxemics The field of proxemics encompasses the perception, use, and framing of space. Contexts include conversations among intimates or strangers, employee interface
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
Methodology: coding and studying nonverbal behavior
43
in business settings, teacher-student collaborations, and approach or crowding by others. Researchers are from many disciplines with differing perspectives: naturalistic, observational studies to empirical manipulations of proxemic cues. Historically, E.T. Hall (1963, 1966, 1973) and Sommer (1959, 1961) were the first to study proxemics (E.T. Hall) and personal space (Sommer), and their ideas reflect their theoretical backgrounds. An anthropologist, E.T. Hall primarily emphasized features of our mammalian sensory equipment (i.e., thermal receptors, vision, olfaction, touch, kinesthesia [head, body, limbs]) in relation to others; kinesthesia refers to an awareness of body position and movement by proprioceptors in muscles and joints. For example, intimates (mothers and babies; lovers) often inhabit the close phase of E.T. Hall’s intimate distance, zero to 18 inches, where touch, smell, body heat, and even faint sounds are perceived, but vision is distorted. Watson and Graves (1966) operationalized E.T. Hall’s dimensional codes: e.g., holding and caressing touch through spot touching to no physical contact. E.T. Hall (1974) also discussed environmental and cultural effects on our use of space: sociopetal (to encourage communication) and sociofugal (for solitarity). Sommer, a psychologist, observed individuals’ alignments in “semi-fixed” space (around tables, chairs), and how these reflected and were affected by affiliation, status, leadership, and productivity (1967, 1969). He found intriguing results on intrusions into another’s personal space (Sommer 2002; Sommer and Becker 1969). Both E.T. Hall’s and Sommer’s findings (e.g., have been corroborated by others; Altman and Visel 1977). Research also has been focused on concepts of territoriality, defense, crowding, boundary markers, and maneuvers for maintaining personal space in public settings (Goffman 1971). While these proxemic content issues are of great interest, the methodology for coding these variables often is not clearly defined, and conceptual categories have been shown to be considerably complex. For example, crowding, a psychological experience, involves not just population density, but time spent in the encounter, attention paid to oneself or others (Zlutnick and Altman 1972), social stimulation (Desor 1972), and room size (Ross et al. 1973). Altman made significant contributions in research on crowding, territoriality, and interpersonal relations (Altman 1975). He described primary (i.e., owner’s domain; home, bedroom), secondary (i.e., not exclusive to owner; neighborhood sites), and public (i.e., available to anyone; parks, public transportation seats) territories, and discussed how privacy is upheld through physical barriers, place markers, and verbal and nonverbal adjustments to discourage interaction (Altman 1975). Lyman and Scott (1967) developed a classification system based on the degree of personal autonomy in various settings, and delineated types of territorial incursion (i.e., violation, invasion, and contamination). Attempts to apply the defining features of these territories to real life settings are not clear-cut. Lines defining interactional, public, and secondary territories are often fuzzy, with considerable overlap on critical variables such as density, use of boundary markers, status, degree of acquaintanceship, and other relevant factors. Sommer’s studies
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
44
Jinni A. Harrigan
reveal the impact of markers (e.g., coat over a chair) to defend personal space in public, and reduce incursion by others while the owner was absent (Sommer and Becker 1969). An additional area of inquiry that has received extensive study is approach distance, i.e., approach toward or being approached by another person (Hayduk 1981a). Aiello (1987) reviewed more than 100 studies that have used the popular, “stop-distance procedure” in which a participant signals “stop” to show his/her level of discomfort with regard to an approaching individual or when approaching another (Hayduk 1983). The space between interactants after “stop” is presumably measured specifically, but reference is made to an estimate, i.e., an “arm’s length” (Aiello 1987). Hayduk (1981b) altered frontal body angle of approach, and conducted the most elaborate study of reactions to intrusions. Bailenson et al. (2003) showed that participants approached by a virtual human in a virtual room behaved in a manner similar to human-to-human approach. An interesting study of approach distance shows the interface between neurology and proxemics when a patient with bilateral damage to the amygdala (which plays a key role in social cognition) displayed an absence of personal space boundaries (Kennedy et al. 2009).
4.1 Proxemic variables The primary proxemic variable has been distance between interactants, and it has garnered the most investigative attention despite being a crude and limited measure. Other proxemic variables add to measurement precision: frontal orientation, posture, and sensory input (e.g., vision, olfaction, touch). Distance would appear to be an unequivocal measure to establish, but the actual physical measurement (i.e., inches or cm) of distance between interactants rarely has been used as it is highly intrusive. Most often distance is estimated between interactants’ foreheads, noses, chins, knees, chests, feet, or chair edges. In some studies the number of floor tiles separating individuals has been used, though its accuracy is ambiguous. Barnard and Bell (1982) developed the Interpersonal Distance Mat where tension on embedded wires within the carpet mat measures pressure from a person walking or standing which is illuminated as a LED device. Trained coders have estimated distances between participants in field settings such as playgrounds (Aiello and Jones 1971) or physicians’ offices (Noesjirwan 1977) using E.T. Hall’s proxemic scales. Videotaped records permit greater accuracy in measuring distance using predetermined calculations, e.g., the distance between participants’ heads and torsos was estimated in three-inch intervals from field recorded videotapes (Remland, Jones, and Brinkman 1995). Calibrated grids have been used in several studies to guide trained coders in establishing distance (Madden 1999). Similarly, photographs made in field settings (i.e., shopping malls, sidewalks) were projected onto a calibrated grid to estimate distance (Burgess 1983).
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
Methodology: coding and studying nonverbal behavior
45
Jones and Aiello (1973) developed a coding system to adjust for participant height variations, and Shuter (1976) designed a method for coding frontal body orientation.
4.2 Methodologies for proxemic studies Studies have employed projective and experimental techniques, and field investigations in public spaces or observations in labs, classrooms, or elevators. Projective techniques ask participants to indicate their comfortable standing or seating distance from another person by manipulating felt figures, dolls, or silhouettes (Aronow, Reznikoff, and Troyon 1975; Strayer and Roberts 1997), or marking photographs (Meisels and Guardo 1969), drawings (Ashton and Shaw 1980), or a questionnaire depicting various scenarios (Hogh-Olesen 2008; Pederson 1973). Duke and Nowicki (1972) designed the Comfort Interpersonal Distance Scale that has been used frequently, as have Kuethe’s (1962) felt figures. Holmes (1992) adopted a unique approach with children whereby they drew a picture of themselves with a stranger and with a friend, and distances between each were measured. Hayduk (1983) and Aiello (1987) contended that projective techniques are a poor measure of personal space because they do not parallel life-size differences, and correlations are very low between projective and real-life studies; such problems are especially true in studies of approach distance. Experimental strategies require subjects to choose a seat with confederates or other participants (Latta 1978). Beaulieu (2004) allowed participants to position their chair, and used taped floor measures to estimate distance. In other studies, the participant’s chair was in the same position for all participants and it could not be moved (Patterson, Roth, and Schenk 1979), or the distance between a participant and interviewer was manipulated (Sundstrom 1975). Participants’ attitudes (Marshall and Heslin 1975) or physiological responses (McBride, King, and James 1965) were assessed in relation to room density or distance from the experimenter. Field investigations are many, and include a variety of public settings: transportation terminals (Remland et al. 1995), outdoor benches (Leibman 1970), playgrounds (S.E. Scherer 1974), sidewalks (Sobel and Lillith 1975), movie and bank lines (Kaya and Erkip 1999), and shopping malls (Brown 1981). Spatial organization with respect to walking in public spaces also has been considered (Costa 2010). In public settings, unobtrusive observations were conducted by trained coders (Greenbaum and Rosenfeld 1980), or photographs and videotapes were made using unseen or disguised cameras (Gilmour and Walkey 1981). While video recordings with slow motion and digital counters offer more precise distance estimates than paper and pencil tallies in the field, other problems can occur (e.g., angle of participants to the camera). S.E. Scherer (1974) developed photogrammetry, a mathematical formula to account for errors in coding distance resulting from participants’ angle to the camera. Recently, proxemic studies have been conducted with
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
46
Jinni A. Harrigan
robots (Mumm and Mutlu 2011; Van Oosterhout and Visser 2008), and in virtual environments (Llobera et al. 2010). For example, customers interacted with a financial advisor at either a close or far distance using a video mediated device (Grayson and Coventry 1998).
4.3 Research considerations In general, there are no universally adopted methods for precisely measuring proxemic variables, and few improvements have been made since E.T. Hall’s notation system (1973). Researchers have developed their own procedures, but without procedural specifics methodological implementation and comparisons across studies are difficult. Hayduk’s (1983) and Aiello’s (1987) thoughtful and comprehensive reviews of more than 700 studies will benefit proxemic researchers. They discussed measures and methodological issues, theoretical interpretations, problem areas, and detailed findings on spatial behavior. Future research will benefit from continued development and precise description of methods and measurement techniques. A few coding suggestions can be proposed. The research question will direct decisions about which proxemic variables to include. In studies targeting how one uses space (e.g., finding solitarity in a public setting) and other variables of interest (e.g., age, gender), it would be useful to employ E.T. Hall’s measures with modifications by Watson and Graves (1966): posture, distance, orientation, touch, vision, audition, olfaction, and thermal detection. Each of these needs to be operationalized. For example, the relationships between children’s age and culture in relation to approach by peers, strangers, or authority figures will require the specifics of distance, posture, orientation, touch, and vision. Body positional cues (i.e., trunk lean, arm/leg/head positions) impact proxemics: for example leaning forward reduces the distance between participants and makes touching, olfaction, and thermal detection possible. When proxemic cues are secondary to the research question (e.g., encounter between bank teller and customer), distance and orientation may be sufficient, while other variables such facial expression and eye contact may be more important. Grahe and Bernieri (1999) rated the degree of mutual eye contact, in addition to proximity and orientation. Proxemic cues are a significant part of the Intimacy Equilibrium Model (Argyle and Dean, 1965; Argyle and Cook, 1976) and show the importance of studying co-occurring cues rather than isolated ones; this mode is discussed in the section on gaze.
5 Gaze behavior Gaze is unique among nonverbal channels in that it is used to both receive and send information. Receiving visual information refers to the “monitoring” function
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
Methodology: coding and studying nonverbal behavior
47
described by Kendon (1967), a pioneer in gaze studies. We gather information about our environment and others by looking to determine others’ motivations and intentions. Von Cranach (1971) described gaze as one component of “orienting behavior” which humans share with other animals. The send function is exercised in both “regulating” speaker-listener turn switches in conversation, and in “expressing” or revealing interest, emotion, and attitude (Kendon, 1967). Preliminary research decisions depend on which function will provide answers sought in the research question, and these decisions in turn dictate where and how the study will be conducted. Monitoring functions involve studies on gaze fixation with precise measurement of pupil direction and movement over stimuli, or studies examining the information gleaned by the observer through looking (Faraday and Sutcliffe 1998). Substantial research exists on the role of gaze in coordinating speaking turns, and indicating listener responsiveness (Duncan and Fiske 1977). Visual behavior is an element in expressing emotion, information, or attitude. Norms for appropriate gaze have been delineated, e.g., “civil inattention” (i.e., not gazing at strangers in public) or “cutting” (i.e., visually ignoring another) (Goffman 1963).
5.1 Gaze variables Gaze behaviors include: eye direction (left/right, up/down); eye contact or mutual gaze between interactants; “one-sided gaze” (one person looks at another who does not return gaze); glancing (brief looks toward and away from another person or object); staring (continual gaze at another); and gaze aversion (looking away from another person) (Kleinke 1986; Noller 2005). Gaze variables are recorded as frequencies or durations, and the most commonly studied are: mutual gaze, glance frequency, gaze duration at partner, and proportion of looking during a specified activity (e.g., listening, speaking); many of these are intercorrelated (Duncan and Fiske 1977). The difficulties encountered in precise measurement have given way to recording “face gaze” (looking toward another’s face) versus “eye gaze” (looking into another’s eyes). Typically, one moves the head when redirecting gaze, and interactants tend to look at each other or well away (Exline 1972; Kendon 1967). Von Cranach (1971) and others (Exline and Fehr 1982; Guerrero 2005) have suggested that the precise determination of eye-to-eye contact may be less critical than the direction of one’s head in relation to another person. It may be more useful to record the extent to which interactants direct their face toward one another, rather than suffer cumbersome intrusions with eye-tracking devices. In their seminal work on speaker-listener turn switching, Duncan and Fiske (1977: 43), recorded gaze based “largely from the movement and orientation of the actor’s head.” The Intimacy Equilibrium Model (Argyle and Cook 1976) suggests that approach-avoid-
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
48
Jinni A. Harrigan
ance forces underlie eye contact and are held in check by components of intimacy (i.e., degree of eye contact, physical distance, topic intimacy, body orientation, smiling, etc.); see Julien (2005) and Patterson (1991). Eye direction when answering thought-provoking questions is purported to reflect brain hemisphere involvement during cognitive processing. Looking left indicates retrieval of emotion or spatial information from the right hemisphere, whereas looking right evinces left hemisphere linguistic and analytic thinking (Weisz and Adam 1993). Finally, while not directly related to gaze, other parts of the eye, the eyebrows and eyelids, are important in studies of emotion: e.g., raised brows in surprise, lowered drawn brows in anger (Ekman and Friesen 1977); blinking in anxiety (Harrigan and O’Connell 1996), and decreased gaze in embarrassment (Edlemann and Hampson 1979). One distinguishing feature among smile types is eye muscle involvement (Ekman and Friesen 1982). Facial affect processing is greatly influenced by direct versus averted gaze (Adams and Kleck 2003).
5.2 Methodologies for gaze studies After determining the variables that will provide the desired data, a question is where to conduct the study. Field studies elucidate naturally occurring gaze in public settings, while laboratory experiments are designed for greater precision of measurement and manipulation of relevant variables by the experimenter. Determining gaze in the field (e.g., in cars, at airports) is imprecise compared with measures using technical equipment in laboratories such as videotaping equipment. These difficulties have encouraged researchers to choose manipulation of gaze patterns in the field rather than measuring spontaneous gaze in the field. Such manipulation studies include participants’ helping a victim (Ellsworth and Langer 1976), giving money (Kleinke 1977), or complying with a request (Snyder, Grather, and Keller 1974). Field studies may offer better external validity than laboratory studies, but several difficulties must be overcome. LaFrance and Mayo (1976) had observers position themselves in public settings so that they could clearly tally gaze data from both dyad members. While inter-observer reliabilities may be sufficient using such crude measures, the introduction of recording equipment has enhanced reliability considerably. As in studies of other nonverbal behaviors, camera obtrusiveness can significantly alter gaze (Exline and Fehr 1982), but its effect can be partially remedied by creatively placed cameras or deceptively directed camera apertures (Eibl-Eibesfeldt 1989). The limited scope of observation can be partially compensated for by a wide angle lens, and a telephoto can help resolve distance issues. A telephoto record of the image provides more detail in the region of interest; zooming in on an image (taken with a standard lens) on the computer compromises clarity as there are fewer pixels of the desired image. Other potential prob-
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
Methodology: coding and studying nonverbal behavior
49
lems need attention: visual acuity, illumination, subject movement out of camera range, and blocking of subject by other individuals or objects. Laboratory studies reduce many disadvantages of field studies. In early investigations, gaze was coded by an observer who sat behind a one-way mirror directly behind the participant’s interlocutor (Argyle and Cook 1976). This “over the shoulder” approach permitted direct visualization of the participant’s gaze (Exline 1972). Gaze variables were tallied on paper, or with event recorders (Dovidio et al. 1988). While coder reaction time may affect measurement error, reliabilities were quite substantial (Exline and Fehr 1982). Another important consideration is the number of participants. Most often, research has been conducted with dyads, but there are studies with three or more (Harrigan and Steffen 1983). A separate coder is assigned to each participant in live interactions (Guerrero 2005), or videotaped recordings were made of each participant (Duncan and Fiske 1977) with split-screen technology to precisely synchronize gaze and speech (Argyle and Cook 1976). As in studies of other nonverbal behaviors, confederates have been used to present the same demeanor and visual patterns to each participant, and can be trained to control their gaze with remarkable precision (Exline 1972), or can be cued with unseen, slight shocks to the hand (Ellyson et al. 1980). Employing a confederate introduces the possibility that other confederate behaviors (e.g., smiling, nodding) can systematically bias the results (Guerrero and Le Poire 2005), and confederates have reported “affective reactions” when altering normal gaze, by modifying head nods, gestures, and orientation (Exline and Fehr 1982). Monitoring can help reduce the potential effects of confederate discomfort, and the attendant arousal experienced by looking at or being looked at by another (Exline and Fehr 1982). Using confederates permits greater control, but reduces spontaneity and introduces artificiality, distinct disadvantages for some research questions.
5.3 Reliability and validity of gaze measurement Reliability estimates for gaze variables are often quite high for live interactions and videotaped records (Argyle and Cook 1976; Exline and Fehr 1982). High reliability estimates are more likely when coding is based on videotape than live encounters because of advantages of re-play, slow motion viewing, and resolution of measurement errors. Validity is more difficult to establish as people do not typically fixate on one part of the face (Yarbus 1967). “Humans are not as accurate as desired in determining when others look them directly (italics added) in the eye(s)” (Exline and Fehr 1982:122). Validity estimates for eye-directed gaze are considerably worse than for face-directed gaze (Exline and Fehr 1982). Judgment errors increase as the head deviates from a straight-on position, as distance between interactants increases, or as gaze duration decreases (Argyle and Cook 1976). There seems to
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
50
Jinni A. Harrigan
be a wide margin to the left and right of one’s face (“off-the-face-gazes”) that is interpreted as gaze from another person. The issue of validity in determining eyeto-eye contact by a participant may be of little importance as (Kendon 1970) and others (Argyle 1970) have strongly suggested that participants tend to look at each other’s faces in an interaction or clearly look away. The meaning attributed to another’s gaze can be assessed by ratings or questionnaires by interacting confederates or other observers (Kleinke, Meeker, and La Fong 1974). Researchers are encouraged to peruse the stellar review of gaze methodology by Fehr and Exline (1987).
5.4 Technology in gaze measurement When precise measurement of gaze is required, the setting likely will be in a laboratory where devices can be implemented to accurately pinpoint gaze. Eye tracking devices shine an infrared light from an infrared sensitive camera into the participant’s eye which reflects lens and cornea boundaries. The reflection off the retina pinpoints the position of the pupil which can be videotaped; “bright pupil” reflection provides the highest accuracy. Since head movement tends to occur when people change their gaze, a stable head position is necessary using a chin rest and forehead lean bar to prevent movement. Pupil reflection is recorded by either a headset with tiny cameras (Moukheiber et al. 2010) or a table mounted camera system (Talmi and Liu 1999). Fortunately, newer headset models are less restrictive, permitting a full range of motion (Nadig et al. 2010), and pupil tracking algorithms calibrate eye orientation when head movement is unrestricted (Ronsee, White, & Lefevre 2007). Many eye tracking studies recorded the gaze of one participant as he/she viewed a specific target. Eizenman and colleagues (2003) developed instrumentation and software analyses which showed remarkable specificity in fixations and glance durations using a high resolution eye tracker, placed behind a participant who receives the gaze of another participant. Nadig and colleagues (2010) measured gaze between autistic children and an interacting adult, and Vertegaal et al. (2001) measured gaze in groups of three to four participants using calibrated circles around interactants’ videotaped faces. An interactive computer system is needed to synchronize coordinated gaze and speech cues (Richardson and Dale 2005). Elaborate devices may be superfluous, however, in light of earlier reports that in natural settings, interactants look at another’s face or well away; “where the receiver thinks the sender looks is more important than where the sender does precisely focus” (Exline 1972: 204). Strongman and Champness (1968) developed a “chance model” to predict the degree of mutual gaze using a probabilistic formula based on individual gaze patterns. While these predictions have been substantiated (Rutter et al. 1977), Exline and Fehr (1982: 115) comment that “another human is a socially significant event that captures far more of our attention…than a truly
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
Methodology: coding and studying nonverbal behavior
51
chance model would predict,” and thus, the degree of mutual gaze is greater than one would predict by random confluence.
6 Kinesics Body movement research methodology suffers from a lack of a well defined, organized coding method that fits a conceptual and theoretical framework. A few coding systems have been proposed. Labanotation (Laban 1975) is a movement notation system designed specifically for dance but has been used to record body actions over time. Birdwhistell (1970), a pioneer in the study of body movement, patterned his coding system on linguistic principles where “kinemes,” the most elementary units (like phonemes), are combined into “kinemorphs” (analogous to morphemes), and then “kinemorphic contructions” (analogous to sentences). One coding system that has received further development since its inception is the Bernese system (Frey and Pool 1976) which uses the Cartesian axes of horizontal, vertical, and depth as spatial parameters to code movement from moment to moment. It purports to cover all possible movements by assigning a numerical code for each movement or “deviation from normal”; for example, a head tilt to the left and down is coded as the deviation from head upright and facing forward (i.e., “normal”). Of these three coding strategies, the Bernese system shows the best reliability, but each suffers considerable coding challenges: large number of arbitrary non-intuitive symbols, time-consuming nature of the coding process, and isolation of body movements which may be considered as a unit (e.g., crossed arms, lean away, and less direct frontal orientation to connote “rejection”). Reasons for the failure to develop an adequate coding strategy for head and body movement may be the intimidating fact that humans display a rich mosaic of body movements in a rather constant state of change. In addition to the sheer number of possible body actions and postures, there are factors of versatility, subtlety, and speed of movement; and the interactive quality of various actions and positions. Coding is manageable, however, because of three key features of body movement. The “body tableau” is composed of a modest number of moveable parts. Legs, arms, and trunk are primarily involved in body positioning and may reflect affect or attitude, actions of the shoulder or elbow may be relevant more specifically, in the display of affect: i.e., shoulder shrugging; elbow jabbing toward another person. The head and hands are responsible for the most movement, and have received the most research attention. A second feature that helps make coding manageable is that while many possible actions and positions can be performed anatomically speaking, some rarely, if ever, occur (e.g., conversing with others from an extreme backward trunk lean). Social conventions, “display rules,” guide our behavior by the exercise of culturally learned rules that govern “…when it is appropriate to express an emotion and to whom one can reveal one’s feelings”
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
52
Jinni A. Harrigan
(Ekman and Rosenberg 1997: 10). This maxim applies to body movements as atypical actions and positions have been regarded as diagnostic with respect to mental stability (e.g., catatonic positioning in schizophrenia) or level of intellectual functioning (e.g., body rocking in mental retardation) (APA 2000; Goffman 1963). A final feature that modifies coding intricacy is that body movements often occur simultaneously or in sequence, and many co-occur with facial and vocal behaviors. Movements with such a temporal relationship are more visible than single movements, thus reducing omission errors (i.e., not coding a behavior that occurred), and provide information on the function of movement patterns.
6.1 Kinesic variables Body movements can be distinguished as actions and positions (Harrigan 2005). Coding strategies have been focused primarily on action behaviors with a relatively distinct onset (i.e., beginning) and offset (i.e., end). Actions involve the head, hands, shoulders, and feet (e.g., nodding, gesturing, kicking) and often are considered expressive of affect, attitude, or intention. Positions are associated with aligning the body and are recorded as a beginning position and whenever a change in configuration occurs; one’s body is always in a position with torso, arms, and legs arranged. Positions are larger units, change less frequently, can be more easily codified, and usually are described with reference to an interactant. Individual position changes tend not to occur in isolation, and often can be considered as a unit. For example, a shift in trunk lean usually affects arm, and sometimes leg, positions. Self-synchrony assumes the coordinated interaction of an individual’s body movements, and interactional synchrony describes coincident movements between interactants. Considerable research exists on both types of synchrony (Bernieri, Reznick, and Rosenthal 1988; Condon and Sander 1974). For a more thorough explication of synchrony measurements see also Chapter 18 (Lakin, this volume). Positions provide information regarding attention, interest, and attitude. They may reflect the degree of tension an individual is experiencing and something about emotion intensity, but carry little information about specific affect (Ekman and Friesen 1974).
6.1.1 Body actions Ekman and Friesen (1969) described several categories of nonverbal behavior based on a theoretical framework; these have been widely adapted by researchers. The first category is emblems, a term adopted from Efron’s (1941) impressive study of hand actions. Emblems are symbolic actions with a “specific verbal translation known to most members of a subculture, and [are] typically intended to send a message” (Ekman and Friesen 1977: 38). They include head nods and hand movements like pointing, waving, “OK” and other signs, and shoulder shrugs.
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
Methodology: coding and studying nonverbal behavior
53
A second category is illustrators, “movements directly tied to speech, serving to illustrate what is being said verbally” (Ekman and Friesen, 1969: 68). These hand actions accent or emphasize a word or phrase (batons), draw the shape of the referent (pictographs), sketch a path or direction of thought (ideographs), depict a bodily action (kinetographs), point to objects (deictics), or show a spatial (spatials) relationship. Illustrators are displayed with little direct awareness or intention, and generally have negligible meaning separate from speech (Krauss, Morrel-Samuels, and Colasante 1991). Much research has been focused on the function of illustrators in speech where they are most often exhibited by speakers to aid listener comprehension (Cohen; 1977; Kendon 1994), but this notion is far from clear (Rimé and Schiaratura 1991). Interestingly, illustrators were exhibited spontaneously by congenitally blind individuals while speaking (Blass, Freedman, and Steingart 1974). Krauss and colleagues have conducted extensive studies on these movements in relation to semantic representations, word retrieval, and hesitation phenomena (Hadar et al. 1998; Krauss, Chen, and Chawla 1996). Bavelas and colleagues (Bavelas and Gerwing 2007) offer an intriguing model of the function of gestures which emphasizes the role of the addressee in effecting the form of the speaker’s gesture: e.g., degree of shared information (Gerwing and Bavelas 2004), or redundancy in linguistic content (Bavelas et al. 2011). Finally, illustrators affect observers’ impressions of the encoder, and their frequency is related to psychopathology, deception, and personality ratings (Ekman and Friesen 1977). The third category is self-adaptors where one part of the body has contact with another body part such as when grooming, scratching one’s head, or hand-to-hand rubbing (Ekman 1977). The function of self-adaptors is considered an affective one: an attempt to cope with feelings, relieve self or bodily needs, or comfort, irritate, or release emotional arousal (Ekman and Friesen 1969). These are usually displayed “with little awareness, without the deliberate intent to communicate a message” (Ekman and Friesen 1977: 39), and are thought to convey some diffuse information about the encoder’s emotional state, pathology, deceptiveness, and general personality traits (Ekman and Friesen 1977). Self-adaptors reveal unintended “emotional leakage” betraying aroused affect, and have been associated with anxiety, guilt, hostility, and suspiciousness (Ekman and Friesen 1974). However, those who display self-adaptors also have been rated very positively (Harrigan et al. 1987). A critical point regarding self-adaptors is the importance of the location, temporal patterning, and type of self-adator (Goldberg and Rosenthal 1986). For instance, hand-to-hand rubbing by job interviewees and patients was judged as more appropriate than by friends or strangers (Harrigan et al. 1991). Freedman and colleagues (Barroso et al. 1978) developed a compelling cognitive processing theoretical view for both illustrators and self-adaptors: illustrators help “buttress the clarity of the image” connecting the image and word, while self-adaptors help maintain the focus of attention and organization of thought (Freedman 1977). The final category, regulators, includes actions with no explicit meaning in themselves, but help “maintain and regulate the back-and-forth” flow of conversa-
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
54
Jinni A. Harrigan
tion between speakers and listeners (Ekman and Friesen, 1969: 82). Regulators include listener responses such as head nods, eye contact, postural shifts, eyebrow movements, and hand movements as “floor holders” (Ekman 2004). Conversational exchange behaviors have received much study; see Feldstein and Welkowitz (1987) and Rosenfeld (1987) for reviews, as well as Chapter 16, Patterson, this volume. While work by Bavelas and colleagues has contributed much to our knowledge of illustrators and their interaction with speech and facial actions, the area of hand movements in general has evolved slowly and much remains rudimentary with respect to description, coding systems, specific methodology, function, and conceptualization. Another important type of body action, in addition to those mentioned above, involves touching, where one person touches another. Research studies on touch are many and include greetings and farewells, intimate encounters, providing comfort or service, or aggressive conflicts (Knapp and Hall 2010; Jones 1994). Touch studies include gender (J.A. Hall and Veccia 1990) and cultural differences (Nail, Harton, and Decker 2003); and therapeutic contact (Stenzel and Rupert 2004). For example, increased touch to premature infants resulted in profound changes in weight gain and developmental advances (Field 2001). Andersen and Guerrero (2005) created the Body Chart for recording touch in natural and lab settings. For touch studies, several factors need be considered: types of touching (Argyle 1975; Heslin and Alper 1983); location of touch (Morris 1977); meanings of touch and characteristics of the toucher and touchee (Knapp and Hall 2010).
6.1.2 Body positions Unfortunately, most studies on body position offer incomplete descriptions of the coded behavior, and much work remains on specifying body positions. These include: overall posture (i.e., sitting, standing, lying), trunk or frontal orientation (i.e., facing, turned away), trunk lean (i.e., forward, straight, backward, sideways), and arm, leg, and foot positions (e.g., folded arms, uncrossed legs, feet under chair). Trunk lean refers to the angle of the trunk with respect to the hips; most often based on a seated posture, although “upper body lean” has been noted for standing postures (Mehrabian 1968). Trunk lean is referenced with respect to an interactant, and can be upright or erect (i.e., head and shoulders in a vertical line over the hips), forward lean (i.e., head and shoulders forward of upright relative to the hips), or backward lean (i.e., head and shoulders backward of upright relative to the hips). Lean has been further defined using a range from five to a 45 degree angle from upright (Fairbanks, McGuire, and Harris 1982). Researchers included sideways turn of the trunk where the shoulders are turned to the left or right (Vrij 1994). Trunk orientation is most often coded as a range where the encoder is directly facing (i.e., zero degrees) or facing away (i.e., turned at a right angle, 90 degrees) from another person (Cappella and Green 1984). Orientation
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
Methodology: coding and studying nonverbal behavior
55
also has been based on the shoulders’ alignment with the plane of the encoder’s seat edge, or the plane of the interactant’s shoulders (Bernieri and Gillis 1995). In many studies the lack of definition for arm and leg positions is typical. Coding the arms, legs, and feet is focused on movement frequencies or noting arm and leg positions as open, symmetrical, or relaxed without defining these terms. Some researchers coded specific types of arm positions such as arms akimbo or folded arms; Mabry (1989) defined five specific arm positions. Bente (1989) described arm movements with respect to horizontal, vertical, and forward or backward axes. Harrigan and Carney (2005) coded arm and leg positions with symbols representing the configuration, e.g., arms folded, hands resting together in lap, legs crossed ankle on knee, feet beneath the chair, etc. Frequencies of postural shifts also have been reported, but definition specificity was lacking; these are sometimes defined as any change in posture, or only leg movements (Vrij 1994). Distinct posture definitions are represented by Hewes (1957), who exhaustively described the world distribution of postures for sitting and standing.
6.1.3 Head movements Not surprisingly, the most typical action counted when coding head movement is nodding, but others are shaking, tilting (i.e., head drawn toward shoulder), and turning movements associated with gaze changes or the slight movements which occur when speaking or listening (Kendon 1970). Still other behaviors have been counted, though ambiguously defined: dipping, bobbing, tossing, thrusting, and dropping. Researchers define nodding based on the type and direction of movement: cyclical or continuous, up/downward or forward/backward motions on the vertical or sagittal plane (Noller 2005). Definitional reference points for nods, shakes, and tilts can be based on imaginary lines drawn horizontally across the tip of the nose, and vertically from the top of the face to the chin bisecting the nose; a nod is coded when the nose crosses the horizontal line and a shake when the nose crosses the vertical line (Harrigan and Carney 2005). A head tilt draws the head toward the shoulder, a head dip draws the chin toward the chest without an upward lift (like a nod), and a head toss is an abrupt upward lift of the chin without a subsequent movement downward (like a nod). All of these actions vary in intensity, breadth, and frequency, and range from fast, vigorous, long head nods to slow, subtle, narrow head shakes. Counting only the frequency of these behaviors may not capture the qualitative variations which may provide valuable information about the function, meaning, or intent of the behavior. While a review of the functions of head movements is not within the scope of this chapter, a few highlighted studies may show the range of research questions that have addressed head movements. A substantial literature demonstrates the powerful reinforcing relationship between interviewer head nodding and information provided by clients (Matarazzo et al. 1964). Nodding has been displayed in
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
56
Jinni A. Harrigan
both giving and seeking approval, and in persuasion (Rosenfeld 1987). Head shaking was related to memory for negatively valenced words (Förster and Strack 1996), and produced prosocial feelings toward an individual who described a negative event (Tamir et al. 2004). Rhythmic head movements improved speech perception (Munhall et al. 2004), and were linked to suprasegmental features of speech (e.g., stress, amplitude) (Hadar et al. 1983). Studies on head tilting are rare (Noller and Callan 1989).
6.2 Training coders and determining reliability The most common method of coding body movements is using trained human observers. The researcher begins with clear definitions and parameters of each of behavior to be coded, and trains coders to recognize the behaviors and code their occurrence. Coders view samples of the participants’ behavior, and record the designated behaviors as frequencies or durations, or based on the time of occurrence, or in relation to another feature of the interaction (e.g., speaking turn, greeting, etc.). Most often behavior is coded from videotaped interactions which permit the viewing and reviewing that is necessary to establish a high level of accuracy and confidence in the coded behaviors. After initial training and practice, the coders’ data is checked for reliability, and if necessary, clarification and re-training is instituted. When acceptable reliabilities are established, each coder is dispatched to complete the coding independently. Continued reliability checks throughout the coding are encouraged to maintain a high level of accuracy. Acceptable reliability thresholds are often set at .80 or better using interrater, rho, or percent agreement, or Pearson or kappa coefficients (Baesler and Burgoon 1987). In some studies two or more coders recorded all behaviors for all participants, but typically only 10 to 25% of the sample is coded by the same two coders (Duncan and Fiske 1977). Reliability can be ascertained in several ways: percent agreement, Spearman’s rho, Pearson’s r, Ebel’s or Winer’s interrater analysis using intraclass correlation, and Rosenthal’s (1987) application of the Spearman-Brown formula. Cohen’s kappa corrects for chance agreement and is preferred over percent agreement (Bakeman 2005). Baesler and Burgoon (1987) conducted a thorough evaluation of reliability measures for nonverbal behavior and found very high median reliabilities for all categories of movement. Important considerations regarding reliability are thoroughly discussed in Rosenthal (2005), and are relevant for all the categories of nonverbal behavior.
6.3 Technology for kinesics Frey and Pool (1976) inserted a cross-hair device onto a videotape recording for coders to locate various movements with reference to vertical and horizontal lines.
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
Methodology: coding and studying nonverbal behavior
57
The Bernese system has been adapted for computer application, and by using it trained coders could draw 3-D characters based on data protocols of models’ body movement coded from the original videotaped interactions using the Bernese system (Bente et al. 2001). Transducers (i.e., small ultrasonic devices), attached to parts of the body, can show receiver-transducer distances and plot three-dimensional positions or movement (Altofer et al. 2000). Blascovich et al. (2002) and others (Guye-Vuilleme et al. 1999), studied behavior using computer generated “immersive virtual environments” (IVEs). Technology for coding head movements also is available. Hadar et al. (1985) used a polarized light goniometer (i.e., measures relationships among moving body parts) to systematically record up/down and left/right cycles in head nodding and shaking and related these to conversational behaviors (i.e., listener responses, speaker-turn attempts, speech stress). Older methods for synchronizing speech, body movement, and acoustic data required aligning videotaped information and speech transcripts together with a superimposed time clock. Almost unimaginably, with reel-to-reel videotapes a researcher had to turn the reels slowly back and forth to see the exact movement onset and offset. Stopwatches had been used to record the length of a movement or utterance (Duncan and Fiske 1977). A great boon to coding nonverbal behavior and speech is Kipp’s (2003) ANVIL, a software framework for digitized audiovisual data, which uses time-anchored embedded slots for the coded data (e.g., linguistic, acoustic, gestures) and which can be subsequently analyzed across channels (i.e., vocal, kinesic, verbal). This system permits ease in transcribing human behavior in temporal alignment with speech and other audio cues. Current computer software allows researchers to build a coding scheme with defined behaviors, collected from video recordings, and synchronized with other data (e.g., verbalizations, physiological measures) which can then be analyzed (e.g., Noldus Information Technology 2010; http://www.noldus.com/files/actions/2010_observer/observer_ xt_hu.html).
6.4 Research considerations The usual encounter for coding body movements is a dyad, but there are studies involving a group (Altorfer et al. 1992). Settings have included therapy interactions, employment interviews, or conversations between friends or strangers in a lab, educational, or public environment. A critical determinant for what movements are manipulated or quantified is the research query. Whether intentional or not, many body actions are expressive, and can reveal personal attributes (e.g., warm, impulsive) and motivations (e.g., interest, competitiveness). Body positions offer information about attitude, status, and degree of affiliation based on position in relation to another person: e.g., leaning forward versus sitting turned away. One category of movement may be more important than another: e.g., turn-taking stud-
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
58
Jinni A. Harrigan
ies include head and hand actions, while those concerned with cultural similarity may involve emblems of the hands and head. A guide for the selection of actions and positions is suggested: 1) when research questions are focused on the more enduring qualities of the interaction (e.g., status), positions provide initial impressions, and 2) for questions regarding characteristics that change from moment to moment (e.g., dynamic), expressive hand and head actions may be most evocative of affect, attitude, attention, and other social interactive behaviors. The critical determination is how the specific research question is reflected in movement. Positions can be coded individually, but may contribute more information when treated as a unit (e.g., change in trunk lean and repositioning of arms and legs) as positions and actions often work together (Costa et al. 2001).
References Adams, R. B., Jr. and R. E. Kleck 2003. Perceived gaze direction and the processing of facial displays of emotion. Psychological Science 14: 644–647. Aiello, J. R. 1987. Human spatial behavior. In: D. Stokols and I. Altman (eds.), Handbook of Environmental Psychology, Vol. 1: 389–504. New York: Wiley. Aiello, J. R. and S. E. Jones 1971. Field study of the proxemic behavior of young children in three subcultural groups. Journal of Personality and Social Psychology 19: 351–356. Altman, I. 1975. The Environment and Social Behavior. Monterey, CA: Brooks Cole. Altman, I. and A. M. Vinsel 1977. Personal space: An analysis of E.T. Hall’s proxemics framework. In: I. Altman and J.F. Wohlwill (eds.), Human Behavior and the Environment: Advances in Theory and Research, Vol. 2: 181–259. New York: Plenum. Altofer, A., M. J. Goldstein, D. J. Miklowitz, and K. H. Nuechterlein 1992. Stress-indicative patterns of non-verbal behaviour: Their role in family interaction. British Journal of Psychiatry 161: 103–113. Altofer, A., S. Jossen, O. Wurmle, M. L. Kasermann, K. Foppa, and H. Zimmerman 2000. Measurement and meaning of head movements in everyday face-to-face communicative interaction. Behavior Research Methods, Instruments, and Computers 32: 17–32. Ambady, N., D. A. LaPlante, T. Nguyen, R. Rosenthal, N. Chaumeton, and W. Levinson 2002. Surgeons’ tone of voice: A clue to malpractice history. Surgery 132: 5–9. American Psychiatric Association 2000. Diagnostic and Statistical Manual of Mental Disorders (4th ed.). Washington, DC: American Psychiatric Association. Ancoli, S. 1979. Psychophysiological Response Patterns of Emotion. San Francisco, CA: University of California San Francisco. Andersen, P. A. and L. K. Guerrero 2005. Measuring live tactile interaction: The Body Chart Coding Approach. In: V. Manusov (ed.), The Sourcebook of Nonverbal Measures, 83–92. Mahwah, NJ: Lawrence Erlbaum Associates. Argyle, M. 1970. Eye-contact and distance: A reply to Stephenson and Rutter. British Journal of Psychology 61: 395–396. Argyle, M. 1975. Bodily Communication. New York: International Universities Press. Argyle, M. and J. Dean 1965. Eye-contact, distance and affiliation. Sociometry 28: 289–304. Argyle, M. and M. Cook 1976. Gaze and Mutual Gaze. Cambridge, U.K.: Cambridge University Press.
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
Methodology: coding and studying nonverbal behavior
59
Aronow, E., M. Reznikoff, and W. W. Tryon 1975. The interpersonal distance of process and reactive schizophrenics. Journal of Consulting and Clinical Psychology 43: 94 Ashton, N. L. and M. E. Shaw 1980. Empirical investigations of a reconceptualized personal space. Bulletin of the Psychonomic Society 15: 309–312. Baesler, E. J. and J. K. Burgoon 1987. Measurement and reliability of nonverbal behavior. Journal of Nonverbal Behavior 11: 205–233. Bailenson, J. N., J. Blascovich, A. C. Beall, and J. M. Loomis 2003. Interpersonal distance in immersive virtual environments. Personality and Social Psychology 29: 819–833. Bakeman, R. 2005. Analysis of coded nonverbal behavior. In: V. Manusov (ed.), The Sourcebook of Nonverbal Measures, 375–381. Mahwah, NJ: Lawrence Erlbaum Associates. Barnard, W. A. and P. A. Bell 1982. An unobtrusive apparatus for measuring interpersonal distance. Journal of General Psychology 107: 85–90. Barroso, F., N. Freedman, S. Grand, and J. V. Meel 1978. Evocation of two types of hand movements in information processing. Journal of Experimental Psychology: Human Perception and Performance 4: 321–329. Bavelas, J. and J. Gerwing 2007. Conversational hand gestures and facial displays in face-to-face dialogue. In: K. Fiedler (ed.), Social Communication, 283–308. New York: Psychology Press. Bavelas, J., J. Gerwing, M. Allison, and C. Sutton 2011. Dyadic evidence for grounding with abstract deictic gestures. In: G. Stam and M. Ishino (eds.), Integrating Gestures: The Interdisciplinary Nature of Gesture, 49–60. Amsterdam: John Benjamins Publishing Co. Beaulieu, C. M. J. 2004. Intercultural study of personal space: A case study. Journal of Applied Social Psychology 34: 794–805. Bente, G. 1989. Facilities for the graphical computer simulation of head and body movements. Behavior Research Methods, Instruments, and Computers 21: 455–462. Bente, G., N. C. Kramer, A. Petersen, and J. P. de Ruiter 2001. Computer animated movement and person perception: Methodological advances in nonverbal behavioral research. Journal of Nonverbal Behavior 25: 151–166. Bernieri, F. J. and J. A. Gillis 1995. The judgment of rapport: A cross-cultural comparison between Americans and Greeks. Journal of Nonverbal Behavior 19: 115–130. Bernieri, F. J., S. Reznick, and R. Rosenthal 1988. Synchrony, pseudosynchrony, and dissynchrony: Measuring the entrainment process in mother-infant interactions. Journal of Personality and Social Psychology 54: 243–253. Birdwhistell, R. L. 1970. Introduction to Kinesics. Louisville, KY: University of Louisville. Blascovich, J. J., J. M. Loomis, A. Beall, K. R. Swinth, C. L. Hoyt, and J. N. Bailenson 2002. Immersion virtual environment technology as a methodological tool for social psychology. Psychological Inquiry 13: 103–124. Blass, T., N. Freedman, and I. Steingart 1974. Body movement and verbal encoding in the congenitally blind. Perceptual and Motor Skills 39: 279–293. Blurton Jones, N. G. 1971. Criteria for use in describing facial expression in children. Human Biology 41: 365–413. Brown, C. E. 1981. Shared space invasion and race. Personality and Social Psychology Bulletin 7: 103–108. Bugental, D. E., J. W. Kaswan, and L. R. Love 1970. Perception of contradictory meanings conveyed by verbal and nonverbal channels. Journal of Personality and Social Psychology 16: 647–650. Buller, D. B. 2005. Methods for measuring speech rate. In: V. Manusov (ed.), The Sourcebook of Nonverbal Measures, 317–323. Mahwah, NJ: Lawrence Erlbaum Associates. Burgess, W. J. 1983. Developmental trends in proxemic spacing behavior between surrounding companions and strangers in casual groups. Journal of Nonverbal Behavior 7: 158–169.
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
60
Jinni A. Harrigan
Burgoon, J. K. and E. J. Baesler 1991. Choosing between micro and macro nonverbal measurement: Application to selected vocalic and kinesic indices. Journal of Nonverbal Behavior 15: 57–78. Burgoon, J. K., B. A. Le Poire, L. E. Beutler, J. J. Bergan, and D. Engle 1992. Nonverbal behaviors as indices of arousal: Extension to the psychotherapy context. Journal of Nonverbal Behavior 16: 159–178. Cacioppo, J. T., G. G. Berntson, J. L. Larsen, K M. Poehlmann, and T. A. Ito 2000. The physiology of emotion. In: M. Lewis and J. M. Haviland-Jones (eds.), Handbook of Emotions, 173–191. New York: The Guilford Press. Cacioppo, J. T., W. L. Gardner, and G. G. Berntson 1999. The affect system has parallel and integrative processing components: Form follows function. Journal of Personality and Social Psychology 76: 839–855. Cappella, J. and J. O. Green 1984. The effects of distance and individual differences in arousability on nonverbal involvement: A test of Discrepancy-Arousal Theory. Journal of Nonverbal Behavior 8: 259–286. Carney, R. M., B. A. Hong, M. F. O’Connell, and H. Amado 1981. Facial electromyography as a predictor of treatment outcome in depression. British Journal of Psychiatry 138: 485–489. Chomsky, N. 1965. Aspects of the Theory of Syntax. Cambridge, M.A.: M.I.T. Press. Cohen, A. A. 1977. The communicative functions of hand illustrators. Journal of Communication 27: 54–63. Cohn, J. F. 2010. Advances in behavioral science using automated facial image analysis and synthesis. IEEE Social Signal Processing Magazine 128: 128–133. Cohn, J. F. and P. Ekman 2005. Measuring facial action. In: J. A. Harrigan, R. Rosenthal, and K. R. Scherer (eds.), The New Handbook of Methods in Nonverbal Behavior Research, 9–64. Oxford, UK: Oxford University Press. Cohn, J. F. and T. Kanade 2007. Use of automated facial image analysis for measurement of emotion expression. In: J. A. Coan and J. J. B. Allen (eds.), The Handbook of Emotion Elicitation and Assessment, 222–238. New York: Oxford University Press. Cohn, J. F., A. J. Zlochower, J. J. J. Lien, and T. Kanade 1999. Automated facial analysis by feature point tracking has high concurrent validity with manual FACS coding. Psychophysiology 36: 35–43. Condon, W. S. and L. W. Sander 1974. Synchrony demonstrated between movements of the neonate and adult speech. Child Development 45: 456–462. Costa, M. 2010. Interpersonal distances in group walking. Journal of Nonverbal Behavior 34: 15– 26. Costa, M., W. Dinsbach, A. S. Mansfield, and P. E. Ricci Bitti 2001. Social presence, embarrassment, and nonverbal behavior. Journal of Nonverbal Behavior 25: 225–240. Desor, J.A. 1972. Toward a psychological theory of crowding. Journal of Personality and Social Psychology 21: 79–83. Dimberg, U., M. Thunberg, and S. Grundel 2002. Facial reactions to emotional stimuli: Automatically controlled emotional responses. Cognition and Emotion 16: 449–471. Dittmann, A. T. 1987. Conversational control functions of nonverbal behavior. In: A. W. Siegman and S. Feldstein (eds.), Nonverbal Behavior and Communication, 563–601. Hillsdale, NJ: Lawrence Erlbaum Associates. Dovidio, J. F., S. L. Ellyson, C. F. Keating, K. Heltman, and C. E. Brown 1988. The relationship of social power to visual displays of dominance between men and women. Journal of Personality and Social Psychology 54: 233–242. Duke, M. P. and S. Nowicki Jr. 1972. A new measure and social-learning model for interpersonal distance. Journal of Experimental Research in Personality 6: 119–132. Duncan, S. D., Jr. and D. W. Fiske 1977. Face-to-Face Interaction: Research, Methods, and Theory. Hillsdale, NJ: Lawrence Erlbaum Associates.
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
Methodology: coding and studying nonverbal behavior
61
Edelman, R. and S. Hampson 1979. Changes in non-verbal behavior during embarrassment. British Journal of Social and Clinical Psychology 18: 85–390. Efron, D. 1941. Gestures and Environment. New York: Kings Crown Press, (Republished as Gesture, Race and Culture. 1972. The Hague: Mouton.) Eibl-Eibesfeldt, I. 1989. Human Ethology. New York: Aldine de Gruyter. Eizenman, M., L. H. Yu, L. Grupp, E. Eizenman, M. Ellenbogen, M. Gemar, and R. D. Levitan 2003. A naturalistic visual scanning approach to assess selective attention in major depressive disorder. Psychiatry Research 118: 117–128. Ekman, P. 2004. Emotional and conversational nonverbal signals. In: J. M. Larrazabal and L. A. Pérez (eds.), Language, Knowledge, and Representation, 39–50. Netherlands: Kluwer Academic Publishers. Ekman, P. and W. V. Friesen 1969. The repertoire of nonverbal behavior: Categories, origins, usage, and coding. Semiotica 1: 49–98. Ekman, P. and W. V. Friesen 1971. Constants across cultures in the face and emotion. Journal of Personality and Social Psychology 17: 124–129. Ekman, P. and W. V. Friesen 1974. Nonverbal behavior and psychopathology. In: R. J. Friedman and M. M. Datz (eds.), The Psychology of Depression: Contemporary Theory and Research, 203–232. Washington, D.C.: Winston & Sons. Ekman, P. and W. V. Friesen 1977. Nonverbal behavior. In: P. F. Ostwald (ed.), Communication and Social Interaction, 37–45. New York: Grune and Stratton. Ekman, P. and W. V. Friesen 1978. Facial Action Coding System: A Technique for the Measurement of Facial Movement. Palo Alto, C.A.: Consulting Psychologists Press. Ekman, P. and W. V. Friesen 1982. Felt, false, and miserable smiles. Journal of Nonverbal Behavior 6: 238–252. Ekman, P., W. V. Friesen, and I. S. Ancoli 1980. Facial signs of emotional experience. Journal of Personality and Social Psychology 39: 1125–1134. Ekman, P., W. V. Friesen, and J. C. Hager 2002. Facial Action Coding System. Salt Lake City, U.T.: Research Nexus, Network Research Information. Ekman, P. and E. L. Rosenberg 1997. What the Face Reveals: Basic and Applied Studies of Spontaneous Expression Using the Facial Action Coding System (FACS). New York: Oxford University Press. Ekman, P. and E. L. Rosenberg 2005. What the Face Reveals: Basic and Applied Studies of Spontaneous Expression Using the Facial Action Coding System (FACS) (2nd ed.). New York: Oxford University Press. Ekman, P., G. Schwartz, and W. V. Friesen 1978. Electrical and visible signs of facial action. San Francisco, C.A.: Human Interaction Laboratory, University of California San Francisco. Elfenbein, H. A. and N. Ambady 2002. On the universality and cultural specificity of emotion recognition: A meta-analysis. Psychological Bulletin 128: 203–235. Ellsworth, P. C. and E. J. Langer 1976. Staring and approach: An interpretation of the stare as a nonspecific activator. Journal of Personality and Social Psychology 33: 117–122. Ellyson, S. L., J. F. Dovido, R. L. Corson, and D. L. Vinicur 1980. Visual dominance behavior in female dyads: Situational and personality factors. Social Psychology Quarterly 43: 328–336. Ermiane, R. and E. Gergerian 1978. Atlas of Facial Expressions (Album des expressions du visage). Paris: La Pensee Universelle. Exline, R. V. 1972. The glances of power and preference. In: J. R. Cole (ed.), Nebraska Symposium on Motivation, 19: 163–206. Lincoln, N.B.: University of Nebraska Press. Exline, R. V. and B. J. Fehr 1982. The assessment of gaze. In: K. R. Scherer and P. Ekman (eds.), Handbook of Methods in Nonverbal Behavior Research, 91–135. Cambridge, UK: Cambridge University Press. Fairbanks, L. A., M. T. McGuire, and C. J. Harris 1982. Nonverbal interaction of patients and therapists during psychiatric interviews. Journal of Abnormal Psychology 91: 109–119.
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
62
Jinni A. Harrigan
Faraday, P. and A. Sutcliffe 1998. Making contact points between text and images. ACM Multimedia 1998, 29–37. Fast, J. 1970. Body Language. New York: Pocket Books. Fehr, B. J. and R. V. Exline 1987. Social visual interaction. In: A. W. Siegman and S. Feldstein (eds.), Nonverbal Behavior and Communication (2nd ed.), 225–326. Hillsdale, NJ: Lawrence Erlbaum Associates. Feldstein, S. and J. Welkowitz 1987. A chronology of conversation: In defense of an objective approach. In: A. W. Siegman and S. Feldstein (eds.), Nonverbal Behavior and Communication (2nd ed.), 435–499. Hillsdale, NJ: Lawrence Erlbaum Associates. Field, T. 2001. Touch. Cambridge, MA: MIT Press. Förster, J. A. and F. Strack 1996. Influence of overt head movements on memory for valenced words: A case of conceptual-motor compatibility. Journal of Personality and Social Psychology 71: 421–430. Freedman, N. 1977. Hands, words, and mind: On the structuralization of body movements during discourse and the capacity for verbal representation. In: N. Freedman and S. Grand (eds.), Communicative Structures and Psychic Structures, 109–132. New York: Plenum. Friesen, W. V. and P. Ekman 1984. EMFACS-7: Emotional Facial Action Coding System. Unpublished manuscript. San Francisco, CA: University of California. Frey, S. and J. Pool 1976. A new approach to the analysis of visible behavior. Berne: Research Reports from the Department of Psychology at the University of Berne. Fridlund, A. J. and J. T. Cacioppo 1986. Guidelines for human electromyographic research. Psychophysiology 23: 567–589. Frois-Wittmann, J. 1930. The judgment of facial expression. Journal of Experimental Psychology 13: 113–151. Gallois, C. and V. J. Callan 1986. Decoding emotional messages: Influence of ethnicity, sex, message type, and channel. Journal of Personality and Social Psychology 51: 755–762. Gerwing, J. and J. Bavelas 2004. Linguistic influences on gesture’s form. Gesture 2:157–195. Gilmour, R. D. and F. H. Walkey 1981. Identifying violent offenders using a video measure of interpersonal distance. Journal of Consulting and Clinical Psychology 49: 287–291. Goffman, E. 1963. Behavior in Public Places. New York: The Free Press. Goffman, E. 1971. Relations in Public. New York: Harper Colophon Books. Goldberg, S. and R. Rosenthal 1986. Self-touching behavior in the job interview: Antecedents and consequences. Journal of Nonverbal Behavior 10: 65–80. Goldman-Eisler, F. 1968. Psycholinguistics: Experiments in Spontaneous Speech. New York: Academic. Grahe, J. E. and F. J. Bernieri 1999. The importance of nonverbal cues in judging rapport. Journal of Nonverbal Behavior 23: 253–269. Gray, H. M. and N. Ambady 2006. Methods for the study of nonverbal communication. In: V. Manusov and M. L. Patterson (eds.), The Sage Handbook of Nonverbal Communication, 41–58. Thousand Oaks, C.A.: Sage Publications. Grayson, D. and L. Coventry 1998. The effects of visual proxemic information in video mediated communication. SIGCHI Bulletin 30: 30–39. Greenbaum, P. E. and H. M. Rosenfeld 1980. Varieties of touching in greetings: Sequential structure and sex-related differences. Journal of Nonverbal Behavior 5: 13–25. Guerrero, L. K. 2005. Observer ratings of nonverbal involvement and immediacy. In: V. Manusov (ed.), The Sourcebook of Nonverbal Measures, 221–237. Mahwah, NJ: Lawrence Erlbaum Associates. Guerrero, L. K. and B. A. Le Poire 2005. Nonverbal research involving experimental manipulations by confederates. In: V. Manusov (ed.), The Sourcebook of Nonverbal Measures, 507–522. Mahwah, NJ: Lawrence Erlbaum Associates.
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
Methodology: coding and studying nonverbal behavior
63
Guye-Vuillieme, A., T. K. Capin, I. S. Pandzic, N. Magnenat Thalmann, and D. Thalmann 1999. Nonverbal communication interface for collaborative virtual environments. Virtual Reality Journal 4: 49–59. Hadar, U., T. J. Steiner, E. C. Grant, and C. F. Rose 1983. Kinematics of head movements accompanying speech during conversation. Human Movement Science 2: 35–46. Hadar, U., T. J. Steiner, E. C. Grant, and C. F. Rose 1985. Head movement during listening turns in conversation. Journal of Nonverbal Behavior 9: 214–228. Hadar, U., D. Wenkert-Olenik, R. Krauss, and N. Soroker 1998. Gesture and the processing of speech: Neuropsychological evidence. Brain and Language 62: 107–126. Hall, E. T. 1963. A system for notation of proxemic behavior. American Anthropologist 65: 1003– 1026. Hall, E. T. 1966. The Hidden Dimension. New York: Doubleday. Hall, E. T. 1973. Handbook for Proxemic Research. Washington D.C.: Society for the Anthropology of Visual Communication. Hall, E. T. 1974. Proxemics. In: S. Weitz (ed.), Nonverbal Communication, 205–229. New York: Oxford University Press. Hall, J. A., D. L. Roter, and C. S. Rand 1981. Communication of affect between patient and physician. Journal of Health and Social Behavior 22:18–30. Hall, J. A. and E. M. Veccia 1990. More “touching” observations: New insights on men, women, and interpersonal touch. Journal of Personality and Social Psychology 59: 1155–1162. Harrigan, J. A. 2005. Proxemics, kinesics, and gaze. In: J. A. Harrigan, R. Rosenthal, and K. R. Scherer (eds.), The New Handbook of Methods in Nonverbal Behavior Research, 137– 198. Oxford, UK: Oxford University Press. Harrigan, J. A. and D. R. Carney 2005. A coding method for body/head positions and actions: Laboratory manual. Fullerton, C.A.: California State University. Harrigan, J. A., J. R. Kues, J. J. Steffen, and R. Rosenthal 1987. Self-touching and impressions of others. Personality and Social Psychology Bulletin 13: 497–512. Harrigan, J. A., K. S. Lucic, D. Kay, A. M. McLaney, and R. Rosenthal 1991. Effects of expresser role and body location of self-touching on observers’ perceptions. Journal of Applied Social Psychology 21: 585–609. Harrigan, J. A. and D. O’Connell 1996. How do you look when feeling anxious?: Facial displays of anxiety. Personality and Individual Differences 32: 851–864. Harrigan, J. A. and J. J. Steffen 1983. Gaze as a turn-exchange signal in group conversations. British Journal of Social Psychology 22: 167–168. Harrigan, J. A., K. Wilson, and R. Rosenthal 2004. Detecting state and trait anxiety from auditory and visual cues: A meta-analysis. Personality and Social Psychology Bulletin 30: 56–66. Hayduk, L. A. 1981a. The permeability of personal space. Canadian Journal of Behavioral Science 13: 274–287. Hayduk, L. A. 1981b. The shape of personal space: An experimental investigation. Canadian Journal of Behavioral Science 123: 87–93. Hayduk, L. A. 1983. Personal space: Where we now stand. Psychological Bulletin 94: 293–335. Heslin, R. and T. Alper 1983. Touch: A bonding gesture. In J. M. Wiemann and R. P. Harrison (eds.), Nonverbal Interaction, 47–75. Beverly Hills, CA: Sage. Hewes, G. W. 1957. The anthropology of posture. Scientific American, 196: 123–132. Hogh-Olesen, H. 2008. Human spatial behavior: The spacing of people, objects and animals in six cross-cultural samples. Journal of Cognition and Culture 8: 245–280. Holmes, R. M. 1992. Children’s artwork and nonverbal communication. Child Study Journal 22: 157–166. Izard, C. E. 1971. The Face of Emotion. New York: Appleton-Century-Crofts. Izard, C. E. 1979. Facial Expression Scoring Manual (FESM). Newark, D.E.: University of Delaware.
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
64
Jinni A. Harrigan
Izard, C. E. 1983. Maximally Discriminative Facial Movement Coding System (MAX). Unpublished manuscript. Neward, DE: University of Delaware. Izard, C. E. and L. M. Dougherty 1980. System for Identifying Affect Expressions by Holistic Judgments (AFFEX). Unpublished manuscript. Newark, DE: University of Delaware. Jones, S. E. 1994. The Right Touch: Understanding and Using the Language of Physical Contact. Cresskill, N.J.: Hampton Press. Jones, S. E. and J. R. Aiello 1973. Proxemic behavior of black and white first, third, and fifth-grade children. Journal of Personality and Social Psychology 25: 21–27. Julien, D. 2005. A procedure to measure interactional synchrony in the context of satisfied and dissatisfied couples’ communication. In: V. Manusov (ed.), The Sourcebook of Nonverbal Measures, 199–208. Mahwah, NJ: Lawrence Erlbaum Associates. Juslin, P. N. and K. R. Scherer 2005. Vocal expression of affect. In: J. A. Harrigan, R. Rosenthal and K. R. Scherer (eds.), The New Handbook of Methods in Nonverbal Behavior Research, 65–135. Oxford, UK: Oxford University Press. Kasl, S. V. and G. F. Mahl 1965. The relationship of disturbances and hesitations in spontaneous speech to anxiety. Journal of Personality and Social Psychology 1: 425–433. Kaya, N. and F. Erkíp 1999. Invasion of personal space under the condition of short-term crowding: A case study on an automatic teller machine. Journal of Environmental Psychology 19: 183–189. Kendon, A. 1967. Some functions of gaze direction in social interaction. Acta Psychologica 26: 22–63. Kendon, A. 1970. Movement coordination in social interaction: Some examples described. Acta Psychologica 32: 1–25. Kendon, A. 1994. Do gestures communicate?: A review. Research on Language and Social Interaction 27: 175–200. Kennedy, D. P., J. Glaescher, M. Tyszka, and R. Adolphs 2009. Personal space regulation by the human amygdala. Nature Neuroscience 12: 1226–1227. Kipp, M. 2003. Anvil 4.0 Annotation of Video and Spoken Language User Manual. http://www. dfki.de/~kipp/anvil. Kleinke, C. L. 1977. Compliance to requests made by gazing and touching experimenter in field settings. Journal of Experimental Social Psychology 13: 218–223. Kleinke, C. L. 1986. Gaze and eye contact: A research review. Psychological Bulletin 100: 78–100. Kleinke, C. L., F. B. Meeker, and C. La Fong 1974. Effects of gaze, touch, and use of name on evaluation of “engaged” couples. Journal of Research in Personality 1: 368–373. Knapp, M. L. and J. A. Hall 2010. Nonverbal Communication in Human Interaction (7th ed.). Belmont, CA: Cengage Learning. Krauss, R. M., Y. Chen, and P. Chawla 1996. Nonverbal behavior and nonverbal communication: What do conversational hand gestures tell us? In: M. P. Zanna (ed.), Advances in Experimental Social Psychology, 28: 389–450. San Diego, CA: Academic Press. Krauss, R. M., P. Morrel-Samuels, and C. Colasante 1991. Do conversational hand gestures communicate? Journal of Personality and Social Psychology 61: 743–754. Kuethe, J. L. 1962. Social schemas. Journal of Abnormal and Social Psychology 64: 31–38. Laban, R. 1975. Laban’s Principles of Dance and Movement Anotation. Princeton, NJ: Princeton Book Company. LaFrance, M. and C. Mayo 1976. Racial differences in gaze behavior during conversation: Two systematic observational studies. Journal of Personality and Social Psychology 33: 547–552. Landis, C. 1924. Studies of emotional reactions: II. General behavior and facial expression. Journal of Comparative Psychology 4: 447–509. Latta, R. M. 1978. Relation of status incongruence to personal space. Personality and Social Psychology Bulletin 4: 143–146.
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
Methodology: coding and studying nonverbal behavior
65
Leibman, M. 1970. The effects of sex and race norms on personal space. Environment and Behavior 2: 208–246. Llobera, J., B. Spanlang, G. Ruffini, and I. Mel Slater 2010. Proxemics with multiple dynamic characters in an immersive virtual environment. ACM Transactions on Applied Perception 8(1), article 3. Lyman, S. M. and M. B. Scott 1967. Territoriality: A neglected sociological dimension. Social Problems 15: 236–249. Mabry, E. A. 1989. Developmental aspects of nonverbal behavior in small group settings. Small Group Behavior 20: 190–202. Madden, S. J. 1999. Proxemics and gender: Where’s the spatial gap? North Dakota Journal of Speech and Theater 12: 1–8. Marshall, J. and R. Heslin 1975. Boys and girls together: Sexual composition and the effect of density and group size on cohesiveness. Journal of Personality and Social Psychology 31: 952–961. Matarazzo, J. D., G. Saslow, A. N. Wiens, M. Weitman, and A. Bernadene 1964. Interviewer head nodding and interviewee speech durations. Psychotherapy: Theory, Research, and Practice 1: 54–63. Matarazzo, J. D. and G. Wiens 1977. The Interview: Research on its Anatomy and Structure. Chicago: Aldine-Atherton. McBride, G. M., G. King, and J. W. James 1965. Social proximity effects on galvanic skin responses in adult humans. Journal of Psychology 61:153–157. Mehrabian, A. 1968. Inference of attitudes from the posture, orientation, and distance of a communicator. Journal of Consulting and Clinical Psychology 32: 296–308. Meisels, M. and C. J. Guardo 1969. Development of personal space schemata. Child Development 49: 1167–1178. Milmoe, S. E., R. Rosenthal, H. T. Blane, M. E. Chaftez, and I. Wolf 1967. The doctor’s voice: Postdictor of successful referral of alcoholic patients. Journal of Abnormal Psychology 72: 78–84. Mohammadi, G., A. Vinciarelli, and M. Mortillaro 2010. The voice of personality: Mapping nonverbal vocal behavior into trait attributions. Proceedings of the International Workshop on Social Signal Processing, 17–20. Firenze, Italy. http://www.dcs.gla.ac.uk/~vincia/papers/ persossp.pdf Morris, D. 1977. Manwatching. New York: Abrams. Moukheiber, A., G. Rautureau, F. Perez-Diaz, R. Soussignan, S. Dubal, R. Jouvent, and A. Pelissolo 2010. Gaze avoidance in social phobia: Objective measure and correlates. Behaviour Research and Therapy 48: 147–151. Mumm, J. and B. Mutlu 2011. Human-Robot Proxemics: Physical and Psychological Distancing in Human-Robot Interaction. In: Proceedings of the 6th ACM/IEEE Conference on Human-Robot Interaction. Lausanne, Switzerland. Munhall, K. G., J. A. Jones, D. E. Callan, T. Kuratate, and E. Vatikiotis-Bateson 2004. Visual prosody and speech intelligibility: Head movement improves auditory speech perception. Psychological Science 15: 133–137. Nadig, A., I. Lee, L. Singh, K. Bosshart, and S. Ozonoff 2010. How does the topic of conversation affect verbal exchange and eye gaze? A comparison between typical development and highfunctioning autism. Neuropsychologia 48: 2730–2739. Nail, P. R., H. C. Harton, and B. P. Decker 2003. Political orientation and modern versus aversive racism: Tests of Dovidio and Gaertner’s 1998. Integrated Model. Journal of Personality and Social Psychology 84: 754–770. Noesjirwan, J. 1977. Contrasting cultural patterns on interpersonal closeness in doctors: Waiting rooms in Sydney and Jakarta. Journal of Cross-Cultural Psychology 8: 357–368.
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
66
Jinni A. Harrigan
Noldus Information Technology 2011. http://www.noldus.com/ Noller, P. 2005. Behavioral coding of visual affect behavior. In: V. Manusov (ed.), The Sourcebook of Nonverbal Measures, 141–150. Mahwah, N.J.: Lawrence Erlbaum. Noller, P. and V. J. Callan 1989. Nonverbal behavior in families with adolescents. Journal of Nonverbal Behavior 13: 47–64. Oster, H., D. Hegley, and L. Nagel 1992. Adult judgments and fine-grained analysis of infant facial expressions: testing the validity of a priori coding formula. Developmental Psychology 28: 1115–1131. Patel, S. and R. Shrivastav 2007. Perception of dysphonic vocal quality: Some thoughts and research update. Voice and Voice Disorders 17: 3–7. Patterson, M. L. 1991. A functional approach to nonverbal exchange. In: R. S. Feldman and B. Rimé (eds.), Fundamentals of Nonverbal Behavior, 458–495. Cambridge, UK: Cambridge University Press. Patterson, M. L., C. P. Roth, and C. Schenk 1979. Seating arrangement, activity, and sex differences in small group crowding. Personality and Social Psychology Bulletin 34: 114–121. Pederson, D. M. 1973. Prediction of behavioral personal space from simulated personal space 37: 803–813. Poyotas, F. 1993. Paralanguage: A Linguistic and Interdisciplinary Approach to Speech and Sound. Amsterdam: John Benjamins. Remez, R. E., J. M. Fellowes, and P. Rubin 1997. Talker identification based on phonetic information. Journal of Experimental Psychology Human Perception and Performance 23: 651–666. Remland, M. S., T. S. Jones, and H. Brinkman 1995. Interpersonal distance, body orientation and touch: Effects of culture, gender, and age. The Journal of Social Psychology 135: 281–297. Richardson, D. C. and R. Dale 2005. Looking to understand: The coupling between speakers’ and listeners’ eye movements and its relationship to discourse comprehension. Cognitive Science 29: 1045–1060. Rimé, B. and L. Schiaratura 1991. Gesture and speech. In: R. S. Feldman and B. Rimé (eds.), Fundamentals of Nonverbal Behavior, 239–284. Cambridge, UK: Cambridge University. Rockwell, P. 2006. “Yeah, right!”: A linguistic analysis of self-reported sarcastic messages and their contexts. Paper presented to the Language and Social Interaction Division of the Southern States Communication Association. Dallas, T.X. Rogers, P. L., K. R. Scherer, and R. Rosenthal 1971. Content filtering human speech: A simple electronic system. Behavioral Research Methods and Instruments 3: 16–18. Ronsee, R., O. White, and P. Lefèvre 2007. Computation of gaze orientation under unrestrained head movements. Journal of Neuroscience Methods 159: 158–169. Rosenfeld, H. M. 1987. Conversational control functions of nonverbal behavior. In: A. W. Siegman and S. Feldstein (eds.), Nonverbal Behavior and Communication, 563–601. Hillsdale, NJ: Lawrence Erlbaum Associates. Rosenthal, R. 1987. Judgment Studies: Design, Analysis, and Meta-analysis. New York: Cambridge University Press. Rosenthal, R. 2005. Conducting judgment studies: Some methodological issues. In: J. A. Harrigan, R. Rosenthal, and K. R. Scherer (eds.) The New Handbook of Methods in Nonverbal Behavior Research, 199–234. Oxford, UK: Oxford University Press. Rosenthal, R., J. A. Hall, M. R. DiMatteo, P. L. Rogers, and D. Archer 1979. Sensitivity to Nonverbal Communication: The PONS Test. Baltimore: The John Hopkins University Press. Ross, M., B. Layton, B. Erickson, and J. Schopler 1973. Affect, facial regard and reactions to crowding. Journal of Personality and Social Psychology 28: 69–76. Rutter, D. R., G. M. Stephenson, A. J. Lazzerini, K. Ayling, and P. A. White 1977. Eye contact: A chance product of individual looking? British Journal of Social and Clinical Psychology 16: 191–192.
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
Methodology: coding and studying nonverbal behavior
67
Scherer, K. R. 1971. Randomized splicing: A note on a simple technique for masking speech content. Journal of Experimental Research in Personality 5: 155–159. Scherer, K. R. 1979. Nonlinguistic vocal indicators of emotion and psychopathology. In: C. E. Izard (ed.), Emotions in Personality and Psychopathology, 493–529. New York: Plenum. Scherer, K. R. and H. G. Wallbott 1985. Analysis of nonverbal behavior. In: T. A. Van Dijk (ed.), Handbook of Discourse Analysis, 199–230. London: Academic Press. Scherer, S. E. 1974. Proxemic behavior of primary school children as a function of their socioeconomic class and subculture. Journal of Personality and Social Psychology 26: 800– 805. Schiller, N. O. and O. Köster 1998. The ability of expert witnesses to identify voices: A comparison between trained and untrained listeners. Forensic Linguistics 5:1–9. Shrivastav, Rahul, Arturo Camacho, Sona Patel, and David A. Eddins 2011. A model for the prediction of breathiness in vowels. Journal of the Acoustical Society of America 129: 1605– 1615. Shuter, R. 1976. Proxemics and tactility in Latin America. Journal of Communication 26: 46–52. Siegman, A. W. 1979. Cognition and hesitation in speech. In: A. W. Siegman and S. Feldstein (eds.), Of Time and Speech: Temporal Speech Patterns in Interpersonal Contexts, 151–178. Hillsdale, NJ: Lawrence Erlbaum Associates. Snyder, M., J. Grather and K. Keller 1974. String and compliance: A field experiment on hitchhiking. Journal of Applied Social Psychology 4: 165–170. Sobel, R. S. and N. Lillith 1975. Determinants of non-stationary personal space invasion. Journal of Social Psychology 97: 39–45. Sommer, R. 1959. Studies in personal space. Sociometry 22: 247–260. Sommer, R. 1961. Leadership and group geography. Sociometry 24: 99–110. Sommer, R. 1967. Sociofugal space. American Journal of Sociology 72: 654–660. Sommer, R. 1969. Personal space: The behavioral basis of design. Englewood Cliffs, N.J.: Prentice-Hall. Sommer, R. 2002. Personal space in a digital age. In: R. B. Bechtel and A. Churchman (eds.), Handbook of Environmental Psychology, 647–660. New York: Wiley. Sommer, R. and F. D. Becker 1969. Territorial defense and the good neighbor. Journal of Personality and Social Psychology 11: 85–92. Stenzel, C. L. and P. A. Rupert 2004. Psychologists’ use of touch in individual psychotherapy. Psychotherapy: Theory, Research, Practice, Training 41: 332–345. Strayer, J. and W. Roberts 1997. Children’s personal distance and their empathy: Indices of interpersonal closeness. International Journal of Behavioral Development 20: 385–403. Street, R. L. Jr. 1990. The communicative functions of paralanguage and prosody. In: H. Giles and W. P. Robinson (eds.), Handbook of Language and Social Psychology, 121–140. Chichester, UK: Wiley. Strongman, K. T. and B. G. Champness 1968. Dominance hierarchies and conflict in eye contact 1970. Acta Psychologica 28: 376–386. Sundstrom, E. 1975. An experimental study of crowding: Effects of room size, intrusion, and goal blocking on nonverbal behavior, self-disclosure, and self-reported stress. Journal of Personality and Social Psychology 32: 645–654. Talmi, K. and J. Liu 1999. Eye and gaze tracking for visually controlled interactive stereoscopic displays. Signal Processing: Image Communication 14: 799–810. Tamir, M., M. D. Robinson, G. L. Clore, L. L. Martin, and D. J. Whitaker 2004. Are we puppets on a string? The contextual meaning of unconscious expressive cues. Personality and Social Psychology Bulletin 30: 237–249. Trager, G. L. 1958. Paralanguage: A first approximation. Studies in Linguistics 13: 1–12. Tusing, K. J. 2005. Objective measurement of vocal signals. In: V. Manusov (ed.), The Sourcebook of Nonverbal Measures, 393–402. Mahwah, NJ: Lawrence Erlbaum Associates.
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM
68
Jinni A. Harrigan
Van Lacker, D., J. Kreiman, and K. Emmorey 1985. Familiar voice recognition: Patterns and parameters. Journal of Phonetics 13: 19–38. Van Oosterhout, T. and A. Visser 2008. A visual method for robot proxemics measurements. In: C. R. Burghart and A. Steinfeld (eds.), Proceedings of Metrics for Human Robot Interaction, 61–68. Workshop at ACM/IEEE HRI. Vertegaal, R., R. Slagter, G. van der Veer, and A. Nijholt 2001. Eye gaze patterns in conversations: There is more to conversational agents than meets the eyes. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. Seattle, WA. doi:10.1145/ 365024.365119. Von Cranach, M. 1971. The role of orienting behavior in human interaction. In: A. H. Esser (ed.), Behavior and Environment: The Use of Space by Animals and Man, 217–237. New York: Plenum. Vrij, A. 1994. The impact of information and setting on detection of deception by police officers. Journal of Nonverbal Behavior 18: 117–137. Watson, M. O. and T. D. Graves 1966. Quantitative research in proxemic behavior. American Anthropologist 68: 971–985. Weisz, J. and G. Adam 1993. Hemishperic preference and lateral eye movements evoked by bilateral visual stimuli. Neuropsychologica 31: 1299–1306. Yarbus, A. L. 1967. Eye Movement and Vision. New York: Plenum. Zlutnick, S. and I. Altman 1972. Crowding and human behavior. In: J. F. Wohlwill and D. H. Carson (eds.), Environment and the Social Sciences: Perspectives and Applications, 44–60. Oxford, UK: American Psychological Association.
Brought to you by | Portland State University Authenticated Download Date | 4/28/15 10:58 PM