10 Visual Attention and Emotion: Studying Influence

0 downloads 0 Views 2MB Size Report
Apr 10, 2014 - Usually when a painter selects one region of the canvas for increased detail over another ..... University of Florida, Gainesville, FL. Lang , P. J. ...
10 Visual Attention and Emotion: Studying Influence on a Two-Way Street Allison A. Brennan and James T. Enns

Introduction It is widely understood that we do not attend to the mundane. We direct our attention to the ripened fruit, the broken glass on the sidewalk, and the awe-inspiring mountains. In an art gallery, we admire the paintings and observe others as they admire the paintings. Our attention is not usually pulled to the walls on which the paintings hang. What is less widely understood is that the attentional processes that pull us toward things of interest, and push us away from the mundane, have a direct influence on our emotional experience. We are happy when we easily locate the product we are searching for at the supermarket, we become frustrated when roadways are not clearly marked and difficult to navigate, and we best appreciate the paintings in a gallery when we view them from the appropriate distance: close enough to see relevant detail, but not so close that we cannot take in the entire scene. In this chapter, we will consider both directions in this two-way relationship of attention and emotion. This will include considering how the mental and motoric processes of attention influence our emotional states and how these states influence our attentional processes. In doing so, we will highlight several different methods we have developed in order to study these relationships in dynamic and complex real-world environments. Along the way, we will provide theoretical context for each of these four lines of research by summarizing the important background that guided us in our research. We have organized this chapter into three main sections, based on the different ways that current researchers and theorists use the concept of emotion. In the first main section, we consider objects and events that convey emotional messages to individuals. This means we will examine the influence of these signals as stimuli on the observer’s attentional processes. As a methodological highlight, we will look most closely at a study of the attentional consequences of viewing emotional pictures while driving an automobile. In the second section we examine emotion as a feature of the participant’s experience, both as a temporary state that may be induced by a specific context, and as a longer-term trait of an individual. Here we discuss how 175

9781107096400c10_p175-198.indd 175

4/10/2014 1:24:49 PM

176

BRENNAN AND ENNS

a participant’s emotional state can influence, and be influenced by, the processes of attentional selection. Two methodological highlights are presented in this section, one examining the consequences of various temporary mood states on attentional selection in the widely used attentional blink task, and one measuring the consequences of a particular pattern of selective looking on visual preferences for works of art. In the third section, we consider the ways in which participants make overt expressions of emotion in experimental settings and how these emotional displays can be measured by researchers. The methodology we highlight here involves a recent study where we harnessed one group of participants’ people watching skills to assess the emotional experiences of a second group of visual search participants. Video recordings of searchers were shown to “people watchers” who were blind to the conditions under which the recordings had been made, in order to see how much information about performance was conveyed through bodily displays of emotion. In each section, we will first orient the reader to the ways in which the constructs of emotion and attention are being used in the background research, before considering the two-way interplay that has been documented between them. Finally, we will highlight one or two studies in order to examine the methodology of studying attention-emotion interactions.

Emotional Signals Influence Visual Attention An evolutionary perspective on selective attention suggests that objects and events critical to the survival of the human species should generally attract our attention, so that we may give these objects and events preferential processing when they are detected. As William James so famously said in his chapter on attention (James, 1890), our immediate attention is pulled instinctively by “strange things, moving things, wild animals, bright things, pretty things, metallic things, words, blows, blood, etc.” The prehistoric individuals for whom this was true were more likely to thrive and survive, and as a result we have continued to prioritize these critical objects and events in our environment. But what marks these objects and events as important? Important things are those that elicit a strong emotional response and memory. Our affective system imbues the world with value and meaning and allows us to make decisions quickly and easily (Lehrer, 2009; Mikels et al., 2011). Thus these emotional objects and events, the ones that have proven important in our evolutionary past, are today still influencing the selective mechanisms of the visual system so that they receive preferential processing when detected. One way this is accomplished by the visual neural pathways connecting the eye with the brain is that there is a fast low road to the amygdala via the subcortical brain structures known as the superior colliculus and pulvinar

9781107096400c10_p175-198.indd 176

4/10/2014 1:24:49 PM

Visual Attention and Emotion

177

(Tamietto and de Gelder, 2010), in addition to the better-known but slower high road to the cerebral cortex (Van Essen, 2004). This means that the prioritized processing given to stimuli with strong emotional signals can occur without the intervention of any conscious construal by us the cortical storyteller in our brain, the one who constructs narratives and makes conscious decisions about the allocation of attentional processing resources (Gazzaniga, 1998). Thus, even when the conscious awareness of stimuli is prevented, for example, by using backward visual masking, classically conditioned fear-relevant stimuli still produce differential skin conductance responses compared with non-fear-relevant stimuli (Esteves, Parra, Dimberg, and Öhman, 1994) and amygdala responses measured with functional magnetic resonance imaging (fMRI) discriminate between unseen emotional and unseen nonemotional stimuli (Morris, Öhman, and Dolan, 1998). The same conclusion is supported by research with blindsight patients (i.e., patients with striate cortex lesions) who are able to demonstrate nonconscious processing of emotional stimuli in their blindfield, despite a lack of conscious awareness (de Gelder, Vroomen, Pourtois, and Weiskrantz, 1999). In support of these low-level, instinctual influences on attention, there is now an extensive body of research on how the emotional nature of visual signals influences information processing. Within this literature, considerably more research has been conducted with negatively valenced stimuli than with positively valenced ones (Fredrickson, 2003). Here, we will briefly review this research (see Yiend, 2010 for a more comprehensive review), before taking a closer look at the methodology of a study on how emotional pictures influence performance in the attentionally challenging and emotion-filled everyday task of driving a motor vehicle. Considerable research has documented the fact that emotion-laden stimuli draw attention quickly and automatically, including images of spiders (Kindt and Brosschot, 1999), guns (Fox, Griggs, and Mouchlianitis, 2007), and faces with positive and negative emotional expressions (Öhman, Lundqvist, and Esteves, 2001). The most widely used task to demonstrate this in the laboratory is visual search, and the most commonly used theoretical framework for interpreting the results is feature integration theory (Treisman and Gelade, 1980). This theory states that when the target of search is defined by a feature that is basic to the visual system, search will be influenced very little by the number of nontargets (distractors) also present in a display. The target simply seems to pop out to the observer, and so search times increase little with set size (the total number of items on display). Conversely, search for a target that is not defined by a basic visual feature demonstrates increases in search times as the number of distractors (and set size) increases. Consistent with this interpretation, visual search times for feared objects are often reported to increase less with set size than equivalent images that convey less fearful or even positive signals. Research building on this basic finding has shown further that the traits of the participant with respect to specific fears can help

9781107096400c10_p175-198.indd 177

4/10/2014 1:24:49 PM

178

BRENNAN AND ENNS

amplify or gate this basic instinctual tendency. For example, participants who were generally fearful of snakes, though not spiders, were able to search for snakes more rapidly than other participants who were generally fearful of spiders, but not snakes (Öhman, Flykt, and Esteves, 2001). Research combining a visual search task with an involuntary spatial cue is often referred to as the study of attentional capture. In these tasks, participants have a clearly defined (emotionally neutral) target to search for, but this target may be preceded in the display, or presented simultaneously with, a task-irrelevant item that appears suddenly in the visual periphery. This item is often called a “salient singleton” and the extent to which search is slowed by this salient singleton is a measure of attentional capture. Research with fearful images indicates that the mere expectation of a feared object such as a spider on any given visual search display is sufficient for a spiderphobic individual to show exaggerated attentional capture, not only for images of spiders, but also for neutral images such as leaves and butterflies (Devue, Belopolsky, and Theeuwes, 2011). These findings do not only apply to feared objects: Positive nurturance-inducing images such as baby faces also elicit attentional capture, demonstrating that positive instincts can also exert an involuntary influence on attentional allocation (Brosch, Sander, Pourtois, and Scherer, 2008). Related research, using an involuntary spatial orienting task, shows that participants are faster to orient to a location in the visual periphery when the orienting cue conveys an emotional signal than when it is emotionally neutral (Armony and Dolan, 2002). This seems to be because, in addition to drawing visual attention involuntarily to a spatial location, emotional signals also hold attention in that location for a longer period. For example, detecting a target that appears in the same location as an immediately prior emotional cue (e.g., face, spider, threat word) is accomplished faster than detecting the same target following a neutral cue or the same target in a different location (Fox, Russo, Bowles, and Dutton, 2001). When an emotional signal automatically draws and holds attention to a specific spatial location, privileged processing occurs at this selected location at the expense of other possible locations – attention is a spatially limited resource. Similarly, when attentional resources are allocated to one aspect or feature of a task, this is at the expense of processing other possible features. In the Stroop task, where participants are required to name the color in which words are written, color naming is slowed more by emotional words (and in particular negative emotional words) than by neutral words (Watts, McKenna, Sharrock, and Trezise, 1986). In a change detection task, where participants are asked to report the different between two images that are alternately presented (such that they “flicker”), change detection in fear-related emotional images is impaired relative to the detection of a change in neutral images (McGlynn, Wheeler, Wilamowska, and Katz, 2008). Together, these findings show that because emotional signals monopolize attentional resources, they result in performance deficits in concurrent tasks that also require attention.

9781107096400c10_p175-198.indd 178

4/10/2014 1:24:49 PM

Visual Attention and Emotion

179

Attention is also limited with respect to time: When attention is allocated to a given moment, there is an associated information processing cost at a second moment in time. An attentional blink task taps into this temporally limited aspect of attention: When attentional processing resources are devoted to a first target in a rapid serial visual presentation stream of possible targets, the identification of a second target is impaired (i.e., an increased attentional blink) when it is presented 200–500 milliseconds after the first. Because emotional signals automatically draw and hold attention, they reduce the attentional blink (i.e., improve second target detection) when presented as the second target (Keil and Ihssen, 2004), but increase the attentional blink when presented as the first target (Mathewson, Arnell, and Mansfield, 2008). We will again draw upon this background literature on the attentional blink when we highlight the methodology we developed to study the two-way attention emotion relationship through the presentation of emotional images while driving, as well as how an individual’s mood following a mood induction procedure influences the time course of information processing. From the literature reviewed previously we know that emotional signals influence the allocation of attention. However, researchers have not yet resolved which aspect of the emotional signal affects attentional allocation. Emotion theorists generally agree that the core emotions are composed of two independent dimensions, valence and arousal (Lang, 1995), but attention researchers have largely ignored this distinction. For example, most studies have compared negative high arousal stimuli (threatening stimuli, e.g., snakes, guns) and neutral or positive stimuli that are lower in arousal (e.g., flowers, tools), thus confounding valence and arousal comparisons. Where emotion theorists differ in opinion is in whether they label the dimensions ‘‘pleasure-misery’’ and ‘‘sleep-arousal’’ (Russell and Barrett, 1999) or ‘‘pleasant-unpleasant’’ and ‘‘activation’’ (Larsen and Diener, 1992), and whether they label the cardinal axes or axes rotated by 45 degrees (Thayer, 1989; Watson and Tellegen, 1985). According to Yik, Russell, and Barrett (1999), these models can be integrated using a common space in which core states are differentiated, but assignment of axes is arbitrary. It is important to note that this framework does not imply that all emotions corresponding to the same pleasure-arousal coordinates produce the same experience. For example, anger, fear, jealousy, grief, and contempt are all affectively negative, high-arousal states, but this does not imply that they feel the same. Emotional experiences are differentiated by attributions individuals make to interpret the cause of a given mood state (Russell, 2003). To disentangle these two distinct dimensions of emotion, researchers must also consider how negative low arousal (e.g., sad) and positive high arousal (e.g., happy) stimuli, alongside the negative high arousal and positive low arousal stimuli already utilized, influence attentional allocation.

9781107096400c10_p175-198.indd 179

4/10/2014 1:24:49 PM

180

BRENNAN AND ENNS

The literature contains several hints that valence and arousal have separable influences on attention. Effects of arousal can be seen in impairments of immediate memory associated with traumatic stress (Nadel and Jacobs, 1998) and in distractions caused by task-irrelevant stimuli (Schimmack, 2005); influences of valence can be seen in the way threatening stimuli attract attention (Öhman, Flykt, and Esteves, 2001) and in the greater efficiency of processing when the emotional valence of the stimuli is relevant to the participant (Shapiro, Caldwell, and Sorensen, 1997). But it is also possible that valence and arousal interact to produce unique outcomes. For example, individuals who are sad or depressed (low arousal, negative affect) tend to process the fine-grained details of a scene at the expense of gist, whereas individuals who are happy (high arousal, positive affect) tend to focus on the gist at the expense of details (Gasper and Clore, 2002). The International Affective Picture System (Lang, Bradley, and Cuthbert, 2008) – a set of normative emotional stimuli – is an excellent resource for researchers investigating the independent contributions of arousal and valence to emotion. The picture system contains more than nine hundred images along three emotional dimensions: affective valence (ranging from pleasant to unpleasant), arousal (from calm to excited), and dominance (from in control to dominated). Similarly, the International Affective Digital Sounds, Affective Norms for English Words, and Affective Norms for English Text are also available, for researchers requiring sounds, verbal words, or brief texts, respectively.

Methodological Highlight: The Consequences of Emotional Images on Driving To distinguish between the effects of the two importantly different dimensions of emotion, namely, arousal and valance, we conducted a study of how driving performance in a simulator is influenced by the processing of Affective Picture System images that vary independently in their arousal and valance (Trick, Brandigampola, and Enns, 2012). We chose this method because driving is a real-world task that many of us engage in, where emotion and attention can both be readily seen at play. For example, emotions range from the exhilaration of the free road to annoyance at the slow moving car ahead. As shown in Figure 10.1 we used a four-door Saturn model simulator, surrounded by visual screens that provided a 250◦ wraparound virtual environment. The task of the participants was to follow a lead vehicle while maintaining safe driving. The simulator was equipped with a standard vehicle interior, augmented with audio speakers and vibration transducers and force feedback to simulate the sounds and sensations of driving. The simulations involved scenic drives through the country on straight two-lane roadway with

9781107096400c10_p175-198.indd 180

4/10/2014 1:24:49 PM

Visual Attention and Emotion

181

Figure 10.1. A study of the influence of emotional images while driving. Images were presented on in-vehicle display (to the right of the steering wheel) in a Saturn-model driving simulator. Response buttons were located on the steering wheel. Four types of images were displayed (high arousal positive, high arousal negative, low arousal positive, and low arousal negative). An example of a high arousal negative image is presented here. (See Plate 2.)

a 90 kilometers per hour (kph) posted speed limit. Drivers were required to maintain the speed limit, and if they went outside an 82–100 kph range, the vehicle “labored,” providing auditory and haptic feedback. There was a lead car in front of the driver that was programmed to stay 30 meters ahead except during braking events, when brake lights appeared before it decelerated suddenly. Drivers had to brake to avoid hitting the lead car. Our main measure of attention was a modification of the standard attentional blink task. While participants in this study were following the lead vehicle at a safe distance, they were asked to indicate that they noticed the occasional images that were projected on the 20 centimeter in-vehicle display to the right of the steering wheel. We selected these images to model the provocative or dramatic images that drivers routinely see using modern technologies such as smart phones, onboard computers, infotainment systems, and video billboards. Drivers indicated they noticed the images by pressing one of two buttons on the steering wheel, to indicate whether the valance of

9781107096400c10_p175-198.indd 181

4/10/2014 1:24:49 PM

182

BRENNAN AND ENNS

each image was positive or negative. The discrimination of these images was thus akin to first target identification in an attentional blink task. The driver’s response to a braking event – the sudden deceleration of the lead vehicle that necessitated braking to prevent a collision – served as the second target. These events sometimes occurred without any preceding images, 250 milliseconds after the driver responded to an image or 500 milliseconds after the driver responded to an image. Importantly, images also occurred frequently without any subsequent braking event. Thus, first target images were not a good predictor of second target braking events, and vice versa. In addition to investigating the independent contributions of emotional arousal and valance to attentional allocation while driving, we capitalized on the distinction between focal and ambient vision in order to determine whether the effects of emotional images would differ depending on the attentional system involved. Focal vision is important in object recognition and visual search for targets defined by clusters of features (not simple features, as in pop out). Because focal vision requires high levels of visual acuity, its success depends on excellent foveal vision and the ability to use eye movements to direct foveal gaze to the task-appropriate locations. Ambient vision, by way of contrast, is used for postural control and locomotion when we are walking, and for guiding tools, including cars, to task-appropriate locations. Its success depends on the analysis of optic flow (wide field of view motion signals) and in linking proprioception (one’s bodily position in space) with these environmental signals. Previous research on attention and driving has linked focal vision to the ability to recognize and respond to hazards in the path of the vehicle (Summala, Lamble, Laakso, 1998) and ambient vision to the ability steer the car when in motion (Summala, Nieminen, Punto, 1996). As further support of this dissociation, discrepancies between focal and ambient vision are correlated with dissociations between hazard response and steering (Horrey, Wickens, and Consalus, 2006). For example, when drivers wore goggles that severely limited their visual acuity, their hazard response to obstacles was compromised but their lane keeping was not (Higgins, Wood, Tait, 1998). Conversely, reducing the peripheral field of view impaired steering but left hazard response unaffected (Owens and Tyrell, 1999). To test whether the effects of emotional images differed, depending on whether focal or ambient vision was involved, we measured two critical dependent variables. One was response time and accuracy to a sudden hazard, which in our case was the onset of brake lights on the car they were instructed to follow at a safe distance. This was designed to assess focal vision. The other measure was variability in lane keeping, as indexed by the variance in the car’s lateral position on the road. This was designed to assess ambient vision. The results showed that valence and arousal had interactive effects on both hazard detection and steering, and that these effects varied in their

9781107096400c10_p175-198.indd 182

4/10/2014 1:24:50 PM

Visual Attention and Emotion

183

time course. The level of arousal in the emotional images had the greatest influence on braking response time. Specifically, high arousal positive images led to the fastest braking times when the braking event followed the emotional image by 250 milliseconds. However, this effect was short-lived and had diminished by the 500 millisecond delay. Such an immediate influence on response readiness likely stems from autonomic nervous system activation by these highly arousing images. The emotional valence (as opposed to arousal) of the images had a stronger influence on drivers’ steering performance. High arousal positive images were associated with significantly better steering performance, in a short time window that followed, than other images. Conversely, low arousal negative images were associated with significantly poorer steering performance, presumably because these images required the greatest amount of cognitive processing to identify (as confirmed by other measures in the study). Both of these findings are consistent with reports that experiencing positive emotion prompts individuals to broaden their focus of attention and thus to devote more attention to global aspects of the display, while experiencing negative emotion leads to a narrowing of attention and a focus on details (Fredrickson, 2003). They are also consistent with previously established links between ambient vision and steering. In summary, this study shows that the emotional content of images seen while driving can have different effects, depending both on the dimension of emotional experience that is targeted by the images (i.e., arousal vs. valence) and on the type of attention that is required for the driving operation (i.e., hazard detection via focal vision vs. steering via ambient vision). This study also represents an important proof of concept, namely, that the study of the attentional blink in a laboratory setting can be applied to a dynamic realworld environment.

Emotional Temporary States and Enduring Traits Personality and social psychologists have long acknowledged the importance of one’s emotional disposition on behavior and attitudes. More recently, cognitive psychologists have begun to explore these interactions. Here we review research focused on how mood can influence the allocation of attention and review a study from our lab on how an individual’s mood influences the time course of information processing using mood induction and an attentional blink task. We then examine the other direction: how the allocation of attentional processing resources affects emotional states. We conclude this section with a real-world application of this research: a study of how master artists have utilized painting techniques, such as selective blurring and sharpening, to direct observers’ gaze and enhance their aesthetic experience in viewing portraits.

9781107096400c10_p175-198.indd 183

4/10/2014 1:24:50 PM

184

BRENNAN AND ENNS

Emotion Influences Attention

Positive emotions tend to be associated with processing the global rather than the local aspects of displays (for review see Fredrickson, 2003). This has been demonstrated using global-local visual tasks, in which participants judge which of two comparison figures – small triangles arranged to form a large triangle or small squares arranged to form a large square – is more similar to a test figure of small squares arranged to form a large triangle. There is no correct answer, but the first comparison figure resembles the test figure in global configuration and is more frequently selected by people experiencing positive emotions. On the other hand, the second comparison figure resembles the test figure in its local elements; people experiencing negative emotions select this as the more similar shape more frequently. Individuals with high levels of positive emotion and low levels of negative emotion also demonstrated an attentional bias in an emotional Stroop task. They took longer to name the color of high-intensity happiness words, indicating that attention was automatically allocated to reading the positive words, even though this impaired performance in the color-naming task (Strauss and Allen, 2006). Positive affect prompted individuals to broaden their focus of attention in a conceptual domain, with an increased scope of semantic access in a remote associates task (Rowe, Hirsh, and Anderson, 2007). Relaxation, achieved through either explicit instructions to relax (Smilek, Enns, Eastwood, and Merickle, 2006) or listening to music (Olivers and Nieuwenhuis, 2006), has been shown to improve performance on attention-demanding tasks. However, when stimuli are threatening, positive mood actually leads to risk aversion and cautiousness (Isen, 2000). Eye movement research supports this distinction: Optimists gazed less at negative, unpleasant images than pessimists and showed selective inattention to negative emotional images (Isaacowitz, 2006). Similarly, individuals induced into positive moods fixated more on emotionally positive peripheral stimuli and made more frequent saccades to neutral and positively valenced items than control participants (Wadlinger and Isaacowitz, 2006). Becker and Leinenger (2011) suggest that the congruence of temporary mood states and the emotional nature of information available for processing is critical; people were more likely to notice an emotional face – positive, negative, or neutral – when it was congruent with their mood. While this research demonstrates that emotional states influence the allocation of attentional processing resources, it does not consider which aspect of emotion, namely, valence (positive, negative) or arousal (high, low), influences the control of attention (Lang, 1995). As in our previous study that teased apart the independent influences of valence and arousal to attentional allocation while driving, here we review a study where we investigated how the valence and arousal of an individual’s mood influence the time course of

9781107096400c10_p175-198.indd 184

4/10/2014 1:24:50 PM

Visual Attention and Emotion

185

information processing using a laboratory attentional blink task (Jefferies, Smilek, Eich, and Enns, 2008). We utilized music for mood induction not only because it is effective, but because it possesses high ecological validity. People often listen to music while performing everyday activities, whether it be arousing music while jogging or tranquil music while writing in a café.

Methodological Highlight: Mood Induction Influences the Time Course of Attention

Participants in this study completed a mood induction prior to the start of the attentional task. They listened to 10 minutes of music (validated to promote a particular mood) while recalling in detail mood-appropriate events from the past (see http://www2.psych.ubc.ca/~ennslab/Vision_Lab/Mood_ Induction_Procedures.html for complete mood induction instructions). This elicited one of four emotional states: calm (low arousal, positive affect), happy (high arousal, positive affect), sad (low arousal, negative affect), or anxious (high arousal, negative affect; see Eich, Macaulay, and Ryan, 1994). This mood induction was verified using participant self-reports on a 9x9 grid representing affect (extremely unpleasant to the left to extremely pleasant to the right) and arousal (extremely high energy at top to extremely low energy at bottom). In the attentional task, participants reported the identity of two letter targets, in the order in which they appeared, among distractor digits. Each stream contained between 11 and 23 items, and each item was on screen for 82 milliseconds. Instead of finding separate influences of arousal or valence, we found that specific combinations of these dimensions best predicted the control of attention. That is, sadness (low arousal, negative affect) produced the highest levels of performance, anxiety (high arousal, negative affect) led to the lowest levels of performance, and calm and happy states (low and high arousal, positive affect) were associated with intermediate performance. It is important to note these differences were not in overall performance (first target accuracy was more or less the same for all groups), but rather in second target accuracy (i.e., attentional control). This points to a direct link between mood and the prioritization of items for visual attention. We posit that sadness is linked to improved control over visual attention, as described with the overinvestment hypothesis (Olivers and Nieuwenhuis, 2006). In this view, sadness was more successful than other mood inductions in distracting participants from allowing nontarget items to gain entry to the limited-capacity processes of target identification. Another finding consistent with this account is that calm and happy participants performed better than anxious ones. Whereas anxious participants may have been overly focused on the task, calm and happy participants may have found their emotional states helpful in avoiding full engagement in the attentional blink task.

9781107096400c10_p175-198.indd 185

4/10/2014 1:24:50 PM

186

BRENNAN AND ENNS

Attention Influences Emotion

Not only do emotional states and traits influence the allocation of attention; the allocation of attention also influences liking, mood, and affect. At the most basic level, one’s emotional evaluation of stimuli can be manipulated purely by repetition (for review see Bornstein, 1989). Hundreds of studies have demonstrated the unexpectedly positive evaluation of stimuli following repeated passive exposure to similar stimuli. Examples include printed English words, Chinese characters, paintings, people, faces, geometric figures, and sounds. Explanations for this mere exposure effect include that perceptual encoding facilitates subsequent perception, which is then interpreted as preference (Bornstein and D’Agostino, 1994), and that the preference from exposure results from classical conditioning where stimuli are passively viewed in the absence of noxious events (Zajonc, 2001). Perceptual fluency can be manipulated by means other than sheer repetition. When fluency is enhanced through a single prior exposure (priming), increased figure-ground contrast, or longer presentation durations, stimuli are evaluated more positively (Reber, Winkielman, and Schwarz, 1998). These effects can also be measured using facial electromyography immediately following stimulus presentation and prior to explicit judgments. High fluency was associated with stronger activity over the zygomaticus region (“smiling muscle” activated during positive affective responses), but was not associated with the activity of the corrugator region (“frowning muscle” activated during negative affective responses) (Winkielman and Cacioppo, 2001). These experiments on mere exposure and perceptual fluency do not specifically manipulate attention – in all cases it is directed toward the experimental stimuli. When selective attention is manipulated, by requiring that participants attend certain stimuli and thereby ignore others, we learn that not any exposure is good exposure! In a visual search task, where half of the distractors appeared before the other half of distractors and the target, a mere exposure account would predict more positive evaluations of previewed than nonpreviewed distractors, as they are seen for longer. However, they were actually devalued relative to nonpreviewed distractors (Fenske and Raymond, 2006). Because distractors compete for attention in search, attentional inhibition is applied and stored with the mental representation of that stimulus. When the previously distracting stimulus is encountered again in an affective evaluation task, the inhibition leads to affective devaluation. This pattern of results is true for both complex, but meaningless visual patterns (judged cheery or not-cheery; Raymond, Fenske, and Tavassoli, 2003), and photographs of human faces (judged trustworthy or not trustworthy; Fenske et al., 2005). Together, this research demonstrates that the allocation of attention has important influences on liking, mood, and affect in the laboratory. However, it is also critical to understand whether these effects occur in real-world

9781107096400c10_p175-198.indd 186

4/10/2014 1:24:50 PM

Visual Attention and Emotion

187

environments. Research on mere exposure and perceptual fluency typically employs very short presentation durations (e.g., 100–2,000 milliseconds in Reber, Winkielman, and Schwarz, 1998). While research on inhibition resulting in the affective devaluation employs longer presentation durations, the experiments are still conducted using paradigms that are far from the realm of everyday human experience. Our lab group has recently investigated how the allocation of attention influences emotional evaluation in a more naturalistic task – we explored how artists’ use of painterly techniques first utilized by Rembrandt during the seventeenth century influenced the attentional allocation of individuals viewing portraits, which in turn influenced their liking of the portraits (DiPaola, Riebe, and Enns, 2010).

Methodological Highlight: Looking Influences Liking of Art

We systematically examined how textural agency – the selective application of detail to different regions of portraits – influences the allocation of attention to, and the affective evaluation of, Rembrandt-like portraits. Usually when a painter selects one region of the canvas for increased detail over another region, these regions also invariably differ from one another in their meaningful content, in relative degree of lighting, and in relative spatial location. Thus, we photographed human models posing, dressed, and lit in a similar way to four of Rembrandt’s most famous late portraits. We then rendered the photographs in the style of Rembrandt by using a knowledge-based computer painterly rendering system. In each rendered portrait we selected four regions for selective manipulation of textural detail (either blurring or sharpening). One region centered above each eye and one region centered on each side of the chin, where the material of the collar meets the skin of the neck (see Figure 10.2a). This selective blurring and sharpening using coarse and fine brushwork created “lost and found edges.” The eye prefers to seek out edges (regions of strong contrast in tone and color), and when those edges disappear or are “lost,” the eye looks to find new edges. Specifically, Rembrandt’s coarse brushwork contained low spatial frequency information that was rapidly transmitted to many brain regions of the visual system to help orient the eyes to points of possible interest (“lost edges”), whereas his fine brushwork contained higher spatial frequency information that was transmitted more slowly to the centers involved in detailed and prolonged inspection (“found edges”). In our systematic exploration of painting detail and eye movements, we first found that manipulating textural detail altered viewers’ gaze patterns (see Figure 10.2b for the fixation locations of a representative participant to a Rembrandt-styled portrait). During eye tracking, participants inspected fewer locations overall, and their gaze rested longer at each location when

9781107096400c10_p175-198.indd 187

4/10/2014 1:24:50 PM

188

BRENNAN AND ENNS

A

B

Figure 10.2. A study of eye movements in portraits. (a) Detailed crops of a painterly rendered portrait, showing right eye and neck region in greater detail (left image) versus left eye and neck region in greater detail (right image). (b) The same painterly rendered portrait with circles depicting the locations at which eight different participants (shown in different colors) have fixated. (See Plate 3.)

viewing the Rembrandt-styled portraits with “lost and found edges” than photographs or other filler portraits without textural agency manipulations. There were also more fixations to the eyes in Rembrandt-like portraits compared to photographs and filler portraits. Although there were almost no fixations to the neck regions themselves, changes to the detail of the neck influenced overall viewing patterns; there was an increased likelihood of fixation to a detailed eye when it was on the same side as a less detailed neck

9781107096400c10_p175-198.indd 188

4/10/2014 1:24:50 PM

Visual Attention and Emotion

189

region. Second, we found that these changes to viewers’ gaze patterns altered their emotional evaluation of the portraits. Participants reported that they liked the Rembrandt-styled portraits the “very best” at a rate significantly greater than chance. These were the portraits with detailed eye and neck regions on the same side of the image, compared to the portraits with a detailed right eye and left collar, for example. These findings demonstrate a direct connection between attention and emotion in a real-world setting. Researchers did not design these painterly techniques for an experiment on attention and emotion; we systematically manipulated a painterly technique first utilized by artists such as Rembrandt in the seventeenth century. Through doing so we showed that the artist’s selection of regions of a portrait for more or less detail has a direct influence on viewing behavior and aesthetic experience.

Emotional Expressions In this section we discuss how the allocation of attention influences emotional expression, a topic of comparatively little previous research. We begin with a discussion of several techniques for measuring both overt and covert emotional expressions in order to encourage researchers to utilize this available information. We conclude this section with a review of recent research from our lab demonstrating that performance in visual search, one of the most frequently studied attentional tasks, results in the outward expression of positive emotion by participants. In fact, we can utilize these emotional displays to understand search performance itself better. Emotions are expressed both covertly and overtly, and there are techniques available to measure both. Electromyography (EMG; Tassinary and Cacioppo, 2000) detects changes in the electrical activity of facial muscles, even when people show no visible change in facial activity. Different emotions result in different patterns of muscle activity; for example, easy to process pictures elicited higher activity over the region of zygomaticus major (cheek), indicating positive affect, than harder to process pictures (Winkielman and Cacioppo, 2001). Overall, EMG is better at differentiating emotional valence (positive vs. negative) than distinguishing between specific emotional experiences (anger vs. fear), while autonomic nervous system activity can be used to differentiate between particular emotions of a given valence. Several skin and cardiovascular responses can be used to measure autonomic activity including electrodermal (skin conductance) responses, which reflect the level of sweat at the surface of the skin, as well as heart rate, blood pressure, total peripheral resistance, cardiac output, preejection period, and heart rate variability (see Lang, Greenwald, Bradley, and Hamm, 1993 for autonomic responses to the International Affective Picture System (IAPS).

9781107096400c10_p175-198.indd 189

4/10/2014 1:24:52 PM

190

BRENNAN AND ENNS

Like covert expressions, overt facial expressions can be used to assess emotional experiences. They are elicited automatically and often involuntarily (Dimberg, Thunberg, and Grunedal, 2002), many are universally expressed and understood across cultures (Darwin, 1872; Ekman, Sorenson, and Friesen, 1969), and they can be quickly recognized and interpreted (Tracy and Robins, 2008). While there is contention surrounding which facial expressions of emotion belong to the “basic set” of distinct universal emotions, researchers generally agree on the inclusion of at least six: happiness, sadness, surprise, fear, disgust, and anger. People are very good at interpreting these facial expressions and are in fact able to discern a genuine “Duchenne” smile from a false smile by activity in the orbicularis oculi, the muscle that orbits the eyes (Ekman, Davidson, and Friesen, 1990). Recent research suggests that bodily expressions also play an important role in communicating and interpreting emotion. In fact, we can recognize the basic set of emotions using bodily expressions in the absence of facial expressions (Van den Stock, Righart, and de Gelder, 2007), and independently of cultural factors (Rozin et al., 2005). When emotional expressions are considered in the context of heads and bodies, research suggests that the list of distinct and universally recognized emotions may need to be expanded to include pride (expanded posture and upward head tilt; Tracy and Matsumoto, 2008) and shame (hunched posture and downward head tilt; Keltner, 1995). Recognition of dynamic whole-body expressions is easier than static stimuli (Atkinson, Dittrich, Gemmell, Young, 2004) and these dynamic expressions not only provide information about the emotional state of the producer, but also signal his or her action intentions. In an effort to harness the wealth of information available in facial and bodily expressions of emotion, we recently conducted a set of experiments investigating how visual search performance influences the emotional state of study participants (Brennan, Watson, Kingstone, and Enns, 2011). We developed a new methodology for this study, one in which we used people’s everyday expertise in “people watching” to measure participants’ emotional experience. We then used what we learned through people watchers to understand searchers’ cognition better. This decision was spurred by the many studies in social psychology that have demonstrated the surprising reliability and validity of thin-slicing – the ability of persons to make rapid evaluations of the personality, disposition, and intent of others from very small samples of their behavior (e.g., Ambady, Bernieri, and Richeson, 2000; Borkenau et al., 2004). Methodological Highlight: Harnessing Emotional Expression to Understand Visual Cognition

In Phase 1 of the experiments, participants completed a standard visual search task and we measured their performance. Figure 10.3a shows a photograph of the office in which participants searched. We videotaped the

9781107096400c10_p175-198.indd 190

4/10/2014 1:24:52 PM

Visual Attention and Emotion

191

A

B

Figure 10.3. A study of emotional expressions during visual search. (a) The participant’s view of the cluttered office and (b) representative still frames from a video of participants searching in the office. Actors are posed to respect the privacy of study participants.

9781107096400c10_p175-198.indd 191

4/10/2014 1:24:52 PM

192

BRENNAN AND ENNS

experimental session, and in Phase 2, a second sample of participants used their everyday people watching skills to observe and judge the behavior of Phase 1 participants, as seen in the video recorded searches. Figure 10.3b illustrates participants performing searches in the office. Specifically, Phase 2 participants rated searchers’ level of interest (bored vs. interested) and pleasure and satisfaction (unhappy/dissatisfied vs. very happy/satisfied). Each video corresponded to a trial in the search task of Phase 1, beginning with the onset of the stimulus display (which only the Phase 1 participant could see) and ending with the Phase 1 participant responding where the target was located in the display. A random sample of each searcher’s video clips was presented in a random order. Thus, both the experimenter and the Phase 2 participants were blind to the conditions under which the search behavior shown in the clips had been obtained. We began by instructing the Phase 1 participants to search for common objects, depicted in a photograph of the office, while they were seated in front of a computer screen with their hands on the keyboard. This is typical of how visual search has been studied in the laboratory, and as such, it provided us with hundreds of potential search trials to sample from. Next, we ventured out from this standard computer-based search task to examine visual search for items hidden in the actual office that had previously been depicted. In Experiment 2 practical considerations reduced the opportunity to acquire a large number of trials as in Experiment 1, but at the same time the search behavior we were able to record for person perception in Phase 2 was potentially richer considering this more closely approximated the conditions under which people perform searches in everyday life. In both Experiments 1 and 2, we randomly assigned searchers to receive one of two sets of strategy instructions: either actively to direct their attention in search of the target or passively to allow the target to pop into their mind. Post hoc, we characterized searchers’ proficiency as either fast or slow on the basis of whether they were below or above the median search time, respectively. When searching in a photograph, participants who were instructed to direct their attention actively showed more positive emotion and interest than participants instructed to allow the target to pop into their mind passively. As a group, active searchers were also faster than passive searchers. Faster searchers, irrespective of the search strategy instructions they received, were rated as showing more positive emotion upon finding the search target and more interest in searching than slower searchers. We also found that positive emotional expression contributed uniquely to how fast people searched; it accounted for 33 percent of the variance in search time on its own, and together with ratings of eye movement activity it accounted for more than 60 percent of the individual differences in search performance. When searching in an actual cluttered office, ratings of positive emotion and interest showed an interaction between search strategy (active vs. passive) and proficiency (fast vs. slow). Searchers were judged as expressing

9781107096400c10_p175-198.indd 192

4/10/2014 1:24:54 PM

Visual Attention and Emotion

193

the highest degree of positive emotion and interest when their search strategy matched their search proficiency (i.e., high-proficiency active searchers and low-proficiency passive searchers expressed the most positive emotion, whereas low-proficiency active searchers and high- proficiency passive searchers expressed the least). Such a congruency effect on emotional expression is reminiscent of fluency theory, whereby more fluent processing produces more positive responses (Reber, Winkielman, and Schwarz, 1998). In this framework, searchers experienced maximum enjoyment when their proficiency and strategy were aligned, and importantly, this increased enjoyment was visible in searchers’ overt behaviors and expressions. This study demonstrates that emotion plays an integral role in a cognitive task previously considered to be absent of emotion. While researchers had examined how emotional stimuli influence search performance, they had not considered that the allocation of attention during search itself would have emotional consequences for the searcher.

Conclusion We have demonstrated that the relationship between visual attention and emotion runs both ways. Not only do emotional messages in our environment, and our own internal emotional states and traits, influence the allocation of attention while performing cognitive tasks, but the very act of allocating attention itself alters our internal emotional experience and overt expressions of emotion. By reviewing recent research and theory on both directions of this relationship, we hope to have illuminated the importance of considering both possible paths of influence for behavioral research on attention and emotion. This is particularly valuable and timely in light of recent neuroscientific evidence concerning overlap in the brain systems involved in emotion and attention, including the prefrontal cortex (orbitofrontal cortex in particular), anterior cingulate cortex, striatum, thalamus, and cholingeric basal forebrain nuclei (Vuilleumier, Armony, and Dolan, 2003). In highlighting the reciprocal nature of the relationship between emotion and attention, we hope to encourage researchers to continue to tease apart the connectivity of attention and emotion where a lack of understanding currently remains.

References Ambady, N., Bernieri, F. J. and Richeson, J. A. (). Toward a histology of social behavior: Judgmental accuracy from thin slices of the behavioral stream. In M. P. Zanna (Ed.), Advances in experimental social psychology, Vol. 32 (pp. 201–271). San Diego, CA: Academic Press.

9781107096400c10_p175-198.indd 193

4/10/2014 1:24:54 PM

194

BRENNAN AND ENNS

Armony, J. L. and Dolan, R. J. (2002). Modulation of spatial attention by fear conditioned stimuli: An event-related fMRI study. Neuropsychologia, 40, 817–826. Atkinson, A. P., Dittrich, W. H., Gemmell, A. J. and Young, A. W. (2004). Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception, 33, 717–746. Becker, M. W. and Leinenger, M. (2011). Attentional selection is biased toward mood-congruent stimuli. Emotion, 11(5), 1248–1254. Borkenau, P., Mauer, N., Riemann, R., Spinath, F. M. and Angleitner, A. (2004). Thin slices of behavior as cues of personality and intelligence. Journal of Personality and Social Psychology, 86, 599–614. Bornstein, R. F. (1989). Exposure and affect: Overview and meta-analysis of research, 1968–1987. Psychological Bulletin, 106, 265–289. Bornstein, R. F. and D’Agostino, P. R. (1994). The attribution and discounting of perceptual fluency: Preliminary tests of a perceptual fluency/attributional model of the mere exposure effect. Social Cognition, 9, 103–128. Brennan, A. A., Watson, M. R., Kingstone, A. and Enns, J. T. (2011). Person perception informs understanding on cognition during visual search. Attention, Perception, and Psychophysics, 73, 1672–1693. doi: 10.3758/s13414-0110141-7. Brosch, T., Sander, D., Pourtois, G. and Scherer, K. R. (2008). Beyond fear: Rapid spatial orienting toward positive emotional stimuli. Psychological Science, 19(4), 362–370. Darwin, C. (1872). The expression of the emotions in man and animals. London: Murray. (Reprinted, Chicago: University of Chicago Press, 1965.) de Gelder, B., Vroomen, J., Pourtois, G. and Weiskrantz, L. (1999). Non-conscious recognition of affect in the absence of striate cortex. NeuroReport, 10, 3759–3763. Devue, C., Belopolsky A. V. and Theeuwes, J. (2011). The role of fear and expectancies in capture of covert attention by spiders. Emotion, 11, 768–775. Dimberg, U., Thunberg, M. and Grunedal, S. (2002). Facial reactions to emotional stimuli: Automatically controlled emotional responses. Cognition and Emotion, 16(4), 449–471. DiPaola, S., Riebe, C. and Enns, J. T. (2010). Rembrandt’s textural agency: A shared perspective in visual art and science. Leonardo, 43(2), 145–151. Eich, E., Macaulay, D. and Ryan, L. (1994). Mood dependent memory for events of the personal past. Journal of Experimental Psychology: General, 123, 201–215. Ekman, P., Davidson, R. J. and Friesen, W. V. (1990). Emotional expression and brain physiology II: The Duchenne Smile. Journal of Personality and Social Psychology, 58, 342–353. Ekman, P., Sorenson, E. R. and Friesen, W. V. (1969). Pan-Cultural elements in facial display of emotions. Science, 164(3875), 86–88. Esteves, F., Parra, C., Dimberg, U. and Öhman, A. (1994). Nonconscious associative learning: Pavlovian conditioning of skin conductance responses to masked fear-relevant facial stimuli. Psychophysiology, 31, 375–385. Fenske, M. J. and Raymond, J. E. (2006). Affective influences of selective attention. Current Directions in Psychological Science, 15(6), 312–316.

9781107096400c10_p175-198.indd 194

4/10/2014 1:24:54 PM

Visual Attention and Emotion

195

Fenske, M. J., Raymond, J. E., Kessler, K., Westoby, N. and Tipper, S. P. (2005). Attentional inhibition has social-emotional consequences for unfamiliar faces. Psychological Science, 16(10), 753–758. Fox, E., Griggs, L. and Mouchlianitis, E. (2007). The detection of fear-relevant stimuli: Are guns noticed as quickly as snakes? Emotion, 7, 691–696. Fox, E., Russo, R., Bowles, R. and Dutton, K. (2001). Do threatening stimuli draw or hold visual attention in subclinical anxiety? Journal of Experimental Psychology: General, 130(4), 681–700. Fredrickson, B. L. (2003). The value of positive emotions. American Scientist, 91, 330–335. Gasper, K. and Clore, G. L. (2002). Attending to the big picture: Mood and global versus local processing of visual information. Psychological Science, 13, 34–40. Gazzaniga, M. S. (1998). The split brain revisited. Scientific American, 279(1), 35–39. Higgins, K. E., Wood, J. and Tait, A. (1998). Vision and driving: Selective effect of optical blur on different driving tasks. Human Factors, 40, 224–233. Horrey, W. J., Wickens, C. D. and Consalus, K. P. (2006). Modeling drivers’ visual attention allocation while interacting with in-vehicle technologies. Journal of Experimental Psychology: Applied, 12(2), 67–78. Isaacowitz, D. M. (2006). Positive psychology and measurement. In A. Ong and M. van Dulmen (Eds.), The Oxford Handbook of Methods in Positive Psychology. New York: Oxford University Press. Isen, A. M. (2000). Positive affect and decision making. In M. Lewis and J. HavilandJones (Eds.), Handbook of Emotions (2nd ed., pp. 417–435). New York: Guilford. James, W. (1890). The principles of psychology, Vol. 1. New York: Holt. Jefferies, L. N., Smilek, D., Eich, E. and Enns, J. T. (2008). Emotional valence and arousal interact in attentional control. Psychological Science, 19(3), 290–295. Keil, A. and Ihssen, N. (2004). Identification facilitation for emotionally arousing verbs during the attentional blink. Emotion, 4, 23–35. Keltner, D. (1995). Signs of appeasement: Evidence for the distinct displays of embarrassment, amusement, and shame. Journal of Personality and Social Psychology, 68, 441–454. Kindt, M. and Brosschot, J. F. (1999). Cognitive bias in spider-phobic children: Comparison of a pictorial and a linguistic spider stroop. Journal of Psychopathology and Behavioral Assessment, 21(3), 207–220. Lang, P. J. (1995). The emotion probe. American Psychologist, 50, 372–385. Lang, P. J., Bradley, M. M. and Cuthbert, B. N. (2008). International affective picture system (IAPS): Affective ratings of pictures and instruction manual. Technical Report A-8. University of Florida, Gainesville, FL. Lang, P. J., Greenwald, M. K., Bradley, M. M. and Hamm, A. O. (1993). Looking at pictures: Affective, facial, visceral, and behavioral reations. Psychophysiology, 30, 261–273. Larsen, R. J. and Diener, E. (1992). Promises and problems with the circumplex model of emotion. In M. S. Clark (Ed.), Review of personality and social psychology, Vol. 13: Emotion (pp. 25–59). Newbury Park, CA: Sage.

9781107096400c10_p175-198.indd 195

4/10/2014 1:24:54 PM

196

BRENNAN AND ENNS

Lehrer, J. (2009). How we decide. Boston: Houghton Mifflin Harcourt. Mathewson, K. M., Arnell, K. M. and Mansfield, C. (2008). Capturing and holding attention: The impact of emotional words in rapid serial visual presentation. Memory and Cognition, 36, 182–200. McGlynn, F. D., Wheeler, S. A., Wilamowska, Z. A. and Katz, J. S. (2008). Detection of change in threat-related and innocuous scenes among snake-fearful and snake-tolerant participants: Data from the flicker task. Journal of Anxiety Disorders, 22, 515–523. Mikels, J. A., Maglio, S. J., Reed, A. E. and Kapowitz, L. J. (2011). Should I go with my gut? Investigating the benefits of emotion-focused decision making. Emotion, 11(4), 743–753. Morris, J. S., Öhman, A. and Dolan, R. J. (1998). Conscious and unconscious learning in the human amygdala. Nature, 393, 467–470. Nadel, L. and Jacobs, W. J. (1998). Traumatic memory is special. Current Directions in Psychological Science, 7, 154–157. Öhman, A., Flykt, A. and Esteves, F. (2001). Emotion drives attention: Detecting the snake in the grass. Journal of Experimental Psychology: General, 130(3), 466–478. Öhman, A., Lundqvist, D. and Esteves, F. (2001). The face in the crowd revisited: A threat advantage with schematic stimuli. Journal of Personality and Social Psychology, 80(3), 381–396. Olivers, C. N. L. and Nieuwenhuis, S. (2006). The beneficial effects of additional task load, positive affect, and instruction on the attentional blink. Journal of Experimental Psychology: Human Perception and Performance, 32(2), 364–379. Owens, D. A. and Tyrell, R. A. (1999). Effects of luminance, blur, and age on nighttime visual guidance: A test of the visual degradation hypothesis. Journal of Experimental Psychology: Applied, 5(2), 115–128. Raymond, J. E., Fenske, M. J. and Tavassoli, N. T. (2003). Selective attention determines emotional responses to novel visual stimuli. Psychological Science, 14, 537–542. Reber, R., Winkielman, P. and Schwarz, N. (1998). Effects of perceptual fluency on affective judgments. Psychological Science, 9(1), 45–48. Rowe, G., Hirsh, J. B., Anderson, A. K., Smith, E. E. (2007). Positive affect increases the breadth of attentional selection. Proceedings of the National Academy of Sciences of the United States of America, 104(1), 383–388. Rozin, P., Taylor, C., Ross, L., Bennett, G. and Hejmadi, A. (2005). General and specific abilities to recognize negative emotions, especially disgust, as portrayed in the face and the body. Cognition and Emotion, 19, 397–412. Russell, J. A. (2003). Core affect and the psychological construction of emotion. Psychological Review, 110, 145–172. Russell, J. A. and Barrett, F. L. (1999). Core affect, prototypical emotional episodes and other things called emotion: Dissecting the elephant. Journal of Personality and Social Psychology, 76, 805–819. Schimmack, U. (2005). Attentional interference effects of emotional pictures: Threat, negativity, or arousal? Emotion, 5, 55–66.

9781107096400c10_p175-198.indd 196

4/10/2014 1:24:54 PM

Visual Attention and Emotion

197

Shapiro, K. L., Caldwell, J., Sorensen, R. E. (1997). Personal names and the attentional blink: A “visual” cocktail party effect. Journal of Experimental Psychology: Human Perception and Performance, 23, 504–514. Smilek, D., Enns, J. T., Eastwood, J. D. and Merickle, P. M. (2006). Relax! Cognitive strategy influences visual search. Visual Cognition, 14, 543–564. Strauss, G. P. and Allen, D. N. (2006). The experience of positive emotion is associated with the automatic processing of positive emotional words. The Journal of Positive Psychology, 1(3), 150–159. Summala, H., Lamble, D. and Laakso, M., (1998). Driving experience and perception of the lead car’s braking when looking at in-car targets. Accident Analysis and Prevention, 30, 401–407. Summala, H., Nieminen, T. and Punto, M. (1996). Maintaining lane position with peripheral vision during in-vehicle tasks. Human Factors, 38, 442–451. Tamietto, M. and de Gelder, B. (2010). Neural bases of the non-conscious perception of emotional signals. Nature Reviews Neuroscience, 11, 697–709. Tassinary, L. G. and Cacioppo, J. T. (2000). The skeletomuscular system: Surface electromyography. In J. T. Cacioppo, L. G. Tassinary and G. G. Berntson (Eds.), Handbook of psychophysiology (2nd ed. pp. 163–199). New York: Cambridge University Press. Thayer, R. E. (1989). The biopsychology of mood and activation. New York: Oxford University Press. Tracy, J. L. and Matsumoto, D. (2008). The spontaneous expression of pride and shame: Evidence for biologically innate nonverbal displays. Proceedings of the National Academy of Sciences, 105(33), 11655–11660. Tracy, J. L. and Robins, R. W. (2008). The automaticity of emotion recognition. Emotion, 8(1), 81–95. Treisman, A. M. and Gelade, G. (1980). A feature integration theory of attention. Cognitive Psychology, 12, 97–136. Trick, L. M., Brandigampola, S. and Enns, J. T. (2012). How fleeting emotions affect hazard perception and steering while driving: The impact of image arousal and valence. Accident Analysis and Prevention, 45, 222–229. doi: 10.1016/j. aap.2011.07.006. Van den Stock, J., Righart, R. and de Gelder, B. (2007). Body expressions influence recognition of emotions in the face and voice. Emotion, 7(3), 487–494. Van Essen, D. C. (2004). Organization of visual areas in macaque and human cerebral cortex. In: L. Chalupa J. S. Werner (Eds.), The Visual Neurosciences (pp. 507–521). Cambridge, MA: MIT Press. Vuilleumier, P., Armony, J. and Dolan, R. (2003) Reciprocal links between emotion and attention. In: R. S. J. Frackowiak et al. (Eds.), Human Brain Function (2nd ed., pp. 419–444). San Diego: Academic Press. Wadlinger, H. A. and Isaacowitz, D. M. (2006). Positive mood broadens visual attention to positive stimuli. Motivation and Emotion, 30, 87–99. Watson, D. and Tellegen, A. (1985). Toward a consensual structure of mood. Psychological Bulletin, 98, 219–235. Watts, F. N., McKenna, F. P., Sharrock, R. and Trezise, L. (1986). Colour naming of phobia-related words. British Journal of Psychology, 77(1), 97–108.

9781107096400c10_p175-198.indd 197

4/10/2014 1:24:54 PM

198

BRENNAN AND ENNS

Winkielman, P. and Cacioppo, J. T. (2001). Mind at ease puts a smile on the face: Psychophysiological evidence that processing facilitation elicits positive affect. Journal of Personality and Social Psychology, 81(6), 989–1000. Yiend, J. (2010). The effects of emotion on attention: A review of attentional processing of emotional information. Cognition and Emotion, 24, 3–47. Yik, M. S. M., Russell, J. A. and Barrett, L. F. (1999). Structure of self-reported current affect: Integration and beyond. Journal of Personality and Social Psychology, 77, 600–619. Zajonc, R. B. (2001). Mere exposure: A gateway to the subliminal. Current Directions in Psychological Science, 10, 224–228.

9781107096400c10_p175-198.indd 198

4/10/2014 1:24:54 PM