NeuroImage 60 (2012) 130–138
Contents lists available at SciVerse ScienceDirect
NeuroImage journal homepage: www.elsevier.com/locate/ynimg
Capture of lexical but not visual resources by task-irrelevant emotional words: A combined ERP and steady-state visual evoked potential study Sophie M. Trauer a, Søren K. Andersen a, b, Sonja A. Kotz c, Matthias M. Müller a,⁎ a b c
Institute of Psychology, University of Leipzig, Leipzig, Germany Department of Neurosciences, University of California at San Diego, La Jolla, CA, USA Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
a r t i c l e
i n f o
Article history: Received 1 August 2011 Revised 1 November 2011 Accepted 9 December 2011 Available online 20 December 2011 Keywords: Visual attention Emotional word processing Human EEG Steady-state visual evoked potential P2 and N400 component
a b s t r a c t Numerous studies have found that emotionally arousing faces or scenes capture visual processing resources. Here we investigated whether emotional distractor words capture attention in an analogous way. Participants detected brief intervals of coherent motion in an array of otherwise randomly moving squares superimposed on words of positive, neutral or negative valence. Processing of the foreground task was assessed by behavioural responses and steady-state visual evoked potentials (SSVEPs) elicited by the squares flickering at 15 Hz. Although words were task-irrelevant, P2 and N400 deflections to negative words were enhanced, indicating that emotionally negative word content modulated lexico-semantic processing and that emotional significance was detected. In contrast, the time course of behavioural data and SSVEP amplitudes revealed no interference with the task regardless of the emotional connotation of distractor words. This dissociation of emotion effects on early perceptual versus lexical stages of processing suggests that written emotional words do not inevitably lead to attentional modulation in early visual areas. Prior studies have shown a distraction effect of emotional pictures on a similar task. Thus, our results indicate the specificity of emotion effects on sensory processing and semantic encoding dependent on the information channel that emotional significance is derived from. © 2011 Elsevier Inc. All rights reserved.
Introduction Emotional words and visual attention It is of evolutionary advantage when our attentional focus not only follows perceptually striking items or current intentions. Objects that are dangerous or desirable and thus potentially relevant for an organism's survival should be detected and processed rapidly and prioritised over present task goals in many situations. Accordingly, emotionally significant signals are favoured in perception and memory as reflected in behavioural performance and corresponding cortical activity (for reviews, see Lang and Bradley, 2009; Vuilleumier, 2005). Arousing stimuli are detected quickly amongst other objects (Hodsoll et al., 2011; Öhman et al., 2001), are less prone to be missed when processing capacities are limited (e.g. Maratos et al., 2008), are more likely to be remembered (Versace et al., 2010), and boost perceptual processing as indicated by enhanced cortical activation along the ventral visual path (Sabatinelli et al., 2005). In spite of empirical evidence that affective stimuli are prioritised, it is still a matter of debate how invariably sensory processes
⁎ Corresponding author at: Institute of Psychology, University of Leipzig, Seeburgstrasse. 14–20, 04103 Leipzig, Germany. Fax: + 49 341 973 5969. E-mail address:
[email protected] (M.M. Müller). 1053-8119/$ – see front matter © 2011 Elsevier Inc. All rights reserved. doi:10.1016/j.neuroimage.2011.12.016
are shaped by emotionally arousing signals (Vuilleumier and Huang, 2009). Here we examined how symbolic stimuli such as written words, whose meaning is not necessarily linked to their physical features, influence early visual processing. Written words are acquired rather late in life. They can be ambiguous when presented out of sentence context (Fischler and Bradley, 2006) suggesting that the emotional arousal of words may be variable, if not too weak to capture attention. However, words still assert emotion effects on brain activity when presented subliminally (Bernat et al., 2001; Ortigue et al., 2004) and their valence can be guessed above chance even without awareness of the word (Nasrallah et al., 2009). Arousing verbal stimuli can lead to amygdala activation similar to that induced by emotional faces, pictures, or conditioned stimuli (Baas et al., 2004). Some studies found emotion effects on the ERP to written words as early as 100 ms (Begleiter and Platz, 1969; Landis, 2006; Scott et al., 2009; Skrandies, 1998; for a review see Kissler et al., 2006). Thus emotional word content can be detected rapidly, but how does this influence further processing? Attentional effects of words have been investigated with a variety of paradigms: For example, Attentional Blink (AB) studies repeatedly reported identification advantages for emotional compared to neutral words. Keil and Ihssen (2004) presented verbs in Rapid Serial Visual Presentation (RSVP) streams containing one or two colour target words. Whilst
S.M. Trauer et al. / NeuroImage 60 (2012) 130–138
after the first target (T1) identification rates for a second target (T2) decreased for a short time (i.e. an AB was observed), T2 words of high arousal were identified more often than low arousing ones. Vice versa emotional words, even as irrelevant distractors, increase the AB in subsequent target identification (Arnell et al., 2007; Mathewson et al., 2008) indicating the enhanced capture of processing resources by emotional words. As another example, in emotional Stroop Tasks the meaning of a word serves as distractor whilst its ink colour has to be named. Several studies showed that arousing word content interferes more with colour naming as indicated by decelerated response latencies (McKenna and Sharma, 1995; Pratto and John, 1991; Williams et al., 1996). These findings support the assumption that arousing word content captures attention. Note that in these studies the task was word identification or naming a colour. Therefore, it remains unclear whether emotional words are prioritised at perceptual or lexical stages of processing. However, there is evidence that affective words also compete with non-lexical stimuli and tasks. Dot-probe experiments require reactions towards neutral stimuli after task-irrelevant words are presented at possible target locations. Low-anxious participants have been shown to react slower to targets at locations where negative compared to neutral words were presented before, whereas anxiety correlates with accelerated detection of targets replacing threat words (Amir et al., 2003; MacLeod et al., 1986; Mogg et al., 1997). These findings demonstrate the influence of emotional words on visuospatial attention, but they also emphasise that behavioural effects of emotional words, being the result of a number of processing steps, can be strongly mediated by cognitive-motivational biases. Imaging studies, which offer a more direct approach to study the influence of emotional words on activity in visual areas, have not yet drawn a consistent picture. During silent reading of emotionally positive compared to neutral and negative adjectives, an enhanced BOLD response in left extrastriate regions was found that was positively correlated with left amygdala activation (Herbert et al., 2009). However, using a lexical decision task, Kuchinke et al. (2005) observed no stronger activation for emotional as compared to neutral nouns in occipital regions. Thus, it remains unclear whether emotional words capture attention in terms of enhanced early visual processing or interfere with concurring tasks only at later stages of attentional selection.
131
effects for emotional pictorial stimuli but not for words in studies that compared both stimulus types (Frühholz et al., 2011; Hinojosa et al., 2009). Therefore, previous results do not allow for more specific hypotheses about the timing and direction of effects. Thus, we examined the time course of emotion effects on ongoing visual processing by means of the SSVEP to a foreground task. To our knowledge, only two previous studies have investigated the processing of emotional words with steady-state potentials. In an RSVP study emotional verbs led to a transient early increase of the SSVEP amplitude elicited by the presentation rate of 8.6 Hz (Keil et al., 2006). This facilitation correlated with more accurate word identification. The authors concluded that the SSVEP effect reflected a facilitation of early stages of visual word processing by affective content, especially so in situations of limited resources as during the attentional blink. In contrast, Koban et al. (2010) presented neutral and emotional nouns flickering at 7.5 Hz in a passive viewing paradigm and found a late and ongoing decrease in SSVEP amplitudes following positive words. Given that words are symbolic and perceptually simple stimuli the finding was interpreted as a potential “shift of attention to internal processes rather than external stimulation” [p. 10]. The authors argued that associative rather than perceptual encoding may be amplified by word affect, resulting in decreased sensory processes as indicated by decreased SSVEP amplitudes. Both studies did not provide behavioural measures of selective attention. We therefore used a visual foreground task to investigate distraction effects of emotional words on steady-state and behavioural responses to the task. To simultaneously examine encoding of the task-irrelevant words we chose a higher flicker frequency (15 Hz) allowing for the concurrent analysis of ERPs elicited by word onset (see Müller and Hillyard, 2000). We hypothesised that the emotional meaning of background words, even though they were taskirrelevant, may modulate early stages of lexico-semantic analysis (i.e. ERP effects in the P2 time window, see e.g. Kanske and Kotz, 2007; Kissler et al., 2009) and later on during the N400 time window that reflects lexico-semantic integration (for a review, see Kutas and Federmeier, 2011). Furthermore, effects on the Late Positive Complex (LPC) were reported in most studies manipulating emotional word content (e.g. Fischler and Bradley, 2006; Herbert et al., 2008; Hinojosa et al., 2009). Methods
The present study Participants Ongoing amplification of visual responses to emotional pictures and faces compared to neutral ones has been demonstrated by frequencytagging the sensory processing of those stimuli using steady-state visual evoked potentials (SSVEPs; Bakardjian et al., 2011; Keil et al., 2003, 2010; McTeague et al., 2011). In a distraction paradigm, Müller et al. (2008; Hindi Attar et al., 2010a) presented neutral and emotional background pictures whilst participants performed a motion detection task on an array of randomly moving squares superimposed on the pictures. The onset of any recognisable picture led to decreased target detection rates and SSVEP amplitudes. This decrease was stronger for positive and negative pictures between 400 and 1000 ms after picture onset, indicating that emotional arousal of stimuli additionally biases competition for visual processing resources. This notion was supported further by an imaging study where activity in the motion-sensitive area V5, induced by the task array of moving dots, was reduced when emotional as compared to neutral background faces were presented (Hindi Attar et al., 2010b). Here we investigated whether task-irrelevant emotional words capture visual processing resources as has been observed for emotional pictures and faces (Hindi Attar et al., 2010a,b; Müller et al., 2008). On the one hand, words are not very complex visual stimuli. On the other hand, their emotional connotation may be derived later than that of faces or scenes as indicated by early ERP
23 right-handed subjects (12 female) with reported normal or corrected-to-normal visual acuity and a mean age of 23.4 years (SD: 2.8 years) took part in the experiment after giving written consent. All were native speakers of German and reported no prior difficulties in reading or orthography. Participants received course credit or monetary compensation. The experiment was conducted under the ethic provisions of the Declaration of Helsinki. Stimuli The task array (see Fig. 1) consisted of 180 randomly moving yellow squares (subtending approximately 0.3° × 0.3° of visual angle each) superimposed on a grey rectangle (15° × 7.5°) containing black background letter strings (10° × 2°). A yellow fixation cross was presented at the centre of the screen throughout each trial. 60 neutral, 60 negative, and 60 positive German nouns were selected from a word pool rated in a prior study (Kanske and Kotz, 2010). Words differed significantly in terms of valence and arousal (see Table 1) but were matched for print frequency, word length in letters and syllables as well as concreteness as the latter factor has been shown to interact with emotion effects (Kanske and Kotz, 2007). 120 additional words served as filler items in training and
132
S.M. Trauer et al. / NeuroImage 60 (2012) 130–138
unpredictable for participants but were not analysed. Each trial was followed by a blank screen for 1 to 1.5 s. The same word valence category was never shown for more than three consecutive trials, otherwise occurrence of the conditions was randomised across the experiment. Sixty percent of the background stimuli, randomly chosen per individual, were repeated once throughout the experiment so that 96 trials per condition were recorded.
Behavioural data
Fig. 1. Schematic illustration of stimulation. Yellow squares moved randomly and flickered at 15 Hz. Targets consisted of coherent (20%) movements of the squares for 200 ms.
catch trials (see Procedure). Words were paired with equally long random consonant strings serving as baseline stimuli. Consonant strings and words were stretched horizontally to have a constant number of pixels and hence equal luminance. To control for a general effect of pronounceable words on task performance and the SSVEP, an additional condition was created by pairing the baseline consonant string with a matched second consonant string in one fourth of all trials. Procedure Participants were instructed to maintain gaze at the fixation cross whilst attending the array of squares in order to detect brief (200 ms) movements of 20% coherence in one of the four cardinal directions. Such targets could occur randomly 0 to 3 times per trial and had to be indicated as fast and accurately as possible by pressing the space bar. Participants were instructed that background stimuli were taskirrelevant. Stimuli were presented on a 19 inch CRT monitor at a viewing distance of 80 cm. The squares flickered at 15 Hz, realized by a presentation cycle of 2 frames on and 2 frames off at a 60 Hz screen refresh rate. Three to six training blocks of 36 trials each were performed in order to become acquainted with the task. The recording consisted of 8 blocks of 64 trials each. Each trial of 3733 ms began with the onset of the flickering squares superimposed on a consonant string that switched to a neutral, negative or positive word or to a second consonant string after a baseline of 867 to 1600 ms in 75% of the trials. Catch trials (25%) with earlier (200 to 800 ms) or later (1667 to 3600 ms) background switches rendered the timing of display changes
Table 1 Stimulus word properties (M± SD): rating data were derived from Kanske and Kotz (2010), for the three emotional word catogories: frequency according to the Wortschatz Lexikon of the University of Leipzig (http://wortschatz.uni-leipzig.de/, frequency is reported as the exponential occurrence 2x of the German article word "der" compared to the selected word), concreteness rating values on a 9-point scale as well as arousal and valence measured with the 9-point Self Assessment Manikin scale (Bradley and Lang, 1994). Note that valence and arousal ratings differed between all three word categories.
Letters Syllables Frequency Concreteness Arousal Valence
Neutral
Negative
Positive
Pairwise t-tests
5.38 ± 0.74 1.80 ± 0.40 11.50 ±1.96 5.74 ± 1.3 2.65 ±0.67 5.33 ± 0.39
5.38 ± 0.76 1.75 ± 0.44 11.87 ± 2.06 5.79 ± 1.54 6.56 ± 0.65 2.56 ± 0.44
5.25 ± 0.73 1.72 ± 0.45 11.45 ± 1.61 5.72 ± 1.96 5.60 ± 0.83 7.21 ± 0.31
n.s., all p > 0.1 n.s., all p > 0.1 n.s., all p > 0.1 n.s., all p > 0.1 *, all p b 0.0001 *, all p b 0.0001
Onsets of target events were uniformly distributed over time. Across the entire experiment, three targets per condition (neutral, negative, positive, consonant string) occurred in each cycle of the flicker (i.e. 67 ms time window). Responses occurring between 200 and 1000 ms after a target onset were considered correct. In order to increase the number of reactions per datum, we averaged the response data across four short time bins of 67 ms resulting in 9 time windows of 267 ms each (2 before and 7 after background switch). Hits and reaction times to targets having an onset from 533 ms before to 1867 ms after background switch were subjected to two separate two-factorial repeated-measurement analyses of variance (ANOVAs) with the factors Emotion (neutral, negative, positive) and Time Window (1–9) and, secondly, Lexicality (neutral, consonant string) and Time Window.
EEG data recording and analysis The electroencephalogram (EEG) was recorded from 64 Ag/AgCl scalp electrodes at a sampling rate of 256 Hz using an ActiveTwo amplifier system (BioSemi, Amsterdam). Four additional electrodes recorded the horizontal and vertical electrooculogram. Epochs were extracted from 900 ms prior to 2500 ms after the switch of the background stimuli and the mean amplitude was subtracted from each epoch. Trials with eye movements or blinks were excluded. Artifacts such as noisy electrodes were corrected using a combination of channel approximation and epoch exclusion based on statistical parameters of the data with the ‘statistical control of artifacts in dense array EEG/MEG studies’ (SCADS, Junghöfer et al., 2000). 89% of trials were retained, their number did not differ significantly between conditions. Data were then algebraically transformed to average reference. For each subject and channel all epochs of one condition were averaged. These averaged epochs were the basis for SSVEP as well as ERP analyses.
Event-related potential Epochs were baseline-corrected to a time window from −100 to 0 ms for ERP analysis. A measure of Global Field Power (Lehmann and Skrandies, 1984; calculated here as the standard deviation of deflections across all 64 electrodes; see Murray et al., 2008 for details) of the grand average across conditions was inspected visually across the whole epoch to define components of interest (see Fig. 2). Time windows for further analyses contained multiples of a full cycle length of 15 Hz (i.e. 1/15 s) to minimise influences of the fast oscillation of the steady-state potential by averaging over time. The following three time windows were chosen: P2 (220 to 287 ms), N400 (360 to 494 ms) and LPC (640 to 908 ms). Channels were grouped to 4 regions of interest (ROIs, see Fig. 4A), covering most of the recorded channels. Amplitudes were averaged across electrodes in these ROIs and across each time window and were then entered into a repeated-measurement ANOVA with the three factors Region (anterior, posterior), Laterality (left, right), and Emotion (neutral, negative, positive).
S.M. Trauer et al. / NeuroImage 60 (2012) 130–138
133
Results Behavioural data
Fig. 2. (A) Global field power (GFP) across the 3 background word conditions (black line: lowpass filtered at 12 Hz, grey line: unfiltered GFP containing the ongoing steady-state potential). Dashed boxes depict the three time windows for further analyses. Note that no clear peaks were observable for the P100 or N170. (B) Scalp topographies of the grand mean across conditions during the three time windows.
Steady-state visual evoked potential SSVEP amplitudes were extracted from the individual averaged epochs by means of a Gabor Filter centred at 15 Hz with a frequency resolution of ±0.85 Hz full-width at half-maximum (FWHM) and a temporal resolution of ±260 ms. The amplitude value for statistical analyses was derived from the averaged signal at an occipital cluster of 6 electrodes containing the SSVEP amplitude maximum across all conditions (see Fig. 3). To account for differences in individual SSVEP signal strength, amplitudes were normalised by dividing them by the average amplitude during a time window from 600 to 260 ms prior to background word onset. Uncorrected paired t-tests did not reveal any significant difference in average baseline amplitude between conditions (all t22 b 1, all p > 0.3). Time courses of SSVEP amplitude were then analysed at each sampling point from 0 to 2000 ms with two separate one-factorial repeated measurement ANOVAs with the factor Emotion (neutral, negative, positive) or Lexicality (neutral, string). In all analyses p values were Greenhouse– Geisser corrected where appropriate.
Fig. 3. Scalp distribution of the (absolute) SSVEP amplitude from − 500 to 2000 ms around word onset. 6 occipital electrodes (white box) were chosen for analysis.
On average, participants detected 56.4% (SD 9.4%) of all targets and accuracy, defined as the ratio of correct responses to the total number of responses, was 92.3% (SD 4.0%). Accuracy was well above the chance level of 27% (t22 = 76.2, p b 0.001), which was calculated as the ratio of time windows for correct responses in relation to the total time for responses, i.e. the probability of a random reaction to be correct. Hit rates were not influenced by emotional connotation of background words (main effect Emotion: F2,44 = 0.1, p > 0.8; interaction Emotion × Time Window: F16,352 = 0.7, p > 0.8) but they did vary across different time windows (see Fig. 4C, main effect Time Window F8,176 = 8.5, p b 0.001, η² = 0.28). Post-hoc t-tests of hit rates, averaged across the three conditions, between subsequent time windows revealed a significant increase of detection rate from the second (− 267 to 0 ms) to the third time window (0 to 267 ms) and a decrease between the last two time windows (1333 to 1600 ms compared to 1600 to 1867 ms, all t22 > 3, all p b 0.01). Hit rates also differed across time windows when neutral words and consonant string were analysed (F8,176 = 6.2, p b 0.001, η² = 0.22), but did not differ between background stimuli (Lexicality: F1,22 = 1.2, p >0.3, Lexicality × Time Window: F8,176 = 0.8, p > 0.5). Posthoc comparisons of hit rates, averaged across the two conditions, between subsequent time windows also revealed a decrease for the last time window (1600–1867 ms) compared to the preceding one (t22 = 4.2, p b 0.001). The omnibus ANOVA on reaction times including the three emotional word categories did not reveal any significant main effects or interactions of time windows or background stimuli. The same held true for neutral words compared to consonant strings (all Fb 1.9, all p>0.1). ERPs P2 time window (220–287 ms) The frontal P2 was reflected in a main effect of Region (F1,22 = 49.6, p b 0.001, η² = 0.69). Amplitudes tended to be overall more positive with negative background words (Emotion: F2,44 = 2.6, p = 0.09, η² = 0.11), the interaction of Region by Emotion was significant (F2,44 = 3.3, p b 0.05, η² = 0.13). Post-hoc paired t-tests confirmed for negative compared to neutral words more positive amplitudes at right anterior (t22 = 3.4, p b 0.01, see Fig. 4B) and more negative amplitudes at left posterior electrodes (t22 = 4.2, p b 0.001). Negative words also tended to lead to larger deflections compared to positive words (right anterior: t22 = 2.0, p = 0.05; left posterior: t22 = 2.0, p = 0.06). N400 time window (360–494 ms) The centroparietal negativity was reflected in a main effect of Region (F1,22 = 10.9, p b 0.01, η² = 0.33), amplitudes over the left hemisphere were more negative than over the right one (Laterality: F1,22 = 12.5, p b 0.01, η² = 0.36). Negative background words led to more positive amplitudes across the scalp (Emotion: F2,44 = 4.3, p b 0.05, η² = 0.16) but this effect was driven by anterior sites as indicated by the interaction of Emotion by Region (F2,44 = 4.2, p b 0.05, η² = 0.16). Post-hoc paired t-tests confirmed more positive going deflections at anterior electrodes (left: t22 = 2.9, p b 0.01; right: t22 = 3.0, p b 0.01), and a more negative deflection at left posterior sites (t22 = 2.6, p b 0.05) and a trend for a more negative deflection at right posterior electrodes (t22 = 2.0, p = 0.06) for negative compared to neutral words. Negative background words also differed from positive words; they led to significantly more positive amplitudes at right anterior electrodes (t22 = 2.4, p b 0.05), the left posterior cluster revealed a trend for a more negative deflection (t22 = 2.0, p = 0.07).
134
S.M. Trauer et al. / NeuroImage 60 (2012) 130–138
Fig. 4. (A) ERP (including the SSVEP oscillation) averaged over the electrodes of the four regions of interest (ROIs of 11 electrodes each, indicated by grey shades) separately. Analysed time windows are indicated by grey boxes. Time and amplitude scales are given in the left lower panel. (B) Difference maps of the emotion effect during the three time windows. (C) Time courses of the normalised SSVEP amplitudes, hit rates and reaction times around word onset. 256 ms time bins of behavioural data: see Methods. Time before word onset refers to the presentation of a consonant string.
S.M. Trauer et al. / NeuroImage 60 (2012) 130–138
135
The aim of the present study was to simultaneously assess the influence of distracting emotional words on ongoing early visual processing and correlates of lexico-semantic encoding. Negative background words modulated the ERP indicating that they led to more activation during lexico-semantic processing (P200, N400). Thus, we can conclude that the emotional connotation of background words was processed. Nonetheless, no interference of emotional word content with the visual foreground task was evidenced by task-related SSVEP amplitudes or behavioural data. One may argue that the words were not legible due to the superimposed squares or were ignored because of the demanding task — however, their effects on the ERP clearly contradict this possibility. The present pattern of results suggests a dissociation between emotion effects on semantic encoding reflected in ERP modulations (e.g. Herbert et al., 2006; Kanske and Kotz, 2007) versus attentional effects of emotional stimuli on early visual processing, which have been observed frequently for pictorial stimuli (e.g. Keil et al., 2005; Müller et al., 2008) but not for words in the present study. Hence the recognition of emotional content may not inevitably modulate ongoing visual sensory processing. The present results rather indicate an enhanced lexico-semantic analysis of taskirrelevant emotional written words.
concern word content or word form. Therefore, our task should have tapped into functionally different processing paths than the background words after initial visual processing. Hence the finding that words did not interfere with the task renders it quite likely that previously reported attention effects of emotional words arise from processing stages subsequent to perceptual analysis. In combination with the ERP effects obtained from the same dataset, the present study demonstrates that emotional word content modulates language processing but does not necessarily feed back to early visual areas. This is in line with a model of emotional speech processing proposed by Schirmer and Kotz (2006), which was recently extended to the visual modality (Kotz and Paulmann, 2011). The authors state three stages of verbal emotion perception: [1] sensory analysis, [2] derivation of emotional meaning based on perceptual cues and [3] evaluative processes. According to the model our results indicate [1] no modulation of sensory analysis (SSVEP) in spite of [2] effects during early meaning derivation (P2) and [3] no ongoing evaluation (LPC) of written emotional words when they are task-irrelevant. Other evidence that written emotional words do not inevitably interfere with tasks that do not involve lexical processing comes from a digit-parity study (Harris and Pashler, 2004) where target numbers and irrelevant emotional words were presented in one display. Only the first occurrence of a negative distractor word in a row of trials affected task performance but no interference was evident during subsequent trials demonstrating that verbal emotional stimuli can be prevented to affect visual attention. Some imaging studies are in line with the notion that the meaning of written words may not necessarily modulate initial steps of visual processing as they report effects of emotional word content in temporal and frontal cortices but not in inferior occipital areas (Kuchinke et al., 2005; Posner et al., 2009). The finding that positive words lead to an ongoing SSVEP decrease (Koban et al., 2010) does not contradict this assumption. Koban et al. reported a deactivation in early visual areas when positive words were watched passively. Due to its timing (>1000 ms), the effect most likely reflects processes subsequent to the encoding of word meaning. Such a late and ongoing modulation of visual areas by emotional word content may not have occurred in the present experiment as participants focused on the task.
Emotional word content may not modulate early visual processing
Early (b300 ms) emotion effect in the event-related potential
Neither lexicality nor emotional content of background words had any effect on the task-related SSVEP amplitude (see Fig. 4C). Behavioural results reveal that the task was difficult but solved above chance level. Thus, no ceiling or floor effects could have obscured potential influences of distractors. Nonetheless, there were no effects of background stimuli on task performance, in line with the pattern of SSVEP amplitudes. The spatial overlap of task and distractors in the present study is a crucial aspect of our paradigm. The SSVEP as well as attention effects on SSVEP amplitudes elicited by stimulus displays similar to that in the present experiment have been localised to striate and early extrastriate cortices (Andersen et al., 2009; Di Russo et al., 2007). Competition for neural representation of spatially overlapping or proximal stimuli in these early visual areas is reflected in the SSVEP (Fuchs et al., 2008; Keitel et al., 2010). When attention biases this competition, the neural response of the unattended stimulus is suppressed (e.g. Andersen and Müller, 2010). In the present study the SSVEP signature of the foreground task remained unaltered after a spatially overlapping consonant string was replaced by a word. Thus, results clearly indicate that the emotional content of background words did not additionally capture processing resources from the task at early stages of visual perception, although ERP effects indicate that the words were processed semantically. Unlike many experiments on visual attention and emotional words (see Introduction), in the present study the task did not
We found an enhanced frontocentral P2 in response to negative compared to neutral background words. Interestingly, the effect was not restricted to frontal sites but comprised a simultaneous occipital negativity corresponding to the early posterior negativity (EPN) that is frequently observed in the difference wave of emotional compared to neutral words (e.g. Kissler et al., 2009; Scott et al., 2009). In fact, some studies reporting an emotion effect on the EPN also found an enhanced frontal positivity for emotional words (Herbert et al., 2008; Kissler et al., 2009) but did not report statistics on anterior electrodes. Future research should disentangle whether frontal and posterior emotion effects in the P2 time window reflect separate aspects of early lexical processing or whether they represent polar counterparts of a common underlying process. The effect of negative compared to neutral words also equates to an enhanced recognition potential (RP, Hinojosa et al., 2004; MartínLoeches et al., 2001b) which peaks around 250 ms and comprises a frontal positivity and a posterior negativity. The recognition potential differentiates meaningful from meaningless word forms as well as pictures and is thought to reflect semantic access. It is also modulated by selective attention to a target category of words (Martín-Loeches et al., 2001b; Rudell and Hua, 1996). Thus, the early ERP effect of negative words in the current study may reflect attentional capture of resources during early stages of meaning encoding. Taken together, studies on P2 as well as EPN and RP effects of written words suggest that semantic encoding shapes brain
LPC time window (640–906 ms) A posterior negativity in this time window was reflected in a main effect of Region (F1,22 = 10.9, p b 0.01, η² = 0.33), amplitudes over the right compared to the left hemisphere were more positive (F1,22 = 7.6, p b 0.05, η² = 0.26). No effects or interactions concerning the emotional content of background words were significant. SSVEP Amplitudes SSVEP amplitudes elicited by the foreground task stimuli did not differ between neutral, negative and positive background words at any sampling point (see Fig. 4C, Emotion: all F2,44 b 0.6, all p > 0.5), neither was there a difference between background consonant strings compared to neutral words (Lexicality: all F1,22 b 1.1, all p > 0.2). Discussion
136
S.M. Trauer et al. / NeuroImage 60 (2012) 130–138
responses from 200 ms onwards (e.g. Martín-Loeches, 2007). Effects of semantic categories (e.g. living-nonliving [Marí-Beffa et al., 2005], animals [Martín-Loeches et al., 2001b]), concreteness (MartínLoeches et al., 2001a) as well as valence (Kanske and Kotz, 2007) and arousal (Kissler et al., 2009) have been reported at this latency. This indicates that task-irrelevant words in the present study were encoded at this early processing stage, negative words appeared to draw more resources during early semantic analysis. Unfortunately, we cannot reinforce this assumption with behavioural measures such as a subsequent surprise-recognition test. Future studies should address this issue. Whilst emotion effects on the P2 have been reported repeatedly (Bernat et al., 2001; Herbert et al., 2006), in some studies the effect was driven by positive rather than negative words (Kanske and Kotz, 2007; Schapkin et al., 2000). Perhaps this depends on topdown control asserted on lexical processing such as a bias towards positive information. When words are presented subliminally (Bernat et al., 2001), interspersed with aversive noise bursts (Herbert et al., 2006) or presented unpredictably as irrelevant distractors as in the present study, threat or high arousal (both associated with the negative word category) may be more potent cues for enhanced lexico-semantic analysis. However, our results are in line with the notion that arousal rather than valence drives early emotion effects in the ERP (Herbert et al., 2008; Kissler et al., 2007).
Late (>300 ms) emotion effect in the event-related potential Negative words also led to an enhanced N400. Modulations of the N400 amplitude by emotional written and spoken words were found previously (e.g. Herbert et al., 2008; Kanske and Kotz, 2007; Schirmer et al., 2005) but in some of these experiments affective words elicited smaller N400 deflections (Herbert et al., 2008; Kanske and Kotz, 2007) interpreted as facilitated semantic integration. Why did we obtain the opposite effect? In all of these studies words had to be attended to or were read passively without distraction, whereas in the present study attention was focussed on separate stimuli and words were task-irrelevant. Thus, in line with previous studies reporting enhanced N400-like deflections for negative words (Herbert et al., 2008; Schirmer et al., 2005) enhanced amplitudes may reflect an enhanced capture of lexico-semantic processing resources by negative words. In line with reports that the size of N400 effects depends on task demands (Chwilla et al., 1995; Fischler and Bradley, 2006), selective attention indeed seems to play a crucial role for effects in the N400 time window. When attention is not directed to the words eliciting an N400 but to other objects, N400 effects are attenuated even more than when words are presented subliminally (reviewed in Deacon and Shelley-Tremblay, 2000). For example, it is a common finding that the N400 amplitude decreases with the repetition of a word, but the opposite effect was found when both presentations of the word occurred in the unattended of two simultaneous streams (Otten et al., 1993). These puzzling results for task-irrelevant words could also be explained by a semantic inhibition theory of the N400: Debruille (2007) suggested that the N400 reflects suppression of activated but inappropriate knowledge. In support of this hypothesis Debruille et al. (2008) found larger N400 amplitudes to distracting compared to attended words. The N400 amplitude was also enhanced in subjects well able to ignore irrelevant information compared to “poor ignorers”. In the present study negative words may have been harder to ignore after they initially received more processing resources, indicated by enhanced P2 deflections and, thus, have led to more inhibitory activity. In any case, the results of Otten et al. (1993) and Debruille et al. (2008) as well as the present study emphasise that more research is needed to understand how language processing is influenced by selective attention (as reviewed by Kutas
and Federmeier, 2011) and how incidental and attentive word processing may differ. The foreground task may also explain why we observed N400 amplitudes to be driven by arousal rather than a valence effect on the component (e.g. Herbert et al., 2008). Resources to assert a positivity bias via cognitive top-down control may have been depleted. An influence of task demands on valence and arousal effects during the N400 has already been suggested by Kanske and Kotz who found that positive words did not modulate the N400 with decreased processing depth (Kanske and Kotz, 2007). The foreground task in our experiment may also be the reason why we did not obtain a typical parietal LPC (see Fig. 4A) or any effects of emotional words in the LPC time window. The latter have been shown to reflect task-dependent stimulus processing (Fischler and Bradley, 2006; Hinojosa et al., 2010; Kissler et al., 2009; Schacht and Sommer, 2009b). However, in the present study the task was dissociated from the words and possibly hindered late elaborate language processing. Furthermore, task-related activity was not temporally aligned to word onset.
Caveats Based on our SSVEP findings we conclude that early visual processing is not modulated by emotional word content. However, given the temporal resolution of this measure (here ±260 ms, the time range from which the amplitude of a given frequency is extracted) very short-lived effects on early visual areas may not have been detected. Furthermore, situations of diminished visual input may constitute an exception to our assumption as indicated by emotion effects on the visual P1 when words were presented at subliminal presentation times (Pauli et al., 2005). Due to the ongoing flicker we could not observe a clear P1 to the background switch. Still most investigations of emotional word reading do not find effects on the ERP prior to 200 ms (Kissler et al., 2006). Given that words pose little requirements on visual-structural analysis (Martín-Loeches, 2007), it seems plausible that attentional feedback to early sensory processes does not occur in a preset manner when word content is already processed and found to be emotionally significant. Accordingly, fast presentation rates, rendering word reading perceptually more challenging, could explain the SSVEP amplification for emotional words reported by Keil et al. (2006). Since SSVEP amplitudes were predictive for subsequent word identification in this study, the effect seems to reflect feedback of lexical encoding to visual areas, which the emotional content of words in our experiment did not systematically produce. On the other hand, the presentation frequency of 8.6 Hz, the latency of the effect (120–270 ms) as well as the lateral occipital electrode sites analysed render it possible that the N1-P2 complex to word onset may have influenced the extracted SSVEP amplitudes. If this was the case, the emotion effect could have arisen from areas subsequent to the early visual sources of the SSVEP (Andersen et al., 2009; Di Russo et al., 2007). In line with this option it was considered that SSVEP modulations of emotional words were clearly more transient than those found for emotional pictures (Keil, 2006) and concluded that emotion effects during object recognition might arise earlier along the visual processing path compared to the spatiotemporally different effects when viewing emotional words. Different SSVEP modulations by verbal compared to pictorial emotional stimuli are also evidenced by contrasting our findings with previous analogous studies that reported a clear interference of emotional pictures with the foreground task (Hindi Attar et al., 2010a; Müller et al., 2008). Compared to the perception of scenes during which continuous visual inspection can reveal more and more detailed information over time, words may generally not assert such sustained effects on early visual areas.
S.M. Trauer et al. / NeuroImage 60 (2012) 130–138
Only few direct comparisons of emotion effects of different stimulus classes have been made so far: Hinojosa et al. (2009) assessed ERPs to words and pictures in two largely comparable experiments and reported early differentiations of the brain response only for pictures. As EPN differentiations to emotional words are not found as consistently as for pictures, the authors concluded that enhanced early processing of emotional material cannot generally be assumed for verbal information. In contrast, Schacht and Sommer (2009a) reported similar ERP effects for emotional faces and words. They found a striking similarity of the emotion effect topographies during the face-N170 and word-N400 and suggested this may reflect shared neural systems. However, given the different timing of these effects, they are still in line with the notion that emotion effects on early processing are more pronounced for pictorial than verbal material. Recently, Frühholz et al. (2011) compared words and faces in a modified Stroop task. Emotional faces enhanced the EPN across different tasks, whereas words only elicited an early emotion effect in the ERP when valence had to be judged explicitly. A possible task dependence of the EPN is supported by other studies (Hinojosa et al., 2010; Kanske and Kotz, 2007) where emotion effects prior to 300 ms were observed only under certain task requirements. In contrast, we found an early ERP effect for words that were completely irrelevant for the task at hand. This finding is more in line with the notion that the early semantic access reflected in the P2 time window occurs rather automatically (Hinojosa et al., 2004) and that the EPN reflects an involuntary shift of attention towards emotional stimuli (Franken et al., 2009). However, the role of task requirements for the early access of meaning seems to need more investigation. Conclusion We found emotion effects of task-irrelevant words on the ERP before 300 ms. This is in line with prior evidence that processes in the P2 time window reflect initial lexico-semantic encoding (Kissler et al., 2006; Martín-Loeches, 2007) and indicates that negative words are differentiated at this early semantic processing stage. Emotional background words also evoked enhanced N400 amplitudes, which may reflect the enhanced capture of lexico-semantic processing resources by emotional distractor words. The emotion effects reflected in the ERP were not associated with the capture of visual processing resources from the foreground task, whereas analogous studies have shown that emotional pictures do interfere with the task we used here (Hindi Attar et al., 2010a; Müller et al., 2008). Our results therefore suggest that feedback to sensory processes is not a universal mechanism following the detection of emotional relevance, but depends on the information channel from which emotional significance is conveyed. Future studies have to shed more light on how emotion shapes perception and recognition in possibly separable ways. Acknowledgments We thank Renate Zahn, Elizabeth Lafrentz and Christopher Gundlach for help in data recording. Research was supported by the Deutsche Forschungsgemeinschaft, graduate program “Function of Attention in Cognition“. References Amir, N., Elias, J., Klumpp, H., Przeworski, A., 2003. Attentional bias to threat in social phobia: facilitated processing of threat or difficulty disengaging attention from threat? Behav. Res. Ther. 41, 1325–1335. Andersen, S.K., Müller, M.M., 2010. Behavioral performance follows the time course of neural facilitation and suppression during cued shifts of feature-selective attention. Proc. Natl. Acad. Sci. U. S. A. 107 (31), 13878–13882. Andersen, S.K., Müller, M.M., Hillyard, S.A., 2009. Color-selective attention need not be mediated by spatial attention. J. Vis. 9 (6), 1–7.
137
Arnell, K.M., Killman, K.V., Fijavz, D., 2007. Blinded by emotion: misses follow attention capture by arousing distractors in RSVP. Emotion 7 (3), 465–477. Baas, D., Aleman, A., Kahn, R.S., 2004. Lateralization of amygdala activation: a systematic review of functional neuroimaging studies. Brain Res. Rev. 45, 96–103. Bakardjian, H., Tanaka, T., Cichocki, A., 2011. Emotional faces boost up steady-state visual responses for brain–computer interface. Neuroreport 22 (3), 121–125. Begleiter, H., Platz, A., 1969. Cortical evoked potentials to semantic stimuli. Psychophysiology 6 (1), 91–100. Bernat, E., Bunce, S., Shevrin, H., 2001. Event-related brain potentials differentiate positive and negative mood adjectives during both supraliminal and subliminal visual processing. Int. J. Psychophysiol. 42, 11–34. Bradley, M.M., Lang, P.J., 1994. Measuring emotion: the self-assessment-manikin and the semantic differential. J. Behav. Ther. Exp. Psychiatry 25 (1), 49–59. Chwilla, D.J., Brown, C.M., Hagoort, P., 1995. The N400 as a function of the level of processing. Psychophysiology 32 (3), 274–285. Deacon, D., Shelley-Tremblay, J., 2000. How automatically is meaning accessed: a review of the effects of attention on semantic processing. Front. Biosci. 5, e82–e944. Debruille, J.B., 2007. The N400 potential could index a semantic inhibition. Brain Res. Rev. 56, 472–477. Debruille, J.B., Ramirez, D., Wolf, Y., Schaefer, A., Nguyen, T.V., et al., 2008. Knowledge inhibition and N400: A within- and a between-subjects study with distractor words. Brain Res. 1187, 167–183. Di Russo, F., Pitzalis, S., Aprile, T., Spitoni, G., Patria, F., et al., 2007. Spatiotemporal analysis of the cortical sources of the steady-state visual evoked potential. Hum. Brain Mapp. 28, 323–334. Fischler, I., Bradley, M.M., 2006. Event-related potential studies of language and emotion: words, phrases and task effects. Prog. Brain Res. 156, 185–203. Franken, I.H.A., Gootjes, L., van Strien, J.W., 2009. Automatic processing of emotional words during an emotional Stroop task. Neuroreport 20 (8), 776–781. Frühholz, S., Jellinghaus, A., Herrmann, M., 2011. Time course of implicit processing and explicit processing of emotional faces and emotional words. Biol. Psychol. 87, 265–274. Fuchs, S., Andersen, S.K., Gruber, T., Müller, M.M., 2008. Attentional bias of competitive interactions in neuronal networks of early visual processing in the human brain. NeuroImage 41, 1086–1101. Harris, C.R., Pashler, H., 2004. Attention and the processing of emotional words and names: not so special after all. Psychol. Sci. 15 (3), 171–178. Herbert, C., Kissler, J., Junghöfer, M., Peyk, P., Rockstroh, B., 2006. Processing of emotional adjectives: evidence from startle EMG and ERPs. Psychophysiology 43, 197–206. Herbert, C., Junghöfer, M., Kissler, J., 2008. Event-related potentials to emotional adjectives during reading. Psychophysiology 45, 487–498. Herbert, C., Ethofer, T., Anders, S., Junghöfer, M., Wildgruber, D., et al., 2009. Amygdala activation during reading of emotional adjectives — an advantage for pleasant content. Soc. Cogn. Affect. Neurosci. 4, 35–49. Hindi Attar, C., Andersen, S.K., Müller, M.M., 2010a. Time course of affective bias in visual attention: convergent evidence from steady-state visual evoked potentials and behavioral data. NeuroImage 53, 1326–1333. Hindi Attar, C., Müller, M.M., Andersen, S.K., Büchel, C., Rose, M., 2010b. Emotional processing in a salient motion context: integration of motion and emotion in both V5/hMT + and the amygdala. J. Neurosci. 30 (15), 5204–5210. Hinojosa, J.A., Martín-Loeches, M., Munoz, F., Casado, P., Pozo, M.A., 2004. Electrophysiological evidence of automatic early semantic processing. Brain Lang. 88, 39–46. Hinojosa, J.A., Carretié, L., Valcárcel, M.A., Méndez-Bértolo, C., Pozo, M.A., 2009. Electrophysiological differences in the processing of affective information in words and pictures. Cogn. Affect. Behav. Neurosci. 9 (2), 173–189. Hinojosa, J.A., Méndez-Bértolo, C., Pozo, M.A., 2010. Looking at emotional words is not the same as reading emotional words: behavioral and neuronal correlates. Psychophysiology 47 (4), 48–57. Hodsoll, S., Viding, E., Lavie, N., 2011. Attentional capture by irrelevant emotional distractor faces. Emotion 11 (2), 346–353. Junghöfer, M., Elbert, T., Tucker, D.M., Rockstroh, B., 2000. Statistical control of artifacts in dense array EEG/MEG studies. Psychophysiology 37 (44), 523–532. Kanske, P., Kotz, S.A., 2007. Concreteness in emotional words: ERP evidence from a hemifield study. Brain Res. 1148, 138–148. Kanske, P., Kotz, S.A., 2010. Modulation of early conflict processing: N200 responses to emotional words in a flanker task. Neuropsychologia 48, 3661–3664. Keil, A., 2006. Macroscopic brain dynamics during verbal and pictorial processing of affective stimuli. Prog. Brain Res. 156, 217–232. Keil, A., Ihssen, N., 2004. Identification facilitation for emotionally arousing verbs during the attentional blink. Emotion 4 (1), 23–35. Keil, A., Gruber, T., Müller, M.M., Moratti, S., Stolarova, M., et al., 2003. Early modulation of visual perception by emotional arousal: evidence from steady-state visual evoked brain potentials. Cogn. Affect. Behav. Neurosci. 3 (3), 195–206. Keil, A., Moratti, S., Sabatinelli, D., Bradley, M.M., Lang, P.J., 2005. Additive effects of emotional content and spatial attention on electrocortical facilitation. Cereb. Cortex 15, 1187–1197. Keil, A., Ihssen, N., Heim, S., 2006. Early cortical facilitation for emotionally arousing targets during the attentional blink. BMC Biol. 4 (23). doi:10.1186/1741-7007-4-23. Keil, A., Bradley, M.M., Ihssen, N., Heim, S., Vila, J., et al., 2010. Defensive engagement and perceptual enhancement. Neuropsychologia 48, 3580–3584. Keitel, C., Andersen, S.K., Müller, M.M., 2010. Competitive effects on steady-state visual evoked potentials with frequencies in- and outside the alpha band. Exp. Brain Res. 205, 489–495. Kissler, J., Assadollahi, R., Herbert, C., 2006. Emotional and semantic networks in visual word processing: insights from ERP studies. Prog. Brain Res. 156, 147–183. Kissler, J., Herbert, C., Peyk, P., Junghöfer, M., 2007. Buzzwords: early cortical responses to emotional words during reading. Psychol. Sci. 18 (6), 475–480.
138
S.M. Trauer et al. / NeuroImage 60 (2012) 130–138
Kissler, J., Herbert, C., Winkler, I., Junghöfer, M., 2009. Emotion and attention in visual word processing — an ERP study. Biol. Psychol. 80, 75–83. Koban, L., Ninck, M., Gisler, T., Kissler, J., 2010. Processing of emotional words measured simultaneously with steady-state visually evoked potentials and near-infrared diffusing-wave spectroscopy. BMC Neurosci. 11 (85), 1–12. Kotz, S.A., Paulmann, S., 2011. Emotion, language, and the brain. Lang. Linguist. Compass 5 (3), 108–125. Kuchinke, L., Jacobs, A.M., Grubich, C., Vo, M.L.H., Conrad, M., Herrmann, M., 2005. Incidental efffects of emotional valence in single word processing: an fMRI study. NeuroImage 28, 1022–1032. Kutas, M., Federmeier, K.D., 2011. Thirty years and counting: finding meaning in the N400 component of the event-related brain potential. Annu. Rev. Psychol. 62, 621–647. Landis, T., 2006. Emotional words: what's so different from just words? Cortex 42 (6), 823–830. Lang, P.J., Bradley, M.M., 2009. Emotion and the motivational brain. Biol. Psychol. 84 (3), 437–450. Lehmann, D., Skrandies, W., 1984. Spatial analysis of evoked potentials in man — a review. Prog. Neurobiol. 23, 227–250. MacLeod, C., Mathews, A., Tata, P., 1986. Attentional bias in emotional disorders. J. Abnorm. Psychol. 95 (1), 15–20. Maratos, F.A., Mogg, K., Bradley, B.P., 2008. Identification of angry faces in the attentional blink. Cogn. Emot. 22 (7), 1340–1352. Marí-Beffa, P., Valdés, B., Cullen, D.J.D., Catena, A., Houghton, G., 2005. ERP analyses of task effects on semantic processing from words. Brain Res. Cogn. Brain Res. 23, 293–305. Martín-Loeches, M., 2007. The gate to reading: reflections on the recognition potential. Brain Res. Rev. 53, 89–97. Martín-Loeches, M., Hinojosa, J.A., Fernández-Frías, C., Rubia, F.J., 2001a. Functional differences in the semantic processing of concrete and abstract words. Neuropsychologia 39, 1086–1096. Martín-Loeches, M., Hinojosa, J.A., Gómez-Jarabo, G., Rubia, F.J., 2001b. An early electrophysiological sign of semantic processing in basal extrastriate areas. Psychophysiology 38, 114–124. Mathewson, K.J., Arnell, K.M., Mansfield, C.A., 2008. Capturing and holding attention: the impact of emotional words in rapid serial visual presentation. Mem. Cognit. 36 (1), 182–200. McKenna, F.P., Sharma, D., 1995. Intrusive cognitions: an investigation of the emotional Stroop task. J. Exp. Psychol. Learn. Mem. Cogn. 21 (6), 1595–1607. McTeague, L.M., Shumen, J.R., Wieser, M.J., Lang, P.J., Keil, A., 2011. Social vision: sustained perceptual enhancement of affective facial cues in social anxiety. NeuroImage 54, 1615–1624. Mogg, K., Bradley, B.P., de Bono, J., Painter, M., 1997. Time course of attentional bias for threat information in non-clinical anxiety. Behav. Res. Ther. 35 (4), 297–303. Müller, M.M., Hillyard, S.A., 2000. Concurrent recording of steady-state and transient eventrelated potentials as indices of visual-spatial selective attention. Clin. Neurophysiol. 111, 1544–1552. Müller, M.M., Andersen, S.K., Keil, A., 2008. Time course of competition for visual processing resources between emotional pictures and foreground task. Cereb. Cortex 18, 1892–1899.
Murray, M.M., Brunet, D., Michel, C.M., 2008. Topographic ERP analyses: a step-by-step tutorial review. Brain Topogr. 20, 249–264. Nasrallah, M., Carmel, D., Lavie, N., 2009. Murder, she wrote: enhanced sensitivity to negative word valence. Emotion 9 (5), 609–618. Öhman, A., Flykt, A., Esteves, F., 2001. Emotion drives attention: detecting the snake in the grass. J. Exp. Psychol. Gen. 130 (3), 466–478. Ortigue, S., Michel, C.M., Murray, M.M., Mohr, C., Carbonnel, S., Landis, T., 2004. Electrical neuroimaging reveals early generator modulation to emotional words. NeuroImage 21, 1242–1251. Otten, L.J., Rugg, M.D., Doyle, M.C., 1993. Modulation of event-related potentials by word repetition: the role of selective attention. Psychophysiology 30 (6), 559–571. Pauli, P., Amrhein, C., Mühlberger, A., Dengler, W., Wiedemann, G., 2005. Electrocortical evidence for an early abnormal processing of panic-related words in panic disorder patients. Int. J. Psychophysiol. 57, 33–41. Posner, J., Russell, J.A., Gerber, A., Gorman, D., Colibazzi, T., et al., 2009. The neurophysiological bases of emotion: an fMRI study of the affective cirumplex using emotion-denoting words. Hum. Brain Mapp. 30, 883–895. Pratto, F., John, O.P., 1991. Automatic vigilance: the attention-grabbing power of negative social information. J. Pers. Soc. Psychol. 61 (3), 380–391. Rudell, A.P., Hua, J., 1996. The recognition potential and conscious awareness. Electroencephalogr. Clin. Neurophysiol. 98, 309–318. Sabatinelli, D., Bradley, M.M., Fitzsimmons, J.R., Lang, P.J., 2005. Parallel amygdala and inferotemporal activation reflect emotional intensity and fear relevance. NeuroImage 24, 1265–1270. Schacht, A., Sommer, W., 2009a. Emotions in word and face processing: early and late cortical responses. Brain Cogn. 69, 538–550. Schacht, A., Sommer, W., 2009b. Time course and task dependence of emotion effects in word processing. Cogn. Affect. Behav. Neurosci. 9 (1), 28–43. Schapkin, S.A., Gusev, A.N., Kuhl, J., 2000. Categorization of unilaterally presented emotional words: an ERP analysis. Acta Neurobiol. Exp. (Wars) 60, 17–28. Schirmer, A., Kotz, S.A., 2006. Beyond the right hemisphere: brain mechanisms mediating vocal emotional processing. Trends Cogn. Sci. 10 (1), 24–30. Schirmer, A., Kotz, S.A., Friederici, A.D., 2005. On the role of attention for the processing of emotions in speech: sex differences revisited. Brain Res. Cogn. Brain Res. 24, 422–452. Scott, G.G., O'Donnell, P., Leuthold, H., Sereno, S.C., 2009. Early emotion word processing: evidence from event-related potentials. Biol. Psychol. 80, 95–104. Skrandies, W., 1998. Evoked potential correlates of semantic meaning — a brain mapping study. Brain Res. Cogn. Brain Res. 6, 173–183. Versace, F., Bradley, M.M., Lang, P.J., 2010. Memory and event-related potentials for rapidly presented emotional pictures. Exp. Brain Res. 205, 223–233. Vuilleumier, P., 2005. How brains beware: neural mechanisms of emotional attention. Trends Cogn. Sci. 9 (12), 585–594. Vuilleumier, P., Huang, Y.M., 2009. Emotional attention: uncovering the mechanisms of affective biases in perception. Curr. Dir. Psychol. Sci. 18 (3), 148–152. Williams, J.M.G., Mathews, A., MacLeod, C., 1996. The emotional Stroop task and psychopathology. Psychol. Bull. 120 (1), 3–24.