tapraid5/emo-emo/emo-emo/emo00510/emo2358d10z xppws S⫽1 10/11/10 1:23 Art: 2009-1048
Emotion 2010, Vol. 10, No. 5, 678 – 687
© 2010 American Psychological Association 1528-3542/10/$12.00 DOI: 10.1037/a0019175
Gender Differences in Implicit and Explicit Processing of Emotional Facial Expressions as Revealed by Event-Related Theta Synchronization Gennady G. Knyazev, Jaroslav Y. Slobodskoj-Plusnin, and Andrey V. Bocharov Siberian Branch of the Russian Academy of Medical Sciences, Novosibirsk, Russia Emotion information processing may occur in 2 modes that are differently represented in conscious awareness. Fast online processing involves coarse-grained analysis of salient features and is not represented in conscious awareness; offline processing takes hundreds of milliseconds to generate fine-grained analysis and is represented in conscious awareness. These processing modes may be studied using event-related electroencephalogram theta synchronization as a marker of emotion processing. Two experiments were conducted that differed on the mode of emotional information presentation. In the explicit mode, subjects were explicitly instructed to evaluate the emotional content of presented stimuli; in the implicit mode, their attention was directed to other features of the stimulus. In the implicit mode, theta synchronization was most pronounced in the early processing stage, whereas in the explicit mode, it was more pronounced in the late processing stage. The early processing stage was more pronounced in men, whereas the late processing stage was more pronounced in women. Implications of these gender differences in emotion processing for well-documented differences in social behavior are discussed. Keywords: EEG, emotion, gender differences, theta oscillations, unconscious processing
perceive, process, express, and experience emotions. Generally speaking, women seem more able, as well as more inclined, to express their own emotions to conspecifics (Dimberg & Lundquist, 1990). Furthermore, they show greater ease in decoding nonverbal indicators connected to the expression of emotions. These data point to the fact that females are superior in explicit conscious recognition of emotional cues. However, emotion information processing may occur in two modes that are differently represented in conscious awareness. Various forms of data point to the fact that it takes some 300 – 400 ms of brain activity for consciousness to occur (Libet, 2003; Milner & Goodale, 1995; Treisman & Kanwisher, 1998; Velmans, 1991). However, motor actions may have already been initiated long before 300 ms. In the first 100 –200 ms after sense organs first receive stimuli from the environment, the brain is capable of a high degree of perceptual analysis, extraction of meaning, cognitive processing, and organization of action, all of which remain entirely unconscious (Velmans, 1991). This online processing is very fast, involving coarse-grained analysis of salient features, and is not represented in conscious awareness; in contrast, offline processing is slow, taking hundreds of milliseconds to generate fine-grained analysis, and is represented in conscious awareness (Milner & Goodale, 1995; Toates, 1998). These two processing modes may differ not only in terms of temporal scales and brain areas involved, but also in terms of specific biases toward particular kinds of information. The main function of the online system, which acts beyond conscious awareness, is to ensure fast reflex-like reaction to important cues. Therefore, this system must attribute most salience to biologically relevant stimuli. The offline system, on the other hand, attributes salience taking into account awareness about situational context, personal goals, and so on. The above-noted superiority of women in understanding emotion information does not necessarily imply they are also
For humans, as social beings, the ability to understand emotional information conveyed by facial expressions of other people is crucial for optimal interpersonal functioning. This ability is present at birth, but it may also develop through the life span in the process of social learning. Substantial literature indicates that women are more interested than men in conspecifics and are better at decoding facial expressions of emotion (Biele & Grabowska, 2006; Hampson, van Anders, & Mullin, 2006; Proverbio, Brignone, Matarazzo, Del Zotto, & Zani, 2006; Rotter & Rotter, 1988; Thayer & Johnsen, 2000). It is difficult to determine whether these differences arise entirely because of cultural factors or depend on genetically determined biological factors. The “affective education” received by females differs from a very young age from that given to males. It appears that in compliance with nearuniversal gender role stereotypes, women are more encouraged than men to develop the ability to understand emotion (Eagly & Johnson, 1990; Kuebli & Fivush, 1992; Langlois & Downs, 1990; Leppa¨nen & Hietanen, 2001). On the other hand, there is evidence that preferences for sexually differentiated objects arose early in human evolution, prior to the emergence of a distinct hominid lineage (Alexander & Hines, 2002). Whatever the cause, substantial differences exist between the ways in which men and women
Gennady G. Knyazev, Jaroslav Y. Slobodskoj-Plusnin, and Andrey V. Bocharov, Institute of Physiology, Siberian Branch of the Russian Academy of Medical Sciences, Novosibirsk, Russia. This study was supported by Russian Foundation for Basic Research Grants 08-06-00016-a and 08-06-00011-a. The authors are grateful to an anonymous reviewer for helpful comments and ideas regarding possible implications of the study findings. Correspondence concerning this article should be addressed to Gennady G. Knyazev, Institute of Physiology, SB RAMS, Timakova str., 4, Novosibirsk 630117, Russia. E-mail:
[email protected] 678
tapraid5/emo-emo/emo-emo/emo00510/emo2358d10z xppws S⫽1 10/11/10 1:23 Art: 2009-1048
PROCESSING OF EMOTIONAL FACIAL EXPRESSIONS
superior in fast online processing. Actually, on the basis of evolutionary ideas, it could be speculated that this processing might be more present in men. Indeed, sexual selection theory locates the origins of gender differences in overt behavior in human evolutionary history as a consequence of unequal parental investment leading to greater male than female reproductive competition and, therefore, overt aggression (Trivers, 1972). Several evolutionary analyses have identified the degree of risk an individual is prepared to take during a conflict as the crucial difference between the sexes. The greater variation in male than female reproductive success that is typical of mammals leads to more intense male competition (Archer, 2004). Inclination to aggressive interactions with conspecifics would not encourage thorough conscious analysis of the competitor’s emotional state; rather, it may encourage fast behavioral responses to emotional cues that are perceived as signs of a potential threat. Research of emotional information processing in humans is retarded by absence of its reliable marker. Neuroimaging techniques, such as positron emission tomography and functional MRI, which currently dominate the research of human brain functions, do not allow the study of fast processes that occur in the first 500 ms after stimulus presentation. For such processes, electroencephalogram (EEG) and magnetoencephalogram still remain the only available options. In the past decade, event-related potential (ERP) studies have been very useful in temporal localization of brain processes underlying recognition of emotional facial expressions. These studies have shown that facial expressions are probably recognized and differentiated within the first 200 –250 ms after their presentation (Allison et al., 1994; Balconi & Pozzoli, 2003; Leppa¨nen et al., 2007). Reviewing the results from ERP studies investigating brain processes involved in the detection and analysis of emotional facial expression, Eimer and Holmes (2007) point out that in all experiments, emotional faces were found to trigger an increased ERP positivity relative to neutral faces. The onset of this emotional expression effect was remarkably early, ranging from 120 to 180 ms poststimulus. Similar emotional expression effects were found for six basic emotions, suggesting that these effects are not primarily generated within neural structures specialized for the automatic detection of specific emotions. Thus, the time of occurrence of these phenomena and the lack of specificity suggest that they are related to coarse online processing. ERP represents one very specific kind of cortical response to stimuli that are labeled evoked responses. These responses are phase-locked to the stimulus onset and most existing evidence indicates that they participate in specific cognitive operations necessary for the stimulus perception, recognition, and preparation of motor programs. Another kind of cortical responses, the socalled induced responses, are time-locked but not phase-locked to the stimulus onset. They probably reflect unspecific accompaniment to these specific reactions. It seems reasonable to suggest that emotional processing should be to a greater extent associated with induced than with evoked responses. Induced responses are oscillatory by nature. They are measured by a degree of stimulusrelated increase or decrease of spectral power in a particular frequency band (Pfurtscheller & Aranibar, 1977). It is increasingly becoming clear that oscillations may have a special and very important role in the integration of brain processes. They are now viewed as the critical “middle ground” linking single-neuron activity to behavior (Buzsaki & Draguhn, 2004). Growing ev-
679
idence suggests that different frequency oscillations of the human brain may play a key role in the emergence of percepts, memories, emotions, thoughts, and actions (Cantero & Atienza, 2005; Knyazev, 2007; Nunez, 2000; Varela, Lachaux, Rodriguez, & Martinerie, 2001). If we look for an oscillation that may play a role in the processing of emotion information, theta rhythm seems the obvious candidate. It has been suggested that midfrontal theta may serve as a gating function on the information processing flow in limbic regions (Pizzagalli, Oakes, & Davidson, 2003; Vinogradova, 1995) that specialize in emotion processing. Considerable evidence confirms a link between theta activity and emotional states in animals (Mitchell, McNaughton, Flanagan, & Kirk, 2008; Moita, Rosis, Zhou, LeDoux, & Blair, 2003; Pare, 2003; Pare & Collins, 2000; Sainsbury & Montoya, 1984; Seidenbecher, Laxmi, Stork, & Pape, 2003) and human beings (Aftanas, Varlamov, Pavlov, Makhnev, & Reva, 2001; Aftanas, Reva, Varlamov, Pav¨ niz, 2006; Doppellov, & Makhnev, 2004; Bas¸ar, Gu¨ntekin, & O mayr, Stadler, Sauseng, Rachbauer, & Klimesch, 2002; Krause, Viemero, Rosenqvist, Sillanmaki, & Astrom, 2000; Nishitani, 2003). Theta synchronization during presentation of emotional facial expressions has been noted in several studies (Bas¸ar et al., ¨ niz, & Bas¸ar-Erog˘lu, 2008; 2006; Bas¸ar, Schmiedt-Fehr, O Gu¨ntekin & Bas¸ar, 2007a; Knyazev, Bocharov, Levin, Savostyanov, & Slobodskoj-Plusnin, 2008). Existing evidence shows that emotion-related increase of spectral power in low frequencies discriminates between “emotional” versus “neutral” stimuli, but it is less effective in discriminating the valence of emotional stimuli (Aftanas et al., 2001, 2004; Bas¸ar et al., 2006; Knyazev et al., 2008; Kamarajan et al., 2008). This implies that theta synchronization does not probably manifest a particular emotion per se. Rather, it may manifest complex (including memory) operations involved in the processing of emotion information. Indeed, if theta oscillations serve the purpose of integration of brain circuits needed for the processing of motivationally relevant emotional stimuli, their power increase should be observed in response to any emotion-arousing stimulus. The degree of this increase must correlate with motivational salience of the stimulus and subjectively experienced emotional involvement. It has been recently shown that the degree of event-related theta synchronization correlates with subjectively experienced emotional involvement and individual sensitivity to emotional content of the stimulus (Knyazev, Slobodskoj-Plusnin, & Bocharov, 2009). In this study, we investigated gender differences in implicit and explicit processing of emotional facial expressions using eventrelated theta synchronization as its marker. Two experiments were conducted that differed in the mode of emotion information presentation. In the explicit mode, subjects were explicitly instructed to evaluate the emotional content of presented stimuli; in the implicit mode, subjects performed the gender discrimination task, hence, their attention was diverted away from the emotional content of the stimuli. We expected that in the former experiment, emotional processing would be mostly conscious and would prevail during late “conscious” stage (that is, after 300 ms), whereas in the latter experiment, fast unconscious processing (before 300 ms) would be more prominent. Moreover, we expected that the late “conscious” stage would be more prominent in women, whereas the early unconscious processing would prevail in men.
tapraid5/emo-emo/emo-emo/emo00510/emo2358d10z xppws S⫽1 10/11/10 1:23 Art: 2009-1048
KNYAZEV, SLOBODSKOJ-PLUSNIN, AND BOCHAROV
680 Method Subjects
The experiments with the explicit and implicit tasks were conducted in two samples. In the explicit emotion recognition task, the sample included 40 participants (19 men and 21 women; age range ⫽ 17 to 32 years). In the gender categorization task. the sample included 49 participants (27 men and 22 women; age range ⫽ 18 to 30 years). Both samples consisted of healthy, right-handed volunteers with normal or corrected-to-normal vision who received a sum equivalent to about $5 (U.S.) for participation. All applicable subject protection guidelines and regulations were followed in conducting the research in accordance with the Declaration of Helsinki. All participants gave informed consent to the study. The study has been approved by the Institute of Physiology ethical committee.
Instruments and Procedure
F1
As stimulation, we used an ensemble of the photographs presented by Ekman and Friesen (1976). We selected 30 photographs, specifically, five different females and five different males with three different facial expressions (angry, happy, and neutral). The pictures were presented in black and white (17 ⫻ 17 cm) and displayed on a screen at a distance of 120 cm from the subjects. The subjects sat in a soundproof and dimly illuminated room. In the explicit experimental mode, after about 8 min of spontaneous EEG registration, they were instructed to evaluate emotional expression of each presented face on an analog scale ranging from ⫺100 (very hostile) to 100 (very friendly). First, a fixation cross appeared at the center of the screen for 1 s. Then a face picture was presented for 4 s, which was followed by presentation of the evaluative scale (see Figure 1). In the implicit experimental mode, the participants were instructed to press 1 or 2 on presentation of, respectively, a male or female face. Angry, happy, and neutral faces were delivered randomly, and the interstimulus interval randomly varied between 4 and 7 s. The number of face stimulations was 150 for each subject, including 50 faces of each category.
EEG Recording EEG was recorded using a 32-channel PC-based system via silver–silver chloride electrodes. A midforehead electrode was the ground. The signals were amplified with a multichannel biosignal amplifier with bandpass 0.05–70 Hz, ⫺6 dB/octave, and continuously digitized at 300 Hz. The electrodes were placed at 30 head
Figure 1. Scheme of one trial. After a fixation cross appeared for 1,000 ms, a target stimulus (i.e., angry, neutral, or happy face picture) was presented for 4,000 ms. Thereafter, an evaluation scale appeared, which was presented until the subject marked the degree of hostility–friendliness of the presented face. Between-stimuli interval randomly varied between 4 and 7 s.
sites according to the International 10 –20 system and referred to linked mastoids. The horizontal and vertical electro-oculograms were registered simultaneously. EEG data were artifact-corrected using independent components analysis via the EEGLAB toolbox (http://www.sccn.ucsd.edu/eeglab/) with additional visual rejection of artifact-contaminated data offline. The 1,000 ms after face presentation were used as the test interval, whereas the 1,000 ms prior to the fixation cross presentation served as the prestimulus baseline.
Event-Related Spectral Perturbations To assess face-evoked changes in spectral power, event-related spectral perturbations (ERSP) were calculated using the timef function of the EEGLAB toolbox (http://www.sccn.ucsd.edu/ eeglab/). The ERSP (Makeig, 1993) shows mean log event-locked deviations from baseline mean power at each frequency. The mean value of the spectral power E in a frequency f during the 1,000 ms prior to fixation cross presentation was considered the baseline level and was subtracted from the E(time, f) after face stimulus onset. Method of ERSP calculation realized in the EEGLAB toolbox is described in Delorme and Makeig (2004). Time– frequency representations were calculated using Morlet wavelets. The obtained frequency spacing was 0.29 Hz; time resolution was 6.7 ms.
Data Analyses and Statistics Behavioral data. For the explicit experiment data, for each subject, the estimates of hostility–friendliness of the presented faces were averaged across all face presentations representing face’s gender (i.e., male vs. female faces) and each emotional category (angry vs. neutral vs. happy) and were used as dependent variables. For the implicit experiment data, reaction times (RTs) and error rates (5.4% of trials on average) were calculated within each emotional category and each face’s gender. Error rates were computed for each participant by dividing the number of incorrect responses by the number of trials within each category. EEG data. It is known that multivariate approaches have low sensitivity to regionally specific effects (Friston, 1997). Given this fact, an alternative to the conventional analysis of variance (ANOVA), the so-called mass-univariate approach, is most frequently used for the analysis of neuroimaging data (Worsley et al., 1996). Being more sensitive to local effects, however, the massunivariate approach is less suitable for the analysis of several factors’ interactions or for the analysis of effects of one factor controlling for effects of others. Hence, a parallel use of both approaches may combine their strengths while avoiding their weaknesses. Here, we used both the mass-univariate approach implemented in the statcond function of the EEGLAB toolbox and conventional ANOVA. The one-sample Kolmogorov–Smirnov test was used to test that EEG variables were normally distributed. This test showed no deviations from normality for ERSP measures. This enabled using parametric tests for hypotheses testing. For the mass-univariate analysis, we divided the sample into respective groups on the basis of experimental paradigm and subject’s gender. ANOVA with group as a between-subjects factor was fitted for each point of the time–frequency– cortical surface matrix. The false discovery rate correction for multiple compari-
tapraid5/emo-emo/emo-emo/emo00510/emo2358d10z xppws S⫽1 10/11/10 1:23 Art: 2009-1048
PROCESSING OF EMOTIONAL FACIAL EXPRESSIONS
sons was applied to reveal areas with significant effects (Holm, 1979). For false discovery rate correction, we employed a q-value threshold of .05. For the analysis, we chose times from the stimulus presentation onset to 700 ms poststimulus and frequencies from 2.3 to 7.6 Hz. For conventional ANOVA, for each subject, we averaged the net ERSP values across a frequency band of 4.7–7.6 Hz (to provide the time-varying measure of theta activity) and time points of 150 –250 and 250 –350 ms after stimulus onset (for rationale of this choice of times, see the Results section). The ERSP values from 30 derivations were averaged for nine regions to reduce the number of statistical comparisons. As our previous data (Knyazev, Slobodskoj-Plusnin, & Bocharov, 2009) showed that, associated with emotion processing, theta responses are most prominent in the frontal and central cortical regions, only these were retained for the ANOVA: the left frontal (Fp1, F7, F3, FT7, FC3), midline frontal (Fz, FCz), right frontal (Fp2, F8, F4, FT8, FC4), left central (T7, C3, TP7, CP3), midline central (Cz, CPz), and right central (T8, C4, TP8, CP4). Repeated measures ANOVA was conducted to test the effects of gender and experimental paradigm as betweensubjects factors, and condition (angry vs. neutral vs. happy faces), laterality (left hemisphere vs. midline region vs. right hemisphere), sagittality (frontal vs. central region), and time (two levels) as within-subject factors. Greenhouse–Geisser correction for sphericity assumption violation was used where necessary.
Results Behavioral Data For the explicit experiment data, we used general linear model analysis to test the effects of subject’s gender as a betweensubjects factor, emotional category and face’s gender as withinsubject factors, and face estimates as dependent variables. The main effect of emotional category, F(1.26) ⫽ 283.64, p ⬍ .001, 2 ⫽ .88, with repeated contrasts confirmed that neutral faces were evaluated as more friendly than angry faces, F(1) ⫽ 276.07, p ⬍ .001, 2 ⫽ .88, and happy faces were evaluated as more friendly than neutral faces, F(1) ⫽ 189.83, p ⬍ .001, 2 ⫽ .83. Estimated marginal means for angry, neutral, and happy faces were ⫺49.4 (SE ⫽ 2.9), 3.6 (SE ⫽ 1.6), and 50.2 (SE ⫽ 3.2), respectively. Thus, participants had no difficulties in defining emotional category of presented faces. All effects of subject’s gender were not significant. For the implicit experiment data, when RTs were used as dependent variables, the same general linear model analysis showed significant main effects of emotional category, F(1.61) ⫽ 7.51, p ⫽ .002, 2 ⫽ .14, and face’s gender, F(1) ⫽ 21.01, p ⬍ .001, 2 ⫽ .31. Their interaction was also significant, F(1.92) ⫽ 4.55, p ⫽ .014, 2 ⫽ .09. The Emotional Category ⫻ Subject’s Gender interaction was marginal, F(1.61) ⫽ 2.88, p ⫽ .074, 2 ⫽ .06; hence, the ANOVA was run separately for male and female subjects. The main effect of emotional category was significant in men, F(1.42) ⫽ 7.95, p ⫽ .004, 2 ⫽ .23, but not in women, F(1.92) ⫽ 0.95, p ⫽ .392, 2 ⫽ .04. Test of repeated contrasts showed that in men, RTs were larger for angry than for neutral faces, F(1) ⫽ 8.95, p ⫽ .006, 2 ⫽ .26, with no difference between neutral and happy faces, F(1) ⫽ 0.09, p ⫽ .765, 2 ⫽ .00. This effect was more pronounced for female than for male faces. For
681
error rates, main effects of emotional category, F(1.67) ⫽ 9.14, p ⫽ .001, 2 ⫽ .16, face’s gender, F(1) ⫽ 17.28, p ⬍ .001, 2 ⫽ .27, and their interaction, F(1.60) ⫽ 10.56, p ⬍ .001, 2 ⫽ .18, were also significant. Test of repeated contrasts showed that error rates were higher for angry than for neutral faces, F(1) ⫽ 13.05, p ⫽ .001, 2 ⫽ .22, with no difference between neutral and happy faces, F(1) ⫽ 1.65, p ⫽ .205, 2 ⫽ .03. This effect again was more pronounced for female than for male faces. Effects of subject’s gender were not significant.
ERSP Results Figure 2 shows significant differences in theta responses between the implicit and the explicit experimental modes. Face presentation evoked two waves of low-frequency synchronization that were significantly different in the two experiments. The first one occurred between 150 and 230 ms at frontal and central cortical locations. It was significantly stronger in the first (implicit) than in the second (explicit) experiment. The second wave started at approximately 270 ms and was most prominent in the frontal cortex. It could be seen until about 450 ms and was much stronger during the explicit experiment. There was also lower alpha (7– 8 Hz, alpha 1 in Klimesch’s, 1999, terminology) desynchronization, which occurred in the implicit but not in the explicit experiment. The latter is probably associated with preparation to the button press, which was absent in the experiment with explicit presentation of emotional stimuli. On the basis of observed times and cortical localization of most pronounced effects, the ANOVA model was fitted for theta ERSP values in the time windows 150 –250 ms versus 250 –350 ms and in the frontal and central cortical regions. For the sake of brevity, only effects relevant to the study aims are described. There was a significant main effect of emotional category, F(1.89) ⫽ 5.92, p ⫽ .004, 2 ⫽ .07. The highest event-related theta synchronization was observed for angry and the lowest for neutral faces. There was also a significant Emotional Category ⫻ Gender ⫻ Experimental Paradigm interaction, F(1.89) ⫽ 3.77, p ⫽ .027, 2 ⫽ .04, which is depicted in Figure 3. It appears that in both experimental paradigms, in both male and female subjects, angry faces evoked stronger theta synchronization than neutral faces. However, compared with neutral faces, happy faces during implicit presentation evoked stronger theta synchronization only in men, whereas during explicit presentation, they evoked stronger theta synchronization only in women. A test of repeated contrasts showed that only the effect of angry versus neutral faces was significant in men during explicit presentation, F(1) ⫽ 6.0, p ⫽ .025, 2 ⫽ .25. In women, explicit presentation of angry versus neutral, F(1) ⫽ 5.87, p ⫽ .025, 2 ⫽ .23, and happy versus neutral, F(1) ⫽ 4.89, p ⫽ .039, 2 ⫽ .20, faces yielded significant effects, whereas implicit presentation of angry versus neutral faces yielded marginal effects, F(1) ⫽ 3.69, p ⫽ .068, 2 ⫽ .15. There was a highly significant Time ⫻ Experimental Paradigm interaction, F(1) ⫽ 36.5, p ⬍ .001, 2 ⫽ .30, which has been described earlier (see Figure 2). There was also a significant Time ⫻ Gender interaction, F(1) ⫽ 7.6, p ⫽ .007, 2 ⫽ .08, which showed that in men, as compared with women, a face presentationrelated increase of theta power was higher in the early and lower
F2
F3
tapraid5/emo-emo/emo-emo/emo00510/emo2358d10z xppws S⫽1 10/11/10 1:23 Art: 2009-1048
KNYAZEV, SLOBODSKOJ-PLUSNIN, AND BOCHAROV
682
Figure 2. Time–frequency distribution was significantly different between the implicit (left panel) and the explicit (right panel) experimental tasks’ face presentation-related spectral perturbations ( p ⬍ .05 after false discovery rate correction for multiple comparisons). Areas with no significant differences are zeroed out. For presentation only, the results are averaged across face categories and cortical sites. Cortical maps at the top of the figure show cortical distribution of most pronounced effects.
F4
in the late time window. Mass-univariate analysis with the group variable that included four levels (implicit presentation in men, implicit presentation in women, explicit presentation in men, and explicit presentation in women) showed that during the implicit presentation, in the early time window, theta synchronization was more pronounced in men than in women across all face categories, but this effect was more prominent during presentation of happy and neutral than during presentation of angry faces (see Figure 4). In the late time window, women also showed less theta synchronization than men, particularly during presentation of happy faces. During the explicit presentation, there was clearly stronger theta synchronization in women than in men, and this effect was limited to the times after approximately 280 ms poststimulus presentation onset.
Discussion In the two experimental paradigms in this study, subjects were presented with the same stimuli; however, they received different instructions. In one case, they were asked to pay particular attention to the emotional content in the stimuli; their behavioral outcome depended on conscious evaluation of this content. In the other case, they had to evaluate other features of the picture; the emotional content served as a distracter and had to be ignored. This latter paradigm is similar to the well-known emotional Stroop task (Williams, Mathews, & MacLeod, 1996), which is used to study affective processing biases. Consistent with the notion that individuals have processing biases toward threat, evidence suggests that individuals are slower and less accurate in naming the color of threatening as compared with neutral stimuli on this task (Eschenbeck, Kohlmann, Heim-Dreger, Koller, & Lesser, 2005; McKenna & Sharma, 1995). This phenomenon, labeled an emotional interference effect, may occur because individuals are distracted by the affective content of threatening
stimuli, which disrupts color-naming performance. The emotional interference effects associate with relevant individual difference characteristics (e.g., trait anxiety; Williams et al., 1996) and appear to signify a clinically relevant phenomenon as they are modulated by treatment (Cooper & Fairburn, 1994; Dawkins, Powell, West, Powell, & Pickering, 2006; Sieswerda, Arntz, & Kindt, 2007) and predict symptom reduction (Mogg, Bradley, Millar, & White, 1995). This evidence suggests that in spite of an absence of apparent conscious perception of affective content, it is nevertheless perceived and influences performance on the primary task. The ecological validity of the emotional Stroop task has been questioned because it relies on lexical stimuli, which are different from and less potent than the types of emotional cues encountered in real life (Kindt & Brosschot, 1997). Using more realistic stimuli, such as photographs of human faces of varying emotional expressions (Ashwin, Wheelwright, & BaronCohen, 2006; van Honk, Tuiten, de Haan, van den Hout, & Stam, 2001), can be more useful because socioaffective face processing is among one of the most basic motivational functions (Ekman, 1993). It also has been argued that the gender discrimination task is less vulnerable to participants’ strategic efforts to suppress the distracting effect of emotional content than the original color discrimination task (Kolassa & Miltner, 2006). In line with previously reported findings (Eschenbeck et al., 2005; Kolassa & Miltner, 2006; McKenna & Sharma, 1995), in the gender discrimination task, angry faces were processed more slowly and less accurately than neutral faces, with no difference between neutral and happy faces. That means that the fast unconscious processing is biased toward threat, consistent with the notion that its main purpose is to ensure timely reaction to danger. It is interesting that this bias appears to be more pronounced in men than in women because it was observed for error rates and RTs only in men.
tapraid5/emo-emo/emo-emo/emo00510/emo2358d10z xppws S⫽1 10/11/10 1:23 Art: 2009-1048
PROCESSING OF EMOTIONAL FACIAL EXPRESSIONS
683
Figure 3. Estimated marginal means and standard errors of face presentation-related spectral perturbations (ERSP) in the frontal and central cortical regions within the theta frequency band in men and women on presentation of angry, neutral, and happy faces during the implicit and the explicit experimental tasks.
Analysis of face presentation-related spectral perturbations showed that “emotional” faces tended to evoke stronger theta synchronization than neutral faces. This corresponds to earlier observations (Aftanas et al., 2001, 2004) and to more recent data reported by Balconi and Pozzoli (2007, 2009), who found increased theta responses to emotional relative to neutral facial expressions. In line with existing evidence on temporal dynamics of unconscious and conscious processing (Libet, 2003; Milner & Goodale, 1995; Treisman & Kanwisher, 1998; Velmans, 1991), the two experimental paradigms yielded significantly distinct temporal dynamics of theta synchronization. In the implicit experiment, this synchronization peaked at the early (before 300 ms) processing stage, whereas in the explicit experiment, it peaked at the late (after 300 ms) processing stage. Given the time of occur-
rence and the experimental context that favors appearance of each of these two waves, they most likely reflect the online unconscious and the offline conscious stages of information processing. Many studies have shown that females have higher P300 amplitude, particularly for emotional stimuli (see, e.g., Garcia-Garcia, Domı´nguez-Borra`s, SanMiguel, & Escera, 2008; Gasbarri et al., 2006; Shen, 2005). Increased event-related delta and theta power in females also has been noted (Gu¨ntekin & Bas¸ar, 2007; Kamarajan et al., 2008). Basically, these data could be explained by the idea linking delta and theta oscillations with motivation and emotion (Knyazev, 2007; Knyazev, Slobodskoj-Plusnin, & Bocharov, 2009). Indeed, ample evidence shows higher emotionality in females than in males (LaFrance & Banaji, 1992); therefore, it seems reasonable that reactivity of oscillatory systems presumably in-
tapraid5/emo-emo/emo-emo/emo00510/emo2358d10z xppws S⫽1 10/11/10 1:23 Art: 2009-1048
684
KNYAZEV, SLOBODSKOJ-PLUSNIN, AND BOCHAROV
Figure 4. Time–frequency distribution was significantly different in men and women during the implicit and the explicit experimental tasks’ angry, neutral, and happy face presentation-related spectral perturbations ( p ⬍ .05 after false discovery rate correction for multiple comparisons). Areas with no significant differences are zeroed out. For presentation only, the results are averaged across face categories and cortical sites. Cortical maps at the top of the figure show cortical distribution of most pronounced effects.
volved in emotion processing also should be higher in females. These same systems make a major contribution to the generation of the P300 wave (Bas¸ar, Bas¸ar-Erogly, Rosen, & Schutt, 1984; Intriligator & Polich, 1995; Roschke & Fell, 1997). However, this study shows that at the early processing stage, theta synchronization was more pronounced in men, whereas at the late processing stage, it was more pronounced in women. This evidence implies that on average men devote more resources to fast unconscious emotion processing, whereas women devote more resources to thorough conscious emotion processing. A straightforward interpretation of this result can be that more unconscious, covert emotional processing in men may be related to covert forms of behavioral patterns, whereas the more conscious, overt emotional processing in women on the other hand is related to overt forms of behavioral patterns. In the framework of the present study’s experimental paradigms, that should show up in a more prominent affective processing bias during the implicit presentation in men than in women and in more contrasting estimates of emotional faces during the explicit presentation in women than in men. The former appears to be indeed so, as has been discussed earlier. As for the latter, there was no significant Emotional Category ⫻ Gender interaction in this study sample, but earlier, in a larger sample, we
have shown that female subjects indeed tend to evaluate happy faces as more friendly and angry faces as more hostile than male subjects do (Knyazev, Bocharov, Slobodskaya, & Ryabichenko, 2008). Although nonsignificant, the same trend was observed in this study sample. As the main purpose of the online processing is to ensure timely behavioral response to motivationally important (mostly aversive) stimuli, relative prevalence of this processing may predispose to rash defensive fight–flight responses to ambiguous cues. Given pronounced sex differences in fear, especially of physical danger (Campbell, 2005), aggressive responses should be more typical of males. That may be one (among others) reason of substantial gender differences in the structure of aggressive behavior. Indeed, most studies indicate that reactive (particularly physical) aggression is more frequent in men than in women (Bailey & Ostrov, 2008; Carre´ & McCormick, 2008; Connor, Steingard, Anderson, & Melloni, 2008; Lento-Zwolinski, 2007). Women, on the other hand, show more relational aggression (or related forms such as social or indirect aggression; Campbell, Sapochnik, & Muncer, 1997; Crick, Casas, & Mosher, 1997; Hess & Hagen, 2006; Ostrov & Keating, 2004). It is clear that these two forms of aggressive behavior considerably differ from each other on the degree of conscious awareness of the causes and consequences of this be-
tapraid5/emo-emo/emo-emo/emo00510/emo2358d10z xppws S⫽1 10/11/10 1:23 Art: 2009-1048
PROCESSING OF EMOTIONAL FACIAL EXPRESSIONS
havior. Whereas reactive aggressive outbursts most frequently start spontaneously, without awareness, and afterward the person may regret the loss of self-control (Campbell & Muncer, 2008), relational aggression needs premeditation and planning. Another reason of lower reactive physical aggression in women, which actually could be a consequence of the first one, is that women may in general have better brakes to stop a violent impulse (see, e.g., Knyazev, Bocharov, & Slobodskoj-Plusnin, 2009). Thus, being aware of one’ emotional state may dampen the overt behavioral manifestation of this state. The downside of this awareness, however, may be a heightened burden of negative emotional feelings. It is well known that, starting from childhood and projecting well into adulthood, externalizing problems rooted in poor impulse control are more common in males, whereas internalizing problems are more common in females (Rutter & Taylor, 2006). As self-report is the only valid source of information about internalizing problems, higher reporting of these problems by females actually evidences their higher awareness of respective emotional states. Thus, as paradoxical as it may look, the more unconscious (covert) emotional processing in males could be related to overt, externalizing pathology, whereas more conscious (overt) emotional processing in females could be related to covert, internalizing disorders. Another possible implication of gender differences in conscious and unconscious processing of emotion is related to health behavior. It is frequently observed in contemporary industrialized societies that although women live longer than men, they report higher rates of morbidity, disability, and health care use (Gijsbers van Wijk, van Vliet, Kolk, & Everaerd, 1991; Kroenke & Spitzer, 1998; Ladwig, Marten-Mittag, Formanek, & Dammann, 2000; Lahelma, Martikainen, Rahkonen, & Silventoinen, 1999). One common explanation is that there are gender differences in the way that symptoms are perceived, evaluated, and acted on (Macintyre, Ford, & Hunt, 1999). If women are more inclined to consciously perceive physical and psychological problems and seek help through social networks, it would result in higher symptom reporting as well as in a higher probability of a timely cure of the problem. Conversely, predominantly unconscious perception of these problems would not result in rational behavior, but may be a reason of subjective ill-being and attempts to self-medicate it by means of psychoactive drugs. Indeed, although substance use in girls has increased over past decades (Kumpfer, Smith, & Summerhays, 2008), it still remains much higher in males than in females, with women typically beginning to use substances later and entering treatment earlier in the course of their illnesses than do men (Brady & Randall, 1999). Beside this indirect association, there also could be a more direct causal link between health and subconscious versus conscious processing of emotion information. Thus, it has been suggested that some disorders (such as somatoform disorder) are linked to a diminished capacity to consciously experience and differentiate affects and express them in an adequate or healthy way (Waller & Scheidt, 2006). In summary, this study shows that during implicit presentation of emotional stimuli, event-related theta band synchronization is more pronounced in times associated with unconscious processing (before 300 ms poststimulus), whereas in the explicit mode, it is more pronounced in a later processing stage, which is presumably associated with conscious processing. The unconscious processing is more pronounced in males, whereas conscious processing is
685
more pronounced in females. Although the prevalence of fast processing could be beneficial in certain circumstances, in modern society it may be associated with problem behaviors, which are more frequently observed in men than in women.
References Aftanas, L. I., Reva, N. V., Varlamov, A. A., Pavlov, S. V., & Makhnev, V. P. (2004). Analysis of evoked EEG synchronization and desynchronization in conditions of emotional activation in humans: Temporal and topographic characteristics. Neuroscience and Behavioral Physiology, 34, 859 – 867. Aftanas, L. I., Varlamov, A. A., Pavlov, S. V., Makhnev, V. P., & Reva, N. V. (2001). Affective picture processing: Event-related synchronization within individually defined human theta band is modulated by valence dimension. Neuroscience Letters, 303, 115–118. Alexander, G. M., & Hines, M. (2002). Sex differences in response to children’s toys in nonhuman primates (Cercopithecus aethiops sabaeus). Evolution and Human Behavior, 23, 467– 479. Allison, T., Ginter, H., McCarthy, G., Nobre, A. C., Puce, A., Luby, M., & Spencer, D. D. (1994). Face recognition in human extrastriate cortex. Journal of Neurophysiology, 71, 821– 825. Archer, J. (2004). Sex differences in aggression in real-world settings: A meta-analytic review. Reviews in General Psychology, 8, 291–322. Ashwin, C., Wheelwright, S., & Baron-Cohen, S. (2006). Attention bias to faces in Asperger syndrome: A pictorial emotion Stroop study. Psychological Medicine, 36, 835– 843. Bailey, C. A., & Ostrov, J. M. (2008). Differentiating forms and functions of aggression in emerging adults: Associations with hostile attribution biases and normative beliefs. Journal of Youth and Adolescence, 37, 713–722. Balconi, M., & Pozzoli, U. (2003). Face-selective processing and the effect of pleasant and unpleasant emotional expressions on ERP correlates. International Journal of Psychophysiology, 49, 67–74. Balconi, M., & Pozzoli, U. (2007). Event-related oscillations (EROs) and event-related potentials (ERPs) comparison in facial expression recognition. Journal of Neuropsychology, 1, 283–294. Balconi, M., & Pozzoli, U. (2009). Arousal effect on emotional face comprehension: Frequency band changes in different time intervals. Physiology & Behavior, 97, 455– 462. Bas¸ar, E., Bas¸ar-Erogly, C., Rosen, B., & Schutt, A. (1984). A new approach to endogenous event-related potentials in man: Relation between EEG and P300 wave. International Journal of Neuroscience, 26, 161–180. ¨ niz, A. (2006). Principles of oscillatory brain Bas¸ar, E., Gu¨ntekin, B., & O dynamics and a treatise of recognition of faces and facial expressions. Progress in Brain Research, 159, 43– 62. ¨ niz, A., & Bas¸ar-Erog˘lu, C. (2008). Brain Bas¸ar, E., Schmiedt-Fehr, C., O oscillations evoked by the face of a loved person. Brain Research, 12, 105–115. Biele, C., & Grabowska, A. (2006). Sex differences in perception of emotion intensity in dynamic and static facial expressions. Experimental Brain Research, 171, 1– 6. Brady, K. T., & Randall, C. L. (1999). Gender differences in substance use disorders. Psychiatric Clinics of North America, 22, 241–252. Buzsaki, G., & Draguhn, A. (2004, June 25). Neuronal oscillations in cortical networks. Science, 304, 1926 –1929. Campbell, A. (2005). Sex differences in direct aggression: What are the psychological mediators? Aggressive and Violent Behavior, 11, 237– 264. Campbell, A., & Muncer, S. (2008). Intent to harm or injure? Gender and the expression of anger. Aggressive Behavior, 34, 282–293. Campbell, A., Sapochnik, M., & Muncer, S. (1997). Sex differences in aggression: Does social representation mediate form of aggression? British Journal of Social Psychology, 36, 161–171.
tapraid5/emo-emo/emo-emo/emo00510/emo2358d10z xppws S⫽1 10/11/10 1:23 Art: 2009-1048
686
KNYAZEV, SLOBODSKOJ-PLUSNIN, AND BOCHAROV
Cantero, J. L., & Atienza, M. (2005). The role of neural synchronization in the emergence of cognition across the wake–sleep cycle. Reviews in Neuroscience, 16, 69 – 83. Carre´, J. M., & McCormick, C. M. (2008). In your face: Facial metrics predict aggressive behaviour in the laboratory and in varsity and professional hockey players. Proceedings of Royal Society B: Social and Biological Sciences, 275, 2651–2656. Connor, D. F., Steingard, R. J., Anderson, J. J., & Melloni, J. R. H. (2008). Gender differences in reactive and proactive aggression. Child Psychiatry and Human Development, 33, 279 –294. Cooper, M. J., & Fairburn, C. G. (1994). Changes in selective information processing with three psychological treatments for bulimia nervosa. British Journal of Clinical Psychology, 33, 353–356. Crick, N. R., Casas, J. F., & Mosher, M. (1997). Relational and overt aggression in preschool. Developmental Psychology, 33, 579 –588. Dawkins, L., Powell, J. H., West, R., Powell, J., & Pickering, A. (2006). A double-blind placebo controlled experimental study of nicotine: I. Effects on incentive motivation. Psychopharmacology, 189, 355–367. Delorme, A., & Makeig, S. (2004). EEGLAB: An open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. Journal of Neuroscience Methods, 134, 9 –21. Dimberg, U., & Lundquist, L. O. (1990). Gender differences in facial reactions to facial expressions. Biological Psychology, 30, 151–159. Doppelmayr, M., Stadler, W., Sauseng, P., Rachbauer, D., & Klimesch, W. (2002, March). Gender-related differences in theta bandpower changes of the EEG during the presentation of erotic and child related stimuli. Presented at the 12th Annual Conference Emotions and the Brain, Toronto, Canada. Eagly, A., & Johnson, B. T. (1990). Gender and leadership style: A meta-analysis. Psychological Bulletin, 108, 233–256. Eimer, M., & Holmes, A. (2007). Event-related brain potential correlates of emotional face processing. Neuropsychologia, 45, 15–31. Ekman, P. (1993). Facial expression and emotion. American Psychologist, 48, 384 –392. Ekman, P., & Friesen, W. V. (1976). Pictures of facial affect. Palo Alto, CA: Consulting Psychologist Press. Eschenbeck, H., Kohlmann, C., Heim-Dreger, U., Koller, D., & Lesser, M. (2005). Processing bias and anxiety in primary school children: A modified emotional Stroop colour-naming task using pictorial facial expressions. Psychology Science, 46, 451– 465. Friston, K. J. (1997). Testing for anatomical specified regional effects. Human Brain Mapping, 5, 133–136. Garcia-Garcia, M., Domı´nguez-Borra`s, J., SanMiguel, I., & Escera, C. (2008). Electrophysiological and behavioral evidence of gender differences in the modulation of distraction by the emotional context. Biological Psychology, 79, 307–316. Gasbarri, A., Arnone, B., Pompili, A., Marchetti, A., Pacitti, F., Calil, S. S., . . . Tomaz, C. (2006). Sex-related lateralized effect of emotional content on declarative memory: An event-related potential study. Behavior and Brain Research, 168, 177–184. Gijsbers van Wijk, C. M., van Vliet, K. P., Kolk, A. M., & Everaerd, W. T. (1991). Symptom sensitivity and sex differences in physical morbidity: A review of health surveys in the United States and the Netherlands. Women and Health, 17, 91–124. Gu¨ntekin, B., & Bas¸ar, E. (2007). Emotional face expressions are differentiated with brain oscillations. International Journal of Psychophysiology, 64, 91–100. Hampson, E., van Anders, S. M., & Mullin, L. I. (2006). A female advantage in the recognition of emotional facial expressions: Test of an evolutionary hypothesis. Evolution and Human Behavior, 27, 401– 416. Hess, N. H., & Hagen, E. H. (2006). Sex differences in indirect aggression: Psychological evidence from young adults. Evolution and Human Behavior, 27, 231–245.
Holm, S. (1979). A simple sequentially rejective multiple test procedure. Scandinavian Journal of Statistics, 6, 65–70. Intriligator, J., & Polich, J. (1995). On the relationship between EEG and ERP variability. International Journal of Psychophysiology, 20, 59 –74. Kamarajan, C., Rangaswamy, M., Chorlian, D. B., Manz, N., Tang, Y., Pandey, A. K., . . . Porjesz, B. (2008). Theta oscillations during the processing of monetary loss and gain: A perspective on gender and impulsivity. Brain Research, 1235, 45– 62. Kindt, M., & Brosschot, J. F. (1997). Phobia-related cognitive bias for pictorial and linguistic stimuli. Journal of Abnormal Psychology, 106, 644 – 648. Klimesch, W. (1999). EEG alpha and theta oscillations reflect cognitive and memory performance: A review and analysis. Brain Research Reviews, 29, 169 –195. Knyazev, G. G. (2007). Motivation emotion and their inhibitory control mirrored in brain oscillations. Neuroscience and Biobehavioral Reviews, 31, 377–395. Knyazev, G. G., Bocharov, A. V., Levin, E. A., Savostyanov, A. N., & Slobodskoj-Plusnin, J. Y. (2008). Anxiety and oscillatory responses to emotional facial expressions. Brain Research, 1227, 174 –188. Knyazev, G. G., Bocharov, A. V., Slobodskaya, H. R., & Ryabichenko, T. I. (2008). Personality-linked biases in perception of emotional facial expressions. Personality and Individual Differences, 44, 1093–1104. Knyazev, G. G., Bocharov, A. V., & Slobodskoj-Plusnin, J. Y. (2009). Hostility- and gender-related differences in oscillatory responses to emotional facial expressions. Aggressive Behavior, 35, 502–513. Knyazev, G. G., Slobodskoj-Plusnin, J. Y., & Bocharov, A. V. (2009). Event-related delta and theta synchronization during explicit and implicit emotion processing. Neuroscience, 164, 1588 –1600. doi:10.1016/ j.neuroscience.2009.09.057 Kolassa, I. T., & Miltner, W. H. (2006). Psychophysiological correlates of face processing in social phobia. Brain Research, 1118, 130 –141. Krause, C. M., Viemero, V., Rosenqvist, A., Sillanmaki, L., & Astrom, T. (2000). Relative electroencephalographic desynchronization and synchronization in humans to emotional film content: An analysis of the 4 – 6, 6 – 8, 8 –10 and 10 –12 Hz frequency bands. Neuroscience Letters, 286, 9 –12. Kroenke, K., & Spitzer, R. L. (1998). Gender differences in the reporting of physical and somatoform symptoms. Psychosomatic Medicine, 60, 150 –155. Kuebli, J., & Fivush, R. (1992). Gender differences in parent– child conversations about past emotions. Sex Roles, 27, 683. Kumpfer, K. L., Smith, P., & Summerhays, J. F. (2008). A wakeup call to the prevention field: Are prevention programs for substance use effective for girls? Substance Use & Misuse, 43, 978 –1001. Ladwig, K. H., Marten-Mittag, B., Formanek, B., & Dammann, G. (2000). Gender differences of symptom reporting and medical health care utilization in the German population. European Journal of Epidemiology, 16, 511–518. LaFrance, M., & Banaji, M. (1992). Toward a reconsideration of the gender– emotion relationship. Review of Personality and Social Psychology, 14, 178 –201. Lahelma, E., Martikainen, P., Rahkonen, O., & Silventoinen, K. (1999). Gender differences in ill health in Finland: Patterns magnitude and change. Social Science & Medicine, 48, 7–19. Langlois, J. H., & Downs, A. C. (1990). Mothers, fathers, and peers as socialization agents of sex-typed play behaviors in young children. Child Development, 51, 1237–1247. Lento-Zwolinski, J. (2007). College students’ self-report of psychosocial factors in reactive forms of relational and physical aggression. Journal of Social and Personality Relationships, 24, 407– 421. Leppa¨nen, J. M., & Hietanen, J. K. (2001). Emotion recognition and social adjustment in school-aged girls and boys. Scandinavian Journal of Psychology, 42, 429 – 435.
tapraid5/emo-emo/emo-emo/emo00510/emo2358d10z xppws S⫽1 10/11/10 1:23 Art: 2009-1048
PROCESSING OF EMOTIONAL FACIAL EXPRESSIONS Leppa¨nen, J. M., Moulson, M. C., Vogel-Farley, V. K., & Nelson, C. A. (2007). An ERP study of emotional face processing in the adult and infant brain. Child Development, 78, 232–245. Libet, B. (2003). Timing of conscious experience: Reply to the 2002 commentaries on Libet’s findings. Consciousness and Cognition, 12, 321–331. Macintyre, S., Ford, G., & Hunt, K. (1999). Do women “over-report” morbidity? Men’s and women’s responses to structured prompting on a standard question on long standing illness. Social Science & Medicine, 48, 89 –98. Makeig, S. (1993). Auditory event-related dynamics of the EEG spectrum and effects of exposure to tones. Electroencephalography and Clinical Neurophysiology, 86, 283–293. McKenna, F., & Sharma, D. (1995). Intrusive cognitions: An investigation of the emotional Stroop task. Journal of Experimental Psychology: Learning, Memory, and Cognition, 21, 1595–1607. Milner, A. D., & Goodale, M. A. (1995). The visual brain in action. Oxford, England: Oxford University Press. Mitchell, D. J., McNaughton, N., Flanagan, D., & Kirk, I. J. (2008). Frontal-midline theta from the perspective of hippocampal “theta.” Progress in Neurobiology, 86, 156 –185. Mogg, K., Bradley, B. P., Millar, N., & White, J. (1995). A follow-up study of cognitive bias in generalized anxiety disorder. Behavior Research and Therapy, 33, 927–935. Moita, M. A., Rosis, S., Zhou, Y., LeDoux, J. E., & Blair, H. T. (2003). Hippocampal place cells acquire location-specific responses to the conditioned stimulus during auditory fear conditioning. Neuron, 37, 372– 374. Nishitani, N. (2003). Dynamics of cognitive processing in the human hippocampus by neuromagnetic and neurochemical assessments. NeuroImage, 20, 561–571. Nunez, P. L. (2000). Toward a quantitative description of large-scale neocortical dynamic function and EEG. Behavioral and Brain Sciences, 23, 371–398. Ostrov, J. M., & Keating, C. F. (2004). Gender differences in preschool aggression during free play and structured interactions: An observational study. Social Development, 13, 255–277. Pare, P. (2003). Role of the basolateral amygdala in memory consolidation. Progress in Neurobiology, 70, 409 – 420. Pare, P., & Collins, D. R. (2000). Neuronal correlates of fear in the lateral amygdala: Multiple extracellular recordings in conscious cats. Journal of Neuroscience, 20, 2701–2710. Pfurtscheller, G., & Aranibar, A. (1977). Event-related cortical desynchronization detected by power measurement of scalp EEG. Electroencephalography and Clinical Neurophysiology, 42, 817– 826. Pizzagalli, D. A., Oakes, T. R., & Davidson, R. J. (2003). Coupling of theta activity and glucose metabolism in the human rostral anterior cingulate cortex: An EEG/PET study of normal and depressed subjects. Psychophysiology, 40, 939 –949. Proverbio, A. M., Brignone, V., Matarazzo, S., Del Zotto, M., & Zani, A. (2006). Gender and parental status affect the visual cortical response to infant facial expression. Neuropsychologia, 44, 2987–2999. Roschke, J., & Fell, J. (1997). Spectral analysis of P300, generation in depression and schizophrenia. Neuropsychobiology, 35, 108 –114.
687
Rotter, N. G., & Rotter, G. S. (1988). Sex differences in the encoding and decoding of negative facial emotions. Journal of Nonverbal Behavior, 12, 139 –145. Rutter, M., & Taylor, E. (2006). Child and adolescent psychiatry (4th ed.). Malden, MA: Blackwell Science. Sainsbury, R. S., & Montoya, P. (1984). The relationship between type 2 theta and behavior. Physiology & Behavior, 33, 621– 626. Seidenbecher, T., Laxmi, T. R., Stork, O., & Pape, H. C. (2003, August 8). Amygdalar and hippocampal theta rhythm synchronization during fear memory retrieval. Science, 301, 846 – 850. Shen, X. (2005). Sex differences in perceptual processing: Performance on the color-Kanji Stroop task of visual stimuli. International Journal of Neuroscience, 115, 1631–1641. Sieswerda, A., Arntz, A., & Kindt, M. (2007). Successful psychotherapy reduces hypervigalence in borderline personality disorder. Behavioural and Cognitive Psychotherapy, 35, 387– 482. Thayer, B. J., & Johnsen, F. H. (2000). Sex differences in judgment of facial affect: A multivariate analysis of recognition errors. Scandinavian Journal of Psychology, 41, 243–246. Toates, F. (1998). The interaction of cognitive and stimulus–response processes in the control of behaviour. Neuroscience & Biobehavioral Reviews, 22, 59 – 83. Treisman, A. M., & Kanwisher, N. G. (1998). Perceiving visually presented objects: Recognition awareness and modularity. Current Opinion in Neurobiology, 8, 218 –226. Trivers, R. (1972). Parental investment and sexual selection. In B. B. Campbell (Ed.), Sexual selection and the descent of man (pp. 136 –179). Chicago: Aldine. van Honk, J., Tuiten, A., de Haan, E., van den Hout, M., & Stam, H. (2001). Attentional biases for angry faces: Relationships to trait anger and anxiety. Cognition & Emotion, 15, 279 –297. Varela, F., Lachaux, J. P., Rodriguez, E., & Martinerie, J. (2001). The brainweb: Phase synchronization and large-scale integration. Nature Reviews Neuroscience, 2, 229 –239. Velmans, M. (1991). Is human information processing conscious? Behavioral and Brain Sciences, 14, 651–726. Vinogradova, O. S. (1995). Expression control and probable functional significance of the neuronal theta-rhythm. Progress in Neurobiology, 45, 523–583. Waller, E., & Scheidt, C. E. (2006). Somatoform disorders as disorders of affect regulation: A development perspective. International Review of Psychiatry, 18, 13–24. Williams, J. M., Mathews, A., & MacLeod, C. (1996). The emotional Stroop task and psychopathology. Psychological Bulletin, 120, 3–24. Worsley, K. J., Marrett, S., Neelin, P., Vandal, A. C., Friston, K. J., & Evans, A. C. (1996). A unified statistical approach or determining significant signals in images of cerebral activation. Human Brain Mapping, 4, 58 –73.
Received June 15, 2009 Revision received December 18, 2009 Accepted January 9, 2010 䡲