Human recognition of emotions in voices: a fNIRS study

3 downloads 0 Views 2MB Size Report
Total haemoglobin data: Task x Optode interaction (General Linear Model, Within-Subject Effects, F=2.919, p=0.040, with sphericity assumed, Mauchly's Test of ...
Human recognition of emotions in voices: a fNIRS study Gruber, T.1,2*, Frühholz, S.3*, Debracque, C.1,2, Igloi, K.2,4,5, Marin Bosch, B.4 & Grandjean D.1,2 1Neuroscience

of Emotion and Affective Dynamics Lab, Department of Psychology, University of Geneva, Geneva, Switzerland; 2Swiss Center for Affective Sciences, University of Geneva, Geneva, Switzerland; 3Department of Psychology, University of Zürich, Zürich, Switzerland; 4Department of Neuroscience, Faculty of Medicine, University of Geneva, Switzerland, 5Geneva Neuroscience Center, University of Geneva, Switzerland; * shared first authorship

[email protected] Introduction Human vocalizations can convey different kinds of information such as variations of the vocal tone of the voice, known as prosody, and can give information about the emotional state of the speaker. In recent years, the emergence of the study of affective neuroscience has allowed understanding how emotions are decoded by the human brain, particularly through the use of functional imaging. Previous studies have suggested the role of the right inferior frontal cortex (IFC) in attentive decoding and cognitive evaluation of emotional cues in human vocalizations [1, 2]. The bilateral IFC activations may depend on the nature of emotional vocalizations (emotional prosody versus nonverbal expressions) and on the level of attentive processing (explicit versus implicit processing), suggesting that several IFC subregions might integrate different acoustic information in order to attribute implicit or explicit meanings [3, 4, 5, 6, 7, 8]. In this study, our goal is to investigate frontal lateralization of human emotion vocalizations during explicit and implicit categorization and discrimination. Hence, we developed an experimental protocol developed to collect behavioral and functional brain data relying on the functional Near-Infrared Spectroscopy (fNIRS).

Experimental design and data analysis N = 30 healthy subjects  15 males and 15 females  26.3 ± 4.7 years old

Block 1

Block 2

Passive listening

Stimuli = 36 “no – words”  2 male and 2 female speakers  Neutral/Angry/Fearful tones (figure 1)

Emotion categorization

Block 6

Figure 2: Optode positions for Data recording the right hemisphere  fNIRS OXYMON from Artinis  8 channels: 2 blocks of 4 channels on right and left PFC (figure 2)  765 – 855 nm laser sources  Fast data collection at 250 Hz

Block 3 Word discrimination

Block 7

Word discrimination

Block 8

Word categorization

Passive listening

Block 4

Block 5 Word categorization

Emotion discrimination Block 9 Emotion categorization

Block 10 Emotion discrimination

Mini block 1

Data analysis Anger Neutral  26 subjects Bilam Molem  MATLAB version 2012a from MathWorks, Inc., Natick, Massachusetts, USA.  Low pass: 0.2 Hz/High pass: 0.02 Hz/Butterworth filter Motor response  SPSS from Armonk, NY: IBM Corp.  GLMs Generalized Linear models with repeated measures: modality x task x optode x laterality

Fear Namil

Anger Molem

Mini block 2 Neutral Namil

Fear Bilam

10 sec

Figure 1: Experimental protocol

Results  Oxy haemoglobin data: Significant three-way interaction between Task x Laterality x Optode (General Linear Model, Within-Subject Effects, F=2.938, p=0.039, with sphericity assumed, Mauchly’s Test of Sphericity: W=0.683, p=0.107). This effect was caused by significantly more activation during categorisation in the lower optode on the left side (paired t-test, t=2.268, p=0.032)  Deoxy haemoglobin data: Significant interaction between Modality x Task (General Linear Model, Within-Subject Effects, F=5.870, p=0.023, with sphericity assumed, Mauchly’s Test of Sphericity: W=1.0). There was more activity during categorisation of words as opposed to discrimination of emotions (figure 3).  Total haemoglobin data: Task x Optode interaction (General Linear Model, Within-Subject Effects, F=2.919, p=0.040, with sphericity assumed, Mauchly’s Test of Sphericity: W=0.969, p=0.980). There was more activity recorded during categorisation tasks rather than discrimination the lower optodes (figure 4).

Figure 3: Deoxy data illustrating the Modality x Task interaction

Figure 4: Total haemoglobin data illustrating the Modality x Task interaction

Conclusion and future work In this study we showed that Near-Infrared Spectroscopy is a method suitable to study cognitive paradigms related to emotions, particularly categorization and discrimination, in humans. We demonstrated significant differences in frontal activations during our tasks, with a marked effect of categorization compared to discrimination. Additionally, this effect was most present in the lower optode, corresponding to the Inferior Frontal Gyrus region already highlighted in previous studies of emotional prosody categorization [9]. Interestingly, our study suggests that high order categorization process occur more in the left hemisphere than in the right hemisphere. Whether this result is connected to the nearby centers related to language abilities (e.g. Broca’s area) and the ability to verbally represent and categorize emotions will need to be investigated in further studies.

References 1. 2. 3. 4. 5. 6. 7. 8. 9.

Schirmer, A and Kotz, S.A, Beyond the right hemisphere: Brain mechanisms mediating vocal emotional processing. Trends Cogn Sci., 2006. 10: p. 24-30. Wildgruber, D., et al., A cerebral network model of speech prosody comprehension. Int J Speech Lang Pathol, 2009. 11: p. 277-281 Bach, D.R., et al., The effect of appraisal level on processing of emotional prosody in meaningless speech. Neuroimage, 2008. 42: p. 919-927. Ethofer, T., et al., Differential influences of emotion, task, and novelty on brain regions underlying the processing of speech melody. J. Cogn. Neurosci:, 2009. 21: p.1255-1268. Mitchell, R.L., et al., The neural response to emotional prosody, as revealed by functional magnetic resonance imaging. Neuropsychologia, 2003. 41: p. 1410-1421. Sander, D., et al., Emotion and attention interactions in social cognition: brain regions involved in processing anger prosody. Neuroimage, 2005. 28: p. 848-858. Buchanan, T.W., et ., Recognition of emotional prosody and verbal components of spoken language: an fMRI study. Brain Res. Cogn., 2000. 9: p. 227-238. Wildgruber, D., et al., Distinct frontal regions subserve evaluation of linguistic and emotional aspects of speech intonation. Cereb. Cortex, 2004. 14: p. 1384-1389 Frühholz, S., & Grandjean, D. (2013). Processing of emotional vocalizations in bilateral inferior frontal cortex. Neuroscience & Biobehavioral Reviews, 37, 2847-2855.

Neuroscience of Emotion and Affective Dynamics Lab cms.unige.ch/fapse/neuroemo/

Suggest Documents