Sample records for emotional faces processing

  1. Face to face with emotion: holistic face processing is modulated by emotional state.

    PubMed

    Curby, Kim M; Johnson, Kareem J; Tyson, Alyssa

    2012-01-01

    Negative emotions are linked with a local, rather than global, visual processing style, which may preferentially facilitate feature-based, relative to holistic, processing mechanisms. Because faces are typically processed holistically, and because social contexts are prime elicitors of emotions, we examined whether negative emotions decrease holistic processing of faces. We induced positive, negative, or neutral emotions via film clips and measured holistic processing before and after the induction: participants made judgements about cued parts of chimeric faces, and holistic processing was indexed by the interference caused by task-irrelevant face parts. Emotional state significantly modulated face-processing style, with the negative emotion induction leading to decreased holistic processing. Furthermore, self-reported change in emotional state correlated with changes in holistic processing. These results contrast with general assumptions that holistic processing of faces is automatic and immune to outside influences, and they illustrate emotion's power to modulate socially relevant aspects of visual perception.

  2. Neurophysiological evidence (ERPs) for hemispheric processing of facial expressions of emotions: Evidence from whole face and chimeric face stimuli.

    PubMed

    Damaskinou, Nikoleta; Watling, Dawn

    2018-05-01

    This study was designed to investigate the patterns of electrophysiological responses of early emotional processing at frontocentral sites in adults and to explore whether adults' activation patterns show hemispheric lateralization for facial emotion processing. Thirty-five adults viewed full face and chimeric face stimuli. After viewing two faces, sequentially, participants were asked to decide which of the two faces was more emotive. The findings from the standard faces and the chimeric faces suggest that emotion processing is present during the early phases of face processing in the frontocentral sites. In particular, sad emotional faces are processed differently than neutral and happy (including happy chimeras) faces in these early phases of processing. Further, there were differences in the electrode amplitudes over the left and right hemisphere, particularly in the early temporal window. This research provides supporting evidence that the chimeric face test is a test of emotion processing that elicits right hemispheric processing.

  3. Rapid communication: Global-local processing affects recognition of distractor emotional faces.

    PubMed

    Srinivasan, Narayanan; Gupta, Rashmi

    2011-03-01

    Recent studies have shown links between happy faces and global, distributed attention as well as sad faces to local, focused attention. Emotions have been shown to affect global-local processing. Given that studies on emotion-cognition interactions have not explored the effect of perceptual processing at different spatial scales on processing stimuli with emotional content, the present study investigated the link between perceptual focus and emotional processing. The study investigated the effects of global-local processing on the recognition of distractor faces with emotional expressions. Participants performed a digit discrimination task with digits at either the global level or the local level presented against a distractor face (happy or sad) as background. The results showed that global processing associated with broad scope of attention facilitates recognition of happy faces, and local processing associated with narrow scope of attention facilitates recognition of sad faces. The novel results of the study provide conclusive evidence for emotion-cognition interactions by demonstrating the effect of perceptual processing on emotional faces. The results along with earlier complementary results on the effect of emotion on global-local processing support a reciprocal relationship between emotional processing and global-local processing. Distractor processing with emotional information also has implications for theories of selective attention.

  4. Interference among the Processing of Facial Emotion, Face Race, and Face Gender.

    PubMed

    Li, Yongna; Tse, Chi-Shing

    2016-01-01

    People can process multiple dimensions of facial properties simultaneously. Facial processing models are based on the processing of facial properties. The current study examined the processing of facial emotion, face race, and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interacted with face race in all the tasks. The interaction of face race and face gender was found in the race and gender categorization tasks, whereas the interaction of facial emotion and face gender was significant in the emotion and gender categorization tasks. These results provided evidence for a symmetric interaction between variant facial properties (emotion) and invariant facial properties (race and gender).

  5. Interference among the Processing of Facial Emotion, Face Race, and Face Gender

    PubMed Central

    Li, Yongna; Tse, Chi-Shing

    2016-01-01

    People can process multiple dimensions of facial properties simultaneously. Facial processing models are based on the processing of facial properties. The current study examined the processing of facial emotion, face race, and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interacted with face race in all the tasks. The interaction of face race and face gender was found in the race and gender categorization tasks, whereas the interaction of facial emotion and face gender was significant in the emotion and gender categorization tasks. These results provided evidence for a symmetric interaction between variant facial properties (emotion) and invariant facial properties (race and gender). PMID:27840621

  6. Face-to-face: Perceived personal relevance amplifies face processing

    PubMed Central

    Pittig, Andre; Schupp, Harald T.; Alpers, Georg W.

    2017-01-01

    Abstract The human face conveys emotional and social information, but it is not well understood how these two aspects influence face perception. In order to model a group situation, two faces displaying happy, neutral or angry expressions were presented. Importantly, faces were either facing the observer, or they were presented in profile view directed towards, or looking away from each other. In Experiment 1 (n = 64), face pairs were rated regarding perceived relevance, wish-to-interact, and displayed interactivity, as well as valence and arousal. All variables revealed main effects of facial expression (emotional > neutral), face orientation (facing observer > towards > away) and interactions showed that evaluation of emotional faces strongly varies with their orientation. Experiment 2 (n = 33) examined the temporal dynamics of perceptual-attentional processing of these face constellations with event-related potentials. Processing of emotional and neutral faces differed significantly in N170 amplitudes, early posterior negativity (EPN), and sustained positive potentials. Importantly, selective emotional face processing varied as a function of face orientation, indicating early emotion-specific (N170, EPN) and late threat-specific effects (LPP, sustained positivity). Taken together, perceived personal relevance to the observer—conveyed by facial expression and face direction—amplifies emotional face processing within triadic group situations. PMID:28158672

  7. Improved emotional conflict control triggered by the processing priority of negative emotion.

    PubMed

    Yang, Qian; Wang, Xiangpeng; Yin, Shouhang; Zhao, Xiaoyue; Tan, Jinfeng; Chen, Antao

    2016-04-18

    The prefrontal cortex is responsible for emotional conflict resolution, and this control mechanism is affected by the emotional valence of distracting stimuli. In the present study, we investigated effects of negative and positive stimuli on emotional conflict control using a face-word Stroop task in combination with functional brain imaging. Emotional conflict was absent in the negative face context, in accordance with the null activation observed in areas regarding emotional face processing (fusiform face area, middle temporal/occipital gyrus). Importantly, these visual areas negatively coupled with the dorsolateral prefrontal cortex (DLPFC). However, the significant emotional conflict was observed in the positive face context, this effect was accompanied by activation in areas associated with emotional face processing, and the default mode network (DMN), here, DLPFC mainly negatively coupled with DMN, rather than visual areas. These results suggested that the conflict control mechanism exerted differently between negative faces and positive faces, it implemented more efficiently in the negative face condition, whereas it is more devoted to inhibiting internal interference in the positive face condition. This study thus provides a plausible mechanism of emotional conflict resolution that the rapid pathway for negative emotion processing efficiently triggers control mechanisms to preventively resolve emotional conflict.

  8. Improved emotional conflict control triggered by the processing priority of negative emotion

    PubMed Central

    Yang, Qian; Wang, Xiangpeng; Yin, Shouhang; Zhao, Xiaoyue; Tan, Jinfeng; Chen, Antao

    2016-01-01

    The prefrontal cortex is responsible for emotional conflict resolution, and this control mechanism is affected by the emotional valence of distracting stimuli. In the present study, we investigated effects of negative and positive stimuli on emotional conflict control using a face-word Stroop task in combination with functional brain imaging. Emotional conflict was absent in the negative face context, in accordance with the null activation observed in areas regarding emotional face processing (fusiform face area, middle temporal/occipital gyrus). Importantly, these visual areas negatively coupled with the dorsolateral prefrontal cortex (DLPFC). However, the significant emotional conflict was observed in the positive face context, this effect was accompanied by activation in areas associated with emotional face processing, and the default mode network (DMN), here, DLPFC mainly negatively coupled with DMN, rather than visual areas. These results suggested that the conflict control mechanism exerted differently between negative faces and positive faces, it implemented more efficiently in the negative face condition, whereas it is more devoted to inhibiting internal interference in the positive face condition. This study thus provides a plausible mechanism of emotional conflict resolution that the rapid pathway for negative emotion processing efficiently triggers control mechanisms to preventively resolve emotional conflict. PMID:27086908

  9. Differential emotion attribution to neutral faces of own and other races.

    PubMed

    Hu, Chao S; Wang, Qiandong; Han, Tong; Weare, Ethan; Fu, Genyue

    2017-02-01

    Past research has demonstrated differential recognition of emotion on faces of different races. This paper reports the first study to explore differential emotion attribution to neutral faces of different races. Chinese and Caucasian adults viewed a series of Chinese and Caucasian neutral faces and judged their outward facial expression: neutral, positive, or negative. The results showed that both Chinese and Caucasian viewers perceived more Chinese faces than Caucasian faces as neutral. Nevertheless, Chinese viewers attributed positive emotion to Caucasian faces more than to Chinese faces, whereas Caucasian viewers attributed negative emotion to Caucasian faces more than to Chinese faces. Moreover, Chinese viewers attributed negative and neutral emotion to the faces of both races without significant difference in frequency, whereas Caucasian viewers mostly attributed neutral emotion to the faces. These differences between Chinese and Caucasian viewers may be due to differential visual experience, culture, racial stereotype, or expectation of the experiment. We also used eye tracking among the Chinese participants to explore the relationship between face-processing strategy and emotion attribution to neutral faces. The results showed that the interaction between emotion attribution and face race was significant on face-processing strategy, such as fixation proportion on eyes and saccade amplitude. Additionally, pupil size during processing Caucasian faces was larger than during processing Chinese faces.

  10. Dissociable patterns of medial prefrontal and amygdala activity to face identity versus emotion in bipolar disorder.

    PubMed

    Keener, M T; Fournier, J C; Mullin, B C; Kronhaus, D; Perlman, S B; LaBarbara, E; Almeida, J C; Phillips, M L

    2012-09-01

    Individuals with bipolar disorder demonstrate abnormal social function. Neuroimaging studies in bipolar disorder have shown functional abnormalities in neural circuitry supporting face emotion processing, but have not examined face identity processing, a key component of social function. We aimed to elucidate functional abnormalities in neural circuitry supporting face emotion and face identity processing in bipolar disorder. Twenty-seven individuals with bipolar disorder I currently euthymic and 27 healthy controls participated in an implicit face processing, block-design paradigm. Participants labeled color flashes that were superimposed on dynamically changing background faces comprising morphs either from neutral to prototypical emotion (happy, sad, angry and fearful) or from one identity to another identity depicting a neutral face. Whole-brain and amygdala region-of-interest (ROI) activities were compared between groups. There was no significant between-group difference looking across both emerging face emotion and identity. During processing of all emerging emotions, euthymic individuals with bipolar disorder showed significantly greater amygdala activity. During facial identity and also happy face processing, euthymic individuals with bipolar disorder showed significantly greater amygdala and medial prefrontal cortical activity compared with controls. This is the first study to examine neural circuitry supporting face identity and face emotion processing in bipolar disorder. Our findings of abnormally elevated activity in amygdala and medial prefrontal cortex (mPFC) during face identity and happy face emotion processing suggest functional abnormalities in key regions previously implicated in social processing. This may be of future importance toward examining the abnormal self-related processing, grandiosity and social dysfunction seen in bipolar disorder.

  11. Face-to-face: Perceived personal relevance amplifies face processing.

    PubMed

    Bublatzky, Florian; Pittig, Andre; Schupp, Harald T; Alpers, Georg W

    2017-05-01

    The human face conveys emotional and social information, but it is not well understood how these two aspects influence face perception. In order to model a group situation, two faces displaying happy, neutral or angry expressions were presented. Importantly, faces were either facing the observer, or they were presented in profile view directed towards, or looking away from each other. In Experiment 1 (n = 64), face pairs were rated regarding perceived relevance, wish-to-interact, and displayed interactivity, as well as valence and arousal. All variables revealed main effects of facial expression (emotional > neutral), face orientation (facing observer > towards > away) and interactions showed that evaluation of emotional faces strongly varies with their orientation. Experiment 2 (n = 33) examined the temporal dynamics of perceptual-attentional processing of these face constellations with event-related potentials. Processing of emotional and neutral faces differed significantly in N170 amplitudes, early posterior negativity (EPN), and sustained positive potentials. Importantly, selective emotional face processing varied as a function of face orientation, indicating early emotion-specific (N170, EPN) and late threat-specific effects (LPP, sustained positivity). Taken together, perceived personal relevance to the observer-conveyed by facial expression and face direction-amplifies emotional face processing within triadic group situations. © The Author (2017). Published by Oxford University Press.

  12. Interactions between facial emotion and identity in face processing: evidence based on redundancy gains.

    PubMed

    Yankouskaya, Alla; Booth, David A; Humphreys, Glyn

    2012-11-01

    Interactions between the processing of emotion expression and form-based information from faces (facial identity) were investigated using the redundant-target paradigm, in which we specifically tested whether identity and emotional expression are integrated in a superadditive manner (Miller, Cognitive Psychology 14:247-279, 1982). In Experiments 1 and 2, participants performed emotion and face identity judgments on faces with sad or angry emotional expressions. Responses to redundant targets were faster than responses to either single target when a universal emotion was conveyed, and performance violated the predictions from a model assuming independent processing of emotion and face identity. Experiment 4 showed that these effects were not modulated by varying interstimulus and nontarget contingencies, and Experiment 5 demonstrated that the redundancy gains were eliminated when faces were inverted. Taken together, these results suggest that the identification of emotion and facial identity interact in face processing.

  13. A leftward bias however you look at it: Revisiting the emotional chimeric face task as a tool for measuring emotion lateralization.

    PubMed

    R Innes, Bobby; Burt, D Michael; Birch, Yan K; Hausmann, Markus

    2015-12-28

    Left hemiface biases observed within the Emotional Chimeric Face Task (ECFT) support emotional face perception models whereby all expressions are preferentially processed by the right hemisphere. However, previous research using this task has not considered that the visible midline between hemifaces might engage atypical facial emotion processing strategies in upright or inverted conditions, nor controlled for left visual field (thus right hemispheric) visuospatial attention biases. This study used novel emotional chimeric faces (blended at the midline) to examine laterality biases for all basic emotions. Left hemiface biases were demonstrated across all emotional expressions and were reduced, but not reversed, for inverted faces. The ECFT bias in upright faces was significantly increased in participants with a large attention bias. These results support the theory that left hemiface biases reflect a genuine bias in emotional face processing, and this bias can interact with attention processes similarly localized in the right hemisphere.

  14. Different underlying mechanisms for face emotion and gender processing during feature-selective attention: Evidence from event-related potential studies.

    PubMed

    Wang, Hailing; Ip, Chengteng; Fu, Shimin; Sun, Pei

    2017-05-01

    Face recognition theories suggest that our brains process invariant (e.g., gender) and changeable (e.g., emotion) facial dimensions separately. To investigate whether these two dimensions are processed in different time courses, we analyzed the selection negativity (SN, an event-related potential component reflecting attentional modulation) elicited by face gender and emotion during a feature selective attention task. Participants were instructed to attend to a combination of face emotion and gender attributes in Experiment 1 (bi-dimensional task) and to either face emotion or gender in Experiment 2 (uni-dimensional task). The results revealed that face emotion did not elicit a substantial SN, whereas face gender consistently generated a substantial SN in both experiments. These results suggest that face gender is more sensitive to feature-selective attention and that face emotion is encoded relatively automatically on SN, implying the existence of different underlying processing mechanisms for invariant and changeable facial dimensions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Behavioral dissociation between emotional and non-emotional facial expressions in congenital prosopagnosia

    PubMed Central

    Daini, Roberta; Comparetti, Chiara M.; Ricciardelli, Paola

    2014-01-01

    Neuropsychological and neuroimaging studies have shown that facial recognition and emotional expressions are dissociable. However, it is unknown if a single system supports the processing of emotional and non-emotional facial expressions. We aimed to understand if individuals with impairment in face recognition from birth (congenital prosopagnosia, CP) can use non-emotional facial expressions to recognize a face as an already seen one, and thus, process this facial dimension independently from features (which are impaired in CP), and basic emotional expressions. To this end, we carried out a behavioral study in which we compared the performance of 6 CP individuals to that of typical development individuals, using upright and inverted faces. Four avatar faces with a neutral expression were presented in the initial phase. The target faces presented in the recognition phase, in which a recognition task was requested (2AFC paradigm), could be identical (neutral) to those of the initial phase or present biologically plausible changes to features, non-emotional expressions, or emotional expressions. After this task, a second task was performed, in which the participants had to detect whether or not the recognized face exactly matched the study face or showed any difference. The results confirmed the CPs' impairment in the configural processing of the invariant aspects of the face, but also showed a spared configural processing of non-emotional facial expression (task 1). Interestingly and unlike the non-emotional expressions, the configural processing of emotional expressions was compromised in CPs and did not improve their change detection ability (task 2). These new results have theoretical implications for face perception models since they suggest that, at least in CPs, non-emotional expressions are processed configurally, can be dissociated from other facial dimensions, and may serve as a compensatory strategy to achieve face recognition. PMID:25520643

  16. Behavioral dissociation between emotional and non-emotional facial expressions in congenital prosopagnosia.

    PubMed

    Daini, Roberta; Comparetti, Chiara M; Ricciardelli, Paola

    2014-01-01

    Neuropsychological and neuroimaging studies have shown that facial recognition and emotional expressions are dissociable. However, it is unknown if a single system supports the processing of emotional and non-emotional facial expressions. We aimed to understand if individuals with impairment in face recognition from birth (congenital prosopagnosia, CP) can use non-emotional facial expressions to recognize a face as an already seen one, and thus, process this facial dimension independently from features (which are impaired in CP), and basic emotional expressions. To this end, we carried out a behavioral study in which we compared the performance of 6 CP individuals to that of typical development individuals, using upright and inverted faces. Four avatar faces with a neutral expression were presented in the initial phase. The target faces presented in the recognition phase, in which a recognition task was requested (2AFC paradigm), could be identical (neutral) to those of the initial phase or present biologically plausible changes to features, non-emotional expressions, or emotional expressions. After this task, a second task was performed, in which the participants had to detect whether or not the recognized face exactly matched the study face or showed any difference. The results confirmed the CPs' impairment in the configural processing of the invariant aspects of the face, but also showed a spared configural processing of non-emotional facial expression (task 1). Interestingly and unlike the non-emotional expressions, the configural processing of emotional expressions was compromised in CPs and did not improve their change detection ability (task 2). These new results have theoretical implications for face perception models since they suggest that, at least in CPs, non-emotional expressions are processed configurally, can be dissociated from other facial dimensions, and may serve as a compensatory strategy to achieve face recognition.

  17. Increased heart rate after exercise facilitates the processing of fearful but not disgusted faces.

    PubMed

    Pezzulo, G; Iodice, P; Barca, L; Chausse, P; Monceau, S; Mermillod, M

    2018-01-10

    Embodied theories of emotion assume that emotional processing is grounded in bodily and affective processes. Accordingly, the perception of an emotion re-enacts congruent sensory and affective states; and conversely, bodily states congruent with a specific emotion facilitate emotional processing. This study tests whether the ability to process facial expressions (faces having a neutral expression, expressing fear, or disgust) can be influenced by making the participants' body state congruent with the expressed emotion (e.g., high heart rate in the case of faces expressing fear). We designed a task requiring participants to categorize pictures of male and female faces that either had a neutral expression (neutral), or expressed emotions whose linkage with high heart rate is strong (fear) or significantly weaker or absent (disgust). Critically, participants were tested in two conditions: with experimentally induced high heart rate (Exercise) and with normal heart rate (Normal). Participants processed fearful faces (but not disgusted or neutral faces) faster when they were in the Exercise condition than in the Normal condition. These results support the idea that an emotionally congruent body state facilitates the automatic processing of emotionally-charged stimuli and this effect is emotion-specific rather than due to generic factors such as arousal.

  18. The electrophysiological effects of the serotonin 1A receptor agonist buspirone in emotional face processing.

    PubMed

    Bernasconi, Fosco; Kometer, Michael; Pokorny, Thomas; Seifritz, Erich; Vollenweider, Franz X

    2015-04-01

    Emotional face processing is critically modulated by the serotonergic system, and serotonin (5-HT) receptor agonists impair emotional face processing. However, the specific contribution of the 5-HT1A receptor remains poorly understood. Here we investigated the spatiotemporal brain mechanisms underpinning the modulation of emotional face processing induced by buspirone, a partial 5-HT1A receptor agonist. In a psychophysical discrimination of emotional faces task, we observed that the discrimination fearful versus neutral faces were reduced, but not happy versus neutral faces. Electrical neuroimaging analyses were applied to visual evoked potentials elicited by emotional face images, after placebo and buspirone administration. Buspirone modulated response strength (i.e., global field power) in the interval 230-248ms after stimulus onset. Distributed source estimation over this time interval revealed that buspirone decreased the neural activity in the right dorsolateral prefrontal cortex that was evoked by fearful faces. These results indicate temporal and valence-specific effects of buspirone on the neuronal correlates of emotional face processing. Furthermore, the reduced neural activity in the dorsolateral prefrontal cortex in response to fearful faces suggests a reduced attention to fearful faces. Collectively, these findings provide new insights into the role of 5-HT1A receptors in emotional face processing and have implications for affective disorders that are characterized by an increased attention to negative stimuli. Copyright © 2015 Elsevier B.V. and ECNP. All rights reserved.

  19. Subliminal Face Emotion Processing: A Comparison of Fearful and Disgusted Faces

    PubMed Central

    Khalid, Shah; Ansorge, Ulrich

    2017-01-01

    Prior research has provided evidence for (1) subcortical processing of subliminal facial expressions of emotion and (2) for the emotion-specificity of these processes. Here, we investigated if this is also true for the processing of the subliminal facial display of disgust. In Experiment 1, we used differently filtered masked prime faces portraying emotionally neutral or disgusted expressions presented prior to clearly visible target faces to test if the masked primes exerted an influence on target processing nonetheless. Whereas we found evidence for subliminal face congruence or priming effects, in particular, reverse priming by low spatial frequencies disgusted face primes, we did not find any support for a subcortical origin of the effect. In Experiment 2, we compared the influence of subliminal disgusted faces with that of subliminal fearful faces and demonstrated a behavioral performance difference between the two, pointing to an emotion-specific processing of the disgusted facial expressions. In both experiments, we also tested for the dependence of the subliminal emotional face processing on spatial attention – with mixed results, suggesting an attention-independence in Experiment 1 but not in Experiment 2 –, and we found perfect masking of the face primes – that is, proof of the subliminality of the prime faces. Based on our findings, we speculate that subliminal facial expressions of disgust could afford easy avoidance of these faces. This could be a unique effect of disgusted faces as compared to other emotional facial displays, at least under the conditions studied here. PMID:28680413

  20. Subliminal Face Emotion Processing: A Comparison of Fearful and Disgusted Faces.

    PubMed

    Khalid, Shah; Ansorge, Ulrich

    2017-01-01

    Prior research has provided evidence for (1) subcortical processing of subliminal facial expressions of emotion and (2) for the emotion-specificity of these processes. Here, we investigated if this is also true for the processing of the subliminal facial display of disgust. In Experiment 1, we used differently filtered masked prime faces portraying emotionally neutral or disgusted expressions presented prior to clearly visible target faces to test if the masked primes exerted an influence on target processing nonetheless. Whereas we found evidence for subliminal face congruence or priming effects, in particular, reverse priming by low spatial frequencies disgusted face primes, we did not find any support for a subcortical origin of the effect. In Experiment 2, we compared the influence of subliminal disgusted faces with that of subliminal fearful faces and demonstrated a behavioral performance difference between the two, pointing to an emotion-specific processing of the disgusted facial expressions. In both experiments, we also tested for the dependence of the subliminal emotional face processing on spatial attention - with mixed results, suggesting an attention-independence in Experiment 1 but not in Experiment 2 -, and we found perfect masking of the face primes - that is, proof of the subliminality of the prime faces. Based on our findings, we speculate that subliminal facial expressions of disgust could afford easy avoidance of these faces. This could be a unique effect of disgusted faces as compared to other emotional facial displays, at least under the conditions studied here.

  1. Positive and negative emotion enhances the processing of famous faces in a semantic judgment task.

    PubMed

    Bate, Sarah; Haslam, Catherine; Hodgson, Timothy L; Jansari, Ashok; Gregory, Nicola; Kay, Janice

    2010-01-01

    Previous work has consistently reported a facilitatory influence of positive emotion in face recognition (e.g., D'Argembeau, Van der Linden, Comblain, & Etienne, 2003). However, these reports asked participants to make recognition judgments in response to faces, and it is unknown whether emotional valence may influence other stages of processing, such as at the level of semantics. Furthermore, other evidence suggests that negative rather than positive emotion facilitates higher level judgments when processing nonfacial stimuli (e.g., Mickley & Kensinger, 2008), and it is possible that negative emotion also influences latter stages of face processing. The present study addressed this issue, examining the influence of emotional valence while participants made semantic judgments in response to a set of famous faces. Eye movements were monitored while participants performed this task, and analyses revealed a reduction in information extraction for the faces of liked and disliked celebrities compared with those of emotionally neutral celebrities. Thus, in contrast to work using familiarity judgments, both positive and negative emotion facilitated processing in this semantic-based task. This pattern of findings is discussed in relation to current models of face processing. Copyright 2009 APA, all rights reserved.

  2. Emotionally anesthetized: media violence induces neural changes during emotional face processing

    PubMed Central

    Stockdale, Laura A.; Morrison, Robert G.; Kmiecik, Matthew J.; Garbarino, James

    2015-01-01

    Media violence exposure causes increased aggression and decreased prosocial behavior, suggesting that media violence desensitizes people to the emotional experience of others. Alterations in emotional face processing following exposure to media violence may result in desensitization to others’ emotional states. This study used scalp electroencephalography methods to examine the link between exposure to violence and neural changes associated with emotional face processing. Twenty-five participants were shown a violent or nonviolent film clip and then completed a gender discrimination stop-signal task using emotional faces. Media violence did not affect the early visual P100 component; however, decreased amplitude was observed in the N170 and P200 event-related potentials following the violent film, indicating that exposure to film violence leads to suppression of holistic face processing and implicit emotional processing. Participants who had just seen a violent film showed increased frontal N200/P300 amplitude. These results suggest that media violence exposure may desensitize people to emotional stimuli and thereby require fewer cognitive resources to inhibit behavior. PMID:25759472

  3. Eye-Tracking, Autonomic, and Electrophysiological Correlates of Emotional Face Processing in Adolescents with Autism Spectrum Disorder

    PubMed Central

    Wagner, Jennifer B.; Hirsch, Suzanna B.; Vogel-Farley, Vanessa K.; Redcay, Elizabeth; Nelson, Charles A.

    2014-01-01

    Individuals with autism spectrum disorder (ASD) often have difficulty with social-emotional cues. This study examined the neural, behavioral, and autonomic correlates of emotional face processing in adolescents with ASD and typical development (TD) using eye-tracking and event-related potentials (ERPs) across two different paradigms. Scanning of faces was similar across groups in the first task, but the second task found that face-sensitive ERPs varied with emotional expressions only in TD. Further, ASD showed enhanced neural responding to non-social stimuli. In TD only, attention to eyes during eye-tracking related to faster face-sensitive ERPs in a separate task; in ASD, a significant positive association was found between autonomic activity and attention to mouths. Overall, ASD showed an atypical pattern of emotional face processing, with reduced neural differentiation between emotions and a reduced relationship between gaze behavior and neural processing of faces. PMID:22684525

  4. Configural and Featural Face Processing Influences on Emotion Recognition in Schizophrenia and Bipolar Disorder.

    PubMed

    Van Rheenen, Tamsyn E; Joshua, Nicole; Castle, David J; Rossell, Susan L

    2017-03-01

    Emotion recognition impairments have been demonstrated in schizophrenia (Sz), but are less consistent and lesser in magnitude in bipolar disorder (BD). This may be related to the extent to which different face processing strategies are engaged during emotion recognition in each of these disorders. We recently showed that Sz patients had impairments in the use of both featural and configural face processing strategies, whereas BD patients were impaired only in the use of the latter. Here we examine the influence that these impairments have on facial emotion recognition in these cohorts. Twenty-eight individuals with Sz, 28 individuals with BD, and 28 healthy controls completed a facial emotion labeling task with two conditions designed to separate the use of featural and configural face processing strategies; part-based and whole-face emotion recognition. Sz patients performed worse than controls on both conditions, and worse than BD patients on the whole-face condition. BD patients performed worse than controls on the whole-face condition only. Configural processing deficits appear to influence the recognition of facial emotions in BD, whereas both configural and featural processing abnormalities impair emotion recognition in Sz. This may explain discrepancies in the profiles of emotion recognition between the disorders. (JINS, 2017, 23, 287-291).

  5. Emotion Perception or Social Cognitive Complexity: What Drives Face Processing Deficits in Autism Spectrum Disorder?

    ERIC Educational Resources Information Center

    Walsh, Jennifer A.; Creighton, Sarah E.; Rutherford, M. D.

    2016-01-01

    Some, but not all, relevant studies have revealed face processing deficits among those with autism spectrum disorder (ASD). In particular, deficits are revealed in face processing tasks that involve emotion perception. The current study examined whether either deficits in processing emotional expression or deficits in processing social cognitive…

  6. Age-Related Changes in Amygdala-Frontal Connectivity during Emotional Face Processing from Childhood into Young Adulthood

    PubMed Central

    Wu, Minjie; Kujawa, Autumn; Lu, Lisa H.; Fitzgerald, Daniel A.; Klumpp, Heide; Fitzgerald, Kate D.; Monk, Christopher S.; Phan, K. Luan

    2016-01-01

    The ability to process and respond to emotional facial expressions is a critical skill for healthy social and emotional development. There has been growing interest in understanding the neural circuitry underlying development of emotional processing, with previous research implicating functional connectivity between amygdala and frontal regions. However, existing work has focused on threatening emotional faces, raising questions regarding the extent to which these developmental patterns are specific to threat or to emotional face processing more broadly. In the current study, we examined age-related changes in brain activity and amygdala functional connectivity during an fMRI emotional face matching task (including angry, fearful and happy faces) in 61 healthy subjects aged 7–25 years. We found age-related decreases in ventral medial prefrontal cortex (vmPFC) activity in response to happy faces but not to angry or fearful faces, and an age-related change (shifting from positive to negative correlation) in amygdala-anterior cingulate cortex/medial prefrontal cortex (ACC/mPFC) functional connectivity to all emotional faces. Specifically, positive correlations between amygdala and ACC/mPFC in children changed to negative correlations in adults, which may suggest early emergence of bottom-up amygdala excitatory signaling to ACC/mPFC in children and later development of top-down inhibitory control of ACC/mPFC over amygdala in adults. Age-related changes in amygdala-ACC/mPFC connectivity did not vary for processing of different facial emotions, suggesting changes in amygdala-ACC/mPFC connectivity may underlie development of broad emotional processing, rather than threat-specific processing. PMID:26931629

  7. Multimodal processing of emotional information in 9-month-old infants I: emotional faces and voices.

    PubMed

    Otte, R A; Donkers, F C L; Braeken, M A K A; Van den Bergh, B R H

    2015-04-01

    Making sense of emotions manifesting in human voice is an important social skill which is influenced by emotions in other modalities, such as that of the corresponding face. Although processing emotional information from voices and faces simultaneously has been studied in adults, little is known about the neural mechanisms underlying the development of this ability in infancy. Here we investigated multimodal processing of fearful and happy face/voice pairs using event-related potential (ERP) measures in a group of 84 9-month-olds. Infants were presented with emotional vocalisations (fearful/happy) preceded by the same or a different facial expression (fearful/happy). The ERP data revealed that the processing of emotional information appearing in human voice was modulated by the emotional expression appearing on the corresponding face: Infants responded with larger auditory ERPs after fearful compared to happy facial primes. This finding suggests that infants dedicate more processing capacities to potentially threatening than to non-threatening stimuli. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Age-related changes in emotional face processing across childhood and into young adulthood: evidence from event-related potentials

    PubMed Central

    MacNamara, Annmarie; Vergés, Alvaro; Kujawa, Autumn; Fitzgerald, Kate D.; Monk, Christopher S.; Phan, K. Luan

    2016-01-01

    Socio-emotional processing is an essential part of development, and age-related changes in its neural correlates can be observed. The late positive potential (LPP) is a measure of motivated attention that can be used to assess emotional processing; however, changes in the LPP elicited by emotional faces have not been assessed across a wide age range in childhood and young adulthood. We used an emotional face matching task to examine behavior and event-related potentials (ERPs) in 33 youth aged 7 to 19 years old. Younger children were slower when performing the matching task. The LPP elicited by emotional faces but not control stimuli (geometric shapes) decreased with age; by contrast, an earlier ERP (the P1) decreased with age for both faces and shapes, suggesting increased efficiency of early visual processing. Results indicate age-related attenuation in emotional processing that may stem from increased efficiency and regulatory control when performing a socio-emotional task. PMID:26220144

  9. A brain-computer interface for potential non-verbal facial communication based on EEG signals related to specific emotions

    PubMed Central

    Kashihara, Koji

    2014-01-01

    Unlike assistive technology for verbal communication, the brain-machine or brain-computer interface (BMI/BCI) has not been established as a non-verbal communication tool for amyotrophic lateral sclerosis (ALS) patients. Face-to-face communication enables access to rich emotional information, but individuals suffering from neurological disorders, such as ALS and autism, may not express their emotions or communicate their negative feelings. Although emotions may be inferred by looking at facial expressions, emotional prediction for neutral faces necessitates advanced judgment. The process that underlies brain neuronal responses to neutral faces and causes emotional changes remains unknown. To address this problem, therefore, this study attempted to decode conditioned emotional reactions to neutral face stimuli. This direction was motivated by the assumption that if electroencephalogram (EEG) signals can be used to detect patients' emotional responses to specific inexpressive faces, the results could be incorporated into the design and development of BMI/BCI-based non-verbal communication tools. To these ends, this study investigated how a neutral face associated with a negative emotion modulates rapid central responses in face processing and then identified cortical activities. The conditioned neutral face-triggered event-related potentials that originated from the posterior temporal lobe statistically significantly changed during late face processing (600–700 ms) after stimulus, rather than in early face processing activities, such as P1 and N170 responses. Source localization revealed that the conditioned neutral faces increased activity in the right fusiform gyrus (FG). This study also developed an efficient method for detecting implicit negative emotional responses to specific faces by using EEG signals. A classification method based on a support vector machine enables the easy classification of neutral faces that trigger specific individual emotions. In accordance with this classification, a face on a computer morphs into a sad or displeased countenance. The proposed method could be incorporated as a part of non-verbal communication tools to enable emotional expression. PMID:25206321

  10. A brain-computer interface for potential non-verbal facial communication based on EEG signals related to specific emotions.

    PubMed

    Kashihara, Koji

    2014-01-01

    Unlike assistive technology for verbal communication, the brain-machine or brain-computer interface (BMI/BCI) has not been established as a non-verbal communication tool for amyotrophic lateral sclerosis (ALS) patients. Face-to-face communication enables access to rich emotional information, but individuals suffering from neurological disorders, such as ALS and autism, may not express their emotions or communicate their negative feelings. Although emotions may be inferred by looking at facial expressions, emotional prediction for neutral faces necessitates advanced judgment. The process that underlies brain neuronal responses to neutral faces and causes emotional changes remains unknown. To address this problem, therefore, this study attempted to decode conditioned emotional reactions to neutral face stimuli. This direction was motivated by the assumption that if electroencephalogram (EEG) signals can be used to detect patients' emotional responses to specific inexpressive faces, the results could be incorporated into the design and development of BMI/BCI-based non-verbal communication tools. To these ends, this study investigated how a neutral face associated with a negative emotion modulates rapid central responses in face processing and then identified cortical activities. The conditioned neutral face-triggered event-related potentials that originated from the posterior temporal lobe statistically significantly changed during late face processing (600-700 ms) after stimulus, rather than in early face processing activities, such as P1 and N170 responses. Source localization revealed that the conditioned neutral faces increased activity in the right fusiform gyrus (FG). This study also developed an efficient method for detecting implicit negative emotional responses to specific faces by using EEG signals. A classification method based on a support vector machine enables the easy classification of neutral faces that trigger specific individual emotions. In accordance with this classification, a face on a computer morphs into a sad or displeased countenance. The proposed method could be incorporated as a part of non-verbal communication tools to enable emotional expression.

  11. Cognitive Bias by Gender Interaction on N170 Response to Emotional Facial Expressions in Major and Minor Depression.

    PubMed

    Wu, Xingqu; Chen, Jiu; Jia, Ting; Ma, Wentao; Zhang, Yan; Deng, Zihe; Yang, Laiqi

    2016-03-01

    States of depression are considered to relate to a cognitive bias reactivity to emotional events. Moreover, gender effect may influence differences in emotional processing. The current study is to investigate whether there is an interaction of cognitive bias by gender on emotional processing in minor depression (MiD) and major depression (MaD). N170 component was obtained during a visual emotional oddball paradigm to manipulate the processing of emotional information in 33 MiD, 36 MaD, and 32 controls (CN). Compared with CN, in male, both MiD and MaD had lower N170 amplitudes for happy faces, but MaD had higher N170 amplitudes for sad faces; in female, both MiD and MaD had lower N170 amplitudes for happy and neutral faces, but higher N170 amplitudes for sad faces. Compared with MaD in male, MiD had higher N170 amplitudes for happy faces, lower N170 amplitudes for sad faces; in female, MiD only had higher N170 amplitudes for sad faces. Interestingly, a negative relationship was observed between N170 amplitude and the HDRS score for identification of happy faces in depressed patients while N170 amplitude was positively correlated with the HDRS score for sad faces identification. These results provide novel evidence for the mood-brightening effect with an interaction of cognitive bias by gender on emotional processing. It further suggests that female depression may be more vulnerable than male during emotional face processing with the unconscious negative cognitive bias and depressive syndromes may exist on a spectrum of severity on emotional face processing.

  12. Event-Related Brain Potential Correlates of Emotional Face Processing

    ERIC Educational Resources Information Center

    Eimer, Martin; Holmes, Amanda

    2007-01-01

    Results from recent event-related brain potential (ERP) studies investigating brain processes involved in the detection and analysis of emotional facial expression are reviewed. In all experiments, emotional faces were found to trigger an increased ERP positivity relative to neutral faces. The onset of this emotional expression effect was…

  13. Time course of implicit processing and explicit processing of emotional faces and emotional words.

    PubMed

    Frühholz, Sascha; Jellinghaus, Anne; Herrmann, Manfred

    2011-05-01

    Facial expressions are important emotional stimuli during social interactions. Symbolic emotional cues, such as affective words, also convey information regarding emotions that is relevant for social communication. Various studies have demonstrated fast decoding of emotions from words, as was shown for faces, whereas others report a rather delayed decoding of information about emotions from words. Here, we introduced an implicit (color naming) and explicit task (emotion judgment) with facial expressions and words, both containing information about emotions, to directly compare the time course of emotion processing using event-related potentials (ERP). The data show that only negative faces affected task performance, resulting in increased error rates compared to neutral faces. Presentation of emotional faces resulted in a modulation of the N170, the EPN and the LPP components and these modulations were found during both the explicit and implicit tasks. Emotional words only affected the EPN during the explicit task, but a task-independent effect on the LPP was revealed. Finally, emotional faces modulated source activity in the extrastriate cortex underlying the generation of the N170, EPN and LPP components. Emotional words led to a modulation of source activity corresponding to the EPN and LPP, but they also affected the N170 source on the right hemisphere. These data show that facial expressions affect earlier stages of emotion processing compared to emotional words, but the emotional value of words may have been detected at early stages of emotional processing in the visual cortex, as was indicated by the extrastriate source activity. Copyright © 2011 Elsevier B.V. All rights reserved.

  14. Emotionally anesthetized: media violence induces neural changes during emotional face processing.

    PubMed

    Stockdale, Laura A; Morrison, Robert G; Kmiecik, Matthew J; Garbarino, James; Silton, Rebecca L

    2015-10-01

    Media violence exposure causes increased aggression and decreased prosocial behavior, suggesting that media violence desensitizes people to the emotional experience of others. Alterations in emotional face processing following exposure to media violence may result in desensitization to others' emotional states. This study used scalp electroencephalography methods to examine the link between exposure to violence and neural changes associated with emotional face processing. Twenty-five participants were shown a violent or nonviolent film clip and then completed a gender discrimination stop-signal task using emotional faces. Media violence did not affect the early visual P100 component; however, decreased amplitude was observed in the N170 and P200 event-related potentials following the violent film, indicating that exposure to film violence leads to suppression of holistic face processing and implicit emotional processing. Participants who had just seen a violent film showed increased frontal N200/P300 amplitude. These results suggest that media violence exposure may desensitize people to emotional stimuli and thereby require fewer cognitive resources to inhibit behavior. © The Author (2015). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  15. The development of emotion perception in face and voice during infancy.

    PubMed

    Grossmann, Tobias

    2010-01-01

    Interacting with others by reading their emotional expressions is an essential social skill in humans. How this ability develops during infancy and what brain processes underpin infants' perception of emotion in different modalities are the questions dealt with in this paper. Literature review. The first part provides a systematic review of behavioral findings on infants' developing emotion-reading abilities. The second part presents a set of new electrophysiological studies that provide insights into the brain processes underlying infants' developing abilities. Throughout, evidence from unimodal (face or voice) and multimodal (face and voice) processing of emotion is considered. The implications of the reviewed findings for our understanding of developmental models of emotion processing are discussed. The reviewed infant data suggest that (a) early in development, emotion enhances the sensory processing of faces and voices, (b) infants' ability to allocate increased attentional resources to negative emotional information develops earlier in the vocal domain than in the facial domain, and (c) at least by the age of 7 months, infants reliably match and recognize emotional information across face and voice.

  16. The Effect of Self-Referential Expectation on Emotional Face Processing

    PubMed Central

    McKendrick, Mel; Butler, Stephen H.; Grealy, Madeleine A.

    2016-01-01

    The role of self-relevance has been somewhat neglected in static face processing paradigms but may be important in understanding how emotional faces impact on attention, cognition and affect. The aim of the current study was to investigate the effect of self-relevant primes on processing emotional composite faces. Sentence primes created an expectation of the emotion of the face before sad, happy, neutral or composite face photos were viewed. Eye movements were recorded and subsequent responses measured the cognitive and affective impact of the emotion expressed. Results indicated that primes did not guide attention, but impacted on judgments of valence intensity and self-esteem ratings. Negative self-relevant primes led to the most negative self-esteem ratings, although the effect of the prime was qualified by salient facial features. Self-relevant expectations about the emotion of a face and subsequent attention to a face that is congruent with these expectations strengthened the affective impact of viewing the face. PMID:27175487

  17. The Effect of Self-Referential Expectation on Emotional Face Processing.

    PubMed

    McKendrick, Mel; Butler, Stephen H; Grealy, Madeleine A

    2016-01-01

    The role of self-relevance has been somewhat neglected in static face processing paradigms but may be important in understanding how emotional faces impact on attention, cognition and affect. The aim of the current study was to investigate the effect of self-relevant primes on processing emotional composite faces. Sentence primes created an expectation of the emotion of the face before sad, happy, neutral or composite face photos were viewed. Eye movements were recorded and subsequent responses measured the cognitive and affective impact of the emotion expressed. Results indicated that primes did not guide attention, but impacted on judgments of valence intensity and self-esteem ratings. Negative self-relevant primes led to the most negative self-esteem ratings, although the effect of the prime was qualified by salient facial features. Self-relevant expectations about the emotion of a face and subsequent attention to a face that is congruent with these expectations strengthened the affective impact of viewing the face.

  18. Testing the effects of expression, intensity and age on emotional face processing in ASD.

    PubMed

    Luyster, Rhiannon J; Bick, Johanna; Westerlund, Alissa; Nelson, Charles A

    2017-06-21

    Individuals with autism spectrum disorder (ASD) commonly show global deficits in the processing of facial emotion, including impairments in emotion recognition and slowed processing of emotional faces. Growing evidence has suggested that these challenges may increase with age, perhaps due to minimal improvement with age in individuals with ASD. In the present study, we explored the role of age, emotion type and emotion intensity in face processing for individuals with and without ASD. Twelve- and 18-22- year-old children with and without ASD participated. No significant diagnostic group differences were observed on behavioral measures of emotion processing for younger versus older individuals with and without ASD. However, there were significant group differences in neural responses to emotional faces. Relative to TD, at 12 years of age and during adulthood, individuals with ASD showed slower N170 to emotional faces. While the TD groups' P1 latency was significantly shorter in adults when compared to 12 year olds, there was no significant age-related difference in P1 latency among individuals with ASD. Findings point to potential differences in the maturation of cortical networks that support visual processing (whether of faces or stimuli more broadly), among individuals with and without ASD between late childhood and adulthood. Finally, associations between ERP amplitudes and behavioral responses on emotion processing tasks suggest possible neural markers for emotional and behavioral deficits among individuals with ASD. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Emotion unfolded by motion: a role for parietal lobe in decoding dynamic facial expressions.

    PubMed

    Sarkheil, Pegah; Goebel, Rainer; Schneider, Frank; Mathiak, Klaus

    2013-12-01

    Facial expressions convey important emotional and social information and are frequently applied in investigations of human affective processing. Dynamic faces may provide higher ecological validity to examine perceptual and cognitive processing of facial expressions. Higher order processing of emotional faces was addressed by varying the task and virtual face models systematically. Blood oxygenation level-dependent activation was assessed using functional magnetic resonance imaging in 20 healthy volunteers while viewing and evaluating either emotion or gender intensity of dynamic face stimuli. A general linear model analysis revealed that high valence activated a network of motion-responsive areas, indicating that visual motion areas support perceptual coding for the motion-based intensity of facial expressions. The comparison of emotion with gender discrimination task revealed increased activation of inferior parietal lobule, which highlights the involvement of parietal areas in processing of high level features of faces. Dynamic emotional stimuli may help to emphasize functions of the hypothesized 'extended' over the 'core' system for face processing.

  20. Age-related changes in amygdala-frontal connectivity during emotional face processing from childhood into young adulthood.

    PubMed

    Wu, Minjie; Kujawa, Autumn; Lu, Lisa H; Fitzgerald, Daniel A; Klumpp, Heide; Fitzgerald, Kate D; Monk, Christopher S; Phan, K Luan

    2016-05-01

    The ability to process and respond to emotional facial expressions is a critical skill for healthy social and emotional development. There has been growing interest in understanding the neural circuitry underlying development of emotional processing, with previous research implicating functional connectivity between amygdala and frontal regions. However, existing work has focused on threatening emotional faces, raising questions regarding the extent to which these developmental patterns are specific to threat or to emotional face processing more broadly. In the current study, we examined age-related changes in brain activity and amygdala functional connectivity during an fMRI emotional face matching task (including angry, fearful, and happy faces) in 61 healthy subjects aged 7-25 years. We found age-related decreases in ventral medial prefrontal cortex activity in response to happy faces but not to angry or fearful faces, and an age-related change (shifting from positive to negative correlation) in amygdala-anterior cingulate cortex/medial prefrontal cortex (ACC/mPFC) functional connectivity to all emotional faces. Specifically, positive correlations between amygdala and ACC/mPFC in children changed to negative correlations in adults, which may suggest early emergence of bottom-up amygdala excitatory signaling to ACC/mPFC in children and later development of top-down inhibitory control of ACC/mPFC over amygdala in adults. Age-related changes in amygdala-ACC/mPFC connectivity did not vary for processing of different facial emotions, suggesting changes in amygdala-ACC/mPFC connectivity may underlie development of broad emotional processing, rather than threat-specific processing. Hum Brain Mapp 37:1684-1695, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  1. Prolonged Interruption of Cognitive Control of Conflict Processing Over Human Faces by Task-Irrelevant Emotion Expression

    PubMed Central

    Kim, Jinyoung; Kang, Min-Suk; Cho, Yang Seok; Lee, Sang-Hun

    2017-01-01

    As documented by Darwin 150 years ago, emotion expressed in human faces readily draws our attention and promotes sympathetic emotional reactions. How do such reactions to the expression of emotion affect our goal-directed actions? Despite the substantial advance made in the neural mechanisms of both cognitive control and emotional processing, it is not yet known well how these two systems interact. Here, we studied how emotion expressed in human faces influences cognitive control of conflict processing, spatial selective attention and inhibitory control in particular, using the Eriksen flanker paradigm. In this task, participants viewed displays of a central target face flanked by peripheral faces and were asked to judge the gender of the target face; task-irrelevant emotion expressions were embedded in the target face, the flanking faces, or both. We also monitored how emotion expression affects gender judgment performance while varying the relative timing between the target and flanker faces. As previously reported, we found robust gender congruency effects, namely slower responses to the target faces whose gender was incongruent with that of the flanker faces, when the flankers preceded the target by 0.1 s. When the flankers further advanced the target by 0.3 s, however, the congruency effect vanished in most of the viewing conditions, except for when emotion was expressed only in the flanking faces or when congruent emotion was expressed in the target and flanking faces. These results suggest that emotional saliency can prolong a substantial degree of conflict by diverting bottom-up attention away from the target, and that inhibitory control on task-irrelevant information from flanking stimuli is deterred by the emotional congruency between target and flanking stimuli. PMID:28676780

  2. Asymmetric Engagement of Amygdala and Its Gamma Connectivity in Early Emotional Face Processing

    PubMed Central

    Liu, Tai-Ying; Chen, Yong-Sheng; Hsieh, Jen-Chuen; Chen, Li-Fen

    2015-01-01

    The amygdala has been regarded as a key substrate for emotion processing. However, the engagement of the left and right amygdala during the early perceptual processing of different emotional faces remains unclear. We investigated the temporal profiles of oscillatory gamma activity in the amygdala and effective connectivity of the amygdala with the thalamus and cortical areas during implicit emotion-perceptual tasks using event-related magnetoencephalography (MEG). We found that within 100 ms after stimulus onset the right amygdala habituated to emotional faces rapidly (with duration around 20–30 ms), whereas activity in the left amygdala (with duration around 50–60 ms) sustained longer than that in the right. Our data suggest that the right amygdala could be linked to autonomic arousal generated by facial emotions and the left amygdala might be involved in decoding or evaluating expressive faces in the early perceptual emotion processing. The results of effective connectivity provide evidence that only negative emotional processing engages both cortical and subcortical pathways connected to the right amygdala, representing its evolutional significance (survival). These findings demonstrate the asymmetric engagement of bilateral amygdala in emotional face processing as well as the capability of MEG for assessing thalamo-cortico-limbic circuitry. PMID:25629899

  3. The effects of familiarity and emotional expression on face processing examined by ERPs in patients with schizophrenia.

    PubMed

    Caharel, Stéphanie; Bernard, Christian; Thibaut, Florence; Haouzir, Sadec; Di Maggio-Clozel, Carole; Allio, Gabrielle; Fouldrin, Gaël; Petit, Michel; Lalonde, Robert; Rebaï, Mohamed

    2007-09-01

    The main objective of the study was to determine whether patients with schizophrenia are deficient relative to controls in the processing of faces at different levels of familiarity and types of emotion and the stage where such differences may occur. ERPs based on 18 patients with schizophrenia and 18 controls were compared in a face identification task at three levels of familiarity (unknown, familiar, subject's own) and for three types of emotion (disgust, smiling, neutral). The schizophrenic group was less accurate than controls in the face processing, especially for unknown faces and those expressing negative emotions such as disgust. P1 and N170 amplitudes were lower and P1, N170, P250 amplitudes were of slower onset in patients with schizophrenia. N170 and P250 amplitudes were modulated by familiarity and face expression in a different manner in patients than controls. Schizophrenia is associated with a genelarized defect of face processing, both in terms of familiarity and emotional expression, attributable to deficient processing at sensory (P1) and perceptual (N170) stages. These patients appear to have difficulty in encoding the structure of a face and thereby do not evaluate correctly familiarity and emotion.

  4. Functional atlas of emotional faces processing: a voxel-based meta-analysis of 105 functional magnetic resonance imaging studies

    PubMed Central

    Fusar-Poli, Paolo; Placentino, Anna; Carletti, Francesco; Landi, Paola; Allen, Paul; Surguladze, Simon; Benedetti, Francesco; Abbamonte, Marta; Gasparotti, Roberto; Barale, Francesco; Perez, Jorge; McGuire, Philip; Politi, Pierluigi

    2009-01-01

    Background Most of our social interactions involve perception of emotional information from the faces of other people. Furthermore, such emotional processes are thought to be aberrant in a range of clinical disorders, including psychosis and depression. However, the exact neurofunctional maps underlying emotional facial processing are not well defined. Methods Two independent researchers conducted separate comprehensive PubMed (1990 to May 2008) searches to find all functional magnetic resonance imaging (fMRI) studies using a variant of the emotional faces paradigm in healthy participants. The search terms were: “fMRI AND happy faces,” “fMRI AND sad faces,” “fMRI AND fearful faces,” “fMRI AND angry faces,” “fMRI AND disgusted faces” and “fMRI AND neutral faces.” We extracted spatial coordinates and inserted them in an electronic database. We performed activation likelihood estimation analysis for voxel-based meta-analyses. Results Of the originally identified studies, 105 met our inclusion criteria. The overall database consisted of 1785 brain coordinates that yielded an overall sample of 1600 healthy participants. Quantitative voxel-based meta-analysis of brain activation provided neurofunctional maps for 1) main effect of human faces; 2) main effect of emotional valence; and 3) modulatory effect of age, sex, explicit versus implicit processing and magnetic field strength. Processing of emotional faces was associated with increased activation in a number of visual, limbic, temporoparietal and prefrontal areas; the putamen; and the cerebellum. Happy, fearful and sad faces specifically activated the amygdala, whereas angry or disgusted faces had no effect on this brain region. Furthermore, amygdala sensitivity was greater for fearful than for happy or sad faces. Insular activation was selectively reported during processing of disgusted and angry faces. However, insular sensitivity was greater for disgusted than for angry faces. Conversely, neural response in the visual cortex and cerebellum was observable across all emotional conditions. Limitations Although the activation likelihood estimation approach is currently one of the most powerful and reliable meta-analytical methods in neuroimaging research, it is insensitive to effect sizes. Conclusion Our study has detailed neurofunctional maps to use as normative references in future fMRI studies of emotional facial processing in psychiatric populations. We found selective differences between neural networks underlying the basic emotions in limbic and insular brain regions. PMID:19949718

  5. Early neural activation during facial affect processing in adolescents with Autism Spectrum Disorder.

    PubMed

    Leung, Rachel C; Pang, Elizabeth W; Cassel, Daniel; Brian, Jessica A; Smith, Mary Lou; Taylor, Margot J

    2015-01-01

    Impaired social interaction is one of the hallmarks of Autism Spectrum Disorder (ASD). Emotional faces are arguably the most critical visual social stimuli and the ability to perceive, recognize, and interpret emotions is central to social interaction and communication, and subsequently healthy social development. However, our understanding of the neural and cognitive mechanisms underlying emotional face processing in adolescents with ASD is limited. We recruited 48 adolescents, 24 with high functioning ASD and 24 typically developing controls. Participants completed an implicit emotional face processing task in the MEG. We examined spatiotemporal differences in neural activation between the groups during implicit angry and happy face processing. While there were no differences in response latencies between groups across emotions, adolescents with ASD had lower accuracy on the implicit emotional face processing task when the trials included angry faces. MEG data showed atypical neural activity in adolescents with ASD during angry and happy face processing, which included atypical activity in the insula, anterior and posterior cingulate and temporal and orbitofrontal regions. Our findings demonstrate differences in neural activity during happy and angry face processing between adolescents with and without ASD. These differences in activation in social cognitive regions may index the difficulties in face processing and in comprehension of social reward and punishment in the ASD group. Thus, our results suggest that atypical neural activation contributes to impaired affect processing, and thus social cognition, in adolescents with ASD.

  6. Social anhedonia is associated with neural abnormalities during face emotion processing.

    PubMed

    Germine, Laura T; Garrido, Lucia; Bruce, Lori; Hooker, Christine

    2011-10-01

    Human beings are social organisms with an intrinsic desire to seek and participate in social interactions. Social anhedonia is a personality trait characterized by a reduced desire for social affiliation and reduced pleasure derived from interpersonal interactions. Abnormally high levels of social anhedonia prospectively predict the development of schizophrenia and contribute to poorer outcomes for schizophrenia patients. Despite the strong association between social anhedonia and schizophrenia, the neural mechanisms that underlie individual differences in social anhedonia have not been studied and are thus poorly understood. Deficits in face emotion recognition are related to poorer social outcomes in schizophrenia, and it has been suggested that face emotion recognition deficits may be a behavioral marker for schizophrenia liability. In the current study, we used functional magnetic resonance imaging (fMRI) to see whether there are differences in the brain networks underlying basic face emotion processing in a community sample of individuals low vs. high in social anhedonia. We isolated the neural mechanisms related to face emotion processing by comparing face emotion discrimination with four other baseline conditions (identity discrimination of emotional faces, identity discrimination of neutral faces, object discrimination, and pattern discrimination). Results showed a group (high/low social anhedonia) × condition (emotion discrimination/control condition) interaction in the anterior portion of the rostral medial prefrontal cortex, right superior temporal gyrus, and left somatosensory cortex. As predicted, high (relative to low) social anhedonia participants showed less neural activity in face emotion processing regions during emotion discrimination as compared to each control condition. The findings suggest that social anhedonia is associated with abnormalities in networks responsible for basic processes associated with social cognition, and provide a starting point for understanding the neural basis of social motivation and our drive to seek social affiliation. Copyright © 2011 Elsevier Inc. All rights reserved.

  7. Passing faces: sequence-dependent variations in the perceptual processing of emotional faces.

    PubMed

    Karl, Christian; Hewig, Johannes; Osinsky, Roman

    2016-10-01

    There is broad evidence that contextual factors influence the processing of emotional facial expressions. Yet temporal-dynamic aspects, inter alia how face processing is influenced by the specific order of neutral and emotional facial expressions, have been largely neglected. To shed light on this topic, we recorded electroencephalogram from 168 healthy participants while they performed a gender-discrimination task with angry and neutral faces. Our event-related potential (ERP) analyses revealed a strong emotional modulation of the N170 component, indicating that the basic visual encoding and emotional analysis of a facial stimulus happen, at least partially, in parallel. While the N170 and the late positive potential (LPP; 400-600 ms) were only modestly affected by the sequence of preceding faces, we observed a strong influence of face sequences on the early posterior negativity (EPN; 200-300 ms). Finally, the differing response patterns of the EPN and LPP indicate that these two ERPs represent distinct processes during face analysis: while the former seems to represent the integration of contextual information in the perception of a current face, the latter appears to represent the net emotional interpretation of a current face.

  8. Association of impaired facial affect recognition with basic facial and visual processing deficits in schizophrenia.

    PubMed

    Norton, Daniel; McBain, Ryan; Holt, Daphne J; Ongur, Dost; Chen, Yue

    2009-06-15

    Impaired emotion recognition has been reported in schizophrenia, yet the nature of this impairment is not completely understood. Recognition of facial emotion depends on processing affective and nonaffective facial signals, as well as basic visual attributes. We examined whether and how poor facial emotion recognition in schizophrenia is related to basic visual processing and nonaffective face recognition. Schizophrenia patients (n = 32) and healthy control subjects (n = 29) performed emotion discrimination, identity discrimination, and visual contrast detection tasks, where the emotionality, distinctiveness of identity, or visual contrast was systematically manipulated. Subjects determined which of two presentations in a trial contained the target: the emotional face for emotion discrimination, a specific individual for identity discrimination, and a sinusoidal grating for contrast detection. Patients had significantly higher thresholds (worse performance) than control subjects for discriminating both fearful and happy faces. Furthermore, patients' poor performance in fear discrimination was predicted by performance in visual detection and face identity discrimination. Schizophrenia patients require greater emotional signal strength to discriminate fearful or happy face images from neutral ones. Deficient emotion recognition in schizophrenia does not appear to be determined solely by affective processing but is also linked to the processing of basic visual and facial information.

  9. Lateralized hybrid faces: evidence of a valence-specific bias in the processing of implicit emotions.

    PubMed

    Prete, Giulia; Laeng, Bruno; Tommasi, Luca

    2014-01-01

    It is well known that hemispheric asymmetries exist for both the analyses of low-level visual information (such as spatial frequency) and high-level visual information (such as emotional expressions). In this study, we assessed which of the above factors underlies perceptual laterality effects with "hybrid faces": a type of stimulus that allows testing for unaware processing of emotional expressions, when the emotion is displayed in the low-frequency information while an image of the same face with a neutral expression is superimposed to it. Despite hybrid faces being perceived as neutral, the emotional information modulates observers' social judgements. In the present study, participants were asked to assess friendliness of hybrid faces displayed tachistoscopically, either centrally or laterally to fixation. We found a clear influence of the hidden emotions also with lateral presentations. Happy faces were rated as more friendly and angry faces as less friendly with respect to neutral faces. In general, hybrid faces were evaluated as less friendly when they were presented in the left visual field/right hemisphere than in the right visual field/left hemisphere. The results extend the validity of the valence hypothesis in the specific domain of unaware (subcortical) emotion processing.

  10. Risk for Bipolar Disorder is Associated with Face-Processing Deficits across Emotions

    ERIC Educational Resources Information Center

    Brotman, Melissa A.; Skup, Martha; Rich, Brendan A.; Blair, Karina S.; Pine, Daniel S.; Blair, James R.; Leibenluft, Ellen

    2008-01-01

    The relationship between the risks for face-emotion labeling deficits and bipolar disorder (BD) among youths is examined. Findings show that youths at risk for BD did not show specific face-emotion recognition deficits. The need to provide more intense emotional information for face-emotion labeling of patients and at-risk youths is also discussed.

  11. Orienting asymmetries and physiological reactivity in dogs' response to human emotional faces.

    PubMed

    Siniscalchi, Marcello; d'Ingeo, Serenella; Quaranta, Angelo

    2018-06-19

    Recent scientific literature shows that emotional cues conveyed by human vocalizations and odours are processed in an asymmetrical way by the canine brain. In the present study, during feeding behaviour, dogs were suddenly presented with 2-D stimuli depicting human faces expressing the Ekman's six basic emotion (e.g. anger, fear, happiness, sadness, surprise, disgust, and neutral), simultaneously into the left and right visual hemifields. A bias to turn the head towards the left (right hemisphere) rather than the right side was observed with human faces expressing anger, fear, and happiness emotions, but an opposite bias (left hemisphere) was observed with human faces expressing surprise. Furthermore, dogs displayed higher behavioural and cardiac activity to picture of human faces expressing clear arousal emotional state. Overall, results demonstrated that dogs are sensitive to emotional cues conveyed by human faces, supporting the existence of an asymmetrical emotional modulation of the canine brain to process basic human emotions.

  12. Neural activity and emotional processing following military deployment: Effects of mild traumatic brain injury and posttraumatic stress disorder.

    PubMed

    Zuj, Daniel V; Felmingham, Kim L; Palmer, Matthew A; Lawrence-Wood, Ellie; Van Hooff, Miranda; Lawrence, Andrew J; Bryant, Richard A; McFarlane, Alexander C

    2017-11-01

    Posttraumatic Stress Disorder (PTSD) and mild traumatic brain injury (mTBI) are common comorbidities during military deployment that affect emotional brain processing, yet few studies have examined the independent effects of mTBI and PTSD. The purpose of this study was to examine distinct differences in neural responses to emotional faces in mTBI and PTSD. Twenty-one soldiers reporting high PTSD symptoms were compared to 21 soldiers with low symptoms, and 16 soldiers who reported mTBI-consistent injury and symptoms were compared with 16 soldiers who did not sustain an mTBI. Participants viewed emotional face expressions while their neural activity was recorded (via event-related potentials) prior to and following deployment. The high-PTSD group displayed increased P1 and P2 amplitudes to threatening faces at post-deployment compared to the low-PTSD group. In contrast, the mTBI group displayed reduced face-specific processing (N170 amplitude) to all facial expressions compared to the no-mTBI group. Here, we identified distinctive neural patterns of emotional face processing, with attentional biases towards threatening faces in PTSD, and reduced emotional face processing in mTBI. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Identity-expression interaction in face perception: sex, visual field, and psychophysical factors.

    PubMed

    Godard, Ornella; Baudouin, Jean-Yves; Bonnet, Philippe; Fiori, Nicole

    2013-01-01

    We investigated the psychophysical factors underlying the identity-emotion interaction in face perception. Visual field and sex were also taken into account. Participants had to judge whether a probe face, presented in either the left or the right visual field, and a central target face belonging to same person while emotional expression varied (Experiment 1) or to judge whether probe and target faces expressed the same emotion while identity was manipulated (Experiment 2). For accuracy we replicated the mutual facilitation effect between identity and emotion; no sex or hemispheric differences were found. Processing speed measurements, however, showed a lesser degree of interference in women than in men, especially for matching identity when faces expressed different emotions after a left visual presentation probe face. Psychophysical indices can be used to determine whether these effects are perceptual (A') or instead arise at a post-perceptual decision-making stage (B"). The influence of identity on the processing of facial emotion seems to be due to perceptual factors, whereas the influence of emotion changes on identity processing seems to be related to decisional factors. In addition, men seem to be more "conservative" after a LVF/RH probe-face presentation when processing identity. Women seem to benefit from better abilities to extract facial invariant aspects relative to identity.

  14. Neurobiological correlates of emotional intelligence in voice and face perception networks

    PubMed Central

    Karle, Kathrin N; Ethofer, Thomas; Jacob, Heike; Brück, Carolin; Erb, Michael; Lotze, Martin; Nizielski, Sophia; Schütz, Astrid; Wildgruber, Dirk; Kreifelts, Benjamin

    2018-01-01

    Abstract Facial expressions and voice modulations are among the most important communicational signals to convey emotional information. The ability to correctly interpret this information is highly relevant for successful social interaction and represents an integral component of emotional competencies that have been conceptualized under the term emotional intelligence. Here, we investigated the relationship of emotional intelligence as measured with the Salovey-Caruso-Emotional-Intelligence-Test (MSCEIT) with cerebral voice and face processing using functional and structural magnetic resonance imaging. MSCEIT scores were positively correlated with increased voice-sensitivity and gray matter volume of the insula accompanied by voice-sensitivity enhanced connectivity between the insula and the temporal voice area, indicating generally increased salience of voices. Conversely, in the face processing system, higher MSCEIT scores were associated with decreased face-sensitivity and gray matter volume of the fusiform face area. Taken together, these findings point to an alteration in the balance of cerebral voice and face processing systems in the form of an attenuated face-vs-voice bias as one potential factor underpinning emotional intelligence. PMID:29365199

  15. Neurobiological correlates of emotional intelligence in voice and face perception networks.

    PubMed

    Karle, Kathrin N; Ethofer, Thomas; Jacob, Heike; Brück, Carolin; Erb, Michael; Lotze, Martin; Nizielski, Sophia; Schütz, Astrid; Wildgruber, Dirk; Kreifelts, Benjamin

    2018-02-01

    Facial expressions and voice modulations are among the most important communicational signals to convey emotional information. The ability to correctly interpret this information is highly relevant for successful social interaction and represents an integral component of emotional competencies that have been conceptualized under the term emotional intelligence. Here, we investigated the relationship of emotional intelligence as measured with the Salovey-Caruso-Emotional-Intelligence-Test (MSCEIT) with cerebral voice and face processing using functional and structural magnetic resonance imaging. MSCEIT scores were positively correlated with increased voice-sensitivity and gray matter volume of the insula accompanied by voice-sensitivity enhanced connectivity between the insula and the temporal voice area, indicating generally increased salience of voices. Conversely, in the face processing system, higher MSCEIT scores were associated with decreased face-sensitivity and gray matter volume of the fusiform face area. Taken together, these findings point to an alteration in the balance of cerebral voice and face processing systems in the form of an attenuated face-vs-voice bias as one potential factor underpinning emotional intelligence.

  16. Visual Search for Faces with Emotional Expressions

    ERIC Educational Resources Information Center

    Frischen, Alexandra; Eastwood, John D.; Smilek, Daniel

    2008-01-01

    The goal of this review is to critically examine contradictory findings in the study of visual search for emotionally expressive faces. Several key issues are addressed: Can emotional faces be processed preattentively and guide attention? What properties of these faces influence search efficiency? Is search moderated by the emotional state of the…

  17. Age-related differences in event-related potentials for early visual processing of emotional faces.

    PubMed

    Hilimire, Matthew R; Mienaltowski, Andrew; Blanchard-Fields, Fredda; Corballis, Paul M

    2014-07-01

    With advancing age, processing resources are shifted away from negative emotional stimuli and toward positive ones. Here, we explored this 'positivity effect' using event-related potentials (ERPs). Participants identified the presence or absence of a visual probe that appeared over photographs of emotional faces. The ERPs elicited by the onsets of angry, sad, happy and neutral faces were recorded. We examined the frontocentral emotional positivity (FcEP), which is defined as a positive deflection in the waveforms elicited by emotional expressions relative to neutral faces early on in the time course of the ERP. The FcEP is thought to reflect enhanced early processing of emotional expressions. The results show that within the first 130 ms young adults show an FcEP to negative emotional expressions, whereas older adults show an FcEP to positive emotional expressions. These findings provide additional evidence that the age-related positivity effect in emotion processing can be traced to automatic processes that are evident very early in the processing of emotional facial expressions. © The Author (2013). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  18. Neural correlates of emotional intelligence in a visual emotional oddball task: an ERP study.

    PubMed

    Raz, Sivan; Dan, Orrie; Zysberg, Leehu

    2014-11-01

    The present study was aimed at identifying potential behavioral and neural correlates of Emotional Intelligence (EI) by using scalp-recorded Event-Related Potentials (ERPs). EI levels were defined according to both self-report questionnaire and a performance-based ability test. We identified ERP correlates of emotional processing by using a visual-emotional oddball paradigm, in which subjects were confronted with one frequent standard stimulus (a neutral face) and two deviant stimuli (a happy and an angry face). The effects of these faces were then compared across groups with low and high EI levels. The ERP results indicate that participants with high EI exhibited significantly greater mean amplitudes of the P1, P2, N2, and P3 ERP components in response to emotional and neutral faces, at frontal, posterior-parietal and occipital scalp locations. P1, P2 and N2 are considered indexes of attention-related processes and have been associated with early attention to emotional stimuli. The later P3 component has been thought to reflect more elaborative, top-down, emotional information processing including emotional evaluation and memory encoding and formation. These results may suggest greater recruitment of resources to process all emotional and non-emotional faces at early and late processing stages among individuals with higher EI. The present study underscores the usefulness of ERP methodology as a sensitive measure for the study of emotional stimuli processing in the research field of EI. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Emotional Processing of Personally Familiar Faces in the Vegetative State

    PubMed Central

    Sharon, Haggai; Pasternak, Yotam; Ben Simon, Eti; Gruberger, Michal; Giladi, Nir; Krimchanski, Ben Zion; Hassin, David; Hendler, Talma

    2013-01-01

    Background The Vegetative State (VS) is a severe disorder of consciousness in which patients are awake but display no signs of awareness. Yet, recent functional magnetic resonance imaging (fMRI) studies have demonstrated evidence for covert awareness in VS patients by recording specific brain activations during a cognitive task. However, the possible existence of incommunicable subjective emotional experiences in VS patients remains largely unexplored. This study aimed to probe the question of whether VS patients retain a brain ability to selectively process external stimuli according to their emotional value and look for evidence of covert emotional awareness in patients. Methods and Findings In order to explore these questions we employed the emotive impact of observing personally familiar faces, known to provoke specific perceptual as well as emotional brain activations. Four VS patients and thirteen healthy controls first underwent an fMRI scan while viewing pictures of non-familiar faces, personally familiar faces and pictures of themselves. In a subsequent imagery task participants were asked to actively imagine one of their parent's faces. Analyses focused on face and familiarity selective regional brain activations and inter-regional functional connectivity. Similar to controls, all patients displayed face selective brain responses with further limbic and cortical activations elicited by familiar faces. In patients as well as controls, Connectivity was observed between emotional, visual and face specific areas, suggesting aware emotional perception. This connectivity was strongest in the two patients who later recovered. Notably, these two patients also displayed selective amygdala activation during familiar face imagery, with one further exhibiting face selective activations, indistinguishable from healthy controls. Conclusions Taken together, these results show that selective emotional processing can be elicited in VS patients both by external emotionally salient stimuli and by internal cognitive processes, suggesting the ability for covert emotional awareness of self and the environment in VS patients. PMID:24086365

  20. Social categories shape the neural representation of emotion: evidence from a visual face adaptation task

    PubMed Central

    Otten, Marte; Banaji, Mahzarin R.

    2012-01-01

    A number of recent behavioral studies have shown that emotional expressions are differently perceived depending on the race of a face, and that perception of race cues is influenced by emotional expressions. However, neural processes related to the perception of invariant cues that indicate the identity of a face (such as race) are often described to proceed independently of processes related to the perception of cues that can vary over time (such as emotion). Using a visual face adaptation paradigm, we tested whether these behavioral interactions between emotion and race also reflect interdependent neural representation of emotion and race. We compared visual emotion aftereffects when the adapting face and ambiguous test face differed in race or not. Emotion aftereffects were much smaller in different race (DR) trials than same race (SR) trials, indicating that the neural representation of a facial expression is significantly different depending on whether the emotional face is black or white. It thus seems that invariable cues such as race interact with variable face cues such as emotion not just at a response level, but also at the level of perception and neural representation. PMID:22403531

  1. Modulation of the composite face effect by unintended emotion cues.

    PubMed

    Gray, Katie L H; Murphy, Jennifer; Marsh, Jade E; Cook, Richard

    2017-04-01

    When upper and lower regions from different emotionless faces are aligned to form a facial composite, observers 'fuse' the two halves together, perceptually. The illusory distortion induced by task-irrelevant ('distractor') halves hinders participants' judgements about task-relevant ('target') halves. This composite-face effect reveals a tendency to integrate feature information from disparate regions of intact upright faces, consistent with theories of holistic face processing. However, observers frequently perceive emotion in ostensibly neutral faces, contrary to the intentions of experimenters. This study sought to determine whether this 'perceived emotion' influences the composite-face effect. In our first experiment, we confirmed that the composite effect grows stronger as the strength of distractor emotion increased. Critically, effects of distractor emotion were induced by weak emotion intensities, and were incidental insofar as emotion cues hindered image matching, not emotion labelling per se . In Experiment 2, we found a correlation between the presence of perceived emotion in a set of ostensibly neutral distractor regions sourced from commonly used face databases, and the strength of illusory distortion they induced. In Experiment 3, participants completed a sequential matching composite task in which half of the distractor regions were rated high and low for perceived emotion, respectively. Significantly stronger composite effects were induced by the high-emotion distractor halves. These convergent results suggest that perceived emotion increases the strength of the composite-face effect induced by supposedly emotionless faces. These findings have important implications for the study of holistic face processing in typical and atypical populations.

  2. Effortful versus automatic emotional processing in schizophrenia: Insights from a face-vignette task.

    PubMed

    Patrick, Regan E; Rastogi, Anuj; Christensen, Bruce K

    2015-01-01

    Adaptive emotional responding relies on dual automatic and effortful processing streams. Dual-stream models of schizophrenia (SCZ) posit a selective deficit in neural circuits that govern goal-directed, effortful processes versus reactive, automatic processes. This imbalance suggests that when patients are confronted with competing automatic and effortful emotional response cues, they will exhibit diminished effortful responding and intact, possibly elevated, automatic responding compared to controls. This prediction was evaluated using a modified version of the face-vignette task (FVT). Participants viewed emotional faces (automatic response cue) paired with vignettes (effortful response cue) that signalled a different emotion category and were instructed to discriminate the manifest emotion. Patients made less vignette and more face responses than controls. However, the relationship between group and FVT responding was moderated by IQ and reading comprehension ability. These results replicate and extend previous research and provide tentative support for abnormal conflict resolution between automatic and effortful emotional processing predicted by dual-stream models of SCZ.

  3. Neural circuitry of emotional face processing in autism spectrum disorders.

    PubMed

    Monk, Christopher S; Weng, Shih-Jen; Wiggins, Jillian Lee; Kurapati, Nikhil; Louro, Hugo M C; Carrasco, Melisa; Maslowsky, Julie; Risi, Susan; Lord, Catherine

    2010-03-01

    Autism spectrum disorders (ASD) are associated with severe impairments in social functioning. Because faces provide nonverbal cues that support social interactions, many studies of ASD have examined neural structures that process faces, including the amygdala, ventromedial prefrontal cortex and superior and middle temporal gyri. However, increases or decreases in activation are often contingent on the cognitive task. Specifically, the cognitive domain of attention influences group differences in brain activation. We investigated brain function abnormalities in participants with ASD using a task that monitored attention bias to emotional faces. Twenty-four participants (12 with ASD, 12 controls) completed a functional magnetic resonance imaging study while performing an attention cuing task with emotional (happy, sad, angry) and neutral faces. In response to emotional faces, those in the ASD group showed greater right amygdala activation than those in the control group. A preliminary psychophysiological connectivity analysis showed that ASD participants had stronger positive right amygdala and ventromedial prefrontal cortex coupling and weaker positive right amygdala and temporal lobe coupling than controls. There were no group differences in the behavioural measure of attention bias to the emotional faces. The small sample size may have affected our ability to detect additional group differences. When attention bias to emotional faces was equivalent between ASD and control groups, ASD was associated with greater amygdala activation. Preliminary analyses showed that ASD participants had stronger connectivity between the amygdala ventromedial prefrontal cortex (a network implicated in emotional modulation) and weaker connectivity between the amygdala and temporal lobe (a pathway involved in the identification of facial expressions, although areas of group differences were generally in a more anterior region of the temporal lobe than what is typically reported for emotional face processing). These alterations in connectivity are consistent with emotion and face processing disturbances in ASD.

  4. Alcoholism and dampened temporal limbic activation to emotional faces.

    PubMed

    Marinkovic, Ksenija; Oscar-Berman, Marlene; Urban, Trinity; O'Reilly, Cara E; Howard, Julie A; Sawyer, Kayle; Harris, Gordon J

    2009-11-01

    Excessive chronic drinking is accompanied by a broad spectrum of emotional changes ranging from apathy and emotional flatness to deficits in comprehending emotional information, but their neural bases are poorly understood. Emotional abnormalities associated with alcoholism were examined with functional magnetic resonance imaging in abstinent long-term alcoholic men in comparison to healthy demographically matched controls. Participants were presented with emotionally valenced words and photographs of faces during deep (semantic) and shallow (perceptual) encoding tasks followed by recognition. Overall, faces evoked stronger activation than words, with the expected material-specific laterality (left hemisphere for words, and right for faces) and depth of processing effects. However, whereas control participants showed stronger activation in the amygdala and hippocampus when viewing faces with emotional (relative to neutral) expressions, the alcoholics responded in an undifferentiated manner to all facial expressions. In the alcoholic participants, amygdala activity was inversely correlated with an increase in lateral prefrontal activity as a function of their behavioral deficits. Prefrontal modulation of emotional function as a compensation for the blunted amygdala activity during a socially relevant face appraisal task is in agreement with a distributed network engagement during emotional face processing. Deficient activation of amygdala and hippocampus may underlie impaired processing of emotional faces associated with long-term alcoholism and may be a part of the wide array of behavioral problems including disinhibition, concurring with previously documented interpersonal difficulties in this population. Furthermore, the results suggest that alcoholics may rely on prefrontal rather than temporal limbic areas in order to compensate for reduced limbic responsivity and to maintain behavioral adequacy when faced with emotionally or socially challenging situations.

  5. Down Syndrome and Automatic Processing of Familiar and Unfamiliar Emotional Faces

    ERIC Educational Resources Information Center

    Morales, Guadalupe E.; Lopez, Ernesto O.

    2010-01-01

    Participants with Down syndrome (DS) were required to participate in a face recognition experiment to recognize familiar (DS faces) and unfamiliar emotional faces (non DS faces), by using an affective priming paradigm. Pairs of emotional facial stimuli were presented (one face after another) with a short Stimulus Onset Asynchrony of 300…

  6. Different brain activity in response to emotional faces alone and augmented by contextual information.

    PubMed

    Lee, Kyung Hwa; Siegle, Greg J

    2014-11-01

    This study examined the extent to which emotional face stimuli differ from the neural reactivity associated with more ecological contextually augmented stimuli. Participants were scanned when they viewed contextually rich pictures depicting both emotional faces and context, and pictures of emotional faces presented alone. Emotional faces alone were more strongly associated with brain activity in paralimbic and social information processing regions, whereas emotional faces augmented by context were associated with increased and sustained activity in regions potentially representing increased complexity and subjective emotional experience. Furthermore, context effects were modulated by emotional intensity and valence. These findings suggest that cortical elaboration that is apparent in contextually augmented stimuli may be missed in studies of emotional faces alone, whereas emotional faces may more selectively recruit limbic reactivity. Copyright © 2014 Society for Psychophysiological Research.

  7. Attentional Modulation of Emotional Conflict Processing with Flanker Tasks

    PubMed Central

    Zhou, Pingyan; Liu, Xun

    2013-01-01

    Emotion processing has been shown to acquire priority by biasing allocation of attentional resources. Aversive images or fearful expressions are processed quickly and automatically. Many existing findings suggested that processing of emotional information was pre-attentive, largely immune from attentional control. Other studies argued that attention gated the processing of emotion. To tackle this controversy, the current study examined whether and to what degrees attention modulated processing of emotion using a stimulus-response-compatibility (SRC) paradigm. We conducted two flanker experiments using color scale faces in neutral expressions or gray scale faces in emotional expressions. We found SRC effects for all three dimensions (color, gender, and emotion) and SRC effects were larger when the conflicts were task relevant than when they were task irrelevant, suggesting that conflict processing of emotion was modulated by attention, similar to those of color and face identity (gender). However, task modulation on color SRC effect was significantly greater than that on gender or emotion SRC effect, indicating that processing of salient information was modulated by attention to a lesser degree than processing of non-emotional stimuli. We proposed that emotion processing can be influenced by attentional control, but at the same time salience of emotional information may bias toward bottom-up processing, rendering less top-down modulation than that on non-emotional stimuli. PMID:23544155

  8. Attentional modulation of emotional conflict processing with flanker tasks.

    PubMed

    Zhou, Pingyan; Liu, Xun

    2013-01-01

    Emotion processing has been shown to acquire priority by biasing allocation of attentional resources. Aversive images or fearful expressions are processed quickly and automatically. Many existing findings suggested that processing of emotional information was pre-attentive, largely immune from attentional control. Other studies argued that attention gated the processing of emotion. To tackle this controversy, the current study examined whether and to what degrees attention modulated processing of emotion using a stimulus-response-compatibility (SRC) paradigm. We conducted two flanker experiments using color scale faces in neutral expressions or gray scale faces in emotional expressions. We found SRC effects for all three dimensions (color, gender, and emotion) and SRC effects were larger when the conflicts were task relevant than when they were task irrelevant, suggesting that conflict processing of emotion was modulated by attention, similar to those of color and face identity (gender). However, task modulation on color SRC effect was significantly greater than that on gender or emotion SRC effect, indicating that processing of salient information was modulated by attention to a lesser degree than processing of non-emotional stimuli. We proposed that emotion processing can be influenced by attentional control, but at the same time salience of emotional information may bias toward bottom-up processing, rendering less top-down modulation than that on non-emotional stimuli.

  9. The effects of early institutionalization on emotional face processing: evidence for sparing via an experience-dependent mechanism.

    PubMed

    Young, Audrey; Luyster, Rhiannon J; Fox, Nathan A; Zeanah, Charles H; Nelson, Charles A

    2017-09-01

    Early psychosocial deprivation has profound adverse effects on children's brain and behavioural development, including abnormalities in physical growth, intellectual function, social cognition, and emotional development. Nevertheless, the domain of emotional face processing has appeared in previous research to be relatively spared; here, we test for possible sleeper effects emerging in early adolescence. This study employed event-related potentials (ERPs) to examine the neural correlates of facial emotion processing in 12-year-old children who took part in a randomized controlled trial of foster care as an intervention for early institutionalization. Results revealed no significant group differences in two face and emotion-sensitive ERP components (P1 and N170), nor any association with age at placement or per cent of lifetime spent in an institution. These results converged with previous evidence from this population supporting relative sparing of facial emotion processing. We hypothesize that this sparing is due to an experience-dependent mechanism in which the amount of exposure to faces and facial expressions of emotion children received was sufficient to meet the low threshold required for cortical specialization of structures critical to emotion processing. Statement of contribution What is already known on this subject? Early psychosocial deprivation leads to profoundly detrimental effects on children's brain and behavioural development. With respect to children's emotional face processing abilities, few adverse effects of institutionalized rearing have previously been reported. Recent studies suggest that 'sleeper effects' may emerge many years later, especially in the domain of face processing. What does this study add? Examining a cumulative 12 years of data, we found only minimal group differences and no evidence of a sleeper effect in this particular domain. These findings identify emotional face processing as a unique ability in which relative sparing can be found. We propose an experience-dependent mechanism in which the amount of social interaction children received met the low threshold required for cortical specialization. © 2017 The British Psychological Society.

  10. Judging emotional congruency: Explicit attention to situational context modulates processing of facial expressions of emotion.

    PubMed

    Diéguez-Risco, Teresa; Aguado, Luis; Albert, Jacobo; Hinojosa, José Antonio

    2015-12-01

    The influence of explicit evaluative processes on the contextual integration of facial expressions of emotion was studied in a procedure that required the participants to judge the congruency of happy and angry faces with preceding sentences describing emotion-inducing situations. Judgments were faster on congruent trials in the case of happy faces and on incongruent trials in the case of angry faces. At the electrophysiological level, a congruency effect was observed in the face-sensitive N170 component that showed larger amplitudes on incongruent trials. An interactive effect of congruency and emotion appeared on the LPP (late positive potential), with larger amplitudes in response to happy faces that followed anger-inducing situations. These results show that the deliberate intention to judge the contextual congruency of facial expressions influences not only processes involved in affective evaluation such as those indexed by the LPP but also earlier processing stages that are involved in face perception. Copyright © 2015. Published by Elsevier B.V.

  11. Effect of distracting faces on visual selective attention in the monkey.

    PubMed

    Landman, Rogier; Sharma, Jitendra; Sur, Mriganka; Desimone, Robert

    2014-12-16

    In primates, visual stimuli with social and emotional content tend to attract attention. Attention might be captured through rapid, automatic, subcortical processing or guided by slower, more voluntary cortical processing. Here we examined whether irrelevant faces with varied emotional expressions interfere with a covert attention task in macaque monkeys. In the task, the monkeys monitored a target grating in the periphery for a subtle color change while ignoring distracters that included faces appearing elsewhere on the screen. The onset time of distracter faces before the target change, as well as their spatial proximity to the target, was varied from trial to trial. The presence of faces, especially faces with emotional expressions interfered with the task, indicating a competition for attentional resources between the task and the face stimuli. However, this interference was significant only when faces were presented for greater than 200 ms. Emotional faces also affected saccade velocity and reduced pupillary reflex. Our results indicate that the attraction of attention by emotional faces in the monkey takes a considerable amount of processing time, possibly involving cortical-subcortical interactions. Intranasal application of the hormone oxytocin ameliorated the interfering effects of faces. Together these results provide evidence for slow modulation of attention by emotional distracters, which likely involves oxytocinergic brain circuits.

  12. Seeing Emotion with Your Ears: Emotional Prosody Implicitly Guides Visual Attention to Faces

    PubMed Central

    Rigoulot, Simon; Pell, Marc D.

    2012-01-01

    Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality) and emotional speech prosody (auditory modality) which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing) uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0–1250 ms], [1250–2500 ms], [2500–5000 ms]) were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect), although this effect was often emotion-specific (with greatest effects for fear). Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions. PMID:22303454

  13. No fear, no panic: probing negation as a means for emotion regulation

    PubMed Central

    Deutsch, Roland; Platte, Petra; Pauli, Paul

    2013-01-01

    This electroencephalographic study investigated if negating one’s emotion results in paradoxical effects or leads to effective emotional downregulation. Healthy participants were asked to downregulate their emotions to happy and fearful faces by using negated emotional cue words (e.g. no fun, no fear). Cue words were congruent with the emotion depicted in the face and presented prior to each face. Stimuli were presented in blocks of happy and fearful faces. Blocks of passive stimulus viewing served as control condition. Active regulation reduced amplitudes of early event-related brain potentials (early posterior negativity, but not N170) and the late positive potential for fearful faces. A fronto-central negativity peaking at about 250 ms after target face onset showed larger amplitude modulations during downregulation of fearful and happy faces. Behaviorally, negating was more associated with reappraisal than with suppression. Our results suggest that in an emotional context, negation processing could be quite effective for emotional downregulation but that its effects depend on the type of the negated emotion (pleasant vs unpleasant). Results are discussed in the context of dual process models of cognition and emotion regulation. PMID:22490924

  14. Spatiotemporal brain dynamics of emotional face processing modulations induced by the serotonin 1A/2A receptor agonist psilocybin.

    PubMed

    Bernasconi, Fosco; Schmidt, André; Pokorny, Thomas; Kometer, Michael; Seifritz, Erich; Vollenweider, Franz X

    2014-12-01

    Emotional face processing is critically modulated by the serotonergic system. For instance, emotional face processing is impaired by acute psilocybin administration, a serotonin (5-HT) 1A and 2A receptor agonist. However, the spatiotemporal brain mechanisms underlying these modulations are poorly understood. Here, we investigated the spatiotemporal brain dynamics underlying psilocybin-induced modulations during emotional face processing. Electrical neuroimaging analyses were applied to visual evoked potentials in response to emotional faces, following psilocybin and placebo administration. Our results indicate a first time period of strength (i.e., Global Field Power) modulation over the 168-189 ms poststimulus interval, induced by psilocybin. A second time period of strength modulation was identified over the 211-242 ms poststimulus interval. Source estimations over these 2 time periods further revealed decreased activity in response to both neutral and fearful faces within limbic areas, including amygdala and parahippocampal gyrus, and the right temporal cortex over the 168-189 ms interval, and reduced activity in response to happy faces within limbic and right temporo-occipital brain areas over the 211-242 ms interval. Our results indicate a selective and temporally dissociable effect of psilocybin on the neuronal correlates of emotional face processing, consistent with a modulation of the top-down control. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. Emotional conflict occurs at an early stage: evidence from the emotional face-word Stroop task.

    PubMed

    Zhu, Xiang-ru; Zhang, Hui-jun; Wu, Ting-ting; Luo, Wen-bo; Luo, Yue-jia

    2010-06-30

    The perceptual processing of emotional conflict was studied using electrophysiological techniques to measure event-related potentials (ERPs). The emotional face-word Stroop task in which emotion words are written in prominent red color across a face was use to study emotional conflict. In each trial, the emotion word and facial expression were either congruent or incongruent (in conflict). When subjects were asked to identify the expression of the face during a trial, the incongruent condition evoked a more negative N170 ERP component in posterior lateral sites than in the congruent condition. In contrast, when subjects were asked to identify the word during a trial, the incongruent condition evoked a less negative N170 component than the congruent condition. The present findings extend our understanding of the control processes involved in emotional conflict by demonstrating that differentiation of emotional congruency begins at an early perceptual processing stage. (c) 2010 Elsevier Ireland Ltd. All rights reserved.

  16. Task relevance regulates the interaction between reward expectation and emotion.

    PubMed

    Wei, Ping; Kang, Guanlan

    2014-06-01

    In the present study, we investigated the impact of reward expectation on the processing of emotional facial expression using a cue-target paradigm. A cue indicating the reward condition of each trial (incentive vs. non-incentive) was followed by the presentation of a picture of an emotional face, the target. Participants were asked to discriminate the emotional expression of the target face in Experiment 1, to discriminate the gender of the target face in Experiment 2, and to judge a number superimposed on the center of the target face as even or odd in Experiment 3, rendering the emotional expression of the target face as task relevant in Experiment 1 but task irrelevant in Experiments 2 and 3. Faster reaction times (RTs) were observed in the monetary incentive condition than in the non-incentive condition, demonstrating the effect of reward on facilitating task concentration. Moreover, the reward effect (i.e., RTs in non-incentive conditions versus incentive conditions) was larger for emotional faces than for neutral faces when emotional expression was task relevant but not when it was task irrelevant. The findings suggest that top-down incentive motivation biased attentional processing toward task-relevant stimuli, and that task relevance played an important role in regulating the influence of reward expectation on the processing of emotional stimuli.

  17. The Relationship between Processing Facial Identity and Emotional Expression in 8-Month-Old Infants

    ERIC Educational Resources Information Center

    Schwarzer, Gudrun; Jovanovic, Bianca

    2010-01-01

    In Experiment 1, it was investigated whether infants process facial identity and emotional expression independently or in conjunction with one another. Eight-month-old infants were habituated to two upright or two inverted faces varying in facial identity and emotional expression. Infants were tested with a habituation face, a switch face, and a…

  18. Emotional facial expressions reduce neural adaptation to face identity.

    PubMed

    Gerlicher, Anna M V; van Loon, Anouk M; Scholte, H Steven; Lamme, Victor A F; van der Leij, Andries R

    2014-05-01

    In human social interactions, facial emotional expressions are a crucial source of information. Repeatedly presented information typically leads to an adaptation of neural responses. However, processing seems sustained with emotional facial expressions. Therefore, we tested whether sustained processing of emotional expressions, especially threat-related expressions, would attenuate neural adaptation. Neutral and emotional expressions (happy, mixed and fearful) of same and different identity were presented at 3 Hz. We used electroencephalography to record the evoked steady-state visual potentials (ssVEP) and tested to what extent the ssVEP amplitude adapts to the same when compared with different face identities. We found adaptation to the identity of a neutral face. However, for emotional faces, adaptation was reduced, decreasing linearly with negative valence, with the least adaptation to fearful expressions. This short and straightforward method may prove to be a valuable new tool in the study of emotional processing.

  19. Alcoholism and Dampened Temporal Limbic Activation to Emotional Faces

    PubMed Central

    Marinkovic, Ksenija; Oscar-Berman, Marlene; Urban, Trinity; O’Reilly, Cara E.; Howard, Julie A.; Sawyer, Kayle; Harris, Gordon J.

    2013-01-01

    Background Excessive chronic drinking is accompanied by a broad spectrum of emotional changes ranging from apathy and emotional flatness to deficits in comprehending emotional information, but their neural bases are poorly understood. Methods Emotional abnormalities associated with alcoholism were examined with functional magnetic resonance imaging in abstinent long-term alcoholic men in comparison to healthy demographically matched controls. Participants were presented with emotionally valenced words and photographs of faces during deep (semantic) and shallow (perceptual) encoding tasks followed by recognition. Results Overall, faces evoked stronger activation than words, with the expected material-specific laterality (left hemisphere for words, and right for faces) and depth of processing effects. However, whereas control participants showed stronger activation in the amygdala and hippocampus when viewing faces with emotional (relative to neutral) expressions, the alcoholics responded in an undifferentiated manner to all facial expressions. In the alcoholic participants, amygdala activity was inversely correlated with an increase in lateral prefrontal activity as a function of their behavioral deficits. Prefrontal modulation of emotional function as a compensation for the blunted amygdala activity during a socially relevant face appraisal task is in agreement with a distributed network engagement during emotional face processing. Conclusions Deficient activation of amygdala and hippocampus may underlie impaired processing of emotional faces associated with long-term alcoholism and may be a part of the wide array of behavioral problems including disinhibition, concurring with previously documented interpersonal difficulties in this population. Furthermore, the results suggest that alcoholics may rely on prefrontal rather than temporal limbic areas in order to compensate for reduced limbic responsivity and to maintain behavioral adequacy when faced with emotionally or socially challenging situations. PMID:19673745

  20. Flexible and inflexible task sets: asymmetric interference when switching between emotional expression, sex, and age classification of perceived faces.

    PubMed

    Schuch, Stefanie; Werheid, Katja; Koch, Iring

    2012-01-01

    The present study investigated whether the processing characteristics of categorizing emotional facial expressions are different from those of categorizing facial age and sex information. Given that emotions change rapidly, it was hypothesized that processing facial expressions involves a more flexible task set that causes less between-task interference than the task sets involved in processing age or sex of a face. Participants switched between three tasks: categorizing a face as looking happy or angry (emotion task), young or old (age task), and male or female (sex task). Interference between tasks was measured by global interference and response interference. Both measures revealed patterns of asymmetric interference. Global between-task interference was reduced when a task was mixed with the emotion task. Response interference, as measured by congruency effects, was larger for the emotion task than for the nonemotional tasks. The results support the idea that processing emotional facial expression constitutes a more flexible task set that causes less interference (i.e., task-set "inertia") than processing the age or sex of a face.

  1. Putting the face in context: Body expressions impact facial emotion processing in human infants.

    PubMed

    Rajhans, Purva; Jessen, Sarah; Missana, Manuela; Grossmann, Tobias

    2016-06-01

    Body expressions exert strong contextual effects on facial emotion perception in adults. Specifically, conflicting body cues hamper the recognition of emotion from faces, as evident on both the behavioral and neural level. We examined the developmental origins of the neural processes involved in emotion perception across body and face in 8-month-old infants by measuring event-related brain potentials (ERPs). We primed infants with body postures (fearful, happy) that were followed by either congruent or incongruent facial expressions. Our results revealed that body expressions impact facial emotion processing and that incongruent body cues impair the neural discrimination of emotional facial expressions. Priming effects were associated with attentional and recognition memory processes, as reflected in a modulation of the Nc and Pc evoked at anterior electrodes. These findings demonstrate that 8-month-old infants possess neural mechanisms that allow for the integration of emotion across body and face, providing evidence for the early developmental emergence of context-sensitive facial emotion perception. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  2. Association with emotional information alters subsequent processing of neutral faces

    PubMed Central

    Riggs, Lily; Fujioka, Takako; Chan, Jessica; McQuiggan, Douglas A.; Anderson, Adam K.; Ryan, Jennifer D.

    2014-01-01

    The processing of emotional as compared to neutral information is associated with different patterns in eye movement and neural activity. However, the ‘emotionality’ of a stimulus can be conveyed not only by its physical properties, but also by the information that is presented with it. There is very limited work examining the how emotional information may influence the immediate perceptual processing of otherwise neutral information. We examined how presenting an emotion label for a neutral face may influence subsequent processing by using eye movement monitoring (EMM) and magnetoencephalography (MEG) simultaneously. Participants viewed a series of faces with neutral expressions. Each face was followed by a unique negative or neutral sentence to describe that person, and then the same face was presented in isolation again. Viewing of faces paired with a negative sentence was associated with increased early viewing of the eye region and increased neural activity between 600 and 1200 ms in emotion processing regions such as the cingulate, medial prefrontal cortex, and amygdala, as well as posterior regions such as the precuneus and occipital cortex. Viewing of faces paired with a neutral sentence was associated with increased activity in the parahippocampal gyrus during the same time window. By monitoring behavior and neural activity within the same paradigm, these findings demonstrate that emotional information alters subsequent visual scanning and the neural systems that are presumably invoked to maintain a representation of the neutral information along with its emotional details. PMID:25566024

  3. Reading Faces: Differential Lateral Gaze Bias in Processing Canine and Human Facial Expressions in Dogs and 4-Year-Old Children

    PubMed Central

    Racca, Anaïs; Guo, Kun; Meints, Kerstin; Mills, Daniel S.

    2012-01-01

    Sensitivity to the emotions of others provides clear biological advantages. However, in the case of heterospecific relationships, such as that existing between dogs and humans, there are additional challenges since some elements of the expression of emotions are species-specific. Given that faces provide important visual cues for communicating emotional state in both humans and dogs, and that processing of emotions is subject to brain lateralisation, we investigated lateral gaze bias in adult dogs when presented with pictures of expressive human and dog faces. Our analysis revealed clear differences in laterality of eye movements in dogs towards conspecific faces according to the emotional valence of the expressions. Differences were also found towards human faces, but to a lesser extent. For comparative purpose, a similar experiment was also run with 4-year-old children and it was observed that they showed differential processing of facial expressions compared to dogs, suggesting a species-dependent engagement of the right or left hemisphere in processing emotions. PMID:22558335

  4. Mere social categorization modulates identification of facial expressions of emotion.

    PubMed

    Young, Steven G; Hugenberg, Kurt

    2010-12-01

    The ability of the human face to communicate emotional states via facial expressions is well known, and past research has established the importance and universality of emotional facial expressions. However, recent evidence has revealed that facial expressions of emotion are most accurately recognized when the perceiver and expresser are from the same cultural ingroup. The current research builds on this literature and extends this work. Specifically, we find that mere social categorization, using a minimal-group paradigm, can create an ingroup emotion-identification advantage even when the culture of the target and perceiver is held constant. Follow-up experiments show that this effect is supported by differential motivation to process ingroup versus outgroup faces and that this motivational disparity leads to more configural processing of ingroup faces than of outgroup faces. Overall, the results point to distinct processing modes for ingroup and outgroup faces, resulting in differential identification accuracy for facial expressions of emotion. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  5. Consciousness and arousal effects on emotional face processing as revealed by brain oscillations. A gamma band analysis.

    PubMed

    Balconi, Michela; Lucchiari, Claudio

    2008-01-01

    It remains an open question whether it is possible to assign a single brain operation or psychological function for facial emotion decoding to a certain type of oscillatory activity. Gamma band activity (GBA) offers an adequate tool for studying cortical activation patterns during emotional face information processing. In the present study brain oscillations were analyzed in response to facial expression of emotions. Specifically, GBA modulation was measured when twenty subjects looked at emotional (angry, fearful, happy, and sad faces) or neutral faces in two different conditions: supraliminal (10 ms) vs subliminal (150 ms) stimulation (100 target-mask pairs for each condition). The results showed that both consciousness and significance of the stimulus in terms of arousal can modulate the power synchronization (ERD decrease) during 150-350 time range: an early oscillatory event showed its peak at about 200 ms post-stimulus. GBA was enhanced by supraliminal more than subliminal elaboration, as well as more by high arousal (anger and fear) than low arousal (happiness and sadness) emotions. Finally a left-posterior dominance for conscious elaboration was found, whereas right hemisphere was discriminant in emotional processing of face in comparison with neutral face.

  6. Perception of Emotional Facial Expressions in Amyotrophic Lateral Sclerosis (ALS) at Behavioural and Brain Metabolic Level.

    PubMed

    Aho-Özhan, Helena E A; Keller, Jürgen; Heimrath, Johanna; Uttner, Ingo; Kassubek, Jan; Birbaumer, Niels; Ludolph, Albert C; Lulé, Dorothée

    2016-01-01

    Amyotrophic lateral sclerosis (ALS) primarily impairs motor abilities but also affects cognition and emotional processing. We hypothesise that subjective ratings of emotional stimuli depicting social interactions and facial expressions is changed in ALS. It was found that recognition of negative emotions and ability to mentalize other's intentions is reduced. Processing of emotions in faces was investigated. A behavioural test of Ekman faces expressing six basic emotions was presented to 30 ALS patients and 29 age-, gender and education matched healthy controls. Additionally, a subgroup of 15 ALS patients that were able to lie supine in the scanner and 14 matched healthy controls viewed the Ekman faces during functional magnetic resonance imaging (fMRI). Affective state and a number of daily social contacts were measured. ALS patients recognized disgust and fear less accurately than healthy controls. In fMRI, reduced brain activity was seen in areas involved in processing of negative emotions replicating our previous results. During processing of sad faces, increased brain activity was seen in areas associated with social emotions in right inferior frontal gyrus and reduced activity in hippocampus bilaterally. No differences in brain activity were seen for any of the other emotional expressions. Inferior frontal gyrus activity for sad faces was associated with increased amount of social contacts of ALS patients. ALS patients showed decreased brain and behavioural responses in processing of disgust and fear and an altered brain response pattern for sadness. The negative consequences of neurodegenerative processes in the course of ALS might be counteracted by positive emotional activity and positive social interactions.

  7. [Emotional intelligence and oscillatory responses on the emotional facial expressions].

    PubMed

    Kniazev, G G; Mitrofanova, L G; Bocharov, A V

    2013-01-01

    Emotional intelligence-related differences in oscillatory responses to emotional facial expressions were investigated in 48 subjects (26 men and 22 women) in age 18-30 years. Participants were instructed to evaluate emotional expression (angry, happy and neutral) of each presented face on an analog scale ranging from -100 (very hostile) to + 100 (very friendly). High emotional intelligence (EI) participants were found to be more sensitive to the emotional content of the stimuli. It showed up both in their subjective evaluation of the stimuli and in a stronger EEG theta synchronization at an earlier (between 100 and 500 ms after face presentation) processing stage. Source localization using sLORETA showed that this effect was localized in the fusiform gyrus upon the presentation of angry faces and in the posterior cingulate gyrus upon the presentation of happy faces. At a later processing stage (500-870 ms) event-related theta synchronization in high emotional intelligence subject was higher in the left prefrontal cortex upon the presentation of happy faces, but it was lower in the anterior cingulate cortex upon presentation of angry faces. This suggests the existence of a mechanism that can be selectively increase the positive emotions and reduce negative emotions.

  8. The implicit processing of categorical and dimensional strategies: an fMRI study of facial emotion perception

    PubMed Central

    Matsuda, Yoshi-Taka; Fujimura, Tomomi; Katahira, Kentaro; Okada, Masato; Ueno, Kenichi; Cheng, Kang; Okanoya, Kazuo

    2013-01-01

    Our understanding of facial emotion perception has been dominated by two seemingly opposing theories: the categorical and dimensional theories. However, we have recently demonstrated that hybrid processing involving both categorical and dimensional perception can be induced in an implicit manner (Fujimura etal., 2012). The underlying neural mechanisms of this hybrid processing remain unknown. In this study, we tested the hypothesis that separate neural loci might intrinsically encode categorical and dimensional processing functions that serve as a basis for hybrid processing. We used functional magnetic resonance imaging to measure neural correlates while subjects passively viewed emotional faces and performed tasks that were unrelated to facial emotion processing. Activity in the right fusiform face area (FFA) increased in response to psychologically obvious emotions and decreased in response to ambiguous expressions, demonstrating the role of the FFA in categorical processing. The amygdala, insula and medial prefrontal cortex exhibited evidence of dimensional (linear) processing that correlated with physical changes in the emotional face stimuli. The occipital face area and superior temporal sulcus did not respond to these changes in the presented stimuli. Our results indicated that distinct neural loci process the physical and psychological aspects of facial emotion perception in a region-specific and implicit manner. PMID:24133426

  9. Emotions in word and face processing: early and late cortical responses.

    PubMed

    Schacht, Annekathrin; Sommer, Werner

    2009-04-01

    Recent research suggests that emotion effects in word processing resemble those in other stimulus domains such as pictures or faces. The present study aims to provide more direct evidence for this notion by comparing emotion effects in word and face processing in a within-subject design. Event-related brain potentials (ERPs) were recorded as participants made decisions on the lexicality of emotionally positive, negative, and neutral German verbs or pseudowords, and on the integrity of intact happy, angry, and neutral faces or slightly distorted faces. Relative to neutral and negative stimuli both positive verbs and happy faces elicited posterior ERP negativities that were indistinguishable in scalp distribution and resembled the early posterior negativities reported by others. Importantly, these ERP modulations appeared at very different latencies. Therefore, it appears that similar brain systems reflect the decoding of both biological and symbolic emotional signals of positive valence, differing mainly in the speed of meaning access, which is more direct and faster for facial expressions than for words.

  10. Men appear more lateralized when noticing emotion in male faces.

    PubMed

    Rahman, Qazi; Anchassi, Tarek

    2012-02-01

    Empirical tests of the "right hemisphere dominance" versus "valence" theories of emotion processing are confounded by known sex differences in lateralization. Moreover, information about the sex of the person posing an emotion might be processed differently by men and women because of an adaptive male bias to notice expressions of threat and vigilance in other male faces. The purpose of this study was to investigate whether sex of poser and emotion displayed influenced lateralization in men and women by analyzing "laterality quotient" scores on a test which depicts vertically split chimeric faces, formed with one half showing a neutral expression and the other half showing an emotional expression. We found that men (N = 50) were significantly more lateralized for emotions indicative of vigilance and threat (happy, sad, angry, and surprised) in male faces relative to female faces and compared to women (N = 44). These data indicate that sex differences in functional cerebral lateralization for facial emotion may be specific to the emotion presented and the sex of face presenting it. PsycINFO Database Record (c) 2012 APA, all rights reserved

  11. Non-verbal emotion communication training induces specific changes in brain function and structure

    PubMed Central

    Kreifelts, Benjamin; Jacob, Heike; Brück, Carolin; Erb, Michael; Ethofer, Thomas; Wildgruber, Dirk

    2013-01-01

    The perception of emotional cues from voice and face is essential for social interaction. However, this process is altered in various psychiatric conditions along with impaired social functioning. Emotion communication trainings have been demonstrated to improve social interaction in healthy individuals and to reduce emotional communication deficits in psychiatric patients. Here, we investigated the impact of a non-verbal emotion communication training (NECT) on cerebral activation and brain structure in a controlled and combined functional magnetic resonance imaging (fMRI) and voxel-based morphometry study. NECT-specific reductions in brain activity occurred in a distributed set of brain regions including face and voice processing regions as well as emotion processing- and motor-related regions presumably reflecting training-induced familiarization with the evaluation of face/voice stimuli. Training-induced changes in non-verbal emotion sensitivity at the behavioral level and the respective cerebral activation patterns were correlated in the face-selective cortical areas in the posterior superior temporal sulcus and fusiform gyrus for valence ratings and in the temporal pole, lateral prefrontal cortex and midbrain/thalamus for the response times. A NECT-induced increase in gray matter (GM) volume was observed in the fusiform face area. Thus, NECT induces both functional and structural plasticity in the face processing system as well as functional plasticity in the emotion perception and evaluation system. We propose that functional alterations are presumably related to changes in sensory tuning in the decoding of emotional expressions. Taken together, these findings highlight that the present experimental design may serve as a valuable tool to investigate the altered behavioral and neuronal processing of emotional cues in psychiatric disorders as well as the impact of therapeutic interventions on brain function and structure. PMID:24146641

  12. Non-verbal emotion communication training induces specific changes in brain function and structure.

    PubMed

    Kreifelts, Benjamin; Jacob, Heike; Brück, Carolin; Erb, Michael; Ethofer, Thomas; Wildgruber, Dirk

    2013-01-01

    The perception of emotional cues from voice and face is essential for social interaction. However, this process is altered in various psychiatric conditions along with impaired social functioning. Emotion communication trainings have been demonstrated to improve social interaction in healthy individuals and to reduce emotional communication deficits in psychiatric patients. Here, we investigated the impact of a non-verbal emotion communication training (NECT) on cerebral activation and brain structure in a controlled and combined functional magnetic resonance imaging (fMRI) and voxel-based morphometry study. NECT-specific reductions in brain activity occurred in a distributed set of brain regions including face and voice processing regions as well as emotion processing- and motor-related regions presumably reflecting training-induced familiarization with the evaluation of face/voice stimuli. Training-induced changes in non-verbal emotion sensitivity at the behavioral level and the respective cerebral activation patterns were correlated in the face-selective cortical areas in the posterior superior temporal sulcus and fusiform gyrus for valence ratings and in the temporal pole, lateral prefrontal cortex and midbrain/thalamus for the response times. A NECT-induced increase in gray matter (GM) volume was observed in the fusiform face area. Thus, NECT induces both functional and structural plasticity in the face processing system as well as functional plasticity in the emotion perception and evaluation system. We propose that functional alterations are presumably related to changes in sensory tuning in the decoding of emotional expressions. Taken together, these findings highlight that the present experimental design may serve as a valuable tool to investigate the altered behavioral and neuronal processing of emotional cues in psychiatric disorders as well as the impact of therapeutic interventions on brain function and structure.

  13. The influence of variations in eating disorder-related symptoms on processing of emotional faces in a non-clinical female sample: An eye-tracking study.

    PubMed

    Sharpe, Emma; Wallis, Deborah J; Ridout, Nathan

    2016-06-30

    This study aimed to: (i) determine if the attention bias towards angry faces reported in eating disorders generalises to a non-clinical sample varying in eating disorder-related symptoms; (ii) examine if the bias occurs during initial orientation or later strategic processing; and (iii) confirm previous findings of impaired facial emotion recognition in non-clinical disordered eating. Fifty-two females viewed a series of face-pairs (happy or angry paired with neutral) whilst their attentional deployment was continuously monitored using an eye-tracker. They subsequently identified the emotion portrayed in a separate series of faces. The highest (n=18) and lowest scorers (n=17) on the Eating Disorders Inventory (EDI) were compared on the attention and facial emotion recognition tasks. Those with relatively high scores exhibited impaired facial emotion recognition, confirming previous findings in similar non-clinical samples. They also displayed biased attention away from emotional faces during later strategic processing, which is consistent with previously observed impairments in clinical samples. These differences were related to drive-for-thinness. Although we found no evidence of a bias towards angry faces, it is plausible that the observed impairments in emotion recognition and avoidance of emotional faces could disrupt social functioning and act as a risk factor for the development of eating disorders. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  14. Serotonergic neurotransmission in emotional processing: New evidence from long-term recreational poly-drug ecstasy use.

    PubMed

    Laursen, Helle Ruff; Henningsson, Susanne; Macoveanu, Julian; Jernigan, Terry L; Siebner, Hartwig R; Holst, Klaus K; Skimminge, Arnold; Knudsen, Gitte M; Ramsoy, Thomas Z; Erritzoe, David

    2016-12-01

    The brain's serotonergic system plays a crucial role in the processing of emotional stimuli, and several studies have shown that a reduced serotonergic neurotransmission is associated with an increase in amygdala activity during emotional face processing. Prolonged recreational use of ecstasy (3,4-methylene-dioxymethamphetamine [MDMA]) induces alterations in serotonergic neurotransmission that are comparable to those observed in a depleted state. In this functional magnetic resonance imaging (fMRI) study, we investigated the responsiveness of the amygdala to emotional face stimuli in recreational ecstasy users as a model of long-term serotonin depletion. Fourteen ecstasy users and 12 non-using controls underwent fMRI to measure the regional neural activity elicited in the amygdala by male or female faces expressing anger, disgust, fear, sadness, or no emotion. During fMRI, participants made a sex judgement on each face stimulus. Positron emission tomography with 11 C-DASB was additionally performed to assess serotonin transporter (SERT) binding in the brain. In the ecstasy users, SERT binding correlated negatively with amygdala activity, and accumulated lifetime intake of ecstasy tablets was associated with an increase in amygdala activity during angry face processing. Conversely, time since the last ecstasy intake was associated with a trend toward a decrease in amygdala activity during angry and sad face processing. These results indicate that the effects of long-term serotonin depletion resulting from ecstasy use are dose-dependent, affecting the functional neural basis of emotional face processing. © The Author(s) 2016.

  15. Distinct spatial frequency sensitivities for processing faces and emotional expressions.

    PubMed

    Vuilleumier, Patrik; Armony, Jorge L; Driver, Jon; Dolan, Raymond J

    2003-06-01

    High and low spatial frequency information in visual images is processed by distinct neural channels. Using event-related functional magnetic resonance imaging (fMRI) in humans, we show dissociable roles of such visual channels for processing faces and emotional fearful expressions. Neural responses in fusiform cortex, and effects of repeating the same face identity upon fusiform activity, were greater with intact or high-spatial-frequency face stimuli than with low-frequency faces, regardless of emotional expression. In contrast, amygdala responses to fearful expressions were greater for intact or low-frequency faces than for high-frequency faces. An activation of pulvinar and superior colliculus by fearful expressions occurred specifically with low-frequency faces, suggesting that these subcortical pathways may provide coarse fear-related inputs to the amygdala.

  16. Automatic Processing of Emotional Faces in High-Functioning Pervasive Developmental Disorders: An Affective Priming Study

    ERIC Educational Resources Information Center

    Kamio, Yoko; Wolf, Julie; Fein, Deborah

    2006-01-01

    This study examined automatic processing of emotional faces in individuals with high-functioning Pervasive Developmental Disorders (HFPDD) using an affective priming paradigm. Sixteen participants (HFPDD and matched controls) were presented with happy faces, fearful faces or objects in both subliminal and supraliminal exposure conditions, followed…

  17. Are event-related potentials to dynamic facial expressions of emotion related to individual differences in the accuracy of processing facial expressions and identity?

    PubMed

    Recio, Guillermo; Wilhelm, Oliver; Sommer, Werner; Hildebrandt, Andrea

    2017-04-01

    Despite a wealth of knowledge about the neural mechanisms behind emotional facial expression processing, little is known about how they relate to individual differences in social cognition abilities. We studied individual differences in the event-related potentials (ERPs) elicited by dynamic facial expressions. First, we assessed the latent structure of the ERPs, reflecting structural face processing in the N170, and the allocation of processing resources and reflexive attention to emotionally salient stimuli, in the early posterior negativity (EPN) and the late positive complex (LPC). Then we estimated brain-behavior relationships between the ERP factors and behavioral indicators of facial identity and emotion-processing abilities. Structural models revealed that the participants who formed faster structural representations of neutral faces (i.e., shorter N170 latencies) performed better at face perception (r = -.51) and memory (r = -.42). The N170 amplitude was not related to individual differences in face cognition or emotion processing. The latent EPN factor correlated with emotion perception (r = .47) and memory (r = .32), and also with face perception abilities (r = .41). Interestingly, the latent factor representing the difference in EPN amplitudes between the two neutral control conditions (chewing and blinking movements) also correlated with emotion perception (r = .51), highlighting the importance of tracking facial changes in the perception of emotional facial expressions. The LPC factor for negative expressions correlated with the memory for emotional facial expressions. The links revealed between the latency and strength of activations of brain systems and individual differences in processing socio-emotional information provide new insights into the brain mechanisms involved in social communication.

  18. Looking to the eyes influences the processing of emotion on face-sensitive event-related potentials in 7-month-old infants.

    PubMed

    Vanderwert, Ross E; Westerlund, Alissa; Montoya, Lina; McCormick, Sarah A; Miguel, Helga O; Nelson, Charles A

    2015-10-01

    Previous studies in infants have shown that face-sensitive components of the ongoing electroencephalogram (the event-related potential, or ERP) are larger in amplitude to negative emotions (e.g., fear, anger) versus positive emotions (e.g., happy). However, it is still unclear whether the negative emotions linked with the face or the negative emotions alone contribute to these amplitude differences. We simultaneously recorded infant looking behaviors (via eye-tracking) and face-sensitive ERPs while 7-month-old infants viewed human faces or animals displaying happy, fear, or angry expressions. We observed that the amplitude of the N290 was greater (i.e., more negative) to angry animals compared to happy or fearful animals; no such differences were obtained for human faces. Eye-tracking data highlighted the importance of the eye region in processing emotional human faces. Infants that spent more time looking to the eye region of human faces showing fearful or angry expressions had greater N290 or P400 amplitudes, respectively. © 2014 Wiley Periodicals, Inc.

  19. An ERP Study of Emotional Face Processing in the Adult and Infant Brain

    ERIC Educational Resources Information Center

    Leppanen, Jukka M.; Moulson, Margaret C.; Vogel-Farley, Vanessa K.; Nelson, Charles A.

    2007-01-01

    To examine the ontogeny of emotional face processing, event-related potentials (ERPs) were recorded from adults and 7-month-old infants while viewing pictures of fearful, happy, and neutral faces. Face-sensitive ERPs at occipital-temporal scalp regions differentiated between fearful and neutral/happy faces in both adults (N170 was larger for fear)…

  20. Human sex differences in emotional processing of own-race and other-race faces.

    PubMed

    Ran, Guangming; Chen, Xu; Pan, Yangu

    2014-06-18

    There is evidence that women and men show differences in the perception of affective facial expressions. However, none of the previous studies directly investigated sex differences in emotional processing of own-race and other-race faces. The current study addressed this issue using high time resolution event-related potential techniques. In total, data from 25 participants (13 women and 12 men) were analyzed. It was found that women showed increased N170 amplitudes to negative White faces compared with negative Chinese faces over the right hemisphere electrodes. This result suggests that women show enhanced sensitivity to other-race faces showing negative emotions (fear or disgust), which may contribute toward evolution. However, the current data showed that men had increased N170 amplitudes to happy Chinese versus happy White faces over the left hemisphere electrodes, indicating that men show enhanced sensitivity to own-race faces showing positive emotions (happiness). In this respect, men might use past pleasant emotional experiences to boost recognition of own-race faces.

  1. Effects of facial color on the subliminal processing of fearful faces.

    PubMed

    Nakajima, K; Minami, T; Nakauchi, S

    2015-12-03

    Recent studies have suggested that both configural information, such as face shape, and surface information is important for face perception. In particular, facial color is sufficiently suggestive of emotional states, as in the phrases: "flushed with anger" and "pale with fear." However, few studies have examined the relationship between facial color and emotional expression. On the other hand, event-related potential (ERP) studies have shown that emotional expressions, such as fear, are processed unconsciously. In this study, we examined how facial color modulated the supraliminal and subliminal processing of fearful faces. We recorded electroencephalograms while participants performed a facial emotion identification task involving masked target faces exhibiting facial expressions (fearful or neutral) and colors (natural or bluish). The results indicated that there was a significant interaction between facial expression and color for the latency of the N170 component. Subsequent analyses revealed that the bluish-colored faces increased the latency effect of facial expressions compared to the natural-colored faces, indicating that the bluish color modulated the processing of fearful expressions. We conclude that the unconscious processing of fearful faces is affected by facial color. Copyright © 2015 IBRO. Published by Elsevier Ltd. All rights reserved.

  2. Functional Brain Activation to Emotional and non-Emotional Faces in Healthy Children: Evidence for Developmentally Undifferentiated Amygdala Function During the School Age Period

    PubMed Central

    Pagliaccio, David; Luby, Joan L.; Gaffrey, Michael S.; Belden, Andrew C.; Botteron, Kelly N.; Harms, Michael P.; Barch, Deanna M.

    2013-01-01

    The amygdala is a key region in emotion processing. Particularly, fMRI studies have demonstrated that the amygdala is active during the viewing of emotional faces. Previous research has consistently found greater amygdala responses to fearful faces as compared to neutral faces in adults, convergent with a focus in the animal literature on the amygdala's role in fear processing. Studies have found that the amygdala also responds differentially to other facial emotion types in adults. Yet, the literature regarding when this differential amygdala responsivity develops is limited and mixed. Thus, the goal of current study was to examine amygdala responses to emotional and neutral faces in a relatively large sample of healthy school age children (N = 52). While the amygdala was active in response to emotional and neutral faces, the results do not support the hypothesis that the amygdala responds differentially to emotional faces in 7 – 12 year old children. Nonetheless, amygdala activity was correlated with the severity of subclinical depression symptoms and emotional regulation skills. Additionally, sex differences were observed in frontal, temporal, and visual regions as well as effects of pubertal development in visual regions. These findings suggest important differences in amygdala reactivity in childhood. PMID:23636982

  3. Cortical deficits of emotional face processing in adults with ADHD: its relation to social cognition and executive function.

    PubMed

    Ibáñez, Agustin; Petroni, Agustin; Urquina, Hugo; Torrente, Fernando; Torralva, Teresa; Hurtado, Esteban; Guex, Raphael; Blenkmann, Alejandro; Beltrachini, Leandro; Muravchik, Carlos; Baez, Sandra; Cetkovich, Marcelo; Sigman, Mariano; Lischinsky, Alicia; Manes, Facundo

    2011-01-01

    Although it has been shown that adults with attention-deficit hyperactivity disorder (ADHD) have impaired social cognition, no previous study has reported the brain correlates of face valence processing. This study looked for behavioral, neuropsychological, and electrophysiological markers of emotion processing for faces (N170) in adult ADHD compared to controls matched by age, gender, educational level, and handedness. We designed an event-related potential (ERP) study based on a dual valence task (DVT), in which faces and words were presented to test the effects of stimulus type (faces, words, or face-word stimuli) and valence (positive versus negative). Individual signatures of cognitive functioning in participants with ADHD and controls were assessed with a comprehensive neuropsychological evaluation, including executive functioning (EF) and theory of mind (ToM). Compared to controls, the adult ADHD group showed deficits in N170 emotion modulation for facial stimuli. These N170 impairments were observed in the absence of any deficit in facial structural processing, suggesting a specific ADHD impairment in early facial emotion modulation. The cortical current density mapping of N170 yielded a main neural source of N170 at posterior section of fusiform gyrus (maximum at left hemisphere for words and right hemisphere for faces and simultaneous stimuli). Neural generators of N170 (fusiform gyrus) were reduced in ADHD. In those patients, N170 emotion processing was associated with performance on an emotional inference ToM task, and N170 from simultaneous stimuli was associated with EF, especially working memory. This is the first report to reveal an adult ADHD-specific impairment in the cortical modulation of emotion for faces and an association between N170 cortical measures and ToM and EF.

  4. The beneficial effect of oxytocin on avoidance-related facial emotion recognition depends on early life stress experience.

    PubMed

    Feeser, Melanie; Fan, Yan; Weigand, Anne; Hahn, Adam; Gärtner, Matti; Aust, Sabine; Böker, Heinz; Bajbouj, Malek; Grimm, Simone

    2014-12-01

    Previous studies have shown that oxytocin (OXT) enhances social cognitive processes. It has also been demonstrated that OXT does not uniformly facilitate social cognition. The effects of OXT administration strongly depend on the exposure to stressful experiences in early life. Emotional facial recognition is crucial for social cognition. However, no study has yet examined how the effects of OXT on the ability to identify emotional faces are altered by early life stress (ELS) experiences. Given the role of OXT in modulating social motivational processes, we specifically aimed to investigate its effects on the recognition of approach- and avoidance-related facial emotions. In a double-blind, between-subjects, placebo-controlled design, 82 male participants performed an emotion recognition task with faces taken from the "Karolinska Directed Emotional Faces" set. We clustered the six basic emotions along the dimensions approach (happy, surprise, anger) and avoidance (fear, sadness, disgust). ELS was assessed with the Childhood Trauma Questionnaire (CTQ). Our results showed that OXT improved the ability to recognize avoidance-related emotional faces as compared to approach-related emotional faces. Whereas the performance for avoidance-related emotions in participants with higher ELS scores was comparable in both OXT and placebo condition, OXT enhanced emotion recognition in participants with lower ELS scores. Independent of OXT administration, we observed increased emotion recognition for avoidance-related faces in participants with high ELS scores. Our findings suggest that the investigation of OXT on social recognition requires a broad approach that takes ELS experiences as well as motivational processes into account.

  5. Superior Recognition Performance for Happy Masked and Unmasked Faces in Both Younger and Older Adults

    PubMed Central

    Svärd, Joakim; Wiens, Stefan; Fischer, Håkan

    2012-01-01

    In the aging literature it has been shown that even though emotion recognition performance decreases with age, the decrease is less for happiness than other facial expressions. Studies in younger adults have also revealed that happy faces are more strongly attended to and better recognized than other emotional facial expressions. Thus, there might be a more age independent happy face advantage in facial expression recognition. By using a backward masking paradigm and varying stimulus onset asynchronies (17–267 ms) the temporal development of a happy face advantage, on a continuum from low to high levels of visibility, was examined in younger and older adults. Results showed that across age groups, recognition performance for happy faces was better than for neutral and fearful faces at durations longer than 50 ms. Importantly, the results showed a happy face advantage already during early processing of emotional faces in both younger and older adults. This advantage is discussed in terms of processing of salient perceptual features and elaborative processing of the happy face. We also investigate the combined effect of age and neuroticism on emotional face processing. The rationale was previous findings of age-related differences in physiological arousal to emotional pictures and a relation between arousal and neuroticism. Across all durations, there was an interaction between age and neuroticism, showing that being high in neuroticism might be disadvantageous for younger, but not older adults’ emotion recognition performance during arousal enhancing tasks. These results indicate that there is a relation between aging, neuroticism, and performance, potentially related to physiological arousal. PMID:23226135

  6. Method for Face-Emotion Retrieval Using A Cartoon Emotional Expression Approach

    NASA Astrophysics Data System (ADS)

    Kostov, Vlaho; Yanagisawa, Hideyoshi; Johansson, Martin; Fukuda, Shuichi

    A simple method for extracting emotion from a human face, as a form of non-verbal communication, was developed to cope with and optimize mobile communication in a globalized and diversified society. A cartoon face based model was developed and used to evaluate emotional content of real faces. After a pilot survey, basic rules were defined and student subjects were asked to express emotion using the cartoon face. Their face samples were then analyzed using principal component analysis and the Mahalanobis distance method. Feature parameters considered as having relations with emotions were extracted and new cartoon faces (based on these parameters) were generated. The subjects evaluated emotion of these cartoon faces again and we confirmed these parameters were suitable. To confirm how these parameters could be applied to real faces, we asked subjects to express the same emotions which were then captured electronically. Simple image processing techniques were also developed to extract these features from real faces and we then compared them with the cartoon face parameters. It is demonstrated via the cartoon face that we are able to express the emotions from very small amounts of information. As a result, real and cartoon faces correspond to each other. It is also shown that emotion could be extracted from still and dynamic real face images using these cartoon-based features.

  7. The processing of facial identity and expression is interactive, but dependent on task and experience

    PubMed Central

    Yankouskaya, Alla; Humphreys, Glyn W.; Rotshtein, Pia

    2014-01-01

    Facial identity and emotional expression are two important sources of information for daily social interaction. However the link between these two aspects of face processing has been the focus of an unresolved debate for the past three decades. Three views have been advocated: (1) separate and parallel processing of identity and emotional expression signals derived from faces; (2) asymmetric processing with the computation of emotion in faces depending on facial identity coding but not vice versa; and (3) integrated processing of facial identity and emotion. We present studies with healthy participants that primarily apply methods from mathematical psychology, formally testing the relations between the processing of facial identity and emotion. Specifically, we focused on the “Garner” paradigm, the composite face effect and the divided attention tasks. We further ask whether the architecture of face-related processes is fixed or flexible and whether (and how) it can be shaped by experience. We conclude that formal methods of testing the relations between processes show that the processing of facial identity and expressions interact, and hence are not fully independent. We further demonstrate that the architecture of the relations depends on experience; where experience leads to higher degree of inter-dependence in the processing of identity and expressions. We propose that this change occurs as integrative processes are more efficient than parallel. Finally, we argue that the dynamic aspects of face processing need to be incorporated into theories in this field. PMID:25452722

  8. On the Automaticity of Emotion Processing in Words and Faces: Event-Related Brain Potentials Evidence from a Superficial Task

    ERIC Educational Resources Information Center

    Rellecke, Julian; Palazova, Marina; Sommer, Werner; Schacht, Annekathrin

    2011-01-01

    The degree to which emotional aspects of stimuli are processed automatically is controversial. Here, we assessed the automatic elicitation of emotion-related brain potentials (ERPs) to positive, negative, and neutral words and facial expressions in an easy and superficial face-word discrimination task, for which the emotional valence was…

  9. The effect of acute citalopram on face emotion processing in remitted depression: a pharmacoMRI study.

    PubMed

    Anderson, Ian M; Juhasz, Gabriella; Thomas, Emma; Downey, Darragh; McKie, Shane; Deakin, J F William; Elliott, Rebecca

    2011-01-01

    Both reduced serotonergic (5-HT) function and negative emotional biases have been associated with vulnerability to depression. In order to investigate whether these might be related we examined 5-HT modulation of affective processing in 14 remitted depressed subjects compared with 12 never depressed controls matched for age and sex. Participants underwent function magnetic resonance imaging (fMRI) during a covert face emotion task with and without intravenous citalopram (7.5mg) pretreatment. Compared with viewing neutral faces, and irrespective of group, citalopram enhanced left anterior cingulate blood oxygen level dependent (BOLD) response to happy faces, right posterior insula and right lateral orbitofrontal responses to sad faces, and reduced amygdala responses bilaterally to fearful faces. In controls, relative to remitted depressed subjects, citalopram increased bilateral hippocampal responses to happy faces and increased right anterior insula response to sad faces. These findings were not accounted for by changes in BOLD responses to viewing neutral faces. These results are consistent with previous findings showing 5-HT modulation of affective processing; differences found in previously depressed participants compared with controls may contribute to emotional processing biases underlying vulnerability to depressive relapse. Copyright © 2010 Elsevier B.V. and ECNP. All rights reserved.

  10. Perception of Emotional Facial Expressions in Amyotrophic Lateral Sclerosis (ALS) at Behavioural and Brain Metabolic Level

    PubMed Central

    Aho-Özhan, Helena E. A.; Keller, Jürgen; Heimrath, Johanna; Uttner, Ingo; Kassubek, Jan; Birbaumer, Niels; Ludolph, Albert C.; Lulé, Dorothée

    2016-01-01

    Introduction Amyotrophic lateral sclerosis (ALS) primarily impairs motor abilities but also affects cognition and emotional processing. We hypothesise that subjective ratings of emotional stimuli depicting social interactions and facial expressions is changed in ALS. It was found that recognition of negative emotions and ability to mentalize other’s intentions is reduced. Methods Processing of emotions in faces was investigated. A behavioural test of Ekman faces expressing six basic emotions was presented to 30 ALS patients and 29 age-, gender and education matched healthy controls. Additionally, a subgroup of 15 ALS patients that were able to lie supine in the scanner and 14 matched healthy controls viewed the Ekman faces during functional magnetic resonance imaging (fMRI). Affective state and a number of daily social contacts were measured. Results ALS patients recognized disgust and fear less accurately than healthy controls. In fMRI, reduced brain activity was seen in areas involved in processing of negative emotions replicating our previous results. During processing of sad faces, increased brain activity was seen in areas associated with social emotions in right inferior frontal gyrus and reduced activity in hippocampus bilaterally. No differences in brain activity were seen for any of the other emotional expressions. Inferior frontal gyrus activity for sad faces was associated with increased amount of social contacts of ALS patients. Conclusion ALS patients showed decreased brain and behavioural responses in processing of disgust and fear and an altered brain response pattern for sadness. The negative consequences of neurodegenerative processes in the course of ALS might be counteracted by positive emotional activity and positive social interactions. PMID:27741285

  11. Face-Memory and Emotion: Associations with Major Depression in Children and Adolescents

    ERIC Educational Resources Information Center

    Pine, Daniel S.; Lissek, Shmuel; Klein, Rachel G.; Mannuzza, Salvatore; Moulton, John L., III; Guardino, Mary; Woldehawariat, Girma

    2004-01-01

    Background: Studies in adults with major depressive disorder (MDD) document abnormalities in both memory and face-emotion processing. The current study used a novel face-memory task to test the hypothesis that adolescent MDD is associated with a deficit in memory for face-emotions. The study also examines the relationship between parental MDD and…

  12. The Development of Emotional Face Processing during Childhood

    ERIC Educational Resources Information Center

    Batty, Magali; Taylor, Margot J.

    2006-01-01

    Our facial expressions give others the opportunity to access our feelings, and constitute an important nonverbal tool for communication. Many recent studies have investigated emotional perception in adults, and our knowledge of neural processes involved in emotions is increasingly precise. Young children also use faces to express their internal…

  13. Emotional face processing in pediatric bipolar disorder: evidence for functional impairments in the fusiform gyrus.

    PubMed

    Perlman, Susan B; Fournier, Jay C; Bebko, Genna; Bertocci, Michele A; Hinze, Amanda K; Bonar, Lisa; Almeida, Jorge R C; Versace, Amelia; Schirda, Claudiu; Travis, Michael; Gill, Mary Kay; Demeter, Christine; Diwadkar, Vaibhav A; Sunshine, Jeffrey L; Holland, Scott K; Kowatch, Robert A; Birmaher, Boris; Axelson, David; Horwitz, Sarah M; Arnold, L Eugene; Fristad, Mary A; Youngstrom, Eric A; Findling, Robert L; Phillips, Mary L

    2013-12-01

    Pediatric bipolar disorder involves poor social functioning, but the neural mechanisms underlying these deficits are not well understood. Previous neuroimaging studies have found deficits in emotional face processing localized to emotional brain regions. However, few studies have examined dysfunction in other regions of the face processing circuit. This study assessed hypoactivation in key face processing regions of the brain in pediatric bipolar disorder. Youth with a bipolar spectrum diagnosis (n = 20) were matched to a nonbipolar clinical group (n = 20), with similar demographics and comorbid diagnoses, and a healthy control group (n = 20). Youth participated in a functional magnetic resonance imaging (fMRI) scanning which employed a task-irrelevant emotion processing design in which processing of facial emotions was not germane to task performance. Hypoactivation, isolated to the fusiform gyrus, was found when viewing animated, emerging facial expressions of happiness, sadness, fearfulness, and especially anger in pediatric bipolar participants relative to matched clinical and healthy control groups. The results of the study imply that differences exist in visual regions of the brain's face processing system and are not solely isolated to emotional brain regions such as the amygdala. Findings are discussed in relation to facial emotion recognition and fusiform gyrus deficits previously reported in the autism literature. Behavioral interventions targeting attention to facial stimuli might be explored as possible treatments for bipolar disorder in youth. Copyright © 2013 American Academy of Child and Adolescent Psychiatry. Published by Elsevier Inc. All rights reserved.

  14. Interdependent Mechanisms for Processing Gender and Emotion: The Special Status of Angry Male Faces

    PubMed Central

    Harris, Daniel A.; Ciaramitaro, Vivian M.

    2016-01-01

    While some models of how various attributes of a face are processed have posited that face features, invariant physical cues such as gender or ethnicity as well as variant social cues such as emotion, may be processed independently (e.g., Bruce and Young, 1986), other models suggest a more distributed representation and interdependent processing (e.g., Haxby et al., 2000). Here, we use a contingent adaptation paradigm to investigate if mechanisms for processing the gender and emotion of a face are interdependent and symmetric across the happy–angry emotional continuum and regardless of the gender of the face. We simultaneously adapted participants to angry female faces and happy male faces (Experiment 1) or to happy female faces and angry male faces (Experiment 2). In Experiment 1, we found evidence for contingent adaptation, with simultaneous aftereffects in opposite directions: male faces were biased toward angry while female faces were biased toward happy. Interestingly, in the complementary Experiment 2, we did not find evidence for contingent adaptation, with both male and female faces biased toward angry. Our results highlight that evidence for contingent adaptation and the underlying interdependent face processing mechanisms that would allow for contingent adaptation may only be evident for certain combinations of face features. Such limits may be especially important in the case of social cues given how maladaptive it may be to stop responding to threatening information, with male angry faces considered to be the most threatening. The underlying neuronal mechanisms that could account for such asymmetric effects in contingent adaptation remain to be elucidated. PMID:27471482

  15. No differences in emotion recognition strategies in children with autism spectrum disorder: evidence from hybrid faces.

    PubMed

    Evers, Kris; Kerkhof, Inneke; Steyaert, Jean; Noens, Ilse; Wagemans, Johan

    2014-01-01

    Emotion recognition problems are frequently reported in individuals with an autism spectrum disorder (ASD). However, this research area is characterized by inconsistent findings, with atypical emotion processing strategies possibly contributing to existing contradictions. In addition, an attenuated saliency of the eyes region is often demonstrated in ASD during face identity processing. We wanted to compare reliance on mouth versus eyes information in children with and without ASD, using hybrid facial expressions. A group of six-to-eight-year-old boys with ASD and an age- and intelligence-matched typically developing (TD) group without intellectual disability performed an emotion labelling task with hybrid facial expressions. Five static expressions were used: one neutral expression and four emotional expressions, namely, anger, fear, happiness, and sadness. Hybrid faces were created, consisting of an emotional face half (upper or lower face region) with the other face half showing a neutral expression. Results showed no emotion recognition problem in ASD. Moreover, we provided evidence for the existence of top- and bottom-emotions in children: correct identification of expressions mainly depends on information in the eyes (so-called top-emotions: happiness) or in the mouth region (so-called bottom-emotions: sadness, anger, and fear). No stronger reliance on mouth information was found in children with ASD.

  16. Social and emotional relevance in face processing: happy faces of future interaction partners enhance the late positive potential

    PubMed Central

    Bublatzky, Florian; Gerdes, Antje B. M.; White, Andrew J.; Riemer, Martin; Alpers, Georg W.

    2014-01-01

    Human face perception is modulated by both emotional valence and social relevance, but their interaction has rarely been examined. Event-related brain potentials (ERP) to happy, neutral, and angry facial expressions with different degrees of social relevance were recorded. To implement a social anticipation task, relevance was manipulated by presenting faces of two specific actors as future interaction partners (socially relevant), whereas two other face actors remained non-relevant. In a further control task all stimuli were presented without specific relevance instructions (passive viewing). Face stimuli of four actors (2 women, from the KDEF) were randomly presented for 1s to 26 participants (16 female). Results showed an augmented N170, early posterior negativity (EPN), and late positive potential (LPP) for emotional in contrast to neutral facial expressions. Of particular interest, face processing varied as a function of experimental tasks. Whereas task effects were observed for P1 and EPN regardless of instructed relevance, LPP amplitudes were modulated by emotional facial expression and relevance manipulation. The LPP was specifically enhanced for happy facial expressions of the anticipated future interaction partners. This underscores that social relevance can impact face processing already at an early stage of visual processing. These findings are discussed within the framework of motivated attention and face processing theories. PMID:25076881

  17. Face Recognition, Musical Appraisal, and Emotional Crossmodal Bias.

    PubMed

    Invitto, Sara; Calcagnì, Antonio; Mignozzi, Arianna; Scardino, Rosanna; Piraino, Giulia; Turchi, Daniele; De Feudis, Irio; Brunetti, Antonio; Bevilacqua, Vitoantonio; de Tommaso, Marina

    2017-01-01

    Recent research on the crossmodal integration of visual and auditory perception suggests that evaluations of emotional information in one sensory modality may tend toward the emotional value generated in another sensory modality. This implies that the emotions elicited by musical stimuli can influence the perception of emotional stimuli presented in other sensory modalities, through a top-down process. The aim of this work was to investigate how crossmodal perceptual processing influences emotional face recognition and how potential modulation of this processing induced by music could be influenced by the subject's musical competence. We investigated how emotional face recognition processing could be modulated by listening to music and how this modulation varies according to the subjective emotional salience of the music and the listener's musical competence. The sample consisted of 24 participants: 12 professional musicians and 12 university students (non-musicians). Participants performed an emotional go/no-go task whilst listening to music by Albeniz, Chopin, or Mozart. The target stimuli were emotionally neutral facial expressions. We examined the N170 Event-Related Potential (ERP) and behavioral responses (i.e., motor reaction time to target recognition and musical emotional judgment). A linear mixed-effects model and a decision-tree learning technique were applied to N170 amplitudes and latencies. The main findings of the study were that musicians' behavioral responses and N170 is more affected by the emotional value of music administered in the emotional go/no-go task and this bias is also apparent in responses to the non-target emotional face. This suggests that emotional information, coming from multiple sensory channels, activates a crossmodal integration process that depends upon the stimuli emotional salience and the listener's appraisal.

  18. Effects of speaker emotional facial expression and listener age on incremental sentence processing.

    PubMed

    Carminati, Maria Nella; Knoeferle, Pia

    2013-01-01

    We report two visual-world eye-tracking experiments that investigated how and with which time course emotional information from a speaker's face affects younger (N = 32, Mean age  = 23) and older (N = 32, Mean age  = 64) listeners' visual attention and language comprehension as they processed emotional sentences in a visual context. The age manipulation tested predictions by socio-emotional selectivity theory of a positivity effect in older adults. After viewing the emotional face of a speaker (happy or sad) on a computer display, participants were presented simultaneously with two pictures depicting opposite-valence events (positive and negative; IAPS database) while they listened to a sentence referring to one of the events. Participants' eye fixations on the pictures while processing the sentence were increased when the speaker's face was (vs. wasn't) emotionally congruent with the sentence. The enhancement occurred from the early stages of referential disambiguation and was modulated by age. For the older adults it was more pronounced with positive faces, and for the younger ones with negative faces. These findings demonstrate for the first time that emotional facial expressions, similarly to previously-studied speaker cues such as eye gaze and gestures, are rapidly integrated into sentence processing. They also provide new evidence for positivity effects in older adults during situated sentence processing.

  19. Seeing Life through Positive-Tinted Glasses: Color–Meaning Associations

    PubMed Central

    Gil, Sandrine; Le Bigot, Ludovic

    2014-01-01

    There is a growing body of literature to show that color can convey information, owing to its emotionally meaningful associations. Most research so far has focused on negative hue–meaning associations (e.g., red) with the exception of the positive aspects associated with green. We therefore set out to investigate the positive associations of two colors (i.e., green and pink), using an emotional facial expression recognition task in which colors provided the emotional contextual information for the face processing. In two experiments, green and pink backgrounds enhanced happy face recognition and impaired sad face recognition, compared with a control color (gray). Our findings therefore suggest that because green and pink both convey positive information, they facilitate the processing of emotionally congruent facial expressions (i.e., faces expressing happiness) and interfere with that of incongruent facial expressions (i.e., faces expressing sadness). Data also revealed a positive association for white. Results are discussed within the theoretical framework of emotional cue processing and color meaning. PMID:25098167

  20. Seeing life through positive-tinted glasses: color-meaning associations.

    PubMed

    Gil, Sandrine; Le Bigot, Ludovic

    2014-01-01

    There is a growing body of literature to show that color can convey information, owing to its emotionally meaningful associations. Most research so far has focused on negative hue-meaning associations (e.g., red) with the exception of the positive aspects associated with green. We therefore set out to investigate the positive associations of two colors (i.e., green and pink), using an emotional facial expression recognition task in which colors provided the emotional contextual information for the face processing. In two experiments, green and pink backgrounds enhanced happy face recognition and impaired sad face recognition, compared with a control color (gray). Our findings therefore suggest that because green and pink both convey positive information, they facilitate the processing of emotionally congruent facial expressions (i.e., faces expressing happiness) and interfere with that of incongruent facial expressions (i.e., faces expressing sadness). Data also revealed a positive association for white. Results are discussed within the theoretical framework of emotional cue processing and color meaning.

  1. Face-memory and emotion: associations with major depression in children and adolescents.

    PubMed

    Pine, Daniel S; Lissek, Shmuel; Klein, Rachel G; Mannuzza, Salvatore; Moulton, John L; Guardino, Mary; Woldehawariat, Girma

    2004-10-01

    Studies in adults with major depressive disorder (MDD) document abnormalities in both memory and face-emotion processing. The current study used a novel face-memory task to test the hypothesis that adolescent MDD is associated with a deficit in memory for face-emotions. The study also examines the relationship between parental MDD and memory performance in offspring. Subjects were 152 offspring (ages 9-19) of adults with either MDD, anxiety disorders, both MDD and anxiety, or no disorder. Parents and offspring were assessed for mental disorders. Collection of face-memory data was blind to offspring and parent diagnosis. A computerized task was developed that required rating of facial photographs depicting 'happy,"fearful,' or 'angry' emotions followed by a memory recall test. Recall accuracy was examined as a function of face-emotion type. Age and gender independently predicted memory, with better recall in older and female subjects. Controlling for age and gender, offspring with a history of MDD (n = 19) demonstrated significant deficits in memory selectively for fearful faces, but not happy or angry faces. Parental MDD was not associated with face-memory accuracy. This study found an association between MDD in childhood or adolescence and perturbed encoding of fearful faces. MDD in young individuals may predispose to subtle anomalies in a neural circuit encompassing the amygdala, a brain region implicated in the processing of fearful facial expressions. These findings suggest that brain imaging studies using similar face-emotion paradigms should test whether deficits in processing of fearful faces relate to amygdala dysfunction in children and adolescents with MDD.

  2. Association between Ability Emotional Intelligence and Left Insula during Social Judgment of Facial Emotions.

    PubMed

    Quarto, Tiziana; Blasi, Giuseppe; Maddalena, Chiara; Viscanti, Giovanna; Lanciano, Tiziana; Soleti, Emanuela; Mangiulli, Ivan; Taurisano, Paolo; Fazio, Leonardo; Bertolino, Alessandro; Curci, Antonietta

    2016-01-01

    The human ability of identifying, processing and regulating emotions from social stimuli is generally referred as Emotional Intelligence (EI). Within EI, Ability EI identifies a performance measure assessing individual skills at perceiving, using, understanding and managing emotions. Previous models suggest that a brain "somatic marker circuitry" (SMC) sustains emotional sub-processes included in EI. Three primary brain regions are included: the amygdala, the insula and the ventromedial prefrontal cortex (vmPFC). Here, our aim was to investigate the relationship between Ability EI scores and SMC activity during social judgment of emotional faces. Sixty-three healthy subjects completed a test measuring Ability EI and underwent fMRI during a social decision task (i.e. approach or avoid) about emotional faces with different facial expressions. Imaging data revealed that EI scores are associated with left insula activity during social judgment of emotional faces as a function of facial expression. Specifically, higher EI scores are associated with greater left insula activity during social judgment of fearful faces but also with lower activity of this region during social judgment of angry faces. These findings indicate that the association between Ability EI and the SMC activity during social behavior is region- and emotion-specific.

  3. Lateralization of Visuospatial Attention across Face Regions Varies with Emotional Prosody

    ERIC Educational Resources Information Center

    Thompson, Laura A.; Malloy, Daniel M.; LeBlanc, Katya L.

    2009-01-01

    It is well-established that linguistic processing is primarily a left-hemisphere activity, while emotional prosody processing is lateralized to the right hemisphere. Does attention, directed at different regions of the talker's face, reflect this pattern of lateralization? We investigated visuospatial attention across a talker's face with a…

  4. Parametric modulation of neural activity by emotion in youth with bipolar disorder, severe mood dysregulation, and healthy subjects

    PubMed Central

    Thomas, Laura A.; Brotman, Melissa A.; Muhrer, Eli M.; Rosen, Brooke H.; Bones, Brian L.; Reynolds, Richard C.; Deveney, Christen; Pine, Daniel S.; Leibenluft, Ellen

    2012-01-01

    Context Youth with bipolar disorder (BD) and those with severe, non-episodic irritability (severe mood dysregulation, SMD) show amygdala dysfunction during face emotion processing. However, studies have not compared such patients to each other and to comparison subjects in neural responsiveness to subtle changes in face emotion; the ability to process such changes is important for social cognition. We employed a novel parametrically designed faces paradigm. Objective Using a parametrically morphed emotional faces task, we compared activation in the amygdala and across the brain in BD, SMD, and healthy volunteers (HV). Design Case-control study. Setting Government research institute. Participants 57 youths (19 BD, 15 SMD, 23 HV). Main Outcome Measure Blood oxygenated level dependent (BOLD) data. Neutral faces were morphed with angry and happy faces in 25% intervals; static face stimuli appeared for 3000ms. Subjects performed hostility or non-emotional facial feature (i.e., nose width) ratings. Slope of BOLD activity was calculated across neutral-to-angry (N→A) and neutral-to-happy (N→H) face stimuli. Results In HV, but not BD or SMD, there was a positive association between left amygdala activity and anger on the face. In the N→H whole brain analysis, BD and SMD modulated parietal, temporal, and medial-frontal areas differently from each other and from HV; with increasing facial-happiness, SMD increased, while BD decreased, activity in parietal, temporal, and frontal regions. Conclusions Youth with BD or SMD differ from HV in modulation of amygdala activity in response to small changes in facial anger displays. In contrast, BD and SMD show distinct perturbations in regions mediating attention and face processing in association with changes in the emotional intensity of facial happiness displays. These findings demonstrate similarities and differences in the neural correlates of face emotion processing in BD and SMD, suggesting these distinct clinical presentations may reflect differing pathologies along a mood disorders spectrum. PMID:23026912

  5. Early visual ERPs are influenced by individual emotional skills

    PubMed Central

    Roux, Sylvie; Batty, Magali

    2014-01-01

    Processing information from faces is crucial to understanding others and to adapting to social life. Many studies have investigated responses to facial emotions to provide a better understanding of the processes and the neural networks involved. Moreover, several studies have revealed abnormalities of emotional face processing and their neural correlates in affective disorders. The aim of this study was to investigate whether early visual event-related potentials (ERPs) are affected by the emotional skills of healthy adults. Unfamiliar faces expressing the six basic emotions were presented to 28 young adults while recording visual ERPs. No specific task was required during the recording. Participants also completed the Social Skills Inventory (SSI) which measures social and emotional skills. The results confirmed that early visual ERPs (P1, N170) are affected by the emotions expressed by a face and also demonstrated that N170 and P2 are correlated to the emotional skills of healthy subjects. While N170 is sensitive to the subject’s emotional sensitivity and expressivity, P2 is modulated by the ability of the subjects to control their emotions. We therefore suggest that N170 and P2 could be used as individual markers to assess strengths and weaknesses in emotional areas and could provide information for further investigations of affective disorders. PMID:23720573

  6. Pretreatment Differences in BOLD Response to Emotional Faces Correlate with Antidepressant Response to Scopolamine.

    PubMed

    Furey, Maura L; Drevets, Wayne C; Szczepanik, Joanna; Khanna, Ashish; Nugent, Allison; Zarate, Carlos A

    2015-03-28

    Faster acting antidepressants and biomarkers that predict treatment response are needed to facilitate the development of more effective treatments for patients with major depressive disorders. Here, we evaluate implicitly and explicitly processed emotional faces using neuroimaging to identify potential biomarkers of treatment response to the antimuscarinic, scopolamine. Healthy participants (n=15) and unmedicated-depressed major depressive disorder patients (n=16) participated in a double-blind, placebo-controlled crossover infusion study using scopolamine (4 μg/kg). Before and following scopolamine, blood oxygen-level dependent signal was measured using functional MRI during a selective attention task. Two stimuli comprised of superimposed pictures of faces and houses were presented. Participants attended to one stimulus component and performed a matching task. Face emotion was modulated (happy/sad) creating implicit (attend-houses) and explicit (attend-faces) emotion processing conditions. The pretreatment difference in blood oxygen-level dependent response to happy and sad faces under implicit and explicit conditions (emotion processing biases) within a-priori regions of interest was correlated with subsequent treatment response in major depressive disorder. Correlations were observed exclusively during implicit emotion processing in the regions of interest, which included the subgenual anterior cingulate (P<.02) and middle occipital cortices (P<.02). The magnitude and direction of differential blood oxygen-level- dependent response to implicitly processed emotional faces prior to treatment reflect the potential to respond to scopolamine. These findings replicate earlier results, highlighting the potential for pretreatment neural activity in the middle occipital cortices and subgenual anterior cingulate to inform us about the potential to respond clinically to scopolamine. Published by Oxford University Press on behalf of CINP 2015. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  7. Individual differences in emotion lateralisation and the processing of emotional information arising from social interactions.

    PubMed

    Bourne, Victoria J; Watling, Dawn

    2015-01-01

    Previous research examining the possible association between emotion lateralisation and social anxiety has found conflicting results. In this paper two studies are presented to assess two aspects related to different features of social anxiety: fear of negative evaluation (FNE) and emotion regulation. Lateralisation for the processing of facial emotion was measured using the chimeric faces test. Individuals with greater FNE were more strongly lateralised to the right hemisphere for the processing of anger, happiness and sadness; and, for the processing of fearful faces the relationship was found for females only. Emotion regulation strategies were reduced to two factors: positive strategies and negative strategies. For males, but not females, greater reported use of negative emotion strategies is associated with stronger right hemisphere lateralisation for processing negative emotions. The implications for further understanding the neuropsychological processing of emotion in individuals with social anxiety are discussed.

  8. Selective attention modulates early human evoked potentials during emotional face-voice processing.

    PubMed

    Ho, Hao Tam; Schröger, Erich; Kotz, Sonja A

    2015-04-01

    Recent findings on multisensory integration suggest that selective attention influences cross-sensory interactions from an early processing stage. Yet, in the field of emotional face-voice integration, the hypothesis prevails that facial and vocal emotional information interacts preattentively. Using ERPs, we investigated the influence of selective attention on the perception of congruent versus incongruent combinations of neutral and angry facial and vocal expressions. Attention was manipulated via four tasks that directed participants to (i) the facial expression, (ii) the vocal expression, (iii) the emotional congruence between the face and the voice, and (iv) the synchrony between lip movement and speech onset. Our results revealed early interactions between facial and vocal emotional expressions, manifested as modulations of the auditory N1 and P2 amplitude by incongruent emotional face-voice combinations. Although audiovisual emotional interactions within the N1 time window were affected by the attentional manipulations, interactions within the P2 modulation showed no such attentional influence. Thus, we propose that the N1 and P2 are functionally dissociated in terms of emotional face-voice processing and discuss evidence in support of the notion that the N1 is associated with cross-sensory prediction, whereas the P2 relates to the derivation of an emotional percept. Essentially, our findings put the integration of facial and vocal emotional expressions into a new perspective-one that regards the integration process as a composite of multiple, possibly independent subprocesses, some of which are susceptible to attentional modulation, whereas others may be influenced by additional factors.

  9. Neural correlates of emotional face processing in bipolar disorder: an event-related potential study.

    PubMed

    Degabriele, Racheal; Lagopoulos, Jim; Malhi, Gin

    2011-09-01

    Behavioural and imaging studies report that individuals with bipolar disorder (BD) exhibit impairments in emotional face processing. However, few studies have studied the temporal characteristics of these impairments, and event-related potential (ERP) studies that investigate emotion perception in BD are rare. The aim of our study was to explore these processes as indexed by the face-specific P100 and N170 ERP components in a BD cohort. Eighteen subjects diagnosed with BD and 18 age- and sex-matched healthy volunteers completed an emotional go/no-go inhibition task during electroencephalogram (EEG) and ERP acquisition. Patients demonstrated faster responses to happy compared to sad faces, whereas control data revealed no emotional discrimination. Errors of omission were more frequent in the BD group in both emotion conditions, but there were no between-group differences in commission errors. Significant differences were found between groups in P100 amplitude variation across levels of affect, with the BD group exhibiting greater responses to happy compared to sad faces. Conversely, the control cohort failed to demonstrate a differentiation between emotions. A statistically significant between-group effect was also found for N170 amplitudes, indicating reduced responses in the BD group. Future studies should ideally recruit BD patients across all three mood states (manic, depressive, and euthymic) with greater scrutiny of the effects of psychotropic medication. These ERP results primarily suggest an emotion-sensitive face processing impairment in BD whereby patients are initially more attuned to positive emotions as indicated by the P100 ERP component, and this may contribute to the emergence of bipolar-like symptoms. Copyright © 2011 Elsevier B.V. All rights reserved.

  10. Amygdala excitability to subliminally presented emotional faces distinguishes unipolar and bipolar depression: an fMRI and pattern classification study.

    PubMed

    Grotegerd, Dominik; Stuhrmann, Anja; Kugel, Harald; Schmidt, Simone; Redlich, Ronny; Zwanzger, Peter; Rauch, Astrid Veronika; Heindel, Walter; Zwitserlood, Pienie; Arolt, Volker; Suslow, Thomas; Dannlowski, Udo

    2014-07-01

    Bipolar disorder and Major depressive disorder are difficult to differentiate during depressive episodes, motivating research for differentiating neurobiological markers. Dysfunctional amygdala responsiveness during emotion processing has been implicated in both disorders, but the important rapid and automatic stages of emotion processing in the amygdala have so far never been investigated in bipolar patients. fMRI data of 22 bipolar depressed patients (BD), 22 matched unipolar depressed patients (MDD), and 22 healthy controls (HC) were obtained during processing of subliminal sad, happy and neutral faces. Amygdala responsiveness was investigated using standard univariate analyses as well as pattern-recognition techniques to differentiate the two clinical groups. Furthermore, medication effects on amygdala responsiveness were explored. All subjects were unaware of the emotional faces. Univariate analysis revealed a significant group × emotion interaction within the left amygdala. Amygdala responsiveness to sad>neutral faces was increased in MDD relative to BD. In contrast, responsiveness to happy>neutral faces showed the opposite pattern, with higher amygdala activity in BD than in MDD. Most of the activation patterns in both clinical groups differed significantly from activation patterns of HC--and therefore represent abnormalities. Furthermore, pattern classification on amygdala activation to sad>happy faces yielded almost 80% accuracy differentiating MDD and BD patients. Medication had no significant effect on these findings. Distinct amygdala excitability during automatic stages of the processing of emotional faces may reflect differential pathophysiological processes in BD versus MDD depression, potentially representing diagnosis-specific neural markers mostly unaffected by current psychotropic medication. Copyright © 2013 Wiley Periodicals, Inc.

  11. Ambulatory orthopaedic surgery patients' emotions when using different patient education methods.

    PubMed

    Heikkinen, Katja; Salanterä, Sanna; Leppänen, Tiina; Vahlberg, Tero; Leino-Kilpi, Helena

    2012-07-01

    A randomised controlled trial was used to evaluate elective ambulatory orthopaedic surgery patients' emotions during internet-based patient education or face-to-face education with a nurse. The internet-based patient education was designed for this study and patients used websites individually based on their needs. Patients in the control group participated individually in face-to-face patient education with a nurse in the ambulatory surgery unit. The theoretical basis for both types of education was the same. Ambulatory orthopaedic surgery patients scored their emotions rather low at intervals throughout the whole surgical process, though their scores also changed during the surgical process. Emotion scores did not decrease after patient education. No differences in patients' emotions were found to result from either of the two different patient education methods.

  12. Emotional responses associated with self-face processing in individuals with autism spectrum disorders: an fMRI study.

    PubMed

    Morita, Tomoyo; Kosaka, Hirotaka; Saito, Daisuke N; Ishitobi, Makoto; Munesue, Toshio; Itakura, Shoji; Omori, Masao; Okazawa, Hidehiko; Wada, Yuji; Sadato, Norihiro

    2012-01-01

    Individuals with autism spectrum disorders (ASD) show impaired emotional responses to self-face processing, but the underlying neural bases are unclear. Using functional magnetic resonance imaging, we investigated brain activity when 15 individuals with high-functioning ASD and 15 controls rated the photogenicity of self-face images and photographs of others' faces. Controls showed a strong correlation between photogenicity ratings and extent of embarrassment evoked by self-face images; this correlation was weaker among ASD individuals, indicating a decoupling between the cognitive evaluation of self-face images and emotional responses. Individuals with ASD demonstrated relatively low self-related activity in the posterior cingulate cortex (PCC), which was related to specific autistic traits. There were significant group differences in the modulation of activity by embarrassment ratings in the right insular (IC) and lateral orbitofrontal cortices. Task-related activity in the right IC was lower in the ASD group. The reduced activity in the right IC for self-face images was associated with weak coupling between cognitive evaluation and emotional responses to self-face images. The PCC is responsible for self-referential processing, and the IC plays a role in emotional experience. Dysfunction in these areas could contribute to the lack of self-conscious behaviors in response to self-reflection in ASD individuals.

  13. Face Processing and Facial Emotion Recognition in Adults with Down Syndrome

    ERIC Educational Resources Information Center

    Barisnikov, Koviljka; Hippolyte, Loyse; Van der Linden, Martial

    2008-01-01

    Face processing and facial expression recognition was investigated in 17 adults with Down syndrome, and results were compared with those of a child control group matched for receptive vocabulary. On the tasks involving faces without emotional content, the adults with Down syndrome performed significantly worse than did the controls. However, their…

  14. Recognition memory for low- and high-frequency-filtered emotional faces: Low spatial frequencies drive emotional memory enhancement, whereas high spatial frequencies drive the emotion-induced recognition bias.

    PubMed

    Rohr, Michaela; Tröger, Johannes; Michely, Nils; Uhde, Alarith; Wentura, Dirk

    2017-07-01

    This article deals with two well-documented phenomena regarding emotional stimuli: emotional memory enhancement-that is, better long-term memory for emotional than for neutral stimuli-and the emotion-induced recognition bias-that is, a more liberal response criterion for emotional than for neutral stimuli. Studies on visual emotion perception and attention suggest that emotion-related processes can be modulated by means of spatial-frequency filtering of the presented emotional stimuli. Specifically, low spatial frequencies are assumed to play a primary role for the influence of emotion on attention and judgment. Given this theoretical background, we investigated whether spatial-frequency filtering also impacts (1) the memory advantage for emotional faces and (2) the emotion-induced recognition bias, in a series of old/new recognition experiments. Participants completed incidental-learning tasks with high- (HSF) and low- (LSF) spatial-frequency-filtered emotional and neutral faces. The results of the surprise recognition tests showed a clear memory advantage for emotional stimuli. Most importantly, the emotional memory enhancement was significantly larger for face images containing only low-frequency information (LSF faces) than for HSF faces across all experiments, suggesting that LSF information plays a critical role in this effect, whereas the emotion-induced recognition bias was found only for HSF stimuli. We discuss our findings in terms of both the traditional account of different processing pathways for HSF and LSF information and a stimulus features account. The double dissociation in the results favors the latter account-that is, an explanation in terms of differences in the characteristics of HSF and LSF stimuli.

  15. Startling similarity: Effects of facial self-resemblance and familiarity on the processing of emotional faces

    PubMed Central

    Larra, Mauro F.; Merz, Martina U.; Schächinger, Hartmut

    2017-01-01

    Facial self-resemblance has been associated with positive emotional evaluations, but this effect may be biased by self-face familiarity. Here we report two experiments utilizing startle modulation to investigate how the processing of facial expressions of emotion is affected by subtle resemblance to the self as well as to familiar faces. Participants of the first experiment (I) (N = 39) were presented with morphed faces showing happy, neutral, and fearful expressions which were manipulated to resemble either their own or unknown faces. At SOAs of either 300 ms or 3500–4500 ms after picture onset, startle responses were elicited by binaural bursts of white noise (50 ms, 105 dB), and recorded at the orbicularis oculi via EMG. Manual reaction time was measured in a simple emotion discrimination paradigm. Pictures preceding noise bursts by short SOA inhibited startle (prepulse inhibition, PPI). Both affective modulation and PPI of startle in response to emotional faces was altered by physical similarity to the self. As indexed both by relative facilitation of startle and faster manual responses, self-resemblance apparently induced deeper processing of facial affect, particularly in happy faces. Experiment II (N = 54) produced similar findings using morphs of famous faces, yet showed no impact of mere familiarity on PPI effects (or response time, either). The results are discussed with respect to differential (presumably pre-attentive) effects of self-specific vs. familiar information in face processing. PMID:29216226

  16. Automatic emotion processing as a function of trait emotional awareness: an fMRI study

    PubMed Central

    Lichev, Vladimir; Sacher, Julia; Ihme, Klas; Rosenberg, Nicole; Quirin, Markus; Lepsien, Jöran; Pampel, André; Rufer, Michael; Grabe, Hans-Jörgen; Kugel, Harald; Kersting, Anette; Villringer, Arno; Lane, Richard D.

    2015-01-01

    It is unclear whether reflective awareness of emotions is related to extent and intensity of implicit affective reactions. This study is the first to investigate automatic brain reactivity to emotional stimuli as a function of trait emotional awareness. To assess emotional awareness the Levels of Emotional Awareness Scale (LEAS) was administered. During scanning, masked happy, angry, fearful and neutral facial expressions were presented to 46 healthy subjects, who had to rate the fit between artificial and emotional words. The rating procedure allowed assessment of shifts in implicit affectivity due to emotion faces. Trait emotional awareness was associated with increased activation in the primary somatosensory cortex, inferior parietal lobule, anterior cingulate gyrus, middle frontal and cerebellar areas, thalamus, putamen and amygdala in response to masked happy faces. LEAS correlated positively with shifts in implicit affect caused by masked happy faces. According to our findings, people with high emotional awareness show stronger affective reactivity and more activation in brain areas involved in emotion processing and simulation during the perception of masked happy facial expression than people with low emotional awareness. High emotional awareness appears to be characterized by an enhanced positive affective resonance to others at an automatic processing level. PMID:25140051

  17. Attention to emotion modulates fMRI activity in human right superior temporal sulcus.

    PubMed

    Narumoto, J; Okada, T; Sadato, N; Fukui, K; Yonekura, Y

    2001-10-01

    A parallel neural network has been proposed for processing various types of information conveyed by faces including emotion. Using functional magnetic resonance imaging (fMRI), we tested the effect of the explicit attention to the emotional expression of the faces on the neuronal activity of the face-responsive regions. Delayed match to sample procedure was adopted. Subjects were required to match the visually presented pictures with regard to the contour of the face pictures, facial identity, and emotional expressions by valence (happy and fearful expressions) and arousal (fearful and sad expressions). Contour matching of the non-face scrambled pictures was used as a control condition. The face-responsive regions that responded more to faces than to non-face stimuli were the bilateral lateral fusiform gyrus (LFG), the right superior temporal sulcus (STS), and the bilateral intraparietal sulcus (IPS). In these regions, general attention to the face enhanced the activities of the bilateral LFG, the right STS, and the left IPS compared with attention to the contour of the facial image. Selective attention to facial emotion specifically enhanced the activity of the right STS compared with attention to the face per se. The results suggest that the right STS region plays a special role in facial emotion recognition within distributed face-processing systems. This finding may support the notion that the STS is involved in social perception.

  18. The Relationship between Early Neural Responses to Emotional Faces at Age 3 and Later Autism and Anxiety Symptoms in Adolescents with Autism

    ERIC Educational Resources Information Center

    Neuhaus, Emily; Jones, Emily J. H.; Barnes, Karen; Sterling, Lindsey; Estes, Annette; Munson, Jeff; Dawson, Geraldine; Webb, Sara J.

    2016-01-01

    Both autism spectrum (ASD) and anxiety disorders are associated with atypical neural and attentional responses to emotional faces, differing in affective face processing from typically developing peers. Within a longitudinal study of children with ASD (23 male, 3 female), we hypothesized that early ERPs to emotional faces would predict concurrent…

  19. Do you see what I see? Sex differences in the discrimination of facial emotions during adolescence.

    PubMed

    Lee, Nikki C; Krabbendam, Lydia; White, Thomas P; Meeter, Martijn; Banaschewski, Tobias; Barker, Gareth J; Bokde, Arun L W; Büchel, Christian; Conrod, Patricia; Flor, Herta; Frouin, Vincent; Heinz, Andreas; Garavan, Hugh; Gowland, Penny; Ittermann, Bernd; Mann, Karl; Paillère Martinot, Marie-Laure; Nees, Frauke; Paus, Tomas; Pausova, Zdenka; Rietschel, Marcella; Robbins, Trevor; Fauth-Bühler, Mira; Smolka, Michael N; Gallinat, Juergen; Schumann, Gunther; Shergill, Sukhi S

    2013-12-01

    During adolescence social relationships become increasingly important. Establishing and maintaining these relationships requires understanding of emotional stimuli, such as facial emotions. A failure to adequately interpret emotional facial expressions has previously been associated with various mental disorders that emerge during adolescence. The current study examined sex differences in emotional face processing during adolescence. Participants were adolescents (n = 1951) with a target age of 14, who completed a forced-choice emotion discrimination task. The stimuli used comprised morphed faces that contained a blend of two emotions in varying intensities (11 stimuli per set of emotions). Adolescent girls showed faster and more sensitive perception of facial emotions than boys. However, both adolescent boys and girls were most sensitive to variations in emotion intensity in faces combining happiness and sadness, and least sensitive to changes in faces comprising fear and anger. Furthermore, both sexes overidentified happiness and anger. However, the overidentification of happiness was stronger in boys. These findings were not influenced by individual differences in the level of pubertal maturation. These results indicate that male and female adolescents differ in their ability to identify emotions in morphed faces containing emotional blends. The findings provide information for clinical studies examining whether sex differences in emotional processing are related to sex differences in the prevalence of psychiatric disorders within this age group.

  20. Perceiving emotions in neutral faces: expression processing is biased by affective person knowledge.

    PubMed

    Suess, Franziska; Rabovsky, Milena; Abdel Rahman, Rasha

    2015-04-01

    According to a widely held view, basic emotions such as happiness or anger are reflected in facial expressions that are invariant and uniquely defined by specific facial muscle movements. Accordingly, expression perception should not be vulnerable to influences outside the face. Here, we test this assumption by manipulating the emotional valence of biographical knowledge associated with individual persons. Faces of well-known and initially unfamiliar persons displaying neutral expressions were associated with socially relevant negative, positive or comparatively neutral biographical information. The expressions of faces associated with negative information were classified as more negative than faces associated with neutral information. Event-related brain potential modulations in the early posterior negativity, a component taken to reflect early sensory processing of affective stimuli such as emotional facial expressions, suggest that negative affective knowledge can bias the perception of faces with neutral expressions toward subjectively displaying negative emotions. © The Author (2014). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  1. The NMDA antagonist ketamine and the 5-HT agonist psilocybin produce dissociable effects on structural encoding of emotional face expressions.

    PubMed

    Schmidt, André; Kometer, Michael; Bachmann, Rosilla; Seifritz, Erich; Vollenweider, Franz

    2013-01-01

    Both glutamate and serotonin (5-HT) play a key role in the pathophysiology of emotional biases. Recent studies indicate that the glutamate N-methyl-D-aspartate (NMDA) receptor antagonist ketamine and the 5-HT receptor agonist psilocybin are implicated in emotion processing. However, as yet, no study has systematically compared their contribution to emotional biases. This study used event-related potentials (ERPs) and signal detection theory to compare the effects of the NMDA (via S-ketamine) and 5-HT (via psilocybin) receptor system on non-conscious or conscious emotional face processing biases. S-ketamine or psilocybin was administrated to two groups of healthy subjects in a double-blind within-subject placebo-controlled design. We behaviorally assessed objective thresholds for non-conscious discrimination in all drug conditions. Electrophysiological responses to fearful, happy, and neutral faces were subsequently recorded with the face-specific P100 and N170 ERP. Both S-ketamine and psilocybin impaired the encoding of fearful faces as expressed by a reduced N170 over parieto-occipital brain regions. In contrast, while S-ketamine also impaired the encoding of happy facial expressions, psilocybin had no effect on the N170 in response to happy faces. This study demonstrates that the NMDA and 5-HT receptor systems differentially contribute to the structural encoding of emotional face expressions as expressed by the N170. These findings suggest that the assessment of early visual evoked responses might allow detecting pharmacologically induced changes in emotional processing biases and thus provides a framework to study the pathophysiology of dysfunctional emotional biases.

  2. Emotional face expression modulates occipital-frontal effective connectivity during memory formation in a bottom-up fashion.

    PubMed

    Xiu, Daiming; Geiger, Maximilian J; Klaver, Peter

    2015-01-01

    This study investigated the role of bottom-up and top-down neural mechanisms in the processing of emotional face expression during memory formation. Functional brain imaging data was acquired during incidental learning of positive ("happy"), neutral and negative ("angry" or "fearful") faces. Dynamic Causal Modeling (DCM) was applied on the functional magnetic resonance imaging (fMRI) data to characterize effective connectivity within a brain network involving face perception (inferior occipital gyrus and fusiform gyrus) and successful memory formation related areas (hippocampus, superior parietal lobule, amygdala, and orbitofrontal cortex). The bottom-up models assumed processing of emotional face expression along feed forward pathways to the orbitofrontal cortex. The top-down models assumed that the orbitofrontal cortex processed emotional valence and mediated connections to the hippocampus. A subsequent recognition memory test showed an effect of negative emotion on the response bias, but not on memory performance. Our DCM findings showed that the bottom-up model family of effective connectivity best explained the data across all subjects and specified that emotion affected most bottom-up connections to the orbitofrontal cortex, especially from the occipital visual cortex and superior parietal lobule. Of those pathways to the orbitofrontal cortex the connection from the inferior occipital gyrus correlated with memory performance independently of valence. We suggest that bottom-up neural mechanisms support effects of emotional face expression and memory formation in a parallel and partially overlapping fashion.

  3. Topographic brain mapping of emotion-related hemisphere asymmetries.

    PubMed

    Roschmann, R; Wittling, W

    1992-03-01

    The study used topographic brain mapping of visual evoked potentials to investigate emotion-related hemisphere asymmetries. The stimulus material consisted of color photographs of human faces, grouped into two emotion-related categories: normal faces (neutral stimuli) and faces deformed by dermatological diseases (emotional stimuli). The pictures were presented tachistoscopically to 20 adult right-handed subjects. Brain activity was recorded by 30 EEG electrodes with linked ears as reference. The waveforms were averaged separately with respect to each of the two stimulus conditions. Statistical analysis by means of significance probability mapping revealed significant differences between stimulus conditions for two periods of time, indicating right hemisphere superiority in emotion-related processing. The results are discussed in terms of a 2-stage-model of emotional processing in the cerebral hemispheres.

  4. Congruence of happy and sad emotion in music and faces modifies cortical audiovisual activation.

    PubMed

    Jeong, Jeong-Won; Diwadkar, Vaibhav A; Chugani, Carla D; Sinsoongsud, Piti; Muzik, Otto; Behen, Michael E; Chugani, Harry T; Chugani, Diane C

    2011-02-14

    The powerful emotion inducing properties of music are well-known, yet music may convey differing emotional responses depending on environmental factors. We hypothesized that neural mechanisms involved in listening to music may differ when presented together with visual stimuli that conveyed the same emotion as the music when compared to visual stimuli with incongruent emotional content. We designed this study to determine the effect of auditory (happy and sad instrumental music) and visual stimuli (happy and sad faces) congruent or incongruent for emotional content on audiovisual processing using fMRI blood oxygenation level-dependent (BOLD) signal contrast. The experiment was conducted in the context of a conventional block-design experiment. A block consisted of three emotional ON periods, music alone (happy or sad music), face alone (happy or sad faces), and music combined with faces where the music excerpt was played while presenting either congruent emotional faces or incongruent emotional faces. We found activity in the superior temporal gyrus (STG) and fusiform gyrus (FG) to be differentially modulated by music and faces depending on the congruence of emotional content. There was a greater BOLD response in STG when the emotion signaled by the music and faces was congruent. Furthermore, the magnitude of these changes differed for happy congruence and sad congruence, i.e., the activation of STG when happy music was presented with happy faces was greater than the activation seen when sad music was presented with sad faces. In contrast, incongruent stimuli diminished the BOLD response in STG and elicited greater signal change in bilateral FG. Behavioral testing supplemented these findings by showing that subject ratings of emotion in faces were influenced by emotion in music. When presented with happy music, happy faces were rated as more happy (p=0.051) and sad faces were rated as less sad (p=0.030). When presented with sad music, happy faces were rated as less happy (p=0.008) and sad faces were rated as sadder (p=0.002). Happy-sad congruence across modalities may enhance activity in auditory regions while incongruence appears to impact the perception of visual affect, leading to increased activation in face processing regions such as the FG. We suggest that greater understanding of the neural bases of happy-sad congruence across modalities can shed light on basic mechanisms of affective perception and experience and may lead to novel insights in the study of emotion regulation and therapeutic use of music. Copyright © 2010 Elsevier Inc. All rights reserved.

  5. Eye-Tracking, Autonomic, and Electrophysiological Correlates of Emotional Face Processing in Adolescents with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Wagner, Jennifer B.; Hirsch, Suzanna B.; Vogel-Farley, Vanessa K.; Redcay, Elizabeth; Nelson, Charles A.

    2013-01-01

    Individuals with autism spectrum disorder (ASD) often have difficulty with social-emotional cues. This study examined the neural, behavioral, and autonomic correlates of emotional face processing in adolescents with ASD and typical development (TD) using eye-tracking and event-related potentials (ERPs) across two different paradigms. Scanning of…

  6. Orienting and maintenance of attention to threatening facial expressions in anxiety--an eye movement study.

    PubMed

    Holas, Pawel; Krejtz, Izabela; Cypryanska, Marzena; Nezlek, John B

    2014-12-15

    Cognitive models posit that anxiety disorders stem in part from underlying attentional biases to threat. Consistent with this, studies have found that the attentional bias to threat-related stimuli is greater in high vs. low anxious individuals. Nevertheless, it is not clear if similar biases exist for different threatening emotions or for any facial emotional stimulus. In the present study, we used eye-tracking to measure orienting and maintenance of attention to faces displaying anger, fear and disgust as threats, and faces displaying happiness and sadness. Using a free viewing task, we examined differences between low and high trait anxious (HTA) individuals in the attention they paid to each of these emotional faces (paired with a neutral face). We found that initial orienting was faster for angry and happy faces, and high trait anxious participants were more vigilant to fearful and disgust faces. Our results for attentional maintenance were not consistent. The results of the present study suggest that attentional processes may be more emotion-specific than previously believed. Our results suggest that attentional processes for different threatening emotions may not be the same and that attentional processes for some negative and some positive emotions may be similar. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  7. Acute tryptophan depletion attenuates conscious appraisal of social emotional signals in healthy female volunteers.

    PubMed

    Beacher, Felix D C C; Gray, Marcus A; Minati, Ludovico; Whale, Richard; Harrison, Neil A; Critchley, Hugo D

    2011-02-01

    Acute tryptophan depletion (ATD) decreases levels of central serotonin. ATD thus enables the cognitive effects of serotonin to be studied, with implications for the understanding of psychiatric conditions, including depression. To determine the role of serotonin in conscious (explicit) and unconscious/incidental processing of emotional information. A randomized, double-blind, cross-over design was used with 15 healthy female participants. Subjective mood was recorded at baseline and after 4 h, when participants performed an explicit emotional face processing task, and a task eliciting unconscious processing of emotionally aversive and neutral images presented subliminally using backward masking. ATD was associated with a robust reduction in plasma tryptophan at 4 h but had no effect on mood or autonomic physiology. ATD was associated with significantly lower attractiveness ratings for happy faces and attenuation of intensity/arousal ratings of angry faces. ATD also reduced overall reaction times on the unconscious perception task, but there was no interaction with emotional content of masked stimuli. ATD did not affect breakthrough perception (accuracy in identification) of masked images. ATD attenuates the attractiveness of positive faces and the negative intensity of threatening faces, suggesting that serotonin contributes specifically to the appraisal of the social salience of both positive and negative salient social emotional cues. We found no evidence that serotonin affects unconscious processing of negative emotional stimuli. These novel findings implicate serotonin in conscious aspects of active social and behavioural engagement and extend knowledge regarding the effects of ATD on emotional perception.

  8. Development of Emotional Face Processing in Premature and Full-Term Infants.

    PubMed

    Carbajal-Valenzuela, Cintli Carolina; Santiago-Rodríguez, Efraín; Quirarte, Gina L; Harmony, Thalía

    2017-03-01

    The rate of premature births has increased in the past 2 decades. Ten percent of premature birth survivors develop motor impairment, but almost half exhibit later sensorial, cognitive, and emotional disabilities attributed to white matter injury and decreased volume of neuronal structures. The aim of this study was to test the hypothesis that premature and full-term infants differ in their development of emotional face processing. A comparative longitudinal study was conducted in premature and full-term infants at 4 and 8 months of age. The absolute power of the electroencephalogram was analyzed in both groups during 5 conditions of an emotional face processing task: positive, negative, neutral faces, non-face, and rest. Differences between the conditions of the task at 4 months were limited to rest versus non-rest comparisons in both groups. Eight-month-old term infants had increases ( P ≤ .05) in absolute power in the left occipital region at the frequency of 10.1 Hz and in the right occipital region at 3.5, 12.8, and 16.0 Hz when shown a positive face in comparison with a neutral face. They also showed increases in absolute power in the left occipital region at 1.9 Hz and in the right occipital region at 2.3 and 3.5 Hz with positive compared to non-face stimuli. In contrast, positive, negative, and neutral faces elicited the same responses in premature infants. In conclusion, our study provides electrophysiological evidence that emotional face processing develops differently in premature than in full-term infants, suggesting that premature birth alters mechanisms of brain development, such as the myelination process, and consequently affects complex cognitive functions.

  9. Association between Ability Emotional Intelligence and Left Insula during Social Judgment of Facial Emotions

    PubMed Central

    Quarto, Tiziana; Blasi, Giuseppe; Maddalena, Chiara; Viscanti, Giovanna; Lanciano, Tiziana; Soleti, Emanuela; Mangiulli, Ivan; Taurisano, Paolo; Fazio, Leonardo; Bertolino, Alessandro; Curci, Antonietta

    2016-01-01

    The human ability of identifying, processing and regulating emotions from social stimuli is generally referred as Emotional Intelligence (EI). Within EI, Ability EI identifies a performance measure assessing individual skills at perceiving, using, understanding and managing emotions. Previous models suggest that a brain “somatic marker circuitry” (SMC) sustains emotional sub-processes included in EI. Three primary brain regions are included: the amygdala, the insula and the ventromedial prefrontal cortex (vmPFC). Here, our aim was to investigate the relationship between Ability EI scores and SMC activity during social judgment of emotional faces. Sixty-three healthy subjects completed a test measuring Ability EI and underwent fMRI during a social decision task (i.e. approach or avoid) about emotional faces with different facial expressions. Imaging data revealed that EI scores are associated with left insula activity during social judgment of emotional faces as a function of facial expression. Specifically, higher EI scores are associated with greater left insula activity during social judgment of fearful faces but also with lower activity of this region during social judgment of angry faces. These findings indicate that the association between Ability EI and the SMC activity during social behavior is region- and emotion-specific. PMID:26859495

  10. The effects of social anxiety on emotional face discrimination and its modulation by mouth salience.

    PubMed

    du Rocher, Andrew R; Pickering, Alan D

    2018-05-21

    People high in social anxiety experience fear of social situations due to the likelihood of social evaluation. Whereas happy faces are generally processed very quickly, this effect is impaired by high social anxiety. Mouth regions are implicated during emotional face processing, therefore differences in mouth salience might affect how social anxiety relates to emotional face discrimination. We designed an emotional facial expression recognition task to reveal how varying levels of sub-clinical social anxiety (measured by questionnaire) related to the discrimination of happy and fearful faces, and of happy and angry faces. We also categorised the facial expressions by the salience of the mouth region (i.e. high [open mouth] vs. low [closed mouth]). In a sample of 90 participants higher social anxiety (relative to lower social anxiety) was associated with a reduced happy face reaction time advantage. However, this effect was mainly driven by the faces with less salient closed mouths. Our results are consistent with theories of anxiety that incorporate an oversensitive valence evaluation system.

  11. "Blindsight" and subjective awareness of fearful faces: Inversion reverses the deficits in fear perception associated with core psychopathic traits.

    PubMed

    Oliver, Lindsay D; Mao, Alexander; Mitchell, Derek G V

    2015-01-01

    Though emotional faces preferentially reach awareness, the present study utilised both objective and subjective indices of awareness to determine whether they enhance subjective awareness and "blindsight". Under continuous flash suppression, participants localised a disgusted, fearful or neutral face (objective index), and rated their confidence (subjective index). Psychopathic traits were also measured to investigate their influence on emotion perception. As predicted, fear increased localisation accuracy, subjective awareness and "blindsight" of upright faces. Coldhearted traits were inversely related to subjective awareness, but not "blindsight", of upright fearful faces. In a follow-up experiment using inverted faces, increased localisation accuracy and awareness, but not "blindsight", were observed for fear. Surprisingly, awareness of inverted fearful faces was positively correlated with coldheartedness. These results suggest that emotion enhances both pre-conscious processing and the qualitative experience of awareness, but that pre-conscious and conscious processing of emotional faces rely on at least partially dissociable cognitive mechanisms.

  12. Event-related potentials reveal preserved attention allocation but impaired emotion regulation in patients with epilepsy and comorbid negative affect.

    PubMed

    De Taeye, Leen; Pourtois, Gilles; Meurs, Alfred; Boon, Paul; Vonck, Kristl; Carrette, Evelien; Raedt, Robrecht

    2015-01-01

    Patients with epilepsy have a high prevalence of comorbid mood disorders. This study aims to evaluate whether negative affect in epilepsy is associated with dysfunction of emotion regulation. Event-related potentials (ERPs) are used in order to unravel the exact electrophysiological time course and investigate whether a possible dysfunction arises during early (attention) and/or late (regulation) stages of emotion control. Fifty epileptic patients with (n = 25) versus without (n = 25) comorbid negative affect plus twenty-five matched controls were recruited. ERPs were recorded while subjects performed a face- or house-matching task in which fearful, sad or neutral faces were presented either at attended or unattended spatial locations. Two ERP components were analyzed: the early vertex positive potential (VPP) which is normally enhanced for faces, and the late positive potential (LPP) that is typically larger for emotional stimuli. All participants had larger amplitude of the early face-sensitive VPP for attended faces compared to houses, regardless of their emotional content. By contrast, in patients with negative affect only, the amplitude of the LPP was significantly increased for unattended negative emotional expressions. These VPP results indicate that epilepsy with or without negative affect does not interfere with the early structural encoding and attention selection of faces. However, the LPP results suggest abnormal regulation processes during the processing of unattended emotional faces in patients with epilepsy and comorbid negative affect. In conclusion, this ERP study reveals that early object-based attention processes are not compromised by epilepsy, but instead, when combined with negative affect, this neurological disease is associated with dysfunction during the later stages of emotion regulation. As such, these new neurophysiological findings shed light on the complex interplay of epilepsy with negative affect during the processing of emotional visual stimuli and in turn might help to better understand the etiology and maintenance of mood disorders in epilepsy.

  13. Neural circuitry of masked emotional face processing in youth with bipolar disorder, severe mood dysregulation, and healthy volunteers.

    PubMed

    Thomas, Laura A; Brotman, Melissa A; Bones, Brian L; Chen, Gang; Rosen, Brooke H; Pine, Daniel S; Leibenluft, Ellen

    2014-04-01

    Youth with bipolar disorder (BD) and those with severe, non-episodic irritability (severe mood dysregulation, SMD) show face-emotion labeling deficits. These groups differ from healthy volunteers (HV) in neural responses to emotional faces. It is unknown whether awareness is required to elicit these differences. We compared activation in BD (N=20), SMD (N=18), and HV (N=22) during "Aware" and "Non-aware" priming of shapes by emotional faces. Subjects rated how much they liked the shape. In aware, a face (angry, fearful, happy, neutral, blank oval) appeared (187 ms) before the shape. In non-aware, a face appeared (17 ms), followed by a mask (170 ms), and shape. A Diagnosis-by-Awareness-by-Emotion ANOVA was not significant. There were significant Diagnosis-by-Awareness interactions in occipital regions. BD and SMD showed increased activity for non-aware vs. aware; HV showed the reverse pattern. When subjects viewed angry or neutral faces, there were Emotion-by-Diagnosis interactions in face-emotion processing regions, including the L precentral gyrus, R posterior cingulate, R superior temporal gyrus, R middle occipital gyrus, and L medial frontal gyrus. Regardless of awareness, BD and SMD differ in activation patterns from HV and each other in multiple brain regions, suggesting that BD and SMD are distinct developmental mood disorders. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. The effect of age on memory for emotional faces.

    PubMed

    Grady, Cheryl L; Hongwanishkul, Donaya; Keightley, Michelle; Lee, Wendy; Hasher, Lynn

    2007-05-01

    Prior studies of emotion suggest that young adults should have enhanced memory for negative faces and that this enhancement should be reduced in older adults. Several studies have not shown these effects but were conducted with procedures different from those used with other emotional stimuli. In this study, researchers examined age differences in recognition of faces with emotional or neutral expressions, using trial-unique stimuli, as is typically done with other types of emotional stimuli. They also assessed the influence of personality traits and mood on memory. Enhanced recognition for negative faces was found in young adults but not in older adults. Recognition of faces was not influenced by mood or personality traits in young adults, but lower levels of extraversion and better emotional sensitivity predicted better negative face memory in older adults. These results suggest that negative expressions enhance memory for faces in young adults, as negative valence enhances memory for words and scenes. This enhancement is absent in older adults, but memory for emotional faces is modulated in older adults by personality traits that are relevant to emotional processing. (c) 2007 APA, all rights reserved

  15. Cholinergic enhancement modulates neural correlates of selective attention and emotional processing.

    PubMed

    Bentley, Paul; Vuilleumier, Patrik; Thiel, Christiane M; Driver, Jon; Dolan, Raymond J

    2003-09-01

    Neocortical cholinergic afferents are proposed to influence both selective attention and emotional processing. In a study of healthy adults we used event-related fMRI while orthogonally manipulating attention and emotionality to examine regions showing effects of cholinergic modulation by the anticholinesterase physostigmine. Either face or house pictures appeared at task-relevant locations, with the alternative picture type at irrelevant locations. Faces had either neutral or fearful expressions. Physostigmine increased relative activity within the anterior fusiform gyrus for faces at attended, versus unattended, locations, but decreased relative activity within the posterolateral occipital cortex for houses in attended, versus unattended, locations. A similar pattern of regional differences in the effect of physostigmine on cue-evoked responses was also present in the absence of stimuli. Cholinergic enhancement augmented the relative neuronal response within the middle fusiform gyrus to fearful faces, whether at attended or unattended locations. By contrast, physostigmine influenced responses in the orbitofrontal, intraparietal and cingulate cortices to fearful faces when faces occupied task-irrelevant locations. These findings suggest that acetylcholine may modulate both selective attention and emotional processes through independent, region-specific effects within the extrastriate cortex. Furthermore, cholinergic inputs to the frontoparietal cortex may influence the allocation of attention to emotional information.

  16. Exploring the Role of Spatial Frequency Information during Neural Emotion Processing in Human Infants.

    PubMed

    Jessen, Sarah; Grossmann, Tobias

    2017-01-01

    Enhanced attention to fear expressions in adults is primarily driven by information from low as opposed to high spatial frequencies contained in faces. However, little is known about the role of spatial frequency information in emotion processing during infancy. In the present study, we examined the role of low compared to high spatial frequencies in the processing of happy and fearful facial expressions by using filtered face stimuli and measuring event-related brain potentials (ERPs) in 7-month-old infants ( N = 26). Our results revealed that infants' brains discriminated between emotional facial expressions containing high but not between expressions containing low spatial frequencies. Specifically, happy faces containing high spatial frequencies elicited a smaller Nc amplitude than fearful faces containing high spatial frequencies and happy and fearful faces containing low spatial frequencies. Our results demonstrate that already in infancy spatial frequency content influences the processing of facial emotions. Furthermore, we observed that fearful facial expressions elicited a comparable Nc response for high and low spatial frequencies, suggesting a robust detection of fearful faces irrespective of spatial frequency content, whereas the detection of happy facial expressions was contingent upon frequency content. In summary, these data provide new insights into the neural processing of facial emotions in early development by highlighting the differential role played by spatial frequencies in the detection of fear and happiness.

  17. Early visual ERPs are influenced by individual emotional skills.

    PubMed

    Meaux, Emilie; Roux, Sylvie; Batty, Magali

    2014-08-01

    Processing information from faces is crucial to understanding others and to adapting to social life. Many studies have investigated responses to facial emotions to provide a better understanding of the processes and the neural networks involved. Moreover, several studies have revealed abnormalities of emotional face processing and their neural correlates in affective disorders. The aim of this study was to investigate whether early visual event-related potentials (ERPs) are affected by the emotional skills of healthy adults. Unfamiliar faces expressing the six basic emotions were presented to 28 young adults while recording visual ERPs. No specific task was required during the recording. Participants also completed the Social Skills Inventory (SSI) which measures social and emotional skills. The results confirmed that early visual ERPs (P1, N170) are affected by the emotions expressed by a face and also demonstrated that N170 and P2 are correlated to the emotional skills of healthy subjects. While N170 is sensitive to the subject's emotional sensitivity and expressivity, P2 is modulated by the ability of the subjects to control their emotions. We therefore suggest that N170 and P2 could be used as individual markers to assess strengths and weaknesses in emotional areas and could provide information for further investigations of affective disorders. © The Author (2013). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  18. Nonconscious emotional activation colors first impressions: a regulatory role for conscious awareness.

    PubMed

    Lapate, Regina C; Rokers, Bas; Li, Tianyi; Davidson, Richard J

    2014-02-01

    Emotions can color people's attitudes toward unrelated objects in the environment. Existing evidence suggests that such emotional coloring is particularly strong when emotion-triggering information escapes conscious awareness. But is emotional reactivity stronger after nonconscious emotional provocation than after conscious emotional provocation, or does conscious processing specifically change the association between emotional reactivity and evaluations of unrelated objects? In this study, we independently indexed emotional reactivity and coloring as a function of emotional-stimulus awareness to disentangle these accounts. Specifically, we recorded skin-conductance responses to spiders and fearful faces, along with subsequent preferences for novel neutral faces during visually aware and unaware states. Fearful faces increased skin-conductance responses comparably in both stimulus-aware and stimulus-unaware conditions. Yet only when visual awareness was precluded did skin-conductance responses to fearful faces predict decreased likability of neutral faces. These findings suggest a regulatory role for conscious awareness in breaking otherwise automatic associations between physiological reactivity and evaluative emotional responses.

  19. The Perception of Dynamic and Static Facial Expressions of Happiness and Disgust Investigated by ERPs and fMRI Constrained Source Analysis

    PubMed Central

    Trautmann-Lengsfeld, Sina Alexa; Domínguez-Borràs, Judith; Escera, Carles; Herrmann, Manfred; Fehr, Thorsten

    2013-01-01

    A recent functional magnetic resonance imaging (fMRI) study by our group demonstrated that dynamic emotional faces are more accurately recognized and evoked more widespread patterns of hemodynamic brain responses than static emotional faces. Based on this experimental design, the present study aimed at investigating the spatio-temporal processing of static and dynamic emotional facial expressions in 19 healthy women by means of multi-channel electroencephalography (EEG), event-related potentials (ERP) and fMRI-constrained regional source analyses. ERP analysis showed an increased amplitude of the LPP (late posterior positivity) over centro-parietal regions for static facial expressions of disgust compared to neutral faces. In addition, the LPP was more widespread and temporally prolonged for dynamic compared to static faces of disgust and happiness. fMRI constrained source analysis on static emotional face stimuli indicated the spatio-temporal modulation of predominantly posterior regional brain activation related to the visual processing stream for both emotional valences when compared to the neutral condition in the fusiform gyrus. The spatio-temporal processing of dynamic stimuli yielded enhanced source activity for emotional compared to neutral conditions in temporal (e.g., fusiform gyrus), and frontal regions (e.g., ventromedial prefrontal cortex, medial and inferior frontal cortex) in early and again in later time windows. The present data support the view that dynamic facial displays trigger more information reflected in complex neural networks, in particular because of their changing features potentially triggering sustained activation related to a continuing evaluation of those faces. A combined fMRI and EEG approach thus provides an advanced insight to the spatio-temporal characteristics of emotional face processing, by also revealing additional neural generators, not identifiable by the only use of an fMRI approach. PMID:23818974

  20. Facing mixed emotions: Analytic and holistic perception of facial emotion expressions engages separate brain networks.

    PubMed

    Meaux, Emilie; Vuilleumier, Patrik

    2016-11-01

    The ability to decode facial emotions is of primary importance for human social interactions; yet, it is still debated how we analyze faces to determine their expression. Here we compared the processing of emotional face expressions through holistic integration and/or local analysis of visual features, and determined which brain systems mediate these distinct processes. Behavioral, physiological, and brain responses to happy and angry faces were assessed by presenting congruent global configurations of expressions (e.g., happy top+happy bottom), incongruent composite configurations (e.g., angry top+happy bottom), and isolated features (e.g. happy top only). Top and bottom parts were always from the same individual. Twenty-six healthy volunteers were scanned using fMRI while they classified the expression in either the top or the bottom face part but ignored information in the other non-target part. Results indicate that the recognition of happy and anger expressions is neither strictly holistic nor analytic Both routes were involved, but with a different role for analytic and holistic information depending on the emotion type, and different weights of local features between happy and anger expressions. Dissociable neural pathways were engaged depending on emotional face configurations. In particular, regions within the face processing network differed in their sensitivity to holistic expression information, which predominantly activated fusiform, inferior occipital areas and amygdala when internal features were congruent (i.e. template matching), whereas more local analysis of independent features preferentially engaged STS and prefrontal areas (IFG/OFC) in the context of full face configurations, but early visual areas and pulvinar when seen in isolated parts. Collectively, these findings suggest that facial emotion recognition recruits separate, but interactive dorsal and ventral routes within the face processing networks, whose engagement may be shaped by reciprocal interactions and modulated by task demands. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. What’s in a Face? How Face Gender and Current Affect Influence Perceived Emotion

    PubMed Central

    Harris, Daniel A.; Hayes-Skelton, Sarah A.; Ciaramitaro, Vivian M.

    2016-01-01

    Faces drive our social interactions. A vast literature suggests an interaction between gender and emotional face perception, with studies using different methodologies demonstrating that the gender of a face can affect how emotions are processed. However, how different is our perception of affective male and female faces? Furthermore, how does our current affective state when viewing faces influence our perceptual biases? We presented participants with a series of faces morphed along an emotional continuum from happy to angry. Participants judged each face morph as either happy or angry. We determined each participant’s unique emotional ‘neutral’ point, defined as the face morph judged to be perceived equally happy and angry, separately for male and female faces. We also assessed how current state affect influenced these perceptual neutral points. Our results indicate that, for both male and female participants, the emotional neutral point for male faces is perceptually biased to be happier than for female faces. This bias suggests that more happiness is required to perceive a male face as emotionally neutral, i.e., we are biased to perceive a male face as more negative. Interestingly, we also find that perceptual biases in perceiving female faces are correlated with current mood, such that positive state affect correlates with perceiving female faces as happier, while we find no significant correlation between negative state affect and the perception of facial emotion. Furthermore, we find reaction time biases, with slower responses for angry male faces compared to angry female faces. PMID:27733839

  2. Self-esteem Modulates the P3 Component in Response to the Self-face Processing after Priming with Emotional Faces

    PubMed Central

    Guan, Lili; Zhao, Yufang; Wang, Yige; Chen, Yujie; Yang, Juan

    2017-01-01

    The self-face processing advantage (SPA) refers to the research finding that individuals generally recognize their own face faster than another’s face; self-face also elicits an enhanced P3 amplitude compared to another’s face. It has been suggested that social evaluation threats could weaken the SPA and that self-esteem could be regarded as a threat buffer. However, little research has directly investigated the neural evidence of how self-esteem modulates the social evaluation threat to the SPA. In the current event-related potential study, 27 healthy Chinese undergraduate students were primed with emotional faces (angry, happy, or neutral) and were asked to judge whether the target face (self, friend, and stranger) was familiar or unfamiliar. Electrophysiological results showed that after priming with emotional faces (angry and happy), self-face elicited similar P3 amplitudes to friend-face in individuals with low self-esteem, but not in individuals with high self-esteem. The results suggest that as low self-esteem raises fears of social rejection and exclusion, priming with emotional faces (angry and happy) can weaken the SPA in low self-esteem individuals but not in high self-esteem individuals. PMID:28868041

  3. Self-esteem Modulates the P3 Component in Response to the Self-face Processing after Priming with Emotional Faces.

    PubMed

    Guan, Lili; Zhao, Yufang; Wang, Yige; Chen, Yujie; Yang, Juan

    2017-01-01

    The self-face processing advantage (SPA) refers to the research finding that individuals generally recognize their own face faster than another's face; self-face also elicits an enhanced P3 amplitude compared to another's face. It has been suggested that social evaluation threats could weaken the SPA and that self-esteem could be regarded as a threat buffer. However, little research has directly investigated the neural evidence of how self-esteem modulates the social evaluation threat to the SPA. In the current event-related potential study, 27 healthy Chinese undergraduate students were primed with emotional faces (angry, happy, or neutral) and were asked to judge whether the target face (self, friend, and stranger) was familiar or unfamiliar. Electrophysiological results showed that after priming with emotional faces (angry and happy), self-face elicited similar P3 amplitudes to friend-face in individuals with low self-esteem, but not in individuals with high self-esteem. The results suggest that as low self-esteem raises fears of social rejection and exclusion, priming with emotional faces (angry and happy) can weaken the SPA in low self-esteem individuals but not in high self-esteem individuals.

  4. Efficacy of identifying neural components in the face and emotion processing system in schizophrenia using a dynamic functional localizer.

    PubMed

    Arnold, Aiden E G F; Iaria, Giuseppe; Goghari, Vina M

    2016-02-28

    Schizophrenia is associated with deficits in face perception and emotion recognition. Despite consistent behavioural results, the neural mechanisms underlying these cognitive abilities have been difficult to isolate, in part due to differences in neuroimaging methods used between studies for identifying regions in the face processing system. Given this problem, we aimed to validate a recently developed fMRI-based dynamic functional localizer task for use in studies of psychiatric populations and specifically schizophrenia. Previously, this functional localizer successfully identified each of the core face processing regions (i.e. fusiform face area, occipital face area, superior temporal sulcus), and regions within an extended system (e.g. amygdala) in healthy individuals. In this study, we tested the functional localizer success rate in 27 schizophrenia patients and in 24 community controls. Overall, the core face processing regions were localized equally between both the schizophrenia and control group. Additionally, the amygdala, a candidate brain region from the extended system, was identified in nearly half the participants from both groups. These results indicate the effectiveness of a dynamic functional localizer at identifying regions of interest associated with face perception and emotion recognition in schizophrenia. The use of dynamic functional localizers may help standardize the investigation of the facial and emotion processing system in this and other clinical populations. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  5. Categorising intersectional targets: An "either/and" approach to race- and gender-emotion congruity.

    PubMed

    Smith, Jacqueline S; LaFrance, Marianne; Dovidio, John F

    2017-01-01

    Research on the interaction of emotional expressions with social category cues in face processing has focused on whether specific emotions are associated with single-category identities, thus overlooking the influence of intersectional identities. Instead, we examined how quickly people categorise intersectional targets by their race, gender, or emotional expression. In Experiment 1, participants categorised Black and White faces displaying angry, happy, or neutral expressions by either race or gender. Emotion influenced responses to men versus women only when gender was made salient by the task. Similarly, emotion influenced responses to Black versus White targets only when participants categorised by race. In Experiment 2, participants categorised faces by emotion so that neither category was more salient. As predicted, responses to Black women differed from those to both Black men and White women. Thus, examining race and gender separately is insufficient to understanding how emotion and social category cues are processed.

  6. Hemispheric differences in recognizing upper and lower facial displays of emotion.

    PubMed

    Prodan, C I; Orbelo, D M; Testa, J A; Ross, E D

    2001-01-01

    To determine if there are hemispheric differences in processing upper versus lower facial displays of emotion. Recent evidence suggests that there are two broad classes of emotions with differential hemispheric lateralization. Primary emotions (e.g. anger, fear) and associated displays are innate, are recognized across all cultures, and are thought to be modulated by the right hemisphere. Social emotions (e.g., guilt, jealousy) and associated "display rules" are learned during early child development, vary across cultures, and are thought to be modulated by the left hemisphere. Display rules are used by persons to alter, suppress or enhance primary emotional displays for social purposes. During deceitful behaviors, a subject's true emotional state is often leaked through upper rather than lower facial displays, giving rise to facial blends of emotion. We hypothesized that upper facial displays are processed preferentially by the right hemisphere, as part of the primary emotional system, while lower facial displays are processed preferentially by the left hemisphere, as part of the social emotional system. 30 strongly right-handed adult volunteers were tested tachistoscopically by randomly flashing facial displays of emotion to the right and left visual fields. The stimuli were line drawings of facial blends with different emotions displayed on the upper versus lower face. The subjects were tested under two conditions: 1) without instructions and 2) with instructions to attend to the upper face. Without instructions, the subjects robustly identified the emotion displayed on the lower face, regardless of visual field presentation. With instructions to attend to the upper face, for the left visual field they robustly identified the emotion displayed on the upper face. For the right visual field, they continued to identify the emotion displayed on the lower face, but to a lesser degree. Our results support the hypothesis that hemispheric differences exist in the ability to process upper versus lower facial displays of emotion. Attention appears to enhance the ability to explore these hemispheric differences under experimental conditions. Our data also support the recent observation that the right hemisphere has a greater ability to recognize deceitful behaviors compared with the left hemisphere. This may be attributable to the different roles the hemispheres play in modulating social versus primary emotions and related behaviors.

  7. Emotional facial expressions evoke faster orienting responses, but weaker emotional responses at neural and behavioural levels compared to scenes: A simultaneous EEG and facial EMG study.

    PubMed

    Mavratzakis, Aimee; Herbert, Cornelia; Walla, Peter

    2016-01-01

    In the current study, electroencephalography (EEG) was recorded simultaneously with facial electromyography (fEMG) to determine whether emotional faces and emotional scenes are processed differently at the neural level. In addition, it was investigated whether these differences can be observed at the behavioural level via spontaneous facial muscle activity. Emotional content of the stimuli did not affect early P1 activity. Emotional faces elicited enhanced amplitudes of the face-sensitive N170 component, while its counterpart, the scene-related N100, was not sensitive to emotional content of scenes. At 220-280ms, the early posterior negativity (EPN) was enhanced only slightly for fearful as compared to neutral or happy faces. However, its amplitudes were significantly enhanced during processing of scenes with positive content, particularly over the right hemisphere. Scenes of positive content also elicited enhanced spontaneous zygomatic activity from 500-750ms onwards, while happy faces elicited no such changes. Contrastingly, both fearful faces and negative scenes elicited enhanced spontaneous corrugator activity at 500-750ms after stimulus onset. However, relative to baseline EMG changes occurred earlier for faces (250ms) than for scenes (500ms) whereas for scenes activity changes were more pronounced over the whole viewing period. Taking into account all effects, the data suggests that emotional facial expressions evoke faster attentional orienting, but weaker affective neural activity and emotional behavioural responses compared to emotional scenes. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  8. Neurofunctional Underpinnings of Audiovisual Emotion Processing in Teens with Autism Spectrum Disorders

    PubMed Central

    Doyle-Thomas, Krissy A.R.; Goldberg, Jeremy; Szatmari, Peter; Hall, Geoffrey B.C.

    2013-01-01

    Despite successful performance on some audiovisual emotion tasks, hypoactivity has been observed in frontal and temporal integration cortices in individuals with autism spectrum disorders (ASD). Little is understood about the neurofunctional network underlying this ability in individuals with ASD. Research suggests that there may be processing biases in individuals with ASD, based on their ability to obtain meaningful information from the face and/or the voice. This functional magnetic resonance imaging study examined brain activity in teens with ASD (n = 18) and typically developing controls (n = 16) during audiovisual and unimodal emotion processing. Teens with ASD had a significantly lower accuracy when matching an emotional face to an emotion label. However, no differences in accuracy were observed between groups when matching an emotional voice or face-voice pair to an emotion label. In both groups brain activity during audiovisual emotion matching differed significantly from activity during unimodal emotion matching. Between-group analyses of audiovisual processing revealed significantly greater activation in teens with ASD in a parietofrontal network believed to be implicated in attention, goal-directed behaviors, and semantic processing. In contrast, controls showed greater activity in frontal and temporal association cortices during this task. These results suggest that in the absence of engaging integrative emotional networks during audiovisual emotion matching, teens with ASD may have recruited the parietofrontal network as an alternate compensatory system. PMID:23750139

  9. Rapid processing of emotional expressions without conscious awareness.

    PubMed

    Smith, Marie L

    2012-08-01

    Rapid accurate categorization of the emotional state of our peers is of critical importance and as such many have proposed that facial expressions of emotion can be processed without conscious awareness. Typically, studies focus selectively on fearful expressions due to their evolutionary significance, leaving the subliminal processing of other facial expressions largely unexplored. Here, I investigated the time course of processing of 3 facial expressions (fearful, disgusted, and happy) plus an emotionally neutral face, during objectively unaware and aware perception. Participants completed the challenging "which expression?" task in response to briefly presented backward-masked expressive faces. Although participant's behavioral responses did not differentiate between the emotional content of the stimuli in the unaware condition, activity over frontal and occipitotemporal (OT) brain regions indicated an emotional modulation of the neuronal response. Over frontal regions this was driven by negative facial expressions and was present on all emotional trials independent of later categorization. Whereas the N170 component, recorded on lateral OT electrodes, was enhanced for all facial expressions but only on trials that would later be categorized as emotional. The results indicate that emotional faces, not only fearful, are processed without conscious awareness at an early stage and highlight the critical importance of considering categorization response when studying subliminal perception.

  10. One size does not fit all: face emotion processing impairments in semantic dementia, behavioural-variant frontotemporal dementia and Alzheimer's disease are mediated by distinct cognitive deficits.

    PubMed

    Miller, Laurie A; Hsieh, Sharpley; Lah, Suncica; Savage, Sharon; Hodges, John R; Piguet, Olivier

    2012-01-01

    Patients with frontotemporal dementia (both behavioural variant [bvFTD] and semantic dementia [SD]) as well as those with Alzheimer's disease (AD) show deficits on tests of face emotion processing, yet the mechanisms underlying these deficits have rarely been explored. We compared groups of patients with bvFTD (n = 17), SD (n = 12) or AD (n = 20) to an age- and education-matched group of healthy control subjects (n = 36) on three face emotion processing tasks (Ekman 60, Emotion Matching and Emotion Selection) and found that all three patient groups were similarly impaired. Analyses of covariance employed to partial out the influences of language and perceptual impairments, which frequently co-occur in these patients, provided evidence of different underlying cognitive mechanisms. These analyses revealed that language impairments explained the original poor scores obtained by the SD patients on the Ekman 60 and Emotion Selection tasks, which involve verbal labels. Perceptual deficits contributed to Emotion Matching performance in the bvFTD and AD patients. Importantly, all groups remained impaired on one task or more following these analyses, denoting a primary emotion processing disturbance in these dementia syndromes. These findings highlight the multifactorial nature of emotion processing deficits in patients with dementia.

  11. Non-Conscious Emotional Activation Colors First Impressions: A Regulatory Role for Conscious Awareness

    PubMed Central

    Lapate, R.C.; Rokers, B.; Li, T.; Davidson, R.J.

    2014-01-01

    Emotions can color our attitudes toward unrelated objects in the environment. Prior evidence suggests that such emotional coloring is particularly strong when emotion-triggering information escapes conscious awareness. But, is emotional reactivity stronger following non-conscious versus conscious emotional provocation? Or does conscious processing specifically change the association between emotional reactivity and evaluations of unrelated objects? In this study, we independently indexed emotional reactivity and coloring as a function of emotional-stimulus awareness to disentangle these accounts. Specifically, we recorded skin conductance responses (SCRs) to spiders and fearful faces, along with subsequent preferences for novel neutral faces during visually aware and unaware states. Fearful faces increased SCRs comparably in both aware and unaware conditions. Yet, only when visual awareness was precluded did SCRs to fearful faces predict decreased likeability of neutral faces. These findings suggest a regulatory role for conscious awareness in breaking otherwise automatic associations between physiological reactivity and evaluative emotional responses. PMID:24317420

  12. Differential Brain Activation to Angry Faces by Elite Warfighters: Neural Processing Evidence for Enhanced Threat Detection

    PubMed Central

    Paulus, Martin P.; Simmons, Alan N.; Fitzpatrick, Summer N.; Potterat, Eric G.; Van Orden, Karl F.; Bauman, James; Swain, Judith L.

    2010-01-01

    Background Little is known about the neural basis of elite performers and their optimal performance in extreme environments. The purpose of this study was to examine brain processing differences between elite warfighters and comparison subjects in brain structures that are important for emotion processing and interoception. Methodology/Principal Findings Navy Sea, Air, and Land Forces (SEALs) while off duty (n = 11) were compared with n = 23 healthy male volunteers while performing a simple emotion face-processing task during functional magnetic resonance imaging. Irrespective of the target emotion, elite warfighters relative to comparison subjects showed relatively greater right-sided insula, but attenuated left-sided insula, activation. Navy SEALs showed selectively greater activation to angry target faces relative to fearful or happy target faces bilaterally in the insula. This was not accounted for by contrasting positive versus negative emotions. Finally, these individuals also showed slower response latencies to fearful and happy target faces than did comparison subjects. Conclusions/Significance These findings support the hypothesis that elite warfighters deploy greater processing resources toward potential threat-related facial expressions and reduced processing resources to non-threat-related facial expressions. Moreover, rather than expending more effort in general, elite warfighters show more focused neural and performance tuning. In other words, greater neural processing resources are directed toward threat stimuli and processing resources are conserved when facing a nonthreat stimulus situation. PMID:20418943

  13. Disrupted neural processing of emotional faces in psychopathy.

    PubMed

    Contreras-Rodríguez, Oren; Pujol, Jesus; Batalla, Iolanda; Harrison, Ben J; Bosque, Javier; Ibern-Regàs, Immaculada; Hernández-Ribas, Rosa; Soriano-Mas, Carles; Deus, Joan; López-Solà, Marina; Pifarré, Josep; Menchón, José M; Cardoner, Narcís

    2014-04-01

    Psychopaths show a reduced ability to recognize emotion facial expressions, which may disturb the interpersonal relationship development and successful social adaptation. Behavioral hypotheses point toward an association between emotion recognition deficits in psychopathy and amygdala dysfunction. Our prediction was that amygdala dysfunction would combine deficient activation with disturbances in functional connectivity with cortical regions of the face-processing network. Twenty-two psychopaths and 22 control subjects were assessed and functional magnetic resonance maps were generated to identify both brain activation and task-induced functional connectivity using psychophysiological interaction analysis during an emotional face-matching task. Results showed significant amygdala activation in control subjects only, but differences between study groups did not reach statistical significance. In contrast, psychopaths showed significantly increased activation in visual and prefrontal areas, with this latest activation being associated with psychopaths' affective-interpersonal disturbances. Psychophysiological interaction analyses revealed a reciprocal reduction in functional connectivity between the left amygdala and visual and prefrontal cortices. Our results suggest that emotional stimulation may evoke a relevant cortical response in psychopaths, but a disruption in the processing of emotional faces exists involving the reciprocal functional interaction between the amygdala and neocortex, consistent with the notion of a failure to integrate emotion into cognition in psychopathic individuals.

  14. Culture modulates the brain response to human expressions of emotion: electrophysiological evidence.

    PubMed

    Liu, Pan; Rigoulot, Simon; Pell, Marc D

    2015-01-01

    To understand how culture modulates on-line neural responses to social information, this study compared how individuals from two distinct cultural groups, English-speaking North Americans and Chinese, process emotional meanings of multi-sensory stimuli as indexed by both behaviour (accuracy) and event-related potential (N400) measures. In an emotional Stroop-like task, participants were presented face-voice pairs expressing congruent or incongruent emotions in conditions where they judged the emotion of one modality while ignoring the other (face or voice focus task). Results indicated that while both groups were sensitive to emotional differences between channels (with lower accuracy and higher N400 amplitudes for incongruent face-voice pairs), there were marked group differences in how intruding facial or vocal cues affected accuracy and N400 amplitudes, with English participants showing greater interference from irrelevant faces than Chinese. Our data illuminate distinct biases in how adults from East Asian versus Western cultures process socio-emotional cues, supplying new evidence that cultural learning modulates not only behaviour, but the neurocognitive response to different features of multi-channel emotion expressions. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Effects of acute psychosocial stress on neural activity to emotional and neutral faces in a face recognition memory paradigm.

    PubMed

    Li, Shijia; Weerda, Riklef; Milde, Christopher; Wolf, Oliver T; Thiel, Christiane M

    2014-12-01

    Previous studies have shown that acute psychosocial stress impairs recognition of declarative memory and that emotional material is especially sensitive to this effect. Animal studies suggest a central role of the amygdala which modulates memory processes in hippocampus, prefrontal cortex and other brain areas. We used functional magnetic resonance imaging (fMRI) to investigate neural correlates of stress-induced modulation of emotional recognition memory in humans. Twenty-seven healthy, right-handed, non-smoker male volunteers performed an emotional face recognition task. During encoding, participants were presented with 50 fearful and 50 neutral faces. One hour later, they underwent either a stress (Trier Social Stress Test) or a control procedure outside the scanner which was followed immediately by the recognition session inside the scanner, where participants had to discriminate between 100 old and 50 new faces. Stress increased salivary cortisol, blood pressure and pulse, and decreased the mood of participants but did not impact recognition memory. BOLD data during recognition revealed a stress condition by emotion interaction in the left inferior frontal gyrus and right hippocampus which was due to a stress-induced increase of neural activity to fearful and a decrease to neutral faces. Functional connectivity analyses revealed a stress-induced increase in coupling between the right amygdala and the right fusiform gyrus, when processing fearful as compared to neutral faces. Our results provide evidence that acute psychosocial stress affects medial temporal and frontal brain areas differentially for neutral and emotional items, with a stress-induced privileged processing of emotional stimuli.

  16. Real-Time Functional Magnetic Resonance Imaging Amygdala Neurofeedback Changes Positive Information Processing in Major Depressive Disorder.

    PubMed

    Young, Kymberly D; Misaki, Masaya; Harmer, Catherine J; Victor, Teresa; Zotev, Vadim; Phillips, Raquel; Siegle, Greg J; Drevets, Wayne C; Bodurka, Jerzy

    2017-10-15

    In participants with major depressive disorder who are trained to upregulate their amygdalar hemodynamic responses during positive autobiographical memory recall with real-time functional magnetic resonance imaging neurofeedback (rtfMRI-nf) training, depressive symptoms diminish. This study tested whether amygdalar rtfMRI-nf also changes emotional processing of positive and negative stimuli in a variety of behavioral and imaging tasks. Patients with major depressive disorder completed two rtfMRI-nf sessions (18 received amygdalar rtfMRI-nf, 16 received control parietal rtfMRI-nf). One week before and following rtfMRI-nf training, participants performed tasks measuring responses to emotionally valenced stimuli including a backward-masking task, which measures the amygdalar hemodynamic response to emotional faces presented for traditionally subliminal duration and followed by a mask, and the Emotional Test Battery in which reaction times and performance accuracy are measured during tasks involving emotional faces and words. During the backward-masking task, amygdalar responses increased while viewing masked happy faces but decreased to masked sad faces in the experimental versus control group following rtfMRI-nf. During the Emotional Test Battery, reaction times decreased to identification of positive faces and during self-identification with positive words and vigilance scores increased to positive faces and decreased to negative faces during the faces dot-probe task in the experimental versus control group following rtfMRI-nf. rtfMRI-nf training to increase the amygdalar hemodynamic response to positive memories was associated with changes in amygdalar responses to happy and sad faces and improved processing of positive stimuli during performance of the Emotional Test Battery. These results may suggest that amygdalar rtfMRI-nf training alters responses to emotional stimuli in a manner similar to antidepressant pharmacotherapy. Copyright © 2017 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  17. Face Processing in Children with Autism Spectrum Disorder: Independent or Interactive Processing of Facial Identity and Facial Expression?

    ERIC Educational Resources Information Center

    Krebs, Julia F.; Biswas, Ajanta; Pascalis, Olivier; Kamp-Becker, Inge; Remschmidt, Helmuth; Schwarzer, Gudrun

    2011-01-01

    The current study investigated if deficits in processing emotional expression affect facial identity processing and vice versa in children with autism spectrum disorder. Children with autism and IQ and age matched typically developing children classified faces either by emotional expression, thereby ignoring facial identity or by facial identity…

  18. Oxytocin Reduces Face Processing Time but Leaves Recognition Accuracy and Eye-Gaze Unaffected.

    PubMed

    Hubble, Kelly; Daughters, Katie; Manstead, Antony S R; Rees, Aled; Thapar, Anita; van Goozen, Stephanie H M

    2017-01-01

    Previous studies have found that oxytocin (OXT) can improve the recognition of emotional facial expressions; it has been proposed that this effect is mediated by an increase in attention to the eye-region of faces. Nevertheless, evidence in support of this claim is inconsistent, and few studies have directly tested the effect of oxytocin on emotion recognition via altered eye-gaze Methods: In a double-blind, within-subjects, randomized control experiment, 40 healthy male participants received 24 IU intranasal OXT and placebo in two identical experimental sessions separated by a 2-week interval. Visual attention to the eye-region was assessed on both occasions while participants completed a static facial emotion recognition task using medium intensity facial expressions. Although OXT had no effect on emotion recognition accuracy, recognition performance was improved because face processing was faster across emotions under the influence of OXT. This effect was marginally significant (p<.06). Consistent with a previous study using dynamic stimuli, OXT had no effect on eye-gaze patterns when viewing static emotional faces and this was not related to recognition accuracy or face processing time. These findings suggest that OXT-induced enhanced facial emotion recognition is not necessarily mediated by an increase in attention to the eye-region of faces, as previously assumed. We discuss several methodological issues which may explain discrepant findings and suggest the effect of OXT on visual attention may differ depending on task requirements. (JINS, 2017, 23, 23-33).

  19. Association between Amygdala Response to Emotional Faces and Social Anxiety in Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Kleinhans, Natalia M.; Richards, Todd; Weaver, Kurt; Johnson, L. Clark; Greenson, Jessica; Dawson, Geraldine; Aylward, Elizabeth

    2010-01-01

    Difficulty interpreting facial expressions has been reported in autism spectrum disorders (ASD) and is thought to be associated with amygdala abnormalities. To further explore the neural basis of abnormal emotional face processing in ASD, we conducted an fMRI study of emotional face matching in high-functioning adults with ASD and age, IQ, and…

  20. Acute tryptophan depletion attenuates conscious appraisal of social emotional signals in healthy female volunteers

    PubMed Central

    Gray, Marcus A.; Minati, Ludovico; Whale, Richard; Harrison, Neil A.; Critchley, Hugo D.

    2010-01-01

    Rationale Acute tryptophan depletion (ATD) decreases levels of central serotonin. ATD thus enables the cognitive effects of serotonin to be studied, with implications for the understanding of psychiatric conditions, including depression. Objective To determine the role of serotonin in conscious (explicit) and unconscious/incidental processing of emotional information. Materials and methods A randomized, double-blind, cross-over design was used with 15 healthy female participants. Subjective mood was recorded at baseline and after 4 h, when participants performed an explicit emotional face processing task, and a task eliciting unconscious processing of emotionally aversive and neutral images presented subliminally using backward masking. Results ATD was associated with a robust reduction in plasma tryptophan at 4 h but had no effect on mood or autonomic physiology. ATD was associated with significantly lower attractiveness ratings for happy faces and attenuation of intensity/arousal ratings of angry faces. ATD also reduced overall reaction times on the unconscious perception task, but there was no interaction with emotional content of masked stimuli. ATD did not affect breakthrough perception (accuracy in identification) of masked images. Conclusions ATD attenuates the attractiveness of positive faces and the negative intensity of threatening faces, suggesting that serotonin contributes specifically to the appraisal of the social salience of both positive and negative salient social emotional cues. We found no evidence that serotonin affects unconscious processing of negative emotional stimuli. These novel findings implicate serotonin in conscious aspects of active social and behavioural engagement and extend knowledge regarding the effects of ATD on emotional perception. PMID:20596858

  1. Reduced beta connectivity during emotional face processing in adolescents with autism.

    PubMed

    Leung, Rachel C; Ye, Annette X; Wong, Simeon M; Taylor, Margot J; Doesburg, Sam M

    2014-01-01

    Autism spectrum disorder (ASD) is a neurodevelopmental disorder characterized by impairments in social cognition. The biological basis of deficits in social cognition in ASD, and their difficulty in processing emotional face information in particular, remains unclear. Atypical communication within and between brain regions has been reported in ASD. Interregional phase-locking is a neurophysiological mechanism mediating communication among brain areas and is understood to support cognitive functions. In the present study we investigated interregional magnetoencephalographic phase synchronization during the perception of emotional faces in adolescents with ASD. A total of 22 adolescents with ASD (18 males, mean age =14.2 ± 1.15 years, 22 right-handed) with mild to no cognitive delay and 17 healthy controls (14 males, mean age =14.4 ± 0.33 years, 16 right-handed) performed an implicit emotional processing task requiring perception of happy, angry and neutral faces while we recorded neuromagnetic signals. The faces were presented rapidly (80 ms duration) to the left or right of a central fixation cross and participants responded to a scrambled pattern that was presented concurrently on the opposite side of the fixation point. Task-dependent interregional phase-locking was calculated among source-resolved brain regions. Task-dependent increases in interregional beta synchronization were observed. Beta-band interregional phase-locking in adolescents with ASD was reduced, relative to controls, during the perception of angry faces in a distributed network involving the right fusiform gyrus and insula. No significant group differences were found for happy or neutral faces, or other analyzed frequency ranges. Significant reductions in task-dependent beta connectivity strength, clustering and eigenvector centrality (all P <0.001) in the right insula were found in adolescents with ASD, relative to controls. Reduced beta synchronization may reflect inadequate recruitment of task-relevant networks during emotional face processing in ASD. The right insula, specifically, was a hub of reduced functional connectivity and may play a prominent role in the inability to effectively extract emotional information from faces. These findings suggest that functional disconnection in brain networks mediating emotional processes may contribute to deficits in social cognition in this population.

  2. Auditory to Visual Cross-Modal Adaptation for Emotion: Psychophysical and Neural Correlates.

    PubMed

    Wang, Xiaodong; Guo, Xiaotao; Chen, Lin; Liu, Yijun; Goldberg, Michael E; Xu, Hong

    2017-02-01

    Adaptation is fundamental in sensory processing and has been studied extensively within the same sensory modality. However, little is known about adaptation across sensory modalities, especially in the context of high-level processing, such as the perception of emotion. Previous studies have shown that prolonged exposure to a face exhibiting one emotion, such as happiness, leads to contrastive biases in the perception of subsequently presented faces toward the opposite emotion, such as sadness. Such work has shown the importance of adaptation in calibrating face perception based on prior visual exposure. In the present study, we showed for the first time that emotion-laden sounds, like laughter, adapt the visual perception of emotional faces, that is, subjects more frequently perceived faces as sad after listening to a happy sound. Furthermore, via electroencephalography recordings and event-related potential analysis, we showed that there was a neural correlate underlying the perceptual bias: There was an attenuated response occurring at ∼ 400 ms to happy test faces and a quickened response to sad test faces, after exposure to a happy sound. Our results provide the first direct evidence for a behavioral cross-modal adaptation effect on the perception of facial emotion, and its neural correlate. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  3. Implicit reward associations impact face processing: Time-resolved evidence from event-related brain potentials and pupil dilations.

    PubMed

    Hammerschmidt, Wiebke; Kagan, Igor; Kulke, Louisa; Schacht, Annekathrin

    2018-06-22

    The present study aimed at investigating whether associated motivational salience causes preferential processing of inherently neutral faces similar to emotional expressions by means of event-related brain potentials (ERPs) and changes of the pupil size. To this aim, neutral faces were implicitly associated with monetary outcome, while participants (N = 44) performed a masked prime face-matching task that ensured performance around chance level and thus an equal proportion of gain, loss, and zero outcomes. Motivational context strongly impacted the processing of the fixation, prime and mask stimuli prior to the target face, indicated by enhanced amplitudes of subsequent ERP components and increased pupil size. In a separate test session, previously associated faces as well as novel faces with emotional expressions were presented within the same task but without motivational context and performance feedback. Most importantly, previously gain-associated faces amplified the LPC, although the individually contingent face-outcome assignments were not made explicit during the learning session. Emotional expressions impacted the N170 and EPN components. Modulations of the pupil size were absent in both motivationally-associated and emotional conditions. Our findings demonstrate that neural representations of neutral stimuli can acquire increased salience via implicit learning, with an advantage for gain over loss associations. Copyright © 2018. Published by Elsevier Inc.

  4. Misinterpretation of Facial Expressions of Emotion in Verbal Adults with Autism Spectrum Disorder

    PubMed Central

    Eack, Shaun M.; MAZEFSKY, CARLA A.; Minshew, Nancy J.

    2014-01-01

    Facial emotion perception is significantly affected in autism spectrum disorder (ASD), yet little is known about how individuals with ASD misinterpret facial expressions that result in their difficulty in accurately recognizing emotion in faces. This study examined facial emotion perception in 45 verbal adults with ASD and 30 age- and gender-matched volunteers without ASD to identify patterns of emotion misinterpretation during face processing that contribute to emotion recognition impairments in autism. Results revealed that difficulty distinguishing emotional from neutral facial expressions characterized much of the emotion perception impairments exhibited by participants with ASD. In particular, adults with ASD uniquely misinterpreted happy faces as neutral, and were significantly more likely than typical volunteers to attribute negative valence to non-emotional faces. The over-attribution of emotions to neutral faces was significantly related to greater communication and emotional intelligence impairments in individuals with ASD. These findings suggest a potential negative bias toward the interpretation of facial expressions and may have implications for interventions designed to remediate emotion perception in ASD. PMID:24535689

  5. Preschool negative emotionality predicts activity and connectivity of the fusiform face area and amygdala in later childhood.

    PubMed

    Kann, Sarah J; O'Rawe, Jonathan F; Huang, Anna S; Klein, Daniel N; Leung, Hoi-Chung

    2017-09-01

    Negative emotionality (NE) refers to individual differences in the propensity to experience and react with negative emotions and is associated with increased risk of psychological disorder. However, research on the neural bases of NE has focused almost exclusively on amygdala activity during emotional face processing. This study broadened this framework by examining the relationship between observed NE in early childhood and subsequent neural responses to emotional faces in both the amygdala and the fusiform face area (FFA) in a late childhood/early adolescent sample. Measures of NE were obtained from children at age 3 using laboratory observations, and functional magnetic resonance imaging (fMRI) data were collected when these children were between the ages of 9 and 12 while performing a visual stimulus identity matching task with houses and emotional faces as stimuli. Multiple regression analyses revealed that higher NE at age 3 is associated with significantly greater activation in the left amygdala and left FFA but lower functional connectivity between these two regions during the face conditions. These findings suggest that those with higher early NE have subsequent alterations in both activity and connectivity within an extended network during face processing. © The Author (2017). Published by Oxford University Press.

  6. Differential Interactions between Identity and Emotional Expression in Own and Other-Race Faces: Effects of Familiarity Revealed through Redundancy Gains

    ERIC Educational Resources Information Center

    Yankouskaya, Alla; Humphreys, Glyn W.; Rotshtein, Pia

    2014-01-01

    We examined relations between the processing of facial identity and emotion in own- and other-race faces, using a fully crossed design with participants from 3 different ethnicities. The benefits of redundant identity and emotion signals were evaluated and formally tested in relation to models of independent and coactive feature processing and…

  7. Preliminary evidence that different mechanisms underlie the anger superiority effect in children with and without Autism Spectrum Disorders

    PubMed Central

    Isomura, Tomoko; Ogawa, Shino; Yamada, Satoko; Shibasaki, Masahiro; Masataka, Nobuo

    2014-01-01

    Previous studies have demonstrated that angry faces capture humans' attention more rapidly than emotionally positive faces. This phenomenon is referred to as the anger superiority effect (ASE). Despite atypical emotional processing, adults and children with Autism Spectrum Disorders (ASD) have been reported to show ASE as well as typically developed (TD) individuals. So far, however, few studies have clarified whether or not the mechanisms underlying ASE are the same for both TD and ASD individuals. Here, we tested how TD and ASD children process schematic emotional faces during detection by employing a recognition task in combination with a face-in-the-crowd task. Results of the face-in-the-crowd task revealed the prevalence of ASE both in TD and ASD children. However, the results of the recognition task revealed group differences: In TD children, detection of angry faces required more configural face processing and disrupted the processing of local features. In ASD children, on the other hand, it required more feature-based processing rather than configural processing. Despite the small sample sizes, these findings provide preliminary evidence that children with ASD, in contrast to TD children, show quick detection of angry faces by extracting local features in faces. PMID:24904477

  8. Face Age and Eye Gaze Influence Older Adults' Emotion Recognition.

    PubMed

    Campbell, Anna; Murray, Janice E; Atkinson, Lianne; Ruffman, Ted

    2017-07-01

    Eye gaze has been shown to influence emotion recognition. In addition, older adults (over 65 years) are not as influenced by gaze direction cues as young adults (18-30 years). Nevertheless, these differences might stem from the use of young to middle-aged faces in emotion recognition research because older adults have an attention bias toward old-age faces. Therefore, using older face stimuli might allow older adults to process gaze direction cues to influence emotion recognition. To investigate this idea, young and older adults completed an emotion recognition task with young and older face stimuli displaying direct and averted gaze, assessing labeling accuracy for angry, disgusted, fearful, happy, and sad faces. Direct gaze rather than averted gaze improved young adults' recognition of emotions in young and older faces, but for older adults this was true only for older faces. The current study highlights the impact of stimulus face age and gaze direction on emotion recognition in young and older adults. The use of young face stimuli with direct gaze in most research might contribute to age-related emotion recognition differences. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  9. The human body odor compound androstadienone increases neural conflict coupled to higher behavioral costs during an emotional Stroop task.

    PubMed

    Hornung, Jonas; Kogler, Lydia; Erb, Michael; Freiherr, Jessica; Derntl, Birgit

    2018-05-01

    The androgen derivative androstadienone (AND) is a substance found in human sweat and thus may act as human chemosignal. With the current experiment, we aimed to explore in which way AND affects interference processing during an emotional Stroop task which used human faces as target and emotional words as distractor stimuli. This was complemented by functional magnetic resonance imaging (fMRI) to unravel the neural mechanism of AND-action. Based on previous accounts we expected AND to increase neural activation in areas commonly implicated in evaluation of emotional face processing and to change neural activation in brain regions linked to interference processing. For this aim, a total of 80 healthy individuals (oral contraceptive users, luteal women, men) were tested twice on two consecutive days with an emotional Stroop task using fMRI. Our results suggest that AND increases interference processing in brain areas that are heavily recruited during emotional conflict. At the same time, correlation analyses revealed that this neural interference processing was paralleled by higher behavioral costs (response times) with higher interference related brain activation under AND. Furthermore, AND elicited higher activation in regions implicated in emotional face processing including right fusiform gyrus, inferior frontal gyrus and dorsomedial cortex. In this connection, neural activation was not coupled to behavioral outcome. Furthermore, despite previous accounts of increased hypothalamic activation under AND, we were not able to replicate this finding and discuss possible reasons for this discrepancy. To conclude, AND increased interference processing in regions heavily recruited during emotional conflict which was coupled to higher costs in resolving emotional conflicts with stronger interference-related brain activation under AND. At the moment it remains unclear whether these effects are due to changes in conflict detection or resolution. However, evidence most consistently suggests that AND does not draw attention to the most potent socio-emotional information (human faces) but rather highlights representations of emotional words. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. Automatic Processing of Changes in Facial Emotions in Dysphoria: A Magnetoencephalography Study.

    PubMed

    Xu, Qianru; Ruohonen, Elisa M; Ye, Chaoxiong; Li, Xueqiao; Kreegipuu, Kairi; Stefanics, Gabor; Luo, Wenbo; Astikainen, Piia

    2018-01-01

    It is not known to what extent the automatic encoding and change detection of peripherally presented facial emotion is altered in dysphoria. The negative bias in automatic face processing in particular has rarely been studied. We used magnetoencephalography (MEG) to record automatic brain responses to happy and sad faces in dysphoric (Beck's Depression Inventory ≥ 13) and control participants. Stimuli were presented in a passive oddball condition, which allowed potential negative bias in dysphoria at different stages of face processing (M100, M170, and M300) and alterations of change detection (visual mismatch negativity, vMMN) to be investigated. The magnetic counterpart of the vMMN was elicited at all stages of face processing, indexing automatic deviance detection in facial emotions. The M170 amplitude was modulated by emotion, response amplitudes being larger for sad faces than happy faces. Group differences were found for the M300, and they were indexed by two different interaction effects. At the left occipital region of interest, the dysphoric group had larger amplitudes for sad than happy deviant faces, reflecting negative bias in deviance detection, which was not found in the control group. On the other hand, the dysphoric group showed no vMMN to changes in facial emotions, while the vMMN was observed in the control group at the right occipital region of interest. Our results indicate that there is a negative bias in automatic visual deviance detection, but also a general change detection deficit in dysphoria.

  11. The Processing of Human Emotional Faces by Pet and Lab Dogs: Evidence for Lateralization and Experience Effects

    PubMed Central

    Barber, Anjuli L. A.; Randi, Dania; Müller, Corsin A.; Huber, Ludwig

    2016-01-01

    From all non-human animals dogs are very likely the best decoders of human behavior. In addition to a high sensitivity to human attentive status and to ostensive cues, they are able to distinguish between individual human faces and even between human facial expressions. However, so far little is known about how they process human faces and to what extent this is influenced by experience. Here we present an eye-tracking study with dogs emanating from two different living environments and varying experience with humans: pet and lab dogs. The dogs were shown pictures of familiar and unfamiliar human faces expressing four different emotions. The results, extracted from several different eye-tracking measurements, revealed pronounced differences in the face processing of pet and lab dogs, thus indicating an influence of the amount of exposure to humans. In addition, there was some evidence for the influences of both, the familiarity and the emotional expression of the face, and strong evidence for a left gaze bias. These findings, together with recent evidence for the dog's ability to discriminate human facial expressions, indicate that dogs are sensitive to some emotions expressed in human faces. PMID:27074009

  12. More than words (and faces): evidence for a Stroop effect of prosody in emotion word processing.

    PubMed

    Filippi, Piera; Ocklenburg, Sebastian; Bowling, Daniel L; Heege, Larissa; Güntürkün, Onur; Newen, Albert; de Boer, Bart

    2017-08-01

    Humans typically combine linguistic and nonlinguistic information to comprehend emotions. We adopted an emotion identification Stroop task to investigate how different channels interact in emotion communication. In experiment 1, synonyms of "happy" and "sad" were spoken with happy and sad prosody. Participants had more difficulty ignoring prosody than ignoring verbal content. In experiment 2, synonyms of "happy" and "sad" were spoken with happy and sad prosody, while happy or sad faces were displayed. Accuracy was lower when two channels expressed an emotion that was incongruent with the channel participants had to focus on, compared with the cross-channel congruence condition. When participants were required to focus on verbal content, accuracy was significantly lower also when prosody was incongruent with verbal content and face. This suggests that prosody biases emotional verbal content processing, even when conflicting with verbal content and face simultaneously. Implications for multimodal communication and language evolution studies are discussed.

  13. Emotion-attention interactions in recognition memory for distractor faces.

    PubMed

    Srinivasan, Narayanan; Gupta, Rashmi

    2010-04-01

    Effective filtering of distractor information has been shown to be dependent on perceptual load. Given the salience of emotional information and the presence of emotion-attention interactions, we wanted to explore the recognition memory for emotional distractors especially as a function of focused attention and distributed attention by manipulating load and the spatial spread of attention. We performed two experiments to study emotion-attention interactions by measuring recognition memory performance for distractor neutral and emotional faces. Participants performed a color discrimination task (low-load) or letter identification task (high-load) with a letter string display in Experiment 1 and a high-load letter identification task with letters presented in a circular array in Experiment 2. The stimuli were presented against a distractor face background. The recognition memory results show that happy faces were recognized better than sad faces under conditions of less focused or distributed attention. When attention is more spatially focused, sad faces were recognized better than happy faces. The study provides evidence for emotion-attention interactions in which specific emotional information like sad or happy is associated with focused or distributed attention respectively. Distractor processing with emotional information also has implications for theories of attention. Copyright 2010 APA, all rights reserved.

  14. The Right Place at the Right Time: Priming Facial Expressions with Emotional Face Components in Developmental Visual Agnosia

    PubMed Central

    Aviezer, Hillel; Hassin, Ran. R.; Perry, Anat; Dudarev, Veronica; Bentin, Shlomo

    2012-01-01

    The current study examined the nature of deficits in emotion recognition from facial expressions in case LG, an individual with a rare form of developmental visual agnosia (DVA). LG presents with profoundly impaired recognition of facial expressions, yet the underlying nature of his deficit remains unknown. During typical face processing, normal sighted individuals extract information about expressed emotions from face regions with activity diagnostic for specific emotion categories. Given LG’s impairment, we sought to shed light on his emotion perception by examining if priming facial expressions with diagnostic emotional face components would facilitate his recognition of the emotion expressed by the face. LG and control participants matched isolated face components with components appearing in a subsequently presented full-face and then categorized the face’s emotion. Critically, the matched components were from regions which were diagnostic or non-diagnostic of the emotion portrayed by the full face. In experiment 1, when the full faces were briefly presented (150 ms), LG’s performance was strongly influenced by the diagnosticity of the components: His emotion recognition was boosted within normal limits when diagnostic components were used and was obliterated when non-diagnostic components were used. By contrast, in experiment 2, when the face-exposure duration was extended (2000 ms), the beneficial effect of the diagnostic matching was diminished as was the detrimental effect of the non-diagnostic matching. These data highlight the impact of diagnostic facial features in normal expression recognition and suggest that impaired emotion recognition in DVA results from deficient visual integration across diagnostic face components. PMID:22349446

  15. Amygdala and whole-brain activity to emotional faces distinguishes major depressive disorder and bipolar disorder.

    PubMed

    Fournier, Jay C; Keener, Matthew T; Almeida, Jorge; Kronhaus, Dina M; Phillips, Mary L

    2013-11-01

    It can be clinically difficult to distinguish depressed individuals with bipolar disorder (BD) and major depressive disorder (MDD). To examine potential biomarkers of difference between the two disorders, the current study examined differences in the functioning of emotion-processing neural regions during a dynamic emotional faces task. During functional magnetic resonance imaging, healthy control adults (HC) (n = 29) and depressed adults with MDD (n = 30) and BD (n = 22) performed an implicit emotional-faces task in which they identified a color label superimposed on neutral faces that dynamically morphed into one of four emotional faces (angry, fearful, sad, happy). We compared neural activation between the groups in an amygdala region-of-interest and at the whole-brain level. Adults with MDD showed significantly greater activity than adults with BD in the left amygdala to the anger condition (p = 0.01). Results of whole-brain analyses (at p < 0.005, k ≥ 20) revealed that adults with BD showed greater activity to sad faces in temporoparietal regions, primarily in the left hemisphere, whereas individuals with MDD demonstrated greater activity than those with BD to displays of anger, fear, and happiness. Many of the observed BD-MDD differences represented abnormalities in functioning compared to HC. We observed a dissociation between depressed adults with BD and MDD in the processing of emerging emotional faces. Those with BD showed greater activity during mood-congruent (i.e., sad) faces, whereas those with MDD showed greater activity for mood-incongruent (i.e., fear, anger, and happy) faces. Such findings may reflect markers of differences between BD and MDD depression in underlying pathophysiological processes. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. Misinterpretation of facial expressions of emotion in verbal adults with autism spectrum disorder.

    PubMed

    Eack, Shaun M; Mazefsky, Carla A; Minshew, Nancy J

    2015-04-01

    Facial emotion perception is significantly affected in autism spectrum disorder, yet little is known about how individuals with autism spectrum disorder misinterpret facial expressions that result in their difficulty in accurately recognizing emotion in faces. This study examined facial emotion perception in 45 verbal adults with autism spectrum disorder and 30 age- and gender-matched volunteers without autism spectrum disorder to identify patterns of emotion misinterpretation during face processing that contribute to emotion recognition impairments in autism. Results revealed that difficulty distinguishing emotional from neutral facial expressions characterized much of the emotion perception impairments exhibited by participants with autism spectrum disorder. In particular, adults with autism spectrum disorder uniquely misinterpreted happy faces as neutral, and were significantly more likely than typical volunteers to attribute negative valence to nonemotional faces. The over-attribution of emotions to neutral faces was significantly related to greater communication and emotional intelligence impairments in individuals with autism spectrum disorder. These findings suggest a potential negative bias toward the interpretation of facial expressions and may have implications for interventions designed to remediate emotion perception in autism spectrum disorder. © The Author(s) 2014.

  17. Electrocortical processing of social signals of threat in combat-related post-traumatic stress disorder.

    PubMed

    MacNamara, Annmarie; Post, David; Kennedy, Amy E; Rabinak, Christine A; Phan, K Luan

    2013-10-01

    Post-traumatic stress disorder (PTSD) is characterized by avoidance, emotional numbing, increased arousal and hypervigilance for threat following a trauma. Thirty-three veterans (19 with PTSD, 14 without PTSD) who had experienced combat trauma while on deployment in Iraq and/or Afghanistan completed an emotional faces matching task while electroencephalography was recorded. Vertex positive potentials (VPPs) elicited by happy, angry and fearful faces were smaller in veterans with versus without PTSD. In addition, veterans with PTSD exhibited smaller late positive potentials (LPPs) to angry faces and greater intrusive symptoms predicted smaller LPPs to fearful faces in the PTSD group. Veterans with PTSD were also less accurate at identifying angry faces, and accuracy decreased in the PTSD group as hyperarousal symptoms increased. These findings show reduced early processing of emotional faces, irrespective of valence, and blunted prolonged processing of social signals of threat in conjunction with impaired perception for angry faces in PTSD. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. The role of the cannabinoid receptor in adolescents' processing of facial expressions.

    PubMed

    Ewald, Anais; Becker, Susanne; Heinrich, Angela; Banaschewski, Tobias; Poustka, Luise; Bokde, Arun; Büchel, Christian; Bromberg, Uli; Cattrell, Anna; Conrod, Patricia; Desrivières, Sylvane; Frouin, Vincent; Papadopoulos-Orfanos, Dimitri; Gallinat, Jürgen; Garavan, Hugh; Heinz, Andreas; Walter, Henrik; Ittermann, Bernd; Gowland, Penny; Paus, Tomáš; Martinot, Jean-Luc; Paillère Martinot, Marie-Laure; Smolka, Michael N; Vetter, Nora; Whelan, Rob; Schumann, Gunter; Flor, Herta; Nees, Frauke

    2016-01-01

    The processing of emotional faces is an important prerequisite for adequate social interactions in daily life, and might thus specifically be altered in adolescence, a period marked by significant changes in social emotional processing. Previous research has shown that the cannabinoid receptor CB1R is associated with longer gaze duration and increased brain responses in the striatum to happy faces in adults, yet, for adolescents, it is not clear whether an association between CBR1 and face processing exists. In the present study we investigated genetic effects of the two CB1R polymorphisms, rs1049353 and rs806377, on the processing of emotional faces in healthy adolescents. They participated in functional magnetic resonance imaging during a Faces Task, watching blocks of video clips with angry and neutral facial expressions, and completed a Morphed Faces Task in the laboratory where they looked at different facial expressions that switched from anger to fear or sadness or from happiness to fear or sadness, and labelled them according to these four emotional expressions. A-allele versus GG-carriers in rs1049353 displayed earlier recognition of facial expressions changing from anger to sadness or fear, but not for expressions changing from happiness to sadness or fear, and higher brain responses to angry, but not neutral, faces in the amygdala and insula. For rs806377 no significant effects emerged. This suggests that rs1049353 is involved in the processing of negative facial expressions with relation to anger in adolescence. These findings add to our understanding of social emotion-related mechanisms in this life period. © 2015 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  19. Relation between Amygdala Structure and Function in Adolescents with Bipolar Disorder

    ERIC Educational Resources Information Center

    Kalmar, Jessica H.; Wang, Fei; Chepenik, Lara G.; Womer, Fay Y.; Jones, Monique M.; Pittman, Brian; Shah, Maulik P.; Martin, Andres; Constable, R. Todd; Blumberg, Hilary P.

    2009-01-01

    Adolescents with bipolar disorder showed decreased amygdala volume and increased amygdala response to emotional faces. Amygdala volume is inversely related to activation during emotional face processing.

  20. Event-Related Potentials of Bottom-Up and Top-Down Processing of Emotional Faces

    PubMed Central

    Moradi, Afsane; Mehrinejad, Seyed Abolghasem; Ghadiri, Mohammad; Rezaei, Farzin

    2017-01-01

    Introduction: Emotional stimulus is processed automatically in a bottom-up way or can be processed voluntarily in a top-down way. Imaging studies have indicated that bottom-up and top-down processing are mediated through different neural systems. However, temporal differentiation of top-down versus bottom-up processing of facial emotional expressions has remained to be clarified. The present study aimed to explore the time course of these processes as indexed by the emotion-specific P100 and late positive potential (LPP) event-related potential (ERP) components in a group of healthy women. Methods: Fourteen female students of Alzahra University, Tehran, Iran aged 18–30 years, voluntarily participated in the study. The subjects completed 2 overt and covert emotional tasks during ERP acquisition. Results: The results indicated that fearful expressions significantly produced greater P100 amplitude compared to other expressions. Moreover, the P100 findings showed an interaction between emotion and processing conditions. Further analysis indicated that within the overt condition, fearful expressions elicited more P100 amplitude compared to other emotional expressions. Also, overt conditions created significantly more LPP latencies and amplitudes compared to covert conditions. Conclusion: Based on the results, early perceptual processing of fearful face expressions is enhanced in top-down way compared to bottom-up way. It also suggests that P100 may reflect an attentional bias toward fearful emotions. However, no such differentiation was observed within later processing stages of face expressions, as indexed by the ERP LPP component, in a top-down versus bottom-up way. Overall, this study provides a basis for further exploring of bottom-up and top-down processes underlying emotion and may be typically helpful for investigating the temporal characteristics associated with impaired emotional processing in psychiatric disorders. PMID:28446947

  1. Right hemisphere or valence hypothesis, or both? The processing of hybrid faces in the intact and callosotomized brain.

    PubMed

    Prete, Giulia; Laeng, Bruno; Fabri, Mara; Foschi, Nicoletta; Tommasi, Luca

    2015-02-01

    The valence hypothesis and the right hemisphere hypothesis in emotion processing have been alternatively supported. To better disentangle the two accounts, we carried out two studies, presenting healthy participants and an anterior callosotomized patient with 'hybrid faces', stimuli created by superimposing the low spatial frequencies of an emotional face to the high spatial frequencies of the same face in a neutral expression. In both studies we asked participants to judge the friendliness level of stimuli, which is an indirect measure of the processing of emotional information, despite this remaining "invisible". In Experiment 1 we presented hybrid faces in a divided visual field paradigm using different tachistoscopic presentation times; in Experiment 2 we presented hybrid chimeric faces in canonical view and upside-down. In Experiments 3 and 4 we tested a callosotomized patient, with spared splenium, in similar paradigms as those used in Experiments 1 and 2. Results from Experiments 1 and 3 were consistent with the valence hypothesis, whereas results of Experiments 2 and 4 were consistent with the right hemisphere hypothesis. This study confirms that the low spatial frequencies of emotional faces influence the social judgments of observers, even when seen for 28 ms (Experiment 1), possibly by means of configural analysis (Experiment 2). The possible roles of the cortical and subcortical emotional routes in these tasks are discussed in the light of the results obtained in the callosotomized patient. We propose that the right hemisphere and the valence accounts are not mutually exclusive, at least in the case of subliminal emotion processing. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Developmental changes in the primacy of facial cues for emotion recognition.

    PubMed

    Leitzke, Brian T; Pollak, Seth D

    2016-04-01

    There have been long-standing differences of opinion regarding the influence of the face relative to that of contextual information on how individuals process and judge facial expressions of emotion. However, developmental changes in how individuals use such information have remained largely unexplored and could be informative in attempting to reconcile these opposing views. The current study tested for age-related differences in how individuals prioritize viewing emotional faces versus contexts when making emotion judgments. To do so, we asked 4-, 8-, and 12-year-old children as well as college students to categorize facial expressions of emotion that were presented with scenes that were either congruent or incongruent with the facial displays. During this time, we recorded participants' gaze patterns via eye tracking. College students directed their visual attention primarily to the face, regardless of contextual information. Children, however, divided their attention between both the face and the context as sources of emotional information depending on the valence of the context. These findings reveal a developmental shift in how individuals process and integrate emotional cues. (c) 2016 APA, all rights reserved).

  3. Effects of Acute Alcohol Consumption on the Processing of Emotion in Faces: Implications for Understanding Alcohol-Related Aggression

    PubMed Central

    Attwood, Angela S.; Munafò, Marcus R.

    2016-01-01

    The negative consequences of chronic alcohol abuse are well known, but heavy episodic consumption ("binge drinking") is also associated with significant personal and societal harms. Aggressive tendencies are increased after alcohol but the mechanisms underlying these changes are not fully understood. While effects on behavioural control are likely to be important, other effects may be involved given the widespread action of alcohol. Altered processing of social signals is associated with changes in social behaviours, including aggression, but until recently there has been little research investigating the effects of acute alcohol consumption on these outcomes. Recent work investigating the effects of acute alcohol on emotional face processing has suggested reduced sensitivity to submissive signals (sad faces) and increased perceptual bias towards provocative signals (angry faces) after alcohol consumption, which may play a role in alcohol-related aggression. Here we discuss a putative mechanism that may explain how alcohol consumption influences emotional processing and subsequent aggressive responding, via disruption of OFC-amygdala connectivity. While the importance of emotional processing on social behaviours is well established, research into acute alcohol consumption and emotional processing is still in its infancy. Further research is needed and we outline a research agenda to address gaps in the literature. PMID:24920135

  4. Face emotion recognition is related to individual differences in psychosis-proneness.

    PubMed

    Germine, L T; Hooker, C I

    2011-05-01

    Deficits in face emotion recognition (FER) in schizophrenia are well documented, and have been proposed as a potential intermediate phenotype for schizophrenia liability. However, research on the relationship between psychosis vulnerability and FER has mixed findings and methodological limitations. Moreover, no study has yet characterized the relationship between FER ability and level of psychosis-proneness. If FER ability varies continuously with psychosis-proneness, this suggests a relationship between FER and polygenic risk factors. We tested two large internet samples to see whether psychometric psychosis-proneness, as measured by the Schizotypal Personality Questionnaire-Brief (SPQ-B), is related to differences in face emotion identification and discrimination or other face processing abilities. Experiment 1 (n=2332) showed that psychosis-proneness predicts face emotion identification ability but not face gender identification ability. Experiment 2 (n=1514) demonstrated that psychosis-proneness also predicts performance on face emotion but not face identity discrimination. The tasks in Experiment 2 used identical stimuli and task parameters, differing only in emotion/identity judgment. Notably, the relationships demonstrated in Experiments 1 and 2 persisted even when individuals with the highest psychosis-proneness levels (the putative high-risk group) were excluded from analysis. Our data suggest that FER ability is related to individual differences in psychosis-like characteristics in the normal population, and that these differences cannot be accounted for by differences in face processing and/or visual perception. Our results suggest that FER may provide a useful candidate intermediate phenotype.

  5. Face or body? Oxytocin improves perception of emotions from facial expressions in incongruent emotional body context.

    PubMed

    Perry, Anat; Aviezer, Hillel; Goldstein, Pavel; Palgi, Sharon; Klein, Ehud; Shamay-Tsoory, Simone G

    2013-11-01

    The neuropeptide oxytocin (OT) has been repeatedly reported to play an essential role in the regulation of social cognition in humans in general, and specifically in enhancing the recognition of emotions from facial expressions. The later was assessed in different paradigms that rely primarily on isolated and decontextualized emotional faces. However, recent evidence has indicated that the perception of basic facial expressions is not context invariant and can be categorically altered by context, especially body context, at early perceptual levels. Body context has a strong effect on our perception of emotional expressions, especially when the actual target face and the contextually expected face are perceptually similar. To examine whether and how OT affects emotion recognition, we investigated the role of OT in categorizing facial expressions in incongruent body contexts. Our results show that in the combined process of deciphering emotions from facial expressions and from context, OT gives an advantage to the face. This advantage is most evident when the target face and the contextually expected face are perceptually similar. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. The right place at the right time: priming facial expressions with emotional face components in developmental visual agnosia.

    PubMed

    Aviezer, Hillel; Hassin, Ran R; Perry, Anat; Dudarev, Veronica; Bentin, Shlomo

    2012-04-01

    The current study examined the nature of deficits in emotion recognition from facial expressions in case LG, an individual with a rare form of developmental visual agnosia (DVA). LG presents with profoundly impaired recognition of facial expressions, yet the underlying nature of his deficit remains unknown. During typical face processing, normal sighted individuals extract information about expressed emotions from face regions with activity diagnostic for specific emotion categories. Given LG's impairment, we sought to shed light on his emotion perception by examining if priming facial expressions with diagnostic emotional face components would facilitate his recognition of the emotion expressed by the face. LG and control participants matched isolated face components with components appearing in a subsequently presented full-face and then categorized the face's emotion. Critically, the matched components were from regions which were diagnostic or non-diagnostic of the emotion portrayed by the full face. In experiment 1, when the full faces were briefly presented (150 ms), LG's performance was strongly influenced by the diagnosticity of the components: his emotion recognition was boosted within normal limits when diagnostic components were used and was obliterated when non-diagnostic components were used. By contrast, in experiment 2, when the face-exposure duration was extended (2000 ms), the beneficial effect of the diagnostic matching was diminished as was the detrimental effect of the non-diagnostic matching. These data highlight the impact of diagnostic facial features in normal expression recognition and suggest that impaired emotion recognition in DVA results from deficient visual integration across diagnostic face components. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. Emotion processing in chimeric faces: hemispheric asymmetries in expression and recognition of emotions.

    PubMed

    Indersmitten, Tim; Gur, Ruben C

    2003-05-01

    Since the discovery of facial asymmetries in emotional expressions of humans and other primates, hypotheses have related the greater left-hemiface intensity to right-hemispheric dominance in emotion processing. However, the difficulty of creating true frontal views of facial expressions in two-dimensional photographs has confounded efforts to better understand the phenomenon. We have recently described a method for obtaining three-dimensional photographs of posed and evoked emotional expressions and used these stimuli to investigate both intensity of expression and accuracy of recognizing emotion in chimeric faces constructed from only left- or right-side composites. The participant population included 38 (19 male, 19 female) African-American, Caucasian, and Asian adults. They were presented with chimeric composites generated from faces of eight actors and eight actresses showing four emotions: happiness, sadness, anger, and fear, each in posed and evoked conditions. We replicated the finding that emotions are expressed more intensely in the left hemiface for all emotions and conditions, with the exception of evoked anger, which was expressed more intensely in the right hemiface. In contrast, the results indicated that emotional expressions are recognized more efficiently in the right hemiface, indicating that the right hemiface expresses emotions more accurately. The double dissociation between the laterality of expression intensity and that of recognition efficiency supports the notion that the two kinds of processes may have distinct neural substrates. Evoked anger is uniquely expressed more intensely and accurately on the side of the face that projects to the viewer's right hemisphere, dominant in emotion recognition.

  8. Sex differences in functional activation patterns revealed by increased emotion processing demands.

    PubMed

    Hall, Geoffrey B C; Witelson, Sandra F; Szechtman, Henry; Nahmias, Claude

    2004-02-09

    Two [O(15)] PET studies assessed sex differences regional brain activation in the recognition of emotional stimuli. Study I revealed that the recognition of emotion in visual faces resulted in bilateral frontal activation in women, and unilateral right-sided activation in men. In study II, the complexity of the emotional face task was increased through tje addition of associated auditory emotional stimuli. Men again showed unilateral frontal activation, in this case to the left; whereas women did not show bilateral frontal activation, but showed greater limbic activity. These results suggest that when processing broader cross-modal emotional stimuli, men engage more in associative cognitive strategies while women draw more on primary emotional references.

  9. Gender differences in brain networks supporting empathy.

    PubMed

    Schulte-Rüther, Martin; Markowitsch, Hans J; Shah, N Jon; Fink, Gereon R; Piefke, Martina

    2008-08-01

    Females frequently score higher on standard tests of empathy, social sensitivity, and emotion recognition than do males. It remains to be clarified, however, whether these gender differences are associated with gender specific neural mechanisms of emotional social cognition. We investigated gender differences in an emotion attribution task using functional magnetic resonance imaging. Subjects either focused on their own emotional response to emotion expressing faces (SELF-task) or evaluated the emotional state expressed by the faces (OTHER-task). Behaviorally, females rated SELF-related emotions significantly stronger than males. Across the sexes, SELF- and OTHER-related processing of facial expressions activated a network of medial and lateral prefrontal, temporal, and parietal brain regions involved in emotional perspective taking. During SELF-related processing, females recruited the right inferior frontal cortex and superior temporal sulcus stronger than males. In contrast, there was increased neural activity in the left temporoparietal junction in males (relative to females). When performing the OTHER-task, females showed increased activation of the right inferior frontal cortex while there were no differential activations in males. The data suggest that females recruit areas containing mirror neurons to a higher degree than males during both SELF- and OTHER-related processing in empathic face-to-face interactions. This may underlie facilitated emotional "contagion" in females. Together with the observation that males differentially rely on the left temporoparietal junction (an area mediating the distinction between the SELF and OTHERS) the data suggest that females and males rely on different strategies when assessing their own emotions in response to other people.

  10. Emotional content modulates response inhibition and perceptual processing.

    PubMed

    Yang, Suyong; Luo, Wenbo; Zhu, Xiangru; Broster, Lucas S; Chen, Taolin; Li, Jinzhen; Luo, Yuejia

    2014-11-01

    In this study, event-related potentials were used to investigate the effect of emotion on response inhibition. Participants performed an emotional go/no-go task that required responses to human faces associated with a "go" valence (i.e., emotional, neutral) and response inhibition to human faces associated with a "no-go" valence. Emotional content impaired response inhibition, as evidenced by decreased response accuracy and N2 amplitudes in no-go trials. More importantly, emotional expressions elicited larger N170 amplitudes than neutral expressions, and this effect was larger in no-go than in go trials, indicating that the perceptual processing of emotional expression had priority in inhibitory trials. In no-go trials, correlation analysis showed that increased N170 amplitudes were associated with decreased N2 amplitudes. Taken together, our findings suggest that emotional content impairs response inhibition due to the prioritization of emotional content processing. Copyright © 2014 Society for Psychophysiological Research.

  11. Using an emotional saccade task to characterize executive functioning and emotion processing in attention-deficit hyperactivity disorder and bipolar disorder.

    PubMed

    Yep, Rachel; Soncin, Stephen; Brien, Donald C; Coe, Brian C; Marin, Alina; Munoz, Douglas P

    2018-04-23

    Despite distinct diagnostic criteria, attention-deficit hyperactivity disorder (ADHD) and bipolar disorder (BD) share cognitive and emotion processing deficits that complicate diagnoses. The goal of this study was to use an emotional saccade task to characterize executive functioning and emotion processing in adult ADHD and BD. Participants (21 control, 20 ADHD, 20 BD) performed an interleaved pro/antisaccade task (look toward vs. look away from a visual target, respectively) in which the sex of emotional face stimuli acted as the cue to perform either the pro- or antisaccade. Both patient groups made more direction (erroneous prosaccades on antisaccade trials) and anticipatory (saccades made before cue processing) errors than controls. Controls exhibited lower microsaccade rates preceding correct anti- vs. prosaccade initiation, but this task-related modulation was absent in both patient groups. Regarding emotion processing, the ADHD group performed worse than controls on neutral face trials, while the BD group performed worse than controls on trials presenting faces of all valence. These findings support the role of fronto-striatal circuitry in mediating response inhibition deficits in both ADHD and BD, and suggest that such deficits are exacerbated in BD during emotion processing, presumably via dysregulated limbic system circuitry involving the anterior cingulate and orbitofrontal cortex. Copyright © 2018 Elsevier Inc. All rights reserved.

  12. Emotion based attentional priority for storage in visual short-term memory.

    PubMed

    Simione, Luca; Calabrese, Lucia; Marucci, Francesco S; Belardinelli, Marta Olivetti; Raffone, Antonino; Maratos, Frances A

    2014-01-01

    A plethora of research demonstrates that the processing of emotional faces is prioritised over non-emotive stimuli when cognitive resources are limited (this is known as 'emotional superiority'). However, there is debate as to whether competition for processing resources results in emotional superiority per se, or more specifically, threat superiority. Therefore, to investigate prioritisation of emotional stimuli for storage in visual short-term memory (VSTM), we devised an original VSTM report procedure using schematic (angry, happy, neutral) faces in which processing competition was manipulated. In Experiment 1, display exposure time was manipulated to create competition between stimuli. Participants (n = 20) had to recall a probed stimulus from a set size of four under high (150 ms array exposure duration) and low (400 ms array exposure duration) perceptual processing competition. For the high competition condition (i.e. 150 ms exposure), results revealed an emotional superiority effect per se. In Experiment 2 (n = 20), we increased competition by manipulating set size (three versus five stimuli), whilst maintaining a constrained array exposure duration of 150 ms. Here, for the five-stimulus set size (i.e. maximal competition) only threat superiority emerged. These findings demonstrate attentional prioritisation for storage in VSTM for emotional faces. We argue that task demands modulated the availability of processing resources and consequently the relative magnitude of the emotional/threat superiority effect, with only threatening stimuli prioritised for storage in VSTM under more demanding processing conditions. Our results are discussed in light of models and theories of visual selection, and not only combine the two strands of research (i.e. visual selection and emotion), but highlight a critical factor in the processing of emotional stimuli is availability of processing resources, which is further constrained by task demands.

  13. Social incentives improve deliberative but not procedural learning in older adults.

    PubMed

    Gorlick, Marissa A; Maddox, W Todd

    2015-01-01

    Age-related deficits are seen across tasks where learning depends on asocial feedback processing, however plasticity has been observed in some of the same tasks in social contexts suggesting a novel way to attenuate deficits. Socioemotional selectivity theory suggests this plasticity is due to a deliberative motivational shift toward achieving well-being with age (positivity effect) that reverses when executive processes are limited (negativity effect). The present study examined the interaction of feedback valence (positive, negative) and social salience (emotional face feedback - happy; angry, asocial point feedback - gain; loss) on learning in a deliberative task that challenges executive processes and a procedural task that does not. We predict that angry face feedback will improve learning in a deliberative task when executive function is challenged. We tested two competing hypotheses regarding the interactive effects of deliberative emotional biases on automatic feedback processing: (1) If deliberative emotion regulation and automatic feedback are interactive we expect happy face feedback to improve learning and angry face feedback to impair learning in older adults because cognitive control is available. (2) If deliberative emotion regulation and automatic feedback are not interactive we predict that emotional face feedback will not improve procedural learning regardless of valence. Results demonstrate that older adults show persistent deficits relative to younger adults during procedural category learning suggesting that deliberative emotional biases do not interact with automatic feedback processing. Interestingly, a subgroup of older adults identified as potentially using deliberative strategies tended to learn as well as younger adults with angry relative to happy feedback, matching the pattern observed in the deliberative task. Results suggest that deliberative emotional biases can improve deliberative learning, but have no effect on procedural learning.

  14. Passive and Motivated Perception of Emotional Faces: Qualitative and Quantitative Changes in the Face Processing Network

    PubMed Central

    Skelly, Laurie R.; Decety, Jean

    2012-01-01

    Emotionally expressive faces are processed by a distributed network of interacting sub-cortical and cortical brain regions. The components of this network have been identified and described in large part by the stimulus properties to which they are sensitive, but as face processing research matures interest has broadened to also probe dynamic interactions between these regions and top-down influences such as task demand and context. While some research has tested the robustness of affective face processing by restricting available attentional resources, it is not known whether face network processing can be augmented by increased motivation to attend to affective face stimuli. Short videos of people expressing emotions were presented to healthy participants during functional magnetic resonance imaging. Motivation to attend to the videos was manipulated by providing an incentive for improved recall performance. During the motivated condition, there was greater coherence among nodes of the face processing network, more widespread correlation between signal intensity and performance, and selective signal increases in a task-relevant subset of face processing regions, including the posterior superior temporal sulcus and right amygdala. In addition, an unexpected task-related laterality effect was seen in the amygdala. These findings provide strong evidence that motivation augmentsco-activity among nodes of the face processing network and the impact of neural activity on performance. These within-subject effects highlight the necessity to consider motivation when interpreting neural function in special populations, and to further explore the effect of task demands on face processing in healthy brains. PMID:22768287

  15. Visual scanning behavior during processing of emotional faces in older adults with major depression.

    PubMed

    Noiret, Nicolas; Carvalho, Nicolas; Laurent, Éric; Vulliez, Lauriane; Bennabi, Djamila; Chopard, Gilles; Haffen, Emmanuel; Nicolier, Magali; Monnin, Julie; Vandel, Pierre

    2015-01-01

    Although several reported studies have suggested that younger adults with depression display depression-related biases during the processing of emotional faces, there remains a lack of data concerning these biases in older adults. The aim of our study was to assess scanning behavior during the processing of emotional faces in depressed older adults. Older adults with and without depression viewed happy, neutral or sad portraits during an eye movement recording. Depressed older adults spent less time with fewer fixations on emotional features than healthy older adults, but only for sad and neutral portraits, with no significant difference for happy portraits. These results suggest disengagement from sad and neutral faces in depressed older adults, which is not consistent with standard theoretical proposals on congruence biases in depression. Also, aging and associated emotional regulation change may explain the expression of depression-related biases. Our preliminary results suggest that information processing in depression consists of a more complex phenomenon than merely a general searching for mood-congruent stimuli or general disengagement from all kinds of stimuli. These findings underline that care must be used when evaluating potential variables, such as aging, which interact with depression and selectively influence the choice of relevant stimulus dimensions.

  16. Enhanced processing of threat stimuli under limited attentional resources.

    PubMed

    De Martino, Benedetto; Kalisch, Raffael; Rees, Geraint; Dolan, Raymond J

    2009-01-01

    The ability to process stimuli that convey potential threat, under conditions of limited attentional resources, confers adaptive advantages. This study examined the neurobiology underpinnings of this capacity. Employing an attentional blink paradigm, in conjunction with functional magnetic resonance imaging, we manipulated the salience of the second of 2 face target stimuli (T2), by varying emotionality. Behaviorally, fearful T2 faces were identified significantly more than neutral faces. Activity in fusiform face area increased with correct identification of T2 faces. Enhanced activity in rostral anterior cingulate cortex (rACC) accounted for the benefit in detection of fearful stimuli reflected in a significant interaction between target valence and correct identification. Thus, under conditions of limited attention resources activation in rACC correlated with enhanced processing of emotional stimuli. We suggest that these data support a model in which a prefrontal "gate" mechanism controls conscious access of emotional information under conditions of limited attentional resources.

  17. Reduced specificity in emotion judgment in people with autism spectrum disorder

    PubMed Central

    Wang, Shuo; Adolphs, Ralph

    2017-01-01

    There is a conflicting literature on facial emotion processing in autism spectrum disorder (ASD): both typical and atypical performance have been reported, and inconsistencies in the literature may stem from different processes examined (emotion judgment, face perception, fixations) as well as differences in participant populations. Here we conducted a detailed investigation of the ability to discriminate graded emotions shown in morphs of fear-happy faces, in a well-characterized high-functioning sample of participants with ASD and matched controls. Signal detection approaches were used in the analyses, and concurrent high-resolution eye-tracking was collected. Although people with ASD had typical thresholds for categorical fear and confidence judgments, their psychometric specificity to detect emotions across the entire range of intensities was reduced. However, fixation patterns onto the stimuli were typical and could not account for the reduced specificity of emotion judgment. Together, our results argue for a subtle and specific deficit in emotion perception in ASD that, from a signal detection perspective, is best understood as a reduced specificity due to increased noise in central processing of the face stimuli. PMID:28343960

  18. Effects of touch on emotional face processing: A study of event-related potentials, facial EMG and cardiac activity.

    PubMed

    Spapé, M M; Harjunen, Ville; Ravaja, N

    2017-03-01

    Being touched is known to affect emotion, and even a casual touch can elicit positive feelings and affinity. Psychophysiological studies have recently shown that tactile primes affect visual evoked potentials to emotional stimuli, suggesting altered affective stimulus processing. As, however, these studies approached emotion from a purely unidimensional perspective, it remains unclear whether touch biases emotional evaluation or a more general feature such as salience. Here, we investigated how simple tactile primes modulate event related potentials (ERPs), facial EMG and cardiac response to pictures of facial expressions of emotion. All measures replicated known effects of emotional face processing: Disgust and fear modulated early ERPs, anger increased the cardiac orienting response, and expressions elicited emotion-congruent facial EMG activity. Tactile primes also affected these measures, but priming never interacted with the type of emotional expression. Thus, touch may additively affect general stimulus processing, but it does not bias or modulate immediate affective evaluation. Copyright © 2017. Published by Elsevier B.V.

  19. Guanfacine modulates the influence of emotional cues on prefrontal cortex activation for cognitive control.

    PubMed

    Schulz, Kurt P; Clerkin, Suzanne M; Fan, Jin; Halperin, Jeffrey M; Newcorn, Jeffrey H

    2013-03-01

    Functional interactions between limbic regions that process emotions and frontal networks that guide response functions provide a substrate for emotional cues to influence behavior. Stimulation of postsynaptic α₂ adrenoceptors enhances the function of prefrontal regions in these networks. However, the impact of this stimulation on the emotional biasing of behavior has not been established. This study tested the effect of the postsynaptic α₂ adrenoceptor agonist guanfacine on the emotional biasing of response execution and inhibition in prefrontal cortex. Fifteen healthy young adults were scanned twice with functional magnetic resonance imaging while performing a face emotion go/no-go task following counterbalanced administration of single doses of oral guanfacine (1 mg) and placebo in a double-blind, cross-over design. Lower perceptual sensitivity and less response bias for sad faces resulted in fewer correct responses compared to happy and neutral faces but had no effect on correct inhibitions. Guanfacine increased the sensitivity and bias selectively for sad faces, resulting in response accuracy comparable to happy and neutral faces, and reversed the valence-dependent variation in response-related activation in left dorsolateral prefrontal cortex (DLPFC), resulting in enhanced activation for response execution cued by sad faces relative to happy and neutral faces, in line with other frontoparietal regions. These results provide evidence that guanfacine stimulation of postsynaptic α₂ adrenoceptors moderates DLPFC activation associated with the emotional biasing of response execution processes. The findings have implications for the α₂ adrenoceptor agonist treatment of attention-deficit hyperactivity disorder.

  20. Implicit Recognition of Familiar and Unfamiliar Faces in Schizophrenia: A Study of the Skin Conductance Response in Familiarity Disorders.

    PubMed

    Ameller, Aurely; Picard, Aline; D'Hondt, Fabien; Vaiva, Guillaume; Thomas, Pierre; Pins, Delphine

    2017-01-01

    Familiarity is a subjective sensation that contributes to person recognition. This process is described as an emotion-based memory-trace of previous meetings and could be disrupted in schizophrenia. Consequently, familiarity disorders could be involved in the impaired social interactions observed in patients with schizophrenia. Previous studies have primarily focused on famous people recognition. Our aim was to identify underlying features, such as emotional disturbances, that may contribute to familiarity disorders in schizophrenia. We hypothesize that patients with familiarity disorders will exhibit a lack of familiarity that could be detected by a flattened skin conductance response (SCR). The SCR was recorded to test the hypothesis that emotional reactivity disturbances occur in patients with schizophrenia during the categorization of specific familiar, famous and unknown faces as male or female. Forty-eight subjects were divided into the following 3 matched groups with 16 subjects per group: control subjects, schizophrenic people with familiarity disorder, and schizophrenic people without familiarity disorders. Emotional arousal is reflected by the skin conductance measures. The control subjects and the patients without familiarity disorders experienced a differential emotional response to the specific familiar faces compared with that to the unknown faces. Nevertheless, overall, the schizophrenic patients without familiarity disorders showed a weaker response across conditions compared with the control subjects. In contrast, the patients with familiarity disorders did not show any significant differences in their emotional response to the faces, regardless of the condition. Only patients with familiarity disorders fail to exhibit a difference in emotional response between familiar and non-familiar faces. These patients likely emotionally process familiar faces similarly to unknown faces. Hence, the lower feelings of familiarity in schizophrenia may be a premise enabling the emergence of familiarity disorders.

  1. Face puzzle—two new video-based tasks for measuring explicit and implicit aspects of facial emotion recognition

    PubMed Central

    Kliemann, Dorit; Rosenblau, Gabriela; Bölte, Sven; Heekeren, Hauke R.; Dziobek, Isabel

    2013-01-01

    Recognizing others' emotional states is crucial for effective social interaction. While most facial emotion recognition tasks use explicit prompts that trigger consciously controlled processing, emotional faces are almost exclusively processed implicitly in real life. Recent attempts in social cognition suggest a dual process perspective, whereby explicit and implicit processes largely operate independently. However, due to differences in methodology the direct comparison of implicit and explicit social cognition has remained a challenge. Here, we introduce a new tool to comparably measure implicit and explicit processing aspects comprising basic and complex emotions in facial expressions. We developed two video-based tasks with similar answer formats to assess performance in respective facial emotion recognition processes: Face Puzzle, implicit and explicit. To assess the tasks' sensitivity to atypical social cognition and to infer interrelationship patterns between explicit and implicit processes in typical and atypical development, we included healthy adults (NT, n = 24) and adults with autism spectrum disorder (ASD, n = 24). Item analyses yielded good reliability of the new tasks. Group-specific results indicated sensitivity to subtle social impairments in high-functioning ASD. Correlation analyses with established implicit and explicit socio-cognitive measures were further in favor of the tasks' external validity. Between group comparisons provide first hints of differential relations between implicit and explicit aspects of facial emotion recognition processes in healthy compared to ASD participants. In addition, an increased magnitude of between group differences in the implicit task was found for a speed-accuracy composite measure. The new Face Puzzle tool thus provides two new tasks to separately assess explicit and implicit social functioning, for instance, to measure subtle impairments as well as potential improvements due to social cognitive interventions. PMID:23805122

  2. Recognition of facial emotions among maltreated children with high rates of post-traumatic stress disorder

    PubMed Central

    Masten, Carrie L.; Guyer, Amanda E.; Hodgdon, Hilary B.; McClure, Erin B.; Charney, Dennis S.; Ernst, Monique; Kaufman, Joan; Pine, Daniel S.; Monk, Christopher S.

    2008-01-01

    Objective The purpose of this study is to examine processing of facial emotions in a sample of maltreated children showing high rates of post-traumatic stress disorder (PTSD). Maltreatment during childhood has been associated independently with both atypical processing of emotion and the development of PTSD. However, research has provided little evidence indicating how high rates of PTSD might relate to maltreated children’s processing of emotions. Method Participants’ reaction time and labeling of emotions were measured using a morphed facial emotion identification task. Participants included a diverse sample of maltreated children with and without PTSD and controls ranging in age from 8 to 15 years. Maltreated children had been removed from their homes and placed in state custody following experiences of maltreatment. Diagnoses of PTSD and other disorders were determined through combination of parent, child, and teacher reports. Results Maltreated children displayed faster reaction times than controls when labeling emotional facial expressions, and this result was most pronounced for fearful faces. Relative to children who were not maltreated, maltreated children both with and without PTSD showed enhanced response times when identifying fearful faces. There was no group difference in labeling of emotions when identifying different facial emotions. Conclusions Maltreated children show heightened ability to identify fearful faces, evidenced by faster reaction times relative to controls. This association between maltreatment and atypical processing of emotion is independent of PTSD diagnosis. PMID:18155144

  3. Deficits of unconscious emotional processing in patients with major depression: An ERP study.

    PubMed

    Zhang, Dandan; He, Zhenhong; Chen, Yuming; Wei, Zhaoguo

    2016-07-15

    Major depressive disorder (MDD) is associated with behavioral and neurobiological evidences of negative bias in unconscious emotional processing. However, little is known about the time course of this deficit. The current study aimed to explore the unconscious processing of emotional facial expressions in MDD patients by means of event-related potentials (ERPs). The ERP responses to subliminally presented happy/neutral/sad faces were recorded in 26 medication-free patients and 26 healthy controls in a backward masking task. Three ERP components were compared between patients and controls. Detection accuracy was at chance level for both groups, suggesting that the process was performed in the absence of conscious awareness of the emotional stimuli. Robust emotion×group interactions were observed in P1, N170 and P3. Compared with the neutral faces, 1) the patients showed larger P1 for sad and smaller P1 for happy faces; however, the controls showed a completely inverse P1 pattern; 2) the controls exhibited larger N170 in the happy but not in the sad trials, whereas patients had comparable larger N170 amplitudes in sad and happy trials; 3) although both groups exhibited larger P3 for emotional faces, the patients showed a priority for sad trials while the controls showed a priority for happy trials. Our data suggested that negative processing bias exists on the unconscious level in individuals with MDD. The ERP measures indicated that the unconscious emotional processing in MDD patients has a time course of three-stage deflection. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Access of emotional information to visual awareness in patients with major depressive disorder.

    PubMed

    Sterzer, P; Hilgenfeldt, T; Freudenberg, P; Bermpohl, F; Adli, M

    2011-08-01

    According to cognitive theories of depression, negative biases affect most cognitive processes including perception. Such depressive perception may result not only from biased cognitive appraisal but also from automatic processing biases that influence the access of sensory information to awareness. Twenty patients with major depressive disorder (MDD) and 20 healthy control participants underwent behavioural testing with a variant of binocular rivalry, continuous flash suppression (CFS), to investigate the potency of emotional visual stimuli to gain access to awareness. While a neutral, fearful, happy or sad emotional face was presented to one eye, high-contrast dynamic patterns were presented to the other eye, resulting in initial suppression of the face from awareness. Participants indicated the location of the face with a key press as soon as it became visible. The modulation of suppression time by emotional expression was taken as an index of unconscious emotion processing. We found a significant difference in the emotional modulation of suppression time between MDD patients and controls. This difference was due to relatively shorter suppression of sad faces and, to a lesser degree, to longer suppression of happy faces in MDD. Suppression time modulation by sad expression correlated with change in self-reported severity of depression after 4 weeks. Our finding of preferential access to awareness for mood-congruent stimuli supports the notion that depressive perception may be related to altered sensory information processing even at automatic processing stages. Such perceptual biases towards mood-congruent information may reinforce depressed mood and contribute to negative cognitive biases. © Cambridge University Press 2011

  5. Recognition memory for emotional and neutral faces: an event-related potential study.

    PubMed

    Johansson, Mikael; Mecklinger, Axel; Treese, Anne-Cécile

    2004-12-01

    This study examined emotional influences on the hypothesized event-related potential (ERP) correlates of familiarity and recollection (Experiment 1) and the states of awareness (Experiment 2) accompanying recognition memory for faces differing in facial affect. Participants made gender judgments to positive, negative, and neutral faces at study and were in the test phase instructed to discriminate between studied and nonstudied faces. Whereas old-new discrimination was unaffected by facial expression, negative faces were recollected to a greater extent than both positive and neutral faces as reflected in the parietal ERP old-new effect and in the proportion of remember judgments. Moreover, emotion-specific modulations were observed in frontally recorded ERPs elicited by correctly rejected new faces that concurred with a more liberal response criterion for emotional as compared to neutral faces. Taken together, the results are consistent with the view that processes promoting recollection are facilitated for negative events and that emotion may affect recognition performance by influencing criterion setting mediated by the prefrontal cortex.

  6. A Normalization Framework for Emotional Attention

    PubMed Central

    Zhang, Xilin; Japee, Shruti; Safiullah, Zaid; Ungerleider, Leslie G.

    2016-01-01

    The normalization model of attention proposes that attention can affect performance by response- or contrast-gain changes, depending on the size of the stimulus and attention field. Here, we manipulated the attention field by emotional valence, negative faces versus positive faces, while holding stimulus size constant in a spatial cueing task. We observed changes in the cueing effect consonant with changes in response gain for negative faces and contrast gain for positive faces. Neuroimaging experiments confirmed that subjects’ attention fields were narrowed for negative faces and broadened for positive faces. Importantly, across subjects, the self-reported emotional strength of negative faces and positive faces correlated, respectively, both with response- and contrast-gain changes and with primary visual cortex (V1) narrowed and broadened attention fields. Effective connectivity analysis showed that the emotional valence-dependent attention field was closely associated with feedback from the dorsolateral prefrontal cortex (DLPFC) to V1. These findings indicate a crucial involvement of DLPFC in the normalization processes of emotional attention. PMID:27870851

  7. A Normalization Framework for Emotional Attention.

    PubMed

    Zhang, Xilin; Japee, Shruti; Safiullah, Zaid; Mlynaryk, Nicole; Ungerleider, Leslie G

    2016-11-01

    The normalization model of attention proposes that attention can affect performance by response- or contrast-gain changes, depending on the size of the stimulus and attention field. Here, we manipulated the attention field by emotional valence, negative faces versus positive faces, while holding stimulus size constant in a spatial cueing task. We observed changes in the cueing effect consonant with changes in response gain for negative faces and contrast gain for positive faces. Neuroimaging experiments confirmed that subjects' attention fields were narrowed for negative faces and broadened for positive faces. Importantly, across subjects, the self-reported emotional strength of negative faces and positive faces correlated, respectively, both with response- and contrast-gain changes and with primary visual cortex (V1) narrowed and broadened attention fields. Effective connectivity analysis showed that the emotional valence-dependent attention field was closely associated with feedback from the dorsolateral prefrontal cortex (DLPFC) to V1. These findings indicate a crucial involvement of DLPFC in the normalization processes of emotional attention.

  8. Faces in context: a review and systematization of contextual influences on affective face processing.

    PubMed

    Wieser, Matthias J; Brosch, Tobias

    2012-01-01

    Facial expressions are of eminent importance for social interaction as they convey information about other individuals' emotions and social intentions. According to the predominant "basic emotion" approach, the perception of emotion in faces is based on the rapid, automatic categorization of prototypical, universal expressions. Consequently, the perception of facial expressions has typically been investigated using isolated, de-contextualized, static pictures of facial expressions that maximize the distinction between categories. However, in everyday life, an individual's face is not perceived in isolation, but almost always appears within a situational context, which may arise from other people, the physical environment surrounding the face, as well as multichannel information from the sender. Furthermore, situational context may be provided by the perceiver, including already present social information gained from affective learning and implicit processing biases such as race bias. Thus, the perception of facial expressions is presumably always influenced by contextual variables. In this comprehensive review, we aim at (1) systematizing the contextual variables that may influence the perception of facial expressions and (2) summarizing experimental paradigms and findings that have been used to investigate these influences. The studies reviewed here demonstrate that perception and neural processing of facial expressions are substantially modified by contextual information, including verbal, visual, and auditory information presented together with the face as well as knowledge or processing biases already present in the observer. These findings further challenge the assumption of automatic, hardwired categorical emotion extraction mechanisms predicted by basic emotion theories. Taking into account a recent model on face processing, we discuss where and when these different contextual influences may take place, thus outlining potential avenues in future research.

  9. Is empathy necessary to comprehend the emotional faces? The empathic effect on attentional mechanisms (eye movements), cortical correlates (N200 event-related potentials) and facial behaviour (electromyography) in face processing.

    PubMed

    Balconi, Michela; Canavesio, Ylenia

    2016-01-01

    The present research explored the effect of social empathy on processing emotional facial expressions. Previous evidence suggested a close relationship between emotional empathy and both the ability to detect facial emotions and the attentional mechanisms involved. A multi-measure approach was adopted: we investigated the association between trait empathy (Balanced Emotional Empathy Scale) and individuals' performance (response times; RTs), attentional mechanisms (eye movements; number and duration of fixations), correlates of cortical activation (event-related potential (ERP) N200 component), and facial responsiveness (facial zygomatic and corrugator activity). Trait empathy was found to affect face detection performance (reduced RTs), attentional processes (more scanning eye movements in specific areas of interest), ERP salience effect (increased N200 amplitude), and electromyographic activity (more facial responses). A second important result was the demonstration of strong, direct correlations among these measures. We suggest that empathy may function as a social facilitator of the processes underlying the detection of facial emotion, and a general "facial response effect" is proposed to explain these results. We assumed that empathy influences cognitive and the facial responsiveness, such that empathic individuals are more skilful in processing facial emotion.

  10. Cultural immersion alters emotion perception: Neurophysiological evidence from Chinese immigrants to Canada.

    PubMed

    Liu, Pan; Rigoulot, Simon; Pell, Marc D

    2017-12-01

    To explore how cultural immersion modulates emotion processing, this study examined how Chinese immigrants to Canada process multisensory emotional expressions, which were compared to existing data from two groups, Chinese and North Americans. Stroop and Oddball paradigms were employed to examine different stages of emotion processing. The Stroop task presented face-voice pairs expressing congruent/incongruent emotions and participants actively judged the emotion of one modality while ignoring the other. A significant effect of cultural immersion was observed in the immigrants' behavioral performance, which showed greater interference from to-be-ignored faces, comparable with what was observed in North Americans. However, this effect was absent in their N400 data, which retained the same pattern as the Chinese. In the Oddball task, where immigrants passively viewed facial expressions with/without simultaneous vocal emotions, they exhibited a larger visual MMN for faces accompanied by voices, again mirroring patterns observed in Chinese. Correlation analyses indicated that the immigrants' living duration in Canada was associated with neural patterns (N400 and visual mismatch negativity) more closely resembling North Americans. Our data suggest that in multisensory emotion processing, adopting to a new culture first leads to behavioral accommodation followed by alterations in brain activities, providing new evidence on human's neurocognitive plasticity in communication.

  11. Implicit Processing of Visual Emotions Is Affected by Sound-Induced Affective States and Individual Affective Traits

    PubMed Central

    Quarto, Tiziana; Blasi, Giuseppe; Pallesen, Karen Johanne; Bertolino, Alessandro; Brattico, Elvira

    2014-01-01

    The ability to recognize emotions contained in facial expressions are affected by both affective traits and states and varies widely between individuals. While affective traits are stable in time, affective states can be regulated more rapidly by environmental stimuli, such as music, that indirectly modulate the brain state. Here, we tested whether a relaxing or irritating sound environment affects implicit processing of facial expressions. Moreover, we investigated whether and how individual traits of anxiety and emotional control interact with this process. 32 healthy subjects performed an implicit emotion processing task (presented to subjects as a gender discrimination task) while the sound environment was defined either by a) a therapeutic music sequence (MusiCure), b) a noise sequence or c) silence. Individual changes in mood were sampled before and after the task by a computerized questionnaire. Additionally, emotional control and trait anxiety were assessed in a separate session by paper and pencil questionnaires. Results showed a better mood after the MusiCure condition compared with the other experimental conditions and faster responses to happy faces during MusiCure compared with angry faces during Noise. Moreover, individuals with higher trait anxiety were faster in performing the implicit emotion processing task during MusiCure compared with Silence. These findings suggest that sound-induced affective states are associated with differential responses to angry and happy emotional faces at an implicit stage of processing, and that a relaxing sound environment facilitates the implicit emotional processing in anxious individuals. PMID:25072162

  12. Major depression is associated with impaired processing of emotion in music as well as in facial and vocal stimuli.

    PubMed

    Naranjo, C; Kornreich, C; Campanella, S; Noël, X; Vandriette, Y; Gillain, B; de Longueville, X; Delatte, B; Verbanck, P; Constant, E

    2011-02-01

    The processing of emotional stimuli is thought to be negatively biased in major depression. This study investigates this issue using musical, vocal and facial affective stimuli. 23 depressed in-patients and 23 matched healthy controls were recruited. Affective information processing was assessed through musical, vocal and facial emotion recognition tasks. Depression, anxiety level and attention capacity were controlled. The depressed participants demonstrated less accurate identification of emotions than the control group in all three sorts of emotion-recognition tasks. The depressed group also gave higher intensity ratings than the controls when scoring negative emotions, and they were more likely to attribute negative emotions to neutral voices and faces. Our in-patient group might differ from the more general population of depressed adults. They were all taking anti-depressant medication, which may have had an influence on their emotional information processing. Major depression is associated with a general negative bias in the processing of emotional stimuli. Emotional processing impairment in depression is not confined to interpersonal stimuli (faces and voices), being also present in the ability to feel music accurately. © 2010 Elsevier B.V. All rights reserved.

  13. The Development of Emotional Face and Eye Gaze Processing

    ERIC Educational Resources Information Center

    Hoehl, Stefanie; Striano, Tricia

    2010-01-01

    Recent research has demonstrated that infants' attention towards novel objects is affected by an adult's emotional expression and eye gaze toward the object. The current event-related potential (ERP) study investigated how infants at 3, 6, and 9 months of age process fearful compared to neutral faces looking toward objects or averting gaze away…

  14. The neural representation of emotionally neutral faces and places in patients with panic disorder with agoraphobia.

    PubMed

    Petrowski, Katja; Wintermann, Gloria; Smolka, Michael N; Huebner, Thomas; Donix, Markus

    2014-01-01

    Panic disorder with agoraphobia (PD-A) has been associated with abnormal neural activity for threat-related stimuli (faces, places). Recent findings suggest a disturbed neural processing of emotionally neutral stimuli at a more general level. Using functional magnetic resonance imaging (fMRI) we investigated the neural processing of emotionally neutral faces and places in PD-A. Fifteen patients with PD-A and fifteen healthy subjects participated in the study. When they perceived neutral faces and places, the patients with PD-A showed significantly less brain activity in the fusiform gyrus, the inferior occipital gyrus, the calcarine gyrus, the cerebellum, and the cuneus compared with the healthy controls. However, the patients with PD-A showed significantly more brain activity in the precuneus compared with controls subjects. It was not possible to distinguish the agoraphobia-associated effects from possible contributions due to general anxiety induced by fMRI. For future investigations, an additional clinical control group with patients suffering from panic disorder without agoraphobia would be of interest. In addition, the psychopathology concerning the agoraphobic symptoms needs to be investigated in more detail. The findings suggest altered neural processing of emotionally neutral faces and places in patients with PD-A. Reduced neural activity in different brain regions may indicate difficulties in recognizing the emotional content in face and place stimuli due to anxiety-related hyper-arousal. © 2013 Published by Elsevier B.V.

  15. The face and its emotion: right N170 deficits in structural processing and early emotional discrimination in schizophrenic patients and relatives.

    PubMed

    Ibáñez, Agustín; Riveros, Rodrigo; Hurtado, Esteban; Gleichgerrcht, Ezequiel; Urquina, Hugo; Herrera, Eduar; Amoruso, Lucía; Reyes, Migdyrai Martin; Manes, Facundo

    2012-01-30

    Previous studies have reported facial emotion recognition impairments in schizophrenic patients, as well as abnormalities in the N170 component of the event-related potential. Current research on schizophrenia highlights the importance of complexly-inherited brain-based deficits. In order to examine the N170 markers of face structural and emotional processing, DSM-IV diagnosed schizophrenia probands (n=13), unaffected first-degree relatives from multiplex families (n=13), and control subjects (n=13) matched by age, gender and educational level, performed a categorization task which involved words and faces with positive and negative valence. The N170 component, while present in relatives and control subjects, was reduced in patients, not only for faces, but also for face-word differences, suggesting a deficit in structural processing of stimuli. Control subjects showed N170 modulation according to the valence of facial stimuli. However, this discrimination effect was found to be reduced both in patients and relatives. This is the first report showing N170 valence deficits in relatives. Our results suggest a generalized deficit affecting the structural encoding of faces in patients, as well as the emotion discrimination both in patients and relatives. Finally, these findings lend support to the notion that cortical markers of facial discrimination can be validly considered as vulnerability markers. © 2011 Elsevier Ireland Ltd. All rights reserved.

  16. Behavioral and neural indices of affective coloring for neutral social stimuli

    PubMed Central

    Schaefer, Stacey M; Lapate, Regina C; Schoen, Andrew J; Gresham, Lauren K; Mumford, Jeanette A; Davidson, Richard J

    2018-01-01

    Abstract Emotional processing often continues beyond the presentation of emotionally evocative stimuli, which can result in affective biasing or coloring of subsequently encountered events. Here, we describe neural correlates of affective coloring and examine how individual differences in affective style impact the magnitude of affective coloring. We conducted functional magnetic resonance imaging in 117 adults who passively viewed negative, neutral and positive pictures presented 2 s prior to neutral faces. Brain responses to neutral faces were modulated by the valence of preceding pictures, with greater activation for faces following negative (vs positive) pictures in the amygdala, dorsomedial and lateral prefrontal cortex, ventral visual cortices, posterior superior temporal sulcus, and angular gyrus. Three days after the magnetic resonance imaging scan, participants rated their memory and liking of previously encountered neutral faces. Individuals higher in trait positive affect and emotional reappraisal rated faces as more likable when preceded by emotionally arousing (negative or positive) pictures. In addition, greater amygdala responses to neutral faces preceded by positively valenced pictures were associated with greater memory for these faces 3 days later. Collectively, these results reveal individual differences in how emotions spill over onto the processing of unrelated social stimuli, resulting in persistent and affectively biased evaluations of such stimuli. PMID:29447377

  17. Autistic traits and social anxiety predict differential performance on social cognitive tasks in typically developing young adults

    PubMed Central

    Burk, Joshua A.; Fleckenstein, Katarina; Kozikowski, C. Teal

    2018-01-01

    The current work examined the unique contribution that autistic traits and social anxiety have on tasks examining attention and emotion processing. In Study 1, 119 typically-developing college students completed a flanker task assessing the control of attention to target faces and away from distracting faces during emotion identification. In Study 2, 208 typically-developing college students performed a visual search task which required identification of whether a series of 8 or 16 emotional faces depicted the same or different emotions. Participants with more self-reported autistic traits performed more slowly on the flanker task in Study 1 than those with fewer autistic traits when stimuli depicted complex emotions. In Study 2, participants higher in social anxiety performed less accurately on trials showing all complex faces; participants with autistic traits showed no differences. These studies suggest that traits related to autism and to social anxiety differentially impact social cognitive processing. PMID:29596523

  18. Behavioural and neurophysiological evidence for face identity and face emotion processing in animals

    PubMed Central

    Tate, Andrew J; Fischer, Hanno; Leigh, Andrea E; Kendrick, Keith M

    2006-01-01

    Visual cues from faces provide important social information relating to individual identity, sexual attraction and emotional state. Behavioural and neurophysiological studies on both monkeys and sheep have shown that specialized skills and neural systems for processing these complex cues to guide behaviour have evolved in a number of mammals and are not present exclusively in humans. Indeed, there are remarkable similarities in the ways that faces are processed by the brain in humans and other mammalian species. While human studies with brain imaging and gross neurophysiological recording approaches have revealed global aspects of the face-processing network, they cannot investigate how information is encoded by specific neural networks. Single neuron electrophysiological recording approaches in both monkeys and sheep have, however, provided some insights into the neural encoding principles involved and, particularly, the presence of a remarkable degree of high-level encoding even at the level of a specific face. Recent developments that allow simultaneous recordings to be made from many hundreds of individual neurons are also beginning to reveal evidence for global aspects of a population-based code. This review will summarize what we have learned so far from these animal-based studies about the way the mammalian brain processes the faces and the emotions they can communicate, as well as associated capacities such as how identity and emotion cues are dissociated and how face imagery might be generated. It will also try to highlight what questions and advances in knowledge still challenge us in order to provide a complete understanding of just how brain networks perform this complex and important social recognition task. PMID:17118930

  19. Behavioural and neurophysiological evidence for face identity and face emotion processing in animals.

    PubMed

    Tate, Andrew J; Fischer, Hanno; Leigh, Andrea E; Kendrick, Keith M

    2006-12-29

    Visual cues from faces provide important social information relating to individual identity, sexual attraction and emotional state. Behavioural and neurophysiological studies on both monkeys and sheep have shown that specialized skills and neural systems for processing these complex cues to guide behaviour have evolved in a number of mammals and are not present exclusively in humans. Indeed, there are remarkable similarities in the ways that faces are processed by the brain in humans and other mammalian species. While human studies with brain imaging and gross neurophysiological recording approaches have revealed global aspects of the face-processing network, they cannot investigate how information is encoded by specific neural networks. Single neuron electrophysiological recording approaches in both monkeys and sheep have, however, provided some insights into the neural encoding principles involved and, particularly, the presence of a remarkable degree of high-level encoding even at the level of a specific face. Recent developments that allow simultaneous recordings to be made from many hundreds of individual neurons are also beginning to reveal evidence for global aspects of a population-based code. This review will summarize what we have learned so far from these animal-based studies about the way the mammalian brain processes the faces and the emotions they can communicate, as well as associated capacities such as how identity and emotion cues are dissociated and how face imagery might be generated. It will also try to highlight what questions and advances in knowledge still challenge us in order to provide a complete understanding of just how brain networks perform this complex and important social recognition task.

  20. A Role for the Motor System in Binding Abstract Emotional Meaning

    PubMed Central

    Carota, Francesca; Hauk, Olaf; Mohr, Bettina; Pulvermüller, Friedemann

    2012-01-01

    Sensorimotor areas activate to action- and object-related words, but their role in abstract meaning processing is still debated. Abstract emotion words denoting body internal states are a critical test case because they lack referential links to objects. If actions expressing emotion are crucial for learning correspondences between word forms and emotions, emotion word–evoked activity should emerge in motor brain systems controlling the face and arms, which typically express emotions. To test this hypothesis, we recruited 18 native speakers and used event-related functional magnetic resonance imaging to compare brain activation evoked by abstract emotion words to that by face- and arm-related action words. In addition to limbic regions, emotion words indeed sparked precentral cortex, including body-part–specific areas activated somatotopically by face words or arm words. Control items, including hash mark strings and animal words, failed to activate precentral areas. We conclude that, similar to their role in action word processing, activation of frontocentral motor systems in the dorsal stream reflects the semantic binding of sign and meaning of abstract words denoting emotions and possibly other body internal states. PMID:21914634

  1. Attentional Bias in Psychopathy: An Examination of the Emotional Dot-Probe Task in Male Jail Inmates.

    PubMed

    Edalati, Hanie; Walsh, Zach; Kosson, David S

    2016-08-01

    Numerous studies have identified differences in the identification of emotional displays between psychopaths and non-psychopaths; however, results have been equivocal regarding the nature of these differences. The present study investigated an alternative approach to examining the association between psychopathy and emotion processing by examining attentional bias to emotional faces; we used a modified dot-probe task to measure attentional bias toward emotional faces in comparison with neutral faces, among a sample of male jail inmates assessed using the Psychopathy Checklist-Revised (PCL-R). Results indicated a positive association between psychopathy and attention toward happy versus neutral faces, and that this association was attributable to Factor 1 of the psychopathy construct. © The Author(s) 2015.

  2. Face processing regions are sensitive to distinct aspects of temporal sequence in facial dynamics.

    PubMed

    Reinl, Maren; Bartels, Andreas

    2014-11-15

    Facial movement conveys important information for social interactions, yet its neural processing is poorly understood. Computational models propose that shape- and temporal sequence sensitive mechanisms interact in processing dynamic faces. While face processing regions are known to respond to facial movement, their sensitivity to particular temporal sequences has barely been studied. Here we used fMRI to examine the sensitivity of human face-processing regions to two aspects of directionality in facial movement trajectories. We presented genuine movie recordings of increasing and decreasing fear expressions, each of which were played in natural or reversed frame order. This two-by-two factorial design matched low-level visual properties, static content and motion energy within each factor, emotion-direction (increasing or decreasing emotion) and timeline (natural versus artificial). The results showed sensitivity for emotion-direction in FFA, which was timeline-dependent as it only occurred within the natural frame order, and sensitivity to timeline in the STS, which was emotion-direction-dependent as it only occurred for decreased fear. The occipital face area (OFA) was sensitive to the factor timeline. These findings reveal interacting temporal sequence sensitive mechanisms that are responsive to both ecological meaning and to prototypical unfolding of facial dynamics. These mechanisms are temporally directional, provide socially relevant information regarding emotional state or naturalness of behavior, and agree with predictions from modeling and predictive coding theory. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  3. Association between autistic traits and emotion adaptation to partially occluded faces.

    PubMed

    Luo, Chengwen; Burns, Edwin; Xu, Hong

    2017-04-01

    Prolonged exposure to a happy face makes subsequently presented faces appear sadder: the facial emotion aftereffect (FEA). People with autism spectrum disorders and their relatives have diminished holistic perception of faces. Levels of autism can be measured continuously in the general population by autistic traits using the autism-quotient (AQ). Prior work has not found any association between AQ and FEA in adults, possibly due to non-holistic processing strategies employed by those at the higher end of the spectrum. In the present study, we tested whether AQ was associated with FEA to partially occluded faces. We hypothesized that inferring emotion from such faces would require participants to process their viewable parts as a gestalt percept, thus we anticipated this ability would diminish as autistic traits increased. In Experiment 1, we partially occluded the adapting faces with aligned or misaligned opaque bars. Both conditions produced significant FEAs, with aftereffects and AQ negatively correlated. In Experiment 2, we adapted participants to obscured faces flickering in luminance, and manipulated the facilitation of holistic perception by varying the synchronization of this flickering. We found significant FEAs in all conditions, but abolished its association with AQ. In Experiment 3, we showed that the association between AQ and FEA in the occluded conditions in Experiment 1 was not due to the recognizability or perceived emotional intensity of our adaptors; although the overall FEAs were linked to emotional intensity. We propose that increasing autistic traits are associated with diminishing abilities in perceiving emotional faces as a gestalt percept. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Lateralization for Processing Facial Emotions in Gay Men, Heterosexual Men, and Heterosexual Women.

    PubMed

    Rahman, Qazi; Yusuf, Sifat

    2015-07-01

    This study tested whether male sexual orientation and gender nonconformity influenced functional cerebral lateralization for the processing of facial emotions. We also tested for the effects of sex of poser and emotion displayed on putative differences. Thirty heterosexual men, 30 heterosexual women, and 40 gay men completed measures of demographic variables, recalled childhood gender nonconformity (CGN), IQ, and the Chimeric Faces Test (CFT). The CFT depicts vertically split chimeric faces, formed with one half showing a neutral expression and the other half showing an emotional expression and performance is measured using a "laterality quotient" (LQ) score. We found that heterosexual men were significantly more right-lateralized when viewing female faces compared to heterosexual women and gay men, who did not differ significantly from each other. Heterosexual women and gay men were more left-lateralized for processing female faces. There were no significant group differences in lateralization for male faces. These results remained when controlling for age and IQ scores. There was no significant effect of CGN on LQ scores. These data suggest that gay men are feminized in some aspects of functional cerebral lateralization for facial emotion. The results were discussed in relation to the selectivity of functional lateralization and putative brain mechanisms underlying sexual attraction towards opposite-sex and same-sex targets.

  5. The effect of background music on episodic memory and autonomic responses: listening to emotionally touching music enhances facial memory capacity.

    PubMed

    Proverbio, Alice Mado; Mado Proverbio, C A Alice; Lozano Nasi, Valentina; Alessandra Arcari, Laura; De Benedetto, Francesco; Guardamagna, Matteo; Gazzola, Martina; Zani, Alberto

    2015-10-15

    The aim of this study was to investigate how background auditory processing can affect other perceptual and cognitive processes as a function of stimulus content, style and emotional nature. Previous studies have offered contrasting evidence, and it has been recently shown that listening to music negatively affected concurrent mental processing in the elderly but not in young adults. To further investigate this matter, the effect of listening to music vs. listening to the sound of rain or silence was examined by administering an old/new face memory task (involving 448 unknown faces) to a group of 54 non-musician university students. Heart rate and diastolic and systolic blood pressure were measured during an explicit face study session that was followed by a memory test. The results indicated that more efficient and faster recall of faces occurred under conditions of silence or when participants were listening to emotionally touching music. Whereas auditory background (e.g., rain or joyful music) interfered with memory encoding, listening to emotionally touching music improved memory and significantly increased heart rate. It is hypothesized that touching music is able to modify the visual perception of faces by binding facial properties with auditory and emotionally charged information (music), which may therefore result in deeper memory encoding.

  6. The effect of background music on episodic memory and autonomic responses: listening to emotionally touching music enhances facial memory capacity

    PubMed Central

    Mado Proverbio, C.A. Alice; Lozano Nasi, Valentina; Alessandra Arcari, Laura; De Benedetto, Francesco; Guardamagna, Matteo; Gazzola, Martina; Zani, Alberto

    2015-01-01

    The aim of this study was to investigate how background auditory processing can affect other perceptual and cognitive processes as a function of stimulus content, style and emotional nature. Previous studies have offered contrasting evidence, and it has been recently shown that listening to music negatively affected concurrent mental processing in the elderly but not in young adults. To further investigate this matter, the effect of listening to music vs. listening to the sound of rain or silence was examined by administering an old/new face memory task (involving 448 unknown faces) to a group of 54 non-musician university students. Heart rate and diastolic and systolic blood pressure were measured during an explicit face study session that was followed by a memory test. The results indicated that more efficient and faster recall of faces occurred under conditions of silence or when participants were listening to emotionally touching music. Whereas auditory background (e.g., rain or joyful music) interfered with memory encoding, listening to emotionally touching music improved memory and significantly increased heart rate. It is hypothesized that touching music is able to modify the visual perception of faces by binding facial properties with auditory and emotionally charged information (music), which may therefore result in deeper memory encoding. PMID:26469712

  7. Neural Mechanisms of Reading Facial Emotions in Young and Older Adults

    PubMed Central

    Ebner, Natalie C.; Johnson, Marcia K.; Fischer, Håkan

    2012-01-01

    The ability to read and appropriately respond to emotions in others is central for successful social interaction. Young and older adults are better at identifying positive than negative facial expressions and also expressions of young than older faces. Little, however, is known about the neural processes associated with reading different emotions, particularly in faces of different ages, in samples of young and older adults. During fMRI, young and older participants identified expressions in happy, neutral, and angry young and older faces. The results suggest a functional dissociation of ventromedial prefrontal cortex (vmPFC) and dorsomedial prefrontal cortex (dmPFC) in reading facial emotions that is largely comparable in young and older adults: Both age groups showed greater vmPFC activity to happy compared to angry or neutral faces, which was positively correlated with expression identification for happy compared to angry faces. In contrast, both age groups showed greater activity in dmPFC to neutral or angry than happy faces which was negatively correlated with expression identification for neutral compared to happy faces. A similar region of dmPFC showed greater activity for older than young faces, but no brain-behavior correlations. Greater vmPFC activity in the present study may reflect greater affective processing involved in reading happy compared to neutral or angry faces. Greater dmPFC activity may reflect more cognitive control involved in decoding and/or regulating negative emotions associated with neutral or angry than happy, and older than young, faces. PMID:22798953

  8. The changing face of emotion: age-related patterns of amygdala activation to salient faces.

    PubMed

    Todd, Rebecca M; Evans, Jennifer W; Morris, Drew; Lewis, Marc D; Taylor, Margot J

    2011-01-01

    The present study investigated age-related differences in the amygdala and other nodes of face-processing networks in response to facial expression and familiarity. fMRI data were analyzed from 31 children (3.5-8.5 years) and 14 young adults (18-33 years) who viewed pictures of familiar (mothers) and unfamiliar emotional faces. Results showed that amygdala activation for faces over a scrambled image baseline increased with age. Children, but not adults, showed greater amygdala activation to happy than angry faces; in addition, amygdala activation for angry faces increased with age. In keeping with growing evidence of a positivity bias in young children, our data suggest that children find happy faces to be more salient or meaningful than angry faces. Both children and adults showed preferential activation to mothers' over strangers' faces in a region of rostral anterior cingulate cortex associated with self-evaluation, suggesting that some nodes in frontal evaluative networks are active early in development. This study presents novel data on neural correlates of face processing in childhood and indicates that preferential amygdala activation for emotional expressions changes with age.

  9. Visual processing of emotional expressions in mixed anxious-depressed subclinical state: an event-related potential study on a female sample.

    PubMed

    Rossignol, M; Philippot, P; Crommelinck, M; Campanella, S

    2008-10-01

    Controversy remains about the existence and the nature of a specific bias in emotional facial expression processing in mixed anxious-depressed state (MAD). Event-related potentials were recorded in the following three types of groups defined by the Spielberger state and trait anxiety inventory (STAI) and the Beck depression inventory (BDI): a group of anxious participants (n=12), a group of participants with depressive and anxious tendencies (n=12), and a control group (n=12). Participants were confronted with a visual oddball task in which they had to detect, as quickly as possible, deviant faces amongst a train of standard neutral faces. Deviant stimuli changed either on identity, or on emotion (happy or sad expression). Anxiety facilitated emotional processing and the two anxious groups produced quicker responses than control participants; these effects were correlated with an earlier decisional wave (P3b) for anxious participants. Mixed anxious-depressed participants showed enhanced visual processing of deviant stimuli and produced higher amplitude in attentional complex (N2b/P3a), both for identity and emotional trials. P3a was also particularly increased for emotional faces in this group. Anxious state mainly influenced later decision processes (shorter latency of P3b), whereas mixed anxious-depressed state acted on earlier steps of emotional processing (enhanced N2b/P3a complex). Mixed anxious-depressed individuals seemed more reactive to any visual change, particularly emotional change, without displaying any valence bias.

  10. Neurocognitive mechanisms of gaze-expression interactions in face processing and social attention

    PubMed Central

    Graham, Reiko; LaBar, Kevin S.

    2012-01-01

    The face conveys a rich source of non-verbal information used during social communication. While research has revealed how specific facial channels such as emotional expression are processed, little is known about the prioritization and integration of multiple cues in the face during dyadic exchanges. Classic models of face perception have emphasized the segregation of dynamic versus static facial features along independent information processing pathways. Here we review recent behavioral and neuroscientific evidence suggesting that within the dynamic stream, concurrent changes in eye gaze and emotional expression can yield early independent effects on face judgments and covert shifts of visuospatial attention. These effects are partially segregated within initial visual afferent processing volleys, but are subsequently integrated in limbic regions such as the amygdala or via reentrant visual processing volleys. This spatiotemporal pattern may help to resolve otherwise perplexing discrepancies across behavioral studies of emotional influences on gaze-directed attentional cueing. Theoretical explanations of gaze-expression interactions are discussed, with special consideration of speed-of-processing (discriminability) and contextual (ambiguity) accounts. Future research in this area promises to reveal the mental chronometry of face processing and interpersonal attention, with implications for understanding how social referencing develops in infancy and is impaired in autism and other disorders of social cognition. PMID:22285906

  11. Emotion processing in joint hypermobility: A potential link to the neural bases of anxiety and related somatic symptoms in collagen anomalies.

    PubMed

    Mallorquí-Bagué, N; Bulbena, A; Roé-Vellvé, N; Hoekzema, E; Carmona, S; Barba-Müller, E; Fauquet, J; Pailhez, G; Vilarroya, O

    2015-06-01

    Joint hypermobility syndrome (JHS) has repeatedly been associated with anxiety and anxiety disorders, fibromyalgia, irritable bowel syndrome and temporomandibular joint disorder. However, the neural underpinnings of these associations still remain unclear. This study explored brain responses to facial visual stimuli with emotional cues using fMRI techniques in general population with different ranges of hypermobility. Fifty-one non-clinical volunteers (33 women) completed state and trait anxiety questionnaire measures, were assessed with a clinical examination for hypermobility (Beighton system) and performed an emotional face processing paradigm during functional neuroimaging. Trait anxiety scores did significantly correlate with both state anxiety and hypermobility scores. BOLD signals of the hippocampus did positively correlate with hypermobility scores for the crying faces versus neutral faces contrast in ROI analyses. No results were found for any of the other studied ROIs. Additionally, hypermobility scores were also associated with other key affective processing areas (i.e. the middle and anterior cingulate gyrus, fusiform gyrus, parahippocampal region, orbitofrontal cortex and cerebellum) in the whole brain analysis. Hypermobility scores are associated with trait anxiety and higher brain responses to emotional faces in emotion processing brain areas (including hippocampus) described to be linked to anxiety and somatic symptoms. These findings increase our understanding of emotion processing in people bearing this heritable variant of collagen and the mechanisms through which vulnerability to anxiety and somatic symptoms arises in this population. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  12. Emotional Cues during Simultaneous Face and Voice Processing: Electrophysiological Insights

    PubMed Central

    Liu, Taosheng; Pinheiro, Ana; Zhao, Zhongxin; Nestor, Paul G.; McCarley, Robert W.; Niznikiewicz, Margaret A.

    2012-01-01

    Both facial expression and tone of voice represent key signals of emotional communication but their brain processing correlates remain unclear. Accordingly, we constructed a novel implicit emotion recognition task consisting of simultaneously presented human faces and voices with neutral, happy, and angry valence, within the context of recognizing monkey faces and voices task. To investigate the temporal unfolding of the processing of affective information from human face-voice pairings, we recorded event-related potentials (ERPs) to these audiovisual test stimuli in 18 normal healthy subjects; N100, P200, N250, P300 components were observed at electrodes in the frontal-central region, while P100, N170, P270 were observed at electrodes in the parietal-occipital region. Results indicated a significant audiovisual stimulus effect on the amplitudes and latencies of components in frontal-central (P200, P300, and N250) but not the parietal occipital region (P100, N170 and P270). Specifically, P200 and P300 amplitudes were more positive for emotional relative to neutral audiovisual stimuli, irrespective of valence, whereas N250 amplitude was more negative for neutral relative to emotional stimuli. No differentiation was observed between angry and happy conditions. The results suggest that the general effect of emotion on audiovisual processing can emerge as early as 200 msec (P200 peak latency) post stimulus onset, in spite of implicit affective processing task demands, and that such effect is mainly distributed in the frontal-central region. PMID:22383987

  13. State-dependent alteration in face emotion recognition in depression.

    PubMed

    Anderson, Ian M; Shippen, Clare; Juhasz, Gabriella; Chase, Diana; Thomas, Emma; Downey, Darragh; Toth, Zoltan G; Lloyd-Williams, Kathryn; Elliott, Rebecca; Deakin, J F William

    2011-04-01

    Negative biases in emotional processing are well recognised in people who are currently depressed but are less well described in those with a history of depression, where such biases may contribute to vulnerability to relapse. To compare accuracy, discrimination and bias in face emotion recognition in those with current and remitted depression. The sample comprised a control group (n = 101), a currently depressed group (n = 30) and a remitted depression group (n = 99). Participants provided valid data after receiving a computerised face emotion recognition task following standardised assessment of diagnosis and mood symptoms. In the control group women were more accurate in recognising emotions than men owing to greater discrimination. Among participants with depression, those in remission correctly identified more emotions than controls owing to increased response bias, whereas those currently depressed recognised fewer emotions owing to decreased discrimination. These effects were most marked for anger, fear and sadness but there was no significant emotion × group interaction, and a similar pattern tended to be seen for happiness although not for surprise or disgust. These differences were confined to participants who were antidepressant-free, with those taking antidepressants having similar results to the control group. Abnormalities in face emotion recognition differ between people with current depression and those in remission. Reduced discrimination in depressed participants may reflect withdrawal from the emotions of others, whereas the increased bias in those with a history of depression could contribute to vulnerability to relapse. The normal face emotion recognition seen in those taking medication may relate to the known effects of antidepressants on emotional processing and could contribute to their ability to protect against depressive relapse.

  14. Reduced embodied simulation in psychopathy.

    PubMed

    Mier, Daniela; Haddad, Leila; Diers, Kersten; Dressing, Harald; Meyer-Lindenberg, Andreas; Kirsch, Peter

    2014-08-01

    Psychopathy is characterized by severe deficits in emotion processing and empathy. These emotional deficits might not only affect the feeling of own emotions, but also the understanding of others' emotional and mental states. The present study aims on identifying the neurobiological correlates of social-cognitive related alterations in psychopathy. We applied a social-cognitive paradigm for the investigation of face processing, emotion recognition, and affective Theory of Mind (ToM) to 11 imprisoned psychopaths and 18 healthy controls. Functional magnetic resonance imaging was used to measure task-related brain activation. While showing no overall behavioural deficit, psychopathy was associated with altered brain activation. Psychopaths had reduced fusiform activation related to face processing. Related to affective ToM, psychopaths had hypoactivation in amygdala, inferior prefrontal gyrus and superior temporal sulcus, areas associated with embodied simulation of emotions and intentions. Furthermore, psychopaths lacked connectivity between superior temporal sulcus and amygdala during affective ToM. These results replicate findings of alterations in basal face processing in psychopathy. In addition, they provide evidence for reduced embodied simulation in psychopathy in concert with a lack of communication between motor areas and amygdala which might provide the neural substrate of reduced feeling with others during social cognition.

  15. A note on age differences in mood-congruent vs. mood-incongruent emotion processing in faces

    PubMed Central

    Voelkle, Manuel C.; Ebner, Natalie C.; Lindenberger, Ulman; Riediger, Michaela

    2014-01-01

    This article addresses four interrelated research questions: (1) Does experienced mood affect emotion perception in faces and is this perception mood-congruent or mood-incongruent?(2) Are there age-group differences in the interplay between experienced mood and emotion perception? (3) Does emotion perception in faces change as a function of the temporal sequence of study sessions and stimuli presentation, and (4) does emotion perception in faces serve a mood-regulatory function? One hundred fifty-four adults of three different age groups (younger: 20–31 years; middle-aged: 44–55 years; older adults: 70–81 years) were asked to provide multidimensional emotion ratings of a total of 1026 face pictures of younger, middle-aged, and older men and women, each displaying six different prototypical (primary) emotional expressions. By analyzing the likelihood of ascribing an additional emotional expression to a face whose primary emotion had been correctly recognized, the multidimensional rating approach permits the study of emotion perception while controlling for emotion recognition. Following up on previous research on mood responses to recurring unpleasant situations using the same dataset (Voelkle et al., 2013), crossed random effects analyses supported a mood-congruent relationship between experienced mood and perceived emotions in faces. In particular older adults were more likely to perceive happiness in faces when being in a positive mood and less likely to do so when being in a negative mood. This did not apply to younger adults. Temporal sequence of study sessions and stimuli presentation had a strong effect on the likelihood of ascribing an additional emotional expression. In contrast to previous findings, however, there was neither evidence for a change from mood-congruent to mood-incongruent responses over time nor evidence for a mood-regulatory effect. PMID:25018740

  16. A note on age differences in mood-congruent vs. mood-incongruent emotion processing in faces.

    PubMed

    Voelkle, Manuel C; Ebner, Natalie C; Lindenberger, Ulman; Riediger, Michaela

    2014-01-01

    (1) Does experienced mood affect emotion perception in faces and is this perception mood-congruent or mood-incongruent?(2) Are there age-group differences in the interplay between experienced mood and emotion perception? (3) Does emotion perception in faces change as a function of the temporal sequence of study sessions and stimuli presentation, and (4) does emotion perception in faces serve a mood-regulatory function? One hundred fifty-four adults of three different age groups (younger: 20-31 years; middle-aged: 44-55 years; older adults: 70-81 years) were asked to provide multidimensional emotion ratings of a total of 1026 face pictures of younger, middle-aged, and older men and women, each displaying six different prototypical (primary) emotional expressions. By analyzing the likelihood of ascribing an additional emotional expression to a face whose primary emotion had been correctly recognized, the multidimensional rating approach permits the study of emotion perception while controlling for emotion recognition. Following up on previous research on mood responses to recurring unpleasant situations using the same dataset (Voelkle et al., 2013), crossed random effects analyses supported a mood-congruent relationship between experienced mood and perceived emotions in faces. In particular older adults were more likely to perceive happiness in faces when being in a positive mood and less likely to do so when being in a negative mood. This did not apply to younger adults. Temporal sequence of study sessions and stimuli presentation had a strong effect on the likelihood of ascribing an additional emotional expression. In contrast to previous findings, however, there was neither evidence for a change from mood-congruent to mood-incongruent responses over time nor evidence for a mood-regulatory effect.

  17. Older Adults' Trait Impressions of Faces Are Sensitive to Subtle Resemblance to Emotions

    PubMed Central

    Zebrowitz, Leslie A.

    2013-01-01

    Younger adults (YA) attribute emotion-related traits to people whose neutral facial structure resembles an emotion (emotion overgeneralization). The fact that older adults (OA) show deficits in accurately labeling basic emotions suggests that they may be relatively insensitive to variations in the emotion resemblance of neutral expression faces that underlie emotion overgeneralization effects. On the other hand, the fact that OA, like YA, show a ‘pop-out’ effect for anger, more quickly locating an angry than a happy face in a neutral array, suggests that both age groups may be equally sensitive to emotion resemblance. We used computer modeling to assess the degree to which neutral faces objectively resembled emotions and assessed whether that resemblance predicted trait impressions. We found that both OA and YA showed anger and surprise overgeneralization in ratings of danger and naiveté, respectively, with no significant differences in the strength of the effects for the two age groups. These findings suggest that well-documented OA deficits on emotion recognition tasks may be more due to processing demands than to an insensitivity to the social affordances of emotion expressions. PMID:24058225

  18. Dissimilar processing of emotional facial expressions in human and monkey temporal cortex

    PubMed Central

    Zhu, Qi; Nelissen, Koen; Van den Stock, Jan; De Winter, François-Laurent; Pauwels, Karl; de Gelder, Beatrice; Vanduffel, Wim; Vandenbulcke, Mathieu

    2013-01-01

    Emotional facial expressions play an important role in social communication across primates. Despite major progress made in our understanding of categorical information processing such as for objects and faces, little is known, however, about how the primate brain evolved to process emotional cues. In this study, we used functional magnetic resonance imaging (fMRI) to compare the processing of emotional facial expressions between monkeys and humans. We used a 2 × 2 × 2 factorial design with species (human and monkey), expression (fear and chewing) and configuration (intact versus scrambled) as factors. At the whole brain level, selective neural responses to conspecific emotional expressions were anatomically confined to the superior temporal sulcus (STS) in humans. Within the human STS, we found functional subdivisions with a face-selective right posterior STS area that also responded selectively to emotional expressions of other species and a more anterior area in the right middle STS that responded specifically to human emotions. Hence, we argue that the latter region does not show a mere emotion-dependent modulation of activity but is primarily driven by human emotional facial expressions. Conversely, in monkeys, emotional responses appeared in earlier visual cortex and outside face-selective regions in inferior temporal cortex that responded also to multiple visual categories. Within monkey IT, we also found areas that were more responsive to conspecific than to non-conspecific emotional expressions but these responses were not as specific as in human middle STS. Overall, our results indicate that human STS may have developed unique properties to deal with social cues such as emotional expressions. PMID:23142071

  19. Increased amygdala responses to sad but not fearful faces in major depression: relation to mood state and pharmacological treatment.

    PubMed

    Arnone, Danilo; McKie, Shane; Elliott, Rebecca; Thomas, Emma J; Downey, Darragh; Juhasz, Gabriella; Williams, Steve R; Deakin, J F William; Anderson, Ian M

    2012-08-01

    Increased amygdala response to negative emotions seen in functional MRI (fMRI) has been proposed as a biomarker for negative emotion processing bias underlying depressive symptoms and vulnerability to depressive relapse that are normalized by antidepressant drug treatment. The purpose of this study was to determine whether abnormal amygdala responses to face emotions in depression are related to specific emotions or change in response to antidepressant treatment and whether they are present as a stable trait in medication-free patients in remission. Sixty-two medication-free unipolar depressed patients (38 were currently depressed, and 24 were in remission) and 54 healthy comparison subjects underwent an indirect face-emotion processing task during fMRI. Thirty-two currently depressed patients were treated with the antidepressant citalopram for 8 weeks. Adherence to treatment was evaluated by measuring citalopram plasma concentrations. Patients with current depression had increased bilateral amygdala responses specific to sad faces relative to healthy comparison subjects and nonmedicated patients in stable remission. Treatment with citalopram abolished the abnormal amygdala responses to sad faces in currently depressed patients but did not alter responses to fearful faces. Aberrant amygdala activation in response to sad facial emotions is specific to the depressed state and is a potential biomarker for a negative affective bias during a depressive episode.

  20. Face and emotion expression processing and the serotonin transporter polymorphism 5-HTTLPR/rs22531.

    PubMed

    Hildebrandt, A; Kiy, A; Reuter, M; Sommer, W; Wilhelm, O

    2016-06-01

    Face cognition, including face identity and facial expression processing, is a crucial component of socio-emotional abilities, characterizing humans as highest developed social beings. However, for these trait domains molecular genetic studies investigating gene-behavior associations based on well-founded phenotype definitions are still rare. We examined the relationship between 5-HTTLPR/rs25531 polymorphisms - related to serotonin-reuptake - and the ability to perceive and recognize faces and emotional expressions in human faces. For this aim we conducted structural equation modeling on data from 230 young adults, obtained by using a comprehensive, multivariate task battery with maximal effort tasks. By additionally modeling fluid intelligence and immediate and delayed memory factors, we aimed to address the discriminant relationships of the 5-HTTLPR/rs25531 polymorphisms with socio-emotional abilities. We found a robust association between the 5-HTTLPR/rs25531 polymorphism and facial emotion perception. Carriers of two long (L) alleles outperformed carriers of one or two S alleles. Weaker associations were present for face identity perception and memory for emotional facial expressions. There was no association between the 5-HTTLPR/rs25531 polymorphism and non-social abilities, demonstrating discriminant validity of the relationships. We discuss the implications and possible neural mechanisms underlying these novel findings. © 2016 John Wiley & Sons Ltd and International Behavioural and Neural Genetics Society.

  1. Empathy costs: Negative emotional bias in high empathisers.

    PubMed

    Chikovani, George; Babuadze, Lasha; Iashvili, Nino; Gvalia, Tamar; Surguladze, Simon

    2015-09-30

    Excessive empathy has been associated with compassion fatigue in health professionals and caregivers. We investigated an effect of empathy on emotion processing in 137 healthy individuals of both sexes. We tested a hypothesis that high empathy may underlie increased sensitivity to negative emotion recognition which may interact with gender. Facial emotion stimuli comprised happy, angry, fearful, and sad faces presented at different intensities (mild and prototypical) and different durations (500ms and 2000ms). The parameters of emotion processing were represented by discrimination accuracy, response bias and reaction time. We found that higher empathy was associated with better recognition of all emotions. We also demonstrated that higher empathy was associated with response bias towards sad and fearful faces. The reaction time analysis revealed that higher empathy in females was associated with faster (compared with males) recognition of mildly sad faces of brief duration. We conclude that although empathic abilities were providing for advantages in recognition of all facial emotional expressions, the bias towards emotional negativity may potentially carry a risk for empathic distress. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  2. Memory for faces: the effect of facial appearance and the context in which the face is encountered.

    PubMed

    Mattarozzi, Katia; Todorov, Alexander; Codispoti, Maurizio

    2015-03-01

    We investigated the effects of appearance of emotionally neutral faces and the context in which the faces are encountered on incidental face memory. To approximate real-life situations as closely as possible, faces were embedded in a newspaper article, with a headline that specified an action performed by the person pictured. We found that facial appearance affected memory so that faces perceived as trustworthy or untrustworthy were remembered better than neutral ones. Furthermore, the memory of untrustworthy faces was slightly better than that of trustworthy faces. The emotional context of encoding affected the details of face memory. Faces encountered in a neutral context were more likely to be recognized as only familiar. In contrast, emotionally relevant contexts of encoding, whether pleasant or unpleasant, increased the likelihood of remembering semantic and even episodic details associated with faces. These findings suggest that facial appearance (i.e., perceived trustworthiness) affects face memory. Moreover, the findings support prior evidence that the engagement of emotion processing during memory encoding increases the likelihood that events are not only recognized but also remembered.

  3. Perceptual integration of faces and voices depends on the interaction of emotional content and spatial frequency.

    PubMed

    Kokinous, Jenny; Tavano, Alessandro; Kotz, Sonja A; Schröger, Erich

    2017-02-01

    The role of spatial frequencies (SF) is highly debated in emotion perception, but previous work suggests the importance of low SFs for detecting emotion in faces. Furthermore, emotion perception essentially relies on the rapid integration of multimodal information from faces and voices. We used EEG to test the functional relevance of SFs in the integration of emotional and non-emotional audiovisual stimuli. While viewing dynamic face-voice pairs, participants were asked to identify auditory interjections, and the electroencephalogram (EEG) was recorded. Audiovisual integration was measured as auditory facilitation, indexed by the extent of the auditory N1 amplitude suppression in audiovisual compared to an auditory only condition. We found an interaction of SF filtering and emotion in the auditory response suppression. For neutral faces, larger N1 suppression ensued in the unfiltered and high SF conditions as compared to the low SF condition. Angry face perception led to a larger N1 suppression in the low SF condition. While the results for the neural faces indicate that perceptual quality in terms of SF content plays a major role in audiovisual integration, the results for angry faces suggest that early multisensory integration of emotional information favors low SF neural processing pathways, overruling the predictive value of the visual signal per se. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Emotion perception, but not affect perception, is impaired with semantic memory loss.

    PubMed

    Lindquist, Kristen A; Gendron, Maria; Barrett, Lisa Feldman; Dickerson, Bradford C

    2014-04-01

    For decades, psychologists and neuroscientists have hypothesized that the ability to perceive emotions on others' faces is inborn, prelinguistic, and universal. Concept knowledge about emotion has been assumed to be epiphenomenal to emotion perception. In this article, we report findings from 3 patients with semantic dementia that cannot be explained by this "basic emotion" view. These patients, who have substantial deficits in semantic processing abilities, spontaneously perceived pleasant and unpleasant expressions on faces, but not discrete emotions such as anger, disgust, fear, or sadness, even in a task that did not require the use of emotion words. Our findings support the hypothesis that discrete emotion concept knowledge helps transform perceptions of affect (positively or negatively valenced facial expressions) into perceptions of discrete emotions such as anger, disgust, fear, and sadness. These findings have important consequences for understanding the processes supporting emotion perception.

  5. Own-sex effects in emotional memory for faces.

    PubMed

    Armony, Jorge L; Sergerie, Karine

    2007-10-09

    The amygdala is known to be critical for the enhancement of memory for emotional, especially negative, material. Importantly, some researchers have suggested a sex-specific hemispheric lateralization in this process. In the case of facial expressions, another important factor that could influence memory success is the sex of the face, which could interact with the emotion depicted as well as with the sex of the perceiver. Whether this is the case remains unknown, as all previous studies of sex difference in emotional memory have employed affective pictures. Here we directly explored this question using functional magnetic resonance imaging in a subsequent memory paradigm for facial expressions (fearful, happy and neutral). Consistent with our hypothesis, we found that the hemispheric laterality of the amygdala involvement in successful memory for emotional material was influenced not only by the sex of the subjects, as previously proposed, but also by the sex of the faces being remembered. Namely, the left amygdala was more active for successfully remembered female fearful faces in women, whereas in men the right amygdala was more involved in memory for male fearful faces. These results confirm the existence of sex differences in amygdala lateralization in emotional memory but also demonstrate a subtle relationship between the observer and the stimulus in this process.

  6. Time course of influence on the allocation of attentional resources caused by unconscious fearful faces.

    PubMed

    Jiang, Yunpeng; Wu, Xia; Saab, Rami; Xiao, Yi; Gao, Xiaorong

    2018-05-01

    Emotionally affective stimuli have priority in our visual processing even in the absence of conscious processing. However, the influence of unconscious emotional stimuli on our attentional resources remains unclear. Using the continuous flash suppression (CFS) paradigm, we concurrently recorded and analyzed visual event-related potential (ERP) components evoked by the images of suppressed fearful and neutral faces, and the steady-state visual evoked potential (SSVEP) elicited by dynamic Mondrian pictures. Fearful faces, relative to neutral faces, elicited larger late ERP components on parietal electrodes, indicating emotional expression processing without consciousness. More importantly, the presentation of a suppressed fearful face in the CFS resulted in a significantly greater decrease in SSVEP amplitude which started about 1-1.2 s after the face images first appeared. This suggests that the time course of the attentional bias occurs at about 1 s after the appearance of the fearful face and demonstrates that unconscious fearful faces may influence attentional resource allocation. Moreover, we proposed a new method that could eliminate the interaction of ERPs and SSVEPs when recorded concurrently. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Spatial frequency filtered images reveal differences between masked and unmasked processing of emotional information.

    PubMed

    Rohr, Michaela; Wentura, Dirk

    2014-10-01

    High and low spatial frequency information has been shown to contribute differently to the processing of emotional information. In three priming studies using spatial frequency filtered emotional face primes, emotional face targets, and an emotion categorization task, we investigated this issue further. Differences in the pattern of results between short and masked, and short and long unmasked presentation conditions emerged. Given long and unmasked prime presentation, high and low frequency primes triggered emotion-specific priming effects. Given brief and masked prime presentation in Experiment 2, we found a dissociation: High frequency primes caused a valence priming effect, whereas low frequency primes yielded a differentiation between low and high arousing information within the negative domain. Brief and unmasked prime presentation in Experiment 3 revealed that subliminal processing of primes was responsible for the pattern observed in Experiment 2. The implications of these findings for theories of early emotional information processing are discussed. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Fixation to features and neural processing of facial expressions in a gender discrimination task

    PubMed Central

    Neath, Karly N.; Itier, Roxane J.

    2017-01-01

    Early face encoding, as reflected by the N170 ERP component, is sensitive to fixation to the eyes. Whether this sensitivity varies with facial expressions of emotion and can also be seen on other ERP components such as P1 and EPN, was investigated. Using eye-tracking to manipulate fixation on facial features, we found the N170 to be the only eye-sensitive component and this was true for fearful, happy and neutral faces. A different effect of fixation to features was seen for the earlier P1 that likely reflected general sensitivity to face position. An early effect of emotion (~120 ms) for happy faces was seen at occipital sites and was sustained until ~350 ms post-stimulus. For fearful faces, an early effect was seen around 80 ms followed by a later effect appearing at ~150 ms until ~300 ms at lateral posterior sites. Results suggests that in this emotion-irrelevant gender discrimination task, processing of fearful and happy expressions occurred early and largely independently of the eye-sensitivity indexed by the N170. Processing of the two emotions involved different underlying brain networks active at different times. PMID:26277653

  9. Emotion Unchained: Facial Expression Modulates Gaze Cueing under Cognitive Load.

    PubMed

    Pecchinenda, Anna; Petrucci, Manuel

    2016-01-01

    Direction of eye gaze cues spatial attention, and typically this cueing effect is not modulated by the expression of a face unless top-down processes are explicitly or implicitly involved. To investigate the role of cognitive control on gaze cueing by emotional faces, participants performed a gaze cueing task with happy, angry, or neutral faces under high (i.e., counting backward by 7) or low cognitive load (i.e., counting forward by 2). Results show that high cognitive load enhances gaze cueing effects for angry facial expressions. In addition, cognitive load reduces gaze cueing for neutral faces, whereas happy facial expressions and gaze affected object preferences regardless of load. This evidence clearly indicates a differential role of cognitive control in processing gaze direction and facial expression, suggesting that under typical conditions, when we shift attention based on social cues from another person, cognitive control processes are used to reduce interference from emotional information.

  10. Emotion Unchained: Facial Expression Modulates Gaze Cueing under Cognitive Load

    PubMed Central

    Petrucci, Manuel

    2016-01-01

    Direction of eye gaze cues spatial attention, and typically this cueing effect is not modulated by the expression of a face unless top-down processes are explicitly or implicitly involved. To investigate the role of cognitive control on gaze cueing by emotional faces, participants performed a gaze cueing task with happy, angry, or neutral faces under high (i.e., counting backward by 7) or low cognitive load (i.e., counting forward by 2). Results show that high cognitive load enhances gaze cueing effects for angry facial expressions. In addition, cognitive load reduces gaze cueing for neutral faces, whereas happy facial expressions and gaze affected object preferences regardless of load. This evidence clearly indicates a differential role of cognitive control in processing gaze direction and facial expression, suggesting that under typical conditions, when we shift attention based on social cues from another person, cognitive control processes are used to reduce interference from emotional information. PMID:27959925

  11. Neural Processing of Facial Identity and Emotion in Infants at High-Risk for Autism Spectrum Disorders

    PubMed Central

    Fox, Sharon E.; Wagner, Jennifer B.; Shrock, Christine L.; Tager-Flusberg, Helen; Nelson, Charles A.

    2013-01-01

    Deficits in face processing and social impairment are core characteristics of autism spectrum disorder. The present work examined 7-month-old infants at high-risk for developing autism and typically developing controls at low-risk, using a face perception task designed to differentiate between the effects of face identity and facial emotions on neural response using functional Near-Infrared Spectroscopy. In addition, we employed independent component analysis, as well as a novel method of condition-related component selection and classification to identify group differences in hemodynamic waveforms and response distributions associated with face and emotion processing. The results indicate similarities of waveforms, but differences in the magnitude, spatial distribution, and timing of responses between groups. These early differences in local cortical regions and the hemodynamic response may, in turn, contribute to differences in patterns of functional connectivity. PMID:23576966

  12. Alterations in neural processing of emotional faces in adolescent anorexia nervosa patients - an event-related potential study.

    PubMed

    Sfärlea, Anca; Greimel, Ellen; Platt, Belinda; Bartling, Jürgen; Schulte-Körne, Gerd; Dieler, Alica C

    2016-09-01

    The present study explored the neurophysiological correlates of perception and recognition of emotional facial expressions in adolescent anorexia nervosa (AN) patients using event-related potentials (ERPs). We included 20 adolescent girls with AN and 24 healthy girls and recorded ERPs during a passive viewing task and three active tasks requiring processing of emotional faces in varying processing depths; one of the tasks also assessed emotion recognition abilities behaviourally. Despite the absence of behavioural differences, we found that across all tasks AN patients exhibited a less pronounced early posterior negativity (EPN) in response to all facial expressions compared to controls. The EPN is an ERP component reflecting an automatic, perceptual processing stage which is modulated by the intrinsic salience of a stimulus. Hence, the less pronounced EPN in anorexic girls suggests that they might perceive other people's faces as less intrinsically relevant, i.e. as less "important" than do healthy girls. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Love withdrawal predicts electrocortical responses to emotional faces with performance feedback: a follow-up and extension.

    PubMed

    Huffmeijer, Renske; Bakermans-Kranenburg, Marian J; Alink, Lenneke R A; van IJzendoorn, Marinus H

    2014-06-02

    Parental use of love withdrawal is thought to affect children's later psychological functioning because it creates a link between children's performance and relational consequences. In addition, recent studies have begun to show that experiences of love withdrawal also relate to the neural processing of socio-emotional information relevant to a performance-relational consequence link, and can moderate effects of oxytocin on social information processing and behavior. The current study follows-up on our previous results by attempting to confirm and extend previous findings indicating that experiences of maternal love withdrawal are related to electrocortical responses to emotional faces presented with performance feedback. More maternal love withdrawal was related to enhanced early processing of facial feedback stimuli (reflected in more positive VPP amplitudes, and confirming previous findings). However, attentional engagement with and processing of the stimuli at a later stage were diminished in those reporting higher maternal love withdrawal (reflected in less positive LPP amplitudes, and diverging from previous findings). Maternal love withdrawal affects the processing of emotional faces presented with performance feedback differently in different stages of neural processing.

  14. Gaze Dynamics in the Recognition of Facial Expressions of Emotion.

    PubMed

    Barabanschikov, Vladimir A

    2015-01-01

    We studied preferably fixated parts and features of human face in the process of recognition of facial expressions of emotion. Photographs of facial expressions were used. Participants were to categorize these as basic emotions; during this process, eye movements were registered. It was found that variation in the intensity of an expression is mirrored in accuracy of emotion recognition; it was also reflected by several indices of oculomotor function: duration of inspection of certain areas of the face, its upper and bottom or right parts, right and left sides; location, number and duration of fixations, viewing trajectory. In particular, for low-intensity expressions, right side of the face was found to be attended predominantly (right-side dominance); the right-side dominance effect, was, however, absent for expressions of high intensity. For both low- and high-intensity expressions, upper face part was predominantly fixated, though with greater fixation of high-intensity expressions. The majority of trials (70%), in line with findings in previous studies, revealed a V-shaped pattern of inspection trajectory. No relationship, between accuracy of recognition of emotional expressions, was found, though, with either location and duration of fixations or pattern of gaze directedness in the face. © The Author(s) 2015.

  15. Age-related emotional bias in processing two emotionally valenced tasks.

    PubMed

    Allen, Philip A; Lien, Mei-Ching; Jardin, Elliott

    2017-01-01

    Previous studies suggest that older adults process positive emotions more efficiently than negative emotions, whereas younger adults show the reverse effect. We examined whether this age-related difference in emotional bias still occurs when attention is engaged in two emotional tasks. We used a psychological refractory period paradigm and varied the emotional valence of Task 1 and Task 2. In both experiments, Task 1 was emotional face discrimination (happy vs. angry faces) and Task 2 was sound discrimination (laugh, punch, vs. cork pop in Experiment 1 and laugh vs. scream in Experiment 2). The backward emotional correspondence effect for positively and negatively valenced Task 2 on Task 1 was measured. In both experiments, younger adults showed a backward correspondence effect from a negatively valenced Task 2, suggesting parallel processing of negatively valenced stimuli. Older adults showed similar negativity bias in Experiment 2 with a more salient negative sound ("scream" relative to "punch"). These results are consistent with an arousal-bias competition model [Mather and Sutherland (Perspectives in Psychological Sciences 6:114-133, 2011)], suggesting that emotional arousal modulates top-down attentional control settings (emotional regulation) with age.

  16. Conscious and unconscious processing of facial expressions: evidence from two split-brain patients.

    PubMed

    Prete, Giulia; D'Ascenzo, Stefania; Laeng, Bruno; Fabri, Mara; Foschi, Nicoletta; Tommasi, Luca

    2015-03-01

    We investigated how the brain's hemispheres process explicit and implicit facial expressions in two 'split-brain' patients (one with a complete and one with a partial anterior resection). Photographs of faces expressing positive, negative or neutral emotions were shown either centrally or bilaterally. The task consisted in judging the friendliness of each person in the photographs. Half of the photograph stimuli were 'hybrid faces', that is an amalgamation of filtered images which contained emotional information only in the low range of spatial frequency, blended to a neutral expression of the same individual in the rest of the spatial frequencies. The other half of the images contained unfiltered faces. With the hybrid faces the patients and a matched control group were more influenced in their social judgements by the emotional expression of the face shown in the left visual field (LVF). When the expressions were shown explicitly, that is without filtering, the control group and the partially callosotomized patient based their judgement on the face shown in the LVF, whereas the complete split-brain patient based his ratings mainly on the face presented in the right visual field. We conclude that the processing of implicit emotions does not require the integrity of callosal fibres and can take place within subcortical routes lateralized in the right hemisphere. © 2013 The British Psychological Society.

  17. Repetition Blindness for Faces: A Comparison of Face Identity, Expression, and Gender Judgments.

    PubMed

    Murphy, Karen; Ward, Zoe

    2017-01-01

    Repetition blindness (RB) refers to the impairment in reporting two identical targets within a rapid serial visual presentation stream. While numerous studies have demonstrated RB for words and picture of objects, very few studies have examined RB for faces. This study extended this research by examining RB when the two faces were complete repeats (same emotion and identity), identity repeats (same individual, different emotion), and emotion repeats (different individual, same emotion) for identity, gender, and expression judgment tasks. Complete RB and identity RB effects were evident for all three judgment tasks. Emotion RB was only evident for the expression and gender judgments. Complete RB effects were larger than emotion or identity RB effects across all judgment tasks. For the expression judgments, there was more emotion than identity RB. The identity RB effect was larger than the emotion RB effect for the gender judgments. Cross task comparisons revealed larger complete RB effects for the expression and gender judgments than the identity decisions. There was a larger emotion RB effect for the expression than gender judgments and the identity RB effect was larger for the gender than for the identity and expression judgments. These results indicate that while faces are subject to RB, this is affected by the type of repeated information and relevance of the facial characteristic to the judgment decision. This study provides further support for the operation of separate processing mechanisms for face gender, emotion, and identity information within models of face recognition.

  18. Repetition Blindness for Faces: A Comparison of Face Identity, Expression, and Gender Judgments

    PubMed Central

    Murphy, Karen; Ward, Zoe

    2017-01-01

    Repetition blindness (RB) refers to the impairment in reporting two identical targets within a rapid serial visual presentation stream. While numerous studies have demonstrated RB for words and picture of objects, very few studies have examined RB for faces. This study extended this research by examining RB when the two faces were complete repeats (same emotion and identity), identity repeats (same individual, different emotion), and emotion repeats (different individual, same emotion) for identity, gender, and expression judgment tasks. Complete RB and identity RB effects were evident for all three judgment tasks. Emotion RB was only evident for the expression and gender judgments. Complete RB effects were larger than emotion or identity RB effects across all judgment tasks. For the expression judgments, there was more emotion than identity RB. The identity RB effect was larger than the emotion RB effect for the gender judgments. Cross task comparisons revealed larger complete RB effects for the expression and gender judgments than the identity decisions. There was a larger emotion RB effect for the expression than gender judgments and the identity RB effect was larger for the gender than for the identity and expression judgments. These results indicate that while faces are subject to RB, this is affected by the type of repeated information and relevance of the facial characteristic to the judgment decision. This study provides further support for the operation of separate processing mechanisms for face gender, emotion, and identity information within models of face recognition. PMID:29038663

  19. The recognition of emotional expression in prosopagnosia: decoding whole and part faces.

    PubMed

    Stephan, Blossom Christa Maree; Breen, Nora; Caine, Diana

    2006-11-01

    Prosopagnosia is currently viewed within the constraints of two competing theories of face recognition, one highlighting the analysis of features, the other focusing on configural processing of the whole face. This study investigated the role of feature analysis versus whole face configural processing in the recognition of facial expression. A prosopagnosic patient, SC made expression decisions from whole and incomplete (eyes-only and mouth-only) faces where features had been obscured. SC was impaired at recognizing some (e.g., anger, sadness, and fear), but not all (e.g., happiness) emotional expressions from the whole face. Analyses of his performance on incomplete faces indicated that his recognition of some expressions actually improved relative to his performance on the whole face condition. We argue that in SC interference from damaged configural processes seem to override an intact ability to utilize part-based or local feature cues.

  20. Multiple mechanisms of consciousness: the neural correlates of emotional awareness.

    PubMed

    Amting, Jayna M; Greening, Steven G; Mitchell, Derek G V

    2010-07-28

    Emotional stimuli, including facial expressions, are thought to gain rapid and privileged access to processing resources in the brain. Despite this access, we are conscious of only a fraction of the myriad of emotion-related cues we face everyday. It remains unclear, therefore, what the relationship is between activity in neural regions associated with emotional representation and the phenomenological experience of emotional awareness. We used functional magnetic resonance imaging and binocular rivalry to delineate the neural correlates of awareness of conflicting emotional expressions in humans. Behaviorally, fearful faces were significantly more likely to be perceived than disgusted or neutral faces. Functionally, increased activity was observed in regions associated with facial expression processing, including the amygdala and fusiform gyrus during emotional awareness. In contrast, awareness of neutral faces and suppression of fearful faces were associated with increased activity in dorsolateral prefrontal and inferior parietal cortices. The amygdala showed increased functional connectivity with ventral visual system regions during fear awareness and increased connectivity with perigenual prefrontal cortex (pgPFC; Brodmann's area 32/10) when fear was suppressed. Despite being prioritized for awareness, emotional items were associated with reduced activity in areas considered critical for consciousness. Contributions to consciousness from bottom-up and top-down neural regions may be additive, such that increased activity in specialized regions within the extended ventral visual system may reduce demands on a frontoparietal system important for awareness. The possibility is raised that interactions between pgPFC and the amygdala, previously implicated in extinction, may also influence whether or not an emotional stimulus is accessible to consciousness.

  1. Emotion perception, but not affect perception, is impaired with semantic memory loss

    PubMed Central

    Lindquist, Kristen A.; Gendron, Maria; Feldman Barrett, Lisa; Dickerson, Bradford C.

    2014-01-01

    For decades, psychologists and neuroscientists have hypothesized that the ability to perceive emotions on others’ faces is inborn, pre-linguistic, and universal. Concept knowledge about emotion has been assumed to be epiphenomenal to emotion perception. In this paper, we report findings from three patients with semantic dementia that cannot be explained by this “basic emotion” view. These patients, who have substantial deficits in semantic processing abilities, spontaneously perceived pleasant and unpleasant expressions on faces, but not discrete emotions such as anger, disgust, fear, or sadness, even in a task that did not require the use of emotion words. Our findings support the hypothesis that discrete emotion concept knowledge helps transform perceptions of affect (positively or negatively valenced facial expressions) into perceptions of discrete emotions such as anger, disgust, fear and sadness. These findings have important consequences for understanding the processes supporting emotion perception. PMID:24512242

  2. White matter fiber compromise contributes differentially to attention and emotion processing impairment in alcoholism, HIV-infection, and their comorbidity.

    PubMed

    Schulte, T; Müller-Oehring, E M; Sullivan, E V; Pfefferbaum, A

    2012-10-01

    Alcoholism (ALC) and HIV-1 infection (HIV) each affects emotional and attentional processes and integrity of brain white matter fibers likely contributing to functional compromise. The highly prevalent ALC+HIV comorbidity may exacerbate compromise. We used diffusion tensor imaging (DTI) and an emotional Stroop Match-to-Sample task in 19 ALC, 16 HIV, 15 ALC+HIV, and 15 control participants to investigate whether disruption of fiber system integrity accounts for compromised attentional and emotional processing. The task required matching a cue color to that of an emotional word with faces appearing between the color cue and the Stroop word in half of the trials. Nonmatched cue-word color pairs assessed selective attention, and face-word pairs assessed emotion. Relative to controls, DTI-based fiber tracking revealed lower inferior longitudinal fasciculus (ilf) integrity in HIV and ALC+HIV and lower uncinate fasciculus (uf) integrity in all three patient groups. Controls exhibited Stroop effects to positive face-word emotion, and greater interference was related to greater callosal, cingulum and ilf integrity. By contrast, HIV showed greater interference from negative Stroop words during color-nonmatch trials, correlating with greater uf compromise. For face trials, ALC and ALC+HIV showed greater Stroop-word interference, correlating with lower cingulate and callosal integrity. Thus, in HIV, conflict resolution was diminished when challenging conditions usurped resources needed to manage interference from negative emotion and to disengage attention from wrongly cued colors (nonmatch). In ALC and ALC+HIV, poorer callosal integrity was related to enhanced emotional interference suggesting curtailed interhemispheric exchange needed between preferentially right-hemispheric emotion and left-hemispheric Stroop-word functions. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. The Effect of Observers’ Mood on the Local Processing of Emotional Faces: Evidence from Short-Lived and Prolonged Mood States

    PubMed Central

    Mokhtari, Setareh; Buttle, Heather

    2015-01-01

    We examined the effect of induced mood, varying in valence and longevity, on local processing of emotional faces. It was found that negative facial expression conveyed by the global level of the face interferes with efficient processing of the local features. The results also showed that the duration of involvement with a mood influenced the local processing. We observed that attending to the local level of faces is not different in short-lived happy and sad mood states. However, as the mood state is experienced for a longer period, local processing was impaired in happy mood compared to sad mood. Taken together, we concluded that both facial expressions and affective states influence processing of the local parts of faces. Moreover, we suggest that mediating factors like the duration of involvement with the mood play a role in the interrelation between mood, attention, and perception. PMID:25883696

  4. Adaptation to Emotional Conflict: Evidence from a Novel Face Emotion Paradigm

    PubMed Central

    Clayson, Peter E.; Larson, Michael J.

    2013-01-01

    The preponderance of research on trial-by-trial recruitment of affective control (e.g., conflict adaptation) relies on stimuli wherein lexical word information conflicts with facial affective stimulus properties (e.g., the face-Stroop paradigm where an emotional word is overlaid on a facial expression). Several studies, however, indicate different neural time course and properties for processing of affective lexical stimuli versus affective facial stimuli. The current investigation used a novel task to examine control processes implemented following conflicting emotional stimuli with conflict-inducing affective face stimuli in the absence of affective words. Forty-one individuals completed a task wherein the affective-valence of the eyes and mouth were either congruent (happy eyes, happy mouth) or incongruent (happy eyes, angry mouth) while high-density event-related potentials (ERPs) were recorded. There was a significant congruency effect and significant conflict adaptation effects for error rates. Although response times (RTs) showed a significant congruency effect, the effect of previous-trial congruency on current-trial RTs was only present for current congruent trials. Temporospatial principal components analysis showed a P3-like ERP source localized using FieldTrip software to the medial cingulate gyrus that was smaller on incongruent than congruent trials and was significantly influenced by the recruitment of control processes following previous-trial emotional conflict (i.e., there was significant conflict adaptation in the ERPs). Results show that a face-only paradigm may be sufficient to elicit emotional conflict and suggest a system for rapidly detecting conflicting emotional stimuli and subsequently adjusting control resources, similar to cognitive conflict detection processes, when using conflicting facial expressions without words. PMID:24073278

  5. Adaptation to emotional conflict: evidence from a novel face emotion paradigm.

    PubMed

    Clayson, Peter E; Larson, Michael J

    2013-01-01

    The preponderance of research on trial-by-trial recruitment of affective control (e.g., conflict adaptation) relies on stimuli wherein lexical word information conflicts with facial affective stimulus properties (e.g., the face-Stroop paradigm where an emotional word is overlaid on a facial expression). Several studies, however, indicate different neural time course and properties for processing of affective lexical stimuli versus affective facial stimuli. The current investigation used a novel task to examine control processes implemented following conflicting emotional stimuli with conflict-inducing affective face stimuli in the absence of affective words. Forty-one individuals completed a task wherein the affective-valence of the eyes and mouth were either congruent (happy eyes, happy mouth) or incongruent (happy eyes, angry mouth) while high-density event-related potentials (ERPs) were recorded. There was a significant congruency effect and significant conflict adaptation effects for error rates. Although response times (RTs) showed a significant congruency effect, the effect of previous-trial congruency on current-trial RTs was only present for current congruent trials. Temporospatial principal components analysis showed a P3-like ERP source localized using FieldTrip software to the medial cingulate gyrus that was smaller on incongruent than congruent trials and was significantly influenced by the recruitment of control processes following previous-trial emotional conflict (i.e., there was significant conflict adaptation in the ERPs). Results show that a face-only paradigm may be sufficient to elicit emotional conflict and suggest a system for rapidly detecting conflicting emotional stimuli and subsequently adjusting control resources, similar to cognitive conflict detection processes, when using conflicting facial expressions without words.

  6. Emotional priming with facial exposures in euthymic patients with bipolar disorder.

    PubMed

    Kim, Taek Su; Lee, Su Young; Ha, Ra Yeon; Kim, Eosu; An, Suk Kyoon; Ha, Kyooseob; Cho, Hyun-Sang

    2011-12-01

    People with bipolar disorder have abnormal emotional processing. We investigated the automatic and controlled emotional processing via a priming paradigm with subliminal and supraliminal facial exposure. We compared 20 euthymic bipolar patients and 20 healthy subjects on their performance in subliminal and supraliminal tasks. Priming tasks consisted of three different primes according to facial emotions (happy, sad, and neutral) followed by a neutral face as a target stimulus. The prime stimuli were presented subliminally (17 msec) or supraliminally (1000 msec). In subliminal tasks, both patients and controls judged the neutral target face as significantly more unpleasant (negative judgment shift) when presented with negative emotion primes compared with positive primes. In supraliminal tasks, bipolar subjects showed significant negative judgment shift, whereas healthy subjects did not. There was a significant group × emotion interaction for the judgment rate in supraliminal tasks. Our finding of persistent affective priming even at conscious awareness may suggest that bipolar patients have impaired cognitive control on emotional processing rather than automatically spreading activation of emotion.

  7. More than mere mimicry? The influence of emotion on rapid facial reactions to faces.

    PubMed

    Moody, Eric J; McIntosh, Daniel N; Mann, Laura J; Weisser, Kimberly R

    2007-05-01

    Within a second of seeing an emotional facial expression, people typically match that expression. These rapid facial reactions (RFRs), often termed mimicry, are implicated in emotional contagion, social perception, and embodied affect, yet ambiguity remains regarding the mechanism(s) involved. Two studies evaluated whether RFRs to faces are solely nonaffective motor responses or whether emotional processes are involved. Brow (corrugator, related to anger) and forehead (frontalis, related to fear) activity were recorded using facial electromyography (EMG) while undergraduates in two conditions (fear induction vs. neutral) viewed fear, anger, and neutral facial expressions. As predicted, fear induction increased fear expressions to angry faces within 1000 ms of exposure, demonstrating an emotional component of RFRs. This did not merely reflect increased fear from the induction, because responses to neutral faces were unaffected. Considering RFRs to be merely nonaffective automatic reactions is inaccurate. RFRs are not purely motor mimicry; emotion influences early facial responses to faces. The relevance of these data to emotional contagion, autism, and the mirror system-based perspectives on imitation is discussed.

  8. Attentional bias for emotional faces in paediatric anxiety disorders: an investigation using the emotional Go/No Go task.

    PubMed

    Waters, Allison M; Valvoi, Jaya S

    2009-06-01

    The present study examined contextual modulation of attentional control processes in paediatric anxiety disorders. Anxious children (N=20) and non-anxious controls (N=20) completed an emotional Go/No Go task in which they responded on some trials (i.e., Go trials) when neutral faces were presented amongst either angry or happy faces to which children avoided responding (i.e., No Go trials) or when angry and happy faces were presented as Go trials and children avoided responding to neutral faces. Anxious girls were slower responding to neutral faces with embedded angry compared with happy face No Go trials whereas non-anxious girls were slower responding to neutral faces with embedded happy versus angry face No Go trials. Anxious and non-anxious boys showed the same basic pattern as non-anxious girls. There were no significant group differences on No Go trials or when the emotional faces were presented as Go trials. Results are discussed in terms of selective interference by angry faces in the control of attention in anxious girls.

  9. The valence-specific laterality effect in free viewing conditions: The influence of sex, handedness, and response bias.

    PubMed

    Rodway, Paul; Wright, Lynn; Hardie, Scott

    2003-12-01

    The right hemisphere has often been viewed as having a dominant role in the processing of emotional information. Other evidence indicates that both hemispheres process emotional information but their involvement is valence specific, with the right hemisphere dealing with negative emotions and the left hemisphere preferentially processing positive emotions. This has been found under both restricted (Reuter-Lorenz & Davidson, 1981) and free viewing conditions (Jansari, Tranel, & Adophs, 2000). It remains unclear whether the valence-specific laterality effect is also sex specific or is influenced by the handedness of participants. To explore this issue we repeated Jansari et al.'s free-viewing laterality task with 78 participants. We found a valence-specific laterality effect in women but not men, with women discriminating negative emotional expressions more accurately when the face was presented on the left-hand side and discriminating positive emotions more accurately when those faces were presented on the right-hand side. These results indicate that under free viewing conditions women are more lateralised for the processing of facial emotion than are men. Handedness did not affect the lateralised processing of facial emotion. Finally, participants demonstrated a response bias on control trials, where facial emotion did not differ between the faces. Participants selected the left-hand side more frequently when they believed the expression was negative and the right-hand side more frequently when they believed the expression was positive. This response bias can cause a spurious valence-specific laterality effect which might have contributed to the conflicting findings within the literature.

  10. Amygdala Hyperactivation During Face Emotion Processing in Unaffected Youth at Risk for Bipolar Disorder

    ERIC Educational Resources Information Center

    Olsavsky, Aviva K.; Brotman, Melissa A.; Rutenberg, Julia G.; Muhrer, Eli J.; Deveney, Christen M.; Fromm, Stephen J.; Towbin, Kenneth; Pine, Daniel S.; Leibenluft, Ellen

    2012-01-01

    Objective: Youth at familial risk for bipolar disorder (BD) show deficits in face emotion processing, but the neural correlates of these deficits have not been examined. This preliminary study tests the hypothesis that, relative to healthy comparison (HC) subjects, both BD subjects and youth at risk for BD (i.e., those with a first-degree BD…

  11. Attention and memory bias to facial emotions underlying negative symptoms of schizophrenia.

    PubMed

    Jang, Seon-Kyeong; Park, Seon-Cheol; Lee, Seung-Hwan; Cho, Yang Seok; Choi, Kee-Hong

    2016-01-01

    This study assessed bias in selective attention to facial emotions in negative symptoms of schizophrenia and its influence on subsequent memory for facial emotions. Thirty people with schizophrenia who had high and low levels of negative symptoms (n = 15, respectively) and 21 healthy controls completed a visual probe detection task investigating selective attention bias (happy, sad, and angry faces randomly presented for 50, 500, or 1000 ms). A yes/no incidental facial memory task was then completed. Attention bias scores and recognition errors were calculated. Those with high negative symptoms exhibited reduced attention to emotional faces relative to neutral faces; those with low negative symptoms showed the opposite pattern when faces were presented for 500 ms regardless of the valence. Compared to healthy controls, those with high negative symptoms made more errors for happy faces in the memory task. Reduced attention to emotional faces in the probe detection task was significantly associated with less pleasure and motivation and more recognition errors for happy faces in schizophrenia group only. Attention bias away from emotional information relatively early in the attentional process and associated diminished positive memory may relate to pathological mechanisms for negative symptoms.

  12. A face to remember: emotional expression modulates prefrontal activity during memory formation.

    PubMed

    Sergerie, Karine; Lepage, Martin; Armony, Jorge L

    2005-01-15

    Emotion can exert a modulatory role on episodic memory. Several studies have shown that negative stimuli (e.g., words, pictures) are better remembered than neutral ones. Although facial expressions are powerful emotional stimuli and have been shown to influence perception and attention processes, little is known about their effect on memory. We used functional magnetic resonance imaging (fMRI) in humans to investigate the effects of expression (happy, neutral, and fearful) on prefrontal cortex (PFC) activity during the encoding of faces, using a subsequent memory effect paradigm. Our results show that activity in right PFC predicted memory for faces, regardless of expression, while a homotopic region in the left hemisphere was associated with successful encoding only for faces with an emotional expression. These findings are consistent with the proposed role of right dorsolateral PFC in successful encoding of nonverbal material, but also suggest that left DLPFC may be a site where integration of memory and emotional processes occurs. This study sheds new light on the current controversy regarding the hemispheric lateralization of PFC in memory encoding.

  13. Childhood Poverty Predicts Adult Amygdala and Frontal Activity and Connectivity in Response to Emotional Faces.

    PubMed

    Javanbakht, Arash; King, Anthony P; Evans, Gary W; Swain, James E; Angstadt, Michael; Phan, K Luan; Liberzon, Israel

    2015-01-01

    Childhood poverty negatively impacts physical and mental health in adulthood. Altered brain development in response to social and environmental factors associated with poverty likely contributes to this effect, engendering maladaptive patterns of social attribution and/or elevated physiological stress. In this fMRI study, we examined the association between childhood poverty and neural processing of social signals (i.e., emotional faces) in adulthood. Fifty-two subjects from a longitudinal prospective study recruited as children, participated in a brain imaging study at 23-25 years of age using the Emotional Faces Assessment Task. Childhood poverty, independent of concurrent adult income, was associated with higher amygdala and medial prefrontal cortical (mPFC) responses to threat vs. happy faces. Also, childhood poverty was associated with decreased functional connectivity between left amygdala and mPFC. This study is unique, because it prospectively links childhood poverty to emotional processing during adulthood, suggesting a candidate neural mechanism for negative social-emotional bias. Adults who grew up poor appear to be more sensitive to social threat cues and less sensitive to positive social cues.

  14. The changing face of emotion: age-related patterns of amygdala activation to salient faces

    PubMed Central

    Evans, Jennifer W.; Morris, Drew; Lewis, Marc D.; Taylor, Margot J.

    2011-01-01

    The present study investigated age-related differences in the amygdala and other nodes of face-processing networks in response to facial expression and familiarity. fMRI data were analyzed from 31 children (3.5–8.5 years) and 14 young adults (18–33 years) who viewed pictures of familiar (mothers) and unfamiliar emotional faces. Results showed that amygdala activation for faces over a scrambled image baseline increased with age. Children, but not adults, showed greater amygdala activation to happy than angry faces; in addition, amygdala activation for angry faces increased with age. In keeping with growing evidence of a positivity bias in young children, our data suggest that children find happy faces to be more salient or meaningful than angry faces. Both children and adults showed preferential activation to mothers’ over strangers’ faces in a region of rostral anterior cingulate cortex associated with self-evaluation, suggesting that some nodes in frontal evaluative networks are active early in development. This study presents novel data on neural correlates of face processing in childhood and indicates that preferential amygdala activation for emotional expressions changes with age. PMID:20194512

  15. [Attentional bias and emotional suppression in borderline personality disorder].

    PubMed

    Fernando, Silvia Carvalho; Griepenstroh, Julia; Urban, Sabine; Driessen, Martin; Beblo, Thomas

    2014-01-01

    Emotion regulation dysfunctions marked by negative affectivity are a core feature of borderline personality disorder (BPD). In addition, patients with BPD show disturbed attentional processes which become particularly apparent in the domain of selective attention when emotional stimuli are presented (negative attentional bias). Assuming that emotion regulation is linked to attentional deployment processes, this study aimed (1) to determine whether a negative attentional bias is established by using film clips of fearful faces and (2) to investigate the association between dysfunctional emotion regulation strategies (emotional suppression) and negative attention bias in BPD. We investigated 18 inpatients with BPD and 18 healthy control participants using the modified version of the fearful face-paradigm to assess the inhibition of emotional stimuli. We also administered self-report emotion regulation questionnaires. Compared to the healthy controls, patients with BPD showed significant longer reaction times during the emotional versus the neutral film stimuli in the modified fearful face-paradigm. With regard to the second hypothesis, we failed to find an association between the negative attentional bias and the habitual use of emotional suppression in BPD. In this study, we could confirm an attentional bias for negative stimuli, using complex, dynamic material. Future studies need to address the impact of confounding variables (e. g. comorbid disorders) on the relationship between maladaptive emotion regulation and selective attentional bias.

  16. Psilocybin with psychological support improves emotional face recognition in treatment-resistant depression.

    PubMed

    Stroud, J B; Freeman, T P; Leech, R; Hindocha, C; Lawn, W; Nutt, D J; Curran, H V; Carhart-Harris, R L

    2018-02-01

    Depressed patients robustly exhibit affective biases in emotional processing which are altered by SSRIs and predict clinical outcome. The objective of this study is to investigate whether psilocybin, recently shown to rapidly improve mood in treatment-resistant depression (TRD), alters patients' emotional processing biases. Seventeen patients with treatment-resistant depression completed a dynamic emotional face recognition task at baseline and 1 month later after two doses of psilocybin with psychological support. Sixteen controls completed the emotional recognition task over the same time frame but did not receive psilocybin. We found evidence for a group × time interaction on speed of emotion recognition (p = .035). At baseline, patients were slower at recognising facial emotions compared with controls (p < .001). After psilocybin, this difference was remediated (p = .208). Emotion recognition was faster at follow-up compared with baseline in patients (p = .004, d = .876) but not controls (p = .263, d = .302). In patients, this change was significantly correlated with a reduction in anhedonia over the same time period (r = .640, p = .010). Psilocybin with psychological support appears to improve processing of emotional faces in treatment-resistant depression, and this correlates with reduced anhedonia. Placebo-controlled studies are warranted to follow up these preliminary findings.

  17. Risk for bipolar disorder is associated with face-processing deficits across emotions.

    PubMed

    Brotman, Melissa A; Skup, Martha; Rich, Brendan A; Blair, Karina S; Pine, Daniel S; Blair, James R; Leibenluft, Ellen

    2008-12-01

    Youths with euthymic bipolar disorder (BD) have a deficit in face-emotion labeling that is present across multiple emotions. Recent research indicates that youths at familial risk for BD, but without a history of mood disorder, also have a deficit in face-emotion labeling, suggesting that such impairments may be an endophenotype for BD. It is unclear whether this deficit in at-risk youths is present across all emotions or if the impairment presents initially as an emotion-specific dysfunction that then generalizes to other emotions as the symptoms of BD become manifest. Thirty-seven patients with pediatric BD, 25 unaffected children with a first-degree relative with BD, and 36 typically developing youths were administered the Emotional Expression Multimorph Task, a computerized behavioral task, which presents gradations of facial emotions from 100% neutrality to 100% emotional expression (happiness, surprise, fear, sadness, anger, and disgust). Repeated-measures analysis of covariance revealed that, compared with the control youths, the patients and the at-risk youths required significantly more intense emotional information to identify and correctly label face emotions. The patients with BD and the at-risk youths did not differ from each other. Group-by-emotion interactions were not significant, indicating that the group effects did not differ based on the facial emotion. The youths at risk for BD demonstrate nonspecific deficits in face-emotion recognition, similar to patients with the illness. Further research is needed to determine whether such deficits meet all the criteria for an endophenotype.

  18. Culture, gaze and the neural processing of fear expressions

    PubMed Central

    Franklin, Robert G.; Rule, Nicholas O.; Freeman, Jonathan B.; Kveraga, Kestutis; Hadjikhani, Nouchine; Yoshikawa, Sakiko; Ambady, Nalini

    2010-01-01

    The direction of others’ eye gaze has important influences on how we perceive their emotional expressions. Here, we examined differences in neural activation to direct- versus averted-gaze fear faces as a function of culture of the participant (Japanese versus US Caucasian), culture of the stimulus face (Japanese versus US Caucasian), and the relation between the two. We employed a previously validated paradigm to examine differences in neural activation in response to rapidly presented direct- versus averted-fear expressions, finding clear evidence for a culturally determined role of gaze in the processing of fear. Greater neural responsivity was apparent to averted- versus direct-gaze fear in several regions related to face and emotion processing, including bilateral amygdalae, when posed on same-culture faces, whereas greater response to direct- versus averted-gaze fear was apparent in these same regions when posed on other-culture faces. We also found preliminary evidence for intercultural variation including differential responses across participants to Japanese versus US Caucasian stimuli, and to a lesser degree differences in how Japanese and US Caucasian participants responded to these stimuli. These findings reveal a meaningful role of culture in the processing of eye gaze and emotion, and highlight their interactive influences in neural processing. PMID:20019073

  19. Seeing the mean: ensemble coding for sets of faces.

    PubMed

    Haberman, Jason; Whitney, David

    2009-06-01

    We frequently encounter groups of similar objects in our visual environment: a bed of flowers, a basket of oranges, a crowd of people. How does the visual system process such redundancy? Research shows that rather than code every element in a texture, the visual system favors a summary statistical representation of all the elements. The authors demonstrate that although it may facilitate texture perception, ensemble coding also occurs for faces-a level of processing well beyond that of textures. Observers viewed sets of faces varying in emotionality (e.g., happy to sad) and assessed the mean emotion of each set. Although observers retained little information about the individual set members, they had a remarkably precise representation of the mean emotion. Observers continued to discriminate the mean emotion accurately even when they viewed sets of 16 faces for 500 ms or less. Modeling revealed that perceiving the average facial expression in groups of faces was not due to noisy representation or noisy discrimination. These findings support the hypothesis that ensemble coding occurs extremely fast at multiple levels of visual analysis. (c) 2009 APA, all rights reserved.

  20. The role of spatial frequency information for ERP components sensitive to faces and emotional facial expression.

    PubMed

    Holmes, Amanda; Winston, Joel S; Eimer, Martin

    2005-10-01

    To investigate the impact of spatial frequency on emotional facial expression analysis, ERPs were recorded in response to low spatial frequency (LSF), high spatial frequency (HSF), and unfiltered broad spatial frequency (BSF) faces with fearful or neutral expressions, houses, and chairs. In line with previous findings, BSF fearful facial expressions elicited a greater frontal positivity than BSF neutral facial expressions, starting at about 150 ms after stimulus onset. In contrast, this emotional expression effect was absent for HSF and LSF faces. Given that some brain regions involved in emotion processing, such as amygdala and connected structures, are selectively tuned to LSF visual inputs, these data suggest that ERP effects of emotional facial expression do not directly reflect activity in these regions. It is argued that higher order neocortical brain systems are involved in the generation of emotion-specific waveform modulations. The face-sensitive N170 component was neither affected by emotional facial expression nor by spatial frequency information.

  1. Parametric modulation of neural activity during face emotion processing in unaffected youth at familial risk for bipolar disorder.

    PubMed

    Brotman, Melissa A; Deveney, Christen M; Thomas, Laura A; Hinton, Kendra E; Yi, Jennifer Y; Pine, Daniel S; Leibenluft, Ellen

    2014-11-01

    Both patients with pediatric bipolar disorder (BD) and unaffected youth at familial risk (AR) for the illness show impairments in face emotion labeling. Few studies, however, have examined brain regions engaged in AR youth when processing emotional faces. Moreover, studies have yet to explore neural responsiveness to subtle changes in face emotion in AR youth. Sixty-four unrelated youth, including 20 patients with BD, 15 unaffected AR youth, and 29 healthy comparisons (HC), completed functional magnetic resonance imaging. Neutral faces were morphed with angry or happy faces in 25% intervals. In specific phases of the task, youth alternatively made explicit (hostility) or implicit (nose width) ratings of the faces. The slope of blood oxygenated level-dependent activity was calculated across neutral to angry and neutral to happy face stimuli. Behaviorally, both subjects with BD (p ≤ 0.001) and AR youth (p ≤ 0.05) rated faces as less hostile relative to HC. Consistent with this, in response to increasing anger on the face, patients with BD and AR youth showed decreased modulation in the amygdala and inferior frontal gyrus (IFG; BA 46) compared to HC (all p ≤ 0.05). Amygdala dysfunction was present across both implicit and explicit rating conditions, but IFG modulation deficits were specific to the explicit condition. With increasing happiness, AR youth showed aberrant modulation in the IFG, which was also sensitive to task demands (all p ≤ 0.05). Decreased amygdala and IFG modulation in patients with BD and AR youth may be pathophysiological risk markers for BD, and may underlie the social cognition and face emotion labeling deficits observed in BD and AR youth. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.

  2. Emotional Devaluation of Distracting Patterns and Faces: A Consequence of Attentional Inhibition during Visual Search?

    ERIC Educational Resources Information Center

    Raymond, Jane E.; Fenske, Mark J.; Westoby, Nikki

    2005-01-01

    Visual search has been studied extensively, yet little is known about how its constituent processes affect subsequent emotional evaluation of searched-for and searched-through items. In 3 experiments, the authors asked observers to locate a colored pattern or tinted face in an array of other patterns or faces. Shortly thereafter, either the target…

  3. Intrinsic functional connectivity underlying successful emotion regulation of angry faces

    PubMed Central

    Morawetz, Carmen; Kellermann, Tanja; Kogler, Lydia; Radke, Sina; Blechert, Jens; Derntl, Birgit

    2016-01-01

    Most of our social interaction is naturally based on emotional information derived from the perception of faces of other people. Negative facial expressions of a counterpart might trigger negative emotions and initiate emotion regulatory efforts to reduce the impact of the received emotional message in a perceiver. Despite the high adaptive value of emotion regulation in social interaction, the neural underpinnings of it are largely unknown. To remedy this, this study investigated individual differences in emotion regulation effectiveness during the reappraisal of angry faces on the underlying functional activity using functional magnetic resonance imaging (fMRI) as well as the underlying functional connectivity using resting-state fMRI. Greater emotion regulation ability was associated with greater functional activity in the ventromedial prefrontal cortex. Furthermore, greater functional coupling between activity in the ventrolateral prefrontal cortex and the amygdala was associated with emotion regulation success. Our findings provide a first link between prefrontal cognitive control and subcortical emotion processing systems during successful emotion regulation in an explicitly social context. PMID:27510495

  4. Emotion recognition through static faces and moving bodies: a comparison between typically developed adults and individuals with high level of autistic traits

    PubMed Central

    Actis-Grosso, Rossana; Bossi, Francesco; Ricciardelli, Paola

    2015-01-01

    We investigated whether the type of stimulus (pictures of static faces vs. body motion) contributes differently to the recognition of emotions. The performance (accuracy and response times) of 25 Low Autistic Traits (LAT group) young adults (21 males) and 20 young adults (16 males) with either High Autistic Traits or with High Functioning Autism Spectrum Disorder (HAT group) was compared in the recognition of four emotions (Happiness, Anger, Fear, and Sadness) either shown in static faces or conveyed by moving body patch-light displays (PLDs). Overall, HAT individuals were as accurate as LAT ones in perceiving emotions both with faces and with PLDs. Moreover, they correctly described non-emotional actions depicted by PLDs, indicating that they perceived the motion conveyed by the PLDs per se. For LAT participants, happiness proved to be the easiest emotion to be recognized: in line with previous studies we found a happy face advantage for faces, which for the first time was also found for bodies (happy body advantage). Furthermore, LAT participants recognized sadness better by static faces and fear by PLDs. This advantage for motion kinematics in the recognition of fear was not present in HAT participants, suggesting that (i) emotion recognition is not generally impaired in HAT individuals, (ii) the cues exploited for emotion recognition by LAT and HAT groups are not always the same. These findings are discussed against the background of emotional processing in typically and atypically developed individuals. PMID:26557101

  5. Emotion recognition through static faces and moving bodies: a comparison between typically developed adults and individuals with high level of autistic traits.

    PubMed

    Actis-Grosso, Rossana; Bossi, Francesco; Ricciardelli, Paola

    2015-01-01

    We investigated whether the type of stimulus (pictures of static faces vs. body motion) contributes differently to the recognition of emotions. The performance (accuracy and response times) of 25 Low Autistic Traits (LAT group) young adults (21 males) and 20 young adults (16 males) with either High Autistic Traits or with High Functioning Autism Spectrum Disorder (HAT group) was compared in the recognition of four emotions (Happiness, Anger, Fear, and Sadness) either shown in static faces or conveyed by moving body patch-light displays (PLDs). Overall, HAT individuals were as accurate as LAT ones in perceiving emotions both with faces and with PLDs. Moreover, they correctly described non-emotional actions depicted by PLDs, indicating that they perceived the motion conveyed by the PLDs per se. For LAT participants, happiness proved to be the easiest emotion to be recognized: in line with previous studies we found a happy face advantage for faces, which for the first time was also found for bodies (happy body advantage). Furthermore, LAT participants recognized sadness better by static faces and fear by PLDs. This advantage for motion kinematics in the recognition of fear was not present in HAT participants, suggesting that (i) emotion recognition is not generally impaired in HAT individuals, (ii) the cues exploited for emotion recognition by LAT and HAT groups are not always the same. These findings are discussed against the background of emotional processing in typically and atypically developed individuals.

  6. Decoding facial blends of emotion: visual field, attentional and hemispheric biases.

    PubMed

    Ross, Elliott D; Shayya, Luay; Champlain, Amanda; Monnot, Marilee; Prodan, Calin I

    2013-12-01

    Most clinical research assumes that modulation of facial expressions is lateralized predominantly across the right-left hemiface. However, social psychological research suggests that facial expressions are organized predominantly across the upper-lower face. Because humans learn to cognitively control facial expression for social purposes, the lower face may display a false emotion, typically a smile, to enable approach behavior. In contrast, the upper face may leak a person's true feeling state by producing a brief facial blend of emotion, i.e. a different emotion on the upper versus lower face. Previous studies from our laboratory have shown that upper facial emotions are processed preferentially by the right hemisphere under conditions of directed attention if facial blends of emotion are presented tachistoscopically to the mid left and right visual fields. This paper explores how facial blends are processed within the four visual quadrants. The results, combined with our previous research, demonstrate that lower more so than upper facial emotions are perceived best when presented to the viewer's left and right visual fields just above the horizontal axis. Upper facial emotions are perceived best when presented to the viewer's left visual field just above the horizontal axis under conditions of directed attention. Thus, by gazing at a person's left ear, which also avoids the social stigma of eye-to-eye contact, one's ability to decode facial expressions should be enhanced. Published by Elsevier Inc.

  7. Musical chords and emotion: major and minor triads are processed for emotion.

    PubMed

    Bakker, David Radford; Martin, Frances Heritage

    2015-03-01

    Musical chords are arguably the smallest building blocks of music that retain emotional information. Major chords are generally perceived as positive- and minor chords as negative-sounding, but there has been debate concerning how early these emotional connotations may be processed. To investigate this, emotional facial stimuli and musical chord stimuli were simultaneously presented to participants, and facilitation of processing was measured via event-related potential (ERP) amplitudes. Decreased amplitudes of the P1 and N2 ERP components have been found to index the facilitation of early processing. If simultaneously presented musical chords and facial stimuli are perceived at early stages as belonging to the same emotional category, then early processing should be facilitated for these congruent pairs, and ERP amplitudes should therefore be decreased as compared to the incongruent pairs. ERPs were recorded from 30 musically naive participants as they viewed happy, sad, and neutral faces presented simultaneously with a major or minor chord. When faces and chords were presented that contained congruent emotional information (happy-major or sad-minor), processing was facilitated, as indexed by decreased N2 ERP amplitudes. This suggests that musical chords do possess emotional connotations that can be processed as early as 200 ms in naive listeners. The early stages of processing that are involved suggest that major and minor chords have deeply connected emotional meanings, rather than superficially attributed ones, indicating that minor triads possess negative emotional connotations and major triads possess positive emotional connotations.

  8. An Event-Related Potential Study on the Effects of Cannabis on Emotion Processing

    PubMed Central

    Troup, Lucy J.; Bastidas, Stephanie; Nguyen, Maia T.; Andrzejewski, Jeremy A.; Bowers, Matthew; Nomi, Jason S.

    2016-01-01

    The effect of cannabis on emotional processing was investigated using event-related potential paradigms (ERPs). ERPs associated with emotional processing of cannabis users, and non-using controls, were recorded and compared during an implicit and explicit emotional expression recognition and empathy task. Comparisons in P3 component mean amplitudes were made between cannabis users and controls. Results showed a significant decrease in the P3 amplitude in cannabis users compared to controls. Specifically, cannabis users showed reduced P3 amplitudes for implicit compared to explicit processing over centro-parietal sites which reversed, and was enhanced, at fronto-central sites. Cannabis users also showed a decreased P3 to happy faces, with an increase to angry faces, compared to controls. These effects appear to increase with those participants that self-reported the highest levels of cannabis consumption. Those cannabis users with the greatest consumption rates showed the largest P3 deficits for explicit processing and negative emotions. These data suggest that there is a complex relationship between cannabis consumption and emotion processing that appears to be modulated by attention. PMID:26926868

  9. Human versus Non-Human Face Processing: Evidence from Williams Syndrome

    ERIC Educational Resources Information Center

    Santos, Andreia; Rosset, Delphine; Deruelle, Christine

    2009-01-01

    Increased motivation towards social stimuli in Williams syndrome (WS) led us to hypothesize that a face's human status would have greater impact than face's orientation on WS' face processing abilities. Twenty-nine individuals with WS were asked to categorize facial emotion expressions in real, human cartoon and non-human cartoon faces presented…

  10. Faces in Context: A Review and Systematization of Contextual Influences on Affective Face Processing

    PubMed Central

    Wieser, Matthias J.; Brosch, Tobias

    2012-01-01

    Facial expressions are of eminent importance for social interaction as they convey information about other individuals’ emotions and social intentions. According to the predominant “basic emotion” approach, the perception of emotion in faces is based on the rapid, automatic categorization of prototypical, universal expressions. Consequently, the perception of facial expressions has typically been investigated using isolated, de-contextualized, static pictures of facial expressions that maximize the distinction between categories. However, in everyday life, an individual’s face is not perceived in isolation, but almost always appears within a situational context, which may arise from other people, the physical environment surrounding the face, as well as multichannel information from the sender. Furthermore, situational context may be provided by the perceiver, including already present social information gained from affective learning and implicit processing biases such as race bias. Thus, the perception of facial expressions is presumably always influenced by contextual variables. In this comprehensive review, we aim at (1) systematizing the contextual variables that may influence the perception of facial expressions and (2) summarizing experimental paradigms and findings that have been used to investigate these influences. The studies reviewed here demonstrate that perception and neural processing of facial expressions are substantially modified by contextual information, including verbal, visual, and auditory information presented together with the face as well as knowledge or processing biases already present in the observer. These findings further challenge the assumption of automatic, hardwired categorical emotion extraction mechanisms predicted by basic emotion theories. Taking into account a recent model on face processing, we discuss where and when these different contextual influences may take place, thus outlining potential avenues in future research. PMID:23130011

  11. Task-irrelevant emotion facilitates face discrimination learning.

    PubMed

    Lorenzino, Martina; Caudek, Corrado

    2015-03-01

    We understand poorly how the ability to discriminate faces from one another is shaped by visual experience. The purpose of the present study is to determine whether face discrimination learning can be facilitated by facial emotions. To answer this question, we used a task-irrelevant perceptual learning paradigm because it closely mimics the learning processes that, in daily life, occur without a conscious intention to learn and without an attentional focus on specific facial features. We measured face discrimination thresholds before and after training. During the training phase (4 days), participants performed a contrast discrimination task on face images. They were not informed that we introduced (task-irrelevant) subtle variations in the face images from trial to trial. For the Identity group, the task-irrelevant features were variations along a morphing continuum of facial identity. For the Emotion group, the task-irrelevant features were variations along an emotional expression morphing continuum. The Control group did not undergo contrast discrimination learning and only performed the pre-training and post-training tests, with the same temporal gap between them as the other two groups. Results indicate that face discrimination improved, but only for the Emotion group. Participants in the Emotion group, moreover, showed face discrimination improvements also for stimulus variations along the facial identity dimension, even if these (task-irrelevant) stimulus features had not been presented during training. The present results highlight the importance of emotions for face discrimination learning. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Anterior cingulate activation is related to a positivity bias and emotional stability in successful aging.

    PubMed

    Brassen, Stefanie; Gamer, Matthias; Büchel, Christian

    2011-07-15

    Behavioral studies consistently reported an increased preference for positive experiences in older adults. The socio-emotional selectivity theory explains this positivity effect with a motivated goal shift in emotion regulation, which probably depends on available cognitive resources. The present study investigates the neurobiological mechanism underlying this hypothesis. Functional magnetic resonance imaging data were acquired in 21 older and 22 young subjects while performing a spatial-cueing paradigm that manipulates attentional load on emotional face distracters. We focused our analyses on the anterior cingulate cortex as a key structure of cognitive control of emotion. Elderly subjects showed a specifically increased distractibility by happy faces when more attentional resources were available for face processing. This effect was paralleled by an increased engagement of the rostral anterior cingulate cortex, and this frontal engagement was significantly correlated with emotional stability. The current study highlights how the brain might mediate the tendency to preferentially engage in positive information processing in healthy aging. The finding of a resource-dependency of this positivity effect suggests demanding self-regulating processes that are related to emotional well-being. These findings are of particular relevance regarding implications for the understanding, treatment, and prevention of nonsuccessful aging like highly prevalent late-life depression. Copyright © 2011 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  13. Emotional intelligence is associated with reduced insula responses to masked angry faces.

    PubMed

    Alkozei, Anna; Killgore, William D S

    2015-07-08

    High levels of emotional intelligence (EI) have been associated with increased success in the workplace, greater quality of personal relationships, and enhanced wellbeing. Evidence suggests that EI is mediated extensively by the interplay of key emotion regions including the amygdala, insula, and ventromedial prefrontal cortex, among others. The insula, in particular, is important for processing interoceptive and somatic cues that are interpreted as emotional responses. We investigated the association between EI and functional brain responses within the aforementioned neurocircuitry in response to subliminal presentations of social threat. Fifty-four healthy adults completed the Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT) and underwent functional magnetic brain imaging while viewing subliminal presentations of faces displaying anger, using a backward masked facial affect paradigm to minimize conscious awareness of the expressed emotion. In response to masked angry faces, the total MSCEIT scores correlated negatively with a cluster of activation located within the left insula, but not with activation in any other region of interest. Considering the insula's role in the processing of interoceptive emotional cues, the results suggest that greater EI is associated with reduced emotional visceral reactivity and/or more accurate interoceptive prediction when confronted with stimuli indicative of social threat.

  14. Age-Related Developmental and Individual Differences in the Influence of Social and Non-social Distractors on Cognitive Performance.

    PubMed

    Tan, Patricia Z; Silk, Jennifer S; Dahl, Ronald E; Kronhaus, Dina; Ladouceur, Cecile D

    2018-01-01

    This study sought to examine age-related differences in the influences of social (neutral, emotional faces) and non-social/non-emotional (shapes) distractor stimuli in children, adolescents, and adults. To assess the degree to which distractor, or task-irrelevant, stimuli of varying social and emotional salience interfere with cognitive performance, children ( N = 12; 8-12y), adolescents ( N = 17; 13-17y), and adults ( N = 17; 18-52y) completed the Emotional Identification and Dynamic Faces (EIDF) task. This task included three types of dynamically-changing distractors: (1) neutral-social (neutral face changing into another face); (2) emotional-social (face changing from 0% emotional to 100% emotional); and (3) non-social/non-emotional (shapes changing from small to large) to index the influence of task-irrelevant social and emotional information on cognition. Results yielded no age-related differences in accuracy but showed an age-related linear reduction in correct reaction times across distractor conditions. An age-related effect in interference was observed, such that children and adults showed slower response times on correct trials with socially-salient distractors; whereas adolescents exhibited faster responses on trials with distractors that included faces rather than shapes. A secondary study goal was to explore individual differences in cognitive interference. Results suggested that regardless of age, low trait anxiety and high effortful control were associated with interference to angry faces. Implications for developmental differences in affective processing, notably the importance of considering the contexts in which purportedly irrelevant social and emotional information might impair, vs. improve cognitive control, are discussed.

  15. Laterality Biases to Chimeric Faces in Asperger Syndrome: What Is Right about Face-Processing?

    ERIC Educational Resources Information Center

    Ashwin, Chris; Wheelwright, Sally; Baron-Cohen, Simon

    2005-01-01

    People show a left visual field (LVF) bias for faces, i.e., involving the right hemisphere of the brain. Lesion and neuroimaging studies confirm the importance of the right-hemisphere and suggest separable neural pathways for processing facial identity vs. emotions. We investigated the hemispheric processing of faces in adults with and without…

  16. Pupillary responses reveal infants' discrimination of facial emotions independent of conscious perception.

    PubMed

    Jessen, Sarah; Altvater-Mackensen, Nicole; Grossmann, Tobias

    2016-05-01

    Sensitive responding to others' emotions is essential during social interactions among humans. There is evidence for the existence of subcortically mediated emotion discrimination processes that occur independent of conscious perception in adults. However, only recently work has begun to examine the development of automatic emotion processing systems during infancy. In particular, it is unclear whether emotional expressions impact infants' autonomic nervous system regardless of conscious perception. We examined this question by measuring pupillary responses while subliminally and supraliminally presenting 7-month-old infants with happy and fearful faces. Our results show greater pupil dilation, indexing enhanced autonomic arousal, in response to happy compared to fearful faces regardless of conscious perception. Our findings suggest that, early in ontogeny, emotion discrimination occurs independent of conscious perception and is associated with differential autonomic responses. This provides evidence for the view that automatic emotion processing systems are an early-developing building block of human social functioning. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. The Impact of Emotional States on Cognitive Control Circuitry and Function.

    PubMed

    Cohen, Alexandra O; Dellarco, Danielle V; Breiner, Kaitlyn; Helion, Chelsea; Heller, Aaron S; Rahdar, Ahrareh; Pedersen, Gloria; Chein, Jason; Dyke, Jonathan P; Galvan, Adriana; Casey, B J

    2016-03-01

    Typically in the laboratory, cognitive and emotional processes are studied separately or as a stream of fleeting emotional stimuli embedded within a cognitive task. Yet in life, thoughts and actions often occur in more lasting emotional states of arousal. The current study examines the impact of emotions on actions using a novel behavioral paradigm and functional neuroimaging to assess cognitive control under sustained states of threat (anticipation of an aversive noise) and excitement (anticipation of winning money). Thirty-eight healthy adult participants were scanned while performing an emotional go/no-go task with positive (happy faces), negative (fearful faces), and neutral (calm faces) emotional cues, under threat or excitement. Cognitive control performance was enhanced during the excited state relative to a nonarousing control condition. This enhanced performance was paralleled by heightened activity of frontoparietal and frontostriatal circuitry. In contrast, under persistent threat, cognitive control was diminished when the valence of the emotional cue conflicted with the emotional state. Successful task performance in this conflicting emotional condition was associated with increased activity in the posterior cingulate cortex, a default mode network region implicated in complex processes such as processing emotions in the context of self and monitoring performance. This region showed positive coupling with frontoparietal circuitry implicated in cognitive control, providing support for a role of the posterior cingulate cortex in mobilizing cognitive resources to improve performance. These findings suggest that emotional states of arousal differentially modulate cognitive control and point to the potential utility of this paradigm for understanding effects of situational and pathological states of arousal on behavior.

  18. From Facial Emotional Recognition Abilities to Emotional Attribution: A Study in Down Syndrome

    ERIC Educational Resources Information Center

    Hippolyte, Loyse; Barisnikov, Koviljka; Van der Linden, Martial; Detraux, Jean-Jacques

    2009-01-01

    Facial expression processing and the attribution of facial emotions to a context were investigated in adults with Down syndrome (DS) in two experiments. Their performances were compared with those of a child control group matched for receptive vocabulary. The ability to process faces without emotional content was controlled for, and no differences…

  19. Processing of Emotional Faces in Patients with Chronic Pain Disorder: An Eye-Tracking Study.

    PubMed

    Giel, Katrin Elisabeth; Paganini, Sarah; Schank, Irena; Enck, Paul; Zipfel, Stephan; Junne, Florian

    2018-01-01

    Problems in emotion processing potentially contribute to the development and maintenance of chronic pain. Theories focusing on attentional processing have suggested that dysfunctional attention deployment toward emotional information, i.e., attentional biases for negative emotions, might entail one potential developmental and/or maintenance factor of chronic pain. We assessed self-reported alexithymia, attentional orienting to and maintenance on emotional stimuli using eye tracking in 17 patients with chronic pain disorder (CP) and two age- and sex-matched control groups, 17 healthy individuals (HC) and 17 individuals who were matched to CP according to depressive symptoms (DC). In a choice viewing paradigm, a dot indicated the position of the emotional picture in the next trial to allow for strategic attention deployment. Picture pairs consisted of a happy or sad facial expression and a neutral facial expression of the same individual. Participants were asked to explore picture pairs freely. CP and DC groups reported higher alexithymia than the HC group. HC showed a previously reported emotionality bias by preferentially orienting to the emotional face and preferentially maintaining on the happy face. CP and DC participants showed no facilitated early attention to sad facial expressions, and DC participants showed no facilitated early attention to happy facial expressions, while CP and DC participants did. We found no group differences in attentional maintenance. Our findings are in line with the clinical large overlap between pain and depression. The blunted initial reaction to sadness could be interpreted as a failure of the attentional system to attend to evolutionary salient emotional stimuli or as an attempt to suppress negative emotions. These difficulties in emotion processing might contribute to etiology or maintenance of chronic pain and depression.

  20. Perception of face and body expressions using electromyography, pupillometry and gaze measures.

    PubMed

    Kret, Mariska E; Stekelenburg, Jeroen J; Roelofs, Karin; de Gelder, Beatrice

    2013-01-01

    Traditional emotion theories stress the importance of the face in the expression of emotions but bodily expressions are becoming increasingly important as well. In these experiments we tested the hypothesis that similar physiological responses can be evoked by observing emotional face and body signals and that the reaction to angry signals is amplified in anxious individuals. We designed three experiments in which participants categorized emotional expressions from isolated facial and bodily expressions and emotionally congruent and incongruent face-body compounds. Participants' fixations were measured and their pupil size recorded with eye-tracking equipment and their facial reactions measured with electromyography. The results support our prediction that the recognition of a facial expression is improved in the context of a matching posture and importantly, vice versa as well. From their facial expressions, it appeared that observers acted with signs of negative emotionality (increased corrugator activity) to angry and fearful facial expressions and with positive emotionality (increased zygomaticus) to happy facial expressions. What we predicted and found, was that angry and fearful cues from the face or the body, attracted more attention than happy cues. We further observed that responses evoked by angry cues were amplified in individuals with high anxiety scores. In sum, we show that people process bodily expressions of emotion in a similar fashion as facial expressions and that the congruency between the emotional signals from the face and body facilitates the recognition of the emotion.

  1. Perception of Face and Body Expressions Using Electromyography, Pupillometry and Gaze Measures

    PubMed Central

    Kret, Mariska E.; Stekelenburg, Jeroen J.; Roelofs, Karin; de Gelder, Beatrice

    2013-01-01

    Traditional emotion theories stress the importance of the face in the expression of emotions but bodily expressions are becoming increasingly important as well. In these experiments we tested the hypothesis that similar physiological responses can be evoked by observing emotional face and body signals and that the reaction to angry signals is amplified in anxious individuals. We designed three experiments in which participants categorized emotional expressions from isolated facial and bodily expressions and emotionally congruent and incongruent face-body compounds. Participants’ fixations were measured and their pupil size recorded with eye-tracking equipment and their facial reactions measured with electromyography. The results support our prediction that the recognition of a facial expression is improved in the context of a matching posture and importantly, vice versa as well. From their facial expressions, it appeared that observers acted with signs of negative emotionality (increased corrugator activity) to angry and fearful facial expressions and with positive emotionality (increased zygomaticus) to happy facial expressions. What we predicted and found, was that angry and fearful cues from the face or the body, attracted more attention than happy cues. We further observed that responses evoked by angry cues were amplified in individuals with high anxiety scores. In sum, we show that people process bodily expressions of emotion in a similar fashion as facial expressions and that the congruency between the emotional signals from the face and body facilitates the recognition of the emotion. PMID:23403886

  2. Fixation to features and neural processing of facial expressions in a gender discrimination task.

    PubMed

    Neath, Karly N; Itier, Roxane J

    2015-10-01

    Early face encoding, as reflected by the N170 ERP component, is sensitive to fixation to the eyes. Whether this sensitivity varies with facial expressions of emotion and can also be seen on other ERP components such as P1 and EPN, was investigated. Using eye-tracking to manipulate fixation on facial features, we found the N170 to be the only eye-sensitive component and this was true for fearful, happy and neutral faces. A different effect of fixation to features was seen for the earlier P1 that likely reflected general sensitivity to face position. An early effect of emotion (∼120 ms) for happy faces was seen at occipital sites and was sustained until ∼350 ms post-stimulus. For fearful faces, an early effect was seen around 80 ms followed by a later effect appearing at ∼150 ms until ∼300 ms at lateral posterior sites. Results suggests that in this emotion-irrelevant gender discrimination task, processing of fearful and happy expressions occurred early and largely independently of the eye-sensitivity indexed by the N170. Processing of the two emotions involved different underlying brain networks active at different times. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Emotional face recognition in adolescent suicide attempters and adolescents engaging in non-suicidal self-injury.

    PubMed

    Seymour, Karen E; Jones, Richard N; Cushman, Grace K; Galvan, Thania; Puzia, Megan E; Kim, Kerri L; Spirito, Anthony; Dickstein, Daniel P

    2016-03-01

    Little is known about the bio-behavioral mechanisms underlying and differentiating suicide attempts from non-suicidal self-injury (NSSI) in adolescents. Adolescents who attempt suicide or engage in NSSI often report significant interpersonal and social difficulties. Emotional face recognition ability is a fundamental skill required for successful social interactions, and deficits in this ability may provide insight into the unique brain-behavior interactions underlying suicide attempts versus NSSI in adolescents. Therefore, we examined emotional face recognition ability among three mutually exclusive groups: (1) inpatient adolescents who attempted suicide (SA, n = 30); (2) inpatient adolescents engaged in NSSI (NSSI, n = 30); and (3) typically developing controls (TDC, n = 30) without psychiatric illness. Participants included adolescents aged 13-17 years, matched on age, gender and full-scale IQ. Emotional face recognition was evaluated using the diagnostic assessment of nonverbal accuracy (DANVA-2). Compared to TDC youth, adolescents with NSSI made more errors on child fearful and adult sad face recognition while controlling for psychopathology and medication status (ps < 0.05). No differences were found on emotional face recognition between NSSI and SA groups. Secondary analyses showed that compared to inpatients without major depression, those with major depression made fewer errors on adult sad face recognition even when controlling for group status (p < 0.05). Further, compared to inpatients without generalized anxiety, those with generalized anxiety made fewer recognition errors on adult happy faces even when controlling for group status (p < 0.05). Adolescent inpatients engaged in NSSI showed greater deficits in emotional face recognition than TDC, but not inpatient adolescents who attempted suicide. Further results suggest the importance of psychopathology in emotional face recognition. Replication of these preliminary results and examination of the role of context-dependent emotional processing are needed moving forward.

  4. Early and late temporo-spatial effects of contextual interference during perception of facial affect.

    PubMed

    Frühholz, Sascha; Fehr, Thorsten; Herrmann, Manfred

    2009-10-01

    Contextual features during recognition of facial affect are assumed to modulate the temporal course of emotional face processing. Here, we simultaneously presented colored backgrounds during valence categorizations of facial expressions. Subjects incidentally learned to perceive negative, neutral and positive expressions within a specific colored context. Subsequently, subjects made fast valence judgments while presented with the same face-color-combinations as in the first run (congruent trials) or with different face-color-combinations (incongruent trials). Incongruent trials induced significantly increased response latencies and significantly decreased performance accuracy. Contextual incongruent information during processing of neutral expressions modulated the P1 and the early posterior negativity (EPN) both localized in occipito-temporal areas. Contextual congruent information during emotional face perception revealed an emotion-related modulation of the P1 for positive expressions and of the N170 and the EPN for negative expressions. Highest amplitude of the N170 was found for negative expressions in a negatively associated context and the N170 amplitude varied with the amount of overall negative information. Incongruent trials with negative expressions elicited a parietal negativity which was localized to superior parietal cortex and which most likely represents a posterior manifestation of the N450 as an indicator of conflict processing. A sustained activation of the late LPP over parietal cortex for all incongruent trials might reflect enhanced engagement with facial expression during task conditions of contextual interference. In conclusion, whereas early components seem to be sensitive to the emotional valence of facial expression in specific contexts, late components seem to subserve interference resolution during emotional face processing.

  5. Psilocybin modulates functional connectivity of the amygdala during emotional face discrimination.

    PubMed

    Grimm, O; Kraehenmann, R; Preller, K H; Seifritz, E; Vollenweider, F X

    2018-04-24

    Recent studies suggest that the antidepressant effects of the psychedelic 5-HT2A receptor agonist psilocybin are mediated through its modulatory properties on prefrontal and limbic brain regions including the amygdala. To further investigate the effects of psilocybin on emotion processing networks, we studied for the first-time psilocybin's acute effects on amygdala seed-to-voxel connectivity in an event-related face discrimination task in 18 healthy volunteers who received psilocybin and placebo in a double-blind balanced cross-over design. The amygdala has been implicated as a salience detector especially involved in the immediate response to emotional face content. We used beta-series amygdala seed-to-voxel connectivity during an emotional face discrimination task to elucidate the connectivity pattern of the amygdala over the entire brain. When we compared psilocybin to placebo, an increase in reaction time for all three categories of affective stimuli was found. Psilocybin decreased the connectivity between amygdala and the striatum during angry face discrimination. During happy face discrimination, the connectivity between the amygdala and the frontal pole was decreased. No effect was seen during discrimination of fearful faces. Thus, we show psilocybin's effect as a modulator of major connectivity hubs of the amygdala. Psilocybin decreases the connectivity between important nodes linked to emotion processing like the frontal pole or the striatum. Future studies are needed to clarify whether connectivity changes predict therapeutic effects in psychiatric patients. Copyright © 2018 Elsevier B.V. and ECNP. All rights reserved.

  6. Preschoolers' real-time coordination of vocal and facial emotional information.

    PubMed

    Berman, Jared M J; Chambers, Craig G; Graham, Susan A

    2016-02-01

    An eye-tracking methodology was used to examine the time course of 3- and 5-year-olds' ability to link speech bearing different acoustic cues to emotion (i.e., happy-sounding, neutral, and sad-sounding intonation) to photographs of faces reflecting different emotional expressions. Analyses of saccadic eye movement patterns indicated that, for both 3- and 5-year-olds, sad-sounding speech triggered gaze shifts to a matching (sad-looking) face from the earliest moments of speech processing. However, it was not until approximately 800ms into a happy-sounding utterance that preschoolers began to use the emotional cues from speech to identify a matching (happy-looking) face. Complementary analyses based on conscious/controlled behaviors (children's explicit points toward the faces) indicated that 5-year-olds, but not 3-year-olds, could successfully match happy-sounding and sad-sounding vocal affect to a corresponding emotional face. Together, the findings clarify developmental patterns in preschoolers' implicit versus explicit ability to coordinate emotional cues across modalities and highlight preschoolers' greater sensitivity to sad-sounding speech as the auditory signal unfolds in time. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Neural signatures of conscious and unconscious emotional face processing in human infants.

    PubMed

    Jessen, Sarah; Grossmann, Tobias

    2015-03-01

    Human adults can process emotional information both with and without conscious awareness, and it has been suggested that the two processes rely on partly distinct brain mechanisms. However, the developmental origins of these brain processes are unknown. In the present event-related brain potential (ERP) study, we examined the brain responses of 7-month-old infants in response to subliminally (50 and 100 msec) and supraliminally (500 msec) presented happy and fearful facial expressions. Our results revealed that infants' brain responses (Pb and Nc) over central electrodes distinguished between emotions irrespective of stimulus duration, whereas the discrimination between emotions at occipital electrodes (N290 and P400) only occurred when faces were presented supraliminally (above threshold). This suggests that early in development the human brain not only discriminates between happy and fearful facial expressions irrespective of conscious perception, but also that, similar to adults, supraliminal and subliminal emotion processing relies on distinct neural processes. Our data further suggest that the processing of emotional facial expressions differs across infants depending on their behaviorally shown perceptual sensitivity. The current ERP findings suggest that distinct brain processes underpinning conscious and unconscious emotion perception emerge early in ontogeny and can therefore be seen as a key feature of human social functioning. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Ventromedial prefrontal cortex mediates visual attention during facial emotion recognition.

    PubMed

    Wolf, Richard C; Philippi, Carissa L; Motzkin, Julian C; Baskaya, Mustafa K; Koenigs, Michael

    2014-06-01

    The ventromedial prefrontal cortex is known to play a crucial role in regulating human social and emotional behaviour, yet the precise mechanisms by which it subserves this broad function remain unclear. Whereas previous neuropsychological studies have largely focused on the role of the ventromedial prefrontal cortex in higher-order deliberative processes related to valuation and decision-making, here we test whether ventromedial prefrontal cortex may also be critical for more basic aspects of orienting attention to socially and emotionally meaningful stimuli. Using eye tracking during a test of facial emotion recognition in a sample of lesion patients, we show that bilateral ventromedial prefrontal cortex damage impairs visual attention to the eye regions of faces, particularly for fearful faces. This finding demonstrates a heretofore unrecognized function of the ventromedial prefrontal cortex-the basic attentional process of controlling eye movements to faces expressing emotion. © The Author (2014). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. Emotion and sex of facial stimuli modulate conditional automaticity in behavioral and neuronal interference in healthy men.

    PubMed

    Kohn, Nils; Fernández, Guillén

    2017-12-06

    Our surrounding provides a host of sensory input, which we cannot fully process without streamlining and automatic processing. Levels of automaticity differ for different cognitive and affective processes. Situational and contextual interactions between cognitive and affective processes in turn influence the level of automaticity. Automaticity can be measured by interference in Stroop tasks. We applied an emotional version of the Stroop task to investigate how stress as a contextual factor influences the affective valence-dependent level of automaticity. 120 young, healthy men were investigated for behavioral and brain interference following a stress induction or control procedure in a counter-balanced cross-over-design. Although Stroop interference was always observed, sex and emotion of the face strongly modulated interference, which was larger for fearful and male faces. These effects suggest higher automaticity when processing happy and also female faces. Supporting behavioral patterns, brain data show lower interference related brain activity in executive control related regions in response to happy and female faces. In the absence of behavioral stress effects, congruent compared to incongruent trials (reverse interference) showed little to no deactivation under stress in response to happy female and fearful male trials. These congruency effects are potentially based on altered context- stress-related facial processing that interact with sex-emotion stereotypes. Results indicate that sex and facial emotion modulate Stroop interference in brain and behavior. These effects can be explained by altered response difficulty as a consequence of the contextual and stereotype related modulation of automaticity. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. The extended functional neuroanatomy of emotional processing biases for masked faces in major depressive disorder.

    PubMed

    Victor, Teresa A; Furey, Maura L; Fromm, Stephen J; Bellgowan, Patrick S F; Öhman, Arne; Drevets, Wayne C

    2012-01-01

    Major depressive disorder (MDD) is associated with a mood-congruent processing bias in the amygdala toward face stimuli portraying sad expressions that is evident even when such stimuli are presented below the level of conscious awareness. The extended functional anatomical network that maintains this response bias has not been established, however. To identify neural network differences in the hemodynamic response to implicitly presented facial expressions between depressed and healthy control participants. Unmedicated-depressed participants with MDD (n=22) and healthy controls (HC; n=25) underwent functional MRI as they viewed face stimuli showing sad, happy or neutral face expressions, presented using a backward masking design. The blood-oxygen-level dependent (BOLD) signal was measured to identify regions where the hemodynamic response to the emotionally valenced stimuli differed between groups. The MDD subjects showed greater BOLD responses than the controls to masked-sad versus masked-happy faces in the hippocampus, amygdala and anterior inferotemporal cortex. While viewing both masked-sad and masked-happy faces relative to masked-neutral faces, the depressed subjects showed greater hemodynamic responses than the controls in a network that included the medial and orbital prefrontal cortices and anterior temporal cortex. Depressed and healthy participants showed distinct hemodynamic responses to masked-sad and masked-happy faces in neural circuits known to support the processing of emotionally valenced stimuli and to integrate the sensory and visceromotor aspects of emotional behavior. Altered function within these networks in MDD may establish and maintain illness-associated differences in the salience of sensory/social stimuli, such that attention is biased toward negative and away from positive stimuli.

  11. MEG Evidence for Dynamic Amygdala Modulations by Gaze and Facial Emotions

    PubMed Central

    Dumas, Thibaud; Dubal, Stéphanie; Attal, Yohan; Chupin, Marie; Jouvent, Roland; Morel, Shasha; George, Nathalie

    2013-01-01

    Background Amygdala is a key brain region for face perception. While the role of amygdala in the perception of facial emotion and gaze has been extensively highlighted with fMRI, the unfolding in time of amydgala responses to emotional versus neutral faces with different gaze directions is scarcely known. Methodology/Principal Findings Here we addressed this question in healthy subjects using MEG combined with an original source imaging method based on individual amygdala volume segmentation and the localization of sources in the amygdala volume. We found an early peak of amygdala activity that was enhanced for fearful relative to neutral faces between 130 and 170 ms. The effect of emotion was again significant in a later time range (310–350 ms). Moreover, the amygdala response was greater for direct relative averted gaze between 190 and 350 ms, and this effect was selective of fearful faces in the right amygdala. Conclusion Altogether, our results show that the amygdala is involved in the processing and integration of emotion and gaze cues from faces in different time ranges, thus underlining its role in multiple stages of face perception. PMID:24040190

  12. Association between amygdala response to emotional faces and social anxiety in autism spectrum disorders.

    PubMed

    Kleinhans, Natalia M; Richards, Todd; Weaver, Kurt; Johnson, L Clark; Greenson, Jessica; Dawson, Geraldine; Aylward, Elizabeth

    2010-10-01

    Difficulty interpreting facial expressions has been reported in autism spectrum disorders (ASD) and is thought to be associated with amygdala abnormalities. To further explore the neural basis of abnormal emotional face processing in ASD, we conducted an fMRI study of emotional face matching in high-functioning adults with ASD and age, IQ, and gender matched controls. In addition, we investigated whether there was a relationship between self-reported social anxiety and fMRI activation. During fMRI scanning, study participants were instructed to match facial expressions depicting fear or anger. The control condition was a comparable shape-matching task. The control group evidenced significantly increased left prefrontal activation and decreased activation in the occipital lobes compared to the ASD group during emotional face matching. Further, within the ASD group, greater social anxiety was associated with increased activation in right amygdala and left middle temporal gyrus, and decreased activation in the fusiform face area. These results indicate that level of social anxiety mediates the neural response to emotional face perception in ASD. Copyright © 2010 Elsevier Ltd. All rights reserved.

  13. Functional MRI of facial emotion processing in left temporal lobe epilepsy.

    PubMed

    Szaflarski, Jerzy P; Allendorfer, Jane B; Heyse, Heidi; Mendoza, Lucy; Szaflarski, Basia A; Cohen, Nancy

    2014-03-01

    Temporal lobe epilepsy (TLE) may negatively affect the ability to recognize emotions. This study aimed to determine the cortical correlates of facial emotion processing (happy, sad, fearful, and neutral) in patients with well-characterized left TLE (LTLE) and to examine the effect of seizure control on emotion processing. We enrolled 34 consecutive patients with LTLE and 30 matched healthy control (HC) subjects. Participants underwent functional MRI (fMRI) with an event-related facial emotion recognition task. The seizures of seventeen patients were controlled (no seizure in at least 3months; LTLE-sz), and 17 continued to experience frequent seizures (LTLE+sz). Mood was assessed with the Beck Depression Inventory (BDI) and the Profile of Mood States (POMS). There were no differences in demographic characteristics and measures of mood between HC subjects and patients with LTLE. In patients with LTLE, fMRI showed decreased blood oxygenation level dependent (BOLD) signal in the hippocampus/parahippocampus and cerebellum in processing of happy faces and increased BOLD signal in occipital regions in response to fearful faces. Comparison of groups with LTLE+sz and LTLE-sz showed worse BDI and POMS scores in LTLE+sz (all p<0.05) except for POMS tension/anxiety (p=0.067). Functional MRI revealed increased BOLD signal in patients with LTLE+sz in the left precuneus and left parahippocampus for "fearful" faces and in the left periarcheocortex for "neutral" faces. There was a correlation between the fMRI and Total Mood Disturbance in the left precuneus in LTLE-sz (p=0.019) and in LTLE+sz (p=0.018). Overall, LTLE appears to have a relatively minor effect on the cortical underpinnings of facial emotion processing, while the effect of seizure state (controlled vs. not controlled) is more pronounced, indicating a significant relationship between seizure control and emotion processing. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Neural correlates of text-based emoticons: a preliminary fMRI study.

    PubMed

    Kim, Ko Woon; Lee, Sang Won; Choi, Jeewook; Kim, Tae Min; Jeong, Bumseok

    2016-08-01

    Like nonverbal cues in oral interactions, text-based emoticons, which are textual portrayals of a writer's facial expressions, are commonly used in electronic device-mediated communication. Little is known, however, about how text-based emoticons are processed in the human brain. With this study, we investigated whether the text-based emoticons are processed as face expressions using fMRI. During fMRI scan, subjects were asked to respond by pressing a button, indicating whether text-based emoticons represented positive or negative emotions. Voxel-wise analyses were performed to compare the responses and contrasted with emotional versus scrambled emoticons and among emoticons with different emotions. To explore processing strategies for text-based emoticons, brain activity in the bilateral occipital and fusiform face areas were compared. In the voxel-wise analysis, both emotional and scrambled emoticons were processed mainly in the bilateral fusiform gyri, inferior division of lateral occipital cortex, inferior frontal gyri, dorsolateral prefrontal cortex (DLPFC), dorsal anterior cingulate cortex (dACC), and parietal cortex. In a percent signal change analysis, the right occipital and fusiform face areas showed significantly higher activation than left ones. In comparisons among emoticons, sad one showed significant BOLD signal decrease in the dACC, the left AIC, the bilateral thalamus, and the precuneus as compared with other conditions. The results of this study imply that people recognize text-based emoticons as pictures representing face expressions. Even though text-based emoticons contain emotional meaning, they are not associated with the amygdala while previous studies using emotional stimuli documented amygdala activation.

  15. Cultural differences in on-line sensitivity to emotional voices: comparing East and West

    PubMed Central

    Liu, Pan; Rigoulot, Simon; Pell, Marc D.

    2015-01-01

    Evidence that culture modulates on-line neural responses to the emotional meanings encoded by vocal and facial expressions was demonstrated recently in a study comparing English North Americans and Chinese (Liu et al., 2015). Here, we compared how individuals from these two cultures passively respond to emotional cues from faces and voices using an Oddball task. Participants viewed in-group emotional faces, with or without simultaneous vocal expressions, while performing a face-irrelevant visual task as the EEG was recorded. A significantly larger visual Mismatch Negativity (vMMN) was observed for Chinese vs. English participants when faces were accompanied by voices, suggesting that Chinese were influenced to a larger extent by task-irrelevant vocal cues. These data highlight further differences in how adults from East Asian vs. Western cultures process socio-emotional cues, arguing that distinct cultural practices in communication (e.g., display rules) shape neurocognitive activity associated with the early perception and integration of multi-sensory emotional cues. PMID:26074808

  16. Emotional reactivity and its impact on neural circuitry for attention-emotion interaction in childhood and adolescence

    PubMed Central

    Perlman, Susan B.; Hein, Tyler C.; Stepp, Stephanie D.

    2013-01-01

    Attention modulation when confronted with emotional stimuli is considered a critical aspect of executive function, yet rarely studied during childhood and adolescence, a developmental period marked with changes in these processes. We employed a novel, and child-friendly fMRI task that used emotional faces to investigate the neural underpinnings of the attention-emotion interaction in a child and adolescent sample (n=23, Age m=13.46, sd=2.86, range=8.05–16.93 years). Results implied modulation of activation in the orbitofrontal cortex (OFC) due to emotional distractor valence, which marginally correlated with participant age. Additionally, parent-reported emotional reactivity predicted the trajectory of BOLD signal increase for fearful emotional face distractors such that participants low in emotional reactivity had a steeper latency to peak activation. Results imply that the use of the OFC to modulate attention in the face of social/emotional stimuli may mature with age and may be tightly coupled with adaptive emotional functioning. Findings are discussed in the context of risk for the development of psychiatric disorders, where increased emotional reactivity is particularly apparent. PMID:24055416

  17. Emotion perception accuracy and bias in face-to-face versus cyberbullying.

    PubMed

    Ciucci, Enrica; Baroncelli, Andrea; Nowicki, Stephen

    2014-01-01

    The authors investigated the association of traditional and cyber forms of bullying and victimization with emotion perception accuracy and emotion perception bias. Four basic emotions were considered (i.e., happiness, sadness, anger, and fear); 526 middle school students (280 females; M age = 12.58 years, SD = 1.16 years) were recruited, and emotionality was controlled. Results indicated no significant findings for girls. Boys with higher levels of traditional bullying did not show any deficit in perception accuracy of emotions, but they were prone to identify happiness and fear in faces when a different emotion was expressed; in addition, male cyberbullying was related to greater accuracy in recognizing fear. In terms of the victims, cyber victims had a global problem in recognizing emotions and a specific problem in processing anger and fear. It was concluded that emotion perception accuracy and bias were associated with bullying and victimization for boys not only in traditional settings but also in the electronic ones. Implications of these findings for possible intervention are discussed.

  18. Facial expressions perceived by the adolescent brain: Towards the proficient use of low spatial frequency information.

    PubMed

    Peters, Judith C; Kemner, Chantal

    2017-10-01

    Rapid decoding of emotional expressions is essential for social communication. Fast processing of facial expressions depends on the adequate (subcortical) processing of important global face cues in the low spatial frequency (LSF) ranges. However, children below 9 years of age extract fearful expression information from local details represented by high SF (HSF) image content. Our ERP study investigated at which developmental stage this ineffective HSF-driven processing is replaced by the proficient and rapid LSF-driven perception of fearful faces, in which adults are highly skilled. We examined behavioral and neural responses to high- and low-pass filtered faces with a fearful or neutral expression in groups of children on the verge of pre-adolescence (9-10 years), adolescents (14-15 years), and young adults (20-28 years). Our results suggest that the neural emotional face processing network has a protracted maturational course into adolescence, which is related to changes in SF processing. In mid-adolescence, increased sensitivity to emotional LSF cues is developed, which aids the fast and adequate processing of fearful expressions that might signal impending danger. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. “Who said that?” Matching of low- and high-intensity emotional prosody to facial expressions by adolescents with ASD

    PubMed Central

    Grossman, Ruth B; Tager-Flusberg, Helen

    2012-01-01

    Data on emotion processing by individuals with ASD suggest both intact abilities and significant deficits. Signal intensity may be a contributing factor to this discrepancy. We presented low- and high-intensity emotional stimuli in a face-voice matching task to 22 adolescents with ASD and 22 typically developing (TD) peers. Participants heard semantically neutral sentences with happy, surprised, angry, and sad prosody presented at two intensity levels (low, high) and matched them to emotional faces. The facial expression choice was either across- or within-valence. Both groups were less accurate for low-intensity emotions, but the ASD participants' accuracy levels dropped off more sharply. ASD participants were significantly less accurate than their TD peers for trials involving low-intensity emotions and within-valence face contrasts. PMID:22450703

  20. Visual search for facial expressions of emotions: a comparison of dynamic and static faces.

    PubMed

    Horstmann, Gernot; Ansorge, Ulrich

    2009-02-01

    A number of past studies have used the visual search paradigm to examine whether certain aspects of emotional faces are processed preattentively and can thus be used to guide attention. All these studies presented static depictions of facial prototypes. Emotional expressions conveyed by the movement patterns of the face have never been examined for their preattentive effect. The present study presented for the first time dynamic facial expressions in a visual search paradigm. Experiment 1 revealed efficient search for a dynamic angry face among dynamic friendly faces, but inefficient search in a control condition with static faces. Experiments 2 to 4 suggested that this pattern of results is due to a stronger movement signal in the angry than in the friendly face: No (strong) advantage of dynamic over static faces is revealed when the degree of movement is controlled. These results show that dynamic information can be efficiently utilized in visual search for facial expressions. However, these results do not generally support the hypothesis that emotion-specific movement patterns are always preattentively discriminated. (c) 2009 APA, all rights reserved

  1. Distinct Facial Processing Related Negative Cognitive Bias in First-Episode and Recurrent Major Depression: Evidence from the N170 ERP Component

    PubMed Central

    Chen, Jiu; Ma, Wentao; Zhang, Yan; Wu, Xingqu; Wei, Dunhong; Liu, Guangxiong; Deng, Zihe; Yang, Laiqi; Zhang, Zhijun

    2014-01-01

    Background States of depression are associated with increased sensitivity to negative events. For this novel study, we have assessed the relationship between the number of depressive episodes and the dysfunctional processing of emotional facial expressions. Methodology/Principal Findings We used a visual emotional oddball paradigm to manipulate the processing of emotional information while event-related brain potentials were recorded in 45 patients with first episode major depression (F-MD), 40 patients with recurrent major depression (R-MD), and 46 healthy controls (HC). Compared with the HC group, F-MD patients had lower N170 amplitudes when identifying happy, neutral, and sad faces; R-MD patients had lower N170 amplitudes when identifying happy and neutral faces, but higher N170 amplitudes when identifying sad faces. F-MD patients had longer N170 latencies when identifying happy, neutral, and sad faces relative to the HC group, and R-MD patients had longer N170 latencies when identifying happy and neutral faces, but shorter N170 latencies when identifying sad faces compared with F-MD patients. Interestingly, a negative relationship was observed between N170 amplitude and the depressive severity score for identification of happy faces in R-MD patients while N170 amplitude was positively correlated with the depressive severity score for identification of sad faces in F-MD and R-MD patients. Additionally, the deficits of N170 amplitude for sad faces positively correlated with the number of depressive episodes in R-MD patients. Conclusion/Significance These results provide new evidence that having more recurrent depressive episodes and serious depressive states are likely to aggravate the already abnormal processing of emotional facial expressions in patients with depression. Moreover, it further suggests that the impaired processing as indexed by N170 amplitude for positive face identification may be a potentially useful biomarker for predicting propagation of depression while N170 amplitude for negative face identification could be a potential biomarker for depression recurrence. PMID:25314024

  2. The perception of emotion in body expressions.

    PubMed

    de Gelder, B; de Borst, A W; Watson, R

    2015-01-01

    During communication, we perceive and express emotional information through many different channels, including facial expressions, prosody, body motion, and posture. Although historically the human body has been perceived primarily as a tool for actions, there is now increased understanding that the body is also an important medium for emotional expression. Indeed, research on emotional body language is rapidly emerging as a new field in cognitive and affective neuroscience. This article reviews how whole-body signals are processed and understood, at the behavioral and neural levels, with specific reference to their role in emotional communication. The first part of this review outlines brain regions and spectrotemporal dynamics underlying perception of isolated neutral and affective bodies, the second part details the contextual effects on body emotion recognition, and final part discusses body processing on a subconscious level. More specifically, research has shown that body expressions as compared with neutral bodies draw upon a larger network of regions responsible for action observation and preparation, emotion processing, body processing, and integrative processes. Results from neurotypical populations and masking paradigms suggest that subconscious processing of affective bodies relies on a specific subset of these regions. Moreover, recent evidence has shown that emotional information from the face, voice, and body all interact, with body motion and posture often highlighting and intensifying the emotion expressed in the face and voice. © 2014 John Wiley & Sons, Ltd.

  3. Neural processing of emotional facial and semantic expressions in euthymic bipolar disorder (BD) and its association with theory of mind (ToM).

    PubMed

    Ibanez, Agustin; Urquina, Hugo; Petroni, Agustín; Baez, Sandra; Lopez, Vladimir; do Nascimento, Micaela; Herrera, Eduar; Guex, Raphael; Hurtado, Esteban; Blenkmann, Alejandro; Beltrachini, Leandro; Gelormini, Carlos; Sigman, Mariano; Lischinsky, Alicia; Torralva, Teresa; Torrente, Fernando; Cetkovich, Marcelo; Manes, Facundo

    2012-01-01

    Adults with bipolar disorder (BD) have cognitive impairments that affect face processing and social cognition. However, it remains unknown whether these deficits in euthymic BD have impaired brain markers of emotional processing. We recruited twenty six participants, 13 controls subjects with an equal number of euthymic BD participants. We used an event-related potential (ERP) assessment of a dual valence task (DVT), in which faces (angry and happy), words (pleasant and unpleasant), and face-word simultaneous combinations are presented to test the effects of the stimulus type (face vs word) and valence (positive vs. negative). All participants received clinical, neuropsychological and social cognition evaluations. ERP analysis revealed that both groups showed N170 modulation of stimulus type effects (face > word). BD patients exhibited reduced and enhanced N170 to facial and semantic valence, respectively. The neural source estimation of N170 was a posterior section of the fusiform gyrus (FG), including the face fusiform area (FFA). Neural generators of N170 for faces (FG and FFA) were reduced in BD. In these patients, N170 modulation was associated with social cognition (theory of mind). This is the first report of euthymic BD exhibiting abnormal N170 emotional discrimination associated with theory of mind impairments.

  4. The serotonin transporter gene polymorphism and the effect of baseline on amygdala response to emotional faces.

    PubMed

    von dem Hagen, Elisabeth A H; Passamonti, Luca; Nutland, Sarah; Sambrook, Jennifer; Calder, Andrew J

    2011-03-01

    Previous research has found that a common polymorphism in the serotonin transporter gene (5-HTTLPR) is an important mediator of individual differences in brain responses associated with emotional behaviour. In particular, relative to individuals homozygous for the l-allele, carriers of the s-allele display heightened amygdala activation to emotional compared to non-emotional stimuli. However, there is some debate as to whether this difference is driven by increased activation to emotional stimuli, resting baseline differences between the groups, or decreased activation to neutral stimuli. We performed functional imaging during an implicit facial expression processing task in which participants viewed angry, sad and neutral faces. In addition to neutral faces, we included two further baseline conditions, houses and fixation. We found increased amygdala activation in s-allele carriers relative to l-homozygotes in response to angry faces compared to neutral faces, houses and fixation. When comparing neutral faces to houses or fixation, we found no significant difference in amygdala response between the two groups. In addition, there was no significant difference between the groups in response to fixation when compared with a houses baseline. Overall, these results suggest that the increased amygdala response observed in s-allele carriers to emotional faces is primarily driven by an increased response to emotional faces rather than a decreased response to neutral faces or an increased resting baseline. The results are discussed in relation to the tonic and phasic hypotheses of 5-HTTLPR-mediated modulation of amygdala activity. Copyright © 2010 Elsevier Ltd. All rights reserved.

  5. Can We Distinguish Emotions from Faces? Investigation of Implicit and Explicit Processes of Peak Facial Expressions.

    PubMed

    Xiao, Ruiqi; Li, Xianchun; Li, Lin; Wang, Yanmei

    2016-01-01

    Most previous studies on facial expression recognition have focused on the moderate emotions; to date, few studies have been conducted to investigate the explicit and implicit processes of peak emotions. In the current study, we used transiently peak intense expression images of athletes at the winning or losing point in competition as materials, and investigated the diagnosability of peak facial expressions at both implicit and explicit levels. In Experiment 1, participants were instructed to evaluate isolated faces, isolated bodies, and the face-body compounds, and eye-tracking movement was recorded. The results revealed that the isolated body and face-body congruent images were better recognized than isolated face and face-body incongruent images, indicating that the emotional information conveyed by facial cues was ambiguous, and the body cues influenced facial emotion recognition. Furthermore, eye movement records showed that the participants displayed distinct gaze patterns for the congruent and incongruent compounds. In Experiment 2A, the subliminal affective priming task was used, with faces as primes and bodies as targets, to investigate the unconscious emotion perception of peak facial expressions. The results showed that winning face prime facilitated reaction to winning body target, whereas losing face prime inhibited reaction to winning body target, suggesting that peak facial expressions could be perceived at the implicit level. In general, the results indicate that peak facial expressions cannot be consciously recognized but can be perceived at the unconscious level. In Experiment 2B, revised subliminal affective priming task and a strict awareness test were used to examine the validity of unconscious perception of peak facial expressions found in Experiment 2A. Results of Experiment 2B showed that reaction time to both winning body targets and losing body targets was influenced by the invisibly peak facial expression primes, which indicated the unconscious perception of peak facial expressions.

  6. Can We Distinguish Emotions from Faces? Investigation of Implicit and Explicit Processes of Peak Facial Expressions

    PubMed Central

    Xiao, Ruiqi; Li, Xianchun; Li, Lin; Wang, Yanmei

    2016-01-01

    Most previous studies on facial expression recognition have focused on the moderate emotions; to date, few studies have been conducted to investigate the explicit and implicit processes of peak emotions. In the current study, we used transiently peak intense expression images of athletes at the winning or losing point in competition as materials, and investigated the diagnosability of peak facial expressions at both implicit and explicit levels. In Experiment 1, participants were instructed to evaluate isolated faces, isolated bodies, and the face-body compounds, and eye-tracking movement was recorded. The results revealed that the isolated body and face-body congruent images were better recognized than isolated face and face-body incongruent images, indicating that the emotional information conveyed by facial cues was ambiguous, and the body cues influenced facial emotion recognition. Furthermore, eye movement records showed that the participants displayed distinct gaze patterns for the congruent and incongruent compounds. In Experiment 2A, the subliminal affective priming task was used, with faces as primes and bodies as targets, to investigate the unconscious emotion perception of peak facial expressions. The results showed that winning face prime facilitated reaction to winning body target, whereas losing face prime inhibited reaction to winning body target, suggesting that peak facial expressions could be perceived at the implicit level. In general, the results indicate that peak facial expressions cannot be consciously recognized but can be perceived at the unconscious level. In Experiment 2B, revised subliminal affective priming task and a strict awareness test were used to examine the validity of unconscious perception of peak facial expressions found in Experiment 2A. Results of Experiment 2B showed that reaction time to both winning body targets and losing body targets was influenced by the invisibly peak facial expression primes, which indicated the unconscious perception of peak facial expressions. PMID:27630604

  7. Developmental differences in the neural mechanisms of facial emotion labeling

    PubMed Central

    Adleman, Nancy E.; Kim, Pilyoung; Oakes, Allison H.; Hsu, Derek; Reynolds, Richard C.; Chen, Gang; Pine, Daniel S.; Brotman, Melissa A.; Leibenluft, Ellen

    2016-01-01

    Adolescence is a time of increased risk for the onset of psychological disorders associated with deficits in face emotion labeling. We used functional magnetic resonance imaging (fMRI) to examine age-related differences in brain activation while adolescents and adults labeled the emotion on fearful, happy and angry faces of varying intensities [0% (i.e. neutral), 50%, 75%, 100%]. Adolescents and adults did not differ on accuracy to label emotions. In the superior temporal sulcus, ventrolateral prefrontal cortex and middle temporal gyrus, adults show an inverted-U-shaped response to increasing intensities of fearful faces and a U-shaped response to increasing intensities of happy faces, whereas adolescents show the opposite patterns. In addition, adults, but not adolescents, show greater inferior occipital gyrus activation to negative (angry, fearful) vs positive (happy) emotions. In sum, when subjects classify subtly varying facial emotions, developmental differences manifest in several ‘ventral stream’ brain regions. Charting the typical developmental course of the brain mechanisms of socioemotional processes, such as facial emotion labeling, is an important focus for developmental psychopathology research. PMID:26245836

  8. Emotional face processing deficit in schizophrenia: A replication study in a South African Xhosa population.

    PubMed

    Leppänen, J M; Niehaus, D J H; Koen, L; Du Toit, E; Schoeman, R; Emsley, R

    2006-06-01

    Schizophrenia is associated with a deficit in the recognition of negative emotions from facial expressions. The present study examined the universality of this finding by studying facial expression recognition in African Xhosa population. Forty-four Xhosa patients with schizophrenia and forty healthy controls were tested with a computerized task requiring rapid perceptual discrimination of matched positive (i.e. happy), negative (i.e. angry), and neutral faces. Patients were equally accurate as controls in recognizing happy faces but showed a marked impairment in recognition of angry faces. The impairment was particularly pronounced for high-intensity (open-mouth) angry faces. Patients also exhibited more false happy and angry responses to neutral faces than controls. No correlation between level of education or illness duration and emotion recognition was found but the deficit in the recognition of negative emotions was more pronounced in familial compared to non-familial cases of schizophrenia. These findings suggest that the deficit in the recognition of negative facial expressions may constitute a universal neurocognitive marker of schizophrenia.

  9. In the face of emotions: event-related potentials in supraliminal and subliminal facial expression recognition.

    PubMed

    Balconi, Michela; Lucchiari, Claudio

    2005-02-01

    Is facial expression recognition marked by specific event-related potentials (ERPs) effects? Are conscious and unconscious elaborations of emotional facial stimuli qualitatively different processes? In Experiment 1, ERPs elicited by supraliminal stimuli were recorded when 21 participants viewed emotional facial expressions of four emotions and a neutral stimulus. Two ERP components (N2 and P3) were analyzed for their peak amplitude and latency measures. First, emotional face-specificity was observed for the negative deflection N2, whereas P3 was not affected by the content of the stimulus (emotional or neutral). A more posterior distribution of ERPs was found for N2. Moreover, a lateralization effect was revealed for negative (right lateralization) and positive (left lateralization) facial expressions. In Experiment 2 (20 participants), 1-ms subliminal stimulation was carried out. Unaware information processing was revealed to be quite similar to aware information processing for peak amplitude but not for latency. In fact, unconscious stimulation produced a more delayed peak variation than conscious stimulation.

  10. How does context affect assessments of facial emotion? The role of culture and age

    PubMed Central

    Ko, Seon-Gyu; Lee, Tae-Ho; Yoon, Hyea-Young; Kwon, Jung-Hye; Mather, Mara

    2010-01-01

    People from Asian cultures are more influenced by context in their visual processing than people from Western cultures. In this study, we examined how these cultural differences in context processing affect how people interpret facial emotions. We found that younger Koreans were more influenced than younger Americans by emotional background pictures when rating the emotion of a central face, especially those younger Koreans with low self-rated stress. In contrast, among older adults, neither Koreans nor Americans showed significant influences of context in their face emotion ratings. These findings suggest that cultural differences in reliance on context to interpret others' emotions depend on perceptual integration processes that decline with age, leading to fewer cultural differences in perception among older adults than among younger adults. Furthermore, when asked to recall the background pictures, younger participants recalled more negative pictures than positive pictures, whereas older participants recalled similar numbers of positive and negative pictures. These age differences in the valence of memory were consistent across culture. PMID:21038967

  11. Intranasal oxytocin impedes the ability to ignore task-irrelevant facial expressions of sadness in students with depressive symptoms.

    PubMed

    Ellenbogen, Mark A; Linnen, Anne-Marie; Cardoso, Christopher; Joober, Ridha

    2013-03-01

    The administration of oxytocin promotes prosocial behavior in humans. The mechanism by which this occurs is unknown, but it likely involves changes in social information processing. In a randomized placebo-controlled study, we examined the influence of intranasal oxytocin and placebo on the interference control component of inhibition (i.e. ability to ignore task-irrelevant information) in 102 participants using a negative affective priming task with sad, angry, and happy faces. In this task, participants are instructed to respond to a facial expression of emotion while simultaneously ignoring another emotional face. On the subsequent trial, the previously-ignored emotional valence may become the emotional valence of the target face. Inhibition is operationalized as the differential delay between responding to a previously-ignored emotional valence and responding to an emotional valence unrelated to the previous one. Although no main effect of drug administration on inhibition was observed, a drug × depressive symptom interaction (β = -0.25; t = -2.6, p < 0.05) predicted the inhibition of sad faces. Relative to placebo, participants with high depression scores who were administered oxytocin were unable to inhibit the processing of sad faces. There was no relationship between drug administration and inhibition among those with low depression scores. These findings are consistent with increasing evidence that oxytocin alters social information processing in ways that have both positive and negative social outcomes. Because elevated depression scores are associated with an increased risk for major depressive disorder, difficulties inhibiting mood-congruent stimuli following oxytocin administration may be associated with risk for depression. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Facial emotion recognition in paranoid schizophrenia and autism spectrum disorder.

    PubMed

    Sachse, Michael; Schlitt, Sabine; Hainz, Daniela; Ciaramidaro, Angela; Walter, Henrik; Poustka, Fritz; Bölte, Sven; Freitag, Christine M

    2014-11-01

    Schizophrenia (SZ) and autism spectrum disorder (ASD) share deficits in emotion processing. In order to identify convergent and divergent mechanisms, we investigated facial emotion recognition in SZ, high-functioning ASD (HFASD), and typically developed controls (TD). Different degrees of task difficulty and emotion complexity (face, eyes; basic emotions, complex emotions) were used. Two Benton tests were implemented in order to elicit potentially confounding visuo-perceptual functioning and facial processing. Nineteen participants with paranoid SZ, 22 with HFASD and 20 TD were included, aged between 14 and 33 years. Individuals with SZ were comparable to TD in all obtained emotion recognition measures, but showed reduced basic visuo-perceptual abilities. The HFASD group was impaired in the recognition of basic and complex emotions compared to both, SZ and TD. When facial identity recognition was adjusted for, group differences remained for the recognition of complex emotions only. Our results suggest that there is a SZ subgroup with predominantly paranoid symptoms that does not show problems in face processing and emotion recognition, but visuo-perceptual impairments. They also confirm the notion of a general facial and emotion recognition deficit in HFASD. No shared emotion recognition deficit was found for paranoid SZ and HFASD, emphasizing the differential cognitive underpinnings of both disorders. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Neurocognitive mechanisms behind emotional attention: Inverse effects of anodal tDCS over the left and right DLPFC on gaze disengagement from emotional faces.

    PubMed

    Sanchez-Lopez, Alvaro; Vanderhasselt, Marie-Anne; Allaert, Jens; Baeken, Chris; De Raedt, Rudi

    2018-06-01

    Attention to relevant emotional information in the environment is an important process related to vulnerability and resilience for mood and anxiety disorders. In the present study, the effects of left and right dorsolateral prefrontal cortex (i.e., DLPFC) stimulation on attentional mechanisms of emotional processing were tested and contrasted. A sample of 54 healthy participants received 20 min of active and sham anodal transcranial direct current stimulation (i.e., tDCS) either of the left (n = 27) or of the right DLPFC (n = 27) on two separate days. The anode electrode was placed over the left or the right DLPFC, the cathode over the corresponding contra lateral supraorbital area. After each neurostimulation session, participants completed an eye-tracking task assessing direct processes of attentional engagement towards and attentional disengagement away from emotional faces (happy, disgusted, and sad expressions). Compared to sham, active tDCS over the left DLPFC led to faster gaze disengagement, whereas active tDCS over the right DLPFC led to slower gaze disengagement from emotional faces. Between-group comparisons showed that such inverse change patterns were significantly different and generalized for all types of emotion. Our findings support a lateralized role of left and right DLPFC activity in enhancing/worsening the top-down regulation of emotional attention processing. These results support the rationale of new therapies for affective disorders aimed to increase the activation of the left over the right DLPFC in combination with attentional control training, and identify specific target attention mechanisms to be trained.

  14. The ties to unbind: age-related differences in feature (un)binding in working memory for emotional faces.

    PubMed

    Pehlivanoglu, Didem; Jain, Shivangi; Ariel, Robert; Verhaeghen, Paul

    2014-01-01

    In the present study, we investigated age-related differences in the processing of emotional stimuli. Specifically, we were interested in whether older adults would show deficits in unbinding emotional expression (i.e., either no emotion, happiness, anger, or disgust) from bound stimuli (i.e., photographs of faces expressing these emotions), as a hyper-binding account of age-related differences in working memory would predict. Younger and older adults completed different N-Back tasks (side-by-side 0-Back, 1-Back, 2-Back) under three conditions: match/mismatch judgments based on either the identity of the face (identity condition), the face's emotional expression (expression condition), or both identity and expression of the face (both condition). The two age groups performed more slowly and with lower accuracy in the expression condition than in the both condition, indicating the presence of an unbinding process. This unbinding effect was more pronounced in older adults than in younger adults, but only in the 2-Back task. Thus, older adults seemed to have a specific deficit in unbinding in working memory. Additionally, no age-related differences were found in accuracy in the 0-Back task, but such differences emerged in the 1-Back task, and were further magnified in the 2-Back task, indicating independent age-related differences in attention/STM and working memory. Pupil dilation data confirmed that the attention/STM version of the task (1-Back) is more effortful for older adults than younger adults.

  15. Age-Related Changes in the Processing of Emotional Faces in a Dual-Task Paradigm.

    PubMed

    Casares-Guillén, Carmen; García-Rodríguez, Beatriz; Delgado, Marisa; Ellgring, Heiner

    2016-01-01

    Background/ Study Context: Age-related changes appear to affect the ability to identify emotional facial expressions in dual-task conditions (i.e., while simultaneously performing a second visual task). The level of interference generated by the secondary task depends on the phase of emotional processing affected by the interference and the nature of the secondary task. The aim of the present study was to investigate the effect of these variables on age-related changes in the processing of emotional faces. The identification of emotional facial expressions (EFEs) was assessed in a dual-task paradigm using the following variables: (a) the phase during which interference was applied (encoding vs. retrieval phase); and (b) the nature of the interfering stimulus (visuospatial vs. verbal). The sample population consisted of 24 healthy aged adults (mean age = 75.38) and 40 younger adults (mean age = 26.90). The accuracy of EFE identification was calculated for all experimental conditions. Consistent with our hypothesis, the performance of the older group was poorer than that of the younger group in all experimental conditions. Dual-task performance was poorer when the interference occurred during the encoding phase of emotional face processing and when both tasks were of the same nature (i.e., when the experimental condition was more demanding in terms of attention). These results provide empirical evidence of age-related deficits in the identification of emotional facial expressions, which may be partially explained by the impairment of cognitive resources specific to this task. These findings may account for the difficulties experienced by the elderly during social interactions that require the concomitant processing of emotional and environmental information.

  16. Guanfacine Modulates the Emotional Biasing of Amygdala-Prefrontal Connectivity for Cognitive Control

    PubMed Central

    Schulz, Kurt P.; Clerkin, Suzanne M.; Newcorn, Jeffrey H.; Halperin, Jeffrey M.; Fan, Jin

    2014-01-01

    Functional interactions between amygdala and prefrontal cortex provide a cortical entry point for emotional cues to bias cognitive control. Stimulation of α2 adrenoceptors enhances the prefrontal control functions and blocks the amygdala-dependent encoding of emotional cues. However, the impact of this stimulation on amygdala-prefrontal interactions and the emotional biasing of cognitive control have not been established. We tested the effect of the α2 adrenoceptor agonist guanfacine on psychophysiological interactions of amygdala with prefrontal cortex for the emotional biasing of response execution and inhibition. Fifteen healthy adults were scanned twice with event-related functional magnetic resonance imaging while performing an emotional go/no-go task following administration of oral guanfacine (1 mg) and placebo in a double-blind, counterbalanced design. Happy, sad, and neutral faces served as trial cues. Guanfacine moderated the effect of face emotion on the task-related functional connectivity of left and right amygdala with left inferior frontal gyrus compared to placebo, by selectively reversing the functional co-activation of the two regions for response execution cued by sad faces. This shift from positively to negatively correlated activation for guanfacine was associated with selective improvements in the relatively low accuracy of responses to sad faces seen for placebo. These results demonstrate the importance of functional interactions between amygdala and inferior frontal gyrus to both bottom-up biasing of cognitive control and top-down control of emotional processing, as well as for the α2 adrenoceptor-mediated modulation of these processes. These mechanisms offer a possibile method to address the emotional reactivity that is common to several psychiatric disorders. PMID:25059532

  17. Acute pharmacologically induced shifts in serotonin availability abolish emotion-selective responses to negative face emotions in distinct brain networks.

    PubMed

    Grady, Cheryl L; Siebner, Hartwig R; Hornboll, Bettina; Macoveanu, Julian; Paulson, Olaf B; Knudsen, Gitte M

    2013-05-01

    Pharmacological manipulation of serotonin availability can alter the processing of facial expressions of emotion. Using a within-subject design, we measured the effect of serotonin on the brain's response to aversive face emotions with functional MRI while 20 participants judged the gender of neutral, fearful and angry faces. In three separate and counterbalanced sessions, participants received citalopram (CIT) to raise serotonin levels, underwent acute tryptophan depletion (ATD) to lower serotonin, or were studied without pharmacological challenge (Control). An analysis designed to identify distributed brain responses identified two brain networks with modulations of activity related to face emotion and serotonin level. The first network included the left amygdala, bilateral striatum, and fusiform gyri. During the Control session this network responded only to fearful faces; increasing serotonin decreased this response to fear, whereas reducing serotonin enhanced the response of this network to angry faces. The second network involved bilateral amygdala and ventrolateral prefrontal cortex, and these regions also showed increased activity to fear during the Control session. Both drug challenges enhanced the neural response of this set of regions to angry faces, relative to Control, and CIT also enhanced activity for neutral faces. The net effect of these changes in both networks was to abolish the selective response to fearful expressions. These results suggest that a normal level of serotonin is critical for maintaining a differentiated brain response to threatening face emotions. Lower serotonin leads to a broadening of a normally fear-specific response to anger, and higher levels reduce the differentiated brain response to aversive face emotions. Copyright © 2012 Elsevier B.V. and ECNP. All rights reserved.

  18. What Facial Appearance Reveals Over Time: When Perceived Expressions in Neutral Faces Reveal Stable Emotion Dispositions

    PubMed Central

    Adams, Reginald B.; Garrido, Carlos O.; Albohn, Daniel N.; Hess, Ursula; Kleck, Robert E.

    2016-01-01

    It might seem a reasonable assumption that when we are not actively using our faces to express ourselves (i.e., when we display nonexpressive, or neutral faces), those around us will not be able to read our emotions. Herein, using a variety of expression-related ratings, we examined whether age-related changes in the face can accurately reveal one’s innermost affective dispositions. In each study, we found that expressive ratings of neutral facial displays predicted self-reported positive/negative dispositional affect, but only for elderly women, and only for positive affect. These findings meaningfully replicate and extend earlier work examining age-related emotion cues in the face of elderly women (Malatesta et al., 1987a). We discuss these findings in light of evidence that women are expected to, and do, smile more than men, and that the quality of their smiles predicts their life satisfaction. Although ratings of old male faces did not significantly predict self-reported affective dispositions, the trend was similar to that found for old female faces. A plausible explanation for this gender difference is that in the process of attenuating emotional expressions over their lifetimes, old men reveal less evidence of their total emotional experiences in their faces than do old women. PMID:27445944

  19. Neural bases of different cognitive strategies for facial affect processing in schizophrenia.

    PubMed

    Fakra, Eric; Salgado-Pineda, Pilar; Delaveau, Pauline; Hariri, Ahmad R; Blin, Olivier

    2008-03-01

    To examine the neural basis and dynamics of facial affect processing in schizophrenic patients as compared to healthy controls. Fourteen schizophrenic patients and fourteen matched controls performed a facial affect identification task during fMRI acquisition. The emotional task included an intuitive emotional condition (matching emotional faces) and a more cognitively demanding condition (labeling emotional faces). Individual analysis for each emotional condition, and second-level t-tests examining both within-, and between-group differences, were carried out using a random effects approach. Psychophysiological interactions (PPI) were tested for variations in functional connectivity between amygdala and other brain regions as a function of changes in experimental conditions (labeling versus matching). During the labeling condition, both groups engaged similar networks. During the matching condition, schizophrenics failed to activate regions of the limbic system implicated in the automatic processing of emotions. PPI revealed an inverse functional connectivity between prefrontal regions and the left amygdala in healthy volunteers but there was no such change in patients. Furthermore, during the matching condition, and compared to controls, patients showed decreased activation of regions involved in holistic face processing (fusiform gyrus) and increased activation of regions associated with feature analysis (inferior parietal cortex, left middle temporal lobe, right precuneus). Our findings suggest that schizophrenic patients invariably adopt a cognitive approach when identifying facial affect. The distributed neocortical network observed during the intuitive condition indicates that patients may resort to feature-based, rather than configuration-based, processing and may constitute a compensatory strategy for limbic dysfunction.

  20. Visual short-term memory load modulates the early attention and perception of task-irrelevant emotional faces

    PubMed Central

    Yang, Ping; Wang, Min; Jin, Zhenlan; Li, Ling

    2015-01-01

    The ability to focus on task-relevant information, while suppressing distraction, is critical for human cognition and behavior. Using a delayed-match-to-sample (DMS) task, we investigated the effects of emotional face distractors (positive, negative, and neutral faces) on early and late phases of visual short-term memory (VSTM) maintenance intervals, using low and high VSTM loads. Behavioral results showed decreased accuracy and delayed reaction times (RTs) for high vs. low VSTM load. Event-related potentials (ERPs) showed enhanced frontal N1 and occipital P1 amplitudes for negative faces vs. neutral or positive faces, implying rapid attentional alerting effects and early perceptual processing of negative distractors. However, high VSTM load appeared to inhibit face processing in general, showing decreased N1 amplitudes and delayed P1 latencies. An inverse correlation between the N1 activation difference (high-load minus low-load) and RT costs (high-load minus low-load) was found at left frontal areas when viewing negative distractors, suggesting that the greater the inhibition the lower the RT cost for negative faces. Emotional interference effect was not found in the late VSTM-related parietal P300, frontal positive slow wave (PSW) and occipital negative slow wave (NSW) components. In general, our findings suggest that the VSTM load modulates the early attention and perception of emotional distractors. PMID:26388763

  1. Problems of Face Recognition in Patients with Behavioral Variant Frontotemporal Dementia.

    PubMed

    Chandra, Sadanandavalli Retnaswami; Patwardhan, Ketaki; Pai, Anupama Ramakanth

    2017-01-01

    Faces are very special as they are most essential for social cognition in humans. It is partly understood that face processing in its abstractness involves several extra striate areas. One of the most important causes for caregiver suffering in patients with anterior dementia is lack of empathy. This apart from being a behavioral disorder could be also due to failure to categorize the emotions of the people around them. Inlusion criteria: DSM IV for Bv FTD Tested for prosopagnosia - familiar faces, famous face, smiling face, crying face and reflected face using a simple picture card (figure 1). Advanced illness and mixed causes. 46 patients (15 females, 31 males) 24 had defective face recognition. (mean age 51.5),10/15 females (70%) and 14/31males(47. Familiar face recognition defect was found in 6/10 females and 6/14 males. Total- 40%(6/15) females and 19.35%(6/31)males with FTD had familiar face recognition. Famous Face: 9/10 females and 7/14 males. Total- 60% (9/15) females with FTD had famous face recognition defect as against 22.6%(7/31) males with FTD Smiling face defects in 8/10 female and no males. Total- 53.33% (8/15) females. Crying face recognition defect in 3/10 female and 2 /14 males. Total- 20%(3/15) females and 6.5%(2/31) males. Reflected face recognition defect in 4 females. Famous face recognition and positive emotion recognition defect in 80%, only 20% comprehend positive emotions, Face recognition defects are found in only 45% of males and more common in females. Face recognition is more affected in females with FTD There is differential involvement of different aspects of the face recognition could be one of the important factor underlying decline in the emotional and social behavior of these patients. Understanding these pathological processes will give more insight regarding patient behavior.

  2. Inhibition of Lateral Prefrontal Cortex Produces Emotionally Biased First Impressions: A Transcranial Magnetic Stimulation and Electroencephalography Study.

    PubMed

    Lapate, Regina C; Samaha, Jason; Rokers, Bas; Hamzah, Hamdi; Postle, Bradley R; Davidson, Richard J

    2017-07-01

    Optimal functioning in everyday life requires the ability to override reflexive emotional responses and prevent affective spillover to situations or people unrelated to the source of emotion. In the current study, we investigated whether the lateral prefrontal cortex (lPFC) causally regulates the influence of emotional information on subsequent judgments. We disrupted left lPFC function using transcranial magnetic stimulation (TMS) and recorded electroencephalography (EEG) before and after. Subjects evaluated the likeability of novel neutral faces after a brief exposure to a happy or fearful face. We found that lPFC inhibition biased evaluations of novel faces according to the previously processed emotional expression. Greater frontal EEG alpha power, reflecting increased inhibition by TMS, predicted increased behavioral bias. TMS-induced affective misattribution was long-lasting: Emotionally biased first impressions formed during lPFC inhibition were still detectable outside of the laboratory 3 days later. These findings indicate that lPFC serves an important emotion-regulation function by preventing incidental emotional encoding from automatically biasing subsequent appraisals.

  3. Neural activation to emotional faces in adolescents with autism spectrum disorders.

    PubMed

    Weng, Shih-Jen; Carrasco, Melisa; Swartz, Johnna R; Wiggins, Jillian Lee; Kurapati, Nikhil; Liberzon, Israel; Risi, Susan; Lord, Catherine; Monk, Christopher S

    2011-03-01

    Autism spectrum disorders (ASD) involve a core deficit in social functioning and impairments in the ability to recognize face emotions. In an emotional faces task designed to constrain group differences in attention, the present study used functional MRI to characterize activation in the amygdala, ventral prefrontal cortex (vPFC), and striatum, three structures involved in socio-emotional processing in adolescents with ASD. Twenty-two adolescents with ASD and 20 healthy adolescents viewed facial expressions (happy, fearful, sad and neutral) that were briefly presented (250 ms) during functional MRI acquisition. To monitor attention, subjects pressed a button to identify the gender of each face. The ASD group showed greater activation to the faces relative to the control group in the amygdala, vPFC and striatum. Follow-up analyses indicated that the ASD relative to control group showed greater activation in the amygdala, vPFC and striatum (p < .05 small volume corrected), particularly to sad faces. Moreover, in the ASD group, there was a negative correlation between developmental variables (age and pubertal status) and mean activation from the whole bilateral amygdala; younger adolescents showed greater activation than older adolescents. There were no group differences in accuracy or reaction time in the gender identification task. When group differences in attention to facial expressions were limited, adolescents with ASD showed greater activation in structures involved in socio-emotional processing. © 2010 The Authors. Journal of Child Psychology and Psychiatry © 2010 Association for Child and Adolescent Mental Health.

  4. The endocannabinoid system and emotional processing: a pharmacological fMRI study with ∆9-tetrahydrocannabinol.

    PubMed

    Bossong, Matthijs G; van Hell, Hendrika H; Jager, Gerry; Kahn, René S; Ramsey, Nick F; Jansma, J Martijn

    2013-12-01

    Various psychiatric disorders such as major depression are associated with abnormalities in emotional processing. Evidence indicating involvement of the endocannabinoid system in emotional processing, and thus potentially in related abnormalities, is increasing. In the present study, we examined the role of the endocannabinoid system in processing of stimuli with a positive and negative emotional content in healthy volunteers. A pharmacological functional magnetic resonance imaging (fMRI) study was conducted with a placebo-controlled, cross-over design, investigating effects of the endocannabinoid agonist ∆9-tetrahydrocannabinol (THC) on brain function related to emotional processing in 11 healthy subjects. Performance and brain activity during matching of stimuli with a negative ('fearful faces') or a positive content ('happy faces') were assessed after placebo and THC administration. After THC administration, performance accuracy was decreased for stimuli with a negative but not for stimuli with a positive emotional content. Our task activated a network of brain regions including amygdala, orbital frontal gyrus, hippocampus, parietal gyrus, prefrontal cortex, and regions in the occipital cortex. THC interacted with emotional content, as activity in this network was reduced for negative content, while activity for positive content was increased. These results indicate that THC administration reduces the negative bias in emotional processing. This adds human evidence to support the hypothesis that the endocannabinoid system is involved in modulation of emotional processing. Our findings also suggest a possible role for the endocannabinoid system in abnormal emotional processing, and may thus be relevant for psychiatric disorders such as major depression. Copyright © 2013 Elsevier B.V. and ECNP. All rights reserved.

  5. Adolescents' ability to read different emotional faces relates to their history of maltreatment and type of psychopathology.

    PubMed

    Leist, Tatyana; Dadds, Mark R

    2009-04-01

    Emotional processing styles appear to characterize various forms of psychopathology and environmental adversity in children. For example, autistic, anxious, high- and low-emotion conduct problem children, and children who have been maltreated, all appear to show specific deficits and strengths in recognizing the facial expressions of emotions. Until now, the relationships between emotion recognition, antisocial behaviour, emotional problems, callous-unemotional (CU) traits and early maltreatment have never been assessed simultaneously in one study, and the specific associations of emotion recognition to maltreatment and child characteristics are therefore unknown. We examined facial-emotion processing in a sample of 23 adolescents selected for high-risk status on the variables of interest. As expected, maltreatment and child characteristics showed unique associations. CU traits were uniquely related to impairments in fear recognition. Antisocial behaviour was uniquely associated with better fear recognition, but impaired anger recognition. Emotional problems were associated with better recognition of anger and sadness, but lower recognition of neutral faces. Maltreatment was predictive of superior recognition of fear and sadness. The findings are considered in terms of social information-processing theories of psychopathology. Implications for clinical interventions are discussed.

  6. Audiovisual emotional processing and neurocognitive functioning in patients with depression

    PubMed Central

    Doose-Grünefeld, Sophie; Eickhoff, Simon B.; Müller, Veronika I.

    2015-01-01

    Alterations in the processing of emotional stimuli (e.g., facial expressions, prosody, music) have repeatedly been reported in patients with major depression. Such impairments may result from the likewise prevalent executive deficits in these patients. However, studies investigating this relationship are rare. Moreover, most studies to date have only assessed impairments in unimodal emotional processing, whereas in real life, emotions are primarily conveyed through more than just one sensory channel. The current study therefore aimed at investigating multi-modal emotional processing in patients with depression and to assess the relationship between emotional and neurocognitive impairments. Fourty one patients suffering from major depression and 41 never-depressed healthy controls participated in an audiovisual (faces-sounds) emotional integration paradigm as well as a neurocognitive test battery. Our results showed that depressed patients were specifically impaired in the processing of positive auditory stimuli as they rated faces significantly more fearful when presented with happy than with neutral sounds. Such an effect was absent in controls. Findings in emotional processing in patients did not correlate with Beck’s depression inventory score. Furthermore, neurocognitive findings revealed significant group differences for two of the tests. The effects found in audiovisual emotional processing, however, did not correlate with performance in the neurocognitive tests. In summary, our results underline the diversity of impairments going along with depression and indicate that deficits found for unimodal emotional processing cannot trivially be generalized to deficits in a multi-modal setting. The mechanisms of impairments therefore might be far more complex than previously thought. Our findings furthermore contradict the assumption that emotional processing deficits in major depression are associated with impaired attention or inhibitory functioning. PMID:25688188

  7. Don't make me angry, you wouldn't like me when I'm angry: Volitional choices to act or inhibit are modulated by subliminal perception of emotional faces.

    PubMed

    Parkinson, Jim; Garfinkel, Sarah; Critchley, Hugo; Dienes, Zoltan; Seth, Anil K

    2017-04-01

    Volitional action and self-control-feelings of acting according to one's own intentions and in being control of one's own actions-are fundamental aspects of human conscious experience. However, it is unknown whether high-level cognitive control mechanisms are affected by socially salient but nonconscious emotional cues. In this study, we manipulated free choice decisions to act or withhold an action by subliminally presenting emotional faces: In a novel version of the Go/NoGo paradigm, participants made speeded button-press responses to Go targets, withheld responses to NoGo targets, and made spontaneous, free choices to execute or withhold the response for Choice targets. Before each target, we presented emotional faces, backwards masked to render them nonconscious. In Intentional trials, subliminal angry faces made participants more likely to voluntarily withhold the action, whereas fearful and happy faces had no effects. In a second experiment, the faces were made supraliminal, which eliminated the effects of angry faces on volitional choices. A third experiment measured neural correlates of the effects of subliminal angry faces on intentional choice using EEG. After replicating the behavioural results found in Experiment 1, we identified a frontal-midline theta component-associated with cognitive control processes-which is present for volitional decisions, and is modulated by subliminal angry faces. This suggests a mechanism whereby subliminally presented "threat" stimuli affect conscious control processes. In summary, nonconscious perception of angry faces increases choices to inhibit, and subliminal influences on volitional action are deep seated and ecologically embedded.

  8. Distinct and Overlapping Brain Areas Engaged during Value-Based, Mathematical, and Emotional Decision Processing

    PubMed Central

    Hsu, Chun-Wei; Goh, Joshua O. S.

    2016-01-01

    When comparing between the values of different choices, human beings can rely on either more cognitive processes, such as using mathematical computation, or more affective processes, such as using emotion. However, the neural correlates of how these two types of processes operate during value-based decision-making remain unclear. In this study, we investigated the extent to which neural regions engaged during value-based decision-making overlap with those engaged during mathematical and emotional processing in a within-subject manner. In a functional magnetic resonance imaging experiment, participants viewed stimuli that always consisted of numbers and emotional faces that depicted two choices. Across tasks, participants decided between the two choices based on the expected value of the numbers, a mathematical result of the numbers, or the emotional face stimuli. We found that all three tasks commonly involved various cortical areas including frontal, parietal, motor, somatosensory, and visual regions. Critically, the mathematical task shared common areas with the value but not emotion task in bilateral striatum. Although the emotion task overlapped with the value task in parietal, motor, and sensory areas, the mathematical task also evoked responses in other areas within these same cortical structures. Minimal areas were uniquely engaged for the value task apart from the other two tasks. The emotion task elicited a more expansive area of neural activity whereas value and mathematical task responses were in more focal regions. Whole-brain spatial correlation analysis showed that valuative processing engaged functional brain responses more similarly to mathematical processing than emotional processing. While decisions on expected value entail both mathematical and emotional processing regions, mathematical processes have a more prominent contribution particularly in subcortical processes. PMID:27375466

  9. Distinct and Overlapping Brain Areas Engaged during Value-Based, Mathematical, and Emotional Decision Processing.

    PubMed

    Hsu, Chun-Wei; Goh, Joshua O S

    2016-01-01

    When comparing between the values of different choices, human beings can rely on either more cognitive processes, such as using mathematical computation, or more affective processes, such as using emotion. However, the neural correlates of how these two types of processes operate during value-based decision-making remain unclear. In this study, we investigated the extent to which neural regions engaged during value-based decision-making overlap with those engaged during mathematical and emotional processing in a within-subject manner. In a functional magnetic resonance imaging experiment, participants viewed stimuli that always consisted of numbers and emotional faces that depicted two choices. Across tasks, participants decided between the two choices based on the expected value of the numbers, a mathematical result of the numbers, or the emotional face stimuli. We found that all three tasks commonly involved various cortical areas including frontal, parietal, motor, somatosensory, and visual regions. Critically, the mathematical task shared common areas with the value but not emotion task in bilateral striatum. Although the emotion task overlapped with the value task in parietal, motor, and sensory areas, the mathematical task also evoked responses in other areas within these same cortical structures. Minimal areas were uniquely engaged for the value task apart from the other two tasks. The emotion task elicited a more expansive area of neural activity whereas value and mathematical task responses were in more focal regions. Whole-brain spatial correlation analysis showed that valuative processing engaged functional brain responses more similarly to mathematical processing than emotional processing. While decisions on expected value entail both mathematical and emotional processing regions, mathematical processes have a more prominent contribution particularly in subcortical processes.

  10. Is a neutral expression also a neutral stimulus? A study with functional magnetic resonance.

    PubMed

    Carvajal, Fernando; Rubio, Sandra; Serrano, Juan M; Ríos-Lago, Marcos; Alvarez-Linera, Juan; Pacheco, Lara; Martín, Pilar

    2013-08-01

    Although neutral faces do not initially convey an explicit emotional message, it has been found that individuals tend to assign them an affective content. Moreover, previous research has shown that affective judgments are mediated by the task they have to perform. Using functional magnetic resonance imaging in 21 healthy participants, we focus this study on the cerebral activity patterns triggered by neutral and emotional faces in two different tasks (social or gender judgments). Results obtained, using conjunction analyses, indicated that viewing both emotional and neutral faces evokes activity in several similar brain areas indicating a common neural substrate. Moreover, neutral faces specifically elicit activation of cerebellum, frontal and temporal areas, while emotional faces involve the cuneus, anterior cingulated gyrus, medial orbitofrontal cortex, posterior superior temporal gyrus, precentral/postcentral gyrus and insula. The task selected was also found to influence brain activity, in that the social task recruited frontal areas while the gender task involved the posterior cingulated, inferior parietal lobule and middle temporal gyrus to a greater extent. Specifically, in the social task viewing neutral faces was associated with longer reaction times and increased activity of left dorsolateral frontal cortex compared with viewing facial expressions of emotions. In contrast, in the same task emotional expressions distinctively activated the left amygdale. The results are discussed taking into consideration the fact that, like other facial expressions, neutral expressions are usually assigned some emotional significance. However, neutral faces evoke a greater activation of circuits probably involved in more elaborate cognitive processing.

  11. Emotion improves and impairs early vision.

    PubMed

    Bocanegra, Bruno R; Zeelenberg, René

    2009-06-01

    Recent studies indicate that emotion enhances early vision, but the generality of this finding remains unknown. Do the benefits of emotion extend to all basic aspects of vision, or are they limited in scope? Our results show that the brief presentation of a fearful face, compared with a neutral face, enhances sensitivity for the orientation of subsequently presented low-spatial-frequency stimuli, but diminishes orientation sensitivity for high-spatial-frequency stimuli. This is the first demonstration that emotion not only improves but also impairs low-level vision. The selective low-spatial-frequency benefits are consistent with the idea that emotion enhances magnocellular processing. Additionally, we suggest that the high-spatial-frequency deficits are due to inhibitory interactions between magnocellular and parvocellular pathways. Our results suggest an emotion-induced trade-off in visual processing, rather than a general improvement. This trade-off may benefit perceptual dimensions that are relevant for survival at the expense of those that are less relevant.

  12. Feedback from the heart: Emotional learning and memory is controlled by cardiac cycle, interoceptive accuracy and personality.

    PubMed

    Pfeifer, Gaby; Garfinkel, Sarah N; Gould van Praag, Cassandra D; Sahota, Kuljit; Betka, Sophie; Critchley, Hugo D

    2017-05-01

    Feedback processing is critical to trial-and-error learning. Here, we examined whether interoceptive signals concerning the state of cardiovascular arousal influence the processing of reinforcing feedback during the learning of 'emotional' face-name pairs, with subsequent effects on retrieval. Participants (N=29) engaged in a learning task of face-name pairs (fearful, neutral, happy faces). Correct and incorrect learning decisions were reinforced by auditory feedback, which was delivered either at cardiac systole (on the heartbeat, when baroreceptors signal the contraction of the heart to the brain), or at diastole (between heartbeats during baroreceptor quiescence). We discovered a cardiac influence on feedback processing that enhanced the learning of fearful faces in people with heightened interoceptive ability. Individuals with enhanced accuracy on a heartbeat counting task learned fearful face-name pairs better when feedback was given at systole than at diastole. This effect was not present for neutral and happy faces. At retrieval, we also observed related effects of personality: First, individuals scoring higher for extraversion showed poorer retrieval accuracy. These individuals additionally manifested lower resting heart rate and lower state anxiety, suggesting that attenuated levels of cardiovascular arousal in extraverts underlies poorer performance. Second, higher extraversion scores predicted higher emotional intensity ratings of fearful faces reinforced at systole. Third, individuals scoring higher for neuroticism showed higher retrieval confidence for fearful faces reinforced at diastole. Our results show that cardiac signals shape feedback processing to influence learning of fearful faces, an effect underpinned by personality differences linked to psychophysiological arousal. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Mapping the emotional face. How individual face parts contribute to successful emotion recognition.

    PubMed

    Wegrzyn, Martin; Vogt, Maria; Kireclioglu, Berna; Schneider, Julia; Kissler, Johanna

    2017-01-01

    Which facial features allow human observers to successfully recognize expressions of emotion? While the eyes and mouth have been frequently shown to be of high importance, research on facial action units has made more precise predictions about the areas involved in displaying each emotion. The present research investigated on a fine-grained level, which physical features are most relied on when decoding facial expressions. In the experiment, individual faces expressing the basic emotions according to Ekman were hidden behind a mask of 48 tiles, which was sequentially uncovered. Participants were instructed to stop the sequence as soon as they recognized the facial expression and assign it the correct label. For each part of the face, its contribution to successful recognition was computed, allowing to visualize the importance of different face areas for each expression. Overall, observers were mostly relying on the eye and mouth regions when successfully recognizing an emotion. Furthermore, the difference in the importance of eyes and mouth allowed to group the expressions in a continuous space, ranging from sadness and fear (reliance on the eyes) to disgust and happiness (mouth). The face parts with highest diagnostic value for expression identification were typically located in areas corresponding to action units from the facial action coding system. A similarity analysis of the usefulness of different face parts for expression recognition demonstrated that faces cluster according to the emotion they express, rather than by low-level physical features. Also, expressions relying more on the eyes or mouth region were in close proximity in the constructed similarity space. These analyses help to better understand how human observers process expressions of emotion, by delineating the mapping from facial features to psychological representation.

  14. Mapping the emotional face. How individual face parts contribute to successful emotion recognition

    PubMed Central

    Wegrzyn, Martin; Vogt, Maria; Kireclioglu, Berna; Schneider, Julia; Kissler, Johanna

    2017-01-01

    Which facial features allow human observers to successfully recognize expressions of emotion? While the eyes and mouth have been frequently shown to be of high importance, research on facial action units has made more precise predictions about the areas involved in displaying each emotion. The present research investigated on a fine-grained level, which physical features are most relied on when decoding facial expressions. In the experiment, individual faces expressing the basic emotions according to Ekman were hidden behind a mask of 48 tiles, which was sequentially uncovered. Participants were instructed to stop the sequence as soon as they recognized the facial expression and assign it the correct label. For each part of the face, its contribution to successful recognition was computed, allowing to visualize the importance of different face areas for each expression. Overall, observers were mostly relying on the eye and mouth regions when successfully recognizing an emotion. Furthermore, the difference in the importance of eyes and mouth allowed to group the expressions in a continuous space, ranging from sadness and fear (reliance on the eyes) to disgust and happiness (mouth). The face parts with highest diagnostic value for expression identification were typically located in areas corresponding to action units from the facial action coding system. A similarity analysis of the usefulness of different face parts for expression recognition demonstrated that faces cluster according to the emotion they express, rather than by low-level physical features. Also, expressions relying more on the eyes or mouth region were in close proximity in the constructed similarity space. These analyses help to better understand how human observers process expressions of emotion, by delineating the mapping from facial features to psychological representation. PMID:28493921

  15. Modulation of neuronal oscillatory activity in the beta- and gamma-band is associated with current individual anxiety levels.

    PubMed

    Schneider, Till R; Hipp, Joerg F; Domnick, Claudia; Carl, Christine; Büchel, Christian; Engel, Andreas K

    2018-05-26

    Human faces are among the most salient visual stimuli and act both as socially and emotionally relevant signals. Faces and especially faces with emotional expression receive prioritized processing in the human brain and activate a distributed network of brain areas reflected, e.g., in enhanced oscillatory neuronal activity. However, an inconsistent picture emerged so far regarding neuronal oscillatory activity across different frequency-bands modulated by emotionally and socially relevant stimuli. The individual level of anxiety among healthy populations might be one explanation for these inconsistent findings. Therefore, we tested the hypothesis whether oscillatory neuronal activity is associated with individual anxiety levels during perception of faces with neutral and fearful facial expressions. We recorded neuronal activity using magnetoencephalography (MEG) in 27 healthy participants and determined their individual state anxiety levels. Images of human faces with neutral and fearful expressions, and physically matched visual control stimuli were presented while participants performed a simple color detection task. Spectral analyses revealed that face processing and in particular processing of fearful faces was characterized by enhanced neuronal activity in the theta- and gamma-band and decreased activity in the beta-band in early visual cortex and the fusiform gyrus (FFG). Moreover, the individuals' state anxiety levels correlated positively with the gamma-band response and negatively with the beta response in the FFG and the amygdala. Our results suggest that oscillatory neuronal activity plays an important role in affective face processing and is dependent on the individual level of state anxiety. Our work provides new insights on the role of oscillatory neuronal activity underlying processing of faces. Copyright © 2018. Published by Elsevier Inc.

  16. Visual body recognition in a prosopagnosic patient.

    PubMed

    Moro, V; Pernigo, S; Avesani, R; Bulgarelli, C; Urgesi, C; Candidi, M; Aglioti, S M

    2012-01-01

    Conspicuous deficits in face recognition characterize prosopagnosia. Information on whether agnosic deficits may extend to non-facial body parts is lacking. Here we report the neuropsychological description of FM, a patient affected by a complete deficit in face recognition in the presence of mild clinical signs of visual object agnosia. His deficit involves both overt and covert recognition of faces (i.e. recognition of familiar faces, but also categorization of faces for gender or age) as well as the visual mental imagery of faces. By means of a series of matching-to-sample tasks we investigated: (i) a possible association between prosopagnosia and disorders in visual body perception; (ii) the effect of the emotional content of stimuli on the visual discrimination of faces, bodies and objects; (iii) the existence of a dissociation between identity recognition and the emotional discrimination of faces and bodies. Our results document, for the first time, the co-occurrence of body agnosia, i.e. the visual inability to discriminate body forms and body actions, and prosopagnosia. Moreover, the results show better performance in the discrimination of emotional face and body expressions with respect to body identity and neutral actions. Since FM's lesions involve bilateral fusiform areas, it is unlikely that the amygdala-temporal projections explain the relative sparing of emotion discrimination performance. Indeed, the emotional content of the stimuli did not improve the discrimination of their identity. The results hint at the existence of two segregated brain networks involved in identity and emotional discrimination that are at least partially shared by face and body processing. Copyright © 2011 Elsevier Ltd. All rights reserved.

  17. How Children Use Emotional Prosody: Crossmodal Emotional Integration?

    ERIC Educational Resources Information Center

    Gil, Sandrine; Hattouti, Jamila; Laval, Virginie

    2016-01-01

    A crossmodal effect has been observed in the processing of facial and vocal emotion in adults and infants. For the first time, we assessed whether this effect is present in childhood by administering a crossmodal task similar to those used in seminal studies featuring emotional faces (i.e., a continuum of emotional expressions running from…

  18. Identity modulates short-term memory for facial emotion.

    PubMed

    Galster, Murray; Kahana, Michael J; Wilson, Hugh R; Sekuler, Robert

    2009-12-01

    For some time, the relationship between processing of facial expression and facial identity has been in dispute. Using realistic synthetic faces, we reexamined this relationship for both perception and short-term memory. In Experiment 1, subjects tried to identify whether the emotional expression on a probe stimulus face matched the emotional expression on either of two remembered faces that they had just seen. The results showed that identity strongly influenced recognition short-term memory for emotional expression. In Experiment 2, subjects' similarity/dissimilarity judgments were transformed by multidimensional scaling (MDS) into a 2-D description of the faces' perceptual representations. Distances among stimuli in the MDS representation, which showed a strong linkage of emotional expression and facial identity, were good predictors of correct and false recognitions obtained previously in Experiment 1. The convergence of the results from Experiments 1 and 2 suggests that the overall structure and configuration of faces' perceptual representations may parallel their representation in short-term memory and that facial identity modulates the representation of facial emotion, both in perception and in memory. The stimuli from this study may be downloaded from http://cabn.psychonomic-journals.org/content/supplemental.

  19. The effects of mothers' past infant-holding preferences on their adult children's face processing lateralisation.

    PubMed

    Vervloed, Mathijs P J; Hendriks, Angélique W; van den Eijnde, Esther

    2011-04-01

    Face processing development is negatively affected when infants have not been exposed to faces for some time because of congenital cataract blocking all vision (Le Grand, Mondloch, Maurer, & Brent, 2001). It is not clear, however, whether more subtle differences in face exposure may also have an influence. The present study looked at the effect of the mother's preferred side of holding an infant, on her adult child's face processing lateralisation. Adults with a mother who had a left-arm preference for holding infants were compared with adults with a mother who had a right-arm holding preference. All participants were right-handed and had been exclusively bottle-fed during infancy. The participants were presented with two chimeric faces tests, one involving emotion and the other one gender. The left-arm held individuals showed a normal left-bias on the chimeric face tests, whereas the right-arm held individuals a significantly decreased left-bias. The results might suggest that reduced exposure to high quality emotional information on faces in infancy results in diminished right-hemisphere lateralisation for face processing. Copyright © 2011 Elsevier Inc. All rights reserved.

  20. Emotional faces influence evaluation of natural and transformed food.

    PubMed

    Manippa, Valerio; Padulo, Caterina; Brancucci, Alfredo

    2018-07-01

    Previous evidence showed the presence of a straight relationship between feeding behavior and emotions. Despite that, no studies have focused on the influence of emotional faces on food processing. In our study, participants were presented with 72 couples of visual stimuli composed of a neutral, happy, or disgusted faces (5000 ms duration in Experiment 1, adaptation; 150 ms in Experiment 2, priming) followed by a food stimulus (1500 ms). Food stimuli were grouped in pleasant foods, further divided in natural and transformed, and unpleasant rotten foods. The task consisted in judging the food valence (as 'pleasant' or 'unpleasant') by keypress. Results showed a different pattern of response based on the transformation level of food. In general, the evaluation of natural foods was more rapid compared with transformed foods, maybe for their simplicity and healthier perception. In addition, transformed foods yielded incongruent responses with respect to the preceding emotional face, whereas natural foods yielded congruent responses with respect to it. These effects were independent of the duration of the emotional face (i.e., adaptation or priming paradigm) and may depend on pleasant food stimuli salience.

  1. Sex-dependent neural effect of oxytocin during subliminal processing of negative emotion faces.

    PubMed

    Luo, Lizhu; Becker, Benjamin; Geng, Yayuan; Zhao, Zhiying; Gao, Shan; Zhao, Weihua; Yao, Shuxia; Zheng, Xiaoxiao; Ma, Xiaole; Gao, Zhao; Hu, Jiehui; Kendrick, Keith M

    2017-11-15

    In line with animal models indicating sexually dimorphic effects of oxytocin (OXT) on social-emotional processing, a growing number of OXT-administration studies in humans have also reported sex-dependent effects during social information processing. To explore whether sex-dependent effects already occur during early, subliminal, processing stages the present pharmacological fMRI-study combined the intranasal-application of either OXT or placebo (n = 86-43 males) with a backward-masking emotional face paradigm. Results showed that while OXT suppressed inferior frontal gyrus, dorsal anterior cingulate and anterior insula responses to threatening face stimuli in men it increased them in women. In women increased anterior cingulate reactivity during subliminal threat processing was also positively associated with trait anxiety. On the network level, sex-dependent effects were observed on amygdala, anterior cingulate and inferior frontal gyrus functional connectivity that were mainly driven by reduced coupling in women following OXT. Our findings demonstrate that OXT produces sex-dependent effects even at the early stages of social-emotional processing, and suggest that while it attenuates neural responses to threatening social stimuli in men it increases them in women. Thus in a therapeutic context OXT may potentially produce different effects on anxiety disorders in men and women. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Still feeling it: the time course of emotional recovery from an attentional perspective

    PubMed Central

    Morriss, Jayne; Taylor, Alexander N. W.; Roesch, Etienne B.; van Reekum, Carien M.

    2013-01-01

    Emotional reactivity and the time taken to recover, particularly from negative, stressful, events, are inextricably linked, and both are crucial for maintaining well-being. It is unclear, however, to what extent emotional reactivity during stimulus onset predicts the time course of recovery after stimulus offset. To address this question, 25 participants viewed arousing (negative and positive) and neutral pictures from the International Affective Picture System (IAPS) followed by task-relevant face targets, which were to be gender categorized. Faces were presented early (400–1500 ms) or late (2400–3500 ms) after picture offset to capture the time course of recovery from emotional stimuli. Measures of reaction time (RT), as well as face-locked N170 and P3 components were taken as indicators of the impact of lingering emotion on attentional facilitation or interference. Electrophysiological effects revealed negative and positive images to facilitate face-target processing on the P3 component, regardless of temporal interval. At the individual level, increased reactivity to: (1) negative pictures, quantified as the IAPS picture-locked Late Positive Potential (LPP), predicted larger attentional interference on the face-locked P3 component to faces presented in the late time window after picture offset. (2) Positive pictures, denoted by the LPP, predicted larger facilitation on the face-locked P3 component to faces presented in the earlier time window after picture offset. These results suggest that subsequent processing is still impacted up to 3500 ms after the offset of negative pictures and 1500 ms after the offset of positive pictures for individuals reacting more strongly to these pictures, respectively. Such findings emphasize the importance of individual differences in reactivity when predicting the temporality of emotional recovery. The current experimental model provides a novel basis for future research aiming to identify profiles of adaptive and maladaptive recovery. PMID:23734116

  3. Perceived differences between chimpanzee (Pan troglodytes) and human (Homo sapiens) facial expressions are related to emotional interpretation.

    PubMed

    Waller, Bridget M; Bard, Kim A; Vick, Sarah-Jane; Smith Pasqualini, Marcia C

    2007-11-01

    Human face perception is a finely tuned, specialized process. When comparing faces between species, therefore, it is essential to consider how people make these observational judgments. Comparing facial expressions may be particularly problematic, given that people tend to consider them categorically as emotional signals, which may affect how accurately specific details are processed. The bared-teeth display (BT), observed in most primates, has been proposed as a homologue of the human smile (J. A. R. A. M. van Hooff, 1972). In this study, judgments of similarity between BT displays of chimpanzees (Pan troglodytes) and human smiles varied in relation to perceived emotional valence. When a chimpanzee BT was interpreted as fearful, observers tended to underestimate the magnitude of the relationship between certain features (the extent of lip corner raise) and human smiles. These judgments may reflect the combined effects of categorical emotional perception, configural face processing, and perceptual organization in mental imagery and may demonstrate the advantages of using standardized observational methods in comparative facial expression research. Copyright 2007 APA.

  4. Do bodily expressions compete with facial expressions? Time course of integration of emotional signals from the face and the body.

    PubMed

    Gu, Yuanyuan; Mai, Xiaoqin; Luo, Yue-jia

    2013-01-01

    The decoding of social signals from nonverbal cues plays a vital role in the social interactions of socially gregarious animals such as humans. Because nonverbal emotional signals from the face and body are normally seen together, it is important to investigate the mechanism underlying the integration of emotional signals from these two sources. We conducted a study in which the time course of the integration of facial and bodily expressions was examined via analysis of event-related potentials (ERPs) while the focus of attention was manipulated. Distinctive integrating features were found during multiple stages of processing. In the first stage, threatening information from the body was extracted automatically and rapidly, as evidenced by enhanced P1 amplitudes when the subjects viewed compound face-body images with fearful bodies compared with happy bodies. In the second stage, incongruency between emotional information from the face and the body was detected and captured by N2. Incongruent compound images elicited larger N2s than did congruent compound images. The focus of attention modulated the third stage of integration. When the subjects' attention was focused on the face, images with congruent emotional signals elicited larger P3s than did images with incongruent signals, suggesting more sustained attention and elaboration of congruent emotional information extracted from the face and body. On the other hand, when the subjects' attention was focused on the body, images with fearful bodies elicited larger P3s than did images with happy bodies, indicating more sustained attention and elaboration of threatening information from the body during evaluative processes.

  5. Emotional bias of cognitive control in adults with childhood attention-deficit/hyperactivity disorder

    PubMed Central

    Schulz, Kurt P.; Bédard, Anne-Claude V.; Fan, Jin; Clerkin, Suzanne M.; Dima, Danai; Newcorn, Jeffrey H.; Halperin, Jeffrey M.

    2014-01-01

    Affect recognition deficits found in individuals with attention-deficit/hyperactivity disorder (ADHD) across the lifespan may bias the development of cognitive control processes implicated in the pathophysiology of the disorder. This study aimed to determine the mechanism through which facial expressions influence cognitive control in young adults diagnosed with ADHD in childhood. Fourteen probands with childhood ADHD and 14 comparison subjects with no history of ADHD were scanned with functional magnetic resonance imaging while performing a face emotion go/no-go task. Event-related analyses contrasted activation and functional connectivity for cognitive control collapsed over face valence and tested for variations in activation for response execution and inhibition as a function of face valence. Probands with childhood ADHD made fewer correct responses and inhibitions overall than comparison subjects, but demonstrated comparable effects of face emotion on response execution and inhibition. The two groups showed similar frontotemporal activation for cognitive control collapsed across face valence, but differed in the functional connectivity of the right dorsolateral prefrontal cortex, with fewer interactions with the subgenual cingulate cortex, inferior frontal gyrus, and putamen in probands than in comparison subjects. Further, valence-dependent activation for response execution was seen in the amygdala, ventral striatum, subgenual cingulate cortex, and orbitofrontal cortex in comparison subjects but not in probands. The findings point to functional anomalies in limbic networks for both the valence-dependent biasing of cognitive control and the valence-independent cognitive control of face emotion processing in probands with childhood ADHD. This limbic dysfunction could impact cognitive control in emotional contexts and may contribute to the social and emotional problems associated with ADHD. PMID:24918067

  6. Emotional bias of cognitive control in adults with childhood attention-deficit/hyperactivity disorder.

    PubMed

    Schulz, Kurt P; Bédard, Anne-Claude V; Fan, Jin; Clerkin, Suzanne M; Dima, Danai; Newcorn, Jeffrey H; Halperin, Jeffrey M

    2014-01-01

    Affect recognition deficits found in individuals with attention-deficit/hyperactivity disorder (ADHD) across the lifespan may bias the development of cognitive control processes implicated in the pathophysiology of the disorder. This study aimed to determine the mechanism through which facial expressions influence cognitive control in young adults diagnosed with ADHD in childhood. Fourteen probands with childhood ADHD and 14 comparison subjects with no history of ADHD were scanned with functional magnetic resonance imaging while performing a face emotion go/no-go task. Event-related analyses contrasted activation and functional connectivity for cognitive control collapsed over face valence and tested for variations in activation for response execution and inhibition as a function of face valence. Probands with childhood ADHD made fewer correct responses and inhibitions overall than comparison subjects, but demonstrated comparable effects of face emotion on response execution and inhibition. The two groups showed similar frontotemporal activation for cognitive control collapsed across face valence, but differed in the functional connectivity of the right dorsolateral prefrontal cortex, with fewer interactions with the subgenual cingulate cortex, inferior frontal gyrus, and putamen in probands than in comparison subjects. Further, valence-dependent activation for response execution was seen in the amygdala, ventral striatum, subgenual cingulate cortex, and orbitofrontal cortex in comparison subjects but not in probands. The findings point to functional anomalies in limbic networks for both the valence-dependent biasing of cognitive control and the valence-independent cognitive control of face emotion processing in probands with childhood ADHD. This limbic dysfunction could impact cognitive control in emotional contexts and may contribute to the social and emotional problems associated with ADHD.

  7. Do Bodily Expressions Compete with Facial Expressions? Time Course of Integration of Emotional Signals from the Face and the Body

    PubMed Central

    Gu, Yuanyuan; Mai, Xiaoqin; Luo, Yue-jia

    2013-01-01

    The decoding of social signals from nonverbal cues plays a vital role in the social interactions of socially gregarious animals such as humans. Because nonverbal emotional signals from the face and body are normally seen together, it is important to investigate the mechanism underlying the integration of emotional signals from these two sources. We conducted a study in which the time course of the integration of facial and bodily expressions was examined via analysis of event-related potentials (ERPs) while the focus of attention was manipulated. Distinctive integrating features were found during multiple stages of processing. In the first stage, threatening information from the body was extracted automatically and rapidly, as evidenced by enhanced P1 amplitudes when the subjects viewed compound face-body images with fearful bodies compared with happy bodies. In the second stage, incongruency between emotional information from the face and the body was detected and captured by N2. Incongruent compound images elicited larger N2s than did congruent compound images. The focus of attention modulated the third stage of integration. When the subjects' attention was focused on the face, images with congruent emotional signals elicited larger P3s than did images with incongruent signals, suggesting more sustained attention and elaboration of congruent emotional information extracted from the face and body. On the other hand, when the subjects' attention was focused on the body, images with fearful bodies elicited larger P3s than did images with happy bodies, indicating more sustained attention and elaboration of threatening information from the body during evaluative processes. PMID:23935825

  8. Amygdala habituation to emotional faces in adolescents with internalizing disorders, adolescents with childhood sexual abuse related PTSD and healthy adolescents.

    PubMed

    van den Bulk, Bianca G; Somerville, Leah H; van Hoof, Marie-José; van Lang, Natasja D J; van der Wee, Nic J A; Crone, Eveline A; Vermeiren, Robert R J M

    2016-10-01

    Adolescents with internalizing disorders and adolescents with childhood sexual abuse related post-traumatic stress disorder (CSA-related PTSD) show a large overlap in symptomatology. In addition, brain research indicated hyper-responsiveness and sustained activation instead of habituation of amygdala activation to emotional faces in both groups. Little is known, however, about whether the same patterns of amygdala habituation are present in these two groups. The current study examined habituation patterns of amygdala activity to emotional faces (fearful, happy and neutral) in adolescents with a DSM-IV depressive and/or anxiety disorder (N=25), adolescents with CSA-related PTSD (N=19) and healthy controls (N=26). Behaviourally, the adolescents from the internalizing and CSA-related PTSD group reported more anxiety to fearful and neutral faces than adolescents from the control group and adolescents from the CSA-related PTSD group reacted slower compared to the internalizing group. At the whole brain level, there was a significant interaction between time and group within the left amygdala. Follow-up ROI analysis showed elevated initial activity in the amygdala and rapid habituation in the CSA-related PTSD group compared to the internalizing group. These findings suggest that habituation patterns of amygdala activation provide additional information on problems with emotional face processing. Furthermore, the results suggest there are differences in the underlying neurobiological mechanisms related to emotional face processing for adolescents with internalizing disorders and adolescents with CSA-related PTSD. Possibly CSA-related PTSD is characterized by a stronger primary emotional response driven by the amygdala. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. Detecting emotion in others: increased insula and decreased medial prefrontal cortex activation during emotion processing in elite adventure racers

    PubMed Central

    Johnson, Douglas C.; Flagan, Taru; Simmons, Alan N.; Kotturi, Sante A.; Van Orden, Karl F.; Potterat, Eric G.; Swain, Judith L.; Paulus, Martin P.

    2014-01-01

    Understanding the neural processes that characterize elite performers is a first step to develop a neuroscience model that can be used to improve performance in stressful circumstances. Adventure racers are elite athletes that operate in small teams in the context of environmental and physical extremes. In particular, awareness of team member’s emotional status is critical to the team’s ability to navigate high-magnitude stressors. Thus, this functional magnetic resonance imaging (fMRI) study examined the hypothesis that adventure racers would show altered emotion processing in brain areas that are important for resilience and social awareness. Elite adventure racers (n = 10) were compared with healthy volunteers (n = 12) while performing a simple emotion face-processing (modified Hariri) task during fMRI. Across three types of emotional faces, adventure racers showed greater activation in right insula, left amygdala and dorsal anterior cingulate. Additionally, compared with healthy controls adventure racers showed attenuated right medial prefrontal cortex activation. These results are consistent with previous studies showing elite performers differentially activate neural substrates underlying interoception. Thus, adventure racers differentially deploy brain resources in an effort to recognize and process the internal sensations associated with emotions in others, which could be advantageous for team-based performance under stress. PMID:23171614

  10. From neural signatures of emotional modulation to social cognition: individual differences in healthy volunteers and psychiatric participants

    PubMed Central

    Aguado, Jaume; Baez, Sandra; Huepe, David; Lopez, Vladimir; Ortega, Rodrigo; Sigman, Mariano; Mikulan, Ezequiel; Lischinsky, Alicia; Torrente, Fernando; Cetkovich, Marcelo; Torralva, Teresa; Bekinschtein, Tristan; Manes, Facundo

    2014-01-01

    It is commonly assumed that early emotional signals provide relevant information for social cognition tasks. The goal of this study was to test the association between (a) cortical markers of face emotional processing and (b) social-cognitive measures, and also to build a model which can predict this association (a and b) in healthy volunteers as well as in different groups of psychiatric patients. Thus, we investigated the early cortical processing of emotional stimuli (N170, using a face and word valence task) and their relationship with the social-cognitive profiles (SCPs, indexed by measures of theory of mind, fluid intelligence, speed processing and executive functions). Group comparisons and individual differences were assessed among schizophrenia (SCZ) patients and their relatives, individuals with attention deficit hyperactivity disorder (ADHD), individuals with euthymic bipolar disorder (BD) and healthy participants (educational level, handedness, age and gender matched). Our results provide evidence of emotional N170 impairments in the affected groups (SCZ and relatives, ADHD and BD) as well as subtle group differences. Importantly, cortical processing of emotional stimuli predicted the SCP, as evidenced by a structural equation model analysis. This is the first study to report an association model of brain markers of emotional processing and SCP. PMID:23685775

  11. Dissociable neural effects of stimulus valence and preceding context during the inhibition of responses to emotional faces.

    PubMed

    Schulz, Kurt P; Clerkin, Suzanne M; Halperin, Jeffrey M; Newcorn, Jeffrey H; Tang, Cheuk Y; Fan, Jin

    2009-09-01

    Socially appropriate behavior requires the concurrent inhibition of actions that are inappropriate in the context. This self-regulatory function requires an interaction of inhibitory and emotional processes that recruits brain regions beyond those engaged by either processes alone. In this study, we isolated brain activity associated with response inhibition and emotional processing in 24 healthy adults using event-related functional magnetic resonance imaging (fMRI) and a go/no-go task that independently manipulated the context preceding no-go trials (ie, number of go trials) and the valence (ie, happy, sad, and neutral) of the face stimuli used as trial cues. Parallel quadratic trends were seen in correct inhibitions on no-go trials preceded by increasing numbers of go trials and associated activation for correct no-go trials in inferior frontal gyrus pars opercularis, pars triangularis, and pars orbitalis, temporoparietal junction, superior parietal lobule, and temporal sensory association cortices. Conversely, the comparison of happy versus neutral faces and sad versus neutral faces revealed valence-dependent activation in the amygdala, anterior insula cortex, and posterior midcingulate cortex. Further, an interaction between inhibition and emotion was seen in valence-dependent variations in the quadratic trend in no-go activation in the right inferior frontal gyrus and left posterior insula cortex. These results suggest that the inhibition of response to emotional cues involves the interaction of partly dissociable limbic and frontoparietal networks that encode emotional cues and use these cues to exert inhibitory control over the motor, attention, and sensory functions needed to perform the task, respectively. 2008 Wiley-Liss, Inc.

  12. Facial decoding in schizophrenia is underpinned by basic visual processing impairments.

    PubMed

    Belge, Jan-Baptist; Maurage, Pierre; Mangelinckx, Camille; Leleux, Dominique; Delatte, Benoît; Constant, Eric

    2017-09-01

    Schizophrenia is associated with a strong deficit in the decoding of emotional facial expression (EFE). Nevertheless, it is still unclear whether this deficit is specific for emotions or due to a more general impairment for any type of facial processing. This study was designed to clarify this issue. Thirty patients suffering from schizophrenia and 30 matched healthy controls performed several tasks evaluating the recognition of both changeable (i.e. eyes orientation and emotions) and stable (i.e. gender, age) facial characteristics. Accuracy and reaction times were recorded. Schizophrenic patients presented a performance deficit (accuracy and reaction times) in the perception of both changeable and stable aspects of faces, without any specific deficit for emotional decoding. Our results demonstrate a generalized face recognition deficit in schizophrenic patients, probably caused by a perceptual deficit in basic visual processing. It seems that the deficit in the decoding of emotional facial expression (EFE) is not a specific deficit of emotion processing, but is at least partly related to a generalized perceptual deficit in lower-level perceptual processing, occurring before the stage of emotion processing, and underlying more complex cognitive dysfunctions. These findings should encourage future investigations to explore the neurophysiologic background of these generalized perceptual deficits, and stimulate a clinical approach focusing on more basic visual processing. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  13. Disruption of emotion and conflict processing in HIV infection with and without alcoholism comorbidity.

    PubMed

    Schulte, Tilman; Müller-Oehring, Eva M; Sullivan, Edith V; Pfefferbaum, Adolf

    2011-05-01

    Alcoholism and HIV-1 infection each affect components of selective attention and cognitive control that may contribute to deficits in emotion processing based on closely interacting fronto-parietal attention and frontal-subcortical emotion systems. Here, we investigated whether patients with alcoholism, HIV-1 infection, or both diseases have greater difficulty than healthy controls in resolving conflict from emotional words with different valences. Accordingly, patients with alcoholism (ALC, n = 20), HIV-1 infection (HIV, n = 20), ALC + HIV comorbidity (n = 22), and controls (CTL, n = 16) performed an emotional Stroop Match-to-Sample task, which assessed the contribution of emotion (happy, angry) to cognitive control (Stroop conflict processing). ALC + HIV showed greater Stroop effects than HIV, ALC, or CTL for negative (ANGRY) but not for positive (HAPPY) words, and also when the cue color did not match the Stroop stimulus color; the comorbid group performed similarly to the others when cue and word colors matched. Furthermore, emotionally salient face cues prolonged color-matching responses in all groups. HIV alone, compared with the other three groups, showed disproportionately slowed color-matching time when trials featured angry faces. The enhanced Stroop effects prominent in ALC + HIV suggest difficulty in exercising attentional top-down control on processes that consume attentional capacity, especially when cognitive effort is required to ignore negative emotions.

  14. Neural dynamics underlying emotional transmissions between individuals

    PubMed Central

    Levit-Binnun, Nava; Hendler, Talma; Lerner, Yulia

    2017-01-01

    Abstract Emotional experiences are frequently shaped by the emotional responses of co-present others. Research has shown that people constantly monitor and adapt to the incoming social–emotional signals, even without face-to-face interaction. And yet, the neural processes underlying such emotional transmissions have not been directly studied. Here, we investigated how the human brain processes emotional cues which arrive from another, co-attending individual. We presented continuous emotional feedback to participants who viewed a movie in the scanner. Participants in the social group (but not in the control group) believed that the feedback was coming from another person who was co-viewing the same movie. We found that social–emotional feedback significantly affected the neural dynamics both in the core affect and in the medial pre-frontal regions. Specifically, the response time-courses in those regions exhibited increased similarity across recipients and increased neural alignment with the timeline of the feedback in the social compared with control group. Taken in conjunction with previous research, this study suggests that emotional cues from others shape the neural dynamics across the whole neural continuum of emotional processing in the brain. Moreover, it demonstrates that interpersonal neural alignment can serve as a neural mechanism through which affective information is conveyed between individuals. PMID:28575520

  15. Cyclical Grieving: Reocurring Emotions Experienced by Parents Who Have Children with Disabilities.

    ERIC Educational Resources Information Center

    Blaska, Joan K.

    This paper discusses cyclical grieving, which is described as an intermittent reoccurrence of one or more emotions that are part of the grieving process experienced by parents who have children with disabilities. A study to support the concept of cyclical grieving used a naturalistic approach with face-to-face interviews to explore ten parents'…

  16. Infants' Temperament and Mothers', and Fathers' Depression Predict Infants' Attention to Objects Paired with Emotional Faces.

    PubMed

    Aktar, Evin; Mandell, Dorothy J; de Vente, Wieke; Majdandžić, Mirjana; Raijmakers, Maartje E J; Bögels, Susan M

    2016-07-01

    Between 10 and 14 months, infants gain the ability to learn about unfamiliar stimuli by observing others' emotional reactions to those stimuli, so called social referencing (SR). Joint processing of emotion and head/gaze direction is essential for SR. This study tested emotion and head/gaze direction effects on infants' attention via pupillometry in the period following the emergence of SR. Pupil responses of 14-to-17-month-old infants (N = 57) were measured during computerized presentations of unfamiliar objects alone, before-and-after being paired with emotional (happy, sad, fearful vs. neutral) faces gazing towards (vs. away) from objects. Additionally, the associations of infants' temperament, and parents' negative affect/depression/anxiety with infants' pupil responses were explored. Both mothers and fathers of participating infants completed questionnaires about their negative affect, depression and anxiety symptoms and their infants' negative temperament. Infants allocated more attention (larger pupils) to negative vs. neutral faces when the faces were presented alone, while they allocated less attention to objects paired with emotional vs. neutral faces independent of head/gaze direction. Sad (but not fearful) temperament predicted more attention to emotional faces. Infants' sad temperament moderated the associations of mothers' depression (but not anxiety) with infants' attention to objects. Maternal depression predicted more attention to objects paired with emotional expressions in infants low in sad temperament, while it predicted less attention in infants high in sad temperament. Fathers' depression (but not anxiety) predicted more attention to objects paired with emotional expressions independent of infants' temperament. We conclude that infants' own temperamental dispositions for sadness, and their exposure to mothers' and fathers' depressed moods may influence infants' attention to emotion-object associations in social learning contexts.

  17. The human body odor compound androstadienone leads to anger-dependent effects in an emotional Stroop but not dot-probe task using human faces.

    PubMed

    Hornung, Jonas; Kogler, Lydia; Wolpert, Stephan; Freiherr, Jessica; Derntl, Birgit

    2017-01-01

    The androgen derivative androstadienone is a substance found in human sweat and thus is a putative human chemosignal. Androstadienone has been studied with respect to effects on mood states, attractiveness ratings, physiological and neural activation. With the current experiment, we aimed to explore in which way androstadienone affects attention to social cues (human faces). Moreover, we wanted to test whether effects depend on specific emotions, the participants' sex and individual sensitivity to smell androstadienone. To do so, we investigated 56 healthy individuals (thereof 29 females taking oral contraceptives) with two attention tasks on two consecutive days (once under androstadienone, once under placebo exposure in pseudorandomized order). With an emotional dot-probe task we measured visuo-spatial cueing while an emotional Stroop task allowed us to investigate interference control. Our results suggest that androstadienone acts in a sex, task and emotion-specific manner as a reduction in interference processes in the emotional Stroop task was only apparent for angry faces in men under androstadienone exposure. More specifically, men showed a smaller difference in reaction times for congruent compared to incongruent trials. At the same time also women were slightly affected by smelling androstadienone as they classified angry faces more often correctly under androstadienone. For the emotional dot-probe task no modulation by androstadienone was observed. Furthermore, in both attention paradigms individual sensitivity to androstadienone was neither correlated with reaction times nor error rates in men and women. To conclude, exposure to androstadienone seems to potentiate the relevance of angry faces in both men and women in connection with interference control, while processes of visuo-spatial cueing remain unaffected.

  18. The human body odor compound androstadienone leads to anger-dependent effects in an emotional Stroop but not dot-probe task using human faces

    PubMed Central

    Kogler, Lydia; Wolpert, Stephan; Freiherr, Jessica; Derntl, Birgit

    2017-01-01

    The androgen derivative androstadienone is a substance found in human sweat and thus is a putative human chemosignal. Androstadienone has been studied with respect to effects on mood states, attractiveness ratings, physiological and neural activation. With the current experiment, we aimed to explore in which way androstadienone affects attention to social cues (human faces). Moreover, we wanted to test whether effects depend on specific emotions, the participants' sex and individual sensitivity to smell androstadienone. To do so, we investigated 56 healthy individuals (thereof 29 females taking oral contraceptives) with two attention tasks on two consecutive days (once under androstadienone, once under placebo exposure in pseudorandomized order). With an emotional dot-probe task we measured visuo-spatial cueing while an emotional Stroop task allowed us to investigate interference control. Our results suggest that androstadienone acts in a sex, task and emotion-specific manner as a reduction in interference processes in the emotional Stroop task was only apparent for angry faces in men under androstadienone exposure. More specifically, men showed a smaller difference in reaction times for congruent compared to incongruent trials. At the same time also women were slightly affected by smelling androstadienone as they classified angry faces more often correctly under androstadienone. For the emotional dot-probe task no modulation by androstadienone was observed. Furthermore, in both attention paradigms individual sensitivity to androstadienone was neither correlated with reaction times nor error rates in men and women. To conclude, exposure to androstadienone seems to potentiate the relevance of angry faces in both men and women in connection with interference control, while processes of visuo-spatial cueing remain unaffected. PMID:28369152

  19. Neural circuitry of emotion regulation: Effects of appraisal, attention, and cortisol administration.

    PubMed

    Ma, Sean T; Abelson, James L; Okada, Go; Taylor, Stephan F; Liberzon, Israel

    2017-04-01

    Psychosocial well-being requires effective regulation of emotional responding in context of threat or stress. Neuroimaging studies have focused on instructed, volitional regulation (e.g., reappraisal or distancing), largely ignoring implicit regulation that does not involve purposeful effort to alter emotional experience. These implicit processes may or may not involve the same neural pathways as explicit regulatory strategies. We examined the neurobiology of implicit emotional regulation processes and the impact of the stress hormone cortisol on these processes. Our study task employed composite pictures of faces and places to examine neural activity during implicit emotional processing (of emotional faces), while these responses were implicitly regulated by attention shift away from the emotionally evocative stimuli, and while subjects reflectively appraised their own emotional response to them. Subjects completed the task in an fMRI scanner after random assignment to receive placebo or hydrocortisone (HCT), an orally administered version of cortisol. Implicit emotional processing activated insula/IFG, dACC/dMPFC, midbrain and amygdala. With attention shifting, we saw diminished signal in emotion generating/response regions (e.g., amygdala) and increased activations in task specific attention regions like parahippocampus. With appraisal of emotions, we observed robust activations in medial prefrontal areas, where activation is also seen in instructed reappraisal studies. We observed no main effects of HCT administration on brain, but males and females showed opposing neural effects in prefrontal areas. The data suggest that different types of emotion regulation utilize overlapping circuits, but with some strategy specific activation. Further study of the dimorphic sex response to cortisol is needed.

  20. The influence of stimulus sex and emotional expression on the attentional blink.

    PubMed

    Stebbins, Hilary E; Vanous, Jesse B

    2015-08-01

    Past studies have demonstrated that angry faces used as the first target (T1) in an attentional blink paradigm interfere with processing of a second, neutral target (T2). However, despite research that suggests that the sex and emotional expression of a face are confounded, no study has investigated whether the sex of a stimulus might interact with emotional expression to influence the attentional blink. In the current study, both the sex and emotional expression of a T1 stimulus were manipulated to assess participants' ability to report the presences of a subsequent neutral target. Although the findings revealed limited evidence to support an interaction between sex and emotion, both the sex and emotional expression of the T1 stimulus were found to independently affect reporting of T2. These findings suggest that both emotional expression and stimulus sex are important in the temporal allocation of attentional resources to faces. (c) 2015 APA, all rights reserved).

  1. The impact of oxytocin administration and maternal love withdrawal on event-related potential (ERP) responses to emotional faces with performance feedback.

    PubMed

    Huffmeijer, Renske; Alink, Lenneke R A; Tops, Mattie; Grewen, Karen M; Light, Kathleen C; Bakermans-Kranenburg, Marian J; van Ijzendoorn, Marinus H

    2013-03-01

    This is the first experimental study on the effect of oxytocin administration on the neural processing of facial stimuli conducted with female participants that uses event-related potentials (ERPs). Using a double-blind, placebo-controlled within-subjects design, we studied the effects of 16 IU of intranasal oxytocin on ERPs to pictures combining performance feedback with emotional facial expressions in 48 female undergraduate students. Participants also reported on the amount of love withdrawal they experienced from their mothers. Vertex positive potential (VPP) and late positive potential (LPP) amplitudes were more positive after oxytocin compared to placebo administration. This suggests that oxytocin increased attention to the feedback stimuli (LPP) and enhanced the processing of emotional faces (VPP). Oxytocin heightened processing of the happy and disgusted faces primarily for those reporting less love withdrawal. Significant associations with LPP amplitude suggest that more maternal love withdrawal relates to the allocation of attention toward the motivationally relevant combination of negative feedback with a disgusted face. Copyright © 2012 Elsevier Inc. All rights reserved.

  2. Neural Processing of Emotional Facial and Semantic Expressions in Euthymic Bipolar Disorder (BD) and Its Association with Theory of Mind (ToM)

    PubMed Central

    Petroni, Agustín; Baez, Sandra; Lopez, Vladimir; do Nascimento, Micaela; Herrera, Eduar; Guex, Raphael; Hurtado, Esteban; Blenkmann, Alejandro; Beltrachini, Leandro; Gelormini, Carlos; Sigman, Mariano; Lischinsky, Alicia; Torralva, Teresa; Torrente, Fernando; Cetkovich, Marcelo; Manes, Facundo

    2012-01-01

    Background Adults with bipolar disorder (BD) have cognitive impairments that affect face processing and social cognition. However, it remains unknown whether these deficits in euthymic BD have impaired brain markers of emotional processing. Methodology/Principal Findings We recruited twenty six participants, 13 controls subjects with an equal number of euthymic BD participants. We used an event-related potential (ERP) assessment of a dual valence task (DVT), in which faces (angry and happy), words (pleasant and unpleasant), and face-word simultaneous combinations are presented to test the effects of the stimulus type (face vs word) and valence (positive vs. negative). All participants received clinical, neuropsychological and social cognition evaluations. ERP analysis revealed that both groups showed N170 modulation of stimulus type effects (face > word). BD patients exhibited reduced and enhanced N170 to facial and semantic valence, respectively. The neural source estimation of N170 was a posterior section of the fusiform gyrus (FG), including the face fusiform area (FFA). Neural generators of N170 for faces (FG and FFA) were reduced in BD. In these patients, N170 modulation was associated with social cognition (theory of mind). Conclusions/Significance This is the first report of euthymic BD exhibiting abnormal N170 emotional discrimination associated with theory of mind impairments. PMID:23056505

  3. Rapid facial reactions to emotional facial expressions in typically developing children and children with autism spectrum disorder.

    PubMed

    Beall, Paula M; Moody, Eric J; McIntosh, Daniel N; Hepburn, Susan L; Reed, Catherine L

    2008-11-01

    Typical adults mimic facial expressions within 1000 ms, but adults with autism spectrum disorder (ASD) do not. These rapid facial reactions (RFRs) are associated with the development of social-emotional abilities. Such interpersonal matching may be caused by motor mirroring or emotional responses. Using facial electromyography (EMG), this study evaluated mechanisms underlying RFRs during childhood and examined possible impairment in children with ASD. Experiment 1 found RFRs to happy and angry faces (not fear faces) in 15 typically developing children from 7 to 12 years of age. RFRs of fear (not anger) in response to angry faces indicated an emotional mechanism. In 11 children (8-13 years of age) with ASD, Experiment 2 found undifferentiated RFRs to fear expressions and no consistent RFRs to happy or angry faces. However, as children with ASD aged, matching RFRs to happy faces increased significantly, suggesting the development of processes underlying matching RFRs during this period in ASD.

  4. Evaluating faces on trustworthiness: an extension of systems for recognition of emotions signaling approach/avoidance behaviors.

    PubMed

    Todorov, Alexander

    2008-03-01

    People routinely make various trait judgments from facial appearance, and such judgments affect important social outcomes. These judgments are highly correlated with each other, reflecting the fact that valence evaluation permeates trait judgments from faces. Trustworthiness judgments best approximate this evaluation, consistent with evidence about the involvement of the amygdala in the implicit evaluation of face trustworthiness. Based on computer modeling and behavioral experiments, I argue that face evaluation is an extension of functionally adaptive systems for understanding the communicative meaning of emotional expressions. Specifically, in the absence of diagnostic emotional cues, trustworthiness judgments are an attempt to infer behavioral intentions signaling approach/avoidance behaviors. Correspondingly, these judgments are derived from facial features that resemble emotional expressions signaling such behaviors: happiness and anger for the positive and negative ends of the trustworthiness continuum, respectively. The emotion overgeneralization hypothesis can explain highly efficient but not necessarily accurate trait judgments from faces, a pattern that appears puzzling from an evolutionary point of view and also generates novel predictions about brain responses to faces. Specifically, this hypothesis predicts a nonlinear response in the amygdala to face trustworthiness, confirmed in functional magnetic resonance imaging (fMRI) studies, and dissociations between processing of facial identity and face evaluation, confirmed in studies with developmental prosopagnosics. I conclude with some methodological implications for the study of face evaluation, focusing on the advantages of formally modeling representation of faces on social dimensions.

  5. Amygdala activity and prefrontal cortex-amygdala effective connectivity to emerging emotional faces distinguish remitted and depressed mood states in bipolar disorder.

    PubMed

    Perlman, Susan B; Almeida, Jorge R C; Kronhaus, Dina M; Versace, Amelia; Labarbara, Edmund J; Klein, Crystal R; Phillips, Mary L

    2012-03-01

    Few studies have employed effective connectivity (EC) to examine the functional integrity of neural circuitry supporting abnormal emotion processing in bipolar disorder (BD), a key feature of the illness. We used Granger Causality Mapping (GCM) to map EC between the prefrontal cortex (PFC) and bilateral amygdala and a novel paradigm to assess emotion processing in adults with BD. Thirty-one remitted adults with BD [(remitted BD), mean age = 32 years], 21 adults with BD in a depressed episode [(depressed BD), mean age = 33 years], and 25 healthy control participants [(HC), mean age = 31 years] performed a block-design emotion processing task requiring color-labeling of a color flash superimposed on a task-irrelevant face morphing from neutral to emotional (happy, sad, angry, or fearful). GCM measured EC preceding (top-down) and following (bottom-up) activity between the PFC and the left and right amygdalae. Our findings indicated patterns of abnormally elevated bilateral amygdala activity in response to emerging fearful, sad, and angry facial expressions in remitted-BD subjects versus HC, and abnormally elevated right amygdala activity to emerging fearful faces in depressed-BD subjects versus HC. We also showed distinguishable patterns of abnormal EC between the amygdala and dorsomedial and ventrolateral PFC, especially to emerging happy and sad facial expressions in remitted-BD and depressed-BD subjects. EC measures of neural system level functioning can further understanding of neural mechanisms associated with abnormal emotion processing and regulation in BD. Our findings suggest major differences in recruitment of amygdala-PFC circuitry, supporting implicit emotion processing between remitted-BD and depressed-BD subjects, which may underlie changes from remission to depression in BD. © 2012 John Wiley and Sons A/S.

  6. Preferential responses in amygdala and insula during presentation of facial contempt and disgust.

    PubMed

    Sambataro, Fabio; Dimalta, Savino; Di Giorgio, Annabella; Taurisano, Paolo; Blasi, Giuseppe; Scarabino, Tommaso; Giannatempo, Giuseppe; Nardini, Marcello; Bertolino, Alessandro

    2006-10-01

    Some authors consider contempt to be a basic emotion while others consider it a variant of disgust. The neural correlates of contempt have not so far been specifically contrasted with disgust. Using functional magnetic resonance imaging (fMRI), we investigated the neural networks involved in the processing of facial contempt and disgust in 24 healthy subjects. Facial recognition of contempt was lower than that of disgust and of neutral faces. The imaging data indicated significant activity in the amygdala and in globus pallidus and putamen during processing of contemptuous faces. Bilateral insula and caudate nuclei and left as well as right inferior frontal gyrus were engaged during processing of disgusted faces. Moreover, direct comparisons of contempt vs. disgust yielded significantly different activations in the amygdala. On the other hand, disgusted faces elicited greater activation than contemptuous faces in the right insula and caudate. Our findings suggest preferential involvement of different neural substrates in the processing of facial emotional expressions of contempt and disgust.

  7. Guanfacine modulates the emotional biasing of amygdala-prefrontal connectivity for cognitive control.

    PubMed

    Schulz, Kurt P; Clerkin, Suzanne M; Newcorn, Jeffrey H; Halperin, Jeffrey M; Fan, Jin

    2014-09-01

    Functional interactions between amygdala and prefrontal cortex provide a cortical entry point for emotional cues to bias cognitive control. Stimulation of α2 adrenoceptors enhances the prefrontal control functions and blocks the amygdala-dependent encoding of emotional cues. However, the impact of this stimulation on amygdala-prefrontal interactions and the emotional biasing of cognitive control have not been established. We tested the effect of the α2 adrenoceptor agonist guanfacine on psychophysiological interactions of amygdala with prefrontal cortex for the emotional biasing of response execution and inhibition. Fifteen healthy adults were scanned twice with event-related functional magnetic resonance imaging while performing an emotional go/no-go task following administration of oral guanfacine (1mg) and placebo in a double-blind, counterbalanced design. Happy, sad, and neutral faces served as trial cues. Guanfacine moderated the effect of face emotion on the task-related functional connectivity of left and right amygdala with left inferior frontal gyrus compared to placebo, by selectively reversing the functional co-activation of the two regions for response execution cued by sad faces. This shift from positively to negatively correlated activation for guanfacine was associated with selective improvements in the relatively low accuracy of responses to sad faces seen for placebo. These results demonstrate the importance of functional interactions between amygdala and inferior frontal gyrus to both bottom-up biasing of cognitive control and top-down control of emotional processing, as well as for the α2 adrenoceptor-mediated modulation of these processes. These mechanisms offer a possibile method to address the emotional reactivity that is common to several psychiatric disorders. Copyright © 2014 Elsevier B.V. and ECNP. All rights reserved.

  8. Age and gender modulate the neural circuitry supporting facial emotion processing in adults with major depressive disorder.

    PubMed

    Briceño, Emily M; Rapport, Lisa J; Kassel, Michelle T; Bieliauskas, Linas A; Zubieta, Jon-Kar; Weisenbach, Sara L; Langenecker, Scott A

    2015-03-01

    Emotion processing, supported by frontolimbic circuitry known to be sensitive to the effects of aging, is a relatively understudied cognitive-emotional domain in geriatric depression. Some evidence suggests that the neurophysiological disruption observed in emotion processing among adults with major depressive disorder (MDD) may be modulated by both gender and age. Therefore, the present study investigated the effects of gender and age on the neural circuitry supporting emotion processing in MDD. Cross-sectional comparison of fMRI signal during performance of an emotion processing task. Outpatient university setting. One hundred adults recruited by MDD status, gender, and age. Participants underwent fMRI while completing the Facial Emotion Perception Test. They viewed photographs of faces and categorized the emotion perceived. Contrast for fMRI was of face perception minus animal identification blocks. Effects of depression were observed in precuneus and effects of age in a number of frontolimbic regions. Three-way interactions were present between MDD status, gender, and age in regions pertinent to emotion processing, including frontal, limbic, and basal ganglia. Young women with MDD and older men with MDD exhibited hyperactivation in these regions compared with their respective same-gender healthy comparison (HC) counterparts. In contrast, older women and younger men with MDD exhibited hypoactivation compared to their respective same-gender HC counterparts. This the first study to report gender- and age-specific differences in emotion processing circuitry in MDD. Gender-differential mechanisms may underlie cognitive-emotional disruption in older adults with MDD. The present findings have implications for improved probes into the heterogeneity of the MDD syndrome. Copyright © 2015 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.

  9. Sleepiness induced by sleep-debt enhanced amygdala activity for subliminal signals of fear.

    PubMed

    Motomura, Yuki; Kitamura, Shingo; Oba, Kentaro; Terasawa, Yuri; Enomoto, Minori; Katayose, Yasuko; Hida, Akiko; Moriguchi, Yoshiya; Higuchi, Shigekazu; Mishima, Kazuo

    2014-08-19

    Emotional information is frequently processed below the level of consciousness, where subcortical regions of the brain are thought to play an important role. In the absence of conscious visual experience, patients with visual cortex damage discriminate the valence of emotional expression. Even in healthy individuals, a subliminal mechanism can be utilized to compensate for a functional decline in visual cognition of various causes such as strong sleepiness. In this study, sleep deprivation was simulated in healthy individuals to investigate functional alterations in the subliminal processing of emotional information caused by reduced conscious visual cognition and attention due to an increase in subjective sleepiness. Fourteen healthy adult men participated in a within-subject crossover study consisting of a 5-day session of sleep debt (SD, 4-h sleep) and a 5-day session of sleep control (SC, 8-h sleep). On the last day of each session, participants performed an emotional face-viewing task that included backward masking of nonconscious presentations during magnetic resonance scanning. Finally, data from eleven participants who were unaware of nonconscious face presentations were analyzed. In fear contrasts, subjective sleepiness was significantly positively correlated with activity in the amygdala, ventromedial prefrontal cortex, hippocampus, and insular cortex, and was significantly negatively correlated with the secondary and tertiary visual areas and the fusiform face area. In fear-neutral contrasts, subjective sleepiness was significantly positively correlated with activity of the bilateral amygdala. Further, changes in subjective sleepiness (the difference between the SC and SD sessions) were correlated with both changes in amygdala activity and functional connectivity between the amygdala and superior colliculus in response to subliminal fearful faces. Sleepiness induced functional decline in the brain areas involved in conscious visual cognition of facial expressions, but also enhanced subliminal emotional processing via superior colliculus as represented by activity in the amygdala. These findings suggest that an evolutionally old and auxiliary subliminal hazard perception system is activated as a compensatory mechanism when conscious visual cognition is impaired. In addition, enhancement of subliminal emotional processing might cause involuntary emotional instability during sleep debt through changes in emotional response to or emotional evaluation of external stimuli.

  10. Shades of Emotion: What the Addition of Sunglasses or Masks to Faces Reveals about the Development of Facial Expression Processing

    ERIC Educational Resources Information Center

    Roberson, Debi; Kikutani, Mariko; Doge, Paula; Whitaker, Lydia; Majid, Asifa

    2012-01-01

    Three studies investigated developmental changes in facial expression processing, between 3 years-of-age and adulthood. For adults and older children, the addition of sunglasses to upright faces caused an equivalent decrement in performance to face inversion. However, younger children showed "better" classification of expressions of faces wearing…

  11. Perceptual Biases in Processing Facial Identity and Emotion

    ERIC Educational Resources Information Center

    Coolican, Jamesie; Eskes, Gail A.; McMullen, Patricia A.; Lecky, Erin

    2008-01-01

    Normal observers demonstrate a bias to process the left sides of faces during perceptual judgments about identity or emotion. This effect suggests a right cerebral hemisphere processing bias. To test the role of the right hemisphere and the involvement of configural processing underlying this effect, young and older control observers and patients…

  12. Amygdala hyperactivation to angry faces in intermittent explosive disorder.

    PubMed

    McCloskey, Michael S; Phan, K Luan; Angstadt, Mike; Fettich, Karla C; Keedy, Sarah; Coccaro, Emil F

    2016-08-01

    Individuals with intermittent explosive disorder (IED) were previously found to exhibit amygdala hyperactivation and relatively reduced orbital medial prefrontal cortex (OMPFC) activation to angry faces while performing an implicit emotion information processing task during functional magnetic resonance imaging (fMRI). This study examines the neural substrates associated with explicit encoding of facial emotions among individuals with IED. Twenty unmedicated IED subjects and twenty healthy, matched comparison subjects (HC) underwent fMRI while viewing blocks of angry, happy, and neutral faces and identifying the emotional valence of each face (positive, negative or neutral). We compared amygdala and OMPFC reactivity to faces between IED and HC subjects. We also examined the relationship between amygdala/OMPFC activation and aggression severity. Compared to controls, the IED group exhibited greater amygdala response to angry (vs. neutral) facial expressions. In contrast, IED and control groups did not differ in OMPFC activation to angry faces. Across subjects amygdala activation to angry faces was correlated with number of prior aggressive acts. These findings extend previous evidence of amygdala dysfunction in response to the identification of an ecologically-valid social threat signal (processing angry faces) among individuals with IED, further substantiating a link between amygdala hyperactivity to social signals of direct threat and aggression. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Developmental differences in the neural mechanisms of facial emotion labeling.

    PubMed

    Wiggins, Jillian Lee; Adleman, Nancy E; Kim, Pilyoung; Oakes, Allison H; Hsu, Derek; Reynolds, Richard C; Chen, Gang; Pine, Daniel S; Brotman, Melissa A; Leibenluft, Ellen

    2016-01-01

    Adolescence is a time of increased risk for the onset of psychological disorders associated with deficits in face emotion labeling. We used functional magnetic resonance imaging (fMRI) to examine age-related differences in brain activation while adolescents and adults labeled the emotion on fearful, happy and angry faces of varying intensities [0% (i.e. neutral), 50%, 75%, 100%]. Adolescents and adults did not differ on accuracy to label emotions. In the superior temporal sulcus, ventrolateral prefrontal cortex and middle temporal gyrus, adults show an inverted-U-shaped response to increasing intensities of fearful faces and a U-shaped response to increasing intensities of happy faces, whereas adolescents show the opposite patterns. In addition, adults, but not adolescents, show greater inferior occipital gyrus activation to negative (angry, fearful) vs positive (happy) emotions. In sum, when subjects classify subtly varying facial emotions, developmental differences manifest in several 'ventral stream' brain regions. Charting the typical developmental course of the brain mechanisms of socioemotional processes, such as facial emotion labeling, is an important focus for developmental psychopathology research. Published by Oxford University Press 2015. This work is written by US Government employees and is in the public domain in the US.

  14. Effects of Repeated Concussions and Sex on Early Processing of Emotional Facial Expressions as Revealed by Electrophysiology.

    PubMed

    Carrier-Toutant, Frédérike; Guay, Samuel; Beaulieu, Christelle; Léveillé, Édith; Turcotte-Giroux, Alexandre; Papineau, Samaël D; Brisson, Benoit; D'Hondt, Fabien; De Beaumont, Louis

    2018-05-06

    Concussions affect the processing of emotional stimuli. This study aimed to investigate how sex interacts with concussion effects on early event-related brain potentials (ERP) measures (P1, N1) of emotional facial expressions (EFE) processing in asymptomatic, multi-concussion athletes during an EFE identification task. Forty control athletes (20 females and 20 males) and 43 multi-concussed athletes (22 females and 21 males), recruited more than 3 months after their last concussion, were tested. Participants completed the Beck Depression Inventory II, the Beck Anxiety Inventory, the Post-Concussion Symptom Scale, and an Emotional Facial Expression Identification Task. Pictures of male and female faces expressing neutral, angry, and happy emotions were randomly presented and the emotion depicted had to be identified as fast as possible during EEG acquisition. Relative to controls, concussed athletes of both sex exhibited a significant suppression of P1 amplitude recorded from the dominant right hemisphere while performing the emotional face expression identification task. The present study also highlighted a sex-specific suppression of the N1 component amplitude after concussion which affected male athletes. These findings suggest that repeated concussions alter the typical pattern of right-hemisphere response dominance to EFE in early stages of EFE processing and that the neurophysiological mechanisms underlying the processing of emotional stimuli are distinctively affected across sex. (JINS, 2018, 24, 1-11).

  15. Changes in the neural correlates of implicit emotional face processing during antidepressant treatment in major depressive disorder.

    PubMed

    Victor, Teresa A; Furey, Maura L; Fromm, Stephen J; Öhman, Arne; Drevets, Wayne C

    2013-11-01

    An emerging hypothesis regarding the mechanisms underlying antidepressant pharmacotherapy suggests that these agents benefit depressed patients by reversing negative emotional processing biases (Harmer, 2008). Neuropsychological indices and functional neuroimaging measures of the amygdala response show that antidepressant drugs shift implicit and explicit processing biases away from the negative valence and toward the positive valence. However, few studies have explored such biases in regions extensively connected with the amygdala, such as the pregenual anterior cingulate cortex (pgACC) area, where pre-treatment activity consistently has predicted clinical outcome during antidepressant treatment. We used functional magnetic resonance imaging (fMRI) to investigate changes in haemodynamic response patterns to positive vs. negative stimuli in patients with major depressive disorder (MDD) under antidepressant treatment. Participants with MDD (n = 10) underwent fMRI before and after 8 wk sertraline treatment; healthy controls (n = 10) were imaged across an equivalent interval. A backward masking task was used to elicit non-conscious neural responses to sad, happy and neutral face expressions. Haemodynamic responses to emotional face stimuli were compared between conditions and groups in the pgACC. The response to masked-sad vs. masked-happy faces (SN-HN) in pgACC in the depressed subjects was higher in the pre-treatment condition than in the post-treatment condition and this difference was significantly greater than the corresponding change across time in the controls. The treatment-associated difference was attributable to an attenuated response to sad faces and an enhanced response to happy faces. Pre-treatment pgACC responses to SN-HN correlated positively with clinical improvement during treatment. The pgACC participates with the amygdala in processing the salience of emotional stimuli. Treatment-associated functional changes in this limbic network may influence the non-conscious processing of such stimuli by reversing the negative processing bias extant in MDD.

  16. Selective attention to emotional cues and emotion recognition in healthy subjects: the role of mineralocorticoid receptor stimulation.

    PubMed

    Schultebraucks, Katharina; Deuter, Christian E; Duesenberg, Moritz; Schulze, Lars; Hellmann-Regen, Julian; Domke, Antonia; Lockenvitz, Lisa; Kuehl, Linn K; Otte, Christian; Wingenfeld, Katja

    2016-09-01

    Selective attention toward emotional cues and emotion recognition of facial expressions are important aspects of social cognition. Stress modulates social cognition through cortisol, which acts on glucocorticoid (GR) and mineralocorticoid receptors (MR) in the brain. We examined the role of MR activation on attentional bias toward emotional cues and on emotion recognition. We included 40 healthy young women and 40 healthy young men (mean age 23.9 ± 3.3), who either received 0.4 mg of the MR agonist fludrocortisone or placebo. A dot-probe paradigm was used to test for attentional biases toward emotional cues (happy and sad faces). Moreover, we used a facial emotion recognition task to investigate the ability to recognize emotional valence (anger and sadness) from facial expression in four graded categories of emotional intensity (20, 30, 40, and 80 %). In the emotional dot-probe task, we found a main effect of treatment and a treatment × valence interaction. Post hoc analyses revealed an attentional bias away from sad faces after placebo intake and a shift in selective attention toward sad faces compared to placebo. We found no attentional bias toward happy faces after fludrocortisone or placebo intake. In the facial emotion recognition task, there was no main effect of treatment. MR stimulation seems to be important in modulating quick, automatic emotional processing, i.e., a shift in selective attention toward negative emotional cues. Our results confirm and extend previous findings of MR function. However, we did not find an effect of MR stimulation on emotion recognition.

  17. Altered insular activation and increased insular functional connectivity during sad and happy face processing in adolescent major depressive disorder.

    PubMed

    Henje Blom, Eva; Connolly, Colm G; Ho, Tiffany C; LeWinn, Kaja Z; Mobayed, Nisreen; Han, Laura; Paulus, Martin P; Wu, Jing; Simmons, Alan N; Yang, Tony T

    2015-06-01

    Major depressive disorder (MDD) is a leading cause of disability worldwide and occurs commonly first during adolescence. The insular cortex (IC) plays an important role in integrating emotion processing with interoception and has been implicated recently in the pathophysiology of adult and adolescent MDD. However, no studies have yet specifically examined the IC in adolescent MDD during processing of faces in the sad-happy continuum. Thus, the aim of the present study is to investigate the IC during sad and happy face processing in adolescents with MDD compared to healthy controls (HCL). Thirty-one adolescents (22 female) with MDD and 36 (23 female) HCL underwent a well-validated emotional processing fMRI paradigm that included sad and happy face stimuli. The MDD group showed significantly less differential activation of the anterior/middle insular cortex (AMIC) in response to sad versus happy faces compared to the HCL group. AMIC also showed greater functional connectivity with right fusiform gyrus, left middle frontal gyrus, and right amygdala/parahippocampal gyrus in the MDD compared to HCL group. Moreover, differential activation to sad and happy faces in AMIC correlated negatively with depression severity within the MDD group. Small age-range and cross-sectional nature precluded assessment of development of the AMIC in adolescent depression. Given the role of the IC in integrating bodily stimuli with conscious cognitive and emotional processes, our findings of aberrant AMIC function in adolescent MDD provide a neuroscientific rationale for targeting the AMIC in the development of new treatment modalities. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Impaired recognition of body expressions in the behavioral variant of frontotemporal dementia.

    PubMed

    Van den Stock, Jan; De Winter, François-Laurent; de Gelder, Beatrice; Rangarajan, Janaki Raman; Cypers, Gert; Maes, Frederik; Sunaert, Stefan; Goffin, Karolien; Vandenberghe, Rik; Vandenbulcke, Mathieu

    2015-08-01

    Progressive deterioration of social cognition and emotion processing are core symptoms of the behavioral variant of frontotemporal dementia (bvFTD). Here we investigate whether bvFTD is also associated with impaired recognition of static (Experiment 1) and dynamic (Experiment 2) bodily expressions. In addition, we compared body expression processing with processing of static (Experiment 3) and dynamic (Experiment 4) facial expressions, as well as with face identity processing (Experiment 5). The results reveal that bvFTD is associated with impaired recognition of static and dynamic bodily and facial expressions, while identity processing was intact. No differential impairments were observed regarding motion (static vs. dynamic) or category (body vs. face). Within the bvFTD group, we observed a significant partial correlation between body and face expression recognition, when controlling for performance on the identity task. Voxel-Based Morphometry (VBM) analysis revealed that body emotion recognition was positively associated with gray matter volume in a region of the inferior frontal gyrus (pars orbitalis/triangularis). The results are in line with a supramodal emotion recognition deficit in bvFTD. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. The impact of the stimulus features and task instructions on facial processing in social anxiety: an ERP investigation.

    PubMed

    Peschard, Virginie; Philippot, Pierre; Joassin, Frédéric; Rossignol, Mandy

    2013-04-01

    Social anxiety has been characterized by an attentional bias towards threatening faces. Electrophysiological studies have demonstrated modulations of cognitive processing from 100 ms after stimulus presentation. However, the impact of the stimulus features and task instructions on facial processing remains unclear. Event-related potentials were recorded while high and low socially anxious individuals performed an adapted Stroop paradigm that included a colour-naming task with non-emotional stimuli, an emotion-naming task (the explicit task) and a colour-naming task (the implicit task) on happy, angry and neutral faces. Whereas the impact of task factors was examined by contrasting an explicit and an implicit emotional task, the effects of perceptual changes on facial processing were explored by including upright and inverted faces. The findings showed an enhanced P1 in social anxiety during the three tasks, without a moderating effect of the type of task or stimulus. These results suggest a global modulation of attentional processing in performance situations. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. The Older Adult Positivity Effect in Evaluations of Trustworthiness: Emotion Regulation or Cognitive Capacity?

    PubMed

    Zebrowitz, Leslie A; Boshyan, Jasmine; Ward, Noreen; Gutchess, Angela; Hadjikhani, Nouchine

    2017-01-01

    An older adult positivity effect, i.e., the tendency for older adults to favor positive over negative stimulus information more than do younger adults, has been previously shown in attention, memory, and evaluations. This effect has been attributed to greater emotion regulation in older adults. In the case of attention and memory, this explanation has been supported by some evidence that the older adult positivity effect is most pronounced for negative stimuli, which would motivate emotion regulation, and that it is reduced by cognitive load, which would impede emotion regulation. We investigated whether greater older adult positivity in the case of evaluative responses to faces is also enhanced for negative stimuli and attenuated by cognitive load, as an emotion regulation explanation would predict. In two studies, younger and older adults rated trustworthiness of faces that varied in valence both under low and high cognitive load, with the latter manipulated by a distracting backwards counting task. In Study 1, face valence was manipulated by attractiveness (low /disfigured faces, medium, high/fashion models' faces). In Study 2, face valence was manipulated by trustworthiness (low, medium, high). Both studies revealed a significant older adult positivity effect. However, contrary to an emotion regulation account, this effect was not stronger for more negative faces, and cognitive load increased rather than decreased the rated trustworthiness of negatively valenced faces. Although inconsistent with emotion regulation, the latter effect is consistent with theory and research arguing that more cognitive resources are required to process negative stimuli, because they are more cognitively elaborated than positive ones. The finding that increased age and increased cognitive load both enhanced the positivity of trustworthy ratings suggests that the older adult positivity effect in evaluative ratings of faces may reflect age-related declines in cognitive capacity rather than increases in the regulation of negative emotions.

  1. Perception of emotion on faces in frontotemporal dementia and Alzheimer's disease: a longitudinal study.

    PubMed

    Lavenu, I; Pasquier, F

    2005-01-01

    Frontotemporal dementia (FTD) is a neurodegenerative disease characterised by behavioural disorders that suggest abnormalities of emotional processing. In a previous study, we showed that patients with Alzheimer's disease (AD) and with FTD were equally able to distinguish a face displaying affect from one not displaying affect. However, recognition of emotion was worse in patients with FTD than in patients with AD who did not differ significantly from controls. The aim of this study was to follow up the perception of emotions on faces in these patients. The poor perception of emotion could worsen differently in AD and in FTD, with the progression of atrophy of the amygdala, the anterior temporal cortex and the orbital frontal cortex, structures that are components of the brain's emotional processing systems. Patients with AD or with FTD had to recognise and point out the name of one of seven basic emotions (anger, disgust, happiness, fear, sadness, surprise and contempt) on a set of 28 faces presented on slides at the first visit and 3 years later. Thirty-seven patients (AD = 19, FTD = 18) performed the tests initially. The two patient groups did not differ for age, sex and duration of the disease. During the follow-up, 12 patients died, 4 patients refused to perform the tests and 8 could not be tested because of the severity of the disease. Finally, 7 patients with AD and 6 patients with FTD performed the two tests at a mean delay of 40 months. All patients with AD had worse results at follow-up on the perception of emotion despite the prescription of inhibitors of cholinesterase in all patients and of selective serotonin reuptake inhibitors (SSRIs) in 4 patients. As a whole, patients with FTD had better results in the second than in the first assessment (however, 3 of them had worse results) independently of the prescription of trazodone (n = 2), other SSRIs (n = 2), or the absence of treatment (n = 2), and of possible cognitive change. Recognition of emotion on faces in AD decreases with the progression of dementia and could be related to the progression of the degeneration of the structures implicated in emotional processing systems. Inconsistency of the results in FTD may be related to impulsiveness, lack of consistency of the patients and to heterogeneity of the progression of the lesions. Copyright 2005 S. Karger AG, Basel.

  2. How does context affect assessments of facial emotion? The role of culture and age.

    PubMed

    Ko, Seon-Gyu; Lee, Tae-Ho; Yoon, Hyea-Young; Kwon, Jung-Hye; Mather, Mara

    2011-03-01

    People from Asian cultures are more influenced by context in their visual processing than people from Western cultures. In this study, we examined how these cultural differences in context processing affect how people interpret facial emotions. We found that younger Koreans were more influenced than younger Americans by emotional background pictures when rating the emotion of a central face, especially those younger Koreans with low self-rated stress. In contrast, among older adults, neither Koreans nor Americans showed significant influences of context in their face emotion ratings. These findings suggest that cultural differences in reliance on context to interpret others' emotions depend on perceptual integration processes that decline with age, leading to fewer cultural differences in perception among older adults than among younger adults. Furthermore, when asked to recall the background pictures, younger participants recalled more negative pictures than positive pictures, whereas older participants recalled similar numbers of positive and negative pictures. These age differences in the valence of memory were consistent across culture. (c) 2011 APA, all rights reserved.

  3. Differential effects of face-realism and emotion on event-related brain potentials and their implications for the uncanny valley theory

    NASA Astrophysics Data System (ADS)

    Schindler, Sebastian; Zell, Eduard; Botsch, Mario; Kissler, Johanna

    2017-03-01

    Cartoon characters are omnipresent in popular media. While few studies have scientifically investigated their processing, in computer graphics, efforts are made to increase realism. Yet, close approximations of reality have been suggested to evoke sometimes a feeling of eeriness, the “uncanny valley” effect. Here, we used high-density electroencephalography to investigate brain responses to professionally stylized happy, angry, and neutral character faces. We employed six face-stylization levels varying from abstract to realistic and investigated the N170, early posterior negativity (EPN), and late positive potential (LPP) event-related components. The face-specific N170 showed a u-shaped modulation, with stronger reactions towards both most abstract and most realistic compared to medium-stylized faces. For abstract faces, N170 was generated more occipitally than for real faces, implying stronger reliance on structural processing. Although emotional faces elicited highest amplitudes on both N170 and EPN, on the N170 realism and expression interacted. Finally, LPP increased linearly with face realism, reflecting activity increase in visual and parietal cortex for more realistic faces. Results reveal differential effects of face stylization on distinct face processing stages and suggest a perceptual basis to the uncanny valley hypothesis. They are discussed in relation to face perception, media design, and computer graphics.

  4. Differential effects of face-realism and emotion on event-related brain potentials and their implications for the uncanny valley theory

    PubMed Central

    Schindler, Sebastian; Zell, Eduard; Botsch, Mario; Kissler, Johanna

    2017-01-01

    Cartoon characters are omnipresent in popular media. While few studies have scientifically investigated their processing, in computer graphics, efforts are made to increase realism. Yet, close approximations of reality have been suggested to evoke sometimes a feeling of eeriness, the “uncanny valley” effect. Here, we used high-density electroencephalography to investigate brain responses to professionally stylized happy, angry, and neutral character faces. We employed six face-stylization levels varying from abstract to realistic and investigated the N170, early posterior negativity (EPN), and late positive potential (LPP) event-related components. The face-specific N170 showed a u-shaped modulation, with stronger reactions towards both most abstract and most realistic compared to medium-stylized faces. For abstract faces, N170 was generated more occipitally than for real faces, implying stronger reliance on structural processing. Although emotional faces elicited highest amplitudes on both N170 and EPN, on the N170 realism and expression interacted. Finally, LPP increased linearly with face realism, reflecting activity increase in visual and parietal cortex for more realistic faces. Results reveal differential effects of face stylization on distinct face processing stages and suggest a perceptual basis to the uncanny valley hypothesis. They are discussed in relation to face perception, media design, and computer graphics. PMID:28332557

  5. Influence of emotional processing on working memory in schizophrenia.

    PubMed

    Becerril, Karla; Barch, Deanna

    2011-09-01

    Research on emotional processing in schizophrenia suggests relatively intact subjective responses to affective stimuli "in the moment." However, neuroimaging evidence suggests diminished activation in brain regions associated with emotional processing in schizophrenia. We asked whether given a more vulnerable cognitive system in schizophrenia, individuals with this disorder would show increased or decreased modulation of working memory (WM) as a function of the emotional content of stimuli compared with healthy control subjects. In addition, we examined whether higher anhedonia levels were associated with a diminished impact of emotion on behavioral and brain activation responses. In the present study, 38 individuals with schizophrenia and 32 healthy individuals completed blocks of a 2-back WM task in a functional magnetic resonance imaging scanning session. Blocks contained faces displaying either only neutral stimuli or neutral and emotional stimuli (happy or fearful faces), randomly intermixed and occurring both as targets and non-targets. Both groups showed higher accuracy but slower reaction time for negative compared to neutral stimuli. Individuals with schizophrenia showed intact amygdala activity in response to emotionally evocative stimuli, but demonstrated altered dorsolateral prefrontal cortex (DLPFC) and hippocampal activity while performing an emotionally loaded WM-task. Higher levels of social anhedonia were associated with diminished amygdala responses to emotional stimuli and increased DLPFC activity in individuals with schizophrenia. Emotional arousal may challenge dorsal-frontal control systems, which may have both beneficial and detrimental influences. Our findings suggest that disturbances in emotional processing in schizophrenia relate to alterations in emotion-cognition interactions rather than to the perception and subjective experience of emotion per se.

  6. Long-Lasting Effects of Subliminal Affective Priming from Facial Expressions

    PubMed Central

    Sweeny, Timothy D.; Grabowecky, Marcia; Suzuki, Satoru; Paller, Ken A.

    2009-01-01

    Unconscious processing of stimuli with emotional content can bias affective judgments. Is this subliminal affective priming merely a transient phenomenon manifested in fleeting perceptual changes, or are long-lasting effects also induced? To address this question, we investigated memory for surprise faces 24 hours after they had been shown with 30-ms fearful, happy, or neutral faces. Surprise faces subliminally primed by happy faces were initially rated as more positive, and were later remembered better, than those primed by fearful or neutral faces. Participants likely to have processed primes supraliminally did not respond differentially as a function of expression. These results converge with findings showing memory advantages with happy expressions, though here the expressions were displayed on the face of a different person, perceived subliminally, and not present at test. We conclude that behavioral biases induced by masked emotional expressions are not ephemeral, but rather can last at least 24 hours. PMID:19695907

  7. Long-lasting effects of subliminal affective priming from facial expressions.

    PubMed

    Sweeny, Timothy D; Grabowecky, Marcia; Suzuki, Satoru; Paller, Ken A

    2009-12-01

    Unconscious processing of stimuli with emotional content can bias affective judgments. Is this subliminal affective priming merely a transient phenomenon manifested in fleeting perceptual changes, or are long-lasting effects also induced? To address this question, we investigated memory for surprise faces 24 h after they had been shown with 30-ms fearful, happy, or neutral faces. Surprise faces subliminally primed by happy faces were initially rated as more positive, and were later remembered better, than those primed by fearful or neutral faces. Participants likely to have processed primes supraliminally did not respond differentially as a function of expression. These results converge with findings showing memory advantages with happy expressions, though here the expressions were displayed on the face of a different person, perceived subliminally, and not present at test. We conclude that behavioral biases induced by masked emotional expressions are not ephemeral, but rather can last at least 24 h.

  8. Emotion Words, Regardless of Polarity, Have a Processing Advantage over Neutral Words

    ERIC Educational Resources Information Center

    Kousta, Stavroula-Thaleia; Vinson, David P.; Vigliocco, Gabriella

    2009-01-01

    Despite increasing interest in the interface between emotion and cognition, the role of emotion in cognitive tasks is unclear. According to one hypothesis, negative valence is more relevant for survival and is associated with a general slowdown of the processing of stimuli, due to a defense mechanism that freezes activity in the face of threat.…

  9. When Emotions Matter: Focusing on Emotion Improves Working Memory Updating in Older Adults

    PubMed Central

    Berger, Natalie; Richards, Anne; Davelaar, Eddy J.

    2017-01-01

    Research indicates that emotion can affect the ability to monitor and replace content in working memory, an executive function that is usually referred to as updating. However, it is less clear if the effects of emotion on updating vary with its relevance for the task and with age. Here, 25 younger (20–34 years of age) and 25 older adults (63–80 years of age) performed a 1-back and a 2-back task, in which they responded to younger, middle-aged, and older faces showing neutral, happy or angry expressions. The relevance of emotion for the task was manipulated through instructions to make match/non-match judgments based on the emotion (i.e., emotion was task-relevant) or the age (i.e., emotion was task-irrelevant) of the face. It was found that only older adults updated emotional faces more readily compared to neutral faces as evidenced by faster RTs on non-match trials. This emotion benefit was observed under low-load conditions (1-back task) but not under high-load conditions (2-back task) and only if emotion was task-relevant. In contrast, task-irrelevant emotion did not impair updating performance in either age group. These findings suggest that older adults can benefit from task-relevant emotional information to a greater extent than younger adults when sufficient cognitive resources are available. They also highlight that emotional processing can buffer age-related decline in WM tasks that require not only maintenance but also manipulation of material. PMID:28966602

  10. Facing changes and changing faces in adolescence: A new model for investigating adolescent-specific interactions between pubertal, brain and behavioral development

    PubMed Central

    Scherf, K. Suzanne; Behrmann, Marlene; Dahl, Ronald E.

    2015-01-01

    Adolescence is a time of dramatic physical, cognitive, emotional, and social changes as well as a time for the development of many social-emotional problems. These characteristics raise compelling questions about accompanying neural changes that are unique to this period of development. Here, we propose that studying adolescent-specific changes in face processing and its underlying neural circuitry provides an ideal model for addressing these questions. We also use this model to formulate new hypotheses. Specifically, pubertal hormones are likely to increase motivation to master new peer-oriented developmental tasks, which will in turn, instigate the emergence of new social/affective components of face processing. We also predict that pubertal hormones have a fundamental impact on the reorganization of neural circuitry supporting face processing and propose, in particular, that, the functional connectivity, or temporal synchrony, between regions of the face-processing network will change with the emergence of these new components of face processing in adolescence. Finally, we show how this approach will help reveal why adolescence may be a period of vulnerability in brain development and suggest how it could lead to prevention and intervention strategies that facilitate more adaptive functional interactions between regions within the broader social information processing network. PMID:22483070

  11. Normative data on development of neural and behavioral mechanisms underlying attention orienting toward social-emotional stimuli: An exploratory study

    PubMed Central

    Lindstrom, Kara; Guyer, Amanda E.; Mogg, Karin; Bradley, Brendan P.; Fox, Nathan A.; Ernst, Monique; Nelson, Eric E.; Leibenluft, Ellen; Britton, Jennifer C.; Monk, Christopher S.; Pine, Daniel S.; Bar-Haim, Yair

    2009-01-01

    The ability of positive and negative facial signals to influence attention orienting is crucial to social functioning. Given the dramatic developmental change in neural architecture supporting social function, positive and negative facial cues may influence attention orienting differently in relatively young or old individuals. However, virtually no research examines such age-related differences in the neural circuitry supporting attention orienting to emotional faces. We examined age-related correlations in attention-orienting biases to positive and negative face emotions in a healthy sample (N=37; 9-40 years old) using functional magnetic resonance imaging and a dot-probe task. The dot-probe task in an fMRI setting yields both behavioral and neural indices of attention biases towards or away from an emotional cue (happy or angry face). In the full sample, angry-face attention bias scores did not correlate with age, and age did not correlate with brain activation to angry faces. However, age did positively correlate with attention bias towards happy faces; age also negatively correlated with left cuneus and left caudate activation to a happy-bias fMRI contrast. Secondary analyses suggested age-related changes in attention bias to happy faces. The tendency in younger children to direct attention away from happy faces (relative to neutral faces) was diminished in the older age groups, in tandem with increasing neural deactivation. Implications for future work on developmental changes in attention-emotion processing are discussed. PMID:19631626

  12. Exploring the unconscious using faces.

    PubMed

    Axelrod, Vadim; Bar, Moshe; Rees, Geraint

    2015-01-01

    Understanding the mechanisms of unconscious processing is one of the most substantial endeavors of cognitive science. While there are many different empirical ways to address this question, the use of faces in such research has proven exceptionally fruitful. We review here what has been learned about unconscious processing through the use of faces and face-selective neural correlates. A large number of cognitive systems can be explored with faces, including emotions, social cueing and evaluation, attention, multisensory integration, and various aspects of face processing. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Divergent Patterns of Social Cognition Performance in Autism and 22q11.2 Deletion Syndrome (22q11DS)

    ERIC Educational Resources Information Center

    McCabe, Kathryn L.; Melville, Jessica L.; Rich, Dominique; Strutt, Paul A.; Cooper, Gavin; Loughland, Carmel M.; Schall, Ulrich; Campbell, Linda E.

    2013-01-01

    Individuals with developmental disorders frequently report a range of social cognition deficits including difficulties identifying facial displays of emotion. This study examined the specificity of face emotion processing deficits in adolescents with either autism or 22q11DS compared to typically developing (TD) controls. Two tasks (face emotion…

  14. Gender Differences in Neural Responses to Perceptually Invisible Fearful Face—An ERP Study

    PubMed Central

    Lee, Seung A.; Kim, Chai-Youn; Shim, Miseon; Lee, Seung-Hwan

    2017-01-01

    Women tend to respond to emotional stimuli differently from men. This study aimed at investigating whether neural responses to perceptually “invisible” emotional stimuli differ between men and women by exploiting event-related potential (ERP). Forty healthy participants (21 women) were recruited for the main experiment. A control experiment was conducted by excluding nine (7 women) participants from the main experiment and replacing them with additional ten (6 women) participants (total 41 participants) where Beck's Anxiety Inventory (BAI) and Beck's Depression Inventory (BDI) scores were controlled. Using the visual backward masking paradigm, either a fearful or a neutral face stimulus was presented in varied durations (subthreshold, near-threshold, or suprathreshold) followed by a mask. Participants performed a two-alternative forced choice (2-AFC) emotion discrimination task on each face. Behavioral analysis showed that participants were unaware of masked stimuli of which duration was the shortest and, therefore, processed at subthreshold. Nevertheless, women showed significantly larger response in P100 amplitude to subthreshold fearful faces than men. This result remained consistent in the control experiment. Our findings indicate gender-differences in neural response to subthreshold emotional face, which is reflected in the early processing stage. PMID:28184189

  15. Recognition of facial, auditory, and bodily emotions in older adults.

    PubMed

    Ruffman, Ted; Halberstadt, Jamin; Murray, Janice

    2009-11-01

    Understanding older adults' social functioning difficulties requires insight into their recognition of emotion processing in voices and bodies, not just faces, the focus of most prior research. We examined 60 young and 61 older adults' recognition of basic emotions in facial, vocal, and bodily expressions, and when matching faces and bodies to voices, using 120 emotion items. Older adults were worse than young adults in 17 of 30 comparisons, with consistent difficulties in recognizing both positive (happy) and negative (angry and sad) vocal and bodily expressions. Nearly three quarters of older adults functioned at a level similar to the lowest one fourth of young adults, suggesting that age-related changes are common. In addition, we found that older adults' difficulty in matching emotions was not explained by difficulty on the component sources (i.e., faces or voices on their own), suggesting an additional problem of integration.

  16. MEG demonstrates a supra-additive response to facial and vocal emotion in the right superior temporal sulcus.

    PubMed

    Hagan, Cindy C; Woods, Will; Johnson, Sam; Calder, Andrew J; Green, Gary G R; Young, Andrew W

    2009-11-24

    An influential neural model of face perception suggests that the posterior superior temporal sulcus (STS) is sensitive to those aspects of faces that produce transient visual changes, including facial expression. Other researchers note that recognition of expression involves multiple sensory modalities and suggest that the STS also may respond to crossmodal facial signals that change transiently. Indeed, many studies of audiovisual (AV) speech perception show STS involvement in AV speech integration. Here we examine whether these findings extend to AV emotion. We used magnetoencephalography to measure the neural responses of participants as they viewed and heard emotionally congruent fear and minimally congruent neutral face and voice stimuli. We demonstrate significant supra-additive responses (i.e., where AV > [unimodal auditory + unimodal visual]) in the posterior STS within the first 250 ms for emotionally congruent AV stimuli. These findings show a role for the STS in processing crossmodal emotive signals.

  17. Brief report: Representational momentum for dynamic facial expressions in pervasive developmental disorder.

    PubMed

    Uono, Shota; Sato, Wataru; Toichi, Motomi

    2010-03-01

    Individuals with pervasive developmental disorder (PDD) have difficulty with social communication via emotional facial expressions, but behavioral studies involving static images have reported inconsistent findings about emotion recognition. We investigated whether dynamic presentation of facial expression would enhance subjective perception of expressed emotion in 13 individuals with PDD and 13 typically developing controls. We presented dynamic and static emotional (fearful and happy) expressions. Participants were asked to match a changeable emotional face display with the last presented image. The results showed that both groups perceived the last image of dynamic facial expression to be more emotionally exaggerated than the static facial expression. This finding suggests that individuals with PDD have an intact perceptual mechanism for processing dynamic information in another individual's face.

  18. A face a mother could love: depression-related maternal neural responses to infant emotion faces.

    PubMed

    Laurent, Heidemarie K; Ablow, Jennifer C

    2013-01-01

    Depressed mothers show negatively biased responses to their infants' emotional bids, perhaps due to faulty processing of infant cues. This study is the first to examine depression-related differences in mothers' neural response to their own infant's emotion faces, considering both effects of perinatal depression history and current depressive symptoms. Primiparous mothers (n = 22), half of whom had a history of major depressive episodes (with one episode occurring during pregnancy and/or postpartum), were exposed to images of their own and unfamiliar infants' joy and distress faces during functional neuroimaging. Group differences (depression vs. no-depression) and continuous effects of current depressive symptoms were tested in relation to neural response to own infant emotion faces. Compared to mothers with no psychiatric diagnoses, those with depression showed blunted responses to their own infant's distress faces in the dorsal anterior cingulate cortex. Mothers with higher levels of current symptomatology showed reduced responses to their own infant's joy faces in the orbitofrontal cortex and insula. Current symptomatology also predicted lower responses to own infant joy-distress in left-sided prefrontal and insula/striatal regions. These deficits in self-regulatory and motivational response circuits may help explain parenting difficulties in depressed mothers.

  19. Emotional Processing of Infants Displays in Eating Disorders

    PubMed Central

    Cardi, Valentina; Corfield, Freya; Leppanen, Jenni; Rhind, Charlotte; Deriziotis, Stephanie; Hadjimichalis, Alexandra; Hibbs, Rebecca; Micali, Nadia; Treasure, Janet

    2014-01-01

    Aim The aim of this study is to examine emotional processing of infant displays in people with Eating Disorders (EDs). Background Social and emotional factors are implicated as causal and maintaining factors in EDs. Difficulties in emotional regulation have been mainly studied in relation to adult interactions, with less interest given to interactions with infants. Method A sample of 138 women were recruited, of which 49 suffered from Anorexia Nervosa (AN), 16 from Bulimia Nervosa (BN), and 73 were healthy controls (HCs). Attentional responses to happy and sad infant faces were tested with the visual probe detection task. Emotional identification of, and reactivity to, infant displays were measured using self-report measures. Facial expressions to video clips depicting sad, happy and frustrated infants were also recorded. Results No significant differences between groups were observed in the attentional response to infant photographs. However, there was a trend for patients to disengage from happy faces. People with EDs also reported lower positive ratings of happy infant displays and greater subjective negative reactions to sad infants. Finally, patients showed a significantly lower production of facial expressions, especially in response to the happy infant video clip. Insecure attachment was negatively correlated with positive facial expressions displayed in response to the happy infant and positively correlated with the intensity of negative emotions experienced in response to the sad infant video clip. Conclusion People with EDs do not have marked abnormalities in their attentional processing of infant emotional faces. However, they do have a reduction in facial affect particularly in response to happy infants. Also, they report greater negative reactions to sadness, and rate positive emotions less intensively than HCs. This pattern of emotional responsivity suggests abnormalities in social reward sensitivity and might indicate new treatment targets. PMID:25463051

  20. Amygdala atrophy affects emotion-related activity in face-responsive regions in frontotemporal degeneration.

    PubMed

    De Winter, François-Laurent; Van den Stock, Jan; de Gelder, Beatrice; Peeters, Ronald; Jastorff, Jan; Sunaert, Stefan; Vanduffel, Wim; Vandenberghe, Rik; Vandenbulcke, Mathieu

    2016-09-01

    In the healthy brain, modulatory influences from the amygdala commonly explain enhanced activation in face-responsive areas by emotional facial expressions relative to neutral expressions. In the behavioral variant frontotemporal dementia (bvFTD) facial emotion recognition is impaired and has been associated with atrophy of the amygdala. By combining structural and functional MRI in 19 patients with bvFTD and 20 controls we investigated the neural effects of emotion in face-responsive cortex and its relationship with amygdalar gray matter (GM) volume in neurodegeneration. Voxel-based morphometry revealed decreased GM volume in anterior medio-temporal regions including amygdala in patients compared to controls. During fMRI, we presented dynamic facial expressions (fear and chewing) and their spatiotemporally scrambled versions. We found enhanced activation for fearful compared to neutral faces in ventral temporal cortex and superior temporal sulcus in controls, but not in patients. In the bvFTD group left amygdalar GM volume correlated positively with emotion-related activity in left fusiform face area (FFA). This correlation was amygdala-specific and driven by GM in superficial and basolateral (BLA) subnuclei, consistent with reported amygdalar-cortical networks. The data suggests that anterior medio-temporal atrophy in bvFTD affects emotion processing in distant posterior areas. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Influence of emotional expression on memory recognition bias in schizophrenia as revealed by fMRI.

    PubMed

    Sergerie, Karine; Armony, Jorge L; Menear, Matthew; Sutton, Hazel; Lepage, Martin

    2010-07-01

    We recently showed that, in healthy individuals, emotional expression influences memory for faces both in terms of accuracy and, critically, in memory response bias (tendency to classify stimuli as previously seen or not, regardless of whether this was the case). Although schizophrenia has been shown to be associated with deficit in episodic memory and emotional processing, the relation between these processes in this population remains unclear. Here, we used our previously validated paradigm to directly investigate the modulation of emotion on memory recognition. Twenty patients with schizophrenia and matched healthy controls completed functional magnetic resonance imaging (fMRI) study of recognition memory of happy, sad, and neutral faces. Brain activity associated with the response bias was obtained by correlating this measure with the contrast subjective old (ie, hits and false alarms) minus subjective new (misses and correct rejections) for sad and happy expressions. Although patients exhibited an overall lower memory performance than controls, they showed the same effects of emotion on memory, both in terms of accuracy and bias. For sad faces, the similar behavioral pattern between groups was mirrored by a largely overlapping neural network, mostly involved in familiarity-based judgments (eg, parahippocampal gyrus). In contrast, controls activated a much larger set of regions for happy faces, including areas thought to underlie recollection-based memory retrieval (eg, superior frontal gyrus and hippocampus) and in novelty detection (eg, amygdala). This study demonstrates that, despite an overall lower memory accuracy, emotional memory is intact in schizophrenia, although emotion-specific differences in brain activation exist, possibly reflecting different strategies.

  2. Impaired neural processing of dynamic faces in left-onset Parkinson's disease.

    PubMed

    Garrido-Vásquez, Patricia; Pell, Marc D; Paulmann, Silke; Sehm, Bernhard; Kotz, Sonja A

    2016-02-01

    Parkinson's disease (PD) affects patients beyond the motor domain. According to previous evidence, one mechanism that may be impaired in the disease is face processing. However, few studies have investigated this process at the neural level in PD. Moreover, research using dynamic facial displays rather than static pictures is scarce, but highly warranted due to the higher ecological validity of dynamic stimuli. In the present study we aimed to investigate how PD patients process emotional and non-emotional dynamic face stimuli at the neural level using event-related potentials. Since the literature has revealed a predominantly right-lateralized network for dynamic face processing, we divided the group into patients with left (LPD) and right (RPD) motor symptom onset (right versus left cerebral hemisphere predominantly affected, respectively). Participants watched short video clips of happy, angry, and neutral expressions and engaged in a shallow gender decision task in order to avoid confounds of task difficulty in the data. In line with our expectations, the LPD group showed significant face processing deficits compared to controls. While there were no group differences in early, sensory-driven processing (fronto-central N1 and posterior P1), the vertex positive potential, which is considered the fronto-central counterpart of the face-specific posterior N170 component, had a reduced amplitude and delayed latency in the LPD group. This may indicate disturbances of structural face processing in LPD. Furthermore, the effect was independent of the emotional content of the videos. In contrast, static facial identity recognition performance in LPD was not significantly different from controls, and comprehensive testing of cognitive functions did not reveal any deficits in this group. We therefore conclude that PD, and more specifically the predominant right-hemispheric affection in left-onset PD, is associated with impaired processing of dynamic facial expressions, which could be one of the mechanisms behind the often reported problems of PD patients in their social lives. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Vicarious Social Touch Biases Gazing at Faces and Facial Emotions.

    PubMed

    Schirmer, Annett; Ng, Tabitha; Ebstein, Richard P

    2018-02-01

    Research has suggested that interpersonal touch promotes social processing and other-concern, and that women may respond to it more sensitively than men. In this study, we asked whether this phenomenon would extend to third-party observers who experience touch vicariously. In an eye-tracking experiment, participants (N = 64, 32 men and 32 women) viewed prime and target images with the intention of remembering them. Primes comprised line drawings of dyadic interactions with and without touch. Targets comprised two faces shown side-by-side, with one being neutral and the other being happy or sad. Analysis of prime fixations revealed that faces in touch interactions attracted longer gazing than faces in no-touch interactions. In addition, touch enhanced gazing at the area of touch in women but not men. Analysis of target fixations revealed that touch priming increased looking at both faces immediately after target onset, and subsequently, at the emotional face in the pair. Sex differences in target processing were nonsignificant. Together, the present results imply that vicarious touch biases visual attention to faces and promotes emotion sensitivity. In addition, they suggest that, compared with men, women are more aware of tactile exchanges in their environment. As such, vicarious touch appears to share important qualities with actual physical touch. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  4. Postural Control in Children with Dyslexia: Effects of Emotional Stimuli in a Dual-Task Environment.

    PubMed

    Goulème, Nathalie; Gerard, Christophe-Loïc; Bucci, Maria Pia

    2017-08-01

    The aim of this study was to compare the visual exploration strategies used during a postural control task across participants with and without dyslexia. We simultaneously recorded eye movements and postural control while children were viewing different types of emotional faces. Twenty-two children with dyslexia and twenty-two aged-matched children without dyslexia participated in the study. We analysed the surface area, the length and the mean velocity of the centre of pressure for balance in parallel with visual saccadic latency, the number of saccades and the time spent in regions of interest. Our results showed that postural stability in children with dyslexia was weaker and the surface area of their centre of pressure increased significantly when they viewed an unpleasant face. Moreover, children with dyslexia had different strategies to those used by children without dyslexia during visual exploration, and in particular when they viewed unpleasant emotional faces. We suggest that lower performance in emotional face processing in children with dyslexia could be due to a difference in their visual strategies, linked to their identification of unpleasant emotional faces. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  5. The ties to unbind: age-related differences in feature (un)binding in working memory for emotional faces

    PubMed Central

    Pehlivanoglu, Didem; Jain, Shivangi; Ariel, Robert; Verhaeghen, Paul

    2014-01-01

    In the present study, we investigated age-related differences in the processing of emotional stimuli. Specifically, we were interested in whether older adults would show deficits in unbinding emotional expression (i.e., either no emotion, happiness, anger, or disgust) from bound stimuli (i.e., photographs of faces expressing these emotions), as a hyper-binding account of age-related differences in working memory would predict. Younger and older adults completed different N-Back tasks (side-by-side 0-Back, 1-Back, 2-Back) under three conditions: match/mismatch judgments based on either the identity of the face (identity condition), the face’s emotional expression (expression condition), or both identity and expression of the face (both condition). The two age groups performed more slowly and with lower accuracy in the expression condition than in the both condition, indicating the presence of an unbinding process. This unbinding effect was more pronounced in older adults than in younger adults, but only in the 2-Back task. Thus, older adults seemed to have a specific deficit in unbinding in working memory. Additionally, no age-related differences were found in accuracy in the 0-Back task, but such differences emerged in the 1-Back task, and were further magnified in the 2-Back task, indicating independent age-related differences in attention/STM and working memory. Pupil dilation data confirmed that the attention/STM version of the task (1-Back) is more effortful for older adults than younger adults. PMID:24795660

  6. The impact of facial emotional expressions on behavioral tendencies in females and males

    PubMed Central

    Seidel, Eva-Maria; Habel, Ute; Kirschner, Michaela; Gur, Ruben C.; Derntl, Birgit

    2010-01-01

    Emotional faces communicate both the emotional state and behavioral intentions of an individual. They also activate behavioral tendencies in the perceiver, namely approach or avoidance. Here, we compared more automatic motor to more conscious rating responses to happy, sad, angry and disgusted faces in a healthy student sample. Happiness was associated with approach and anger with avoidance. However, behavioral tendencies in response to sadness and disgust were more complex. Sadness produced automatic approach but conscious withdrawal, probably influenced by interpersonal relations or personality. Disgust elicited withdrawal in the rating task whereas no significant tendency emerged in the joystick task, probably driven by expression style. Based on our results it is highly relevant to further explore actual reactions to emotional expressions and to differentiate between automatic and controlled processes since emotional faces are used in various kinds of studies. Moreover, our results highlight the importance of gender of poser effects when applying emotional expressions as stimuli. PMID:20364933

  7. Gaze-cueing effect depends on facial expression of emotion in 9- to 12-month-old infants

    PubMed Central

    Niedźwiecka, Alicja; Tomalski, Przemysław

    2015-01-01

    Efficient processing of gaze direction and facial expression of emotion is crucial for early social and emotional development. Toward the end of the first year of life infants begin to pay more attention to negative expressions, but it remains unclear to what extent emotion expression is processed jointly with gaze direction at this age. This study sought to establish the interactions of gaze direction and emotion expression in visual orienting in 9- to 12-month-olds. In particular, we tested whether these interactions can be explained by the negativity bias hypothesis and the shared signal hypothesis. We measured saccadic latencies in response to peripheral targets in a gaze-cueing paradigm with happy, angry, and fearful female faces. In the Pilot Experiment three gaze directions were used (direct, congruent with target location, incongruent with target location). In the Main Experiment we sought to replicate the results of the Pilot experiment using a simpler design without the direct gaze condition. In both experiments we found a robust gaze-cueing effect for happy faces, i.e., facilitation of orienting toward the target in the gaze-cued location, compared with the gaze-incongruent location. We found more rapid orienting to targets cued by happy relative to angry and fearful faces. We did not find any gaze-cueing effect for angry or fearful faces. These results are not consistent with the shared signal hypothesis. While our results show differential processing of positive and negative emotions, they do not support a general negativity bias. On the contrary, they indicate that toward the age of 12 months infants show a positivity bias in gaze-cueing tasks. PMID:25713555

  8. Aging and Emotion Recognition: Not Just a Losing Matter

    PubMed Central

    Sze, Jocelyn A.; Goodkind, Madeleine S.; Gyurak, Anett; Levenson, Robert W.

    2013-01-01

    Past studies on emotion recognition and aging have found evidence of age-related decline when emotion recognition was assessed by having participants detect single emotions depicted in static images of full or partial (e.g., eye region) faces. These tests afford good experimental control but do not capture the dynamic nature of real-world emotion recognition, which is often characterized by continuous emotional judgments and dynamic multi-modal stimuli. Research suggests that older adults often perform better under conditions that better mimic real-world social contexts. We assessed emotion recognition in young, middle-aged, and older adults using two traditional methods (single emotion judgments of static images of faces and eyes) and an additional method in which participants made continuous emotion judgments of dynamic, multi-modal stimuli (videotaped interactions between young, middle-aged, and older couples). Results revealed an age by test interaction. Largely consistent with prior research, we found some evidence that older adults performed worse than young adults when judging single emotions from images of faces (for sad and disgust faces only) and eyes (for older eyes only), with middle-aged adults falling in between. In contrast, older adults did better than young adults on the test involving continuous emotion judgments of dyadic interactions, with middle-aged adults falling in between. In tests in which target stimuli differed in age, emotion recognition was not facilitated by an age match between participant and target. These findings are discussed in terms of theoretical and methodological implications for the study of aging and emotional processing. PMID:22823183

  9. Children can discriminate the authenticity of happy but not sad or fearful facial expressions, and use an immature intensity-only strategy.

    PubMed

    Dawel, Amy; Palermo, Romina; O'Kearney, Richard; McKone, Elinor

    2015-01-01

    Much is known about development of the ability to label facial expressions of emotion (e.g., as happy or sad), but rather less is known about the emergence of more complex emotional face processing skills. The present study investigates one such advanced skill: the ability to tell if someone is genuinely feeling an emotion or just pretending (i.e., authenticity discrimination). Previous studies have shown that children can discriminate authenticity of happy faces, using expression intensity as an important cue, but have not tested the negative emotions of sadness or fear. Here, children aged 8-12 years (n = 85) and adults (n = 57) viewed pairs of faces in which one face showed a genuinely-felt emotional expression (happy, sad, or scared) and the other face showed a pretend version. For happy faces, children discriminated authenticity above chance, although they performed more poorly than adults. For sad faces, for which our pretend and genuine images were equal in intensity, adults could discriminate authenticity, but children could not. Neither age group could discriminate authenticity of the fear faces. Results also showed that children judged authenticity based on intensity information alone for all three expressions tested, while adults used a combination of intensity and other factor/s. In addition, novel results show that individual differences in empathy (both cognitive and affective) correlated with authenticity discrimination for happy faces in adults, but not children. Overall, our results indicate late maturity of skills needed to accurately determine the authenticity of emotions from facial information alone, and raise questions about how this might affect social interactions in late childhood and the teenage years.

  10. Children can discriminate the authenticity of happy but not sad or fearful facial expressions, and use an immature intensity-only strategy

    PubMed Central

    Dawel, Amy; Palermo, Romina; O’Kearney, Richard; McKone, Elinor

    2015-01-01

    Much is known about development of the ability to label facial expressions of emotion (e.g., as happy or sad), but rather less is known about the emergence of more complex emotional face processing skills. The present study investigates one such advanced skill: the ability to tell if someone is genuinely feeling an emotion or just pretending (i.e., authenticity discrimination). Previous studies have shown that children can discriminate authenticity of happy faces, using expression intensity as an important cue, but have not tested the negative emotions of sadness or fear. Here, children aged 8–12 years (n = 85) and adults (n = 57) viewed pairs of faces in which one face showed a genuinely-felt emotional expression (happy, sad, or scared) and the other face showed a pretend version. For happy faces, children discriminated authenticity above chance, although they performed more poorly than adults. For sad faces, for which our pretend and genuine images were equal in intensity, adults could discriminate authenticity, but children could not. Neither age group could discriminate authenticity of the fear faces. Results also showed that children judged authenticity based on intensity information alone for all three expressions tested, while adults used a combination of intensity and other factor/s. In addition, novel results show that individual differences in empathy (both cognitive and affective) correlated with authenticity discrimination for happy faces in adults, but not children. Overall, our results indicate late maturity of skills needed to accurately determine the authenticity of emotions from facial information alone, and raise questions about how this might affect social interactions in late childhood and the teenage years. PMID:25999868

  11. Electrophysiological Correlates of Subliminal Perception of Facial Expressions in Individuals with Autistic Traits: A Backward Masking Study

    PubMed Central

    Vukusic, Svjetlana; Ciorciari, Joseph; Crewther, David P.

    2017-01-01

    People with Autism spectrum disorder (ASD) show difficulty in social communication, especially in the rapid assessment of emotion in faces. This study examined the processing of emotional faces in typically developing adults with high and low levels of autistic traits (measured using the Autism Spectrum Quotient—AQ). Event-related potentials (ERPs) were recorded during viewing of backward-masked neutral, fearful and happy faces presented under two conditions: subliminal (16 ms, below the level of visual conscious awareness) and supraliminal (166 ms, above the time required for visual conscious awareness). Individuals with low and high AQ differed in the processing of subliminal faces, with the low AQ group showing an enhanced N2 amplitude for subliminal happy faces. Some group differences were found in the condition effects, with the Low AQ showing shorter frontal P3b and N4 latencies for subliminal vs. supraliminal condition. Although results did not show any group differences on the face-specific N170 component, there were shorter N170 latencies for supraliminal vs. subliminal conditions across groups. The results observed on the N2, showing group differences in subliminal emotion processing, suggest that decreased sensitivity to the reward value of social stimuli is a common feature both of people with ASD as well as people with high autistic traits from the normal population. PMID:28588465

  12. Electrophysiological Correlates of Subliminal Perception of Facial Expressions in Individuals with Autistic Traits: A Backward Masking Study.

    PubMed

    Vukusic, Svjetlana; Ciorciari, Joseph; Crewther, David P

    2017-01-01

    People with Autism spectrum disorder (ASD) show difficulty in social communication, especially in the rapid assessment of emotion in faces. This study examined the processing of emotional faces in typically developing adults with high and low levels of autistic traits (measured using the Autism Spectrum Quotient-AQ). Event-related potentials (ERPs) were recorded during viewing of backward-masked neutral, fearful and happy faces presented under two conditions: subliminal (16 ms, below the level of visual conscious awareness) and supraliminal (166 ms, above the time required for visual conscious awareness). Individuals with low and high AQ differed in the processing of subliminal faces, with the low AQ group showing an enhanced N2 amplitude for subliminal happy faces. Some group differences were found in the condition effects, with the Low AQ showing shorter frontal P3b and N4 latencies for subliminal vs. supraliminal condition. Although results did not show any group differences on the face-specific N170 component, there were shorter N170 latencies for supraliminal vs. subliminal conditions across groups. The results observed on the N2, showing group differences in subliminal emotion processing, suggest that decreased sensitivity to the reward value of social stimuli is a common feature both of people with ASD as well as people with high autistic traits from the normal population.

  13. Interactions among the effects of head orientation, emotional expression, and physical attractiveness on face preferences.

    PubMed

    Main, Julie C; DeBruine, Lisa M; Little, Anthony C; Jones, Benedict C

    2010-01-01

    Previous studies have shown that preferences for direct versus averted gaze are modulated by emotional expressions and physical attractiveness. For example, preferences for direct gaze are stronger when judging happy or physically attractive faces than when judging disgusted or physically unattractive faces. Here we show that preferences for front versus three-quarter views of faces, in which gaze direction was always congruent with head orientation, are also modulated by emotional expressions and physical attractiveness; participants demonstrated preferences for front views of faces over three-quarter views of faces when judging the attractiveness of happy, physically attractive individuals, but not when judging the attractiveness of relatively unattractive individuals or those with disgusted expressions. Moreover, further analyses indicated that these interactions did not simply reflect differential perceptions of the intensity of the emotional expressions shown in each condition. Collectively, these findings present novel evidence that the effect of the direction of the attention of others on attractiveness judgments is modulated by cues to the physical attractiveness and emotional state of the depicted individual, potentially reflecting psychological adaptations for efficient allocation of social effort. These data also present the first behavioural evidence that the effect of the direction of the attention of others on attractiveness judgments reflects viewer-referenced, rather than face-referenced, coding and/or processing of gaze direction.

  14. Impairment in face processing in autism spectrum disorder: a developmental perspective.

    PubMed

    Greimel, Ellen; Schulte-Rüther, Martin; Kamp-Becker, Inge; Remschmidt, Helmut; Herpertz-Dahlmann, Beate; Konrad, Kerstin

    2014-09-01

    Findings on face identity and facial emotion recognition in autism spectrum disorder (ASD) are inconclusive. Moreover, little is known about the developmental trajectory of face processing skills in ASD. Taking a developmental perspective, the aim of this study was to extend previous findings on face processing skills in a sample of adolescents and adults with ASD. N = 38 adolescents and adults (13-49 years) with high-functioning ASD and n = 37 typically developing (TD) control subjects matched for age and IQ participated in the study. Moreover, n = 18 TD children between the ages of 8 and 12 were included to address the question whether face processing skills in ASD follow a delayed developmental pattern. Face processing skills were assessed using computerized tasks of face identity recognition (FR) and identification of facial emotions (IFE). ASD subjects showed impaired performance on several parameters of the FR and IFE task compared to TD control adolescents and adults. Whereas TD adolescents and adults outperformed TD children in both tasks, performance in ASD adolescents and adults was similar to the group of TD children. Within the groups of ASD and control adolescents and adults, no age-related changes in performance were found. Our findings corroborate and extend previous studies showing that ASD is characterised by broad impairments in the ability to process faces. These impairments seem to reflect a developmentally delayed pattern that remains stable throughout adolescence and adulthood.

  15. From neural signatures of emotional modulation to social cognition: individual differences in healthy volunteers and psychiatric participants.

    PubMed

    Ibáñez, Agustín; Aguado, Jaume; Baez, Sandra; Huepe, David; Lopez, Vladimir; Ortega, Rodrigo; Sigman, Mariano; Mikulan, Ezequiel; Lischinsky, Alicia; Torrente, Fernando; Cetkovich, Marcelo; Torralva, Teresa; Bekinschtein, Tristan; Manes, Facundo

    2014-07-01

    It is commonly assumed that early emotional signals provide relevant information for social cognition tasks. The goal of this study was to test the association between (a) cortical markers of face emotional processing and (b) social-cognitive measures, and also to build a model which can predict this association (a and b) in healthy volunteers as well as in different groups of psychiatric patients. Thus, we investigated the early cortical processing of emotional stimuli (N170, using a face and word valence task) and their relationship with the social-cognitive profiles (SCPs, indexed by measures of theory of mind, fluid intelligence, speed processing and executive functions). Group comparisons and individual differences were assessed among schizophrenia (SCZ) patients and their relatives, individuals with attention deficit hyperactivity disorder (ADHD), individuals with euthymic bipolar disorder (BD) and healthy participants (educational level, handedness, age and gender matched). Our results provide evidence of emotional N170 impairments in the affected groups (SCZ and relatives, ADHD and BD) as well as subtle group differences. Importantly, cortical processing of emotional stimuli predicted the SCP, as evidenced by a structural equation model analysis. This is the first study to report an association model of brain markers of emotional processing and SCP. © The Author (2013). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  16. Modulation of central serotonin affects emotional information processing in impulsive aggressive personality disorder.

    PubMed

    Lee, Royce J; Gill, Andrew; Chen, Bing; McCloskey, Michael; Coccaro, Emil F

    2012-06-01

    The mechanistic model whereby serotonin affects impulsive aggression is not completely understood. The purpose of this study was to test the hypothesis that depletion of serotonin reserves by tryptophan depletion affects emotional information processing in susceptible individuals. The effect of tryptophan (vs placebo) depletion on processing of Ekman emotional faces was compared in impulsive aggressive personality disordered, male and female adults with normal controls. All subjects were free of psychotropic medications, medically healthy, nondepressed, and substance free. Additionally, subjective mood state and vital signs were monitored. For emotion recognition, a significant interaction of Aggression × Drug × Sex (F(1, 31) = 7.687, P = 0.009) was found, with male normal controls but not impulsive aggressive males showing increased recognition of fear. For intensity ratings of emotional faces, a significant interaction was discovered of Drug × Group × Sex (F(1, 31) = 5.924, P = 0.021), with follow-up tests revealing that males with intermittent explosive disorder tended to increase intensity ratings of angry faces after tryptophan depletion. Additionally, tryptophan depletion was associated with increased heart rate in all subjects, and increased intensity of the subjective emotional state of "anger" in impulsive aggressive subjects. Individuals with clinically relevant levels of impulsive aggression may be susceptible to effects of serotonergic depletion on emotional information processing, showing a tendency to exaggerate their impression of the intensity of angry expressions and to report an angry mood state after tryptophan depletion. This may reflect heightened sensitivity to the effects of serotonergic dysregulation, and suggests that what underlies impulsive aggression is either supersensitivity to serotonergic disturbances or susceptibility to fluctuations in central serotonergic availability.

  17. Disruption of Emotion and Conflict Processing in HIV Infection with and without Alcoholism Comorbidity

    PubMed Central

    Schulte, Tilman; Müller-Oehring, Eva M.; Sullivan, Edith V.; Pfefferbaum, Adolf

    2012-01-01

    Alcoholism and HIV-1 infection each affect components of selective attention and cognitive control that may contribute to deficits in emotion processing based on closely interacting fronto-parietal attention and frontal-subcortical emotion systems. Here, we investigated whether patients with alcoholism, HIV-1 infection, or both diseases have greater difficulty than healthy controls in resolving conflict from emotional words with different valences. Accordingly, patients with alcoholism (ALC, n = 20), HIV-1 infection (HIV, n = 20), ALC + HIV comorbidity (n = 22), and controls (CTL, n = 16) performed an emotional Stroop Match-to-Sample task, which assessed the contribution of emotion (happy, angry) to cognitive control (Stroop conflict processing). ALC + HIV showed greater Stroop effects than HIV, ALC, or CTL for negative (ANGRY) but not for positive (HAPPY) words, and also when the cue color did not match the Stroop stimulus color; the comorbid group performed similarly to the others when cue and word colors matched. Furthermore, emotionally salient face cues prolonged color-matching responses in all groups. HIV alone, compared with the other three groups, showed disproportionately slowed color-matching time when trials featured angry faces. The enhanced Stroop effects prominent in ALC + HIV suggest difficulty in exercising attentional top-down control on processes that consume attentional capacity, especially when cognitive effort is required to ignore negative emotions. PMID:21418720

  18. Who Expressed What Emotion? Men Grab Anger, Women Grab Happiness

    PubMed Central

    Neel, Rebecca; Becker, D. Vaughn; Neuberg, Steven L.; Kenrick, Douglas T.

    2011-01-01

    When anger or happiness flashes on a face in the crowd, do we misperceive that emotion as belonging to someone else? Two studies found that misperception of apparent emotional expressions – “illusory conjunctions” – depended on the gender of the target: male faces tended to “grab” anger from neighboring faces, and female faces tended to grab happiness. Importantly, the evidence did not suggest that this effect was due to the general tendency to misperceive male or female faces as angry or happy, but instead indicated a more subtle interaction of expectations and early visual processes. This suggests a novel aspect of affordance-management in human perception, whereby cues to threat, when they appear, are attributed to those with the greatest capability of doing harm, whereas cues to friendship are attributed to those with the greatest likelihood of providing affiliation opportunities. PMID:22368303

  19. Emotion Words: Adding Face Value.

    PubMed

    Fugate, Jennifer M B; Gendron, Maria; Nakashima, Satoshi F; Barrett, Lisa Feldman

    2017-06-12

    Despite a growing number of studies suggesting that emotion words affect perceptual judgments of emotional stimuli, little is known about how emotion words affect perceptual memory for emotional faces. In Experiments 1 and 2 we tested how emotion words (compared with control words) affected participants' abilities to select a target emotional face from among distractor faces. Participants were generally more likely to false alarm to distractor emotional faces when primed with an emotion word congruent with the face (compared with a control word). Moreover, participants showed both decreased sensitivity (d') to discriminate between target and distractor faces, as well as altered response biases (c; more likely to answer "yes") when primed with an emotion word (compared with a control word). In Experiment 3 we showed that emotion words had more of an effect on perceptual memory judgments when the structural information in the target face was limited, as well as when participants were only able to categorize the face with a partially congruent emotion word. The overall results are consistent with the idea that emotion words affect the encoding of emotional faces in perceptual memory. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  20. Sex differences in social cognition: The case of face processing.

    PubMed

    Proverbio, Alice Mado

    2017-01-02

    Several studies have demonstrated that women show a greater interest for social information and empathic attitude than men. This article reviews studies on sex differences in the brain, with particular reference to how males and females process faces and facial expressions, social interactions, pain of others, infant faces, faces in things (pareidolia phenomenon), opposite-sex faces, humans vs. landscapes, incongruent behavior, motor actions, biological motion, erotic pictures, and emotional information. Sex differences in oxytocin-based attachment response and emotional memory are also mentioned. In addition, we investigated how 400 different human faces were evaluated for arousal and valence dimensions by a group of healthy male and female University students. Stimuli were carefully balanced for sensory and perceptual characteristics, age, facial expression, and sex. As a whole, women judged all human faces as more positive and more arousing than men. Furthermore, they showed a preference for the faces of children and the elderly in the arousal evaluation. Regardless of face aesthetics, age, or facial expression, women rated human faces higher than men. The preference for opposite- vs. same-sex faces strongly interacted with facial age. Overall, both women and men exhibited differences in facial processing that could be interpreted in the light of evolutionary psychobiology. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  1. Oxytocin attenuates neural reactivity to masked threat cues from the eyes.

    PubMed

    Kanat, Manuela; Heinrichs, Markus; Schwarzwald, Ralf; Domes, Gregor

    2015-01-01

    The neuropeptide oxytocin has recently been shown to modulate covert attention shifts to emotional face cues and to improve discrimination of masked facial emotions. These results suggest that oxytocin modulates facial emotion processing at early perceptual stages prior to full evaluation of the emotional expression. Here, we used functional magnetic resonance imaging to examine whether oxytocin alters neural responses to backwardly masked angry and happy faces while controlling for attention to the eye vs the mouth region. Intranasal oxytocin administration reduced amygdala reactivity to masked emotions when attending to salient facial features, ie, the eyes of angry faces and the mouth of happy faces. In addition, oxytocin decreased neural responses within the fusiform gyrus and brain stem areas, as well as functional coupling between the amygdala and the fusiform gyrus specifically for threat cues from the eyes. Effects of oxytocin on brain activity were not attributable to differences in behavioral performance, as oxytocin had no impact on mere emotion detection. Our results suggest that oxytocin attenuates neural correlates of early arousal by threat signals from the eye region. As reduced threat sensitivity may increase the likelihood of engaging in social interactions, our findings may have important implications for clinical states of social anxiety.

  2. Testosterone reactivity to facial display of emotions in men and women.

    PubMed

    Zilioli, Samuele; Caldbick, Evan; Watson, Neil V

    2014-05-01

    Previous studies have examined testosterone's role in regulating the processing of facial displays of emotions (FDEs). However, the reciprocal process - the influence of FDEs, an evolutionarily ancient and potent class of social signals, on the secretion of testosterone - has not yet been studied. To address this gap, we examined the effects of emotional content and sex of facial stimuli in modulating endogenous testosterone fluctuations, as well as sex differences in the endocrine responses to faces. One hundred and sixty-four young healthy men and women were exposed, in a between-subjects design, to happy or angry same-sex or opposite-sex facial expressions. Results showed that in both men (n=85) and women (n=79), extended exposure to faces of the opposite sex, regardless of their apparent emotional content, was accompanied by an accumulation in salivary testosterone when compared to exposure to faces of the same sex. Furthermore, testosterone change in women exposed to angry expressions was greater than testosterone change in women exposed to happy expressions. These results add emotional facial stimuli to the collection of social signals that modulate endocrine status, and are discussed with regard to the evolutionary roles of testosterone. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Altered neural processing of emotional faces in remitted Cushing's disease.

    PubMed

    Bas-Hoogendam, Janna Marie; Andela, Cornelie D; van der Werff, Steven J A; Pannekoek, J Nienke; van Steenbergen, Henk; Meijer, Onno C; van Buchem, Mark A; Rombouts, Serge A R B; van der Mast, Roos C; Biermasz, Nienke R; van der Wee, Nic J A; Pereira, Alberto M

    2015-09-01

    Patients with long-term remission of Cushing's disease (CD) demonstrate residual psychological complaints. At present, it is not known how previous exposure to hypercortisolism affects psychological functioning in the long-term. Earlier magnetic resonance imaging (MRI) studies demonstrated abnormalities of brain structure and resting-state connectivity in patients with long-term remission of CD, but no data are available on functional alterations in the brain during the performance of emotional or cognitive tasks in these patients. We performed a cross-sectional functional MRI study, investigating brain activation during emotion processing in patients with long-term remission of CD. Processing of emotional faces versus a non-emotional control condition was examined in 21 patients and 21 matched healthy controls. Analyses focused on activation and connectivity of two a priori determined regions of interest: the amygdala and the medial prefrontal-orbitofrontal cortex (mPFC-OFC). We also assessed psychological functioning, cognitive failure, and clinical disease severity. Patients showed less mPFC activation during processing of emotional faces compared to controls, whereas no differences were found in amygdala activation. An exploratory psychophysiological interaction analysis demonstrated decreased functional coupling between the ventromedial PFC and posterior cingulate cortex (a region structurally connected to the PFC) in CD-patients. The present study is the first to show alterations in brain function and task-related functional coupling in patients with long-term remission of CD relative to matched healthy controls. These alterations may, together with abnormalities in brain structure, be related to the persisting psychological morbidity in patients with CD after long-term remission. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. New Tests to Measure Individual Differences in Matching and Labelling Facial Expressions of Emotion, and Their Association with Ability to Recognise Vocal Emotions and Facial Identity

    PubMed Central

    Palermo, Romina; O’Connor, Kirsty B.; Davis, Joshua M.; Irons, Jessica; McKone, Elinor

    2013-01-01

    Although good tests are available for diagnosing clinical impairments in face expression processing, there is a lack of strong tests for assessing “individual differences” – that is, differences in ability between individuals within the typical, nonclinical, range. Here, we develop two new tests, one for expression perception (an odd-man-out matching task in which participants select which one of three faces displays a different expression) and one additionally requiring explicit identification of the emotion (a labelling task in which participants select one of six verbal labels). We demonstrate validity (careful check of individual items, large inversion effects, independence from nonverbal IQ, convergent validity with a previous labelling task), reliability (Cronbach’s alphas of.77 and.76 respectively), and wide individual differences across the typical population. We then demonstrate the usefulness of the tests by addressing theoretical questions regarding the structure of face processing, specifically the extent to which the following processes are common or distinct: (a) perceptual matching and explicit labelling of expression (modest correlation between matching and labelling supported partial independence); (b) judgement of expressions from faces and voices (results argued labelling tasks tap into a multi-modal system, while matching tasks tap distinct perceptual processes); and (c) expression and identity processing (results argued for a common first step of perceptual processing for expression and identity). PMID:23840821

  5. New tests to measure individual differences in matching and labelling facial expressions of emotion, and their association with ability to recognise vocal emotions and facial identity.

    PubMed

    Palermo, Romina; O'Connor, Kirsty B; Davis, Joshua M; Irons, Jessica; McKone, Elinor

    2013-01-01

    Although good tests are available for diagnosing clinical impairments in face expression processing, there is a lack of strong tests for assessing "individual differences"--that is, differences in ability between individuals within the typical, nonclinical, range. Here, we develop two new tests, one for expression perception (an odd-man-out matching task in which participants select which one of three faces displays a different expression) and one additionally requiring explicit identification of the emotion (a labelling task in which participants select one of six verbal labels). We demonstrate validity (careful check of individual items, large inversion effects, independence from nonverbal IQ, convergent validity with a previous labelling task), reliability (Cronbach's alphas of.77 and.76 respectively), and wide individual differences across the typical population. We then demonstrate the usefulness of the tests by addressing theoretical questions regarding the structure of face processing, specifically the extent to which the following processes are common or distinct: (a) perceptual matching and explicit labelling of expression (modest correlation between matching and labelling supported partial independence); (b) judgement of expressions from faces and voices (results argued labelling tasks tap into a multi-modal system, while matching tasks tap distinct perceptual processes); and (c) expression and identity processing (results argued for a common first step of perceptual processing for expression and identity).

  6. Parametric modulation of neural activity by emotion in youth with bipolar disorder, youth with severe mood dysregulation, and healthy volunteers.

    PubMed

    Thomas, Laura A; Brotman, Melissa A; Muhrer, Eli J; Rosen, Brooke H; Bones, Brian L; Reynolds, Richard C; Deveney, Christen M; Pine, Daniel S; Leibenluft, Ellen

    2012-12-01

    CONTEXT Youth with bipolar disorder (BD) and those with severe, nonepisodic irritability (severe mood dysregulation [SMD]) exhibit amygdala dysfunction during facial emotion processing. However, studies have not compared such patients with each other and with comparison individuals in neural responsiveness to subtle changes in facial emotion; the ability to process such changes is important for social cognition. To evaluate this, we used a novel, parametrically designed faces paradigm. OBJECTIVE To compare activation in the amygdala and across the brain in BD patients, SMD patients, and healthy volunteers (HVs). DESIGN Case-control study. SETTING Government research institute. PARTICIPANTS Fifty-seven youths (19 BD, 15 SMD, and 23 HVs). MAIN OUTCOME MEASURE Blood oxygenation level-dependent data. Neutral faces were morphed with angry and happy faces in 25% intervals; static facial stimuli appeared for 3000 milliseconds. Participants performed hostility or nonemotional facial feature (ie, nose width) ratings. The slope of blood oxygenation level-dependent activity was calculated across neutral-to-angry and neutral-to-happy facial stimuli. RESULTS In HVs, but not BD or SMD participants, there was a positive association between left amygdala activity and anger on the face. In the neutral-to-happy whole-brain analysis, BD and SMD participants modulated parietal, temporal, and medial-frontal areas differently from each other and from that in HVs; with increasing facial happiness, SMD patients demonstrated increased, and BD patients decreased, activity in the parietal, temporal, and frontal regions. CONCLUSIONS Youth with BD or SMD differ from HVs in modulation of amygdala activity in response to small changes in facial anger displays. In contrast, individuals with BD or SMD show distinct perturbations in regions mediating attention and face processing in association with changes in the emotional intensity of facial happiness displays. These findings demonstrate similarities and differences in the neural correlates of facial emotion processing in BD and SMD, suggesting that these distinct clinical presentations may reflect differing dysfunctions along a mood disorders spectrum.

  7. Quantifying facial expression recognition across viewing conditions.

    PubMed

    Goren, Deborah; Wilson, Hugh R

    2006-04-01

    Facial expressions are key to social interactions and to assessment of potential danger in various situations. Therefore, our brains must be able to recognize facial expressions when they are transformed in biologically plausible ways. We used synthetic happy, sad, angry and fearful faces to determine the amount of geometric change required to recognize these emotions during brief presentations. Five-alternative forced choice conditions involving central viewing, peripheral viewing and inversion were used to study recognition among the four emotions. Two-alternative forced choice was used to study affect discrimination when spatial frequency information in the stimulus was modified. The results show an emotion and task-dependent pattern of detection. Facial expressions presented with low peak frequencies are much harder to discriminate from neutral than faces defined by either mid or high peak frequencies. Peripheral presentation of faces also makes recognition much more difficult, except for happy faces. Differences between fearful detection and recognition tasks are probably due to common confusions with sadness when recognizing fear from among other emotions. These findings further support the idea that these emotions are processed separately from each other.

  8. Altered Functional Subnetwork During Emotional Face Processing: A Potential Intermediate Phenotype for Schizophrenia.

    PubMed

    Cao, Hengyi; Bertolino, Alessandro; Walter, Henrik; Schneider, Michael; Schäfer, Axel; Taurisano, Paolo; Blasi, Giuseppe; Haddad, Leila; Grimm, Oliver; Otto, Kristina; Dixson, Luanna; Erk, Susanne; Mohnke, Sebastian; Heinz, Andreas; Romanczuk-Seiferth, Nina; Mühleisen, Thomas W; Mattheisen, Manuel; Witt, Stephanie H; Cichon, Sven; Noethen, Markus; Rietschel, Marcella; Tost, Heike; Meyer-Lindenberg, Andreas

    2016-06-01

    Although deficits in emotional processing are prominent in schizophrenia, it has been difficult to identify neural mechanisms related to the genetic risk for this highly heritable illness. Prior studies have not found consistent regional activation or connectivity alterations in first-degree relatives compared with healthy controls, suggesting that a more comprehensive search for connectomic biomarkers is warranted. To identify a potential systems-level intermediate phenotype linked to emotion processing in schizophrenia and to examine the psychological association, task specificity, test-retest reliability, and clinical validity of the identified phenotype. The study was performed in university research hospitals from June 1, 2008, through December 31, 2013. We examined 58 unaffected first-degree relatives of patients with schizophrenia and 94 healthy controls with an emotional face-matching functional magnetic resonance imaging paradigm. Test-retest reliability was analyzed with an independent sample of 26 healthy participants. A clinical association study was performed in 31 patients with schizophrenia and 45 healthy controls. Data analysis was performed from January 1 to September 30, 2014. Conventional amygdala activity and seeded connectivity measures, graph-based global and local network connectivity measures, Spearman rank correlation, intraclass correlation, and gray matter volumes. Among the 152 volunteers included in the relative-control sample, 58 were unaffected first-degree relatives of patients with schizophrenia (mean [SD] age, 33.29 [12.56]; 38 were women), and 94 were healthy controls without a first-degree relative with mental illness (mean [SD] age, 32.69 [10.09] years; 55 were women). A graph-theoretical connectivity approach identified significantly decreased connectivity in a subnetwork that primarily included the limbic cortex, visual cortex, and subcortex during emotional face processing (cluster-level P corrected for familywise error = .006) in relatives compared with controls. The connectivity of the same subnetwork was significantly decreased in patients with schizophrenia (F = 6.29, P = .01). Furthermore, we found that this subnetwork connectivity measure was negatively correlated with trait anxiety scores (P = .04), test-retest reliable (intraclass correlation coefficient = 0.57), specific to emotional face processing (F = 17.97, P < .001), and independent of gray matter volumes of the identified brain areas (F = 1.84, P = .18). Replicating previous results, no significant group differences were found in face-related amygdala activation and amygdala-anterior cingulate cortex connectivity (P corrected for familywise error =.37 and .11, respectively). Our results indicate that altered connectivity in a visual-limbic subnetwork during emotional face processing may be a functional connectomic intermediate phenotype for schizophrenia. The phenotype is reliable, task specific, related to trait anxiety, and associated with manifest illness. These data encourage the further investigation of this phenotype in clinical and pharmacologic studies.

  9. Exploring the nature of facial affect processing deficits in schizophrenia.

    PubMed

    van 't Wout, Mascha; Aleman, André; Kessels, Roy P C; Cahn, Wiepke; de Haan, Edward H F; Kahn, René S

    2007-04-15

    Schizophrenia has been associated with deficits in facial affect processing, especially negative emotions. However, the exact nature of the deficit remains unclear. The aim of the present study was to investigate whether schizophrenia patients have problems in automatic allocation of attention as well as in controlled evaluation of facial affect. Thirty-seven patients with schizophrenia were compared with 41 control subjects on incidental facial affect processing (gender decision of faces with a fearful, angry, happy, disgusted, and neutral expression) and degraded facial affect labeling (labeling of fearful, angry, happy, and neutral faces). The groups were matched on estimates of verbal and performance intelligence (National Adult Reading Test; Raven's Matrices), general face recognition ability (Benton Face Recognition), and other demographic variables. The results showed that patients with schizophrenia as well as control subjects demonstrate the normal threat-related interference during incidental facial affect processing. Conversely, on controlled evaluation patients were specifically worse in the labeling of fearful faces. In particular, patients with high levels of negative symptoms may be characterized by deficits in labeling fear. We suggest that patients with schizophrenia show no evidence of deficits in the automatic allocation of attention resources to fearful (threat-indicating) faces, but have a deficit in the controlled processing of facial emotions that may be specific for fearful faces.

  10. Neural correlates of cross-modal affective priming by music in Williams syndrome.

    PubMed

    Lense, Miriam D; Gordon, Reyna L; Key, Alexandra P F; Dykens, Elisabeth M

    2014-04-01

    Emotional connection is the main reason people engage with music, and the emotional features of music can influence processing in other domains. Williams syndrome (WS) is a neurodevelopmental genetic disorder where musicality and sociability are prominent aspects of the phenotype. This study examined oscillatory brain activity during a musical affective priming paradigm. Participants with WS and age-matched typically developing controls heard brief emotional musical excerpts or emotionally neutral sounds and then reported the emotional valence (happy/sad) of subsequently presented faces. Participants with WS demonstrated greater evoked fronto-central alpha activity to the happy vs sad musical excerpts. The size of these alpha effects correlated with parent-reported emotional reactivity to music. Although participant groups did not differ in accuracy of identifying facial emotions, reaction time data revealed a music priming effect only in persons with WS, who responded faster when the face matched the emotional valence of the preceding musical excerpt vs when the valence differed. Matching emotional valence was also associated with greater evoked gamma activity thought to reflect cross-modal integration. This effect was not present in controls. The results suggest a specific connection between music and socioemotional processing and have implications for clinical and educational approaches for WS.

  11. Post-Decision Wagering Affects Metacognitive Awareness of Emotional Stimuli: An Event Related Potential Study

    PubMed Central

    Wierzchoń, Michał; Wronka, Eligiusz; Paulewicz, Borysław; Szczepanowski, Remigiusz

    2016-01-01

    The present research investigated metacognitive awareness of emotional stimuli and its psychophysiological correlates. We used a backward masking task presenting participants with fearful or neutral faces. We asked participants for face discrimination and then probed their metacognitive awareness with confidence rating (CR) and post-decision wagering (PDW) scales. We also analysed psychophysiological correlates of awareness with event-related potential (ERP) components: P1, N170, early posterior negativity (EPN), and P3. We have not observed any differences between PDW and CR conditions in the emotion identification task. However, the "aware" ratings were associated with increased accuracy performance. This effect was more pronounced in PDW, especially for fearful faces, suggesting that emotional stimuli awareness may be enhanced by monetary incentives. EEG analysis showed larger N170, EPN and P3 amplitudes in aware compared to unaware trials. It also appeared that both EPN and P3 ERP components were more pronounced in the PDW condition, especially when emotional faces were presented. Taken together, our ERP findings suggest that metacognitive awareness of emotional stimuli depends on the effectiveness of both early and late visual information processing. Our study also indicates that awareness of emotional stimuli can be enhanced by the motivation induced by wagering. PMID:27490816

  12. Post-Decision Wagering Affects Metacognitive Awareness of Emotional Stimuli: An Event Related Potential Study.

    PubMed

    Wierzchoń, Michał; Wronka, Eligiusz; Paulewicz, Borysław; Szczepanowski, Remigiusz

    2016-01-01

    The present research investigated metacognitive awareness of emotional stimuli and its psychophysiological correlates. We used a backward masking task presenting participants with fearful or neutral faces. We asked participants for face discrimination and then probed their metacognitive awareness with confidence rating (CR) and post-decision wagering (PDW) scales. We also analysed psychophysiological correlates of awareness with event-related potential (ERP) components: P1, N170, early posterior negativity (EPN), and P3. We have not observed any differences between PDW and CR conditions in the emotion identification task. However, the "aware" ratings were associated with increased accuracy performance. This effect was more pronounced in PDW, especially for fearful faces, suggesting that emotional stimuli awareness may be enhanced by monetary incentives. EEG analysis showed larger N170, EPN and P3 amplitudes in aware compared to unaware trials. It also appeared that both EPN and P3 ERP components were more pronounced in the PDW condition, especially when emotional faces were presented. Taken together, our ERP findings suggest that metacognitive awareness of emotional stimuli depends on the effectiveness of both early and late visual information processing. Our study also indicates that awareness of emotional stimuli can be enhanced by the motivation induced by wagering.

  13. Individual differences in emotion processing: how similar are diffusion model parameters across tasks?

    PubMed

    Mueller, Christina J; White, Corey N; Kuchinke, Lars

    2017-11-27

    The goal of this study was to replicate findings of diffusion model parameters capturing emotion effects in a lexical decision task and investigating whether these findings extend to other tasks of implicit emotion processing. Additionally, we were interested in the stability of diffusion model parameters across emotional stimuli and tasks for individual subjects. Responses to words in a lexical decision task were compared with responses to faces in a gender categorization task for stimuli of the emotion categories: happy, neutral and fear. Main effects of emotion as well as stability of emerging response style patterns as evident in diffusion model parameters across these tasks were analyzed. Based on earlier findings, drift rates were assumed to be more similar in response to stimuli of the same emotion category compared to stimuli of a different emotion category. Results showed that emotion effects of the tasks differed with a processing advantage for happy followed by neutral and fear-related words in the lexical decision task and a processing advantage for neutral followed by happy and fearful faces in the gender categorization task. Both emotion effects were captured in estimated drift rate parameters-and in case of the lexical decision task also in the non-decision time parameters. A principal component analysis showed that contrary to our hypothesis drift rates were more similar within a specific task context than within a specific emotion category. Individual response patterns of subjects across tasks were evident in significant correlations regarding diffusion model parameters including response styles, non-decision times and information accumulation.

  14. Altered saccadic targets when processing facial expressions under different attentional and stimulus conditions.

    PubMed

    Boutsen, Frank A; Dvorak, Justin D; Pulusu, Vinay K; Ross, Elliott D

    2017-04-01

    Depending on a subject's attentional bias, robust changes in emotional perception occur when facial blends (different emotions expressed on upper/lower face) are presented tachistoscopically. If no instructions are given, subjects overwhelmingly identify the lower facial expression when blends are presented to either visual field. If asked to attend to the upper face, subjects overwhelmingly identify the upper facial expression in the left visual field but remain slightly biased to the lower facial expression in the right visual field. The current investigation sought to determine whether differences in initial saccadic targets could help explain the perceptual biases described above. Ten subjects were presented with full and blend facial expressions under different attentional conditions. No saccadic differences were found for left versus right visual field presentations or for full facial versus blend stimuli. When asked to identify the presented emotion, saccades were directed to the lower face. When asked to attend to the upper face, saccades were directed to the upper face. When asked to attend to the upper face and try to identify the emotion, saccades were directed to the upper face but to a lesser degree. Thus, saccadic behavior supports the concept that there are cognitive-attentional pre-attunements when subjects visually process facial expressions. However, these pre-attunements do not fully explain the perceptual superiority of the left visual field for identifying the upper facial expression when facial blends are presented tachistoscopically. Hence other perceptual factors must be in play, such as the phenomenon of virtual scanning. Published by Elsevier Ltd.

  15. Facilitation or disengagement? Attention bias in facial affect processing after short-term violent video game exposure

    PubMed Central

    Liu, Yanling; Lan, Haiying; Teng, Zhaojun; Guo, Cheng; Yao, Dezhong

    2017-01-01

    Previous research has been inconsistent on whether violent video games exert positive and/or negative effects on cognition. In particular, attentional bias in facial affect processing after violent video game exposure continues to be controversial. The aim of the present study was to investigate attentional bias in facial recognition after short term exposure to violent video games and to characterize the neural correlates of this effect. In order to accomplish this, participants were exposed to either neutral or violent video games for 25 min and then event-related potentials (ERPs) were recorded during two emotional search tasks. The first search task assessed attentional facilitation, in which participants were required to identify an emotional face from a crowd of neutral faces. In contrast, the second task measured disengagement, in which participants were required to identify a neutral face from a crowd of emotional faces. Our results found a significant presence of the ERP component, N2pc, during the facilitation task; however, no differences were observed between the two video game groups. This finding does not support a link between attentional facilitation and violent video game exposure. Comparatively, during the disengagement task, N2pc responses were not observed when participants viewed happy faces following violent video game exposure; however, a weak N2pc response was observed after neutral video game exposure. These results provided only inconsistent support for the disengagement hypothesis, suggesting that participants found it difficult to separate a neutral face from a crowd of emotional faces. PMID:28249033

  16. Facilitation or disengagement? Attention bias in facial affect processing after short-term violent video game exposure.

    PubMed

    Liu, Yanling; Lan, Haiying; Teng, Zhaojun; Guo, Cheng; Yao, Dezhong

    2017-01-01

    Previous research has been inconsistent on whether violent video games exert positive and/or negative effects on cognition. In particular, attentional bias in facial affect processing after violent video game exposure continues to be controversial. The aim of the present study was to investigate attentional bias in facial recognition after short term exposure to violent video games and to characterize the neural correlates of this effect. In order to accomplish this, participants were exposed to either neutral or violent video games for 25 min and then event-related potentials (ERPs) were recorded during two emotional search tasks. The first search task assessed attentional facilitation, in which participants were required to identify an emotional face from a crowd of neutral faces. In contrast, the second task measured disengagement, in which participants were required to identify a neutral face from a crowd of emotional faces. Our results found a significant presence of the ERP component, N2pc, during the facilitation task; however, no differences were observed between the two video game groups. This finding does not support a link between attentional facilitation and violent video game exposure. Comparatively, during the disengagement task, N2pc responses were not observed when participants viewed happy faces following violent video game exposure; however, a weak N2pc response was observed after neutral video game exposure. These results provided only inconsistent support for the disengagement hypothesis, suggesting that participants found it difficult to separate a neutral face from a crowd of emotional faces.

  17. Unconscious Processing of Facial Emotional Valence Relation: Behavioral Evidence of Integration between Subliminally Perceived Stimuli.

    PubMed

    Liu, Chengzhen; Sun, Zhiyi; Jou, Jerwen; Cui, Qian; Zhao, Guang; Qiu, Jiang; Tu, Shen

    2016-01-01

    Although a few studies have investigated the integration between some types of unconscious stimuli, no research has yet explored the integration between unconscious emotional stimuli. This study was designed to provide behavioral evidence for the integration between unconsciously perceived emotional faces (same or different valence relation) using a modified priming paradigm. In two experiments, participants were asked to decide whether two faces in the target, which followed two subliminally presented faces of same or different emotional expressions, were of the same or different emotional valence. The interstimulus interval (ISI) between the prime and the target was manipulated (0, 53, 163 ms). In Experiment 1, prime visibility was assessed post-experiment. In Experiment 2, it was assessed on each trial. Interestingly, in both experiments, unconsciously processed valence relation of the two faces in the prime generated a negative priming effect in the response to the supraliminally presented target, independent of the length of ISI. Further analyses suggested that the negative priming was probably caused by a motor response incongruent relation between the subliminally perceived prime and the supraliminally perceived target. The visual feature incongruent relation across the prime and target was not found to play a role in the negative priming. Because the negative priming was found at short ISI, an attention mechanism as well as a motor inhibition mechanism were proposed in the generation of the negative priming effect. Overall, this study indicated that the subliminal valence relation was processed, and that integration between different unconsciously perceived stimuli could occur.

  18. Unconscious Processing of Facial Emotional Valence Relation: Behavioral Evidence of Integration between Subliminally Perceived Stimuli

    PubMed Central

    Jou, Jerwen; Cui, Qian; Zhao, Guang; Qiu, Jiang; Tu, Shen

    2016-01-01

    Although a few studies have investigated the integration between some types of unconscious stimuli, no research has yet explored the integration between unconscious emotional stimuli. This study was designed to provide behavioral evidence for the integration between unconsciously perceived emotional faces (same or different valence relation) using a modified priming paradigm. In two experiments, participants were asked to decide whether two faces in the target, which followed two subliminally presented faces of same or different emotional expressions, were of the same or different emotional valence. The interstimulus interval (ISI) between the prime and the target was manipulated (0, 53, 163 ms). In Experiment 1, prime visibility was assessed post-experiment. In Experiment 2, it was assessed on each trial. Interestingly, in both experiments, unconsciously processed valence relation of the two faces in the prime generated a negative priming effect in the response to the supraliminally presented target, independent of the length of ISI. Further analyses suggested that the negative priming was probably caused by a motor response incongruent relation between the subliminally perceived prime and the supraliminally perceived target. The visual feature incongruent relation across the prime and target was not found to play a role in the negative priming. Because the negative priming was found at short ISI, an attention mechanism as well as a motor inhibition mechanism were proposed in the generation of the negative priming effect. Overall, this study indicated that the subliminal valence relation was processed, and that integration between different unconsciously perceived stimuli could occur. PMID:27622600

  19. An fMRI study of facial emotion processing in patients with schizophrenia.

    PubMed

    Gur, Raquel E; McGrath, Claire; Chan, Robin M; Schroeder, Lee; Turner, Travis; Turetsky, Bruce I; Kohler, Christian; Alsop, David; Maldjian, Joseph; Ragland, J Daniel; Gur, Ruben C

    2002-12-01

    Emotion processing deficits are notable in schizophrenia. The authors evaluated cerebral blood flow response in schizophrenia patients during facial emotion processing to test the hypothesis of diminished limbic activation related to emotional relevance of facial stimuli. Fourteen patients with schizophrenia and 14 matched comparison subjects viewed facial displays of happiness, sadness, anger, fear, and disgust as well as neutral faces. Functional magnetic resonance imaging was used to measure blood-oxygen-level-dependent signal changes as the subjects alternated between tasks of discriminating emotional valence (positive versus negative) and age (over 30 versus under 30) of the faces with an interleaved crosshair reference condition. The groups did not differ in performance on either task. For both tasks, healthy participants showed activation in the fusiform gyrus, occipital lobe, and inferior frontal cortex relative to the resting baseline condition. The increase was greater in the amygdala and hippocampus during the emotional valence discrimination task than during the age discrimination task. In the patients with schizophrenia, minimal focal response was observed for all tasks relative to the resting baseline condition. Contrasting patients and comparison subjects on the emotional valence discrimination task revealed voxels in the left amygdala and bilateral hippocampus in which the comparison subjects had significantly greater activation. Failure to activate limbic regions during emotional valence discrimination may explain emotion processing deficits in patients with schizophrenia. While the lack of limbic recruitment did not significantly impair simple valence discrimination performance in this clinically stable group, it may impact performance of more demanding tasks.

  20. Reduced Processing of Facial and Postural Cues in Social Anxiety: Insights from Electrophysiology

    PubMed Central

    Rossignol, Mandy; Fisch, Sophie-Alexandra; Maurage, Pierre; Joassin, Frédéric; Philippot, Pierre

    2013-01-01

    Social anxiety is characterized by fear of evaluative interpersonal situations. Many studies have investigated the perception of emotional faces in socially anxious individuals and have reported biases in the processing of threatening faces. However, faces are not the only stimuli carrying an interpersonal evaluative load. The present study investigated the processing of emotional body postures in social anxiety. Participants with high and low social anxiety completed an attention-shifting paradigm using neutral, angry and happy faces and postures as cues. We investigated early visual processes through the P100 component, attentional fixation on the P2, structural encoding mirrored by the N170, and attentional orientation towards stimuli to detect with the P100 locked on target occurrence. Results showed a global reduction of P100 and P200 responses to faces and postures in socially anxious participants as compared to non-anxious participants, with a direct correlation between self-reported social anxiety levels and P100 and P200 amplitudes. Structural encoding of cues and target processing were not modulated by social anxiety, but socially anxious participants were slower to detect the targets. These results suggest a reduced processing of social postural and facial cues in social anxiety. PMID:24040403

  1. Detecting and Categorizing Fleeting Emotions in Faces

    PubMed Central

    Sweeny, Timothy D.; Suzuki, Satoru; Grabowecky, Marcia; Paller, Ken A.

    2013-01-01

    Expressions of emotion are often brief, providing only fleeting images from which to base important social judgments. We sought to characterize the sensitivity and mechanisms of emotion detection and expression categorization when exposure to faces is very brief, and to determine whether these processes dissociate. Observers viewed 2 backward-masked facial expressions in quick succession, 1 neutral and the other emotional (happy, fearful, or angry), in a 2-interval forced-choice task. On each trial, observers attempted to detect the emotional expression (emotion detection) and to classify the expression (expression categorization). Above-chance emotion detection was possible with extremely brief exposures of 10 ms and was most accurate for happy expressions. We compared categorization among expressions using a d′ analysis, and found that categorization was usually above chance for angry versus happy and fearful versus happy, but consistently poor for fearful versus angry expressions. Fearful versus angry categorization was poor even when only negative emotions (fearful, angry, or disgusted) were used, suggesting that this categorization is poor independent of decision context. Inverting faces impaired angry versus happy categorization, but not emotion detection, suggesting that information from facial features is used differently for emotion detection and expression categorizations. Emotion detection often occurred without expression categorization, and expression categorization sometimes occurred without emotion detection. These results are consistent with the notion that emotion detection and expression categorization involve separate mechanisms. PMID:22866885

  2. Multimodal emotion perception after anterior temporal lobectomy (ATL)

    PubMed Central

    Milesi, Valérie; Cekic, Sezen; Péron, Julie; Frühholz, Sascha; Cristinzio, Chiara; Seeck, Margitta; Grandjean, Didier

    2014-01-01

    In the context of emotion information processing, several studies have demonstrated the involvement of the amygdala in emotion perception, for unimodal and multimodal stimuli. However, it seems that not only the amygdala, but several regions around it, may also play a major role in multimodal emotional integration. In order to investigate the contribution of these regions to multimodal emotion perception, five patients who had undergone unilateral anterior temporal lobe resection were exposed to both unimodal (vocal or visual) and audiovisual emotional and neutral stimuli. In a classic paradigm, participants were asked to rate the emotional intensity of angry, fearful, joyful, and neutral stimuli on visual analog scales. Compared with matched controls, patients exhibited impaired categorization of joyful expressions, whether the stimuli were auditory, visual, or audiovisual. Patients confused joyful faces with neutral faces, and joyful prosody with surprise. In the case of fear, unlike matched controls, patients provided lower intensity ratings for visual stimuli than for vocal and audiovisual ones. Fearful faces were frequently confused with surprised ones. When we controlled for lesion size, we no longer observed any overall difference between patients and controls in their ratings of emotional intensity on the target scales. Lesion size had the greatest effect on intensity perceptions and accuracy in the visual modality, irrespective of the type of emotion. These new findings suggest that a damaged amygdala, or a disrupted bundle between the amygdala and the ventral part of the occipital lobe, has a greater impact on emotion perception in the visual modality than it does in either the vocal or audiovisual one. We can surmise that patients are able to use the auditory information contained in multimodal stimuli to compensate for difficulty processing visually conveyed emotion. PMID:24839437

  3. Affective Prosody Labeling in Youths with Bipolar Disorder or Severe Mood Dysregulation

    ERIC Educational Resources Information Center

    Deveney, Christen M.; Brotman, Melissa A.; Decker, Ann Marie; Pine, Daniel S.; Leibenluft, Ellen

    2012-01-01

    Background: Accurate identification of nonverbal emotional cues is essential to successful social interactions, yet most research is limited to emotional face expression labeling. Little research focuses on the processing of emotional prosody, or tone of verbal speech, in clinical populations. Methods: Using the Diagnostic Analysis of Nonverbal…

  4. Altered amygdala-prefrontal response to facial emotion in offspring of parents with bipolar disorder.

    PubMed

    Manelis, Anna; Ladouceur, Cecile D; Graur, Simona; Monk, Kelly; Bonar, Lisa K; Hickey, Mary Beth; Dwojak, Amanda C; Axelson, David; Goldstein, Benjamin I; Goldstein, Tina R; Bebko, Genna; Bertocci, Michele A; Hafeman, Danella M; Gill, Mary Kay; Birmaher, Boris; Phillips, Mary L

    2015-09-01

    This study aimed to identify neuroimaging measures associated with risk for, or protection against, bipolar disorder by comparing youth offspring of parents with bipolar disorder versus youth offspring of non-bipolar parents versus offspring of healthy parents in (i) the magnitude of activation within emotional face processing circuitry; and (ii) functional connectivity between this circuitry and frontal emotion regulation regions. The study was conducted at the University of Pittsburgh Medical Centre. Participants included 29 offspring of parents with bipolar disorder (mean age = 13.8 years; 14 females), 29 offspring of non-bipolar parents (mean age = 13.8 years; 12 females) and 23 healthy controls (mean age = 13.7 years; 11 females). Participants were scanned during implicit processing of emerging happy, sad, fearful and angry faces and shapes. The activation analyses revealed greater right amygdala activation to emotional faces versus shapes in offspring of parents with bipolar disorder and offspring of non-bipolar parents than healthy controls. Given that abnormally increased amygdala activation during emotion processing characterized offspring of both patient groups, and that abnormally increased amygdala activation has often been reported in individuals with already developed bipolar disorder and those with major depressive disorder, these neuroimaging findings may represent markers of increased risk for affective disorders in general. The analysis of psychophysiological interaction revealed that offspring of parents with bipolar disorder showed significantly more negative right amygdala-anterior cingulate cortex functional connectivity to emotional faces versus shapes, but significantly more positive right amygdala-left ventrolateral prefrontal cortex functional connectivity to happy faces (all P-values corrected for multiple tests) than offspring of non-bipolar parents and healthy controls. Taken together with findings of increased amygdala-ventrolateral prefrontal cortex functional connectivity, and decreased amygdala-anterior cingulate cortex functional connectivity previously shown in individuals with bipolar disorder, these connectivity patterns in offspring of parents with bipolar disorder may be risk markers for, rather than markers conferring protection against, bipolar disorder in youth. The patterns of activation and functional connectivity remained unchanged after removing medicated participants and those with current psychopathology from analyses. This is the first study to demonstrate that abnormal functional connectivity patterns within face emotion processing circuitry distinguish offspring of parents with bipolar disorder from those of non-bipolar parents and healthy controls. © The Author (2015). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Independent effects of reward expectation and spatial orientation on the processing of emotional facial expressions.

    PubMed

    Kang, Guanlan; Zhou, Xiaolin; Wei, Ping

    2015-09-01

    The present study investigated the effect of reward expectation and spatial orientation on the processing of emotional facial expressions, using a spatial cue-target paradigm. A colored cue was presented at the left or right side of the central fixation point, with its color indicating the monetary reward stakes of a given trial (incentive vs. non-incentive), followed by the presentation of an emotional facial target (angry vs. neutral) at a cued or un-cued location. Participants were asked to discriminate the emotional expression of the target, with the cue-target stimulus onset asynchrony being 200-300 ms in Experiment 1 and 950-1250 ms in Experiment 2a (without a fixation cue) and Experiment 2b (with a fixation cue), producing a spatial facilitation effect and an inhibition of return effect, respectively. The results of all the experiments revealed faster reaction times in the monetary incentive condition than in the non-incentive condition, demonstrating the effect of reward to facilitate task performance. An interaction between reward expectation and the emotion of the target was evident in all the three experiments, with larger reward effects for angry faces than for neutral faces. This interaction was not affected by spatial orientation. These findings demonstrate that incentive motivation improves task performance and increases sensitivity to angry faces, irrespective of spatial orienting and reorienting processes.

  6. Face Emotion Processing in Depressed Children and Adolescents with and without Comorbid Conduct Disorder

    ERIC Educational Resources Information Center

    Schepman, Karen; Taylor, Eric; Collishaw, Stephan; Fombonne, Eric

    2012-01-01

    Studies of adults with depression point to characteristic neurocognitive deficits, including differences in processing facial expressions. Few studies have examined face processing in juvenile depression, or taken account of other comorbid disorders. Three groups were compared: depressed children and adolescents with conduct disorder (n = 23),…

  7. LSD Acutely Impairs Fear Recognition and Enhances Emotional Empathy and Sociality

    PubMed Central

    Dolder, Patrick C; Schmid, Yasmin; Müller, Felix; Borgwardt, Stefan; Liechti, Matthias E

    2016-01-01

    Lysergic acid diethylamide (LSD) is used recreationally and has been evaluated as an adjunct to psychotherapy to treat anxiety in patients with life-threatening illness. LSD is well-known to induce perceptual alterations, but unknown is whether LSD alters emotional processing in ways that can support psychotherapy. We investigated the acute effects of LSD on emotional processing using the Face Emotion Recognition Task (FERT) and Multifaceted Empathy Test (MET). The effects of LSD on social behavior were tested using the Social Value Orientation (SVO) test. Two similar placebo-controlled, double-blind, random-order, crossover studies were conducted using 100 μg LSD in 24 subjects and 200 μg LSD in 16 subjects. All of the subjects were healthy and mostly hallucinogen-naive 25- to 65-year-old volunteers (20 men, 20 women). LSD produced feelings of happiness, trust, closeness to others, enhanced explicit and implicit emotional empathy on the MET, and impaired the recognition of sad and fearful faces on the FERT. LSD enhanced the participants' desire to be with other people and increased their prosocial behavior on the SVO test. These effects of LSD on emotion processing and sociality may be useful for LSD-assisted psychotherapy. PMID:27249781

  8. LSD Acutely Impairs Fear Recognition and Enhances Emotional Empathy and Sociality.

    PubMed

    Dolder, Patrick C; Schmid, Yasmin; Müller, Felix; Borgwardt, Stefan; Liechti, Matthias E

    2016-10-01

    Lysergic acid diethylamide (LSD) is used recreationally and has been evaluated as an adjunct to psychotherapy to treat anxiety in patients with life-threatening illness. LSD is well-known to induce perceptual alterations, but unknown is whether LSD alters emotional processing in ways that can support psychotherapy. We investigated the acute effects of LSD on emotional processing using the Face Emotion Recognition Task (FERT) and Multifaceted Empathy Test (MET). The effects of LSD on social behavior were tested using the Social Value Orientation (SVO) test. Two similar placebo-controlled, double-blind, random-order, crossover studies were conducted using 100 μg LSD in 24 subjects and 200 μg LSD in 16 subjects. All of the subjects were healthy and mostly hallucinogen-naive 25- to 65-year-old volunteers (20 men, 20 women). LSD produced feelings of happiness, trust, closeness to others, enhanced explicit and implicit emotional empathy on the MET, and impaired the recognition of sad and fearful faces on the FERT. LSD enhanced the participants' desire to be with other people and increased their prosocial behavior on the SVO test. These effects of LSD on emotion processing and sociality may be useful for LSD-assisted psychotherapy.

  9. Facing changes and changing faces in adolescence: a new model for investigating adolescent-specific interactions between pubertal, brain and behavioral development.

    PubMed

    Scherf, K Suzanne; Behrmann, Marlene; Dahl, Ronald E

    2012-04-01

    Adolescence is a time of dramatic physical, cognitive, emotional, and social changes as well as a time for the development of many social-emotional problems. These characteristics raise compelling questions about accompanying neural changes that are unique to this period of development. Here, we propose that studying adolescent-specific changes in face processing and its underlying neural circuitry provides an ideal model for addressing these questions. We also use this model to formulate new hypotheses. Specifically, pubertal hormones are likely to increase motivation to master new peer-oriented developmental tasks, which will in turn, instigate the emergence of new social/affective components of face processing. We also predict that pubertal hormones have a fundamental impact on the re-organization of neural circuitry supporting face processing and propose, in particular, that, the functional connectivity, or temporal synchrony, between regions of the face-processing network will change with the emergence of these new components of face processing in adolescence. Finally, we show how this approach will help reveal why adolescence may be a period of vulnerability in brain development and suggest how it could lead to prevention and intervention strategies that facilitate more adaptive functional interactions between regions within the broader social information processing network. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. How do typically developing deaf children and deaf children with autism spectrum disorder use the face when comprehending emotional facial expressions in British sign language?

    PubMed

    Denmark, Tanya; Atkinson, Joanna; Campbell, Ruth; Swettenham, John

    2014-10-01

    Facial expressions in sign language carry a variety of communicative features. While emotion can modulate a spoken utterance through changes in intonation, duration and intensity, in sign language specific facial expressions presented concurrently with a manual sign perform this function. When deaf adult signers cannot see facial features, their ability to judge emotion in a signed utterance is impaired (Reilly et al. in Sign Lang Stud 75:113-118, 1992). We examined the role of the face in the comprehension of emotion in sign language in a group of typically developing (TD) deaf children and in a group of deaf children with autism spectrum disorder (ASD). We replicated Reilly et al.'s (Sign Lang Stud 75:113-118, 1992) adult results in the TD deaf signing children, confirming the importance of the face in understanding emotion in sign language. The ASD group performed more poorly on the emotion recognition task than the TD children. The deaf children with ASD showed a deficit in emotion recognition during sign language processing analogous to the deficit in vocal emotion recognition that has been observed in hearing children with ASD.

  11. Impaired perception of facial emotion in developmental prosopagnosia.

    PubMed

    Biotti, Federica; Cook, Richard

    2016-08-01

    Developmental prosopagnosia (DP) is a neurodevelopmental condition characterised by difficulties recognising faces. Despite severe difficulties recognising facial identity, expression recognition is typically thought to be intact in DP; case studies have described individuals who are able to correctly label photographic displays of facial emotion, and no group differences have been reported. This pattern of deficits suggests a locus of impairment relatively late in the face processing stream, after the divergence of expression and identity analysis pathways. To date, however, there has been little attempt to investigate emotion recognition systematically in a large sample of developmental prosopagnosics using sensitive tests. In the present study, we describe three complementary experiments that examine emotion recognition in a sample of 17 developmental prosopagnosics. In Experiment 1, we investigated observers' ability to make binary classifications of whole-face expression stimuli drawn from morph continua. In Experiment 2, observers judged facial emotion using only the eye-region (the rest of the face was occluded). Analyses of both experiments revealed diminished ability to classify facial expressions in our sample of developmental prosopagnosics, relative to typical observers. Imprecise expression categorisation was particularly evident in those individuals exhibiting apperceptive profiles, associated with problems encoding facial shape accurately. Having split the sample of prosopagnosics into apperceptive and non-apperceptive subgroups, only the apperceptive prosopagnosics were impaired relative to typical observers. In our third experiment, we examined the ability of observers' to classify the emotion present within segments of vocal affect. Despite difficulties judging facial emotion, the prosopagnosics exhibited excellent recognition of vocal affect. Contrary to the prevailing view, our results suggest that many prosopagnosics do experience difficulties classifying expressions, particularly those with apperceptive profiles. These individuals may have difficulties forming view-invariant structural descriptions at an early stage in the face processing stream, before identity and expression pathways diverge. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Dissociation in Rating Negative Facial Emotions between Behavioral Variant Frontotemporal Dementia and Major Depressive Disorder.

    PubMed

    Chiu, Isabelle; Piguet, Olivier; Diehl-Schmid, Janine; Riedl, Lina; Beck, Johannes; Leyhe, Thomas; Holsboer-Trachsler, Edith; Berres, Manfred; Monsch, Andreas U; Sollberger, Marc

    2016-11-01

    Features of behavioral variant frontotemporal dementia (bvFTD) such as executive dysfunction, apathy, and impaired empathic abilities are also observed in major depressive disorder (MDD). This may contribute to the reason why early stage bvFTD is often misdiagnosed as MDD. New assessment tools are thus needed to improve early diagnosis of bvFTD. Although emotion processing is affected in bvFTD and MDD, growing evidence indicates that the pattern of emotion processing deficits varies between the two disorders. As such, emotion processing paradigms have substantial potentials to distinguish bvFTD from MDD. The current study compared 25 patients with bvFTD, 21 patients with MDD, 21 patients with Alzheimer disease (AD) dementia, and 31 healthy participants on a novel facial emotion intensity rating task. Stimuli comprised morphed faces from the Ekman and Friesen stimulus set containing faces of each sex with two different degrees of emotion intensity for each of the six basic emotions. Analyses of covariance uncovered a significant dissociation between bvFTD and MDD patients in rating the intensity of negative emotions overall (i.e., bvFTD patients underrated negative emotions overall, whereas MDD patients overrated negative emotions overall compared with healthy participants). In contrast, AD dementia patients rated negative emotions similarly to healthy participants, suggesting no impact of cognitive deficits on rating facial emotions. By strongly differentiating bvFTD and MDDpatients through negative facial emotions, this sensitive and short rating task might help improve the early diagnosis of bvFTD. Copyright © 2016 American Association for Geriatric Psychiatry. All rights reserved.

  13. Fear across the senses: brain responses to music, vocalizations and facial expressions

    PubMed Central

    Angulo-Perkins, Arafat; Peretz, Isabelle; Concha, Luis; Armony, Jorge L.

    2015-01-01

    Intrinsic emotional expressions such as those communicated by faces and vocalizations have been shown to engage specific brain regions, such as the amygdala. Although music constitutes another powerful means to express emotions, the neural substrates involved in its processing remain poorly understood. In particular, it is unknown whether brain regions typically associated with processing ‘biologically relevant’ emotional expressions are also recruited by emotional music. To address this question, we conducted an event-related functional magnetic resonance imaging study in 47 healthy volunteers in which we directly compared responses to basic emotions (fear, sadness and happiness, as well as neutral) expressed through faces, non-linguistic vocalizations and short novel musical excerpts. Our results confirmed the importance of fear in emotional communication, as revealed by significant blood oxygen level-dependent signal increased in a cluster within the posterior amygdala and anterior hippocampus, as well as in the posterior insula across all three domains. Moreover, subject-specific amygdala responses to fearful music and vocalizations were correlated, consistent with the proposal that the brain circuitry involved in the processing of musical emotions might be shared with the one that have evolved for vocalizations. Overall, our results show that processing of fear expressed through music, engages some of the same brain areas known to be crucial for detecting and evaluating threat-related information. PMID:24795437

  14. Fear across the senses: brain responses to music, vocalizations and facial expressions.

    PubMed

    Aubé, William; Angulo-Perkins, Arafat; Peretz, Isabelle; Concha, Luis; Armony, Jorge L

    2015-03-01

    Intrinsic emotional expressions such as those communicated by faces and vocalizations have been shown to engage specific brain regions, such as the amygdala. Although music constitutes another powerful means to express emotions, the neural substrates involved in its processing remain poorly understood. In particular, it is unknown whether brain regions typically associated with processing 'biologically relevant' emotional expressions are also recruited by emotional music. To address this question, we conducted an event-related functional magnetic resonance imaging study in 47 healthy volunteers in which we directly compared responses to basic emotions (fear, sadness and happiness, as well as neutral) expressed through faces, non-linguistic vocalizations and short novel musical excerpts. Our results confirmed the importance of fear in emotional communication, as revealed by significant blood oxygen level-dependent signal increased in a cluster within the posterior amygdala and anterior hippocampus, as well as in the posterior insula across all three domains. Moreover, subject-specific amygdala responses to fearful music and vocalizations were correlated, consistent with the proposal that the brain circuitry involved in the processing of musical emotions might be shared with the one that have evolved for vocalizations. Overall, our results show that processing of fear expressed through music, engages some of the same brain areas known to be crucial for detecting and evaluating threat-related information. © The Author (2014). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  15. Family environment influences emotion recognition following paediatric traumatic brain injury.

    PubMed

    Schmidt, Adam T; Orsten, Kimberley D; Hanten, Gerri R; Li, Xiaoqi; Levin, Harvey S

    2010-01-01

    This study investigated the relationship between family functioning and performance on two tasks of emotion recognition (emotional prosody and face emotion recognition) and a cognitive control procedure (the Flanker task) following paediatric traumatic brain injury (TBI) or orthopaedic injury (OI). A total of 142 children (75 TBI, 67 OI) were assessed on three occasions: baseline, 3 months and 1 year post-injury on the two emotion recognition tasks and the Flanker task. Caregivers also completed the Life Stressors and Resources Scale (LISRES) on each occasion. Growth curve analysis was used to analyse the data. Results indicated that family functioning influenced performance on the emotional prosody and Flanker tasks but not on the face emotion recognition task. Findings on both the emotional prosody and Flanker tasks were generally similar across groups. However, financial resources emerged as significantly related to emotional prosody performance in the TBI group only (p = 0.0123). Findings suggest family functioning variables--especially financial resources--can influence performance on an emotional processing task following TBI in children.

  16. Sustained neural activity to gaze and emotion perception in dynamic social scenes

    PubMed Central

    Ulloa, José Luis; Puce, Aina; Hugueville, Laurent; George, Nathalie

    2014-01-01

    To understand social interactions, we must decode dynamic social cues from seen faces. Here, we used magnetoencephalography (MEG) to study the neural responses underlying the perception of emotional expressions and gaze direction changes as depicted in an interaction between two agents. Subjects viewed displays of paired faces that first established a social scenario of gazing at each other (mutual attention) or gazing laterally together (deviated group attention) and then dynamically displayed either an angry or happy facial expression. The initial gaze change elicited a significantly larger M170 under the deviated than the mutual attention scenario. At around 400 ms after the dynamic emotion onset, responses at posterior MEG sensors differentiated between emotions, and between 1000 and 2200 ms, left posterior sensors were additionally modulated by social scenario. Moreover, activity on right anterior sensors showed both an early and prolonged interaction between emotion and social scenario. These results suggest that activity in right anterior sensors reflects an early integration of emotion and social attention, while posterior activity first differentiated between emotions only, supporting the view of a dual route for emotion processing. Altogether, our data demonstrate that both transient and sustained neurophysiological responses underlie social processing when observing interactions between others. PMID:23202662

  17. Distant influences of amygdala lesion on visual cortical activation during emotional face processing.

    PubMed

    Vuilleumier, Patrik; Richardson, Mark P; Armony, Jorge L; Driver, Jon; Dolan, Raymond J

    2004-11-01

    Emotional visual stimuli evoke enhanced responses in the visual cortex. To test whether this reflects modulatory influences from the amygdala on sensory processing, we used event-related functional magnetic resonance imaging (fMRI) in human patients with medial temporal lobe sclerosis. Twenty-six patients with lesions in the amygdala, the hippocampus or both, plus 13 matched healthy controls, were shown pictures of fearful or neutral faces in task-releant or task-irrelevant positions on the display. All subjects showed increased fusiform cortex activation when the faces were in task-relevant positions. Both healthy individuals and those with hippocampal damage showed increased activation in the fusiform and occipital cortex when they were shown fearful faces, but this was not the case for individuals with damage to the amygdala, even though visual areas were structurally intact. The distant influence of the amygdala was also evidenced by the parametric relationship between amygdala damage and the level of emotional activation in the fusiform cortex. Our data show that combining the fMRI and lesion approaches can help reveal the source of functional modulatory influences between distant but interconnected brain regions.

  18. Evidence for the triadic model of adolescent brain development: Cognitive load and task-relevance of emotion differentially affect adolescents and adults.

    PubMed

    Mueller, Sven C; Cromheeke, Sofie; Siugzdaite, Roma; Nicolas Boehler, C

    2017-08-01

    In adults, cognitive control is supported by several brain regions including the limbic system and the dorsolateral prefrontal cortex (dlPFC) when processing emotional information. However, in adolescents, some theories hypothesize a neurobiological imbalance proposing heightened sensitivity to affective material in the amygdala and striatum within a cognitive control context. Yet, direct neurobiological evidence is scarce. Twenty-four adolescents (12-16) and 28 adults (25-35) completed an emotional n-back working memory task in response to happy, angry, and neutral faces during fMRI. Importantly, participants either paid attention to the emotion (task-relevant condition) or judged the gender (task-irrelevant condition). Behaviorally, for both groups, when happy faces were task-relevant, performance improved relative to when they were task-irrelevant, while performance decrements were seen for angry faces. In the dlPFC, angry faces elicited more activation in adults during low relative to high cognitive load (2-back vs. 0-back). By contrast, happy faces elicited more activation in the amygdala in adolescents when they were task-relevant. Happy faces also generally increased nucleus accumbens activity (regardless of relevance) in adolescents relative to adults. Together, the findings are consistent with neurobiological models of adolescent brain development and identify neurodevelopmental differences in cognitive control emotion interactions. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. Processing of subliminal facial expressions of emotion: a behavioral and fMRI study.

    PubMed

    Prochnow, D; Kossack, H; Brunheim, S; Müller, K; Wittsack, H-J; Markowitsch, H-J; Seitz, R J

    2013-01-01

    The recognition of emotional facial expressions is an important means to adjust behavior in social interactions. As facial expressions widely differ in their duration and degree of expressiveness, they often manifest with short and transient expressions below the level of awareness. In this combined behavioral and fMRI study, we aimed at examining whether or not consciously accessible (subliminal) emotional facial expressions influence empathic judgments and which brain activations are related to it. We hypothesized that subliminal facial expressions of emotions masked with neutral expressions of the same faces induce an empathic processing similar to consciously accessible (supraliminal) facial expressions. Our behavioral data in 23 healthy subjects showed that subliminal emotional facial expressions of 40 ms duration affect the judgments of the subsequent neutral facial expressions. In the fMRI study in 12 healthy subjects it was found that both, supra- and subliminal emotional facial expressions shared a widespread network of brain areas including the fusiform gyrus, the temporo-parietal junction, and the inferior, dorsolateral, and medial frontal cortex. Compared with subliminal facial expressions, supraliminal facial expressions led to a greater activation of left occipital and fusiform face areas. We conclude that masked subliminal emotional information is suited to trigger processing in brain areas which have been implicated in empathy and, thereby in social encounters.

  20. Interactive effects between gaze direction and facial expression on attentional resources deployment: the task instruction and context matter

    PubMed Central

    Ricciardelli, Paola; Lugli, Luisa; Pellicano, Antonello; Iani, Cristina; Nicoletti, Roberto

    2016-01-01

    In three experiments, we tested whether the amount of attentional resources needed to process a face displaying neutral/angry/fearful facial expressions with direct or averted gaze depends on task instructions, and face presentation. To this end, we used a Rapid Serial Visual Presentation paradigm in which participants in Experiment 1 were first explicitly asked to discriminate whether the expression of a target face (T1) with direct or averted gaze was angry or neutral, and then to judge the orientation of a landscape (T2). Experiment 2 was identical to Experiment 1 except that participants had to discriminate the gender of the face of T1 and fearful faces were also presented randomly inter-mixed within each block of trials. Experiment 3 differed from Experiment 2 only because angry and fearful faces were never presented within the same block. The findings indicated that the presence of the attentional blink (AB) for face stimuli depends on specific combinations of gaze direction and emotional facial expressions and crucially revealed that the contextual factors (e.g., explicit instruction to process the facial expression and the presence of other emotional faces) can modify and even reverse the AB, suggesting a flexible and more contextualized deployment of attentional resources in face processing. PMID:26898473

Top