Sample records for emotional faces task

  1. Task relevance of emotional information affects anxiety-linked attention bias in visual search.

    PubMed

    Dodd, Helen F; Vogt, Julia; Turkileri, Nilgun; Notebaert, Lies

    2017-01-01

    Task relevance affects emotional attention in healthy individuals. Here, we investigate whether the association between anxiety and attention bias is affected by the task relevance of emotion during an attention task. Participants completed two visual search tasks. In the emotion-irrelevant task, participants were asked to indicate whether a discrepant face in a crowd of neutral, middle-aged faces was old or young. Irrelevant to the task, target faces displayed angry, happy, or neutral expressions. In the emotion-relevant task, participants were asked to indicate whether a discrepant face in a crowd of middle-aged neutral faces was happy or angry (target faces also varied in age). Trait anxiety was not associated with attention in the emotion-relevant task. However, in the emotion-irrelevant task, trait anxiety was associated with a bias for angry over happy faces. These findings demonstrate that the task relevance of emotional information affects conclusions about the presence of an anxiety-linked attention bias. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Interference among the Processing of Facial Emotion, Face Race, and Face Gender.

    PubMed

    Li, Yongna; Tse, Chi-Shing

    2016-01-01

    People can process multiple dimensions of facial properties simultaneously. Facial processing models are based on the processing of facial properties. The current study examined the processing of facial emotion, face race, and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interacted with face race in all the tasks. The interaction of face race and face gender was found in the race and gender categorization tasks, whereas the interaction of facial emotion and face gender was significant in the emotion and gender categorization tasks. These results provided evidence for a symmetric interaction between variant facial properties (emotion) and invariant facial properties (race and gender).

  3. Interference among the Processing of Facial Emotion, Face Race, and Face Gender

    PubMed Central

    Li, Yongna; Tse, Chi-Shing

    2016-01-01

    People can process multiple dimensions of facial properties simultaneously. Facial processing models are based on the processing of facial properties. The current study examined the processing of facial emotion, face race, and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interacted with face race in all the tasks. The interaction of face race and face gender was found in the race and gender categorization tasks, whereas the interaction of facial emotion and face gender was significant in the emotion and gender categorization tasks. These results provided evidence for a symmetric interaction between variant facial properties (emotion) and invariant facial properties (race and gender). PMID:27840621

  4. Flexible and inflexible task sets: asymmetric interference when switching between emotional expression, sex, and age classification of perceived faces.

    PubMed

    Schuch, Stefanie; Werheid, Katja; Koch, Iring

    2012-01-01

    The present study investigated whether the processing characteristics of categorizing emotional facial expressions are different from those of categorizing facial age and sex information. Given that emotions change rapidly, it was hypothesized that processing facial expressions involves a more flexible task set that causes less between-task interference than the task sets involved in processing age or sex of a face. Participants switched between three tasks: categorizing a face as looking happy or angry (emotion task), young or old (age task), and male or female (sex task). Interference between tasks was measured by global interference and response interference. Both measures revealed patterns of asymmetric interference. Global between-task interference was reduced when a task was mixed with the emotion task. Response interference, as measured by congruency effects, was larger for the emotion task than for the nonemotional tasks. The results support the idea that processing emotional facial expression constitutes a more flexible task set that causes less interference (i.e., task-set "inertia") than processing the age or sex of a face.

  5. Task relevance regulates the interaction between reward expectation and emotion.

    PubMed

    Wei, Ping; Kang, Guanlan

    2014-06-01

    In the present study, we investigated the impact of reward expectation on the processing of emotional facial expression using a cue-target paradigm. A cue indicating the reward condition of each trial (incentive vs. non-incentive) was followed by the presentation of a picture of an emotional face, the target. Participants were asked to discriminate the emotional expression of the target face in Experiment 1, to discriminate the gender of the target face in Experiment 2, and to judge a number superimposed on the center of the target face as even or odd in Experiment 3, rendering the emotional expression of the target face as task relevant in Experiment 1 but task irrelevant in Experiments 2 and 3. Faster reaction times (RTs) were observed in the monetary incentive condition than in the non-incentive condition, demonstrating the effect of reward on facilitating task concentration. Moreover, the reward effect (i.e., RTs in non-incentive conditions versus incentive conditions) was larger for emotional faces than for neutral faces when emotional expression was task relevant but not when it was task irrelevant. The findings suggest that top-down incentive motivation biased attentional processing toward task-relevant stimuli, and that task relevance played an important role in regulating the influence of reward expectation on the processing of emotional stimuli.

  6. Familiarity and face emotion recognition in patients with schizophrenia.

    PubMed

    Lahera, Guillermo; Herrera, Sara; Fernández, Cristina; Bardón, Marta; de los Ángeles, Victoria; Fernández-Liria, Alberto

    2014-01-01

    To assess the emotion recognition in familiar and unknown faces in a sample of schizophrenic patients and healthy controls. Face emotion recognition of 18 outpatients diagnosed with schizophrenia (DSM-IVTR) and 18 healthy volunteers was assessed with two Emotion Recognition Tasks using familiar faces and unknown faces. Each subject was accompanied by 4 familiar people (parents, siblings or friends), which were photographed by expressing the 6 Ekman's basic emotions. Face emotion recognition in familiar faces was assessed with this ad hoc instrument. In each case, the patient scored (from 1 to 10) the subjective familiarity and affective valence corresponding to each person. Patients with schizophrenia not only showed a deficit in the recognition of emotions on unknown faces (p=.01), but they also showed an even more pronounced deficit on familiar faces (p=.001). Controls had a similar success rate in the unknown faces task (mean: 18 +/- 2.2) and the familiar face task (mean: 17.4 +/- 3). However, patients had a significantly lower score in the familiar faces task (mean: 13.2 +/- 3.8) than in the unknown faces task (mean: 16 +/- 2.4; p<.05). In both tests, the highest number of errors was with emotions of anger and fear. Subjectively, the patient group showed a lower level of familiarity and emotional valence to their respective relatives (p<.01). The sense of familiarity may be a factor involved in the face emotion recognition and it may be disturbed in schizophrenia. © 2013.

  7. ERP Correlates of Target-Distracter Differentiation in Repeated Runs of a Continuous Recognition Task with Emotional and Neutral Faces

    ERIC Educational Resources Information Center

    Treese, Anne-Cecile; Johansson, Mikael; Lindgren, Magnus

    2010-01-01

    The emotional salience of faces has previously been shown to induce memory distortions in recognition memory tasks. This event-related potential (ERP) study used repeated runs of a continuous recognition task with emotional and neutral faces to investigate emotion-induced memory distortions. In the second and third runs, participants made more…

  8. Emotion recognition training using composite faces generalises across identities but not all emotions.

    PubMed

    Dalili, Michael N; Schofield-Toloza, Lawrence; Munafò, Marcus R; Penton-Voak, Ian S

    2017-08-01

    Many cognitive bias modification (CBM) tasks use facial expressions of emotion as stimuli. Some tasks use unique facial stimuli, while others use composite stimuli, given evidence that emotion is encoded prototypically. However, CBM using composite stimuli may be identity- or emotion-specific, and may not generalise to other stimuli. We investigated the generalisability of effects using composite faces in two experiments. Healthy adults in each study were randomised to one of four training conditions: two stimulus-congruent conditions, where same faces were used during all phases of the task, and two stimulus-incongruent conditions, where faces of the opposite sex (Experiment 1) or faces depicting another emotion (Experiment 2) were used after the modification phase. Our results suggested that training effects generalised across identities. However, our results indicated only partial generalisation across emotions. These findings suggest effects obtained using composite stimuli may extend beyond the stimuli used in the task but remain emotion-specific.

  9. Behavioral dissociation between emotional and non-emotional facial expressions in congenital prosopagnosia

    PubMed Central

    Daini, Roberta; Comparetti, Chiara M.; Ricciardelli, Paola

    2014-01-01

    Neuropsychological and neuroimaging studies have shown that facial recognition and emotional expressions are dissociable. However, it is unknown if a single system supports the processing of emotional and non-emotional facial expressions. We aimed to understand if individuals with impairment in face recognition from birth (congenital prosopagnosia, CP) can use non-emotional facial expressions to recognize a face as an already seen one, and thus, process this facial dimension independently from features (which are impaired in CP), and basic emotional expressions. To this end, we carried out a behavioral study in which we compared the performance of 6 CP individuals to that of typical development individuals, using upright and inverted faces. Four avatar faces with a neutral expression were presented in the initial phase. The target faces presented in the recognition phase, in which a recognition task was requested (2AFC paradigm), could be identical (neutral) to those of the initial phase or present biologically plausible changes to features, non-emotional expressions, or emotional expressions. After this task, a second task was performed, in which the participants had to detect whether or not the recognized face exactly matched the study face or showed any difference. The results confirmed the CPs' impairment in the configural processing of the invariant aspects of the face, but also showed a spared configural processing of non-emotional facial expression (task 1). Interestingly and unlike the non-emotional expressions, the configural processing of emotional expressions was compromised in CPs and did not improve their change detection ability (task 2). These new results have theoretical implications for face perception models since they suggest that, at least in CPs, non-emotional expressions are processed configurally, can be dissociated from other facial dimensions, and may serve as a compensatory strategy to achieve face recognition. PMID:25520643

  10. Behavioral dissociation between emotional and non-emotional facial expressions in congenital prosopagnosia.

    PubMed

    Daini, Roberta; Comparetti, Chiara M; Ricciardelli, Paola

    2014-01-01

    Neuropsychological and neuroimaging studies have shown that facial recognition and emotional expressions are dissociable. However, it is unknown if a single system supports the processing of emotional and non-emotional facial expressions. We aimed to understand if individuals with impairment in face recognition from birth (congenital prosopagnosia, CP) can use non-emotional facial expressions to recognize a face as an already seen one, and thus, process this facial dimension independently from features (which are impaired in CP), and basic emotional expressions. To this end, we carried out a behavioral study in which we compared the performance of 6 CP individuals to that of typical development individuals, using upright and inverted faces. Four avatar faces with a neutral expression were presented in the initial phase. The target faces presented in the recognition phase, in which a recognition task was requested (2AFC paradigm), could be identical (neutral) to those of the initial phase or present biologically plausible changes to features, non-emotional expressions, or emotional expressions. After this task, a second task was performed, in which the participants had to detect whether or not the recognized face exactly matched the study face or showed any difference. The results confirmed the CPs' impairment in the configural processing of the invariant aspects of the face, but also showed a spared configural processing of non-emotional facial expression (task 1). Interestingly and unlike the non-emotional expressions, the configural processing of emotional expressions was compromised in CPs and did not improve their change detection ability (task 2). These new results have theoretical implications for face perception models since they suggest that, at least in CPs, non-emotional expressions are processed configurally, can be dissociated from other facial dimensions, and may serve as a compensatory strategy to achieve face recognition.

  11. Task-irrelevant emotion facilitates face discrimination learning.

    PubMed

    Lorenzino, Martina; Caudek, Corrado

    2015-03-01

    We understand poorly how the ability to discriminate faces from one another is shaped by visual experience. The purpose of the present study is to determine whether face discrimination learning can be facilitated by facial emotions. To answer this question, we used a task-irrelevant perceptual learning paradigm because it closely mimics the learning processes that, in daily life, occur without a conscious intention to learn and without an attentional focus on specific facial features. We measured face discrimination thresholds before and after training. During the training phase (4 days), participants performed a contrast discrimination task on face images. They were not informed that we introduced (task-irrelevant) subtle variations in the face images from trial to trial. For the Identity group, the task-irrelevant features were variations along a morphing continuum of facial identity. For the Emotion group, the task-irrelevant features were variations along an emotional expression morphing continuum. The Control group did not undergo contrast discrimination learning and only performed the pre-training and post-training tests, with the same temporal gap between them as the other two groups. Results indicate that face discrimination improved, but only for the Emotion group. Participants in the Emotion group, moreover, showed face discrimination improvements also for stimulus variations along the facial identity dimension, even if these (task-irrelevant) stimulus features had not been presented during training. The present results highlight the importance of emotions for face discrimination learning. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. When Emotions Matter: Focusing on Emotion Improves Working Memory Updating in Older Adults

    PubMed Central

    Berger, Natalie; Richards, Anne; Davelaar, Eddy J.

    2017-01-01

    Research indicates that emotion can affect the ability to monitor and replace content in working memory, an executive function that is usually referred to as updating. However, it is less clear if the effects of emotion on updating vary with its relevance for the task and with age. Here, 25 younger (20–34 years of age) and 25 older adults (63–80 years of age) performed a 1-back and a 2-back task, in which they responded to younger, middle-aged, and older faces showing neutral, happy or angry expressions. The relevance of emotion for the task was manipulated through instructions to make match/non-match judgments based on the emotion (i.e., emotion was task-relevant) or the age (i.e., emotion was task-irrelevant) of the face. It was found that only older adults updated emotional faces more readily compared to neutral faces as evidenced by faster RTs on non-match trials. This emotion benefit was observed under low-load conditions (1-back task) but not under high-load conditions (2-back task) and only if emotion was task-relevant. In contrast, task-irrelevant emotion did not impair updating performance in either age group. These findings suggest that older adults can benefit from task-relevant emotional information to a greater extent than younger adults when sufficient cognitive resources are available. They also highlight that emotional processing can buffer age-related decline in WM tasks that require not only maintenance but also manipulation of material. PMID:28966602

  13. A leftward bias however you look at it: Revisiting the emotional chimeric face task as a tool for measuring emotion lateralization.

    PubMed

    R Innes, Bobby; Burt, D Michael; Birch, Yan K; Hausmann, Markus

    2015-12-28

    Left hemiface biases observed within the Emotional Chimeric Face Task (ECFT) support emotional face perception models whereby all expressions are preferentially processed by the right hemisphere. However, previous research using this task has not considered that the visible midline between hemifaces might engage atypical facial emotion processing strategies in upright or inverted conditions, nor controlled for left visual field (thus right hemispheric) visuospatial attention biases. This study used novel emotional chimeric faces (blended at the midline) to examine laterality biases for all basic emotions. Left hemiface biases were demonstrated across all emotional expressions and were reduced, but not reversed, for inverted faces. The ECFT bias in upright faces was significantly increased in participants with a large attention bias. These results support the theory that left hemiface biases reflect a genuine bias in emotional face processing, and this bias can interact with attention processes similarly localized in the right hemisphere.

  14. Different underlying mechanisms for face emotion and gender processing during feature-selective attention: Evidence from event-related potential studies.

    PubMed

    Wang, Hailing; Ip, Chengteng; Fu, Shimin; Sun, Pei

    2017-05-01

    Face recognition theories suggest that our brains process invariant (e.g., gender) and changeable (e.g., emotion) facial dimensions separately. To investigate whether these two dimensions are processed in different time courses, we analyzed the selection negativity (SN, an event-related potential component reflecting attentional modulation) elicited by face gender and emotion during a feature selective attention task. Participants were instructed to attend to a combination of face emotion and gender attributes in Experiment 1 (bi-dimensional task) and to either face emotion or gender in Experiment 2 (uni-dimensional task). The results revealed that face emotion did not elicit a substantial SN, whereas face gender consistently generated a substantial SN in both experiments. These results suggest that face gender is more sensitive to feature-selective attention and that face emotion is encoded relatively automatically on SN, implying the existence of different underlying processing mechanisms for invariant and changeable facial dimensions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Time course of implicit processing and explicit processing of emotional faces and emotional words.

    PubMed

    Frühholz, Sascha; Jellinghaus, Anne; Herrmann, Manfred

    2011-05-01

    Facial expressions are important emotional stimuli during social interactions. Symbolic emotional cues, such as affective words, also convey information regarding emotions that is relevant for social communication. Various studies have demonstrated fast decoding of emotions from words, as was shown for faces, whereas others report a rather delayed decoding of information about emotions from words. Here, we introduced an implicit (color naming) and explicit task (emotion judgment) with facial expressions and words, both containing information about emotions, to directly compare the time course of emotion processing using event-related potentials (ERP). The data show that only negative faces affected task performance, resulting in increased error rates compared to neutral faces. Presentation of emotional faces resulted in a modulation of the N170, the EPN and the LPP components and these modulations were found during both the explicit and implicit tasks. Emotional words only affected the EPN during the explicit task, but a task-independent effect on the LPP was revealed. Finally, emotional faces modulated source activity in the extrastriate cortex underlying the generation of the N170, EPN and LPP components. Emotional words led to a modulation of source activity corresponding to the EPN and LPP, but they also affected the N170 source on the right hemisphere. These data show that facial expressions affect earlier stages of emotion processing compared to emotional words, but the emotional value of words may have been detected at early stages of emotional processing in the visual cortex, as was indicated by the extrastriate source activity. Copyright © 2011 Elsevier B.V. All rights reserved.

  16. Prolonged Interruption of Cognitive Control of Conflict Processing Over Human Faces by Task-Irrelevant Emotion Expression

    PubMed Central

    Kim, Jinyoung; Kang, Min-Suk; Cho, Yang Seok; Lee, Sang-Hun

    2017-01-01

    As documented by Darwin 150 years ago, emotion expressed in human faces readily draws our attention and promotes sympathetic emotional reactions. How do such reactions to the expression of emotion affect our goal-directed actions? Despite the substantial advance made in the neural mechanisms of both cognitive control and emotional processing, it is not yet known well how these two systems interact. Here, we studied how emotion expressed in human faces influences cognitive control of conflict processing, spatial selective attention and inhibitory control in particular, using the Eriksen flanker paradigm. In this task, participants viewed displays of a central target face flanked by peripheral faces and were asked to judge the gender of the target face; task-irrelevant emotion expressions were embedded in the target face, the flanking faces, or both. We also monitored how emotion expression affects gender judgment performance while varying the relative timing between the target and flanker faces. As previously reported, we found robust gender congruency effects, namely slower responses to the target faces whose gender was incongruent with that of the flanker faces, when the flankers preceded the target by 0.1 s. When the flankers further advanced the target by 0.3 s, however, the congruency effect vanished in most of the viewing conditions, except for when emotion was expressed only in the flanking faces or when congruent emotion was expressed in the target and flanking faces. These results suggest that emotional saliency can prolong a substantial degree of conflict by diverting bottom-up attention away from the target, and that inhibitory control on task-irrelevant information from flanking stimuli is deterred by the emotional congruency between target and flanking stimuli. PMID:28676780

  17. Proactive and reactive control depends on emotional valence: a Stroop study with emotional expressions and words.

    PubMed

    Kar, Bhoomika Rastogi; Srinivasan, Narayanan; Nehabala, Yagyima; Nigam, Richa

    2018-03-01

    We examined proactive and reactive control effects in the context of task-relevant happy, sad, and angry facial expressions on a face-word Stroop task. Participants identified the emotion expressed by a face that contained a congruent or incongruent emotional word (happy/sad/angry). Proactive control effects were measured in terms of the reduction in Stroop interference (difference between incongruent and congruent trials) as a function of previous trial emotion and previous trial congruence. Reactive control effects were measured in terms of the reduction in Stroop interference as a function of current trial emotion and previous trial congruence. Previous trial negative emotions exert greater influence on proactive control than the positive emotion. Sad faces in the previous trial resulted in greater reduction in the Stroop interference for happy faces in the current trial. However, current trial angry faces showed stronger adaptation effects compared to happy faces. Thus, both proactive and reactive control mechanisms are dependent on emotional valence of task-relevant stimuli.

  18. Eye-Tracking, Autonomic, and Electrophysiological Correlates of Emotional Face Processing in Adolescents with Autism Spectrum Disorder

    PubMed Central

    Wagner, Jennifer B.; Hirsch, Suzanna B.; Vogel-Farley, Vanessa K.; Redcay, Elizabeth; Nelson, Charles A.

    2014-01-01

    Individuals with autism spectrum disorder (ASD) often have difficulty with social-emotional cues. This study examined the neural, behavioral, and autonomic correlates of emotional face processing in adolescents with ASD and typical development (TD) using eye-tracking and event-related potentials (ERPs) across two different paradigms. Scanning of faces was similar across groups in the first task, but the second task found that face-sensitive ERPs varied with emotional expressions only in TD. Further, ASD showed enhanced neural responding to non-social stimuli. In TD only, attention to eyes during eye-tracking related to faster face-sensitive ERPs in a separate task; in ASD, a significant positive association was found between autonomic activity and attention to mouths. Overall, ASD showed an atypical pattern of emotional face processing, with reduced neural differentiation between emotions and a reduced relationship between gaze behavior and neural processing of faces. PMID:22684525

  19. Is a neutral expression also a neutral stimulus? A study with functional magnetic resonance.

    PubMed

    Carvajal, Fernando; Rubio, Sandra; Serrano, Juan M; Ríos-Lago, Marcos; Alvarez-Linera, Juan; Pacheco, Lara; Martín, Pilar

    2013-08-01

    Although neutral faces do not initially convey an explicit emotional message, it has been found that individuals tend to assign them an affective content. Moreover, previous research has shown that affective judgments are mediated by the task they have to perform. Using functional magnetic resonance imaging in 21 healthy participants, we focus this study on the cerebral activity patterns triggered by neutral and emotional faces in two different tasks (social or gender judgments). Results obtained, using conjunction analyses, indicated that viewing both emotional and neutral faces evokes activity in several similar brain areas indicating a common neural substrate. Moreover, neutral faces specifically elicit activation of cerebellum, frontal and temporal areas, while emotional faces involve the cuneus, anterior cingulated gyrus, medial orbitofrontal cortex, posterior superior temporal gyrus, precentral/postcentral gyrus and insula. The task selected was also found to influence brain activity, in that the social task recruited frontal areas while the gender task involved the posterior cingulated, inferior parietal lobule and middle temporal gyrus to a greater extent. Specifically, in the social task viewing neutral faces was associated with longer reaction times and increased activity of left dorsolateral frontal cortex compared with viewing facial expressions of emotions. In contrast, in the same task emotional expressions distinctively activated the left amygdale. The results are discussed taking into consideration the fact that, like other facial expressions, neutral expressions are usually assigned some emotional significance. However, neutral faces evoke a greater activation of circuits probably involved in more elaborate cognitive processing.

  20. Effect of distracting faces on visual selective attention in the monkey.

    PubMed

    Landman, Rogier; Sharma, Jitendra; Sur, Mriganka; Desimone, Robert

    2014-12-16

    In primates, visual stimuli with social and emotional content tend to attract attention. Attention might be captured through rapid, automatic, subcortical processing or guided by slower, more voluntary cortical processing. Here we examined whether irrelevant faces with varied emotional expressions interfere with a covert attention task in macaque monkeys. In the task, the monkeys monitored a target grating in the periphery for a subtle color change while ignoring distracters that included faces appearing elsewhere on the screen. The onset time of distracter faces before the target change, as well as their spatial proximity to the target, was varied from trial to trial. The presence of faces, especially faces with emotional expressions interfered with the task, indicating a competition for attentional resources between the task and the face stimuli. However, this interference was significant only when faces were presented for greater than 200 ms. Emotional faces also affected saccade velocity and reduced pupillary reflex. Our results indicate that the attraction of attention by emotional faces in the monkey takes a considerable amount of processing time, possibly involving cortical-subcortical interactions. Intranasal application of the hormone oxytocin ameliorated the interfering effects of faces. Together these results provide evidence for slow modulation of attention by emotional distracters, which likely involves oxytocinergic brain circuits.

  1. Modulation of the composite face effect by unintended emotion cues.

    PubMed

    Gray, Katie L H; Murphy, Jennifer; Marsh, Jade E; Cook, Richard

    2017-04-01

    When upper and lower regions from different emotionless faces are aligned to form a facial composite, observers 'fuse' the two halves together, perceptually. The illusory distortion induced by task-irrelevant ('distractor') halves hinders participants' judgements about task-relevant ('target') halves. This composite-face effect reveals a tendency to integrate feature information from disparate regions of intact upright faces, consistent with theories of holistic face processing. However, observers frequently perceive emotion in ostensibly neutral faces, contrary to the intentions of experimenters. This study sought to determine whether this 'perceived emotion' influences the composite-face effect. In our first experiment, we confirmed that the composite effect grows stronger as the strength of distractor emotion increased. Critically, effects of distractor emotion were induced by weak emotion intensities, and were incidental insofar as emotion cues hindered image matching, not emotion labelling per se . In Experiment 2, we found a correlation between the presence of perceived emotion in a set of ostensibly neutral distractor regions sourced from commonly used face databases, and the strength of illusory distortion they induced. In Experiment 3, participants completed a sequential matching composite task in which half of the distractor regions were rated high and low for perceived emotion, respectively. Significantly stronger composite effects were induced by the high-emotion distractor halves. These convergent results suggest that perceived emotion increases the strength of the composite-face effect induced by supposedly emotionless faces. These findings have important implications for the study of holistic face processing in typical and atypical populations.

  2. Music to my ears: Age-related decline in musical and facial emotion recognition.

    PubMed

    Sutcliffe, Ryan; Rendell, Peter G; Henry, Julie D; Bailey, Phoebe E; Ruffman, Ted

    2017-12-01

    We investigated young-old differences in emotion recognition using music and face stimuli and tested explanatory hypotheses regarding older adults' typically worse emotion recognition. In Experiment 1, young and older adults labeled emotions in an established set of faces, and in classical piano stimuli that we pilot-tested on other young and older adults. Older adults were worse at detecting anger, sadness, fear, and happiness in music. Performance on the music and face emotion tasks was not correlated for either age group. Because musical expressions of fear were not equated for age groups in the pilot study of Experiment 1, we conducted a second experiment in which we created a novel set of music stimuli that included more accessible musical styles, and which we again pilot-tested on young and older adults. In this pilot study, all musical emotions were identified similarly by young and older adults. In Experiment 2, participants also made age estimations in another set of faces to examine whether potential relations between the face and music emotion tasks would be shared with the age estimation task. Older adults did worse in each of the tasks, and had specific difficulty recognizing happy, sad, peaceful, angry, and fearful music clips. Older adults' difficulties in each of the 3 tasks-music emotion, face emotion, and face age-were not correlated with each other. General cognitive decline did not appear to explain our results as increasing age predicted emotion performance even after fluid IQ was controlled for within the older adult group. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  3. Development of response inhibition in the context of relevant versus irrelevant emotions.

    PubMed

    Schel, Margot A; Crone, Eveline A

    2013-01-01

    The present study examined the influence of relevant and irrelevant emotions on response inhibition from childhood to early adulthood. Ninety-four participants between 6 and 25 years of age performed two go/nogo tasks with emotional faces (neutral, happy, and fearful) as stimuli. In one go/nogo task emotion formed a relevant dimension of the task and in the other go/nogo task emotion was irrelevant and participants had to respond to the color of the faces instead. A special feature of the latter task, in which emotion was irrelevant, was the inclusion of free choice trials, in which participants could freely decide between acting and inhibiting. Results showed a linear increase in response inhibition performance with increasing age both in relevant and irrelevant affective contexts. Relevant emotions had a pronounced influence on performance across age, whereas irrelevant emotions did not. Overall, participants made more false alarms on trials with fearful faces than happy faces, and happy faces were associated with better performance on go trials (higher percentage correct and faster RTs) than fearful faces. The latter effect was stronger for young children in terms of accuracy. Finally, during the free choice trials participants did not base their decisions on affective context, confirming that irrelevant emotions do not have a strong impact on inhibition. Together, these findings suggest that across development relevant affective context has a larger influence on response inhibition than irrelevant affective context. When emotions are relevant, a context of positive emotions is associated with better performance compared to a context with negative emotions, especially in young children.

  4. Emotion-attention interactions in recognition memory for distractor faces.

    PubMed

    Srinivasan, Narayanan; Gupta, Rashmi

    2010-04-01

    Effective filtering of distractor information has been shown to be dependent on perceptual load. Given the salience of emotional information and the presence of emotion-attention interactions, we wanted to explore the recognition memory for emotional distractors especially as a function of focused attention and distributed attention by manipulating load and the spatial spread of attention. We performed two experiments to study emotion-attention interactions by measuring recognition memory performance for distractor neutral and emotional faces. Participants performed a color discrimination task (low-load) or letter identification task (high-load) with a letter string display in Experiment 1 and a high-load letter identification task with letters presented in a circular array in Experiment 2. The stimuli were presented against a distractor face background. The recognition memory results show that happy faces were recognized better than sad faces under conditions of less focused or distributed attention. When attention is more spatially focused, sad faces were recognized better than happy faces. The study provides evidence for emotion-attention interactions in which specific emotional information like sad or happy is associated with focused or distributed attention respectively. Distractor processing with emotional information also has implications for theories of attention. Copyright 2010 APA, all rights reserved.

  5. Age-related changes in emotional face processing across childhood and into young adulthood: evidence from event-related potentials

    PubMed Central

    MacNamara, Annmarie; Vergés, Alvaro; Kujawa, Autumn; Fitzgerald, Kate D.; Monk, Christopher S.; Phan, K. Luan

    2016-01-01

    Socio-emotional processing is an essential part of development, and age-related changes in its neural correlates can be observed. The late positive potential (LPP) is a measure of motivated attention that can be used to assess emotional processing; however, changes in the LPP elicited by emotional faces have not been assessed across a wide age range in childhood and young adulthood. We used an emotional face matching task to examine behavior and event-related potentials (ERPs) in 33 youth aged 7 to 19 years old. Younger children were slower when performing the matching task. The LPP elicited by emotional faces but not control stimuli (geometric shapes) decreased with age; by contrast, an earlier ERP (the P1) decreased with age for both faces and shapes, suggesting increased efficiency of early visual processing. Results indicate age-related attenuation in emotional processing that may stem from increased efficiency and regulatory control when performing a socio-emotional task. PMID:26220144

  6. Attention and memory bias to facial emotions underlying negative symptoms of schizophrenia.

    PubMed

    Jang, Seon-Kyeong; Park, Seon-Cheol; Lee, Seung-Hwan; Cho, Yang Seok; Choi, Kee-Hong

    2016-01-01

    This study assessed bias in selective attention to facial emotions in negative symptoms of schizophrenia and its influence on subsequent memory for facial emotions. Thirty people with schizophrenia who had high and low levels of negative symptoms (n = 15, respectively) and 21 healthy controls completed a visual probe detection task investigating selective attention bias (happy, sad, and angry faces randomly presented for 50, 500, or 1000 ms). A yes/no incidental facial memory task was then completed. Attention bias scores and recognition errors were calculated. Those with high negative symptoms exhibited reduced attention to emotional faces relative to neutral faces; those with low negative symptoms showed the opposite pattern when faces were presented for 500 ms regardless of the valence. Compared to healthy controls, those with high negative symptoms made more errors for happy faces in the memory task. Reduced attention to emotional faces in the probe detection task was significantly associated with less pleasure and motivation and more recognition errors for happy faces in schizophrenia group only. Attention bias away from emotional information relatively early in the attentional process and associated diminished positive memory may relate to pathological mechanisms for negative symptoms.

  7. Repetition Blindness for Faces: A Comparison of Face Identity, Expression, and Gender Judgments.

    PubMed

    Murphy, Karen; Ward, Zoe

    2017-01-01

    Repetition blindness (RB) refers to the impairment in reporting two identical targets within a rapid serial visual presentation stream. While numerous studies have demonstrated RB for words and picture of objects, very few studies have examined RB for faces. This study extended this research by examining RB when the two faces were complete repeats (same emotion and identity), identity repeats (same individual, different emotion), and emotion repeats (different individual, same emotion) for identity, gender, and expression judgment tasks. Complete RB and identity RB effects were evident for all three judgment tasks. Emotion RB was only evident for the expression and gender judgments. Complete RB effects were larger than emotion or identity RB effects across all judgment tasks. For the expression judgments, there was more emotion than identity RB. The identity RB effect was larger than the emotion RB effect for the gender judgments. Cross task comparisons revealed larger complete RB effects for the expression and gender judgments than the identity decisions. There was a larger emotion RB effect for the expression than gender judgments and the identity RB effect was larger for the gender than for the identity and expression judgments. These results indicate that while faces are subject to RB, this is affected by the type of repeated information and relevance of the facial characteristic to the judgment decision. This study provides further support for the operation of separate processing mechanisms for face gender, emotion, and identity information within models of face recognition.

  8. Repetition Blindness for Faces: A Comparison of Face Identity, Expression, and Gender Judgments

    PubMed Central

    Murphy, Karen; Ward, Zoe

    2017-01-01

    Repetition blindness (RB) refers to the impairment in reporting two identical targets within a rapid serial visual presentation stream. While numerous studies have demonstrated RB for words and picture of objects, very few studies have examined RB for faces. This study extended this research by examining RB when the two faces were complete repeats (same emotion and identity), identity repeats (same individual, different emotion), and emotion repeats (different individual, same emotion) for identity, gender, and expression judgment tasks. Complete RB and identity RB effects were evident for all three judgment tasks. Emotion RB was only evident for the expression and gender judgments. Complete RB effects were larger than emotion or identity RB effects across all judgment tasks. For the expression judgments, there was more emotion than identity RB. The identity RB effect was larger than the emotion RB effect for the gender judgments. Cross task comparisons revealed larger complete RB effects for the expression and gender judgments than the identity decisions. There was a larger emotion RB effect for the expression than gender judgments and the identity RB effect was larger for the gender than for the identity and expression judgments. These results indicate that while faces are subject to RB, this is affected by the type of repeated information and relevance of the facial characteristic to the judgment decision. This study provides further support for the operation of separate processing mechanisms for face gender, emotion, and identity information within models of face recognition. PMID:29038663

  9. Family environment influences emotion recognition following paediatric traumatic brain injury.

    PubMed

    Schmidt, Adam T; Orsten, Kimberley D; Hanten, Gerri R; Li, Xiaoqi; Levin, Harvey S

    2010-01-01

    This study investigated the relationship between family functioning and performance on two tasks of emotion recognition (emotional prosody and face emotion recognition) and a cognitive control procedure (the Flanker task) following paediatric traumatic brain injury (TBI) or orthopaedic injury (OI). A total of 142 children (75 TBI, 67 OI) were assessed on three occasions: baseline, 3 months and 1 year post-injury on the two emotion recognition tasks and the Flanker task. Caregivers also completed the Life Stressors and Resources Scale (LISRES) on each occasion. Growth curve analysis was used to analyse the data. Results indicated that family functioning influenced performance on the emotional prosody and Flanker tasks but not on the face emotion recognition task. Findings on both the emotional prosody and Flanker tasks were generally similar across groups. However, financial resources emerged as significantly related to emotional prosody performance in the TBI group only (p = 0.0123). Findings suggest family functioning variables--especially financial resources--can influence performance on an emotional processing task following TBI in children.

  10. Positive and negative emotion enhances the processing of famous faces in a semantic judgment task.

    PubMed

    Bate, Sarah; Haslam, Catherine; Hodgson, Timothy L; Jansari, Ashok; Gregory, Nicola; Kay, Janice

    2010-01-01

    Previous work has consistently reported a facilitatory influence of positive emotion in face recognition (e.g., D'Argembeau, Van der Linden, Comblain, & Etienne, 2003). However, these reports asked participants to make recognition judgments in response to faces, and it is unknown whether emotional valence may influence other stages of processing, such as at the level of semantics. Furthermore, other evidence suggests that negative rather than positive emotion facilitates higher level judgments when processing nonfacial stimuli (e.g., Mickley & Kensinger, 2008), and it is possible that negative emotion also influences latter stages of face processing. The present study addressed this issue, examining the influence of emotional valence while participants made semantic judgments in response to a set of famous faces. Eye movements were monitored while participants performed this task, and analyses revealed a reduction in information extraction for the faces of liked and disliked celebrities compared with those of emotionally neutral celebrities. Thus, in contrast to work using familiarity judgments, both positive and negative emotion facilitated processing in this semantic-based task. This pattern of findings is discussed in relation to current models of face processing. Copyright 2009 APA, all rights reserved.

  11. Real-Time Functional Magnetic Resonance Imaging Amygdala Neurofeedback Changes Positive Information Processing in Major Depressive Disorder.

    PubMed

    Young, Kymberly D; Misaki, Masaya; Harmer, Catherine J; Victor, Teresa; Zotev, Vadim; Phillips, Raquel; Siegle, Greg J; Drevets, Wayne C; Bodurka, Jerzy

    2017-10-15

    In participants with major depressive disorder who are trained to upregulate their amygdalar hemodynamic responses during positive autobiographical memory recall with real-time functional magnetic resonance imaging neurofeedback (rtfMRI-nf) training, depressive symptoms diminish. This study tested whether amygdalar rtfMRI-nf also changes emotional processing of positive and negative stimuli in a variety of behavioral and imaging tasks. Patients with major depressive disorder completed two rtfMRI-nf sessions (18 received amygdalar rtfMRI-nf, 16 received control parietal rtfMRI-nf). One week before and following rtfMRI-nf training, participants performed tasks measuring responses to emotionally valenced stimuli including a backward-masking task, which measures the amygdalar hemodynamic response to emotional faces presented for traditionally subliminal duration and followed by a mask, and the Emotional Test Battery in which reaction times and performance accuracy are measured during tasks involving emotional faces and words. During the backward-masking task, amygdalar responses increased while viewing masked happy faces but decreased to masked sad faces in the experimental versus control group following rtfMRI-nf. During the Emotional Test Battery, reaction times decreased to identification of positive faces and during self-identification with positive words and vigilance scores increased to positive faces and decreased to negative faces during the faces dot-probe task in the experimental versus control group following rtfMRI-nf. rtfMRI-nf training to increase the amygdalar hemodynamic response to positive memories was associated with changes in amygdalar responses to happy and sad faces and improved processing of positive stimuli during performance of the Emotional Test Battery. These results may suggest that amygdalar rtfMRI-nf training alters responses to emotional stimuli in a manner similar to antidepressant pharmacotherapy. Copyright © 2017 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  12. Age-related emotional bias in processing two emotionally valenced tasks.

    PubMed

    Allen, Philip A; Lien, Mei-Ching; Jardin, Elliott

    2017-01-01

    Previous studies suggest that older adults process positive emotions more efficiently than negative emotions, whereas younger adults show the reverse effect. We examined whether this age-related difference in emotional bias still occurs when attention is engaged in two emotional tasks. We used a psychological refractory period paradigm and varied the emotional valence of Task 1 and Task 2. In both experiments, Task 1 was emotional face discrimination (happy vs. angry faces) and Task 2 was sound discrimination (laugh, punch, vs. cork pop in Experiment 1 and laugh vs. scream in Experiment 2). The backward emotional correspondence effect for positively and negatively valenced Task 2 on Task 1 was measured. In both experiments, younger adults showed a backward correspondence effect from a negatively valenced Task 2, suggesting parallel processing of negatively valenced stimuli. Older adults showed similar negativity bias in Experiment 2 with a more salient negative sound ("scream" relative to "punch"). These results are consistent with an arousal-bias competition model [Mather and Sutherland (Perspectives in Psychological Sciences 6:114-133, 2011)], suggesting that emotional arousal modulates top-down attentional control settings (emotional regulation) with age.

  13. Autistic traits and social anxiety predict differential performance on social cognitive tasks in typically developing young adults

    PubMed Central

    Burk, Joshua A.; Fleckenstein, Katarina; Kozikowski, C. Teal

    2018-01-01

    The current work examined the unique contribution that autistic traits and social anxiety have on tasks examining attention and emotion processing. In Study 1, 119 typically-developing college students completed a flanker task assessing the control of attention to target faces and away from distracting faces during emotion identification. In Study 2, 208 typically-developing college students performed a visual search task which required identification of whether a series of 8 or 16 emotional faces depicted the same or different emotions. Participants with more self-reported autistic traits performed more slowly on the flanker task in Study 1 than those with fewer autistic traits when stimuli depicted complex emotions. In Study 2, participants higher in social anxiety performed less accurately on trials showing all complex faces; participants with autistic traits showed no differences. These studies suggest that traits related to autism and to social anxiety differentially impact social cognitive processing. PMID:29596523

  14. Evidence for the triadic model of adolescent brain development: Cognitive load and task-relevance of emotion differentially affect adolescents and adults.

    PubMed

    Mueller, Sven C; Cromheeke, Sofie; Siugzdaite, Roma; Nicolas Boehler, C

    2017-08-01

    In adults, cognitive control is supported by several brain regions including the limbic system and the dorsolateral prefrontal cortex (dlPFC) when processing emotional information. However, in adolescents, some theories hypothesize a neurobiological imbalance proposing heightened sensitivity to affective material in the amygdala and striatum within a cognitive control context. Yet, direct neurobiological evidence is scarce. Twenty-four adolescents (12-16) and 28 adults (25-35) completed an emotional n-back working memory task in response to happy, angry, and neutral faces during fMRI. Importantly, participants either paid attention to the emotion (task-relevant condition) or judged the gender (task-irrelevant condition). Behaviorally, for both groups, when happy faces were task-relevant, performance improved relative to when they were task-irrelevant, while performance decrements were seen for angry faces. In the dlPFC, angry faces elicited more activation in adults during low relative to high cognitive load (2-back vs. 0-back). By contrast, happy faces elicited more activation in the amygdala in adolescents when they were task-relevant. Happy faces also generally increased nucleus accumbens activity (regardless of relevance) in adolescents relative to adults. Together, the findings are consistent with neurobiological models of adolescent brain development and identify neurodevelopmental differences in cognitive control emotion interactions. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. The Cambridge Mindreading (CAM) Face-Voice Battery: Testing complex emotion recognition in adults with and without Asperger syndrome.

    PubMed

    Golan, Ofer; Baron-Cohen, Simon; Hill, Jacqueline

    2006-02-01

    Adults with Asperger Syndrome (AS) can recognise simple emotions and pass basic theory of mind tasks, but have difficulties recognising more complex emotions and mental states. This study describes a new battery of tasks, testing recognition of 20 complex emotions and mental states from faces and voices. The battery was given to males and females with AS and matched controls. Results showed the AS group performed worse than controls overall, on emotion recognition from faces and voices and on 12/20 specific emotions. Females recognised faces better than males regardless of diagnosis, and males with AS had more difficulties recognising emotions from faces than from voices. The implications of these results are discussed in relation to social functioning in AS.

  16. Age-Related Developmental and Individual Differences in the Influence of Social and Non-social Distractors on Cognitive Performance.

    PubMed

    Tan, Patricia Z; Silk, Jennifer S; Dahl, Ronald E; Kronhaus, Dina; Ladouceur, Cecile D

    2018-01-01

    This study sought to examine age-related differences in the influences of social (neutral, emotional faces) and non-social/non-emotional (shapes) distractor stimuli in children, adolescents, and adults. To assess the degree to which distractor, or task-irrelevant, stimuli of varying social and emotional salience interfere with cognitive performance, children ( N = 12; 8-12y), adolescents ( N = 17; 13-17y), and adults ( N = 17; 18-52y) completed the Emotional Identification and Dynamic Faces (EIDF) task. This task included three types of dynamically-changing distractors: (1) neutral-social (neutral face changing into another face); (2) emotional-social (face changing from 0% emotional to 100% emotional); and (3) non-social/non-emotional (shapes changing from small to large) to index the influence of task-irrelevant social and emotional information on cognition. Results yielded no age-related differences in accuracy but showed an age-related linear reduction in correct reaction times across distractor conditions. An age-related effect in interference was observed, such that children and adults showed slower response times on correct trials with socially-salient distractors; whereas adolescents exhibited faster responses on trials with distractors that included faces rather than shapes. A secondary study goal was to explore individual differences in cognitive interference. Results suggested that regardless of age, low trait anxiety and high effortful control were associated with interference to angry faces. Implications for developmental differences in affective processing, notably the importance of considering the contexts in which purportedly irrelevant social and emotional information might impair, vs. improve cognitive control, are discussed.

  17. No Prior Entry for Threat-Related Faces: Evidence from Temporal Order Judgments

    PubMed Central

    Schettino, Antonio; Loeys, Tom; Pourtois, Gilles

    2013-01-01

    Previous research showed that threat-related faces, due to their intrinsic motivational relevance, capture attention more readily than neutral faces. Here we used a standard temporal order judgment (TOJ) task to assess whether negative (either angry or fearful) emotional faces, when competing with neutral faces for attention selection, may lead to a prior entry effect and hence be perceived as appearing first, especially when uncertainty is high regarding the order of the two onsets. We did not find evidence for this conjecture across five different experiments, despite the fact that participants were invariably influenced by asynchronies in the respective onsets of the two competing faces in the pair, and could reliably identify the emotion in the faces. Importantly, by systematically varying task demands across experiments, we could rule out confounds related to suboptimal stimulus presentation or inappropriate task demands. These findings challenge the notion of an early automatic capture of attention by (negative) emotion. Future studies are needed to investigate whether the lack of systematic bias of attention by emotion is imputed to the primacy of a non-emotional cue to resolve the TOJ task, which in turn prevents negative emotion to exert an early bottom-up influence on the guidance of spatial and temporal attention. PMID:23646126

  18. Family environment influences emotion recognition following paediatric traumatic brain injury

    PubMed Central

    SCHMIDT, ADAM T.; ORSTEN, KIMBERLEY D.; HANTEN, GERRI R.; LI, XIAOQI; LEVIN, HARVEY S.

    2011-01-01

    Objective This study investigated the relationship between family functioning and performance on two tasks of emotion recognition (emotional prosody and face emotion recognition) and a cognitive control procedure (the Flanker task) following paediatric traumatic brain injury (TBI) or orthopaedic injury (OI). Methods A total of 142 children (75 TBI, 67 OI) were assessed on three occasions: baseline, 3 months and 1 year post-injury on the two emotion recognition tasks and the Flanker task. Caregivers also completed the Life Stressors and Resources Scale (LISRES) on each occasion. Growth curve analysis was used to analyse the data. Results Results indicated that family functioning influenced performance on the emotional prosody and Flanker tasks but not on the face emotion recognition task. Findings on both the emotional prosody and Flanker tasks were generally similar across groups. However, financial resources emerged as significantly related to emotional prosody performance in the TBI group only (p = 0.0123). Conclusions Findings suggest family functioning variables—especially financial resources—can influence performance on an emotional processing task following TBI in children. PMID:21058900

  19. On the Automaticity of Emotion Processing in Words and Faces: Event-Related Brain Potentials Evidence from a Superficial Task

    ERIC Educational Resources Information Center

    Rellecke, Julian; Palazova, Marina; Sommer, Werner; Schacht, Annekathrin

    2011-01-01

    The degree to which emotional aspects of stimuli are processed automatically is controversial. Here, we assessed the automatic elicitation of emotion-related brain potentials (ERPs) to positive, negative, and neutral words and facial expressions in an easy and superficial face-word discrimination task, for which the emotional valence was…

  20. The human body odor compound androstadienone leads to anger-dependent effects in an emotional Stroop but not dot-probe task using human faces.

    PubMed

    Hornung, Jonas; Kogler, Lydia; Wolpert, Stephan; Freiherr, Jessica; Derntl, Birgit

    2017-01-01

    The androgen derivative androstadienone is a substance found in human sweat and thus is a putative human chemosignal. Androstadienone has been studied with respect to effects on mood states, attractiveness ratings, physiological and neural activation. With the current experiment, we aimed to explore in which way androstadienone affects attention to social cues (human faces). Moreover, we wanted to test whether effects depend on specific emotions, the participants' sex and individual sensitivity to smell androstadienone. To do so, we investigated 56 healthy individuals (thereof 29 females taking oral contraceptives) with two attention tasks on two consecutive days (once under androstadienone, once under placebo exposure in pseudorandomized order). With an emotional dot-probe task we measured visuo-spatial cueing while an emotional Stroop task allowed us to investigate interference control. Our results suggest that androstadienone acts in a sex, task and emotion-specific manner as a reduction in interference processes in the emotional Stroop task was only apparent for angry faces in men under androstadienone exposure. More specifically, men showed a smaller difference in reaction times for congruent compared to incongruent trials. At the same time also women were slightly affected by smelling androstadienone as they classified angry faces more often correctly under androstadienone. For the emotional dot-probe task no modulation by androstadienone was observed. Furthermore, in both attention paradigms individual sensitivity to androstadienone was neither correlated with reaction times nor error rates in men and women. To conclude, exposure to androstadienone seems to potentiate the relevance of angry faces in both men and women in connection with interference control, while processes of visuo-spatial cueing remain unaffected.

  1. The human body odor compound androstadienone leads to anger-dependent effects in an emotional Stroop but not dot-probe task using human faces

    PubMed Central

    Kogler, Lydia; Wolpert, Stephan; Freiherr, Jessica; Derntl, Birgit

    2017-01-01

    The androgen derivative androstadienone is a substance found in human sweat and thus is a putative human chemosignal. Androstadienone has been studied with respect to effects on mood states, attractiveness ratings, physiological and neural activation. With the current experiment, we aimed to explore in which way androstadienone affects attention to social cues (human faces). Moreover, we wanted to test whether effects depend on specific emotions, the participants' sex and individual sensitivity to smell androstadienone. To do so, we investigated 56 healthy individuals (thereof 29 females taking oral contraceptives) with two attention tasks on two consecutive days (once under androstadienone, once under placebo exposure in pseudorandomized order). With an emotional dot-probe task we measured visuo-spatial cueing while an emotional Stroop task allowed us to investigate interference control. Our results suggest that androstadienone acts in a sex, task and emotion-specific manner as a reduction in interference processes in the emotional Stroop task was only apparent for angry faces in men under androstadienone exposure. More specifically, men showed a smaller difference in reaction times for congruent compared to incongruent trials. At the same time also women were slightly affected by smelling androstadienone as they classified angry faces more often correctly under androstadienone. For the emotional dot-probe task no modulation by androstadienone was observed. Furthermore, in both attention paradigms individual sensitivity to androstadienone was neither correlated with reaction times nor error rates in men and women. To conclude, exposure to androstadienone seems to potentiate the relevance of angry faces in both men and women in connection with interference control, while processes of visuo-spatial cueing remain unaffected. PMID:28369152

  2. The ties to unbind: age-related differences in feature (un)binding in working memory for emotional faces.

    PubMed

    Pehlivanoglu, Didem; Jain, Shivangi; Ariel, Robert; Verhaeghen, Paul

    2014-01-01

    In the present study, we investigated age-related differences in the processing of emotional stimuli. Specifically, we were interested in whether older adults would show deficits in unbinding emotional expression (i.e., either no emotion, happiness, anger, or disgust) from bound stimuli (i.e., photographs of faces expressing these emotions), as a hyper-binding account of age-related differences in working memory would predict. Younger and older adults completed different N-Back tasks (side-by-side 0-Back, 1-Back, 2-Back) under three conditions: match/mismatch judgments based on either the identity of the face (identity condition), the face's emotional expression (expression condition), or both identity and expression of the face (both condition). The two age groups performed more slowly and with lower accuracy in the expression condition than in the both condition, indicating the presence of an unbinding process. This unbinding effect was more pronounced in older adults than in younger adults, but only in the 2-Back task. Thus, older adults seemed to have a specific deficit in unbinding in working memory. Additionally, no age-related differences were found in accuracy in the 0-Back task, but such differences emerged in the 1-Back task, and were further magnified in the 2-Back task, indicating independent age-related differences in attention/STM and working memory. Pupil dilation data confirmed that the attention/STM version of the task (1-Back) is more effortful for older adults than younger adults.

  3. Cultural differences in on-line sensitivity to emotional voices: comparing East and West

    PubMed Central

    Liu, Pan; Rigoulot, Simon; Pell, Marc D.

    2015-01-01

    Evidence that culture modulates on-line neural responses to the emotional meanings encoded by vocal and facial expressions was demonstrated recently in a study comparing English North Americans and Chinese (Liu et al., 2015). Here, we compared how individuals from these two cultures passively respond to emotional cues from faces and voices using an Oddball task. Participants viewed in-group emotional faces, with or without simultaneous vocal expressions, while performing a face-irrelevant visual task as the EEG was recorded. A significantly larger visual Mismatch Negativity (vMMN) was observed for Chinese vs. English participants when faces were accompanied by voices, suggesting that Chinese were influenced to a larger extent by task-irrelevant vocal cues. These data highlight further differences in how adults from East Asian vs. Western cultures process socio-emotional cues, arguing that distinct cultural practices in communication (e.g., display rules) shape neurocognitive activity associated with the early perception and integration of multi-sensory emotional cues. PMID:26074808

  4. Social and emotional relevance in face processing: happy faces of future interaction partners enhance the late positive potential

    PubMed Central

    Bublatzky, Florian; Gerdes, Antje B. M.; White, Andrew J.; Riemer, Martin; Alpers, Georg W.

    2014-01-01

    Human face perception is modulated by both emotional valence and social relevance, but their interaction has rarely been examined. Event-related brain potentials (ERP) to happy, neutral, and angry facial expressions with different degrees of social relevance were recorded. To implement a social anticipation task, relevance was manipulated by presenting faces of two specific actors as future interaction partners (socially relevant), whereas two other face actors remained non-relevant. In a further control task all stimuli were presented without specific relevance instructions (passive viewing). Face stimuli of four actors (2 women, from the KDEF) were randomly presented for 1s to 26 participants (16 female). Results showed an augmented N170, early posterior negativity (EPN), and late positive potential (LPP) for emotional in contrast to neutral facial expressions. Of particular interest, face processing varied as a function of experimental tasks. Whereas task effects were observed for P1 and EPN regardless of instructed relevance, LPP amplitudes were modulated by emotional facial expression and relevance manipulation. The LPP was specifically enhanced for happy facial expressions of the anticipated future interaction partners. This underscores that social relevance can impact face processing already at an early stage of visual processing. These findings are discussed within the framework of motivated attention and face processing theories. PMID:25076881

  5. “Distracters” Do Not Always Distract: Visual Working Memory for Angry Faces is Enhanced by Incidental Emotional Words

    PubMed Central

    Jackson, Margaret C.; Linden, David E. J.; Raymond, Jane E.

    2012-01-01

    We are often required to filter out distraction in order to focus on a primary task during which working memory (WM) is engaged. Previous research has shown that negative versus neutral distracters presented during a visual WM maintenance period significantly impair memory for neutral information. However, the contents of WM are often also emotional in nature. The question we address here is how incidental information might impact upon visual WM when both this and the memory items contain emotional information. We presented emotional versus neutral words during the maintenance interval of an emotional visual WM faces task. Participants encoded two angry or happy faces into WM, and several seconds into a 9 s maintenance period a negative, positive, or neutral word was flashed on the screen three times. A single neutral test face was presented for retrieval with a face identity that was either present or absent in the preceding study array. WM for angry face identities was significantly better when an emotional (negative or positive) versus neutral (or no) word was presented. In contrast, WM for happy face identities was not significantly affected by word valence. These findings suggest that the presence of emotion within an intervening stimulus boosts the emotional value of threat-related information maintained in visual WM and thus improves performance. In addition, we show that incidental events that are emotional in nature do not always distract from an ongoing WM task. PMID:23112782

  6. Neural circuitry of emotional face processing in autism spectrum disorders.

    PubMed

    Monk, Christopher S; Weng, Shih-Jen; Wiggins, Jillian Lee; Kurapati, Nikhil; Louro, Hugo M C; Carrasco, Melisa; Maslowsky, Julie; Risi, Susan; Lord, Catherine

    2010-03-01

    Autism spectrum disorders (ASD) are associated with severe impairments in social functioning. Because faces provide nonverbal cues that support social interactions, many studies of ASD have examined neural structures that process faces, including the amygdala, ventromedial prefrontal cortex and superior and middle temporal gyri. However, increases or decreases in activation are often contingent on the cognitive task. Specifically, the cognitive domain of attention influences group differences in brain activation. We investigated brain function abnormalities in participants with ASD using a task that monitored attention bias to emotional faces. Twenty-four participants (12 with ASD, 12 controls) completed a functional magnetic resonance imaging study while performing an attention cuing task with emotional (happy, sad, angry) and neutral faces. In response to emotional faces, those in the ASD group showed greater right amygdala activation than those in the control group. A preliminary psychophysiological connectivity analysis showed that ASD participants had stronger positive right amygdala and ventromedial prefrontal cortex coupling and weaker positive right amygdala and temporal lobe coupling than controls. There were no group differences in the behavioural measure of attention bias to the emotional faces. The small sample size may have affected our ability to detect additional group differences. When attention bias to emotional faces was equivalent between ASD and control groups, ASD was associated with greater amygdala activation. Preliminary analyses showed that ASD participants had stronger connectivity between the amygdala ventromedial prefrontal cortex (a network implicated in emotional modulation) and weaker connectivity between the amygdala and temporal lobe (a pathway involved in the identification of facial expressions, although areas of group differences were generally in a more anterior region of the temporal lobe than what is typically reported for emotional face processing). These alterations in connectivity are consistent with emotion and face processing disturbances in ASD.

  7. Attentional Modulation of Emotional Conflict Processing with Flanker Tasks

    PubMed Central

    Zhou, Pingyan; Liu, Xun

    2013-01-01

    Emotion processing has been shown to acquire priority by biasing allocation of attentional resources. Aversive images or fearful expressions are processed quickly and automatically. Many existing findings suggested that processing of emotional information was pre-attentive, largely immune from attentional control. Other studies argued that attention gated the processing of emotion. To tackle this controversy, the current study examined whether and to what degrees attention modulated processing of emotion using a stimulus-response-compatibility (SRC) paradigm. We conducted two flanker experiments using color scale faces in neutral expressions or gray scale faces in emotional expressions. We found SRC effects for all three dimensions (color, gender, and emotion) and SRC effects were larger when the conflicts were task relevant than when they were task irrelevant, suggesting that conflict processing of emotion was modulated by attention, similar to those of color and face identity (gender). However, task modulation on color SRC effect was significantly greater than that on gender or emotion SRC effect, indicating that processing of salient information was modulated by attention to a lesser degree than processing of non-emotional stimuli. We proposed that emotion processing can be influenced by attentional control, but at the same time salience of emotional information may bias toward bottom-up processing, rendering less top-down modulation than that on non-emotional stimuli. PMID:23544155

  8. Attentional modulation of emotional conflict processing with flanker tasks.

    PubMed

    Zhou, Pingyan; Liu, Xun

    2013-01-01

    Emotion processing has been shown to acquire priority by biasing allocation of attentional resources. Aversive images or fearful expressions are processed quickly and automatically. Many existing findings suggested that processing of emotional information was pre-attentive, largely immune from attentional control. Other studies argued that attention gated the processing of emotion. To tackle this controversy, the current study examined whether and to what degrees attention modulated processing of emotion using a stimulus-response-compatibility (SRC) paradigm. We conducted two flanker experiments using color scale faces in neutral expressions or gray scale faces in emotional expressions. We found SRC effects for all three dimensions (color, gender, and emotion) and SRC effects were larger when the conflicts were task relevant than when they were task irrelevant, suggesting that conflict processing of emotion was modulated by attention, similar to those of color and face identity (gender). However, task modulation on color SRC effect was significantly greater than that on gender or emotion SRC effect, indicating that processing of salient information was modulated by attention to a lesser degree than processing of non-emotional stimuli. We proposed that emotion processing can be influenced by attentional control, but at the same time salience of emotional information may bias toward bottom-up processing, rendering less top-down modulation than that on non-emotional stimuli.

  9. On the neural control of social emotional behavior

    PubMed Central

    Roelofs, Karin; Minelli, Alessandra; Mars, Rogier B.; van Peer, Jacobien; Toni, Ivan

    2009-01-01

    It is known that the orbitofrontal cortex (OFC) is crucially involved in emotion regulation. However, the specific role of the OFC in controlling the behavior evoked by these emotions, such as approach–avoidance (AA) responses, remains largely unexplored. We measured behavioral and neural responses (using fMRI) during the performance of a social task, a reaction time (RT) task where subjects approached or avoided visually presented emotional faces by pulling or pushing a joystick, respectively. RTs were longer for affect-incongruent responses (approach angry faces and avoid happy faces) as compared to affect-congruent responses (approach–happy; avoid–angry). Moreover, affect-incongruent responses recruited increased activity in the left lateral OFC. These behavioral and neural effects emerged only when the subjects responded explicitly to the emotional value of the faces (AA-task) and largely disappeared when subjects responded to an affectively irrelevant feature of the faces during a control (gender evaluation: GE) task. Most crucially, the size of the OFC-effect correlated positively with the size of the behavioral costs of approaching angry faces. These findings qualify the role of the lateral OFC in the voluntary control of social–motivational behavior, emphasizing the relevance of this region for selecting rule-driven stimulus–response associations, while overriding automatic (affect-congruent) stimulus–response mappings. PMID:19047074

  10. Facial Emotion Recognition in Bipolar Disorder and Healthy Aging.

    PubMed

    Altamura, Mario; Padalino, Flavia A; Stella, Eleonora; Balzotti, Angela; Bellomo, Antonello; Palumbo, Rocco; Di Domenico, Alberto; Mammarella, Nicola; Fairfield, Beth

    2016-03-01

    Emotional face recognition is impaired in bipolar disorder, but it is not clear whether this is specific for the illness. Here, we investigated how aging and bipolar disorder influence dynamic emotional face recognition. Twenty older adults, 16 bipolar patients, and 20 control subjects performed a dynamic affective facial recognition task and a subsequent rating task. Participants pressed a key as soon as they were able to discriminate whether the neutral face was assuming a happy or angry facial expression and then rated the intensity of each facial expression. Results showed that older adults recognized happy expressions faster, whereas bipolar patients recognized angry expressions faster. Furthermore, both groups rated emotional faces more intensely than did the control subjects. This study is one of the first to compare how aging and clinical conditions influence emotional facial recognition and underlines the need to consider the role of specific and common factors in emotional face recognition.

  11. Attentional Bias in Psychopathy: An Examination of the Emotional Dot-Probe Task in Male Jail Inmates.

    PubMed

    Edalati, Hanie; Walsh, Zach; Kosson, David S

    2016-08-01

    Numerous studies have identified differences in the identification of emotional displays between psychopaths and non-psychopaths; however, results have been equivocal regarding the nature of these differences. The present study investigated an alternative approach to examining the association between psychopathy and emotion processing by examining attentional bias to emotional faces; we used a modified dot-probe task to measure attentional bias toward emotional faces in comparison with neutral faces, among a sample of male jail inmates assessed using the Psychopathy Checklist-Revised (PCL-R). Results indicated a positive association between psychopathy and attention toward happy versus neutral faces, and that this association was attributable to Factor 1 of the psychopathy construct. © The Author(s) 2015.

  12. How is this child feeling? Preschool-aged children’s ability to recognize emotion in faces and body poses

    PubMed Central

    Parker, Alison E.; Mathis, Erin T.; Kupersmidt, Janis B.

    2016-01-01

    The study examined children’s recognition of emotion from faces and body poses, as well as gender differences in these recognition abilities. Preschool-aged children (N = 55) and their parents and teachers participated in the study. Preschool-aged children completed a web-based measure of emotion recognition skills, which included five tasks (three with faces and two with bodies). Parents and teachers reported on children’s aggressive behaviors and social skills. Children’s emotion accuracy on two of the three facial tasks and one of the body tasks was related to teacher reports of social skills. Some of these relations were moderated by child gender. In particular, the relationships between emotion recognition accuracy and reports of children’s behavior were stronger for boys than girls. Identifying preschool-aged children’s strengths and weaknesses in identification of emotion from faces and body poses may be helpful in guiding interventions with children who have problems with social and behavioral functioning that may be due, in part, to emotional knowledge deficits. Further developmental implications of these findings are discussed. PMID:27057129

  13. Retention of identity versus expression of emotional faces differs in the recruitment of limbic areas.

    PubMed

    Röder, Christian H; Mohr, Harald; Linden, David E J

    2011-02-01

    Faces are multidimensional stimuli that convey information for complex social and emotional functions. Separate neural systems have been implicated in the recognition of facial identity (mainly extrastriate visual cortex) and emotional expression (limbic areas and the superior temporal sulcus). Working-memory (WM) studies with faces have shown different but partly overlapping activation patterns in comparison to spatial WM in parietal and prefrontal areas. However, little is known about the neural representations of the different facial dimensions during WM. In the present study 22 subjects performed a face-identity or face-emotion WM task at different load levels during functional magnetic resonance imaging. We found a fronto-parietal-visual WM-network for both tasks during maintenance, including fusiform gyrus. Limbic areas in the amygdala and parahippocampal gyrus demonstrated a stronger activation for the identity than the emotion condition. One explanation for this finding is that the repetitive presentation of faces with different identities but the same emotional expression during the identity-task is responsible for the stronger increase in BOLD signal in the amygdala. These results raise the question how different emotional expressions are coded in WM. Our findings suggest that emotional expressions are re-coded in an abstract representation that is supported at the neural level by the canonical fronto-parietal WM network. Copyright © 2010 Elsevier Ltd. All rights reserved.

  14. Social incentives improve deliberative but not procedural learning in older adults.

    PubMed

    Gorlick, Marissa A; Maddox, W Todd

    2015-01-01

    Age-related deficits are seen across tasks where learning depends on asocial feedback processing, however plasticity has been observed in some of the same tasks in social contexts suggesting a novel way to attenuate deficits. Socioemotional selectivity theory suggests this plasticity is due to a deliberative motivational shift toward achieving well-being with age (positivity effect) that reverses when executive processes are limited (negativity effect). The present study examined the interaction of feedback valence (positive, negative) and social salience (emotional face feedback - happy; angry, asocial point feedback - gain; loss) on learning in a deliberative task that challenges executive processes and a procedural task that does not. We predict that angry face feedback will improve learning in a deliberative task when executive function is challenged. We tested two competing hypotheses regarding the interactive effects of deliberative emotional biases on automatic feedback processing: (1) If deliberative emotion regulation and automatic feedback are interactive we expect happy face feedback to improve learning and angry face feedback to impair learning in older adults because cognitive control is available. (2) If deliberative emotion regulation and automatic feedback are not interactive we predict that emotional face feedback will not improve procedural learning regardless of valence. Results demonstrate that older adults show persistent deficits relative to younger adults during procedural category learning suggesting that deliberative emotional biases do not interact with automatic feedback processing. Interestingly, a subgroup of older adults identified as potentially using deliberative strategies tended to learn as well as younger adults with angry relative to happy feedback, matching the pattern observed in the deliberative task. Results suggest that deliberative emotional biases can improve deliberative learning, but have no effect on procedural learning.

  15. Emotional priming with facial exposures in euthymic patients with bipolar disorder.

    PubMed

    Kim, Taek Su; Lee, Su Young; Ha, Ra Yeon; Kim, Eosu; An, Suk Kyoon; Ha, Kyooseob; Cho, Hyun-Sang

    2011-12-01

    People with bipolar disorder have abnormal emotional processing. We investigated the automatic and controlled emotional processing via a priming paradigm with subliminal and supraliminal facial exposure. We compared 20 euthymic bipolar patients and 20 healthy subjects on their performance in subliminal and supraliminal tasks. Priming tasks consisted of three different primes according to facial emotions (happy, sad, and neutral) followed by a neutral face as a target stimulus. The prime stimuli were presented subliminally (17 msec) or supraliminally (1000 msec). In subliminal tasks, both patients and controls judged the neutral target face as significantly more unpleasant (negative judgment shift) when presented with negative emotion primes compared with positive primes. In supraliminal tasks, bipolar subjects showed significant negative judgment shift, whereas healthy subjects did not. There was a significant group × emotion interaction for the judgment rate in supraliminal tasks. Our finding of persistent affective priming even at conscious awareness may suggest that bipolar patients have impaired cognitive control on emotional processing rather than automatically spreading activation of emotion.

  16. Face puzzle—two new video-based tasks for measuring explicit and implicit aspects of facial emotion recognition

    PubMed Central

    Kliemann, Dorit; Rosenblau, Gabriela; Bölte, Sven; Heekeren, Hauke R.; Dziobek, Isabel

    2013-01-01

    Recognizing others' emotional states is crucial for effective social interaction. While most facial emotion recognition tasks use explicit prompts that trigger consciously controlled processing, emotional faces are almost exclusively processed implicitly in real life. Recent attempts in social cognition suggest a dual process perspective, whereby explicit and implicit processes largely operate independently. However, due to differences in methodology the direct comparison of implicit and explicit social cognition has remained a challenge. Here, we introduce a new tool to comparably measure implicit and explicit processing aspects comprising basic and complex emotions in facial expressions. We developed two video-based tasks with similar answer formats to assess performance in respective facial emotion recognition processes: Face Puzzle, implicit and explicit. To assess the tasks' sensitivity to atypical social cognition and to infer interrelationship patterns between explicit and implicit processes in typical and atypical development, we included healthy adults (NT, n = 24) and adults with autism spectrum disorder (ASD, n = 24). Item analyses yielded good reliability of the new tasks. Group-specific results indicated sensitivity to subtle social impairments in high-functioning ASD. Correlation analyses with established implicit and explicit socio-cognitive measures were further in favor of the tasks' external validity. Between group comparisons provide first hints of differential relations between implicit and explicit aspects of facial emotion recognition processes in healthy compared to ASD participants. In addition, an increased magnitude of between group differences in the implicit task was found for a speed-accuracy composite measure. The new Face Puzzle tool thus provides two new tasks to separately assess explicit and implicit social functioning, for instance, to measure subtle impairments as well as potential improvements due to social cognitive interventions. PMID:23805122

  17. Emotion unfolded by motion: a role for parietal lobe in decoding dynamic facial expressions.

    PubMed

    Sarkheil, Pegah; Goebel, Rainer; Schneider, Frank; Mathiak, Klaus

    2013-12-01

    Facial expressions convey important emotional and social information and are frequently applied in investigations of human affective processing. Dynamic faces may provide higher ecological validity to examine perceptual and cognitive processing of facial expressions. Higher order processing of emotional faces was addressed by varying the task and virtual face models systematically. Blood oxygenation level-dependent activation was assessed using functional magnetic resonance imaging in 20 healthy volunteers while viewing and evaluating either emotion or gender intensity of dynamic face stimuli. A general linear model analysis revealed that high valence activated a network of motion-responsive areas, indicating that visual motion areas support perceptual coding for the motion-based intensity of facial expressions. The comparison of emotion with gender discrimination task revealed increased activation of inferior parietal lobule, which highlights the involvement of parietal areas in processing of high level features of faces. Dynamic emotional stimuli may help to emphasize functions of the hypothesized 'extended' over the 'core' system for face processing.

  18. Gender differences in brain networks supporting empathy.

    PubMed

    Schulte-Rüther, Martin; Markowitsch, Hans J; Shah, N Jon; Fink, Gereon R; Piefke, Martina

    2008-08-01

    Females frequently score higher on standard tests of empathy, social sensitivity, and emotion recognition than do males. It remains to be clarified, however, whether these gender differences are associated with gender specific neural mechanisms of emotional social cognition. We investigated gender differences in an emotion attribution task using functional magnetic resonance imaging. Subjects either focused on their own emotional response to emotion expressing faces (SELF-task) or evaluated the emotional state expressed by the faces (OTHER-task). Behaviorally, females rated SELF-related emotions significantly stronger than males. Across the sexes, SELF- and OTHER-related processing of facial expressions activated a network of medial and lateral prefrontal, temporal, and parietal brain regions involved in emotional perspective taking. During SELF-related processing, females recruited the right inferior frontal cortex and superior temporal sulcus stronger than males. In contrast, there was increased neural activity in the left temporoparietal junction in males (relative to females). When performing the OTHER-task, females showed increased activation of the right inferior frontal cortex while there were no differential activations in males. The data suggest that females recruit areas containing mirror neurons to a higher degree than males during both SELF- and OTHER-related processing in empathic face-to-face interactions. This may underlie facilitated emotional "contagion" in females. Together with the observation that males differentially rely on the left temporoparietal junction (an area mediating the distinction between the SELF and OTHERS) the data suggest that females and males rely on different strategies when assessing their own emotions in response to other people.

  19. Is there less to social anxiety than meets the eye? Behavioral and neural responses to three socio-emotional tasks

    PubMed Central

    2013-01-01

    Background Social anxiety disorder (SAD) is widely thought to be characterized by heightened behavioral and limbic reactivity to socio-emotional stimuli. However, although behavioral findings are clear, neural findings are surprisingly mixed. Methods Using functional magnetic resonance imaging (fMRI), we examined behavioral and brain responses in a priori emotion generative regions of interest (amygdala and insula) in 67 patients with generalized SAD and in 28 healthy controls (HC) during three distinct socio-emotional tasks. We administered these socio-emotional tasks during one fMRI scanning session: 1) looming harsh faces (Faces); 2) videotaped actors delivering social criticism (Criticism); and 3) written negative self-beliefs (Beliefs). Results In each task, SAD patients reported heightened negative emotion, compared to HC. There were, however, no SAD versus HC differential brain responses in the amygdala and insula. Between-group whole-brain analyses confirmed no group differences in the responses of the amygdala and insula, and indicated different brain networks activated during each of the tasks. In SAD participants, social anxiety symptom severity was associated with increased BOLD signal in the left insula during the Faces task. Conclusions The similar responses in amygdala and insula in SAD and HC participants suggest that heightened negative emotion responses reported by patients with SAD may be related to dysfunction in higher cognitive processes (e.g., distorted appraisal, attention biases, or ineffective cognitive reappraisal). In addition, the findings of this study emphasize the differential effects of socio-emotional experimental tasks. PMID:23448192

  20. Emotional conflict occurs at an early stage: evidence from the emotional face-word Stroop task.

    PubMed

    Zhu, Xiang-ru; Zhang, Hui-jun; Wu, Ting-ting; Luo, Wen-bo; Luo, Yue-jia

    2010-06-30

    The perceptual processing of emotional conflict was studied using electrophysiological techniques to measure event-related potentials (ERPs). The emotional face-word Stroop task in which emotion words are written in prominent red color across a face was use to study emotional conflict. In each trial, the emotion word and facial expression were either congruent or incongruent (in conflict). When subjects were asked to identify the expression of the face during a trial, the incongruent condition evoked a more negative N170 ERP component in posterior lateral sites than in the congruent condition. In contrast, when subjects were asked to identify the word during a trial, the incongruent condition evoked a less negative N170 component than the congruent condition. The present findings extend our understanding of the control processes involved in emotional conflict by demonstrating that differentiation of emotional congruency begins at an early perceptual processing stage. (c) 2010 Elsevier Ireland Ltd. All rights reserved.

  1. Searching for emotion or race: task-irrelevant facial cues have asymmetrical effects.

    PubMed

    Lipp, Ottmar V; Craig, Belinda M; Frost, Mareka J; Terry, Deborah J; Smith, Joanne R

    2014-01-01

    Facial cues of threat such as anger and other race membership are detected preferentially in visual search tasks. However, it remains unclear whether these facial cues interact in visual search. If both cues equally facilitate search, a symmetrical interaction would be predicted; anger cues should facilitate detection of other race faces and cues of other race membership should facilitate detection of anger. Past research investigating this race by emotional expression interaction in categorisation tasks revealed an asymmetrical interaction. This suggests that cues of other race membership may facilitate the detection of angry faces but not vice versa. Utilising the same stimuli and procedures across two search tasks, participants were asked to search for targets defined by either race or emotional expression. Contrary to the results revealed in the categorisation paradigm, cues of anger facilitated detection of other race faces whereas differences in race did not differentially influence detection of emotion targets.

  2. Face-Memory and Emotion: Associations with Major Depression in Children and Adolescents

    ERIC Educational Resources Information Center

    Pine, Daniel S.; Lissek, Shmuel; Klein, Rachel G.; Mannuzza, Salvatore; Moulton, John L., III; Guardino, Mary; Woldehawariat, Girma

    2004-01-01

    Background: Studies in adults with major depressive disorder (MDD) document abnormalities in both memory and face-emotion processing. The current study used a novel face-memory task to test the hypothesis that adolescent MDD is associated with a deficit in memory for face-emotions. The study also examines the relationship between parental MDD and…

  3. Individual differences in emotion processing: how similar are diffusion model parameters across tasks?

    PubMed

    Mueller, Christina J; White, Corey N; Kuchinke, Lars

    2017-11-27

    The goal of this study was to replicate findings of diffusion model parameters capturing emotion effects in a lexical decision task and investigating whether these findings extend to other tasks of implicit emotion processing. Additionally, we were interested in the stability of diffusion model parameters across emotional stimuli and tasks for individual subjects. Responses to words in a lexical decision task were compared with responses to faces in a gender categorization task for stimuli of the emotion categories: happy, neutral and fear. Main effects of emotion as well as stability of emerging response style patterns as evident in diffusion model parameters across these tasks were analyzed. Based on earlier findings, drift rates were assumed to be more similar in response to stimuli of the same emotion category compared to stimuli of a different emotion category. Results showed that emotion effects of the tasks differed with a processing advantage for happy followed by neutral and fear-related words in the lexical decision task and a processing advantage for neutral followed by happy and fearful faces in the gender categorization task. Both emotion effects were captured in estimated drift rate parameters-and in case of the lexical decision task also in the non-decision time parameters. A principal component analysis showed that contrary to our hypothesis drift rates were more similar within a specific task context than within a specific emotion category. Individual response patterns of subjects across tasks were evident in significant correlations regarding diffusion model parameters including response styles, non-decision times and information accumulation.

  4. The ties to unbind: age-related differences in feature (un)binding in working memory for emotional faces

    PubMed Central

    Pehlivanoglu, Didem; Jain, Shivangi; Ariel, Robert; Verhaeghen, Paul

    2014-01-01

    In the present study, we investigated age-related differences in the processing of emotional stimuli. Specifically, we were interested in whether older adults would show deficits in unbinding emotional expression (i.e., either no emotion, happiness, anger, or disgust) from bound stimuli (i.e., photographs of faces expressing these emotions), as a hyper-binding account of age-related differences in working memory would predict. Younger and older adults completed different N-Back tasks (side-by-side 0-Back, 1-Back, 2-Back) under three conditions: match/mismatch judgments based on either the identity of the face (identity condition), the face’s emotional expression (expression condition), or both identity and expression of the face (both condition). The two age groups performed more slowly and with lower accuracy in the expression condition than in the both condition, indicating the presence of an unbinding process. This unbinding effect was more pronounced in older adults than in younger adults, but only in the 2-Back task. Thus, older adults seemed to have a specific deficit in unbinding in working memory. Additionally, no age-related differences were found in accuracy in the 0-Back task, but such differences emerged in the 1-Back task, and were further magnified in the 2-Back task, indicating independent age-related differences in attention/STM and working memory. Pupil dilation data confirmed that the attention/STM version of the task (1-Back) is more effortful for older adults than younger adults. PMID:24795660

  5. Reduced beta connectivity during emotional face processing in adolescents with autism.

    PubMed

    Leung, Rachel C; Ye, Annette X; Wong, Simeon M; Taylor, Margot J; Doesburg, Sam M

    2014-01-01

    Autism spectrum disorder (ASD) is a neurodevelopmental disorder characterized by impairments in social cognition. The biological basis of deficits in social cognition in ASD, and their difficulty in processing emotional face information in particular, remains unclear. Atypical communication within and between brain regions has been reported in ASD. Interregional phase-locking is a neurophysiological mechanism mediating communication among brain areas and is understood to support cognitive functions. In the present study we investigated interregional magnetoencephalographic phase synchronization during the perception of emotional faces in adolescents with ASD. A total of 22 adolescents with ASD (18 males, mean age =14.2 ± 1.15 years, 22 right-handed) with mild to no cognitive delay and 17 healthy controls (14 males, mean age =14.4 ± 0.33 years, 16 right-handed) performed an implicit emotional processing task requiring perception of happy, angry and neutral faces while we recorded neuromagnetic signals. The faces were presented rapidly (80 ms duration) to the left or right of a central fixation cross and participants responded to a scrambled pattern that was presented concurrently on the opposite side of the fixation point. Task-dependent interregional phase-locking was calculated among source-resolved brain regions. Task-dependent increases in interregional beta synchronization were observed. Beta-band interregional phase-locking in adolescents with ASD was reduced, relative to controls, during the perception of angry faces in a distributed network involving the right fusiform gyrus and insula. No significant group differences were found for happy or neutral faces, or other analyzed frequency ranges. Significant reductions in task-dependent beta connectivity strength, clustering and eigenvector centrality (all P <0.001) in the right insula were found in adolescents with ASD, relative to controls. Reduced beta synchronization may reflect inadequate recruitment of task-relevant networks during emotional face processing in ASD. The right insula, specifically, was a hub of reduced functional connectivity and may play a prominent role in the inability to effectively extract emotional information from faces. These findings suggest that functional disconnection in brain networks mediating emotional processes may contribute to deficits in social cognition in this population.

  6. Face to face with emotion: holistic face processing is modulated by emotional state.

    PubMed

    Curby, Kim M; Johnson, Kareem J; Tyson, Alyssa

    2012-01-01

    Negative emotions are linked with a local, rather than global, visual processing style, which may preferentially facilitate feature-based, relative to holistic, processing mechanisms. Because faces are typically processed holistically, and because social contexts are prime elicitors of emotions, we examined whether negative emotions decrease holistic processing of faces. We induced positive, negative, or neutral emotions via film clips and measured holistic processing before and after the induction: participants made judgements about cued parts of chimeric faces, and holistic processing was indexed by the interference caused by task-irrelevant face parts. Emotional state significantly modulated face-processing style, with the negative emotion induction leading to decreased holistic processing. Furthermore, self-reported change in emotional state correlated with changes in holistic processing. These results contrast with general assumptions that holistic processing of faces is automatic and immune to outside influences, and they illustrate emotion's power to modulate socially relevant aspects of visual perception.

  7. Age-Related Changes in the Processing of Emotional Faces in a Dual-Task Paradigm.

    PubMed

    Casares-Guillén, Carmen; García-Rodríguez, Beatriz; Delgado, Marisa; Ellgring, Heiner

    2016-01-01

    Background/ Study Context: Age-related changes appear to affect the ability to identify emotional facial expressions in dual-task conditions (i.e., while simultaneously performing a second visual task). The level of interference generated by the secondary task depends on the phase of emotional processing affected by the interference and the nature of the secondary task. The aim of the present study was to investigate the effect of these variables on age-related changes in the processing of emotional faces. The identification of emotional facial expressions (EFEs) was assessed in a dual-task paradigm using the following variables: (a) the phase during which interference was applied (encoding vs. retrieval phase); and (b) the nature of the interfering stimulus (visuospatial vs. verbal). The sample population consisted of 24 healthy aged adults (mean age = 75.38) and 40 younger adults (mean age = 26.90). The accuracy of EFE identification was calculated for all experimental conditions. Consistent with our hypothesis, the performance of the older group was poorer than that of the younger group in all experimental conditions. Dual-task performance was poorer when the interference occurred during the encoding phase of emotional face processing and when both tasks were of the same nature (i.e., when the experimental condition was more demanding in terms of attention). These results provide empirical evidence of age-related deficits in the identification of emotional facial expressions, which may be partially explained by the impairment of cognitive resources specific to this task. These findings may account for the difficulties experienced by the elderly during social interactions that require the concomitant processing of emotional and environmental information.

  8. I don't know where to look: the impact of intolerance of uncertainty on saccades towards non-predictive emotional face distractors.

    PubMed

    Morriss, Jayne; McSorley, Eugene; van Reekum, Carien M

    2017-08-24

    Attentional bias to uncertain threat is associated with anxiety disorders. Here we examine the extent to which emotional face distractors (happy, angry and neutral) and individual differences in intolerance of uncertainty (IU), impact saccades in two versions of the "follow a cross" task. In both versions of the follow the cross task, the probability of receiving an emotional face distractor was 66.7%. To increase perceived uncertainty regarding the location of the face distractors, in one of the tasks additional non-predictive cues were presented before the onset of the face distractors and target. We did not find IU to impact saccades towards non-cued face distractors. However, we found IU, over Trait Anxiety, to impact saccades towards non-predictive cueing of face distractors. Under these conditions, IU individuals' eyes were pulled towards angry face distractors and away from happy face distractors overall, and the speed of this deviation of the eyes was determined by the combination of the cue and emotion of the face. Overall, these results suggest a specific role of IU on attentional bias to threat during uncertainty. These findings highlight the potential of intolerance of uncertainty-based mechanisms to help understand anxiety disorder pathology and inform potential treatment targets.

  9. The impact of the stimulus features and task instructions on facial processing in social anxiety: an ERP investigation.

    PubMed

    Peschard, Virginie; Philippot, Pierre; Joassin, Frédéric; Rossignol, Mandy

    2013-04-01

    Social anxiety has been characterized by an attentional bias towards threatening faces. Electrophysiological studies have demonstrated modulations of cognitive processing from 100 ms after stimulus presentation. However, the impact of the stimulus features and task instructions on facial processing remains unclear. Event-related potentials were recorded while high and low socially anxious individuals performed an adapted Stroop paradigm that included a colour-naming task with non-emotional stimuli, an emotion-naming task (the explicit task) and a colour-naming task (the implicit task) on happy, angry and neutral faces. Whereas the impact of task factors was examined by contrasting an explicit and an implicit emotional task, the effects of perceptual changes on facial processing were explored by including upright and inverted faces. The findings showed an enhanced P1 in social anxiety during the three tasks, without a moderating effect of the type of task or stimulus. These results suggest a global modulation of attentional processing in performance situations. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. Does cortisol modulate emotion recognition and empathy?

    PubMed

    Duesenberg, Moritz; Weber, Juliane; Schulze, Lars; Schaeuffele, Carmen; Roepke, Stefan; Hellmann-Regen, Julian; Otte, Christian; Wingenfeld, Katja

    2016-04-01

    Emotion recognition and empathy are important aspects in the interaction and understanding of other people's behaviors and feelings. The Human environment comprises of stressful situations that impact social interactions on a daily basis. Aim of the study was to examine the effects of the stress hormone cortisol on emotion recognition and empathy. In this placebo-controlled study, 40 healthy men and 40 healthy women (mean age 24.5 years) received either 10mg of hydrocortisone or placebo. We used the Multifaceted Empathy Test to measure emotional and cognitive empathy. Furthermore, we examined emotion recognition from facial expressions, which contained two emotions (anger and sadness) and two emotion intensities (40% and 80%). We did not find a main effect for treatment or sex on either empathy or emotion recognition but a sex × emotion interaction on emotion recognition. The main result was a four-way-interaction on emotion recognition including treatment, sex, emotion and task difficulty. At 40% task difficulty, women recognized angry faces better than men in the placebo condition. Furthermore, in the placebo condition, men recognized sadness better than anger. At 80% task difficulty, men and women performed equally well in recognizing sad faces but men performed worse compared to women with regard to angry faces. Apparently, our results did not support the hypothesis that increases in cortisol concentration alone influence empathy and emotion recognition in healthy young individuals. However, sex and task difficulty appear to be important variables in emotion recognition from facial expressions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Prism adaptation does not change the rightward spatial preference bias found with ambiguous stimuli in unilateral neglect

    PubMed Central

    Sarri, Margarita; Greenwood, Richard; Kalra, Lalit; Driver, Jon

    2011-01-01

    Previous research has shown that prism adaptation (prism adaptation) can ameliorate several symptoms of spatial neglect after right-hemisphere damage. But the mechanisms behind this remain unclear. Recently we reported that prisms may increase leftward awareness for neglect in a task using chimeric visual objects, despite apparently not affecting awareness in a task using chimeric emotional faces (Sarri et al., 2006). Here we explored potential reasons for this apparent discrepancy in outcome, by testing further whether the lack of a prism effect on the chimeric face task task could be explained by: i) the specific category of stimuli used (faces as opposed to objects); ii) the affective nature of the stimuli; and/or iii) the particular task implemented, with the chimeric face task requiring forced-choice judgements of lateral ‘preference’ between pairs of identical, but left/right mirror-reversed chimeric face tasks (as opposed to identification for the chimeric object task). We replicated our previous pattern of no impact of prisms on the emotional chimeric face task here in a new series of patients, while also similarly finding no beneficial impact on another lateral ‘preference’ measure that used non-face non-emotional stimuli, namely greyscale gradients. By contrast, we found the usual beneficial impact of prism adaptation (prism adaptation) on some conventional measures of neglect, and improvements for at least some patients in a different face task, requiring explicit discrimination of the chimeric or non-chimeric nature of face stimuli. The new findings indicate that prism therapy does not alter spatial biases in neglect as revealed by ‘lateral preference tasks’ that have no right or wrong answer (requiring forced-choice judgements on left/right mirror-reversed stimuli), regardless of whether these employ face or non-face stimuli. But our data also show that prism therapy can beneficially modulate some aspects of visual awareness in spatial neglect not only for objects, but also for face stimuli, in some cases. PMID:20171612

  12. Emotion Perception or Social Cognitive Complexity: What Drives Face Processing Deficits in Autism Spectrum Disorder?

    ERIC Educational Resources Information Center

    Walsh, Jennifer A.; Creighton, Sarah E.; Rutherford, M. D.

    2016-01-01

    Some, but not all, relevant studies have revealed face processing deficits among those with autism spectrum disorder (ASD). In particular, deficits are revealed in face processing tasks that involve emotion perception. The current study examined whether either deficits in processing emotional expression or deficits in processing social cognitive…

  13. Neural Activation to Emotional Faces in Adolescents with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Weng, Shih-Jen; Carrasco, Melisa; Swartz, Johnna R.; Wiggins, Jillian Lee; Kurapati, Nikhil; Liberzon, Israel; Risi, Susan; Lord, Catherine; Monk, Christopher S.

    2011-01-01

    Background: Autism spectrum disorders (ASD) involve a core deficit in social functioning and impairments in the ability to recognize face emotions. In an emotional faces task designed to constrain group differences in attention, the present study used functional MRI to characterize activation in the amygdala, ventral prefrontal cortex (vPFC), and…

  14. Alcoholism and dampened temporal limbic activation to emotional faces.

    PubMed

    Marinkovic, Ksenija; Oscar-Berman, Marlene; Urban, Trinity; O'Reilly, Cara E; Howard, Julie A; Sawyer, Kayle; Harris, Gordon J

    2009-11-01

    Excessive chronic drinking is accompanied by a broad spectrum of emotional changes ranging from apathy and emotional flatness to deficits in comprehending emotional information, but their neural bases are poorly understood. Emotional abnormalities associated with alcoholism were examined with functional magnetic resonance imaging in abstinent long-term alcoholic men in comparison to healthy demographically matched controls. Participants were presented with emotionally valenced words and photographs of faces during deep (semantic) and shallow (perceptual) encoding tasks followed by recognition. Overall, faces evoked stronger activation than words, with the expected material-specific laterality (left hemisphere for words, and right for faces) and depth of processing effects. However, whereas control participants showed stronger activation in the amygdala and hippocampus when viewing faces with emotional (relative to neutral) expressions, the alcoholics responded in an undifferentiated manner to all facial expressions. In the alcoholic participants, amygdala activity was inversely correlated with an increase in lateral prefrontal activity as a function of their behavioral deficits. Prefrontal modulation of emotional function as a compensation for the blunted amygdala activity during a socially relevant face appraisal task is in agreement with a distributed network engagement during emotional face processing. Deficient activation of amygdala and hippocampus may underlie impaired processing of emotional faces associated with long-term alcoholism and may be a part of the wide array of behavioral problems including disinhibition, concurring with previously documented interpersonal difficulties in this population. Furthermore, the results suggest that alcoholics may rely on prefrontal rather than temporal limbic areas in order to compensate for reduced limbic responsivity and to maintain behavioral adequacy when faced with emotionally or socially challenging situations.

  15. Attentional bias for emotional faces in paediatric anxiety disorders: an investigation using the emotional Go/No Go task.

    PubMed

    Waters, Allison M; Valvoi, Jaya S

    2009-06-01

    The present study examined contextual modulation of attentional control processes in paediatric anxiety disorders. Anxious children (N=20) and non-anxious controls (N=20) completed an emotional Go/No Go task in which they responded on some trials (i.e., Go trials) when neutral faces were presented amongst either angry or happy faces to which children avoided responding (i.e., No Go trials) or when angry and happy faces were presented as Go trials and children avoided responding to neutral faces. Anxious girls were slower responding to neutral faces with embedded angry compared with happy face No Go trials whereas non-anxious girls were slower responding to neutral faces with embedded happy versus angry face No Go trials. Anxious and non-anxious boys showed the same basic pattern as non-anxious girls. There were no significant group differences on No Go trials or when the emotional faces were presented as Go trials. Results are discussed in terms of selective interference by angry faces in the control of attention in anxious girls.

  16. Effortful versus automatic emotional processing in schizophrenia: Insights from a face-vignette task.

    PubMed

    Patrick, Regan E; Rastogi, Anuj; Christensen, Bruce K

    2015-01-01

    Adaptive emotional responding relies on dual automatic and effortful processing streams. Dual-stream models of schizophrenia (SCZ) posit a selective deficit in neural circuits that govern goal-directed, effortful processes versus reactive, automatic processes. This imbalance suggests that when patients are confronted with competing automatic and effortful emotional response cues, they will exhibit diminished effortful responding and intact, possibly elevated, automatic responding compared to controls. This prediction was evaluated using a modified version of the face-vignette task (FVT). Participants viewed emotional faces (automatic response cue) paired with vignettes (effortful response cue) that signalled a different emotion category and were instructed to discriminate the manifest emotion. Patients made less vignette and more face responses than controls. However, the relationship between group and FVT responding was moderated by IQ and reading comprehension ability. These results replicate and extend previous research and provide tentative support for abnormal conflict resolution between automatic and effortful emotional processing predicted by dual-stream models of SCZ.

  17. Selective attention to emotional cues and emotion recognition in healthy subjects: the role of mineralocorticoid receptor stimulation.

    PubMed

    Schultebraucks, Katharina; Deuter, Christian E; Duesenberg, Moritz; Schulze, Lars; Hellmann-Regen, Julian; Domke, Antonia; Lockenvitz, Lisa; Kuehl, Linn K; Otte, Christian; Wingenfeld, Katja

    2016-09-01

    Selective attention toward emotional cues and emotion recognition of facial expressions are important aspects of social cognition. Stress modulates social cognition through cortisol, which acts on glucocorticoid (GR) and mineralocorticoid receptors (MR) in the brain. We examined the role of MR activation on attentional bias toward emotional cues and on emotion recognition. We included 40 healthy young women and 40 healthy young men (mean age 23.9 ± 3.3), who either received 0.4 mg of the MR agonist fludrocortisone or placebo. A dot-probe paradigm was used to test for attentional biases toward emotional cues (happy and sad faces). Moreover, we used a facial emotion recognition task to investigate the ability to recognize emotional valence (anger and sadness) from facial expression in four graded categories of emotional intensity (20, 30, 40, and 80 %). In the emotional dot-probe task, we found a main effect of treatment and a treatment × valence interaction. Post hoc analyses revealed an attentional bias away from sad faces after placebo intake and a shift in selective attention toward sad faces compared to placebo. We found no attentional bias toward happy faces after fludrocortisone or placebo intake. In the facial emotion recognition task, there was no main effect of treatment. MR stimulation seems to be important in modulating quick, automatic emotional processing, i.e., a shift in selective attention toward negative emotional cues. Our results confirm and extend previous findings of MR function. However, we did not find an effect of MR stimulation on emotion recognition.

  18. Age-Group Differences in Interference from Young and Older Emotional Faces.

    PubMed

    Ebner, Natalie C; Johnson, Marcia K

    2010-11-01

    Human attention is selective, focusing on some aspects of events at the expense of others. In particular, angry faces engage attention. Most studies have used pictures of young faces, even when comparing young and older age groups. Two experiments asked (1) whether task-irrelevant faces of young and older individuals with happy, angry, and neutral expressions disrupt performance on a face-unrelated task, (2) whether interference varies for faces of different ages and different facial expressions, and (3) whether young and older adults differ in this regard. Participants gave speeded responses on a number task while irrelevant faces appeared in the background. Both age groups were more distracted by own than other-age faces. In addition, young participants' responses were slower for angry than happy faces, whereas older participants' responses were slower for happy than angry faces. Factors underlying age-group differences in interference from emotional faces of different ages are discussed.

  19. Happy faces, sad faces: Emotion understanding in toddlers and preschoolers with language impairments.

    PubMed

    Rieffe, Carolien; Wiefferink, Carin H

    2017-03-01

    The capacity for emotion recognition and understanding is crucial for daily social functioning. We examined to what extent this capacity is impaired in young children with a Language Impairment (LI). In typical development, children learn to recognize emotions in faces and situations through social experiences and social learning. Children with LI have less access to these experiences and are therefore expected to fall behind their peers without LI. In this study, 89 preschool children with LI and 202 children without LI (mean age 3 years and 10 months in both groups) were tested on three indices for facial emotion recognition (discrimination, identification, and attribution in emotion evoking situations). Parents reported on their children's emotion vocabulary and ability to talk about their own emotions. Preschoolers with and without LI performed similarly on the non-verbal task for emotion discrimination. Children with LI fell behind their peers without LI on both other tasks for emotion recognition that involved labelling the four basic emotions (happy, sad, angry, fear). The outcomes of these two tasks were also related to children's level of emotion language. These outcomes emphasize the importance of 'emotion talk' at the youngest age possible for children with LI. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Memory for faces and voices varies as a function of sex and expressed emotion.

    PubMed

    S Cortes, Diana; Laukka, Petri; Lindahl, Christina; Fischer, Håkan

    2017-01-01

    We investigated how memory for faces and voices (presented separately and in combination) varies as a function of sex and emotional expression (anger, disgust, fear, happiness, sadness, and neutral). At encoding, participants judged the expressed emotion of items in forced-choice tasks, followed by incidental Remember/Know recognition tasks. Results from 600 participants showed that accuracy (hits minus false alarms) was consistently higher for neutral compared to emotional items, whereas accuracy for specific emotions varied across the presentation modalities (i.e., faces, voices, and face-voice combinations). For the subjective sense of recollection ("remember" hits), neutral items received the highest hit rates only for faces, whereas for voices and face-voice combinations anger and fear expressions instead received the highest recollection rates. We also observed better accuracy for items by female expressers, and own-sex bias where female participants displayed memory advantage for female faces and face-voice combinations. Results further suggest that own-sex bias can be explained by recollection, rather than familiarity, rates. Overall, results show that memory for faces and voices may be influenced by the expressions that they carry, as well as by the sex of both items and participants. Emotion expressions may also enhance the subjective sense of recollection without enhancing memory accuracy.

  1. Memory for faces and voices varies as a function of sex and expressed emotion

    PubMed Central

    Laukka, Petri; Lindahl, Christina; Fischer, Håkan

    2017-01-01

    We investigated how memory for faces and voices (presented separately and in combination) varies as a function of sex and emotional expression (anger, disgust, fear, happiness, sadness, and neutral). At encoding, participants judged the expressed emotion of items in forced-choice tasks, followed by incidental Remember/Know recognition tasks. Results from 600 participants showed that accuracy (hits minus false alarms) was consistently higher for neutral compared to emotional items, whereas accuracy for specific emotions varied across the presentation modalities (i.e., faces, voices, and face-voice combinations). For the subjective sense of recollection (“remember” hits), neutral items received the highest hit rates only for faces, whereas for voices and face-voice combinations anger and fear expressions instead received the highest recollection rates. We also observed better accuracy for items by female expressers, and own-sex bias where female participants displayed memory advantage for female faces and face-voice combinations. Results further suggest that own-sex bias can be explained by recollection, rather than familiarity, rates. Overall, results show that memory for faces and voices may be influenced by the expressions that they carry, as well as by the sex of both items and participants. Emotion expressions may also enhance the subjective sense of recollection without enhancing memory accuracy. PMID:28570691

  2. Preliminary evidence that different mechanisms underlie the anger superiority effect in children with and without Autism Spectrum Disorders

    PubMed Central

    Isomura, Tomoko; Ogawa, Shino; Yamada, Satoko; Shibasaki, Masahiro; Masataka, Nobuo

    2014-01-01

    Previous studies have demonstrated that angry faces capture humans' attention more rapidly than emotionally positive faces. This phenomenon is referred to as the anger superiority effect (ASE). Despite atypical emotional processing, adults and children with Autism Spectrum Disorders (ASD) have been reported to show ASE as well as typically developed (TD) individuals. So far, however, few studies have clarified whether or not the mechanisms underlying ASE are the same for both TD and ASD individuals. Here, we tested how TD and ASD children process schematic emotional faces during detection by employing a recognition task in combination with a face-in-the-crowd task. Results of the face-in-the-crowd task revealed the prevalence of ASE both in TD and ASD children. However, the results of the recognition task revealed group differences: In TD children, detection of angry faces required more configural face processing and disrupted the processing of local features. In ASD children, on the other hand, it required more feature-based processing rather than configural processing. Despite the small sample sizes, these findings provide preliminary evidence that children with ASD, in contrast to TD children, show quick detection of angry faces by extracting local features in faces. PMID:24904477

  3. Saccadic movement deficiencies in adults with ADHD tendencies.

    PubMed

    Lee, Yun-Jeong; Lee, Sangil; Chang, Munseon; Kwak, Ho-Wan

    2015-12-01

    The goal of the present study was to explore deficits in gaze detection and emotional value judgment during a saccadic eye movement task in adults with attention deficit/hyperactivity disorder (ADHD) tendencies. Thirty-two participants, consisting of 16 ADHD tendencies and 16 controls, were recruited from a pool of 243 university students. Among the many problems in adults with ADHDs, our research focused on the deficits in the processing of nonverbal cues, such as gaze direction and the emotional value of others' faces. In Experiment 1, a cue display containing a face with emotional value and gaze direction was followed by a target display containing two faces located on the left and right side of the display. The participant's task was to make an anti-saccade opposite to the gaze direction if the cue face was not emotionally neutral. ADHD tendencies showed more overall errors than controls in making anti-saccades. Based on the hypothesis that the exposure duration of the cue display in Experiment 1 may have been too long, we presented the cue and target display simultaneously to prevent participants from preparing saccades in advance. Participants in Experiment 2 were asked to make either a pro-saccade or an anti-saccade depending on the emotional value of the central cue face. Interestingly, significant group differences were observed for errors of omission and commission. In addition, a significant three-way interaction among groups, cue emotion, and target gaze direction suggests that the emotional recognition and gaze control systems might somehow be interconnected. The result also shows that ADHDs are more easily distracted by a task-irrelevant gaze direction. Taken together, these results suggest that tasks requiring both response inhibition (anti-saccade) and gaze-emotion recognition might be useful in developing a diagnostic test for discriminating adults with ADHDs from healthy adults.

  4. Emotional Recognition in Autism Spectrum Conditions from Voices and Faces

    ERIC Educational Resources Information Center

    Stewart, Mary E.; McAdam, Clair; Ota, Mitsuhiko; Peppe, Sue; Cleland, Joanne

    2013-01-01

    The present study reports on a new vocal emotion recognition task and assesses whether people with autism spectrum conditions (ASC) perform differently from typically developed individuals on tests of emotional identification from both the face and the voice. The new test of vocal emotion contained trials in which the vocal emotion of the sentence…

  5. Memory for faces with emotional expressions in Alzheimer's disease and healthy older participants: positivity effect is not only due to familiarity.

    PubMed

    Sava, Alina-Alexandra; Krolak-Salmon, Pierre; Delphin-Combe, Floriane; Cloarec, Morgane; Chainay, Hanna

    2017-01-01

    Young individuals better memorize initially seen faces with emotional rather than neutral expressions. Healthy older participants and Alzheimer's disease (AD) patients show better memory for faces with positive expressions. The socioemotional selectivity theory postulates that this positivity effect in memory reflects a general age-related preference for positive stimuli, subserving emotion regulation. Another explanation might be that older participants use compensatory strategies, often considering happy faces as previously seen. The question about the existence of this effect in tasks not permitting such compensatory strategies is still open. Thus, we compared the performance of healthy participants and AD patients for positive, neutral, and negative faces in such tasks. Healthy older participants and AD patients showed a positivity effect in memory, but there was no difference between emotional and neutral faces in young participants. Our results suggest that the positivity effect in memory is not entirely due to the sense of familiarity for smiling faces.

  6. Detection of Emotional Faces: Salient Physical Features Guide Effective Visual Search

    ERIC Educational Resources Information Center

    Calvo, Manuel G.; Nummenmaa, Lauri

    2008-01-01

    In this study, the authors investigated how salient visual features capture attention and facilitate detection of emotional facial expressions. In a visual search task, a target emotional face (happy, disgusted, fearful, angry, sad, or surprised) was presented in an array of neutral faces. Faster detection of happy and, to a lesser extent,…

  7. Alterations in neural processing of emotional faces in adolescent anorexia nervosa patients - an event-related potential study.

    PubMed

    Sfärlea, Anca; Greimel, Ellen; Platt, Belinda; Bartling, Jürgen; Schulte-Körne, Gerd; Dieler, Alica C

    2016-09-01

    The present study explored the neurophysiological correlates of perception and recognition of emotional facial expressions in adolescent anorexia nervosa (AN) patients using event-related potentials (ERPs). We included 20 adolescent girls with AN and 24 healthy girls and recorded ERPs during a passive viewing task and three active tasks requiring processing of emotional faces in varying processing depths; one of the tasks also assessed emotion recognition abilities behaviourally. Despite the absence of behavioural differences, we found that across all tasks AN patients exhibited a less pronounced early posterior negativity (EPN) in response to all facial expressions compared to controls. The EPN is an ERP component reflecting an automatic, perceptual processing stage which is modulated by the intrinsic salience of a stimulus. Hence, the less pronounced EPN in anorexic girls suggests that they might perceive other people's faces as less intrinsically relevant, i.e. as less "important" than do healthy girls. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Valence Specific Laterality Effects in Free Viewing Conditions: The Role of Expectancy and Gender of Image

    ERIC Educational Resources Information Center

    Stafford, Lorenzo D.; Brandaro, Nicola

    2010-01-01

    Recent research has looked at whether the expectancy of an emotion can account for subsequent valence specific laterality effects of prosodic emotion, though no research has examined this effect for facial emotion. In the study here (n = 58), we investigated this issue using two tasks; an emotional face perception task and a novel word task that…

  9. Acute tryptophan depletion attenuates conscious appraisal of social emotional signals in healthy female volunteers.

    PubMed

    Beacher, Felix D C C; Gray, Marcus A; Minati, Ludovico; Whale, Richard; Harrison, Neil A; Critchley, Hugo D

    2011-02-01

    Acute tryptophan depletion (ATD) decreases levels of central serotonin. ATD thus enables the cognitive effects of serotonin to be studied, with implications for the understanding of psychiatric conditions, including depression. To determine the role of serotonin in conscious (explicit) and unconscious/incidental processing of emotional information. A randomized, double-blind, cross-over design was used with 15 healthy female participants. Subjective mood was recorded at baseline and after 4 h, when participants performed an explicit emotional face processing task, and a task eliciting unconscious processing of emotionally aversive and neutral images presented subliminally using backward masking. ATD was associated with a robust reduction in plasma tryptophan at 4 h but had no effect on mood or autonomic physiology. ATD was associated with significantly lower attractiveness ratings for happy faces and attenuation of intensity/arousal ratings of angry faces. ATD also reduced overall reaction times on the unconscious perception task, but there was no interaction with emotional content of masked stimuli. ATD did not affect breakthrough perception (accuracy in identification) of masked images. ATD attenuates the attractiveness of positive faces and the negative intensity of threatening faces, suggesting that serotonin contributes specifically to the appraisal of the social salience of both positive and negative salient social emotional cues. We found no evidence that serotonin affects unconscious processing of negative emotional stimuli. These novel findings implicate serotonin in conscious aspects of active social and behavioural engagement and extend knowledge regarding the effects of ATD on emotional perception.

  10. Selective Attention to Emotional Stimuli: What IQ and Openness Do, and Emotional Intelligence Does Not

    ERIC Educational Resources Information Center

    Fiori, Marina; Antonakis, John

    2012-01-01

    We examined how general intelligence, personality, and emotional intelligence--measured as an ability using the MSCEIT--predicted performance on a selective-attention task requiring participants to ignore distracting emotion information. We used a visual prime in which participants saw a pair of faces depicting emotions; their task was to focus on…

  11. Emotion regulation in social anxiety disorder: behavioral and neural responses to three socio-emotional tasks

    PubMed Central

    2013-01-01

    Background Social anxiety disorder (SAD) is thought to involve deficits in emotion regulation, and more specifically, deficits in cognitive reappraisal. However, evidence for such deficits is mixed. Methods Using functional magnetic resonance imaging (fMRI) of blood oxygen-level dependent (BOLD) signal, we examined reappraisal-related behavioral and neural responses in 27 participants with generalized SAD and 27 healthy controls (HC) during three socio-emotional tasks: (1) looming harsh faces (Faces); (2) videotaped actors delivering social criticism (Criticism); and (3) written autobiographical negative self-beliefs (Beliefs). Results Behaviorally, compared to HC, participants with SAD had lesser reappraisal-related reduction in negative emotion in the Beliefs task. Neurally, compared to HC, participants with SAD had lesser BOLD responses in reappraisal-related brain regions when reappraising faces, in visual and attention related regions when reappraising criticism, and in the left superior temporal gyrus when reappraising beliefs. Examination of the temporal dynamics of BOLD responses revealed late reappraisal-related increased responses in HC, compared to SAD. In addition, the dorsomedial prefrontal cortex (DMPFC), which showed reappraisal-related increased activity in both groups, had similar temporal dynamics in SAD and HC during the Faces and Criticism tasks, but greater late response increases in HC, compared to SAD, during the Beliefs task. Reappraisal-related greater late DMPFC responses were associated with greater percent reduction in negative emotion ratings in SAD patients. Conclusions These results suggest a dysfunction of cognitive reappraisal in SAD patients, with overall reduced late brain responses in prefrontal regions, particularly when reappraising faces. Decreased late activity in the DMPFC might be associated with deficient reappraisal and greater negative reactivity. Trial registration ClinicalTrials.gov identifier: NCT00380731 PMID:24517388

  12. Culture modulates the brain response to human expressions of emotion: electrophysiological evidence.

    PubMed

    Liu, Pan; Rigoulot, Simon; Pell, Marc D

    2015-01-01

    To understand how culture modulates on-line neural responses to social information, this study compared how individuals from two distinct cultural groups, English-speaking North Americans and Chinese, process emotional meanings of multi-sensory stimuli as indexed by both behaviour (accuracy) and event-related potential (N400) measures. In an emotional Stroop-like task, participants were presented face-voice pairs expressing congruent or incongruent emotions in conditions where they judged the emotion of one modality while ignoring the other (face or voice focus task). Results indicated that while both groups were sensitive to emotional differences between channels (with lower accuracy and higher N400 amplitudes for incongruent face-voice pairs), there were marked group differences in how intruding facial or vocal cues affected accuracy and N400 amplitudes, with English participants showing greater interference from irrelevant faces than Chinese. Our data illuminate distinct biases in how adults from East Asian versus Western cultures process socio-emotional cues, supplying new evidence that cultural learning modulates not only behaviour, but the neurocognitive response to different features of multi-channel emotion expressions. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Facilitation or disengagement? Attention bias in facial affect processing after short-term violent video game exposure

    PubMed Central

    Liu, Yanling; Lan, Haiying; Teng, Zhaojun; Guo, Cheng; Yao, Dezhong

    2017-01-01

    Previous research has been inconsistent on whether violent video games exert positive and/or negative effects on cognition. In particular, attentional bias in facial affect processing after violent video game exposure continues to be controversial. The aim of the present study was to investigate attentional bias in facial recognition after short term exposure to violent video games and to characterize the neural correlates of this effect. In order to accomplish this, participants were exposed to either neutral or violent video games for 25 min and then event-related potentials (ERPs) were recorded during two emotional search tasks. The first search task assessed attentional facilitation, in which participants were required to identify an emotional face from a crowd of neutral faces. In contrast, the second task measured disengagement, in which participants were required to identify a neutral face from a crowd of emotional faces. Our results found a significant presence of the ERP component, N2pc, during the facilitation task; however, no differences were observed between the two video game groups. This finding does not support a link between attentional facilitation and violent video game exposure. Comparatively, during the disengagement task, N2pc responses were not observed when participants viewed happy faces following violent video game exposure; however, a weak N2pc response was observed after neutral video game exposure. These results provided only inconsistent support for the disengagement hypothesis, suggesting that participants found it difficult to separate a neutral face from a crowd of emotional faces. PMID:28249033

  14. Facilitation or disengagement? Attention bias in facial affect processing after short-term violent video game exposure.

    PubMed

    Liu, Yanling; Lan, Haiying; Teng, Zhaojun; Guo, Cheng; Yao, Dezhong

    2017-01-01

    Previous research has been inconsistent on whether violent video games exert positive and/or negative effects on cognition. In particular, attentional bias in facial affect processing after violent video game exposure continues to be controversial. The aim of the present study was to investigate attentional bias in facial recognition after short term exposure to violent video games and to characterize the neural correlates of this effect. In order to accomplish this, participants were exposed to either neutral or violent video games for 25 min and then event-related potentials (ERPs) were recorded during two emotional search tasks. The first search task assessed attentional facilitation, in which participants were required to identify an emotional face from a crowd of neutral faces. In contrast, the second task measured disengagement, in which participants were required to identify a neutral face from a crowd of emotional faces. Our results found a significant presence of the ERP component, N2pc, during the facilitation task; however, no differences were observed between the two video game groups. This finding does not support a link between attentional facilitation and violent video game exposure. Comparatively, during the disengagement task, N2pc responses were not observed when participants viewed happy faces following violent video game exposure; however, a weak N2pc response was observed after neutral video game exposure. These results provided only inconsistent support for the disengagement hypothesis, suggesting that participants found it difficult to separate a neutral face from a crowd of emotional faces.

  15. Alcoholism and Dampened Temporal Limbic Activation to Emotional Faces

    PubMed Central

    Marinkovic, Ksenija; Oscar-Berman, Marlene; Urban, Trinity; O’Reilly, Cara E.; Howard, Julie A.; Sawyer, Kayle; Harris, Gordon J.

    2013-01-01

    Background Excessive chronic drinking is accompanied by a broad spectrum of emotional changes ranging from apathy and emotional flatness to deficits in comprehending emotional information, but their neural bases are poorly understood. Methods Emotional abnormalities associated with alcoholism were examined with functional magnetic resonance imaging in abstinent long-term alcoholic men in comparison to healthy demographically matched controls. Participants were presented with emotionally valenced words and photographs of faces during deep (semantic) and shallow (perceptual) encoding tasks followed by recognition. Results Overall, faces evoked stronger activation than words, with the expected material-specific laterality (left hemisphere for words, and right for faces) and depth of processing effects. However, whereas control participants showed stronger activation in the amygdala and hippocampus when viewing faces with emotional (relative to neutral) expressions, the alcoholics responded in an undifferentiated manner to all facial expressions. In the alcoholic participants, amygdala activity was inversely correlated with an increase in lateral prefrontal activity as a function of their behavioral deficits. Prefrontal modulation of emotional function as a compensation for the blunted amygdala activity during a socially relevant face appraisal task is in agreement with a distributed network engagement during emotional face processing. Conclusions Deficient activation of amygdala and hippocampus may underlie impaired processing of emotional faces associated with long-term alcoholism and may be a part of the wide array of behavioral problems including disinhibition, concurring with previously documented interpersonal difficulties in this population. Furthermore, the results suggest that alcoholics may rely on prefrontal rather than temporal limbic areas in order to compensate for reduced limbic responsivity and to maintain behavioral adequacy when faced with emotionally or socially challenging situations. PMID:19673745

  16. Distinct and Overlapping Brain Areas Engaged during Value-Based, Mathematical, and Emotional Decision Processing

    PubMed Central

    Hsu, Chun-Wei; Goh, Joshua O. S.

    2016-01-01

    When comparing between the values of different choices, human beings can rely on either more cognitive processes, such as using mathematical computation, or more affective processes, such as using emotion. However, the neural correlates of how these two types of processes operate during value-based decision-making remain unclear. In this study, we investigated the extent to which neural regions engaged during value-based decision-making overlap with those engaged during mathematical and emotional processing in a within-subject manner. In a functional magnetic resonance imaging experiment, participants viewed stimuli that always consisted of numbers and emotional faces that depicted two choices. Across tasks, participants decided between the two choices based on the expected value of the numbers, a mathematical result of the numbers, or the emotional face stimuli. We found that all three tasks commonly involved various cortical areas including frontal, parietal, motor, somatosensory, and visual regions. Critically, the mathematical task shared common areas with the value but not emotion task in bilateral striatum. Although the emotion task overlapped with the value task in parietal, motor, and sensory areas, the mathematical task also evoked responses in other areas within these same cortical structures. Minimal areas were uniquely engaged for the value task apart from the other two tasks. The emotion task elicited a more expansive area of neural activity whereas value and mathematical task responses were in more focal regions. Whole-brain spatial correlation analysis showed that valuative processing engaged functional brain responses more similarly to mathematical processing than emotional processing. While decisions on expected value entail both mathematical and emotional processing regions, mathematical processes have a more prominent contribution particularly in subcortical processes. PMID:27375466

  17. Distinct and Overlapping Brain Areas Engaged during Value-Based, Mathematical, and Emotional Decision Processing.

    PubMed

    Hsu, Chun-Wei; Goh, Joshua O S

    2016-01-01

    When comparing between the values of different choices, human beings can rely on either more cognitive processes, such as using mathematical computation, or more affective processes, such as using emotion. However, the neural correlates of how these two types of processes operate during value-based decision-making remain unclear. In this study, we investigated the extent to which neural regions engaged during value-based decision-making overlap with those engaged during mathematical and emotional processing in a within-subject manner. In a functional magnetic resonance imaging experiment, participants viewed stimuli that always consisted of numbers and emotional faces that depicted two choices. Across tasks, participants decided between the two choices based on the expected value of the numbers, a mathematical result of the numbers, or the emotional face stimuli. We found that all three tasks commonly involved various cortical areas including frontal, parietal, motor, somatosensory, and visual regions. Critically, the mathematical task shared common areas with the value but not emotion task in bilateral striatum. Although the emotion task overlapped with the value task in parietal, motor, and sensory areas, the mathematical task also evoked responses in other areas within these same cortical structures. Minimal areas were uniquely engaged for the value task apart from the other two tasks. The emotion task elicited a more expansive area of neural activity whereas value and mathematical task responses were in more focal regions. Whole-brain spatial correlation analysis showed that valuative processing engaged functional brain responses more similarly to mathematical processing than emotional processing. While decisions on expected value entail both mathematical and emotional processing regions, mathematical processes have a more prominent contribution particularly in subcortical processes.

  18. Emotion Recognition in Faces and the Use of Visual Context in Young People with High-Functioning Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Wright, Barry; Clarke, Natalie; Jordan, Jo; Young, Andrew W.; Clarke, Paula; Miles, Jeremy; Nation, Kate; Clarke, Leesa; Williams, Christine

    2008-01-01

    We compared young people with high-functioning autism spectrum disorders (ASDs) with age, sex and IQ matched controls on emotion recognition of faces and pictorial context. Each participant completed two tests of emotion recognition. The first used Ekman series faces. The second used facial expressions in visual context. A control task involved…

  19. Risk for bipolar disorder is associated with face-processing deficits across emotions.

    PubMed

    Brotman, Melissa A; Skup, Martha; Rich, Brendan A; Blair, Karina S; Pine, Daniel S; Blair, James R; Leibenluft, Ellen

    2008-12-01

    Youths with euthymic bipolar disorder (BD) have a deficit in face-emotion labeling that is present across multiple emotions. Recent research indicates that youths at familial risk for BD, but without a history of mood disorder, also have a deficit in face-emotion labeling, suggesting that such impairments may be an endophenotype for BD. It is unclear whether this deficit in at-risk youths is present across all emotions or if the impairment presents initially as an emotion-specific dysfunction that then generalizes to other emotions as the symptoms of BD become manifest. Thirty-seven patients with pediatric BD, 25 unaffected children with a first-degree relative with BD, and 36 typically developing youths were administered the Emotional Expression Multimorph Task, a computerized behavioral task, which presents gradations of facial emotions from 100% neutrality to 100% emotional expression (happiness, surprise, fear, sadness, anger, and disgust). Repeated-measures analysis of covariance revealed that, compared with the control youths, the patients and the at-risk youths required significantly more intense emotional information to identify and correctly label face emotions. The patients with BD and the at-risk youths did not differ from each other. Group-by-emotion interactions were not significant, indicating that the group effects did not differ based on the facial emotion. The youths at risk for BD demonstrate nonspecific deficits in face-emotion recognition, similar to patients with the illness. Further research is needed to determine whether such deficits meet all the criteria for an endophenotype.

  20. Auditory Emotion Word Primes Influence Emotional Face Categorization in Children and Adults, but Not Vice Versa.

    PubMed

    Vesker, Michael; Bahn, Daniela; Kauschke, Christina; Tschense, Monika; Degé, Franziska; Schwarzer, Gudrun

    2018-01-01

    In order to assess how the perception of audible speech and facial expressions influence one another for the perception of emotions, and how this influence might change over the course of development, we conducted two cross-modal priming experiments with three age groups of children (6-, 9-, and 12-years old), as well as college-aged adults. In Experiment 1, 74 children and 24 adult participants were tasked with categorizing photographs of emotional faces as positive or negative as quickly as possible after being primed with emotion words presented via audio in valence-congruent and valence-incongruent trials. In Experiment 2, 67 children and 24 adult participants carried out a similar categorization task, but with faces acting as visual primes, and emotion words acting as auditory targets. The results of Experiment 1 showed that participants made more errors when categorizing positive faces primed by negative words versus positive words, and that 6-year-old children are particularly sensitive to positive word primes, giving faster correct responses regardless of target valence. Meanwhile, the results of Experiment 2 did not show any congruency effects for priming by facial expressions. Thus, audible emotion words seem to exert an influence on the emotional categorization of faces, while faces do not seem to influence the categorization of emotion words in a significant way.

  1. Positive and negative emotional contexts unevenly predict episodic memory.

    PubMed

    Martínez-Galindo, Joyce Graciela; Cansino, Selene

    2015-09-15

    The aim of this study was to investigate whether the recognition of faces with neutral expressions differs when they are encoded under different emotional contexts (positive, negative or non-emotional). The effects of the emotional valence context on the subsequent memory effect (SME) and the autonomic responses were also examined. Twenty-eight participants performed a betting-game task in which the faces of their virtual opponents were presented in each trial. The probability of winning or losing was manipulated to generate positive or negative contexts, respectively. Additionally, the participants performed the same task without betting as a non-emotional condition. After the encoding phase, an old/new paradigm was performed for the faces of the virtual opponents. The recognition was superior for the faces encoded in the positive contexts than for the faces encoded in the non-emotional contexts. The skin conductance response amplitude was equivalent for both of the emotional contexts. The N170 and P300 components at occipital sites and the frontal slow wave manifested SMEs that were modulated by positive contexts; neither negative nor non-emotional contexts influenced these effects. The behavioral and neurophysiological data demonstrated that positive contexts are stronger predictors of episodic memory than negative or non-emotional contexts. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. An fMRI study of facial emotion processing in patients with schizophrenia.

    PubMed

    Gur, Raquel E; McGrath, Claire; Chan, Robin M; Schroeder, Lee; Turner, Travis; Turetsky, Bruce I; Kohler, Christian; Alsop, David; Maldjian, Joseph; Ragland, J Daniel; Gur, Ruben C

    2002-12-01

    Emotion processing deficits are notable in schizophrenia. The authors evaluated cerebral blood flow response in schizophrenia patients during facial emotion processing to test the hypothesis of diminished limbic activation related to emotional relevance of facial stimuli. Fourteen patients with schizophrenia and 14 matched comparison subjects viewed facial displays of happiness, sadness, anger, fear, and disgust as well as neutral faces. Functional magnetic resonance imaging was used to measure blood-oxygen-level-dependent signal changes as the subjects alternated between tasks of discriminating emotional valence (positive versus negative) and age (over 30 versus under 30) of the faces with an interleaved crosshair reference condition. The groups did not differ in performance on either task. For both tasks, healthy participants showed activation in the fusiform gyrus, occipital lobe, and inferior frontal cortex relative to the resting baseline condition. The increase was greater in the amygdala and hippocampus during the emotional valence discrimination task than during the age discrimination task. In the patients with schizophrenia, minimal focal response was observed for all tasks relative to the resting baseline condition. Contrasting patients and comparison subjects on the emotional valence discrimination task revealed voxels in the left amygdala and bilateral hippocampus in which the comparison subjects had significantly greater activation. Failure to activate limbic regions during emotional valence discrimination may explain emotion processing deficits in patients with schizophrenia. While the lack of limbic recruitment did not significantly impair simple valence discrimination performance in this clinically stable group, it may impact performance of more demanding tasks.

  3. A facial expression of pax: Assessing children's "recognition" of emotion from faces.

    PubMed

    Nelson, Nicole L; Russell, James A

    2016-01-01

    In a classic study, children were shown an array of facial expressions and asked to choose the person who expressed a specific emotion. Children were later asked to name the emotion in the face with any label they wanted. Subsequent research often relied on the same two tasks--choice from array and free labeling--to support the conclusion that children recognize basic emotions from facial expressions. Here five studies (N=120, 2- to 10-year-olds) showed that these two tasks produce illusory recognition; a novel nonsense facial expression was included in the array. Children "recognized" a nonsense emotion (pax or tolen) and two familiar emotions (fear and jealousy) from the same nonsense face. Children likely used a process of elimination; they paired the unknown facial expression with a label given in the choice-from-array task and, after just two trials, freely labeled the new facial expression with the new label. These data indicate that past studies using this method may have overestimated children's expression knowledge. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Using an emotional saccade task to characterize executive functioning and emotion processing in attention-deficit hyperactivity disorder and bipolar disorder.

    PubMed

    Yep, Rachel; Soncin, Stephen; Brien, Donald C; Coe, Brian C; Marin, Alina; Munoz, Douglas P

    2018-04-23

    Despite distinct diagnostic criteria, attention-deficit hyperactivity disorder (ADHD) and bipolar disorder (BD) share cognitive and emotion processing deficits that complicate diagnoses. The goal of this study was to use an emotional saccade task to characterize executive functioning and emotion processing in adult ADHD and BD. Participants (21 control, 20 ADHD, 20 BD) performed an interleaved pro/antisaccade task (look toward vs. look away from a visual target, respectively) in which the sex of emotional face stimuli acted as the cue to perform either the pro- or antisaccade. Both patient groups made more direction (erroneous prosaccades on antisaccade trials) and anticipatory (saccades made before cue processing) errors than controls. Controls exhibited lower microsaccade rates preceding correct anti- vs. prosaccade initiation, but this task-related modulation was absent in both patient groups. Regarding emotion processing, the ADHD group performed worse than controls on neutral face trials, while the BD group performed worse than controls on trials presenting faces of all valence. These findings support the role of fronto-striatal circuitry in mediating response inhibition deficits in both ADHD and BD, and suggest that such deficits are exacerbated in BD during emotion processing, presumably via dysregulated limbic system circuitry involving the anterior cingulate and orbitofrontal cortex. Copyright © 2018 Elsevier Inc. All rights reserved.

  5. Attentional bias to emotional stimuli is altered during moderate- but not high-intensity exercise.

    PubMed

    Tian, Qu; Smith, J Carson

    2011-12-01

    Little is known regarding how attention to emotional stimuli is affected during simultaneously performed exercise. Attentional biases to emotional face stimuli were assessed in 34 college students (17 women) using the dot-probe task during counterbalanced conditions of moderate- (heart rate at 45% peak oxygen consumption) and high-intensity exercise (heart rate at 80% peak oxygen consumption) compared with seated rest. The dot-probe task consisted of 1 emotional face (pleasant or unpleasant) paired with a neutral face for 1,000 ms; 256 trials (128 trials for each valence) were presented during each condition. Each condition lasted approximately 10 min. Participants were instructed to perform each trial of the dot-probe task as quickly and accurately as possible during the exercise and rest conditions. During moderate-intensity exercise, participants exhibited significantly greater attentional bias scores to pleasant compared with unpleasant faces (p < .01), whereas attentional bias scores to emotional faces did not differ at rest or during high-intensity exercise (p > .05). In addition, the attentional bias to unpleasant faces was significantly reduced during moderate-intensity exercise compared with that during rest (p < .05). These results provide behavioral evidence that during exercise at a moderate intensity, there is a shift in attention allocation toward pleasant emotional stimuli and away from unpleasant emotional stimuli. Future work is needed to determine whether acute exercise may be an effective treatment approach to reduce negative bias or enhance positive bias in individuals diagnosed with mood or anxiety disorders, or whether attentional bias during exercise predicts adherence to exercise. (c) 2011 APA, all rights reserved.

  6. Differential amygdala response during facial recognition in patients with schizophrenia: an fMRI study.

    PubMed

    Kosaka, H; Omori, M; Murata, T; Iidaka, T; Yamada, H; Okada, T; Takahashi, T; Sadato, N; Itoh, H; Yonekura, Y; Wada, Y

    2002-09-01

    Human lesion or neuroimaging studies suggest that amygdala is involved in facial emotion recognition. Although impairments in recognition of facial and/or emotional expression have been reported in schizophrenia, there are few neuroimaging studies that have examined differential brain activation during facial recognition between patients with schizophrenia and normal controls. To investigate amygdala responses during facial recognition in schizophrenia, we conducted a functional magnetic resonance imaging (fMRI) study with 12 right-handed medicated patients with schizophrenia and 12 age- and sex-matched healthy controls. The experiment task was a type of emotional intensity judgment task. During the task period, subjects were asked to view happy (or angry/disgusting/sad) and neutral faces simultaneously presented every 3 s and to judge which face was more emotional (positive or negative face discrimination). Imaging data were investigated in voxel-by-voxel basis for single-group analysis and for between-group analysis according to the random effect model using Statistical Parametric Mapping (SPM). No significant difference in task accuracy was found between the schizophrenic and control groups. Positive face discrimination activated the bilateral amygdalae of both controls and schizophrenics, with more prominent activation of the right amygdala shown in the schizophrenic group. Negative face discrimination activated the bilateral amygdalae in the schizophrenic group whereas the right amygdala alone in the control group, although no significant group difference was found. Exaggerated amygdala activation during emotional intensity judgment found in the schizophrenic patients may reflect impaired gating of sensory input containing emotion. Copyright 2002 Elsevier Science B.V.

  7. Socio-cognitive load and social anxiety in an emotional anti-saccade task

    PubMed Central

    Butler, Stephen H.; Grealy, Madeleine A.

    2018-01-01

    The anti-saccade task has been used to measure attentional control related to general anxiety but less so with social anxiety specifically. Previous research has not been conclusive in suggesting that social anxiety may lead to difficulties in inhibiting faces. It is possible that static face paradigms do not convey a sufficient social threat to elicit an inhibitory response in socially anxious individuals. The aim of the current study was twofold. We investigated the effect of social anxiety on performance in an anti-saccade task with neutral or emotional faces preceded either by a social stressor (Experiment 1), or valenced sentence primes designed to increase the social salience of the task (Experiment 2). Our results indicated that latencies were significantly longer for happy than angry faces. Additionally, and surprisingly, high anxious participants made more erroneous anti-saccades to neutral than angry and happy faces, whilst the low anxious groups exhibited a trend in the opposite direction. Results are consistent with a general approach-avoidance response for positive and threatening social information. However increased socio-cognitive load may alter attentional control with high anxious individuals avoiding emotional faces, but finding it more difficult to inhibit ambiguous faces. The effects of social sentence primes on attention appear to be subtle but suggest that the anti-saccade task will only elicit socially relevant responses where the paradigm is more ecologically valid. PMID:29795619

  8. Effects on automatic attention due to exposure to pictures of emotional faces while performing Chinese word judgment tasks.

    PubMed

    Junhong, Huang; Renlai, Zhou; Senqi, Hu

    2013-01-01

    Two experiments were conducted to investigate the automatic processing of emotional facial expressions while performing low or high demand cognitive tasks under unattended conditions. In Experiment 1, 35 subjects performed low (judging the structure of Chinese words) and high (judging the tone of Chinese words) cognitive load tasks while exposed to unattended pictures of fearful, neutral, or happy faces. The results revealed that the reaction time was slower and the performance accuracy was higher while performing the low cognitive load task than while performing the high cognitive load task. Exposure to fearful faces resulted in significantly longer reaction times and lower accuracy than exposure to neutral faces on the low cognitive load task. In Experiment 2, 26 subjects performed the same word judgment tasks and their brain event-related potentials (ERPs) were measured for a period of 800 ms after the onset of the task stimulus. The amplitudes of the early component of ERP around 176 ms (P2) elicited by unattended fearful faces over frontal-central-parietal recording sites was significantly larger than those elicited by unattended neutral faces while performing the word structure judgment task. Together, the findings of the two experiments indicated that unattended fearful faces captured significantly more attention resources than unattended neutral faces on a low cognitive load task, but not on a high cognitive load task. It was concluded that fearful faces could automatically capture attention if residues of attention resources were available under the unattended condition.

  9. Effects on Automatic Attention Due to Exposure to Pictures of Emotional Faces while Performing Chinese Word Judgment Tasks

    PubMed Central

    Junhong, Huang; Renlai, Zhou; Senqi, Hu

    2013-01-01

    Two experiments were conducted to investigate the automatic processing of emotional facial expressions while performing low or high demand cognitive tasks under unattended conditions. In Experiment 1, 35 subjects performed low (judging the structure of Chinese words) and high (judging the tone of Chinese words) cognitive load tasks while exposed to unattended pictures of fearful, neutral, or happy faces. The results revealed that the reaction time was slower and the performance accuracy was higher while performing the low cognitive load task than while performing the high cognitive load task. Exposure to fearful faces resulted in significantly longer reaction times and lower accuracy than exposure to neutral faces on the low cognitive load task. In Experiment 2, 26 subjects performed the same word judgment tasks and their brain event-related potentials (ERPs) were measured for a period of 800 ms after the onset of the task stimulus. The amplitudes of the early component of ERP around 176 ms (P2) elicited by unattended fearful faces over frontal-central-parietal recording sites was significantly larger than those elicited by unattended neutral faces while performing the word structure judgment task. Together, the findings of the two experiments indicated that unattended fearful faces captured significantly more attention resources than unattended neutral faces on a low cognitive load task, but not on a high cognitive load task. It was concluded that fearful faces could automatically capture attention if residues of attention resources were available under the unattended condition. PMID:24124486

  10. Facial Emotion Recognition Performance Differentiates Between Behavioral Variant Frontotemporal Dementia and Major Depressive Disorder.

    PubMed

    Chiu, Isabelle; Piguet, Olivier; Diehl-Schmid, Janine; Riedl, Lina; Beck, Johannes; Leyhe, Thomas; Holsboer-Trachsler, Edith; Kressig, Reto W; Berres, Manfred; Monsch, Andreas U; Sollberger, Marc

    Misdiagnosis of early behavioral variant frontotemporal dementia (bvFTD) with major depressive disorder (MDD) is not uncommon due to overlapping symptoms. The aim of this study was to improve the discrimination between these disorders using a novel facial emotion perception task. In this prospective cohort study (July 2013-March 2016), we compared 25 patients meeting Rascovsky diagnostic criteria for bvFTD, 20 patients meeting DSM-IV criteria for MDD, 21 patients meeting McKhann diagnostic criteria for Alzheimer's disease dementia, and 31 healthy participants on a novel emotion intensity rating task comprising morphed low-intensity facial stimuli. Participants were asked to rate the intensity of morphed faces on the congruent basic emotion (eg, rating on sadness when sad face is shown) and on the 5 incongruent basic emotions (eg, rating on each of the other basic emotions when sad face is shown). While bvFTD patients underrated congruent emotions (P < .01), they also overrated incongruent emotions (P < .001), resulting in confusion of facial emotions. In contrast, MDD patients overrated congruent negative facial emotions (P < .001), but not incongruent facial emotions. Accordingly, ratings of congruent and incongruent emotions highly discriminated between bvFTD and MDD patients, ranging from area under the curve (AUC) = 93% to AUC = 98%. Further, an almost complete discrimination (AUC = 99%) was achieved by contrasting the 2 rating types. In contrast, Alzheimer's disease dementia patients perceived emotions similarly to healthy participants, indicating no impact of cognitive impairment on rating scores. Our congruent and incongruent facial emotion intensity rating task allows a detailed assessment of facial emotion perception in patient populations. By using this simple task, we achieved an almost complete discrimination between bvFTD and MDD, potentially helping improve the diagnostic certainty in early bvFTD. © Copyright 2018 Physicians Postgraduate Press, Inc.

  11. Childhood anxiety and attention to emotion faces in a modified stroop task.

    PubMed

    Hadwin, Julie A; Donnelly, Nick; Richards, Anne; French, Christopher C; Patel, Umang

    2009-06-01

    This study used an emotional face stroop task to investigate the effects of self-report trait anxiety, social concern (SC), and chronological age (CA) on reaction time to match coloured outlines of angry, happy, and neutral faces (and control faces with scrambled features) with coloured buttons in a community sample of 74 children aged 6-12 years. The results showed an interference of colour matching for angry (relative to neutral) faces in children with elevated SC. The same effect was not found for happy or control faces. In addition, the results suggest that selective attention to angry faces in children with social concern (SC) was not significantly moderated by age.

  12. One size does not fit all: face emotion processing impairments in semantic dementia, behavioural-variant frontotemporal dementia and Alzheimer's disease are mediated by distinct cognitive deficits.

    PubMed

    Miller, Laurie A; Hsieh, Sharpley; Lah, Suncica; Savage, Sharon; Hodges, John R; Piguet, Olivier

    2012-01-01

    Patients with frontotemporal dementia (both behavioural variant [bvFTD] and semantic dementia [SD]) as well as those with Alzheimer's disease (AD) show deficits on tests of face emotion processing, yet the mechanisms underlying these deficits have rarely been explored. We compared groups of patients with bvFTD (n = 17), SD (n = 12) or AD (n = 20) to an age- and education-matched group of healthy control subjects (n = 36) on three face emotion processing tasks (Ekman 60, Emotion Matching and Emotion Selection) and found that all three patient groups were similarly impaired. Analyses of covariance employed to partial out the influences of language and perceptual impairments, which frequently co-occur in these patients, provided evidence of different underlying cognitive mechanisms. These analyses revealed that language impairments explained the original poor scores obtained by the SD patients on the Ekman 60 and Emotion Selection tasks, which involve verbal labels. Perceptual deficits contributed to Emotion Matching performance in the bvFTD and AD patients. Importantly, all groups remained impaired on one task or more following these analyses, denoting a primary emotion processing disturbance in these dementia syndromes. These findings highlight the multifactorial nature of emotion processing deficits in patients with dementia.

  13. Emotional Processing of Personally Familiar Faces in the Vegetative State

    PubMed Central

    Sharon, Haggai; Pasternak, Yotam; Ben Simon, Eti; Gruberger, Michal; Giladi, Nir; Krimchanski, Ben Zion; Hassin, David; Hendler, Talma

    2013-01-01

    Background The Vegetative State (VS) is a severe disorder of consciousness in which patients are awake but display no signs of awareness. Yet, recent functional magnetic resonance imaging (fMRI) studies have demonstrated evidence for covert awareness in VS patients by recording specific brain activations during a cognitive task. However, the possible existence of incommunicable subjective emotional experiences in VS patients remains largely unexplored. This study aimed to probe the question of whether VS patients retain a brain ability to selectively process external stimuli according to their emotional value and look for evidence of covert emotional awareness in patients. Methods and Findings In order to explore these questions we employed the emotive impact of observing personally familiar faces, known to provoke specific perceptual as well as emotional brain activations. Four VS patients and thirteen healthy controls first underwent an fMRI scan while viewing pictures of non-familiar faces, personally familiar faces and pictures of themselves. In a subsequent imagery task participants were asked to actively imagine one of their parent's faces. Analyses focused on face and familiarity selective regional brain activations and inter-regional functional connectivity. Similar to controls, all patients displayed face selective brain responses with further limbic and cortical activations elicited by familiar faces. In patients as well as controls, Connectivity was observed between emotional, visual and face specific areas, suggesting aware emotional perception. This connectivity was strongest in the two patients who later recovered. Notably, these two patients also displayed selective amygdala activation during familiar face imagery, with one further exhibiting face selective activations, indistinguishable from healthy controls. Conclusions Taken together, these results show that selective emotional processing can be elicited in VS patients both by external emotionally salient stimuli and by internal cognitive processes, suggesting the ability for covert emotional awareness of self and the environment in VS patients. PMID:24086365

  14. Acute tryptophan depletion attenuates conscious appraisal of social emotional signals in healthy female volunteers

    PubMed Central

    Gray, Marcus A.; Minati, Ludovico; Whale, Richard; Harrison, Neil A.; Critchley, Hugo D.

    2010-01-01

    Rationale Acute tryptophan depletion (ATD) decreases levels of central serotonin. ATD thus enables the cognitive effects of serotonin to be studied, with implications for the understanding of psychiatric conditions, including depression. Objective To determine the role of serotonin in conscious (explicit) and unconscious/incidental processing of emotional information. Materials and methods A randomized, double-blind, cross-over design was used with 15 healthy female participants. Subjective mood was recorded at baseline and after 4 h, when participants performed an explicit emotional face processing task, and a task eliciting unconscious processing of emotionally aversive and neutral images presented subliminally using backward masking. Results ATD was associated with a robust reduction in plasma tryptophan at 4 h but had no effect on mood or autonomic physiology. ATD was associated with significantly lower attractiveness ratings for happy faces and attenuation of intensity/arousal ratings of angry faces. ATD also reduced overall reaction times on the unconscious perception task, but there was no interaction with emotional content of masked stimuli. ATD did not affect breakthrough perception (accuracy in identification) of masked images. Conclusions ATD attenuates the attractiveness of positive faces and the negative intensity of threatening faces, suggesting that serotonin contributes specifically to the appraisal of the social salience of both positive and negative salient social emotional cues. We found no evidence that serotonin affects unconscious processing of negative emotional stimuli. These novel findings implicate serotonin in conscious aspects of active social and behavioural engagement and extend knowledge regarding the effects of ATD on emotional perception. PMID:20596858

  15. Improved emotional conflict control triggered by the processing priority of negative emotion.

    PubMed

    Yang, Qian; Wang, Xiangpeng; Yin, Shouhang; Zhao, Xiaoyue; Tan, Jinfeng; Chen, Antao

    2016-04-18

    The prefrontal cortex is responsible for emotional conflict resolution, and this control mechanism is affected by the emotional valence of distracting stimuli. In the present study, we investigated effects of negative and positive stimuli on emotional conflict control using a face-word Stroop task in combination with functional brain imaging. Emotional conflict was absent in the negative face context, in accordance with the null activation observed in areas regarding emotional face processing (fusiform face area, middle temporal/occipital gyrus). Importantly, these visual areas negatively coupled with the dorsolateral prefrontal cortex (DLPFC). However, the significant emotional conflict was observed in the positive face context, this effect was accompanied by activation in areas associated with emotional face processing, and the default mode network (DMN), here, DLPFC mainly negatively coupled with DMN, rather than visual areas. These results suggested that the conflict control mechanism exerted differently between negative faces and positive faces, it implemented more efficiently in the negative face condition, whereas it is more devoted to inhibiting internal interference in the positive face condition. This study thus provides a plausible mechanism of emotional conflict resolution that the rapid pathway for negative emotion processing efficiently triggers control mechanisms to preventively resolve emotional conflict.

  16. Improved emotional conflict control triggered by the processing priority of negative emotion

    PubMed Central

    Yang, Qian; Wang, Xiangpeng; Yin, Shouhang; Zhao, Xiaoyue; Tan, Jinfeng; Chen, Antao

    2016-01-01

    The prefrontal cortex is responsible for emotional conflict resolution, and this control mechanism is affected by the emotional valence of distracting stimuli. In the present study, we investigated effects of negative and positive stimuli on emotional conflict control using a face-word Stroop task in combination with functional brain imaging. Emotional conflict was absent in the negative face context, in accordance with the null activation observed in areas regarding emotional face processing (fusiform face area, middle temporal/occipital gyrus). Importantly, these visual areas negatively coupled with the dorsolateral prefrontal cortex (DLPFC). However, the significant emotional conflict was observed in the positive face context, this effect was accompanied by activation in areas associated with emotional face processing, and the default mode network (DMN), here, DLPFC mainly negatively coupled with DMN, rather than visual areas. These results suggested that the conflict control mechanism exerted differently between negative faces and positive faces, it implemented more efficiently in the negative face condition, whereas it is more devoted to inhibiting internal interference in the positive face condition. This study thus provides a plausible mechanism of emotional conflict resolution that the rapid pathway for negative emotion processing efficiently triggers control mechanisms to preventively resolve emotional conflict. PMID:27086908

  17. State-dependent alterations in inhibitory control and emotional face identification in seasonal affective disorder.

    PubMed

    Hjordt, Liv V; Stenbæk, Dea S; Madsen, Kathrine Skak; Mc Mahon, Brenda; Jensen, Christian G; Vestergaard, Martin; Hageman, Ida; Meder, David; Hasselbalch, Steen G; Knudsen, Gitte M

    2017-04-01

    Depressed individuals often exhibit impaired inhibition to negative input and identification of positive stimuli, but it is unclear whether this is a state or trait feature. We here exploited a naturalistic model, namely individuals with seasonal affective disorder (SAD), to study this feature longitudinally. The goal of this study was to examine seasonal changes in inhibitory control and identification of emotional faces in individuals with SAD. Twenty-nine individuals diagnosed with winter-SAD and 30 demographically matched controls with no seasonality symptoms completed an emotional Go/NoGo task, requiring inhibition of prepotent responses to emotional facial expressions and an emotional face identification task twice, in winter and summer. In winter, individuals with SAD showed impaired ability to inhibit responses to angry (p = .0006) and sad faces (p = .011), and decreased identification of happy faces (p = .032) compared with controls. In summer, individuals with SAD and controls performed similarly on these tasks (ps > .24). We provide novel evidence that inhibition of angry and sad faces and identification of happy faces are impaired in SAD in the symptomatic phase, but not in the remitted phase. The affective biases in cognitive processing constitute state-dependent features of SAD. Our data show that reinstatement of a normal affective cognition should be possible and would constitute a major goal in psychiatric treatment to improve the quality of life for these patients. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  18. Faces in-between: evaluations reflect the interplay of facial features and task-dependent fluency.

    PubMed

    Winkielman, Piotr; Olszanowski, Michal; Gola, Mateusz

    2015-04-01

    Facial features influence social evaluations. For example, faces are rated as more attractive and trustworthy when they have more smiling features and also more female features. However, the influence of facial features on evaluations should be qualified by the affective consequences of fluency (cognitive ease) with which such features are processed. Further, fluency (along with its affective consequences) should depend on whether the current task highlights conflict between specific features. Four experiments are presented. In 3 experiments, participants saw faces varying in expressions ranging from pure anger, through mixed expression, to pure happiness. Perceivers first categorized faces either on a control dimension, or an emotional dimension (angry/happy). Thus, the emotional categorization task made "pure" expressions fluent and "mixed" expressions disfluent. Next, participants made social evaluations. Results show that after emotional categorization, but not control categorization, targets with mixed expressions are relatively devalued. Further, this effect is mediated by categorization disfluency. Additional data from facial electromyography reveal that on a basic physiological level, affective devaluation of mixed expressions is driven by their objective ambiguity. The fourth experiment shows that the relative devaluation of mixed faces that vary in gender ambiguity requires a gender categorization task. Overall, these studies highlight that the impact of facial features on evaluation is qualified by their fluency, and that the fluency of features is a function of the current task. The discussion highlights the implications of these findings for research on emotional reactions to ambiguity. (c) 2015 APA, all rights reserved).

  19. Regional gray matter density associated with emotional conflict resolution: evidence from voxel-based morphometry.

    PubMed

    Deng, Z; Wei, D; Xue, S; Du, X; Hitchman, G; Qiu, J

    2014-09-05

    Successful emotion regulation is a fundamental prerequisite for well-being and dysregulation may lead to psychopathology. The ability to inhibit spontaneous emotions while behaving in accordance with desired goals is an important dimension of emotion regulation and can be measured using emotional conflict resolution tasks. Few studies have investigated the gray matter correlates underlying successful emotional conflict resolution at the whole-brain level. We had 190 adults complete an emotional conflict resolution task (face-word task) and examined the brain regions significantly correlated with successful emotional conflict resolution using voxel-based morphometry. We found successful emotional conflict resolution was associated with increased regional gray matter density in widely distributed brain regions. These regions included the dorsal anterior cingulate/dorsal medial prefrontal cortex, ventral medial prefrontal cortex, supplementary motor area, amygdala, ventral striatum, precuneus, posterior cingulate cortex, inferior parietal lobule, superior temporal gyrus and fusiform face area. Together, our results indicate that individual differences in emotional conflict resolution ability may be attributed to regional structural differences across widely distributed brain regions. Copyright © 2014 IBRO. Published by Elsevier Ltd. All rights reserved.

  20. Face-memory and emotion: associations with major depression in children and adolescents.

    PubMed

    Pine, Daniel S; Lissek, Shmuel; Klein, Rachel G; Mannuzza, Salvatore; Moulton, John L; Guardino, Mary; Woldehawariat, Girma

    2004-10-01

    Studies in adults with major depressive disorder (MDD) document abnormalities in both memory and face-emotion processing. The current study used a novel face-memory task to test the hypothesis that adolescent MDD is associated with a deficit in memory for face-emotions. The study also examines the relationship between parental MDD and memory performance in offspring. Subjects were 152 offspring (ages 9-19) of adults with either MDD, anxiety disorders, both MDD and anxiety, or no disorder. Parents and offspring were assessed for mental disorders. Collection of face-memory data was blind to offspring and parent diagnosis. A computerized task was developed that required rating of facial photographs depicting 'happy,"fearful,' or 'angry' emotions followed by a memory recall test. Recall accuracy was examined as a function of face-emotion type. Age and gender independently predicted memory, with better recall in older and female subjects. Controlling for age and gender, offspring with a history of MDD (n = 19) demonstrated significant deficits in memory selectively for fearful faces, but not happy or angry faces. Parental MDD was not associated with face-memory accuracy. This study found an association between MDD in childhood or adolescence and perturbed encoding of fearful faces. MDD in young individuals may predispose to subtle anomalies in a neural circuit encompassing the amygdala, a brain region implicated in the processing of fearful facial expressions. These findings suggest that brain imaging studies using similar face-emotion paradigms should test whether deficits in processing of fearful faces relate to amygdala dysfunction in children and adolescents with MDD.

  1. Intranasal oxytocin impedes the ability to ignore task-irrelevant facial expressions of sadness in students with depressive symptoms.

    PubMed

    Ellenbogen, Mark A; Linnen, Anne-Marie; Cardoso, Christopher; Joober, Ridha

    2013-03-01

    The administration of oxytocin promotes prosocial behavior in humans. The mechanism by which this occurs is unknown, but it likely involves changes in social information processing. In a randomized placebo-controlled study, we examined the influence of intranasal oxytocin and placebo on the interference control component of inhibition (i.e. ability to ignore task-irrelevant information) in 102 participants using a negative affective priming task with sad, angry, and happy faces. In this task, participants are instructed to respond to a facial expression of emotion while simultaneously ignoring another emotional face. On the subsequent trial, the previously-ignored emotional valence may become the emotional valence of the target face. Inhibition is operationalized as the differential delay between responding to a previously-ignored emotional valence and responding to an emotional valence unrelated to the previous one. Although no main effect of drug administration on inhibition was observed, a drug × depressive symptom interaction (β = -0.25; t = -2.6, p < 0.05) predicted the inhibition of sad faces. Relative to placebo, participants with high depression scores who were administered oxytocin were unable to inhibit the processing of sad faces. There was no relationship between drug administration and inhibition among those with low depression scores. These findings are consistent with increasing evidence that oxytocin alters social information processing in ways that have both positive and negative social outcomes. Because elevated depression scores are associated with an increased risk for major depressive disorder, difficulties inhibiting mood-congruent stimuli following oxytocin administration may be associated with risk for depression. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Implicit Processing of Visual Emotions Is Affected by Sound-Induced Affective States and Individual Affective Traits

    PubMed Central

    Quarto, Tiziana; Blasi, Giuseppe; Pallesen, Karen Johanne; Bertolino, Alessandro; Brattico, Elvira

    2014-01-01

    The ability to recognize emotions contained in facial expressions are affected by both affective traits and states and varies widely between individuals. While affective traits are stable in time, affective states can be regulated more rapidly by environmental stimuli, such as music, that indirectly modulate the brain state. Here, we tested whether a relaxing or irritating sound environment affects implicit processing of facial expressions. Moreover, we investigated whether and how individual traits of anxiety and emotional control interact with this process. 32 healthy subjects performed an implicit emotion processing task (presented to subjects as a gender discrimination task) while the sound environment was defined either by a) a therapeutic music sequence (MusiCure), b) a noise sequence or c) silence. Individual changes in mood were sampled before and after the task by a computerized questionnaire. Additionally, emotional control and trait anxiety were assessed in a separate session by paper and pencil questionnaires. Results showed a better mood after the MusiCure condition compared with the other experimental conditions and faster responses to happy faces during MusiCure compared with angry faces during Noise. Moreover, individuals with higher trait anxiety were faster in performing the implicit emotion processing task during MusiCure compared with Silence. These findings suggest that sound-induced affective states are associated with differential responses to angry and happy emotional faces at an implicit stage of processing, and that a relaxing sound environment facilitates the implicit emotional processing in anxious individuals. PMID:25072162

  3. Visual short-term memory load modulates the early attention and perception of task-irrelevant emotional faces

    PubMed Central

    Yang, Ping; Wang, Min; Jin, Zhenlan; Li, Ling

    2015-01-01

    The ability to focus on task-relevant information, while suppressing distraction, is critical for human cognition and behavior. Using a delayed-match-to-sample (DMS) task, we investigated the effects of emotional face distractors (positive, negative, and neutral faces) on early and late phases of visual short-term memory (VSTM) maintenance intervals, using low and high VSTM loads. Behavioral results showed decreased accuracy and delayed reaction times (RTs) for high vs. low VSTM load. Event-related potentials (ERPs) showed enhanced frontal N1 and occipital P1 amplitudes for negative faces vs. neutral or positive faces, implying rapid attentional alerting effects and early perceptual processing of negative distractors. However, high VSTM load appeared to inhibit face processing in general, showing decreased N1 amplitudes and delayed P1 latencies. An inverse correlation between the N1 activation difference (high-load minus low-load) and RT costs (high-load minus low-load) was found at left frontal areas when viewing negative distractors, suggesting that the greater the inhibition the lower the RT cost for negative faces. Emotional interference effect was not found in the late VSTM-related parietal P300, frontal positive slow wave (PSW) and occipital negative slow wave (NSW) components. In general, our findings suggest that the VSTM load modulates the early attention and perception of emotional distractors. PMID:26388763

  4. Cholinergic enhancement modulates neural correlates of selective attention and emotional processing.

    PubMed

    Bentley, Paul; Vuilleumier, Patrik; Thiel, Christiane M; Driver, Jon; Dolan, Raymond J

    2003-09-01

    Neocortical cholinergic afferents are proposed to influence both selective attention and emotional processing. In a study of healthy adults we used event-related fMRI while orthogonally manipulating attention and emotionality to examine regions showing effects of cholinergic modulation by the anticholinesterase physostigmine. Either face or house pictures appeared at task-relevant locations, with the alternative picture type at irrelevant locations. Faces had either neutral or fearful expressions. Physostigmine increased relative activity within the anterior fusiform gyrus for faces at attended, versus unattended, locations, but decreased relative activity within the posterolateral occipital cortex for houses in attended, versus unattended, locations. A similar pattern of regional differences in the effect of physostigmine on cue-evoked responses was also present in the absence of stimuli. Cholinergic enhancement augmented the relative neuronal response within the middle fusiform gyrus to fearful faces, whether at attended or unattended locations. By contrast, physostigmine influenced responses in the orbitofrontal, intraparietal and cingulate cortices to fearful faces when faces occupied task-irrelevant locations. These findings suggest that acetylcholine may modulate both selective attention and emotional processes through independent, region-specific effects within the extrastriate cortex. Furthermore, cholinergic inputs to the frontoparietal cortex may influence the allocation of attention to emotional information.

  5. Face emotion recognition is related to individual differences in psychosis-proneness.

    PubMed

    Germine, L T; Hooker, C I

    2011-05-01

    Deficits in face emotion recognition (FER) in schizophrenia are well documented, and have been proposed as a potential intermediate phenotype for schizophrenia liability. However, research on the relationship between psychosis vulnerability and FER has mixed findings and methodological limitations. Moreover, no study has yet characterized the relationship between FER ability and level of psychosis-proneness. If FER ability varies continuously with psychosis-proneness, this suggests a relationship between FER and polygenic risk factors. We tested two large internet samples to see whether psychometric psychosis-proneness, as measured by the Schizotypal Personality Questionnaire-Brief (SPQ-B), is related to differences in face emotion identification and discrimination or other face processing abilities. Experiment 1 (n=2332) showed that psychosis-proneness predicts face emotion identification ability but not face gender identification ability. Experiment 2 (n=1514) demonstrated that psychosis-proneness also predicts performance on face emotion but not face identity discrimination. The tasks in Experiment 2 used identical stimuli and task parameters, differing only in emotion/identity judgment. Notably, the relationships demonstrated in Experiments 1 and 2 persisted even when individuals with the highest psychosis-proneness levels (the putative high-risk group) were excluded from analysis. Our data suggest that FER ability is related to individual differences in psychosis-like characteristics in the normal population, and that these differences cannot be accounted for by differences in face processing and/or visual perception. Our results suggest that FER may provide a useful candidate intermediate phenotype.

  6. Early neural activation during facial affect processing in adolescents with Autism Spectrum Disorder.

    PubMed

    Leung, Rachel C; Pang, Elizabeth W; Cassel, Daniel; Brian, Jessica A; Smith, Mary Lou; Taylor, Margot J

    2015-01-01

    Impaired social interaction is one of the hallmarks of Autism Spectrum Disorder (ASD). Emotional faces are arguably the most critical visual social stimuli and the ability to perceive, recognize, and interpret emotions is central to social interaction and communication, and subsequently healthy social development. However, our understanding of the neural and cognitive mechanisms underlying emotional face processing in adolescents with ASD is limited. We recruited 48 adolescents, 24 with high functioning ASD and 24 typically developing controls. Participants completed an implicit emotional face processing task in the MEG. We examined spatiotemporal differences in neural activation between the groups during implicit angry and happy face processing. While there were no differences in response latencies between groups across emotions, adolescents with ASD had lower accuracy on the implicit emotional face processing task when the trials included angry faces. MEG data showed atypical neural activity in adolescents with ASD during angry and happy face processing, which included atypical activity in the insula, anterior and posterior cingulate and temporal and orbitofrontal regions. Our findings demonstrate differences in neural activity during happy and angry face processing between adolescents with and without ASD. These differences in activation in social cognitive regions may index the difficulties in face processing and in comprehension of social reward and punishment in the ASD group. Thus, our results suggest that atypical neural activation contributes to impaired affect processing, and thus social cognition, in adolescents with ASD.

  7. Loneliness and the social monitoring system: Emotion recognition and eye gaze in a real-life conversation.

    PubMed

    Lodder, Gerine M A; Scholte, Ron H J; Goossens, Luc; Engels, Rutger C M E; Verhagen, Maaike

    2016-02-01

    Based on the belongingness regulation theory (Gardner et al., 2005, Pers. Soc. Psychol. Bull., 31, 1549), this study focuses on the relationship between loneliness and social monitoring. Specifically, we examined whether loneliness relates to performance on three emotion recognition tasks and whether lonely individuals show increased gazing towards their conversation partner's faces in a real-life conversation. Study 1 examined 170 college students (Mage = 19.26; SD = 1.21) who completed an emotion recognition task with dynamic stimuli (morph task) and a micro(-emotion) expression recognition task. Study 2 examined 130 college students (Mage = 19.33; SD = 2.00) who completed the Reading the Mind in the Eyes Test and who had a conversation with an unfamiliar peer while their gaze direction was videotaped. In both studies, loneliness was measured using the UCLA Loneliness Scale version 3 (Russell, 1996, J. Pers. Assess., 66, 20). The results showed that loneliness was unrelated to emotion recognition on all emotion recognition tasks, but that it was related to increased gaze towards their conversation partner's faces. Implications for the belongingness regulation system of lonely individuals are discussed. © 2015 The British Psychological Society.

  8. Face Processing and Facial Emotion Recognition in Adults with Down Syndrome

    ERIC Educational Resources Information Center

    Barisnikov, Koviljka; Hippolyte, Loyse; Van der Linden, Martial

    2008-01-01

    Face processing and facial expression recognition was investigated in 17 adults with Down syndrome, and results were compared with those of a child control group matched for receptive vocabulary. On the tasks involving faces without emotional content, the adults with Down syndrome performed significantly worse than did the controls. However, their…

  9. Disrupted neural processing of emotional faces in psychopathy.

    PubMed

    Contreras-Rodríguez, Oren; Pujol, Jesus; Batalla, Iolanda; Harrison, Ben J; Bosque, Javier; Ibern-Regàs, Immaculada; Hernández-Ribas, Rosa; Soriano-Mas, Carles; Deus, Joan; López-Solà, Marina; Pifarré, Josep; Menchón, José M; Cardoner, Narcís

    2014-04-01

    Psychopaths show a reduced ability to recognize emotion facial expressions, which may disturb the interpersonal relationship development and successful social adaptation. Behavioral hypotheses point toward an association between emotion recognition deficits in psychopathy and amygdala dysfunction. Our prediction was that amygdala dysfunction would combine deficient activation with disturbances in functional connectivity with cortical regions of the face-processing network. Twenty-two psychopaths and 22 control subjects were assessed and functional magnetic resonance maps were generated to identify both brain activation and task-induced functional connectivity using psychophysiological interaction analysis during an emotional face-matching task. Results showed significant amygdala activation in control subjects only, but differences between study groups did not reach statistical significance. In contrast, psychopaths showed significantly increased activation in visual and prefrontal areas, with this latest activation being associated with psychopaths' affective-interpersonal disturbances. Psychophysiological interaction analyses revealed a reciprocal reduction in functional connectivity between the left amygdala and visual and prefrontal cortices. Our results suggest that emotional stimulation may evoke a relevant cortical response in psychopaths, but a disruption in the processing of emotional faces exists involving the reciprocal functional interaction between the amygdala and neocortex, consistent with the notion of a failure to integrate emotion into cognition in psychopathic individuals.

  10. Passive and Motivated Perception of Emotional Faces: Qualitative and Quantitative Changes in the Face Processing Network

    PubMed Central

    Skelly, Laurie R.; Decety, Jean

    2012-01-01

    Emotionally expressive faces are processed by a distributed network of interacting sub-cortical and cortical brain regions. The components of this network have been identified and described in large part by the stimulus properties to which they are sensitive, but as face processing research matures interest has broadened to also probe dynamic interactions between these regions and top-down influences such as task demand and context. While some research has tested the robustness of affective face processing by restricting available attentional resources, it is not known whether face network processing can be augmented by increased motivation to attend to affective face stimuli. Short videos of people expressing emotions were presented to healthy participants during functional magnetic resonance imaging. Motivation to attend to the videos was manipulated by providing an incentive for improved recall performance. During the motivated condition, there was greater coherence among nodes of the face processing network, more widespread correlation between signal intensity and performance, and selective signal increases in a task-relevant subset of face processing regions, including the posterior superior temporal sulcus and right amygdala. In addition, an unexpected task-related laterality effect was seen in the amygdala. These findings provide strong evidence that motivation augmentsco-activity among nodes of the face processing network and the impact of neural activity on performance. These within-subject effects highlight the necessity to consider motivation when interpreting neural function in special populations, and to further explore the effect of task demands on face processing in healthy brains. PMID:22768287

  11. Facial Mimicry and Emotion Consistency: Influences of Memory and Context.

    PubMed

    Kirkham, Alexander J; Hayes, Amy E; Pawling, Ralph; Tipper, Steven P

    2015-01-01

    This study investigates whether mimicry of facial emotions is a stable response or can instead be modulated and influenced by memory of the context in which the emotion was initially observed, and therefore the meaning of the expression. The study manipulated emotion consistency implicitly, where a face expressing smiles or frowns was irrelevant and to be ignored while participants categorised target scenes. Some face identities always expressed emotions consistent with the scene (e.g., smiling with a positive scene), whilst others were always inconsistent (e.g., frowning with a positive scene). During this implicit learning of face identity and emotion consistency there was evidence for encoding of face-scene emotion consistency, with slower RTs, a reduction in trust, and inhibited facial EMG for faces expressing incompatible emotions. However, in a later task where the faces were subsequently viewed expressing emotions with no additional context, there was no evidence for retrieval of prior emotion consistency, as mimicry of emotion was similar for consistent and inconsistent individuals. We conclude that facial mimicry can be influenced by current emotion context, but there is little evidence of learning, as subsequent mimicry of emotionally consistent and inconsistent faces is similar.

  12. The impact of facial emotional expressions on behavioral tendencies in females and males

    PubMed Central

    Seidel, Eva-Maria; Habel, Ute; Kirschner, Michaela; Gur, Ruben C.; Derntl, Birgit

    2010-01-01

    Emotional faces communicate both the emotional state and behavioral intentions of an individual. They also activate behavioral tendencies in the perceiver, namely approach or avoidance. Here, we compared more automatic motor to more conscious rating responses to happy, sad, angry and disgusted faces in a healthy student sample. Happiness was associated with approach and anger with avoidance. However, behavioral tendencies in response to sadness and disgust were more complex. Sadness produced automatic approach but conscious withdrawal, probably influenced by interpersonal relations or personality. Disgust elicited withdrawal in the rating task whereas no significant tendency emerged in the joystick task, probably driven by expression style. Based on our results it is highly relevant to further explore actual reactions to emotional expressions and to differentiate between automatic and controlled processes since emotional faces are used in various kinds of studies. Moreover, our results highlight the importance of gender of poser effects when applying emotional expressions as stimuli. PMID:20364933

  13. Amygdala alterations during an emotional conflict task in women recovered from anorexia nervosa.

    PubMed

    Bang, Lasse; Rø, Øyvind; Endestad, Tor

    2016-02-28

    The pathophysiology of anorexia nervosa (AN) is not completely understood, but research suggests that alterations in brain circuits related to cognitive control and emotion are central. The aim of this study was to explore neural responses to an emotional conflict task in women recovered from AN. Functional magnetic resonance imaging was used to measure neural responses to an emotional conflict task in 22 women recovered from AN and 21 age-matched healthy controls. The task involved categorizing affective faces while ignoring affective words. Face and word stimuli were either congruent (non-conflict) or incongruent (conflict). Brain responses to emotional conflict did not differ between groups. However, in response to emotional non-conflict, women recovered from AN relative to healthy controls showed significantly less activation in the bilateral amygdala. Specifically, while emotional non-conflict evoked significant activations of the amygdala in healthy controls, recovered AN women did not show such activations. Similar significant group differences were also observed in the hippocampus and basal ganglia. These results suggest that women recovered from AN are characterized by alterations within emotion-related brain circuits. Recovered women's absence of amygdala and hippocampus activation during non-conflict trials possibly reflects an impaired ability to process emotional significant stimuli. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  14. Affect and neural activity in women with PTSD during a task of emotional interference.

    PubMed

    Brown, Wilson J; Wojtalik, Jessica A; Dewey, Daniel; Bruce, Steven E; Yang, Zhen; Sheline, Yvette I

    2016-11-01

    The current study sought to determine the relationship between self-reported dimensions of affect and activation in brain regions associated with emotion regulation in PTSD during a task of non-conscious emotional processing in interpersonal trauma survivors with PTSD and healthy controls. Participants included 52 women diagnosed with PTSD and 18 female healthy controls. All participants completed a clinical assessment including the SCID, CAPS, and PANAS followed by a functional MRI assessment including a task of implicit emotional conflict. When PTSD participants were oriented to fearful faces, negative affect (NA) was inversely related to activation in the left amygdala and positive affect (PA) was inversely related to activation in the right amygdala. When ignoring fearful faces, NA was positively associated with activation in the right parahippocampal gyrus and PA was inversely related to activation in the left hippocampus. Similar results were observed in healthy controls regarding PA. However, NA was not significantly related to any region of interest in healthy controls. Limitations include the homogeneity of the healthy controls with regard to racial diversity, results may only be specific to female interpersonal trauma survivors with PTSD, and neutral faces within the conflict task may be perceived as negative by clinical samples. Persistent, increased NA may represent a proxy for disruptions in emotional processing in interpersonal trauma survivors with PTSD. As such, clinicians may prioritize increasing emotional awareness through emotion regulation and/or distress tolerance strategies in this population. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Validation of a novel attentional bias modification task: the future may be in the cards.

    PubMed

    Notebaert, Lies; Clarke, Patrick J F; Grafton, Ben; MacLeod, Colin

    2015-02-01

    Attentional bias modification (ABM) is a promising therapeutic tool aimed at changing patterns of attentional selectivity associated with heightened anxiety. A number of studies have successfully implemented ABM using the modified dot-probe task. However others have not achieved the attentional change required to achieve emotional benefits, highlighting the need for new ABM methods. The current study compared the effectiveness of a newly developed ABM task against the traditional dot-probe ABM task. The new person-identity-matching (PIM) task presented participants with virtual cards, each depicting a happy and angry person. The task encourages selective attention toward or away from threat by requiring participants to make matching judgements between two cards, based either on the identities of the happy faces, or of the angry faces. Change in attentional bias achieved by both ABM tasks was measured by a dot-probe assessment task. Their impact on emotional vulnerability was assessed by measuring negative emotional reactions to a video stressor. The PIM task succeeded in modifying attentional bias, and exerting an impact on emotional reactivity, whereas this was not the case for the dot-probe task. These results are considered in relation to the potential clinical utility of the current task in comparison to traditional ABM methodologies. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Measures of skin conductance and heart rate in alcoholic men and women during memory performance

    PubMed Central

    Poey, Alan; Ruiz, Susan Mosher; Marinkovic, Ksenija; Oscar-Berman, Marlene

    2015-01-01

    We examined abnormalities in physiological responses to emotional stimuli associated with long-term chronic alcoholism. Skin conductance responses (SCR) and heart rate (HR) responses were measured in 32 abstinent alcoholic (ALC) and 30 healthy nonalcoholic (NC) men and women undergoing an emotional memory task in an MRI scanner. The task required participants to remember the identity of two emotionally-valenced faces presented at the onset of each trial during functional magnetic resonance imaging (fMRI) scanning. After viewing the faces, participants saw a distractor image (an alcoholic beverage, nonalcoholic beverage, or scrambled image) followed by a single probe face. The task was to decide whether the probe face matched one of the two encoded faces. Skin conductance measurements (before and after the encoded faces, distractor, and probe) were obtained from electrodes on the index and middle fingers on the left hand. HR measurements (beats per minute before and after the encoded faces, distractor, and probe) were obtained by a pulse oximeter placed on the little finger on the left hand. We expected that, relative to NC participants, the ALC participants would show reduced SCR and HR responses to the face stimuli, and that we would identify greater reactivity to the alcoholic beverage stimuli than to the distractor stimuli unrelated to alcohol. While the beverage type did not differentiate the groups, the ALC group did have reduced skin conductance and HR responses to elements of the task, as compared to the NC group. PMID:26020002

  17. Social categories shape the neural representation of emotion: evidence from a visual face adaptation task

    PubMed Central

    Otten, Marte; Banaji, Mahzarin R.

    2012-01-01

    A number of recent behavioral studies have shown that emotional expressions are differently perceived depending on the race of a face, and that perception of race cues is influenced by emotional expressions. However, neural processes related to the perception of invariant cues that indicate the identity of a face (such as race) are often described to proceed independently of processes related to the perception of cues that can vary over time (such as emotion). Using a visual face adaptation paradigm, we tested whether these behavioral interactions between emotion and race also reflect interdependent neural representation of emotion and race. We compared visual emotion aftereffects when the adapting face and ambiguous test face differed in race or not. Emotion aftereffects were much smaller in different race (DR) trials than same race (SR) trials, indicating that the neural representation of a facial expression is significantly different depending on whether the emotional face is black or white. It thus seems that invariable cues such as race interact with variable face cues such as emotion not just at a response level, but also at the level of perception and neural representation. PMID:22403531

  18. Implicit reward associations impact face processing: Time-resolved evidence from event-related brain potentials and pupil dilations.

    PubMed

    Hammerschmidt, Wiebke; Kagan, Igor; Kulke, Louisa; Schacht, Annekathrin

    2018-06-22

    The present study aimed at investigating whether associated motivational salience causes preferential processing of inherently neutral faces similar to emotional expressions by means of event-related brain potentials (ERPs) and changes of the pupil size. To this aim, neutral faces were implicitly associated with monetary outcome, while participants (N = 44) performed a masked prime face-matching task that ensured performance around chance level and thus an equal proportion of gain, loss, and zero outcomes. Motivational context strongly impacted the processing of the fixation, prime and mask stimuli prior to the target face, indicated by enhanced amplitudes of subsequent ERP components and increased pupil size. In a separate test session, previously associated faces as well as novel faces with emotional expressions were presented within the same task but without motivational context and performance feedback. Most importantly, previously gain-associated faces amplified the LPC, although the individually contingent face-outcome assignments were not made explicit during the learning session. Emotional expressions impacted the N170 and EPN components. Modulations of the pupil size were absent in both motivationally-associated and emotional conditions. Our findings demonstrate that neural representations of neutral stimuli can acquire increased salience via implicit learning, with an advantage for gain over loss associations. Copyright © 2018. Published by Elsevier Inc.

  19. Content specificity of attentional bias to threat in post-traumatic stress disorder.

    PubMed

    Zinchenko, A; Al-Amin, M M; Alam, M M; Mahmud, W; Kabir, N; Reza, H M; Burne, T H J

    2017-08-01

    Attentional bias to affective information and reduced cognitive control may maintain the symptoms of post-traumatic stress disorder (PTSD) and impair cognitive functioning. However, the role of content specificity of affective stimuli (e.g., trauma-related, emotional trauma-unrelated) in the observed attentional bias and cognitive control is less clear, as this has not been tested simultaneously before. Therefore, we examined the content specificity of attentional bias to threat in PTSD. PTSD participants (survivors of a multistory factory collapse, n=30) and matched controls (n=30) performed an Eriksen Flanker task. They identified the direction of a centrally presented target arrow, which was flanked by several task-irrelevant distractor arrows pointed to the same (congruent) or opposite direction (incongruent). Additionally, participants were presented with a picture of a face (neutral, emotional) or building (neutral=normal, emotional=collapsed multistory factory) as a task-irrelevant background image. We found that PTSD participants produced overall larger conflict effects and longer reaction times (RT) to emotional than to neutral stimuli relative to their healthy counterparts. Moreover, PTSD, but not healthy participants showed a stimulus specific dissociation in processing emotional stimuli. Emotional faces elicited longer RTs compared to neutral faces, while emotional buildings elicited faster responses, compared to neutral buildings. PTSD patients show a content-sensitive attentional bias to emotional information and impaired cognitive control. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Postural Control in Children with Dyslexia: Effects of Emotional Stimuli in a Dual-Task Environment.

    PubMed

    Goulème, Nathalie; Gerard, Christophe-Loïc; Bucci, Maria Pia

    2017-08-01

    The aim of this study was to compare the visual exploration strategies used during a postural control task across participants with and without dyslexia. We simultaneously recorded eye movements and postural control while children were viewing different types of emotional faces. Twenty-two children with dyslexia and twenty-two aged-matched children without dyslexia participated in the study. We analysed the surface area, the length and the mean velocity of the centre of pressure for balance in parallel with visual saccadic latency, the number of saccades and the time spent in regions of interest. Our results showed that postural stability in children with dyslexia was weaker and the surface area of their centre of pressure increased significantly when they viewed an unpleasant face. Moreover, children with dyslexia had different strategies to those used by children without dyslexia during visual exploration, and in particular when they viewed unpleasant emotional faces. We suggest that lower performance in emotional face processing in children with dyslexia could be due to a difference in their visual strategies, linked to their identification of unpleasant emotional faces. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  1. "We all look the same to me": positive emotions eliminate the own-race in face recognition.

    PubMed

    Johnson, Kareem J; Fredrickson, Barbara L

    2005-11-01

    Extrapolating from the broaden-and-build theory, we hypothesized that positive emotion may reduce the own-race bias in facial recognition. In Experiments 1 and 2, Caucasian participants (N = 89) viewed Black and White faces for a recognition task. They viewed videos eliciting joy, fear, or neutrality before the learning (Experiment 1) or testing (Experiment 2) stages of the task. Results reliably supported the hypothesis. Relative to fear or a neutral state, joy experienced before either stage improved recognition of Black faces and significantly reduced the own-race bias. Discussion centers on possible mechanisms for this reduction of the own-race bias, including improvements in holistic processing and promotion of a common in-group identity due to positive emotions.

  2. Quantifying facial expression recognition across viewing conditions.

    PubMed

    Goren, Deborah; Wilson, Hugh R

    2006-04-01

    Facial expressions are key to social interactions and to assessment of potential danger in various situations. Therefore, our brains must be able to recognize facial expressions when they are transformed in biologically plausible ways. We used synthetic happy, sad, angry and fearful faces to determine the amount of geometric change required to recognize these emotions during brief presentations. Five-alternative forced choice conditions involving central viewing, peripheral viewing and inversion were used to study recognition among the four emotions. Two-alternative forced choice was used to study affect discrimination when spatial frequency information in the stimulus was modified. The results show an emotion and task-dependent pattern of detection. Facial expressions presented with low peak frequencies are much harder to discriminate from neutral than faces defined by either mid or high peak frequencies. Peripheral presentation of faces also makes recognition much more difficult, except for happy faces. Differences between fearful detection and recognition tasks are probably due to common confusions with sadness when recognizing fear from among other emotions. These findings further support the idea that these emotions are processed separately from each other.

  3. Face and emotion expression processing and the serotonin transporter polymorphism 5-HTTLPR/rs22531.

    PubMed

    Hildebrandt, A; Kiy, A; Reuter, M; Sommer, W; Wilhelm, O

    2016-06-01

    Face cognition, including face identity and facial expression processing, is a crucial component of socio-emotional abilities, characterizing humans as highest developed social beings. However, for these trait domains molecular genetic studies investigating gene-behavior associations based on well-founded phenotype definitions are still rare. We examined the relationship between 5-HTTLPR/rs25531 polymorphisms - related to serotonin-reuptake - and the ability to perceive and recognize faces and emotional expressions in human faces. For this aim we conducted structural equation modeling on data from 230 young adults, obtained by using a comprehensive, multivariate task battery with maximal effort tasks. By additionally modeling fluid intelligence and immediate and delayed memory factors, we aimed to address the discriminant relationships of the 5-HTTLPR/rs25531 polymorphisms with socio-emotional abilities. We found a robust association between the 5-HTTLPR/rs25531 polymorphism and facial emotion perception. Carriers of two long (L) alleles outperformed carriers of one or two S alleles. Weaker associations were present for face identity perception and memory for emotional facial expressions. There was no association between the 5-HTTLPR/rs25531 polymorphism and non-social abilities, demonstrating discriminant validity of the relationships. We discuss the implications and possible neural mechanisms underlying these novel findings. © 2016 John Wiley & Sons Ltd and International Behavioural and Neural Genetics Society.

  4. The Impact of Emotional States on Cognitive Control Circuitry and Function.

    PubMed

    Cohen, Alexandra O; Dellarco, Danielle V; Breiner, Kaitlyn; Helion, Chelsea; Heller, Aaron S; Rahdar, Ahrareh; Pedersen, Gloria; Chein, Jason; Dyke, Jonathan P; Galvan, Adriana; Casey, B J

    2016-03-01

    Typically in the laboratory, cognitive and emotional processes are studied separately or as a stream of fleeting emotional stimuli embedded within a cognitive task. Yet in life, thoughts and actions often occur in more lasting emotional states of arousal. The current study examines the impact of emotions on actions using a novel behavioral paradigm and functional neuroimaging to assess cognitive control under sustained states of threat (anticipation of an aversive noise) and excitement (anticipation of winning money). Thirty-eight healthy adult participants were scanned while performing an emotional go/no-go task with positive (happy faces), negative (fearful faces), and neutral (calm faces) emotional cues, under threat or excitement. Cognitive control performance was enhanced during the excited state relative to a nonarousing control condition. This enhanced performance was paralleled by heightened activity of frontoparietal and frontostriatal circuitry. In contrast, under persistent threat, cognitive control was diminished when the valence of the emotional cue conflicted with the emotional state. Successful task performance in this conflicting emotional condition was associated with increased activity in the posterior cingulate cortex, a default mode network region implicated in complex processes such as processing emotions in the context of self and monitoring performance. This region showed positive coupling with frontoparietal circuitry implicated in cognitive control, providing support for a role of the posterior cingulate cortex in mobilizing cognitive resources to improve performance. These findings suggest that emotional states of arousal differentially modulate cognitive control and point to the potential utility of this paradigm for understanding effects of situational and pathological states of arousal on behavior.

  5. Unconsciously Triggered Emotional Conflict by Emotional Facial Expressions

    PubMed Central

    Chen, Antao; Cui, Qian; Zhang, Qinglin

    2013-01-01

    The present study investigated whether emotional conflict and emotional conflict adaptation could be triggered by unconscious emotional information as assessed in a backward-masked affective priming task. Participants were instructed to identify the valence of a face (e.g., happy or sad) preceded by a masked happy or sad face. The results of two experiments revealed the emotional conflict effect but no emotional conflict adaptation effect. This demonstrates that emotional conflict can be triggered by unconsciously presented emotional information, but participants may not adjust their subsequent performance trial-by trial to reduce this conflict. PMID:23409084

  6. Oxytocin Reduces Face Processing Time but Leaves Recognition Accuracy and Eye-Gaze Unaffected.

    PubMed

    Hubble, Kelly; Daughters, Katie; Manstead, Antony S R; Rees, Aled; Thapar, Anita; van Goozen, Stephanie H M

    2017-01-01

    Previous studies have found that oxytocin (OXT) can improve the recognition of emotional facial expressions; it has been proposed that this effect is mediated by an increase in attention to the eye-region of faces. Nevertheless, evidence in support of this claim is inconsistent, and few studies have directly tested the effect of oxytocin on emotion recognition via altered eye-gaze Methods: In a double-blind, within-subjects, randomized control experiment, 40 healthy male participants received 24 IU intranasal OXT and placebo in two identical experimental sessions separated by a 2-week interval. Visual attention to the eye-region was assessed on both occasions while participants completed a static facial emotion recognition task using medium intensity facial expressions. Although OXT had no effect on emotion recognition accuracy, recognition performance was improved because face processing was faster across emotions under the influence of OXT. This effect was marginally significant (p<.06). Consistent with a previous study using dynamic stimuli, OXT had no effect on eye-gaze patterns when viewing static emotional faces and this was not related to recognition accuracy or face processing time. These findings suggest that OXT-induced enhanced facial emotion recognition is not necessarily mediated by an increase in attention to the eye-region of faces, as previously assumed. We discuss several methodological issues which may explain discrepant findings and suggest the effect of OXT on visual attention may differ depending on task requirements. (JINS, 2017, 23, 23-33).

  7. The Moving Window Technique: A Window into Developmental Changes in Attention during Facial Emotion Recognition

    ERIC Educational Resources Information Center

    Birmingham, Elina; Meixner, Tamara; Iarocci, Grace; Kanan, Christopher; Smilek, Daniel; Tanaka, James W.

    2013-01-01

    The strategies children employ to selectively attend to different parts of the face may reflect important developmental changes in facial emotion recognition. Using the Moving Window Technique (MWT), children aged 5-12 years and adults ("N" = 129) explored faces with a mouse-controlled window in an emotion recognition task. An…

  8. Multisensory emotion perception in congenitally, early, and late deaf CI users

    PubMed Central

    Nava, Elena; Villwock, Agnes K.; Büchner, Andreas; Lenarz, Thomas; Röder, Brigitte

    2017-01-01

    Emotions are commonly recognized by combining auditory and visual signals (i.e., vocal and facial expressions). Yet it is unknown whether the ability to link emotional signals across modalities depends on early experience with audio-visual stimuli. In the present study, we investigated the role of auditory experience at different stages of development for auditory, visual, and multisensory emotion recognition abilities in three groups of adolescent and adult cochlear implant (CI) users. CI users had a different deafness onset and were compared to three groups of age- and gender-matched hearing control participants. We hypothesized that congenitally deaf (CD) but not early deaf (ED) and late deaf (LD) CI users would show reduced multisensory interactions and a higher visual dominance in emotion perception than their hearing controls. The CD (n = 7), ED (deafness onset: <3 years of age; n = 7), and LD (deafness onset: >3 years; n = 13) CI users and the control participants performed an emotion recognition task with auditory, visual, and audio-visual emotionally congruent and incongruent nonsense speech stimuli. In different blocks, participants judged either the vocal (Voice task) or the facial expressions (Face task). In the Voice task, all three CI groups performed overall less efficiently than their respective controls and experienced higher interference from incongruent facial information. Furthermore, the ED CI users benefitted more than their controls from congruent faces and the CD CI users showed an analogous trend. In the Face task, recognition efficiency of the CI users and controls did not differ. Our results suggest that CI users acquire multisensory interactions to some degree, even after congenital deafness. When judging affective prosody they appear impaired and more strongly biased by concurrent facial information than typically hearing individuals. We speculate that limitations inherent to the CI contribute to these group differences. PMID:29023525

  9. Multisensory emotion perception in congenitally, early, and late deaf CI users.

    PubMed

    Fengler, Ineke; Nava, Elena; Villwock, Agnes K; Büchner, Andreas; Lenarz, Thomas; Röder, Brigitte

    2017-01-01

    Emotions are commonly recognized by combining auditory and visual signals (i.e., vocal and facial expressions). Yet it is unknown whether the ability to link emotional signals across modalities depends on early experience with audio-visual stimuli. In the present study, we investigated the role of auditory experience at different stages of development for auditory, visual, and multisensory emotion recognition abilities in three groups of adolescent and adult cochlear implant (CI) users. CI users had a different deafness onset and were compared to three groups of age- and gender-matched hearing control participants. We hypothesized that congenitally deaf (CD) but not early deaf (ED) and late deaf (LD) CI users would show reduced multisensory interactions and a higher visual dominance in emotion perception than their hearing controls. The CD (n = 7), ED (deafness onset: <3 years of age; n = 7), and LD (deafness onset: >3 years; n = 13) CI users and the control participants performed an emotion recognition task with auditory, visual, and audio-visual emotionally congruent and incongruent nonsense speech stimuli. In different blocks, participants judged either the vocal (Voice task) or the facial expressions (Face task). In the Voice task, all three CI groups performed overall less efficiently than their respective controls and experienced higher interference from incongruent facial information. Furthermore, the ED CI users benefitted more than their controls from congruent faces and the CD CI users showed an analogous trend. In the Face task, recognition efficiency of the CI users and controls did not differ. Our results suggest that CI users acquire multisensory interactions to some degree, even after congenital deafness. When judging affective prosody they appear impaired and more strongly biased by concurrent facial information than typically hearing individuals. We speculate that limitations inherent to the CI contribute to these group differences.

  10. Emotion and Destination Memory in Alzheimer's Disease.

    PubMed

    El Haj, Mohamad; Raffard, Stephane; Antoine, Pascal; Gely-Nargeot, Marie-Christine

    2015-01-01

    Research shows beneficial effect of emotion on self-related information in patients with Alzheimer's Disease (AD). Our paper investigates whether emotion improves destination memory (e.g., did I tell you about the manuscript?), which is thought to be self-related (e.g., did I tell you about the manuscript?). To this aim, twenty-seven AD patients and thirty healthy older adults told 24 neutral facts to eight neutral faces, eight positive faces, and eight negative faces. On a subsequent recognition task, participants had to decide whether they had previously told a given fact to a given face or not. Data revealed no emotional effect on destination memory in AD patients. However, in healthy older adults, better destination memory was observed for negative faces than for positive faces, and the latter memory was better than for neutral faces. The absence of emotional effect on destination memory in AD is interpreted in terms of substantial decline in this memory in the disease.

  11. Emotionally anesthetized: media violence induces neural changes during emotional face processing

    PubMed Central

    Stockdale, Laura A.; Morrison, Robert G.; Kmiecik, Matthew J.; Garbarino, James

    2015-01-01

    Media violence exposure causes increased aggression and decreased prosocial behavior, suggesting that media violence desensitizes people to the emotional experience of others. Alterations in emotional face processing following exposure to media violence may result in desensitization to others’ emotional states. This study used scalp electroencephalography methods to examine the link between exposure to violence and neural changes associated with emotional face processing. Twenty-five participants were shown a violent or nonviolent film clip and then completed a gender discrimination stop-signal task using emotional faces. Media violence did not affect the early visual P100 component; however, decreased amplitude was observed in the N170 and P200 event-related potentials following the violent film, indicating that exposure to film violence leads to suppression of holistic face processing and implicit emotional processing. Participants who had just seen a violent film showed increased frontal N200/P300 amplitude. These results suggest that media violence exposure may desensitize people to emotional stimuli and thereby require fewer cognitive resources to inhibit behavior. PMID:25759472

  12. Normative data on development of neural and behavioral mechanisms underlying attention orienting toward social-emotional stimuli: An exploratory study

    PubMed Central

    Lindstrom, Kara; Guyer, Amanda E.; Mogg, Karin; Bradley, Brendan P.; Fox, Nathan A.; Ernst, Monique; Nelson, Eric E.; Leibenluft, Ellen; Britton, Jennifer C.; Monk, Christopher S.; Pine, Daniel S.; Bar-Haim, Yair

    2009-01-01

    The ability of positive and negative facial signals to influence attention orienting is crucial to social functioning. Given the dramatic developmental change in neural architecture supporting social function, positive and negative facial cues may influence attention orienting differently in relatively young or old individuals. However, virtually no research examines such age-related differences in the neural circuitry supporting attention orienting to emotional faces. We examined age-related correlations in attention-orienting biases to positive and negative face emotions in a healthy sample (N=37; 9-40 years old) using functional magnetic resonance imaging and a dot-probe task. The dot-probe task in an fMRI setting yields both behavioral and neural indices of attention biases towards or away from an emotional cue (happy or angry face). In the full sample, angry-face attention bias scores did not correlate with age, and age did not correlate with brain activation to angry faces. However, age did positively correlate with attention bias towards happy faces; age also negatively correlated with left cuneus and left caudate activation to a happy-bias fMRI contrast. Secondary analyses suggested age-related changes in attention bias to happy faces. The tendency in younger children to direct attention away from happy faces (relative to neutral faces) was diminished in the older age groups, in tandem with increasing neural deactivation. Implications for future work on developmental changes in attention-emotion processing are discussed. PMID:19631626

  13. Facial emotion expression recognition by children at familial risk for depression: High risk boys are oversensitive to sadness

    PubMed Central

    Lopez-Duran, Nestor L.; Kuhlman, Kate R.; George, Charles; Kovacs, Maria

    2012-01-01

    In the present study we examined perceptual sensitivity to facial expressions of sadness among children at familial-risk for depression (N = 64) and low-risk peers (N = 40) between the ages 7 and 13(Mage = 9.51; SD = 2.27). Participants were presented with pictures of facial expressions that varied in emotional intensity from neutral to full-intensity sadness or anger (i.e., emotion recognition), or pictures of faces morphing from anger to sadness (emotion discrimination). After each picture was presented, children indicated whether the face showed a specific emotion (i.e., sadness, anger) or no emotion at all (neutral). In the emotion recognition task, boys (but not girls) at familial-risk for depression identified sadness at significantly lower levels of emotional intensity than did their low-risk peers. The high and low-risk groups did not differ with regard to identification of anger. In the emotion discrimination task, both groups displayed over-identification of sadness in ambiguous mixed faces but high-risk youth were less likely to show this labeling bias than their peers. Our findings are consistent with the hypothesis that enhanced perceptual sensitivity to subtle traces of sadness in facial expressions may be a potential mechanism of risk among boys at familial-risk for depression. This enhanced perceptual sensitivity does not appear to be due to biases in the labeling of ambiguous faces. PMID:23106941

  14. Rapid communication: Global-local processing affects recognition of distractor emotional faces.

    PubMed

    Srinivasan, Narayanan; Gupta, Rashmi

    2011-03-01

    Recent studies have shown links between happy faces and global, distributed attention as well as sad faces to local, focused attention. Emotions have been shown to affect global-local processing. Given that studies on emotion-cognition interactions have not explored the effect of perceptual processing at different spatial scales on processing stimuli with emotional content, the present study investigated the link between perceptual focus and emotional processing. The study investigated the effects of global-local processing on the recognition of distractor faces with emotional expressions. Participants performed a digit discrimination task with digits at either the global level or the local level presented against a distractor face (happy or sad) as background. The results showed that global processing associated with broad scope of attention facilitates recognition of happy faces, and local processing associated with narrow scope of attention facilitates recognition of sad faces. The novel results of the study provide conclusive evidence for emotion-cognition interactions by demonstrating the effect of perceptual processing on emotional faces. The results along with earlier complementary results on the effect of emotion on global-local processing support a reciprocal relationship between emotional processing and global-local processing. Distractor processing with emotional information also has implications for theories of selective attention.

  15. Emotional Cues during Simultaneous Face and Voice Processing: Electrophysiological Insights

    PubMed Central

    Liu, Taosheng; Pinheiro, Ana; Zhao, Zhongxin; Nestor, Paul G.; McCarley, Robert W.; Niznikiewicz, Margaret A.

    2012-01-01

    Both facial expression and tone of voice represent key signals of emotional communication but their brain processing correlates remain unclear. Accordingly, we constructed a novel implicit emotion recognition task consisting of simultaneously presented human faces and voices with neutral, happy, and angry valence, within the context of recognizing monkey faces and voices task. To investigate the temporal unfolding of the processing of affective information from human face-voice pairings, we recorded event-related potentials (ERPs) to these audiovisual test stimuli in 18 normal healthy subjects; N100, P200, N250, P300 components were observed at electrodes in the frontal-central region, while P100, N170, P270 were observed at electrodes in the parietal-occipital region. Results indicated a significant audiovisual stimulus effect on the amplitudes and latencies of components in frontal-central (P200, P300, and N250) but not the parietal occipital region (P100, N170 and P270). Specifically, P200 and P300 amplitudes were more positive for emotional relative to neutral audiovisual stimuli, irrespective of valence, whereas N250 amplitude was more negative for neutral relative to emotional stimuli. No differentiation was observed between angry and happy conditions. The results suggest that the general effect of emotion on audiovisual processing can emerge as early as 200 msec (P200 peak latency) post stimulus onset, in spite of implicit affective processing task demands, and that such effect is mainly distributed in the frontal-central region. PMID:22383987

  16. Post-Decision Wagering Affects Metacognitive Awareness of Emotional Stimuli: An Event Related Potential Study

    PubMed Central

    Wierzchoń, Michał; Wronka, Eligiusz; Paulewicz, Borysław; Szczepanowski, Remigiusz

    2016-01-01

    The present research investigated metacognitive awareness of emotional stimuli and its psychophysiological correlates. We used a backward masking task presenting participants with fearful or neutral faces. We asked participants for face discrimination and then probed their metacognitive awareness with confidence rating (CR) and post-decision wagering (PDW) scales. We also analysed psychophysiological correlates of awareness with event-related potential (ERP) components: P1, N170, early posterior negativity (EPN), and P3. We have not observed any differences between PDW and CR conditions in the emotion identification task. However, the "aware" ratings were associated with increased accuracy performance. This effect was more pronounced in PDW, especially for fearful faces, suggesting that emotional stimuli awareness may be enhanced by monetary incentives. EEG analysis showed larger N170, EPN and P3 amplitudes in aware compared to unaware trials. It also appeared that both EPN and P3 ERP components were more pronounced in the PDW condition, especially when emotional faces were presented. Taken together, our ERP findings suggest that metacognitive awareness of emotional stimuli depends on the effectiveness of both early and late visual information processing. Our study also indicates that awareness of emotional stimuli can be enhanced by the motivation induced by wagering. PMID:27490816

  17. Post-Decision Wagering Affects Metacognitive Awareness of Emotional Stimuli: An Event Related Potential Study.

    PubMed

    Wierzchoń, Michał; Wronka, Eligiusz; Paulewicz, Borysław; Szczepanowski, Remigiusz

    2016-01-01

    The present research investigated metacognitive awareness of emotional stimuli and its psychophysiological correlates. We used a backward masking task presenting participants with fearful or neutral faces. We asked participants for face discrimination and then probed their metacognitive awareness with confidence rating (CR) and post-decision wagering (PDW) scales. We also analysed psychophysiological correlates of awareness with event-related potential (ERP) components: P1, N170, early posterior negativity (EPN), and P3. We have not observed any differences between PDW and CR conditions in the emotion identification task. However, the "aware" ratings were associated with increased accuracy performance. This effect was more pronounced in PDW, especially for fearful faces, suggesting that emotional stimuli awareness may be enhanced by monetary incentives. EEG analysis showed larger N170, EPN and P3 amplitudes in aware compared to unaware trials. It also appeared that both EPN and P3 ERP components were more pronounced in the PDW condition, especially when emotional faces were presented. Taken together, our ERP findings suggest that metacognitive awareness of emotional stimuli depends on the effectiveness of both early and late visual information processing. Our study also indicates that awareness of emotional stimuli can be enhanced by the motivation induced by wagering.

  18. Cortical deficits of emotional face processing in adults with ADHD: its relation to social cognition and executive function.

    PubMed

    Ibáñez, Agustin; Petroni, Agustin; Urquina, Hugo; Torrente, Fernando; Torralva, Teresa; Hurtado, Esteban; Guex, Raphael; Blenkmann, Alejandro; Beltrachini, Leandro; Muravchik, Carlos; Baez, Sandra; Cetkovich, Marcelo; Sigman, Mariano; Lischinsky, Alicia; Manes, Facundo

    2011-01-01

    Although it has been shown that adults with attention-deficit hyperactivity disorder (ADHD) have impaired social cognition, no previous study has reported the brain correlates of face valence processing. This study looked for behavioral, neuropsychological, and electrophysiological markers of emotion processing for faces (N170) in adult ADHD compared to controls matched by age, gender, educational level, and handedness. We designed an event-related potential (ERP) study based on a dual valence task (DVT), in which faces and words were presented to test the effects of stimulus type (faces, words, or face-word stimuli) and valence (positive versus negative). Individual signatures of cognitive functioning in participants with ADHD and controls were assessed with a comprehensive neuropsychological evaluation, including executive functioning (EF) and theory of mind (ToM). Compared to controls, the adult ADHD group showed deficits in N170 emotion modulation for facial stimuli. These N170 impairments were observed in the absence of any deficit in facial structural processing, suggesting a specific ADHD impairment in early facial emotion modulation. The cortical current density mapping of N170 yielded a main neural source of N170 at posterior section of fusiform gyrus (maximum at left hemisphere for words and right hemisphere for faces and simultaneous stimuli). Neural generators of N170 (fusiform gyrus) were reduced in ADHD. In those patients, N170 emotion processing was associated with performance on an emotional inference ToM task, and N170 from simultaneous stimuli was associated with EF, especially working memory. This is the first report to reveal an adult ADHD-specific impairment in the cortical modulation of emotion for faces and an association between N170 cortical measures and ToM and EF.

  19. Discrimination and categorization of emotional facial expressions and faces in Parkinson's disease.

    PubMed

    Alonso-Recio, Laura; Martín, Pilar; Rubio, Sandra; Serrano, Juan M

    2014-09-01

    Our objective was to compare the ability to discriminate and categorize emotional facial expressions (EFEs) and facial identity characteristics (age and/or gender) in a group of 53 individuals with Parkinson's disease (PD) and another group of 53 healthy subjects. On the one hand, by means of discrimination and identification tasks, we compared two stages in the visual recognition process that could be selectively affected in individuals with PD. On the other hand, facial expression versus gender and age comparison permits us to contrast whether the emotional or non-emotional content influences the configural perception of faces. In Experiment I, we did not find differences between groups, either with facial expression or age, in discrimination tasks. Conversely, in Experiment II, we found differences between the groups, but only in the EFE identification task. Taken together, our results indicate that configural perception of faces does not seem to be globally impaired in PD. However, this ability is selectively altered when the categorization of emotional faces is required. A deeper assessment of the PD group indicated that decline in facial expression categorization is more evident in a subgroup of patients with higher global impairment (motor and cognitive). Taken together, these results suggest that the problems found in facial expression recognition may be associated with the progressive neuronal loss in frontostriatal and mesolimbic circuits, which characterizes PD. © 2013 The British Psychological Society.

  20. Distant influences of amygdala lesion on visual cortical activation during emotional face processing.

    PubMed

    Vuilleumier, Patrik; Richardson, Mark P; Armony, Jorge L; Driver, Jon; Dolan, Raymond J

    2004-11-01

    Emotional visual stimuli evoke enhanced responses in the visual cortex. To test whether this reflects modulatory influences from the amygdala on sensory processing, we used event-related functional magnetic resonance imaging (fMRI) in human patients with medial temporal lobe sclerosis. Twenty-six patients with lesions in the amygdala, the hippocampus or both, plus 13 matched healthy controls, were shown pictures of fearful or neutral faces in task-releant or task-irrelevant positions on the display. All subjects showed increased fusiform cortex activation when the faces were in task-relevant positions. Both healthy individuals and those with hippocampal damage showed increased activation in the fusiform and occipital cortex when they were shown fearful faces, but this was not the case for individuals with damage to the amygdala, even though visual areas were structurally intact. The distant influence of the amygdala was also evidenced by the parametric relationship between amygdala damage and the level of emotional activation in the fusiform cortex. Our data show that combining the fMRI and lesion approaches can help reveal the source of functional modulatory influences between distant but interconnected brain regions.

  1. Contrasting vertical and horizontal representations of affect in emotional visual search.

    PubMed

    Damjanovic, Ljubica; Santiago, Julio

    2016-02-01

    Independent lines of evidence suggest that the representation of emotional evaluation recruits both vertical and horizontal spatial mappings. These two spatial mappings differ in their experiential origins and their productivity, and available data suggest that they differ in their saliency. Yet, no study has so far compared their relative strength in an attentional orienting reaction time task that affords the simultaneous manifestation of both types of mapping. Here, we investigated this question using a visual search task with emotional faces. We presented angry and happy face targets and neutral distracter faces in top, bottom, left, and right locations on the computer screen. Conceptual congruency effects were observed along the vertical dimension supporting the 'up = good' metaphor, but not along the horizontal dimension. This asymmetrical processing pattern was observed when faces were presented in a cropped (Experiment 1) and whole (Experiment 2) format. These findings suggest that the 'up = good' metaphor is more salient and readily activated than the 'right = good' metaphor, and that the former outcompetes the latter when the task context affords the simultaneous activation of both mappings.

  2. The attraction of emotions: Irrelevant emotional information modulates motor actions.

    PubMed

    Ambron, Elisabetta; Foroni, Francesco

    2015-08-01

    Emotional expressions are important cues that capture our attention automatically. Although a wide range of work has explored the role and influence of emotions on cognition and behavior, little is known about the way that emotions influence motor actions. Moreover, considering how critical detecting emotional facial expressions in the environment can be, it is important to understand their impact even when they are not directly relevant to the task being performed. Our novel approach was to explore this issue from the attention-and-action perspective, using a task-irrelevant distractor paradigm in which participants are asked to reach for a target while a nontarget stimulus is also presented. We tested whether the movement trajectory would be influenced by irrelevant stimuli-faces with or without emotional expressions. The results showed that reaching paths veered toward faces with emotional expressions, in particular happiness, but not toward neutral expressions. This reinforces the view of emotions as attention-capturing stimuli that are, however, also potential sources of distraction for motor actions.

  3. Processing of task-irrelevant emotional faces impacted by implicit sequence learning.

    PubMed

    Peng, Ming; Cai, Mengfei; Zhou, Renlai

    2015-12-02

    Attentional load may be increased by task-relevant attention, such as difficulty of task, or task-irrelevant attention, such as an unexpected light-spot in the screen. Several studies have focused on the influence of task-relevant attentional load on task-irrelevant emotion processing. In this study, we used event-related potentials to examine the impact of task-irrelevant attentional load on task-irrelevant expression processing. Eighteen participants identified the color of a word (i.e. the color Stroop task) while a picture of a fearful or a neutral face was shown in the background. The task-irrelevant attentional load was increased by regularly presented congruence trials (congruence between the color and the meaning of the word) in the regular condition because implicit sequence learning was induced. We compared the task-irrelevant expression processing between the regular condition and the random condition (the congruence and incongruence trials were presented randomly). Behaviorally, reaction times for the fearful face condition were faster than the neutral faces condition in the random condition, whereas no significant difference was found in the regular condition. The event-related potential results indicated enhanced positive amplitudes in P2, N2, and P3 components relative to neutral faces in the random condition. In comparison, only P2 differed significantly for the two types of expressions in the regular condition. The study showed that attentional load increased by implicit sequence learning influenced the late processing of task-irrelevant expression.

  4. Emotion Recognition in Face and Body Motion in Bulimia Nervosa.

    PubMed

    Dapelo, Marcela Marin; Surguladze, Simon; Morris, Robin; Tchanturia, Kate

    2017-11-01

    Social cognition has been studied extensively in anorexia nervosa (AN), but there are few studies in bulimia nervosa (BN). This study investigated the ability of people with BN to recognise emotions in ambiguous facial expressions and in body movement. Participants were 26 women with BN, who were compared with 35 with AN, and 42 healthy controls. Participants completed an emotion recognition task by using faces portraying blended emotions, along with a body emotion recognition task by using videos of point-light walkers. The results indicated that BN participants exhibited difficulties recognising disgust in less-ambiguous facial expressions, and a tendency to interpret non-angry faces as anger, compared with healthy controls. These difficulties were similar to those found in AN. There were no significant differences amongst the groups in body motion emotion recognition. The findings suggest that difficulties with disgust and anger recognition in facial expressions may be shared transdiagnostically in people with eating disorders. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association.

  5. Conflict adaptation in emotional task underlies the amplification of target.

    PubMed

    Chechko, Natalia; Kellermann, Thilo; Schneider, Frank; Habel, Ute

    2014-04-01

    A primary function of cognitive control is to adjust the cognitive system according to situational demands. The so-called "conflict adaptation effect" elicited in laboratory experiments is supposed to reflect the above function. Neuroimaging studies suggest that adaptation of nonemotional conflict is mediated by the dorsolateral prefrontal cortex through a top-down enhancement of task-relevant (target), relative to task-irrelevant (distractor), stimulus representation in the sensory cortices. The adaptation of emotional conflict, on the other hand, is suggested to be related to the rostral anterior cingulate inhibiting the processing of emotional distractors through a top-down modulation of amygdala responsivity. In the present study, we manipulated, on a trial-by-trial basis, the levels of semantic interference conflict triggered by the incompatibility between emotional faces (targets) and emotional words (distractors) in a modified version of the emotional Stroop task. Similar to previous observations involving nonemotional interference effects, the behavioral adaptation of emotional conflict was found to be paralleled by a stronger recruitment of the fusiform face area. Additional areas related to the conflict adaptation effect were the bilateral insula, the bilateral frontal operculum (fO), the right amygdala, the left precentral and postcentral gyri, and the parietal cortex. These findings suggest that augmentation of cortical responses to task-relevant information in emotional conflict may be related to conflict adaptation processes in a way that has been observed in nonemotional conflict, challenging the view that brain circuitries underlying the conflict adaptation effect depend only on the nature of conflict.

  6. Neutral face distractors differentiate performance between depressed and healthy adolescents during an emotional working memory task.

    PubMed

    Tavitian, Lucy R; Ladouceur, Cecile D; Nahas, Ziad; Khater, Beatrice; Brent, David A; Maalouf, Fadi T

    2014-08-01

    The aim of the present study is to examine the effect of neutral and emotional facial expressions on voluntary attentional control using a working memory (WM) task in adolescents with major depressive disorder (MDD). We administered the Emotional Face n-back (EFNBACK) task, a visual WM task with neutral, happy and angry faces as distractors to 22 adolescents with MDD (mean age 15.7 years) and 21 healthy controls (HC) (mean age 14.7 years). There was a significant group by distractor type interaction (p = 0.045) for mean percent accuracy rates. Group comparisons showed that MDD youth were less accurate on neutral trials than HC (p = 0.027). The two groups did not differ on angry, happy and blank trials (p > 0.05). Reaction time did not differ across groups. In addition, when comparing the differences between accuracies on neutral trials and each of the happy and angry trials, respectively [(HAP-NEUT) and (ANG-NEUT)], there was a group effect on (HAP-NEUT) where the difference was larger in MDD than HC (p = 0.009) but not on ANG-NEUT (p > 0.05). Findings were independent of memory load. Findings indicate that attentional control to neutral faces is impaired and negatively affected performance on a WM task in adolescents with MDD. Such an impact of neutral faces on attentional control in MDD may be at the core of the social-cognitive impairment observed in this population.

  7. Face Age and Eye Gaze Influence Older Adults' Emotion Recognition.

    PubMed

    Campbell, Anna; Murray, Janice E; Atkinson, Lianne; Ruffman, Ted

    2017-07-01

    Eye gaze has been shown to influence emotion recognition. In addition, older adults (over 65 years) are not as influenced by gaze direction cues as young adults (18-30 years). Nevertheless, these differences might stem from the use of young to middle-aged faces in emotion recognition research because older adults have an attention bias toward old-age faces. Therefore, using older face stimuli might allow older adults to process gaze direction cues to influence emotion recognition. To investigate this idea, young and older adults completed an emotion recognition task with young and older face stimuli displaying direct and averted gaze, assessing labeling accuracy for angry, disgusted, fearful, happy, and sad faces. Direct gaze rather than averted gaze improved young adults' recognition of emotions in young and older faces, but for older adults this was true only for older faces. The current study highlights the impact of stimulus face age and gaze direction on emotion recognition in young and older adults. The use of young face stimuli with direct gaze in most research might contribute to age-related emotion recognition differences. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  8. The human body odor compound androstadienone increases neural conflict coupled to higher behavioral costs during an emotional Stroop task.

    PubMed

    Hornung, Jonas; Kogler, Lydia; Erb, Michael; Freiherr, Jessica; Derntl, Birgit

    2018-05-01

    The androgen derivative androstadienone (AND) is a substance found in human sweat and thus may act as human chemosignal. With the current experiment, we aimed to explore in which way AND affects interference processing during an emotional Stroop task which used human faces as target and emotional words as distractor stimuli. This was complemented by functional magnetic resonance imaging (fMRI) to unravel the neural mechanism of AND-action. Based on previous accounts we expected AND to increase neural activation in areas commonly implicated in evaluation of emotional face processing and to change neural activation in brain regions linked to interference processing. For this aim, a total of 80 healthy individuals (oral contraceptive users, luteal women, men) were tested twice on two consecutive days with an emotional Stroop task using fMRI. Our results suggest that AND increases interference processing in brain areas that are heavily recruited during emotional conflict. At the same time, correlation analyses revealed that this neural interference processing was paralleled by higher behavioral costs (response times) with higher interference related brain activation under AND. Furthermore, AND elicited higher activation in regions implicated in emotional face processing including right fusiform gyrus, inferior frontal gyrus and dorsomedial cortex. In this connection, neural activation was not coupled to behavioral outcome. Furthermore, despite previous accounts of increased hypothalamic activation under AND, we were not able to replicate this finding and discuss possible reasons for this discrepancy. To conclude, AND increased interference processing in regions heavily recruited during emotional conflict which was coupled to higher costs in resolving emotional conflicts with stronger interference-related brain activation under AND. At the moment it remains unclear whether these effects are due to changes in conflict detection or resolution. However, evidence most consistently suggests that AND does not draw attention to the most potent socio-emotional information (human faces) but rather highlights representations of emotional words. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. Face Recognition, Musical Appraisal, and Emotional Crossmodal Bias.

    PubMed

    Invitto, Sara; Calcagnì, Antonio; Mignozzi, Arianna; Scardino, Rosanna; Piraino, Giulia; Turchi, Daniele; De Feudis, Irio; Brunetti, Antonio; Bevilacqua, Vitoantonio; de Tommaso, Marina

    2017-01-01

    Recent research on the crossmodal integration of visual and auditory perception suggests that evaluations of emotional information in one sensory modality may tend toward the emotional value generated in another sensory modality. This implies that the emotions elicited by musical stimuli can influence the perception of emotional stimuli presented in other sensory modalities, through a top-down process. The aim of this work was to investigate how crossmodal perceptual processing influences emotional face recognition and how potential modulation of this processing induced by music could be influenced by the subject's musical competence. We investigated how emotional face recognition processing could be modulated by listening to music and how this modulation varies according to the subjective emotional salience of the music and the listener's musical competence. The sample consisted of 24 participants: 12 professional musicians and 12 university students (non-musicians). Participants performed an emotional go/no-go task whilst listening to music by Albeniz, Chopin, or Mozart. The target stimuli were emotionally neutral facial expressions. We examined the N170 Event-Related Potential (ERP) and behavioral responses (i.e., motor reaction time to target recognition and musical emotional judgment). A linear mixed-effects model and a decision-tree learning technique were applied to N170 amplitudes and latencies. The main findings of the study were that musicians' behavioral responses and N170 is more affected by the emotional value of music administered in the emotional go/no-go task and this bias is also apparent in responses to the non-target emotional face. This suggests that emotional information, coming from multiple sensory channels, activates a crossmodal integration process that depends upon the stimuli emotional salience and the listener's appraisal.

  10. Amygdala and whole-brain activity to emotional faces distinguishes major depressive disorder and bipolar disorder.

    PubMed

    Fournier, Jay C; Keener, Matthew T; Almeida, Jorge; Kronhaus, Dina M; Phillips, Mary L

    2013-11-01

    It can be clinically difficult to distinguish depressed individuals with bipolar disorder (BD) and major depressive disorder (MDD). To examine potential biomarkers of difference between the two disorders, the current study examined differences in the functioning of emotion-processing neural regions during a dynamic emotional faces task. During functional magnetic resonance imaging, healthy control adults (HC) (n = 29) and depressed adults with MDD (n = 30) and BD (n = 22) performed an implicit emotional-faces task in which they identified a color label superimposed on neutral faces that dynamically morphed into one of four emotional faces (angry, fearful, sad, happy). We compared neural activation between the groups in an amygdala region-of-interest and at the whole-brain level. Adults with MDD showed significantly greater activity than adults with BD in the left amygdala to the anger condition (p = 0.01). Results of whole-brain analyses (at p < 0.005, k ≥ 20) revealed that adults with BD showed greater activity to sad faces in temporoparietal regions, primarily in the left hemisphere, whereas individuals with MDD demonstrated greater activity than those with BD to displays of anger, fear, and happiness. Many of the observed BD-MDD differences represented abnormalities in functioning compared to HC. We observed a dissociation between depressed adults with BD and MDD in the processing of emerging emotional faces. Those with BD showed greater activity during mood-congruent (i.e., sad) faces, whereas those with MDD showed greater activity for mood-incongruent (i.e., fear, anger, and happy) faces. Such findings may reflect markers of differences between BD and MDD depression in underlying pathophysiological processes. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  11. A Normalization Framework for Emotional Attention

    PubMed Central

    Zhang, Xilin; Japee, Shruti; Safiullah, Zaid; Ungerleider, Leslie G.

    2016-01-01

    The normalization model of attention proposes that attention can affect performance by response- or contrast-gain changes, depending on the size of the stimulus and attention field. Here, we manipulated the attention field by emotional valence, negative faces versus positive faces, while holding stimulus size constant in a spatial cueing task. We observed changes in the cueing effect consonant with changes in response gain for negative faces and contrast gain for positive faces. Neuroimaging experiments confirmed that subjects’ attention fields were narrowed for negative faces and broadened for positive faces. Importantly, across subjects, the self-reported emotional strength of negative faces and positive faces correlated, respectively, both with response- and contrast-gain changes and with primary visual cortex (V1) narrowed and broadened attention fields. Effective connectivity analysis showed that the emotional valence-dependent attention field was closely associated with feedback from the dorsolateral prefrontal cortex (DLPFC) to V1. These findings indicate a crucial involvement of DLPFC in the normalization processes of emotional attention. PMID:27870851

  12. A Normalization Framework for Emotional Attention.

    PubMed

    Zhang, Xilin; Japee, Shruti; Safiullah, Zaid; Mlynaryk, Nicole; Ungerleider, Leslie G

    2016-11-01

    The normalization model of attention proposes that attention can affect performance by response- or contrast-gain changes, depending on the size of the stimulus and attention field. Here, we manipulated the attention field by emotional valence, negative faces versus positive faces, while holding stimulus size constant in a spatial cueing task. We observed changes in the cueing effect consonant with changes in response gain for negative faces and contrast gain for positive faces. Neuroimaging experiments confirmed that subjects' attention fields were narrowed for negative faces and broadened for positive faces. Importantly, across subjects, the self-reported emotional strength of negative faces and positive faces correlated, respectively, both with response- and contrast-gain changes and with primary visual cortex (V1) narrowed and broadened attention fields. Effective connectivity analysis showed that the emotional valence-dependent attention field was closely associated with feedback from the dorsolateral prefrontal cortex (DLPFC) to V1. These findings indicate a crucial involvement of DLPFC in the normalization processes of emotional attention.

  13. Neural substrates of cognitive switching and inhibition in a face processing task.

    PubMed

    Piguet, Camille; Sterpenich, Virginie; Desseilles, Martin; Cojan, Yann; Bertschy, Gilles; Vuilleumier, Patrik

    2013-11-15

    We frequently need to change our current occupation, an operation requiring additional effortful cognitive demands. Switching from one task to another may involve two distinct processes: inhibition of the previously relevant task-set, and initiation of a new one. Here we tested whether these two processes are underpinned by separate neural substrates, and whether they differ depending on the nature of the task and the emotional content of stimuli. We used functional magnetic resonance imaging in healthy human volunteers who categorize emotional faces according to three different judgment rules (color, gender, or emotional expression). Our paradigm allowed us to separate neural activity associated with inhibition and switching based on the sequence of the tasks required on successive trials. We found that the bilateral medial superior parietal lobule and left intraparietal sulcus showed consistent activation during switching regardless of the task. On the other hand, no common region was activated (or suppressed) as a consequence of inhibition across all tasks. Rather, task-specific effects were observed in brain regions that were more activated when switching to a particular task but less activated after inhibition of the same task. In addition, compared to other conditions, the emotional task elicited a similar switching cost but lower inhibition cost, accompanied by selective decrease in the anterior cingulate cortex when returning to this task shortly after inhibiting it. These results demonstrate that switching relies on domain-general processes mediated by postero-medial parietal areas, engaged across all tasks, but also provide novel evidence that task inhibition produces domain-specific decreases as a function of particular task demands, with only the latter inhibition component being modulated by emotional information. Copyright © 2013 Elsevier Inc. All rights reserved.

  14. Increased heart rate after exercise facilitates the processing of fearful but not disgusted faces.

    PubMed

    Pezzulo, G; Iodice, P; Barca, L; Chausse, P; Monceau, S; Mermillod, M

    2018-01-10

    Embodied theories of emotion assume that emotional processing is grounded in bodily and affective processes. Accordingly, the perception of an emotion re-enacts congruent sensory and affective states; and conversely, bodily states congruent with a specific emotion facilitate emotional processing. This study tests whether the ability to process facial expressions (faces having a neutral expression, expressing fear, or disgust) can be influenced by making the participants' body state congruent with the expressed emotion (e.g., high heart rate in the case of faces expressing fear). We designed a task requiring participants to categorize pictures of male and female faces that either had a neutral expression (neutral), or expressed emotions whose linkage with high heart rate is strong (fear) or significantly weaker or absent (disgust). Critically, participants were tested in two conditions: with experimentally induced high heart rate (Exercise) and with normal heart rate (Normal). Participants processed fearful faces (but not disgusted or neutral faces) faster when they were in the Exercise condition than in the Normal condition. These results support the idea that an emotionally congruent body state facilitates the automatic processing of emotionally-charged stimuli and this effect is emotion-specific rather than due to generic factors such as arousal.

  15. Verbal and facial-emotional Stroop tasks reveal specific attentional interferences in sad mood

    PubMed Central

    Isaac, Linda; Vrijsen, Janna N; Eling, Paul; van Oostrom, Iris; Speckens, Anne; Becker, Eni S

    2012-01-01

    Mood congruence refers to the tendency of individuals to attend to information more readily when it has the same emotional content as their current mood state. The aim of the present study was to ascertain whether attentional interference occurred for participants in sad mood states for emotionally relevant stimuli (mood-congruence), and to determine whether this interference occurred for both valenced words and valenced faces. A mood induction procedure was administered to 116 undergraduate females divided into two equal groups for the sad and happy mood condition. This study employed three versions of the Stroop task: color, verbal-emotional, and a facial-emotional Stroop. The two mood groups did not differ on the color Stroop. Significant group differences were found on the verbal-emotional Stroop for sad words with longer latencies for sad-induced participants. Main findings for the facial-emotional Stroop were that sad mood is associated with attentional interference for angry-threatening faces as well as longer latencies for neutral faces. Group differences were not found for positive stimuli. These findings confirm that sad mood is associated with attentional interference for mood-congruent stimuli in the verbal domain (sad words), but this mood-congruent effect does not necessarily apply to the visual domain (sad faces). Attentional interference for neutral faces suggests sad mood participants did not necessarily see valence-free faces. Attentional interference for threatening stimuli is often associated with anxiety; however, the current results show that threat is not an attentional interference observed exclusively in states of anxiety but also in sad mood. PMID:22574276

  16. Verbal and facial-emotional Stroop tasks reveal specific attentional interferences in sad mood.

    PubMed

    Isaac, Linda; Vrijsen, Janna N; Eling, Paul; van Oostrom, Iris; Speckens, Anne; Becker, Eni S

    2012-01-01

    Mood congruence refers to the tendency of individuals to attend to information more readily when it has the same emotional content as their current mood state. The aim of the present study was to ascertain whether attentional interference occurred for participants in sad mood states for emotionally relevant stimuli (mood-congruence), and to determine whether this interference occurred for both valenced words and valenced faces. A mood induction procedure was administered to 116 undergraduate females divided into two equal groups for the sad and happy mood condition. This study employed three versions of the Stroop task: color, verbal-emotional, and a facial-emotional Stroop. The two mood groups did not differ on the color Stroop. Significant group differences were found on the verbal-emotional Stroop for sad words with longer latencies for sad-induced participants. Main findings for the facial-emotional Stroop were that sad mood is associated with attentional interference for angry-threatening faces as well as longer latencies for neutral faces. Group differences were not found for positive stimuli. These findings confirm that sad mood is associated with attentional interference for mood-congruent stimuli in the verbal domain (sad words), but this mood-congruent effect does not necessarily apply to the visual domain (sad faces). Attentional interference for neutral faces suggests sad mood participants did not necessarily see valence-free faces. Attentional interference for threatening stimuli is often associated with anxiety; however, the current results show that threat is not an attentional interference observed exclusively in states of anxiety but also in sad mood.

  17. Pretreatment Differences in BOLD Response to Emotional Faces Correlate with Antidepressant Response to Scopolamine.

    PubMed

    Furey, Maura L; Drevets, Wayne C; Szczepanik, Joanna; Khanna, Ashish; Nugent, Allison; Zarate, Carlos A

    2015-03-28

    Faster acting antidepressants and biomarkers that predict treatment response are needed to facilitate the development of more effective treatments for patients with major depressive disorders. Here, we evaluate implicitly and explicitly processed emotional faces using neuroimaging to identify potential biomarkers of treatment response to the antimuscarinic, scopolamine. Healthy participants (n=15) and unmedicated-depressed major depressive disorder patients (n=16) participated in a double-blind, placebo-controlled crossover infusion study using scopolamine (4 μg/kg). Before and following scopolamine, blood oxygen-level dependent signal was measured using functional MRI during a selective attention task. Two stimuli comprised of superimposed pictures of faces and houses were presented. Participants attended to one stimulus component and performed a matching task. Face emotion was modulated (happy/sad) creating implicit (attend-houses) and explicit (attend-faces) emotion processing conditions. The pretreatment difference in blood oxygen-level dependent response to happy and sad faces under implicit and explicit conditions (emotion processing biases) within a-priori regions of interest was correlated with subsequent treatment response in major depressive disorder. Correlations were observed exclusively during implicit emotion processing in the regions of interest, which included the subgenual anterior cingulate (P<.02) and middle occipital cortices (P<.02). The magnitude and direction of differential blood oxygen-level- dependent response to implicitly processed emotional faces prior to treatment reflect the potential to respond to scopolamine. These findings replicate earlier results, highlighting the potential for pretreatment neural activity in the middle occipital cortices and subgenual anterior cingulate to inform us about the potential to respond clinically to scopolamine. Published by Oxford University Press on behalf of CINP 2015. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  18. New Tests to Measure Individual Differences in Matching and Labelling Facial Expressions of Emotion, and Their Association with Ability to Recognise Vocal Emotions and Facial Identity

    PubMed Central

    Palermo, Romina; O’Connor, Kirsty B.; Davis, Joshua M.; Irons, Jessica; McKone, Elinor

    2013-01-01

    Although good tests are available for diagnosing clinical impairments in face expression processing, there is a lack of strong tests for assessing “individual differences” – that is, differences in ability between individuals within the typical, nonclinical, range. Here, we develop two new tests, one for expression perception (an odd-man-out matching task in which participants select which one of three faces displays a different expression) and one additionally requiring explicit identification of the emotion (a labelling task in which participants select one of six verbal labels). We demonstrate validity (careful check of individual items, large inversion effects, independence from nonverbal IQ, convergent validity with a previous labelling task), reliability (Cronbach’s alphas of.77 and.76 respectively), and wide individual differences across the typical population. We then demonstrate the usefulness of the tests by addressing theoretical questions regarding the structure of face processing, specifically the extent to which the following processes are common or distinct: (a) perceptual matching and explicit labelling of expression (modest correlation between matching and labelling supported partial independence); (b) judgement of expressions from faces and voices (results argued labelling tasks tap into a multi-modal system, while matching tasks tap distinct perceptual processes); and (c) expression and identity processing (results argued for a common first step of perceptual processing for expression and identity). PMID:23840821

  19. New tests to measure individual differences in matching and labelling facial expressions of emotion, and their association with ability to recognise vocal emotions and facial identity.

    PubMed

    Palermo, Romina; O'Connor, Kirsty B; Davis, Joshua M; Irons, Jessica; McKone, Elinor

    2013-01-01

    Although good tests are available for diagnosing clinical impairments in face expression processing, there is a lack of strong tests for assessing "individual differences"--that is, differences in ability between individuals within the typical, nonclinical, range. Here, we develop two new tests, one for expression perception (an odd-man-out matching task in which participants select which one of three faces displays a different expression) and one additionally requiring explicit identification of the emotion (a labelling task in which participants select one of six verbal labels). We demonstrate validity (careful check of individual items, large inversion effects, independence from nonverbal IQ, convergent validity with a previous labelling task), reliability (Cronbach's alphas of.77 and.76 respectively), and wide individual differences across the typical population. We then demonstrate the usefulness of the tests by addressing theoretical questions regarding the structure of face processing, specifically the extent to which the following processes are common or distinct: (a) perceptual matching and explicit labelling of expression (modest correlation between matching and labelling supported partial independence); (b) judgement of expressions from faces and voices (results argued labelling tasks tap into a multi-modal system, while matching tasks tap distinct perceptual processes); and (c) expression and identity processing (results argued for a common first step of perceptual processing for expression and identity).

  20. Emotional face processing in pediatric bipolar disorder: evidence for functional impairments in the fusiform gyrus.

    PubMed

    Perlman, Susan B; Fournier, Jay C; Bebko, Genna; Bertocci, Michele A; Hinze, Amanda K; Bonar, Lisa; Almeida, Jorge R C; Versace, Amelia; Schirda, Claudiu; Travis, Michael; Gill, Mary Kay; Demeter, Christine; Diwadkar, Vaibhav A; Sunshine, Jeffrey L; Holland, Scott K; Kowatch, Robert A; Birmaher, Boris; Axelson, David; Horwitz, Sarah M; Arnold, L Eugene; Fristad, Mary A; Youngstrom, Eric A; Findling, Robert L; Phillips, Mary L

    2013-12-01

    Pediatric bipolar disorder involves poor social functioning, but the neural mechanisms underlying these deficits are not well understood. Previous neuroimaging studies have found deficits in emotional face processing localized to emotional brain regions. However, few studies have examined dysfunction in other regions of the face processing circuit. This study assessed hypoactivation in key face processing regions of the brain in pediatric bipolar disorder. Youth with a bipolar spectrum diagnosis (n = 20) were matched to a nonbipolar clinical group (n = 20), with similar demographics and comorbid diagnoses, and a healthy control group (n = 20). Youth participated in a functional magnetic resonance imaging (fMRI) scanning which employed a task-irrelevant emotion processing design in which processing of facial emotions was not germane to task performance. Hypoactivation, isolated to the fusiform gyrus, was found when viewing animated, emerging facial expressions of happiness, sadness, fearfulness, and especially anger in pediatric bipolar participants relative to matched clinical and healthy control groups. The results of the study imply that differences exist in visual regions of the brain's face processing system and are not solely isolated to emotional brain regions such as the amygdala. Findings are discussed in relation to facial emotion recognition and fusiform gyrus deficits previously reported in the autism literature. Behavioral interventions targeting attention to facial stimuli might be explored as possible treatments for bipolar disorder in youth. Copyright © 2013 American Academy of Child and Adolescent Psychiatry. Published by Elsevier Inc. All rights reserved.

  1. Affective bias in visual working memory is associated with capacity.

    PubMed

    Xie, Weizhen; Li, Huanhuan; Ying, Xiangyu; Zhu, Shiyou; Fu, Rong; Zou, Yingmin; Cui, Yanyan

    2017-11-01

    How does the affective nature of task stimuli modulate working memory (WM)? This study investigates whether WM maintains emotional information in a biased manner to meet the motivational principle of approaching positivity and avoiding negativity by retaining more approach-related positive content over avoidance-related negative content. This bias may exist regardless of individual differences in WM functionality, as indexed by WM capacity (overall bias hypothesis). Alternatively, this bias may be contingent on WM capacity (capacity-based hypothesis), in which a better WM system may be more likely to reveal an adaptive bias. In two experiments, participants performed change localisation tasks with emotional and non-emotional stimuli to estimate the number of items that they could retain for each of those stimuli. Although participants did not seem to remember one type of emotional content (e.g. happy faces) better than the other type of emotional content (e.g. sad faces), there was a significant correlation between WM capacity and affective bias. Specifically, participants with higher WM capacity for non-emotional stimuli (colours or line-drawing symbols) tended to maintain more happy faces over sad faces. These findings demonstrated the presence of a "built-in" affective bias in WM as a function of its systematic limitations, favouring the capacity-based hypothesis.

  2. Parametric modulation of neural activity during face emotion processing in unaffected youth at familial risk for bipolar disorder.

    PubMed

    Brotman, Melissa A; Deveney, Christen M; Thomas, Laura A; Hinton, Kendra E; Yi, Jennifer Y; Pine, Daniel S; Leibenluft, Ellen

    2014-11-01

    Both patients with pediatric bipolar disorder (BD) and unaffected youth at familial risk (AR) for the illness show impairments in face emotion labeling. Few studies, however, have examined brain regions engaged in AR youth when processing emotional faces. Moreover, studies have yet to explore neural responsiveness to subtle changes in face emotion in AR youth. Sixty-four unrelated youth, including 20 patients with BD, 15 unaffected AR youth, and 29 healthy comparisons (HC), completed functional magnetic resonance imaging. Neutral faces were morphed with angry or happy faces in 25% intervals. In specific phases of the task, youth alternatively made explicit (hostility) or implicit (nose width) ratings of the faces. The slope of blood oxygenated level-dependent activity was calculated across neutral to angry and neutral to happy face stimuli. Behaviorally, both subjects with BD (p ≤ 0.001) and AR youth (p ≤ 0.05) rated faces as less hostile relative to HC. Consistent with this, in response to increasing anger on the face, patients with BD and AR youth showed decreased modulation in the amygdala and inferior frontal gyrus (IFG; BA 46) compared to HC (all p ≤ 0.05). Amygdala dysfunction was present across both implicit and explicit rating conditions, but IFG modulation deficits were specific to the explicit condition. With increasing happiness, AR youth showed aberrant modulation in the IFG, which was also sensitive to task demands (all p ≤ 0.05). Decreased amygdala and IFG modulation in patients with BD and AR youth may be pathophysiological risk markers for BD, and may underlie the social cognition and face emotion labeling deficits observed in BD and AR youth. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.

  3. Emotional body-word conflict evokes enhanced n450 and slow potential.

    PubMed

    Ma, Jianling; Liu, Chang; Zhong, Xin; Wang, Lu; Chen, Xu

    2014-01-01

    Emotional conflict refers to the influence of task irrelevant affective stimuli on current task set. Previously used emotional face-word tasks have produced certain electrophysiological phenomena, such as an enhanced N450 and slow potential; however, it remains unknown whether these effects emerge in other tasks. The present study used an emotional body-word conflict task to investigate the neural dynamics of emotional conflict as reflected by response time, accuracy, and event-related potentials, which were recorded with the aim of replicating the previously observed N450 and slow potential effect. Results indicated increased response time and decreased accuracy in the incongruent condition relative to the congruent condition, indicating a robust interference effect. Furthermore, the incongruent condition evoked pronounced N450 amplitudes and a more positive slow potential, which might be associated with conflict-monitoring and conflict resolution. The present findings extend our understanding of emotional conflict to the body-word domain.

  4. Emotionally anesthetized: media violence induces neural changes during emotional face processing.

    PubMed

    Stockdale, Laura A; Morrison, Robert G; Kmiecik, Matthew J; Garbarino, James; Silton, Rebecca L

    2015-10-01

    Media violence exposure causes increased aggression and decreased prosocial behavior, suggesting that media violence desensitizes people to the emotional experience of others. Alterations in emotional face processing following exposure to media violence may result in desensitization to others' emotional states. This study used scalp electroencephalography methods to examine the link between exposure to violence and neural changes associated with emotional face processing. Twenty-five participants were shown a violent or nonviolent film clip and then completed a gender discrimination stop-signal task using emotional faces. Media violence did not affect the early visual P100 component; however, decreased amplitude was observed in the N170 and P200 event-related potentials following the violent film, indicating that exposure to film violence leads to suppression of holistic face processing and implicit emotional processing. Participants who had just seen a violent film showed increased frontal N200/P300 amplitude. These results suggest that media violence exposure may desensitize people to emotional stimuli and thereby require fewer cognitive resources to inhibit behavior. © The Author (2015). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  5. Fixation to features and neural processing of facial expressions in a gender discrimination task

    PubMed Central

    Neath, Karly N.; Itier, Roxane J.

    2017-01-01

    Early face encoding, as reflected by the N170 ERP component, is sensitive to fixation to the eyes. Whether this sensitivity varies with facial expressions of emotion and can also be seen on other ERP components such as P1 and EPN, was investigated. Using eye-tracking to manipulate fixation on facial features, we found the N170 to be the only eye-sensitive component and this was true for fearful, happy and neutral faces. A different effect of fixation to features was seen for the earlier P1 that likely reflected general sensitivity to face position. An early effect of emotion (~120 ms) for happy faces was seen at occipital sites and was sustained until ~350 ms post-stimulus. For fearful faces, an early effect was seen around 80 ms followed by a later effect appearing at ~150 ms until ~300 ms at lateral posterior sites. Results suggests that in this emotion-irrelevant gender discrimination task, processing of fearful and happy expressions occurred early and largely independently of the eye-sensitivity indexed by the N170. Processing of the two emotions involved different underlying brain networks active at different times. PMID:26277653

  6. Psychopaths lack the automatic avoidance of social threat: relation to instrumental aggression.

    PubMed

    Louise von Borries, Anna Katinka; Volman, Inge; de Bruijn, Ellen Rosalia Aloïs; Bulten, Berend Hendrik; Verkes, Robbert Jan; Roelofs, Karin

    2012-12-30

    Psychopathy (PP) is associated with marked abnormalities in social emotional behaviour, such as high instrumental aggression (IA). A crucial but largely ignored question is whether automatic social approach-avoidance tendencies may underlie this condition. We tested whether offenders with PP show lack of automatic avoidance tendencies, usually activated when (healthy) individuals are confronted with social threat stimuli (angry faces). We applied a computerized approach-avoidance task (AAT), where participants pushed or pulled pictures of emotional faces using a joystick, upon which the faces decreased or increased in size, respectively. Furthermore, participants completed an emotion recognition task which was used to control for differences in recognition of facial emotions. In contrast to healthy controls (HC), PP patients showed total absence of avoidance tendencies towards angry faces. Interestingly, those responses were related to levels of instrumental aggression and the (in)ability to experience personal distress (PD). These findings suggest that social performance in psychopaths is disturbed on a basic level of automatic action tendencies. The lack of implicit threat avoidance tendencies may underlie their aggressive behaviour. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  7. History of Childhood Maltreatment and College Academic Outcomes: Indirect Effects of Hot Execution Function

    PubMed Central

    Welsh, Marilyn C.; Peterson, Eric; Jameson, Molly M.

    2017-01-01

    College students who report a history of childhood maltreatment may be at risk for poor outcomes. In the current study, we conducted an exploratory analysis to examine potential models that statistically mediate associations between aspects of maltreatment and aspects of academic outcome, with a particular focus on executive functions (EF). Consistent with contemporary EF research, we distinguished between relatively “cool” EF tasks (i.e., performed in a context relatively free of emotional or motivational valence) and “hot” EF tasks that emphasize performance under more emotionally arousing conditions. Sixty-one male and female college undergraduates self-reported childhood maltreatment history (emotional abuse and neglect, physical abuse and neglect, and sexual abuse) on the Childhood Trauma Questionnaire (CTQ), and were given two EF measures: (1) Go-No-Go (GNG) test that included a Color Condition (cool); Neutral Face Condition (warm); and Emotion Face condition (hot), and (2) Iowa Gambling Task (IGT), a measure of risky decision making that reflects hot EF. Academic outcomes were: (1) grade point average (GPA: first-semester, cumulative, and semester concurrent with testing), and (2) Student Adaptation to College Questionnaire (SACQ). Correlational patterns suggested two EF scores as potential mediators: GNG reaction time (RT) in the Neutral Face condition, and IGT Block 2 adaptive responding. Indirect effects analyses indicated that IGT Block 2 adaptive responding has an indirect effect on the relationship between CTQ Total score and 1st semester GPA, and between CTQ Emotional Abuse and concurrent GPA. Regarding college adaptation, we identified a consistent indirect effect of GNG Neutral Face RT on the relationship between CTQ Emotional Neglect and SACQ total, academic, social, and personal–emotional adaption scores. Our results demonstrate that higher scores on a child maltreatment history self-report negatively predict college academic outcomes as assessed by GPA and by self-reported adaptation. Further, relatively “hot” EF task performance on the IGT and GNG tasks serves as a link between child maltreatment experiences and college achievement and adaptation, suggesting that hot EF skills may be a fruitful direction for future intervention efforts to improve academic outcomes for this population. PMID:28725204

  8. Psilocybin modulates functional connectivity of the amygdala during emotional face discrimination.

    PubMed

    Grimm, O; Kraehenmann, R; Preller, K H; Seifritz, E; Vollenweider, F X

    2018-04-24

    Recent studies suggest that the antidepressant effects of the psychedelic 5-HT2A receptor agonist psilocybin are mediated through its modulatory properties on prefrontal and limbic brain regions including the amygdala. To further investigate the effects of psilocybin on emotion processing networks, we studied for the first-time psilocybin's acute effects on amygdala seed-to-voxel connectivity in an event-related face discrimination task in 18 healthy volunteers who received psilocybin and placebo in a double-blind balanced cross-over design. The amygdala has been implicated as a salience detector especially involved in the immediate response to emotional face content. We used beta-series amygdala seed-to-voxel connectivity during an emotional face discrimination task to elucidate the connectivity pattern of the amygdala over the entire brain. When we compared psilocybin to placebo, an increase in reaction time for all three categories of affective stimuli was found. Psilocybin decreased the connectivity between amygdala and the striatum during angry face discrimination. During happy face discrimination, the connectivity between the amygdala and the frontal pole was decreased. No effect was seen during discrimination of fearful faces. Thus, we show psilocybin's effect as a modulator of major connectivity hubs of the amygdala. Psilocybin decreases the connectivity between important nodes linked to emotion processing like the frontal pole or the striatum. Future studies are needed to clarify whether connectivity changes predict therapeutic effects in psychiatric patients. Copyright © 2018 Elsevier B.V. and ECNP. All rights reserved.

  9. Neurofunctional Underpinnings of Audiovisual Emotion Processing in Teens with Autism Spectrum Disorders

    PubMed Central

    Doyle-Thomas, Krissy A.R.; Goldberg, Jeremy; Szatmari, Peter; Hall, Geoffrey B.C.

    2013-01-01

    Despite successful performance on some audiovisual emotion tasks, hypoactivity has been observed in frontal and temporal integration cortices in individuals with autism spectrum disorders (ASD). Little is understood about the neurofunctional network underlying this ability in individuals with ASD. Research suggests that there may be processing biases in individuals with ASD, based on their ability to obtain meaningful information from the face and/or the voice. This functional magnetic resonance imaging study examined brain activity in teens with ASD (n = 18) and typically developing controls (n = 16) during audiovisual and unimodal emotion processing. Teens with ASD had a significantly lower accuracy when matching an emotional face to an emotion label. However, no differences in accuracy were observed between groups when matching an emotional voice or face-voice pair to an emotion label. In both groups brain activity during audiovisual emotion matching differed significantly from activity during unimodal emotion matching. Between-group analyses of audiovisual processing revealed significantly greater activation in teens with ASD in a parietofrontal network believed to be implicated in attention, goal-directed behaviors, and semantic processing. In contrast, controls showed greater activity in frontal and temporal association cortices during this task. These results suggest that in the absence of engaging integrative emotional networks during audiovisual emotion matching, teens with ASD may have recruited the parietofrontal network as an alternate compensatory system. PMID:23750139

  10. The perception of positive and negative facial expressions by unilateral stroke patients.

    PubMed

    Abbott, Jacenta D; Wijeratne, Tissa; Hughes, Andrew; Perre, Diana; Lindell, Annukka K

    2014-04-01

    There remains conflict in the literature about the lateralisation of affective face perception. Some studies have reported a right hemisphere advantage irrespective of valence, whereas others have found a left hemisphere advantage for positive, and a right hemisphere advantage for negative, emotion. Differences in injury aetiology and chronicity, proportion of male participants, participant age, and the number of emotions used within a perception task may contribute to these contradictory findings. The present study therefore controlled and/or directly examined the influence of these possible moderators. Right brain-damaged (RBD; n=17), left brain-damaged (LBD; n=17), and healthy control (HC; n=34) participants completed two face perception tasks (identification and discrimination). No group differences in facial expression perception according to valence were found. Across emotions, the RBD group was less accurate thanthe HC group, however RBD and LBD group performancedid not differ. The lack of difference between RBD and LBD groups indicates that both hemispheres are involved in positive and negative expression perception. The inclusion of older adults and the well-defined chronicity range of the brain-damaged participants may have moderated these findings. Participant sex and general face perception ability did not influence performance. Furthermore, while the RBD group was less accurate than the LBD group when the identification task tested two emotions, performance of the two groups was indistinguishable when the number of emotions increased (four or six). This suggests that task demand moderates a study's ability to find hemispheric differences in the perception of facial emotion. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. How Children Use Emotional Prosody: Crossmodal Emotional Integration?

    ERIC Educational Resources Information Center

    Gil, Sandrine; Hattouti, Jamila; Laval, Virginie

    2016-01-01

    A crossmodal effect has been observed in the processing of facial and vocal emotion in adults and infants. For the first time, we assessed whether this effect is present in childhood by administering a crossmodal task similar to those used in seminal studies featuring emotional faces (i.e., a continuum of emotional expressions running from…

  12. A motivational determinant of facial emotion recognition: regulatory focus affects recognition of emotions in faces.

    PubMed

    Sassenrath, Claudia; Sassenberg, Kai; Ray, Devin G; Scheiter, Katharina; Jarodzka, Halszka

    2014-01-01

    Two studies examined an unexplored motivational determinant of facial emotion recognition: observer regulatory focus. It was predicted that a promotion focus would enhance facial emotion recognition relative to a prevention focus because the attentional strategies associated with promotion focus enhance performance on well-learned or innate tasks - such as facial emotion recognition. In Study 1, a promotion or a prevention focus was experimentally induced and better facial emotion recognition was observed in a promotion focus compared to a prevention focus. In Study 2, individual differences in chronic regulatory focus were assessed and attention allocation was measured using eye tracking during the facial emotion recognition task. Results indicated that the positive relation between a promotion focus and facial emotion recognition is mediated by shorter fixation duration on the face which reflects a pattern of attention allocation matched to the eager strategy in a promotion focus (i.e., striving to make hits). A prevention focus did not have an impact neither on perceptual processing nor on facial emotion recognition. Taken together, these findings demonstrate important mechanisms and consequences of observer motivational orientation for facial emotion recognition.

  13. The Value of Emoticons in Investigating Student Emotions Related to Mathematics Task Negotiation

    ERIC Educational Resources Information Center

    D'Agostin, Fabio

    2014-01-01

    "Emoticons" are simple face icons expressing common feelings such as happiness, interest and boredom and are popularly used in electronic communication. Emoticons were utilised in this study as experience sampling devices. Year 10 students selected emoticons to indicate their emotional states at set intervals during classroom tasks.…

  14. Cognitive Flexibility in ASD; Task Switching with Emotional Faces

    ERIC Educational Resources Information Center

    de Vries, Marieke; Geurts, Hilde M.

    2012-01-01

    Children with autism spectrum disorders (ASDs) show daily cognitive flexibility deficits, but laboratory data are unconvincing. The current study aimed to bridge this gap. Thirty-one children with ASD (8-12 years) and 31 age- and IQ-matched typically developing children performed a gender emotion switch task. Unannounced switches and complex…

  15. Facial EMG responses to emotional expressions are related to emotion perception ability.

    PubMed

    Künecke, Janina; Hildebrandt, Andrea; Recio, Guillermo; Sommer, Werner; Wilhelm, Oliver

    2014-01-01

    Although most people can identify facial expressions of emotions well, they still differ in this ability. According to embodied simulation theories understanding emotions of others is fostered by involuntarily mimicking the perceived expressions, causing a "reactivation" of the corresponding mental state. Some studies suggest automatic facial mimicry during expression viewing; however, findings on the relationship between mimicry and emotion perception abilities are equivocal. The present study investigated individual differences in emotion perception and its relationship to facial muscle responses - recorded with electromyogram (EMG)--in response to emotional facial expressions. N° = °269 participants completed multiple tasks measuring face and emotion perception. EMG recordings were taken from a subsample (N° = °110) in an independent emotion classification task of short videos displaying six emotions. Confirmatory factor analyses of the m. corrugator supercilii in response to angry, happy, sad, and neutral expressions showed that individual differences in corrugator activity can be separated into a general response to all faces and an emotion-related response. Structural equation modeling revealed a substantial relationship between the emotion-related response and emotion perception ability, providing evidence for the role of facial muscle activation in emotion perception from an individual differences perspective.

  16. Facial EMG Responses to Emotional Expressions Are Related to Emotion Perception Ability

    PubMed Central

    Künecke, Janina; Hildebrandt, Andrea; Recio, Guillermo; Sommer, Werner; Wilhelm, Oliver

    2014-01-01

    Although most people can identify facial expressions of emotions well, they still differ in this ability. According to embodied simulation theories understanding emotions of others is fostered by involuntarily mimicking the perceived expressions, causing a “reactivation” of the corresponding mental state. Some studies suggest automatic facial mimicry during expression viewing; however, findings on the relationship between mimicry and emotion perception abilities are equivocal. The present study investigated individual differences in emotion perception and its relationship to facial muscle responses - recorded with electromyogram (EMG) - in response to emotional facial expressions. N° = °269 participants completed multiple tasks measuring face and emotion perception. EMG recordings were taken from a subsample (N° = °110) in an independent emotion classification task of short videos displaying six emotions. Confirmatory factor analyses of the m. corrugator supercilii in response to angry, happy, sad, and neutral expressions showed that individual differences in corrugator activity can be separated into a general response to all faces and an emotion-related response. Structural equation modeling revealed a substantial relationship between the emotion-related response and emotion perception ability, providing evidence for the role of facial muscle activation in emotion perception from an individual differences perspective. PMID:24489647

  17. Patients with Parkinson's disease display a dopamine therapy related negative bias and an enlarged range in emotional responses to facial emotional stimuli.

    PubMed

    Lundqvist, Daniel; Svärd, Joakim; Michelgård Palmquist, Åsa; Fischer, Håkan; Svenningsson, Per

    2017-09-01

    The literature on emotional processing in Parkinson's disease (PD) patients shows mixed results. This may be because of various methodological and/or patient-related differences, such as failing to adjust for cognitive functioning, depression, and/or mood. In the current study, we tested PD patients and healthy controls (HCs) using emotional stimuli across a variety of tasks, including visual search, short-term memory (STM), categorical perception, and emotional stimulus rating. The PD and HC groups were matched on cognitive ability, depression, and mood. We also explored possible relationships between task results and antiparkinsonian treatment effects, as measured by levodopa equivalent dosages (LED), in the PD group. The results show that PD patients use a larger emotional range compared with HCs when reporting their impression of emotional faces on rated emotional valence, arousal, and potency. The results also show that dopaminergic therapy was correlated with stimulus rating results such that PD patients with higher LED scores rated negative faces as less arousing, less negative, and less powerful. Finally, results also show that PD patients display a general slowing effect in the visual search tasks compared with HCs, indicating overall slowed responses. There were no group differences observed in the STM or categorical perception tasks. Our results indicate a relationship between emotional responses, PD, and dopaminergic therapy, in which PD per se is associated with stronger emotional responses, whereas LED levels are negatively correlated with the strength of emotional responses. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  18. Development of Emotional Face Processing in Premature and Full-Term Infants.

    PubMed

    Carbajal-Valenzuela, Cintli Carolina; Santiago-Rodríguez, Efraín; Quirarte, Gina L; Harmony, Thalía

    2017-03-01

    The rate of premature births has increased in the past 2 decades. Ten percent of premature birth survivors develop motor impairment, but almost half exhibit later sensorial, cognitive, and emotional disabilities attributed to white matter injury and decreased volume of neuronal structures. The aim of this study was to test the hypothesis that premature and full-term infants differ in their development of emotional face processing. A comparative longitudinal study was conducted in premature and full-term infants at 4 and 8 months of age. The absolute power of the electroencephalogram was analyzed in both groups during 5 conditions of an emotional face processing task: positive, negative, neutral faces, non-face, and rest. Differences between the conditions of the task at 4 months were limited to rest versus non-rest comparisons in both groups. Eight-month-old term infants had increases ( P ≤ .05) in absolute power in the left occipital region at the frequency of 10.1 Hz and in the right occipital region at 3.5, 12.8, and 16.0 Hz when shown a positive face in comparison with a neutral face. They also showed increases in absolute power in the left occipital region at 1.9 Hz and in the right occipital region at 2.3 and 3.5 Hz with positive compared to non-face stimuli. In contrast, positive, negative, and neutral faces elicited the same responses in premature infants. In conclusion, our study provides electrophysiological evidence that emotional face processing develops differently in premature than in full-term infants, suggesting that premature birth alters mechanisms of brain development, such as the myelination process, and consequently affects complex cognitive functions.

  19. Passing faces: sequence-dependent variations in the perceptual processing of emotional faces.

    PubMed

    Karl, Christian; Hewig, Johannes; Osinsky, Roman

    2016-10-01

    There is broad evidence that contextual factors influence the processing of emotional facial expressions. Yet temporal-dynamic aspects, inter alia how face processing is influenced by the specific order of neutral and emotional facial expressions, have been largely neglected. To shed light on this topic, we recorded electroencephalogram from 168 healthy participants while they performed a gender-discrimination task with angry and neutral faces. Our event-related potential (ERP) analyses revealed a strong emotional modulation of the N170 component, indicating that the basic visual encoding and emotional analysis of a facial stimulus happen, at least partially, in parallel. While the N170 and the late positive potential (LPP; 400-600 ms) were only modestly affected by the sequence of preceding faces, we observed a strong influence of face sequences on the early posterior negativity (EPN; 200-300 ms). Finally, the differing response patterns of the EPN and LPP indicate that these two ERPs represent distinct processes during face analysis: while the former seems to represent the integration of contextual information in the perception of a current face, the latter appears to represent the net emotional interpretation of a current face.

  20. The electrophysiological effects of the serotonin 1A receptor agonist buspirone in emotional face processing.

    PubMed

    Bernasconi, Fosco; Kometer, Michael; Pokorny, Thomas; Seifritz, Erich; Vollenweider, Franz X

    2015-04-01

    Emotional face processing is critically modulated by the serotonergic system, and serotonin (5-HT) receptor agonists impair emotional face processing. However, the specific contribution of the 5-HT1A receptor remains poorly understood. Here we investigated the spatiotemporal brain mechanisms underpinning the modulation of emotional face processing induced by buspirone, a partial 5-HT1A receptor agonist. In a psychophysical discrimination of emotional faces task, we observed that the discrimination fearful versus neutral faces were reduced, but not happy versus neutral faces. Electrical neuroimaging analyses were applied to visual evoked potentials elicited by emotional face images, after placebo and buspirone administration. Buspirone modulated response strength (i.e., global field power) in the interval 230-248ms after stimulus onset. Distributed source estimation over this time interval revealed that buspirone decreased the neural activity in the right dorsolateral prefrontal cortex that was evoked by fearful faces. These results indicate temporal and valence-specific effects of buspirone on the neuronal correlates of emotional face processing. Furthermore, the reduced neural activity in the dorsolateral prefrontal cortex in response to fearful faces suggests a reduced attention to fearful faces. Collectively, these findings provide new insights into the role of 5-HT1A receptors in emotional face processing and have implications for affective disorders that are characterized by an increased attention to negative stimuli. Copyright © 2015 Elsevier B.V. and ECNP. All rights reserved.

  1. Configural and Featural Face Processing Influences on Emotion Recognition in Schizophrenia and Bipolar Disorder.

    PubMed

    Van Rheenen, Tamsyn E; Joshua, Nicole; Castle, David J; Rossell, Susan L

    2017-03-01

    Emotion recognition impairments have been demonstrated in schizophrenia (Sz), but are less consistent and lesser in magnitude in bipolar disorder (BD). This may be related to the extent to which different face processing strategies are engaged during emotion recognition in each of these disorders. We recently showed that Sz patients had impairments in the use of both featural and configural face processing strategies, whereas BD patients were impaired only in the use of the latter. Here we examine the influence that these impairments have on facial emotion recognition in these cohorts. Twenty-eight individuals with Sz, 28 individuals with BD, and 28 healthy controls completed a facial emotion labeling task with two conditions designed to separate the use of featural and configural face processing strategies; part-based and whole-face emotion recognition. Sz patients performed worse than controls on both conditions, and worse than BD patients on the whole-face condition. BD patients performed worse than controls on the whole-face condition only. Configural processing deficits appear to influence the recognition of facial emotions in BD, whereas both configural and featural processing abnormalities impair emotion recognition in Sz. This may explain discrepancies in the profiles of emotion recognition between the disorders. (JINS, 2017, 23, 287-291).

  2. Cultural immersion alters emotion perception: Neurophysiological evidence from Chinese immigrants to Canada.

    PubMed

    Liu, Pan; Rigoulot, Simon; Pell, Marc D

    2017-12-01

    To explore how cultural immersion modulates emotion processing, this study examined how Chinese immigrants to Canada process multisensory emotional expressions, which were compared to existing data from two groups, Chinese and North Americans. Stroop and Oddball paradigms were employed to examine different stages of emotion processing. The Stroop task presented face-voice pairs expressing congruent/incongruent emotions and participants actively judged the emotion of one modality while ignoring the other. A significant effect of cultural immersion was observed in the immigrants' behavioral performance, which showed greater interference from to-be-ignored faces, comparable with what was observed in North Americans. However, this effect was absent in their N400 data, which retained the same pattern as the Chinese. In the Oddball task, where immigrants passively viewed facial expressions with/without simultaneous vocal emotions, they exhibited a larger visual MMN for faces accompanied by voices, again mirroring patterns observed in Chinese. Correlation analyses indicated that the immigrants' living duration in Canada was associated with neural patterns (N400 and visual mismatch negativity) more closely resembling North Americans. Our data suggest that in multisensory emotion processing, adopting to a new culture first leads to behavioral accommodation followed by alterations in brain activities, providing new evidence on human's neurocognitive plasticity in communication.

  3. The effect of sad facial expressions on weight judgment

    PubMed Central

    Weston, Trent D.; Hass, Norah C.; Lim, Seung-Lark

    2015-01-01

    Although the body weight evaluation (e.g., normal or overweight) of others relies on perceptual impressions, it also can be influenced by other psychosocial factors. In this study, we explored the effect of task-irrelevant emotional facial expressions on judgments of body weight and the relationship between emotion-induced weight judgment bias and other psychosocial variables including attitudes toward obese persons. Forty-four participants were asked to quickly make binary body weight decisions for 960 randomized sad and neutral faces of varying weight levels presented on a computer screen. The results showed that sad facial expressions systematically decreased the decision threshold of overweight judgments for male faces. This perceptual decision bias by emotional expressions was positively correlated with the belief that being overweight is not under the control of obese persons. Our results provide experimental evidence that task-irrelevant emotional expressions can systematically change the decision threshold for weight judgments, demonstrating that sad expressions can make faces appear more overweight than they would otherwise be judged. PMID:25914669

  4. Older Adults' Trait Impressions of Faces Are Sensitive to Subtle Resemblance to Emotions

    PubMed Central

    Zebrowitz, Leslie A.

    2013-01-01

    Younger adults (YA) attribute emotion-related traits to people whose neutral facial structure resembles an emotion (emotion overgeneralization). The fact that older adults (OA) show deficits in accurately labeling basic emotions suggests that they may be relatively insensitive to variations in the emotion resemblance of neutral expression faces that underlie emotion overgeneralization effects. On the other hand, the fact that OA, like YA, show a ‘pop-out’ effect for anger, more quickly locating an angry than a happy face in a neutral array, suggests that both age groups may be equally sensitive to emotion resemblance. We used computer modeling to assess the degree to which neutral faces objectively resembled emotions and assessed whether that resemblance predicted trait impressions. We found that both OA and YA showed anger and surprise overgeneralization in ratings of danger and naiveté, respectively, with no significant differences in the strength of the effects for the two age groups. These findings suggest that well-documented OA deficits on emotion recognition tasks may be more due to processing demands than to an insensitivity to the social affordances of emotion expressions. PMID:24058225

  5. Cultural differences in gaze and emotion recognition: Americans contrast more than Chinese.

    PubMed

    Stanley, Jennifer Tehan; Zhang, Xin; Fung, Helene H; Isaacowitz, Derek M

    2013-02-01

    We investigated the influence of contextual expressions on emotion recognition accuracy and gaze patterns among American and Chinese participants. We expected Chinese participants would be more influenced by, and attend more to, contextual information than Americans. Consistent with our hypothesis, Americans were more accurate than Chinese participants at recognizing emotions embedded in the context of other emotional expressions. Eye-tracking data suggest that, for some emotions, Americans attended more to the target faces, and they made more gaze transitions to the target face than Chinese. For all emotions except anger and disgust, Americans appeared to use more of a contrasting strategy where each face was individually contrasted with the target face, compared with Chinese who used less of a contrasting strategy. Both cultures were influenced by contextual information, although the benefit of contextual information depended upon the perceptual dissimilarity of the contextual emotions to the target emotion and the gaze pattern employed during the recognition task. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  6. Cultural Differences in Gaze and Emotion Recognition: Americans Contrast More than Chinese

    PubMed Central

    Tehan Stanley, Jennifer; Zhang, Xin; Fung, Helene H.; Isaacowitz, Derek M.

    2014-01-01

    We investigated the influence of contextual expressions on emotion recognition accuracy and gaze patterns among American and Chinese participants. We expected Chinese participants would be more influenced by, and attend more to, contextual information than Americans. Consistent with our hypothesis, Americans were more accurate than Chinese participants at recognizing emotions embedded in the context of other emotional expressions. Eye tracking data suggest that, for some emotions, Americans attended more to the target faces and made more gaze transitions to the target face than Chinese. For all emotions except anger and disgust, Americans appeared to use more of a contrasting strategy where each face was individually contrasted with the target face, compared with Chinese who used less of a contrasting strategy. Both cultures were influenced by contextual information, although the benefit of contextual information depended upon the perceptual dissimilarity of the contextual emotions to the target emotion and the gaze pattern employed during the recognition task. PMID:22889414

  7. The role of the amygdala and the basal ganglia in visual processing of central vs. peripheral emotional content.

    PubMed

    Almeida, Inês; van Asselen, Marieke; Castelo-Branco, Miguel

    2013-09-01

    In human cognition, most relevant stimuli, such as faces, are processed in central vision. However, it is widely believed that recognition of relevant stimuli (e.g. threatening animal faces) at peripheral locations is also important due to their survival value. Moreover, task instructions have been shown to modulate brain regions involved in threat recognition (e.g. the amygdala). In this respect it is also controversial whether tasks requiring explicit focus on stimulus threat content vs. implicit processing differently engage primitive subcortical structures involved in emotional appraisal. Here we have addressed the role of central vs. peripheral processing in the human amygdala using animal threatening vs. non-threatening face stimuli. First, a simple animal face recognition task with threatening and non-threatening animal faces, as well as non-face control stimuli, was employed in naïve subjects (implicit task). A subsequent task was then performed with the same stimulus categories (but different stimuli) in which subjects were told to explicitly detect threat signals. We found lateralized amygdala responses both to the spatial location of stimuli and to the threatening content of faces depending on the task performed: the right amygdala showed increased responses to central compared to left presented stimuli specifically during the threat detection task, while the left amygdala was better prone to discriminate threatening faces from non-facial displays during the animal face recognition task. Additionally, the right amygdala responded to faces during the threat detection task but only when centrally presented. Moreover, we have found no evidence for superior responses of the amygdala to peripheral stimuli. Importantly, we have found that striatal regions activate differentially depending on peripheral vs. central processing of threatening faces. Accordingly, peripheral processing of these stimuli activated more strongly the putaminal region, while central processing engaged mainly the caudate nucleus. We conclude that the human amygdala has a central bias for face stimuli, and that visual processing recruits different striatal regions, putaminal or caudate based, depending on the task and on whether peripheral or central visual processing is involved. © 2013 Elsevier Ltd. All rights reserved.

  8. Diagnostic Features of Emotional Expressions Are Processed Preferentially

    PubMed Central

    Scheller, Elisa; Büchel, Christian; Gamer, Matthias

    2012-01-01

    Diagnostic features of emotional expressions are differentially distributed across the face. The current study examined whether these diagnostic features are preferentially attended to even when they are irrelevant for the task at hand or when faces appear at different locations in the visual field. To this aim, fearful, happy and neutral faces were presented to healthy individuals in two experiments while measuring eye movements. In Experiment 1, participants had to accomplish an emotion classification, a gender discrimination or a passive viewing task. To differentiate fast, potentially reflexive, eye movements from a more elaborate scanning of faces, stimuli were either presented for 150 or 2000 ms. In Experiment 2, similar faces were presented at different spatial positions to rule out the possibility that eye movements only reflect a general bias for certain visual field locations. In both experiments, participants fixated the eye region much longer than any other region in the face. Furthermore, the eye region was attended to more pronouncedly when fearful or neutral faces were shown whereas more attention was directed toward the mouth of happy facial expressions. Since these results were similar across the other experimental manipulations, they indicate that diagnostic features of emotional expressions are preferentially processed irrespective of task demands and spatial locations. Saliency analyses revealed that a computational model of bottom-up visual attention could not explain these results. Furthermore, as these gaze preferences were evident very early after stimulus onset and occurred even when saccades did not allow for extracting further information from these stimuli, they may reflect a preattentive mechanism that automatically detects relevant facial features in the visual field and facilitates the orientation of attention towards them. This mechanism might crucially depend on amygdala functioning and it is potentially impaired in a number of clinical conditions such as autism or social anxiety disorders. PMID:22848607

  9. Diagnostic features of emotional expressions are processed preferentially.

    PubMed

    Scheller, Elisa; Büchel, Christian; Gamer, Matthias

    2012-01-01

    Diagnostic features of emotional expressions are differentially distributed across the face. The current study examined whether these diagnostic features are preferentially attended to even when they are irrelevant for the task at hand or when faces appear at different locations in the visual field. To this aim, fearful, happy and neutral faces were presented to healthy individuals in two experiments while measuring eye movements. In Experiment 1, participants had to accomplish an emotion classification, a gender discrimination or a passive viewing task. To differentiate fast, potentially reflexive, eye movements from a more elaborate scanning of faces, stimuli were either presented for 150 or 2000 ms. In Experiment 2, similar faces were presented at different spatial positions to rule out the possibility that eye movements only reflect a general bias for certain visual field locations. In both experiments, participants fixated the eye region much longer than any other region in the face. Furthermore, the eye region was attended to more pronouncedly when fearful or neutral faces were shown whereas more attention was directed toward the mouth of happy facial expressions. Since these results were similar across the other experimental manipulations, they indicate that diagnostic features of emotional expressions are preferentially processed irrespective of task demands and spatial locations. Saliency analyses revealed that a computational model of bottom-up visual attention could not explain these results. Furthermore, as these gaze preferences were evident very early after stimulus onset and occurred even when saccades did not allow for extracting further information from these stimuli, they may reflect a preattentive mechanism that automatically detects relevant facial features in the visual field and facilitates the orientation of attention towards them. This mechanism might crucially depend on amygdala functioning and it is potentially impaired in a number of clinical conditions such as autism or social anxiety disorders.

  10. Association between Ability Emotional Intelligence and Left Insula during Social Judgment of Facial Emotions.

    PubMed

    Quarto, Tiziana; Blasi, Giuseppe; Maddalena, Chiara; Viscanti, Giovanna; Lanciano, Tiziana; Soleti, Emanuela; Mangiulli, Ivan; Taurisano, Paolo; Fazio, Leonardo; Bertolino, Alessandro; Curci, Antonietta

    2016-01-01

    The human ability of identifying, processing and regulating emotions from social stimuli is generally referred as Emotional Intelligence (EI). Within EI, Ability EI identifies a performance measure assessing individual skills at perceiving, using, understanding and managing emotions. Previous models suggest that a brain "somatic marker circuitry" (SMC) sustains emotional sub-processes included in EI. Three primary brain regions are included: the amygdala, the insula and the ventromedial prefrontal cortex (vmPFC). Here, our aim was to investigate the relationship between Ability EI scores and SMC activity during social judgment of emotional faces. Sixty-three healthy subjects completed a test measuring Ability EI and underwent fMRI during a social decision task (i.e. approach or avoid) about emotional faces with different facial expressions. Imaging data revealed that EI scores are associated with left insula activity during social judgment of emotional faces as a function of facial expression. Specifically, higher EI scores are associated with greater left insula activity during social judgment of fearful faces but also with lower activity of this region during social judgment of angry faces. These findings indicate that the association between Ability EI and the SMC activity during social behavior is region- and emotion-specific.

  11. No differences in emotion recognition strategies in children with autism spectrum disorder: evidence from hybrid faces.

    PubMed

    Evers, Kris; Kerkhof, Inneke; Steyaert, Jean; Noens, Ilse; Wagemans, Johan

    2014-01-01

    Emotion recognition problems are frequently reported in individuals with an autism spectrum disorder (ASD). However, this research area is characterized by inconsistent findings, with atypical emotion processing strategies possibly contributing to existing contradictions. In addition, an attenuated saliency of the eyes region is often demonstrated in ASD during face identity processing. We wanted to compare reliance on mouth versus eyes information in children with and without ASD, using hybrid facial expressions. A group of six-to-eight-year-old boys with ASD and an age- and intelligence-matched typically developing (TD) group without intellectual disability performed an emotion labelling task with hybrid facial expressions. Five static expressions were used: one neutral expression and four emotional expressions, namely, anger, fear, happiness, and sadness. Hybrid faces were created, consisting of an emotional face half (upper or lower face region) with the other face half showing a neutral expression. Results showed no emotion recognition problem in ASD. Moreover, we provided evidence for the existence of top- and bottom-emotions in children: correct identification of expressions mainly depends on information in the eyes (so-called top-emotions: happiness) or in the mouth region (so-called bottom-emotions: sadness, anger, and fear). No stronger reliance on mouth information was found in children with ASD.

  12. The effects of familiarity and emotional expression on face processing examined by ERPs in patients with schizophrenia.

    PubMed

    Caharel, Stéphanie; Bernard, Christian; Thibaut, Florence; Haouzir, Sadec; Di Maggio-Clozel, Carole; Allio, Gabrielle; Fouldrin, Gaël; Petit, Michel; Lalonde, Robert; Rebaï, Mohamed

    2007-09-01

    The main objective of the study was to determine whether patients with schizophrenia are deficient relative to controls in the processing of faces at different levels of familiarity and types of emotion and the stage where such differences may occur. ERPs based on 18 patients with schizophrenia and 18 controls were compared in a face identification task at three levels of familiarity (unknown, familiar, subject's own) and for three types of emotion (disgust, smiling, neutral). The schizophrenic group was less accurate than controls in the face processing, especially for unknown faces and those expressing negative emotions such as disgust. P1 and N170 amplitudes were lower and P1, N170, P250 amplitudes were of slower onset in patients with schizophrenia. N170 and P250 amplitudes were modulated by familiarity and face expression in a different manner in patients than controls. Schizophrenia is associated with a genelarized defect of face processing, both in terms of familiarity and emotional expression, attributable to deficient processing at sensory (P1) and perceptual (N170) stages. These patients appear to have difficulty in encoding the structure of a face and thereby do not evaluate correctly familiarity and emotion.

  13. Neural activation to emotional faces in adolescents with autism spectrum disorders.

    PubMed

    Weng, Shih-Jen; Carrasco, Melisa; Swartz, Johnna R; Wiggins, Jillian Lee; Kurapati, Nikhil; Liberzon, Israel; Risi, Susan; Lord, Catherine; Monk, Christopher S

    2011-03-01

    Autism spectrum disorders (ASD) involve a core deficit in social functioning and impairments in the ability to recognize face emotions. In an emotional faces task designed to constrain group differences in attention, the present study used functional MRI to characterize activation in the amygdala, ventral prefrontal cortex (vPFC), and striatum, three structures involved in socio-emotional processing in adolescents with ASD. Twenty-two adolescents with ASD and 20 healthy adolescents viewed facial expressions (happy, fearful, sad and neutral) that were briefly presented (250 ms) during functional MRI acquisition. To monitor attention, subjects pressed a button to identify the gender of each face. The ASD group showed greater activation to the faces relative to the control group in the amygdala, vPFC and striatum. Follow-up analyses indicated that the ASD relative to control group showed greater activation in the amygdala, vPFC and striatum (p < .05 small volume corrected), particularly to sad faces. Moreover, in the ASD group, there was a negative correlation between developmental variables (age and pubertal status) and mean activation from the whole bilateral amygdala; younger adolescents showed greater activation than older adolescents. There were no group differences in accuracy or reaction time in the gender identification task. When group differences in attention to facial expressions were limited, adolescents with ASD showed greater activation in structures involved in socio-emotional processing. © 2010 The Authors. Journal of Child Psychology and Psychiatry © 2010 Association for Child and Adolescent Mental Health.

  14. The processing of facial identity and expression is interactive, but dependent on task and experience

    PubMed Central

    Yankouskaya, Alla; Humphreys, Glyn W.; Rotshtein, Pia

    2014-01-01

    Facial identity and emotional expression are two important sources of information for daily social interaction. However the link between these two aspects of face processing has been the focus of an unresolved debate for the past three decades. Three views have been advocated: (1) separate and parallel processing of identity and emotional expression signals derived from faces; (2) asymmetric processing with the computation of emotion in faces depending on facial identity coding but not vice versa; and (3) integrated processing of facial identity and emotion. We present studies with healthy participants that primarily apply methods from mathematical psychology, formally testing the relations between the processing of facial identity and emotion. Specifically, we focused on the “Garner” paradigm, the composite face effect and the divided attention tasks. We further ask whether the architecture of face-related processes is fixed or flexible and whether (and how) it can be shaped by experience. We conclude that formal methods of testing the relations between processes show that the processing of facial identity and expressions interact, and hence are not fully independent. We further demonstrate that the architecture of the relations depends on experience; where experience leads to higher degree of inter-dependence in the processing of identity and expressions. We propose that this change occurs as integrative processes are more efficient than parallel. Finally, we argue that the dynamic aspects of face processing need to be incorporated into theories in this field. PMID:25452722

  15. Guanfacine Modulates the Emotional Biasing of Amygdala-Prefrontal Connectivity for Cognitive Control

    PubMed Central

    Schulz, Kurt P.; Clerkin, Suzanne M.; Newcorn, Jeffrey H.; Halperin, Jeffrey M.; Fan, Jin

    2014-01-01

    Functional interactions between amygdala and prefrontal cortex provide a cortical entry point for emotional cues to bias cognitive control. Stimulation of α2 adrenoceptors enhances the prefrontal control functions and blocks the amygdala-dependent encoding of emotional cues. However, the impact of this stimulation on amygdala-prefrontal interactions and the emotional biasing of cognitive control have not been established. We tested the effect of the α2 adrenoceptor agonist guanfacine on psychophysiological interactions of amygdala with prefrontal cortex for the emotional biasing of response execution and inhibition. Fifteen healthy adults were scanned twice with event-related functional magnetic resonance imaging while performing an emotional go/no-go task following administration of oral guanfacine (1 mg) and placebo in a double-blind, counterbalanced design. Happy, sad, and neutral faces served as trial cues. Guanfacine moderated the effect of face emotion on the task-related functional connectivity of left and right amygdala with left inferior frontal gyrus compared to placebo, by selectively reversing the functional co-activation of the two regions for response execution cued by sad faces. This shift from positively to negatively correlated activation for guanfacine was associated with selective improvements in the relatively low accuracy of responses to sad faces seen for placebo. These results demonstrate the importance of functional interactions between amygdala and inferior frontal gyrus to both bottom-up biasing of cognitive control and top-down control of emotional processing, as well as for the α2 adrenoceptor-mediated modulation of these processes. These mechanisms offer a possibile method to address the emotional reactivity that is common to several psychiatric disorders. PMID:25059532

  16. The effects of social anxiety on emotional face discrimination and its modulation by mouth salience.

    PubMed

    du Rocher, Andrew R; Pickering, Alan D

    2018-05-21

    People high in social anxiety experience fear of social situations due to the likelihood of social evaluation. Whereas happy faces are generally processed very quickly, this effect is impaired by high social anxiety. Mouth regions are implicated during emotional face processing, therefore differences in mouth salience might affect how social anxiety relates to emotional face discrimination. We designed an emotional facial expression recognition task to reveal how varying levels of sub-clinical social anxiety (measured by questionnaire) related to the discrimination of happy and fearful faces, and of happy and angry faces. We also categorised the facial expressions by the salience of the mouth region (i.e. high [open mouth] vs. low [closed mouth]). In a sample of 90 participants higher social anxiety (relative to lower social anxiety) was associated with a reduced happy face reaction time advantage. However, this effect was mainly driven by the faces with less salient closed mouths. Our results are consistent with theories of anxiety that incorporate an oversensitive valence evaluation system.

  17. Social perception and aging: The relationship between aging and the perception of subtle changes in facial happiness and identity.

    PubMed

    Yang, Tao; Penton, Tegan; Köybaşı, Şerife Leman; Banissy, Michael J

    2017-09-01

    Previous findings suggest that older adults show impairments in the social perception of faces, including the perception of emotion and facial identity. The majority of this work has tended to examine performance on tasks involving young adult faces and prototypical emotions. While useful, this can influence performance differences between groups due to perceptual biases and limitations on task performance. Here we sought to examine how typical aging is associated with the perception of subtle changes in facial happiness and facial identity in older adult faces. We developed novel tasks that permitted the ability to assess facial happiness, facial identity, and non-social perception (object perception) across similar task parameters. We observe that aging is linked with declines in the ability to make fine-grained judgements in the perception of facial happiness and facial identity (from older adult faces), but not for non-social (object) perception. This pattern of results is discussed in relation to mechanisms that may contribute to declines in facial perceptual processing in older adulthood. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  18. Psilocybin with psychological support improves emotional face recognition in treatment-resistant depression.

    PubMed

    Stroud, J B; Freeman, T P; Leech, R; Hindocha, C; Lawn, W; Nutt, D J; Curran, H V; Carhart-Harris, R L

    2018-02-01

    Depressed patients robustly exhibit affective biases in emotional processing which are altered by SSRIs and predict clinical outcome. The objective of this study is to investigate whether psilocybin, recently shown to rapidly improve mood in treatment-resistant depression (TRD), alters patients' emotional processing biases. Seventeen patients with treatment-resistant depression completed a dynamic emotional face recognition task at baseline and 1 month later after two doses of psilocybin with psychological support. Sixteen controls completed the emotional recognition task over the same time frame but did not receive psilocybin. We found evidence for a group × time interaction on speed of emotion recognition (p = .035). At baseline, patients were slower at recognising facial emotions compared with controls (p < .001). After psilocybin, this difference was remediated (p = .208). Emotion recognition was faster at follow-up compared with baseline in patients (p = .004, d = .876) but not controls (p = .263, d = .302). In patients, this change was significantly correlated with a reduction in anhedonia over the same time period (r = .640, p = .010). Psilocybin with psychological support appears to improve processing of emotional faces in treatment-resistant depression, and this correlates with reduced anhedonia. Placebo-controlled studies are warranted to follow up these preliminary findings.

  19. Categorising intersectional targets: An "either/and" approach to race- and gender-emotion congruity.

    PubMed

    Smith, Jacqueline S; LaFrance, Marianne; Dovidio, John F

    2017-01-01

    Research on the interaction of emotional expressions with social category cues in face processing has focused on whether specific emotions are associated with single-category identities, thus overlooking the influence of intersectional identities. Instead, we examined how quickly people categorise intersectional targets by their race, gender, or emotional expression. In Experiment 1, participants categorised Black and White faces displaying angry, happy, or neutral expressions by either race or gender. Emotion influenced responses to men versus women only when gender was made salient by the task. Similarly, emotion influenced responses to Black versus White targets only when participants categorised by race. In Experiment 2, participants categorised faces by emotion so that neither category was more salient. As predicted, responses to Black women differed from those to both Black men and White women. Thus, examining race and gender separately is insufficient to understanding how emotion and social category cues are processed.

  20. The effect of intranasal oxytocin on perceiving and understanding emotion on the Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT).

    PubMed

    Cardoso, Christopher; Ellenbogen, Mark A; Linnen, Anne-Marie

    2014-02-01

    Evidence suggests that intranasal oxytocin enhances the perception of emotion in facial expressions during standard emotion identification tasks. However, it is not clear whether this effect is desirable in people who do not show deficits in emotion perception. That is, a heightened perception of emotion in faces could lead to "oversensitivity" to the emotions of others in nonclinical participants. The goal of this study was to assess the effects of intranasal oxytocin on emotion perception using ecologically valid social and nonsocial visual tasks. Eighty-two participants (42 women) self-administered a 24 IU dose of intranasal oxytocin or a placebo in a double-blind, randomized experiment and then completed the perceiving and understanding emotion components of the Mayer-Salovey-Caruso Emotional Intelligence Test. In this test, emotion identification accuracy is based on agreement with a normative sample. As expected, participants administered intranasal oxytocin rated emotion in facial stimuli as expressing greater emotional intensity than those given a placebo. Consequently, accurate identification of emotion in faces, based on agreement with a normative sample, was impaired in the oxytocin group relative to placebo. No such effect was observed for tests using nonsocial stimuli. The results are consistent with the hypothesis that intranasal oxytocin enhances the salience of social stimuli in the environment, but not nonsocial stimuli. The present findings support a growing literature showing that the effects of intranasal oxytocin on social cognition can be negative under certain circumstances, in this case promoting "oversensitivity" to emotion in faces in healthy people. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  1. Processing Distracting Non-face Emotional Images: No Evidence of an Age-Related Positivity Effect

    PubMed Central

    Madill, Mark; Murray, Janice E.

    2017-01-01

    Cognitive aging may be accompanied by increased prioritization of social and emotional goals that enhance positive experiences and emotional states. The socioemotional selectivity theory suggests this may be achieved by giving preference to positive information and avoiding or suppressing negative information. Although there is some evidence of a positivity bias in controlled attention tasks, it remains unclear whether a positivity bias extends to the processing of affective stimuli presented outside focused attention. In two experiments, we investigated age-related differences in the effects of to-be-ignored non-face affective images on target processing. In Experiment 1, 27 older (64–90 years) and 25 young adults (19–29 years) made speeded valence judgments about centrally presented positive or negative target images taken from the International Affective Picture System. To-be-ignored distractor images were presented above and below the target image and were either positive, negative, or neutral in valence. The distractors were considered task relevant because they shared emotional characteristics with the target stimuli. Both older and young adults responded slower to targets when distractor valence was incongruent with target valence relative to when distractors were neutral. Older adults responded faster to positive than to negative targets but did not show increased interference effects from positive distractors. In Experiment 2, affective distractors were task irrelevant as the target was a three-digit array and did not share emotional characteristics with the distractors. Twenty-six older (63–84 years) and 30 young adults (18–30 years) gave speeded responses on a digit disparity task while ignoring the affective distractors positioned in the periphery. Task performance in either age group was not influenced by the task-irrelevant affective images. In keeping with the socioemotional selectivity theory, these findings suggest that older adults preferentially process task-relevant positive non-face images but only when presented within the main focus of attention. PMID:28450848

  2. Processing Distracting Non-face Emotional Images: No Evidence of an Age-Related Positivity Effect.

    PubMed

    Madill, Mark; Murray, Janice E

    2017-01-01

    Cognitive aging may be accompanied by increased prioritization of social and emotional goals that enhance positive experiences and emotional states. The socioemotional selectivity theory suggests this may be achieved by giving preference to positive information and avoiding or suppressing negative information. Although there is some evidence of a positivity bias in controlled attention tasks, it remains unclear whether a positivity bias extends to the processing of affective stimuli presented outside focused attention. In two experiments, we investigated age-related differences in the effects of to-be-ignored non-face affective images on target processing. In Experiment 1, 27 older (64-90 years) and 25 young adults (19-29 years) made speeded valence judgments about centrally presented positive or negative target images taken from the International Affective Picture System. To-be-ignored distractor images were presented above and below the target image and were either positive, negative, or neutral in valence. The distractors were considered task relevant because they shared emotional characteristics with the target stimuli. Both older and young adults responded slower to targets when distractor valence was incongruent with target valence relative to when distractors were neutral. Older adults responded faster to positive than to negative targets but did not show increased interference effects from positive distractors. In Experiment 2, affective distractors were task irrelevant as the target was a three-digit array and did not share emotional characteristics with the distractors. Twenty-six older (63-84 years) and 30 young adults (18-30 years) gave speeded responses on a digit disparity task while ignoring the affective distractors positioned in the periphery. Task performance in either age group was not influenced by the task-irrelevant affective images. In keeping with the socioemotional selectivity theory, these findings suggest that older adults preferentially process task-relevant positive non-face images but only when presented within the main focus of attention.

  3. Colour and emotion: children also associate red with negative valence.

    PubMed

    Gil, Sandrine; Le Bigot, Ludovic

    2016-11-01

    The association of colour with emotion constitutes a growing field of research, as it can affect how humans process their environment. Although there has been increasing interest in the association of red with negative valence in adults, little is known about how it develops. We therefore tested the red-negative association in children for the first time. Children aged 5-10 years performed a face categorization task in the form of a card-sorting task. They had to judge whether ambiguous faces shown against three different colour backgrounds (red, grey, green) seemed to 'feel good' or 'feel bad'. Results of logistic mixed models showed that - as previously demonstrated in adults - children across the age range provided significantly more 'feel bad' responses when the faces were given a red background. This finding is discussed in relation to colour-emotion association theories. © 2015 John Wiley & Sons Ltd.

  4. Children's understanding of facial expression of emotion: II. Drawing of emotion-faces.

    PubMed

    Missaghi-Lakshman, M; Whissell, C

    1991-06-01

    67 children from Grades 2, 4, and 7 drew faces representing the emotional expressions of fear, anger, surprise, disgust, happiness, and sadness. The children themselves and 29 adults later decoded the drawings in an emotion-recognition task. Children were the more accurate decoders, and their accuracy and the accuracy of adults increased significantly for judgments of 7th-grade drawings. The emotions happy and sad were most accurately decoded. There were no significant differences associated with sex. In their drawings, children utilized a symbol system that seems to be based on a highlighting or exaggeration of features of the innately governed facial expression of emotion.

  5. Gaze direction differentially affects avoidance tendencies to happy and angry faces in socially anxious individuals.

    PubMed

    Roelofs, Karin; Putman, Peter; Schouten, Sonja; Lange, Wolf-Gero; Volman, Inge; Rinck, Mike

    2010-04-01

    Increasing evidence indicates that eye gaze direction affects the processing of emotional faces in anxious individuals. However, the effects of eye gaze direction on the behavioral responses elicited by emotional faces, such as avoidance behavior, remain largely unexplored. We administered an Approach-Avoidance Task (AAT) in high (HSA) and low socially anxious (LSA) individuals. All participants responded to photographs of angry, happy and neutral faces (presented with direct and averted gaze), by either pushing a joystick away from them (avoidance) or pulling it towards them (approach). Compared to LSA, HSA were faster in avoiding than approaching angry faces. Most crucially, this avoidance tendency was only present when the perceived anger was directed towards the subject (direct gaze) and not when the gaze of the face-stimulus was averted. In contrast, HSA individuals tended to avoid happy faces irrespectively of gaze direction. Neutral faces elicited no approach-avoidance tendencies. Thus avoidance of angry faces in social anxiety as measured by AA-tasks reflects avoidance of subject-directed anger and not of negative stimuli in general. In addition, although both anger and joy are considered to reflect approach-related emotions, gaze direction did not affect HSA's avoidance of happy faces, suggesting differential mechanisms affecting responses to happy and angry faces in social anxiety. 2009 Elsevier Ltd. All rights reserved.

  6. Asymmetric Engagement of Amygdala and Its Gamma Connectivity in Early Emotional Face Processing

    PubMed Central

    Liu, Tai-Ying; Chen, Yong-Sheng; Hsieh, Jen-Chuen; Chen, Li-Fen

    2015-01-01

    The amygdala has been regarded as a key substrate for emotion processing. However, the engagement of the left and right amygdala during the early perceptual processing of different emotional faces remains unclear. We investigated the temporal profiles of oscillatory gamma activity in the amygdala and effective connectivity of the amygdala with the thalamus and cortical areas during implicit emotion-perceptual tasks using event-related magnetoencephalography (MEG). We found that within 100 ms after stimulus onset the right amygdala habituated to emotional faces rapidly (with duration around 20–30 ms), whereas activity in the left amygdala (with duration around 50–60 ms) sustained longer than that in the right. Our data suggest that the right amygdala could be linked to autonomic arousal generated by facial emotions and the left amygdala might be involved in decoding or evaluating expressive faces in the early perceptual emotion processing. The results of effective connectivity provide evidence that only negative emotional processing engages both cortical and subcortical pathways connected to the right amygdala, representing its evolutional significance (survival). These findings demonstrate the asymmetric engagement of bilateral amygdala in emotional face processing as well as the capability of MEG for assessing thalamo-cortico-limbic circuitry. PMID:25629899

  7. Association of impaired facial affect recognition with basic facial and visual processing deficits in schizophrenia.

    PubMed

    Norton, Daniel; McBain, Ryan; Holt, Daphne J; Ongur, Dost; Chen, Yue

    2009-06-15

    Impaired emotion recognition has been reported in schizophrenia, yet the nature of this impairment is not completely understood. Recognition of facial emotion depends on processing affective and nonaffective facial signals, as well as basic visual attributes. We examined whether and how poor facial emotion recognition in schizophrenia is related to basic visual processing and nonaffective face recognition. Schizophrenia patients (n = 32) and healthy control subjects (n = 29) performed emotion discrimination, identity discrimination, and visual contrast detection tasks, where the emotionality, distinctiveness of identity, or visual contrast was systematically manipulated. Subjects determined which of two presentations in a trial contained the target: the emotional face for emotion discrimination, a specific individual for identity discrimination, and a sinusoidal grating for contrast detection. Patients had significantly higher thresholds (worse performance) than control subjects for discriminating both fearful and happy faces. Furthermore, patients' poor performance in fear discrimination was predicted by performance in visual detection and face identity discrimination. Schizophrenia patients require greater emotional signal strength to discriminate fearful or happy face images from neutral ones. Deficient emotion recognition in schizophrenia does not appear to be determined solely by affective processing but is also linked to the processing of basic visual and facial information.

  8. Using Time Perception to Explore Implicit Sensitivity to Emotional Stimuli in Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Jones, Catherine R. G.; Lambrechts, Anna; Gaigg, Sebastian B.

    2017-01-01

    Establishing whether implicit responses to emotional cues are intact in autism spectrum disorder (ASD) is fundamental to ascertaining why their emotional understanding is compromised. We used a temporal bisection task to assess for responsiveness to face and wildlife images that varied in emotional salience. There were no significant differences…

  9. Differential age-related decline in conflict-driven task-set shielding from emotional versus non-emotional distracters.

    PubMed

    Monti, Jim M; Weintraub, Sandra; Egner, Tobias

    2010-05-01

    While normal aging is associated with a marked decline in cognitive abilities, such as memory and executive functions, recent evidence suggests that control processes involved in regulating responses to emotional stimuli may remain well-preserved in the elderly. However, neither the precise nature of these preserved control processes, nor their domain-specificity with respect to comparable non-emotional control processes, are currently well-established. Here, we tested the hypothesis of domain-specific preservation of emotional control in the elderly by employing two closely matched behavioral tasks that assessed the ability to shield the processing of task-relevant stimulus information from competition by task-irrelevant distracter stimuli that could be either non-emotional or emotional in nature. The efficacy of non-emotional versus emotional task-set shielding, gauged via the 'conflict adaptation effect', was compared between cohorts of healthy young adults, healthy elderly adults, and individuals diagnosed with probable Alzheimer's disease (PRAD), age-matched to the elderly subjects. It was found that, compared to the young adult cohort, the healthy elderly displayed deficits in task-set shielding in the non-emotional but not in the emotional task, whereas PRAD subjects displayed impaired performance in both tasks. These results provide new evidence that healthy aging is associated with a domain-specific preservation of emotional control functions, specifically, the shielding of a current task-set from interference by emotional distracter stimuli. This selective preservation of function supports the notion of partly dissociable affective control mechanisms, and may either reflect different time-courses of degeneration in the neuroanatomical circuits mediating task-set maintenance in the face of non-emotional versus emotional distracters, or a motivational shift towards affective processing in the elderly. 2010 Elsevier Ltd. All rights reserved.

  10. Finding an emotional face in a crowd: emotional and perceptual stimulus factors influence visual search efficiency.

    PubMed

    Lundqvist, Daniel; Bruce, Neil; Öhman, Arne

    2015-01-01

    In this article, we examine how emotional and perceptual stimulus factors influence visual search efficiency. In an initial task, we run a visual search task, using a large number of target/distractor emotion combinations. In two subsequent tasks, we then assess measures of perceptual (rated and computational distances) and emotional (rated valence, arousal and potency) stimulus properties. In a series of regression analyses, we then explore the degree to which target salience (the size of target/distractor dissimilarities) on these emotional and perceptual measures predict the outcome on search efficiency measures (response times and accuracy) from the visual search task. The results show that both emotional and perceptual stimulus salience contribute to visual search efficiency. The results show that among the emotional measures, salience on arousal measures was more influential than valence salience. The importance of the arousal factor may be a contributing factor to contradictory history of results within this field.

  11. Fixation to features and neural processing of facial expressions in a gender discrimination task.

    PubMed

    Neath, Karly N; Itier, Roxane J

    2015-10-01

    Early face encoding, as reflected by the N170 ERP component, is sensitive to fixation to the eyes. Whether this sensitivity varies with facial expressions of emotion and can also be seen on other ERP components such as P1 and EPN, was investigated. Using eye-tracking to manipulate fixation on facial features, we found the N170 to be the only eye-sensitive component and this was true for fearful, happy and neutral faces. A different effect of fixation to features was seen for the earlier P1 that likely reflected general sensitivity to face position. An early effect of emotion (∼120 ms) for happy faces was seen at occipital sites and was sustained until ∼350 ms post-stimulus. For fearful faces, an early effect was seen around 80 ms followed by a later effect appearing at ∼150 ms until ∼300 ms at lateral posterior sites. Results suggests that in this emotion-irrelevant gender discrimination task, processing of fearful and happy expressions occurred early and largely independently of the eye-sensitivity indexed by the N170. Processing of the two emotions involved different underlying brain networks active at different times. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Temporal lobe structures and facial emotion recognition in schizophrenia patients and nonpsychotic relatives.

    PubMed

    Goghari, Vina M; Macdonald, Angus W; Sponheim, Scott R

    2011-11-01

    Temporal lobe abnormalities and emotion recognition deficits are prominent features of schizophrenia and appear related to the diathesis of the disorder. This study investigated whether temporal lobe structural abnormalities were associated with facial emotion recognition deficits in schizophrenia and related to genetic liability for the disorder. Twenty-seven schizophrenia patients, 23 biological family members, and 36 controls participated. Several temporal lobe regions (fusiform, superior temporal, middle temporal, amygdala, and hippocampus) previously associated with face recognition in normative samples and found to be abnormal in schizophrenia were evaluated using volumetric analyses. Participants completed a facial emotion recognition task and an age recognition control task under time-limited and self-paced conditions. Temporal lobe volumes were tested for associations with task performance. Group status explained 23% of the variance in temporal lobe volume. Left fusiform gray matter volume was decreased by 11% in patients and 7% in relatives compared with controls. Schizophrenia patients additionally exhibited smaller hippocampal and middle temporal volumes. Patients were unable to improve facial emotion recognition performance with unlimited time to make a judgment but were able to improve age recognition performance. Patients additionally showed a relationship between reduced temporal lobe gray matter and poor facial emotion recognition. For the middle temporal lobe region, the relationship between greater volume and better task performance was specific to facial emotion recognition and not age recognition. Because schizophrenia patients exhibited a specific deficit in emotion recognition not attributable to a generalized impairment in face perception, impaired emotion recognition may serve as a target for interventions.

  13. Sex differences in functional activation patterns revealed by increased emotion processing demands.

    PubMed

    Hall, Geoffrey B C; Witelson, Sandra F; Szechtman, Henry; Nahmias, Claude

    2004-02-09

    Two [O(15)] PET studies assessed sex differences regional brain activation in the recognition of emotional stimuli. Study I revealed that the recognition of emotion in visual faces resulted in bilateral frontal activation in women, and unilateral right-sided activation in men. In study II, the complexity of the emotional face task was increased through tje addition of associated auditory emotional stimuli. Men again showed unilateral frontal activation, in this case to the left; whereas women did not show bilateral frontal activation, but showed greater limbic activity. These results suggest that when processing broader cross-modal emotional stimuli, men engage more in associative cognitive strategies while women draw more on primary emotional references.

  14. Can We Distinguish Emotions from Faces? Investigation of Implicit and Explicit Processes of Peak Facial Expressions.

    PubMed

    Xiao, Ruiqi; Li, Xianchun; Li, Lin; Wang, Yanmei

    2016-01-01

    Most previous studies on facial expression recognition have focused on the moderate emotions; to date, few studies have been conducted to investigate the explicit and implicit processes of peak emotions. In the current study, we used transiently peak intense expression images of athletes at the winning or losing point in competition as materials, and investigated the diagnosability of peak facial expressions at both implicit and explicit levels. In Experiment 1, participants were instructed to evaluate isolated faces, isolated bodies, and the face-body compounds, and eye-tracking movement was recorded. The results revealed that the isolated body and face-body congruent images were better recognized than isolated face and face-body incongruent images, indicating that the emotional information conveyed by facial cues was ambiguous, and the body cues influenced facial emotion recognition. Furthermore, eye movement records showed that the participants displayed distinct gaze patterns for the congruent and incongruent compounds. In Experiment 2A, the subliminal affective priming task was used, with faces as primes and bodies as targets, to investigate the unconscious emotion perception of peak facial expressions. The results showed that winning face prime facilitated reaction to winning body target, whereas losing face prime inhibited reaction to winning body target, suggesting that peak facial expressions could be perceived at the implicit level. In general, the results indicate that peak facial expressions cannot be consciously recognized but can be perceived at the unconscious level. In Experiment 2B, revised subliminal affective priming task and a strict awareness test were used to examine the validity of unconscious perception of peak facial expressions found in Experiment 2A. Results of Experiment 2B showed that reaction time to both winning body targets and losing body targets was influenced by the invisibly peak facial expression primes, which indicated the unconscious perception of peak facial expressions.

  15. Can We Distinguish Emotions from Faces? Investigation of Implicit and Explicit Processes of Peak Facial Expressions

    PubMed Central

    Xiao, Ruiqi; Li, Xianchun; Li, Lin; Wang, Yanmei

    2016-01-01

    Most previous studies on facial expression recognition have focused on the moderate emotions; to date, few studies have been conducted to investigate the explicit and implicit processes of peak emotions. In the current study, we used transiently peak intense expression images of athletes at the winning or losing point in competition as materials, and investigated the diagnosability of peak facial expressions at both implicit and explicit levels. In Experiment 1, participants were instructed to evaluate isolated faces, isolated bodies, and the face-body compounds, and eye-tracking movement was recorded. The results revealed that the isolated body and face-body congruent images were better recognized than isolated face and face-body incongruent images, indicating that the emotional information conveyed by facial cues was ambiguous, and the body cues influenced facial emotion recognition. Furthermore, eye movement records showed that the participants displayed distinct gaze patterns for the congruent and incongruent compounds. In Experiment 2A, the subliminal affective priming task was used, with faces as primes and bodies as targets, to investigate the unconscious emotion perception of peak facial expressions. The results showed that winning face prime facilitated reaction to winning body target, whereas losing face prime inhibited reaction to winning body target, suggesting that peak facial expressions could be perceived at the implicit level. In general, the results indicate that peak facial expressions cannot be consciously recognized but can be perceived at the unconscious level. In Experiment 2B, revised subliminal affective priming task and a strict awareness test were used to examine the validity of unconscious perception of peak facial expressions found in Experiment 2A. Results of Experiment 2B showed that reaction time to both winning body targets and losing body targets was influenced by the invisibly peak facial expression primes, which indicated the unconscious perception of peak facial expressions. PMID:27630604

  16. Facial Expressions and Ability to Recognize Emotions From Eyes or Mouth in Children

    PubMed Central

    Guarnera, Maria; Hichy, Zira; Cascio, Maura I.; Carrubba, Stefano

    2015-01-01

    This research aims to contribute to the literature on the ability to recognize anger, happiness, fear, surprise, sadness, disgust and neutral emotions from facial information. By investigating children’s performance in detecting these emotions from a specific face region, we were interested to know whether children would show differences in recognizing these expressions from the upper or lower face, and if any difference between specific facial regions depended on the emotion in question. For this purpose, a group of 6-7 year-old children was selected. Participants were asked to recognize emotions by using a labeling task with three stimulus types (region of the eyes, of the mouth, and full face). The findings seem to indicate that children correctly recognize basic facial expressions when pictures represent the whole face, except for a neutral expression, which was recognized from the mouth, and sadness, which was recognized from the eyes. Children are also able to identify anger from the eyes as well as from the whole face. With respect to gender differences, there is no female advantage in emotional recognition. The results indicate a significant interaction ‘gender x face region’ only for anger and neutral emotions. PMID:27247651

  17. Divergent Patterns of Social Cognition Performance in Autism and 22q11.2 Deletion Syndrome (22q11DS)

    ERIC Educational Resources Information Center

    McCabe, Kathryn L.; Melville, Jessica L.; Rich, Dominique; Strutt, Paul A.; Cooper, Gavin; Loughland, Carmel M.; Schall, Ulrich; Campbell, Linda E.

    2013-01-01

    Individuals with developmental disorders frequently report a range of social cognition deficits including difficulties identifying facial displays of emotion. This study examined the specificity of face emotion processing deficits in adolescents with either autism or 22q11DS compared to typically developing (TD) controls. Two tasks (face emotion…

  18. White matter fiber compromise contributes differentially to attention and emotion processing impairment in alcoholism, HIV-infection, and their comorbidity.

    PubMed

    Schulte, T; Müller-Oehring, E M; Sullivan, E V; Pfefferbaum, A

    2012-10-01

    Alcoholism (ALC) and HIV-1 infection (HIV) each affects emotional and attentional processes and integrity of brain white matter fibers likely contributing to functional compromise. The highly prevalent ALC+HIV comorbidity may exacerbate compromise. We used diffusion tensor imaging (DTI) and an emotional Stroop Match-to-Sample task in 19 ALC, 16 HIV, 15 ALC+HIV, and 15 control participants to investigate whether disruption of fiber system integrity accounts for compromised attentional and emotional processing. The task required matching a cue color to that of an emotional word with faces appearing between the color cue and the Stroop word in half of the trials. Nonmatched cue-word color pairs assessed selective attention, and face-word pairs assessed emotion. Relative to controls, DTI-based fiber tracking revealed lower inferior longitudinal fasciculus (ilf) integrity in HIV and ALC+HIV and lower uncinate fasciculus (uf) integrity in all three patient groups. Controls exhibited Stroop effects to positive face-word emotion, and greater interference was related to greater callosal, cingulum and ilf integrity. By contrast, HIV showed greater interference from negative Stroop words during color-nonmatch trials, correlating with greater uf compromise. For face trials, ALC and ALC+HIV showed greater Stroop-word interference, correlating with lower cingulate and callosal integrity. Thus, in HIV, conflict resolution was diminished when challenging conditions usurped resources needed to manage interference from negative emotion and to disengage attention from wrongly cued colors (nonmatch). In ALC and ALC+HIV, poorer callosal integrity was related to enhanced emotional interference suggesting curtailed interhemispheric exchange needed between preferentially right-hemispheric emotion and left-hemispheric Stroop-word functions. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Major depression is associated with impaired processing of emotion in music as well as in facial and vocal stimuli.

    PubMed

    Naranjo, C; Kornreich, C; Campanella, S; Noël, X; Vandriette, Y; Gillain, B; de Longueville, X; Delatte, B; Verbanck, P; Constant, E

    2011-02-01

    The processing of emotional stimuli is thought to be negatively biased in major depression. This study investigates this issue using musical, vocal and facial affective stimuli. 23 depressed in-patients and 23 matched healthy controls were recruited. Affective information processing was assessed through musical, vocal and facial emotion recognition tasks. Depression, anxiety level and attention capacity were controlled. The depressed participants demonstrated less accurate identification of emotions than the control group in all three sorts of emotion-recognition tasks. The depressed group also gave higher intensity ratings than the controls when scoring negative emotions, and they were more likely to attribute negative emotions to neutral voices and faces. Our in-patient group might differ from the more general population of depressed adults. They were all taking anti-depressant medication, which may have had an influence on their emotional information processing. Major depression is associated with a general negative bias in the processing of emotional stimuli. Emotional processing impairment in depression is not confined to interpersonal stimuli (faces and voices), being also present in the ability to feel music accurately. © 2010 Elsevier B.V. All rights reserved.

  20. “Who said that?” Matching of low- and high-intensity emotional prosody to facial expressions by adolescents with ASD

    PubMed Central

    Grossman, Ruth B; Tager-Flusberg, Helen

    2012-01-01

    Data on emotion processing by individuals with ASD suggest both intact abilities and significant deficits. Signal intensity may be a contributing factor to this discrepancy. We presented low- and high-intensity emotional stimuli in a face-voice matching task to 22 adolescents with ASD and 22 typically developing (TD) peers. Participants heard semantically neutral sentences with happy, surprised, angry, and sad prosody presented at two intensity levels (low, high) and matched them to emotional faces. The facial expression choice was either across- or within-valence. Both groups were less accurate for low-intensity emotions, but the ASD participants' accuracy levels dropped off more sharply. ASD participants were significantly less accurate than their TD peers for trials involving low-intensity emotions and within-valence face contrasts. PMID:22450703

  1. Functional asymmetry and interhemispheric cooperation in the perception of emotions from facial expressions.

    PubMed

    Tamietto, Marco; Latini Corazzini, Luca; de Gelder, Beatrice; Geminiani, Giuliano

    2006-05-01

    The present study used the redundant target paradigm on healthy subjects to investigate functional hemispheric asymmetries and interhemispheric cooperation in the perception of emotions from faces. In Experiment 1 participants responded to checkerboards presented either unilaterally to the left (LVF) or right visual half field (RVF), or simultaneously to both hemifields (BVF), while performing a pointing task for the control of eye movements. As previously reported (Miniussi et al. in J Cogn Neurosci 10:216-230, 1998), redundant stimulation led to shorter latencies for stimulus detection (bilateral gain or redundant target effect, RTE) that exceeded the limit for a probabilistic interpretation, thereby validating the pointing procedure and supporting interhemispheric cooperation. In Experiment 2 the same pointing procedure was used in a go/no-go task requiring subjects to respond when seeing a target emotional expression (happy or fearful, counterbalanced between blocks). Faster reaction times to unilateral LVF than RVF emotions, regardless of valence, indicate that the perception of positive and negative emotional faces is lateralized toward the right hemisphere. Simultaneous presentation of two congruent emotional faces, either happy or fearful, produced an RTE that cannot be explained by probability summation and suggests interhemispheric cooperation and neural summation. No such effect was present with BVF incongruent facial expressions. In Experiment 3 we studied whether the RTE for emotional faces depends on the physical identity between BVF stimuli, and we set a second BVF congruent condition in which there was only emotional but not physical or gender identity between stimuli (i.e. two different faces expressing the same emotion). The RTE and interhemispheric cooperation were present also in this second BVF congruent condition. This shows that emotional congruency is the sufficient condition for the RTE to take place in the intact brain and that the cerebral hemispheres can interact in spite of physical differences between stimuli.

  2. Seeing Life through Positive-Tinted Glasses: Color–Meaning Associations

    PubMed Central

    Gil, Sandrine; Le Bigot, Ludovic

    2014-01-01

    There is a growing body of literature to show that color can convey information, owing to its emotionally meaningful associations. Most research so far has focused on negative hue–meaning associations (e.g., red) with the exception of the positive aspects associated with green. We therefore set out to investigate the positive associations of two colors (i.e., green and pink), using an emotional facial expression recognition task in which colors provided the emotional contextual information for the face processing. In two experiments, green and pink backgrounds enhanced happy face recognition and impaired sad face recognition, compared with a control color (gray). Our findings therefore suggest that because green and pink both convey positive information, they facilitate the processing of emotionally congruent facial expressions (i.e., faces expressing happiness) and interfere with that of incongruent facial expressions (i.e., faces expressing sadness). Data also revealed a positive association for white. Results are discussed within the theoretical framework of emotional cue processing and color meaning. PMID:25098167

  3. Seeing life through positive-tinted glasses: color-meaning associations.

    PubMed

    Gil, Sandrine; Le Bigot, Ludovic

    2014-01-01

    There is a growing body of literature to show that color can convey information, owing to its emotionally meaningful associations. Most research so far has focused on negative hue-meaning associations (e.g., red) with the exception of the positive aspects associated with green. We therefore set out to investigate the positive associations of two colors (i.e., green and pink), using an emotional facial expression recognition task in which colors provided the emotional contextual information for the face processing. In two experiments, green and pink backgrounds enhanced happy face recognition and impaired sad face recognition, compared with a control color (gray). Our findings therefore suggest that because green and pink both convey positive information, they facilitate the processing of emotionally congruent facial expressions (i.e., faces expressing happiness) and interfere with that of incongruent facial expressions (i.e., faces expressing sadness). Data also revealed a positive association for white. Results are discussed within the theoretical framework of emotional cue processing and color meaning.

  4. Priming Facial Gender and Emotional Valence: The Influence of Spatial Frequency on Face Perception in ASD

    ERIC Educational Resources Information Center

    Vanmarcke, Steven; Wagemans, Johan

    2017-01-01

    Adolescents with and without autism spectrum disorder (ASD) performed two priming experiments in which they implicitly processed a prime stimulus, containing high and/or low spatial frequency information, and then explicitly categorized a target face either as male/female (gender task) or as positive/negative (Valence task). Adolescents with ASD…

  5. Facial emotion recognition and borderline personality pathology.

    PubMed

    Meehan, Kevin B; De Panfilis, Chiara; Cain, Nicole M; Antonucci, Camilla; Soliani, Antonio; Clarkin, John F; Sambataro, Fabio

    2017-09-01

    The impact of borderline personality pathology on facial emotion recognition has been in dispute; with impaired, comparable, and enhanced accuracy found in high borderline personality groups. Discrepancies are likely driven by variations in facial emotion recognition tasks across studies (stimuli type/intensity) and heterogeneity in borderline personality pathology. This study evaluates facial emotion recognition for neutral and negative emotions (fear/sadness/disgust/anger) presented at varying intensities. Effortful control was evaluated as a moderator of facial emotion recognition in borderline personality. Non-clinical multicultural undergraduates (n = 132) completed a morphed facial emotion recognition task of neutral and negative emotional expressions across different intensities (100% Neutral; 25%/50%/75% Emotion) and self-reported borderline personality features and effortful control. Greater borderline personality features related to decreased accuracy in detecting neutral faces, but increased accuracy in detecting negative emotion faces, particularly at low-intensity thresholds. This pattern was moderated by effortful control; for individuals with low but not high effortful control, greater borderline personality features related to misattributions of emotion to neutral expressions, and enhanced detection of low-intensity emotional expressions. Individuals with high borderline personality features may therefore exhibit a bias toward detecting negative emotions that are not or barely present; however, good self-regulatory skills may protect against this potential social-cognitive vulnerability. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  6. Do you see what I see? Sex differences in the discrimination of facial emotions during adolescence.

    PubMed

    Lee, Nikki C; Krabbendam, Lydia; White, Thomas P; Meeter, Martijn; Banaschewski, Tobias; Barker, Gareth J; Bokde, Arun L W; Büchel, Christian; Conrod, Patricia; Flor, Herta; Frouin, Vincent; Heinz, Andreas; Garavan, Hugh; Gowland, Penny; Ittermann, Bernd; Mann, Karl; Paillère Martinot, Marie-Laure; Nees, Frauke; Paus, Tomas; Pausova, Zdenka; Rietschel, Marcella; Robbins, Trevor; Fauth-Bühler, Mira; Smolka, Michael N; Gallinat, Juergen; Schumann, Gunther; Shergill, Sukhi S

    2013-12-01

    During adolescence social relationships become increasingly important. Establishing and maintaining these relationships requires understanding of emotional stimuli, such as facial emotions. A failure to adequately interpret emotional facial expressions has previously been associated with various mental disorders that emerge during adolescence. The current study examined sex differences in emotional face processing during adolescence. Participants were adolescents (n = 1951) with a target age of 14, who completed a forced-choice emotion discrimination task. The stimuli used comprised morphed faces that contained a blend of two emotions in varying intensities (11 stimuli per set of emotions). Adolescent girls showed faster and more sensitive perception of facial emotions than boys. However, both adolescent boys and girls were most sensitive to variations in emotion intensity in faces combining happiness and sadness, and least sensitive to changes in faces comprising fear and anger. Furthermore, both sexes overidentified happiness and anger. However, the overidentification of happiness was stronger in boys. These findings were not influenced by individual differences in the level of pubertal maturation. These results indicate that male and female adolescents differ in their ability to identify emotions in morphed faces containing emotional blends. The findings provide information for clinical studies examining whether sex differences in emotional processing are related to sex differences in the prevalence of psychiatric disorders within this age group.

  7. Association between Ability Emotional Intelligence and Left Insula during Social Judgment of Facial Emotions

    PubMed Central

    Quarto, Tiziana; Blasi, Giuseppe; Maddalena, Chiara; Viscanti, Giovanna; Lanciano, Tiziana; Soleti, Emanuela; Mangiulli, Ivan; Taurisano, Paolo; Fazio, Leonardo; Bertolino, Alessandro; Curci, Antonietta

    2016-01-01

    The human ability of identifying, processing and regulating emotions from social stimuli is generally referred as Emotional Intelligence (EI). Within EI, Ability EI identifies a performance measure assessing individual skills at perceiving, using, understanding and managing emotions. Previous models suggest that a brain “somatic marker circuitry” (SMC) sustains emotional sub-processes included in EI. Three primary brain regions are included: the amygdala, the insula and the ventromedial prefrontal cortex (vmPFC). Here, our aim was to investigate the relationship between Ability EI scores and SMC activity during social judgment of emotional faces. Sixty-three healthy subjects completed a test measuring Ability EI and underwent fMRI during a social decision task (i.e. approach or avoid) about emotional faces with different facial expressions. Imaging data revealed that EI scores are associated with left insula activity during social judgment of emotional faces as a function of facial expression. Specifically, higher EI scores are associated with greater left insula activity during social judgment of fearful faces but also with lower activity of this region during social judgment of angry faces. These findings indicate that the association between Ability EI and the SMC activity during social behavior is region- and emotion-specific. PMID:26859495

  8. Effects of Repeated Concussions and Sex on Early Processing of Emotional Facial Expressions as Revealed by Electrophysiology.

    PubMed

    Carrier-Toutant, Frédérike; Guay, Samuel; Beaulieu, Christelle; Léveillé, Édith; Turcotte-Giroux, Alexandre; Papineau, Samaël D; Brisson, Benoit; D'Hondt, Fabien; De Beaumont, Louis

    2018-05-06

    Concussions affect the processing of emotional stimuli. This study aimed to investigate how sex interacts with concussion effects on early event-related brain potentials (ERP) measures (P1, N1) of emotional facial expressions (EFE) processing in asymptomatic, multi-concussion athletes during an EFE identification task. Forty control athletes (20 females and 20 males) and 43 multi-concussed athletes (22 females and 21 males), recruited more than 3 months after their last concussion, were tested. Participants completed the Beck Depression Inventory II, the Beck Anxiety Inventory, the Post-Concussion Symptom Scale, and an Emotional Facial Expression Identification Task. Pictures of male and female faces expressing neutral, angry, and happy emotions were randomly presented and the emotion depicted had to be identified as fast as possible during EEG acquisition. Relative to controls, concussed athletes of both sex exhibited a significant suppression of P1 amplitude recorded from the dominant right hemisphere while performing the emotional face expression identification task. The present study also highlighted a sex-specific suppression of the N1 component amplitude after concussion which affected male athletes. These findings suggest that repeated concussions alter the typical pattern of right-hemisphere response dominance to EFE in early stages of EFE processing and that the neurophysiological mechanisms underlying the processing of emotional stimuli are distinctively affected across sex. (JINS, 2018, 24, 1-11).

  9. Reward prediction error signal enhanced by striatum-amygdala interaction explains the acceleration of probabilistic reward learning by emotion.

    PubMed

    Watanabe, Noriya; Sakagami, Masamichi; Haruno, Masahiko

    2013-03-06

    Learning does not only depend on rationality, because real-life learning cannot be isolated from emotion or social factors. Therefore, it is intriguing to determine how emotion changes learning, and to identify which neural substrates underlie this interaction. Here, we show that the task-independent presentation of an emotional face before a reward-predicting cue increases the speed of cue-reward association learning in human subjects compared with trials in which a neutral face is presented. This phenomenon was attributable to an increase in the learning rate, which regulates reward prediction errors. Parallel to these behavioral findings, functional magnetic resonance imaging demonstrated that presentation of an emotional face enhanced reward prediction error (RPE) signal in the ventral striatum. In addition, we also found a functional link between this enhanced RPE signal and increased activity in the amygdala following presentation of an emotional face. Thus, this study revealed an acceleration of cue-reward association learning by emotion, and underscored a role of striatum-amygdala interactions in the modulation of the reward prediction errors by emotion.

  10. Neurocognitive assessment of emotional context sensitivity.

    PubMed

    Myruski, Sarah; Bonanno, George A; Gulyayeva, Olga; Egan, Laura J; Dennis-Tiwary, Tracy A

    2017-10-01

    Sensitivity to emotional context is an emerging construct for characterizing adaptive or maladaptive emotion regulation, but few measurement approaches exist. The current study combined behavioral and neurocognitive measures to assess context sensitivity in relation to self-report measures of adaptive emotional flexibility and well-being. Sixty-six adults completed an emotional go/no-go task using happy, fearful, and neutral faces as go and no-go cues, while EEG was recorded to generate event-related potentials (ERPs) reflecting attentional selection and discrimination (N170) and cognitive control (N2). Context sensitivity was measured as the degree of emotional facilitation or disruption in the go/no-go task and magnitude of ERP response to emotion cues. Participants self-reported on emotional flexibility, anxiety, and depression. Overall participants evidenced emotional context sensitivity, such that when happy faces were go stimuli, accuracy improved (greater behavioral facilitation), whereas when fearful faces were no-go stimuli, errors increased (disrupted behavioral inhibition). These indices predicted emotional flexibility and well-being: Greater behavioral facilitation following happy cues was associated with lower depression and anxiety, whereas greater disruption in behavioral inhibition following fearful cues was associated with lower flexibility. ERP indices of context sensitivity revealed additional associations: Greater N2 to fear go cues was associated with less anxiety and depression, and greater N2 and N170 to happy and fear no-go cues, respectively, were associated with greater emotional flexibility and well-being. Results suggest that pleasant and unpleasant emotions selectively enhance and disrupt components of context sensitivity, and that behavioral and ERP indices of context sensitivity predict flexibility and well-being.

  11. The Task and Relational Dimensions of Online Social Support.

    PubMed

    Beck, Stephenson J; Paskewitz, Emily A; Anderson, Whitney A; Bourdeaux, Renee; Currie-Mueller, Jenna

    2017-03-01

    Online support groups are attractive to individuals suffering from various types of mental and physical illness due to their accessibility, convenience, and comfort level. Individuals coping with depression, in particular, may seek social support online to avoid the stigma that accompanies face-to-face support groups. We explored how task and relational messages created social support in online depression support groups using Cutrona and Suhr's social support coding scheme and Bales's Interaction Process Analysis coding scheme. A content analysis revealed emotional support as the most common type of social support within the group, although the majority of messages were task rather than relational. Informational support consisted primarily of task messages, whereas network and esteem support were primarily relational messages. Specific types of task and relational messages were associated with different support types. Results indicate task messages dominated online depression support groups, suggesting the individuals who participate in these groups are interested in solving problems but may also experience emotional support when their uncertainty is reduced via task messages.

  12. Categorical Perception of Emotional Facial Expressions in Preschoolers

    ERIC Educational Resources Information Center

    Cheal, Jenna L.; Rutherford, M. D.

    2011-01-01

    Adults perceive emotional facial expressions categorically. In this study, we explored categorical perception in 3.5-year-olds by creating a morphed continuum of emotional faces and tested preschoolers' discrimination and identification of them. In the discrimination task, participants indicated whether two examples from the continuum "felt the…

  13. More than words (and faces): evidence for a Stroop effect of prosody in emotion word processing.

    PubMed

    Filippi, Piera; Ocklenburg, Sebastian; Bowling, Daniel L; Heege, Larissa; Güntürkün, Onur; Newen, Albert; de Boer, Bart

    2017-08-01

    Humans typically combine linguistic and nonlinguistic information to comprehend emotions. We adopted an emotion identification Stroop task to investigate how different channels interact in emotion communication. In experiment 1, synonyms of "happy" and "sad" were spoken with happy and sad prosody. Participants had more difficulty ignoring prosody than ignoring verbal content. In experiment 2, synonyms of "happy" and "sad" were spoken with happy and sad prosody, while happy or sad faces were displayed. Accuracy was lower when two channels expressed an emotion that was incongruent with the channel participants had to focus on, compared with the cross-channel congruence condition. When participants were required to focus on verbal content, accuracy was significantly lower also when prosody was incongruent with verbal content and face. This suggests that prosody biases emotional verbal content processing, even when conflicting with verbal content and face simultaneously. Implications for multimodal communication and language evolution studies are discussed.

  14. Guanfacine modulates the emotional biasing of amygdala-prefrontal connectivity for cognitive control.

    PubMed

    Schulz, Kurt P; Clerkin, Suzanne M; Newcorn, Jeffrey H; Halperin, Jeffrey M; Fan, Jin

    2014-09-01

    Functional interactions between amygdala and prefrontal cortex provide a cortical entry point for emotional cues to bias cognitive control. Stimulation of α2 adrenoceptors enhances the prefrontal control functions and blocks the amygdala-dependent encoding of emotional cues. However, the impact of this stimulation on amygdala-prefrontal interactions and the emotional biasing of cognitive control have not been established. We tested the effect of the α2 adrenoceptor agonist guanfacine on psychophysiological interactions of amygdala with prefrontal cortex for the emotional biasing of response execution and inhibition. Fifteen healthy adults were scanned twice with event-related functional magnetic resonance imaging while performing an emotional go/no-go task following administration of oral guanfacine (1mg) and placebo in a double-blind, counterbalanced design. Happy, sad, and neutral faces served as trial cues. Guanfacine moderated the effect of face emotion on the task-related functional connectivity of left and right amygdala with left inferior frontal gyrus compared to placebo, by selectively reversing the functional co-activation of the two regions for response execution cued by sad faces. This shift from positively to negatively correlated activation for guanfacine was associated with selective improvements in the relatively low accuracy of responses to sad faces seen for placebo. These results demonstrate the importance of functional interactions between amygdala and inferior frontal gyrus to both bottom-up biasing of cognitive control and top-down control of emotional processing, as well as for the α2 adrenoceptor-mediated modulation of these processes. These mechanisms offer a possibile method to address the emotional reactivity that is common to several psychiatric disorders. Copyright © 2014 Elsevier B.V. and ECNP. All rights reserved.

  15. The influence of variations in eating disorder-related symptoms on processing of emotional faces in a non-clinical female sample: An eye-tracking study.

    PubMed

    Sharpe, Emma; Wallis, Deborah J; Ridout, Nathan

    2016-06-30

    This study aimed to: (i) determine if the attention bias towards angry faces reported in eating disorders generalises to a non-clinical sample varying in eating disorder-related symptoms; (ii) examine if the bias occurs during initial orientation or later strategic processing; and (iii) confirm previous findings of impaired facial emotion recognition in non-clinical disordered eating. Fifty-two females viewed a series of face-pairs (happy or angry paired with neutral) whilst their attentional deployment was continuously monitored using an eye-tracker. They subsequently identified the emotion portrayed in a separate series of faces. The highest (n=18) and lowest scorers (n=17) on the Eating Disorders Inventory (EDI) were compared on the attention and facial emotion recognition tasks. Those with relatively high scores exhibited impaired facial emotion recognition, confirming previous findings in similar non-clinical samples. They also displayed biased attention away from emotional faces during later strategic processing, which is consistent with previously observed impairments in clinical samples. These differences were related to drive-for-thinness. Although we found no evidence of a bias towards angry faces, it is plausible that the observed impairments in emotion recognition and avoidance of emotional faces could disrupt social functioning and act as a risk factor for the development of eating disorders. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  16. Neural circuitry of emotion regulation: Effects of appraisal, attention, and cortisol administration.

    PubMed

    Ma, Sean T; Abelson, James L; Okada, Go; Taylor, Stephan F; Liberzon, Israel

    2017-04-01

    Psychosocial well-being requires effective regulation of emotional responding in context of threat or stress. Neuroimaging studies have focused on instructed, volitional regulation (e.g., reappraisal or distancing), largely ignoring implicit regulation that does not involve purposeful effort to alter emotional experience. These implicit processes may or may not involve the same neural pathways as explicit regulatory strategies. We examined the neurobiology of implicit emotional regulation processes and the impact of the stress hormone cortisol on these processes. Our study task employed composite pictures of faces and places to examine neural activity during implicit emotional processing (of emotional faces), while these responses were implicitly regulated by attention shift away from the emotionally evocative stimuli, and while subjects reflectively appraised their own emotional response to them. Subjects completed the task in an fMRI scanner after random assignment to receive placebo or hydrocortisone (HCT), an orally administered version of cortisol. Implicit emotional processing activated insula/IFG, dACC/dMPFC, midbrain and amygdala. With attention shifting, we saw diminished signal in emotion generating/response regions (e.g., amygdala) and increased activations in task specific attention regions like parahippocampus. With appraisal of emotions, we observed robust activations in medial prefrontal areas, where activation is also seen in instructed reappraisal studies. We observed no main effects of HCT administration on brain, but males and females showed opposing neural effects in prefrontal areas. The data suggest that different types of emotion regulation utilize overlapping circuits, but with some strategy specific activation. Further study of the dimorphic sex response to cortisol is needed.

  17. Association between amygdala response to emotional faces and social anxiety in autism spectrum disorders.

    PubMed

    Kleinhans, Natalia M; Richards, Todd; Weaver, Kurt; Johnson, L Clark; Greenson, Jessica; Dawson, Geraldine; Aylward, Elizabeth

    2010-10-01

    Difficulty interpreting facial expressions has been reported in autism spectrum disorders (ASD) and is thought to be associated with amygdala abnormalities. To further explore the neural basis of abnormal emotional face processing in ASD, we conducted an fMRI study of emotional face matching in high-functioning adults with ASD and age, IQ, and gender matched controls. In addition, we investigated whether there was a relationship between self-reported social anxiety and fMRI activation. During fMRI scanning, study participants were instructed to match facial expressions depicting fear or anger. The control condition was a comparable shape-matching task. The control group evidenced significantly increased left prefrontal activation and decreased activation in the occipital lobes compared to the ASD group during emotional face matching. Further, within the ASD group, greater social anxiety was associated with increased activation in right amygdala and left middle temporal gyrus, and decreased activation in the fusiform face area. These results indicate that level of social anxiety mediates the neural response to emotional face perception in ASD. Copyright © 2010 Elsevier Ltd. All rights reserved.

  18. The complex duration perception of emotional faces: effects of face direction.

    PubMed

    Kliegl, Katrin M; Limbrecht-Ecklundt, Kerstin; Dürr, Lea; Traue, Harald C; Huckauf, Anke

    2015-01-01

    The perceived duration of emotional face stimuli strongly depends on the expressed emotion. But, emotional faces also differ regarding a number of other features like gaze, face direction, or sex. Usually, these features have been controlled by only using pictures of female models with straight gaze and face direction. Doi and Shinohara (2009) reported that an overestimation of angry faces could only be found when the model's gaze was oriented toward the observer. We aimed at replicating this effect for face direction. Moreover, we explored the effect of face direction on the duration perception sad faces. Controlling for the sex of the face model and the participant, female and male participants rated the duration of neutral, angry, and sad face stimuli of both sexes photographed from different perspectives in a bisection task. In line with current findings, we report a significant overestimation of angry compared to neutral face stimuli that was modulated by face direction. Moreover, the perceived duration of sad face stimuli did not differ from that of neutral faces and was not influenced by face direction. Furthermore, we found that faces of the opposite sex appeared to last longer than those of the same sex. This outcome is discussed with regards to stimulus parameters like the induced arousal, social relevance, and an evolutionary context.

  19. Perceived facial expressions of emotion as motivational incentives: evidence from a differential implicit learning paradigm.

    PubMed

    Schultheiss, Oliver C; Pang, Joyce S; Torges, Cynthia M; Wirth, Michelle M; Treynor, Wendy; Derryberry, Douglas

    2005-03-01

    Participants (N = 216) were administered a differential implicit learning task during which they were trained and tested on 3 maximally distinct 2nd-order visuomotor sequences, with sequence color serving as discriminative stimulus. During training, 1 sequence each was followed by an emotional face, a neutral face, and no face, using backward masking. Emotion (joy, surprise, anger), face gender, and exposure duration (12 ms, 209 ms) were varied between participants; implicit motives were assessed with a picture-story exercise. For power-motivated individuals, low-dominance facial expressions enhanced and high-dominance expressions impaired learning. For affiliation-motivated individuals, learning was impaired in the context of hostile faces. These findings did not depend on explicit learning of fixed sequences or on awareness of sequence-face contingencies. Copyright 2005 APA, all rights reserved.

  20. Recognition of face identity and emotion in expressive specific language impairment.

    PubMed

    Merkenschlager, A; Amorosa, H; Kiefl, H; Martinius, J

    2012-01-01

    To study face and emotion recognition in children with mostly expressive specific language impairment (SLI-E). A test movie to study perception and recognition of faces and mimic-gestural expression was applied to 24 children diagnosed as suffering from SLI-E and an age-matched control group of normally developing children. Compared to a normal control group, the SLI-E children scored significantly worse in both the face and expression recognition tasks with a preponderant effect on emotion recognition. The performance of the SLI-E group could not be explained by reduced attention during the test session. We conclude that SLI-E is associated with a deficiency in decoding non-verbal emotional facial and gestural information, which might lead to profound and persistent problems in social interaction and development. Copyright © 2012 S. Karger AG, Basel.

  1. Age-Related Changes in Amygdala-Frontal Connectivity during Emotional Face Processing from Childhood into Young Adulthood

    PubMed Central

    Wu, Minjie; Kujawa, Autumn; Lu, Lisa H.; Fitzgerald, Daniel A.; Klumpp, Heide; Fitzgerald, Kate D.; Monk, Christopher S.; Phan, K. Luan

    2016-01-01

    The ability to process and respond to emotional facial expressions is a critical skill for healthy social and emotional development. There has been growing interest in understanding the neural circuitry underlying development of emotional processing, with previous research implicating functional connectivity between amygdala and frontal regions. However, existing work has focused on threatening emotional faces, raising questions regarding the extent to which these developmental patterns are specific to threat or to emotional face processing more broadly. In the current study, we examined age-related changes in brain activity and amygdala functional connectivity during an fMRI emotional face matching task (including angry, fearful and happy faces) in 61 healthy subjects aged 7–25 years. We found age-related decreases in ventral medial prefrontal cortex (vmPFC) activity in response to happy faces but not to angry or fearful faces, and an age-related change (shifting from positive to negative correlation) in amygdala-anterior cingulate cortex/medial prefrontal cortex (ACC/mPFC) functional connectivity to all emotional faces. Specifically, positive correlations between amygdala and ACC/mPFC in children changed to negative correlations in adults, which may suggest early emergence of bottom-up amygdala excitatory signaling to ACC/mPFC in children and later development of top-down inhibitory control of ACC/mPFC over amygdala in adults. Age-related changes in amygdala-ACC/mPFC connectivity did not vary for processing of different facial emotions, suggesting changes in amygdala-ACC/mPFC connectivity may underlie development of broad emotional processing, rather than threat-specific processing. PMID:26931629

  2. The role of the cannabinoid receptor in adolescents' processing of facial expressions.

    PubMed

    Ewald, Anais; Becker, Susanne; Heinrich, Angela; Banaschewski, Tobias; Poustka, Luise; Bokde, Arun; Büchel, Christian; Bromberg, Uli; Cattrell, Anna; Conrod, Patricia; Desrivières, Sylvane; Frouin, Vincent; Papadopoulos-Orfanos, Dimitri; Gallinat, Jürgen; Garavan, Hugh; Heinz, Andreas; Walter, Henrik; Ittermann, Bernd; Gowland, Penny; Paus, Tomáš; Martinot, Jean-Luc; Paillère Martinot, Marie-Laure; Smolka, Michael N; Vetter, Nora; Whelan, Rob; Schumann, Gunter; Flor, Herta; Nees, Frauke

    2016-01-01

    The processing of emotional faces is an important prerequisite for adequate social interactions in daily life, and might thus specifically be altered in adolescence, a period marked by significant changes in social emotional processing. Previous research has shown that the cannabinoid receptor CB1R is associated with longer gaze duration and increased brain responses in the striatum to happy faces in adults, yet, for adolescents, it is not clear whether an association between CBR1 and face processing exists. In the present study we investigated genetic effects of the two CB1R polymorphisms, rs1049353 and rs806377, on the processing of emotional faces in healthy adolescents. They participated in functional magnetic resonance imaging during a Faces Task, watching blocks of video clips with angry and neutral facial expressions, and completed a Morphed Faces Task in the laboratory where they looked at different facial expressions that switched from anger to fear or sadness or from happiness to fear or sadness, and labelled them according to these four emotional expressions. A-allele versus GG-carriers in rs1049353 displayed earlier recognition of facial expressions changing from anger to sadness or fear, but not for expressions changing from happiness to sadness or fear, and higher brain responses to angry, but not neutral, faces in the amygdala and insula. For rs806377 no significant effects emerged. This suggests that rs1049353 is involved in the processing of negative facial expressions with relation to anger in adolescence. These findings add to our understanding of social emotion-related mechanisms in this life period. © 2015 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  3. Emotional faces and the default mode network.

    PubMed

    Sreenivas, S; Boehm, S G; Linden, D E J

    2012-01-11

    The default-mode network (DMN) of the human brain has become a central topic of cognitive neuroscience research. Although alterations in its resting state activity and in its recruitment during tasks have been reported for several mental and neurodegenerative disorders, its role in emotion processing has received relatively little attention. We investigated brain responses to different categories of emotional faces with functional magnetic resonance imaging (fMRI) and found deactivation in ventromedial prefrontal cortex (VMPFC), posterior cingulate gyrus (PC) and cuneus. This deactivation was modulated by emotional category and was less prominent for happy than for sad faces. These deactivated areas along the midline conformed to areas of the DMN. We also observed emotion-dependent deactivation of the left middle frontal gyrus, which is not a classical component of the DMN. Conversely, several areas in a fronto-parietal network commonly linked with attention were differentially activated by emotion categories. Functional connectivity patterns, as obtained by correlation of activation levels, also varied between emotions. VMPFC, PC or cuneus served as hubs between the DMN-type areas and the fronto-parietal network. These data support recent suggestions that the DMN is not a unitary system but differentiates according to task and even type of stimulus. The emotion-specific differential pattern of DMN deactivation may be explored further in patients with mood disorder, where the quest for biological markers of emotional biases is still ongoing. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  4. Emotional reactivity and its impact on neural circuitry for attention-emotion interaction in childhood and adolescence

    PubMed Central

    Perlman, Susan B.; Hein, Tyler C.; Stepp, Stephanie D.

    2013-01-01

    Attention modulation when confronted with emotional stimuli is considered a critical aspect of executive function, yet rarely studied during childhood and adolescence, a developmental period marked with changes in these processes. We employed a novel, and child-friendly fMRI task that used emotional faces to investigate the neural underpinnings of the attention-emotion interaction in a child and adolescent sample (n=23, Age m=13.46, sd=2.86, range=8.05–16.93 years). Results implied modulation of activation in the orbitofrontal cortex (OFC) due to emotional distractor valence, which marginally correlated with participant age. Additionally, parent-reported emotional reactivity predicted the trajectory of BOLD signal increase for fearful emotional face distractors such that participants low in emotional reactivity had a steeper latency to peak activation. Results imply that the use of the OFC to modulate attention in the face of social/emotional stimuli may mature with age and may be tightly coupled with adaptive emotional functioning. Findings are discussed in the context of risk for the development of psychiatric disorders, where increased emotional reactivity is particularly apparent. PMID:24055416

  5. Emotional face processing deficit in schizophrenia: A replication study in a South African Xhosa population.

    PubMed

    Leppänen, J M; Niehaus, D J H; Koen, L; Du Toit, E; Schoeman, R; Emsley, R

    2006-06-01

    Schizophrenia is associated with a deficit in the recognition of negative emotions from facial expressions. The present study examined the universality of this finding by studying facial expression recognition in African Xhosa population. Forty-four Xhosa patients with schizophrenia and forty healthy controls were tested with a computerized task requiring rapid perceptual discrimination of matched positive (i.e. happy), negative (i.e. angry), and neutral faces. Patients were equally accurate as controls in recognizing happy faces but showed a marked impairment in recognition of angry faces. The impairment was particularly pronounced for high-intensity (open-mouth) angry faces. Patients also exhibited more false happy and angry responses to neutral faces than controls. No correlation between level of education or illness duration and emotion recognition was found but the deficit in the recognition of negative emotions was more pronounced in familial compared to non-familial cases of schizophrenia. These findings suggest that the deficit in the recognition of negative facial expressions may constitute a universal neurocognitive marker of schizophrenia.

  6. Recognition memory for low- and high-frequency-filtered emotional faces: Low spatial frequencies drive emotional memory enhancement, whereas high spatial frequencies drive the emotion-induced recognition bias.

    PubMed

    Rohr, Michaela; Tröger, Johannes; Michely, Nils; Uhde, Alarith; Wentura, Dirk

    2017-07-01

    This article deals with two well-documented phenomena regarding emotional stimuli: emotional memory enhancement-that is, better long-term memory for emotional than for neutral stimuli-and the emotion-induced recognition bias-that is, a more liberal response criterion for emotional than for neutral stimuli. Studies on visual emotion perception and attention suggest that emotion-related processes can be modulated by means of spatial-frequency filtering of the presented emotional stimuli. Specifically, low spatial frequencies are assumed to play a primary role for the influence of emotion on attention and judgment. Given this theoretical background, we investigated whether spatial-frequency filtering also impacts (1) the memory advantage for emotional faces and (2) the emotion-induced recognition bias, in a series of old/new recognition experiments. Participants completed incidental-learning tasks with high- (HSF) and low- (LSF) spatial-frequency-filtered emotional and neutral faces. The results of the surprise recognition tests showed a clear memory advantage for emotional stimuli. Most importantly, the emotional memory enhancement was significantly larger for face images containing only low-frequency information (LSF faces) than for HSF faces across all experiments, suggesting that LSF information plays a critical role in this effect, whereas the emotion-induced recognition bias was found only for HSF stimuli. We discuss our findings in terms of both the traditional account of different processing pathways for HSF and LSF information and a stimulus features account. The double dissociation in the results favors the latter account-that is, an explanation in terms of differences in the characteristics of HSF and LSF stimuli.

  7. Testing the effects of expression, intensity and age on emotional face processing in ASD.

    PubMed

    Luyster, Rhiannon J; Bick, Johanna; Westerlund, Alissa; Nelson, Charles A

    2017-06-21

    Individuals with autism spectrum disorder (ASD) commonly show global deficits in the processing of facial emotion, including impairments in emotion recognition and slowed processing of emotional faces. Growing evidence has suggested that these challenges may increase with age, perhaps due to minimal improvement with age in individuals with ASD. In the present study, we explored the role of age, emotion type and emotion intensity in face processing for individuals with and without ASD. Twelve- and 18-22- year-old children with and without ASD participated. No significant diagnostic group differences were observed on behavioral measures of emotion processing for younger versus older individuals with and without ASD. However, there were significant group differences in neural responses to emotional faces. Relative to TD, at 12 years of age and during adulthood, individuals with ASD showed slower N170 to emotional faces. While the TD groups' P1 latency was significantly shorter in adults when compared to 12 year olds, there was no significant age-related difference in P1 latency among individuals with ASD. Findings point to potential differences in the maturation of cortical networks that support visual processing (whether of faces or stimuli more broadly), among individuals with and without ASD between late childhood and adulthood. Finally, associations between ERP amplitudes and behavioral responses on emotion processing tasks suggest possible neural markers for emotional and behavioral deficits among individuals with ASD. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. A Multidimensional Approach to the Study of Emotion Recognition in Autism Spectrum Disorders

    PubMed Central

    Xavier, Jean; Vignaud, Violaine; Ruggiero, Rosa; Bodeau, Nicolas; Cohen, David; Chaby, Laurence

    2015-01-01

    Although deficits in emotion recognition have been widely reported in autism spectrum disorder (ASD), experiments have been restricted to either facial or vocal expressions. Here, we explored multimodal emotion processing in children with ASD (N = 19) and with typical development (TD, N = 19), considering uni (faces and voices) and multimodal (faces/voices simultaneously) stimuli and developmental comorbidities (neuro-visual, language and motor impairments). Compared to TD controls, children with ASD had rather high and heterogeneous emotion recognition scores but showed also several significant differences: lower emotion recognition scores for visual stimuli, for neutral emotion, and a greater number of saccades during visual task. Multivariate analyses showed that: (1) the difficulties they experienced with visual stimuli were partially alleviated with multimodal stimuli. (2) Developmental age was significantly associated with emotion recognition in TD children, whereas it was the case only for the multimodal task in children with ASD. (3) Language impairments tended to be associated with emotion recognition scores of ASD children in the auditory modality. Conversely, in the visual or bimodal (visuo-auditory) tasks, the impact of developmental coordination disorder or neuro-visual impairments was not found. We conclude that impaired emotion processing constitutes a dimension to explore in the field of ASD, as research has the potential to define more homogeneous subgroups and tailored interventions. However, it is clear that developmental age, the nature of the stimuli, and other developmental comorbidities must also be taken into account when studying this dimension. PMID:26733928

  9. Computer-mediated communication preferences predict biobehavioral measures of social-emotional functioning.

    PubMed

    Babkirk, Sarah; Luehring-Jones, Peter; Dennis-Tiwary, Tracy A

    2016-12-01

    The use of computer-mediated communication (CMC) as a form of social interaction has become increasingly prevalent, yet few studies examine individual differences that may shed light on implications of CMC for adjustment. The current study examined neurocognitive individual differences associated with preferences to use technology in relation to social-emotional outcomes. In Study 1 (N = 91), a self-report measure, the Social Media Communication Questionnaire (SMCQ), was evaluated as an assessment of preferences for communicating positive and negative emotions on a scale ranging from purely via CMC to purely face-to-face. In Study 2, SMCQ preferences were examined in relation to event-related potentials (ERPs) associated with early emotional attention capture and reactivity (the frontal N1) and later sustained emotional processing and regulation (the late positive potential (LPP)). Electroencephalography (EEG) was recorded while 22 participants passively viewed emotional and neutral pictures and completed an emotion regulation task with instructions to increase, decrease, or maintain their emotional responses. A greater preference for CMC was associated with reduced size of and satisfaction with social support, greater early (N1) attention capture by emotional stimuli, and reduced LPP amplitudes to unpleasant stimuli in the increase emotion regulatory task. These findings are discussed in the context of possible emotion- and social-regulatory functions of CMC.

  10. Computer-Mediated Communication Preferences and Individual Differences in Neurocognitive Measures of Emotional Attention Capture, Reactivity and Regulation

    PubMed Central

    Babkirk, Sarah; Luehring-Jones, Peter; Dennis, Tracy A.

    2016-01-01

    The use of computer-mediated communication (CMC) to engage socially has become increasingly prevalent, yet few studies examined individual differences that may shed light on implications of CMC for adjustment. The current study examined neurocognitive individual differences associated with preferences to use technology in relation to social-emotional outcomes. In Study 1 (N =91), a self-report measure, the Social Media Communication Questionnaire (SMCQ), was evaluated as an assessment of preferences for communicating positive and negative emotions on a scale ranging from purely via CMC to purely face-to-face. In Study 2, SMCQ preferences were examined in relation to event-related potentials (ERPs) associated with early emotional attention capture and reactivity (the frontal N1) and later sustained emotional processing and regulation [the late positive potential (LPP)]. Electroencephalography (EEG) was recorded while 22 participants passively viewed emotional and neutral pictures and completed an emotion regulation task with instructions to increase, decrease or maintain their emotional responses. A greater preference for CMC was associated with reduced size of and satisfaction with social support, greater early (N1) attention capture by emotional stimuli, and reduced LPP amplitudes to unpleasant stimuli in the increase emotion regulatory task. These findings are discussed in the context of possible emotion- and social-regulatory functions of CMC. PMID:26613269

  11. Adaptation to Emotional Conflict: Evidence from a Novel Face Emotion Paradigm

    PubMed Central

    Clayson, Peter E.; Larson, Michael J.

    2013-01-01

    The preponderance of research on trial-by-trial recruitment of affective control (e.g., conflict adaptation) relies on stimuli wherein lexical word information conflicts with facial affective stimulus properties (e.g., the face-Stroop paradigm where an emotional word is overlaid on a facial expression). Several studies, however, indicate different neural time course and properties for processing of affective lexical stimuli versus affective facial stimuli. The current investigation used a novel task to examine control processes implemented following conflicting emotional stimuli with conflict-inducing affective face stimuli in the absence of affective words. Forty-one individuals completed a task wherein the affective-valence of the eyes and mouth were either congruent (happy eyes, happy mouth) or incongruent (happy eyes, angry mouth) while high-density event-related potentials (ERPs) were recorded. There was a significant congruency effect and significant conflict adaptation effects for error rates. Although response times (RTs) showed a significant congruency effect, the effect of previous-trial congruency on current-trial RTs was only present for current congruent trials. Temporospatial principal components analysis showed a P3-like ERP source localized using FieldTrip software to the medial cingulate gyrus that was smaller on incongruent than congruent trials and was significantly influenced by the recruitment of control processes following previous-trial emotional conflict (i.e., there was significant conflict adaptation in the ERPs). Results show that a face-only paradigm may be sufficient to elicit emotional conflict and suggest a system for rapidly detecting conflicting emotional stimuli and subsequently adjusting control resources, similar to cognitive conflict detection processes, when using conflicting facial expressions without words. PMID:24073278

  12. Adaptation to emotional conflict: evidence from a novel face emotion paradigm.

    PubMed

    Clayson, Peter E; Larson, Michael J

    2013-01-01

    The preponderance of research on trial-by-trial recruitment of affective control (e.g., conflict adaptation) relies on stimuli wherein lexical word information conflicts with facial affective stimulus properties (e.g., the face-Stroop paradigm where an emotional word is overlaid on a facial expression). Several studies, however, indicate different neural time course and properties for processing of affective lexical stimuli versus affective facial stimuli. The current investigation used a novel task to examine control processes implemented following conflicting emotional stimuli with conflict-inducing affective face stimuli in the absence of affective words. Forty-one individuals completed a task wherein the affective-valence of the eyes and mouth were either congruent (happy eyes, happy mouth) or incongruent (happy eyes, angry mouth) while high-density event-related potentials (ERPs) were recorded. There was a significant congruency effect and significant conflict adaptation effects for error rates. Although response times (RTs) showed a significant congruency effect, the effect of previous-trial congruency on current-trial RTs was only present for current congruent trials. Temporospatial principal components analysis showed a P3-like ERP source localized using FieldTrip software to the medial cingulate gyrus that was smaller on incongruent than congruent trials and was significantly influenced by the recruitment of control processes following previous-trial emotional conflict (i.e., there was significant conflict adaptation in the ERPs). Results show that a face-only paradigm may be sufficient to elicit emotional conflict and suggest a system for rapidly detecting conflicting emotional stimuli and subsequently adjusting control resources, similar to cognitive conflict detection processes, when using conflicting facial expressions without words.

  13. Exploring the motivational brain: effects of implicit power motivation on brain activation in response to facial expressions of emotion.

    PubMed

    Schultheiss, Oliver C; Wirth, Michelle M; Waugh, Christian E; Stanton, Steven J; Meier, Elizabeth A; Reuter-Lorenz, Patricia

    2008-12-01

    This study tested the hypothesis that implicit power motivation (nPower), in interaction with power incentives, influences activation of brain systems mediating motivation. Twelve individuals low (lowest quartile) and 12 individuals high (highest quartile) in nPower, as assessed per content coding of picture stories, were selected from a larger initial participant pool and participated in a functional magnetic resonance imaging study during which they viewed high-dominance (angry faces), low-dominance (surprised faces) and control stimuli (neutral faces, gray squares) under oddball-task conditions. Consistent with hypotheses, high-power participants showed stronger activation in response to emotional faces in brain structures involved in emotion and motivation (insula, dorsal striatum, orbitofrontal cortex) than low-power participants.

  14. Emerging depression is associated with face memory deficits in adolescent girls.

    PubMed

    Guyer, Amanda E; Choate, Victoria R; Grimm, Kevin J; Pine, Daniel S; Keenan, Kate

    2011-02-01

    To examine the association between memory for previously encoded emotional faces and depression symptoms assessed over 4 years in adolescent girls. Investigating the interface between memory deficits and depression in adolescent girls may provide clues about depression pathophysiology. Participants were 213 girls recruited from a longitudinal, community-based study; the majority were African American. Scores on depressive screening measures at age 8 were used to increase the base rate of depression. Depression symptoms and diagnoses were assessed annually for 4 years. In year 4, when the girls were 12 to 13 years old, a face emotion encoding task was administered during which ratings were generated in response to sad, fearful, angry, and happy faces. A surprise memory task followed whereby participants identified which of two faces, displaying neutral expressions, they had seen previously. Girls with higher depression symptom levels from ages 9 to 12 years evidenced lower accuracy in identifying previously encoded emotional faces. Controlling for IQ, higher depression symptom level was associated with a memory deficit specific to previously encoded sad and happy faces. These effects were not moderated by race. Individual differences in face memory deficits relate to individual differences in emerging, early adolescent depression, and may be vulnerability markers for depression. Copyright © 2011 American Academy of Child and Adolescent Psychiatry. Published by Elsevier Inc. All rights reserved.

  15. People with chronic facial pain perform worse than controls at a facial emotion recognition task, but it is not all about the emotion.

    PubMed

    von Piekartz, H; Wallwork, S B; Mohr, G; Butler, D S; Moseley, G L

    2015-04-01

    Alexithymia, or a lack of emotional awareness, is prevalent in some chronic pain conditions and has been linked to poor recognition of others' emotions. Recognising others' emotions from their facial expression involves both emotional and motor processing, but the possible contribution of motor disruption has not been considered. It is possible that poor performance on emotional recognition tasks could reflect problems with emotional processing, motor processing or both. We hypothesised that people with chronic facial pain would be less accurate in recognising others' emotions from facial expressions, would be less accurate in a motor imagery task involving the face, and that performance on both tasks would be positively related. A convenience sample of 19 people (15 females) with chronic facial pain and 19 gender-matched controls participated. They undertook two tasks; in the first task, they identified the facial emotion presented in a photograph. In the second, they identified whether the person in the image had a facial feature pointed towards their left or right side, a well-recognised paradigm to induce implicit motor imagery. People with chronic facial pain performed worse than controls at both tasks (Facially Expressed Emotion Labelling (FEEL) task P < 0·001; left/right judgment task P < 0·001). Participants who were more accurate at one task were also more accurate at the other, regardless of group (P < 0·001, r(2)  = 0·523). Participants with chronic facial pain were worse than controls at both the FEEL emotion recognition task and the left/right facial expression task and performance covaried within participants. We propose that disrupted motor processing may underpin or at least contribute to the difficulty that facial pain patients have in emotion recognition and that further research that tests this proposal is warranted. © 2014 John Wiley & Sons Ltd.

  16. Inter-hemispheric interaction facilitates face processing.

    PubMed

    Compton, Rebecca J

    2002-01-01

    Many recent studies have revealed that interaction between the left and right cerebral hemispheres can aid in task performance, but these studies have tended to examine perception of simple stimuli such as letters, digits or simple shapes, which may have limited naturalistic validity. The present study extends these prior findings to a more naturalistic face perception task. Matching tasks required subjects to indicate when a target face matched one of two probe faces. Matches could be either across-field, requiring inter-hemispheric interaction, or within-field, not requiring inter-hemispheric interaction. Subjects indicated when faces matched in emotional expression (Experiment 1; n=32) or in character identity (Experiment 2; n=32). In both experiments, across-field performance was significantly better than within-field performance, supporting the primary hypothesis. Further, this advantage was greater for the more difficult character identity task. Results offer qualified support for the hypothesis that inter-hemispheric interaction is especially advantageous as task demands increase.

  17. Preschool negative emotionality predicts activity and connectivity of the fusiform face area and amygdala in later childhood.

    PubMed

    Kann, Sarah J; O'Rawe, Jonathan F; Huang, Anna S; Klein, Daniel N; Leung, Hoi-Chung

    2017-09-01

    Negative emotionality (NE) refers to individual differences in the propensity to experience and react with negative emotions and is associated with increased risk of psychological disorder. However, research on the neural bases of NE has focused almost exclusively on amygdala activity during emotional face processing. This study broadened this framework by examining the relationship between observed NE in early childhood and subsequent neural responses to emotional faces in both the amygdala and the fusiform face area (FFA) in a late childhood/early adolescent sample. Measures of NE were obtained from children at age 3 using laboratory observations, and functional magnetic resonance imaging (fMRI) data were collected when these children were between the ages of 9 and 12 while performing a visual stimulus identity matching task with houses and emotional faces as stimuli. Multiple regression analyses revealed that higher NE at age 3 is associated with significantly greater activation in the left amygdala and left FFA but lower functional connectivity between these two regions during the face conditions. These findings suggest that those with higher early NE have subsequent alterations in both activity and connectivity within an extended network during face processing. © The Author (2017). Published by Oxford University Press.

  18. Spatial Rotation and Recognizing Emotions: Gender Related Differences in Brain Activity

    ERIC Educational Resources Information Center

    Jausovec, Norbert; Jausovec, Ksenija

    2008-01-01

    In three experiments, gender and ability (performance and emotional intelligence) related differences in brain activity--assessed with EEG methodology--while respondents were solving a spatial rotation tasks and identifying emotions in faces were investigated. The most robust gender related difference in brain activity was observed in the lower-2…

  19. Intact Rapid Facial Mimicry as well as Generally Reduced Mimic Responses in Stable Schizophrenia Patients

    PubMed Central

    Chechko, Natalya; Pagel, Alena; Otte, Ellen; Koch, Iring; Habel, Ute

    2016-01-01

    Spontaneous emotional expressions (rapid facial mimicry) perform both emotional and social functions. In the current study, we sought to test whether there were deficits in automatic mimic responses to emotional facial expressions in patients (15 of them) with stable schizophrenia compared to 15 controls. In a perception-action interference paradigm (the Simon task; first experiment), and in the context of a dual-task paradigm (second experiment), the task-relevant stimulus feature was the gender of a face, which, however, displayed a smiling or frowning expression (task-irrelevant stimulus feature). We measured the electromyographical activity in the corrugator supercilii and zygomaticus major muscle regions in response to either compatible or incompatible stimuli (i.e., when the required response did or did not correspond to the depicted facial expression). The compatibility effect based on interactions between the implicit processing of a task-irrelevant emotional facial expression and the conscious production of an emotional facial expression did not differ between the groups. In stable patients (in spite of a reduced mimic reaction), we observed an intact capacity to respond spontaneously to facial emotional stimuli. PMID:27303335

  20. Visuo-spatial interference affects the identification of emotional facial expressions in unmedicated Parkinson's patients.

    PubMed

    García-Rodríguez, Beatriz; Guillén, Carmen Casares; Barba, Rosa Jurado; io Valladolid, Gabriel Rub; Arjona, José Antonio Molina; Ellgring, Heiner

    2012-02-15

    There is evidence that visuo-spatial capacity can become overloaded when processing a secondary visual task (Dual Task, DT), as occurs in daily life. Hence, we investigated the influence of the visuo-spatial interference in the identification of emotional facial expressions (EFEs) in early stages of Parkinson's disease (PD). We compared the identification of 24 emotional faces that illustrate six basic emotions in, unmedicated recently diagnosed PD patients (16) and healthy adults (20), under two different conditions: a) simple EFE identification, and b) identification with a concurrent visuo-spatial task (Corsi Blocks). EFE identification by PD patients was significantly worse than that of healthy adults when combined with another visual stimulus. Published by Elsevier B.V.

  1. The association between PTSD and facial affect recognition.

    PubMed

    Williams, Christian L; Milanak, Melissa E; Judah, Matt R; Berenbaum, Howard

    2018-05-05

    The major aims of this study were to examine how, if at all, having higher levels of PTSD would be associated with performance on a facial affect recognition task in which facial expressions of emotion are superimposed on emotionally valenced, non-face images. College students with trauma histories (N = 90) completed a facial affect recognition task as well as measures of exposure to traumatic events, and PTSD symptoms. When the face and context matched, participants with higher levels of PTSD were significantly more accurate. When the face and context were mismatched, participants with lower levels of PTSD were more accurate than were those with higher levels of PTSD. These findings suggest that PTSD is associated with how people process affective information. Furthermore, these results suggest that the enhanced attention of people with higher levels of PTSD to affective information can be either beneficial or detrimental to their ability to accurately identify facial expressions of emotion. Limitations, future directions and clinical implications are discussed. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Visual scanpath abnormalities in 22q11.2 deletion syndrome: is this a face specific deficit?

    PubMed

    McCabe, Kathryn; Rich, Dominique; Loughland, Carmel Maree; Schall, Ulrich; Campbell, Linda Elisabet

    2011-09-30

    People with 22q11.2 deletion syndrome (22q11DS) have deficits in face emotion recognition. However, it is not known whether this is a deficit specific to faces, or represents maladaptive information processing strategies to complex stimuli in general. This study examined the specificity of face emotion processing deficits in 22q11DS by exploring recognition accuracy and visual scanpath performance to a Faces task compared to a Weather Scene task. Seventeen adolescents with 22q11DS (11=females, age=17.4) and 18 healthy controls (11=females, age=17.7) participated in the study. People with 22q11DS displayed an overall impoverished scanning strategy to face and weather stimuli alike, resulting in poorer accuracy across all stimuli for the 22q11DS participants compared to controls. While the control subjects altered their information processing in response to faces, a similar change was not present in the 22q11DS group indicating different visual scanpath strategies to identify category within each of the tasks, of which faces appear to represent a particularly difficult subcategory. To conclude, while this study indicates that people with 22q11DS have a general visual processing deficit, the lack of strategic change between tasks suggest that the 22q11DS group did not adapt to the change in stimuli content as well as the controls, indicative of cognitive inflexibility rather than a face specific deficit. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. Time-course of attention biases in social phobia.

    PubMed

    Schofield, Casey A; Inhoff, Albrecht W; Coles, Meredith E

    2013-10-01

    Theoretical models of social phobia implicate preferential attention to social threat in the maintenance of anxiety symptoms, though there has been limited work characterizing the nature of these biases over time. The current study utilized eye-movement data to examine the time-course of visual attention over 1500ms trials of a probe detection task. Nineteen participants with a primary diagnosis of social phobia based on DSM-IV criteria and 20 non-clinical controls completed this task with angry, fearful, and happy face trials. Overt visual attention to the emotional and neutral faces was measured in 50ms segments across the trial. Over time, participants with social phobia attend less to emotional faces and specifically less to happy faces compared to controls. Further, attention to emotional relative to neutral expressions did not vary notably by emotion for participants with social phobia, but control participants showed a pattern after 1000ms in which over time they preferentially attended to happy expressions and avoided negative expressions. Findings highlight the importance of considering attention biases to positive stimuli as well as the pattern of attention between groups. These results suggest that attention "bias" in social phobia may be driven by a relative lack of the biases seen in non-anxious participants. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Connectivity underlying emotion conflict regulation in older adults with 5-HTTLPR short allele: a preliminary investigation.

    PubMed

    Waring, Jill D; Etkin, Amit; Hallmayer, Joachim F; O'Hara, Ruth

    2014-09-01

    The serotonin transporter polymorphism short (s) allele is associated with heightened emotional reactivity and reduced emotion regulation, which increases vulnerability to depression and anxiety disorders. We investigated behavioral and neural markers of emotion regulation in community-dwelling older adults, contrasting s allele carriers and long allele homozygotes. Participants (N = 26) completed a face-word emotion conflict task during functional magnetic resonance imaging, in which facilitated regulation of emotion conflict was observed on face-word incongruent trials following another incongruent trial (i.e., emotional conflict adaptation). There were no differences between genetic groups in behavioral task performance or neural activation in postincongruent versus postcongruent trials. By contrast, connectivity between dorsal anterior cingulate cortex (ACC) and pregenual ACC, regions previously implicated in emotion conflict regulation, was impaired in s carriers for emotional conflict adaptation. This is the first demonstration of an association between serotonin transporter polymorphism and functional connectivity in older adults. Poor dorsal ACC-pregenual ACC connectivity in s carriers may be one route by which these individuals experience greater difficulty in implementing effective emotional regulation, which may contribute to their vulnerability for affective disorders. Copyright © 2014 American Association for Geriatric Psychiatry. All rights reserved.

  5. Early visual ERPs are influenced by individual emotional skills

    PubMed Central

    Roux, Sylvie; Batty, Magali

    2014-01-01

    Processing information from faces is crucial to understanding others and to adapting to social life. Many studies have investigated responses to facial emotions to provide a better understanding of the processes and the neural networks involved. Moreover, several studies have revealed abnormalities of emotional face processing and their neural correlates in affective disorders. The aim of this study was to investigate whether early visual event-related potentials (ERPs) are affected by the emotional skills of healthy adults. Unfamiliar faces expressing the six basic emotions were presented to 28 young adults while recording visual ERPs. No specific task was required during the recording. Participants also completed the Social Skills Inventory (SSI) which measures social and emotional skills. The results confirmed that early visual ERPs (P1, N170) are affected by the emotions expressed by a face and also demonstrated that N170 and P2 are correlated to the emotional skills of healthy subjects. While N170 is sensitive to the subject’s emotional sensitivity and expressivity, P2 is modulated by the ability of the subjects to control their emotions. We therefore suggest that N170 and P2 could be used as individual markers to assess strengths and weaknesses in emotional areas and could provide information for further investigations of affective disorders. PMID:23720573

  6. The beneficial effect of oxytocin on avoidance-related facial emotion recognition depends on early life stress experience.

    PubMed

    Feeser, Melanie; Fan, Yan; Weigand, Anne; Hahn, Adam; Gärtner, Matti; Aust, Sabine; Böker, Heinz; Bajbouj, Malek; Grimm, Simone

    2014-12-01

    Previous studies have shown that oxytocin (OXT) enhances social cognitive processes. It has also been demonstrated that OXT does not uniformly facilitate social cognition. The effects of OXT administration strongly depend on the exposure to stressful experiences in early life. Emotional facial recognition is crucial for social cognition. However, no study has yet examined how the effects of OXT on the ability to identify emotional faces are altered by early life stress (ELS) experiences. Given the role of OXT in modulating social motivational processes, we specifically aimed to investigate its effects on the recognition of approach- and avoidance-related facial emotions. In a double-blind, between-subjects, placebo-controlled design, 82 male participants performed an emotion recognition task with faces taken from the "Karolinska Directed Emotional Faces" set. We clustered the six basic emotions along the dimensions approach (happy, surprise, anger) and avoidance (fear, sadness, disgust). ELS was assessed with the Childhood Trauma Questionnaire (CTQ). Our results showed that OXT improved the ability to recognize avoidance-related emotional faces as compared to approach-related emotional faces. Whereas the performance for avoidance-related emotions in participants with higher ELS scores was comparable in both OXT and placebo condition, OXT enhanced emotion recognition in participants with lower ELS scores. Independent of OXT administration, we observed increased emotion recognition for avoidance-related faces in participants with high ELS scores. Our findings suggest that the investigation of OXT on social recognition requires a broad approach that takes ELS experiences as well as motivational processes into account.

  7. Body Weight Can Change How Your Emotions Are Perceived

    PubMed Central

    2016-01-01

    Accurately interpreting other’s emotions through facial expressions has important adaptive values for social interactions. However, due to the stereotypical social perception of overweight individuals as carefree, humorous, and light-hearted, the body weight of those with whom we interact may have a systematic influence on our emotion judgment even though it has no relevance to the expressed emotion itself. In this experimental study, we examined the role of body weight in faces on the affective perception of facial expressions. We hypothesized that the weight perceived in a face would bias the assessment of an emotional expression, with overweight faces generally more likely to be perceived as having more positive and less negative expressions than healthy weight faces. Using two-alternative forced-choice perceptual decision tasks, participants were asked to sort the emotional expressions of overweight and healthy weight facial stimuli that had been gradually morphed across six emotional intensity levels into one of two categories—“neutral vs. happy” (Experiment 1) and “neutral vs. sad” (Experiment 2). As predicted, our results demonstrated that overweight faces were more likely to be categorized as happy (i.e., lower happy decision threshold) and less likely to be categorized as sad (i.e., higher sad decision threshold) compared to healthy weight faces that had the same levels of emotional intensity. The neutral-sad decision threshold shift was negatively correlated with participant’s own fear of becoming fat, that is, those without a fear of becoming fat more strongly perceived overweight faces as sad relative to those with a higher fear. These findings demonstrate that the weight of the face systematically influences how its emotional expression is interpreted, suggesting that being overweight may make emotional expressions appear more happy and less sad than they really are. PMID:27870892

  8. Age-related changes in amygdala-frontal connectivity during emotional face processing from childhood into young adulthood.

    PubMed

    Wu, Minjie; Kujawa, Autumn; Lu, Lisa H; Fitzgerald, Daniel A; Klumpp, Heide; Fitzgerald, Kate D; Monk, Christopher S; Phan, K Luan

    2016-05-01

    The ability to process and respond to emotional facial expressions is a critical skill for healthy social and emotional development. There has been growing interest in understanding the neural circuitry underlying development of emotional processing, with previous research implicating functional connectivity between amygdala and frontal regions. However, existing work has focused on threatening emotional faces, raising questions regarding the extent to which these developmental patterns are specific to threat or to emotional face processing more broadly. In the current study, we examined age-related changes in brain activity and amygdala functional connectivity during an fMRI emotional face matching task (including angry, fearful, and happy faces) in 61 healthy subjects aged 7-25 years. We found age-related decreases in ventral medial prefrontal cortex activity in response to happy faces but not to angry or fearful faces, and an age-related change (shifting from positive to negative correlation) in amygdala-anterior cingulate cortex/medial prefrontal cortex (ACC/mPFC) functional connectivity to all emotional faces. Specifically, positive correlations between amygdala and ACC/mPFC in children changed to negative correlations in adults, which may suggest early emergence of bottom-up amygdala excitatory signaling to ACC/mPFC in children and later development of top-down inhibitory control of ACC/mPFC over amygdala in adults. Age-related changes in amygdala-ACC/mPFC connectivity did not vary for processing of different facial emotions, suggesting changes in amygdala-ACC/mPFC connectivity may underlie development of broad emotional processing, rather than threat-specific processing. Hum Brain Mapp 37:1684-1695, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  9. Body Weight Can Change How Your Emotions Are Perceived.

    PubMed

    Oh, Yujung; Hass, Norah C; Lim, Seung-Lark

    2016-01-01

    Accurately interpreting other's emotions through facial expressions has important adaptive values for social interactions. However, due to the stereotypical social perception of overweight individuals as carefree, humorous, and light-hearted, the body weight of those with whom we interact may have a systematic influence on our emotion judgment even though it has no relevance to the expressed emotion itself. In this experimental study, we examined the role of body weight in faces on the affective perception of facial expressions. We hypothesized that the weight perceived in a face would bias the assessment of an emotional expression, with overweight faces generally more likely to be perceived as having more positive and less negative expressions than healthy weight faces. Using two-alternative forced-choice perceptual decision tasks, participants were asked to sort the emotional expressions of overweight and healthy weight facial stimuli that had been gradually morphed across six emotional intensity levels into one of two categories-"neutral vs. happy" (Experiment 1) and "neutral vs. sad" (Experiment 2). As predicted, our results demonstrated that overweight faces were more likely to be categorized as happy (i.e., lower happy decision threshold) and less likely to be categorized as sad (i.e., higher sad decision threshold) compared to healthy weight faces that had the same levels of emotional intensity. The neutral-sad decision threshold shift was negatively correlated with participant's own fear of becoming fat, that is, those without a fear of becoming fat more strongly perceived overweight faces as sad relative to those with a higher fear. These findings demonstrate that the weight of the face systematically influences how its emotional expression is interpreted, suggesting that being overweight may make emotional expressions appear more happy and less sad than they really are.

  10. Emotional faces influence evaluation of natural and transformed food.

    PubMed

    Manippa, Valerio; Padulo, Caterina; Brancucci, Alfredo

    2018-07-01

    Previous evidence showed the presence of a straight relationship between feeding behavior and emotions. Despite that, no studies have focused on the influence of emotional faces on food processing. In our study, participants were presented with 72 couples of visual stimuli composed of a neutral, happy, or disgusted faces (5000 ms duration in Experiment 1, adaptation; 150 ms in Experiment 2, priming) followed by a food stimulus (1500 ms). Food stimuli were grouped in pleasant foods, further divided in natural and transformed, and unpleasant rotten foods. The task consisted in judging the food valence (as 'pleasant' or 'unpleasant') by keypress. Results showed a different pattern of response based on the transformation level of food. In general, the evaluation of natural foods was more rapid compared with transformed foods, maybe for their simplicity and healthier perception. In addition, transformed foods yielded incongruent responses with respect to the preceding emotional face, whereas natural foods yielded congruent responses with respect to it. These effects were independent of the duration of the emotional face (i.e., adaptation or priming paradigm) and may depend on pleasant food stimuli salience.

  11. Repetition suppression of faces is modulated by emotion

    NASA Astrophysics Data System (ADS)

    Ishai, Alumit; Pessoa, Luiz; Bikle, Philip C.; Ungerleider, Leslie G.

    2004-06-01

    Single-unit recordings and functional brain imaging studies have shown reduced neural responses to repeated stimuli in the visual cortex. By using event-related functional MRI, we compared the activation evoked by repetitions of neutral and fearful faces, which were either task relevant (targets) or irrelevant (distracters). We found that within the inferior occipital gyri, lateral fusiform gyri, superior temporal sulci, amygdala, and the inferior frontal gyri/insula, targets evoked stronger responses than distracters and their repetition was associated with significantly reduced responses. Repetition suppression, as manifested by the difference in response amplitude between the first and third repetitions of a target, was stronger for fearful than neutral faces. Distracter faces, regardless of their repetition or valence, evoked negligible activation, indicating top-down attenuation of behaviorally irrelevant stimuli. Our findings demonstrate a three-way interaction between emotional valence, repetition, and task relevance and suggest that repetition suppression is influenced by high-level cognitive processes in the human brain. face perception | functional MRI

  12. Neural correlates of emotional intelligence in a visual emotional oddball task: an ERP study.

    PubMed

    Raz, Sivan; Dan, Orrie; Zysberg, Leehu

    2014-11-01

    The present study was aimed at identifying potential behavioral and neural correlates of Emotional Intelligence (EI) by using scalp-recorded Event-Related Potentials (ERPs). EI levels were defined according to both self-report questionnaire and a performance-based ability test. We identified ERP correlates of emotional processing by using a visual-emotional oddball paradigm, in which subjects were confronted with one frequent standard stimulus (a neutral face) and two deviant stimuli (a happy and an angry face). The effects of these faces were then compared across groups with low and high EI levels. The ERP results indicate that participants with high EI exhibited significantly greater mean amplitudes of the P1, P2, N2, and P3 ERP components in response to emotional and neutral faces, at frontal, posterior-parietal and occipital scalp locations. P1, P2 and N2 are considered indexes of attention-related processes and have been associated with early attention to emotional stimuli. The later P3 component has been thought to reflect more elaborative, top-down, emotional information processing including emotional evaluation and memory encoding and formation. These results may suggest greater recruitment of resources to process all emotional and non-emotional faces at early and late processing stages among individuals with higher EI. The present study underscores the usefulness of ERP methodology as a sensitive measure for the study of emotional stimuli processing in the research field of EI. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. Impairment in face processing in autism spectrum disorder: a developmental perspective.

    PubMed

    Greimel, Ellen; Schulte-Rüther, Martin; Kamp-Becker, Inge; Remschmidt, Helmut; Herpertz-Dahlmann, Beate; Konrad, Kerstin

    2014-09-01

    Findings on face identity and facial emotion recognition in autism spectrum disorder (ASD) are inconclusive. Moreover, little is known about the developmental trajectory of face processing skills in ASD. Taking a developmental perspective, the aim of this study was to extend previous findings on face processing skills in a sample of adolescents and adults with ASD. N = 38 adolescents and adults (13-49 years) with high-functioning ASD and n = 37 typically developing (TD) control subjects matched for age and IQ participated in the study. Moreover, n = 18 TD children between the ages of 8 and 12 were included to address the question whether face processing skills in ASD follow a delayed developmental pattern. Face processing skills were assessed using computerized tasks of face identity recognition (FR) and identification of facial emotions (IFE). ASD subjects showed impaired performance on several parameters of the FR and IFE task compared to TD control adolescents and adults. Whereas TD adolescents and adults outperformed TD children in both tasks, performance in ASD adolescents and adults was similar to the group of TD children. Within the groups of ASD and control adolescents and adults, no age-related changes in performance were found. Our findings corroborate and extend previous studies showing that ASD is characterised by broad impairments in the ability to process faces. These impairments seem to reflect a developmentally delayed pattern that remains stable throughout adolescence and adulthood.

  14. The dissociable neural dynamics of cognitive conflict and emotional conflict control: An ERP study.

    PubMed

    Xue, Song; Li, Yu; Kong, Xia; He, Qiaolin; Liu, Jia; Qiu, Jiang

    2016-04-21

    This study investigated differences in the neural time-course of cognitive conflict and emotional conflict control, using event-related potentials (ERPs). Although imaging studies have provided some evidence that distinct, dissociable neural systems underlie emotional and nonemotional conflict resolution, no ERP study has directly compared these two types of conflict. Therefore, the present study used a modified face-word Stroop task to explore the electrophysiological correlates of cognitive and emotional conflict control. The behavioral data showed that the difference in response time of congruency (incongruent condition minus the congruent condition) was larger in the cognitive conflict task than in the emotional conflict task, which indicated that cognitive conflict was stronger than the emotional conflict in the present tasks. Analysis of the ERP data revealed a main effect of task type on N2, which may be associated with top-down attention. The N450 results showed an interaction between cognitive and emotional conflict, which might be related to conflict detection. In addition, we found the incongruent condition elicited a larger SP than the congruent condition, which might be related to conflict resolution. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  15. The effect of acute citalopram on face emotion processing in remitted depression: a pharmacoMRI study.

    PubMed

    Anderson, Ian M; Juhasz, Gabriella; Thomas, Emma; Downey, Darragh; McKie, Shane; Deakin, J F William; Elliott, Rebecca

    2011-01-01

    Both reduced serotonergic (5-HT) function and negative emotional biases have been associated with vulnerability to depression. In order to investigate whether these might be related we examined 5-HT modulation of affective processing in 14 remitted depressed subjects compared with 12 never depressed controls matched for age and sex. Participants underwent function magnetic resonance imaging (fMRI) during a covert face emotion task with and without intravenous citalopram (7.5mg) pretreatment. Compared with viewing neutral faces, and irrespective of group, citalopram enhanced left anterior cingulate blood oxygen level dependent (BOLD) response to happy faces, right posterior insula and right lateral orbitofrontal responses to sad faces, and reduced amygdala responses bilaterally to fearful faces. In controls, relative to remitted depressed subjects, citalopram increased bilateral hippocampal responses to happy faces and increased right anterior insula response to sad faces. These findings were not accounted for by changes in BOLD responses to viewing neutral faces. These results are consistent with previous findings showing 5-HT modulation of affective processing; differences found in previously depressed participants compared with controls may contribute to emotional processing biases underlying vulnerability to depressive relapse. Copyright © 2010 Elsevier B.V. and ECNP. All rights reserved.

  16. Emotional responses associated with self-face processing in individuals with autism spectrum disorders: an fMRI study.

    PubMed

    Morita, Tomoyo; Kosaka, Hirotaka; Saito, Daisuke N; Ishitobi, Makoto; Munesue, Toshio; Itakura, Shoji; Omori, Masao; Okazawa, Hidehiko; Wada, Yuji; Sadato, Norihiro

    2012-01-01

    Individuals with autism spectrum disorders (ASD) show impaired emotional responses to self-face processing, but the underlying neural bases are unclear. Using functional magnetic resonance imaging, we investigated brain activity when 15 individuals with high-functioning ASD and 15 controls rated the photogenicity of self-face images and photographs of others' faces. Controls showed a strong correlation between photogenicity ratings and extent of embarrassment evoked by self-face images; this correlation was weaker among ASD individuals, indicating a decoupling between the cognitive evaluation of self-face images and emotional responses. Individuals with ASD demonstrated relatively low self-related activity in the posterior cingulate cortex (PCC), which was related to specific autistic traits. There were significant group differences in the modulation of activity by embarrassment ratings in the right insular (IC) and lateral orbitofrontal cortices. Task-related activity in the right IC was lower in the ASD group. The reduced activity in the right IC for self-face images was associated with weak coupling between cognitive evaluation and emotional responses to self-face images. The PCC is responsible for self-referential processing, and the IC plays a role in emotional experience. Dysfunction in these areas could contribute to the lack of self-conscious behaviors in response to self-reflection in ASD individuals.

  17. Neural Reactivity to Angry Faces Predicts Treatment Response in Pediatric Anxiety.

    PubMed

    Bunford, Nora; Kujawa, Autumn; Fitzgerald, Kate D; Swain, James E; Hanna, Gregory L; Koschmann, Elizabeth; Simpson, David; Connolly, Sucheta; Monk, Christopher S; Phan, K Luan

    2017-02-01

    Although cognitive-behavioral psychotherapy (CBT) and pharmacotherapy are evidence-based treatments for pediatric anxiety, many youth with anxiety disorders fail to respond to these treatments. Given limitations of clinical measures in predicting treatment response, identifying neural predictors is timely. In this study, 35 anxious youth (ages 7-19 years) completed an emotional face-matching task during which the late positive potential (LPP), an event-related potential (ERP) component that indexes sustained attention towards emotional stimuli, was measured. Following the ERP measurement, youth received CBT or selective serotonin reuptake inhibitor (SSRI) treatment, and the LPP was examined as a predictor of treatment response. Findings indicated that, accounting for pre-treatment anxiety severity, neural reactivity to emotional faces predicted anxiety severity post- CBT and SSRI treatment such that enhanced electrocortical response to angry faces was associated with better treatment response. An enhanced LPP to angry faces may predict treatment response insofar as it may reflect greater emotion dysregulation or less avoidance and/or enhanced engagement with environmental stimuli in general, including with treatment.

  18. Attention to emotion: auditory-evoked potentials in an emotional choice reaction task and personality traits as assessed by the NEO FFI.

    PubMed

    Mittermeier, Verena; Leicht, Gregor; Karch, Susanne; Hegerl, Ulrich; Möller, Hans-Jürgen; Pogarell, Oliver; Mulert, Christoph

    2011-03-01

    Several studies suggest that attention to emotional content is related to specific changes in central information processing. In particular, event-related potential (ERP) studies focusing on emotion recognition in pictures and faces or word processing have pointed toward a distinct component of the visual-evoked potential, the EPN ('early posterior negativity'), which has been shown to be related to attention to emotional content. In the present study, we were interested in the existence of a corresponding ERP component in the auditory modality and a possible relationship with the personality dimension extraversion-introversion, as assessed by the NEO Five-Factors Inventory. We investigated 29 healthy subjects using three types of auditory choice tasks: (1) the distinction of syllables with emotional intonation, (2) the identification of the emotional content of adjectives and (3) a purely cognitive control task. Compared with the cognitive control task, emotional paradigms using auditory stimuli evoked an EPN component with a distinct peak after 170 ms (EPN 170). Interestingly, subjects with high scores in the personality trait extraversion showed significantly higher EPN amplitudes for emotional paradigms (syllables and words) than introverted subjects.

  19. Neural architecture underlying classification of face perception paradigms.

    PubMed

    Laird, Angela R; Riedel, Michael C; Sutherland, Matthew T; Eickhoff, Simon B; Ray, Kimberly L; Uecker, Angela M; Fox, P Mickle; Turner, Jessica A; Fox, Peter T

    2015-10-01

    We present a novel strategy for deriving a classification system of functional neuroimaging paradigms that relies on hierarchical clustering of experiments archived in the BrainMap database. The goal of our proof-of-concept application was to examine the underlying neural architecture of the face perception literature from a meta-analytic perspective, as these studies include a wide range of tasks. Task-based results exhibiting similar activation patterns were grouped as similar, while tasks activating different brain networks were classified as functionally distinct. We identified four sub-classes of face tasks: (1) Visuospatial Attention and Visuomotor Coordination to Faces, (2) Perception and Recognition of Faces, (3) Social Processing and Episodic Recall of Faces, and (4) Face Naming and Lexical Retrieval. Interpretation of these sub-classes supports an extension of a well-known model of face perception to include a core system for visual analysis and extended systems for personal information, emotion, and salience processing. Overall, these results demonstrate that a large-scale data mining approach can inform the evolution of theoretical cognitive models by probing the range of behavioral manipulations across experimental tasks. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. The role of working memory in decoding emotions.

    PubMed

    Phillips, Louise H; Channon, Shelley; Tunstall, Mary; Hedenstrom, Anna; Lyons, Kathryn

    2008-04-01

    Decoding facial expressions of emotion is an important aspect of social communication that is often impaired following psychiatric or neurological illness. However, little is known of the cognitive components involved in perceiving emotional expressions. Three dual task studies explored the role of verbal working memory in decoding emotions. Concurrent working memory load substantially interfered with choosing which emotional label described a facial expression (Experiment 1). A key factor in the magnitude of interference was the number of emotion labels from which to choose (Experiment 2). In contrast the ability to decide that two faces represented the same emotion in a discrimination task was relatively unaffected by concurrent working memory load (Experiment 3). Different methods of assessing emotion perception make substantially different demands on working memory. Implications for clinical disorders which affect both working memory and emotion perception are considered. (Copyright) 2008 APA.

  1. Grounding context in face processing: color, emotion, and gender.

    PubMed

    Gil, Sandrine; Le Bigot, Ludovic

    2015-01-01

    In recent years, researchers have become interested in the way that the affective quality of contextual information transfers to a perceived target. We therefore examined the effect of a red (vs. green, mixed red/green, and achromatic) background - known to be valenced - on the processing of stimuli that play a key role in human interactions, namely facial expressions. We also examined whether the valenced-color effect can be modulated by gender, which is also known to be valenced. Female and male adult participants performed a categorization task of facial expressions of emotion in which the faces of female and male posers expressing two ambiguous emotions (i.e., neutral and surprise) were presented against the four different colored backgrounds. Additionally, this task was completed by collecting subjective ratings for each colored background in the form of five semantic differential scales corresponding to both discrete and dimensional perspectives of emotion. We found that the red background resulted in more negative face perception than the green background, whether the poser was female or male. However, whereas this valenced-color effect was the only effect for female posers, for male posers, the effect was modulated by both the nature of the ambiguous emotion and the decoder's gender. Overall, our findings offer evidence that color and gender have a common valence-based dimension.

  2. Why is happy-sad more difficult? Focal emotional information impairs inhibitory control in children and adults.

    PubMed

    Kramer, Hannah J; Lagattuta, Kristin Hansen; Sayfan, Liat

    2015-02-01

    This study compared the relative difficulty of the happy-sad inhibitory control task (say "happy" for the sad face and "sad" for the happy face) against other card tasks that varied by the presence and type (focal vs. peripheral; negative vs. positive) of emotional information in a sample of 4- to 11-year-olds and adults (N = 264). Participants also completed parallel "name games" (direct labeling). All age groups made more errors and took longer to respond to happy-sad compared to other versions, and the relative difficulty of happy-sad increased with age. The happy-sad name game even posed a greater challenge than some opposite games. These data provide insight into the impact of emotions on cognitive processing across a wide age range. PsycINFO Database Record (c) 2015 APA, all rights reserved.

  3. Emotion perception, but not affect perception, is impaired with semantic memory loss.

    PubMed

    Lindquist, Kristen A; Gendron, Maria; Barrett, Lisa Feldman; Dickerson, Bradford C

    2014-04-01

    For decades, psychologists and neuroscientists have hypothesized that the ability to perceive emotions on others' faces is inborn, prelinguistic, and universal. Concept knowledge about emotion has been assumed to be epiphenomenal to emotion perception. In this article, we report findings from 3 patients with semantic dementia that cannot be explained by this "basic emotion" view. These patients, who have substantial deficits in semantic processing abilities, spontaneously perceived pleasant and unpleasant expressions on faces, but not discrete emotions such as anger, disgust, fear, or sadness, even in a task that did not require the use of emotion words. Our findings support the hypothesis that discrete emotion concept knowledge helps transform perceptions of affect (positively or negatively valenced facial expressions) into perceptions of discrete emotions such as anger, disgust, fear, and sadness. These findings have important consequences for understanding the processes supporting emotion perception.

  4. Using Event Related Potentials to Explore Stages of Facial Affect Recognition Deficits in Schizophrenia

    PubMed Central

    Wynn, Jonathan K.; Lee, Junghee; Horan, William P.; Green, Michael F.

    2008-01-01

    Schizophrenia patients show impairments in identifying facial affect; however, it is not known at what stage facial affect processing is impaired. We evaluated 3 event-related potentials (ERPs) to explore stages of facial affect processing in schizophrenia patients. Twenty-six schizophrenia patients and 27 normal controls participated. In separate blocks, subjects identified the gender of a face, the emotion of a face, or if a building had 1 or 2 stories. Three ERPs were examined: (1) P100 to examine basic visual processing, (2) N170 to examine facial feature encoding, and (3) N250 to examine affect decoding. Behavioral performance on each task was also measured. Results showed that schizophrenia patients’ P100 was comparable to the controls during all 3 identification tasks. Both patients and controls exhibited a comparable N170 that was largest during processing of faces and smallest during processing of buildings. For both groups, the N250 was largest during the emotion identification task and smallest for the building identification task. However, the patients produced a smaller N250 compared with the controls across the 3 tasks. The groups did not differ in behavioral performance in any of the 3 identification tasks. The pattern of intact P100 and N170 suggest that patients maintain basic visual processing and facial feature encoding abilities. The abnormal N250 suggests that schizophrenia patients are less efficient at decoding facial affect features. Our results imply that abnormalities in the later stage of feature decoding could potentially underlie emotion identification deficits in schizophrenia. PMID:18499704

  5. Piccolo genotype modulates neural correlates of emotion processing but not executive functioning.

    PubMed

    Woudstra, S; Bochdanovits, Z; van Tol, M-J; Veltman, D J; Zitman, F G; van Buchem, M A; van der Wee, N J; Opmeer, E M; Demenescu, L R; Aleman, A; Penninx, B W; Hoogendijk, W J

    2012-04-03

    Major depressive disorder (MDD) is characterized by affective symptoms and cognitive impairments, which have been associated with changes in limbic and prefrontal activity as well as with monoaminergic neurotransmission. A genome-wide association study implicated the polymorphism rs2522833 in the piccolo (PCLO) gene--involved in monoaminergic neurotransmission--as a risk factor for MDD. However, the role of the PCLO risk allele in emotion processing and executive function or its effect on their neural substrate has never been studied. We used functional magnetic resonance imaging (fMRI) to investigate PCLO risk allele carriers vs noncarriers during an emotional face processing task and a visuospatial planning task in 159 current MDD patients and healthy controls. In PCLO risk allele carriers, we found increased activity in the left amygdala during processing of angry and sad faces compared with noncarriers, independent of psychopathological status. During processing of fearful faces, the PCLO risk allele was associated with increased amygdala activation in MDD patients only. During the visuospatial planning task, we found no genotype effect on performance or on BOLD signal in our predefined areas as a function of increasing task load. The PCLO risk allele was found to be specifically associated with altered emotion processing, but not with executive dysfunction. Moreover, the PCLO risk allele appears to modulate amygdala function during fearful facial processing in MDD and may constitute a possible link between genotype and susceptibility for depression via altered processing of fearful stimuli. The current results may therefore aid in better understanding underlying neurobiological mechanisms in MDD.

  6. Altered medial prefrontal activity during dynamic face processing in schizophrenia spectrum patients.

    PubMed

    Mothersill, Omar; Morris, Derek W; Kelly, Sinead; Rose, Emma Jane; Bokde, Arun; Reilly, Richard; Gill, Michael; Corvin, Aiden P; Donohoe, Gary

    2014-08-01

    Processing the emotional content of faces is recognised as a key deficit of schizophrenia, associated with poorer functional outcomes and possibly contributing to the severity of clinical symptoms such as paranoia. At the neural level, fMRI studies have reported altered limbic activity in response to facial stimuli. However, previous studies may be limited by the use of cognitively demanding tasks and static facial stimuli. To address these issues, the current study used a face processing task involving both passive face viewing and dynamic social stimuli. Such a task may (1) lack the potentially confounding effects of high cognitive demands and (2) show higher ecological validity. Functional MRI was used to examine neural activity in 25 patients with a DSM-IV diagnosis of schizophrenia/schizoaffective disorder and 21 age- and gender-matched healthy controls while they participated in a face processing task, which involved viewing videos of angry and neutral facial expressions, and a non-biological baseline condition. While viewing faces, patients showed significantly weaker deactivation of the medial prefrontal cortex, including the anterior cingulate, and decreased activation in the left cerebellum, compared to controls. Patients also showed weaker medial prefrontal deactivation while viewing the angry faces relative to baseline. Given that the anterior cingulate plays a role in processing negative emotion, weaker deactivation of this region in patients while viewing faces may contribute to an increased perception of social threat. Future studies examining the neurobiology of social cognition in schizophrenia using fMRI may help establish targets for treatment interventions. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Emotion based attentional priority for storage in visual short-term memory.

    PubMed

    Simione, Luca; Calabrese, Lucia; Marucci, Francesco S; Belardinelli, Marta Olivetti; Raffone, Antonino; Maratos, Frances A

    2014-01-01

    A plethora of research demonstrates that the processing of emotional faces is prioritised over non-emotive stimuli when cognitive resources are limited (this is known as 'emotional superiority'). However, there is debate as to whether competition for processing resources results in emotional superiority per se, or more specifically, threat superiority. Therefore, to investigate prioritisation of emotional stimuli for storage in visual short-term memory (VSTM), we devised an original VSTM report procedure using schematic (angry, happy, neutral) faces in which processing competition was manipulated. In Experiment 1, display exposure time was manipulated to create competition between stimuli. Participants (n = 20) had to recall a probed stimulus from a set size of four under high (150 ms array exposure duration) and low (400 ms array exposure duration) perceptual processing competition. For the high competition condition (i.e. 150 ms exposure), results revealed an emotional superiority effect per se. In Experiment 2 (n = 20), we increased competition by manipulating set size (three versus five stimuli), whilst maintaining a constrained array exposure duration of 150 ms. Here, for the five-stimulus set size (i.e. maximal competition) only threat superiority emerged. These findings demonstrate attentional prioritisation for storage in VSTM for emotional faces. We argue that task demands modulated the availability of processing resources and consequently the relative magnitude of the emotional/threat superiority effect, with only threatening stimuli prioritised for storage in VSTM under more demanding processing conditions. Our results are discussed in light of models and theories of visual selection, and not only combine the two strands of research (i.e. visual selection and emotion), but highlight a critical factor in the processing of emotional stimuli is availability of processing resources, which is further constrained by task demands.

  8. Orienting and maintenance of attention to threatening facial expressions in anxiety--an eye movement study.

    PubMed

    Holas, Pawel; Krejtz, Izabela; Cypryanska, Marzena; Nezlek, John B

    2014-12-15

    Cognitive models posit that anxiety disorders stem in part from underlying attentional biases to threat. Consistent with this, studies have found that the attentional bias to threat-related stimuli is greater in high vs. low anxious individuals. Nevertheless, it is not clear if similar biases exist for different threatening emotions or for any facial emotional stimulus. In the present study, we used eye-tracking to measure orienting and maintenance of attention to faces displaying anger, fear and disgust as threats, and faces displaying happiness and sadness. Using a free viewing task, we examined differences between low and high trait anxious (HTA) individuals in the attention they paid to each of these emotional faces (paired with a neutral face). We found that initial orienting was faster for angry and happy faces, and high trait anxious participants were more vigilant to fearful and disgust faces. Our results for attentional maintenance were not consistent. The results of the present study suggest that attentional processes may be more emotion-specific than previously believed. Our results suggest that attentional processes for different threatening emotions may not be the same and that attentional processes for some negative and some positive emotions may be similar. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  9. Dissociable neural effects of stimulus valence and preceding context during the inhibition of responses to emotional faces.

    PubMed

    Schulz, Kurt P; Clerkin, Suzanne M; Halperin, Jeffrey M; Newcorn, Jeffrey H; Tang, Cheuk Y; Fan, Jin

    2009-09-01

    Socially appropriate behavior requires the concurrent inhibition of actions that are inappropriate in the context. This self-regulatory function requires an interaction of inhibitory and emotional processes that recruits brain regions beyond those engaged by either processes alone. In this study, we isolated brain activity associated with response inhibition and emotional processing in 24 healthy adults using event-related functional magnetic resonance imaging (fMRI) and a go/no-go task that independently manipulated the context preceding no-go trials (ie, number of go trials) and the valence (ie, happy, sad, and neutral) of the face stimuli used as trial cues. Parallel quadratic trends were seen in correct inhibitions on no-go trials preceded by increasing numbers of go trials and associated activation for correct no-go trials in inferior frontal gyrus pars opercularis, pars triangularis, and pars orbitalis, temporoparietal junction, superior parietal lobule, and temporal sensory association cortices. Conversely, the comparison of happy versus neutral faces and sad versus neutral faces revealed valence-dependent activation in the amygdala, anterior insula cortex, and posterior midcingulate cortex. Further, an interaction between inhibition and emotion was seen in valence-dependent variations in the quadratic trend in no-go activation in the right inferior frontal gyrus and left posterior insula cortex. These results suggest that the inhibition of response to emotional cues involves the interaction of partly dissociable limbic and frontoparietal networks that encode emotional cues and use these cues to exert inhibitory control over the motor, attention, and sensory functions needed to perform the task, respectively. 2008 Wiley-Liss, Inc.

  10. Transient emotional events and individual affective traits affect emotion recognition in a perceptual decision-making task.

    PubMed

    Qiao-Tasserit, Emilie; Garcia Quesada, Maria; Antico, Lia; Bavelier, Daphne; Vuilleumier, Patrik; Pichon, Swann

    2017-01-01

    Both affective states and personality traits shape how we perceive the social world and interpret emotions. The literature on affective priming has mostly focused on brief influences of emotional stimuli and emotional states on perceptual and cognitive processes. Yet this approach does not fully capture more dynamic processes at the root of emotional states, with such states lingering beyond the duration of the inducing external stimuli. Our goal was to put in perspective three different types of affective states (induced affective states, more sustained mood states and affective traits such as depression and anxiety) and investigate how they may interact and influence emotion perception. Here, we hypothesized that absorption into positive and negative emotional episodes generate sustained affective states that outlast the episode period and bias the interpretation of facial expressions in a perceptual decision-making task. We also investigated how such effects are influenced by more sustained mood states and by individual affect traits (depression and anxiety) and whether they interact. Transient emotional states were induced using movie-clips, after which participants performed a forced-choice emotion classification task with morphed facial expressions ranging from fear to happiness. Using a psychometric approach, we show that negative (vs. neutral) clips increased participants' propensity to classify ambiguous faces as fearful during several minutes. In contrast, positive movies biased classification toward happiness only for those clips perceived as most absorbing. Negative mood, anxiety and depression had a stronger effect than transient states and increased the propensity to classify ambiguous faces as fearful. These results provide the first evidence that absorption and different temporal dimensions of emotions have a significant effect on how we perceive facial expressions.

  11. Transient emotional events and individual affective traits affect emotion recognition in a perceptual decision-making task

    PubMed Central

    Garcia Quesada, Maria; Antico, Lia; Bavelier, Daphne; Vuilleumier, Patrik; Pichon, Swann

    2017-01-01

    Both affective states and personality traits shape how we perceive the social world and interpret emotions. The literature on affective priming has mostly focused on brief influences of emotional stimuli and emotional states on perceptual and cognitive processes. Yet this approach does not fully capture more dynamic processes at the root of emotional states, with such states lingering beyond the duration of the inducing external stimuli. Our goal was to put in perspective three different types of affective states (induced affective states, more sustained mood states and affective traits such as depression and anxiety) and investigate how they may interact and influence emotion perception. Here, we hypothesized that absorption into positive and negative emotional episodes generate sustained affective states that outlast the episode period and bias the interpretation of facial expressions in a perceptual decision-making task. We also investigated how such effects are influenced by more sustained mood states and by individual affect traits (depression and anxiety) and whether they interact. Transient emotional states were induced using movie-clips, after which participants performed a forced-choice emotion classification task with morphed facial expressions ranging from fear to happiness. Using a psychometric approach, we show that negative (vs. neutral) clips increased participants’ propensity to classify ambiguous faces as fearful during several minutes. In contrast, positive movies biased classification toward happiness only for those clips perceived as most absorbing. Negative mood, anxiety and depression had a stronger effect than transient states and increased the propensity to classify ambiguous faces as fearful. These results provide the first evidence that absorption and different temporal dimensions of emotions have a significant effect on how we perceive facial expressions. PMID:28151976

  12. The serotonin transporter gene polymorphism and the effect of baseline on amygdala response to emotional faces.

    PubMed

    von dem Hagen, Elisabeth A H; Passamonti, Luca; Nutland, Sarah; Sambrook, Jennifer; Calder, Andrew J

    2011-03-01

    Previous research has found that a common polymorphism in the serotonin transporter gene (5-HTTLPR) is an important mediator of individual differences in brain responses associated with emotional behaviour. In particular, relative to individuals homozygous for the l-allele, carriers of the s-allele display heightened amygdala activation to emotional compared to non-emotional stimuli. However, there is some debate as to whether this difference is driven by increased activation to emotional stimuli, resting baseline differences between the groups, or decreased activation to neutral stimuli. We performed functional imaging during an implicit facial expression processing task in which participants viewed angry, sad and neutral faces. In addition to neutral faces, we included two further baseline conditions, houses and fixation. We found increased amygdala activation in s-allele carriers relative to l-homozygotes in response to angry faces compared to neutral faces, houses and fixation. When comparing neutral faces to houses or fixation, we found no significant difference in amygdala response between the two groups. In addition, there was no significant difference between the groups in response to fixation when compared with a houses baseline. Overall, these results suggest that the increased amygdala response observed in s-allele carriers to emotional faces is primarily driven by an increased response to emotional faces rather than a decreased response to neutral faces or an increased resting baseline. The results are discussed in relation to the tonic and phasic hypotheses of 5-HTTLPR-mediated modulation of amygdala activity. Copyright © 2010 Elsevier Ltd. All rights reserved.

  13. Effects of facial color on the subliminal processing of fearful faces.

    PubMed

    Nakajima, K; Minami, T; Nakauchi, S

    2015-12-03

    Recent studies have suggested that both configural information, such as face shape, and surface information is important for face perception. In particular, facial color is sufficiently suggestive of emotional states, as in the phrases: "flushed with anger" and "pale with fear." However, few studies have examined the relationship between facial color and emotional expression. On the other hand, event-related potential (ERP) studies have shown that emotional expressions, such as fear, are processed unconsciously. In this study, we examined how facial color modulated the supraliminal and subliminal processing of fearful faces. We recorded electroencephalograms while participants performed a facial emotion identification task involving masked target faces exhibiting facial expressions (fearful or neutral) and colors (natural or bluish). The results indicated that there was a significant interaction between facial expression and color for the latency of the N170 component. Subsequent analyses revealed that the bluish-colored faces increased the latency effect of facial expressions compared to the natural-colored faces, indicating that the bluish color modulated the processing of fearful expressions. We conclude that the unconscious processing of fearful faces is affected by facial color. Copyright © 2015 IBRO. Published by Elsevier Ltd. All rights reserved.

  14. Effects of exposure to facial expression variation in face learning and recognition.

    PubMed

    Liu, Chang Hong; Chen, Wenfeng; Ward, James

    2015-11-01

    Facial expression is a major source of image variation in face images. Linking numerous expressions to the same face can be a huge challenge for face learning and recognition. It remains largely unknown what level of exposure to this image variation is critical for expression-invariant face recognition. We examined this issue in a recognition memory task, where the number of facial expressions of each face being exposed during a training session was manipulated. Faces were either trained with multiple expressions or a single expression, and they were later tested in either the same or different expressions. We found that recognition performance after learning three emotional expressions had no improvement over learning a single emotional expression (Experiments 1 and 2). However, learning three emotional expressions improved recognition compared to learning a single neutral expression (Experiment 3). These findings reveal both the limitation and the benefit of multiple exposures to variations of emotional expression in achieving expression-invariant face recognition. The transfer of expression training to a new type of expression is likely to depend on a relatively extensive level of training and a certain degree of variation across the types of expressions.

  15. Facial and semantic emotional interference: A pilot study on the behavioral and cortical responses to the dual valence association task

    PubMed Central

    2011-01-01

    Background Integration of compatible or incompatible emotional valence and semantic information is an essential aspect of complex social interactions. A modified version of the Implicit Association Test (IAT) called Dual Valence Association Task (DVAT) was designed in order to measure conflict resolution processing from compatibility/incompatibly of semantic and facial valence. The DVAT involves two emotional valence evaluative tasks which elicits two forms of emotional compatible/incompatible associations (facial and semantic). Methods Behavioural measures and Event Related Potentials were recorded while participants performed the DVAT. Results Behavioural data showed a robust effect that distinguished compatible/incompatible tasks. The effects of valence and contextual association (between facial and semantic stimuli) showed early discrimination in N170 of faces. The LPP component was modulated by the compatibility of the DVAT. Conclusions Results suggest that DVAT is a robust paradigm for studying the emotional interference effect in the processing of simultaneous information from semantic and facial stimuli. PMID:21489277

  16. Emotion perception, but not affect perception, is impaired with semantic memory loss

    PubMed Central

    Lindquist, Kristen A.; Gendron, Maria; Feldman Barrett, Lisa; Dickerson, Bradford C.

    2014-01-01

    For decades, psychologists and neuroscientists have hypothesized that the ability to perceive emotions on others’ faces is inborn, pre-linguistic, and universal. Concept knowledge about emotion has been assumed to be epiphenomenal to emotion perception. In this paper, we report findings from three patients with semantic dementia that cannot be explained by this “basic emotion” view. These patients, who have substantial deficits in semantic processing abilities, spontaneously perceived pleasant and unpleasant expressions on faces, but not discrete emotions such as anger, disgust, fear, or sadness, even in a task that did not require the use of emotion words. Our findings support the hypothesis that discrete emotion concept knowledge helps transform perceptions of affect (positively or negatively valenced facial expressions) into perceptions of discrete emotions such as anger, disgust, fear and sadness. These findings have important consequences for understanding the processes supporting emotion perception. PMID:24512242

  17. Emotional content modulates response inhibition and perceptual processing.

    PubMed

    Yang, Suyong; Luo, Wenbo; Zhu, Xiangru; Broster, Lucas S; Chen, Taolin; Li, Jinzhen; Luo, Yuejia

    2014-11-01

    In this study, event-related potentials were used to investigate the effect of emotion on response inhibition. Participants performed an emotional go/no-go task that required responses to human faces associated with a "go" valence (i.e., emotional, neutral) and response inhibition to human faces associated with a "no-go" valence. Emotional content impaired response inhibition, as evidenced by decreased response accuracy and N2 amplitudes in no-go trials. More importantly, emotional expressions elicited larger N170 amplitudes than neutral expressions, and this effect was larger in no-go than in go trials, indicating that the perceptual processing of emotional expression had priority in inhibitory trials. In no-go trials, correlation analysis showed that increased N170 amplitudes were associated with decreased N2 amplitudes. Taken together, our findings suggest that emotional content impairs response inhibition due to the prioritization of emotional content processing. Copyright © 2014 Society for Psychophysiological Research.

  18. Attention to Low- and High-Spatial Frequencies in Categorizing Facial Identities, Emotions and Gender in Children with Autism

    ERIC Educational Resources Information Center

    Deruelle, Christine; Rondan, Cecilie; Salle-Collemiche, Xavier; Bastard-Rosset, Delphine; Da Fonseca, David

    2008-01-01

    This study was aimed at investigating face categorization strategies in children with autistic spectrum disorders (ASD). Performance of 17 children with ASD was compared to that of 17 control children in a face-matching task, including hybrid faces (composed of two overlapping faces of different spatial bandwidths) and either low- or high-pass…

  19. Implicit and Explicit Motivational Tendencies to Faces Varying in Trustworthiness and Dominance in Men

    PubMed Central

    Radke, Sina; Kalt, Theresa; Wagels, Lisa; Derntl, Birgit

    2018-01-01

    Motivational tendencies to happy and angry faces are well-established, e.g., in the form of aggression. Approach-avoidance reactions are not only elicited by emotional expressions, but also linked to the evaluation of stable, social characteristics of faces. Grounded in the two fundamental dimensions of face-based evaluations proposed by Oosterhof and Todorov (2008), the current study tested whether emotionally neutral faces varying in trustworthiness and dominance potentiate approach-avoidance in 50 healthy male participants. Given that evaluations of social traits are influenced by testosterone, we further tested for associations of approach-avoidance tendencies with endogenous and prenatal indicators of testosterone. Computer-generated faces signaling high and low trustworthiness and dominance were used to elicit motivational reactions in three approach-avoidance tasks, i.e., one implicit and one explicit joystick-based paradigm, and an additional rating task. When participants rated their behavioral tendencies, highly trustworthy faces evoked approach, and highly dominant faces evoked avoidance. This pattern, however, did not translate to faster initiation times of corresponding approach-avoidance movements. Instead, the joystick tasks revealed general effects, such as faster reactions to faces signaling high trustworthiness or high dominance. These findings partially support the framework of Oosterhof and Todorov (2008) in guiding approach-avoidance decisions, but not behavioral tendencies. Contrary to our expectations, neither endogenous nor prenatal indicators of testosterone were associated with motivational tendencies. Future studies should investigate the contexts in which testosterone influences social motivation. PMID:29410619

  20. Electrocortical processing of social signals of threat in combat-related post-traumatic stress disorder.

    PubMed

    MacNamara, Annmarie; Post, David; Kennedy, Amy E; Rabinak, Christine A; Phan, K Luan

    2013-10-01

    Post-traumatic stress disorder (PTSD) is characterized by avoidance, emotional numbing, increased arousal and hypervigilance for threat following a trauma. Thirty-three veterans (19 with PTSD, 14 without PTSD) who had experienced combat trauma while on deployment in Iraq and/or Afghanistan completed an emotional faces matching task while electroencephalography was recorded. Vertex positive potentials (VPPs) elicited by happy, angry and fearful faces were smaller in veterans with versus without PTSD. In addition, veterans with PTSD exhibited smaller late positive potentials (LPPs) to angry faces and greater intrusive symptoms predicted smaller LPPs to fearful faces in the PTSD group. Veterans with PTSD were also less accurate at identifying angry faces, and accuracy decreased in the PTSD group as hyperarousal symptoms increased. These findings show reduced early processing of emotional faces, irrespective of valence, and blunted prolonged processing of social signals of threat in conjunction with impaired perception for angry faces in PTSD. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. Sleep deprivation impairs the accurate recognition of human emotions.

    PubMed

    van der Helm, Els; Gujar, Ninad; Walker, Matthew P

    2010-03-01

    Investigate the impact of sleep deprivation on the ability to recognize the intensity of human facial emotions. Randomized total sleep-deprivation or sleep-rested conditions, involving between-group and within-group repeated measures analysis. Experimental laboratory study. Thirty-seven healthy participants, (21 females) aged 18-25 y, were randomly assigned to the sleep control (SC: n = 17) or total sleep deprivation group (TSD: n = 20). Participants performed an emotional face recognition task, in which they evaluated 3 different affective face categories: Sad, Happy, and Angry, each ranging in a gradient from neutral to increasingly emotional. In the TSD group, the task was performed once under conditions of sleep deprivation, and twice under sleep-rested conditions following different durations of sleep recovery. In the SC group, the task was performed twice under sleep-rested conditions, controlling for repeatability. In the TSD group, when sleep-deprived, there was a marked and significant blunting in the recognition of Angry and Happy affective expressions in the moderate (but not extreme) emotional intensity range; differences that were most reliable and significant in female participants. No change in the recognition of Sad expressions was observed. These recognition deficits were, however, ameliorated following one night of recovery sleep. No changes in task performance were observed in the SC group. Sleep deprivation selectively impairs the accurate judgment of human facial emotions, especially threat relevant (Anger) and reward relevant (Happy) categories, an effect observed most significantly in females. Such findings suggest that sleep loss impairs discrete affective neural systems, disrupting the identification of salient affective social cues.

  2. Guanfacine modulates the influence of emotional cues on prefrontal cortex activation for cognitive control.

    PubMed

    Schulz, Kurt P; Clerkin, Suzanne M; Fan, Jin; Halperin, Jeffrey M; Newcorn, Jeffrey H

    2013-03-01

    Functional interactions between limbic regions that process emotions and frontal networks that guide response functions provide a substrate for emotional cues to influence behavior. Stimulation of postsynaptic α₂ adrenoceptors enhances the function of prefrontal regions in these networks. However, the impact of this stimulation on the emotional biasing of behavior has not been established. This study tested the effect of the postsynaptic α₂ adrenoceptor agonist guanfacine on the emotional biasing of response execution and inhibition in prefrontal cortex. Fifteen healthy young adults were scanned twice with functional magnetic resonance imaging while performing a face emotion go/no-go task following counterbalanced administration of single doses of oral guanfacine (1 mg) and placebo in a double-blind, cross-over design. Lower perceptual sensitivity and less response bias for sad faces resulted in fewer correct responses compared to happy and neutral faces but had no effect on correct inhibitions. Guanfacine increased the sensitivity and bias selectively for sad faces, resulting in response accuracy comparable to happy and neutral faces, and reversed the valence-dependent variation in response-related activation in left dorsolateral prefrontal cortex (DLPFC), resulting in enhanced activation for response execution cued by sad faces relative to happy and neutral faces, in line with other frontoparietal regions. These results provide evidence that guanfacine stimulation of postsynaptic α₂ adrenoceptors moderates DLPFC activation associated with the emotional biasing of response execution processes. The findings have implications for the α₂ adrenoceptor agonist treatment of attention-deficit hyperactivity disorder.

  3. Emotion Processing in Parkinson’s Disease: A Three-Level Study on Recognition, Representation, and Regulation

    PubMed Central

    Enrici, Ivan; Adenzato, Mauro; Ardito, Rita B.; Mitkova, Antonia; Cavallo, Marco; Zibetti, Maurizio; Lopiano, Leonardo; Castelli, Lorys

    2015-01-01

    Background Parkinson’s disease (PD) is characterised by well-known motor symptoms, whereas the presence of cognitive non-motor symptoms, such as emotional disturbances, is still underestimated. One of the major problems in studying emotion deficits in PD is an atomising approach that does not take into account different levels of emotion elaboration. Our study addressed the question of whether people with PD exhibit difficulties in one or more specific dimensions of emotion processing, investigating three different levels of analyses, that is, recognition, representation, and regulation. Methodology Thirty-two consecutive medicated patients with PD and 25 healthy controls were enrolled in the study. Participants performed a three-level analysis assessment of emotional processing using quantitative standardised emotional tasks: the Ekman 60-Faces for emotion recognition, the full 36-item version of the Reading the Mind in the Eyes (RME) for emotion representation, and the 20-item Toronto Alexithymia Scale (TAS-20) for emotion regulation. Principal Findings Regarding emotion recognition, patients obtained significantly worse scores than controls in the total score of Ekman 60-Faces but not in any other basic emotions. For emotion representation, patients obtained significantly worse scores than controls in the RME experimental score but no in the RME gender control task. Finally, on emotion regulation, PD and controls did not perform differently at TAS-20 and no specific differences were found on TAS-20 subscales. The PD impairments on emotion recognition and representation do not correlate with dopamine therapy, disease severity, or with the duration of illness. These results are independent from other cognitive processes, such as global cognitive status and executive function, or from psychiatric status, such as depression, anxiety or apathy. Conclusions These results may contribute to better understanding of the emotional problems that are often seen in patients with PD and the measures used to test these problems, in particular on the use of different versions of the RME task. PMID:26110271

  4. How do schizophrenia patients use visual information to decode facial emotion?

    PubMed

    Lee, Junghee; Gosselin, Frédéric; Wynn, Jonathan K; Green, Michael F

    2011-09-01

    Impairment in recognizing facial emotions is a prominent feature of schizophrenia patients, but the underlying mechanism of this impairment remains unclear. This study investigated the specific aspects of visual information that are critical for schizophrenia patients to recognize emotional expression. Using the Bubbles technique, we probed the use of visual information during a facial emotion discrimination task (fear vs. happy) in 21 schizophrenia patients and 17 healthy controls. Visual information was sampled through randomly located Gaussian apertures (or "bubbles") at 5 spatial frequency scales. Online calibration of the amount of face exposed through bubbles was used to ensure 75% overall accuracy for each subject. Least-square multiple linear regression analyses between sampled information and accuracy were performed to identify critical visual information that was used to identify emotional expression. To accurately identify emotional expression, schizophrenia patients required more exposure of facial areas (i.e., more bubbles) compared with healthy controls. To identify fearful faces, schizophrenia patients relied less on bilateral eye regions at high-spatial frequency compared with healthy controls. For identification of happy faces, schizophrenia patients relied on the mouth and eye regions; healthy controls did not utilize eyes and used the mouth much less than patients did. Schizophrenia patients needed more facial information to recognize emotional expression of faces. In addition, patients differed from controls in their use of high-spatial frequency information from eye regions to identify fearful faces. This study provides direct evidence that schizophrenia patients employ an atypical strategy of using visual information to recognize emotional faces.

  5. Early visual ERPs are influenced by individual emotional skills.

    PubMed

    Meaux, Emilie; Roux, Sylvie; Batty, Magali

    2014-08-01

    Processing information from faces is crucial to understanding others and to adapting to social life. Many studies have investigated responses to facial emotions to provide a better understanding of the processes and the neural networks involved. Moreover, several studies have revealed abnormalities of emotional face processing and their neural correlates in affective disorders. The aim of this study was to investigate whether early visual event-related potentials (ERPs) are affected by the emotional skills of healthy adults. Unfamiliar faces expressing the six basic emotions were presented to 28 young adults while recording visual ERPs. No specific task was required during the recording. Participants also completed the Social Skills Inventory (SSI) which measures social and emotional skills. The results confirmed that early visual ERPs (P1, N170) are affected by the emotions expressed by a face and also demonstrated that N170 and P2 are correlated to the emotional skills of healthy subjects. While N170 is sensitive to the subject's emotional sensitivity and expressivity, P2 is modulated by the ability of the subjects to control their emotions. We therefore suggest that N170 and P2 could be used as individual markers to assess strengths and weaknesses in emotional areas and could provide information for further investigations of affective disorders. © The Author (2013). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  6. Individual differences in bodily freezing predict emotional biases in decision making

    PubMed Central

    Ly, Verena; Huys, Quentin J. M.; Stins, John F.; Roelofs, Karin; Cools, Roshan

    2014-01-01

    Instrumental decision making has long been argued to be vulnerable to emotional responses. Literature on multiple decision making systems suggests that this emotional biasing might reflect effects of a system that regulates innately specified, evolutionarily preprogrammed responses. To test this hypothesis directly, we investigated whether effects of emotional faces on instrumental action can be predicted by effects of emotional faces on bodily freezing, an innately specified response to aversive relative to appetitive cues. We tested 43 women using a novel emotional decision making task combined with posturography, which involves a force platform to detect small oscillations of the body to accurately quantify postural control in upright stance. On the platform, participants learned whole body approach-avoidance actions based on monetary feedback, while being primed by emotional faces (angry/happy). Our data evidence an emotional biasing of instrumental action. Thus, angry relative to happy faces slowed instrumental approach relative to avoidance responses. Critically, individual differences in this emotional biasing effect were predicted by individual differences in bodily freezing. This result suggests that emotional biasing of instrumental action involves interaction with a system that controls innately specified responses. Furthermore, our findings help bridge (animal and human) decision making and emotion research to advance our mechanistic understanding of decision making anomalies in daily encounters as well as in a wide range of psychopathology. PMID:25071491

  7. Neural processing of fearful and happy facial expressions during emotion-relevant and emotion-irrelevant tasks: a fixation-to-feature approach

    PubMed Central

    Neath-Tavares, Karly N.; Itier, Roxane J.

    2017-01-01

    Research suggests an important role of the eyes and mouth for discriminating facial expressions of emotion. A gaze-contingent procedure was used to test the impact of fixation to facial features on the neural response to fearful, happy and neutral facial expressions in an emotion discrimination (Exp.1) and an oddball detection (Exp.2) task. The N170 was the only eye-sensitive ERP component, and this sensitivity did not vary across facial expressions. In both tasks, compared to neutral faces, responses to happy expressions were seen as early as 100–120ms occipitally, while responses to fearful expressions started around 150ms, on or after the N170, at both occipital and lateral-posterior sites. Analyses of scalp topographies revealed different distributions of these two emotion effects across most of the epoch. Emotion processing interacted with fixation location at different times between tasks. Results suggest a role of both the eyes and mouth in the neural processing of fearful expressions and of the mouth in the processing of happy expressions, before 350ms. PMID:27430934

  8. Crossmodal adaptation in right posterior superior temporal sulcus during face-voice emotional integration.

    PubMed

    Watson, Rebecca; Latinus, Marianne; Noguchi, Takao; Garrod, Oliver; Crabbe, Frances; Belin, Pascal

    2014-05-14

    The integration of emotional information from the face and voice of other persons is known to be mediated by a number of "multisensory" cerebral regions, such as the right posterior superior temporal sulcus (pSTS). However, whether multimodal integration in these regions is attributable to interleaved populations of unisensory neurons responding to face or voice or rather by multimodal neurons receiving input from the two modalities is not fully clear. Here, we examine this question using functional magnetic resonance adaptation and dynamic audiovisual stimuli in which emotional information was manipulated parametrically and independently in the face and voice via morphing between angry and happy expressions. Healthy human adult subjects were scanned while performing a happy/angry emotion categorization task on a series of such stimuli included in a fast event-related, continuous carryover design. Subjects integrated both face and voice information when categorizing emotion-although there was a greater weighting of face information-and showed behavioral adaptation effects both within and across modality. Adaptation also occurred at the neural level: in addition to modality-specific adaptation in visual and auditory cortices, we observed for the first time a crossmodal adaptation effect. Specifically, fMRI signal in the right pSTS was reduced in response to a stimulus in which facial emotion was similar to the vocal emotion of the preceding stimulus. These results suggest that the integration of emotional information from face and voice in the pSTS involves a detectable proportion of bimodal neurons that combine inputs from visual and auditory cortices. Copyright © 2014 the authors 0270-6474/14/346813-09$15.00/0.

  9. Unattended facial expressions asymmetrically bias the concurrent processing of nonemotional information.

    PubMed

    Maxwell, Jeffrey S; Shackman, Alexander J; Davidson, Richard J

    2005-09-01

    Planned and reflexive behaviors often occur in the presence of emotional stimuli and within the context of an individual's acute emotional state. Therefore, determining the manner in which emotion and attention interact is an important step toward understanding how we function in the real world. Participants in the current investigation viewed centrally displayed, task-irrelevant, face distractors (angry, neutral, happy) while performing a lateralized go/no-go continuous performance task. Lateralized go targets and no-go lures that did not spatially overlap with the faces were employed to differentially probe processing in the left (LH) and right (RH) cerebral hemispheres. There was a significant interaction between expression and hemisphere, with an overall pattern such that angry distractors were associated with relatively more RH inhibitory errors than neutral or happy distractors and happy distractors with relatively more LH inhibitory errors than angry or neutral distractors. Simple effects analyses confirmed that angry faces differentially interfered with RH relative to LH inhibition and with inhibition in the RH relative to happy faces. A significant three-way interaction further revealed that state anxiety moderated relations between emotional expression and hemisphere. Under conditions of low cognitive load, more intense anxiety was associated with relatively greater RH than LH impairment in the presence of both happy and threatening distractors. By contrast, under high load, only angry distractors produced greater RH than LH interference as a function of anxiety.

  10. Superior Recognition Performance for Happy Masked and Unmasked Faces in Both Younger and Older Adults

    PubMed Central

    Svärd, Joakim; Wiens, Stefan; Fischer, Håkan

    2012-01-01

    In the aging literature it has been shown that even though emotion recognition performance decreases with age, the decrease is less for happiness than other facial expressions. Studies in younger adults have also revealed that happy faces are more strongly attended to and better recognized than other emotional facial expressions. Thus, there might be a more age independent happy face advantage in facial expression recognition. By using a backward masking paradigm and varying stimulus onset asynchronies (17–267 ms) the temporal development of a happy face advantage, on a continuum from low to high levels of visibility, was examined in younger and older adults. Results showed that across age groups, recognition performance for happy faces was better than for neutral and fearful faces at durations longer than 50 ms. Importantly, the results showed a happy face advantage already during early processing of emotional faces in both younger and older adults. This advantage is discussed in terms of processing of salient perceptual features and elaborative processing of the happy face. We also investigate the combined effect of age and neuroticism on emotional face processing. The rationale was previous findings of age-related differences in physiological arousal to emotional pictures and a relation between arousal and neuroticism. Across all durations, there was an interaction between age and neuroticism, showing that being high in neuroticism might be disadvantageous for younger, but not older adults’ emotion recognition performance during arousal enhancing tasks. These results indicate that there is a relation between aging, neuroticism, and performance, potentially related to physiological arousal. PMID:23226135

  11. Visual body recognition in a prosopagnosic patient.

    PubMed

    Moro, V; Pernigo, S; Avesani, R; Bulgarelli, C; Urgesi, C; Candidi, M; Aglioti, S M

    2012-01-01

    Conspicuous deficits in face recognition characterize prosopagnosia. Information on whether agnosic deficits may extend to non-facial body parts is lacking. Here we report the neuropsychological description of FM, a patient affected by a complete deficit in face recognition in the presence of mild clinical signs of visual object agnosia. His deficit involves both overt and covert recognition of faces (i.e. recognition of familiar faces, but also categorization of faces for gender or age) as well as the visual mental imagery of faces. By means of a series of matching-to-sample tasks we investigated: (i) a possible association between prosopagnosia and disorders in visual body perception; (ii) the effect of the emotional content of stimuli on the visual discrimination of faces, bodies and objects; (iii) the existence of a dissociation between identity recognition and the emotional discrimination of faces and bodies. Our results document, for the first time, the co-occurrence of body agnosia, i.e. the visual inability to discriminate body forms and body actions, and prosopagnosia. Moreover, the results show better performance in the discrimination of emotional face and body expressions with respect to body identity and neutral actions. Since FM's lesions involve bilateral fusiform areas, it is unlikely that the amygdala-temporal projections explain the relative sparing of emotion discrimination performance. Indeed, the emotional content of the stimuli did not improve the discrimination of their identity. The results hint at the existence of two segregated brain networks involved in identity and emotional discrimination that are at least partially shared by face and body processing. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. Emotion Words, Regardless of Polarity, Have a Processing Advantage over Neutral Words

    ERIC Educational Resources Information Center

    Kousta, Stavroula-Thaleia; Vinson, David P.; Vigliocco, Gabriella

    2009-01-01

    Despite increasing interest in the interface between emotion and cognition, the role of emotion in cognitive tasks is unclear. According to one hypothesis, negative valence is more relevant for survival and is associated with a general slowdown of the processing of stimuli, due to a defense mechanism that freezes activity in the face of threat.…

  13. "Who Said That?" Matching of Low- and High-Intensity Emotional Prosody to Facial Expressions by Adolescents with ASD

    ERIC Educational Resources Information Center

    Grossman, Ruth B.; Tager-Flusberg, Helen

    2012-01-01

    Data on emotion processing by individuals with ASD suggest both intact abilities and significant deficits. Signal intensity may be a contributing factor to this discrepancy. We presented low- and high-intensity emotional stimuli in a face-voice matching task to 22 adolescents with ASD and 22 typically developing (TD) peers. Participants heard…

  14. Feedback from the heart: Emotional learning and memory is controlled by cardiac cycle, interoceptive accuracy and personality.

    PubMed

    Pfeifer, Gaby; Garfinkel, Sarah N; Gould van Praag, Cassandra D; Sahota, Kuljit; Betka, Sophie; Critchley, Hugo D

    2017-05-01

    Feedback processing is critical to trial-and-error learning. Here, we examined whether interoceptive signals concerning the state of cardiovascular arousal influence the processing of reinforcing feedback during the learning of 'emotional' face-name pairs, with subsequent effects on retrieval. Participants (N=29) engaged in a learning task of face-name pairs (fearful, neutral, happy faces). Correct and incorrect learning decisions were reinforced by auditory feedback, which was delivered either at cardiac systole (on the heartbeat, when baroreceptors signal the contraction of the heart to the brain), or at diastole (between heartbeats during baroreceptor quiescence). We discovered a cardiac influence on feedback processing that enhanced the learning of fearful faces in people with heightened interoceptive ability. Individuals with enhanced accuracy on a heartbeat counting task learned fearful face-name pairs better when feedback was given at systole than at diastole. This effect was not present for neutral and happy faces. At retrieval, we also observed related effects of personality: First, individuals scoring higher for extraversion showed poorer retrieval accuracy. These individuals additionally manifested lower resting heart rate and lower state anxiety, suggesting that attenuated levels of cardiovascular arousal in extraverts underlies poorer performance. Second, higher extraversion scores predicted higher emotional intensity ratings of fearful faces reinforced at systole. Third, individuals scoring higher for neuroticism showed higher retrieval confidence for fearful faces reinforced at diastole. Our results show that cardiac signals shape feedback processing to influence learning of fearful faces, an effect underpinned by personality differences linked to psychophysiological arousal. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Variation in White Matter Connectivity Predicts the Ability to Remember Faces and Discriminate Their Emotions

    PubMed Central

    Unger, Ashley; Alm, Kylie H.; Collins, Jessica A.; O’Leary, Jacqueline M.; Olson, Ingrid R.

    2017-01-01

    Objective The extended face network contains clusters of neurons that perform distinct functions on facial stimuli. Regions in the posterior ventral visual stream appear to perform basic perceptual functions on faces, while more anterior regions, such as the ventral anterior temporal lobe and amygdala, function to link mnemonic and affective information to faces. Anterior and posterior regions are interconnected by a long-range white matter tracts however it is not known if variation in connectivity of these pathways explains cognitive performance. Methods Here, we used diffusion imaging and deterministic tractography in a cohort of 28 neurologically normal adults ages 18–28 to examine microstructural properties of visual fiber pathways and their relationship to certain mnemonic and affective functions involved in face processing. We investigated how inter-individual variability in two tracts, the inferior longitudinal fasciculus (ILF) and the inferior fronto-occipital fasciculus (IFOF), related to performance on tests of facial emotion recognition and face memory. Results Results revealed that microstructure of both tracts predicted variability in behavioral performance indexed by both tasks, suggesting that the ILF and IFOF play a role in facilitating our ability to discriminate emotional expressions in faces, as well as to remember unique faces. Variation in a control tract, the uncinate fasciculus, did not predict performance on these tasks. Conclusions These results corroborate and extend the findings of previous neuropsychology studies investigating the effects of damage to the ILF and IFOF, and demonstrate that differences in face processing abilities are related to white matter microstructure, even in healthy individuals. PMID:26888615

  16. Following the time course of face gender and expression processing: a task-dependent ERP study.

    PubMed

    Valdés-Conroy, Berenice; Aguado, Luis; Fernández-Cahill, María; Romero-Ferreiro, Verónica; Diéguez-Risco, Teresa

    2014-05-01

    The effects of task demands and the interaction between gender and expression in face perception were studied using event-related potentials (ERPs). Participants performed three different tasks with male and female faces that were emotionally inexpressive or that showed happy or angry expressions. In two of the tasks (gender and expression categorization) facial properties were task-relevant while in a third task (symbol discrimination) facial information was irrelevant. Effects of expression were observed on the visual P100 component under all task conditions, suggesting the operation of an automatic process that is not influenced by task demands. The earliest interaction between expression and gender was observed later in the face-sensitive N170 component. This component showed differential modulations by specific combinations of gender and expression (e.g., angry male vs. angry female faces). Main effects of expression and task were observed in a later occipito-temporal component peaking around 230 ms post-stimulus onset (EPN or early posterior negativity). Less positive amplitudes in the presence of angry faces and during performance of the gender and expression tasks were observed. Finally, task demands also modulated a positive component peaking around 400 ms (LPC, or late positive complex) that showed enhanced amplitude for the gender task. The pattern of results obtained here adds new evidence about the sequence of operations involved in face processing and the interaction of facial properties (gender and expression) in response to different task demands. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Identification of emotions in mixed disgusted-happy faces as a function of depressive symptom severity.

    PubMed

    Sanchez, Alvaro; Romero, Nuria; Maurage, Pierre; De Raedt, Rudi

    2017-12-01

    Interpersonal difficulties are common in depression, but their underlying mechanisms are not yet fully understood. The role of depression in the identification of mixed emotional signals with a direct interpersonal value remains unclear. The present study aimed to clarify this question. A sample of 39 individuals reporting a broad range of depression levels completed an emotion identification task where they viewed faces expressing three emotional categories (100% disgusted and 100% happy faces, as well as their morphed 50% disgusted - 50% happy exemplars). Participants were asked to identify the corresponding depicted emotion as "clearly disgusted", "mixed", or "clearly happy". Higher depression levels were associated with lower identification of positive emotions in 50% disgusted - 50% happy faces. The study was conducted with an analogue sample reporting individual differences in subclinical depression levels. Further research must replicate these findings in a clinical sample and clarify whether differential emotional identification patterns emerge in depression for different mixed negative-positive emotions (sad-happy vs. disgusted-happy). Depression may account for a lower bias to perceive positive states when ambiguous states from others include subtle signals of social threat (i.e., disgust), leading to an under-perception of positive social signals. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Judgments of auditory-visual affective congruence in adolescents with and without autism: a pilot study of a new task using fMRI.

    PubMed

    Loveland, Katherine A; Steinberg, Joel L; Pearson, Deborah A; Mansour, Rosleen; Reddoch, Stacy

    2008-10-01

    One of the most widely reported developmental deficits associated with autism is difficulty perceiving and expressing emotion appropriately. Brain activation associated with performance on a new task, the Emotional Congruence Task, requires judging affective congruence of facial expression and voice, compared with their sex congruence. Participants in this pilot study were adolescents with normal IQ (n = 5) and autism or without (n = 4) autism. In the emotional congruence condition, as compared to the sex congruence of voice and face, controls had significantly more activation than the Autism group in the orbitofrontal cortex, the superior temporal, parahippocampal, and posterior cingulate gyri and occipital regions. Unlike controls, the Autism group did not have significantly greater prefrontal activation during the emotional congruence condition, but did during the sex congruence condition. Results indicate the Emotional Congruence Task can be used successfully to assess brain activation and behavior associated with integration of auditory and visual information for emotion. While the numbers in the groups are small, the results suggest that brain activity while performing the Emotional Congruence Task differed between adolescents with and without autism in fronto-limbic areas and in the superior temporal region. These findings must be confirmed using larger samples of participants.

  19. Effects of oxycodone on brain responses to emotional images.

    PubMed

    Wardle, Margaret C; Fitzgerald, Daniel A; Angstadt, Michael; Rabinak, Christine A; de Wit, Harriet; Phan, K Luan

    2014-11-01

    Evidence from animal and human studies suggests that opiate drugs decrease emotional responses to negative stimuli and increase responses to positive stimuli. Such emotional effects may motivate misuse of oxycodone (OXY), a widely abused opiate. Yet, we know little about how OXY affects neural circuits underlying emotional processing in humans. We examined effects of OXY on brain activity during presentation of positive and negative visual emotional stimuli. We predicted that OXY would decrease amygdala activity to negative stimuli and increase ventral striatum (VS) activity to positive stimuli. Secondarily, we examined the effects of OXY on other emotional network regions on an exploratory basis. In a three-session study, healthy adults (N = 17) received placebo, 10 and 20 mg OXY under counterbalanced, double-blind conditions. At each session, participants completed subjective and cardiovascular measures and underwent functional MRI (fMRI) scanning while completing two emotional response tasks. Our emotional tasks reliably activated emotional network areas. OXY produced subjective effects but did not alter either behavioral responses to emotional stimuli or activity in our primary areas of interest. OXY did decrease right medial orbitofrontal cortex (MOFC) responses to happy faces. Contrary to our expectations, OXY did not affect behavioral or neural responses to emotional stimuli in our primary areas of interest. Further, the effects of OXY in the MOFC would be more consistent with a decrease in value for happy faces. This may indicate that healthy adults do not receive emotional benefits from opiates, or the pharmacological actions of OXY differ from other opiates.

  20. A Drawing Task to Assess Emotion Inference in Language-Impaired Children.

    PubMed

    Vendeville, Nathalie; Blanc, Nathalie; Brechet, Claire

    2015-10-01

    Studies investigating the ability of children with language impairment (LI) to infer emotions rely on verbal responses (which can be challenging for these children) and/or the selection of a card representing an emotion (which limits the response range). In contrast, a drawing task might allow a broad spectrum of responses without involving language. This study used a drawing task to compare the ability to make emotional inferences in children with and without LI. Twenty-two children with LI and 22 typically developing children ages 6 to 10 years were assessed in school during 3 sessions. They were asked to listen to audio stories. At specific moments, the experimenter stopped the recording and asked children to complete the drawing of a face to depict the emotion felt by the story's character. Three adult study-blind judges were subsequently asked to evaluate the expressiveness of the drawings. Children with LI had more difficulty than typically developing children making emotional inferences. Children with LI also made more errors of different valence than their typically developing peers. Our findings confirm that children with LI show difficulty in producing emotional inferences, even when performing a drawing task--a relatively language-free response mode.

  1. Fusiform Gyrus Dysfunction is Associated with Perceptual Processing Efficiency to Emotional Faces in Adolescent Depression: A Model-Based Approach.

    PubMed

    Ho, Tiffany C; Zhang, Shunan; Sacchet, Matthew D; Weng, Helen; Connolly, Colm G; Henje Blom, Eva; Han, Laura K M; Mobayed, Nisreen O; Yang, Tony T

    2016-01-01

    While the extant literature has focused on major depressive disorder (MDD) as being characterized by abnormalities in processing affective stimuli (e.g., facial expressions), little is known regarding which specific aspects of cognition influence the evaluation of affective stimuli, and what are the underlying neural correlates. To investigate these issues, we assessed 26 adolescents diagnosed with MDD and 37 well-matched healthy controls (HCL) who completed an emotion identification task of dynamically morphing faces during functional magnetic resonance imaging (fMRI). We analyzed the behavioral data using a sequential sampling model of response time (RT) commonly used to elucidate aspects of cognition in binary perceptual decision making tasks: the Linear Ballistic Accumulator (LBA) model. Using a hierarchical Bayesian estimation method, we obtained group-level and individual-level estimates of LBA parameters on the facial emotion identification task. While the MDD and HCL groups did not differ in mean RT, accuracy, or group-level estimates of perceptual processing efficiency (i.e., drift rate parameter of the LBA), the MDD group showed significantly reduced responses in left fusiform gyrus compared to the HCL group during the facial emotion identification task. Furthermore, within the MDD group, fMRI signal in the left fusiform gyrus during affective face processing was significantly associated with greater individual-level estimates of perceptual processing efficiency. Our results therefore suggest that affective processing biases in adolescents with MDD are characterized by greater perceptual processing efficiency of affective visual information in sensory brain regions responsible for the early processing of visual information. The theoretical, methodological, and clinical implications of our results are discussed.

  2. Fusiform Gyrus Dysfunction is Associated with Perceptual Processing Efficiency to Emotional Faces in Adolescent Depression: A Model-Based Approach

    PubMed Central

    Ho, Tiffany C.; Zhang, Shunan; Sacchet, Matthew D.; Weng, Helen; Connolly, Colm G.; Henje Blom, Eva; Han, Laura K. M.; Mobayed, Nisreen O.; Yang, Tony T.

    2016-01-01

    While the extant literature has focused on major depressive disorder (MDD) as being characterized by abnormalities in processing affective stimuli (e.g., facial expressions), little is known regarding which specific aspects of cognition influence the evaluation of affective stimuli, and what are the underlying neural correlates. To investigate these issues, we assessed 26 adolescents diagnosed with MDD and 37 well-matched healthy controls (HCL) who completed an emotion identification task of dynamically morphing faces during functional magnetic resonance imaging (fMRI). We analyzed the behavioral data using a sequential sampling model of response time (RT) commonly used to elucidate aspects of cognition in binary perceptual decision making tasks: the Linear Ballistic Accumulator (LBA) model. Using a hierarchical Bayesian estimation method, we obtained group-level and individual-level estimates of LBA parameters on the facial emotion identification task. While the MDD and HCL groups did not differ in mean RT, accuracy, or group-level estimates of perceptual processing efficiency (i.e., drift rate parameter of the LBA), the MDD group showed significantly reduced responses in left fusiform gyrus compared to the HCL group during the facial emotion identification task. Furthermore, within the MDD group, fMRI signal in the left fusiform gyrus during affective face processing was significantly associated with greater individual-level estimates of perceptual processing efficiency. Our results therefore suggest that affective processing biases in adolescents with MDD are characterized by greater perceptual processing efficiency of affective visual information in sensory brain regions responsible for the early processing of visual information. The theoretical, methodological, and clinical implications of our results are discussed. PMID:26869950

  3. Childhood Poverty Predicts Adult Amygdala and Frontal Activity and Connectivity in Response to Emotional Faces.

    PubMed

    Javanbakht, Arash; King, Anthony P; Evans, Gary W; Swain, James E; Angstadt, Michael; Phan, K Luan; Liberzon, Israel

    2015-01-01

    Childhood poverty negatively impacts physical and mental health in adulthood. Altered brain development in response to social and environmental factors associated with poverty likely contributes to this effect, engendering maladaptive patterns of social attribution and/or elevated physiological stress. In this fMRI study, we examined the association between childhood poverty and neural processing of social signals (i.e., emotional faces) in adulthood. Fifty-two subjects from a longitudinal prospective study recruited as children, participated in a brain imaging study at 23-25 years of age using the Emotional Faces Assessment Task. Childhood poverty, independent of concurrent adult income, was associated with higher amygdala and medial prefrontal cortical (mPFC) responses to threat vs. happy faces. Also, childhood poverty was associated with decreased functional connectivity between left amygdala and mPFC. This study is unique, because it prospectively links childhood poverty to emotional processing during adulthood, suggesting a candidate neural mechanism for negative social-emotional bias. Adults who grew up poor appear to be more sensitive to social threat cues and less sensitive to positive social cues.

  4. Age-related differences in brain activity during implicit and explicit processing of fearful facial expressions.

    PubMed

    Zsoldos, Isabella; Cousin, Emilie; Klein-Koerkamp, Yanica; Pichat, Cédric; Hot, Pascal

    2016-11-01

    Age-related differences in neural correlates underlying implicit and explicit emotion processing are unclear. Within the framework of the Frontoamygdalar Age-related Differences in Emotion model (St Jacques et al., 2009), our objectives were to examine the behavioral and neural modifications that occur with age for both processes. During explicit and implicit processing of fearful faces, we expected to observe less amygdala activity in older adults (OA) than in younger adults (YA), associated with poorer recognition performance in the explicit task, and more frontal activity during implicit processing, suggesting compensation. At a behavioral level, explicit recognition of fearful faces was impaired in OA compared with YA. We did not observe any cerebral differences between OA and YA during the implicit task, whereas in the explicit task, OA recruited more frontal, parietal, temporal, occipital, and cingulate areas. Our findings suggest that automatic processing of emotion may be preserved during aging, whereas deliberate processing is impaired. Additional neural recruitment in OA did not appear to compensate for their behavioral deficits. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Facial Emotion Recognition in Child Psychiatry: A Systematic Review

    ERIC Educational Resources Information Center

    Collin, Lisa; Bindra, Jasmeet; Raju, Monika; Gillberg, Christopher; Minnis, Helen

    2013-01-01

    This review focuses on facial affect (emotion) recognition in children and adolescents with psychiatric disorders other than autism. A systematic search, using PRISMA guidelines, was conducted to identify original articles published prior to October 2011 pertaining to face recognition tasks in case-control studies. Used in the qualitative…

  6. Acute glucocorticoid effects on response inhibition in borderline personality disorder.

    PubMed

    Carvalho Fernando, Silvia; Beblo, Thomas; Schlosser, Nicole; Terfehr, Kirsten; Wolf, Oliver Tobias; Otte, Christian; Löwe, Bernd; Spitzer, Carsten; Driessen, Martin; Wingenfeld, Katja

    2013-11-01

    Growing evidence suggests inhibition dysfunctions in borderline personality disorder (BPD). Moreover, abnormalities in hypothalamic-pituitary-adrenal (HPA) axis functioning have also been found in BPD patients. In healthy individuals, response inhibition has been sensitive to acute stress, and previous research indicates that effects mediated by the HPA axis become particularly apparent when emotional stimuli are processed. This study aimed to explore the influence of acute hydrocortisone administration on response inhibition of emotional stimuli in BPD patients compared to healthy control participants. After a single administration of 10mg hydrocortisone or placebo, 32 female BPD patients and 32 healthy female participants performed an adapted emotional go/no-go paradigm to assess response inhibition for emotional face stimuli in a cross-over study. Acute cortisol elevations decreased the reaction times to target stimuli in both BPD patients and healthy controls. Patients and controls did not differ in task performance; however, BPD patients with comorbid posttraumatic stress disorder (PTSD) displayed longer reaction times than patients without PTSD. In contrast, the occurrence of comorbid eating disorder had no significant impact on go/no-go performance. No significant interaction effect between the treatment condition and the emotional valence of the face stimuli was found. Acute hydrocortisone administration enhances response inhibition of face stimuli in BPD patients and healthy controls, regardless of their emotional valence. Our results agree with the suggestion that moderate cortisol enhancement increases the inhibition of task-irrelevant distracters. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Neural bases of different cognitive strategies for facial affect processing in schizophrenia.

    PubMed

    Fakra, Eric; Salgado-Pineda, Pilar; Delaveau, Pauline; Hariri, Ahmad R; Blin, Olivier

    2008-03-01

    To examine the neural basis and dynamics of facial affect processing in schizophrenic patients as compared to healthy controls. Fourteen schizophrenic patients and fourteen matched controls performed a facial affect identification task during fMRI acquisition. The emotional task included an intuitive emotional condition (matching emotional faces) and a more cognitively demanding condition (labeling emotional faces). Individual analysis for each emotional condition, and second-level t-tests examining both within-, and between-group differences, were carried out using a random effects approach. Psychophysiological interactions (PPI) were tested for variations in functional connectivity between amygdala and other brain regions as a function of changes in experimental conditions (labeling versus matching). During the labeling condition, both groups engaged similar networks. During the matching condition, schizophrenics failed to activate regions of the limbic system implicated in the automatic processing of emotions. PPI revealed an inverse functional connectivity between prefrontal regions and the left amygdala in healthy volunteers but there was no such change in patients. Furthermore, during the matching condition, and compared to controls, patients showed decreased activation of regions involved in holistic face processing (fusiform gyrus) and increased activation of regions associated with feature analysis (inferior parietal cortex, left middle temporal lobe, right precuneus). Our findings suggest that schizophrenic patients invariably adopt a cognitive approach when identifying facial affect. The distributed neocortical network observed during the intuitive condition indicates that patients may resort to feature-based, rather than configuration-based, processing and may constitute a compensatory strategy for limbic dysfunction.

  8. The Face-to-Face Light Detection Paradigm: A New Methodology for Investigating Visuospatial Attention Across Different Face Regions in Live Face-to-Face Communication Settings.

    PubMed

    Thompson, Laura A; Malloy, Daniel M; Cone, John M; Hendrickson, David L

    2010-01-01

    We introduce a novel paradigm for studying the cognitive processes used by listeners within interactive settings. This paradigm places the talker and the listener in the same physical space, creating opportunities for investigations of attention and comprehension processes taking place during interactive discourse situations. An experiment was conducted to compare results from previous research using videotaped stimuli to those obtained within the live face-to-face task paradigm. A headworn apparatus is used to briefly display LEDs on the talker's face in four locations as the talker communicates with the participant. In addition to the primary task of comprehending speeches, participants make a secondary task light detection response. In the present experiment, the talker gave non-emotionally-expressive speeches that were used in past research with videotaped stimuli. Signal detection analysis was employed to determine which areas of the face received the greatest focus of attention. Results replicate previous findings using videotaped methods.

  9. The Face-to-Face Light Detection Paradigm: A New Methodology for Investigating Visuospatial Attention Across Different Face Regions in Live Face-to-Face Communication Settings

    PubMed Central

    Thompson, Laura A.; Malloy, Daniel M.; Cone, John M.; Hendrickson, David L.

    2009-01-01

    We introduce a novel paradigm for studying the cognitive processes used by listeners within interactive settings. This paradigm places the talker and the listener in the same physical space, creating opportunities for investigations of attention and comprehension processes taking place during interactive discourse situations. An experiment was conducted to compare results from previous research using videotaped stimuli to those obtained within the live face-to-face task paradigm. A headworn apparatus is used to briefly display LEDs on the talker’s face in four locations as the talker communicates with the participant. In addition to the primary task of comprehending speeches, participants make a secondary task light detection response. In the present experiment, the talker gave non-emotionally-expressive speeches that were used in past research with videotaped stimuli. Signal detection analysis was employed to determine which areas of the face received the greatest focus of attention. Results replicate previous findings using videotaped methods. PMID:21113354

  10. Independent effects of reward expectation and spatial orientation on the processing of emotional facial expressions.

    PubMed

    Kang, Guanlan; Zhou, Xiaolin; Wei, Ping

    2015-09-01

    The present study investigated the effect of reward expectation and spatial orientation on the processing of emotional facial expressions, using a spatial cue-target paradigm. A colored cue was presented at the left or right side of the central fixation point, with its color indicating the monetary reward stakes of a given trial (incentive vs. non-incentive), followed by the presentation of an emotional facial target (angry vs. neutral) at a cued or un-cued location. Participants were asked to discriminate the emotional expression of the target, with the cue-target stimulus onset asynchrony being 200-300 ms in Experiment 1 and 950-1250 ms in Experiment 2a (without a fixation cue) and Experiment 2b (with a fixation cue), producing a spatial facilitation effect and an inhibition of return effect, respectively. The results of all the experiments revealed faster reaction times in the monetary incentive condition than in the non-incentive condition, demonstrating the effect of reward to facilitate task performance. An interaction between reward expectation and the emotion of the target was evident in all the three experiments, with larger reward effects for angry faces than for neutral faces. This interaction was not affected by spatial orientation. These findings demonstrate that incentive motivation improves task performance and increases sensitivity to angry faces, irrespective of spatial orienting and reorienting processes.

  11. Grounding context in face processing: color, emotion, and gender

    PubMed Central

    Gil, Sandrine; Le Bigot, Ludovic

    2015-01-01

    In recent years, researchers have become interested in the way that the affective quality of contextual information transfers to a perceived target. We therefore examined the effect of a red (vs. green, mixed red/green, and achromatic) background – known to be valenced – on the processing of stimuli that play a key role in human interactions, namely facial expressions. We also examined whether the valenced-color effect can be modulated by gender, which is also known to be valenced. Female and male adult participants performed a categorization task of facial expressions of emotion in which the faces of female and male posers expressing two ambiguous emotions (i.e., neutral and surprise) were presented against the four different colored backgrounds. Additionally, this task was completed by collecting subjective ratings for each colored background in the form of five semantic differential scales corresponding to both discrete and dimensional perspectives of emotion. We found that the red background resulted in more negative face perception than the green background, whether the poser was female or male. However, whereas this valenced-color effect was the only effect for female posers, for male posers, the effect was modulated by both the nature of the ambiguous emotion and the decoder’s gender. Overall, our findings offer evidence that color and gender have a common valence-based dimension. PMID:25852625

  12. From neural signatures of emotional modulation to social cognition: individual differences in healthy volunteers and psychiatric participants

    PubMed Central

    Aguado, Jaume; Baez, Sandra; Huepe, David; Lopez, Vladimir; Ortega, Rodrigo; Sigman, Mariano; Mikulan, Ezequiel; Lischinsky, Alicia; Torrente, Fernando; Cetkovich, Marcelo; Torralva, Teresa; Bekinschtein, Tristan; Manes, Facundo

    2014-01-01

    It is commonly assumed that early emotional signals provide relevant information for social cognition tasks. The goal of this study was to test the association between (a) cortical markers of face emotional processing and (b) social-cognitive measures, and also to build a model which can predict this association (a and b) in healthy volunteers as well as in different groups of psychiatric patients. Thus, we investigated the early cortical processing of emotional stimuli (N170, using a face and word valence task) and their relationship with the social-cognitive profiles (SCPs, indexed by measures of theory of mind, fluid intelligence, speed processing and executive functions). Group comparisons and individual differences were assessed among schizophrenia (SCZ) patients and their relatives, individuals with attention deficit hyperactivity disorder (ADHD), individuals with euthymic bipolar disorder (BD) and healthy participants (educational level, handedness, age and gender matched). Our results provide evidence of emotional N170 impairments in the affected groups (SCZ and relatives, ADHD and BD) as well as subtle group differences. Importantly, cortical processing of emotional stimuli predicted the SCP, as evidenced by a structural equation model analysis. This is the first study to report an association model of brain markers of emotional processing and SCP. PMID:23685775

  13. Increased amygdala responses to sad but not fearful faces in major depression: relation to mood state and pharmacological treatment.

    PubMed

    Arnone, Danilo; McKie, Shane; Elliott, Rebecca; Thomas, Emma J; Downey, Darragh; Juhasz, Gabriella; Williams, Steve R; Deakin, J F William; Anderson, Ian M

    2012-08-01

    Increased amygdala response to negative emotions seen in functional MRI (fMRI) has been proposed as a biomarker for negative emotion processing bias underlying depressive symptoms and vulnerability to depressive relapse that are normalized by antidepressant drug treatment. The purpose of this study was to determine whether abnormal amygdala responses to face emotions in depression are related to specific emotions or change in response to antidepressant treatment and whether they are present as a stable trait in medication-free patients in remission. Sixty-two medication-free unipolar depressed patients (38 were currently depressed, and 24 were in remission) and 54 healthy comparison subjects underwent an indirect face-emotion processing task during fMRI. Thirty-two currently depressed patients were treated with the antidepressant citalopram for 8 weeks. Adherence to treatment was evaluated by measuring citalopram plasma concentrations. Patients with current depression had increased bilateral amygdala responses specific to sad faces relative to healthy comparison subjects and nonmedicated patients in stable remission. Treatment with citalopram abolished the abnormal amygdala responses to sad faces in currently depressed patients but did not alter responses to fearful faces. Aberrant amygdala activation in response to sad facial emotions is specific to the depressed state and is a potential biomarker for a negative affective bias during a depressive episode.

  14. Interactive effects between gaze direction and facial expression on attentional resources deployment: the task instruction and context matter

    PubMed Central

    Ricciardelli, Paola; Lugli, Luisa; Pellicano, Antonello; Iani, Cristina; Nicoletti, Roberto

    2016-01-01

    In three experiments, we tested whether the amount of attentional resources needed to process a face displaying neutral/angry/fearful facial expressions with direct or averted gaze depends on task instructions, and face presentation. To this end, we used a Rapid Serial Visual Presentation paradigm in which participants in Experiment 1 were first explicitly asked to discriminate whether the expression of a target face (T1) with direct or averted gaze was angry or neutral, and then to judge the orientation of a landscape (T2). Experiment 2 was identical to Experiment 1 except that participants had to discriminate the gender of the face of T1 and fearful faces were also presented randomly inter-mixed within each block of trials. Experiment 3 differed from Experiment 2 only because angry and fearful faces were never presented within the same block. The findings indicated that the presence of the attentional blink (AB) for face stimuli depends on specific combinations of gaze direction and emotional facial expressions and crucially revealed that the contextual factors (e.g., explicit instruction to process the facial expression and the presence of other emotional faces) can modify and even reverse the AB, suggesting a flexible and more contextualized deployment of attentional resources in face processing. PMID:26898473

  15. Distinct facial processing in schizophrenia and schizoaffective disorders

    PubMed Central

    Chen, Yue; Cataldo, Andrea; Norton, Daniel J; Ongur, Dost

    2011-01-01

    Although schizophrenia and schizoaffective disorders have both similar and differing clinical features, it is not well understood whether similar or differing pathophysiological processes mediate patients’ cognitive functions. Using psychophysical methods, this study compared the performances of schizophrenia (SZ) patients, patients with schizoaffective disorder (SA), and a healthy control group in two face-related cognitive tasks: emotion discrimination, which tested perception of facial affect, and identity discrimination, which tested perception of non-affective facial features. Compared to healthy controls, SZ patients, but not SA patients, exhibited deficient performance in both fear and happiness discrimination, as well as identity discrimination. SZ patients, but not SA patients, also showed impaired performance in a theory-of-mind task for which emotional expressions are identified based upon the eye regions of face images. This pattern of results suggests distinct processing of face information in schizophrenia and schizoaffective disorders. PMID:21868199

  16. Emotional contexts modulate intentional memory suppression of neutral faces: Insights from ERPs.

    PubMed

    Pierguidi, Lapo; Righi, Stefania; Gronchi, Giorgio; Marzi, Tessa; Caharel, Stephanie; Giovannelli, Fabio; Viggiano, Maria Pia

    2016-08-01

    The main goal of present work is to gain new insight into the temporal dynamics underlying the voluntary memory control for neutral faces associated with neutral, positive and negative contexts. A directed forgetting (DF) procedure was used during the recording of EEG to answer the question whether is it possible to forget a face that has been encoded within a particular emotional context. A face-scene phase in which a neutral face was showed in a neutral or emotional scene (positive, negative) was followed by the voluntary memory cue (cue phase) indicating whether the face had to-be remember or to-be-forgotten (TBR and TBF). Memory for faces was then assessed with an old/new recognition task. Behaviorally, we found that it is harder to suppress faces-in-positive-scenes compared to faces-in-negative and neutral-scenes. The temporal information obtained by the ERPs showed: 1) during the face-scene phase, the Late Positive Potential (LPP), which indexes motivated emotional attention, was larger for faces-in-negative-scenes compared to faces-in-neutral-scenes. 2) Remarkably, during the cue phase, ERPs were significantly modulated by the emotional contexts. Faces-in-neutral scenes showed an ERP pattern that has been typically associated to DF effect whereas faces-in-positive-scenes elicited the reverse ERP pattern. Faces-in-negative scenes did not show differences in the DF-related neural activities but larger N1 amplitude for TBF vs. TBR faces may index early attentional deployment. These results support the hypothesis that the pleasantness or unpleasantness of the contexts (through attentional broadening and narrowing mechanisms, respectively) may modulate the effectiveness of intentional memory suppression for neutral information. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Neural Reactivity to Angry Faces Predicts Treatment Response in Pediatric Anxiety

    PubMed Central

    Kujawa, Autumn; Fitzgerald, Kate D.; Swain, James E.; Hanna, Gregory L.; Koschmann, Elizabeth; Simpson, David; Connolly, Sucheta; Monk, Christopher S.; Phan, K. Luan

    2018-01-01

    Although cognitive-behavioral psychotherapy (CBT) and pharmacotherapy are evidence-based treatments for pediatric anxiety, many youth with anxiety disorders fail to respond to these treatments. Given limitations of clinical measures in predicting treatment response, identifying neural predictors is timely. In this study, 35 anxious youth (ages 7–19 years) completed an emotional face-matching task during which the late positive potential (LPP), an event-related potential (ERP) component that indexes sustained attention towards emotional stimuli, was measured. Following the ERP measurement, youth received CBT or selective serotonin reuptake inhibitor (SSRI) treatment, and the LPP was examined as a predictor of treatment response. Findings indicated that, accounting for pre-treatment anxiety severity, neural reactivity to emotional faces predicted anxiety severity post-CBT and SSRI treatment such that enhanced electrocortical response to angry faces was associated with better treatment response. An enhanced LPP to angry faces may predict treatment response insofar as it may reflect greater emotion dysregulation or less avoidance and/or enhanced engagement with environmental stimuli in general, including with treatment. PMID:27255517

  18. Facial emotion perception by intensity in children and adolescents with 22q11.2 deletion syndrome.

    PubMed

    Leleu, Arnaud; Saucourt, Guillaume; Rigard, Caroline; Chesnoy, Gabrielle; Baudouin, Jean-Yves; Rossi, Massimiliano; Edery, Patrick; Franck, Nicolas; Demily, Caroline

    2016-03-01

    Difficulties in the recognition of emotions in expressive faces have been reported in people with 22q11.2 deletion syndrome (22q11.2DS). However, while low-intensity expressive faces are frequent in everyday life, nothing is known about their ability to perceive facial emotions depending on the intensity of expression. Through a visual matching task, children and adolescents with 22q11.2DS as well as gender- and age-matched healthy participants were asked to categorise the emotion of a target face among six possible expressions. Static pictures of morphs between neutrality and expressions were used to parametrically manipulate the intensity of the target face. In comparison to healthy controls, results showed higher perception thresholds (i.e. a more intense expression is needed to perceive the emotion) and lower accuracy for the most expressive faces indicating reduced categorisation abilities in the 22q11.2DS group. The number of intrusions (i.e. each time an emotion is perceived as another one) and a more gradual perception performance indicated smooth boundaries between emotional categories. Correlational analyses with neuropsychological and clinical measures suggested that reduced visual skills may be associated with impaired categorisation of facial emotions. Overall, the present study indicates greater difficulties for children and adolescents with 22q11.2DS to perceive an emotion in low-intensity expressive faces. This disability is subtended by emotional categories that are not sharply organised. It also suggests that these difficulties may be associated with impaired visual cognition, a hallmark of the cognitive deficits observed in the syndrome. These data yield promising tracks for future experimental and clinical investigations.

  19. Attentional Bias towards Positive Emotion Predicts Stress Resilience.

    PubMed

    Thoern, Hanna A; Grueschow, Marcus; Ehlert, Ulrike; Ruff, Christian C; Kleim, Birgit

    2016-01-01

    There is extensive evidence for an association between an attentional bias towards emotionally negative stimuli and vulnerability to stress-related psychopathology. Less is known about whether selective attention towards emotionally positive stimuli relates to mental health and stress resilience. The current study used a modified Dot Probe task to investigate if individual differences in attentional biases towards either happy or angry emotional stimuli, or an interaction between these biases, are related to self-reported trait stress resilience. In a nonclinical sample (N = 43), we indexed attentional biases as individual differences in reaction time for stimuli preceded by either happy or angry (compared to neutral) face stimuli. Participants with greater attentional bias towards happy faces (but not angry faces) reported higher trait resilience. However, an attentional bias towards angry stimuli moderated this effect: The attentional bias towards happy faces was only predictive for resilience in those individuals who also endorsed an attentional bias towards angry stimuli. An attentional bias towards positive emotional stimuli may thus be a protective factor contributing to stress resilience, specifically in those individuals who also endorse an attentional bias towards negative emotional stimuli. Our findings therefore suggest a novel target for prevention and treatment interventions addressing stress-related psychopathology.

  20. Urinary oxytocin positively correlates with performance in facial visual search in unmarried males, without specific reaction to infant face.

    PubMed

    Saito, Atsuko; Hamada, Hiroki; Kikusui, Takefumi; Mogi, Kazutaka; Nagasawa, Miho; Mitsui, Shohei; Higuchi, Takashi; Hasegawa, Toshikazu; Hiraki, Kazuo

    2014-01-01

    The neuropeptide oxytocin plays a central role in prosocial and parental behavior in non-human mammals as well as humans. It has been suggested that oxytocin may affect visual processing of infant faces and emotional reaction to infants. Healthy male volunteers (N = 13) were tested for their ability to detect infant or adult faces among adult or infant faces (facial visual search task). Urine samples were collected from all participants before the study to measure the concentration of oxytocin. Urinary oxytocin positively correlated with performance in the facial visual search task. However, task performance and its correlation with oxytocin concentration did not differ between infant faces and adult faces. Our data suggests that endogenous oxytocin is related to facial visual cognition, but does not promote infant-specific responses in unmarried men who are not fathers.

  1. The time course of emotional picture processing: an event-related potential study using a rapid serial visual presentation paradigm

    PubMed Central

    Zhu, Chuanlin; He, Weiqi; Qi, Zhengyang; Wang, Lili; Song, Dongqing; Zhan, Lei; Yi, Shengnan; Luo, Yuejia; Luo, Wenbo

    2015-01-01

    The present study recorded event-related potentials using rapid serial visual presentation paradigm to explore the time course of emotionally charged pictures. Participants completed a dual-target task as quickly and accurately as possible, in which they were asked to judge the gender of the person depicted (task 1) and the valence (positive, neutral, or negative) of the given picture (task 2). The results showed that the amplitudes of the P2 component were larger for emotional pictures than they were for neutral pictures, and this finding represents brain processes that distinguish emotional stimuli from non-emotional stimuli. Furthermore, positive, neutral, and negative pictures elicited late positive potentials with different amplitudes, implying that the differences between emotions are recognized. Additionally, the time course for emotional picture processing was consistent with the latter two stages of a three-stage model derived from studies on emotional facial expression processing and emotional adjective processing. The results of the present study indicate that in the three-stage model of emotion processing, the middle and late stages are more universal and stable, and thus occur at similar time points when using different stimuli (faces, words, or scenes). PMID:26217276

  2. Monitoring cognitive and emotional processes through pupil and cardiac response during dynamic versus logical task.

    PubMed

    Causse, Mickaël; Sénard, Jean-Michel; Démonet, Jean François; Pastor, Josette

    2010-06-01

    The paper deals with the links between physiological measurements and cognitive and emotional functioning. As long as the operator is a key agent in charge of complex systems, the definition of metrics able to predict his performance is a great challenge. The measurement of the physiological state is a very promising way but a very acute comprehension is required; in particular few studies compare autonomous nervous system reactivity according to specific cognitive processes during task performance and task related psychological stress is often ignored. We compared physiological parameters recorded on 24 healthy subjects facing two neuropsychological tasks: a dynamic task that require problem solving in a world that continually evolves over time and a logical task representative of cognitive processes performed by operators facing everyday problem solving. Results showed that the mean pupil diameter change was higher during the dynamic task; conversely, the heart rate was more elevated during the logical task. Finally, the systolic blood pressure seemed to be strongly sensitive to psychological stress. A better taking into account of the precise influence of a given cognitive activity and both workload and related task-induced psychological stress during task performance is a promising way to better monitor operators in complex working situations to detect mental overload or pejorative stress factor of error.

  3. Emotion and sex of facial stimuli modulate conditional automaticity in behavioral and neuronal interference in healthy men.

    PubMed

    Kohn, Nils; Fernández, Guillén

    2017-12-06

    Our surrounding provides a host of sensory input, which we cannot fully process without streamlining and automatic processing. Levels of automaticity differ for different cognitive and affective processes. Situational and contextual interactions between cognitive and affective processes in turn influence the level of automaticity. Automaticity can be measured by interference in Stroop tasks. We applied an emotional version of the Stroop task to investigate how stress as a contextual factor influences the affective valence-dependent level of automaticity. 120 young, healthy men were investigated for behavioral and brain interference following a stress induction or control procedure in a counter-balanced cross-over-design. Although Stroop interference was always observed, sex and emotion of the face strongly modulated interference, which was larger for fearful and male faces. These effects suggest higher automaticity when processing happy and also female faces. Supporting behavioral patterns, brain data show lower interference related brain activity in executive control related regions in response to happy and female faces. In the absence of behavioral stress effects, congruent compared to incongruent trials (reverse interference) showed little to no deactivation under stress in response to happy female and fearful male trials. These congruency effects are potentially based on altered context- stress-related facial processing that interact with sex-emotion stereotypes. Results indicate that sex and facial emotion modulate Stroop interference in brain and behavior. These effects can be explained by altered response difficulty as a consequence of the contextual and stereotype related modulation of automaticity. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Therapeutic writing as an intervention for symptoms of bulimia nervosa: effects and mechanism of change.

    PubMed

    Johnston, Olwyn; Startup, Helen; Lavender, Anna; Godfrey, Emma; Schmidt, Ulrike

    2010-07-01

    This study explored the effects on bulimic symptomatology of a writing task intended to reduce emotional avoidance. Eighty individuals reporting symptoms of bulimia completed, by e-mail, a therapeutic or control writing task. Participants completed questionnaires on bulimic symptoms, mood, and potential moderating and mediating factors, and were followed up after 4 and 8 weeks. Writing content was explored using a word count package and qualitative framework analysis. Bulimic symptoms decreased in both groups, although in both groups the number of participants who improved was approximately equal to the number who did not improve. Symptom decreases were associated with increases in perceived mood regulation abilities, and decreases in negative beliefs about emotions. Participants preferred internet delivery to face to face discussion. For individuals experiencing symptoms of bulimia, the effects of therapeutic writing did not differ significantly from effects of a control writing task. 2009 by Wiley Periodicals, Inc.

  5. An fMRI Study of the Neural Correlates of Incidental versus Directed Emotion Processing in Pediatric Bipolar Disorder

    ERIC Educational Resources Information Center

    Pavuluri, Mani N.; Passarotti, Alessandra M.; Harral, Erin M.; Sweeney, John A.

    2009-01-01

    The use of functional neuroimaging on patients with pediatric bipolar disorder finds that there is increased amygdala activation on this group when they are tasked to judge whether emotion in faces showing the same emotion were older or younger than 35 years. The right prefrontal systems also seem less engaged in patients with this disorder.

  6. Global-Local Precedence in the Perception of Facial Age and Emotional Expression by Children with Autism and Other Developmental Disabilities

    ERIC Educational Resources Information Center

    Gross, Thomas F.

    2005-01-01

    Global information processing and perception of facial age and emotional expression was studied in children with autism, language disorders, mental retardation, and a clinical control group. Children were given a global-local task and asked to recognize age and emotion in human and canine faces. Children with autism made fewer global responses and…

  7. Effects of acute psychosocial stress on neural activity to emotional and neutral faces in a face recognition memory paradigm.

    PubMed

    Li, Shijia; Weerda, Riklef; Milde, Christopher; Wolf, Oliver T; Thiel, Christiane M

    2014-12-01

    Previous studies have shown that acute psychosocial stress impairs recognition of declarative memory and that emotional material is especially sensitive to this effect. Animal studies suggest a central role of the amygdala which modulates memory processes in hippocampus, prefrontal cortex and other brain areas. We used functional magnetic resonance imaging (fMRI) to investigate neural correlates of stress-induced modulation of emotional recognition memory in humans. Twenty-seven healthy, right-handed, non-smoker male volunteers performed an emotional face recognition task. During encoding, participants were presented with 50 fearful and 50 neutral faces. One hour later, they underwent either a stress (Trier Social Stress Test) or a control procedure outside the scanner which was followed immediately by the recognition session inside the scanner, where participants had to discriminate between 100 old and 50 new faces. Stress increased salivary cortisol, blood pressure and pulse, and decreased the mood of participants but did not impact recognition memory. BOLD data during recognition revealed a stress condition by emotion interaction in the left inferior frontal gyrus and right hippocampus which was due to a stress-induced increase of neural activity to fearful and a decrease to neutral faces. Functional connectivity analyses revealed a stress-induced increase in coupling between the right amygdala and the right fusiform gyrus, when processing fearful as compared to neutral faces. Our results provide evidence that acute psychosocial stress affects medial temporal and frontal brain areas differentially for neutral and emotional items, with a stress-induced privileged processing of emotional stimuli.

  8. Music-induced changes in functional cerebral asymmetries.

    PubMed

    Hausmann, Markus; Hodgetts, Sophie; Eerola, Tuomas

    2016-04-01

    After decades of research, it remains unclear whether emotion lateralization occurs because one hemisphere is dominant for processing the emotional content of the stimuli, or whether emotional stimuli activate lateralised networks associated with the subjective emotional experience. By using emotion-induction procedures, we investigated the effect of listening to happy and sad music on three well-established lateralization tasks. In a prestudy, Mozart's piano sonata (K. 448) and Beethoven's Moonlight Sonata were rated as the most happy and sad excerpts, respectively. Participants listened to either one emotional excerpt, or sat in silence before completing an emotional chimeric faces task (Experiment 1), visual line bisection task (Experiment 2) and a dichotic listening task (Experiment 3 and 4). Listening to happy music resulted in a reduced right hemispheric bias in facial emotion recognition (Experiment 1) and visuospatial attention (Experiment 2) and increased left hemispheric bias in language lateralization (Experiments 3 and 4). Although Experiments 1-3 revealed an increased positive emotional state after listening to happy music, mediation analyses revealed that the effect on hemispheric asymmetries was not mediated by music-induced emotional changes. The direct effect of music listening on lateralization was investigated in Experiment 4 in which tempo of the happy excerpt was manipulated by controlling for other acoustic features. However, the results of Experiment 4 made it rather unlikely that tempo is the critical cue accounting for the effects. We conclude that listening to music can affect functional cerebral asymmetries in well-established emotional and cognitive laterality tasks, independent of music-induced changes in the emotion state. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Processing of Facial Emotion in Bipolar Depression and Euthymia.

    PubMed

    Robinson, Lucy J; Gray, John M; Burt, Mike; Ferrier, I Nicol; Gallagher, Peter

    2015-10-01

    Previous studies of facial emotion processing in bipolar disorder (BD) have reported conflicting findings. In independently conducted studies, we investigate facial emotion labeling in euthymic and depressed BD patients using tasks with static and dynamically morphed images of different emotions displayed at different intensities. Study 1 included 38 euthymic BD patients and 28 controls. Participants completed two tasks: labeling of static images of basic facial emotions (anger, disgust, fear, happy, sad) shown at different expression intensities; the Eyes Test (Baron-Cohen, Wheelwright, Hill, Raste, & Plumb, 2001), which involves recognition of complex emotions using only the eye region of the face. Study 2 included 53 depressed BD patients and 47 controls. Participants completed two tasks: labeling of "dynamic" facial expressions of the same five basic emotions; the Emotional Hexagon test (Young, Perret, Calder, Sprengelmeyer, & Ekman, 2002). There were no significant group differences on any measures of emotion perception/labeling, compared to controls. A significant group by intensity interaction was observed in both emotion labeling tasks (euthymia and depression), although this effect did not survive the addition of measures of executive function/psychomotor speed as covariates. Only 2.6-15.8% of euthymic patients and 7.8-13.7% of depressed patients scored below the 10th percentile of the controls for total emotion recognition accuracy. There was no evidence of specific deficits in facial emotion labeling in euthymic or depressed BD patients. Methodological variations-including mood state, sample size, and the cognitive demands of the tasks-may contribute significantly to the variability in findings between studies.

  10. Selective attention modulates early human evoked potentials during emotional face-voice processing.

    PubMed

    Ho, Hao Tam; Schröger, Erich; Kotz, Sonja A

    2015-04-01

    Recent findings on multisensory integration suggest that selective attention influences cross-sensory interactions from an early processing stage. Yet, in the field of emotional face-voice integration, the hypothesis prevails that facial and vocal emotional information interacts preattentively. Using ERPs, we investigated the influence of selective attention on the perception of congruent versus incongruent combinations of neutral and angry facial and vocal expressions. Attention was manipulated via four tasks that directed participants to (i) the facial expression, (ii) the vocal expression, (iii) the emotional congruence between the face and the voice, and (iv) the synchrony between lip movement and speech onset. Our results revealed early interactions between facial and vocal emotional expressions, manifested as modulations of the auditory N1 and P2 amplitude by incongruent emotional face-voice combinations. Although audiovisual emotional interactions within the N1 time window were affected by the attentional manipulations, interactions within the P2 modulation showed no such attentional influence. Thus, we propose that the N1 and P2 are functionally dissociated in terms of emotional face-voice processing and discuss evidence in support of the notion that the N1 is associated with cross-sensory prediction, whereas the P2 relates to the derivation of an emotional percept. Essentially, our findings put the integration of facial and vocal emotional expressions into a new perspective-one that regards the integration process as a composite of multiple, possibly independent subprocesses, some of which are susceptible to attentional modulation, whereas others may be influenced by additional factors.

  11. Development and validation of a facial expression database based on the dimensional and categorical model of emotions.

    PubMed

    Fujimura, Tomomi; Umemura, Hiroyuki

    2018-01-15

    The present study describes the development and validation of a facial expression database comprising five different horizontal face angles in dynamic and static presentations. The database includes twelve expression types portrayed by eight Japanese models. This database was inspired by the dimensional and categorical model of emotions: surprise, fear, sadness, anger with open mouth, anger with closed mouth, disgust with open mouth, disgust with closed mouth, excitement, happiness, relaxation, sleepiness, and neutral (static only). The expressions were validated using emotion classification and Affect Grid rating tasks [Russell, Weiss, & Mendelsohn, 1989. Affect Grid: A single-item scale of pleasure and arousal. Journal of Personality and Social Psychology, 57(3), 493-502]. The results indicate that most of the expressions were recognised as the intended emotions and could systematically represent affective valence and arousal. Furthermore, face angle and facial motion information influenced emotion classification and valence and arousal ratings. Our database will be available online at the following URL. https://www.dh.aist.go.jp/database/face2017/ .

  12. Early ERP Modulation for Task-Irrelevant Subliminal Faces.

    PubMed

    Pegna, Alan J; Darque, Alexandra; Berrut, Claire; Khateb, Asaid

    2011-01-01

    A number of investigations have reported that emotional faces can be processed subliminally, and that they give rise to specific patterns of brain activation in the absence of awareness. Recent event-related potential (ERP) studies have suggested that electrophysiological differences occur early in time (<200 ms) in response to backward-masked emotional faces. These findings have been taken as evidence of a rapid non-conscious pathway, which would allow threatening stimuli to be processed rapidly and subsequently allow appropriate avoidance action to be taken. However, for this to be the case, subliminal processing should arise even if the threatening stimulus is not attended. This point has in fact not yet been clearly established. In this ERP study, we investigated whether subliminal processing of fearful faces occurs outside the focus of attention. Fourteen healthy participants performed a line judgment task while fearful and non-fearful (happy or neutral) faces were presented both subliminally and supraliminally. ERPs were compared across the four experimental conditions (i.e., subliminal and supraliminal; fearful and non-fearful). The earliest differences between fearful and non-fearful faces appeared as an enhanced posterior negativity for the former at 170 ms (the N170 component) over right temporo-occipital electrodes. This difference was observed for both subliminal (p < 0.05) and supraliminal presentations (p < 0.01). Our results confirm that subliminal processing of fearful faces occurs early in the course of visual processing, and more importantly, that this arises even when the subject's attention is engaged in an incidental task.

  13. State-dependent alteration in face emotion recognition in depression.

    PubMed

    Anderson, Ian M; Shippen, Clare; Juhasz, Gabriella; Chase, Diana; Thomas, Emma; Downey, Darragh; Toth, Zoltan G; Lloyd-Williams, Kathryn; Elliott, Rebecca; Deakin, J F William

    2011-04-01

    Negative biases in emotional processing are well recognised in people who are currently depressed but are less well described in those with a history of depression, where such biases may contribute to vulnerability to relapse. To compare accuracy, discrimination and bias in face emotion recognition in those with current and remitted depression. The sample comprised a control group (n = 101), a currently depressed group (n = 30) and a remitted depression group (n = 99). Participants provided valid data after receiving a computerised face emotion recognition task following standardised assessment of diagnosis and mood symptoms. In the control group women were more accurate in recognising emotions than men owing to greater discrimination. Among participants with depression, those in remission correctly identified more emotions than controls owing to increased response bias, whereas those currently depressed recognised fewer emotions owing to decreased discrimination. These effects were most marked for anger, fear and sadness but there was no significant emotion × group interaction, and a similar pattern tended to be seen for happiness although not for surprise or disgust. These differences were confined to participants who were antidepressant-free, with those taking antidepressants having similar results to the control group. Abnormalities in face emotion recognition differ between people with current depression and those in remission. Reduced discrimination in depressed participants may reflect withdrawal from the emotions of others, whereas the increased bias in those with a history of depression could contribute to vulnerability to relapse. The normal face emotion recognition seen in those taking medication may relate to the known effects of antidepressants on emotional processing and could contribute to their ability to protect against depressive relapse.

  14. The implicit processing of categorical and dimensional strategies: an fMRI study of facial emotion perception

    PubMed Central

    Matsuda, Yoshi-Taka; Fujimura, Tomomi; Katahira, Kentaro; Okada, Masato; Ueno, Kenichi; Cheng, Kang; Okanoya, Kazuo

    2013-01-01

    Our understanding of facial emotion perception has been dominated by two seemingly opposing theories: the categorical and dimensional theories. However, we have recently demonstrated that hybrid processing involving both categorical and dimensional perception can be induced in an implicit manner (Fujimura etal., 2012). The underlying neural mechanisms of this hybrid processing remain unknown. In this study, we tested the hypothesis that separate neural loci might intrinsically encode categorical and dimensional processing functions that serve as a basis for hybrid processing. We used functional magnetic resonance imaging to measure neural correlates while subjects passively viewed emotional faces and performed tasks that were unrelated to facial emotion processing. Activity in the right fusiform face area (FFA) increased in response to psychologically obvious emotions and decreased in response to ambiguous expressions, demonstrating the role of the FFA in categorical processing. The amygdala, insula and medial prefrontal cortex exhibited evidence of dimensional (linear) processing that correlated with physical changes in the emotional face stimuli. The occipital face area and superior temporal sulcus did not respond to these changes in the presented stimuli. Our results indicated that distinct neural loci process the physical and psychological aspects of facial emotion perception in a region-specific and implicit manner. PMID:24133426

  15. Facial Expression Influences Face Identity Recognition During the Attentional Blink

    PubMed Central

    2014-01-01

    Emotional stimuli (e.g., negative facial expressions) enjoy prioritized memory access when task relevant, consistent with their ability to capture attention. Whether emotional expression also impacts on memory access when task-irrelevant is important for arbitrating between feature-based and object-based attentional capture. Here, the authors address this question in 3 experiments using an attentional blink task with face photographs as first and second target (T1, T2). They demonstrate reduced neutral T2 identity recognition after angry or happy T1 expression, compared to neutral T1, and this supports attentional capture by a task-irrelevant feature. Crucially, after neutral T1, T2 identity recognition was enhanced and not suppressed when T2 was angry—suggesting that attentional capture by this task-irrelevant feature may be object-based and not feature-based. As an unexpected finding, both angry and happy facial expressions suppress memory access for competing objects, but only angry facial expression enjoyed privileged memory access. This could imply that these 2 processes are relatively independent from one another. PMID:25286076

  16. Facial expression influences face identity recognition during the attentional blink.

    PubMed

    Bach, Dominik R; Schmidt-Daffy, Martin; Dolan, Raymond J

    2014-12-01

    Emotional stimuli (e.g., negative facial expressions) enjoy prioritized memory access when task relevant, consistent with their ability to capture attention. Whether emotional expression also impacts on memory access when task-irrelevant is important for arbitrating between feature-based and object-based attentional capture. Here, the authors address this question in 3 experiments using an attentional blink task with face photographs as first and second target (T1, T2). They demonstrate reduced neutral T2 identity recognition after angry or happy T1 expression, compared to neutral T1, and this supports attentional capture by a task-irrelevant feature. Crucially, after neutral T1, T2 identity recognition was enhanced and not suppressed when T2 was angry-suggesting that attentional capture by this task-irrelevant feature may be object-based and not feature-based. As an unexpected finding, both angry and happy facial expressions suppress memory access for competing objects, but only angry facial expression enjoyed privileged memory access. This could imply that these 2 processes are relatively independent from one another.

  17. Attention Alters Neural Responses to Evocative Faces in Behaviorally Inhibited Adolescents

    PubMed Central

    Pérez-Edgar, Koraly; Roberson-Nay, Roxann; Hardin, Michael G.; Poeth, Kaitlin; Guyer, Amanda E.; Nelson, Eric E.; McClure, Erin B.; Henderson, Heather A.; Fox, Nathan A.; Pine, Daniel S.; Ernst, Monique

    2007-01-01

    Behavioral inhibition (BI) is a risk factor for anxiety disorders. While the two constructs bear behavioral similarities, previous work has not extended these parallels to the neural level. This study examined amygdala reactivity during a task previously used with clinically anxious adolescents. Adolescents were selected for enduring patterns of BI or non-inhibition (BN). We examined amygdala response to evocative emotion faces in BI (N=10, mean 12.8 years) and BN (N=17, mean 12.5 years) adolescents while systematically manipulating attention. Analyses focused on amygdala response during subjective ratings of internal fear (constrained attention) and passive viewing (unconstrained attention) during the presentation of emotion faces (Happy, Angry, Fearful, and Neutral). BI adolescents, relative to BN adolescents, showed exaggerated amygdala response during subjective fear ratings and deactivation during passive viewing, across all emotion faces. In addition, the BI group showed an abnormally high amygdala response to a task condition marked by novelty and uncertainty (i.e., rating fear state to a Happy face). Perturbations in amygdala function are evident in adolescents temperamentally at risk for anxiety. Attention state alters the underlying pattern of neural processing, potentially mediating the observed behavioral patterns across development. BI adolescents also show a heightened sensitivity to novelty and uncertainty, which has been linked to anxiety. These patterns of reactivity may help sustain early temperamental biases over time and contribute to the observed relation between BI and anxiety. PMID:17376704

  18. Social anxiety under load: the effects of perceptual load in processing emotional faces.

    PubMed

    Soares, Sandra C; Rocha, Marta; Neiva, Tiago; Rodrigues, Paulo; Silva, Carlos F

    2015-01-01

    Previous studies in the social anxiety arena have shown an impaired attentional control system, similar to that found in trait anxiety. However, the effect of task demands on social anxiety in socially threatening stimuli, such as angry faces, remains unseen. In the present study, 54 university students scoring high and low in the Social Interaction and Performance Anxiety and Avoidance Scale (SIPAAS) questionnaire, participated in a target letter discrimination task while task-irrelevant face stimuli (angry, disgust, happy, and neutral) were simultaneously presented. The results showed that high (compared to low) socially anxious individuals were more prone to distraction by task-irrelevant stimuli, particularly under high perceptual load conditions. More importantly, for such individuals, the accuracy proportions for angry faces significantly differed between the low and high perceptual load conditions, which is discussed in light of current evolutionary models of social anxiety.

  19. Emotion Unchained: Facial Expression Modulates Gaze Cueing under Cognitive Load.

    PubMed

    Pecchinenda, Anna; Petrucci, Manuel

    2016-01-01

    Direction of eye gaze cues spatial attention, and typically this cueing effect is not modulated by the expression of a face unless top-down processes are explicitly or implicitly involved. To investigate the role of cognitive control on gaze cueing by emotional faces, participants performed a gaze cueing task with happy, angry, or neutral faces under high (i.e., counting backward by 7) or low cognitive load (i.e., counting forward by 2). Results show that high cognitive load enhances gaze cueing effects for angry facial expressions. In addition, cognitive load reduces gaze cueing for neutral faces, whereas happy facial expressions and gaze affected object preferences regardless of load. This evidence clearly indicates a differential role of cognitive control in processing gaze direction and facial expression, suggesting that under typical conditions, when we shift attention based on social cues from another person, cognitive control processes are used to reduce interference from emotional information.

  20. Emotion Unchained: Facial Expression Modulates Gaze Cueing under Cognitive Load

    PubMed Central

    Petrucci, Manuel

    2016-01-01

    Direction of eye gaze cues spatial attention, and typically this cueing effect is not modulated by the expression of a face unless top-down processes are explicitly or implicitly involved. To investigate the role of cognitive control on gaze cueing by emotional faces, participants performed a gaze cueing task with happy, angry, or neutral faces under high (i.e., counting backward by 7) or low cognitive load (i.e., counting forward by 2). Results show that high cognitive load enhances gaze cueing effects for angry facial expressions. In addition, cognitive load reduces gaze cueing for neutral faces, whereas happy facial expressions and gaze affected object preferences regardless of load. This evidence clearly indicates a differential role of cognitive control in processing gaze direction and facial expression, suggesting that under typical conditions, when we shift attention based on social cues from another person, cognitive control processes are used to reduce interference from emotional information. PMID:27959925

  1. Neural Processing of Facial Identity and Emotion in Infants at High-Risk for Autism Spectrum Disorders

    PubMed Central

    Fox, Sharon E.; Wagner, Jennifer B.; Shrock, Christine L.; Tager-Flusberg, Helen; Nelson, Charles A.

    2013-01-01

    Deficits in face processing and social impairment are core characteristics of autism spectrum disorder. The present work examined 7-month-old infants at high-risk for developing autism and typically developing controls at low-risk, using a face perception task designed to differentiate between the effects of face identity and facial emotions on neural response using functional Near-Infrared Spectroscopy. In addition, we employed independent component analysis, as well as a novel method of condition-related component selection and classification to identify group differences in hemodynamic waveforms and response distributions associated with face and emotion processing. The results indicate similarities of waveforms, but differences in the magnitude, spatial distribution, and timing of responses between groups. These early differences in local cortical regions and the hemodynamic response may, in turn, contribute to differences in patterns of functional connectivity. PMID:23576966

  2. Interference with facial emotion recognition by verbal but not visual loads.

    PubMed

    Reed, Phil; Steed, Ian

    2015-12-01

    The ability to recognize emotions through facial characteristics is critical for social functioning, but is often impaired in those with a developmental or intellectual disability. The current experiments explored the degree to which interfering with the processing capacities of typically-developing individuals would produce a similar inability to recognize emotions through the facial elements of faces displaying particular emotions. It was found that increasing the cognitive load (in an attempt to model learning impairments in a typically developing population) produced deficits in correctly identifying emotions from facial elements. However, this effect was much more pronounced when using a concurrent verbal task than when employing a concurrent visual task, suggesting that there is a substantial verbal element to the labeling and subsequent recognition of emotions. This concurs with previous work conducted with those with developmental disabilities that suggests emotion recognition deficits are connected with language deficits. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Influence of emotional processing on working memory in schizophrenia.

    PubMed

    Becerril, Karla; Barch, Deanna

    2011-09-01

    Research on emotional processing in schizophrenia suggests relatively intact subjective responses to affective stimuli "in the moment." However, neuroimaging evidence suggests diminished activation in brain regions associated with emotional processing in schizophrenia. We asked whether given a more vulnerable cognitive system in schizophrenia, individuals with this disorder would show increased or decreased modulation of working memory (WM) as a function of the emotional content of stimuli compared with healthy control subjects. In addition, we examined whether higher anhedonia levels were associated with a diminished impact of emotion on behavioral and brain activation responses. In the present study, 38 individuals with schizophrenia and 32 healthy individuals completed blocks of a 2-back WM task in a functional magnetic resonance imaging scanning session. Blocks contained faces displaying either only neutral stimuli or neutral and emotional stimuli (happy or fearful faces), randomly intermixed and occurring both as targets and non-targets. Both groups showed higher accuracy but slower reaction time for negative compared to neutral stimuli. Individuals with schizophrenia showed intact amygdala activity in response to emotionally evocative stimuli, but demonstrated altered dorsolateral prefrontal cortex (DLPFC) and hippocampal activity while performing an emotionally loaded WM-task. Higher levels of social anhedonia were associated with diminished amygdala responses to emotional stimuli and increased DLPFC activity in individuals with schizophrenia. Emotional arousal may challenge dorsal-frontal control systems, which may have both beneficial and detrimental influences. Our findings suggest that disturbances in emotional processing in schizophrenia relate to alterations in emotion-cognition interactions rather than to the perception and subjective experience of emotion per se.

  4. Dissociation in Rating Negative Facial Emotions between Behavioral Variant Frontotemporal Dementia and Major Depressive Disorder.

    PubMed

    Chiu, Isabelle; Piguet, Olivier; Diehl-Schmid, Janine; Riedl, Lina; Beck, Johannes; Leyhe, Thomas; Holsboer-Trachsler, Edith; Berres, Manfred; Monsch, Andreas U; Sollberger, Marc

    2016-11-01

    Features of behavioral variant frontotemporal dementia (bvFTD) such as executive dysfunction, apathy, and impaired empathic abilities are also observed in major depressive disorder (MDD). This may contribute to the reason why early stage bvFTD is often misdiagnosed as MDD. New assessment tools are thus needed to improve early diagnosis of bvFTD. Although emotion processing is affected in bvFTD and MDD, growing evidence indicates that the pattern of emotion processing deficits varies between the two disorders. As such, emotion processing paradigms have substantial potentials to distinguish bvFTD from MDD. The current study compared 25 patients with bvFTD, 21 patients with MDD, 21 patients with Alzheimer disease (AD) dementia, and 31 healthy participants on a novel facial emotion intensity rating task. Stimuli comprised morphed faces from the Ekman and Friesen stimulus set containing faces of each sex with two different degrees of emotion intensity for each of the six basic emotions. Analyses of covariance uncovered a significant dissociation between bvFTD and MDD patients in rating the intensity of negative emotions overall (i.e., bvFTD patients underrated negative emotions overall, whereas MDD patients overrated negative emotions overall compared with healthy participants). In contrast, AD dementia patients rated negative emotions similarly to healthy participants, suggesting no impact of cognitive deficits on rating facial emotions. By strongly differentiating bvFTD and MDDpatients through negative facial emotions, this sensitive and short rating task might help improve the early diagnosis of bvFTD. Copyright © 2016 American Association for Geriatric Psychiatry. All rights reserved.

  5. Competition between frontoparietal control and default networks supports social working memory and empathy

    PubMed Central

    Xin, Fei

    2015-01-01

    An extensive body of literature has indicated that there is increased activity in the frontoparietal control network (FPC) and decreased activity in the default mode network (DMN) during working memory (WM) tasks. The FPC and DMN operate in a competitive relationship during tasks requiring externally directed attention. However, the association between this FPC-DMN competition and performance in social WM tasks has rarely been reported in previous studies. To investigate this question, we measured FPC-DMN connectivity during resting state and two emotional face recognition WM tasks using the 2-back paradigm. Thirty-four individuals were instructed to perform the tasks based on either the expression [emotion (EMO)] or the identity (ID) of the same set of face stimuli. Consistent with previous studies, an increased anti-correlation between the FPC and DMN was observed during both tasks relative to the resting state. Specifically, this anti-correlation during the EMO task was stronger than during the ID task, as the former has a higher social load. Intriguingly, individual differences in self-reported empathy were significantly correlated with the FPC-DMN anti-correlation in the EMO task. These results indicate that the top-down signals from the FPC suppress the DMN to support social WM and empathy. PMID:25556209

  6. Emotion Awareness Predicts Body Mass Index Percentile Trajectories in Youth.

    PubMed

    Whalen, Diana J; Belden, Andy C; Barch, Deanna; Luby, Joan

    2015-10-01

    To examine the rate of change in body mass index (BMI) percentile across 3 years in relation to emotion identification ability and brain-based reactivity in emotional processing regions. A longitudinal sample of 202 youths completed 3 functional magnetic resonance imaging-based facial processing tasks and behavioral emotion differentiation tasks. We examined the rate of change in the youth's BMI percentile as a function of reactivity in emotional processing brain regions and behavioral emotion identification tasks using multilevel modeling. Lower correct identification of both happiness and sadness measured behaviorally predicted increases in BMI percentile across development, whereas higher correct identification of both happiness and sadness predicted decreases in BMI percentile, while controlling for children's pubertal status, sex, ethnicity, IQ score, exposure to antipsychotic medication, family income-to-needs ratio, and externalizing, internalizing, and depressive symptoms. Greater neural activation in emotional reactivity regions to sad faces also predicted increases in BMI percentile during development, also controlling for the aforementioned covariates. Our findings provide longitudinal developmental data demonstrating links between both emotion identification ability and greater neural reactivity in emotional processing regions with trajectories of BMI percentiles across childhood. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Theory of mind and its relationship with executive functions and emotion recognition in borderline personality disorder.

    PubMed

    Baez, Sandra; Marengo, Juan; Perez, Ana; Huepe, David; Font, Fernanda Giralt; Rial, Veronica; Gonzalez-Gadea, María Luz; Manes, Facundo; Ibanez, Agustin

    2015-09-01

    Impaired social cognition has been claimed to be a mechanism underlying the development and maintenance of borderline personality disorder (BPD). One important aspect of social cognition is the theory of mind (ToM), a complex skill that seems to be influenced by more basic processes, such as executive functions (EF) and emotion recognition. Previous ToM studies in BPD have yielded inconsistent results. This study assessed the performance of BPD adults on ToM, emotion recognition, and EF tasks. We also examined whether EF and emotion recognition could predict the performance on ToM tasks. We evaluated 15 adults with BPD and 15 matched healthy controls using different tasks of EF, emotion recognition, and ToM. The results showed that BPD adults exhibited deficits in the three domains, which seem to be task-dependent. Furthermore, we found that EF and emotion recognition predicted the performance on ToM. Our results suggest that tasks that involve real-life social scenarios and contextual cues are more sensitive to detect ToM and emotion recognition deficits in BPD individuals. Our findings also indicate that (a) ToM variability in BPD is partially explained by individual differences on EF and emotion recognition; and (b) ToM deficits of BPD patients are partially explained by the capacity to integrate cues from face, prosody, gesture, and social context to identify the emotions and others' beliefs. © 2014 The British Psychological Society.

  8. How do typically developing deaf children and deaf children with autism spectrum disorder use the face when comprehending emotional facial expressions in British sign language?

    PubMed

    Denmark, Tanya; Atkinson, Joanna; Campbell, Ruth; Swettenham, John

    2014-10-01

    Facial expressions in sign language carry a variety of communicative features. While emotion can modulate a spoken utterance through changes in intonation, duration and intensity, in sign language specific facial expressions presented concurrently with a manual sign perform this function. When deaf adult signers cannot see facial features, their ability to judge emotion in a signed utterance is impaired (Reilly et al. in Sign Lang Stud 75:113-118, 1992). We examined the role of the face in the comprehension of emotion in sign language in a group of typically developing (TD) deaf children and in a group of deaf children with autism spectrum disorder (ASD). We replicated Reilly et al.'s (Sign Lang Stud 75:113-118, 1992) adult results in the TD deaf signing children, confirming the importance of the face in understanding emotion in sign language. The ASD group performed more poorly on the emotion recognition task than the TD children. The deaf children with ASD showed a deficit in emotion recognition during sign language processing analogous to the deficit in vocal emotion recognition that has been observed in hearing children with ASD.

  9. Spatiotemporal dynamics in human visual cortex rapidly encode the emotional content of faces.

    PubMed

    Dima, Diana C; Perry, Gavin; Messaritaki, Eirini; Zhang, Jiaxiang; Singh, Krish D

    2018-06-08

    Recognizing emotion in faces is important in human interaction and survival, yet existing studies do not paint a consistent picture of the neural representation supporting this task. To address this, we collected magnetoencephalography (MEG) data while participants passively viewed happy, angry and neutral faces. Using time-resolved decoding of sensor-level data, we show that responses to angry faces can be discriminated from happy and neutral faces as early as 90 ms after stimulus onset and only 10 ms later than faces can be discriminated from scrambled stimuli, even in the absence of differences in evoked responses. Time-resolved relevance patterns in source space track expression-related information from the visual cortex (100 ms) to higher-level temporal and frontal areas (200-500 ms). Together, our results point to a system optimised for rapid processing of emotional faces and preferentially tuned to threat, consistent with the important evolutionary role that such a system must have played in the development of human social interactions. © 2018 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  10. Neural correlates of emotional action control in anger-prone women with borderline personality disorder.

    PubMed

    Bertsch, Katja; Roelofs, Karin; Roch, Paul Jonathan; Ma, Bo; Hensel, Saskia; Herpertz, Sabine C; Volman, Inge

    2018-05-01

    Difficulty in controlling emotional impulses is a crucial component of borderline personality disorder (BPD) that often leads to destructive, impulsive behaviours against others. In line with recent findings in aggressive individuals, deficits in prefrontal amygdala coupling during emotional action control may account for these symptoms. To study the neurobiological correlates of altered emotional action control in individuals with BPD, we asked medication-free, anger-prone, female patients with BPD and age- and intelligence-matched healthy women to take part in an approach-avoidance task while lying in an MRI scanner. The task required controlling fast behavioural tendencies to approach happy and avoid angry faces. Additionally, before the task we collected saliva testosterone and self-reported information on tendencies to act out anger and correlated this with behavioural and functional MRI (fMRI) data. We included 30 patients and 28 controls in our analysis. Patients with BPD reported increased tendencies to act out anger and were faster in approaching than avoiding angry faces than with healthy women, suggesting deficits in emotional action control in women with BPD. On a neural level, controlling fast emotional action tendencies was associated with enhanced activation in the antero- and dorsolateral prefrontal cortex across groups. Healthy women showed a negative coupling between the left dorsolateral prefrontal cortex and right amygdala, whereas this was absent in patients with BPD. Specificity of results to BPD and sex differences remain unknown owing to the lack of clinical control groups and male participants. The results indicate reduced lateral prefrontal-amygdala communication during emotional action control in anger-prone women with BPD. The findings provide a possible neural mechanism underlying difficulties with controlling emotional impulses in patients with BPD.

  11. Dissociation of Neural Substrates of Response Inhibition to Negative Information between Implicit and Explicit Facial Go/Nogo Tasks: Evidence from an Electrophysiological Study

    PubMed Central

    Sun, Shiyue; Carretié, Luis; Zhang, Lei; Dong, Yi; Zhu, Chunyan; Luo, Yuejia; Wang, Kai

    2014-01-01

    Background Although ample evidence suggests that emotion and response inhibition are interrelated at the behavioral and neural levels, neural substrates of response inhibition to negative facial information remain unclear. Thus we used event-related potential (ERP) methods to explore the effects of explicit and implicit facial expression processing in response inhibition. Methods We used implicit (gender categorization) and explicit emotional Go/Nogo tasks (emotion categorization) in which neutral and sad faces were presented. Electrophysiological markers at the scalp and the voxel level were analyzed during the two tasks. Results We detected a task, emotion and trial type interaction effect in the Nogo-P3 stage. Larger Nogo-P3 amplitudes during sad conditions versus neutral conditions were detected with explicit tasks. However, the amplitude differences between the two conditions were not significant for implicit tasks. Source analyses on P3 component revealed that right inferior frontal junction (rIFJ) was involved during this stage. The current source density (CSD) of rIFJ was higher with sad conditions compared to neutral conditions for explicit tasks, rather than for implicit tasks. Conclusions The findings indicated that response inhibition was modulated by sad facial information at the action inhibition stage when facial expressions were processed explicitly rather than implicitly. The rIFJ may be a key brain region in emotion regulation. PMID:25330212

  12. Neural correlates of emotional face processing in bipolar disorder: an event-related potential study.

    PubMed

    Degabriele, Racheal; Lagopoulos, Jim; Malhi, Gin

    2011-09-01

    Behavioural and imaging studies report that individuals with bipolar disorder (BD) exhibit impairments in emotional face processing. However, few studies have studied the temporal characteristics of these impairments, and event-related potential (ERP) studies that investigate emotion perception in BD are rare. The aim of our study was to explore these processes as indexed by the face-specific P100 and N170 ERP components in a BD cohort. Eighteen subjects diagnosed with BD and 18 age- and sex-matched healthy volunteers completed an emotional go/no-go inhibition task during electroencephalogram (EEG) and ERP acquisition. Patients demonstrated faster responses to happy compared to sad faces, whereas control data revealed no emotional discrimination. Errors of omission were more frequent in the BD group in both emotion conditions, but there were no between-group differences in commission errors. Significant differences were found between groups in P100 amplitude variation across levels of affect, with the BD group exhibiting greater responses to happy compared to sad faces. Conversely, the control cohort failed to demonstrate a differentiation between emotions. A statistically significant between-group effect was also found for N170 amplitudes, indicating reduced responses in the BD group. Future studies should ideally recruit BD patients across all three mood states (manic, depressive, and euthymic) with greater scrutiny of the effects of psychotropic medication. These ERP results primarily suggest an emotion-sensitive face processing impairment in BD whereby patients are initially more attuned to positive emotions as indicated by the P100 ERP component, and this may contribute to the emergence of bipolar-like symptoms. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. Reading the mind in the infant eyes: paradoxical effects of oxytocin on neural activity and emotion recognition in watching pictures of infant faces.

    PubMed

    Voorthuis, Alexandra; Riem, Madelon M E; Van IJzendoorn, Marinus H; Bakermans-Kranenburg, Marian J

    2014-09-11

    The neuropeptide oxytocin facilitates parental caregiving and is involved in the processing of infant vocal cues. In this randomized-controlled trial with functional magnetic resonance imaging we examined the influence of intranasally administered oxytocin on neural activity during emotion recognition in infant faces. Blood oxygenation level dependent (BOLD) responses during emotion recognition were measured in 50 women who were administered 16 IU of oxytocin or a placebo. Participants performed an adapted version of the Infant Facial Expressions of Emotions from Looking at Pictures (IFEEL pictures), a task that has been developed to assess the perception and interpretation of infants' facial expressions. Experimentally induced oxytocin levels increased activation in the inferior frontal gyrus (IFG), the middle temporal gyrus (MTG) and the superior temporal gyrus (STG). However, oxytocin decreased performance on the IFEEL picture task. Our findings suggest that oxytocin enhances processing of facial cues of the emotional state of infants on a neural level, but at the same time it may decrease the correct interpretation of infants' facial expressions on a behavior level. This article is part of a Special Issue entitled Oxytocin and Social Behav. © 2013 Published by Elsevier B.V.

  14. Parametric modulation of neural activity by emotion in youth with bipolar disorder, severe mood dysregulation, and healthy subjects

    PubMed Central

    Thomas, Laura A.; Brotman, Melissa A.; Muhrer, Eli M.; Rosen, Brooke H.; Bones, Brian L.; Reynolds, Richard C.; Deveney, Christen; Pine, Daniel S.; Leibenluft, Ellen

    2012-01-01

    Context Youth with bipolar disorder (BD) and those with severe, non-episodic irritability (severe mood dysregulation, SMD) show amygdala dysfunction during face emotion processing. However, studies have not compared such patients to each other and to comparison subjects in neural responsiveness to subtle changes in face emotion; the ability to process such changes is important for social cognition. We employed a novel parametrically designed faces paradigm. Objective Using a parametrically morphed emotional faces task, we compared activation in the amygdala and across the brain in BD, SMD, and healthy volunteers (HV). Design Case-control study. Setting Government research institute. Participants 57 youths (19 BD, 15 SMD, 23 HV). Main Outcome Measure Blood oxygenated level dependent (BOLD) data. Neutral faces were morphed with angry and happy faces in 25% intervals; static face stimuli appeared for 3000ms. Subjects performed hostility or non-emotional facial feature (i.e., nose width) ratings. Slope of BOLD activity was calculated across neutral-to-angry (N→A) and neutral-to-happy (N→H) face stimuli. Results In HV, but not BD or SMD, there was a positive association between left amygdala activity and anger on the face. In the N→H whole brain analysis, BD and SMD modulated parietal, temporal, and medial-frontal areas differently from each other and from HV; with increasing facial-happiness, SMD increased, while BD decreased, activity in parietal, temporal, and frontal regions. Conclusions Youth with BD or SMD differ from HV in modulation of amygdala activity in response to small changes in facial anger displays. In contrast, BD and SMD show distinct perturbations in regions mediating attention and face processing in association with changes in the emotional intensity of facial happiness displays. These findings demonstrate similarities and differences in the neural correlates of face emotion processing in BD and SMD, suggesting these distinct clinical presentations may reflect differing pathologies along a mood disorders spectrum. PMID:23026912

  15. Neural processing of fearful and happy facial expressions during emotion-relevant and emotion-irrelevant tasks: A fixation-to-feature approach.

    PubMed

    Neath-Tavares, Karly N; Itier, Roxane J

    2016-09-01

    Research suggests an important role of the eyes and mouth for discriminating facial expressions of emotion. A gaze-contingent procedure was used to test the impact of fixation to facial features on the neural response to fearful, happy and neutral facial expressions in an emotion discrimination (Exp.1) and an oddball detection (Exp.2) task. The N170 was the only eye-sensitive ERP component, and this sensitivity did not vary across facial expressions. In both tasks, compared to neutral faces, responses to happy expressions were seen as early as 100-120ms occipitally, while responses to fearful expressions started around 150ms, on or after the N170, at both occipital and lateral-posterior sites. Analyses of scalp topographies revealed different distributions of these two emotion effects across most of the epoch. Emotion processing interacted with fixation location at different times between tasks. Results suggest a role of both the eyes and mouth in the neural processing of fearful expressions and of the mouth in the processing of happy expressions, before 350ms. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. The Older Adult Positivity Effect in Evaluations of Trustworthiness: Emotion Regulation or Cognitive Capacity?

    PubMed

    Zebrowitz, Leslie A; Boshyan, Jasmine; Ward, Noreen; Gutchess, Angela; Hadjikhani, Nouchine

    2017-01-01

    An older adult positivity effect, i.e., the tendency for older adults to favor positive over negative stimulus information more than do younger adults, has been previously shown in attention, memory, and evaluations. This effect has been attributed to greater emotion regulation in older adults. In the case of attention and memory, this explanation has been supported by some evidence that the older adult positivity effect is most pronounced for negative stimuli, which would motivate emotion regulation, and that it is reduced by cognitive load, which would impede emotion regulation. We investigated whether greater older adult positivity in the case of evaluative responses to faces is also enhanced for negative stimuli and attenuated by cognitive load, as an emotion regulation explanation would predict. In two studies, younger and older adults rated trustworthiness of faces that varied in valence both under low and high cognitive load, with the latter manipulated by a distracting backwards counting task. In Study 1, face valence was manipulated by attractiveness (low /disfigured faces, medium, high/fashion models' faces). In Study 2, face valence was manipulated by trustworthiness (low, medium, high). Both studies revealed a significant older adult positivity effect. However, contrary to an emotion regulation account, this effect was not stronger for more negative faces, and cognitive load increased rather than decreased the rated trustworthiness of negatively valenced faces. Although inconsistent with emotion regulation, the latter effect is consistent with theory and research arguing that more cognitive resources are required to process negative stimuli, because they are more cognitively elaborated than positive ones. The finding that increased age and increased cognitive load both enhanced the positivity of trustworthy ratings suggests that the older adult positivity effect in evaluative ratings of faces may reflect age-related declines in cognitive capacity rather than increases in the regulation of negative emotions.

  17. Reprint of "Investigating ensemble perception of emotions in autistic and typical children and adolescents".

    PubMed

    Karaminis, Themelis; Neil, Louise; Manning, Catherine; Turi, Marco; Fiorentini, Chiara; Burr, David; Pellicano, Elizabeth

    2018-01-01

    Ensemble perception, the ability to assess automatically the summary of large amounts of information presented in visual scenes, is available early in typical development. This ability might be compromised in autistic children, who are thought to present limitations in maintaining summary statistics representations for the recent history of sensory input. Here we examined ensemble perception of facial emotional expressions in 35 autistic children, 30 age- and ability-matched typical children and 25 typical adults. Participants received three tasks: a) an 'ensemble' emotion discrimination task; b) a baseline (single-face) emotion discrimination task; and c) a facial expression identification task. Children performed worse than adults on all three tasks. Unexpectedly, autistic and typical children were, on average, indistinguishable in their precision and accuracy on all three tasks. Computational modelling suggested that, on average, autistic and typical children used ensemble-encoding strategies to a similar extent; but ensemble perception was related to non-verbal reasoning abilities in autistic but not in typical children. Eye-movement data also showed no group differences in the way children attended to the stimuli. Our combined findings suggest that the abilities of autistic and typical children for ensemble perception of emotions are comparable on average. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  18. Ensemble perception of emotions in autistic and typical children and adolescents.

    PubMed

    Karaminis, Themelis; Neil, Louise; Manning, Catherine; Turi, Marco; Fiorentini, Chiara; Burr, David; Pellicano, Elizabeth

    2017-04-01

    Ensemble perception, the ability to assess automatically the summary of large amounts of information presented in visual scenes, is available early in typical development. This ability might be compromised in autistic children, who are thought to present limitations in maintaining summary statistics representations for the recent history of sensory input. Here we examined ensemble perception of facial emotional expressions in 35 autistic children, 30 age- and ability-matched typical children and 25 typical adults. Participants received three tasks: a) an 'ensemble' emotion discrimination task; b) a baseline (single-face) emotion discrimination task; and c) a facial expression identification task. Children performed worse than adults on all three tasks. Unexpectedly, autistic and typical children were, on average, indistinguishable in their precision and accuracy on all three tasks. Computational modelling suggested that, on average, autistic and typical children used ensemble-encoding strategies to a similar extent; but ensemble perception was related to non-verbal reasoning abilities in autistic but not in typical children. Eye-movement data also showed no group differences in the way children attended to the stimuli. Our combined findings suggest that the abilities of autistic and typical children for ensemble perception of emotions are comparable on average. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. Transcutaneous vagus nerve stimulation (tVNS) enhances recognition of emotions in faces but not bodies.

    PubMed

    Sellaro, Roberta; de Gelder, Beatrice; Finisguerra, Alessandra; Colzato, Lorenza S

    2018-02-01

    The polyvagal theory suggests that the vagus nerve is the key phylogenetic substrate enabling optimal social interactions, a crucial aspect of which is emotion recognition. A previous study showed that the vagus nerve plays a causal role in mediating people's ability to recognize emotions based on images of the eye region. The aim of this study is to verify whether the previously reported causal link between vagal activity and emotion recognition can be generalized to situations in which emotions must be inferred from images of whole faces and bodies. To this end, we employed transcutaneous vagus nerve stimulation (tVNS), a novel non-invasive brain stimulation technique that causes the vagus nerve to fire by the application of a mild electrical stimulation to the auricular branch of the vagus nerve, located in the anterior protuberance of the outer ear. In two separate sessions, participants received active or sham tVNS before and while performing two emotion recognition tasks, aimed at indexing their ability to recognize emotions from facial and bodily expressions. Active tVNS, compared to sham stimulation, enhanced emotion recognition for whole faces but not for bodies. Our results confirm and further extend recent observations supporting a causal relationship between vagus nerve activity and the ability to infer others' emotional state, but restrict this association to situations in which the emotional state is conveyed by the whole face and/or by salient facial cues, such as eyes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Effects of task demands on the early neural processing of fearful and happy facial expressions

    PubMed Central

    Itier, Roxane J.; Neath-Tavares, Karly N.

    2017-01-01

    Task demands shape how we process environmental stimuli but their impact on the early neural processing of facial expressions remains unclear. In a within-subject design, ERPs were recorded to the same fearful, happy and neutral facial expressions presented during a gender discrimination, an explicit emotion discrimination and an oddball detection tasks, the most studied tasks in the field. Using an eye tracker, fixation on the face nose was enforced using a gaze-contingent presentation. Task demands modulated amplitudes from 200–350ms at occipito-temporal sites spanning the EPN component. Amplitudes were more negative for fearful than neutral expressions starting on N170 from 150–350ms, with a temporo-occipital distribution, whereas no clear effect of happy expressions was seen. Task and emotion effects never interacted in any time window or for the ERP components analyzed (P1, N170, EPN). Thus, whether emotion is explicitly discriminated or irrelevant for the task at hand, neural correlates of fearful and happy facial expressions seem immune to these task demands during the first 350ms of visual processing. PMID:28315309

  1. Cortical activation deficits during facial emotion processing in youth at high risk for the development of substance use disorders.

    PubMed

    Hulvershorn, Leslie A; Finn, Peter; Hummer, Tom A; Leibenluft, Ellen; Ball, Brandon; Gichina, Victoria; Anand, Amit

    2013-08-01

    Recent longitudinal studies demonstrate that addiction risk may be influenced by a cognitive, affective and behavioral phenotype that emerges during childhood. Relatively little research has focused on the affective or emotional risk components of this high-risk phenotype, including the relevant neurobiology. Non-substance abusing youth (N=19; mean age=12.2) with externalizing psychopathology and paternal history of a substance use disorder and demographically matched healthy comparisons (N=18; mean age=11.9) were tested on a facial emotion matching task during functional MRI. This task involved matching faces by emotions (angry, anxious) or matching shape orientation. High-risk youth exhibited increased medial prefrontal, precuneus and occipital cortex activation compared to the healthy comparison group during the face matching condition, relative to the control shape condition. The occipital activation correlated positively with parent-rated emotion regulation impairments in the high-risk group. These findings suggest a preexisting abnormality in cortical activation in response to facial emotion matching in youth at high risk for the development of problem drug or alcohol use. These cortical deficits may underlie impaired affective processing and regulation, which in turn may contribute to escalating drug use in adolescence. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  2. Cortical Activation Deficits During Facial Emotion Processing in Youth at High Risk for the Development of Substance Use Disorders*

    PubMed Central

    Hulvershorn, Leslie A.; Finn, Peter; Hummer, Tom A.; Leibenluft, Ellen; Ball, Brandon; Gichina, Victoria; Anand, Amit

    2013-01-01

    Background Recent longitudinal studies demonstrate that addiction risk may be influenced by a cognitive, affective and behavioral phenotype that emerges during childhood. Relatively little research has focused on the affective or emotional risk components of this high-risk phenotype, including the relevant neurobiology. Methods Non-substance abusing youth (N = 19; mean age = 12.2) with externalizing psychopathology and paternal history of a substance use disorder and demographically matched healthy comparisons (N=18; mean age = 11.9) were tested on a facial emotion matching task during functional MRI. This task involved matching faces by emotions (angry, anxious) or matching shape orientation. Results High-risk youth exhibited increased medial prefrontal, precuneus and occipital cortex activation compared to the healthy comparison group during the face matching condition, relative to the control shape condition. The occipital activation correlated positively with parent-rated emotion regulation impairments in the high-risk group. Conclusions These findings suggest a preexisting abnormality in cortical activation in response to facial emotion matching in youth at high risk for the development of problem drug or alcohol use. These cortical deficits may underlie impaired affective processing and regulation, which in turn may contribute to escalating drug use in adolescence. PMID:23768841

  3. Basic and complex emotion recognition in children with autism: cross-cultural findings.

    PubMed

    Fridenson-Hayo, Shimrit; Berggren, Steve; Lassalle, Amandine; Tal, Shahar; Pigat, Delia; Bölte, Sven; Baron-Cohen, Simon; Golan, Ofer

    2016-01-01

    Children with autism spectrum conditions (ASC) have emotion recognition deficits when tested in different expression modalities (face, voice, body). However, these findings usually focus on basic emotions, using one or two expression modalities. In addition, cultural similarities and differences in emotion recognition patterns in children with ASC have not been explored before. The current study examined the similarities and differences in the recognition of basic and complex emotions by children with ASC and typically developing (TD) controls across three cultures: Israel, Britain, and Sweden. Fifty-five children with high-functioning ASC, aged 5-9, were compared to 58 TD children. On each site, groups were matched on age, sex, and IQ. Children were tested using four tasks, examining recognition of basic and complex emotions from voice recordings, videos of facial and bodily expressions, and emotional video scenarios including all modalities in context. Compared to their TD peers, children with ASC showed emotion recognition deficits in both basic and complex emotions on all three modalities and their integration in context. Complex emotions were harder to recognize, compared to basic emotions for the entire sample. Cross-cultural agreement was found for all major findings, with minor deviations on the face and body tasks. Our findings highlight the multimodal nature of ER deficits in ASC, which exist for basic as well as complex emotions and are relatively stable cross-culturally. Cross-cultural research has the potential to reveal both autism-specific universal deficits and the role that specific cultures play in the way empathy operates in different countries.

  4. You may look unhappy unless you smile: the distinctiveness of a smiling face against faces without an explicit smile.

    PubMed

    Park, Hyung-Bum; Han, Ji-Eun; Hyun, Joo-Seok

    2015-05-01

    An expressionless face is often perceived as rude whereas a smiling face is considered as hospitable. Repetitive exposure to such perceptions may have developed stereotype of categorizing an expressionless face as expressing negative emotion. To test this idea, we displayed a search array where the target was an expressionless face and the distractors were either smiling or frowning faces. We manipulated set size. Search reaction times were delayed with frowning distractors. Delays became more evident as the set size increased. We also devised a short-term comparison task where participants compared two sequential sets of expressionless, smiling, and frowning faces. Detection of an expression change across the sets was highly inaccurate when the change was made between frowning and expressionless face. These results indicate that subjects were confused with expressed emotions on frowning and expressionless faces, suggesting that it is difficult to distinguish expressionless face from frowning faces. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Detecting and Categorizing Fleeting Emotions in Faces

    PubMed Central

    Sweeny, Timothy D.; Suzuki, Satoru; Grabowecky, Marcia; Paller, Ken A.

    2013-01-01

    Expressions of emotion are often brief, providing only fleeting images from which to base important social judgments. We sought to characterize the sensitivity and mechanisms of emotion detection and expression categorization when exposure to faces is very brief, and to determine whether these processes dissociate. Observers viewed 2 backward-masked facial expressions in quick succession, 1 neutral and the other emotional (happy, fearful, or angry), in a 2-interval forced-choice task. On each trial, observers attempted to detect the emotional expression (emotion detection) and to classify the expression (expression categorization). Above-chance emotion detection was possible with extremely brief exposures of 10 ms and was most accurate for happy expressions. We compared categorization among expressions using a d′ analysis, and found that categorization was usually above chance for angry versus happy and fearful versus happy, but consistently poor for fearful versus angry expressions. Fearful versus angry categorization was poor even when only negative emotions (fearful, angry, or disgusted) were used, suggesting that this categorization is poor independent of decision context. Inverting faces impaired angry versus happy categorization, but not emotion detection, suggesting that information from facial features is used differently for emotion detection and expression categorizations. Emotion detection often occurred without expression categorization, and expression categorization sometimes occurred without emotion detection. These results are consistent with the notion that emotion detection and expression categorization involve separate mechanisms. PMID:22866885

  6. Posttraumatic stress and alcohol use among veterans: Amygdala and anterior cingulate activation to emotional cues.

    PubMed

    Simons, Raluca M; Simons, Jeffrey S; Olson, Dawne; Baugh, Lee; Magnotta, Vincent; Forster, Gina

    2016-11-01

    This fMRI study tested a model of combat trauma, posttraumatic stress symptoms (PTSS), alcohol use, and behavioral and neural responses to emotional cues in 100 OIF/OEF/OND veterans. Multilevel structural equation models were tested for left and right dorsal ACC (dACC), rostral ACC (rACC), and amygdala blood-oxygen- level dependent responses during the emotional counting Stroop test and masked faces task. In the Stroop task, combat exposure moderated the effect of combat stimuli resulting in hyperactivation in the rACC and dACC. Activation in the left amygdala also increased in response to combat stimuli, but effects did not vary as a function of combat severity. In the masked faces task, activation patterns did not vary as a function of stimulus. However, at the between-person level, amygdala activation during the masked faces task was inversely associated with PTSS. In respect to behavioral outcomes, higher PTSS were associated with a stronger Stroop effect, suggesting greater interference associated with combat words. Results are consistent with the premise that combat trauma results in hyperactivation in the ACC in response to combat stimuli, and, via its effect on PTSS, is associated with deficits in cognitive performance in the presence of combat stimuli. Across tasks, predeployment drinking was inversely associated with activation in the dACC but not the rACC or amygdala. Drinking may be a buffering factor, or negatively reinforcing in part because of its effects on normalizing brain response following trauma exposure. Alternatively, drinking may undermine adaptive functioning of the dACC when responding to traumatic stress cues. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  7. The effect of background music on episodic memory and autonomic responses: listening to emotionally touching music enhances facial memory capacity.

    PubMed

    Proverbio, Alice Mado; Mado Proverbio, C A Alice; Lozano Nasi, Valentina; Alessandra Arcari, Laura; De Benedetto, Francesco; Guardamagna, Matteo; Gazzola, Martina; Zani, Alberto

    2015-10-15

    The aim of this study was to investigate how background auditory processing can affect other perceptual and cognitive processes as a function of stimulus content, style and emotional nature. Previous studies have offered contrasting evidence, and it has been recently shown that listening to music negatively affected concurrent mental processing in the elderly but not in young adults. To further investigate this matter, the effect of listening to music vs. listening to the sound of rain or silence was examined by administering an old/new face memory task (involving 448 unknown faces) to a group of 54 non-musician university students. Heart rate and diastolic and systolic blood pressure were measured during an explicit face study session that was followed by a memory test. The results indicated that more efficient and faster recall of faces occurred under conditions of silence or when participants were listening to emotionally touching music. Whereas auditory background (e.g., rain or joyful music) interfered with memory encoding, listening to emotionally touching music improved memory and significantly increased heart rate. It is hypothesized that touching music is able to modify the visual perception of faces by binding facial properties with auditory and emotionally charged information (music), which may therefore result in deeper memory encoding.

  8. The effect of background music on episodic memory and autonomic responses: listening to emotionally touching music enhances facial memory capacity

    PubMed Central

    Mado Proverbio, C.A. Alice; Lozano Nasi, Valentina; Alessandra Arcari, Laura; De Benedetto, Francesco; Guardamagna, Matteo; Gazzola, Martina; Zani, Alberto

    2015-01-01

    The aim of this study was to investigate how background auditory processing can affect other perceptual and cognitive processes as a function of stimulus content, style and emotional nature. Previous studies have offered contrasting evidence, and it has been recently shown that listening to music negatively affected concurrent mental processing in the elderly but not in young adults. To further investigate this matter, the effect of listening to music vs. listening to the sound of rain or silence was examined by administering an old/new face memory task (involving 448 unknown faces) to a group of 54 non-musician university students. Heart rate and diastolic and systolic blood pressure were measured during an explicit face study session that was followed by a memory test. The results indicated that more efficient and faster recall of faces occurred under conditions of silence or when participants were listening to emotionally touching music. Whereas auditory background (e.g., rain or joyful music) interfered with memory encoding, listening to emotionally touching music improved memory and significantly increased heart rate. It is hypothesized that touching music is able to modify the visual perception of faces by binding facial properties with auditory and emotionally charged information (music), which may therefore result in deeper memory encoding. PMID:26469712

  9. Affective theory of mind inferences contextually influence the recognition of emotional facial expressions.

    PubMed

    Stewart, Suzanne L K; Schepman, Astrid; Haigh, Matthew; McHugh, Rhian; Stewart, Andrew J

    2018-03-14

    The recognition of emotional facial expressions is often subject to contextual influence, particularly when the face and the context convey similar emotions. We investigated whether spontaneous, incidental affective theory of mind inferences made while reading vignettes describing social situations would produce context effects on the identification of same-valenced emotions (Experiment 1) as well as differently-valenced emotions (Experiment 2) conveyed by subsequently presented faces. Crucially, we found an effect of context on reaction times in both experiments while, in line with previous work, we found evidence for a context effect on accuracy only in Experiment 1. This demonstrates that affective theory of mind inferences made at the pragmatic level of a text can automatically, contextually influence the perceptual processing of emotional facial expressions in a separate task even when those emotions are of a distinctive valence. Thus, our novel findings suggest that language acts as a contextual influence to the recognition of emotional facial expressions for both same and different valences.

  10. Crossmodal Adaptation in Right Posterior Superior Temporal Sulcus during Face–Voice Emotional Integration

    PubMed Central

    Latinus, Marianne; Noguchi, Takao; Garrod, Oliver; Crabbe, Frances; Belin, Pascal

    2014-01-01

    The integration of emotional information from the face and voice of other persons is known to be mediated by a number of “multisensory” cerebral regions, such as the right posterior superior temporal sulcus (pSTS). However, whether multimodal integration in these regions is attributable to interleaved populations of unisensory neurons responding to face or voice or rather by multimodal neurons receiving input from the two modalities is not fully clear. Here, we examine this question using functional magnetic resonance adaptation and dynamic audiovisual stimuli in which emotional information was manipulated parametrically and independently in the face and voice via morphing between angry and happy expressions. Healthy human adult subjects were scanned while performing a happy/angry emotion categorization task on a series of such stimuli included in a fast event-related, continuous carryover design. Subjects integrated both face and voice information when categorizing emotion—although there was a greater weighting of face information—and showed behavioral adaptation effects both within and across modality. Adaptation also occurred at the neural level: in addition to modality-specific adaptation in visual and auditory cortices, we observed for the first time a crossmodal adaptation effect. Specifically, fMRI signal in the right pSTS was reduced in response to a stimulus in which facial emotion was similar to the vocal emotion of the preceding stimulus. These results suggest that the integration of emotional information from face and voice in the pSTS involves a detectable proportion of bimodal neurons that combine inputs from visual and auditory cortices. PMID:24828635

  11. Neural Systems Underlying Emotional and Non-emotional Interference Processing: An ALE Meta-Analysis of Functional Neuroimaging Studies

    PubMed Central

    Xu, Min; Xu, Guiping; Yang, Yang

    2016-01-01

    Understanding how the nature of interference might influence the recruitments of the neural systems is considered as the key to understanding cognitive control. Although, interference processing in the emotional domain has recently attracted great interest, the question of whether there are separable neural patterns for emotional and non-emotional interference processing remains open. Here, we performed an activation likelihood estimation meta-analysis of 78 neuroimaging experiments, and examined common and distinct neural systems for emotional and non-emotional interference processing. We examined brain activation in three domains of interference processing: emotional verbal interference in the face-word conflict task, non-emotional verbal interference in the color-word Stroop task, and non-emotional spatial interference in the Simon, SRC and Flanker tasks. Our results show that the dorsal anterior cingulate cortex (ACC) was recruited for both emotional and non-emotional interference. In addition, the right anterior insula, presupplementary motor area (pre-SMA), and right inferior frontal gyrus (IFG) were activated by interference processing across both emotional and non-emotional domains. In light of these results, we propose that the anterior insular cortex may serve to integrate information from different dimensions and work together with the dorsal ACC to detect and monitor conflicts, whereas pre-SMA and right IFG may be recruited to inhibit inappropriate responses. In contrast, the dorsolateral prefrontal cortex (DLPFC) and posterior parietal cortex (PPC) showed different degrees of activation and distinct lateralization patterns for different processing domains, which suggests that these regions may implement cognitive control based on the specific task requirements. PMID:27895564

  12. Altered neural processing of emotional faces in remitted Cushing's disease.

    PubMed

    Bas-Hoogendam, Janna Marie; Andela, Cornelie D; van der Werff, Steven J A; Pannekoek, J Nienke; van Steenbergen, Henk; Meijer, Onno C; van Buchem, Mark A; Rombouts, Serge A R B; van der Mast, Roos C; Biermasz, Nienke R; van der Wee, Nic J A; Pereira, Alberto M

    2015-09-01

    Patients with long-term remission of Cushing's disease (CD) demonstrate residual psychological complaints. At present, it is not known how previous exposure to hypercortisolism affects psychological functioning in the long-term. Earlier magnetic resonance imaging (MRI) studies demonstrated abnormalities of brain structure and resting-state connectivity in patients with long-term remission of CD, but no data are available on functional alterations in the brain during the performance of emotional or cognitive tasks in these patients. We performed a cross-sectional functional MRI study, investigating brain activation during emotion processing in patients with long-term remission of CD. Processing of emotional faces versus a non-emotional control condition was examined in 21 patients and 21 matched healthy controls. Analyses focused on activation and connectivity of two a priori determined regions of interest: the amygdala and the medial prefrontal-orbitofrontal cortex (mPFC-OFC). We also assessed psychological functioning, cognitive failure, and clinical disease severity. Patients showed less mPFC activation during processing of emotional faces compared to controls, whereas no differences were found in amygdala activation. An exploratory psychophysiological interaction analysis demonstrated decreased functional coupling between the ventromedial PFC and posterior cingulate cortex (a region structurally connected to the PFC) in CD-patients. The present study is the first to show alterations in brain function and task-related functional coupling in patients with long-term remission of CD relative to matched healthy controls. These alterations may, together with abnormalities in brain structure, be related to the persisting psychological morbidity in patients with CD after long-term remission. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Emotion recognition and oxytocin in patients with schizophrenia

    PubMed Central

    Averbeck, B. B.; Bobin, T.; Evans, S.; Shergill, S. S.

    2012-01-01

    Background Studies have suggested that patients with schizophrenia are impaired at recognizing emotions. Recently, it has been shown that the neuropeptide oxytocin can have beneficial effects on social behaviors. Method To examine emotion recognition deficits in patients and see whether oxytocin could improve these deficits, we carried out two experiments. In the first experiment we recruited 30 patients with schizophrenia and 29 age- and IQ-matched control subjects, and gave them an emotion recognition task. Following this, we carried out a second experiment in which we recruited 21 patients with schizophrenia for a double-blind, placebo-controlled cross-over study of the effects of oxytocin on the same emotion recognition task. Results In the first experiment we found that patients with schizophrenia had a deficit relative to controls in recognizing emotions. In the second experiment we found that administration of oxytocin improved the ability of patients to recognize emotions. The improvement was consistent and occurred for most emotions, and was present whether patients were identifying morphed or non-morphed faces. Conclusions These data add to a growing literature showing beneficial effects of oxytocin on social–behavioral tasks, as well as clinical symptoms. PMID:21835090

  14. Older but not younger infants associate own-race faces with happy music and other-race faces with sad music.

    PubMed

    Xiao, Naiqi G; Quinn, Paul C; Liu, Shaoying; Ge, Liezhong; Pascalis, Olivier; Lee, Kang

    2018-03-01

    We used a novel intermodal association task to examine whether infants associate own- and other-race faces with music of different emotional valences. Three- to 9-month-olds saw a series of neutral own- or other-race faces paired with happy or sad musical excerpts. Three- to 6-month-olds did not show any specific association between face race and music. At 9 months, however, infants looked longer at own-race faces paired with happy music than at own-race faces paired with sad music. Nine-month-olds also looked longer at other-race faces paired with sad music than at other-race faces paired with happy music. These results indicate that infants with nearly exclusive own-race face experience develop associations between face race and music emotional valence in the first year of life. The potential implications of such associations for developing racial biases in early childhood are discussed. © 2017 John Wiley & Sons Ltd.

  15. The face and its emotion: right N170 deficits in structural processing and early emotional discrimination in schizophrenic patients and relatives.

    PubMed

    Ibáñez, Agustín; Riveros, Rodrigo; Hurtado, Esteban; Gleichgerrcht, Ezequiel; Urquina, Hugo; Herrera, Eduar; Amoruso, Lucía; Reyes, Migdyrai Martin; Manes, Facundo

    2012-01-30

    Previous studies have reported facial emotion recognition impairments in schizophrenic patients, as well as abnormalities in the N170 component of the event-related potential. Current research on schizophrenia highlights the importance of complexly-inherited brain-based deficits. In order to examine the N170 markers of face structural and emotional processing, DSM-IV diagnosed schizophrenia probands (n=13), unaffected first-degree relatives from multiplex families (n=13), and control subjects (n=13) matched by age, gender and educational level, performed a categorization task which involved words and faces with positive and negative valence. The N170 component, while present in relatives and control subjects, was reduced in patients, not only for faces, but also for face-word differences, suggesting a deficit in structural processing of stimuli. Control subjects showed N170 modulation according to the valence of facial stimuli. However, this discrimination effect was found to be reduced both in patients and relatives. This is the first report showing N170 valence deficits in relatives. Our results suggest a generalized deficit affecting the structural encoding of faces in patients, as well as the emotion discrimination both in patients and relatives. Finally, these findings lend support to the notion that cortical markers of facial discrimination can be validly considered as vulnerability markers. © 2011 Elsevier Ireland Ltd. All rights reserved.

  16. Accuracy of emotion labeling in children of parents diagnosed with bipolar disorder.

    PubMed

    Hanford, Lindsay C; Sassi, Roberto B; Hall, Geoffrey B

    2016-04-01

    Emotion labeling deficits have been posited as an endophenotype for bipolar disorder (BD) as they have been observed in both patients and their first-degree relatives. It remains unclear whether these deficits exist secondary to the development of psychiatric symptoms or whether they can be attributed to risk for psychopathology. To explore this, we investigated emotion processing in symptomatic and asymptomatic high-risk bipolar offspring (HRO) and healthy children of healthy parents (HCO). Symptomatic (n:18, age: 13.8 ± 2.6 years, 44% female) and asymptomatic (n:12, age: 12.8 ± 3.0 years, 42% female) HRO and age- and sex-matched HCO (n:20, age: 13.3 ± 2.5 years, 45% female) performed an emotion-labeling task. Total number of errors, emotion category and intensity of emotion error scores were compared. Correlations between total error scores and symptom severity were also investigated. Compared to HCO, both HRO groups made more errors on the adult face task (pcor=0.014). The HRO group were 2.3 times [90%CI:0.9-6.3] more likely and 4.3 times [90%CI:1.3-14.3] more likely to make errors on sad and angry faces, respectively. With the exception of sad face type errors, we observed no significant differences in error patterns between symptomatic and asymptomatic HRO, and no correlations between symptom severity and total number of errors. This study was cross-sectional in design, limiting our ability to infer trajectories or heritability of these deficits. This study provides further support for emotion labeling deficits as a candidate endophenotype for BD. Our study also suggests these deficits are not attributable to the presence of psychiatric symptoms. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Amygdala activity and prefrontal cortex-amygdala effective connectivity to emerging emotional faces distinguish remitted and depressed mood states in bipolar disorder.

    PubMed

    Perlman, Susan B; Almeida, Jorge R C; Kronhaus, Dina M; Versace, Amelia; Labarbara, Edmund J; Klein, Crystal R; Phillips, Mary L

    2012-03-01

    Few studies have employed effective connectivity (EC) to examine the functional integrity of neural circuitry supporting abnormal emotion processing in bipolar disorder (BD), a key feature of the illness. We used Granger Causality Mapping (GCM) to map EC between the prefrontal cortex (PFC) and bilateral amygdala and a novel paradigm to assess emotion processing in adults with BD. Thirty-one remitted adults with BD [(remitted BD), mean age = 32 years], 21 adults with BD in a depressed episode [(depressed BD), mean age = 33 years], and 25 healthy control participants [(HC), mean age = 31 years] performed a block-design emotion processing task requiring color-labeling of a color flash superimposed on a task-irrelevant face morphing from neutral to emotional (happy, sad, angry, or fearful). GCM measured EC preceding (top-down) and following (bottom-up) activity between the PFC and the left and right amygdalae. Our findings indicated patterns of abnormally elevated bilateral amygdala activity in response to emerging fearful, sad, and angry facial expressions in remitted-BD subjects versus HC, and abnormally elevated right amygdala activity to emerging fearful faces in depressed-BD subjects versus HC. We also showed distinguishable patterns of abnormal EC between the amygdala and dorsomedial and ventrolateral PFC, especially to emerging happy and sad facial expressions in remitted-BD and depressed-BD subjects. EC measures of neural system level functioning can further understanding of neural mechanisms associated with abnormal emotion processing and regulation in BD. Our findings suggest major differences in recruitment of amygdala-PFC circuitry, supporting implicit emotion processing between remitted-BD and depressed-BD subjects, which may underlie changes from remission to depression in BD. © 2012 John Wiley and Sons A/S.

  18. From neural signatures of emotional modulation to social cognition: individual differences in healthy volunteers and psychiatric participants.

    PubMed

    Ibáñez, Agustín; Aguado, Jaume; Baez, Sandra; Huepe, David; Lopez, Vladimir; Ortega, Rodrigo; Sigman, Mariano; Mikulan, Ezequiel; Lischinsky, Alicia; Torrente, Fernando; Cetkovich, Marcelo; Torralva, Teresa; Bekinschtein, Tristan; Manes, Facundo

    2014-07-01

    It is commonly assumed that early emotional signals provide relevant information for social cognition tasks. The goal of this study was to test the association between (a) cortical markers of face emotional processing and (b) social-cognitive measures, and also to build a model which can predict this association (a and b) in healthy volunteers as well as in different groups of psychiatric patients. Thus, we investigated the early cortical processing of emotional stimuli (N170, using a face and word valence task) and their relationship with the social-cognitive profiles (SCPs, indexed by measures of theory of mind, fluid intelligence, speed processing and executive functions). Group comparisons and individual differences were assessed among schizophrenia (SCZ) patients and their relatives, individuals with attention deficit hyperactivity disorder (ADHD), individuals with euthymic bipolar disorder (BD) and healthy participants (educational level, handedness, age and gender matched). Our results provide evidence of emotional N170 impairments in the affected groups (SCZ and relatives, ADHD and BD) as well as subtle group differences. Importantly, cortical processing of emotional stimuli predicted the SCP, as evidenced by a structural equation model analysis. This is the first study to report an association model of brain markers of emotional processing and SCP. © The Author (2013). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  19. Influence of Temporal Expectations on Response Priming by Subliminal Faces

    PubMed Central

    Guex, Raphael; Vuilleumier, Patrik

    2016-01-01

    Unconscious processes are often assumed immune from attention influence. Recent behavioral studies suggest however that the processing of subliminal information can be influenced by temporal attention. To examine the neural mechanisms underlying these effects, we used a stringent masking paradigm together with fMRI to investigate how temporal attention modulates the processing of unseen (masked) faces. Participants performed a gender decision task on a visible neutral target face, preceded by a masked prime face that could vary in gender (same or different than target) and emotion expression (neutral or fearful). We manipulated temporal attention by instructing participants to expect targets to appear either early or late during the stimulus sequence. Orienting temporal attention to subliminal primes influenced response priming by masked faces, even when gender was incongruent. In addition, gender-congruent primes facilitated responses regardless of attention while gender-incongruent primes reduced accuracy when attended. Emotion produced no differential effects. At the neural level, incongruent and temporally unexpected primes increased brain response in regions of the fronto-parietal attention network, reflecting greater recruitment of executive control and reorienting processes. Congruent and expected primes produced higher activations in fusiform cortex, presumably reflecting facilitation of perceptual processing. These results indicate that temporal attention can influence subliminal processing of face features, and thus facilitate information integration according to task-relevance regardless of conscious awareness. They also suggest that task-congruent information between prime and target may facilitate response priming even when temporal attention is not selectively oriented to the prime onset time. PMID:27764124

  20. Influence of Temporal Expectations on Response Priming by Subliminal Faces.

    PubMed

    Pichon, Swann; Guex, Raphael; Vuilleumier, Patrik

    2016-01-01

    Unconscious processes are often assumed immune from attention influence. Recent behavioral studies suggest however that the processing of subliminal information can be influenced by temporal attention. To examine the neural mechanisms underlying these effects, we used a stringent masking paradigm together with fMRI to investigate how temporal attention modulates the processing of unseen (masked) faces. Participants performed a gender decision task on a visible neutral target face, preceded by a masked prime face that could vary in gender (same or different than target) and emotion expression (neutral or fearful). We manipulated temporal attention by instructing participants to expect targets to appear either early or late during the stimulus sequence. Orienting temporal attention to subliminal primes influenced response priming by masked faces, even when gender was incongruent. In addition, gender-congruent primes facilitated responses regardless of attention while gender-incongruent primes reduced accuracy when attended. Emotion produced no differential effects. At the neural level, incongruent and temporally unexpected primes increased brain response in regions of the fronto-parietal attention network, reflecting greater recruitment of executive control and reorienting processes. Congruent and expected primes produced higher activations in fusiform cortex, presumably reflecting facilitation of perceptual processing. These results indicate that temporal attention can influence subliminal processing of face features, and thus facilitate information integration according to task-relevance regardless of conscious awareness. They also suggest that task-congruent information between prime and target may facilitate response priming even when temporal attention is not selectively oriented to the prime onset time.

  1. Rapid processing of emotional expressions without conscious awareness.

    PubMed

    Smith, Marie L

    2012-08-01

    Rapid accurate categorization of the emotional state of our peers is of critical importance and as such many have proposed that facial expressions of emotion can be processed without conscious awareness. Typically, studies focus selectively on fearful expressions due to their evolutionary significance, leaving the subliminal processing of other facial expressions largely unexplored. Here, I investigated the time course of processing of 3 facial expressions (fearful, disgusted, and happy) plus an emotionally neutral face, during objectively unaware and aware perception. Participants completed the challenging "which expression?" task in response to briefly presented backward-masked expressive faces. Although participant's behavioral responses did not differentiate between the emotional content of the stimuli in the unaware condition, activity over frontal and occipitotemporal (OT) brain regions indicated an emotional modulation of the neuronal response. Over frontal regions this was driven by negative facial expressions and was present on all emotional trials independent of later categorization. Whereas the N170 component, recorded on lateral OT electrodes, was enhanced for all facial expressions but only on trials that would later be categorized as emotional. The results indicate that emotional faces, not only fearful, are processed without conscious awareness at an early stage and highlight the critical importance of considering categorization response when studying subliminal perception.

  2. Emotion recognition deficits associated with ventromedial prefrontal cortex lesions are improved by gaze manipulation.

    PubMed

    Wolf, Richard C; Pujara, Maia; Baskaya, Mustafa K; Koenigs, Michael

    2016-09-01

    Facial emotion recognition is a critical aspect of human communication. Since abnormalities in facial emotion recognition are associated with social and affective impairment in a variety of psychiatric and neurological conditions, identifying the neural substrates and psychological processes underlying facial emotion recognition will help advance basic and translational research on social-affective function. Ventromedial prefrontal cortex (vmPFC) has recently been implicated in deploying visual attention to the eyes of emotional faces, although there is mixed evidence regarding the importance of this brain region for recognition accuracy. In the present study of neurological patients with vmPFC damage, we used an emotion recognition task with morphed facial expressions of varying intensities to determine (1) whether vmPFC is essential for emotion recognition accuracy, and (2) whether instructed attention to the eyes of faces would be sufficient to improve any accuracy deficits. We found that vmPFC lesion patients are impaired, relative to neurologically healthy adults, at recognizing moderate intensity expressions of anger and that recognition accuracy can be improved by providing instructions of where to fixate. These results suggest that vmPFC may be important for the recognition of facial emotion through a role in guiding visual attention to emotionally salient regions of faces. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Facial race and sex cues have a comparable influence on emotion recognition in Chinese and Australian participants.

    PubMed

    Craig, Belinda M; Zhang, Jing; Lipp, Ottmar V

    2017-10-01

    The magnitude of the happy categorisation advantage, the faster recognition of happiness than negative expressions, is influenced by facial race and sex cues. Previous studies have investigated these relationships using racial outgroups stereotypically associated with physical threat in predominantly Caucasian samples. To determine whether these influences generalise to stimuli representing other ethnic groups and to participants of different ethnicities, Caucasian Australian (Experiments 1 and 2) and Chinese participants (Experiment 2) categorised happy and angry expressions displayed on own-race male faces presented with emotional other-race male, own-race female, and other-race female faces in separate tasks. The influence of social category cues on the happy categorisation advantage was similar in the Australian and Chinese samples. In both samples, the happy categorisation advantage was present for own-race male faces when they were encountered with other-race male faces but reduced when own-race male faces were categorised along with female faces. The happy categorisation advantage was present for own-race and other-race female faces when they were encountered with own-race male faces in both samples. Results suggest similarity in the influence of social category cues on emotion categorisation.

  4. Amygdala activation in response to facial expressions in pediatric obsessive-compulsive disorder

    PubMed Central

    Britton, Jennifer C.; Stewart, S. Evelyn; Killgore, William D.S.; Rosso, Isabelle M.; Price, Lauren M.; Gold, Andrea L.; Pine, Daniel S.; Wilhelm, Sabine; Jenike, Michael A.; Rauch, Scott L.

    2010-01-01

    Background Exaggerated amygdala activation to threatening faces has been detected in adults and children with anxiety disorders, compared to healthy comparison subjects. However, the profile of amygdala activation in response to facial expressions in obsessive-compulsive disorder (OCD) may be a distinguishing feature; a prior study found that compared with healthy adults, adults with OCD exhibited less amygdala activation to emotional and neutral faces, relative to fixation (Cannistraro et al., 2004). Methods In the current event-related functional magnetic resonance imaging (fMRI) study, a pediatric OCD sample (N=12) and a healthy comparison sample (HC, N=17) performed a gender discrimination task while viewing emotional faces (happy, fear, disgust) and neutral faces. Results Compared to the HC group, the OCD group showed less amygdala/hippocampus activation in all emotion and neutral conditions relative to fixation. Conclusions Like previous reports in adult OCD, pediatric OCD may have a distinct neural profile from other anxiety disorders, with respect to amygdala activation in response to emotional stimuli that are not disorder-specific. PMID:20602430

  5. Avoiding threat in late adulthood: testing two life span theories of emotion.

    PubMed

    Orgeta, Vasiliki

    2011-07-01

    The purpose of the present research was to explore the time course of age-related attentional biases and the role of emotion regulation as a potential mediator of older adults' performance in an emotion dot probe task. In two studies, younger and older adults (N = 80) completed a visual probe detection task, which presented happy, angry, and sad facial expressions. Across both studies, age influenced attentional responses to angry faces. Results indicated a bias away from angry-related facial emotion information occurring relatively late in attention. Age effects were not attributable to decreasing information processing speed or visuoperceptual function. Current results demonstrated that an age-related attentional preference away from angry facial cues was mediated by efforts to suppress emotion. Findings are discussed in relation to current theories of sociocognitive aging.

  6. Efficacy of identifying neural components in the face and emotion processing system in schizophrenia using a dynamic functional localizer.

    PubMed

    Arnold, Aiden E G F; Iaria, Giuseppe; Goghari, Vina M

    2016-02-28

    Schizophrenia is associated with deficits in face perception and emotion recognition. Despite consistent behavioural results, the neural mechanisms underlying these cognitive abilities have been difficult to isolate, in part due to differences in neuroimaging methods used between studies for identifying regions in the face processing system. Given this problem, we aimed to validate a recently developed fMRI-based dynamic functional localizer task for use in studies of psychiatric populations and specifically schizophrenia. Previously, this functional localizer successfully identified each of the core face processing regions (i.e. fusiform face area, occipital face area, superior temporal sulcus), and regions within an extended system (e.g. amygdala) in healthy individuals. In this study, we tested the functional localizer success rate in 27 schizophrenia patients and in 24 community controls. Overall, the core face processing regions were localized equally between both the schizophrenia and control group. Additionally, the amygdala, a candidate brain region from the extended system, was identified in nearly half the participants from both groups. These results indicate the effectiveness of a dynamic functional localizer at identifying regions of interest associated with face perception and emotion recognition in schizophrenia. The use of dynamic functional localizers may help standardize the investigation of the facial and emotion processing system in this and other clinical populations. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  7. Neuroimaging Study of the Human Amygdala - Toward an Understanding of Emotional and Stress Responses -

    NASA Astrophysics Data System (ADS)

    Iidaka, Tetsuya

    The amygdala plays a critical role in the neural system involved in emotional responses and conditioned fear. The dysfunction of this system is thought to be a cause of several neuropsychiatric disorders. A neuroimaging study provides a unique opportunity for noninvasive investigation of the human amygdala. We studied the activity of this structure in normal subjects and patients with schizophrenia by using the face recognition task. Our results showed that the amygdala was activated by presentation of face stimuli, and negative face activated the amygdala to a greater extent than a neutral face. Under the happy face condition, the activation of the amygdala was higher in the schizophrenic patients than in control subjects. A single nucleotide polymorphism in the regulatory region of the serotonin type 3 receptor gene had modulatory effects on the amygdaloid activity. The emotion regulation had a significant impact on neural interaction between the amygdala and prefrontal cortices. Thus, studies on the human amygdala would greatly contribute to the elucidation of the neural system that determines emotional and stress responses. To clarify the relevance of the neural dysfunction and neuropsychiatric disorders, further studies using physiological, genetic, and hormonal approaches are essential.

  8. Event-related potentials reveal preserved attention allocation but impaired emotion regulation in patients with epilepsy and comorbid negative affect.

    PubMed

    De Taeye, Leen; Pourtois, Gilles; Meurs, Alfred; Boon, Paul; Vonck, Kristl; Carrette, Evelien; Raedt, Robrecht

    2015-01-01

    Patients with epilepsy have a high prevalence of comorbid mood disorders. This study aims to evaluate whether negative affect in epilepsy is associated with dysfunction of emotion regulation. Event-related potentials (ERPs) are used in order to unravel the exact electrophysiological time course and investigate whether a possible dysfunction arises during early (attention) and/or late (regulation) stages of emotion control. Fifty epileptic patients with (n = 25) versus without (n = 25) comorbid negative affect plus twenty-five matched controls were recruited. ERPs were recorded while subjects performed a face- or house-matching task in which fearful, sad or neutral faces were presented either at attended or unattended spatial locations. Two ERP components were analyzed: the early vertex positive potential (VPP) which is normally enhanced for faces, and the late positive potential (LPP) that is typically larger for emotional stimuli. All participants had larger amplitude of the early face-sensitive VPP for attended faces compared to houses, regardless of their emotional content. By contrast, in patients with negative affect only, the amplitude of the LPP was significantly increased for unattended negative emotional expressions. These VPP results indicate that epilepsy with or without negative affect does not interfere with the early structural encoding and attention selection of faces. However, the LPP results suggest abnormal regulation processes during the processing of unattended emotional faces in patients with epilepsy and comorbid negative affect. In conclusion, this ERP study reveals that early object-based attention processes are not compromised by epilepsy, but instead, when combined with negative affect, this neurological disease is associated with dysfunction during the later stages of emotion regulation. As such, these new neurophysiological findings shed light on the complex interplay of epilepsy with negative affect during the processing of emotional visual stimuli and in turn might help to better understand the etiology and maintenance of mood disorders in epilepsy.

  9. ADRA2B genotype differentially modulates stress-induced neural activity in the amygdala and hippocampus during emotional memory retrieval.

    PubMed

    Li, Shijia; Weerda, Riklef; Milde, Christopher; Wolf, Oliver T; Thiel, Christiane M

    2015-02-01

    Noradrenaline interacts with stress hormones in the amygdala and hippocampus to enhance emotional memory consolidation, but the noradrenergic-glucocorticoid interaction at retrieval, where stress impairs memory, is less understood. We used a genetic neuroimaging approach to investigate whether a genetic variation of the noradrenergic system impacts stress-induced neural activity in amygdala and hippocampus during recognition of emotional memory. This study is based on genotype-dependent reanalysis of data from our previous publication (Li et al. Brain Imaging Behav 2014). Twenty-two healthy male volunteers were genotyped for the ADRA2B gene encoding the α2B-adrenergic receptor. Ten deletion carriers and 12 noncarriers performed an emotional face recognition task, while their brain activity was measured with fMRI. During encoding, 50 fearful and 50 neutral faces were presented. One hour later, they underwent either an acute stress (Trier Social Stress Test) or a control procedure which was followed immediately by the retrieval session, where participants had to discriminate between 100 old and 50 new faces. A genotype-dependent modulation of neural activity at retrieval was found in the bilateral amygdala and right hippocampus. Deletion carriers showed decreased neural activity in the amygdala when recognizing emotional faces in control condition and increased amygdala activity under stress. Noncarriers showed no differences in emotional modulated amygdala activation under stress or control. Instead, stress-induced increases during recognition of emotional faces were present in the right hippocampus. The genotype-dependent effects of acute stress on neural activity in amygdala and hippocampus provide evidence for noradrenergic-glucocorticoid interaction in emotional memory retrieval.

  10. The Relationships between Processing Facial Identity, Emotional Expression, Facial Speech, and Gaze Direction during Development

    ERIC Educational Resources Information Center

    Spangler, Sibylle M.; Schwarzer, Gudrun; Korell, Monika; Maier-Karius, Johanna

    2010-01-01

    Four experiments were conducted with 5- to 11-year-olds and adults to investigate whether facial identity, facial speech, emotional expression, and gaze direction are processed independently of or in interaction with one another. In a computer-based, speeded sorting task, participants sorted faces according to facial identity while disregarding…

  11. Early ERP Modulation for Task-Irrelevant Subliminal Faces

    PubMed Central

    Pegna, Alan J.; Darque, Alexandra; Berrut, Claire; Khateb, Asaid

    2011-01-01

    A number of investigations have reported that emotional faces can be processed subliminally, and that they give rise to specific patterns of brain activation in the absence of awareness. Recent event-related potential (ERP) studies have suggested that electrophysiological differences occur early in time (<200 ms) in response to backward-masked emotional faces. These findings have been taken as evidence of a rapid non-conscious pathway, which would allow threatening stimuli to be processed rapidly and subsequently allow appropriate avoidance action to be taken. However, for this to be the case, subliminal processing should arise even if the threatening stimulus is not attended. This point has in fact not yet been clearly established. In this ERP study, we investigated whether subliminal processing of fearful faces occurs outside the focus of attention. Fourteen healthy participants performed a line judgment task while fearful and non-fearful (happy or neutral) faces were presented both subliminally and supraliminally. ERPs were compared across the four experimental conditions (i.e., subliminal and supraliminal; fearful and non-fearful). The earliest differences between fearful and non-fearful faces appeared as an enhanced posterior negativity for the former at 170 ms (the N170 component) over right temporo-occipital electrodes. This difference was observed for both subliminal (p < 0.05) and supraliminal presentations (p < 0.01). Our results confirm that subliminal processing of fearful faces occurs early in the course of visual processing, and more importantly, that this arises even when the subject's attention is engaged in an incidental task. PMID:21687457

  12. Brain circuitries involved in emotional interference task in major depression disorder.

    PubMed

    Chechko, Natalia; Augustin, Marc; Zvyagintsev, Michael; Schneider, Frank; Habel, Ute; Kellermann, Thilo

    2013-07-01

    Emotional and non-emotional Stroop are frequently applied to study major depressive disorder (MDD). The versions of emotional Stroop used in previous studies were not, unlike the ones employed in the present study, based on semantic incongruence, making it difficult to compare the tasks. We used functional magnetic resonance imaging (fMRI) to study the neural and behavioral responses of 18 healthy subjects and 18 subjects with MDD to emotional and non-emotional word-face Stroop tasks based on semantic incompatibility between targets and distractors. In both groups, the distractors triggered significant amounts of interference conflict. A between-groups comparison revealed hypoactivation in MDD during emotional task in areas supporting conflict resolution (lateral prefrontal cortex, parietal and extrastriate cortices) paralleled by increased response in the right amygdala. Response in the amygdala, however, did not vary between conflicting and non-conflicting trials. While in the emotional (compared to non-emotional) task healthy controls showed considerably stronger involvement of networks related to conflict resolution, in patients, the processing differences between the two conflict types were negligible. The patients group was inhomogeneous in terms of medication and clinical characteristics. The number of female participants was higher, due to which gender effects could not be studied or excluded. Whilst healthy controls seemed able to adjust the involvement of the network supporting conflict resolution based on conflict demand, patients appeared to lack this capability. The reduced cortical involvement coupled with increased response of limbic structures might underlie the maladjustment vis-à-vis new demands in depressed mood. Copyright © 2013 Elsevier B.V. All rights reserved.

  13. Sad benefit in face working memory: an emotional bias of melancholic depression.

    PubMed

    Linden, Stefanie C; Jackson, Margaret C; Subramanian, Leena; Healy, David; Linden, David E J

    2011-12-01

    Emotion biases feature prominently in cognitive theories of depression and are a focus of psychological interventions. However, there is presently no stable neurocognitive marker of altered emotion-cognition interactions in depression. One reason may be the heterogeneity of major depressive disorder. Our aim in the present study was to find an emotional bias that differentiates patients with melancholic depression from controls, and patients with melancholic from those with non-melancholic depression. We used a working memory paradigm for emotional faces, where two faces with angry, happy, neutral, sad or fearful expression had to be retained over one second. Twenty patients with melancholic depression, 20 age-, education- and gender-matched control participants and 20 patients with non-melancholic depression participated in the study. We analysed performance on the working memory task using signal detection measures. We found an interaction between group and emotion on working memory performance that was driven by the higher performance for sad faces compared to other categories in the melancholic group. We computed a measure of "sad benefit", which distinguished melancholic and non-melancholic patients with good sensitivity and specificity. However, replication studies and formal discriminant analysis will be needed in order to assess whether emotion bias in working memory may become a useful diagnostic tool to distinguish these two syndromes. Copyright © 2011 Elsevier B.V. All rights reserved.

  14. Facing mixed emotions: Analytic and holistic perception of facial emotion expressions engages separate brain networks.

    PubMed

    Meaux, Emilie; Vuilleumier, Patrik

    2016-11-01

    The ability to decode facial emotions is of primary importance for human social interactions; yet, it is still debated how we analyze faces to determine their expression. Here we compared the processing of emotional face expressions through holistic integration and/or local analysis of visual features, and determined which brain systems mediate these distinct processes. Behavioral, physiological, and brain responses to happy and angry faces were assessed by presenting congruent global configurations of expressions (e.g., happy top+happy bottom), incongruent composite configurations (e.g., angry top+happy bottom), and isolated features (e.g. happy top only). Top and bottom parts were always from the same individual. Twenty-six healthy volunteers were scanned using fMRI while they classified the expression in either the top or the bottom face part but ignored information in the other non-target part. Results indicate that the recognition of happy and anger expressions is neither strictly holistic nor analytic Both routes were involved, but with a different role for analytic and holistic information depending on the emotion type, and different weights of local features between happy and anger expressions. Dissociable neural pathways were engaged depending on emotional face configurations. In particular, regions within the face processing network differed in their sensitivity to holistic expression information, which predominantly activated fusiform, inferior occipital areas and amygdala when internal features were congruent (i.e. template matching), whereas more local analysis of independent features preferentially engaged STS and prefrontal areas (IFG/OFC) in the context of full face configurations, but early visual areas and pulvinar when seen in isolated parts. Collectively, these findings suggest that facial emotion recognition recruits separate, but interactive dorsal and ventral routes within the face processing networks, whose engagement may be shaped by reciprocal interactions and modulated by task demands. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Personality, Attentional Biases towards Emotional Faces and Symptoms of Mental Disorders in an Adolescent Sample.

    PubMed

    O'Leary-Barrett, Maeve; Pihl, Robert O; Artiges, Eric; Banaschewski, Tobias; Bokde, Arun L W; Büchel, Christian; Flor, Herta; Frouin, Vincent; Garavan, Hugh; Heinz, Andreas; Ittermann, Bernd; Mann, Karl; Paillère-Martinot, Marie-Laure; Nees, Frauke; Paus, Tomas; Pausova, Zdenka; Poustka, Luise; Rietschel, Marcella; Robbins, Trevor W; Smolka, Michael N; Ströhle, Andreas; Schumann, Gunter; Conrod, Patricia J

    2015-01-01

    To investigate the role of personality factors and attentional biases towards emotional faces, in establishing concurrent and prospective risk for mental disorder diagnosis in adolescence. Data were obtained as part of the IMAGEN study, conducted across 8 European sites, with a community sample of 2257 adolescents. At 14 years, participants completed an emotional variant of the dot-probe task, as well two personality measures, namely the Substance Use Risk Profile Scale and the revised NEO Personality Inventory. At 14 and 16 years, participants and their parents were interviewed to determine symptoms of mental disorders. Personality traits were general and specific risk indicators for mental disorders at 14 years. Increased specificity was obtained when investigating the likelihood of mental disorders over a 2-year period, with the Substance Use Risk Profile Scale showing incremental validity over the NEO Personality Inventory. Attentional biases to emotional faces did not characterise or predict mental disorders examined in the current sample. Personality traits can indicate concurrent and prospective risk for mental disorders in a community youth sample, and identify at-risk youth beyond the impact of baseline symptoms. This study does not support the hypothesis that attentional biases mediate the relationship between personality and psychopathology in a community sample. Task and sample characteristics that contribute to differing results among studies are discussed.

  16. Personality, Attentional Biases towards Emotional Faces and Symptoms of Mental Disorders in an Adolescent Sample

    PubMed Central

    O’Leary-Barrett, Maeve; Pihl, Robert O.; Artiges, Eric; Banaschewski, Tobias; Bokde, Arun L. W.; Büchel, Christian; Flor, Herta; Frouin, Vincent; Garavan, Hugh; Heinz, Andreas; Ittermann, Bernd; Mann, Karl; Paillère-Martinot, Marie-Laure; Nees, Frauke; Paus, Tomas; Pausova, Zdenka; Poustka, Luise; Rietschel, Marcella; Robbins, Trevor W.; Smolka, Michael N.; Ströhle, Andreas; Schumann, Gunter; Conrod, Patricia J.

    2015-01-01

    Objective To investigate the role of personality factors and attentional biases towards emotional faces, in establishing concurrent and prospective risk for mental disorder diagnosis in adolescence. Method Data were obtained as part of the IMAGEN study, conducted across 8 European sites, with a community sample of 2257 adolescents. At 14 years, participants completed an emotional variant of the dot-probe task, as well two personality measures, namely the Substance Use Risk Profile Scale and the revised NEO Personality Inventory. At 14 and 16 years, participants and their parents were interviewed to determine symptoms of mental disorders. Results Personality traits were general and specific risk indicators for mental disorders at 14 years. Increased specificity was obtained when investigating the likelihood of mental disorders over a 2-year period, with the Substance Use Risk Profile Scale showing incremental validity over the NEO Personality Inventory. Attentional biases to emotional faces did not characterise or predict mental disorders examined in the current sample. Discussion Personality traits can indicate concurrent and prospective risk for mental disorders in a community youth sample, and identify at-risk youth beyond the impact of baseline symptoms. This study does not support the hypothesis that attentional biases mediate the relationship between personality and psychopathology in a community sample. Task and sample characteristics that contribute to differing results among studies are discussed. PMID:26046352

  17. Spatial frequency filtered images reveal differences between masked and unmasked processing of emotional information.

    PubMed

    Rohr, Michaela; Wentura, Dirk

    2014-10-01

    High and low spatial frequency information has been shown to contribute differently to the processing of emotional information. In three priming studies using spatial frequency filtered emotional face primes, emotional face targets, and an emotion categorization task, we investigated this issue further. Differences in the pattern of results between short and masked, and short and long unmasked presentation conditions emerged. Given long and unmasked prime presentation, high and low frequency primes triggered emotion-specific priming effects. Given brief and masked prime presentation in Experiment 2, we found a dissociation: High frequency primes caused a valence priming effect, whereas low frequency primes yielded a differentiation between low and high arousing information within the negative domain. Brief and unmasked prime presentation in Experiment 3 revealed that subliminal processing of primes was responsible for the pattern observed in Experiment 2. The implications of these findings for theories of early emotional information processing are discussed. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. Affective learning modulates spatial competition during low-load attentional conditions.

    PubMed

    Lim, Seung-Lark; Padmala, Srikanth; Pessoa, Luiz

    2008-04-01

    It has been hypothesized that the amygdala mediates the processing advantage of emotional items. In the present study, we employed functional magnetic resonance imaging (fMRI) to investigate how fear conditioning affected the visual processing of task-irrelevant faces. We hypothesized that faces previously paired with shock (threat faces) would more effectively vie for processing resources during conditions involving spatial competition. To investigate this question, following conditioning, participants performed a letter-detection task on an array of letters that was superimposed on task-irrelevant faces. Attentional resources were manipulated by having participants perform an easy or a difficult search task. Our findings revealed that threat fearful faces evoked stronger responses in the amygdala and fusiform gyrus relative to safe fearful faces during low-load attentional conditions, but not during high-load conditions. Consistent with the increased processing of shock-paired stimuli during the low-load condition, such stimuli exhibited increased behavioral priming and fMRI repetition effects relative to unpaired faces during a subsequent implicit-memory task. Overall, our results suggest a competition model in which affective significance signals from the amygdala may constitute a key modulatory factor determining the neural fate of visual stimuli. In addition, it appears that such competitive advantage is only evident when sufficient processing resources are available to process the affective stimulus.

  19. Overestimation of the Subjective Experience of Time in Social Anxiety: Effects of Facial Expression, Gaze Direction, and Time Course

    PubMed Central

    Ishikawa, Kenta; Okubo, Matia

    2016-01-01

    It is known that threatening stimuli increase emotional arousal, resulting in overestimating the subjective experience of passing time. Moreover, facial expressions and gaze direction interact to create socially threatening situations in people with social anxiety. The present study investigated the effect of social anxiety on the perceived duration of observing emotional faces with a direct or an averted gaze. Participants were divided into high, medium and low social anxiety groups based on social anxiety inventory scores. Participants then performed a temporal bisection task. Participants with high social anxiety provided larger overestimates for neutral faces with an averted gaze than those with low social anxiety in the second half of the task, whereas these differences were not found for angry face with direct and averted gaze. These results suggest that people with social anxiety perceive the duration of threatening situations as being longer than true durations based on objectively measured time. PMID:27199844

  20. Emotional Response Inhibition in Bipolar Disorder: A Functional Magnetic Resonance Imaging Study of Trait- and State-Related Abnormalities

    PubMed Central

    Hummer, Tom A.; Hulvershorn, Leslie A.; Karne, Harish S.; Gunn, Abigail D.; Wang, Yang; Anand, Amit

    2018-01-01

    Background Impaired response inhibition and poor impulse control are hallmarks of the manic phase of bipolar disorder but are also present during depressive and, to a lesser degree, euthymic periods. The neural mechanisms underlying these impairments are poorly understood, including how mechanisms are related to bipolar trait or state effects. Methods One-hundred four unmedicated participants with bipolar mania (BM) (n = 30), bipolar depression (BD) (n = 30), bipolar euthymia (BE) (n = 14), and healthy control subjects (n = 30) underwent functional magnetic resonance imaging during emotional and nonemotional go/no-go tasks. The go/no-go task requires participants to press a button for go stimuli, while inhibiting the response to no-go trials. In separate blocks, participants inhibited the response to happy faces, sad faces, or letters. Results The BE group had higher insula activity during happy face inhibition and greater activity in left inferior frontal gyrus during sad face inhibition, demonstrating bipolar trait effects. Relative to the BE group, BD and BM groups demonstrated lower insula activity during inhibition of happy faces, though the depressed sample had lower activity than manic patients. The BD and BM groups had a greater response to inhibiting sad faces in emotion processing and regulation regions, including putamen, insula, and lateral prefrontal cortex. The manic group also had higher activity in insula and putamen during neutral letter inhibition. Conclusions These results suggest distinct trait- and state-related neural abnormalities during response inhibition in bipolar disorder, with implications for future research and treatment. PMID:22871393

  1. Decoding Task and Stimulus Representations in Face-responsive Cortex

    PubMed Central

    Kliemann, Dorit; Jacoby, Nir; Anzellotti, Stefano; Saxe, Rebecca R.

    2017-01-01

    Faces provide rich social information about others’ stable traits (e.g., age) and fleeting states of mind (e.g., emotional expression). While some of these facial aspects may be processed automatically, observers can also deliberately attend to some features while ignoring others. It remains unclear how internal goals (e.g., task context) influence the representational geometry of variable and stable facial aspects in face-responsive cortex. We investigated neural response patterns related to decoding i) the intention to attend to a facial aspect before its perception, ii) the attended aspect of a face and iii) stimulus properties. We measured neural responses while subjects watched videos of dynamic positive and negative expressions, and judged the age or the expression’s valence. Split-half multivoxel pattern analyses (MVPA) showed that (i) the intention to attend to a specific aspect of a face can be decoded from left fronto-lateral, but not face-responsive regions; (ii) during face perception, the attend aspect (age vs emotion) could be robustly decoded from almost all face-responsive regions; and (iii) a stimulus property (valence), was represented in right posterior superior temporal sulcus and medial prefrontal cortices. The effect of deliberately shifting the focus of attention on representations suggest a powerful influence of top-down signals on cortical representation of social information, varying across cortical regions, likely reflecting neural flexibility to optimally integrate internal goals and dynamic perceptual input. PMID:27978778

  2. Decreased response inhibition to sad faces during explicit and implicit tasks in females with depression: Evidence from an event-related potential study.

    PubMed

    Yu, Fengqiong; Zhou, Xiaoqing; Qing, Wu; Li, Dan; Li, Jing; Chen, Xingui; Ji, Gongjun; Dong, Yi; Luo, Yuejia; Zhu, Chunyan; Wang, Kai

    2017-01-30

    The present study aimed to investigate neural substrates of response inhibition to sad faces across explicit and implicit tasks in depressed female patients. Event-related potentials were obtained while participants performed modified explicit and implicit emotional go/no-go tasks. Compared to controls, depressed patients showed decreased discrimination accuracy and amplitudes of original and nogo-go difference waves at the P3 interval in response inhibition to sad faces during explicit and implicit tasks. P3 difference wave were positively correlated with discrimination accuracy and were independent of clinical assessment. The activation of right dorsal prefrontal cortex was larger for the implicit than for the explicit task in sad condition in health controls, but was similar for the two tasks in depressed patients. The present study indicated that selectively impairment in response inhibition to sad faces in depressed female patients occurred at the behavior inhibition stage across implicit and explicit tasks and may be a trait-like marker of depression. Longitudinal studies are required to determine whether decreased response inhibition to sad faces increases the risk for future depressive episodes so that appropriate treatment can be administered to patients. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  3. Aberrant activity and connectivity of the posterior superior temporal sulcus during social cognition in schizophrenia.

    PubMed

    Mier, Daniela; Eisenacher, Sarah; Rausch, Franziska; Englisch, Susanne; Gerchen, Martin Fungisai; Zamoscik, Vera; Meyer-Lindenberg, Andreas; Zink, Mathias; Kirsch, Peter

    2017-10-01

    Schizophrenia is associated with significant impairments in social cognition. These impairments have been shown to go along with altered activation of the posterior superior temporal sulcus (pSTS). However, studies that investigate connectivity of pSTS during social cognition in schizophrenia are sparse. Twenty-two patients with schizophrenia and 22 matched healthy controls completed a social-cognitive task for functional magnetic resonance imaging that allows the investigation of affective Theory of Mind (ToM), emotion recognition and the processing of neutral facial expressions. Moreover, a resting-state measurement was taken. Patients with schizophrenia performed worse in the social-cognitive task (main effect of group). In addition, a group by social-cognitive processing interaction was revealed for activity, as well as for connectivity during the social-cognitive task, i.e., patients with schizophrenia showed hyperactivity of right pSTS during neutral face processing, but hypoactivity during emotion recognition and affective ToM. In addition, hypoconnectivity between right and left pSTS was revealed for affective ToM, but not for neutral face processing or emotion recognition. No group differences in connectivity from right to left pSTS occurred during resting state. This pattern of aberrant activity and connectivity of the right pSTS during social cognition might form the basis of false-positive perceptions of emotions and intentions and could contribute to the emergence and sustainment of delusions.

  4. The influence of psychological resilience on the relation between automatic stimulus evaluation and attentional breadth for surprised faces.

    PubMed

    Grol, Maud; De Raedt, Rudi

    2015-01-01

    The broaden-and-build theory relates positive emotions to resilience and cognitive broadening. The theory proposes that the broadening effects underly the relation between positive emotions and resilience, suggesting that resilient people can benefit more from positive emotions at the level of cognitive functioning. Research has investigated the influence of positive emotions on attentional broadening, but the stimulus in the target of attention may also influence attentional breadth, depending on affective stimulus evaluation. Surprised faces are particularly interesting as they are valence ambiguous, therefore, we investigated the relation between affective evaluation--using an affective priming task--and attentional breadth for surprised faces, and how this relation is influenced by resilience. Results show that more positive evaluations are related to more attentional broadening at high levels of resilience, while this relation is reversed at low levels. This indicates that resilient individuals can benefit more from attending to positively evaluated stimuli at the level of attentional broadening.

  5. Emotional task management: neural correlates of switching between affective and non-affective task-sets

    PubMed Central

    Reeck, Crystal

    2015-01-01

    Although task-switching has been investigated extensively, its interaction with emotionally salient task content remains unclear. Prioritized processing of affective stimulus content may enhance accessibility of affective task-sets and generate increased interference when switching between affective and non-affective task-sets. Previous research has demonstrated that more dominant task-sets experience greater switch costs, as they necessitate active inhibition during performance of less entrenched tasks. Extending this logic to the affective domain, the present experiment examined (a) whether affective task-sets are more dominant than non-affective ones, and (b) what neural mechanisms regulate affective task-sets, so that weaker, non-affective task-sets can be executed. While undergoing functional magnetic resonance imaging, participants categorized face stimuli according to either their gender (non-affective task) or their emotional expression (affective task). Behavioral results were consistent with the affective task dominance hypothesis: participants were slower to switch to the affective task, and cross-task interference was strongest when participants tried to switch from the affective to the non-affective task. These behavioral costs of controlling the affective task-set were mirrored in the activation of a right-lateralized frontostriatal network previously implicated in task-set updating and response inhibition. Connectivity between amygdala and right ventrolateral prefrontal cortex was especially pronounced during cross-task interference from affective features. PMID:25552571

  6. Affective blindsight in the absence of input from face processing regions in occipital-temporal cortex.

    PubMed

    Striemer, Christopher L; Whitwell, Robert L; Goodale, Melvyn A

    2017-11-12

    Previous research suggests that the implicit recognition of emotional expressions may be carried out by pathways that bypass primary visual cortex (V1) and project to the amygdala. Some of the strongest evidence supporting this claim comes from case studies of "affective blindsight" in which patients with V1 damage can correctly guess whether an unseen face was depicting a fearful or happy expression. In the current study, we report a new case of affective blindsight in patient MC who is cortically blind following extensive bilateral lesions to V1, as well as face and object processing regions in her ventral visual stream. Despite her large lesions, MC has preserved motion perception which is related to sparing of the motion sensitive region MT+ in both hemispheres. To examine affective blindsight in MC we asked her to perform gender and emotion discrimination tasks in which she had to guess, using a two-alternative forced-choice procedure, whether the face presented was male or female, happy or fearful, or happy or angry. In addition, we also tested MC in a four-alternative forced-choice target localization task. Results indicated that MC was not able to determine the gender of the faces (53% accuracy), or localize targets in a forced-choice task. However, she was able to determine, at above chance levels, whether the face presented was depicting a happy or fearful (67%, p = .006), or a happy or angry (64%, p = .025) expression. Interestingly, although MC was better than chance at discriminating between emotions in faces when asked to make rapid judgments, her performance fell to chance when she was asked to provide subjective confidence ratings about her performance. These data lend further support to the idea that there is a non-conscious visual pathway that bypasses V1 which is capable of processing affective signals from facial expressions without input from higher-order face and object processing regions in the ventral visual stream. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. The mental and subjective skin: Emotion, empathy, feelings and thermography.

    PubMed

    Salazar-López, E; Domínguez, E; Juárez Ramos, V; de la Fuente, J; Meins, A; Iborra, O; Gálvez, G; Rodríguez-Artacho, M A; Gómez-Milán, E

    2015-07-01

    We applied thermography to investigate the cognitive neuropsychology of emotions, using it as a somatic marker of subjective experience during emotional tasks. We obtained results that showed significant correlations between changes in facial temperature and mental set. The main result was the change in the temperature of the nose, which tended to decrease with negative valence stimuli but to increase with positive emotions and arousal patterns. However, temperature change was identified not only in the nose, but also in the forehead, the oro-facial area, the cheeks and in the face taken as a whole. Nevertheless, thermic facial changes, mostly nasal temperature changes, correlated positively with participants' empathy scores and their performance. We found that temperature changes in the face may reveal maps of bodily sensations associated with different emotions and feelings like love. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Recognizing the bank robber and spotting the difference: emotional state and global vs. local attentional set.

    PubMed

    Pacheco-Unguetti, Antonia Pilar; Acosta, Alberto; Lupiáñez, Juan

    2014-01-01

    In two experiments (161 participants in total), we investigated how current mood influences processing styles (global vs. local). Participants watched a video of a bank robbery before receiving a positive, negative or neutral induction, and they performed two tasks: a face-recognition task about the bank robber as global processing measure, and a spot-the-difference task using neutral pictures (Experiment-1) or emotional scenes (Experiment-2) as local processing measure. Results showed that positive mood induction favoured a global processing style, enhancing participants' ability to correctly identify a face even when they watched the video before the mood-induction. This shows that, besides influencing encoding processes, mood state can be also related to retrieval processes. On the contrary, negative mood induction enhanced a local processing style, making easier and faster the detection of differences between nearly identical pictures, independently of their valence. This dissociation supports the hypothesis that current mood modulates processing through activation of different cognitive styles.

  9. The Three Models of Emotional Intelligence and Performance in a Hot and Cool go/no-go Task in Undergraduate Students

    PubMed Central

    Gutiérrez-Cobo, María J.; Cabello, Rosario; Fernández-Berrocal, Pablo

    2017-01-01

    Emotional intelligence (EI), or the ability to perceive, use, understand and regulate emotions, appears to be helpful in the performance of “hot” (i.e., emotionally laden) cognitive tasks when using performance-based ability models, but not when using self-report EI models. The aim of this study is to analyze the relationship between EI (as measured through a performance-based ability test, a self-report mixed test and a self-report ability test) and cognitive control ability during the performance of hot and “cool” (i.e., non-emotionally laden) “go/no-go” tasks. An experimental design was used for this study in which 187 undergraduate students (25% men) with a mean age of 21.93 years (standard deviation [SD] = 3.8) completed the three EI tests of interest (Mayer-Salovey-Caruso Emotional Intelligence Test [MSCEIT], Trait Meta-Mood Scale [TMMS] and Emotional Quotient Inventory–Short Form [EQi:S]) as well as go/no-go tasks using faces and geometric figures as stimuli. The results provide evidence for negative associations between the “managing” branch of EI measured through the performance-based ability test of EI and the cognitive control index of the hot go/no-go task, although similar evidence was not found when using the cool task. Further, the present study failed to observe consistent results when using the self-report EI instruments. These findings are discussed in terms of both the validity and implications of the various EI models. PMID:28275343

  10. Behavioral assessment of emotional and motivational appraisal during visual processing of emotional scenes depending on spatial frequencies.

    PubMed

    Fradcourt, B; Peyrin, C; Baciu, M; Campagne, A

    2013-10-01

    Previous studies performed on visual processing of emotional stimuli have revealed preference for a specific type of visual spatial frequencies (high spatial frequency, HSF; low spatial frequency, LSF) according to task demands. The majority of studies used a face and focused on the appraisal of the emotional state of others. The present behavioral study investigates the relative role of spatial frequencies on processing emotional natural scenes during two explicit cognitive appraisal tasks, one emotional, based on the self-emotional experience and one motivational, based on the tendency to action. Our results suggest that HSF information was the most relevant to rapidly identify the self-emotional experience (unpleasant, pleasant, and neutral) while LSF was required to rapidly identify the tendency to action (avoidance, approach, and no action). The tendency to action based on LSF analysis showed a priority for unpleasant stimuli whereas the identification of emotional experience based on HSF analysis showed a priority for pleasant stimuli. The present study confirms the interest of considering both emotional and motivational characteristics of visual stimuli. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. Biases in facial and vocal emotion recognition in chronic schizophrenia

    PubMed Central

    Dondaine, Thibaut; Robert, Gabriel; Péron, Julie; Grandjean, Didier; Vérin, Marc; Drapier, Dominique; Millet, Bruno

    2014-01-01

    There has been extensive research on impaired emotion recognition in schizophrenia in the facial and vocal modalities. The literature points to biases toward non-relevant emotions for emotional faces but few studies have examined biases in emotional recognition across different modalities (facial and vocal). In order to test emotion recognition biases, we exposed 23 patients with stabilized chronic schizophrenia and 23 healthy controls (HCs) to emotional facial and vocal tasks asking them to rate emotional intensity on visual analog scales. We showed that patients with schizophrenia provided higher intensity ratings on the non-target scales (e.g., surprise scale for fear stimuli) than HCs for the both tasks. Furthermore, with the exception of neutral vocal stimuli, they provided the same intensity ratings on the target scales as the HCs. These findings suggest that patients with chronic schizophrenia have emotional biases when judging emotional stimuli in the visual and vocal modalities. These biases may stem from a basic sensorial deficit, a high-order cognitive dysfunction, or both. The respective roles of prefrontal-subcortical circuitry and the basal ganglia are discussed. PMID:25202287

  12. Recognition of facial emotions among maltreated children with high rates of post-traumatic stress disorder

    PubMed Central

    Masten, Carrie L.; Guyer, Amanda E.; Hodgdon, Hilary B.; McClure, Erin B.; Charney, Dennis S.; Ernst, Monique; Kaufman, Joan; Pine, Daniel S.; Monk, Christopher S.

    2008-01-01

    Objective The purpose of this study is to examine processing of facial emotions in a sample of maltreated children showing high rates of post-traumatic stress disorder (PTSD). Maltreatment during childhood has been associated independently with both atypical processing of emotion and the development of PTSD. However, research has provided little evidence indicating how high rates of PTSD might relate to maltreated children’s processing of emotions. Method Participants’ reaction time and labeling of emotions were measured using a morphed facial emotion identification task. Participants included a diverse sample of maltreated children with and without PTSD and controls ranging in age from 8 to 15 years. Maltreated children had been removed from their homes and placed in state custody following experiences of maltreatment. Diagnoses of PTSD and other disorders were determined through combination of parent, child, and teacher reports. Results Maltreated children displayed faster reaction times than controls when labeling emotional facial expressions, and this result was most pronounced for fearful faces. Relative to children who were not maltreated, maltreated children both with and without PTSD showed enhanced response times when identifying fearful faces. There was no group difference in labeling of emotions when identifying different facial emotions. Conclusions Maltreated children show heightened ability to identify fearful faces, evidenced by faster reaction times relative to controls. This association between maltreatment and atypical processing of emotion is independent of PTSD diagnosis. PMID:18155144

  13. Detection of emotional faces: salient physical features guide effective visual search.

    PubMed

    Calvo, Manuel G; Nummenmaa, Lauri

    2008-08-01

    In this study, the authors investigated how salient visual features capture attention and facilitate detection of emotional facial expressions. In a visual search task, a target emotional face (happy, disgusted, fearful, angry, sad, or surprised) was presented in an array of neutral faces. Faster detection of happy and, to a lesser extent, surprised and disgusted faces was found both under upright and inverted display conditions. Inversion slowed down the detection of these faces less than that of others (fearful, angry, and sad). Accordingly, the detection advantage involves processing of featural rather than configural information. The facial features responsible for the detection advantage are located in the mouth rather than the eye region. Computationally modeled visual saliency predicted both attentional orienting and detection. Saliency was greatest for the faces (happy) and regions (mouth) that were fixated earlier and detected faster, and there was close correspondence between the onset of the modeled saliency peak and the time at which observers initially fixated the faces. The authors conclude that visual saliency of specific facial features--especially the smiling mouth--is responsible for facilitated initial orienting, which thus shortens detection. (PsycINFO Database Record (c) 2008 APA, all rights reserved).

  14. Threatening scenes but not threatening faces shorten time-to-contact estimates.

    PubMed

    DeLucia, Patricia R; Brendel, Esther; Hecht, Heiko; Stacy, Ryan L; Larsen, Jeff T

    2014-08-01

    We previously reported that time-to-contact (TTC) judgments of threatening scene pictures (e.g., frontal attacks) resulted in shortened estimations and were mediated by cognitive processes, and that judgments of threatening (e.g., angry) face pictures resulted in a smaller effect and did not seem cognitively mediated. In the present study, the effects of threatening scenes and faces were compared in two different tasks. An effect of threatening scene pictures occurred in a prediction-motion task, which putatively requires cognitive motion extrapolation, but not in a relative TTC judgment task, which was designed to be less reliant on cognitive processes. An effect of threatening face pictures did not occur in either task. We propose that an object's explicit potential of threat per se, and not only emotional valence, underlies the effect of threatening scenes on TTC judgments and that such an effect occurs only when the task allows sufficient cognitive processing. Results are consistent with distinctions between predator and social fear systems and different underlying physiological mechanisms. Not all threatening information elicits the same responses, and whether an effect occurs at all may depend on the task and the degree to which the task involves cognitive processes.

  15. Facial emotion perception in Chinese patients with schizophrenia and non-psychotic first-degree relatives.

    PubMed

    Li, Huijie; Chan, Raymond C K; Zhao, Qing; Hong, Xiaohong; Gong, Qi-Yong

    2010-03-17

    Although there is a consensus that patients with schizophrenia have certain deficits in perceiving and expressing facial emotions, previous studies of facial emotion perception in schizophrenia do not present consistent results. The objective of this study was to explore facial emotion perception deficits in Chinese patients with schizophrenia and their non-psychotic first-degree relatives. Sixty-nine patients with schizophrenia, 56 of their first-degree relatives (33 parents and 23 siblings), and 92 healthy controls (67 younger healthy controls matched to the patients and siblings, and 25 older healthy controls matched to the parents) completed a set of facial emotion perception tasks, including facial emotion discrimination, identification, intensity, valence, and corresponding face identification tasks. The results demonstrated that patients with schizophrenia performed significantly worse than their siblings and younger healthy controls in accuracy in a variety of facial emotion perception tasks, whereas the siblings of the patients performed as well as the corresponding younger healthy controls in all of the facial emotion perception tasks. Patients with schizophrenia also showed significantly reduced speed than younger healthy controls, while siblings of patients did not demonstrate significant differences with both patients and younger healthy controls in speed. Meanwhile, we also found that parents of the schizophrenia patients performed significantly worse than the corresponding older healthy controls in accuracy in terms of facial emotion identification, valence, and the composite index of the facial discrimination, identification, intensity and valence tasks. Moreover, no significant differences were found between the parents of patients and older healthy controls in speed after controlling the years of education and IQ. Taken together, the results suggest that facial emotion perception deficits may serve as potential endophenotypes for schizophrenia. Copyright 2010 Elsevier Inc. All rights reserved.

  16. Emotion recognition in body dysmorphic disorder: application of the Reading the Mind in the Eyes Task.

    PubMed

    Buhlmann, Ulrike; Winter, Anna; Kathmann, Norbert

    2013-03-01

    Body dysmorphic disorder (BDD) is characterized by perceived appearance-related defects, often tied to aspects of the face or head (e.g., acne). Deficits in decoding emotional expressions have been examined in several psychological disorders including BDD. Previous research indicates that BDD is associated with impaired facial emotion recognition, particularly in situations that involve the BDD sufferer him/herself. The purpose of this study was to further evaluate the ability to read other people's emotions among 31 individuals with BDD, and 31 mentally healthy controls. We applied the Reading the Mind in the Eyes task, in which participants are presented with a series of pairs of eyes, one at a time, and are asked to identify the emotion that describes the stimulus best. The groups did not differ with respect to decoding other people's emotions by looking into their eyes. Findings are discussed in light of previous research examining emotion recognition in BDD. Copyright © 2013. Published by Elsevier Ltd.

  17. Conscious and unconscious processing of facial expressions: evidence from two split-brain patients.

    PubMed

    Prete, Giulia; D'Ascenzo, Stefania; Laeng, Bruno; Fabri, Mara; Foschi, Nicoletta; Tommasi, Luca

    2015-03-01

    We investigated how the brain's hemispheres process explicit and implicit facial expressions in two 'split-brain' patients (one with a complete and one with a partial anterior resection). Photographs of faces expressing positive, negative or neutral emotions were shown either centrally or bilaterally. The task consisted in judging the friendliness of each person in the photographs. Half of the photograph stimuli were 'hybrid faces', that is an amalgamation of filtered images which contained emotional information only in the low range of spatial frequency, blended to a neutral expression of the same individual in the rest of the spatial frequencies. The other half of the images contained unfiltered faces. With the hybrid faces the patients and a matched control group were more influenced in their social judgements by the emotional expression of the face shown in the left visual field (LVF). When the expressions were shown explicitly, that is without filtering, the control group and the partially callosotomized patient based their judgement on the face shown in the LVF, whereas the complete split-brain patient based his ratings mainly on the face presented in the right visual field. We conclude that the processing of implicit emotions does not require the integrity of callosal fibres and can take place within subcortical routes lateralized in the right hemisphere. © 2013 The British Psychological Society.

  18. Genetic correlations between wellbeing, depression and anxiety symptoms and behavioral responses to the emotional faces task in healthy twins.

    PubMed

    Routledge, Kylie M; Williams, Leanne M; Harris, Anthony W F; Schofield, Peter R; Clark, C Richard; Gatt, Justine M

    2018-06-01

    Currently there is a very limited understanding of how mental wellbeing versus anxiety and depression symptoms are associated with emotion processing behaviour. For the first time, we examined these associations using a behavioural emotion task of positive and negative facial expressions in 1668 healthy adult twins. Linear mixed model results suggested faster reaction times to happy facial expressions was associated with higher wellbeing scores, and slower reaction times with higher depression and anxiety scores. Multivariate twin modelling identified a significant genetic correlation between depression and anxiety symptoms and reaction time to happy facial expressions, in the absence of any significant correlations with wellbeing. We also found a significant negative phenotypic relationship between depression and anxiety symptoms and accuracy for identifying neutral emotions, although the genetic or environment correlations were not significant in the multivariate model. Overall, the phenotypic relationships between speed of identifying happy facial expressions and wellbeing on the one hand, versus depression and anxiety symptoms on the other, were in opposing directions. Twin modelling revealed a small common genetic correlation between response to happy faces and depression and anxiety symptoms alone, suggesting that wellbeing and depression and anxiety symptoms show largely independent relationships with emotion processing at the behavioral level. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Facial expression movement enhances the measurement of temporal dynamics of attentional bias in the dot-probe task.

    PubMed

    Caudek, Corrado; Ceccarini, Francesco; Sica, Claudio

    2017-08-01

    The facial dot-probe task is one of the most common experimental paradigms used to assess attentional bias toward emotional information. In recent years, however, the psychometric properties of this paradigm have been questioned. In the present study, attentional bias to emotional face stimuli was measured with dynamic and static images of realistic human faces in 97 college students (63 women) who underwent either a positive or a negative mood-induction prior to the experiment. We controlled the bottom-up salience of the stimuli in order to dissociate the top-down orienting of attention from the effects of the bottom-up physical properties of the stimuli. A Bayesian analysis of our results indicates that 1) the traditional global attentional bias index shows a low reliability, 2) reliability increases dramatically when biased attention is analyzed by extracting a series of bias estimations from trial-to-trial (Zvielli, Bernstein, & Koster, 2015), 3) dynamic expression of emotions strengthens biased attention to emotional information, and 4) mood-congruency facilitates the measurement of biased attention to emotional stimuli. These results highlight the importance of using ecologically valid stimuli in attentional bias research, together with the importance of estimating biased attention at the trial level. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Biased recognition of facial affect in patients with major depressive disorder reflects clinical state.

    PubMed

    Münkler, Paula; Rothkirch, Marcus; Dalati, Yasmin; Schmack, Katharina; Sterzer, Philipp

    2015-01-01

    Cognitive theories of depression posit that perception is negatively biased in depressive disorder. Previous studies have provided empirical evidence for this notion, but left open the question whether the negative perceptual bias reflects a stable trait or the current depressive state. Here we investigated the stability of negatively biased perception over time. Emotion perception was examined in patients with major depressive disorder (MDD) and healthy control participants in two experiments. In the first experiment subjective biases in the recognition of facial emotional expressions were assessed. Participants were presented with faces that were morphed between sad and neutral and happy expressions and had to decide whether the face was sad or happy. The second experiment assessed automatic emotion processing by measuring the potency of emotional faces to gain access to awareness using interocular suppression. A follow-up investigation using the same tests was performed three months later. In the emotion recognition task, patients with major depression showed a shift in the criterion for the differentiation between sad and happy faces: In comparison to healthy controls, patients with MDD required a greater intensity of the happy expression to recognize a face as happy. After three months, this negative perceptual bias was reduced in comparison to the control group. The reduction in negative perceptual bias correlated with the reduction of depressive symptoms. In contrast to previous work, we found no evidence for preferential access to awareness of sad vs. happy faces. Taken together, our results indicate that MDD-related perceptual biases in emotion recognition reflect the current clinical state rather than a stable depressive trait.

  1. Reduced embodied simulation in psychopathy.

    PubMed

    Mier, Daniela; Haddad, Leila; Diers, Kersten; Dressing, Harald; Meyer-Lindenberg, Andreas; Kirsch, Peter

    2014-08-01

    Psychopathy is characterized by severe deficits in emotion processing and empathy. These emotional deficits might not only affect the feeling of own emotions, but also the understanding of others' emotional and mental states. The present study aims on identifying the neurobiological correlates of social-cognitive related alterations in psychopathy. We applied a social-cognitive paradigm for the investigation of face processing, emotion recognition, and affective Theory of Mind (ToM) to 11 imprisoned psychopaths and 18 healthy controls. Functional magnetic resonance imaging was used to measure task-related brain activation. While showing no overall behavioural deficit, psychopathy was associated with altered brain activation. Psychopaths had reduced fusiform activation related to face processing. Related to affective ToM, psychopaths had hypoactivation in amygdala, inferior prefrontal gyrus and superior temporal sulcus, areas associated with embodied simulation of emotions and intentions. Furthermore, psychopaths lacked connectivity between superior temporal sulcus and amygdala during affective ToM. These results replicate findings of alterations in basal face processing in psychopathy. In addition, they provide evidence for reduced embodied simulation in psychopathy in concert with a lack of communication between motor areas and amygdala which might provide the neural substrate of reduced feeling with others during social cognition.

  2. Affective resonance in response to others' emotional faces varies with affective ratings and psychopathic traits in amygdala and anterior insula.

    PubMed

    Seara-Cardoso, Ana; Sebastian, Catherine L; Viding, Essi; Roiser, Jonathan P

    2016-01-01

    Despite extensive research on the neural basis of empathic responses for pain and disgust, there is limited data about the brain regions that underpin affective response to other people's emotional facial expressions. Here, we addressed this question using event-related functional magnetic resonance imaging to assess neural responses to emotional faces, combined with online ratings of subjective state. When instructed to rate their own affective response to others' faces, participants recruited anterior insula, dorsal anterior cingulate, inferior frontal gyrus, and amygdala, regions consistently implicated in studies investigating empathy for disgust and pain, as well as emotional saliency. Importantly, responses in anterior insula and amygdala were modulated by trial-by-trial variations in subjective affective responses to the emotional facial stimuli. Furthermore, overall task-elicited activations in these regions were negatively associated with psychopathic personality traits, which are characterized by low affective empathy. Our findings suggest that anterior insula and amygdala play important roles in the generation of affective internal states in response to others' emotional cues and that attenuated function in these regions may underlie reduced empathy in individuals with high levels of psychopathic traits.

  3. Right hemisphere or valence hypothesis, or both? The processing of hybrid faces in the intact and callosotomized brain.

    PubMed

    Prete, Giulia; Laeng, Bruno; Fabri, Mara; Foschi, Nicoletta; Tommasi, Luca

    2015-02-01

    The valence hypothesis and the right hemisphere hypothesis in emotion processing have been alternatively supported. To better disentangle the two accounts, we carried out two studies, presenting healthy participants and an anterior callosotomized patient with 'hybrid faces', stimuli created by superimposing the low spatial frequencies of an emotional face to the high spatial frequencies of the same face in a neutral expression. In both studies we asked participants to judge the friendliness level of stimuli, which is an indirect measure of the processing of emotional information, despite this remaining "invisible". In Experiment 1 we presented hybrid faces in a divided visual field paradigm using different tachistoscopic presentation times; in Experiment 2 we presented hybrid chimeric faces in canonical view and upside-down. In Experiments 3 and 4 we tested a callosotomized patient, with spared splenium, in similar paradigms as those used in Experiments 1 and 2. Results from Experiments 1 and 3 were consistent with the valence hypothesis, whereas results of Experiments 2 and 4 were consistent with the right hemisphere hypothesis. This study confirms that the low spatial frequencies of emotional faces influence the social judgments of observers, even when seen for 28 ms (Experiment 1), possibly by means of configural analysis (Experiment 2). The possible roles of the cortical and subcortical emotional routes in these tasks are discussed in the light of the results obtained in the callosotomized patient. We propose that the right hemisphere and the valence accounts are not mutually exclusive, at least in the case of subliminal emotion processing. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Effects of task demands on the early neural processing of fearful and happy facial expressions.

    PubMed

    Itier, Roxane J; Neath-Tavares, Karly N

    2017-05-15

    Task demands shape how we process environmental stimuli but their impact on the early neural processing of facial expressions remains unclear. In a within-subject design, ERPs were recorded to the same fearful, happy and neutral facial expressions presented during a gender discrimination, an explicit emotion discrimination and an oddball detection tasks, the most studied tasks in the field. Using an eye tracker, fixation on the face nose was enforced using a gaze-contingent presentation. Task demands modulated amplitudes from 200 to 350ms at occipito-temporal sites spanning the EPN component. Amplitudes were more negative for fearful than neutral expressions starting on N170 from 150 to 350ms, with a temporo-occipital distribution, whereas no clear effect of happy expressions was seen. Task and emotion effects never interacted in any time window or for the ERP components analyzed (P1, N170, EPN). Thus, whether emotion is explicitly discriminated or irrelevant for the task at hand, neural correlates of fearful and happy facial expressions seem immune to these task demands during the first 350ms of visual processing. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Task-irrelevant fear enhances amygdala-FFG inhibition and decreases subsequent face processing.

    PubMed

    Schulte Holthausen, Barbara; Habel, Ute; Kellermann, Thilo; Schelenz, Patrick D; Schneider, Frank; Christopher Edgar, J; Turetsky, Bruce I; Regenbogen, Christina

    2016-09-01

    Facial threat is associated with changes in limbic activity as well as modifications in the cortical face-related N170. It remains unclear if task-irrelevant threat modulates the response to a subsequent facial stimulus, and whether the amygdala's role in early threat perception is independent and direct, or modulatory. In 19 participants, crowds of emotional faces were followed by target faces and a rating task while simultaneous EEG-fMRI were recorded. In addition to conventional analyses, fMRI-informed EEG analyses and fMRI dynamic causal modeling (DCM) were performed. Fearful crowds reduced EEG N170 target face amplitudes and increased responses in a fMRI network comprising insula, amygdala and inferior frontal cortex. Multimodal analyses showed that amygdala response was present ∼60 ms before the right fusiform gyrus-derived N170. DCM indicated inhibitory connections from amygdala to fusiform gyrus, strengthened when fearful crowds preceded a target face. Results demonstrated the suppressing influence of task-irrelevant fearful crowds on subsequent face processing. The amygdala may be sensitive to task-irrelevant fearful crowds and subsequently strengthen its inhibitory influence on face-responsive fusiform N170 generators. This provides spatiotemporal evidence for a feedback mechanism of the amygdala by narrowing attention in order to focus on potential threats. © The Author (2016). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  6. How stable is activation in the amygdala and prefrontal cortex in adolescence? A study of emotional face processing across three measurements.

    PubMed

    van den Bulk, Bianca G; Koolschijn, P Cédric M P; Meens, Paul H F; van Lang, Natasja D J; van der Wee, Nic J A; Rombouts, Serge A R B; Vermeiren, Robert R J M; Crone, Eveline A

    2013-04-01

    Prior developmental functional magnetic resonance imaging (fMRI) studies have demonstrated elevated activation patterns in the amygdala and prefrontal cortex (PFC) in response to viewing emotional faces. As adolescence is a time of substantial variability in mood and emotional responsiveness, the stability of activation patterns could be fluctuating over time. In the current study, 27 healthy adolescents (age: 12-19 years) were scanned three times over a period of six months (mean test-retest interval of three months; final samples N=27, N=22, N=18). At each session, participants performed the same emotional faces task. At first measurement the presentation of emotional faces resulted in heightened activation in bilateral amygdala, bilateral lateral PFC and visual areas including the fusiform face area. Average activation did not differ across test-sessions over time, indicating that at the group level activation patterns in this network do not vary significantly over time. However, using the Intraclass Correlation Coefficient (ICC), fMRI reliability demonstrated only fair reliability for PFC (ICC=0.41-0.59) and poor reliability for the amygdala (ICC<0.4). These findings suggest substantial variability of brain activity over time and may have implications for studies investigating the influence of treatment effects on changes in neural levels in adolescents with psychiatric disorders. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. The automaticity of face perception is influenced by familiarity.

    PubMed

    Yan, Xiaoqian; Young, Andrew W; Andrews, Timothy J

    2017-10-01

    In this study, we explore the automaticity of encoding for different facial characteristics and ask whether it is influenced by face familiarity. We used a matching task in which participants had to report whether the gender, identity, race, or expression of two briefly presented faces was the same or different. The task was made challenging by allowing nonrelevant dimensions to vary across trials. To test for automaticity, we compared performance on trials in which the task instruction was given at the beginning of the trial, with trials in which the task instruction was given at the end of the trial. As a strong criterion for automatic processing, we reasoned that if perception of a given characteristic (gender, race, identity, or emotion) is fully automatic, the timing of the instruction should not influence performance. We compared automaticity for the perception of familiar and unfamiliar faces. Performance with unfamiliar faces was higher for all tasks when the instruction was given at the beginning of the trial. However, we found a significant interaction between instruction and task with familiar faces. Accuracy of gender and identity judgments to familiar faces was the same regardless of whether the instruction was given before or after the trial, suggesting automatic processing of these properties. In contrast, there was an effect of instruction for judgments of expression and race to familiar faces. These results show that familiarity enhances the automatic processing of some types of facial information more than others.

  8. Finding a face in the crowd: testing the anger superiority effect in Asperger Syndrome.

    PubMed

    Ashwin, Chris; Wheelwright, Sally; Baron-Cohen, Simon

    2006-06-01

    Social threat captures attention and is processed rapidly and efficiently, with many lines of research showing involvement of the amygdala. Visual search paradigms looking at social threat have shown angry faces 'pop-out' in a crowd, compared to happy faces. Autism and Asperger Syndrome (AS) are neurodevelopmental conditions characterised by social deficits, abnormal face processing, and amygdala dysfunction. We tested adults with high-functioning autism (HFA) and AS using a facial visual search paradigm with schematic neutral and emotional faces. We found, contrary to predictions, that people with HFA/AS performed similarly to controls in many conditions. However, the effect was reduced in the HFA/AS group when using widely varying crowd sizes and when faces were inverted, suggesting a difference in face-processing style may be evident even with simple schematic faces. We conclude there are intact threat detection mechanisms in AS, under simple and predictable conditions, but that like other face-perception tasks, the visual search of threat faces task reveals atypical face-processing in HFA/AS.

  9. Differences in the way older and younger adults rate threat in faces but not situations.

    PubMed

    Ruffman, Ted; Sullivan, Susan; Edge, Nigel

    2006-07-01

    We compared young and healthy older adults' ability to rate photos of faces and situations (e.g., sporting activities) for the degree of threat they posed. Older adults did not distinguish between more and less dangerous faces to the same extent as younger adults did. In contrast, we found no significant age differences in young and older adults' ability to distinguish between high- and low-danger situations. The differences between young and older adults on the face task were independent of age differences in older adults' fluid IQ. We discuss results in relation to differences between young and older adults on emotion-recognition tasks; we also discuss sociocognitive and neuropsychological (e.g., amygdala) theories of aging.

  10. Neural Correlates of Irritability in Disruptive Mood Dysregulation and Bipolar Disorders.

    PubMed

    Wiggins, Jillian Lee; Brotman, Melissa A; Adleman, Nancy E; Kim, Pilyoung; Oakes, Allison H; Reynolds, Richard C; Chen, Gang; Pine, Daniel S; Leibenluft, Ellen

    2016-07-01

    Bipolar disorder and disruptive mood dysregulation disorder (DMDD) are clinically and pathophysiologically distinct, yet irritability can be a clinical feature of both illnesses. The authors examine whether the neural mechanisms mediating irritability differ between bipolar disorder and DMDD, using a face emotion labeling paradigm because such labeling is deficient in both patient groups. The authors hypothesized that during face emotion labeling, irritability would be associated with dysfunctional activation in the amygdala and other temporal and prefrontal regions in both disorders, but that the nature of these associations would differ between DMDD and bipolar disorder. During functional MRI acquisition, 71 youths (25 with DMDD, 24 with bipolar disorder, and 22 healthy youths) performed a labeling task with happy, fearful, and angry faces of varying emotional intensity. Participants with DMDD and bipolar disorder showed similar levels of irritability and did not differ from each other or from healthy youths in face emotion labeling accuracy. Irritability correlated with amygdala activity across all intensities for all emotions in the DMDD group; such correlation was present in the bipolar disorder group only for fearful faces. In the ventral visual stream, associations between neural activity and irritability were found more consistently in the DMDD group than in the bipolar disorder group, especially in response to ambiguous angry faces. These results suggest diagnostic specificity in the neural correlates of irritability, a symptom of both DMDD and bipolar disorder. Such evidence of distinct neural correlates suggests the need to evaluate different approaches to treating irritability in the two disorders.

  11. Psychopathic traits affect the visual exploration of facial expressions.

    PubMed

    Boll, Sabrina; Gamer, Matthias

    2016-05-01

    Deficits in emotional reactivity and recognition have been reported in psychopathy. Impaired attention to the eyes along with amygdala malfunctions may underlie these problems. Here, we investigated how different facets of psychopathy modulate the visual exploration of facial expressions by assessing personality traits in a sample of healthy young adults using an eye-tracking based face perception task. Fearless Dominance (the interpersonal-emotional facet of psychopathy) and Coldheartedness scores predicted reduced face exploration consistent with findings on lowered emotional reactivity in psychopathy. Moreover, participants high on the social deviance facet of psychopathy ('Self-Centered Impulsivity') showed a reduced bias to shift attention towards the eyes. Our data suggest that facets of psychopathy modulate face processing in healthy individuals and reveal possible attentional mechanisms which might be responsible for the severe impairments of social perception and behavior observed in psychopathy. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Affective attention under cognitive load: reduced emotional biases but emergent anxiety-related costs to inhibitory control

    PubMed Central

    Berggren, Nick; Richards, Anne; Taylor, Joseph; Derakshan, Nazanin

    2013-01-01

    Trait anxiety is associated with deficits in attentional control, particularly in the ability to inhibit prepotent responses. Here, we investigated this effect while varying the level of cognitive load in a modified antisaccade task that employed emotional facial expressions (neutral, happy, and angry) as targets. Load was manipulated using a secondary auditory task requiring recognition of tones (low load), or recognition of specific tone pitch (high load). Results showed that load increased antisaccade latencies on trials where gaze toward face stimuli should be inhibited. This effect was exacerbated for high anxious individuals. Emotional expression also modulated task performance on antisaccade trials for both high and low anxious participants under low cognitive load, but did not influence performance under high load. Collectively, results (1) suggest that individuals reporting high levels of anxiety are particularly vulnerable to the effects of cognitive load on inhibition, and (2) support recent evidence that loading cognitive processes can reduce emotional influences on attention and cognition. PMID:23717273

  13. Discrimination of face gender and expression under dual-task conditions.

    PubMed

    García-Gutiérrez, Ana; Aguado, Luis; Romero-Ferreiro, Verónica; Pérez-Moreno, Elisa

    2017-02-01

    In order to test whether expression and gender can be attended to simultaneously without a cost in accuracy four experiments were carried out using a dual gender-expression task with male and female faces showing different emotional expressions that were backward masked by emotionally neutral faces. In the dual-facial condition the participants had to report both the gender and the expression of the targets. In two control conditions the participant reported either the gender or the expression of the face and indicated whether a surrounding frame was continuous or discontinuous. In Experiments 1-3, with angry and happy targets, asymmetric interference was observed. Gender discrimination, but no expression discrimination, was impaired in the dual-facial condition compared to the corresponding control. This effect was obtained with a between-subjects design in Experiment 1, with a within-subjects design in Experiment 2, and with androgynous face masks in Experiment 3. In Experiments 4a and 4b different target combinations were tested. No decrement of performance in the dual-facial task was observed for either gender or expression discrimination with fearful-disgusted (Experiment 4a) or fearful-happy faces (Experiment 4b). We conclude that the ability to attend simultaneously to gender and expression cues without a decrement in performance depends on the specific combination of expressions to be differentiated between. Happy and angry expressions are usually directed at the perceiver and command preferential attention. Under conditions of restricted viewing such as those of the present study, discrimination of these expressions is prioritized leading to impaired discrimination of other facial properties such as gender.

  14. Adolescent risk-taking is predicted by individual differences in cognitive control over emotional, but not non-emotional, response conflict.

    PubMed

    Botdorf, Morgan; Rosenbaum, Gail M; Patrianakos, Jamie; Steinberg, Laurence; Chein, Jason M

    2017-08-01

    While much research on adolescent risk behaviour has focused on the development of prefrontal self-regulatory mechanisms, prior studies have elicited mixed evidence of a relationship between individual differences in the capacity for self-regulation and individual differences in risk taking. To explain these inconsistent findings, it has been suggested that the capacity for self-regulation may be, for most adolescents, adequately mature to produce adaptive behaviour in non-affective, "cold" circumstances, but that adolescents have a more difficult time exerting control in affective, "hot" contexts. To further explore this claim, the present study examined individual differences in self-control in the face of affective and non-affective response conflict, and examined whether differences in the functioning of cognitive control processes under these different conditions was related to risk taking. Participants completed a cognitive Stroop task, an emotional Stroop task, and a risky driving task known as the Stoplight game. Regression analyses showed that performance on the emotional Stroop task predicted laboratory risk-taking in the driving task, whereas performance on the cognitive Stroop task did not exhibit the same trend. This pattern of results is consistent with theories of adolescent risk-taking that emphasise the impacts of affective contextual influences on the ability to enact effective cognitive control.

  15. Thought suppression predicts task switching deficits in patients with frontal lobe epilepsy.

    PubMed

    Gul, Amara; Ahmad, Hira

    2015-04-01

    To examine the relationship between task switching and thought suppression in connection with frontal lobe epilepsy (FLE). This experimental study included 30 patients with FLE admitted to the Services and Jinnah Hospital, Lahore, Pakistan between February and November 2013, and 30 healthy individuals from the local community. Participants performed a task switching experiment where they switched between emotion and age categorizations among faces. In addition, they completed a thought suppression questionnaire. There were 3 important results: (i) Patients with FLE showed weaker task switching abilities than healthy individuals. This result is attributed toward executive dysfunctions in patients with FLE. (ii) Contrary to the control group, patients with FLE showed larger switch cost for the age than the emotion categorization. This result can be seen in the context of social cognition deficits and poor inhibitory control in patients with FLE. In addition, larger switch costs reflected a binding effect with facial emotion as compared to age. The integration might represent emotion as an intrusive facial dimension that interrupted task switching performance. (iii) Patients with FLE had more recurrent suppression of thoughts than controls. Thought suppression was a significant predictor for switch costs. High scores on thought suppression were correlated with task switching deficits. The results suggest that thought suppression causes significant cognitive decline.

  16. Avoidant decision making in social anxiety: the interaction of angry faces and emotional responses

    PubMed Central

    Pittig, Andre; Pawlikowski, Mirko; Craske, Michelle G.; Alpers, Georg W.

    2014-01-01

    Recent research indicates that angry facial expressions are preferentially processed and may facilitate automatic avoidance response, especially in socially anxious individuals. However, few studies have examined whether this bias also expresses itself in more complex cognitive processes and behavior such as decision making. We recently introduced a variation of the Iowa Gambling Task which allowed us to document the influence of task-irrelevant emotional cues on rational decision making. The present study used a modified gambling task to investigate the impact of angry facial expressions on decision making in 38 individuals with a wide range of social anxiety. Participants were to find out which choices were (dis-) advantageous to maximize overall gain. To create a decision conflict between approach of reward and avoidance of fear-relevant angry faces, advantageous choices were associated with angry facial expressions, whereas disadvantageous choices were associated with happy facial expressions. Results indicated that higher social avoidance predicted less advantageous decisions in the beginning of the task, i.e., when contingencies were still uncertain. Interactions with specific skin conductance responses further clarified that this initial avoidance only occurred in combination with elevated responses before choosing an angry facial expressions. In addition, an interaction between high trait anxiety and elevated responses to early losses predicted faster learning of an advantageous strategy. These effects were independent of intelligence, general risky decision-making, self-reported state anxiety, and depression. Thus, socially avoidant individuals who respond emotionally to angry facial expressions are more likely to show avoidance of these faces under uncertainty. This novel laboratory paradigm may be an appropriate analog for central features of social anxiety. PMID:25324792

  17. Avoidant decision making in social anxiety: the interaction of angry faces and emotional responses.

    PubMed

    Pittig, Andre; Pawlikowski, Mirko; Craske, Michelle G; Alpers, Georg W

    2014-01-01

    Recent research indicates that angry facial expressions are preferentially processed and may facilitate automatic avoidance response, especially in socially anxious individuals. However, few studies have examined whether this bias also expresses itself in more complex cognitive processes and behavior such as decision making. We recently introduced a variation of the Iowa Gambling Task which allowed us to document the influence of task-irrelevant emotional cues on rational decision making. The present study used a modified gambling task to investigate the impact of angry facial expressions on decision making in 38 individuals with a wide range of social anxiety. Participants were to find out which choices were (dis-) advantageous to maximize overall gain. To create a decision conflict between approach of reward and avoidance of fear-relevant angry faces, advantageous choices were associated with angry facial expressions, whereas disadvantageous choices were associated with happy facial expressions. Results indicated that higher social avoidance predicted less advantageous decisions in the beginning of the task, i.e., when contingencies were still uncertain. Interactions with specific skin conductance responses further clarified that this initial avoidance only occurred in combination with elevated responses before choosing an angry facial expressions. In addition, an interaction between high trait anxiety and elevated responses to early losses predicted faster learning of an advantageous strategy. These effects were independent of intelligence, general risky decision-making, self-reported state anxiety, and depression. Thus, socially avoidant individuals who respond emotionally to angry facial expressions are more likely to show avoidance of these faces under uncertainty. This novel laboratory paradigm may be an appropriate analog for central features of social anxiety.

  18. The perception and identification of facial emotions in individuals with autism spectrum disorders using the Let's Face It! Emotion Skills Battery.

    PubMed

    Tanaka, James W; Wolf, Julie M; Klaiman, Cheryl; Koenig, Kathleen; Cockburn, Jeffrey; Herlihy, Lauren; Brown, Carla; Stahl, Sherin S; South, Mikle; McPartland, James C; Kaiser, Martha D; Schultz, Robert T

    2012-12-01

    Although impaired social-emotional ability is a hallmark of autism spectrum disorder (ASD), the perceptual skills and mediating strategies contributing to the social deficits of autism are not well understood. A perceptual skill that is fundamental to effective social communication is the ability to accurately perceive and interpret facial emotions. To evaluate the expression processing of participants with ASD, we designed the Let's Face It! Emotion Skills Battery (LFI! Battery), a computer-based assessment composed of three subscales measuring verbal and perceptual skills implicated in the recognition of facial emotions. We administered the LFI! Battery to groups of participants with ASD and typically developing control (TDC) participants that were matched for age and IQ. On the Name Game labeling task, participants with ASD (N = 68) performed on par with TDC individuals (N = 66) in their ability to name the facial emotions of happy, sad, disgust and surprise and were only impaired in their ability to identify the angry expression. On the Matchmaker Expression task that measures the recognition of facial emotions across different facial identities, the ASD participants (N = 66) performed reliably worse than TDC participants (N = 67) on the emotions of happy, sad, disgust, frighten and angry. In the Parts-Wholes test of perceptual strategies of expression, the TDC participants (N = 67) displayed more holistic encoding for the eyes than the mouths in expressive faces whereas ASD participants (N = 66) exhibited the reverse pattern of holistic recognition for the mouth and analytic recognition of the eyes. In summary, findings from the LFI! Battery show that participants with ASD were able to label the basic facial emotions (with the exception of angry expression) on par with age- and IQ-matched TDC participants. However, participants with ASD were impaired in their ability to generalize facial emotions across different identities and showed a tendency to recognize the mouth feature holistically and the eyes as isolated parts. © 2012 The Authors. Journal of Child Psychology and Psychiatry © 2012 Association for Child and Adolescent Mental Health.

  19. Dissociation between facial and bodily expressions in emotion recognition: A case study.

    PubMed

    Leiva, Samanta; Margulis, Laura; Micciulli, Andrea; Ferreres, Aldo

    2017-12-21

    Existing single-case studies have reported deficit in recognizing basic emotions through facial expression and unaffected performance with body expressions, but not the opposite pattern. The aim of this paper is to present a case study with impaired emotion recognition through body expressions and intact performance with facial expressions. In this single-case study we assessed a 30-year-old patient with autism spectrum disorder, without intellectual disability, and a healthy control group (n = 30) with four tasks of basic and complex emotion recognition through face and body movements, and two non-emotional control tasks. To analyze the dissociation between facial and body expressions, we used Crawford and Garthwaite's operational criteria, and we compared the patient and the control group performance with a modified one-tailed t-test designed specifically for single-case studies. There were no statistically significant differences between the patient's and the control group's performances on the non-emotional body movement task or the facial perception task. For both kinds of emotions (basic and complex) when the patient's performance was compared to the control group's, statistically significant differences were only observed for the recognition of body expressions. There were no significant differences between the patient's and the control group's correct answers for emotional facial stimuli. Our results showed a profile of impaired emotion recognition through body expressions and intact performance with facial expressions. This is the first case study that describes the existence of this kind of dissociation pattern between facial and body expressions of basic and complex emotions.

  20. Mapping structural covariance networks of facial emotion recognition in early psychosis: A pilot study.

    PubMed

    Buchy, Lisa; Barbato, Mariapaola; Makowski, Carolina; Bray, Signe; MacMaster, Frank P; Deighton, Stephanie; Addington, Jean

    2017-11-01

    People with psychosis show deficits recognizing facial emotions and disrupted activation in the underlying neural circuitry. We evaluated associations between facial emotion recognition and cortical thickness using a correlation-based approach to map structural covariance networks across the brain. Fifteen people with an early psychosis provided magnetic resonance scans and completed the Penn Emotion Recognition and Differentiation tasks. Fifteen historical controls provided magnetic resonance scans. Cortical thickness was computed using CIVET and analyzed with linear models. Seed-based structural covariance analysis was done using the mapping anatomical correlations across the cerebral cortex methodology. To map structural covariance networks involved in facial emotion recognition, the right somatosensory cortex and bilateral fusiform face areas were selected as seeds. Statistics were run in SurfStat. Findings showed increased cortical covariance between the right fusiform face region seed and right orbitofrontal cortex in controls than early psychosis subjects. Facial emotion recognition scores were not significantly associated with thickness in any region. A negative effect of Penn Differentiation scores on cortical covariance was seen between the left fusiform face area seed and right superior parietal lobule in early psychosis subjects. Results suggest that facial emotion recognition ability is related to covariance in a temporal-parietal network in early psychosis. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Curvilinear relationship between phonological working memory load and social-emotional modulation

    PubMed Central

    Mano, Quintino R.; Brown, Gregory G.; Bolden, Khalima; Aupperle, Robin; Sullivan, Sarah; Paulus, Martin P.; Stein, Murray B.

    2015-01-01

    Accumulating evidence suggests that working memory load is an important factor for the interplay between cognitive and facial-affective processing. However, it is unclear how distraction caused by perception of faces interacts with load-related performance. We developed a modified version of the delayed match-to-sample task wherein task-irrelevant facial distracters were presented early in the rehearsal of pseudoword memoranda that varied incrementally in load size (1-syllable, 2-syllables, or 3-syllables). Facial distracters displayed happy, sad, or neutral expressions in Experiment 1 (N=60) and happy, fearful, or neutral expressions in Experiment 2 (N=29). Facial distracters significantly disrupted task performance in the intermediate load condition (2-syllable) but not in the low or high load conditions (1- and 3-syllables, respectively), an interaction replicated and generalised in Experiment 2. All facial distracters disrupted working memory in the intermediate load condition irrespective of valence, suggesting a primary and general effect of distraction caused by faces. However, sad and fearful faces tended to be less disruptive than happy faces, suggesting a secondary and specific valence effect. Working memory appears to be most vulnerable to social-emotional information at intermediate loads. At low loads, spare capacity is capable of accommodating the combinatorial load (1-syllable plus facial distracter), whereas high loads maximised capacity and deprived facial stimuli from occupying working memory slots to cause disruption. PMID:22928750

  2. Dose-dependent social-cognitive effects of intranasal oxytocin delivered with novel Breath Powered device in adults with autism spectrum disorder: a randomized placebo-controlled double-blind crossover trial

    PubMed Central

    Quintana, D S; Westlye, L T; Hope, S; Nærland, T; Elvsåshagen, T; Dørum, E; Rustan, Ø; Valstad, M; Rezvaya, L; Lishaugen, H; Stensønes, E; Yaqub, S; Smerud, K T; Mahmoud, R A; Djupesland, P G; Andreassen, O A

    2017-01-01

    The neuropeptide oxytocin has shown promise as a treatment for symptoms of autism spectrum disorders (ASD). However, clinical research progress has been hampered by a poor understanding of oxytocin’s dose–response and sub-optimal intranasal delivery methods. We examined two doses of oxytocin delivered using a novel Breath Powered intranasal delivery device designed to improve direct nose-to-brain activity in a double-blind, crossover, randomized, placebo-controlled trial. In a randomized sequence of single-dose sessions, 17 male adults with ASD received 8 international units (IU) oxytocin, 24IU oxytocin or placebo followed by four social-cognitive tasks. We observed an omnibus main effect of treatment on the primary outcome measure of overt emotion salience as measured by emotional ratings of faces (η2=0.18). Compared to placebo, 8IU treatment increased overt emotion salience (P=0.02, d=0.63). There was no statistically significant increase after 24IU treatment (P=0.12, d=0.4). The effects after 8IU oxytocin were observed despite no significant increase in peripheral blood plasma oxytocin concentrations. We found no significant effects for reading the mind in the eyes task performance or secondary outcome social-cognitive tasks (emotional dot probe and face-morphing). To our knowledge, this is the first trial to assess the dose-dependent effects of a single oxytocin administration in autism, with results indicating that a low dose of oxytocin can significantly modulate overt emotion salience despite minimal systemic exposure. PMID:28534875

  3. Dose-dependent social-cognitive effects of intranasal oxytocin delivered with novel Breath Powered device in adults with autism spectrum disorder: a randomized placebo-controlled double-blind crossover trial.

    PubMed

    Quintana, D S; Westlye, L T; Hope, S; Nærland, T; Elvsåshagen, T; Dørum, E; Rustan, Ø; Valstad, M; Rezvaya, L; Lishaugen, H; Stensønes, E; Yaqub, S; Smerud, K T; Mahmoud, R A; Djupesland, P G; Andreassen, O A

    2017-05-23

    The neuropeptide oxytocin has shown promise as a treatment for symptoms of autism spectrum disorders (ASD). However, clinical research progress has been hampered by a poor understanding of oxytocin's dose-response and sub-optimal intranasal delivery methods. We examined two doses of oxytocin delivered using a novel Breath Powered intranasal delivery device designed to improve direct nose-to-brain activity in a double-blind, crossover, randomized, placebo-controlled trial. In a randomized sequence of single-dose sessions, 17 male adults with ASD received 8 international units (IU) oxytocin, 24IU oxytocin or placebo followed by four social-cognitive tasks. We observed an omnibus main effect of treatment on the primary outcome measure of overt emotion salience as measured by emotional ratings of faces (η 2 =0.18). Compared to placebo, 8IU treatment increased overt emotion salience (P=0.02, d=0.63). There was no statistically significant increase after 24IU treatment (P=0.12, d=0.4). The effects after 8IU oxytocin were observed despite no significant increase in peripheral blood plasma oxytocin concentrations. We found no significant effects for reading the mind in the eyes task performance or secondary outcome social-cognitive tasks (emotional dot probe and face-morphing). To our knowledge, this is the first trial to assess the dose-dependent effects of a single oxytocin administration in autism, with results indicating that a low dose of oxytocin can significantly modulate overt emotion salience despite minimal systemic exposure.

  4. Amygdala hyperactivation to angry faces in intermittent explosive disorder.

    PubMed

    McCloskey, Michael S; Phan, K Luan; Angstadt, Mike; Fettich, Karla C; Keedy, Sarah; Coccaro, Emil F

    2016-08-01

    Individuals with intermittent explosive disorder (IED) were previously found to exhibit amygdala hyperactivation and relatively reduced orbital medial prefrontal cortex (OMPFC) activation to angry faces while performing an implicit emotion information processing task during functional magnetic resonance imaging (fMRI). This study examines the neural substrates associated with explicit encoding of facial emotions among individuals with IED. Twenty unmedicated IED subjects and twenty healthy, matched comparison subjects (HC) underwent fMRI while viewing blocks of angry, happy, and neutral faces and identifying the emotional valence of each face (positive, negative or neutral). We compared amygdala and OMPFC reactivity to faces between IED and HC subjects. We also examined the relationship between amygdala/OMPFC activation and aggression severity. Compared to controls, the IED group exhibited greater amygdala response to angry (vs. neutral) facial expressions. In contrast, IED and control groups did not differ in OMPFC activation to angry faces. Across subjects amygdala activation to angry faces was correlated with number of prior aggressive acts. These findings extend previous evidence of amygdala dysfunction in response to the identification of an ecologically-valid social threat signal (processing angry faces) among individuals with IED, further substantiating a link between amygdala hyperactivity to social signals of direct threat and aggression. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. On the flexibility of social source memory: a test of the emotional incongruity hypothesis.

    PubMed

    Bell, Raoul; Buchner, Axel; Kroneisen, Meike; Giang, Trang

    2012-11-01

    A popular hypothesis in evolutionary psychology posits that reciprocal altruism is supported by a cognitive module that helps cooperative individuals to detect and remember cheaters. Consistent with this hypothesis, a source memory advantage for faces of cheaters (better memory for the cheating context in which these faces were encountered) was observed in previous studies. Here, we examined whether positive or negative expectancies would influence source memory for cheaters and cooperators. A cooperation task with virtual opponents was used in Experiments 1 and 2. Source memory for the emotionally incongruent information was enhanced relative to the congruent information: In Experiment 1, source memory was best for cheaters with likable faces and for cooperators with unlikable faces; in Experiment 2, source memory was better for smiling cheater faces than for smiling cooperator faces, and descriptively better for angry cooperator faces than for angry cheater faces. Experiments 3 and 4 showed that the emotional incongruity effect generalizes to 3rd-party reputational information (descriptions of cheating and trustworthy behavior). The results are inconsistent with the assumption of a highly specific cheater detection module. Focusing on expectancy-incongruent information may represent a more efficient, general, and hence more adaptive memory strategy for remembering exchange-relevant information than focusing only on cheaters.

  6. Recognition of facial and musical emotions in Parkinson's disease.

    PubMed

    Saenz, A; Doé de Maindreville, A; Henry, A; de Labbey, S; Bakchine, S; Ehrlé, N

    2013-03-01

    Patients with amygdala lesions were found to be impaired in recognizing the fear emotion both from face and from music. In patients with Parkinson's disease (PD), impairment in recognition of emotions from facial expressions was reported for disgust, fear, sadness and anger, but no studies had yet investigated this population for the recognition of emotions from both face and music. The ability to recognize basic universal emotions (fear, happiness and sadness) from both face and music was investigated in 24 medicated patients with PD and 24 healthy controls. The patient group was tested for language (verbal fluency tasks), memory (digit and spatial span), executive functions (Similarities and Picture Completion subtests of the WAIS III, Brixton and Stroop tests), visual attention (Bells test), and fulfilled self-assessment tests for anxiety and depression. Results showed that the PD group was significantly impaired for recognition of both fear and sadness emotions from facial expressions, whereas their performance in recognition of emotions from musical excerpts was not different from that of the control group. The scores of fear and sadness recognition from faces were neither correlated to scores in tests for executive and cognitive functions, nor to scores in self-assessment scales. We attributed the observed dissociation to the modality (visual vs. auditory) of presentation and to the ecological value of the musical stimuli that we used. We discuss the relevance of our findings for the care of patients with PD. © 2012 The Author(s) European Journal of Neurology © 2012 EFNS.

  7. Is cross-modal integration of emotional expressions independent of attentional resources?

    PubMed

    Vroomen, J; Driver, J; de Gelder, B

    2001-12-01

    In this study, we examined whether integration of visual and auditory information about emotions requires limited attentional resources. Subjects judged whether a voice expressed happiness or fear, while trying to ignore a concurrently presented static facial expression. As an additional task, the subjects had to add two numbers together rapidly (Experiment 1), count the occurrences of a target digit in a rapid serial visual presentation (Experiment 2), or judge the pitch of a tone as high or low (Experiment 3). The visible face had an impact on judgments of the emotion of the heard voice in all the experiments. This cross-modal effect was independent of whether or not the subjects performed a demanding additional task. This suggests that integration of visual and auditory information about emotions may be a mandatory process, unconstrained by attentional resources.

  8. Gender Differences in Neural Responses to Perceptually Invisible Fearful Face—An ERP Study

    PubMed Central

    Lee, Seung A.; Kim, Chai-Youn; Shim, Miseon; Lee, Seung-Hwan

    2017-01-01

    Women tend to respond to emotional stimuli differently from men. This study aimed at investigating whether neural responses to perceptually “invisible” emotional stimuli differ between men and women by exploiting event-related potential (ERP). Forty healthy participants (21 women) were recruited for the main experiment. A control experiment was conducted by excluding nine (7 women) participants from the main experiment and replacing them with additional ten (6 women) participants (total 41 participants) where Beck's Anxiety Inventory (BAI) and Beck's Depression Inventory (BDI) scores were controlled. Using the visual backward masking paradigm, either a fearful or a neutral face stimulus was presented in varied durations (subthreshold, near-threshold, or suprathreshold) followed by a mask. Participants performed a two-alternative forced choice (2-AFC) emotion discrimination task on each face. Behavioral analysis showed that participants were unaware of masked stimuli of which duration was the shortest and, therefore, processed at subthreshold. Nevertheless, women showed significantly larger response in P100 amplitude to subthreshold fearful faces than men. This result remained consistent in the control experiment. Our findings indicate gender-differences in neural response to subthreshold emotional face, which is reflected in the early processing stage. PMID:28184189

  9. Understanding the Impact of User Frustration Intensities on Task Performance Using the OCC Theory of Emotions

    NASA Technical Reports Server (NTRS)

    Washington, Gloria

    2012-01-01

    Have you heard the saying "frustration is written all over your falce"? Well this saying is true, but that is not the only place. Frustration is written all over your face and your body. The human body has various means to communicate an emotion without the utterance of a single word. The Media Equation says that people interact with computers as if they are human: this includes experiencing frustration. This research measures frustration by monitoring human body-based measures such as heart rate, posture, skin temperature. and respiration. The OCC Theory of Emotions is used to separate frustration into different levels or intensities. The results of this study showed that individual intensities of frustration exist, so that task performance is not degraded. Results from this study can be used by usability testers to model how much frustration is needed before task performance measures start to decrease.

  10. Alcohol acutely enhances decoding of positive emotions and emotional concern for positive stimuli and facilitates the viewing of sexual images.

    PubMed

    Dolder, Patrick C; Holze, Friederike; Liakoni, Evangelia; Harder, Samuel; Schmid, Yasmin; Liechti, Matthias E

    2017-01-01

    Social cognition influences social interactions. Alcohol reportedly facilitates social interactions. However, the acute effects of alcohol on social cognition are relatively poorly studied. We investigated the effects of alcoholic or non-alcoholic beer on emotion recognition, empathy, and sexual arousal using the dynamic face emotion recognition task (FERT), Multifaceted Empathy Test (MET), and Sexual Arousal Task (SAT) in a double-blind, random-order, cross-over study in 60 healthy social drinkers. We also assessed subjective effects using visual analog scales (VASs), blood alcohol concentrations, and plasma oxytocin levels. Alcohol increased VAS ratings of stimulated, happy, talkative, open, and want to be with others. The subjective effects of alcohol were greater in participants with higher trait inhibitedness. Alcohol facilitated the recognition of happy faces on the FERT and enhanced emotional empathy for positive stimuli on the MET, particularly in participants with low trait empathy. Pictures of explicit sexual content were rated as less pleasant than neutral pictures after non-alcoholic beer but not after alcoholic beer. Explicit sexual pictures were rated as more pleasant after alcoholic beer compared with non-alcoholic beer, particularly in women. Alcohol did not alter the levels of circulating oxytocin. Alcohol biased emotion recognition toward better decoding of positive emotions and increased emotional concern for positive stimuli. No support was found for a modulatory role of oxytocin. Alcohol also facilitated the viewing of sexual images, consistent with disinhibition, but it did not actually enhance sexual arousal. These effects of alcohol on social cognition likely enhance sociability. www.clinicaltrials.gov/ct2/show/NCT02318823.

  11. Differential Brain Activation to Angry Faces by Elite Warfighters: Neural Processing Evidence for Enhanced Threat Detection

    PubMed Central

    Paulus, Martin P.; Simmons, Alan N.; Fitzpatrick, Summer N.; Potterat, Eric G.; Van Orden, Karl F.; Bauman, James; Swain, Judith L.

    2010-01-01

    Background Little is known about the neural basis of elite performers and their optimal performance in extreme environments. The purpose of this study was to examine brain processing differences between elite warfighters and comparison subjects in brain structures that are important for emotion processing and interoception. Methodology/Principal Findings Navy Sea, Air, and Land Forces (SEALs) while off duty (n = 11) were compared with n = 23 healthy male volunteers while performing a simple emotion face-processing task during functional magnetic resonance imaging. Irrespective of the target emotion, elite warfighters relative to comparison subjects showed relatively greater right-sided insula, but attenuated left-sided insula, activation. Navy SEALs showed selectively greater activation to angry target faces relative to fearful or happy target faces bilaterally in the insula. This was not accounted for by contrasting positive versus negative emotions. Finally, these individuals also showed slower response latencies to fearful and happy target faces than did comparison subjects. Conclusions/Significance These findings support the hypothesis that elite warfighters deploy greater processing resources toward potential threat-related facial expressions and reduced processing resources to non-threat-related facial expressions. Moreover, rather than expending more effort in general, elite warfighters show more focused neural and performance tuning. In other words, greater neural processing resources are directed toward threat stimuli and processing resources are conserved when facing a nonthreat stimulus situation. PMID:20418943

  12. Emotional words facilitate lexical but not early visual processing.

    PubMed

    Trauer, Sophie M; Kotz, Sonja A; Müller, Matthias M

    2015-12-12

    Emotional scenes and faces have shown to capture and bind visual resources at early sensory processing stages, i.e. in early visual cortex. However, emotional words have led to mixed results. In the current study ERPs were assessed simultaneously with steady-state visual evoked potentials (SSVEPs) to measure attention effects on early visual activity in emotional word processing. Neutral and negative words were flickered at 12.14 Hz whilst participants performed a Lexical Decision Task. Emotional word content did not modulate the 12.14 Hz SSVEP amplitude, neither did word lexicality. However, emotional words affected the ERP. Negative compared to neutral words as well as words compared to pseudowords lead to enhanced deflections in the P2 time range indicative of lexico-semantic access. The N400 was reduced for negative compared to neutral words and enhanced for pseudowords compared to words indicating facilitated semantic processing of emotional words. LPC amplitudes reflected word lexicality and thus the task-relevant response. In line with previous ERP and imaging evidence, the present results indicate that written emotional words are facilitated in processing only subsequent to visual analysis.

  13. Does parental anxiety cause biases in the processing of child-relevant threat material?

    PubMed

    Cartwright-Hatton, Sam; Abeles, Paul; Dixon, Clare; Holliday, Christine; Hills, Becky

    2014-06-01

    Anxiety leads to biases in processing personally relevant information. This study set out to examine whether anxious parents also experience biases in processing child-relevant material. Ninety parents acted as a control condition, or received a social anxiety or child-related anxiety induction. They completed a task examining attentional biases in relation to child-threat words and social-threat words, and a task examining ability to categorize emotion in children's faces and voices. There was a trend indicating group differences in attentional bias towards social-threat words, and this appears to have been only in the social anxiety condition, but not the child anxiety or control conditions. For child-threat words, attentional bias was present in the child anxiety condition, but not the social anxiety or control conditions. In the emotion recognition task, there was no difference between the control and child anxiety conditions, but the social anxiety condition were more likely to erroneously label children's faces and voices as sad. Parents' anxious biases may spill over into their child's world. Parents' anxious biases may spill over into their child's world. Anxious parents may have attentional biases towards threats in their children's environment. Anxious parents may over-attribute negative emotion to children. © 2013 The British Psychological Society.

  14. Subgenual Cingulum Microstructure Supports Control of Emotional Conflict.

    PubMed

    Keedwell, Paul A; Doidge, Amie N; Meyer, Marcel; Lawrence, Natalia; Lawrence, Andrew D; Jones, Derek K

    2016-06-01

    Major depressive disorder (MDD) is associated with specific difficulties in attentional disengagement from negatively valenced material. Diffusion MRI studies have demonstrated altered white matter microstructure in the subgenual cingulum bundle (CB) in individuals with MDD, though the functional significance of these alterations has not been examined formally. This study explored whether individual differences in selective attention to negatively valenced stimuli are related to interindividual differences in subgenual CB microstructure. Forty-six individuals (21 with remitted MDD, 25 never depressed) completed an emotional Stroop task, using happy and angry distractor faces overlaid by pleasant or unpleasant target words and a control gender-based Stroop task. CBs were reconstructed in 38 individuals using diffusion-weighted imaging and tractography, and mean fractional anisotropy (FA) computed for the subgenual, retrosplenial, and parahippocampal subdivisions. No significant correlations were found between FA and performance in the control gender-based Stroop task in any CB region. However, the degree of interference produced by angry face distractors on time to identify pleasant words (emotional conflict) correlated selectively with FA in the subgenual CB (r = -0.53; P = 0.01). Higher FA was associated with reduced interference, irrespective of a diagnosis of MDD, suggesting that subgenual CB microstructure is functionally relevant for regulating attentional bias toward negative interpersonal stimuli. © The Author 2016. Published by Oxford University Press.

  15. Subgenual Cingulum Microstructure Supports Control of Emotional Conflict

    PubMed Central

    Keedwell, Paul A.; Doidge, Amie N.; Meyer, Marcel; Lawrence, Natalia; Lawrence, Andrew D.; Jones, Derek K.

    2016-01-01

    Major depressive disorder (MDD) is associated with specific difficulties in attentional disengagement from negatively valenced material. Diffusion MRI studies have demonstrated altered white matter microstructure in the subgenual cingulum bundle (CB) in individuals with MDD, though the functional significance of these alterations has not been examined formally. This study explored whether individual differences in selective attention to negatively valenced stimuli are related to interindividual differences in subgenual CB microstructure. Forty-six individuals (21 with remitted MDD, 25 never depressed) completed an emotional Stroop task, using happy and angry distractor faces overlaid by pleasant or unpleasant target words and a control gender-based Stroop task. CBs were reconstructed in 38 individuals using diffusion-weighted imaging and tractography, and mean fractional anisotropy (FA) computed for the subgenual, retrosplenial, and parahippocampal subdivisions. No significant correlations were found between FA and performance in the control gender-based Stroop task in any CB region. However, the degree of interference produced by angry face distractors on time to identify pleasant words (emotional conflict) correlated selectively with FA in the subgenual CB (r = −0.53; P = 0.01). Higher FA was associated with reduced interference, irrespective of a diagnosis of MDD, suggesting that subgenual CB microstructure is functionally relevant for regulating attentional bias toward negative interpersonal stimuli. PMID:27048427

  16. Infants’ sensitivity to emotion in music and emotion-action understanding

    PubMed Central

    Siu, Tik-Sze Carrey; Cheung, Him

    2017-01-01

    Emerging evidence has indicated infants’ early sensitivity to acoustic cues in music. Do they interpret these cues in emotional terms to represent others’ affective states? The present study examined infants’ development of emotional understanding of music with a violation-of-expectation paradigm. Twelve- and 20-month-olds were presented with emotionally concordant and discordant music-face displays on alternate trials. The 20-month-olds, but not the 12-month-olds, were surprised by emotional incongruence between musical and facial expressions, suggesting their sensitivity to musical emotion. In a separate non-music task, only the 20-month-olds were able to use an actress’s affective facial displays to predict her subsequent action. Interestingly, for the 20-month-olds, such emotion-action understanding correlated with sensitivity to musical expressions measured in the first task. These two abilities however did not correlate with family income, parental estimation of language and communicative skills, and quality of parent-child interaction. The findings suggest that sensitivity to musical emotion and emotion-action understanding may be supported by a generalised common capacity to represent emotion from social cues, which lays a foundation for later social-communicative development. PMID:28152081

  17. A voxel-based lesion study on facial emotion recognition after penetrating brain injury

    PubMed Central

    Dal Monte, Olga; Solomon, Jeffrey M.; Schintu, Selene; Knutson, Kristine M.; Strenziok, Maren; Pardini, Matteo; Leopold, Anne; Raymont, Vanessa; Grafman, Jordan

    2013-01-01

    The ability to read emotions in the face of another person is an important social skill that can be impaired in subjects with traumatic brain injury (TBI). To determine the brain regions that modulate facial emotion recognition, we conducted a whole-brain analysis using a well-validated facial emotion recognition task and voxel-based lesion symptom mapping (VLSM) in a large sample of patients with focal penetrating TBIs (pTBIs). Our results revealed that individuals with pTBI performed significantly worse than normal controls in recognizing unpleasant emotions. VLSM mapping results showed that impairment in facial emotion recognition was due to damage in a bilateral fronto-temporo-limbic network, including medial prefrontal cortex (PFC), anterior cingulate cortex, left insula and temporal areas. Beside those common areas, damage to the bilateral and anterior regions of PFC led to impairment in recognizing unpleasant emotions, whereas bilateral posterior PFC and left temporal areas led to impairment in recognizing pleasant emotions. Our findings add empirical evidence that the ability to read pleasant and unpleasant emotions in other people's faces is a complex process involving not only a common network that includes bilateral fronto-temporo-limbic lobes, but also other regions depending on emotional valence. PMID:22496440

  18. Emotional task management: neural correlates of switching between affective and non-affective task-sets.

    PubMed

    Reeck, Crystal; Egner, Tobias

    2015-08-01

    Although task-switching has been investigated extensively, its interaction with emotionally salient task content remains unclear. Prioritized processing of affective stimulus content may enhance accessibility of affective task-sets and generate increased interference when switching between affective and non-affective task-sets. Previous research has demonstrated that more dominant task-sets experience greater switch costs, as they necessitate active inhibition during performance of less entrenched tasks. Extending this logic to the affective domain, the present experiment examined (a) whether affective task-sets are more dominant than non-affective ones, and (b) what neural mechanisms regulate affective task-sets, so that weaker, non-affective task-sets can be executed. While undergoing functional magnetic resonance imaging, participants categorized face stimuli according to either their gender (non-affective task) or their emotional expression (affective task). Behavioral results were consistent with the affective task dominance hypothesis: participants were slower to switch to the affective task, and cross-task interference was strongest when participants tried to switch from the affective to the non-affective task. These behavioral costs of controlling the affective task-set were mirrored in the activation of a right-lateralized frontostriatal network previously implicated in task-set updating and response inhibition. Connectivity between amygdala and right ventrolateral prefrontal cortex was especially pronounced during cross-task interference from affective features. © The Author (2014). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  19. Visual and auditory socio-cognitive perception in unilateral temporal lobe epilepsy in children and adolescents: a prospective controlled study.

    PubMed

    Laurent, Agathe; Arzimanoglou, Alexis; Panagiotakaki, Eleni; Sfaello, Ignacio; Kahane, Philippe; Ryvlin, Philippe; Hirsch, Edouard; de Schonen, Scania

    2014-12-01

    A high rate of abnormal social behavioural traits or perceptual deficits is observed in children with unilateral temporal lobe epilepsy. In the present study, perception of auditory and visual social signals, carried by faces and voices, was evaluated in children or adolescents with temporal lobe epilepsy. We prospectively investigated a sample of 62 children with focal non-idiopathic epilepsy early in the course of the disorder. The present analysis included 39 children with a confirmed diagnosis of temporal lobe epilepsy. Control participants (72), distributed across 10 age groups, served as a control group. Our socio-perceptual evaluation protocol comprised three socio-visual tasks (face identity, facial emotion and gaze direction recognition), two socio-auditory tasks (voice identity and emotional prosody recognition), and three control tasks (lip reading, geometrical pattern and linguistic intonation recognition). All 39 patients also benefited from a neuropsychological examination. As a group, children with temporal lobe epilepsy performed at a significantly lower level compared to the control group with regards to recognition of facial identity, direction of eye gaze, and emotional facial expressions. We found no relationship between the type of visual deficit and age at first seizure, duration of epilepsy, or the epilepsy-affected cerebral hemisphere. Deficits in socio-perceptual tasks could be found independently of the presence of deficits in visual or auditory episodic memory, visual non-facial pattern processing (control tasks), or speech perception. A normal FSIQ did not exempt some of the patients from an underlying deficit in some of the socio-perceptual tasks. Temporal lobe epilepsy not only impairs development of emotion recognition, but can also impair development of perception of other socio-perceptual signals in children with or without intellectual deficiency. Prospective studies need to be designed to evaluate the results of appropriate re-education programs in children presenting with deficits in social cue processing.

  20. Abnormal GABAergic function and face processing in schizophrenia: A pharmacologic-fMRI study.

    PubMed

    Tso, Ivy F; Fang, Yu; Phan, K Luan; Welsh, Robert C; Taylor, Stephan F

    2015-10-01

    The involvement of the gamma-aminobutyric acid (GABA) system in schizophrenia is suggested by postmortem studies and the common use of GABA receptor-potentiating agents in treatment. In a recent study, we used a benzodiazepine challenge to demonstrate abnormal GABAergic function during processing of negative visual stimuli in schizophrenia. This study extended this investigation by mapping GABAergic mechanisms associated with face processing and social appraisal in schizophrenia using a benzodiazepine challenge. Fourteen stable, medicated schizophrenia/schizoaffective patients (SZ) and 13 healthy controls (HC) underwent functional MRI using the blood oxygenation level-dependent (BOLD) technique while they performed the Socio-emotional Preference Task (SePT) on emotional face stimuli ("Do you like this face?"). Participants received single-blinded intravenous saline and lorazepam (LRZ) in two separate sessions separated by 1-3weeks. Both SZ and HC recruited medial prefrontal cortex/anterior cingulate during the SePT, relative to gender identification. A significant drug by group interaction was observed in the medial occipital cortex, such that SZ showed increased BOLD signal to LRZ challenge, while HC showed an expected decrease of signal; the interaction did not vary by task. The altered BOLD response to LRZ challenge in SZ was significantly correlated with increased negative affect across multiple measures. The altered response to LRZ challenge suggests that abnormal face processing and negative affect in SZ are associated with altered GABAergic function in the visual cortex, underscoring the role of impaired visual processing in socio-emotional deficits in schizophrenia. Copyright © 2015 Elsevier B.V. All rights reserved.

Top