Sample records for face emotion processing

  1. Face to face with emotion: holistic face processing is modulated by emotional state.

    PubMed

    Curby, Kim M; Johnson, Kareem J; Tyson, Alyssa

    2012-01-01

    Negative emotions are linked with a local, rather than global, visual processing style, which may preferentially facilitate feature-based, relative to holistic, processing mechanisms. Because faces are typically processed holistically, and because social contexts are prime elicitors of emotions, we examined whether negative emotions decrease holistic processing of faces. We induced positive, negative, or neutral emotions via film clips and measured holistic processing before and after the induction: participants made judgements about cued parts of chimeric faces, and holistic processing was indexed by the interference caused by task-irrelevant face parts. Emotional state significantly modulated face-processing style, with the negative emotion induction leading to decreased holistic processing. Furthermore, self-reported change in emotional state correlated with changes in holistic processing. These results contrast with general assumptions that holistic processing of faces is automatic and immune to outside influences, and they illustrate emotion's power to modulate socially relevant aspects of visual perception.

  2. Subliminal Face Emotion Processing: A Comparison of Fearful and Disgusted Faces.

    PubMed

    Khalid, Shah; Ansorge, Ulrich

    2017-01-01

    Prior research has provided evidence for (1) subcortical processing of subliminal facial expressions of emotion and (2) for the emotion-specificity of these processes. Here, we investigated if this is also true for the processing of the subliminal facial display of disgust. In Experiment 1, we used differently filtered masked prime faces portraying emotionally neutral or disgusted expressions presented prior to clearly visible target faces to test if the masked primes exerted an influence on target processing nonetheless. Whereas we found evidence for subliminal face congruence or priming effects, in particular, reverse priming by low spatial frequencies disgusted face primes, we did not find any support for a subcortical origin of the effect. In Experiment 2, we compared the influence of subliminal disgusted faces with that of subliminal fearful faces and demonstrated a behavioral performance difference between the two, pointing to an emotion-specific processing of the disgusted facial expressions. In both experiments, we also tested for the dependence of the subliminal emotional face processing on spatial attention - with mixed results, suggesting an attention-independence in Experiment 1 but not in Experiment 2 -, and we found perfect masking of the face primes - that is, proof of the subliminality of the prime faces. Based on our findings, we speculate that subliminal facial expressions of disgust could afford easy avoidance of these faces. This could be a unique effect of disgusted faces as compared to other emotional facial displays, at least under the conditions studied here.

  3. Interference among the Processing of Facial Emotion, Face Race, and Face Gender.

    PubMed

    Li, Yongna; Tse, Chi-Shing

    2016-01-01

    People can process multiple dimensions of facial properties simultaneously. Facial processing models are based on the processing of facial properties. The current study examined the processing of facial emotion, face race, and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interacted with face race in all the tasks. The interaction of face race and face gender was found in the race and gender categorization tasks, whereas the interaction of facial emotion and face gender was significant in the emotion and gender categorization tasks. These results provided evidence for a symmetric interaction between variant facial properties (emotion) and invariant facial properties (race and gender).

  4. Interference among the Processing of Facial Emotion, Face Race, and Face Gender

    PubMed Central

    Li, Yongna; Tse, Chi-Shing

    2016-01-01

    People can process multiple dimensions of facial properties simultaneously. Facial processing models are based on the processing of facial properties. The current study examined the processing of facial emotion, face race, and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interacted with face race in all the tasks. The interaction of face race and face gender was found in the race and gender categorization tasks, whereas the interaction of facial emotion and face gender was significant in the emotion and gender categorization tasks. These results provided evidence for a symmetric interaction between variant facial properties (emotion) and invariant facial properties (race and gender). PMID:27840621

  5. Subliminal Face Emotion Processing: A Comparison of Fearful and Disgusted Faces

    PubMed Central

    Khalid, Shah; Ansorge, Ulrich

    2017-01-01

    Prior research has provided evidence for (1) subcortical processing of subliminal facial expressions of emotion and (2) for the emotion-specificity of these processes. Here, we investigated if this is also true for the processing of the subliminal facial display of disgust. In Experiment 1, we used differently filtered masked prime faces portraying emotionally neutral or disgusted expressions presented prior to clearly visible target faces to test if the masked primes exerted an influence on target processing nonetheless. Whereas we found evidence for subliminal face congruence or priming effects, in particular, reverse priming by low spatial frequencies disgusted face primes, we did not find any support for a subcortical origin of the effect. In Experiment 2, we compared the influence of subliminal disgusted faces with that of subliminal fearful faces and demonstrated a behavioral performance difference between the two, pointing to an emotion-specific processing of the disgusted facial expressions. In both experiments, we also tested for the dependence of the subliminal emotional face processing on spatial attention – with mixed results, suggesting an attention-independence in Experiment 1 but not in Experiment 2 –, and we found perfect masking of the face primes – that is, proof of the subliminality of the prime faces. Based on our findings, we speculate that subliminal facial expressions of disgust could afford easy avoidance of these faces. This could be a unique effect of disgusted faces as compared to other emotional facial displays, at least under the conditions studied here. PMID:28680413

  6. Emotionally anesthetized: media violence induces neural changes during emotional face processing

    PubMed Central

    Stockdale, Laura A.; Morrison, Robert G.; Kmiecik, Matthew J.; Garbarino, James

    2015-01-01

    Media violence exposure causes increased aggression and decreased prosocial behavior, suggesting that media violence desensitizes people to the emotional experience of others. Alterations in emotional face processing following exposure to media violence may result in desensitization to others’ emotional states. This study used scalp electroencephalography methods to examine the link between exposure to violence and neural changes associated with emotional face processing. Twenty-five participants were shown a violent or nonviolent film clip and then completed a gender discrimination stop-signal task using emotional faces. Media violence did not affect the early visual P100 component; however, decreased amplitude was observed in the N170 and P200 event-related potentials following the violent film, indicating that exposure to film violence leads to suppression of holistic face processing and implicit emotional processing. Participants who had just seen a violent film showed increased frontal N200/P300 amplitude. These results suggest that media violence exposure may desensitize people to emotional stimuli and thereby require fewer cognitive resources to inhibit behavior. PMID:25759472

  7. Time course of implicit processing and explicit processing of emotional faces and emotional words.

    PubMed

    Frühholz, Sascha; Jellinghaus, Anne; Herrmann, Manfred

    2011-05-01

    Facial expressions are important emotional stimuli during social interactions. Symbolic emotional cues, such as affective words, also convey information regarding emotions that is relevant for social communication. Various studies have demonstrated fast decoding of emotions from words, as was shown for faces, whereas others report a rather delayed decoding of information about emotions from words. Here, we introduced an implicit (color naming) and explicit task (emotion judgment) with facial expressions and words, both containing information about emotions, to directly compare the time course of emotion processing using event-related potentials (ERP). The data show that only negative faces affected task performance, resulting in increased error rates compared to neutral faces. Presentation of emotional faces resulted in a modulation of the N170, the EPN and the LPP components and these modulations were found during both the explicit and implicit tasks. Emotional words only affected the EPN during the explicit task, but a task-independent effect on the LPP was revealed. Finally, emotional faces modulated source activity in the extrastriate cortex underlying the generation of the N170, EPN and LPP components. Emotional words led to a modulation of source activity corresponding to the EPN and LPP, but they also affected the N170 source on the right hemisphere. These data show that facial expressions affect earlier stages of emotion processing compared to emotional words, but the emotional value of words may have been detected at early stages of emotional processing in the visual cortex, as was indicated by the extrastriate source activity. Copyright © 2011 Elsevier B.V. All rights reserved.

  8. Emotionally anesthetized: media violence induces neural changes during emotional face processing.

    PubMed

    Stockdale, Laura A; Morrison, Robert G; Kmiecik, Matthew J; Garbarino, James; Silton, Rebecca L

    2015-10-01

    Media violence exposure causes increased aggression and decreased prosocial behavior, suggesting that media violence desensitizes people to the emotional experience of others. Alterations in emotional face processing following exposure to media violence may result in desensitization to others' emotional states. This study used scalp electroencephalography methods to examine the link between exposure to violence and neural changes associated with emotional face processing. Twenty-five participants were shown a violent or nonviolent film clip and then completed a gender discrimination stop-signal task using emotional faces. Media violence did not affect the early visual P100 component; however, decreased amplitude was observed in the N170 and P200 event-related potentials following the violent film, indicating that exposure to film violence leads to suppression of holistic face processing and implicit emotional processing. Participants who had just seen a violent film showed increased frontal N200/P300 amplitude. These results suggest that media violence exposure may desensitize people to emotional stimuli and thereby require fewer cognitive resources to inhibit behavior. © The Author (2015). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  9. Passing faces: sequence-dependent variations in the perceptual processing of emotional faces.

    PubMed

    Karl, Christian; Hewig, Johannes; Osinsky, Roman

    2016-10-01

    There is broad evidence that contextual factors influence the processing of emotional facial expressions. Yet temporal-dynamic aspects, inter alia how face processing is influenced by the specific order of neutral and emotional facial expressions, have been largely neglected. To shed light on this topic, we recorded electroencephalogram from 168 healthy participants while they performed a gender-discrimination task with angry and neutral faces. Our event-related potential (ERP) analyses revealed a strong emotional modulation of the N170 component, indicating that the basic visual encoding and emotional analysis of a facial stimulus happen, at least partially, in parallel. While the N170 and the late positive potential (LPP; 400-600 ms) were only modestly affected by the sequence of preceding faces, we observed a strong influence of face sequences on the early posterior negativity (EPN; 200-300 ms). Finally, the differing response patterns of the EPN and LPP indicate that these two ERPs represent distinct processes during face analysis: while the former seems to represent the integration of contextual information in the perception of a current face, the latter appears to represent the net emotional interpretation of a current face.

  10. Rapid communication: Global-local processing affects recognition of distractor emotional faces.

    PubMed

    Srinivasan, Narayanan; Gupta, Rashmi

    2011-03-01

    Recent studies have shown links between happy faces and global, distributed attention as well as sad faces to local, focused attention. Emotions have been shown to affect global-local processing. Given that studies on emotion-cognition interactions have not explored the effect of perceptual processing at different spatial scales on processing stimuli with emotional content, the present study investigated the link between perceptual focus and emotional processing. The study investigated the effects of global-local processing on the recognition of distractor faces with emotional expressions. Participants performed a digit discrimination task with digits at either the global level or the local level presented against a distractor face (happy or sad) as background. The results showed that global processing associated with broad scope of attention facilitates recognition of happy faces, and local processing associated with narrow scope of attention facilitates recognition of sad faces. The novel results of the study provide conclusive evidence for emotion-cognition interactions by demonstrating the effect of perceptual processing on emotional faces. The results along with earlier complementary results on the effect of emotion on global-local processing support a reciprocal relationship between emotional processing and global-local processing. Distractor processing with emotional information also has implications for theories of selective attention.

  11. Event-Related Brain Potential Correlates of Emotional Face Processing

    ERIC Educational Resources Information Center

    Eimer, Martin; Holmes, Amanda

    2007-01-01

    Results from recent event-related brain potential (ERP) studies investigating brain processes involved in the detection and analysis of emotional facial expression are reviewed. In all experiments, emotional faces were found to trigger an increased ERP positivity relative to neutral faces. The onset of this emotional expression effect was…

  12. The Effect of Self-Referential Expectation on Emotional Face Processing

    PubMed Central

    McKendrick, Mel; Butler, Stephen H.; Grealy, Madeleine A.

    2016-01-01

    The role of self-relevance has been somewhat neglected in static face processing paradigms but may be important in understanding how emotional faces impact on attention, cognition and affect. The aim of the current study was to investigate the effect of self-relevant primes on processing emotional composite faces. Sentence primes created an expectation of the emotion of the face before sad, happy, neutral or composite face photos were viewed. Eye movements were recorded and subsequent responses measured the cognitive and affective impact of the emotion expressed. Results indicated that primes did not guide attention, but impacted on judgments of valence intensity and self-esteem ratings. Negative self-relevant primes led to the most negative self-esteem ratings, although the effect of the prime was qualified by salient facial features. Self-relevant expectations about the emotion of a face and subsequent attention to a face that is congruent with these expectations strengthened the affective impact of viewing the face. PMID:27175487

  13. The Effect of Self-Referential Expectation on Emotional Face Processing.

    PubMed

    McKendrick, Mel; Butler, Stephen H; Grealy, Madeleine A

    2016-01-01

    The role of self-relevance has been somewhat neglected in static face processing paradigms but may be important in understanding how emotional faces impact on attention, cognition and affect. The aim of the current study was to investigate the effect of self-relevant primes on processing emotional composite faces. Sentence primes created an expectation of the emotion of the face before sad, happy, neutral or composite face photos were viewed. Eye movements were recorded and subsequent responses measured the cognitive and affective impact of the emotion expressed. Results indicated that primes did not guide attention, but impacted on judgments of valence intensity and self-esteem ratings. Negative self-relevant primes led to the most negative self-esteem ratings, although the effect of the prime was qualified by salient facial features. Self-relevant expectations about the emotion of a face and subsequent attention to a face that is congruent with these expectations strengthened the affective impact of viewing the face.

  14. Emotional Processing of Personally Familiar Faces in the Vegetative State

    PubMed Central

    Sharon, Haggai; Pasternak, Yotam; Ben Simon, Eti; Gruberger, Michal; Giladi, Nir; Krimchanski, Ben Zion; Hassin, David; Hendler, Talma

    2013-01-01

    Background The Vegetative State (VS) is a severe disorder of consciousness in which patients are awake but display no signs of awareness. Yet, recent functional magnetic resonance imaging (fMRI) studies have demonstrated evidence for covert awareness in VS patients by recording specific brain activations during a cognitive task. However, the possible existence of incommunicable subjective emotional experiences in VS patients remains largely unexplored. This study aimed to probe the question of whether VS patients retain a brain ability to selectively process external stimuli according to their emotional value and look for evidence of covert emotional awareness in patients. Methods and Findings In order to explore these questions we employed the emotive impact of observing personally familiar faces, known to provoke specific perceptual as well as emotional brain activations. Four VS patients and thirteen healthy controls first underwent an fMRI scan while viewing pictures of non-familiar faces, personally familiar faces and pictures of themselves. In a subsequent imagery task participants were asked to actively imagine one of their parent's faces. Analyses focused on face and familiarity selective regional brain activations and inter-regional functional connectivity. Similar to controls, all patients displayed face selective brain responses with further limbic and cortical activations elicited by familiar faces. In patients as well as controls, Connectivity was observed between emotional, visual and face specific areas, suggesting aware emotional perception. This connectivity was strongest in the two patients who later recovered. Notably, these two patients also displayed selective amygdala activation during familiar face imagery, with one further exhibiting face selective activations, indistinguishable from healthy controls. Conclusions Taken together, these results show that selective emotional processing can be elicited in VS patients both by external emotionally

  15. Neurophysiological evidence (ERPs) for hemispheric processing of facial expressions of emotions: Evidence from whole face and chimeric face stimuli.

    PubMed

    Damaskinou, Nikoleta; Watling, Dawn

    2018-05-01

    This study was designed to investigate the patterns of electrophysiological responses of early emotional processing at frontocentral sites in adults and to explore whether adults' activation patterns show hemispheric lateralization for facial emotion processing. Thirty-five adults viewed full face and chimeric face stimuli. After viewing two faces, sequentially, participants were asked to decide which of the two faces was more emotive. The findings from the standard faces and the chimeric faces suggest that emotion processing is present during the early phases of face processing in the frontocentral sites. In particular, sad emotional faces are processed differently than neutral and happy (including happy chimeras) faces in these early phases of processing. Further, there were differences in the electrode amplitudes over the left and right hemisphere, particularly in the early temporal window. This research provides supporting evidence that the chimeric face test is a test of emotion processing that elicits right hemispheric processing.

  16. Multimodal processing of emotional information in 9-month-old infants I: emotional faces and voices.

    PubMed

    Otte, R A; Donkers, F C L; Braeken, M A K A; Van den Bergh, B R H

    2015-04-01

    Making sense of emotions manifesting in human voice is an important social skill which is influenced by emotions in other modalities, such as that of the corresponding face. Although processing emotional information from voices and faces simultaneously has been studied in adults, little is known about the neural mechanisms underlying the development of this ability in infancy. Here we investigated multimodal processing of fearful and happy face/voice pairs using event-related potential (ERP) measures in a group of 84 9-month-olds. Infants were presented with emotional vocalisations (fearful/happy) preceded by the same or a different facial expression (fearful/happy). The ERP data revealed that the processing of emotional information appearing in human voice was modulated by the emotional expression appearing on the corresponding face: Infants responded with larger auditory ERPs after fearful compared to happy facial primes. This finding suggests that infants dedicate more processing capacities to potentially threatening than to non-threatening stimuli. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. Neural circuitry of emotional face processing in autism spectrum disorders.

    PubMed

    Monk, Christopher S; Weng, Shih-Jen; Wiggins, Jillian Lee; Kurapati, Nikhil; Louro, Hugo M C; Carrasco, Melisa; Maslowsky, Julie; Risi, Susan; Lord, Catherine

    2010-03-01

    Autism spectrum disorders (ASD) are associated with severe impairments in social functioning. Because faces provide nonverbal cues that support social interactions, many studies of ASD have examined neural structures that process faces, including the amygdala, ventromedial prefrontal cortex and superior and middle temporal gyri. However, increases or decreases in activation are often contingent on the cognitive task. Specifically, the cognitive domain of attention influences group differences in brain activation. We investigated brain function abnormalities in participants with ASD using a task that monitored attention bias to emotional faces. Twenty-four participants (12 with ASD, 12 controls) completed a functional magnetic resonance imaging study while performing an attention cuing task with emotional (happy, sad, angry) and neutral faces. In response to emotional faces, those in the ASD group showed greater right amygdala activation than those in the control group. A preliminary psychophysiological connectivity analysis showed that ASD participants had stronger positive right amygdala and ventromedial prefrontal cortex coupling and weaker positive right amygdala and temporal lobe coupling than controls. There were no group differences in the behavioural measure of attention bias to the emotional faces. The small sample size may have affected our ability to detect additional group differences. When attention bias to emotional faces was equivalent between ASD and control groups, ASD was associated with greater amygdala activation. Preliminary analyses showed that ASD participants had stronger connectivity between the amygdala ventromedial prefrontal cortex (a network implicated in emotional modulation) and weaker connectivity between the amygdala and temporal lobe (a pathway involved in the identification of facial expressions, although areas of group differences were generally in a more anterior region of the temporal lobe than what is typically reported for

  18. Association with emotional information alters subsequent processing of neutral faces

    PubMed Central

    Riggs, Lily; Fujioka, Takako; Chan, Jessica; McQuiggan, Douglas A.; Anderson, Adam K.; Ryan, Jennifer D.

    2014-01-01

    The processing of emotional as compared to neutral information is associated with different patterns in eye movement and neural activity. However, the ‘emotionality’ of a stimulus can be conveyed not only by its physical properties, but also by the information that is presented with it. There is very limited work examining the how emotional information may influence the immediate perceptual processing of otherwise neutral information. We examined how presenting an emotion label for a neutral face may influence subsequent processing by using eye movement monitoring (EMM) and magnetoencephalography (MEG) simultaneously. Participants viewed a series of faces with neutral expressions. Each face was followed by a unique negative or neutral sentence to describe that person, and then the same face was presented in isolation again. Viewing of faces paired with a negative sentence was associated with increased early viewing of the eye region and increased neural activity between 600 and 1200 ms in emotion processing regions such as the cingulate, medial prefrontal cortex, and amygdala, as well as posterior regions such as the precuneus and occipital cortex. Viewing of faces paired with a neutral sentence was associated with increased activity in the parahippocampal gyrus during the same time window. By monitoring behavior and neural activity within the same paradigm, these findings demonstrate that emotional information alters subsequent visual scanning and the neural systems that are presumably invoked to maintain a representation of the neutral information along with its emotional details. PMID:25566024

  19. Disrupted neural processing of emotional faces in psychopathy.

    PubMed

    Contreras-Rodríguez, Oren; Pujol, Jesus; Batalla, Iolanda; Harrison, Ben J; Bosque, Javier; Ibern-Regàs, Immaculada; Hernández-Ribas, Rosa; Soriano-Mas, Carles; Deus, Joan; López-Solà, Marina; Pifarré, Josep; Menchón, José M; Cardoner, Narcís

    2014-04-01

    Psychopaths show a reduced ability to recognize emotion facial expressions, which may disturb the interpersonal relationship development and successful social adaptation. Behavioral hypotheses point toward an association between emotion recognition deficits in psychopathy and amygdala dysfunction. Our prediction was that amygdala dysfunction would combine deficient activation with disturbances in functional connectivity with cortical regions of the face-processing network. Twenty-two psychopaths and 22 control subjects were assessed and functional magnetic resonance maps were generated to identify both brain activation and task-induced functional connectivity using psychophysiological interaction analysis during an emotional face-matching task. Results showed significant amygdala activation in control subjects only, but differences between study groups did not reach statistical significance. In contrast, psychopaths showed significantly increased activation in visual and prefrontal areas, with this latest activation being associated with psychopaths' affective-interpersonal disturbances. Psychophysiological interaction analyses revealed a reciprocal reduction in functional connectivity between the left amygdala and visual and prefrontal cortices. Our results suggest that emotional stimulation may evoke a relevant cortical response in psychopaths, but a disruption in the processing of emotional faces exists involving the reciprocal functional interaction between the amygdala and neocortex, consistent with the notion of a failure to integrate emotion into cognition in psychopathic individuals.

  20. Social anhedonia is associated with neural abnormalities during face emotion processing.

    PubMed

    Germine, Laura T; Garrido, Lucia; Bruce, Lori; Hooker, Christine

    2011-10-01

    Human beings are social organisms with an intrinsic desire to seek and participate in social interactions. Social anhedonia is a personality trait characterized by a reduced desire for social affiliation and reduced pleasure derived from interpersonal interactions. Abnormally high levels of social anhedonia prospectively predict the development of schizophrenia and contribute to poorer outcomes for schizophrenia patients. Despite the strong association between social anhedonia and schizophrenia, the neural mechanisms that underlie individual differences in social anhedonia have not been studied and are thus poorly understood. Deficits in face emotion recognition are related to poorer social outcomes in schizophrenia, and it has been suggested that face emotion recognition deficits may be a behavioral marker for schizophrenia liability. In the current study, we used functional magnetic resonance imaging (fMRI) to see whether there are differences in the brain networks underlying basic face emotion processing in a community sample of individuals low vs. high in social anhedonia. We isolated the neural mechanisms related to face emotion processing by comparing face emotion discrimination with four other baseline conditions (identity discrimination of emotional faces, identity discrimination of neutral faces, object discrimination, and pattern discrimination). Results showed a group (high/low social anhedonia) × condition (emotion discrimination/control condition) interaction in the anterior portion of the rostral medial prefrontal cortex, right superior temporal gyrus, and left somatosensory cortex. As predicted, high (relative to low) social anhedonia participants showed less neural activity in face emotion processing regions during emotion discrimination as compared to each control condition. The findings suggest that social anhedonia is associated with abnormalities in networks responsible for basic processes associated with social cognition, and provide a

  1. The Development of Emotional Face Processing during Childhood

    ERIC Educational Resources Information Center

    Batty, Magali; Taylor, Margot J.

    2006-01-01

    Our facial expressions give others the opportunity to access our feelings, and constitute an important nonverbal tool for communication. Many recent studies have investigated emotional perception in adults, and our knowledge of neural processes involved in emotions is increasingly precise. Young children also use faces to express their internal…

  2. Distinct spatial frequency sensitivities for processing faces and emotional expressions.

    PubMed

    Vuilleumier, Patrik; Armony, Jorge L; Driver, Jon; Dolan, Raymond J

    2003-06-01

    High and low spatial frequency information in visual images is processed by distinct neural channels. Using event-related functional magnetic resonance imaging (fMRI) in humans, we show dissociable roles of such visual channels for processing faces and emotional fearful expressions. Neural responses in fusiform cortex, and effects of repeating the same face identity upon fusiform activity, were greater with intact or high-spatial-frequency face stimuli than with low-frequency faces, regardless of emotional expression. In contrast, amygdala responses to fearful expressions were greater for intact or low-frequency faces than for high-frequency faces. An activation of pulvinar and superior colliculus by fearful expressions occurred specifically with low-frequency faces, suggesting that these subcortical pathways may provide coarse fear-related inputs to the amygdala.

  3. Emotions in word and face processing: early and late cortical responses.

    PubMed

    Schacht, Annekathrin; Sommer, Werner

    2009-04-01

    Recent research suggests that emotion effects in word processing resemble those in other stimulus domains such as pictures or faces. The present study aims to provide more direct evidence for this notion by comparing emotion effects in word and face processing in a within-subject design. Event-related brain potentials (ERPs) were recorded as participants made decisions on the lexicality of emotionally positive, negative, and neutral German verbs or pseudowords, and on the integrity of intact happy, angry, and neutral faces or slightly distorted faces. Relative to neutral and negative stimuli both positive verbs and happy faces elicited posterior ERP negativities that were indistinguishable in scalp distribution and resembled the early posterior negativities reported by others. Importantly, these ERP modulations appeared at very different latencies. Therefore, it appears that similar brain systems reflect the decoding of both biological and symbolic emotional signals of positive valence, differing mainly in the speed of meaning access, which is more direct and faster for facial expressions than for words.

  4. Emotion processing in chimeric faces: hemispheric asymmetries in expression and recognition of emotions.

    PubMed

    Indersmitten, Tim; Gur, Ruben C

    2003-05-01

    Since the discovery of facial asymmetries in emotional expressions of humans and other primates, hypotheses have related the greater left-hemiface intensity to right-hemispheric dominance in emotion processing. However, the difficulty of creating true frontal views of facial expressions in two-dimensional photographs has confounded efforts to better understand the phenomenon. We have recently described a method for obtaining three-dimensional photographs of posed and evoked emotional expressions and used these stimuli to investigate both intensity of expression and accuracy of recognizing emotion in chimeric faces constructed from only left- or right-side composites. The participant population included 38 (19 male, 19 female) African-American, Caucasian, and Asian adults. They were presented with chimeric composites generated from faces of eight actors and eight actresses showing four emotions: happiness, sadness, anger, and fear, each in posed and evoked conditions. We replicated the finding that emotions are expressed more intensely in the left hemiface for all emotions and conditions, with the exception of evoked anger, which was expressed more intensely in the right hemiface. In contrast, the results indicated that emotional expressions are recognized more efficiently in the right hemiface, indicating that the right hemiface expresses emotions more accurately. The double dissociation between the laterality of expression intensity and that of recognition efficiency supports the notion that the two kinds of processes may have distinct neural substrates. Evoked anger is uniquely expressed more intensely and accurately on the side of the face that projects to the viewer's right hemisphere, dominant in emotion recognition.

  5. Interactions between facial emotion and identity in face processing: evidence based on redundancy gains.

    PubMed

    Yankouskaya, Alla; Booth, David A; Humphreys, Glyn

    2012-11-01

    Interactions between the processing of emotion expression and form-based information from faces (facial identity) were investigated using the redundant-target paradigm, in which we specifically tested whether identity and emotional expression are integrated in a superadditive manner (Miller, Cognitive Psychology 14:247-279, 1982). In Experiments 1 and 2, participants performed emotion and face identity judgments on faces with sad or angry emotional expressions. Responses to redundant targets were faster than responses to either single target when a universal emotion was conveyed, and performance violated the predictions from a model assuming independent processing of emotion and face identity. Experiment 4 showed that these effects were not modulated by varying interstimulus and nontarget contingencies, and Experiment 5 demonstrated that the redundancy gains were eliminated when faces were inverted. Taken together, these results suggest that the identification of emotion and facial identity interact in face processing.

  6. Behavioural and neurophysiological evidence for face identity and face emotion processing in animals

    PubMed Central

    Tate, Andrew J; Fischer, Hanno; Leigh, Andrea E; Kendrick, Keith M

    2006-01-01

    Visual cues from faces provide important social information relating to individual identity, sexual attraction and emotional state. Behavioural and neurophysiological studies on both monkeys and sheep have shown that specialized skills and neural systems for processing these complex cues to guide behaviour have evolved in a number of mammals and are not present exclusively in humans. Indeed, there are remarkable similarities in the ways that faces are processed by the brain in humans and other mammalian species. While human studies with brain imaging and gross neurophysiological recording approaches have revealed global aspects of the face-processing network, they cannot investigate how information is encoded by specific neural networks. Single neuron electrophysiological recording approaches in both monkeys and sheep have, however, provided some insights into the neural encoding principles involved and, particularly, the presence of a remarkable degree of high-level encoding even at the level of a specific face. Recent developments that allow simultaneous recordings to be made from many hundreds of individual neurons are also beginning to reveal evidence for global aspects of a population-based code. This review will summarize what we have learned so far from these animal-based studies about the way the mammalian brain processes the faces and the emotions they can communicate, as well as associated capacities such as how identity and emotion cues are dissociated and how face imagery might be generated. It will also try to highlight what questions and advances in knowledge still challenge us in order to provide a complete understanding of just how brain networks perform this complex and important social recognition task. PMID:17118930

  7. Behavioural and neurophysiological evidence for face identity and face emotion processing in animals.

    PubMed

    Tate, Andrew J; Fischer, Hanno; Leigh, Andrea E; Kendrick, Keith M

    2006-12-29

    Visual cues from faces provide important social information relating to individual identity, sexual attraction and emotional state. Behavioural and neurophysiological studies on both monkeys and sheep have shown that specialized skills and neural systems for processing these complex cues to guide behaviour have evolved in a number of mammals and are not present exclusively in humans. Indeed, there are remarkable similarities in the ways that faces are processed by the brain in humans and other mammalian species. While human studies with brain imaging and gross neurophysiological recording approaches have revealed global aspects of the face-processing network, they cannot investigate how information is encoded by specific neural networks. Single neuron electrophysiological recording approaches in both monkeys and sheep have, however, provided some insights into the neural encoding principles involved and, particularly, the presence of a remarkable degree of high-level encoding even at the level of a specific face. Recent developments that allow simultaneous recordings to be made from many hundreds of individual neurons are also beginning to reveal evidence for global aspects of a population-based code. This review will summarize what we have learned so far from these animal-based studies about the way the mammalian brain processes the faces and the emotions they can communicate, as well as associated capacities such as how identity and emotion cues are dissociated and how face imagery might be generated. It will also try to highlight what questions and advances in knowledge still challenge us in order to provide a complete understanding of just how brain networks perform this complex and important social recognition task.

  8. Configural and Featural Face Processing Influences on Emotion Recognition in Schizophrenia and Bipolar Disorder.

    PubMed

    Van Rheenen, Tamsyn E; Joshua, Nicole; Castle, David J; Rossell, Susan L

    2017-03-01

    Emotion recognition impairments have been demonstrated in schizophrenia (Sz), but are less consistent and lesser in magnitude in bipolar disorder (BD). This may be related to the extent to which different face processing strategies are engaged during emotion recognition in each of these disorders. We recently showed that Sz patients had impairments in the use of both featural and configural face processing strategies, whereas BD patients were impaired only in the use of the latter. Here we examine the influence that these impairments have on facial emotion recognition in these cohorts. Twenty-eight individuals with Sz, 28 individuals with BD, and 28 healthy controls completed a facial emotion labeling task with two conditions designed to separate the use of featural and configural face processing strategies; part-based and whole-face emotion recognition. Sz patients performed worse than controls on both conditions, and worse than BD patients on the whole-face condition. BD patients performed worse than controls on the whole-face condition only. Configural processing deficits appear to influence the recognition of facial emotions in BD, whereas both configural and featural processing abnormalities impair emotion recognition in Sz. This may explain discrepancies in the profiles of emotion recognition between the disorders. (JINS, 2017, 23, 287-291).

  9. Development of Emotional Face Processing in Premature and Full-Term Infants.

    PubMed

    Carbajal-Valenzuela, Cintli Carolina; Santiago-Rodríguez, Efraín; Quirarte, Gina L; Harmony, Thalía

    2017-03-01

    The rate of premature births has increased in the past 2 decades. Ten percent of premature birth survivors develop motor impairment, but almost half exhibit later sensorial, cognitive, and emotional disabilities attributed to white matter injury and decreased volume of neuronal structures. The aim of this study was to test the hypothesis that premature and full-term infants differ in their development of emotional face processing. A comparative longitudinal study was conducted in premature and full-term infants at 4 and 8 months of age. The absolute power of the electroencephalogram was analyzed in both groups during 5 conditions of an emotional face processing task: positive, negative, neutral faces, non-face, and rest. Differences between the conditions of the task at 4 months were limited to rest versus non-rest comparisons in both groups. Eight-month-old term infants had increases ( P ≤ .05) in absolute power in the left occipital region at the frequency of 10.1 Hz and in the right occipital region at 3.5, 12.8, and 16.0 Hz when shown a positive face in comparison with a neutral face. They also showed increases in absolute power in the left occipital region at 1.9 Hz and in the right occipital region at 2.3 and 3.5 Hz with positive compared to non-face stimuli. In contrast, positive, negative, and neutral faces elicited the same responses in premature infants. In conclusion, our study provides electrophysiological evidence that emotional face processing develops differently in premature than in full-term infants, suggesting that premature birth alters mechanisms of brain development, such as the myelination process, and consequently affects complex cognitive functions.

  10. Asymmetric Engagement of Amygdala and Its Gamma Connectivity in Early Emotional Face Processing

    PubMed Central

    Liu, Tai-Ying; Chen, Yong-Sheng; Hsieh, Jen-Chuen; Chen, Li-Fen

    2015-01-01

    The amygdala has been regarded as a key substrate for emotion processing. However, the engagement of the left and right amygdala during the early perceptual processing of different emotional faces remains unclear. We investigated the temporal profiles of oscillatory gamma activity in the amygdala and effective connectivity of the amygdala with the thalamus and cortical areas during implicit emotion-perceptual tasks using event-related magnetoencephalography (MEG). We found that within 100 ms after stimulus onset the right amygdala habituated to emotional faces rapidly (with duration around 20–30 ms), whereas activity in the left amygdala (with duration around 50–60 ms) sustained longer than that in the right. Our data suggest that the right amygdala could be linked to autonomic arousal generated by facial emotions and the left amygdala might be involved in decoding or evaluating expressive faces in the early perceptual emotion processing. The results of effective connectivity provide evidence that only negative emotional processing engages both cortical and subcortical pathways connected to the right amygdala, representing its evolutional significance (survival). These findings demonstrate the asymmetric engagement of bilateral amygdala in emotional face processing as well as the capability of MEG for assessing thalamo-cortico-limbic circuitry. PMID:25629899

  11. Face-to-face: Perceived personal relevance amplifies face processing

    PubMed Central

    Pittig, Andre; Schupp, Harald T.; Alpers, Georg W.

    2017-01-01

    Abstract The human face conveys emotional and social information, but it is not well understood how these two aspects influence face perception. In order to model a group situation, two faces displaying happy, neutral or angry expressions were presented. Importantly, faces were either facing the observer, or they were presented in profile view directed towards, or looking away from each other. In Experiment 1 (n = 64), face pairs were rated regarding perceived relevance, wish-to-interact, and displayed interactivity, as well as valence and arousal. All variables revealed main effects of facial expression (emotional > neutral), face orientation (facing observer > towards > away) and interactions showed that evaluation of emotional faces strongly varies with their orientation. Experiment 2 (n = 33) examined the temporal dynamics of perceptual-attentional processing of these face constellations with event-related potentials. Processing of emotional and neutral faces differed significantly in N170 amplitudes, early posterior negativity (EPN), and sustained positive potentials. Importantly, selective emotional face processing varied as a function of face orientation, indicating early emotion-specific (N170, EPN) and late threat-specific effects (LPP, sustained positivity). Taken together, perceived personal relevance to the observer—conveyed by facial expression and face direction—amplifies emotional face processing within triadic group situations. PMID:28158672

  12. Face-to-face: Perceived personal relevance amplifies face processing.

    PubMed

    Bublatzky, Florian; Pittig, Andre; Schupp, Harald T; Alpers, Georg W

    2017-05-01

    The human face conveys emotional and social information, but it is not well understood how these two aspects influence face perception. In order to model a group situation, two faces displaying happy, neutral or angry expressions were presented. Importantly, faces were either facing the observer, or they were presented in profile view directed towards, or looking away from each other. In Experiment 1 (n = 64), face pairs were rated regarding perceived relevance, wish-to-interact, and displayed interactivity, as well as valence and arousal. All variables revealed main effects of facial expression (emotional > neutral), face orientation (facing observer > towards > away) and interactions showed that evaluation of emotional faces strongly varies with their orientation. Experiment 2 (n = 33) examined the temporal dynamics of perceptual-attentional processing of these face constellations with event-related potentials. Processing of emotional and neutral faces differed significantly in N170 amplitudes, early posterior negativity (EPN), and sustained positive potentials. Importantly, selective emotional face processing varied as a function of face orientation, indicating early emotion-specific (N170, EPN) and late threat-specific effects (LPP, sustained positivity). Taken together, perceived personal relevance to the observer-conveyed by facial expression and face direction-amplifies emotional face processing within triadic group situations. © The Author (2017). Published by Oxford University Press.

  13. Testing the effects of expression, intensity and age on emotional face processing in ASD.

    PubMed

    Luyster, Rhiannon J; Bick, Johanna; Westerlund, Alissa; Nelson, Charles A

    2017-06-21

    Individuals with autism spectrum disorder (ASD) commonly show global deficits in the processing of facial emotion, including impairments in emotion recognition and slowed processing of emotional faces. Growing evidence has suggested that these challenges may increase with age, perhaps due to minimal improvement with age in individuals with ASD. In the present study, we explored the role of age, emotion type and emotion intensity in face processing for individuals with and without ASD. Twelve- and 18-22- year-old children with and without ASD participated. No significant diagnostic group differences were observed on behavioral measures of emotion processing for younger versus older individuals with and without ASD. However, there were significant group differences in neural responses to emotional faces. Relative to TD, at 12 years of age and during adulthood, individuals with ASD showed slower N170 to emotional faces. While the TD groups' P1 latency was significantly shorter in adults when compared to 12 year olds, there was no significant age-related difference in P1 latency among individuals with ASD. Findings point to potential differences in the maturation of cortical networks that support visual processing (whether of faces or stimuli more broadly), among individuals with and without ASD between late childhood and adulthood. Finally, associations between ERP amplitudes and behavioral responses on emotion processing tasks suggest possible neural markers for emotional and behavioral deficits among individuals with ASD. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. The electrophysiological effects of the serotonin 1A receptor agonist buspirone in emotional face processing.

    PubMed

    Bernasconi, Fosco; Kometer, Michael; Pokorny, Thomas; Seifritz, Erich; Vollenweider, Franz X

    2015-04-01

    Emotional face processing is critically modulated by the serotonergic system, and serotonin (5-HT) receptor agonists impair emotional face processing. However, the specific contribution of the 5-HT1A receptor remains poorly understood. Here we investigated the spatiotemporal brain mechanisms underpinning the modulation of emotional face processing induced by buspirone, a partial 5-HT1A receptor agonist. In a psychophysical discrimination of emotional faces task, we observed that the discrimination fearful versus neutral faces were reduced, but not happy versus neutral faces. Electrical neuroimaging analyses were applied to visual evoked potentials elicited by emotional face images, after placebo and buspirone administration. Buspirone modulated response strength (i.e., global field power) in the interval 230-248ms after stimulus onset. Distributed source estimation over this time interval revealed that buspirone decreased the neural activity in the right dorsolateral prefrontal cortex that was evoked by fearful faces. These results indicate temporal and valence-specific effects of buspirone on the neuronal correlates of emotional face processing. Furthermore, the reduced neural activity in the dorsolateral prefrontal cortex in response to fearful faces suggests a reduced attention to fearful faces. Collectively, these findings provide new insights into the role of 5-HT1A receptors in emotional face processing and have implications for affective disorders that are characterized by an increased attention to negative stimuli. Copyright © 2015 Elsevier B.V. and ECNP. All rights reserved.

  15. Emotional Cues during Simultaneous Face and Voice Processing: Electrophysiological Insights

    PubMed Central

    Liu, Taosheng; Pinheiro, Ana; Zhao, Zhongxin; Nestor, Paul G.; McCarley, Robert W.; Niznikiewicz, Margaret A.

    2012-01-01

    Both facial expression and tone of voice represent key signals of emotional communication but their brain processing correlates remain unclear. Accordingly, we constructed a novel implicit emotion recognition task consisting of simultaneously presented human faces and voices with neutral, happy, and angry valence, within the context of recognizing monkey faces and voices task. To investigate the temporal unfolding of the processing of affective information from human face-voice pairings, we recorded event-related potentials (ERPs) to these audiovisual test stimuli in 18 normal healthy subjects; N100, P200, N250, P300 components were observed at electrodes in the frontal-central region, while P100, N170, P270 were observed at electrodes in the parietal-occipital region. Results indicated a significant audiovisual stimulus effect on the amplitudes and latencies of components in frontal-central (P200, P300, and N250) but not the parietal occipital region (P100, N170 and P270). Specifically, P200 and P300 amplitudes were more positive for emotional relative to neutral audiovisual stimuli, irrespective of valence, whereas N250 amplitude was more negative for neutral relative to emotional stimuli. No differentiation was observed between angry and happy conditions. The results suggest that the general effect of emotion on audiovisual processing can emerge as early as 200 msec (P200 peak latency) post stimulus onset, in spite of implicit affective processing task demands, and that such effect is mainly distributed in the frontal-central region. PMID:22383987

  16. Reduced beta connectivity during emotional face processing in adolescents with autism.

    PubMed

    Leung, Rachel C; Ye, Annette X; Wong, Simeon M; Taylor, Margot J; Doesburg, Sam M

    2014-01-01

    Autism spectrum disorder (ASD) is a neurodevelopmental disorder characterized by impairments in social cognition. The biological basis of deficits in social cognition in ASD, and their difficulty in processing emotional face information in particular, remains unclear. Atypical communication within and between brain regions has been reported in ASD. Interregional phase-locking is a neurophysiological mechanism mediating communication among brain areas and is understood to support cognitive functions. In the present study we investigated interregional magnetoencephalographic phase synchronization during the perception of emotional faces in adolescents with ASD. A total of 22 adolescents with ASD (18 males, mean age =14.2 ± 1.15 years, 22 right-handed) with mild to no cognitive delay and 17 healthy controls (14 males, mean age =14.4 ± 0.33 years, 16 right-handed) performed an implicit emotional processing task requiring perception of happy, angry and neutral faces while we recorded neuromagnetic signals. The faces were presented rapidly (80 ms duration) to the left or right of a central fixation cross and participants responded to a scrambled pattern that was presented concurrently on the opposite side of the fixation point. Task-dependent interregional phase-locking was calculated among source-resolved brain regions. Task-dependent increases in interregional beta synchronization were observed. Beta-band interregional phase-locking in adolescents with ASD was reduced, relative to controls, during the perception of angry faces in a distributed network involving the right fusiform gyrus and insula. No significant group differences were found for happy or neutral faces, or other analyzed frequency ranges. Significant reductions in task-dependent beta connectivity strength, clustering and eigenvector centrality (all P <0.001) in the right insula were found in adolescents with ASD, relative to controls. Reduced beta synchronization may reflect inadequate

  17. Effortful versus automatic emotional processing in schizophrenia: Insights from a face-vignette task.

    PubMed

    Patrick, Regan E; Rastogi, Anuj; Christensen, Bruce K

    2015-01-01

    Adaptive emotional responding relies on dual automatic and effortful processing streams. Dual-stream models of schizophrenia (SCZ) posit a selective deficit in neural circuits that govern goal-directed, effortful processes versus reactive, automatic processes. This imbalance suggests that when patients are confronted with competing automatic and effortful emotional response cues, they will exhibit diminished effortful responding and intact, possibly elevated, automatic responding compared to controls. This prediction was evaluated using a modified version of the face-vignette task (FVT). Participants viewed emotional faces (automatic response cue) paired with vignettes (effortful response cue) that signalled a different emotion category and were instructed to discriminate the manifest emotion. Patients made less vignette and more face responses than controls. However, the relationship between group and FVT responding was moderated by IQ and reading comprehension ability. These results replicate and extend previous research and provide tentative support for abnormal conflict resolution between automatic and effortful emotional processing predicted by dual-stream models of SCZ.

  18. Positive and negative emotion enhances the processing of famous faces in a semantic judgment task.

    PubMed

    Bate, Sarah; Haslam, Catherine; Hodgson, Timothy L; Jansari, Ashok; Gregory, Nicola; Kay, Janice

    2010-01-01

    Previous work has consistently reported a facilitatory influence of positive emotion in face recognition (e.g., D'Argembeau, Van der Linden, Comblain, & Etienne, 2003). However, these reports asked participants to make recognition judgments in response to faces, and it is unknown whether emotional valence may influence other stages of processing, such as at the level of semantics. Furthermore, other evidence suggests that negative rather than positive emotion facilitates higher level judgments when processing nonfacial stimuli (e.g., Mickley & Kensinger, 2008), and it is possible that negative emotion also influences latter stages of face processing. The present study addressed this issue, examining the influence of emotional valence while participants made semantic judgments in response to a set of famous faces. Eye movements were monitored while participants performed this task, and analyses revealed a reduction in information extraction for the faces of liked and disliked celebrities compared with those of emotionally neutral celebrities. Thus, in contrast to work using familiarity judgments, both positive and negative emotion facilitated processing in this semantic-based task. This pattern of findings is discussed in relation to current models of face processing. Copyright 2009 APA, all rights reserved.

  19. Selective attention modulates early human evoked potentials during emotional face-voice processing.

    PubMed

    Ho, Hao Tam; Schröger, Erich; Kotz, Sonja A

    2015-04-01

    Recent findings on multisensory integration suggest that selective attention influences cross-sensory interactions from an early processing stage. Yet, in the field of emotional face-voice integration, the hypothesis prevails that facial and vocal emotional information interacts preattentively. Using ERPs, we investigated the influence of selective attention on the perception of congruent versus incongruent combinations of neutral and angry facial and vocal expressions. Attention was manipulated via four tasks that directed participants to (i) the facial expression, (ii) the vocal expression, (iii) the emotional congruence between the face and the voice, and (iv) the synchrony between lip movement and speech onset. Our results revealed early interactions between facial and vocal emotional expressions, manifested as modulations of the auditory N1 and P2 amplitude by incongruent emotional face-voice combinations. Although audiovisual emotional interactions within the N1 time window were affected by the attentional manipulations, interactions within the P2 modulation showed no such attentional influence. Thus, we propose that the N1 and P2 are functionally dissociated in terms of emotional face-voice processing and discuss evidence in support of the notion that the N1 is associated with cross-sensory prediction, whereas the P2 relates to the derivation of an emotional percept. Essentially, our findings put the integration of facial and vocal emotional expressions into a new perspective-one that regards the integration process as a composite of multiple, possibly independent subprocesses, some of which are susceptible to attentional modulation, whereas others may be influenced by additional factors.

  20. Age-related differences in event-related potentials for early visual processing of emotional faces.

    PubMed

    Hilimire, Matthew R; Mienaltowski, Andrew; Blanchard-Fields, Fredda; Corballis, Paul M

    2014-07-01

    With advancing age, processing resources are shifted away from negative emotional stimuli and toward positive ones. Here, we explored this 'positivity effect' using event-related potentials (ERPs). Participants identified the presence or absence of a visual probe that appeared over photographs of emotional faces. The ERPs elicited by the onsets of angry, sad, happy and neutral faces were recorded. We examined the frontocentral emotional positivity (FcEP), which is defined as a positive deflection in the waveforms elicited by emotional expressions relative to neutral faces early on in the time course of the ERP. The FcEP is thought to reflect enhanced early processing of emotional expressions. The results show that within the first 130 ms young adults show an FcEP to negative emotional expressions, whereas older adults show an FcEP to positive emotional expressions. These findings provide additional evidence that the age-related positivity effect in emotion processing can be traced to automatic processes that are evident very early in the processing of emotional facial expressions. © The Author (2013). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  1. Social and emotional relevance in face processing: happy faces of future interaction partners enhance the late positive potential

    PubMed Central

    Bublatzky, Florian; Gerdes, Antje B. M.; White, Andrew J.; Riemer, Martin; Alpers, Georg W.

    2014-01-01

    Human face perception is modulated by both emotional valence and social relevance, but their interaction has rarely been examined. Event-related brain potentials (ERP) to happy, neutral, and angry facial expressions with different degrees of social relevance were recorded. To implement a social anticipation task, relevance was manipulated by presenting faces of two specific actors as future interaction partners (socially relevant), whereas two other face actors remained non-relevant. In a further control task all stimuli were presented without specific relevance instructions (passive viewing). Face stimuli of four actors (2 women, from the KDEF) were randomly presented for 1s to 26 participants (16 female). Results showed an augmented N170, early posterior negativity (EPN), and late positive potential (LPP) for emotional in contrast to neutral facial expressions. Of particular interest, face processing varied as a function of experimental tasks. Whereas task effects were observed for P1 and EPN regardless of instructed relevance, LPP amplitudes were modulated by emotional facial expression and relevance manipulation. The LPP was specifically enhanced for happy facial expressions of the anticipated future interaction partners. This underscores that social relevance can impact face processing already at an early stage of visual processing. These findings are discussed within the framework of motivated attention and face processing theories. PMID:25076881

  2. Age-Related Changes in Amygdala-Frontal Connectivity during Emotional Face Processing from Childhood into Young Adulthood

    PubMed Central

    Wu, Minjie; Kujawa, Autumn; Lu, Lisa H.; Fitzgerald, Daniel A.; Klumpp, Heide; Fitzgerald, Kate D.; Monk, Christopher S.; Phan, K. Luan

    2016-01-01

    The ability to process and respond to emotional facial expressions is a critical skill for healthy social and emotional development. There has been growing interest in understanding the neural circuitry underlying development of emotional processing, with previous research implicating functional connectivity between amygdala and frontal regions. However, existing work has focused on threatening emotional faces, raising questions regarding the extent to which these developmental patterns are specific to threat or to emotional face processing more broadly. In the current study, we examined age-related changes in brain activity and amygdala functional connectivity during an fMRI emotional face matching task (including angry, fearful and happy faces) in 61 healthy subjects aged 7–25 years. We found age-related decreases in ventral medial prefrontal cortex (vmPFC) activity in response to happy faces but not to angry or fearful faces, and an age-related change (shifting from positive to negative correlation) in amygdala-anterior cingulate cortex/medial prefrontal cortex (ACC/mPFC) functional connectivity to all emotional faces. Specifically, positive correlations between amygdala and ACC/mPFC in children changed to negative correlations in adults, which may suggest early emergence of bottom-up amygdala excitatory signaling to ACC/mPFC in children and later development of top-down inhibitory control of ACC/mPFC over amygdala in adults. Age-related changes in amygdala-ACC/mPFC connectivity did not vary for processing of different facial emotions, suggesting changes in amygdala-ACC/mPFC connectivity may underlie development of broad emotional processing, rather than threat-specific processing. PMID:26931629

  3. Passive and Motivated Perception of Emotional Faces: Qualitative and Quantitative Changes in the Face Processing Network

    PubMed Central

    Skelly, Laurie R.; Decety, Jean

    2012-01-01

    Emotionally expressive faces are processed by a distributed network of interacting sub-cortical and cortical brain regions. The components of this network have been identified and described in large part by the stimulus properties to which they are sensitive, but as face processing research matures interest has broadened to also probe dynamic interactions between these regions and top-down influences such as task demand and context. While some research has tested the robustness of affective face processing by restricting available attentional resources, it is not known whether face network processing can be augmented by increased motivation to attend to affective face stimuli. Short videos of people expressing emotions were presented to healthy participants during functional magnetic resonance imaging. Motivation to attend to the videos was manipulated by providing an incentive for improved recall performance. During the motivated condition, there was greater coherence among nodes of the face processing network, more widespread correlation between signal intensity and performance, and selective signal increases in a task-relevant subset of face processing regions, including the posterior superior temporal sulcus and right amygdala. In addition, an unexpected task-related laterality effect was seen in the amygdala. These findings provide strong evidence that motivation augmentsco-activity among nodes of the face processing network and the impact of neural activity on performance. These within-subject effects highlight the necessity to consider motivation when interpreting neural function in special populations, and to further explore the effect of task demands on face processing in healthy brains. PMID:22768287

  4. Emotion Perception or Social Cognitive Complexity: What Drives Face Processing Deficits in Autism Spectrum Disorder?

    ERIC Educational Resources Information Center

    Walsh, Jennifer A.; Creighton, Sarah E.; Rutherford, M. D.

    2016-01-01

    Some, but not all, relevant studies have revealed face processing deficits among those with autism spectrum disorder (ASD). In particular, deficits are revealed in face processing tasks that involve emotion perception. The current study examined whether either deficits in processing emotional expression or deficits in processing social cognitive…

  5. Eye-Tracking, Autonomic, and Electrophysiological Correlates of Emotional Face Processing in Adolescents with Autism Spectrum Disorder

    PubMed Central

    Wagner, Jennifer B.; Hirsch, Suzanna B.; Vogel-Farley, Vanessa K.; Redcay, Elizabeth; Nelson, Charles A.

    2014-01-01

    Individuals with autism spectrum disorder (ASD) often have difficulty with social-emotional cues. This study examined the neural, behavioral, and autonomic correlates of emotional face processing in adolescents with ASD and typical development (TD) using eye-tracking and event-related potentials (ERPs) across two different paradigms. Scanning of faces was similar across groups in the first task, but the second task found that face-sensitive ERPs varied with emotional expressions only in TD. Further, ASD showed enhanced neural responding to non-social stimuli. In TD only, attention to eyes during eye-tracking related to faster face-sensitive ERPs in a separate task; in ASD, a significant positive association was found between autonomic activity and attention to mouths. Overall, ASD showed an atypical pattern of emotional face processing, with reduced neural differentiation between emotions and a reduced relationship between gaze behavior and neural processing of faces. PMID:22684525

  6. Lateralized hybrid faces: evidence of a valence-specific bias in the processing of implicit emotions.

    PubMed

    Prete, Giulia; Laeng, Bruno; Tommasi, Luca

    2014-01-01

    It is well known that hemispheric asymmetries exist for both the analyses of low-level visual information (such as spatial frequency) and high-level visual information (such as emotional expressions). In this study, we assessed which of the above factors underlies perceptual laterality effects with "hybrid faces": a type of stimulus that allows testing for unaware processing of emotional expressions, when the emotion is displayed in the low-frequency information while an image of the same face with a neutral expression is superimposed to it. Despite hybrid faces being perceived as neutral, the emotional information modulates observers' social judgements. In the present study, participants were asked to assess friendliness of hybrid faces displayed tachistoscopically, either centrally or laterally to fixation. We found a clear influence of the hidden emotions also with lateral presentations. Happy faces were rated as more friendly and angry faces as less friendly with respect to neutral faces. In general, hybrid faces were evaluated as less friendly when they were presented in the left visual field/right hemisphere than in the right visual field/left hemisphere. The results extend the validity of the valence hypothesis in the specific domain of unaware (subcortical) emotion processing.

  7. Age-related changes in amygdala-frontal connectivity during emotional face processing from childhood into young adulthood.

    PubMed

    Wu, Minjie; Kujawa, Autumn; Lu, Lisa H; Fitzgerald, Daniel A; Klumpp, Heide; Fitzgerald, Kate D; Monk, Christopher S; Phan, K Luan

    2016-05-01

    The ability to process and respond to emotional facial expressions is a critical skill for healthy social and emotional development. There has been growing interest in understanding the neural circuitry underlying development of emotional processing, with previous research implicating functional connectivity between amygdala and frontal regions. However, existing work has focused on threatening emotional faces, raising questions regarding the extent to which these developmental patterns are specific to threat or to emotional face processing more broadly. In the current study, we examined age-related changes in brain activity and amygdala functional connectivity during an fMRI emotional face matching task (including angry, fearful, and happy faces) in 61 healthy subjects aged 7-25 years. We found age-related decreases in ventral medial prefrontal cortex activity in response to happy faces but not to angry or fearful faces, and an age-related change (shifting from positive to negative correlation) in amygdala-anterior cingulate cortex/medial prefrontal cortex (ACC/mPFC) functional connectivity to all emotional faces. Specifically, positive correlations between amygdala and ACC/mPFC in children changed to negative correlations in adults, which may suggest early emergence of bottom-up amygdala excitatory signaling to ACC/mPFC in children and later development of top-down inhibitory control of ACC/mPFC over amygdala in adults. Age-related changes in amygdala-ACC/mPFC connectivity did not vary for processing of different facial emotions, suggesting changes in amygdala-ACC/mPFC connectivity may underlie development of broad emotional processing, rather than threat-specific processing. Hum Brain Mapp 37:1684-1695, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  8. Putting the face in context: Body expressions impact facial emotion processing in human infants.

    PubMed

    Rajhans, Purva; Jessen, Sarah; Missana, Manuela; Grossmann, Tobias

    2016-06-01

    Body expressions exert strong contextual effects on facial emotion perception in adults. Specifically, conflicting body cues hamper the recognition of emotion from faces, as evident on both the behavioral and neural level. We examined the developmental origins of the neural processes involved in emotion perception across body and face in 8-month-old infants by measuring event-related brain potentials (ERPs). We primed infants with body postures (fearful, happy) that were followed by either congruent or incongruent facial expressions. Our results revealed that body expressions impact facial emotion processing and that incongruent body cues impair the neural discrimination of emotional facial expressions. Priming effects were associated with attentional and recognition memory processes, as reflected in a modulation of the Nc and Pc evoked at anterior electrodes. These findings demonstrate that 8-month-old infants possess neural mechanisms that allow for the integration of emotion across body and face, providing evidence for the early developmental emergence of context-sensitive facial emotion perception. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. Human sex differences in emotional processing of own-race and other-race faces.

    PubMed

    Ran, Guangming; Chen, Xu; Pan, Yangu

    2014-06-18

    There is evidence that women and men show differences in the perception of affective facial expressions. However, none of the previous studies directly investigated sex differences in emotional processing of own-race and other-race faces. The current study addressed this issue using high time resolution event-related potential techniques. In total, data from 25 participants (13 women and 12 men) were analyzed. It was found that women showed increased N170 amplitudes to negative White faces compared with negative Chinese faces over the right hemisphere electrodes. This result suggests that women show enhanced sensitivity to other-race faces showing negative emotions (fear or disgust), which may contribute toward evolution. However, the current data showed that men had increased N170 amplitudes to happy Chinese versus happy White faces over the left hemisphere electrodes, indicating that men show enhanced sensitivity to own-race faces showing positive emotions (happiness). In this respect, men might use past pleasant emotional experiences to boost recognition of own-race faces.

  10. Prolonged Interruption of Cognitive Control of Conflict Processing Over Human Faces by Task-Irrelevant Emotion Expression

    PubMed Central

    Kim, Jinyoung; Kang, Min-Suk; Cho, Yang Seok; Lee, Sang-Hun

    2017-01-01

    As documented by Darwin 150 years ago, emotion expressed in human faces readily draws our attention and promotes sympathetic emotional reactions. How do such reactions to the expression of emotion affect our goal-directed actions? Despite the substantial advance made in the neural mechanisms of both cognitive control and emotional processing, it is not yet known well how these two systems interact. Here, we studied how emotion expressed in human faces influences cognitive control of conflict processing, spatial selective attention and inhibitory control in particular, using the Eriksen flanker paradigm. In this task, participants viewed displays of a central target face flanked by peripheral faces and were asked to judge the gender of the target face; task-irrelevant emotion expressions were embedded in the target face, the flanking faces, or both. We also monitored how emotion expression affects gender judgment performance while varying the relative timing between the target and flanker faces. As previously reported, we found robust gender congruency effects, namely slower responses to the target faces whose gender was incongruent with that of the flanker faces, when the flankers preceded the target by 0.1 s. When the flankers further advanced the target by 0.3 s, however, the congruency effect vanished in most of the viewing conditions, except for when emotion was expressed only in the flanking faces or when congruent emotion was expressed in the target and flanking faces. These results suggest that emotional saliency can prolong a substantial degree of conflict by diverting bottom-up attention away from the target, and that inhibitory control on task-irrelevant information from flanking stimuli is deterred by the emotional congruency between target and flanking stimuli. PMID:28676780

  11. Interdependent Mechanisms for Processing Gender and Emotion: The Special Status of Angry Male Faces

    PubMed Central

    Harris, Daniel A.; Ciaramitaro, Vivian M.

    2016-01-01

    While some models of how various attributes of a face are processed have posited that face features, invariant physical cues such as gender or ethnicity as well as variant social cues such as emotion, may be processed independently (e.g., Bruce and Young, 1986), other models suggest a more distributed representation and interdependent processing (e.g., Haxby et al., 2000). Here, we use a contingent adaptation paradigm to investigate if mechanisms for processing the gender and emotion of a face are interdependent and symmetric across the happy–angry emotional continuum and regardless of the gender of the face. We simultaneously adapted participants to angry female faces and happy male faces (Experiment 1) or to happy female faces and angry male faces (Experiment 2). In Experiment 1, we found evidence for contingent adaptation, with simultaneous aftereffects in opposite directions: male faces were biased toward angry while female faces were biased toward happy. Interestingly, in the complementary Experiment 2, we did not find evidence for contingent adaptation, with both male and female faces biased toward angry. Our results highlight that evidence for contingent adaptation and the underlying interdependent face processing mechanisms that would allow for contingent adaptation may only be evident for certain combinations of face features. Such limits may be especially important in the case of social cues given how maladaptive it may be to stop responding to threatening information, with male angry faces considered to be the most threatening. The underlying neuronal mechanisms that could account for such asymmetric effects in contingent adaptation remain to be elucidated. PMID:27471482

  12. Neural correlates of emotional face processing in bipolar disorder: an event-related potential study.

    PubMed

    Degabriele, Racheal; Lagopoulos, Jim; Malhi, Gin

    2011-09-01

    Behavioural and imaging studies report that individuals with bipolar disorder (BD) exhibit impairments in emotional face processing. However, few studies have studied the temporal characteristics of these impairments, and event-related potential (ERP) studies that investigate emotion perception in BD are rare. The aim of our study was to explore these processes as indexed by the face-specific P100 and N170 ERP components in a BD cohort. Eighteen subjects diagnosed with BD and 18 age- and sex-matched healthy volunteers completed an emotional go/no-go inhibition task during electroencephalogram (EEG) and ERP acquisition. Patients demonstrated faster responses to happy compared to sad faces, whereas control data revealed no emotional discrimination. Errors of omission were more frequent in the BD group in both emotion conditions, but there were no between-group differences in commission errors. Significant differences were found between groups in P100 amplitude variation across levels of affect, with the BD group exhibiting greater responses to happy compared to sad faces. Conversely, the control cohort failed to demonstrate a differentiation between emotions. A statistically significant between-group effect was also found for N170 amplitudes, indicating reduced responses in the BD group. Future studies should ideally recruit BD patients across all three mood states (manic, depressive, and euthymic) with greater scrutiny of the effects of psychotropic medication. These ERP results primarily suggest an emotion-sensitive face processing impairment in BD whereby patients are initially more attuned to positive emotions as indicated by the P100 ERP component, and this may contribute to the emergence of bipolar-like symptoms. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. Perceiving emotions in neutral faces: expression processing is biased by affective person knowledge.

    PubMed

    Suess, Franziska; Rabovsky, Milena; Abdel Rahman, Rasha

    2015-04-01

    According to a widely held view, basic emotions such as happiness or anger are reflected in facial expressions that are invariant and uniquely defined by specific facial muscle movements. Accordingly, expression perception should not be vulnerable to influences outside the face. Here, we test this assumption by manipulating the emotional valence of biographical knowledge associated with individual persons. Faces of well-known and initially unfamiliar persons displaying neutral expressions were associated with socially relevant negative, positive or comparatively neutral biographical information. The expressions of faces associated with negative information were classified as more negative than faces associated with neutral information. Event-related brain potential modulations in the early posterior negativity, a component taken to reflect early sensory processing of affective stimuli such as emotional facial expressions, suggest that negative affective knowledge can bias the perception of faces with neutral expressions toward subjectively displaying negative emotions. © The Author (2014). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  14. Face Processing and Facial Emotion Recognition in Adults with Down Syndrome

    ERIC Educational Resources Information Center

    Barisnikov, Koviljka; Hippolyte, Loyse; Van der Linden, Martial

    2008-01-01

    Face processing and facial expression recognition was investigated in 17 adults with Down syndrome, and results were compared with those of a child control group matched for receptive vocabulary. On the tasks involving faces without emotional content, the adults with Down syndrome performed significantly worse than did the controls. However, their…

  15. Visual Search for Faces with Emotional Expressions

    ERIC Educational Resources Information Center

    Frischen, Alexandra; Eastwood, John D.; Smilek, Daniel

    2008-01-01

    The goal of this review is to critically examine contradictory findings in the study of visual search for emotionally expressive faces. Several key issues are addressed: Can emotional faces be processed preattentively and guide attention? What properties of these faces influence search efficiency? Is search moderated by the emotional state of the…

  16. The effects of familiarity and emotional expression on face processing examined by ERPs in patients with schizophrenia.

    PubMed

    Caharel, Stéphanie; Bernard, Christian; Thibaut, Florence; Haouzir, Sadec; Di Maggio-Clozel, Carole; Allio, Gabrielle; Fouldrin, Gaël; Petit, Michel; Lalonde, Robert; Rebaï, Mohamed

    2007-09-01

    The main objective of the study was to determine whether patients with schizophrenia are deficient relative to controls in the processing of faces at different levels of familiarity and types of emotion and the stage where such differences may occur. ERPs based on 18 patients with schizophrenia and 18 controls were compared in a face identification task at three levels of familiarity (unknown, familiar, subject's own) and for three types of emotion (disgust, smiling, neutral). The schizophrenic group was less accurate than controls in the face processing, especially for unknown faces and those expressing negative emotions such as disgust. P1 and N170 amplitudes were lower and P1, N170, P250 amplitudes were of slower onset in patients with schizophrenia. N170 and P250 amplitudes were modulated by familiarity and face expression in a different manner in patients than controls. Schizophrenia is associated with a genelarized defect of face processing, both in terms of familiarity and emotional expression, attributable to deficient processing at sensory (P1) and perceptual (N170) stages. These patients appear to have difficulty in encoding the structure of a face and thereby do not evaluate correctly familiarity and emotion.

  17. Emotional face processing in pediatric bipolar disorder: evidence for functional impairments in the fusiform gyrus.

    PubMed

    Perlman, Susan B; Fournier, Jay C; Bebko, Genna; Bertocci, Michele A; Hinze, Amanda K; Bonar, Lisa; Almeida, Jorge R C; Versace, Amelia; Schirda, Claudiu; Travis, Michael; Gill, Mary Kay; Demeter, Christine; Diwadkar, Vaibhav A; Sunshine, Jeffrey L; Holland, Scott K; Kowatch, Robert A; Birmaher, Boris; Axelson, David; Horwitz, Sarah M; Arnold, L Eugene; Fristad, Mary A; Youngstrom, Eric A; Findling, Robert L; Phillips, Mary L

    2013-12-01

    Pediatric bipolar disorder involves poor social functioning, but the neural mechanisms underlying these deficits are not well understood. Previous neuroimaging studies have found deficits in emotional face processing localized to emotional brain regions. However, few studies have examined dysfunction in other regions of the face processing circuit. This study assessed hypoactivation in key face processing regions of the brain in pediatric bipolar disorder. Youth with a bipolar spectrum diagnosis (n = 20) were matched to a nonbipolar clinical group (n = 20), with similar demographics and comorbid diagnoses, and a healthy control group (n = 20). Youth participated in a functional magnetic resonance imaging (fMRI) scanning which employed a task-irrelevant emotion processing design in which processing of facial emotions was not germane to task performance. Hypoactivation, isolated to the fusiform gyrus, was found when viewing animated, emerging facial expressions of happiness, sadness, fearfulness, and especially anger in pediatric bipolar participants relative to matched clinical and healthy control groups. The results of the study imply that differences exist in visual regions of the brain's face processing system and are not solely isolated to emotional brain regions such as the amygdala. Findings are discussed in relation to facial emotion recognition and fusiform gyrus deficits previously reported in the autism literature. Behavioral interventions targeting attention to facial stimuli might be explored as possible treatments for bipolar disorder in youth. Copyright © 2013 American Academy of Child and Adolescent Psychiatry. Published by Elsevier Inc. All rights reserved.

  18. The Development of Emotional Face and Eye Gaze Processing

    ERIC Educational Resources Information Center

    Hoehl, Stefanie; Striano, Tricia

    2010-01-01

    Recent research has demonstrated that infants' attention towards novel objects is affected by an adult's emotional expression and eye gaze toward the object. The current event-related potential (ERP) study investigated how infants at 3, 6, and 9 months of age process fearful compared to neutral faces looking toward objects or averting gaze away…

  19. Different underlying mechanisms for face emotion and gender processing during feature-selective attention: Evidence from event-related potential studies.

    PubMed

    Wang, Hailing; Ip, Chengteng; Fu, Shimin; Sun, Pei

    2017-05-01

    Face recognition theories suggest that our brains process invariant (e.g., gender) and changeable (e.g., emotion) facial dimensions separately. To investigate whether these two dimensions are processed in different time courses, we analyzed the selection negativity (SN, an event-related potential component reflecting attentional modulation) elicited by face gender and emotion during a feature selective attention task. Participants were instructed to attend to a combination of face emotion and gender attributes in Experiment 1 (bi-dimensional task) and to either face emotion or gender in Experiment 2 (uni-dimensional task). The results revealed that face emotion did not elicit a substantial SN, whereas face gender consistently generated a substantial SN in both experiments. These results suggest that face gender is more sensitive to feature-selective attention and that face emotion is encoded relatively automatically on SN, implying the existence of different underlying processing mechanisms for invariant and changeable facial dimensions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Age-related changes in emotional face processing across childhood and into young adulthood: evidence from event-related potentials

    PubMed Central

    MacNamara, Annmarie; Vergés, Alvaro; Kujawa, Autumn; Fitzgerald, Kate D.; Monk, Christopher S.; Phan, K. Luan

    2016-01-01

    Socio-emotional processing is an essential part of development, and age-related changes in its neural correlates can be observed. The late positive potential (LPP) is a measure of motivated attention that can be used to assess emotional processing; however, changes in the LPP elicited by emotional faces have not been assessed across a wide age range in childhood and young adulthood. We used an emotional face matching task to examine behavior and event-related potentials (ERPs) in 33 youth aged 7 to 19 years old. Younger children were slower when performing the matching task. The LPP elicited by emotional faces but not control stimuli (geometric shapes) decreased with age; by contrast, an earlier ERP (the P1) decreased with age for both faces and shapes, suggesting increased efficiency of early visual processing. Results indicate age-related attenuation in emotional processing that may stem from increased efficiency and regulatory control when performing a socio-emotional task. PMID:26220144

  1. Method for Face-Emotion Retrieval Using A Cartoon Emotional Expression Approach

    NASA Astrophysics Data System (ADS)

    Kostov, Vlaho; Yanagisawa, Hideyoshi; Johansson, Martin; Fukuda, Shuichi

    A simple method for extracting emotion from a human face, as a form of non-verbal communication, was developed to cope with and optimize mobile communication in a globalized and diversified society. A cartoon face based model was developed and used to evaluate emotional content of real faces. After a pilot survey, basic rules were defined and student subjects were asked to express emotion using the cartoon face. Their face samples were then analyzed using principal component analysis and the Mahalanobis distance method. Feature parameters considered as having relations with emotions were extracted and new cartoon faces (based on these parameters) were generated. The subjects evaluated emotion of these cartoon faces again and we confirmed these parameters were suitable. To confirm how these parameters could be applied to real faces, we asked subjects to express the same emotions which were then captured electronically. Simple image processing techniques were also developed to extract these features from real faces and we then compared them with the cartoon face parameters. It is demonstrated via the cartoon face that we are able to express the emotions from very small amounts of information. As a result, real and cartoon faces correspond to each other. It is also shown that emotion could be extracted from still and dynamic real face images using these cartoon-based features.

  2. Event-Related Potentials of Bottom-Up and Top-Down Processing of Emotional Faces

    PubMed Central

    Moradi, Afsane; Mehrinejad, Seyed Abolghasem; Ghadiri, Mohammad; Rezaei, Farzin

    2017-01-01

    Introduction: Emotional stimulus is processed automatically in a bottom-up way or can be processed voluntarily in a top-down way. Imaging studies have indicated that bottom-up and top-down processing are mediated through different neural systems. However, temporal differentiation of top-down versus bottom-up processing of facial emotional expressions has remained to be clarified. The present study aimed to explore the time course of these processes as indexed by the emotion-specific P100 and late positive potential (LPP) event-related potential (ERP) components in a group of healthy women. Methods: Fourteen female students of Alzahra University, Tehran, Iran aged 18–30 years, voluntarily participated in the study. The subjects completed 2 overt and covert emotional tasks during ERP acquisition. Results: The results indicated that fearful expressions significantly produced greater P100 amplitude compared to other expressions. Moreover, the P100 findings showed an interaction between emotion and processing conditions. Further analysis indicated that within the overt condition, fearful expressions elicited more P100 amplitude compared to other emotional expressions. Also, overt conditions created significantly more LPP latencies and amplitudes compared to covert conditions. Conclusion: Based on the results, early perceptual processing of fearful face expressions is enhanced in top-down way compared to bottom-up way. It also suggests that P100 may reflect an attentional bias toward fearful emotions. However, no such differentiation was observed within later processing stages of face expressions, as indexed by the ERP LPP component, in a top-down versus bottom-up way. Overall, this study provides a basis for further exploring of bottom-up and top-down processes underlying emotion and may be typically helpful for investigating the temporal characteristics associated with impaired emotional processing in psychiatric disorders. PMID:28446947

  3. Altered neural processing of emotional faces in remitted Cushing's disease.

    PubMed

    Bas-Hoogendam, Janna Marie; Andela, Cornelie D; van der Werff, Steven J A; Pannekoek, J Nienke; van Steenbergen, Henk; Meijer, Onno C; van Buchem, Mark A; Rombouts, Serge A R B; van der Mast, Roos C; Biermasz, Nienke R; van der Wee, Nic J A; Pereira, Alberto M

    2015-09-01

    Patients with long-term remission of Cushing's disease (CD) demonstrate residual psychological complaints. At present, it is not known how previous exposure to hypercortisolism affects psychological functioning in the long-term. Earlier magnetic resonance imaging (MRI) studies demonstrated abnormalities of brain structure and resting-state connectivity in patients with long-term remission of CD, but no data are available on functional alterations in the brain during the performance of emotional or cognitive tasks in these patients. We performed a cross-sectional functional MRI study, investigating brain activation during emotion processing in patients with long-term remission of CD. Processing of emotional faces versus a non-emotional control condition was examined in 21 patients and 21 matched healthy controls. Analyses focused on activation and connectivity of two a priori determined regions of interest: the amygdala and the medial prefrontal-orbitofrontal cortex (mPFC-OFC). We also assessed psychological functioning, cognitive failure, and clinical disease severity. Patients showed less mPFC activation during processing of emotional faces compared to controls, whereas no differences were found in amygdala activation. An exploratory psychophysiological interaction analysis demonstrated decreased functional coupling between the ventromedial PFC and posterior cingulate cortex (a region structurally connected to the PFC) in CD-patients. The present study is the first to show alterations in brain function and task-related functional coupling in patients with long-term remission of CD relative to matched healthy controls. These alterations may, together with abnormalities in brain structure, be related to the persisting psychological morbidity in patients with CD after long-term remission. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. An ERP Study of Emotional Face Processing in the Adult and Infant Brain

    ERIC Educational Resources Information Center

    Leppanen, Jukka M.; Moulson, Margaret C.; Vogel-Farley, Vanessa K.; Nelson, Charles A.

    2007-01-01

    To examine the ontogeny of emotional face processing, event-related potentials (ERPs) were recorded from adults and 7-month-old infants while viewing pictures of fearful, happy, and neutral faces. Face-sensitive ERPs at occipital-temporal scalp regions differentiated between fearful and neutral/happy faces in both adults (N170 was larger for fear)…

  5. Self-esteem Modulates the P3 Component in Response to the Self-face Processing after Priming with Emotional Faces

    PubMed Central

    Guan, Lili; Zhao, Yufang; Wang, Yige; Chen, Yujie; Yang, Juan

    2017-01-01

    The self-face processing advantage (SPA) refers to the research finding that individuals generally recognize their own face faster than another’s face; self-face also elicits an enhanced P3 amplitude compared to another’s face. It has been suggested that social evaluation threats could weaken the SPA and that self-esteem could be regarded as a threat buffer. However, little research has directly investigated the neural evidence of how self-esteem modulates the social evaluation threat to the SPA. In the current event-related potential study, 27 healthy Chinese undergraduate students were primed with emotional faces (angry, happy, or neutral) and were asked to judge whether the target face (self, friend, and stranger) was familiar or unfamiliar. Electrophysiological results showed that after priming with emotional faces (angry and happy), self-face elicited similar P3 amplitudes to friend-face in individuals with low self-esteem, but not in individuals with high self-esteem. The results suggest that as low self-esteem raises fears of social rejection and exclusion, priming with emotional faces (angry and happy) can weaken the SPA in low self-esteem individuals but not in high self-esteem individuals. PMID:28868041

  6. Self-esteem Modulates the P3 Component in Response to the Self-face Processing after Priming with Emotional Faces.

    PubMed

    Guan, Lili; Zhao, Yufang; Wang, Yige; Chen, Yujie; Yang, Juan

    2017-01-01

    The self-face processing advantage (SPA) refers to the research finding that individuals generally recognize their own face faster than another's face; self-face also elicits an enhanced P3 amplitude compared to another's face. It has been suggested that social evaluation threats could weaken the SPA and that self-esteem could be regarded as a threat buffer. However, little research has directly investigated the neural evidence of how self-esteem modulates the social evaluation threat to the SPA. In the current event-related potential study, 27 healthy Chinese undergraduate students were primed with emotional faces (angry, happy, or neutral) and were asked to judge whether the target face (self, friend, and stranger) was familiar or unfamiliar. Electrophysiological results showed that after priming with emotional faces (angry and happy), self-face elicited similar P3 amplitudes to friend-face in individuals with low self-esteem, but not in individuals with high self-esteem. The results suggest that as low self-esteem raises fears of social rejection and exclusion, priming with emotional faces (angry and happy) can weaken the SPA in low self-esteem individuals but not in high self-esteem individuals.

  7. Startling similarity: Effects of facial self-resemblance and familiarity on the processing of emotional faces

    PubMed Central

    Larra, Mauro F.; Merz, Martina U.; Schächinger, Hartmut

    2017-01-01

    Facial self-resemblance has been associated with positive emotional evaluations, but this effect may be biased by self-face familiarity. Here we report two experiments utilizing startle modulation to investigate how the processing of facial expressions of emotion is affected by subtle resemblance to the self as well as to familiar faces. Participants of the first experiment (I) (N = 39) were presented with morphed faces showing happy, neutral, and fearful expressions which were manipulated to resemble either their own or unknown faces. At SOAs of either 300 ms or 3500–4500 ms after picture onset, startle responses were elicited by binaural bursts of white noise (50 ms, 105 dB), and recorded at the orbicularis oculi via EMG. Manual reaction time was measured in a simple emotion discrimination paradigm. Pictures preceding noise bursts by short SOA inhibited startle (prepulse inhibition, PPI). Both affective modulation and PPI of startle in response to emotional faces was altered by physical similarity to the self. As indexed both by relative facilitation of startle and faster manual responses, self-resemblance apparently induced deeper processing of facial affect, particularly in happy faces. Experiment II (N = 54) produced similar findings using morphs of famous faces, yet showed no impact of mere familiarity on PPI effects (or response time, either). The results are discussed with respect to differential (presumably pre-attentive) effects of self-specific vs. familiar information in face processing. PMID:29216226

  8. Down Syndrome and Automatic Processing of Familiar and Unfamiliar Emotional Faces

    ERIC Educational Resources Information Center

    Morales, Guadalupe E.; Lopez, Ernesto O.

    2010-01-01

    Participants with Down syndrome (DS) were required to participate in a face recognition experiment to recognize familiar (DS faces) and unfamiliar emotional faces (non DS faces), by using an affective priming paradigm. Pairs of emotional facial stimuli were presented (one face after another) with a short Stimulus Onset Asynchrony of 300…

  9. Grounding context in face processing: color, emotion, and gender.

    PubMed

    Gil, Sandrine; Le Bigot, Ludovic

    2015-01-01

    In recent years, researchers have become interested in the way that the affective quality of contextual information transfers to a perceived target. We therefore examined the effect of a red (vs. green, mixed red/green, and achromatic) background - known to be valenced - on the processing of stimuli that play a key role in human interactions, namely facial expressions. We also examined whether the valenced-color effect can be modulated by gender, which is also known to be valenced. Female and male adult participants performed a categorization task of facial expressions of emotion in which the faces of female and male posers expressing two ambiguous emotions (i.e., neutral and surprise) were presented against the four different colored backgrounds. Additionally, this task was completed by collecting subjective ratings for each colored background in the form of five semantic differential scales corresponding to both discrete and dimensional perspectives of emotion. We found that the red background resulted in more negative face perception than the green background, whether the poser was female or male. However, whereas this valenced-color effect was the only effect for female posers, for male posers, the effect was modulated by both the nature of the ambiguous emotion and the decoder's gender. Overall, our findings offer evidence that color and gender have a common valence-based dimension.

  10. Grounding context in face processing: color, emotion, and gender

    PubMed Central

    Gil, Sandrine; Le Bigot, Ludovic

    2015-01-01

    In recent years, researchers have become interested in the way that the affective quality of contextual information transfers to a perceived target. We therefore examined the effect of a red (vs. green, mixed red/green, and achromatic) background – known to be valenced – on the processing of stimuli that play a key role in human interactions, namely facial expressions. We also examined whether the valenced-color effect can be modulated by gender, which is also known to be valenced. Female and male adult participants performed a categorization task of facial expressions of emotion in which the faces of female and male posers expressing two ambiguous emotions (i.e., neutral and surprise) were presented against the four different colored backgrounds. Additionally, this task was completed by collecting subjective ratings for each colored background in the form of five semantic differential scales corresponding to both discrete and dimensional perspectives of emotion. We found that the red background resulted in more negative face perception than the green background, whether the poser was female or male. However, whereas this valenced-color effect was the only effect for female posers, for male posers, the effect was modulated by both the nature of the ambiguous emotion and the decoder’s gender. Overall, our findings offer evidence that color and gender have a common valence-based dimension. PMID:25852625

  11. Functional atlas of emotional faces processing: a voxel-based meta-analysis of 105 functional magnetic resonance imaging studies

    PubMed Central

    Fusar-Poli, Paolo; Placentino, Anna; Carletti, Francesco; Landi, Paola; Allen, Paul; Surguladze, Simon; Benedetti, Francesco; Abbamonte, Marta; Gasparotti, Roberto; Barale, Francesco; Perez, Jorge; McGuire, Philip; Politi, Pierluigi

    2009-01-01

    Background Most of our social interactions involve perception of emotional information from the faces of other people. Furthermore, such emotional processes are thought to be aberrant in a range of clinical disorders, including psychosis and depression. However, the exact neurofunctional maps underlying emotional facial processing are not well defined. Methods Two independent researchers conducted separate comprehensive PubMed (1990 to May 2008) searches to find all functional magnetic resonance imaging (fMRI) studies using a variant of the emotional faces paradigm in healthy participants. The search terms were: “fMRI AND happy faces,” “fMRI AND sad faces,” “fMRI AND fearful faces,” “fMRI AND angry faces,” “fMRI AND disgusted faces” and “fMRI AND neutral faces.” We extracted spatial coordinates and inserted them in an electronic database. We performed activation likelihood estimation analysis for voxel-based meta-analyses. Results Of the originally identified studies, 105 met our inclusion criteria. The overall database consisted of 1785 brain coordinates that yielded an overall sample of 1600 healthy participants. Quantitative voxel-based meta-analysis of brain activation provided neurofunctional maps for 1) main effect of human faces; 2) main effect of emotional valence; and 3) modulatory effect of age, sex, explicit versus implicit processing and magnetic field strength. Processing of emotional faces was associated with increased activation in a number of visual, limbic, temporoparietal and prefrontal areas; the putamen; and the cerebellum. Happy, fearful and sad faces specifically activated the amygdala, whereas angry or disgusted faces had no effect on this brain region. Furthermore, amygdala sensitivity was greater for fearful than for happy or sad faces. Insular activation was selectively reported during processing of disgusted and angry faces. However, insular sensitivity was greater for disgusted than for angry faces. Conversely

  12. The effects of early institutionalization on emotional face processing: evidence for sparing via an experience-dependent mechanism.

    PubMed

    Young, Audrey; Luyster, Rhiannon J; Fox, Nathan A; Zeanah, Charles H; Nelson, Charles A

    2017-09-01

    Early psychosocial deprivation has profound adverse effects on children's brain and behavioural development, including abnormalities in physical growth, intellectual function, social cognition, and emotional development. Nevertheless, the domain of emotional face processing has appeared in previous research to be relatively spared; here, we test for possible sleeper effects emerging in early adolescence. This study employed event-related potentials (ERPs) to examine the neural correlates of facial emotion processing in 12-year-old children who took part in a randomized controlled trial of foster care as an intervention for early institutionalization. Results revealed no significant group differences in two face and emotion-sensitive ERP components (P1 and N170), nor any association with age at placement or per cent of lifetime spent in an institution. These results converged with previous evidence from this population supporting relative sparing of facial emotion processing. We hypothesize that this sparing is due to an experience-dependent mechanism in which the amount of exposure to faces and facial expressions of emotion children received was sufficient to meet the low threshold required for cortical specialization of structures critical to emotion processing. Statement of contribution What is already known on this subject? Early psychosocial deprivation leads to profoundly detrimental effects on children's brain and behavioural development. With respect to children's emotional face processing abilities, few adverse effects of institutionalized rearing have previously been reported. Recent studies suggest that 'sleeper effects' may emerge many years later, especially in the domain of face processing. What does this study add? Examining a cumulative 12 years of data, we found only minimal group differences and no evidence of a sleeper effect in this particular domain. These findings identify emotional face processing as a unique ability in which relative sparing

  13. Risk for Bipolar Disorder is Associated with Face-Processing Deficits across Emotions

    ERIC Educational Resources Information Center

    Brotman, Melissa A.; Skup, Martha; Rich, Brendan A.; Blair, Karina S.; Pine, Daniel S.; Blair, James R.; Leibenluft, Ellen

    2008-01-01

    The relationship between the risks for face-emotion labeling deficits and bipolar disorder (BD) among youths is examined. Findings show that youths at risk for BD did not show specific face-emotion recognition deficits. The need to provide more intense emotional information for face-emotion labeling of patients and at-risk youths is also discussed.

  14. Differential emotion attribution to neutral faces of own and other races.

    PubMed

    Hu, Chao S; Wang, Qiandong; Han, Tong; Weare, Ethan; Fu, Genyue

    2017-02-01

    Past research has demonstrated differential recognition of emotion on faces of different races. This paper reports the first study to explore differential emotion attribution to neutral faces of different races. Chinese and Caucasian adults viewed a series of Chinese and Caucasian neutral faces and judged their outward facial expression: neutral, positive, or negative. The results showed that both Chinese and Caucasian viewers perceived more Chinese faces than Caucasian faces as neutral. Nevertheless, Chinese viewers attributed positive emotion to Caucasian faces more than to Chinese faces, whereas Caucasian viewers attributed negative emotion to Caucasian faces more than to Chinese faces. Moreover, Chinese viewers attributed negative and neutral emotion to the faces of both races without significant difference in frequency, whereas Caucasian viewers mostly attributed neutral emotion to the faces. These differences between Chinese and Caucasian viewers may be due to differential visual experience, culture, racial stereotype, or expectation of the experiment. We also used eye tracking among the Chinese participants to explore the relationship between face-processing strategy and emotion attribution to neutral faces. The results showed that the interaction between emotion attribution and face race was significant on face-processing strategy, such as fixation proportion on eyes and saccade amplitude. Additionally, pupil size during processing Caucasian faces was larger than during processing Chinese faces.

  15. Seeing Emotion with Your Ears: Emotional Prosody Implicitly Guides Visual Attention to Faces

    PubMed Central

    Rigoulot, Simon; Pell, Marc D.

    2012-01-01

    Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality) and emotional speech prosody (auditory modality) which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing) uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0–1250 ms], [1250–2500 ms], [2500–5000 ms]) were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect), although this effect was often emotion-specific (with greatest effects for fear). Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions. PMID:22303454

  16. Emotion Words: Adding Face Value.

    PubMed

    Fugate, Jennifer M B; Gendron, Maria; Nakashima, Satoshi F; Barrett, Lisa Feldman

    2017-06-12

    Despite a growing number of studies suggesting that emotion words affect perceptual judgments of emotional stimuli, little is known about how emotion words affect perceptual memory for emotional faces. In Experiments 1 and 2 we tested how emotion words (compared with control words) affected participants' abilities to select a target emotional face from among distractor faces. Participants were generally more likely to false alarm to distractor emotional faces when primed with an emotion word congruent with the face (compared with a control word). Moreover, participants showed both decreased sensitivity (d') to discriminate between target and distractor faces, as well as altered response biases (c; more likely to answer "yes") when primed with an emotion word (compared with a control word). In Experiment 3 we showed that emotion words had more of an effect on perceptual memory judgments when the structural information in the target face was limited, as well as when participants were only able to categorize the face with a partially congruent emotion word. The overall results are consistent with the idea that emotion words affect the encoding of emotional faces in perceptual memory. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  17. The effect of acute citalopram on face emotion processing in remitted depression: a pharmacoMRI study.

    PubMed

    Anderson, Ian M; Juhasz, Gabriella; Thomas, Emma; Downey, Darragh; McKie, Shane; Deakin, J F William; Elliott, Rebecca

    2011-01-01

    Both reduced serotonergic (5-HT) function and negative emotional biases have been associated with vulnerability to depression. In order to investigate whether these might be related we examined 5-HT modulation of affective processing in 14 remitted depressed subjects compared with 12 never depressed controls matched for age and sex. Participants underwent function magnetic resonance imaging (fMRI) during a covert face emotion task with and without intravenous citalopram (7.5mg) pretreatment. Compared with viewing neutral faces, and irrespective of group, citalopram enhanced left anterior cingulate blood oxygen level dependent (BOLD) response to happy faces, right posterior insula and right lateral orbitofrontal responses to sad faces, and reduced amygdala responses bilaterally to fearful faces. In controls, relative to remitted depressed subjects, citalopram increased bilateral hippocampal responses to happy faces and increased right anterior insula response to sad faces. These findings were not accounted for by changes in BOLD responses to viewing neutral faces. These results are consistent with previous findings showing 5-HT modulation of affective processing; differences found in previously depressed participants compared with controls may contribute to emotional processing biases underlying vulnerability to depressive relapse. Copyright © 2010 Elsevier B.V. and ECNP. All rights reserved.

  18. Mapping the emotional face. How individual face parts contribute to successful emotion recognition.

    PubMed

    Wegrzyn, Martin; Vogt, Maria; Kireclioglu, Berna; Schneider, Julia; Kissler, Johanna

    2017-01-01

    Which facial features allow human observers to successfully recognize expressions of emotion? While the eyes and mouth have been frequently shown to be of high importance, research on facial action units has made more precise predictions about the areas involved in displaying each emotion. The present research investigated on a fine-grained level, which physical features are most relied on when decoding facial expressions. In the experiment, individual faces expressing the basic emotions according to Ekman were hidden behind a mask of 48 tiles, which was sequentially uncovered. Participants were instructed to stop the sequence as soon as they recognized the facial expression and assign it the correct label. For each part of the face, its contribution to successful recognition was computed, allowing to visualize the importance of different face areas for each expression. Overall, observers were mostly relying on the eye and mouth regions when successfully recognizing an emotion. Furthermore, the difference in the importance of eyes and mouth allowed to group the expressions in a continuous space, ranging from sadness and fear (reliance on the eyes) to disgust and happiness (mouth). The face parts with highest diagnostic value for expression identification were typically located in areas corresponding to action units from the facial action coding system. A similarity analysis of the usefulness of different face parts for expression recognition demonstrated that faces cluster according to the emotion they express, rather than by low-level physical features. Also, expressions relying more on the eyes or mouth region were in close proximity in the constructed similarity space. These analyses help to better understand how human observers process expressions of emotion, by delineating the mapping from facial features to psychological representation.

  19. Mapping the emotional face. How individual face parts contribute to successful emotion recognition

    PubMed Central

    Wegrzyn, Martin; Vogt, Maria; Kireclioglu, Berna; Schneider, Julia; Kissler, Johanna

    2017-01-01

    Which facial features allow human observers to successfully recognize expressions of emotion? While the eyes and mouth have been frequently shown to be of high importance, research on facial action units has made more precise predictions about the areas involved in displaying each emotion. The present research investigated on a fine-grained level, which physical features are most relied on when decoding facial expressions. In the experiment, individual faces expressing the basic emotions according to Ekman were hidden behind a mask of 48 tiles, which was sequentially uncovered. Participants were instructed to stop the sequence as soon as they recognized the facial expression and assign it the correct label. For each part of the face, its contribution to successful recognition was computed, allowing to visualize the importance of different face areas for each expression. Overall, observers were mostly relying on the eye and mouth regions when successfully recognizing an emotion. Furthermore, the difference in the importance of eyes and mouth allowed to group the expressions in a continuous space, ranging from sadness and fear (reliance on the eyes) to disgust and happiness (mouth). The face parts with highest diagnostic value for expression identification were typically located in areas corresponding to action units from the facial action coding system. A similarity analysis of the usefulness of different face parts for expression recognition demonstrated that faces cluster according to the emotion they express, rather than by low-level physical features. Also, expressions relying more on the eyes or mouth region were in close proximity in the constructed similarity space. These analyses help to better understand how human observers process expressions of emotion, by delineating the mapping from facial features to psychological representation. PMID:28493921

  20. Consciousness and arousal effects on emotional face processing as revealed by brain oscillations. A gamma band analysis.

    PubMed

    Balconi, Michela; Lucchiari, Claudio

    2008-01-01

    It remains an open question whether it is possible to assign a single brain operation or psychological function for facial emotion decoding to a certain type of oscillatory activity. Gamma band activity (GBA) offers an adequate tool for studying cortical activation patterns during emotional face information processing. In the present study brain oscillations were analyzed in response to facial expression of emotions. Specifically, GBA modulation was measured when twenty subjects looked at emotional (angry, fearful, happy, and sad faces) or neutral faces in two different conditions: supraliminal (10 ms) vs subliminal (150 ms) stimulation (100 target-mask pairs for each condition). The results showed that both consciousness and significance of the stimulus in terms of arousal can modulate the power synchronization (ERD decrease) during 150-350 time range: an early oscillatory event showed its peak at about 200 ms post-stimulus. GBA was enhanced by supraliminal more than subliminal elaboration, as well as more by high arousal (anger and fear) than low arousal (happiness and sadness) emotions. Finally a left-posterior dominance for conscious elaboration was found, whereas right hemisphere was discriminant in emotional processing of face in comparison with neutral face.

  1. Neural signatures of conscious and unconscious emotional face processing in human infants.

    PubMed

    Jessen, Sarah; Grossmann, Tobias

    2015-03-01

    Human adults can process emotional information both with and without conscious awareness, and it has been suggested that the two processes rely on partly distinct brain mechanisms. However, the developmental origins of these brain processes are unknown. In the present event-related brain potential (ERP) study, we examined the brain responses of 7-month-old infants in response to subliminally (50 and 100 msec) and supraliminally (500 msec) presented happy and fearful facial expressions. Our results revealed that infants' brain responses (Pb and Nc) over central electrodes distinguished between emotions irrespective of stimulus duration, whereas the discrimination between emotions at occipital electrodes (N290 and P400) only occurred when faces were presented supraliminally (above threshold). This suggests that early in development the human brain not only discriminates between happy and fearful facial expressions irrespective of conscious perception, but also that, similar to adults, supraliminal and subliminal emotion processing relies on distinct neural processes. Our data further suggest that the processing of emotional facial expressions differs across infants depending on their behaviorally shown perceptual sensitivity. The current ERP findings suggest that distinct brain processes underpinning conscious and unconscious emotion perception emerge early in ontogeny and can therefore be seen as a key feature of human social functioning. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Visual scanning behavior during processing of emotional faces in older adults with major depression.

    PubMed

    Noiret, Nicolas; Carvalho, Nicolas; Laurent, Éric; Vulliez, Lauriane; Bennabi, Djamila; Chopard, Gilles; Haffen, Emmanuel; Nicolier, Magali; Monnin, Julie; Vandel, Pierre

    2015-01-01

    Although several reported studies have suggested that younger adults with depression display depression-related biases during the processing of emotional faces, there remains a lack of data concerning these biases in older adults. The aim of our study was to assess scanning behavior during the processing of emotional faces in depressed older adults. Older adults with and without depression viewed happy, neutral or sad portraits during an eye movement recording. Depressed older adults spent less time with fewer fixations on emotional features than healthy older adults, but only for sad and neutral portraits, with no significant difference for happy portraits. These results suggest disengagement from sad and neutral faces in depressed older adults, which is not consistent with standard theoretical proposals on congruence biases in depression. Also, aging and associated emotional regulation change may explain the expression of depression-related biases. Our preliminary results suggest that information processing in depression consists of a more complex phenomenon than merely a general searching for mood-congruent stimuli or general disengagement from all kinds of stimuli. These findings underline that care must be used when evaluating potential variables, such as aging, which interact with depression and selectively influence the choice of relevant stimulus dimensions.

  3. Alcoholism and Dampened Temporal Limbic Activation to Emotional Faces

    PubMed Central

    Marinkovic, Ksenija; Oscar-Berman, Marlene; Urban, Trinity; O’Reilly, Cara E.; Howard, Julie A.; Sawyer, Kayle; Harris, Gordon J.

    2013-01-01

    Background Excessive chronic drinking is accompanied by a broad spectrum of emotional changes ranging from apathy and emotional flatness to deficits in comprehending emotional information, but their neural bases are poorly understood. Methods Emotional abnormalities associated with alcoholism were examined with functional magnetic resonance imaging in abstinent long-term alcoholic men in comparison to healthy demographically matched controls. Participants were presented with emotionally valenced words and photographs of faces during deep (semantic) and shallow (perceptual) encoding tasks followed by recognition. Results Overall, faces evoked stronger activation than words, with the expected material-specific laterality (left hemisphere for words, and right for faces) and depth of processing effects. However, whereas control participants showed stronger activation in the amygdala and hippocampus when viewing faces with emotional (relative to neutral) expressions, the alcoholics responded in an undifferentiated manner to all facial expressions. In the alcoholic participants, amygdala activity was inversely correlated with an increase in lateral prefrontal activity as a function of their behavioral deficits. Prefrontal modulation of emotional function as a compensation for the blunted amygdala activity during a socially relevant face appraisal task is in agreement with a distributed network engagement during emotional face processing. Conclusions Deficient activation of amygdala and hippocampus may underlie impaired processing of emotional faces associated with long-term alcoholism and may be a part of the wide array of behavioral problems including disinhibition, concurring with previously documented interpersonal difficulties in this population. Furthermore, the results suggest that alcoholics may rely on prefrontal rather than temporal limbic areas in order to compensate for reduced limbic responsivity and to maintain behavioral adequacy when faced with emotionally

  4. Alcoholism and dampened temporal limbic activation to emotional faces.

    PubMed

    Marinkovic, Ksenija; Oscar-Berman, Marlene; Urban, Trinity; O'Reilly, Cara E; Howard, Julie A; Sawyer, Kayle; Harris, Gordon J

    2009-11-01

    Excessive chronic drinking is accompanied by a broad spectrum of emotional changes ranging from apathy and emotional flatness to deficits in comprehending emotional information, but their neural bases are poorly understood. Emotional abnormalities associated with alcoholism were examined with functional magnetic resonance imaging in abstinent long-term alcoholic men in comparison to healthy demographically matched controls. Participants were presented with emotionally valenced words and photographs of faces during deep (semantic) and shallow (perceptual) encoding tasks followed by recognition. Overall, faces evoked stronger activation than words, with the expected material-specific laterality (left hemisphere for words, and right for faces) and depth of processing effects. However, whereas control participants showed stronger activation in the amygdala and hippocampus when viewing faces with emotional (relative to neutral) expressions, the alcoholics responded in an undifferentiated manner to all facial expressions. In the alcoholic participants, amygdala activity was inversely correlated with an increase in lateral prefrontal activity as a function of their behavioral deficits. Prefrontal modulation of emotional function as a compensation for the blunted amygdala activity during a socially relevant face appraisal task is in agreement with a distributed network engagement during emotional face processing. Deficient activation of amygdala and hippocampus may underlie impaired processing of emotional faces associated with long-term alcoholism and may be a part of the wide array of behavioral problems including disinhibition, concurring with previously documented interpersonal difficulties in this population. Furthermore, the results suggest that alcoholics may rely on prefrontal rather than temporal limbic areas in order to compensate for reduced limbic responsivity and to maintain behavioral adequacy when faced with emotionally or socially challenging situations.

  5. Face and emotion expression processing and the serotonin transporter polymorphism 5-HTTLPR/rs22531.

    PubMed

    Hildebrandt, A; Kiy, A; Reuter, M; Sommer, W; Wilhelm, O

    2016-06-01

    Face cognition, including face identity and facial expression processing, is a crucial component of socio-emotional abilities, characterizing humans as highest developed social beings. However, for these trait domains molecular genetic studies investigating gene-behavior associations based on well-founded phenotype definitions are still rare. We examined the relationship between 5-HTTLPR/rs25531 polymorphisms - related to serotonin-reuptake - and the ability to perceive and recognize faces and emotional expressions in human faces. For this aim we conducted structural equation modeling on data from 230 young adults, obtained by using a comprehensive, multivariate task battery with maximal effort tasks. By additionally modeling fluid intelligence and immediate and delayed memory factors, we aimed to address the discriminant relationships of the 5-HTTLPR/rs25531 polymorphisms with socio-emotional abilities. We found a robust association between the 5-HTTLPR/rs25531 polymorphism and facial emotion perception. Carriers of two long (L) alleles outperformed carriers of one or two S alleles. Weaker associations were present for face identity perception and memory for emotional facial expressions. There was no association between the 5-HTTLPR/rs25531 polymorphism and non-social abilities, demonstrating discriminant validity of the relationships. We discuss the implications and possible neural mechanisms underlying these novel findings. © 2016 John Wiley & Sons Ltd and International Behavioural and Neural Genetics Society.

  6. Automatic Processing of Emotional Faces in High-Functioning Pervasive Developmental Disorders: An Affective Priming Study

    ERIC Educational Resources Information Center

    Kamio, Yoko; Wolf, Julie; Fein, Deborah

    2006-01-01

    This study examined automatic processing of emotional faces in individuals with high-functioning Pervasive Developmental Disorders (HFPDD) using an affective priming paradigm. Sixteen participants (HFPDD and matched controls) were presented with happy faces, fearful faces or objects in both subliminal and supraliminal exposure conditions, followed…

  7. Risk for bipolar disorder is associated with face-processing deficits across emotions.

    PubMed

    Brotman, Melissa A; Skup, Martha; Rich, Brendan A; Blair, Karina S; Pine, Daniel S; Blair, James R; Leibenluft, Ellen

    2008-12-01

    Youths with euthymic bipolar disorder (BD) have a deficit in face-emotion labeling that is present across multiple emotions. Recent research indicates that youths at familial risk for BD, but without a history of mood disorder, also have a deficit in face-emotion labeling, suggesting that such impairments may be an endophenotype for BD. It is unclear whether this deficit in at-risk youths is present across all emotions or if the impairment presents initially as an emotion-specific dysfunction that then generalizes to other emotions as the symptoms of BD become manifest. Thirty-seven patients with pediatric BD, 25 unaffected children with a first-degree relative with BD, and 36 typically developing youths were administered the Emotional Expression Multimorph Task, a computerized behavioral task, which presents gradations of facial emotions from 100% neutrality to 100% emotional expression (happiness, surprise, fear, sadness, anger, and disgust). Repeated-measures analysis of covariance revealed that, compared with the control youths, the patients and the at-risk youths required significantly more intense emotional information to identify and correctly label face emotions. The patients with BD and the at-risk youths did not differ from each other. Group-by-emotion interactions were not significant, indicating that the group effects did not differ based on the facial emotion. The youths at risk for BD demonstrate nonspecific deficits in face-emotion recognition, similar to patients with the illness. Further research is needed to determine whether such deficits meet all the criteria for an endophenotype.

  8. Face Recognition, Musical Appraisal, and Emotional Crossmodal Bias.

    PubMed

    Invitto, Sara; Calcagnì, Antonio; Mignozzi, Arianna; Scardino, Rosanna; Piraino, Giulia; Turchi, Daniele; De Feudis, Irio; Brunetti, Antonio; Bevilacqua, Vitoantonio; de Tommaso, Marina

    2017-01-01

    Recent research on the crossmodal integration of visual and auditory perception suggests that evaluations of emotional information in one sensory modality may tend toward the emotional value generated in another sensory modality. This implies that the emotions elicited by musical stimuli can influence the perception of emotional stimuli presented in other sensory modalities, through a top-down process. The aim of this work was to investigate how crossmodal perceptual processing influences emotional face recognition and how potential modulation of this processing induced by music could be influenced by the subject's musical competence. We investigated how emotional face recognition processing could be modulated by listening to music and how this modulation varies according to the subjective emotional salience of the music and the listener's musical competence. The sample consisted of 24 participants: 12 professional musicians and 12 university students (non-musicians). Participants performed an emotional go/no-go task whilst listening to music by Albeniz, Chopin, or Mozart. The target stimuli were emotionally neutral facial expressions. We examined the N170 Event-Related Potential (ERP) and behavioral responses (i.e., motor reaction time to target recognition and musical emotional judgment). A linear mixed-effects model and a decision-tree learning technique were applied to N170 amplitudes and latencies. The main findings of the study were that musicians' behavioral responses and N170 is more affected by the emotional value of music administered in the emotional go/no-go task and this bias is also apparent in responses to the non-target emotional face. This suggests that emotional information, coming from multiple sensory channels, activates a crossmodal integration process that depends upon the stimuli emotional salience and the listener's appraisal.

  9. Spatiotemporal brain dynamics of emotional face processing modulations induced by the serotonin 1A/2A receptor agonist psilocybin.

    PubMed

    Bernasconi, Fosco; Schmidt, André; Pokorny, Thomas; Kometer, Michael; Seifritz, Erich; Vollenweider, Franz X

    2014-12-01

    Emotional face processing is critically modulated by the serotonergic system. For instance, emotional face processing is impaired by acute psilocybin administration, a serotonin (5-HT) 1A and 2A receptor agonist. However, the spatiotemporal brain mechanisms underlying these modulations are poorly understood. Here, we investigated the spatiotemporal brain dynamics underlying psilocybin-induced modulations during emotional face processing. Electrical neuroimaging analyses were applied to visual evoked potentials in response to emotional faces, following psilocybin and placebo administration. Our results indicate a first time period of strength (i.e., Global Field Power) modulation over the 168-189 ms poststimulus interval, induced by psilocybin. A second time period of strength modulation was identified over the 211-242 ms poststimulus interval. Source estimations over these 2 time periods further revealed decreased activity in response to both neutral and fearful faces within limbic areas, including amygdala and parahippocampal gyrus, and the right temporal cortex over the 168-189 ms interval, and reduced activity in response to happy faces within limbic and right temporo-occipital brain areas over the 211-242 ms interval. Our results indicate a selective and temporally dissociable effect of psilocybin on the neuronal correlates of emotional face processing, consistent with a modulation of the top-down control. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  10. Altered Functional Subnetwork During Emotional Face Processing: A Potential Intermediate Phenotype for Schizophrenia.

    PubMed

    Cao, Hengyi; Bertolino, Alessandro; Walter, Henrik; Schneider, Michael; Schäfer, Axel; Taurisano, Paolo; Blasi, Giuseppe; Haddad, Leila; Grimm, Oliver; Otto, Kristina; Dixson, Luanna; Erk, Susanne; Mohnke, Sebastian; Heinz, Andreas; Romanczuk-Seiferth, Nina; Mühleisen, Thomas W; Mattheisen, Manuel; Witt, Stephanie H; Cichon, Sven; Noethen, Markus; Rietschel, Marcella; Tost, Heike; Meyer-Lindenberg, Andreas

    2016-06-01

    Although deficits in emotional processing are prominent in schizophrenia, it has been difficult to identify neural mechanisms related to the genetic risk for this highly heritable illness. Prior studies have not found consistent regional activation or connectivity alterations in first-degree relatives compared with healthy controls, suggesting that a more comprehensive search for connectomic biomarkers is warranted. To identify a potential systems-level intermediate phenotype linked to emotion processing in schizophrenia and to examine the psychological association, task specificity, test-retest reliability, and clinical validity of the identified phenotype. The study was performed in university research hospitals from June 1, 2008, through December 31, 2013. We examined 58 unaffected first-degree relatives of patients with schizophrenia and 94 healthy controls with an emotional face-matching functional magnetic resonance imaging paradigm. Test-retest reliability was analyzed with an independent sample of 26 healthy participants. A clinical association study was performed in 31 patients with schizophrenia and 45 healthy controls. Data analysis was performed from January 1 to September 30, 2014. Conventional amygdala activity and seeded connectivity measures, graph-based global and local network connectivity measures, Spearman rank correlation, intraclass correlation, and gray matter volumes. Among the 152 volunteers included in the relative-control sample, 58 were unaffected first-degree relatives of patients with schizophrenia (mean [SD] age, 33.29 [12.56]; 38 were women), and 94 were healthy controls without a first-degree relative with mental illness (mean [SD] age, 32.69 [10.09] years; 55 were women). A graph-theoretical connectivity approach identified significantly decreased connectivity in a subnetwork that primarily included the limbic cortex, visual cortex, and subcortex during emotional face processing (cluster-level P corrected for familywise error =

  11. Emotional responses associated with self-face processing in individuals with autism spectrum disorders: an fMRI study.

    PubMed

    Morita, Tomoyo; Kosaka, Hirotaka; Saito, Daisuke N; Ishitobi, Makoto; Munesue, Toshio; Itakura, Shoji; Omori, Masao; Okazawa, Hidehiko; Wada, Yuji; Sadato, Norihiro

    2012-01-01

    Individuals with autism spectrum disorders (ASD) show impaired emotional responses to self-face processing, but the underlying neural bases are unclear. Using functional magnetic resonance imaging, we investigated brain activity when 15 individuals with high-functioning ASD and 15 controls rated the photogenicity of self-face images and photographs of others' faces. Controls showed a strong correlation between photogenicity ratings and extent of embarrassment evoked by self-face images; this correlation was weaker among ASD individuals, indicating a decoupling between the cognitive evaluation of self-face images and emotional responses. Individuals with ASD demonstrated relatively low self-related activity in the posterior cingulate cortex (PCC), which was related to specific autistic traits. There were significant group differences in the modulation of activity by embarrassment ratings in the right insular (IC) and lateral orbitofrontal cortices. Task-related activity in the right IC was lower in the ASD group. The reduced activity in the right IC for self-face images was associated with weak coupling between cognitive evaluation and emotional responses to self-face images. The PCC is responsible for self-referential processing, and the IC plays a role in emotional experience. Dysfunction in these areas could contribute to the lack of self-conscious behaviors in response to self-reflection in ASD individuals.

  12. Processing of Emotional Faces in Patients with Chronic Pain Disorder: An Eye-Tracking Study.

    PubMed

    Giel, Katrin Elisabeth; Paganini, Sarah; Schank, Irena; Enck, Paul; Zipfel, Stephan; Junne, Florian

    2018-01-01

    Problems in emotion processing potentially contribute to the development and maintenance of chronic pain. Theories focusing on attentional processing have suggested that dysfunctional attention deployment toward emotional information, i.e., attentional biases for negative emotions, might entail one potential developmental and/or maintenance factor of chronic pain. We assessed self-reported alexithymia, attentional orienting to and maintenance on emotional stimuli using eye tracking in 17 patients with chronic pain disorder (CP) and two age- and sex-matched control groups, 17 healthy individuals (HC) and 17 individuals who were matched to CP according to depressive symptoms (DC). In a choice viewing paradigm, a dot indicated the position of the emotional picture in the next trial to allow for strategic attention deployment. Picture pairs consisted of a happy or sad facial expression and a neutral facial expression of the same individual. Participants were asked to explore picture pairs freely. CP and DC groups reported higher alexithymia than the HC group. HC showed a previously reported emotionality bias by preferentially orienting to the emotional face and preferentially maintaining on the happy face. CP and DC participants showed no facilitated early attention to sad facial expressions, and DC participants showed no facilitated early attention to happy facial expressions, while CP and DC participants did. We found no group differences in attentional maintenance. Our findings are in line with the clinical large overlap between pain and depression. The blunted initial reaction to sadness could be interpreted as a failure of the attentional system to attend to evolutionary salient emotional stimuli or as an attempt to suppress negative emotions. These difficulties in emotion processing might contribute to etiology or maintenance of chronic pain and depression.

  13. Modulation of the composite face effect by unintended emotion cues.

    PubMed

    Gray, Katie L H; Murphy, Jennifer; Marsh, Jade E; Cook, Richard

    2017-04-01

    When upper and lower regions from different emotionless faces are aligned to form a facial composite, observers 'fuse' the two halves together, perceptually. The illusory distortion induced by task-irrelevant ('distractor') halves hinders participants' judgements about task-relevant ('target') halves. This composite-face effect reveals a tendency to integrate feature information from disparate regions of intact upright faces, consistent with theories of holistic face processing. However, observers frequently perceive emotion in ostensibly neutral faces, contrary to the intentions of experimenters. This study sought to determine whether this 'perceived emotion' influences the composite-face effect. In our first experiment, we confirmed that the composite effect grows stronger as the strength of distractor emotion increased. Critically, effects of distractor emotion were induced by weak emotion intensities, and were incidental insofar as emotion cues hindered image matching, not emotion labelling per se . In Experiment 2, we found a correlation between the presence of perceived emotion in a set of ostensibly neutral distractor regions sourced from commonly used face databases, and the strength of illusory distortion they induced. In Experiment 3, participants completed a sequential matching composite task in which half of the distractor regions were rated high and low for perceived emotion, respectively. Significantly stronger composite effects were induced by the high-emotion distractor halves. These convergent results suggest that perceived emotion increases the strength of the composite-face effect induced by supposedly emotionless faces. These findings have important implications for the study of holistic face processing in typical and atypical populations.

  14. Emotional facial expressions reduce neural adaptation to face identity.

    PubMed

    Gerlicher, Anna M V; van Loon, Anouk M; Scholte, H Steven; Lamme, Victor A F; van der Leij, Andries R

    2014-05-01

    In human social interactions, facial emotional expressions are a crucial source of information. Repeatedly presented information typically leads to an adaptation of neural responses. However, processing seems sustained with emotional facial expressions. Therefore, we tested whether sustained processing of emotional expressions, especially threat-related expressions, would attenuate neural adaptation. Neutral and emotional expressions (happy, mixed and fearful) of same and different identity were presented at 3 Hz. We used electroencephalography to record the evoked steady-state visual potentials (ssVEP) and tested to what extent the ssVEP amplitude adapts to the same when compared with different face identities. We found adaptation to the identity of a neutral face. However, for emotional faces, adaptation was reduced, decreasing linearly with negative valence, with the least adaptation to fearful expressions. This short and straightforward method may prove to be a valuable new tool in the study of emotional processing.

  15. Sex-dependent neural effect of oxytocin during subliminal processing of negative emotion faces.

    PubMed

    Luo, Lizhu; Becker, Benjamin; Geng, Yayuan; Zhao, Zhiying; Gao, Shan; Zhao, Weihua; Yao, Shuxia; Zheng, Xiaoxiao; Ma, Xiaole; Gao, Zhao; Hu, Jiehui; Kendrick, Keith M

    2017-11-15

    In line with animal models indicating sexually dimorphic effects of oxytocin (OXT) on social-emotional processing, a growing number of OXT-administration studies in humans have also reported sex-dependent effects during social information processing. To explore whether sex-dependent effects already occur during early, subliminal, processing stages the present pharmacological fMRI-study combined the intranasal-application of either OXT or placebo (n = 86-43 males) with a backward-masking emotional face paradigm. Results showed that while OXT suppressed inferior frontal gyrus, dorsal anterior cingulate and anterior insula responses to threatening face stimuli in men it increased them in women. In women increased anterior cingulate reactivity during subliminal threat processing was also positively associated with trait anxiety. On the network level, sex-dependent effects were observed on amygdala, anterior cingulate and inferior frontal gyrus functional connectivity that were mainly driven by reduced coupling in women following OXT. Our findings demonstrate that OXT produces sex-dependent effects even at the early stages of social-emotional processing, and suggest that while it attenuates neural responses to threatening social stimuli in men it increases them in women. Thus in a therapeutic context OXT may potentially produce different effects on anxiety disorders in men and women. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Cortical deficits of emotional face processing in adults with ADHD: its relation to social cognition and executive function.

    PubMed

    Ibáñez, Agustin; Petroni, Agustin; Urquina, Hugo; Torrente, Fernando; Torralva, Teresa; Hurtado, Esteban; Guex, Raphael; Blenkmann, Alejandro; Beltrachini, Leandro; Muravchik, Carlos; Baez, Sandra; Cetkovich, Marcelo; Sigman, Mariano; Lischinsky, Alicia; Manes, Facundo

    2011-01-01

    Although it has been shown that adults with attention-deficit hyperactivity disorder (ADHD) have impaired social cognition, no previous study has reported the brain correlates of face valence processing. This study looked for behavioral, neuropsychological, and electrophysiological markers of emotion processing for faces (N170) in adult ADHD compared to controls matched by age, gender, educational level, and handedness. We designed an event-related potential (ERP) study based on a dual valence task (DVT), in which faces and words were presented to test the effects of stimulus type (faces, words, or face-word stimuli) and valence (positive versus negative). Individual signatures of cognitive functioning in participants with ADHD and controls were assessed with a comprehensive neuropsychological evaluation, including executive functioning (EF) and theory of mind (ToM). Compared to controls, the adult ADHD group showed deficits in N170 emotion modulation for facial stimuli. These N170 impairments were observed in the absence of any deficit in facial structural processing, suggesting a specific ADHD impairment in early facial emotion modulation. The cortical current density mapping of N170 yielded a main neural source of N170 at posterior section of fusiform gyrus (maximum at left hemisphere for words and right hemisphere for faces and simultaneous stimuli). Neural generators of N170 (fusiform gyrus) were reduced in ADHD. In those patients, N170 emotion processing was associated with performance on an emotional inference ToM task, and N170 from simultaneous stimuli was associated with EF, especially working memory. This is the first report to reveal an adult ADHD-specific impairment in the cortical modulation of emotion for faces and an association between N170 cortical measures and ToM and EF.

  17. What’s in a Face? How Face Gender and Current Affect Influence Perceived Emotion

    PubMed Central

    Harris, Daniel A.; Hayes-Skelton, Sarah A.; Ciaramitaro, Vivian M.

    2016-01-01

    Faces drive our social interactions. A vast literature suggests an interaction between gender and emotional face perception, with studies using different methodologies demonstrating that the gender of a face can affect how emotions are processed. However, how different is our perception of affective male and female faces? Furthermore, how does our current affective state when viewing faces influence our perceptual biases? We presented participants with a series of faces morphed along an emotional continuum from happy to angry. Participants judged each face morph as either happy or angry. We determined each participant’s unique emotional ‘neutral’ point, defined as the face morph judged to be perceived equally happy and angry, separately for male and female faces. We also assessed how current state affect influenced these perceptual neutral points. Our results indicate that, for both male and female participants, the emotional neutral point for male faces is perceptually biased to be happier than for female faces. This bias suggests that more happiness is required to perceive a male face as emotionally neutral, i.e., we are biased to perceive a male face as more negative. Interestingly, we also find that perceptual biases in perceiving female faces are correlated with current mood, such that positive state affect correlates with perceiving female faces as happier, while we find no significant correlation between negative state affect and the perception of facial emotion. Furthermore, we find reaction time biases, with slower responses for angry male faces compared to angry female faces. PMID:27733839

  18. The development of emotion perception in face and voice during infancy.

    PubMed

    Grossmann, Tobias

    2010-01-01

    Interacting with others by reading their emotional expressions is an essential social skill in humans. How this ability develops during infancy and what brain processes underpin infants' perception of emotion in different modalities are the questions dealt with in this paper. Literature review. The first part provides a systematic review of behavioral findings on infants' developing emotion-reading abilities. The second part presents a set of new electrophysiological studies that provide insights into the brain processes underlying infants' developing abilities. Throughout, evidence from unimodal (face or voice) and multimodal (face and voice) processing of emotion is considered. The implications of the reviewed findings for our understanding of developmental models of emotion processing are discussed. The reviewed infant data suggest that (a) early in development, emotion enhances the sensory processing of faces and voices, (b) infants' ability to allocate increased attentional resources to negative emotional information develops earlier in the vocal domain than in the facial domain, and (c) at least by the age of 7 months, infants reliably match and recognize emotional information across face and voice.

  19. Task-irrelevant emotion facilitates face discrimination learning.

    PubMed

    Lorenzino, Martina; Caudek, Corrado

    2015-03-01

    We understand poorly how the ability to discriminate faces from one another is shaped by visual experience. The purpose of the present study is to determine whether face discrimination learning can be facilitated by facial emotions. To answer this question, we used a task-irrelevant perceptual learning paradigm because it closely mimics the learning processes that, in daily life, occur without a conscious intention to learn and without an attentional focus on specific facial features. We measured face discrimination thresholds before and after training. During the training phase (4 days), participants performed a contrast discrimination task on face images. They were not informed that we introduced (task-irrelevant) subtle variations in the face images from trial to trial. For the Identity group, the task-irrelevant features were variations along a morphing continuum of facial identity. For the Emotion group, the task-irrelevant features were variations along an emotional expression morphing continuum. The Control group did not undergo contrast discrimination learning and only performed the pre-training and post-training tests, with the same temporal gap between them as the other two groups. Results indicate that face discrimination improved, but only for the Emotion group. Participants in the Emotion group, moreover, showed face discrimination improvements also for stimulus variations along the facial identity dimension, even if these (task-irrelevant) stimulus features had not been presented during training. The present results highlight the importance of emotions for face discrimination learning. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Age-Related Changes in the Processing of Emotional Faces in a Dual-Task Paradigm.

    PubMed

    Casares-Guillén, Carmen; García-Rodríguez, Beatriz; Delgado, Marisa; Ellgring, Heiner

    2016-01-01

    Background/ Study Context: Age-related changes appear to affect the ability to identify emotional facial expressions in dual-task conditions (i.e., while simultaneously performing a second visual task). The level of interference generated by the secondary task depends on the phase of emotional processing affected by the interference and the nature of the secondary task. The aim of the present study was to investigate the effect of these variables on age-related changes in the processing of emotional faces. The identification of emotional facial expressions (EFEs) was assessed in a dual-task paradigm using the following variables: (a) the phase during which interference was applied (encoding vs. retrieval phase); and (b) the nature of the interfering stimulus (visuospatial vs. verbal). The sample population consisted of 24 healthy aged adults (mean age = 75.38) and 40 younger adults (mean age = 26.90). The accuracy of EFE identification was calculated for all experimental conditions. Consistent with our hypothesis, the performance of the older group was poorer than that of the younger group in all experimental conditions. Dual-task performance was poorer when the interference occurred during the encoding phase of emotional face processing and when both tasks were of the same nature (i.e., when the experimental condition was more demanding in terms of attention). These results provide empirical evidence of age-related deficits in the identification of emotional facial expressions, which may be partially explained by the impairment of cognitive resources specific to this task. These findings may account for the difficulties experienced by the elderly during social interactions that require the concomitant processing of emotional and environmental information.

  1. Emotional conflict occurs at an early stage: evidence from the emotional face-word Stroop task.

    PubMed

    Zhu, Xiang-ru; Zhang, Hui-jun; Wu, Ting-ting; Luo, Wen-bo; Luo, Yue-jia

    2010-06-30

    The perceptual processing of emotional conflict was studied using electrophysiological techniques to measure event-related potentials (ERPs). The emotional face-word Stroop task in which emotion words are written in prominent red color across a face was use to study emotional conflict. In each trial, the emotion word and facial expression were either congruent or incongruent (in conflict). When subjects were asked to identify the expression of the face during a trial, the incongruent condition evoked a more negative N170 ERP component in posterior lateral sites than in the congruent condition. In contrast, when subjects were asked to identify the word during a trial, the incongruent condition evoked a less negative N170 component than the congruent condition. The present findings extend our understanding of the control processes involved in emotional conflict by demonstrating that differentiation of emotional congruency begins at an early perceptual processing stage. (c) 2010 Elsevier Ireland Ltd. All rights reserved.

  2. Improved emotional conflict control triggered by the processing priority of negative emotion.

    PubMed

    Yang, Qian; Wang, Xiangpeng; Yin, Shouhang; Zhao, Xiaoyue; Tan, Jinfeng; Chen, Antao

    2016-04-18

    The prefrontal cortex is responsible for emotional conflict resolution, and this control mechanism is affected by the emotional valence of distracting stimuli. In the present study, we investigated effects of negative and positive stimuli on emotional conflict control using a face-word Stroop task in combination with functional brain imaging. Emotional conflict was absent in the negative face context, in accordance with the null activation observed in areas regarding emotional face processing (fusiform face area, middle temporal/occipital gyrus). Importantly, these visual areas negatively coupled with the dorsolateral prefrontal cortex (DLPFC). However, the significant emotional conflict was observed in the positive face context, this effect was accompanied by activation in areas associated with emotional face processing, and the default mode network (DMN), here, DLPFC mainly negatively coupled with DMN, rather than visual areas. These results suggested that the conflict control mechanism exerted differently between negative faces and positive faces, it implemented more efficiently in the negative face condition, whereas it is more devoted to inhibiting internal interference in the positive face condition. This study thus provides a plausible mechanism of emotional conflict resolution that the rapid pathway for negative emotion processing efficiently triggers control mechanisms to preventively resolve emotional conflict.

  3. Improved emotional conflict control triggered by the processing priority of negative emotion

    PubMed Central

    Yang, Qian; Wang, Xiangpeng; Yin, Shouhang; Zhao, Xiaoyue; Tan, Jinfeng; Chen, Antao

    2016-01-01

    The prefrontal cortex is responsible for emotional conflict resolution, and this control mechanism is affected by the emotional valence of distracting stimuli. In the present study, we investigated effects of negative and positive stimuli on emotional conflict control using a face-word Stroop task in combination with functional brain imaging. Emotional conflict was absent in the negative face context, in accordance with the null activation observed in areas regarding emotional face processing (fusiform face area, middle temporal/occipital gyrus). Importantly, these visual areas negatively coupled with the dorsolateral prefrontal cortex (DLPFC). However, the significant emotional conflict was observed in the positive face context, this effect was accompanied by activation in areas associated with emotional face processing, and the default mode network (DMN), here, DLPFC mainly negatively coupled with DMN, rather than visual areas. These results suggested that the conflict control mechanism exerted differently between negative faces and positive faces, it implemented more efficiently in the negative face condition, whereas it is more devoted to inhibiting internal interference in the positive face condition. This study thus provides a plausible mechanism of emotional conflict resolution that the rapid pathway for negative emotion processing efficiently triggers control mechanisms to preventively resolve emotional conflict. PMID:27086908

  4. Men appear more lateralized when noticing emotion in male faces.

    PubMed

    Rahman, Qazi; Anchassi, Tarek

    2012-02-01

    Empirical tests of the "right hemisphere dominance" versus "valence" theories of emotion processing are confounded by known sex differences in lateralization. Moreover, information about the sex of the person posing an emotion might be processed differently by men and women because of an adaptive male bias to notice expressions of threat and vigilance in other male faces. The purpose of this study was to investigate whether sex of poser and emotion displayed influenced lateralization in men and women by analyzing "laterality quotient" scores on a test which depicts vertically split chimeric faces, formed with one half showing a neutral expression and the other half showing an emotional expression. We found that men (N = 50) were significantly more lateralized for emotions indicative of vigilance and threat (happy, sad, angry, and surprised) in male faces relative to female faces and compared to women (N = 44). These data indicate that sex differences in functional cerebral lateralization for facial emotion may be specific to the emotion presented and the sex of face presenting it. PsycINFO Database Record (c) 2012 APA, all rights reserved

  5. Neurobiological correlates of emotional intelligence in voice and face perception networks

    PubMed Central

    Karle, Kathrin N; Ethofer, Thomas; Jacob, Heike; Brück, Carolin; Erb, Michael; Lotze, Martin; Nizielski, Sophia; Schütz, Astrid; Wildgruber, Dirk; Kreifelts, Benjamin

    2018-01-01

    Abstract Facial expressions and voice modulations are among the most important communicational signals to convey emotional information. The ability to correctly interpret this information is highly relevant for successful social interaction and represents an integral component of emotional competencies that have been conceptualized under the term emotional intelligence. Here, we investigated the relationship of emotional intelligence as measured with the Salovey-Caruso-Emotional-Intelligence-Test (MSCEIT) with cerebral voice and face processing using functional and structural magnetic resonance imaging. MSCEIT scores were positively correlated with increased voice-sensitivity and gray matter volume of the insula accompanied by voice-sensitivity enhanced connectivity between the insula and the temporal voice area, indicating generally increased salience of voices. Conversely, in the face processing system, higher MSCEIT scores were associated with decreased face-sensitivity and gray matter volume of the fusiform face area. Taken together, these findings point to an alteration in the balance of cerebral voice and face processing systems in the form of an attenuated face-vs-voice bias as one potential factor underpinning emotional intelligence. PMID:29365199

  6. Neurobiological correlates of emotional intelligence in voice and face perception networks.

    PubMed

    Karle, Kathrin N; Ethofer, Thomas; Jacob, Heike; Brück, Carolin; Erb, Michael; Lotze, Martin; Nizielski, Sophia; Schütz, Astrid; Wildgruber, Dirk; Kreifelts, Benjamin

    2018-02-01

    Facial expressions and voice modulations are among the most important communicational signals to convey emotional information. The ability to correctly interpret this information is highly relevant for successful social interaction and represents an integral component of emotional competencies that have been conceptualized under the term emotional intelligence. Here, we investigated the relationship of emotional intelligence as measured with the Salovey-Caruso-Emotional-Intelligence-Test (MSCEIT) with cerebral voice and face processing using functional and structural magnetic resonance imaging. MSCEIT scores were positively correlated with increased voice-sensitivity and gray matter volume of the insula accompanied by voice-sensitivity enhanced connectivity between the insula and the temporal voice area, indicating generally increased salience of voices. Conversely, in the face processing system, higher MSCEIT scores were associated with decreased face-sensitivity and gray matter volume of the fusiform face area. Taken together, these findings point to an alteration in the balance of cerebral voice and face processing systems in the form of an attenuated face-vs-voice bias as one potential factor underpinning emotional intelligence.

  7. The effect of age on memory for emotional faces.

    PubMed

    Grady, Cheryl L; Hongwanishkul, Donaya; Keightley, Michelle; Lee, Wendy; Hasher, Lynn

    2007-05-01

    Prior studies of emotion suggest that young adults should have enhanced memory for negative faces and that this enhancement should be reduced in older adults. Several studies have not shown these effects but were conducted with procedures different from those used with other emotional stimuli. In this study, researchers examined age differences in recognition of faces with emotional or neutral expressions, using trial-unique stimuli, as is typically done with other types of emotional stimuli. They also assessed the influence of personality traits and mood on memory. Enhanced recognition for negative faces was found in young adults but not in older adults. Recognition of faces was not influenced by mood or personality traits in young adults, but lower levels of extraversion and better emotional sensitivity predicted better negative face memory in older adults. These results suggest that negative expressions enhance memory for faces in young adults, as negative valence enhances memory for words and scenes. This enhancement is absent in older adults, but memory for emotional faces is modulated in older adults by personality traits that are relevant to emotional processing. (c) 2007 APA, all rights reserved

  8. The Processing of Human Emotional Faces by Pet and Lab Dogs: Evidence for Lateralization and Experience Effects

    PubMed Central

    Barber, Anjuli L. A.; Randi, Dania; Müller, Corsin A.; Huber, Ludwig

    2016-01-01

    From all non-human animals dogs are very likely the best decoders of human behavior. In addition to a high sensitivity to human attentive status and to ostensive cues, they are able to distinguish between individual human faces and even between human facial expressions. However, so far little is known about how they process human faces and to what extent this is influenced by experience. Here we present an eye-tracking study with dogs emanating from two different living environments and varying experience with humans: pet and lab dogs. The dogs were shown pictures of familiar and unfamiliar human faces expressing four different emotions. The results, extracted from several different eye-tracking measurements, revealed pronounced differences in the face processing of pet and lab dogs, thus indicating an influence of the amount of exposure to humans. In addition, there was some evidence for the influences of both, the familiarity and the emotional expression of the face, and strong evidence for a left gaze bias. These findings, together with recent evidence for the dog's ability to discriminate human facial expressions, indicate that dogs are sensitive to some emotions expressed in human faces. PMID:27074009

  9. The face and its emotion: right N170 deficits in structural processing and early emotional discrimination in schizophrenic patients and relatives.

    PubMed

    Ibáñez, Agustín; Riveros, Rodrigo; Hurtado, Esteban; Gleichgerrcht, Ezequiel; Urquina, Hugo; Herrera, Eduar; Amoruso, Lucía; Reyes, Migdyrai Martin; Manes, Facundo

    2012-01-30

    Previous studies have reported facial emotion recognition impairments in schizophrenic patients, as well as abnormalities in the N170 component of the event-related potential. Current research on schizophrenia highlights the importance of complexly-inherited brain-based deficits. In order to examine the N170 markers of face structural and emotional processing, DSM-IV diagnosed schizophrenia probands (n=13), unaffected first-degree relatives from multiplex families (n=13), and control subjects (n=13) matched by age, gender and educational level, performed a categorization task which involved words and faces with positive and negative valence. The N170 component, while present in relatives and control subjects, was reduced in patients, not only for faces, but also for face-word differences, suggesting a deficit in structural processing of stimuli. Control subjects showed N170 modulation according to the valence of facial stimuli. However, this discrimination effect was found to be reduced both in patients and relatives. This is the first report showing N170 valence deficits in relatives. Our results suggest a generalized deficit affecting the structural encoding of faces in patients, as well as the emotion discrimination both in patients and relatives. Finally, these findings lend support to the notion that cortical markers of facial discrimination can be validly considered as vulnerability markers. © 2011 Elsevier Ireland Ltd. All rights reserved.

  10. More than words (and faces): evidence for a Stroop effect of prosody in emotion word processing.

    PubMed

    Filippi, Piera; Ocklenburg, Sebastian; Bowling, Daniel L; Heege, Larissa; Güntürkün, Onur; Newen, Albert; de Boer, Bart

    2017-08-01

    Humans typically combine linguistic and nonlinguistic information to comprehend emotions. We adopted an emotion identification Stroop task to investigate how different channels interact in emotion communication. In experiment 1, synonyms of "happy" and "sad" were spoken with happy and sad prosody. Participants had more difficulty ignoring prosody than ignoring verbal content. In experiment 2, synonyms of "happy" and "sad" were spoken with happy and sad prosody, while happy or sad faces were displayed. Accuracy was lower when two channels expressed an emotion that was incongruent with the channel participants had to focus on, compared with the cross-channel congruence condition. When participants were required to focus on verbal content, accuracy was significantly lower also when prosody was incongruent with verbal content and face. This suggests that prosody biases emotional verbal content processing, even when conflicting with verbal content and face simultaneously. Implications for multimodal communication and language evolution studies are discussed.

  11. A leftward bias however you look at it: Revisiting the emotional chimeric face task as a tool for measuring emotion lateralization.

    PubMed

    R Innes, Bobby; Burt, D Michael; Birch, Yan K; Hausmann, Markus

    2015-12-28

    Left hemiface biases observed within the Emotional Chimeric Face Task (ECFT) support emotional face perception models whereby all expressions are preferentially processed by the right hemisphere. However, previous research using this task has not considered that the visible midline between hemifaces might engage atypical facial emotion processing strategies in upright or inverted conditions, nor controlled for left visual field (thus right hemispheric) visuospatial attention biases. This study used novel emotional chimeric faces (blended at the midline) to examine laterality biases for all basic emotions. Left hemiface biases were demonstrated across all emotional expressions and were reduced, but not reversed, for inverted faces. The ECFT bias in upright faces was significantly increased in participants with a large attention bias. These results support the theory that left hemiface biases reflect a genuine bias in emotional face processing, and this bias can interact with attention processes similarly localized in the right hemisphere.

  12. Own-sex effects in emotional memory for faces.

    PubMed

    Armony, Jorge L; Sergerie, Karine

    2007-10-09

    The amygdala is known to be critical for the enhancement of memory for emotional, especially negative, material. Importantly, some researchers have suggested a sex-specific hemispheric lateralization in this process. In the case of facial expressions, another important factor that could influence memory success is the sex of the face, which could interact with the emotion depicted as well as with the sex of the perceiver. Whether this is the case remains unknown, as all previous studies of sex difference in emotional memory have employed affective pictures. Here we directly explored this question using functional magnetic resonance imaging in a subsequent memory paradigm for facial expressions (fearful, happy and neutral). Consistent with our hypothesis, we found that the hemispheric laterality of the amygdala involvement in successful memory for emotional material was influenced not only by the sex of the subjects, as previously proposed, but also by the sex of the faces being remembered. Namely, the left amygdala was more active for successfully remembered female fearful faces in women, whereas in men the right amygdala was more involved in memory for male fearful faces. These results confirm the existence of sex differences in amygdala lateralization in emotional memory but also demonstrate a subtle relationship between the observer and the stimulus in this process.

  13. Emotion-attention interactions in recognition memory for distractor faces.

    PubMed

    Srinivasan, Narayanan; Gupta, Rashmi

    2010-04-01

    Effective filtering of distractor information has been shown to be dependent on perceptual load. Given the salience of emotional information and the presence of emotion-attention interactions, we wanted to explore the recognition memory for emotional distractors especially as a function of focused attention and distributed attention by manipulating load and the spatial spread of attention. We performed two experiments to study emotion-attention interactions by measuring recognition memory performance for distractor neutral and emotional faces. Participants performed a color discrimination task (low-load) or letter identification task (high-load) with a letter string display in Experiment 1 and a high-load letter identification task with letters presented in a circular array in Experiment 2. The stimuli were presented against a distractor face background. The recognition memory results show that happy faces were recognized better than sad faces under conditions of less focused or distributed attention. When attention is more spatially focused, sad faces were recognized better than happy faces. The study provides evidence for emotion-attention interactions in which specific emotional information like sad or happy is associated with focused or distributed attention respectively. Distractor processing with emotional information also has implications for theories of attention. Copyright 2010 APA, all rights reserved.

  14. Neural circuitry of masked emotional face processing in youth with bipolar disorder, severe mood dysregulation, and healthy volunteers.

    PubMed

    Thomas, Laura A; Brotman, Melissa A; Bones, Brian L; Chen, Gang; Rosen, Brooke H; Pine, Daniel S; Leibenluft, Ellen

    2014-04-01

    Youth with bipolar disorder (BD) and those with severe, non-episodic irritability (severe mood dysregulation, SMD) show face-emotion labeling deficits. These groups differ from healthy volunteers (HV) in neural responses to emotional faces. It is unknown whether awareness is required to elicit these differences. We compared activation in BD (N=20), SMD (N=18), and HV (N=22) during "Aware" and "Non-aware" priming of shapes by emotional faces. Subjects rated how much they liked the shape. In aware, a face (angry, fearful, happy, neutral, blank oval) appeared (187 ms) before the shape. In non-aware, a face appeared (17 ms), followed by a mask (170 ms), and shape. A Diagnosis-by-Awareness-by-Emotion ANOVA was not significant. There were significant Diagnosis-by-Awareness interactions in occipital regions. BD and SMD showed increased activity for non-aware vs. aware; HV showed the reverse pattern. When subjects viewed angry or neutral faces, there were Emotion-by-Diagnosis interactions in face-emotion processing regions, including the L precentral gyrus, R posterior cingulate, R superior temporal gyrus, R middle occipital gyrus, and L medial frontal gyrus. Regardless of awareness, BD and SMD differ in activation patterns from HV and each other in multiple brain regions, suggesting that BD and SMD are distinct developmental mood disorders. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. The extended functional neuroanatomy of emotional processing biases for masked faces in major depressive disorder.

    PubMed

    Victor, Teresa A; Furey, Maura L; Fromm, Stephen J; Bellgowan, Patrick S F; Öhman, Arne; Drevets, Wayne C

    2012-01-01

    Major depressive disorder (MDD) is associated with a mood-congruent processing bias in the amygdala toward face stimuli portraying sad expressions that is evident even when such stimuli are presented below the level of conscious awareness. The extended functional anatomical network that maintains this response bias has not been established, however. To identify neural network differences in the hemodynamic response to implicitly presented facial expressions between depressed and healthy control participants. Unmedicated-depressed participants with MDD (n=22) and healthy controls (HC; n=25) underwent functional MRI as they viewed face stimuli showing sad, happy or neutral face expressions, presented using a backward masking design. The blood-oxygen-level dependent (BOLD) signal was measured to identify regions where the hemodynamic response to the emotionally valenced stimuli differed between groups. The MDD subjects showed greater BOLD responses than the controls to masked-sad versus masked-happy faces in the hippocampus, amygdala and anterior inferotemporal cortex. While viewing both masked-sad and masked-happy faces relative to masked-neutral faces, the depressed subjects showed greater hemodynamic responses than the controls in a network that included the medial and orbital prefrontal cortices and anterior temporal cortex. Depressed and healthy participants showed distinct hemodynamic responses to masked-sad and masked-happy faces in neural circuits known to support the processing of emotionally valenced stimuli and to integrate the sensory and visceromotor aspects of emotional behavior. Altered function within these networks in MDD may establish and maintain illness-associated differences in the salience of sensory/social stimuli, such that attention is biased toward negative and away from positive stimuli.

  16. Eye-Tracking, Autonomic, and Electrophysiological Correlates of Emotional Face Processing in Adolescents with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Wagner, Jennifer B.; Hirsch, Suzanna B.; Vogel-Farley, Vanessa K.; Redcay, Elizabeth; Nelson, Charles A.

    2013-01-01

    Individuals with autism spectrum disorder (ASD) often have difficulty with social-emotional cues. This study examined the neural, behavioral, and autonomic correlates of emotional face processing in adolescents with ASD and typical development (TD) using eye-tracking and event-related potentials (ERPs) across two different paradigms. Scanning of…

  17. Adaptation to Emotional Conflict: Evidence from a Novel Face Emotion Paradigm

    PubMed Central

    Clayson, Peter E.; Larson, Michael J.

    2013-01-01

    The preponderance of research on trial-by-trial recruitment of affective control (e.g., conflict adaptation) relies on stimuli wherein lexical word information conflicts with facial affective stimulus properties (e.g., the face-Stroop paradigm where an emotional word is overlaid on a facial expression). Several studies, however, indicate different neural time course and properties for processing of affective lexical stimuli versus affective facial stimuli. The current investigation used a novel task to examine control processes implemented following conflicting emotional stimuli with conflict-inducing affective face stimuli in the absence of affective words. Forty-one individuals completed a task wherein the affective-valence of the eyes and mouth were either congruent (happy eyes, happy mouth) or incongruent (happy eyes, angry mouth) while high-density event-related potentials (ERPs) were recorded. There was a significant congruency effect and significant conflict adaptation effects for error rates. Although response times (RTs) showed a significant congruency effect, the effect of previous-trial congruency on current-trial RTs was only present for current congruent trials. Temporospatial principal components analysis showed a P3-like ERP source localized using FieldTrip software to the medial cingulate gyrus that was smaller on incongruent than congruent trials and was significantly influenced by the recruitment of control processes following previous-trial emotional conflict (i.e., there was significant conflict adaptation in the ERPs). Results show that a face-only paradigm may be sufficient to elicit emotional conflict and suggest a system for rapidly detecting conflicting emotional stimuli and subsequently adjusting control resources, similar to cognitive conflict detection processes, when using conflicting facial expressions without words. PMID:24073278

  18. Adaptation to emotional conflict: evidence from a novel face emotion paradigm.

    PubMed

    Clayson, Peter E; Larson, Michael J

    2013-01-01

    The preponderance of research on trial-by-trial recruitment of affective control (e.g., conflict adaptation) relies on stimuli wherein lexical word information conflicts with facial affective stimulus properties (e.g., the face-Stroop paradigm where an emotional word is overlaid on a facial expression). Several studies, however, indicate different neural time course and properties for processing of affective lexical stimuli versus affective facial stimuli. The current investigation used a novel task to examine control processes implemented following conflicting emotional stimuli with conflict-inducing affective face stimuli in the absence of affective words. Forty-one individuals completed a task wherein the affective-valence of the eyes and mouth were either congruent (happy eyes, happy mouth) or incongruent (happy eyes, angry mouth) while high-density event-related potentials (ERPs) were recorded. There was a significant congruency effect and significant conflict adaptation effects for error rates. Although response times (RTs) showed a significant congruency effect, the effect of previous-trial congruency on current-trial RTs was only present for current congruent trials. Temporospatial principal components analysis showed a P3-like ERP source localized using FieldTrip software to the medial cingulate gyrus that was smaller on incongruent than congruent trials and was significantly influenced by the recruitment of control processes following previous-trial emotional conflict (i.e., there was significant conflict adaptation in the ERPs). Results show that a face-only paradigm may be sufficient to elicit emotional conflict and suggest a system for rapidly detecting conflicting emotional stimuli and subsequently adjusting control resources, similar to cognitive conflict detection processes, when using conflicting facial expressions without words.

  19. Effects of Acute Alcohol Consumption on the Processing of Emotion in Faces: Implications for Understanding Alcohol-Related Aggression

    PubMed Central

    Attwood, Angela S.; Munafò, Marcus R.

    2016-01-01

    The negative consequences of chronic alcohol abuse are well known, but heavy episodic consumption ("binge drinking") is also associated with significant personal and societal harms. Aggressive tendencies are increased after alcohol but the mechanisms underlying these changes are not fully understood. While effects on behavioural control are likely to be important, other effects may be involved given the widespread action of alcohol. Altered processing of social signals is associated with changes in social behaviours, including aggression, but until recently there has been little research investigating the effects of acute alcohol consumption on these outcomes. Recent work investigating the effects of acute alcohol on emotional face processing has suggested reduced sensitivity to submissive signals (sad faces) and increased perceptual bias towards provocative signals (angry faces) after alcohol consumption, which may play a role in alcohol-related aggression. Here we discuss a putative mechanism that may explain how alcohol consumption influences emotional processing and subsequent aggressive responding, via disruption of OFC-amygdala connectivity. While the importance of emotional processing on social behaviours is well established, research into acute alcohol consumption and emotional processing is still in its infancy. Further research is needed and we outline a research agenda to address gaps in the literature. PMID:24920135

  20. Facing mixed emotions: Analytic and holistic perception of facial emotion expressions engages separate brain networks.

    PubMed

    Meaux, Emilie; Vuilleumier, Patrik

    2016-11-01

    The ability to decode facial emotions is of primary importance for human social interactions; yet, it is still debated how we analyze faces to determine their expression. Here we compared the processing of emotional face expressions through holistic integration and/or local analysis of visual features, and determined which brain systems mediate these distinct processes. Behavioral, physiological, and brain responses to happy and angry faces were assessed by presenting congruent global configurations of expressions (e.g., happy top+happy bottom), incongruent composite configurations (e.g., angry top+happy bottom), and isolated features (e.g. happy top only). Top and bottom parts were always from the same individual. Twenty-six healthy volunteers were scanned using fMRI while they classified the expression in either the top or the bottom face part but ignored information in the other non-target part. Results indicate that the recognition of happy and anger expressions is neither strictly holistic nor analytic Both routes were involved, but with a different role for analytic and holistic information depending on the emotion type, and different weights of local features between happy and anger expressions. Dissociable neural pathways were engaged depending on emotional face configurations. In particular, regions within the face processing network differed in their sensitivity to holistic expression information, which predominantly activated fusiform, inferior occipital areas and amygdala when internal features were congruent (i.e. template matching), whereas more local analysis of independent features preferentially engaged STS and prefrontal areas (IFG/OFC) in the context of full face configurations, but early visual areas and pulvinar when seen in isolated parts. Collectively, these findings suggest that facial emotion recognition recruits separate, but interactive dorsal and ventral routes within the face processing networks, whose engagement may be shaped by

  1. A note on age differences in mood-congruent vs. mood-incongruent emotion processing in faces.

    PubMed

    Voelkle, Manuel C; Ebner, Natalie C; Lindenberger, Ulman; Riediger, Michaela

    2014-01-01

    (1) Does experienced mood affect emotion perception in faces and is this perception mood-congruent or mood-incongruent?(2) Are there age-group differences in the interplay between experienced mood and emotion perception? (3) Does emotion perception in faces change as a function of the temporal sequence of study sessions and stimuli presentation, and (4) does emotion perception in faces serve a mood-regulatory function? One hundred fifty-four adults of three different age groups (younger: 20-31 years; middle-aged: 44-55 years; older adults: 70-81 years) were asked to provide multidimensional emotion ratings of a total of 1026 face pictures of younger, middle-aged, and older men and women, each displaying six different prototypical (primary) emotional expressions. By analyzing the likelihood of ascribing an additional emotional expression to a face whose primary emotion had been correctly recognized, the multidimensional rating approach permits the study of emotion perception while controlling for emotion recognition. Following up on previous research on mood responses to recurring unpleasant situations using the same dataset (Voelkle et al., 2013), crossed random effects analyses supported a mood-congruent relationship between experienced mood and perceived emotions in faces. In particular older adults were more likely to perceive happiness in faces when being in a positive mood and less likely to do so when being in a negative mood. This did not apply to younger adults. Temporal sequence of study sessions and stimuli presentation had a strong effect on the likelihood of ascribing an additional emotional expression. In contrast to previous findings, however, there was neither evidence for a change from mood-congruent to mood-incongruent responses over time nor evidence for a mood-regulatory effect.

  2. Face Age and Eye Gaze Influence Older Adults' Emotion Recognition.

    PubMed

    Campbell, Anna; Murray, Janice E; Atkinson, Lianne; Ruffman, Ted

    2017-07-01

    Eye gaze has been shown to influence emotion recognition. In addition, older adults (over 65 years) are not as influenced by gaze direction cues as young adults (18-30 years). Nevertheless, these differences might stem from the use of young to middle-aged faces in emotion recognition research because older adults have an attention bias toward old-age faces. Therefore, using older face stimuli might allow older adults to process gaze direction cues to influence emotion recognition. To investigate this idea, young and older adults completed an emotion recognition task with young and older face stimuli displaying direct and averted gaze, assessing labeling accuracy for angry, disgusted, fearful, happy, and sad faces. Direct gaze rather than averted gaze improved young adults' recognition of emotions in young and older faces, but for older adults this was true only for older faces. The current study highlights the impact of stimulus face age and gaze direction on emotion recognition in young and older adults. The use of young face stimuli with direct gaze in most research might contribute to age-related emotion recognition differences. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  3. Can We Distinguish Emotions from Faces? Investigation of Implicit and Explicit Processes of Peak Facial Expressions.

    PubMed

    Xiao, Ruiqi; Li, Xianchun; Li, Lin; Wang, Yanmei

    2016-01-01

    Most previous studies on facial expression recognition have focused on the moderate emotions; to date, few studies have been conducted to investigate the explicit and implicit processes of peak emotions. In the current study, we used transiently peak intense expression images of athletes at the winning or losing point in competition as materials, and investigated the diagnosability of peak facial expressions at both implicit and explicit levels. In Experiment 1, participants were instructed to evaluate isolated faces, isolated bodies, and the face-body compounds, and eye-tracking movement was recorded. The results revealed that the isolated body and face-body congruent images were better recognized than isolated face and face-body incongruent images, indicating that the emotional information conveyed by facial cues was ambiguous, and the body cues influenced facial emotion recognition. Furthermore, eye movement records showed that the participants displayed distinct gaze patterns for the congruent and incongruent compounds. In Experiment 2A, the subliminal affective priming task was used, with faces as primes and bodies as targets, to investigate the unconscious emotion perception of peak facial expressions. The results showed that winning face prime facilitated reaction to winning body target, whereas losing face prime inhibited reaction to winning body target, suggesting that peak facial expressions could be perceived at the implicit level. In general, the results indicate that peak facial expressions cannot be consciously recognized but can be perceived at the unconscious level. In Experiment 2B, revised subliminal affective priming task and a strict awareness test were used to examine the validity of unconscious perception of peak facial expressions found in Experiment 2A. Results of Experiment 2B showed that reaction time to both winning body targets and losing body targets was influenced by the invisibly peak facial expression primes, which indicated the

  4. Can We Distinguish Emotions from Faces? Investigation of Implicit and Explicit Processes of Peak Facial Expressions

    PubMed Central

    Xiao, Ruiqi; Li, Xianchun; Li, Lin; Wang, Yanmei

    2016-01-01

    Most previous studies on facial expression recognition have focused on the moderate emotions; to date, few studies have been conducted to investigate the explicit and implicit processes of peak emotions. In the current study, we used transiently peak intense expression images of athletes at the winning or losing point in competition as materials, and investigated the diagnosability of peak facial expressions at both implicit and explicit levels. In Experiment 1, participants were instructed to evaluate isolated faces, isolated bodies, and the face-body compounds, and eye-tracking movement was recorded. The results revealed that the isolated body and face-body congruent images were better recognized than isolated face and face-body incongruent images, indicating that the emotional information conveyed by facial cues was ambiguous, and the body cues influenced facial emotion recognition. Furthermore, eye movement records showed that the participants displayed distinct gaze patterns for the congruent and incongruent compounds. In Experiment 2A, the subliminal affective priming task was used, with faces as primes and bodies as targets, to investigate the unconscious emotion perception of peak facial expressions. The results showed that winning face prime facilitated reaction to winning body target, whereas losing face prime inhibited reaction to winning body target, suggesting that peak facial expressions could be perceived at the implicit level. In general, the results indicate that peak facial expressions cannot be consciously recognized but can be perceived at the unconscious level. In Experiment 2B, revised subliminal affective priming task and a strict awareness test were used to examine the validity of unconscious perception of peak facial expressions found in Experiment 2A. Results of Experiment 2B showed that reaction time to both winning body targets and losing body targets was influenced by the invisibly peak facial expression primes, which indicated the

  5. An fMRI study of emotional face processing in adolescent major depression.

    PubMed

    Hall, Leah M J; Klimes-Dougan, Bonnie; Hunt, Ruskin H; Thomas, Kathleen M; Houri, Alaa; Noack, Emily; Mueller, Bryon A; Lim, Kelvin O; Cullen, Kathryn R

    2014-10-01

    Major depressive disorder (MDD) often begins during adolescence when the brain is still maturing. To better understand the neurobiological underpinnings of MDD early in development, this study examined brain function in response to emotional faces in adolescents with MDD and healthy (HC) adolescents using functional magnetic resonance imaging (fMRI). Thirty-two unmedicated adolescents with MDD and 23 healthy age- and gender-matched controls completed an fMRI task viewing happy and fearful faces. Fronto-limbic regions of interest (ROI; bilateral amygdala, insula, subgenual and rostral anterior cingulate cortices) and whole-brain analyses were conducted to examine between-group differences in brain function. ROI analyses revealed that patients had greater bilateral amygdala activity than HC in response to viewing fearful versus happy faces, which remained significant when controlling for comorbid anxiety. Whole-brain analyses revealed that adolescents with MDD had lower activation compared to HC in a right hemisphere cluster comprised of the insula, superior/middle temporal gyrus, and Heschl׳s gyrus when viewing fearful faces. Brain activity in the subgenual anterior cingulate cortex was inversely correlated with depression severity. Limitations include a cross-sectional design with a modest sample size and use of a limited range of emotional stimuli. Results replicate previous studies that suggest emotion processing in adolescent MDD is associated with abnormalities within fronto-limbic brain regions. Findings implicate elevated amygdalar arousal to negative stimuli in adolescents with depression and provide new evidence for a deficit in functioning of the saliency network, which may be a future target for early intervention and MDD treatment. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Orienting asymmetries and physiological reactivity in dogs' response to human emotional faces.

    PubMed

    Siniscalchi, Marcello; d'Ingeo, Serenella; Quaranta, Angelo

    2018-06-19

    Recent scientific literature shows that emotional cues conveyed by human vocalizations and odours are processed in an asymmetrical way by the canine brain. In the present study, during feeding behaviour, dogs were suddenly presented with 2-D stimuli depicting human faces expressing the Ekman's six basic emotion (e.g. anger, fear, happiness, sadness, surprise, disgust, and neutral), simultaneously into the left and right visual hemifields. A bias to turn the head towards the left (right hemisphere) rather than the right side was observed with human faces expressing anger, fear, and happiness emotions, but an opposite bias (left hemisphere) was observed with human faces expressing surprise. Furthermore, dogs displayed higher behavioural and cardiac activity to picture of human faces expressing clear arousal emotional state. Overall, results demonstrated that dogs are sensitive to emotional cues conveyed by human faces, supporting the existence of an asymmetrical emotional modulation of the canine brain to process basic human emotions.

  7. Efficacy of identifying neural components in the face and emotion processing system in schizophrenia using a dynamic functional localizer.

    PubMed

    Arnold, Aiden E G F; Iaria, Giuseppe; Goghari, Vina M

    2016-02-28

    Schizophrenia is associated with deficits in face perception and emotion recognition. Despite consistent behavioural results, the neural mechanisms underlying these cognitive abilities have been difficult to isolate, in part due to differences in neuroimaging methods used between studies for identifying regions in the face processing system. Given this problem, we aimed to validate a recently developed fMRI-based dynamic functional localizer task for use in studies of psychiatric populations and specifically schizophrenia. Previously, this functional localizer successfully identified each of the core face processing regions (i.e. fusiform face area, occipital face area, superior temporal sulcus), and regions within an extended system (e.g. amygdala) in healthy individuals. In this study, we tested the functional localizer success rate in 27 schizophrenia patients and in 24 community controls. Overall, the core face processing regions were localized equally between both the schizophrenia and control group. Additionally, the amygdala, a candidate brain region from the extended system, was identified in nearly half the participants from both groups. These results indicate the effectiveness of a dynamic functional localizer at identifying regions of interest associated with face perception and emotion recognition in schizophrenia. The use of dynamic functional localizers may help standardize the investigation of the facial and emotion processing system in this and other clinical populations. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. A note on age differences in mood-congruent vs. mood-incongruent emotion processing in faces

    PubMed Central

    Voelkle, Manuel C.; Ebner, Natalie C.; Lindenberger, Ulman; Riediger, Michaela

    2014-01-01

    This article addresses four interrelated research questions: (1) Does experienced mood affect emotion perception in faces and is this perception mood-congruent or mood-incongruent?(2) Are there age-group differences in the interplay between experienced mood and emotion perception? (3) Does emotion perception in faces change as a function of the temporal sequence of study sessions and stimuli presentation, and (4) does emotion perception in faces serve a mood-regulatory function? One hundred fifty-four adults of three different age groups (younger: 20–31 years; middle-aged: 44–55 years; older adults: 70–81 years) were asked to provide multidimensional emotion ratings of a total of 1026 face pictures of younger, middle-aged, and older men and women, each displaying six different prototypical (primary) emotional expressions. By analyzing the likelihood of ascribing an additional emotional expression to a face whose primary emotion had been correctly recognized, the multidimensional rating approach permits the study of emotion perception while controlling for emotion recognition. Following up on previous research on mood responses to recurring unpleasant situations using the same dataset (Voelkle et al., 2013), crossed random effects analyses supported a mood-congruent relationship between experienced mood and perceived emotions in faces. In particular older adults were more likely to perceive happiness in faces when being in a positive mood and less likely to do so when being in a negative mood. This did not apply to younger adults. Temporal sequence of study sessions and stimuli presentation had a strong effect on the likelihood of ascribing an additional emotional expression. In contrast to previous findings, however, there was neither evidence for a change from mood-congruent to mood-incongruent responses over time nor evidence for a mood-regulatory effect. PMID:25018740

  9. Looking to the eyes influences the processing of emotion on face-sensitive event-related potentials in 7-month-old infants.

    PubMed

    Vanderwert, Ross E; Westerlund, Alissa; Montoya, Lina; McCormick, Sarah A; Miguel, Helga O; Nelson, Charles A

    2015-10-01

    Previous studies in infants have shown that face-sensitive components of the ongoing electroencephalogram (the event-related potential, or ERP) are larger in amplitude to negative emotions (e.g., fear, anger) versus positive emotions (e.g., happy). However, it is still unclear whether the negative emotions linked with the face or the negative emotions alone contribute to these amplitude differences. We simultaneously recorded infant looking behaviors (via eye-tracking) and face-sensitive ERPs while 7-month-old infants viewed human faces or animals displaying happy, fear, or angry expressions. We observed that the amplitude of the N290 was greater (i.e., more negative) to angry animals compared to happy or fearful animals; no such differences were obtained for human faces. Eye-tracking data highlighted the importance of the eye region in processing emotional human faces. Infants that spent more time looking to the eye region of human faces showing fearful or angry expressions had greater N290 or P400 amplitudes, respectively. © 2014 Wiley Periodicals, Inc.

  10. Processing of task-irrelevant emotional faces impacted by implicit sequence learning.

    PubMed

    Peng, Ming; Cai, Mengfei; Zhou, Renlai

    2015-12-02

    Attentional load may be increased by task-relevant attention, such as difficulty of task, or task-irrelevant attention, such as an unexpected light-spot in the screen. Several studies have focused on the influence of task-relevant attentional load on task-irrelevant emotion processing. In this study, we used event-related potentials to examine the impact of task-irrelevant attentional load on task-irrelevant expression processing. Eighteen participants identified the color of a word (i.e. the color Stroop task) while a picture of a fearful or a neutral face was shown in the background. The task-irrelevant attentional load was increased by regularly presented congruence trials (congruence between the color and the meaning of the word) in the regular condition because implicit sequence learning was induced. We compared the task-irrelevant expression processing between the regular condition and the random condition (the congruence and incongruence trials were presented randomly). Behaviorally, reaction times for the fearful face condition were faster than the neutral faces condition in the random condition, whereas no significant difference was found in the regular condition. The event-related potential results indicated enhanced positive amplitudes in P2, N2, and P3 components relative to neutral faces in the random condition. In comparison, only P2 differed significantly for the two types of expressions in the regular condition. The study showed that attentional load increased by implicit sequence learning influenced the late processing of task-irrelevant expression.

  11. State-dependent alteration in face emotion recognition in depression.

    PubMed

    Anderson, Ian M; Shippen, Clare; Juhasz, Gabriella; Chase, Diana; Thomas, Emma; Downey, Darragh; Toth, Zoltan G; Lloyd-Williams, Kathryn; Elliott, Rebecca; Deakin, J F William

    2011-04-01

    Negative biases in emotional processing are well recognised in people who are currently depressed but are less well described in those with a history of depression, where such biases may contribute to vulnerability to relapse. To compare accuracy, discrimination and bias in face emotion recognition in those with current and remitted depression. The sample comprised a control group (n = 101), a currently depressed group (n = 30) and a remitted depression group (n = 99). Participants provided valid data after receiving a computerised face emotion recognition task following standardised assessment of diagnosis and mood symptoms. In the control group women were more accurate in recognising emotions than men owing to greater discrimination. Among participants with depression, those in remission correctly identified more emotions than controls owing to increased response bias, whereas those currently depressed recognised fewer emotions owing to decreased discrimination. These effects were most marked for anger, fear and sadness but there was no significant emotion × group interaction, and a similar pattern tended to be seen for happiness although not for surprise or disgust. These differences were confined to participants who were antidepressant-free, with those taking antidepressants having similar results to the control group. Abnormalities in face emotion recognition differ between people with current depression and those in remission. Reduced discrimination in depressed participants may reflect withdrawal from the emotions of others, whereas the increased bias in those with a history of depression could contribute to vulnerability to relapse. The normal face emotion recognition seen in those taking medication may relate to the known effects of antidepressants on emotional processing and could contribute to their ability to protect against depressive relapse.

  12. The changing face of emotion: age-related patterns of amygdala activation to salient faces.

    PubMed

    Todd, Rebecca M; Evans, Jennifer W; Morris, Drew; Lewis, Marc D; Taylor, Margot J

    2011-01-01

    The present study investigated age-related differences in the amygdala and other nodes of face-processing networks in response to facial expression and familiarity. fMRI data were analyzed from 31 children (3.5-8.5 years) and 14 young adults (18-33 years) who viewed pictures of familiar (mothers) and unfamiliar emotional faces. Results showed that amygdala activation for faces over a scrambled image baseline increased with age. Children, but not adults, showed greater amygdala activation to happy than angry faces; in addition, amygdala activation for angry faces increased with age. In keeping with growing evidence of a positivity bias in young children, our data suggest that children find happy faces to be more salient or meaningful than angry faces. Both children and adults showed preferential activation to mothers' over strangers' faces in a region of rostral anterior cingulate cortex associated with self-evaluation, suggesting that some nodes in frontal evaluative networks are active early in development. This study presents novel data on neural correlates of face processing in childhood and indicates that preferential amygdala activation for emotional expressions changes with age.

  13. Alterations in neural processing of emotional faces in adolescent anorexia nervosa patients - an event-related potential study.

    PubMed

    Sfärlea, Anca; Greimel, Ellen; Platt, Belinda; Bartling, Jürgen; Schulte-Körne, Gerd; Dieler, Alica C

    2016-09-01

    The present study explored the neurophysiological correlates of perception and recognition of emotional facial expressions in adolescent anorexia nervosa (AN) patients using event-related potentials (ERPs). We included 20 adolescent girls with AN and 24 healthy girls and recorded ERPs during a passive viewing task and three active tasks requiring processing of emotional faces in varying processing depths; one of the tasks also assessed emotion recognition abilities behaviourally. Despite the absence of behavioural differences, we found that across all tasks AN patients exhibited a less pronounced early posterior negativity (EPN) in response to all facial expressions compared to controls. The EPN is an ERP component reflecting an automatic, perceptual processing stage which is modulated by the intrinsic salience of a stimulus. Hence, the less pronounced EPN in anorexic girls suggests that they might perceive other people's faces as less intrinsically relevant, i.e. as less "important" than do healthy girls. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Dissociable patterns of medial prefrontal and amygdala activity to face identity versus emotion in bipolar disorder.

    PubMed

    Keener, M T; Fournier, J C; Mullin, B C; Kronhaus, D; Perlman, S B; LaBarbara, E; Almeida, J C; Phillips, M L

    2012-09-01

    Individuals with bipolar disorder demonstrate abnormal social function. Neuroimaging studies in bipolar disorder have shown functional abnormalities in neural circuitry supporting face emotion processing, but have not examined face identity processing, a key component of social function. We aimed to elucidate functional abnormalities in neural circuitry supporting face emotion and face identity processing in bipolar disorder. Twenty-seven individuals with bipolar disorder I currently euthymic and 27 healthy controls participated in an implicit face processing, block-design paradigm. Participants labeled color flashes that were superimposed on dynamically changing background faces comprising morphs either from neutral to prototypical emotion (happy, sad, angry and fearful) or from one identity to another identity depicting a neutral face. Whole-brain and amygdala region-of-interest (ROI) activities were compared between groups. There was no significant between-group difference looking across both emerging face emotion and identity. During processing of all emerging emotions, euthymic individuals with bipolar disorder showed significantly greater amygdala activity. During facial identity and also happy face processing, euthymic individuals with bipolar disorder showed significantly greater amygdala and medial prefrontal cortical activity compared with controls. This is the first study to examine neural circuitry supporting face identity and face emotion processing in bipolar disorder. Our findings of abnormally elevated activity in amygdala and medial prefrontal cortex (mPFC) during face identity and happy face emotion processing suggest functional abnormalities in key regions previously implicated in social processing. This may be of future importance toward examining the abnormal self-related processing, grandiosity and social dysfunction seen in bipolar disorder.

  15. On the Automaticity of Emotion Processing in Words and Faces: Event-Related Brain Potentials Evidence from a Superficial Task

    ERIC Educational Resources Information Center

    Rellecke, Julian; Palazova, Marina; Sommer, Werner; Schacht, Annekathrin

    2011-01-01

    The degree to which emotional aspects of stimuli are processed automatically is controversial. Here, we assessed the automatic elicitation of emotion-related brain potentials (ERPs) to positive, negative, and neutral words and facial expressions in an easy and superficial face-word discrimination task, for which the emotional valence was…

  16. Detecting and Categorizing Fleeting Emotions in Faces

    PubMed Central

    Sweeny, Timothy D.; Suzuki, Satoru; Grabowecky, Marcia; Paller, Ken A.

    2013-01-01

    Expressions of emotion are often brief, providing only fleeting images from which to base important social judgments. We sought to characterize the sensitivity and mechanisms of emotion detection and expression categorization when exposure to faces is very brief, and to determine whether these processes dissociate. Observers viewed 2 backward-masked facial expressions in quick succession, 1 neutral and the other emotional (happy, fearful, or angry), in a 2-interval forced-choice task. On each trial, observers attempted to detect the emotional expression (emotion detection) and to classify the expression (expression categorization). Above-chance emotion detection was possible with extremely brief exposures of 10 ms and was most accurate for happy expressions. We compared categorization among expressions using a d′ analysis, and found that categorization was usually above chance for angry versus happy and fearful versus happy, but consistently poor for fearful versus angry expressions. Fearful versus angry categorization was poor even when only negative emotions (fearful, angry, or disgusted) were used, suggesting that this categorization is poor independent of decision context. Inverting faces impaired angry versus happy categorization, but not emotion detection, suggesting that information from facial features is used differently for emotion detection and expression categorizations. Emotion detection often occurred without expression categorization, and expression categorization sometimes occurred without emotion detection. These results are consistent with the notion that emotion detection and expression categorization involve separate mechanisms. PMID:22866885

  17. Face or body? Oxytocin improves perception of emotions from facial expressions in incongruent emotional body context.

    PubMed

    Perry, Anat; Aviezer, Hillel; Goldstein, Pavel; Palgi, Sharon; Klein, Ehud; Shamay-Tsoory, Simone G

    2013-11-01

    The neuropeptide oxytocin (OT) has been repeatedly reported to play an essential role in the regulation of social cognition in humans in general, and specifically in enhancing the recognition of emotions from facial expressions. The later was assessed in different paradigms that rely primarily on isolated and decontextualized emotional faces. However, recent evidence has indicated that the perception of basic facial expressions is not context invariant and can be categorically altered by context, especially body context, at early perceptual levels. Body context has a strong effect on our perception of emotional expressions, especially when the actual target face and the contextually expected face are perceptually similar. To examine whether and how OT affects emotion recognition, we investigated the role of OT in categorizing facial expressions in incongruent body contexts. Our results show that in the combined process of deciphering emotions from facial expressions and from context, OT gives an advantage to the face. This advantage is most evident when the target face and the contextually expected face are perceptually similar. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Changes in the neural correlates of implicit emotional face processing during antidepressant treatment in major depressive disorder.

    PubMed

    Victor, Teresa A; Furey, Maura L; Fromm, Stephen J; Öhman, Arne; Drevets, Wayne C

    2013-11-01

    An emerging hypothesis regarding the mechanisms underlying antidepressant pharmacotherapy suggests that these agents benefit depressed patients by reversing negative emotional processing biases (Harmer, 2008). Neuropsychological indices and functional neuroimaging measures of the amygdala response show that antidepressant drugs shift implicit and explicit processing biases away from the negative valence and toward the positive valence. However, few studies have explored such biases in regions extensively connected with the amygdala, such as the pregenual anterior cingulate cortex (pgACC) area, where pre-treatment activity consistently has predicted clinical outcome during antidepressant treatment. We used functional magnetic resonance imaging (fMRI) to investigate changes in haemodynamic response patterns to positive vs. negative stimuli in patients with major depressive disorder (MDD) under antidepressant treatment. Participants with MDD (n = 10) underwent fMRI before and after 8 wk sertraline treatment; healthy controls (n = 10) were imaged across an equivalent interval. A backward masking task was used to elicit non-conscious neural responses to sad, happy and neutral face expressions. Haemodynamic responses to emotional face stimuli were compared between conditions and groups in the pgACC. The response to masked-sad vs. masked-happy faces (SN-HN) in pgACC in the depressed subjects was higher in the pre-treatment condition than in the post-treatment condition and this difference was significantly greater than the corresponding change across time in the controls. The treatment-associated difference was attributable to an attenuated response to sad faces and an enhanced response to happy faces. Pre-treatment pgACC responses to SN-HN correlated positively with clinical improvement during treatment. The pgACC participates with the amygdala in processing the salience of emotional stimuli. Treatment-associated functional changes in this limbic network may influence

  19. Distant influences of amygdala lesion on visual cortical activation during emotional face processing.

    PubMed

    Vuilleumier, Patrik; Richardson, Mark P; Armony, Jorge L; Driver, Jon; Dolan, Raymond J

    2004-11-01

    Emotional visual stimuli evoke enhanced responses in the visual cortex. To test whether this reflects modulatory influences from the amygdala on sensory processing, we used event-related functional magnetic resonance imaging (fMRI) in human patients with medial temporal lobe sclerosis. Twenty-six patients with lesions in the amygdala, the hippocampus or both, plus 13 matched healthy controls, were shown pictures of fearful or neutral faces in task-releant or task-irrelevant positions on the display. All subjects showed increased fusiform cortex activation when the faces were in task-relevant positions. Both healthy individuals and those with hippocampal damage showed increased activation in the fusiform and occipital cortex when they were shown fearful faces, but this was not the case for individuals with damage to the amygdala, even though visual areas were structurally intact. The distant influence of the amygdala was also evidenced by the parametric relationship between amygdala damage and the level of emotional activation in the fusiform cortex. Our data show that combining the fMRI and lesion approaches can help reveal the source of functional modulatory influences between distant but interconnected brain regions.

  20. Different brain activity in response to emotional faces alone and augmented by contextual information.

    PubMed

    Lee, Kyung Hwa; Siegle, Greg J

    2014-11-01

    This study examined the extent to which emotional face stimuli differ from the neural reactivity associated with more ecological contextually augmented stimuli. Participants were scanned when they viewed contextually rich pictures depicting both emotional faces and context, and pictures of emotional faces presented alone. Emotional faces alone were more strongly associated with brain activity in paralimbic and social information processing regions, whereas emotional faces augmented by context were associated with increased and sustained activity in regions potentially representing increased complexity and subjective emotional experience. Furthermore, context effects were modulated by emotional intensity and valence. These findings suggest that cortical elaboration that is apparent in contextually augmented stimuli may be missed in studies of emotional faces alone, whereas emotional faces may more selectively recruit limbic reactivity. Copyright © 2014 Society for Psychophysiological Research.

  1. The changing face of emotion: age-related patterns of amygdala activation to salient faces

    PubMed Central

    Evans, Jennifer W.; Morris, Drew; Lewis, Marc D.; Taylor, Margot J.

    2011-01-01

    The present study investigated age-related differences in the amygdala and other nodes of face-processing networks in response to facial expression and familiarity. fMRI data were analyzed from 31 children (3.5–8.5 years) and 14 young adults (18–33 years) who viewed pictures of familiar (mothers) and unfamiliar emotional faces. Results showed that amygdala activation for faces over a scrambled image baseline increased with age. Children, but not adults, showed greater amygdala activation to happy than angry faces; in addition, amygdala activation for angry faces increased with age. In keeping with growing evidence of a positivity bias in young children, our data suggest that children find happy faces to be more salient or meaningful than angry faces. Both children and adults showed preferential activation to mothers’ over strangers’ faces in a region of rostral anterior cingulate cortex associated with self-evaluation, suggesting that some nodes in frontal evaluative networks are active early in development. This study presents novel data on neural correlates of face processing in childhood and indicates that preferential amygdala activation for emotional expressions changes with age. PMID:20194512

  2. Face-memory and emotion: associations with major depression in children and adolescents.

    PubMed

    Pine, Daniel S; Lissek, Shmuel; Klein, Rachel G; Mannuzza, Salvatore; Moulton, John L; Guardino, Mary; Woldehawariat, Girma

    2004-10-01

    Studies in adults with major depressive disorder (MDD) document abnormalities in both memory and face-emotion processing. The current study used a novel face-memory task to test the hypothesis that adolescent MDD is associated with a deficit in memory for face-emotions. The study also examines the relationship between parental MDD and memory performance in offspring. Subjects were 152 offspring (ages 9-19) of adults with either MDD, anxiety disorders, both MDD and anxiety, or no disorder. Parents and offspring were assessed for mental disorders. Collection of face-memory data was blind to offspring and parent diagnosis. A computerized task was developed that required rating of facial photographs depicting 'happy,"fearful,' or 'angry' emotions followed by a memory recall test. Recall accuracy was examined as a function of face-emotion type. Age and gender independently predicted memory, with better recall in older and female subjects. Controlling for age and gender, offspring with a history of MDD (n = 19) demonstrated significant deficits in memory selectively for fearful faces, but not happy or angry faces. Parental MDD was not associated with face-memory accuracy. This study found an association between MDD in childhood or adolescence and perturbed encoding of fearful faces. MDD in young individuals may predispose to subtle anomalies in a neural circuit encompassing the amygdala, a brain region implicated in the processing of fearful facial expressions. These findings suggest that brain imaging studies using similar face-emotion paradigms should test whether deficits in processing of fearful faces relate to amygdala dysfunction in children and adolescents with MDD.

  3. Emotion-independent face recognition

    NASA Astrophysics Data System (ADS)

    De Silva, Liyanage C.; Esther, Kho G. P.

    2000-12-01

    Current face recognition techniques tend to work well when recognizing faces under small variations in lighting, facial expression and pose, but deteriorate under more extreme conditions. In this paper, a face recognition system to recognize faces of known individuals, despite variations in facial expression due to different emotions, is developed. The eigenface approach is used for feature extraction. Classification methods include Euclidean distance, back propagation neural network and generalized regression neural network. These methods yield 100% recognition accuracy when the training database is representative, containing one image representing the peak expression for each emotion of each person apart from the neutral expression. The feature vectors used for comparison in the Euclidean distance method and for training the neural network must be all the feature vectors of the training set. These results are obtained for a face database consisting of only four persons.

  4. Face emotion recognition is related to individual differences in psychosis-proneness.

    PubMed

    Germine, L T; Hooker, C I

    2011-05-01

    Deficits in face emotion recognition (FER) in schizophrenia are well documented, and have been proposed as a potential intermediate phenotype for schizophrenia liability. However, research on the relationship between psychosis vulnerability and FER has mixed findings and methodological limitations. Moreover, no study has yet characterized the relationship between FER ability and level of psychosis-proneness. If FER ability varies continuously with psychosis-proneness, this suggests a relationship between FER and polygenic risk factors. We tested two large internet samples to see whether psychometric psychosis-proneness, as measured by the Schizotypal Personality Questionnaire-Brief (SPQ-B), is related to differences in face emotion identification and discrimination or other face processing abilities. Experiment 1 (n=2332) showed that psychosis-proneness predicts face emotion identification ability but not face gender identification ability. Experiment 2 (n=1514) demonstrated that psychosis-proneness also predicts performance on face emotion but not face identity discrimination. The tasks in Experiment 2 used identical stimuli and task parameters, differing only in emotion/identity judgment. Notably, the relationships demonstrated in Experiments 1 and 2 persisted even when individuals with the highest psychosis-proneness levels (the putative high-risk group) were excluded from analysis. Our data suggest that FER ability is related to individual differences in psychosis-like characteristics in the normal population, and that these differences cannot be accounted for by differences in face processing and/or visual perception. Our results suggest that FER may provide a useful candidate intermediate phenotype.

  5. Emotion elicitor or emotion messenger? Subliminal priming reveals two faces of facial expressions.

    PubMed

    Ruys, Kirsten I; Stapel, Diederik A

    2008-06-01

    Facial emotional expressions can serve both as emotional stimuli and as communicative signals. The research reported here was conducted to illustrate how responses to both roles of facial emotional expressions unfold over time. As an emotion elicitor, a facial emotional expression (e.g., a disgusted face) activates a response that is similar to responses to other emotional stimuli of the same valence (e.g., a dirty, nonflushed toilet). As an emotion messenger, the same facial expression (e.g., a disgusted face) serves as a communicative signal by also activating the knowledge that the sender is experiencing a specific emotion (e.g., the sender feels disgusted). By varying the duration of exposure to disgusted, fearful, angry, and neutral faces in two subliminal-priming studies, we demonstrated that responses to faces as emotion elicitors occur prior to responses to faces as emotion messengers, and that both types of responses may unfold unconsciously.

  6. Amygdala Hyperactivation During Face Emotion Processing in Unaffected Youth at Risk for Bipolar Disorder

    ERIC Educational Resources Information Center

    Olsavsky, Aviva K.; Brotman, Melissa A.; Rutenberg, Julia G.; Muhrer, Eli J.; Deveney, Christen M.; Fromm, Stephen J.; Towbin, Kenneth; Pine, Daniel S.; Leibenluft, Ellen

    2012-01-01

    Objective: Youth at familial risk for bipolar disorder (BD) show deficits in face emotion processing, but the neural correlates of these deficits have not been examined. This preliminary study tests the hypothesis that, relative to healthy comparison (HC) subjects, both BD subjects and youth at risk for BD (i.e., those with a first-degree BD…

  7. Effects of touch on emotional face processing: A study of event-related potentials, facial EMG and cardiac activity.

    PubMed

    Spapé, M M; Harjunen, Ville; Ravaja, N

    2017-03-01

    Being touched is known to affect emotion, and even a casual touch can elicit positive feelings and affinity. Psychophysiological studies have recently shown that tactile primes affect visual evoked potentials to emotional stimuli, suggesting altered affective stimulus processing. As, however, these studies approached emotion from a purely unidimensional perspective, it remains unclear whether touch biases emotional evaluation or a more general feature such as salience. Here, we investigated how simple tactile primes modulate event related potentials (ERPs), facial EMG and cardiac response to pictures of facial expressions of emotion. All measures replicated known effects of emotional face processing: Disgust and fear modulated early ERPs, anger increased the cardiac orienting response, and expressions elicited emotion-congruent facial EMG activity. Tactile primes also affected these measures, but priming never interacted with the type of emotional expression. Thus, touch may additively affect general stimulus processing, but it does not bias or modulate immediate affective evaluation. Copyright © 2017. Published by Elsevier B.V.

  8. Face-Memory and Emotion: Associations with Major Depression in Children and Adolescents

    ERIC Educational Resources Information Center

    Pine, Daniel S.; Lissek, Shmuel; Klein, Rachel G.; Mannuzza, Salvatore; Moulton, John L., III; Guardino, Mary; Woldehawariat, Girma

    2004-01-01

    Background: Studies in adults with major depressive disorder (MDD) document abnormalities in both memory and face-emotion processing. The current study used a novel face-memory task to test the hypothesis that adolescent MDD is associated with a deficit in memory for face-emotions. The study also examines the relationship between parental MDD and…

  9. Processing Distracting Non-face Emotional Images: No Evidence of an Age-Related Positivity Effect.

    PubMed

    Madill, Mark; Murray, Janice E

    2017-01-01

    Cognitive aging may be accompanied by increased prioritization of social and emotional goals that enhance positive experiences and emotional states. The socioemotional selectivity theory suggests this may be achieved by giving preference to positive information and avoiding or suppressing negative information. Although there is some evidence of a positivity bias in controlled attention tasks, it remains unclear whether a positivity bias extends to the processing of affective stimuli presented outside focused attention. In two experiments, we investigated age-related differences in the effects of to-be-ignored non-face affective images on target processing. In Experiment 1, 27 older (64-90 years) and 25 young adults (19-29 years) made speeded valence judgments about centrally presented positive or negative target images taken from the International Affective Picture System. To-be-ignored distractor images were presented above and below the target image and were either positive, negative, or neutral in valence. The distractors were considered task relevant because they shared emotional characteristics with the target stimuli. Both older and young adults responded slower to targets when distractor valence was incongruent with target valence relative to when distractors were neutral. Older adults responded faster to positive than to negative targets but did not show increased interference effects from positive distractors. In Experiment 2, affective distractors were task irrelevant as the target was a three-digit array and did not share emotional characteristics with the distractors. Twenty-six older (63-84 years) and 30 young adults (18-30 years) gave speeded responses on a digit disparity task while ignoring the affective distractors positioned in the periphery. Task performance in either age group was not influenced by the task-irrelevant affective images. In keeping with the socioemotional selectivity theory, these findings suggest that older adults preferentially

  10. Afterimage induced neural activity during emotional face perception.

    PubMed

    Cheal, Jenna L; Heisz, Jennifer J; Walsh, Jennifer A; Shedden, Judith M; Rutherford, M D

    2014-02-26

    The N170 response differs when positive versus negative facial expressions are viewed. This neural response could be associated with the perception of emotions, or some feature of the stimulus. We used an aftereffect paradigm to clarify. Consistent with previous reports of emotional aftereffects, a neutral face was more likely to be described as happy following a sad face adaptation, and more likely to be described as sad following a happy face adaptation. In addition, similar to previous observations with actual emotional faces, we found differences in the latency of the N170 elicited by the neutral face following sad versus happy face adaptation, demonstrating that the emotion-specific effect on the N170 emerges even when emotion expressions are perceptually different but physically identical. The re-entry of emotional information from other brain regions may be driving the emotional aftereffects and the N170 latency differences. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Social categories shape the neural representation of emotion: evidence from a visual face adaptation task

    PubMed Central

    Otten, Marte; Banaji, Mahzarin R.

    2012-01-01

    A number of recent behavioral studies have shown that emotional expressions are differently perceived depending on the race of a face, and that perception of race cues is influenced by emotional expressions. However, neural processes related to the perception of invariant cues that indicate the identity of a face (such as race) are often described to proceed independently of processes related to the perception of cues that can vary over time (such as emotion). Using a visual face adaptation paradigm, we tested whether these behavioral interactions between emotion and race also reflect interdependent neural representation of emotion and race. We compared visual emotion aftereffects when the adapting face and ambiguous test face differed in race or not. Emotion aftereffects were much smaller in different race (DR) trials than same race (SR) trials, indicating that the neural representation of a facial expression is significantly different depending on whether the emotional face is black or white. It thus seems that invariable cues such as race interact with variable face cues such as emotion not just at a response level, but also at the level of perception and neural representation. PMID:22403531

  12. Association between autistic traits and emotion adaptation to partially occluded faces.

    PubMed

    Luo, Chengwen; Burns, Edwin; Xu, Hong

    2017-04-01

    Prolonged exposure to a happy face makes subsequently presented faces appear sadder: the facial emotion aftereffect (FEA). People with autism spectrum disorders and their relatives have diminished holistic perception of faces. Levels of autism can be measured continuously in the general population by autistic traits using the autism-quotient (AQ). Prior work has not found any association between AQ and FEA in adults, possibly due to non-holistic processing strategies employed by those at the higher end of the spectrum. In the present study, we tested whether AQ was associated with FEA to partially occluded faces. We hypothesized that inferring emotion from such faces would require participants to process their viewable parts as a gestalt percept, thus we anticipated this ability would diminish as autistic traits increased. In Experiment 1, we partially occluded the adapting faces with aligned or misaligned opaque bars. Both conditions produced significant FEAs, with aftereffects and AQ negatively correlated. In Experiment 2, we adapted participants to obscured faces flickering in luminance, and manipulated the facilitation of holistic perception by varying the synchronization of this flickering. We found significant FEAs in all conditions, but abolished its association with AQ. In Experiment 3, we showed that the association between AQ and FEA in the occluded conditions in Experiment 1 was not due to the recognizability or perceived emotional intensity of our adaptors; although the overall FEAs were linked to emotional intensity. We propose that increasing autistic traits are associated with diminishing abilities in perceiving emotional faces as a gestalt percept. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Emotional face processing and flat affect in schizophrenia: functional and structural neural correlates.

    PubMed

    Lepage, M; Sergerie, K; Benoit, A; Czechowska, Y; Dickie, E; Armony, J L

    2011-09-01

    There is a general consensus in the literature that schizophrenia causes difficulties with facial emotion perception and discrimination. Functional brain imaging studies have observed reduced limbic activity during facial emotion perception but few studies have examined the relation to flat affect severity. A total of 26 people with schizophrenia and 26 healthy controls took part in this event-related functional magnetic resonance imaging study. Sad, happy and neutral faces were presented in a pseudo-random order and participants indicated the gender of the face presented. Manual segmentation of the amygdala was performed on a structural T1 image. Both the schizophrenia group and the healthy control group rated the emotional valence of facial expressions similarly. Both groups exhibited increased brain activity during the perception of emotional faces relative to neutral ones in multiple brain regions, including multiple prefrontal regions bilaterally, the right amygdala, right cingulate cortex and cuneus. Group comparisons, however, revealed increased activity in the healthy group in the anterior cingulate, right parahippocampal gyrus and multiple visual areas. In schizophrenia, the severity of flat affect correlated significantly with neural activity in several brain areas including the amygdala and parahippocampal region bilaterally. These results suggest that many of the brain regions involved in emotional face perception, including the amygdala, are equally recruited in both schizophrenia and controls, but flat affect can also moderate activity in some other brain regions, notably in the left amygdala and parahippocampal gyrus bilaterally. There were no significant group differences in the volume of the amygdala.

  14. Parametric modulation of neural activity during face emotion processing in unaffected youth at familial risk for bipolar disorder.

    PubMed

    Brotman, Melissa A; Deveney, Christen M; Thomas, Laura A; Hinton, Kendra E; Yi, Jennifer Y; Pine, Daniel S; Leibenluft, Ellen

    2014-11-01

    Both patients with pediatric bipolar disorder (BD) and unaffected youth at familial risk (AR) for the illness show impairments in face emotion labeling. Few studies, however, have examined brain regions engaged in AR youth when processing emotional faces. Moreover, studies have yet to explore neural responsiveness to subtle changes in face emotion in AR youth. Sixty-four unrelated youth, including 20 patients with BD, 15 unaffected AR youth, and 29 healthy comparisons (HC), completed functional magnetic resonance imaging. Neutral faces were morphed with angry or happy faces in 25% intervals. In specific phases of the task, youth alternatively made explicit (hostility) or implicit (nose width) ratings of the faces. The slope of blood oxygenated level-dependent activity was calculated across neutral to angry and neutral to happy face stimuli. Behaviorally, both subjects with BD (p ≤ 0.001) and AR youth (p ≤ 0.05) rated faces as less hostile relative to HC. Consistent with this, in response to increasing anger on the face, patients with BD and AR youth showed decreased modulation in the amygdala and inferior frontal gyrus (IFG; BA 46) compared to HC (all p ≤ 0.05). Amygdala dysfunction was present across both implicit and explicit rating conditions, but IFG modulation deficits were specific to the explicit condition. With increasing happiness, AR youth showed aberrant modulation in the IFG, which was also sensitive to task demands (all p ≤ 0.05). Decreased amygdala and IFG modulation in patients with BD and AR youth may be pathophysiological risk markers for BD, and may underlie the social cognition and face emotion labeling deficits observed in BD and AR youth. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.

  15. The influence of variations in eating disorder-related symptoms on processing of emotional faces in a non-clinical female sample: An eye-tracking study.

    PubMed

    Sharpe, Emma; Wallis, Deborah J; Ridout, Nathan

    2016-06-30

    This study aimed to: (i) determine if the attention bias towards angry faces reported in eating disorders generalises to a non-clinical sample varying in eating disorder-related symptoms; (ii) examine if the bias occurs during initial orientation or later strategic processing; and (iii) confirm previous findings of impaired facial emotion recognition in non-clinical disordered eating. Fifty-two females viewed a series of face-pairs (happy or angry paired with neutral) whilst their attentional deployment was continuously monitored using an eye-tracker. They subsequently identified the emotion portrayed in a separate series of faces. The highest (n=18) and lowest scorers (n=17) on the Eating Disorders Inventory (EDI) were compared on the attention and facial emotion recognition tasks. Those with relatively high scores exhibited impaired facial emotion recognition, confirming previous findings in similar non-clinical samples. They also displayed biased attention away from emotional faces during later strategic processing, which is consistent with previously observed impairments in clinical samples. These differences were related to drive-for-thinness. Although we found no evidence of a bias towards angry faces, it is plausible that the observed impairments in emotion recognition and avoidance of emotional faces could disrupt social functioning and act as a risk factor for the development of eating disorders. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  16. Familiarity and face emotion recognition in patients with schizophrenia.

    PubMed

    Lahera, Guillermo; Herrera, Sara; Fernández, Cristina; Bardón, Marta; de los Ángeles, Victoria; Fernández-Liria, Alberto

    2014-01-01

    To assess the emotion recognition in familiar and unknown faces in a sample of schizophrenic patients and healthy controls. Face emotion recognition of 18 outpatients diagnosed with schizophrenia (DSM-IVTR) and 18 healthy volunteers was assessed with two Emotion Recognition Tasks using familiar faces and unknown faces. Each subject was accompanied by 4 familiar people (parents, siblings or friends), which were photographed by expressing the 6 Ekman's basic emotions. Face emotion recognition in familiar faces was assessed with this ad hoc instrument. In each case, the patient scored (from 1 to 10) the subjective familiarity and affective valence corresponding to each person. Patients with schizophrenia not only showed a deficit in the recognition of emotions on unknown faces (p=.01), but they also showed an even more pronounced deficit on familiar faces (p=.001). Controls had a similar success rate in the unknown faces task (mean: 18 +/- 2.2) and the familiar face task (mean: 17.4 +/- 3). However, patients had a significantly lower score in the familiar faces task (mean: 13.2 +/- 3.8) than in the unknown faces task (mean: 16 +/- 2.4; p<.05). In both tests, the highest number of errors was with emotions of anger and fear. Subjectively, the patient group showed a lower level of familiarity and emotional valence to their respective relatives (p<.01). The sense of familiarity may be a factor involved in the face emotion recognition and it may be disturbed in schizophrenia. © 2013.

  17. The cerebral correlates of subliminal emotions: an eleoencephalographic study with emotional hybrid faces.

    PubMed

    Prete, Giulia; Capotosto, Paolo; Zappasodi, Filippo; Laeng, Bruno; Tommasi, Luca

    2015-12-01

    In a high-resolution electroencephalographic study, participants evaluated the friendliness level of upright and inverted 'hybrid faces', i.e. facial photos containing a subliminal emotional core in the low spatial frequencies (< 6 cycles/image), superimposed on a neutral expression in the rest of the spatial frequencies. Upright happy and angry faces were judged as more friendly or less friendly than neutral faces, respectively. We observed the time course of cerebral correlates of these stimuli with event-related potentials (ERPs), confirming that hybrid faces elicited the posterior emotion-related and face-related components (P1, N170 and P2), previously shown to be engaged by non-subliminal emotional stimuli. In addition, these components were stronger in the right hemisphere and were both enhanced and delayed by face inversion. A frontal positivity (210-300 ms) was stronger for emotional than for neutral faces, and for upright than for inverted faces. Hence, hybrid faces represent an original approach in the study of subliminal emotions, which appears promising for investigating their electrophysiological correlates. © 2015 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  18. Effects of acute psychosocial stress on neural activity to emotional and neutral faces in a face recognition memory paradigm.

    PubMed

    Li, Shijia; Weerda, Riklef; Milde, Christopher; Wolf, Oliver T; Thiel, Christiane M

    2014-12-01

    Previous studies have shown that acute psychosocial stress impairs recognition of declarative memory and that emotional material is especially sensitive to this effect. Animal studies suggest a central role of the amygdala which modulates memory processes in hippocampus, prefrontal cortex and other brain areas. We used functional magnetic resonance imaging (fMRI) to investigate neural correlates of stress-induced modulation of emotional recognition memory in humans. Twenty-seven healthy, right-handed, non-smoker male volunteers performed an emotional face recognition task. During encoding, participants were presented with 50 fearful and 50 neutral faces. One hour later, they underwent either a stress (Trier Social Stress Test) or a control procedure outside the scanner which was followed immediately by the recognition session inside the scanner, where participants had to discriminate between 100 old and 50 new faces. Stress increased salivary cortisol, blood pressure and pulse, and decreased the mood of participants but did not impact recognition memory. BOLD data during recognition revealed a stress condition by emotion interaction in the left inferior frontal gyrus and right hippocampus which was due to a stress-induced increase of neural activity to fearful and a decrease to neutral faces. Functional connectivity analyses revealed a stress-induced increase in coupling between the right amygdala and the right fusiform gyrus, when processing fearful as compared to neutral faces. Our results provide evidence that acute psychosocial stress affects medial temporal and frontal brain areas differentially for neutral and emotional items, with a stress-induced privileged processing of emotional stimuli.

  19. Emotional faces influence evaluation of natural and transformed food.

    PubMed

    Manippa, Valerio; Padulo, Caterina; Brancucci, Alfredo

    2018-07-01

    Previous evidence showed the presence of a straight relationship between feeding behavior and emotions. Despite that, no studies have focused on the influence of emotional faces on food processing. In our study, participants were presented with 72 couples of visual stimuli composed of a neutral, happy, or disgusted faces (5000 ms duration in Experiment 1, adaptation; 150 ms in Experiment 2, priming) followed by a food stimulus (1500 ms). Food stimuli were grouped in pleasant foods, further divided in natural and transformed, and unpleasant rotten foods. The task consisted in judging the food valence (as 'pleasant' or 'unpleasant') by keypress. Results showed a different pattern of response based on the transformation level of food. In general, the evaluation of natural foods was more rapid compared with transformed foods, maybe for their simplicity and healthier perception. In addition, transformed foods yielded incongruent responses with respect to the preceding emotional face, whereas natural foods yielded congruent responses with respect to it. These effects were independent of the duration of the emotional face (i.e., adaptation or priming paradigm) and may depend on pleasant food stimuli salience.

  20. Recognition profile of emotions in natural and virtual faces.

    PubMed

    Dyck, Miriam; Winbeck, Maren; Leiberg, Susanne; Chen, Yuhan; Gur, Ruben C; Gur, Rurben C; Mathiak, Klaus

    2008-01-01

    Computer-generated virtual faces become increasingly realistic including the simulation of emotional expressions. These faces can be used as well-controlled, realistic and dynamic stimuli in emotion research. However, the validity of virtual facial expressions in comparison to natural emotion displays still needs to be shown for the different emotions and different age groups. Thirty-two healthy volunteers between the age of 20 and 60 rated pictures of natural human faces and faces of virtual characters (avatars) with respect to the expressed emotions: happiness, sadness, anger, fear, disgust, and neutral. Results indicate that virtual emotions were recognized comparable to natural ones. Recognition differences in virtual and natural faces depended on specific emotions: whereas disgust was difficult to convey with the current avatar technology, virtual sadness and fear achieved better recognition results than natural faces. Furthermore, emotion recognition rates decreased for virtual but not natural faces in participants over the age of 40. This specific age effect suggests that media exposure has an influence on emotion recognition. Virtual and natural facial displays of emotion may be equally effective. Improved technology (e.g. better modelling of the naso-labial area) may lead to even better results as compared to trained actors. Due to the ease with which virtual human faces can be animated and manipulated, validated artificial emotional expressions will be of major relevance in future research and therapeutic applications.

  1. Recognition memory for emotional and neutral faces: an event-related potential study.

    PubMed

    Johansson, Mikael; Mecklinger, Axel; Treese, Anne-Cécile

    2004-12-01

    This study examined emotional influences on the hypothesized event-related potential (ERP) correlates of familiarity and recollection (Experiment 1) and the states of awareness (Experiment 2) accompanying recognition memory for faces differing in facial affect. Participants made gender judgments to positive, negative, and neutral faces at study and were in the test phase instructed to discriminate between studied and nonstudied faces. Whereas old-new discrimination was unaffected by facial expression, negative faces were recollected to a greater extent than both positive and neutral faces as reflected in the parietal ERP old-new effect and in the proportion of remember judgments. Moreover, emotion-specific modulations were observed in frontally recorded ERPs elicited by correctly rejected new faces that concurred with a more liberal response criterion for emotional as compared to neutral faces. Taken together, the results are consistent with the view that processes promoting recollection are facilitated for negative events and that emotion may affect recognition performance by influencing criterion setting mediated by the prefrontal cortex.

  2. Emotional faces and the default mode network.

    PubMed

    Sreenivas, S; Boehm, S G; Linden, D E J

    2012-01-11

    The default-mode network (DMN) of the human brain has become a central topic of cognitive neuroscience research. Although alterations in its resting state activity and in its recruitment during tasks have been reported for several mental and neurodegenerative disorders, its role in emotion processing has received relatively little attention. We investigated brain responses to different categories of emotional faces with functional magnetic resonance imaging (fMRI) and found deactivation in ventromedial prefrontal cortex (VMPFC), posterior cingulate gyrus (PC) and cuneus. This deactivation was modulated by emotional category and was less prominent for happy than for sad faces. These deactivated areas along the midline conformed to areas of the DMN. We also observed emotion-dependent deactivation of the left middle frontal gyrus, which is not a classical component of the DMN. Conversely, several areas in a fronto-parietal network commonly linked with attention were differentially activated by emotion categories. Functional connectivity patterns, as obtained by correlation of activation levels, also varied between emotions. VMPFC, PC or cuneus served as hubs between the DMN-type areas and the fronto-parietal network. These data support recent suggestions that the DMN is not a unitary system but differentiates according to task and even type of stimulus. The emotion-specific differential pattern of DMN deactivation may be explored further in patients with mood disorder, where the quest for biological markers of emotional biases is still ongoing. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  3. The complex duration perception of emotional faces: effects of face direction.

    PubMed

    Kliegl, Katrin M; Limbrecht-Ecklundt, Kerstin; Dürr, Lea; Traue, Harald C; Huckauf, Anke

    2015-01-01

    The perceived duration of emotional face stimuli strongly depends on the expressed emotion. But, emotional faces also differ regarding a number of other features like gaze, face direction, or sex. Usually, these features have been controlled by only using pictures of female models with straight gaze and face direction. Doi and Shinohara (2009) reported that an overestimation of angry faces could only be found when the model's gaze was oriented toward the observer. We aimed at replicating this effect for face direction. Moreover, we explored the effect of face direction on the duration perception sad faces. Controlling for the sex of the face model and the participant, female and male participants rated the duration of neutral, angry, and sad face stimuli of both sexes photographed from different perspectives in a bisection task. In line with current findings, we report a significant overestimation of angry compared to neutral face stimuli that was modulated by face direction. Moreover, the perceived duration of sad face stimuli did not differ from that of neutral faces and was not influenced by face direction. Furthermore, we found that faces of the opposite sex appeared to last longer than those of the same sex. This outcome is discussed with regards to stimulus parameters like the induced arousal, social relevance, and an evolutionary context.

  4. Emotion perception accuracy and bias in face-to-face versus cyberbullying.

    PubMed

    Ciucci, Enrica; Baroncelli, Andrea; Nowicki, Stephen

    2014-01-01

    The authors investigated the association of traditional and cyber forms of bullying and victimization with emotion perception accuracy and emotion perception bias. Four basic emotions were considered (i.e., happiness, sadness, anger, and fear); 526 middle school students (280 females; M age = 12.58 years, SD = 1.16 years) were recruited, and emotionality was controlled. Results indicated no significant findings for girls. Boys with higher levels of traditional bullying did not show any deficit in perception accuracy of emotions, but they were prone to identify happiness and fear in faces when a different emotion was expressed; in addition, male cyberbullying was related to greater accuracy in recognizing fear. In terms of the victims, cyber victims had a global problem in recognizing emotions and a specific problem in processing anger and fear. It was concluded that emotion perception accuracy and bias were associated with bullying and victimization for boys not only in traditional settings but also in the electronic ones. Implications of these findings for possible intervention are discussed.

  5. Similar representations of emotions across faces and voices.

    PubMed

    Kuhn, Lisa Katharina; Wydell, Taeko; Lavan, Nadine; McGettigan, Carolyn; Garrido, Lúcia

    2017-09-01

    [Correction Notice: An Erratum for this article was reported in Vol 17(6) of Emotion (see record 2017-18585-001). In the article, the copyright attribution was incorrectly listed and the Creative Commons CC-BY license disclaimer was incorrectly omitted from the author note. The correct copyright is "© 2017 The Author(s)" and the omitted disclaimer is below. All versions of this article have been corrected. "This article has been published under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Copyright for this article is retained by the author(s). Author(s) grant(s) the American Psychological Association the exclusive right to publish the article and identify itself as the original publisher."] Emotions are a vital component of social communication, carried across a range of modalities and via different perceptual signals such as specific muscle contractions in the face and in the upper respiratory system. Previous studies have found that emotion recognition impairments after brain damage depend on the modality of presentation: recognition from faces may be impaired whereas recognition from voices remains preserved, and vice versa. On the other hand, there is also evidence for shared neural activation during emotion processing in both modalities. In a behavioral study, we investigated whether there are shared representations in the recognition of emotions from faces and voices. We used a within-subjects design in which participants rated the intensity of facial expressions and nonverbal vocalizations for each of the 6 basic emotion labels. For each participant and each modality, we then computed a representation matrix with the intensity ratings of each emotion. These matrices allowed us to examine the patterns of confusions between emotions and to characterize the representations

  6. Attentional Modulation of Emotional Conflict Processing with Flanker Tasks

    PubMed Central

    Zhou, Pingyan; Liu, Xun

    2013-01-01

    Emotion processing has been shown to acquire priority by biasing allocation of attentional resources. Aversive images or fearful expressions are processed quickly and automatically. Many existing findings suggested that processing of emotional information was pre-attentive, largely immune from attentional control. Other studies argued that attention gated the processing of emotion. To tackle this controversy, the current study examined whether and to what degrees attention modulated processing of emotion using a stimulus-response-compatibility (SRC) paradigm. We conducted two flanker experiments using color scale faces in neutral expressions or gray scale faces in emotional expressions. We found SRC effects for all three dimensions (color, gender, and emotion) and SRC effects were larger when the conflicts were task relevant than when they were task irrelevant, suggesting that conflict processing of emotion was modulated by attention, similar to those of color and face identity (gender). However, task modulation on color SRC effect was significantly greater than that on gender or emotion SRC effect, indicating that processing of salient information was modulated by attention to a lesser degree than processing of non-emotional stimuli. We proposed that emotion processing can be influenced by attentional control, but at the same time salience of emotional information may bias toward bottom-up processing, rendering less top-down modulation than that on non-emotional stimuli. PMID:23544155

  7. Attentional modulation of emotional conflict processing with flanker tasks.

    PubMed

    Zhou, Pingyan; Liu, Xun

    2013-01-01

    Emotion processing has been shown to acquire priority by biasing allocation of attentional resources. Aversive images or fearful expressions are processed quickly and automatically. Many existing findings suggested that processing of emotional information was pre-attentive, largely immune from attentional control. Other studies argued that attention gated the processing of emotion. To tackle this controversy, the current study examined whether and to what degrees attention modulated processing of emotion using a stimulus-response-compatibility (SRC) paradigm. We conducted two flanker experiments using color scale faces in neutral expressions or gray scale faces in emotional expressions. We found SRC effects for all three dimensions (color, gender, and emotion) and SRC effects were larger when the conflicts were task relevant than when they were task irrelevant, suggesting that conflict processing of emotion was modulated by attention, similar to those of color and face identity (gender). However, task modulation on color SRC effect was significantly greater than that on gender or emotion SRC effect, indicating that processing of salient information was modulated by attention to a lesser degree than processing of non-emotional stimuli. We proposed that emotion processing can be influenced by attentional control, but at the same time salience of emotional information may bias toward bottom-up processing, rendering less top-down modulation than that on non-emotional stimuli.

  8. Recognition memory for low- and high-frequency-filtered emotional faces: Low spatial frequencies drive emotional memory enhancement, whereas high spatial frequencies drive the emotion-induced recognition bias.

    PubMed

    Rohr, Michaela; Tröger, Johannes; Michely, Nils; Uhde, Alarith; Wentura, Dirk

    2017-07-01

    This article deals with two well-documented phenomena regarding emotional stimuli: emotional memory enhancement-that is, better long-term memory for emotional than for neutral stimuli-and the emotion-induced recognition bias-that is, a more liberal response criterion for emotional than for neutral stimuli. Studies on visual emotion perception and attention suggest that emotion-related processes can be modulated by means of spatial-frequency filtering of the presented emotional stimuli. Specifically, low spatial frequencies are assumed to play a primary role for the influence of emotion on attention and judgment. Given this theoretical background, we investigated whether spatial-frequency filtering also impacts (1) the memory advantage for emotional faces and (2) the emotion-induced recognition bias, in a series of old/new recognition experiments. Participants completed incidental-learning tasks with high- (HSF) and low- (LSF) spatial-frequency-filtered emotional and neutral faces. The results of the surprise recognition tests showed a clear memory advantage for emotional stimuli. Most importantly, the emotional memory enhancement was significantly larger for face images containing only low-frequency information (LSF faces) than for HSF faces across all experiments, suggesting that LSF information plays a critical role in this effect, whereas the emotion-induced recognition bias was found only for HSF stimuli. We discuss our findings in terms of both the traditional account of different processing pathways for HSF and LSF information and a stimulus features account. The double dissociation in the results favors the latter account-that is, an explanation in terms of differences in the characteristics of HSF and LSF stimuli.

  9. Recognition Profile of Emotions in Natural and Virtual Faces

    PubMed Central

    Dyck, Miriam; Winbeck, Maren; Leiberg, Susanne; Chen, Yuhan; Gur, Rurben C.; Mathiak, Klaus

    2008-01-01

    Background Computer-generated virtual faces become increasingly realistic including the simulation of emotional expressions. These faces can be used as well-controlled, realistic and dynamic stimuli in emotion research. However, the validity of virtual facial expressions in comparison to natural emotion displays still needs to be shown for the different emotions and different age groups. Methodology/Principal Findings Thirty-two healthy volunteers between the age of 20 and 60 rated pictures of natural human faces and faces of virtual characters (avatars) with respect to the expressed emotions: happiness, sadness, anger, fear, disgust, and neutral. Results indicate that virtual emotions were recognized comparable to natural ones. Recognition differences in virtual and natural faces depended on specific emotions: whereas disgust was difficult to convey with the current avatar technology, virtual sadness and fear achieved better recognition results than natural faces. Furthermore, emotion recognition rates decreased for virtual but not natural faces in participants over the age of 40. This specific age effect suggests that media exposure has an influence on emotion recognition. Conclusions/Significance Virtual and natural facial displays of emotion may be equally effective. Improved technology (e.g. better modelling of the naso-labial area) may lead to even better results as compared to trained actors. Due to the ease with which virtual human faces can be animated and manipulated, validated artificial emotional expressions will be of major relevance in future research and therapeutic applications. PMID:18985152

  10. Processing Distracting Non-face Emotional Images: No Evidence of an Age-Related Positivity Effect

    PubMed Central

    Madill, Mark; Murray, Janice E.

    2017-01-01

    Cognitive aging may be accompanied by increased prioritization of social and emotional goals that enhance positive experiences and emotional states. The socioemotional selectivity theory suggests this may be achieved by giving preference to positive information and avoiding or suppressing negative information. Although there is some evidence of a positivity bias in controlled attention tasks, it remains unclear whether a positivity bias extends to the processing of affective stimuli presented outside focused attention. In two experiments, we investigated age-related differences in the effects of to-be-ignored non-face affective images on target processing. In Experiment 1, 27 older (64–90 years) and 25 young adults (19–29 years) made speeded valence judgments about centrally presented positive or negative target images taken from the International Affective Picture System. To-be-ignored distractor images were presented above and below the target image and were either positive, negative, or neutral in valence. The distractors were considered task relevant because they shared emotional characteristics with the target stimuli. Both older and young adults responded slower to targets when distractor valence was incongruent with target valence relative to when distractors were neutral. Older adults responded faster to positive than to negative targets but did not show increased interference effects from positive distractors. In Experiment 2, affective distractors were task irrelevant as the target was a three-digit array and did not share emotional characteristics with the distractors. Twenty-six older (63–84 years) and 30 young adults (18–30 years) gave speeded responses on a digit disparity task while ignoring the affective distractors positioned in the periphery. Task performance in either age group was not influenced by the task-irrelevant affective images. In keeping with the socioemotional selectivity theory, these findings suggest that older adults

  11. Intrinsic functional connectivity underlying successful emotion regulation of angry faces

    PubMed Central

    Morawetz, Carmen; Kellermann, Tanja; Kogler, Lydia; Radke, Sina; Blechert, Jens; Derntl, Birgit

    2016-01-01

    Most of our social interaction is naturally based on emotional information derived from the perception of faces of other people. Negative facial expressions of a counterpart might trigger negative emotions and initiate emotion regulatory efforts to reduce the impact of the received emotional message in a perceiver. Despite the high adaptive value of emotion regulation in social interaction, the neural underpinnings of it are largely unknown. To remedy this, this study investigated individual differences in emotion regulation effectiveness during the reappraisal of angry faces on the underlying functional activity using functional magnetic resonance imaging (fMRI) as well as the underlying functional connectivity using resting-state fMRI. Greater emotion regulation ability was associated with greater functional activity in the ventromedial prefrontal cortex. Furthermore, greater functional coupling between activity in the ventrolateral prefrontal cortex and the amygdala was associated with emotion regulation success. Our findings provide a first link between prefrontal cognitive control and subcortical emotion processing systems during successful emotion regulation in an explicitly social context. PMID:27510495

  12. A face a mother could love: depression-related maternal neural responses to infant emotion faces.

    PubMed

    Laurent, Heidemarie K; Ablow, Jennifer C

    2013-01-01

    Depressed mothers show negatively biased responses to their infants' emotional bids, perhaps due to faulty processing of infant cues. This study is the first to examine depression-related differences in mothers' neural response to their own infant's emotion faces, considering both effects of perinatal depression history and current depressive symptoms. Primiparous mothers (n = 22), half of whom had a history of major depressive episodes (with one episode occurring during pregnancy and/or postpartum), were exposed to images of their own and unfamiliar infants' joy and distress faces during functional neuroimaging. Group differences (depression vs. no-depression) and continuous effects of current depressive symptoms were tested in relation to neural response to own infant emotion faces. Compared to mothers with no psychiatric diagnoses, those with depression showed blunted responses to their own infant's distress faces in the dorsal anterior cingulate cortex. Mothers with higher levels of current symptomatology showed reduced responses to their own infant's joy faces in the orbitofrontal cortex and insula. Current symptomatology also predicted lower responses to own infant joy-distress in left-sided prefrontal and insula/striatal regions. These deficits in self-regulatory and motivational response circuits may help explain parenting difficulties in depressed mothers.

  13. Age-related emotional bias in processing two emotionally valenced tasks.

    PubMed

    Allen, Philip A; Lien, Mei-Ching; Jardin, Elliott

    2017-01-01

    Previous studies suggest that older adults process positive emotions more efficiently than negative emotions, whereas younger adults show the reverse effect. We examined whether this age-related difference in emotional bias still occurs when attention is engaged in two emotional tasks. We used a psychological refractory period paradigm and varied the emotional valence of Task 1 and Task 2. In both experiments, Task 1 was emotional face discrimination (happy vs. angry faces) and Task 2 was sound discrimination (laugh, punch, vs. cork pop in Experiment 1 and laugh vs. scream in Experiment 2). The backward emotional correspondence effect for positively and negatively valenced Task 2 on Task 1 was measured. In both experiments, younger adults showed a backward correspondence effect from a negatively valenced Task 2, suggesting parallel processing of negatively valenced stimuli. Older adults showed similar negativity bias in Experiment 2 with a more salient negative sound ("scream" relative to "punch"). These results are consistent with an arousal-bias competition model [Mather and Sutherland (Perspectives in Psychological Sciences 6:114-133, 2011)], suggesting that emotional arousal modulates top-down attentional control settings (emotional regulation) with age.

  14. Congruence of happy and sad emotion in music and faces modifies cortical audiovisual activation.

    PubMed

    Jeong, Jeong-Won; Diwadkar, Vaibhav A; Chugani, Carla D; Sinsoongsud, Piti; Muzik, Otto; Behen, Michael E; Chugani, Harry T; Chugani, Diane C

    2011-02-14

    The powerful emotion inducing properties of music are well-known, yet music may convey differing emotional responses depending on environmental factors. We hypothesized that neural mechanisms involved in listening to music may differ when presented together with visual stimuli that conveyed the same emotion as the music when compared to visual stimuli with incongruent emotional content. We designed this study to determine the effect of auditory (happy and sad instrumental music) and visual stimuli (happy and sad faces) congruent or incongruent for emotional content on audiovisual processing using fMRI blood oxygenation level-dependent (BOLD) signal contrast. The experiment was conducted in the context of a conventional block-design experiment. A block consisted of three emotional ON periods, music alone (happy or sad music), face alone (happy or sad faces), and music combined with faces where the music excerpt was played while presenting either congruent emotional faces or incongruent emotional faces. We found activity in the superior temporal gyrus (STG) and fusiform gyrus (FG) to be differentially modulated by music and faces depending on the congruence of emotional content. There was a greater BOLD response in STG when the emotion signaled by the music and faces was congruent. Furthermore, the magnitude of these changes differed for happy congruence and sad congruence, i.e., the activation of STG when happy music was presented with happy faces was greater than the activation seen when sad music was presented with sad faces. In contrast, incongruent stimuli diminished the BOLD response in STG and elicited greater signal change in bilateral FG. Behavioral testing supplemented these findings by showing that subject ratings of emotion in faces were influenced by emotion in music. When presented with happy music, happy faces were rated as more happy (p=0.051) and sad faces were rated as less sad (p=0.030). When presented with sad music, happy faces were rated as less

  15. Psilocybin modulates functional connectivity of the amygdala during emotional face discrimination.

    PubMed

    Grimm, O; Kraehenmann, R; Preller, K H; Seifritz, E; Vollenweider, F X

    2018-04-24

    Recent studies suggest that the antidepressant effects of the psychedelic 5-HT2A receptor agonist psilocybin are mediated through its modulatory properties on prefrontal and limbic brain regions including the amygdala. To further investigate the effects of psilocybin on emotion processing networks, we studied for the first-time psilocybin's acute effects on amygdala seed-to-voxel connectivity in an event-related face discrimination task in 18 healthy volunteers who received psilocybin and placebo in a double-blind balanced cross-over design. The amygdala has been implicated as a salience detector especially involved in the immediate response to emotional face content. We used beta-series amygdala seed-to-voxel connectivity during an emotional face discrimination task to elucidate the connectivity pattern of the amygdala over the entire brain. When we compared psilocybin to placebo, an increase in reaction time for all three categories of affective stimuli was found. Psilocybin decreased the connectivity between amygdala and the striatum during angry face discrimination. During happy face discrimination, the connectivity between the amygdala and the frontal pole was decreased. No effect was seen during discrimination of fearful faces. Thus, we show psilocybin's effect as a modulator of major connectivity hubs of the amygdala. Psilocybin decreases the connectivity between important nodes linked to emotion processing like the frontal pole or the striatum. Future studies are needed to clarify whether connectivity changes predict therapeutic effects in psychiatric patients. Copyright © 2018 Elsevier B.V. and ECNP. All rights reserved.

  16. Emotional face processing deficit in schizophrenia: A replication study in a South African Xhosa population.

    PubMed

    Leppänen, J M; Niehaus, D J H; Koen, L; Du Toit, E; Schoeman, R; Emsley, R

    2006-06-01

    Schizophrenia is associated with a deficit in the recognition of negative emotions from facial expressions. The present study examined the universality of this finding by studying facial expression recognition in African Xhosa population. Forty-four Xhosa patients with schizophrenia and forty healthy controls were tested with a computerized task requiring rapid perceptual discrimination of matched positive (i.e. happy), negative (i.e. angry), and neutral faces. Patients were equally accurate as controls in recognizing happy faces but showed a marked impairment in recognition of angry faces. The impairment was particularly pronounced for high-intensity (open-mouth) angry faces. Patients also exhibited more false happy and angry responses to neutral faces than controls. No correlation between level of education or illness duration and emotion recognition was found but the deficit in the recognition of negative emotions was more pronounced in familial compared to non-familial cases of schizophrenia. These findings suggest that the deficit in the recognition of negative facial expressions may constitute a universal neurocognitive marker of schizophrenia.

  17. Lateralization of Visuospatial Attention across Face Regions Varies with Emotional Prosody

    ERIC Educational Resources Information Center

    Thompson, Laura A.; Malloy, Daniel M.; LeBlanc, Katya L.

    2009-01-01

    It is well-established that linguistic processing is primarily a left-hemisphere activity, while emotional prosody processing is lateralized to the right hemisphere. Does attention, directed at different regions of the talker's face, reflect this pattern of lateralization? We investigated visuospatial attention across a talker's face with a…

  18. Faces in context: a review and systematization of contextual influences on affective face processing.

    PubMed

    Wieser, Matthias J; Brosch, Tobias

    2012-01-01

    Facial expressions are of eminent importance for social interaction as they convey information about other individuals' emotions and social intentions. According to the predominant "basic emotion" approach, the perception of emotion in faces is based on the rapid, automatic categorization of prototypical, universal expressions. Consequently, the perception of facial expressions has typically been investigated using isolated, de-contextualized, static pictures of facial expressions that maximize the distinction between categories. However, in everyday life, an individual's face is not perceived in isolation, but almost always appears within a situational context, which may arise from other people, the physical environment surrounding the face, as well as multichannel information from the sender. Furthermore, situational context may be provided by the perceiver, including already present social information gained from affective learning and implicit processing biases such as race bias. Thus, the perception of facial expressions is presumably always influenced by contextual variables. In this comprehensive review, we aim at (1) systematizing the contextual variables that may influence the perception of facial expressions and (2) summarizing experimental paradigms and findings that have been used to investigate these influences. The studies reviewed here demonstrate that perception and neural processing of facial expressions are substantially modified by contextual information, including verbal, visual, and auditory information presented together with the face as well as knowledge or processing biases already present in the observer. These findings further challenge the assumption of automatic, hardwired categorical emotion extraction mechanisms predicted by basic emotion theories. Taking into account a recent model on face processing, we discuss where and when these different contextual influences may take place, thus outlining potential avenues in future research.

  19. Individual differences in emotion lateralisation and the processing of emotional information arising from social interactions.

    PubMed

    Bourne, Victoria J; Watling, Dawn

    2015-01-01

    Previous research examining the possible association between emotion lateralisation and social anxiety has found conflicting results. In this paper two studies are presented to assess two aspects related to different features of social anxiety: fear of negative evaluation (FNE) and emotion regulation. Lateralisation for the processing of facial emotion was measured using the chimeric faces test. Individuals with greater FNE were more strongly lateralised to the right hemisphere for the processing of anger, happiness and sadness; and, for the processing of fearful faces the relationship was found for females only. Emotion regulation strategies were reduced to two factors: positive strategies and negative strategies. For males, but not females, greater reported use of negative emotion strategies is associated with stronger right hemisphere lateralisation for processing negative emotions. The implications for further understanding the neuropsychological processing of emotion in individuals with social anxiety are discussed.

  20. Functional Brain Activation to Emotional and non-Emotional Faces in Healthy Children: Evidence for Developmentally Undifferentiated Amygdala Function During the School Age Period

    PubMed Central

    Pagliaccio, David; Luby, Joan L.; Gaffrey, Michael S.; Belden, Andrew C.; Botteron, Kelly N.; Harms, Michael P.; Barch, Deanna M.

    2013-01-01

    The amygdala is a key region in emotion processing. Particularly, fMRI studies have demonstrated that the amygdala is active during the viewing of emotional faces. Previous research has consistently found greater amygdala responses to fearful faces as compared to neutral faces in adults, convergent with a focus in the animal literature on the amygdala's role in fear processing. Studies have found that the amygdala also responds differentially to other facial emotion types in adults. Yet, the literature regarding when this differential amygdala responsivity develops is limited and mixed. Thus, the goal of current study was to examine amygdala responses to emotional and neutral faces in a relatively large sample of healthy school age children (N = 52). While the amygdala was active in response to emotional and neutral faces, the results do not support the hypothesis that the amygdala responds differentially to emotional faces in 7 – 12 year old children. Nonetheless, amygdala activity was correlated with the severity of subclinical depression symptoms and emotional regulation skills. Additionally, sex differences were observed in frontal, temporal, and visual regions as well as effects of pubertal development in visual regions. These findings suggest important differences in amygdala reactivity in childhood. PMID:23636982

  1. Emotion recognition training using composite faces generalises across identities but not all emotions.

    PubMed

    Dalili, Michael N; Schofield-Toloza, Lawrence; Munafò, Marcus R; Penton-Voak, Ian S

    2017-08-01

    Many cognitive bias modification (CBM) tasks use facial expressions of emotion as stimuli. Some tasks use unique facial stimuli, while others use composite stimuli, given evidence that emotion is encoded prototypically. However, CBM using composite stimuli may be identity- or emotion-specific, and may not generalise to other stimuli. We investigated the generalisability of effects using composite faces in two experiments. Healthy adults in each study were randomised to one of four training conditions: two stimulus-congruent conditions, where same faces were used during all phases of the task, and two stimulus-incongruent conditions, where faces of the opposite sex (Experiment 1) or faces depicting another emotion (Experiment 2) were used after the modification phase. Our results suggested that training effects generalised across identities. However, our results indicated only partial generalisation across emotions. These findings suggest effects obtained using composite stimuli may extend beyond the stimuli used in the task but remain emotion-specific.

  2. Rapid processing of emotional expressions without conscious awareness.

    PubMed

    Smith, Marie L

    2012-08-01

    Rapid accurate categorization of the emotional state of our peers is of critical importance and as such many have proposed that facial expressions of emotion can be processed without conscious awareness. Typically, studies focus selectively on fearful expressions due to their evolutionary significance, leaving the subliminal processing of other facial expressions largely unexplored. Here, I investigated the time course of processing of 3 facial expressions (fearful, disgusted, and happy) plus an emotionally neutral face, during objectively unaware and aware perception. Participants completed the challenging "which expression?" task in response to briefly presented backward-masked expressive faces. Although participant's behavioral responses did not differentiate between the emotional content of the stimuli in the unaware condition, activity over frontal and occipitotemporal (OT) brain regions indicated an emotional modulation of the neuronal response. Over frontal regions this was driven by negative facial expressions and was present on all emotional trials independent of later categorization. Whereas the N170 component, recorded on lateral OT electrodes, was enhanced for all facial expressions but only on trials that would later be categorized as emotional. The results indicate that emotional faces, not only fearful, are processed without conscious awareness at an early stage and highlight the critical importance of considering categorization response when studying subliminal perception.

  3. Pretreatment Differences in BOLD Response to Emotional Faces Correlate with Antidepressant Response to Scopolamine.

    PubMed

    Furey, Maura L; Drevets, Wayne C; Szczepanik, Joanna; Khanna, Ashish; Nugent, Allison; Zarate, Carlos A

    2015-03-28

    Faster acting antidepressants and biomarkers that predict treatment response are needed to facilitate the development of more effective treatments for patients with major depressive disorders. Here, we evaluate implicitly and explicitly processed emotional faces using neuroimaging to identify potential biomarkers of treatment response to the antimuscarinic, scopolamine. Healthy participants (n=15) and unmedicated-depressed major depressive disorder patients (n=16) participated in a double-blind, placebo-controlled crossover infusion study using scopolamine (4 μg/kg). Before and following scopolamine, blood oxygen-level dependent signal was measured using functional MRI during a selective attention task. Two stimuli comprised of superimposed pictures of faces and houses were presented. Participants attended to one stimulus component and performed a matching task. Face emotion was modulated (happy/sad) creating implicit (attend-houses) and explicit (attend-faces) emotion processing conditions. The pretreatment difference in blood oxygen-level dependent response to happy and sad faces under implicit and explicit conditions (emotion processing biases) within a-priori regions of interest was correlated with subsequent treatment response in major depressive disorder. Correlations were observed exclusively during implicit emotion processing in the regions of interest, which included the subgenual anterior cingulate (P<.02) and middle occipital cortices (P<.02). The magnitude and direction of differential blood oxygen-level- dependent response to implicitly processed emotional faces prior to treatment reflect the potential to respond to scopolamine. These findings replicate earlier results, highlighting the potential for pretreatment neural activity in the middle occipital cortices and subgenual anterior cingulate to inform us about the potential to respond clinically to scopolamine. Published by Oxford University Press on behalf of CINP 2015. This work is written by (a

  4. Effects of facial color on the subliminal processing of fearful faces.

    PubMed

    Nakajima, K; Minami, T; Nakauchi, S

    2015-12-03

    Recent studies have suggested that both configural information, such as face shape, and surface information is important for face perception. In particular, facial color is sufficiently suggestive of emotional states, as in the phrases: "flushed with anger" and "pale with fear." However, few studies have examined the relationship between facial color and emotional expression. On the other hand, event-related potential (ERP) studies have shown that emotional expressions, such as fear, are processed unconsciously. In this study, we examined how facial color modulated the supraliminal and subliminal processing of fearful faces. We recorded electroencephalograms while participants performed a facial emotion identification task involving masked target faces exhibiting facial expressions (fearful or neutral) and colors (natural or bluish). The results indicated that there was a significant interaction between facial expression and color for the latency of the N170 component. Subsequent analyses revealed that the bluish-colored faces increased the latency effect of facial expressions compared to the natural-colored faces, indicating that the bluish color modulated the processing of fearful expressions. We conclude that the unconscious processing of fearful faces is affected by facial color. Copyright © 2015 IBRO. Published by Elsevier Ltd. All rights reserved.

  5. Musical chords and emotion: major and minor triads are processed for emotion.

    PubMed

    Bakker, David Radford; Martin, Frances Heritage

    2015-03-01

    Musical chords are arguably the smallest building blocks of music that retain emotional information. Major chords are generally perceived as positive- and minor chords as negative-sounding, but there has been debate concerning how early these emotional connotations may be processed. To investigate this, emotional facial stimuli and musical chord stimuli were simultaneously presented to participants, and facilitation of processing was measured via event-related potential (ERP) amplitudes. Decreased amplitudes of the P1 and N2 ERP components have been found to index the facilitation of early processing. If simultaneously presented musical chords and facial stimuli are perceived at early stages as belonging to the same emotional category, then early processing should be facilitated for these congruent pairs, and ERP amplitudes should therefore be decreased as compared to the incongruent pairs. ERPs were recorded from 30 musically naive participants as they viewed happy, sad, and neutral faces presented simultaneously with a major or minor chord. When faces and chords were presented that contained congruent emotional information (happy-major or sad-minor), processing was facilitated, as indexed by decreased N2 ERP amplitudes. This suggests that musical chords do possess emotional connotations that can be processed as early as 200 ms in naive listeners. The early stages of processing that are involved suggest that major and minor chords have deeply connected emotional meanings, rather than superficially attributed ones, indicating that minor triads possess negative emotional connotations and major triads possess positive emotional connotations.

  6. Neural activation to emotional faces in adolescents with autism spectrum disorders.

    PubMed

    Weng, Shih-Jen; Carrasco, Melisa; Swartz, Johnna R; Wiggins, Jillian Lee; Kurapati, Nikhil; Liberzon, Israel; Risi, Susan; Lord, Catherine; Monk, Christopher S

    2011-03-01

    Autism spectrum disorders (ASD) involve a core deficit in social functioning and impairments in the ability to recognize face emotions. In an emotional faces task designed to constrain group differences in attention, the present study used functional MRI to characterize activation in the amygdala, ventral prefrontal cortex (vPFC), and striatum, three structures involved in socio-emotional processing in adolescents with ASD. Twenty-two adolescents with ASD and 20 healthy adolescents viewed facial expressions (happy, fearful, sad and neutral) that were briefly presented (250 ms) during functional MRI acquisition. To monitor attention, subjects pressed a button to identify the gender of each face. The ASD group showed greater activation to the faces relative to the control group in the amygdala, vPFC and striatum. Follow-up analyses indicated that the ASD relative to control group showed greater activation in the amygdala, vPFC and striatum (p < .05 small volume corrected), particularly to sad faces. Moreover, in the ASD group, there was a negative correlation between developmental variables (age and pubertal status) and mean activation from the whole bilateral amygdala; younger adolescents showed greater activation than older adolescents. There were no group differences in accuracy or reaction time in the gender identification task. When group differences in attention to facial expressions were limited, adolescents with ASD showed greater activation in structures involved in socio-emotional processing. © 2010 The Authors. Journal of Child Psychology and Psychiatry © 2010 Association for Child and Adolescent Mental Health.

  7. Increased heart rate after exercise facilitates the processing of fearful but not disgusted faces.

    PubMed

    Pezzulo, G; Iodice, P; Barca, L; Chausse, P; Monceau, S; Mermillod, M

    2018-01-10

    Embodied theories of emotion assume that emotional processing is grounded in bodily and affective processes. Accordingly, the perception of an emotion re-enacts congruent sensory and affective states; and conversely, bodily states congruent with a specific emotion facilitate emotional processing. This study tests whether the ability to process facial expressions (faces having a neutral expression, expressing fear, or disgust) can be influenced by making the participants' body state congruent with the expressed emotion (e.g., high heart rate in the case of faces expressing fear). We designed a task requiring participants to categorize pictures of male and female faces that either had a neutral expression (neutral), or expressed emotions whose linkage with high heart rate is strong (fear) or significantly weaker or absent (disgust). Critically, participants were tested in two conditions: with experimentally induced high heart rate (Exercise) and with normal heart rate (Normal). Participants processed fearful faces (but not disgusted or neutral faces) faster when they were in the Exercise condition than in the Normal condition. These results support the idea that an emotionally congruent body state facilitates the automatic processing of emotionally-charged stimuli and this effect is emotion-specific rather than due to generic factors such as arousal.

  8. Judging emotional congruency: Explicit attention to situational context modulates processing of facial expressions of emotion.

    PubMed

    Diéguez-Risco, Teresa; Aguado, Luis; Albert, Jacobo; Hinojosa, José Antonio

    2015-12-01

    The influence of explicit evaluative processes on the contextual integration of facial expressions of emotion was studied in a procedure that required the participants to judge the congruency of happy and angry faces with preceding sentences describing emotion-inducing situations. Judgments were faster on congruent trials in the case of happy faces and on incongruent trials in the case of angry faces. At the electrophysiological level, a congruency effect was observed in the face-sensitive N170 component that showed larger amplitudes on incongruent trials. An interactive effect of congruency and emotion appeared on the LPP (late positive potential), with larger amplitudes in response to happy faces that followed anger-inducing situations. These results show that the deliberate intention to judge the contextual congruency of facial expressions influences not only processes involved in affective evaluation such as those indexed by the LPP but also earlier processing stages that are involved in face perception. Copyright © 2015. Published by Elsevier B.V.

  9. Automatic emotion processing as a function of trait emotional awareness: an fMRI study

    PubMed Central

    Lichev, Vladimir; Sacher, Julia; Ihme, Klas; Rosenberg, Nicole; Quirin, Markus; Lepsien, Jöran; Pampel, André; Rufer, Michael; Grabe, Hans-Jörgen; Kugel, Harald; Kersting, Anette; Villringer, Arno; Lane, Richard D.

    2015-01-01

    It is unclear whether reflective awareness of emotions is related to extent and intensity of implicit affective reactions. This study is the first to investigate automatic brain reactivity to emotional stimuli as a function of trait emotional awareness. To assess emotional awareness the Levels of Emotional Awareness Scale (LEAS) was administered. During scanning, masked happy, angry, fearful and neutral facial expressions were presented to 46 healthy subjects, who had to rate the fit between artificial and emotional words. The rating procedure allowed assessment of shifts in implicit affectivity due to emotion faces. Trait emotional awareness was associated with increased activation in the primary somatosensory cortex, inferior parietal lobule, anterior cingulate gyrus, middle frontal and cerebellar areas, thalamus, putamen and amygdala in response to masked happy faces. LEAS correlated positively with shifts in implicit affect caused by masked happy faces. According to our findings, people with high emotional awareness show stronger affective reactivity and more activation in brain areas involved in emotion processing and simulation during the perception of masked happy facial expression than people with low emotional awareness. High emotional awareness appears to be characterized by an enhanced positive affective resonance to others at an automatic processing level. PMID:25140051

  10. Association of Irritability and Anxiety With the Neural Mechanisms of Implicit Face Emotion Processing in Youths With Psychopathology.

    PubMed

    Stoddard, Joel; Tseng, Wan-Ling; Kim, Pilyoung; Chen, Gang; Yi, Jennifer; Donahue, Laura; Brotman, Melissa A; Towbin, Kenneth E; Pine, Daniel S; Leibenluft, Ellen

    2017-01-01

    of irritability (Wald χ21 = 21.3; P < .001 for contrast). Irritability was associated with differences in neural response to face emotions in several areas (F2, 888 ≥ 13.45; all P < .001). This primarily occurred in the ventral visual areas, with a positive association to angry and happy faces relative to fearful faces. These data extend prior work conducted in youths with irritability or anxiety alone and suggest that research may miss important findings if the pathophysiology of irritability and anxiety are studied in isolation. Decreased amygdala-medial prefrontal cortex connectivity may mediate emotion dysregulation when very anxious and irritable youth process threat-related faces. Activation in the ventral visual circuitry suggests a mechanism through which signals of social approach (ie, happy and angry expressions) may capture attention in irritable youth.

  11. No differences in emotion recognition strategies in children with autism spectrum disorder: evidence from hybrid faces.

    PubMed

    Evers, Kris; Kerkhof, Inneke; Steyaert, Jean; Noens, Ilse; Wagemans, Johan

    2014-01-01

    Emotion recognition problems are frequently reported in individuals with an autism spectrum disorder (ASD). However, this research area is characterized by inconsistent findings, with atypical emotion processing strategies possibly contributing to existing contradictions. In addition, an attenuated saliency of the eyes region is often demonstrated in ASD during face identity processing. We wanted to compare reliance on mouth versus eyes information in children with and without ASD, using hybrid facial expressions. A group of six-to-eight-year-old boys with ASD and an age- and intelligence-matched typically developing (TD) group without intellectual disability performed an emotion labelling task with hybrid facial expressions. Five static expressions were used: one neutral expression and four emotional expressions, namely, anger, fear, happiness, and sadness. Hybrid faces were created, consisting of an emotional face half (upper or lower face region) with the other face half showing a neutral expression. Results showed no emotion recognition problem in ASD. Moreover, we provided evidence for the existence of top- and bottom-emotions in children: correct identification of expressions mainly depends on information in the eyes (so-called top-emotions: happiness) or in the mouth region (so-called bottom-emotions: sadness, anger, and fear). No stronger reliance on mouth information was found in children with ASD.

  12. A face to remember: emotional expression modulates prefrontal activity during memory formation.

    PubMed

    Sergerie, Karine; Lepage, Martin; Armony, Jorge L

    2005-01-15

    Emotion can exert a modulatory role on episodic memory. Several studies have shown that negative stimuli (e.g., words, pictures) are better remembered than neutral ones. Although facial expressions are powerful emotional stimuli and have been shown to influence perception and attention processes, little is known about their effect on memory. We used functional magnetic resonance imaging (fMRI) in humans to investigate the effects of expression (happy, neutral, and fearful) on prefrontal cortex (PFC) activity during the encoding of faces, using a subsequent memory effect paradigm. Our results show that activity in right PFC predicted memory for faces, regardless of expression, while a homotopic region in the left hemisphere was associated with successful encoding only for faces with an emotional expression. These findings are consistent with the proposed role of right dorsolateral PFC in successful encoding of nonverbal material, but also suggest that left DLPFC may be a site where integration of memory and emotional processes occurs. This study sheds new light on the current controversy regarding the hemispheric lateralization of PFC in memory encoding.

  13. Faces in Context: A Review and Systematization of Contextual Influences on Affective Face Processing

    PubMed Central

    Wieser, Matthias J.; Brosch, Tobias

    2012-01-01

    Facial expressions are of eminent importance for social interaction as they convey information about other individuals’ emotions and social intentions. According to the predominant “basic emotion” approach, the perception of emotion in faces is based on the rapid, automatic categorization of prototypical, universal expressions. Consequently, the perception of facial expressions has typically been investigated using isolated, de-contextualized, static pictures of facial expressions that maximize the distinction between categories. However, in everyday life, an individual’s face is not perceived in isolation, but almost always appears within a situational context, which may arise from other people, the physical environment surrounding the face, as well as multichannel information from the sender. Furthermore, situational context may be provided by the perceiver, including already present social information gained from affective learning and implicit processing biases such as race bias. Thus, the perception of facial expressions is presumably always influenced by contextual variables. In this comprehensive review, we aim at (1) systematizing the contextual variables that may influence the perception of facial expressions and (2) summarizing experimental paradigms and findings that have been used to investigate these influences. The studies reviewed here demonstrate that perception and neural processing of facial expressions are substantially modified by contextual information, including verbal, visual, and auditory information presented together with the face as well as knowledge or processing biases already present in the observer. These findings further challenge the assumption of automatic, hardwired categorical emotion extraction mechanisms predicted by basic emotion theories. Taking into account a recent model on face processing, we discuss where and when these different contextual influences may take place, thus outlining potential avenues in future

  14. The effects of social anxiety on emotional face discrimination and its modulation by mouth salience.

    PubMed

    du Rocher, Andrew R; Pickering, Alan D

    2018-05-21

    People high in social anxiety experience fear of social situations due to the likelihood of social evaluation. Whereas happy faces are generally processed very quickly, this effect is impaired by high social anxiety. Mouth regions are implicated during emotional face processing, therefore differences in mouth salience might affect how social anxiety relates to emotional face discrimination. We designed an emotional facial expression recognition task to reveal how varying levels of sub-clinical social anxiety (measured by questionnaire) related to the discrimination of happy and fearful faces, and of happy and angry faces. We also categorised the facial expressions by the salience of the mouth region (i.e. high [open mouth] vs. low [closed mouth]). In a sample of 90 participants higher social anxiety (relative to lower social anxiety) was associated with a reduced happy face reaction time advantage. However, this effect was mainly driven by the faces with less salient closed mouths. Our results are consistent with theories of anxiety that incorporate an oversensitive valence evaluation system.

  15. Face Processing: Models For Recognition

    NASA Astrophysics Data System (ADS)

    Turk, Matthew A.; Pentland, Alexander P.

    1990-03-01

    The human ability to process faces is remarkable. We can identify perhaps thousands of faces learned throughout our lifetime and read facial expression to understand such subtle qualities as emotion. These skills are quite robust, despite sometimes large changes in the visual stimulus due to expression, aging, and distractions such as glasses or changes in hairstyle or facial hair. Computers which model and recognize faces will be useful in a variety of applications, including criminal identification, human-computer interface, and animation. We discuss models for representing faces and their applicability to the task of recognition, and present techniques for identifying faces and detecting eye blinks.

  16. Older Adults' Trait Impressions of Faces Are Sensitive to Subtle Resemblance to Emotions

    PubMed Central

    Zebrowitz, Leslie A.

    2013-01-01

    Younger adults (YA) attribute emotion-related traits to people whose neutral facial structure resembles an emotion (emotion overgeneralization). The fact that older adults (OA) show deficits in accurately labeling basic emotions suggests that they may be relatively insensitive to variations in the emotion resemblance of neutral expression faces that underlie emotion overgeneralization effects. On the other hand, the fact that OA, like YA, show a ‘pop-out’ effect for anger, more quickly locating an angry than a happy face in a neutral array, suggests that both age groups may be equally sensitive to emotion resemblance. We used computer modeling to assess the degree to which neutral faces objectively resembled emotions and assessed whether that resemblance predicted trait impressions. We found that both OA and YA showed anger and surprise overgeneralization in ratings of danger and naiveté, respectively, with no significant differences in the strength of the effects for the two age groups. These findings suggest that well-documented OA deficits on emotion recognition tasks may be more due to processing demands than to an insensitivity to the social affordances of emotion expressions. PMID:24058225

  17. Psilocybin with psychological support improves emotional face recognition in treatment-resistant depression.

    PubMed

    Stroud, J B; Freeman, T P; Leech, R; Hindocha, C; Lawn, W; Nutt, D J; Curran, H V; Carhart-Harris, R L

    2018-02-01

    Depressed patients robustly exhibit affective biases in emotional processing which are altered by SSRIs and predict clinical outcome. The objective of this study is to investigate whether psilocybin, recently shown to rapidly improve mood in treatment-resistant depression (TRD), alters patients' emotional processing biases. Seventeen patients with treatment-resistant depression completed a dynamic emotional face recognition task at baseline and 1 month later after two doses of psilocybin with psychological support. Sixteen controls completed the emotional recognition task over the same time frame but did not receive psilocybin. We found evidence for a group × time interaction on speed of emotion recognition (p = .035). At baseline, patients were slower at recognising facial emotions compared with controls (p < .001). After psilocybin, this difference was remediated (p = .208). Emotion recognition was faster at follow-up compared with baseline in patients (p = .004, d = .876) but not controls (p = .263, d = .302). In patients, this change was significantly correlated with a reduction in anhedonia over the same time period (r = .640, p = .010). Psilocybin with psychological support appears to improve processing of emotional faces in treatment-resistant depression, and this correlates with reduced anhedonia. Placebo-controlled studies are warranted to follow up these preliminary findings.

  18. Laterality Biases to Chimeric Faces in Asperger Syndrome: What Is Right about Face-Processing?

    ERIC Educational Resources Information Center

    Ashwin, Chris; Wheelwright, Sally; Baron-Cohen, Simon

    2005-01-01

    People show a left visual field (LVF) bias for faces, i.e., involving the right hemisphere of the brain. Lesion and neuroimaging studies confirm the importance of the right-hemisphere and suggest separable neural pathways for processing facial identity vs. emotions. We investigated the hemispheric processing of faces in adults with and without…

  19. An electrocortical investigation of emotional face processing in military-related posttraumatic stress disorder.

    PubMed

    DiGangi, Julia A; Burkhouse, Katie L; Aase, Darrin M; Babione, Joseph M; Schroth, Christopher; Kennedy, Amy E; Greenstein, Justin E; Proescher, Eric; Phan, K Luan

    2017-09-01

    PTSD is a disorder of emotion dysregulation. Although much work has intended to elucidate the neural underpinnings of the disorder, much remains unknown about the neurobiological substrates of emotion dysregulation in PTSD. In order to assess the relationship between a neural measure of attention to emotion (i.e. the late positive potential; LPP) and PTSD symptoms, EEG was recorded and examined as a potential predictor of military-related PTSD symptoms in a sample of 73 OEF/OIF/OND veterans. Results revealed that higher PTSD symptoms were related to an attenuated LPP response to angry facial expressions. This finding was not observed for happy or fearful faces. The current study provides initial evidence that, in a relatively young, mostly male sample of OEF/OIF/OND veterans, hyporeactivity to angry faces at the neural level may provide phenotypic data to characterize individual differences in PTSD symptom severity. This work may assist in future studies that seek to examine useful psychophysiologic targets for treatment and early interventions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. One size does not fit all: face emotion processing impairments in semantic dementia, behavioural-variant frontotemporal dementia and Alzheimer's disease are mediated by distinct cognitive deficits.

    PubMed

    Miller, Laurie A; Hsieh, Sharpley; Lah, Suncica; Savage, Sharon; Hodges, John R; Piguet, Olivier

    2012-01-01

    Patients with frontotemporal dementia (both behavioural variant [bvFTD] and semantic dementia [SD]) as well as those with Alzheimer's disease (AD) show deficits on tests of face emotion processing, yet the mechanisms underlying these deficits have rarely been explored. We compared groups of patients with bvFTD (n = 17), SD (n = 12) or AD (n = 20) to an age- and education-matched group of healthy control subjects (n = 36) on three face emotion processing tasks (Ekman 60, Emotion Matching and Emotion Selection) and found that all three patient groups were similarly impaired. Analyses of covariance employed to partial out the influences of language and perceptual impairments, which frequently co-occur in these patients, provided evidence of different underlying cognitive mechanisms. These analyses revealed that language impairments explained the original poor scores obtained by the SD patients on the Ekman 60 and Emotion Selection tasks, which involve verbal labels. Perceptual deficits contributed to Emotion Matching performance in the bvFTD and AD patients. Importantly, all groups remained impaired on one task or more following these analyses, denoting a primary emotion processing disturbance in these dementia syndromes. These findings highlight the multifactorial nature of emotion processing deficits in patients with dementia.

  1. Emotional intelligence is associated with reduced insula responses to masked angry faces.

    PubMed

    Alkozei, Anna; Killgore, William D S

    2015-07-08

    High levels of emotional intelligence (EI) have been associated with increased success in the workplace, greater quality of personal relationships, and enhanced wellbeing. Evidence suggests that EI is mediated extensively by the interplay of key emotion regions including the amygdala, insula, and ventromedial prefrontal cortex, among others. The insula, in particular, is important for processing interoceptive and somatic cues that are interpreted as emotional responses. We investigated the association between EI and functional brain responses within the aforementioned neurocircuitry in response to subliminal presentations of social threat. Fifty-four healthy adults completed the Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT) and underwent functional magnetic brain imaging while viewing subliminal presentations of faces displaying anger, using a backward masked facial affect paradigm to minimize conscious awareness of the expressed emotion. In response to masked angry faces, the total MSCEIT scores correlated negatively with a cluster of activation located within the left insula, but not with activation in any other region of interest. Considering the insula's role in the processing of interoceptive emotional cues, the results suggest that greater EI is associated with reduced emotional visceral reactivity and/or more accurate interoceptive prediction when confronted with stimuli indicative of social threat.

  2. Association between Amygdala Response to Emotional Faces and Social Anxiety in Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Kleinhans, Natalia M.; Richards, Todd; Weaver, Kurt; Johnson, L. Clark; Greenson, Jessica; Dawson, Geraldine; Aylward, Elizabeth

    2010-01-01

    Difficulty interpreting facial expressions has been reported in autism spectrum disorders (ASD) and is thought to be associated with amygdala abnormalities. To further explore the neural basis of abnormal emotional face processing in ASD, we conducted an fMRI study of emotional face matching in high-functioning adults with ASD and age, IQ, and…

  3. Facial emotion recognition, face scan paths, and face perception in children with neurofibromatosis type 1.

    PubMed

    Lewis, Amelia K; Porter, Melanie A; Williams, Tracey A; Bzishvili, Samantha; North, Kathryn N; Payne, Jonathan M

    2017-05-01

    This study aimed to investigate face scan paths and face perception abilities in children with Neurofibromatosis Type 1 (NF1) and how these might relate to emotion recognition abilities in this population. The authors investigated facial emotion recognition, face scan paths, and face perception in 29 children with NF1 compared to 29 chronological age-matched typically developing controls. Correlations between facial emotion recognition, face scan paths, and face perception in children with NF1 were examined. Children with NF1 displayed significantly poorer recognition of fearful expressions compared to controls, as well as a nonsignificant trend toward poorer recognition of anger. Although there was no significant difference between groups in time spent viewing individual core facial features (eyes, nose, mouth, and nonfeature regions), children with NF1 spent significantly less time than controls viewing the face as a whole. Children with NF1 also displayed significantly poorer face perception abilities than typically developing controls. Facial emotion recognition deficits were not significantly associated with aberrant face scan paths or face perception abilities in the NF1 group. These results suggest that impairments in the perception, identification, and interpretation of information from faces are important aspects of the social-cognitive phenotype of NF1. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  4. The Effect of Observers’ Mood on the Local Processing of Emotional Faces: Evidence from Short-Lived and Prolonged Mood States

    PubMed Central

    Mokhtari, Setareh; Buttle, Heather

    2015-01-01

    We examined the effect of induced mood, varying in valence and longevity, on local processing of emotional faces. It was found that negative facial expression conveyed by the global level of the face interferes with efficient processing of the local features. The results also showed that the duration of involvement with a mood influenced the local processing. We observed that attending to the local level of faces is not different in short-lived happy and sad mood states. However, as the mood state is experienced for a longer period, local processing was impaired in happy mood compared to sad mood. Taken together, we concluded that both facial expressions and affective states influence processing of the local parts of faces. Moreover, we suggest that mediating factors like the duration of involvement with the mood play a role in the interrelation between mood, attention, and perception. PMID:25883696

  5. Aging and attentional biases for emotional faces.

    PubMed

    Mather, Mara; Carstensen, Laura L

    2003-09-01

    We examined age differences in attention to and memory for faces expressing sadness, anger, and happiness. Participants saw a pair of faces, one emotional and one neutral, and then a dot probe that appeared in the location of one of the faces. In two experiments, older adults responded faster to the dot if it was presented on the same side as a neutral face than if it was presented on the same side as a negative face. Younger adults did not exhibit this attentional bias. Interactions of age and valence were also found for memory for the faces, with older adults remembering positive better than negative faces. These findings reveal that in their initial attention, older adults avoid negative information. This attentional bias is consistent with older adults' generally better emotional well-being and their tendency to remember negative less well than positive information.

  6. In Your Face: Startle to Emotional Facial Expressions Depends on Face Direction.

    PubMed

    Åsli, Ole; Michalsen, Henriette; Øvervoll, Morten

    2017-01-01

    Although faces are often included in the broad category of emotional visual stimuli, the affective impact of different facial expressions is not well documented. The present experiment investigated startle electromyographic responses to pictures of neutral, happy, angry, and fearful facial expressions, with a frontal face direction (directed) and at a 45° angle to the left (averted). Results showed that emotional facial expressions interact with face direction to produce startle potentiation: Greater responses were found for angry expressions, compared with fear and neutrality, with directed faces. When faces were averted, fear and neutrality produced larger responses compared with anger and happiness. These results are in line with the notion that startle is potentiated to stimuli signaling threat. That is, a forward directed angry face may signal a threat toward the observer, and a fearful face directed to the side may signal a possible threat in the environment.

  7. Emotional face expression modulates occipital-frontal effective connectivity during memory formation in a bottom-up fashion.

    PubMed

    Xiu, Daiming; Geiger, Maximilian J; Klaver, Peter

    2015-01-01

    This study investigated the role of bottom-up and top-down neural mechanisms in the processing of emotional face expression during memory formation. Functional brain imaging data was acquired during incidental learning of positive ("happy"), neutral and negative ("angry" or "fearful") faces. Dynamic Causal Modeling (DCM) was applied on the functional magnetic resonance imaging (fMRI) data to characterize effective connectivity within a brain network involving face perception (inferior occipital gyrus and fusiform gyrus) and successful memory formation related areas (hippocampus, superior parietal lobule, amygdala, and orbitofrontal cortex). The bottom-up models assumed processing of emotional face expression along feed forward pathways to the orbitofrontal cortex. The top-down models assumed that the orbitofrontal cortex processed emotional valence and mediated connections to the hippocampus. A subsequent recognition memory test showed an effect of negative emotion on the response bias, but not on memory performance. Our DCM findings showed that the bottom-up model family of effective connectivity best explained the data across all subjects and specified that emotion affected most bottom-up connections to the orbitofrontal cortex, especially from the occipital visual cortex and superior parietal lobule. Of those pathways to the orbitofrontal cortex the connection from the inferior occipital gyrus correlated with memory performance independently of valence. We suggest that bottom-up neural mechanisms support effects of emotional face expression and memory formation in a parallel and partially overlapping fashion.

  8. Emotional content modulates response inhibition and perceptual processing.

    PubMed

    Yang, Suyong; Luo, Wenbo; Zhu, Xiangru; Broster, Lucas S; Chen, Taolin; Li, Jinzhen; Luo, Yuejia

    2014-11-01

    In this study, event-related potentials were used to investigate the effect of emotion on response inhibition. Participants performed an emotional go/no-go task that required responses to human faces associated with a "go" valence (i.e., emotional, neutral) and response inhibition to human faces associated with a "no-go" valence. Emotional content impaired response inhibition, as evidenced by decreased response accuracy and N2 amplitudes in no-go trials. More importantly, emotional expressions elicited larger N170 amplitudes than neutral expressions, and this effect was larger in no-go than in go trials, indicating that the perceptual processing of emotional expression had priority in inhibitory trials. In no-go trials, correlation analysis showed that increased N170 amplitudes were associated with decreased N2 amplitudes. Taken together, our findings suggest that emotional content impairs response inhibition due to the prioritization of emotional content processing. Copyright © 2014 Society for Psychophysiological Research.

  9. Acute pharmacologically induced shifts in serotonin availability abolish emotion-selective responses to negative face emotions in distinct brain networks.

    PubMed

    Grady, Cheryl L; Siebner, Hartwig R; Hornboll, Bettina; Macoveanu, Julian; Paulson, Olaf B; Knudsen, Gitte M

    2013-05-01

    Pharmacological manipulation of serotonin availability can alter the processing of facial expressions of emotion. Using a within-subject design, we measured the effect of serotonin on the brain's response to aversive face emotions with functional MRI while 20 participants judged the gender of neutral, fearful and angry faces. In three separate and counterbalanced sessions, participants received citalopram (CIT) to raise serotonin levels, underwent acute tryptophan depletion (ATD) to lower serotonin, or were studied without pharmacological challenge (Control). An analysis designed to identify distributed brain responses identified two brain networks with modulations of activity related to face emotion and serotonin level. The first network included the left amygdala, bilateral striatum, and fusiform gyri. During the Control session this network responded only to fearful faces; increasing serotonin decreased this response to fear, whereas reducing serotonin enhanced the response of this network to angry faces. The second network involved bilateral amygdala and ventrolateral prefrontal cortex, and these regions also showed increased activity to fear during the Control session. Both drug challenges enhanced the neural response of this set of regions to angry faces, relative to Control, and CIT also enhanced activity for neutral faces. The net effect of these changes in both networks was to abolish the selective response to fearful expressions. These results suggest that a normal level of serotonin is critical for maintaining a differentiated brain response to threatening face emotions. Lower serotonin leads to a broadening of a normally fear-specific response to anger, and higher levels reduce the differentiated brain response to aversive face emotions. Copyright © 2012 Elsevier B.V. and ECNP. All rights reserved.

  10. More than mere mimicry? The influence of emotion on rapid facial reactions to faces.

    PubMed

    Moody, Eric J; McIntosh, Daniel N; Mann, Laura J; Weisser, Kimberly R

    2007-05-01

    Within a second of seeing an emotional facial expression, people typically match that expression. These rapid facial reactions (RFRs), often termed mimicry, are implicated in emotional contagion, social perception, and embodied affect, yet ambiguity remains regarding the mechanism(s) involved. Two studies evaluated whether RFRs to faces are solely nonaffective motor responses or whether emotional processes are involved. Brow (corrugator, related to anger) and forehead (frontalis, related to fear) activity were recorded using facial electromyography (EMG) while undergraduates in two conditions (fear induction vs. neutral) viewed fear, anger, and neutral facial expressions. As predicted, fear induction increased fear expressions to angry faces within 1000 ms of exposure, demonstrating an emotional component of RFRs. This did not merely reflect increased fear from the induction, because responses to neutral faces were unaffected. Considering RFRs to be merely nonaffective automatic reactions is inaccurate. RFRs are not purely motor mimicry; emotion influences early facial responses to faces. The relevance of these data to emotional contagion, autism, and the mirror system-based perspectives on imitation is discussed.

  11. Fusiform Gyrus Dysfunction is Associated with Perceptual Processing Efficiency to Emotional Faces in Adolescent Depression: A Model-Based Approach.

    PubMed

    Ho, Tiffany C; Zhang, Shunan; Sacchet, Matthew D; Weng, Helen; Connolly, Colm G; Henje Blom, Eva; Han, Laura K M; Mobayed, Nisreen O; Yang, Tony T

    2016-01-01

    While the extant literature has focused on major depressive disorder (MDD) as being characterized by abnormalities in processing affective stimuli (e.g., facial expressions), little is known regarding which specific aspects of cognition influence the evaluation of affective stimuli, and what are the underlying neural correlates. To investigate these issues, we assessed 26 adolescents diagnosed with MDD and 37 well-matched healthy controls (HCL) who completed an emotion identification task of dynamically morphing faces during functional magnetic resonance imaging (fMRI). We analyzed the behavioral data using a sequential sampling model of response time (RT) commonly used to elucidate aspects of cognition in binary perceptual decision making tasks: the Linear Ballistic Accumulator (LBA) model. Using a hierarchical Bayesian estimation method, we obtained group-level and individual-level estimates of LBA parameters on the facial emotion identification task. While the MDD and HCL groups did not differ in mean RT, accuracy, or group-level estimates of perceptual processing efficiency (i.e., drift rate parameter of the LBA), the MDD group showed significantly reduced responses in left fusiform gyrus compared to the HCL group during the facial emotion identification task. Furthermore, within the MDD group, fMRI signal in the left fusiform gyrus during affective face processing was significantly associated with greater individual-level estimates of perceptual processing efficiency. Our results therefore suggest that affective processing biases in adolescents with MDD are characterized by greater perceptual processing efficiency of affective visual information in sensory brain regions responsible for the early processing of visual information. The theoretical, methodological, and clinical implications of our results are discussed.

  12. Vicarious Social Touch Biases Gazing at Faces and Facial Emotions.

    PubMed

    Schirmer, Annett; Ng, Tabitha; Ebstein, Richard P

    2018-02-01

    Research has suggested that interpersonal touch promotes social processing and other-concern, and that women may respond to it more sensitively than men. In this study, we asked whether this phenomenon would extend to third-party observers who experience touch vicariously. In an eye-tracking experiment, participants (N = 64, 32 men and 32 women) viewed prime and target images with the intention of remembering them. Primes comprised line drawings of dyadic interactions with and without touch. Targets comprised two faces shown side-by-side, with one being neutral and the other being happy or sad. Analysis of prime fixations revealed that faces in touch interactions attracted longer gazing than faces in no-touch interactions. In addition, touch enhanced gazing at the area of touch in women but not men. Analysis of target fixations revealed that touch priming increased looking at both faces immediately after target onset, and subsequently, at the emotional face in the pair. Sex differences in target processing were nonsignificant. Together, the present results imply that vicarious touch biases visual attention to faces and promotes emotion sensitivity. In addition, they suggest that, compared with men, women are more aware of tactile exchanges in their environment. As such, vicarious touch appears to share important qualities with actual physical touch. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  13. Expression-dependent susceptibility to face distortions in processing of facial expressions of emotion.

    PubMed

    Guo, Kun; Soornack, Yoshi; Settle, Rebecca

    2018-03-05

    Our capability of recognizing facial expressions of emotion under different viewing conditions implies the existence of an invariant expression representation. As natural visual signals are often distorted and our perceptual strategy changes with external noise level, it is essential to understand how expression perception is susceptible to face distortion and whether the same facial cues are used to process high- and low-quality face images. We systematically manipulated face image resolution (experiment 1) and blur (experiment 2), and measured participants' expression categorization accuracy, perceived expression intensity and associated gaze patterns. Our analysis revealed a reasonable tolerance to face distortion in expression perception. Reducing image resolution up to 48 × 64 pixels or increasing image blur up to 15 cycles/image had little impact on expression assessment and associated gaze behaviour. Further distortion led to decreased expression categorization accuracy and intensity rating, increased reaction time and fixation duration, and stronger central fixation bias which was not driven by distortion-induced changes in local image saliency. Interestingly, the observed distortion effects were expression-dependent with less deterioration impact on happy and surprise expressions, suggesting this distortion-invariant facial expression perception might be achieved through the categorical model involving a non-linear configural combination of local facial features. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. Attentional bias for emotional faces in paediatric anxiety disorders: an investigation using the emotional Go/No Go task.

    PubMed

    Waters, Allison M; Valvoi, Jaya S

    2009-06-01

    The present study examined contextual modulation of attentional control processes in paediatric anxiety disorders. Anxious children (N=20) and non-anxious controls (N=20) completed an emotional Go/No Go task in which they responded on some trials (i.e., Go trials) when neutral faces were presented amongst either angry or happy faces to which children avoided responding (i.e., No Go trials) or when angry and happy faces were presented as Go trials and children avoided responding to neutral faces. Anxious girls were slower responding to neutral faces with embedded angry compared with happy face No Go trials whereas non-anxious girls were slower responding to neutral faces with embedded happy versus angry face No Go trials. Anxious and non-anxious boys showed the same basic pattern as non-anxious girls. There were no significant group differences on No Go trials or when the emotional faces were presented as Go trials. Results are discussed in terms of selective interference by angry faces in the control of attention in anxious girls.

  15. Amygdala and whole-brain activity to emotional faces distinguishes major depressive disorder and bipolar disorder.

    PubMed

    Fournier, Jay C; Keener, Matthew T; Almeida, Jorge; Kronhaus, Dina M; Phillips, Mary L

    2013-11-01

    It can be clinically difficult to distinguish depressed individuals with bipolar disorder (BD) and major depressive disorder (MDD). To examine potential biomarkers of difference between the two disorders, the current study examined differences in the functioning of emotion-processing neural regions during a dynamic emotional faces task. During functional magnetic resonance imaging, healthy control adults (HC) (n = 29) and depressed adults with MDD (n = 30) and BD (n = 22) performed an implicit emotional-faces task in which they identified a color label superimposed on neutral faces that dynamically morphed into one of four emotional faces (angry, fearful, sad, happy). We compared neural activation between the groups in an amygdala region-of-interest and at the whole-brain level. Adults with MDD showed significantly greater activity than adults with BD in the left amygdala to the anger condition (p = 0.01). Results of whole-brain analyses (at p < 0.005, k ≥ 20) revealed that adults with BD showed greater activity to sad faces in temporoparietal regions, primarily in the left hemisphere, whereas individuals with MDD demonstrated greater activity than those with BD to displays of anger, fear, and happiness. Many of the observed BD-MDD differences represented abnormalities in functioning compared to HC. We observed a dissociation between depressed adults with BD and MDD in the processing of emerging emotional faces. Those with BD showed greater activity during mood-congruent (i.e., sad) faces, whereas those with MDD showed greater activity for mood-incongruent (i.e., fear, anger, and happy) faces. Such findings may reflect markers of differences between BD and MDD depression in underlying pathophysiological processes. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. Priming the Secure Attachment Schema Affects the Emotional Face Processing Bias in Attachment Anxiety: An fMRI Research

    PubMed Central

    Tang, Qingting; Chen, Xu; Hu, Jia; Liu, Ying

    2017-01-01

    Our study explored how priming with a secure base schema affects the processing of emotional facial stimuli in individuals with attachment anxiety. We enrolled 42 undergraduate students between 18 and 27 years of age, and divided them into two groups: attachment anxiety and attachment secure. All participants were primed under two conditions, the secure priming using references to the partner, and neutral priming using neutral references. We performed repeated attachment security priming combined with a dual-task paradigm and functional magnetic resonance imaging. Participants’ reaction times in terms of responding to the facial stimuli were also measured. Attachment security priming can facilitate an individual’s processing of positive emotional faces; for instance, the presentation of the partner’s name was associated with stronger activities in a wide range of brain regions and faster reaction times for positive facial expressions in the subjects. The current finding of higher activity in the left-hemisphere regions for secure priming rather than neutral priming is consistent with the prediction that attachment security priming triggers the spread of the activation of a positive emotional state. However, the difference in brain activity during processing of both, positive and negative emotional facial stimuli between the two priming conditions appeared in the attachment anxiety group alone. This study indicates that the effect of attachment secure priming on the processing of emotional facial stimuli could be mediated by chronic attachment anxiety. In addition, it highlights the association between higher-order processes of the attachment system (secure attachment schema priming) and early-stage information processing system (attention), given the increased attention toward the effects of secure base schema on the processing of emotion- and attachment-related information among the insecure population. Thus, the following study has applications in providing

  17. Priming the Secure Attachment Schema Affects the Emotional Face Processing Bias in Attachment Anxiety: An fMRI Research.

    PubMed

    Tang, Qingting; Chen, Xu; Hu, Jia; Liu, Ying

    2017-01-01

    Our study explored how priming with a secure base schema affects the processing of emotional facial stimuli in individuals with attachment anxiety. We enrolled 42 undergraduate students between 18 and 27 years of age, and divided them into two groups: attachment anxiety and attachment secure. All participants were primed under two conditions, the secure priming using references to the partner, and neutral priming using neutral references. We performed repeated attachment security priming combined with a dual-task paradigm and functional magnetic resonance imaging. Participants' reaction times in terms of responding to the facial stimuli were also measured. Attachment security priming can facilitate an individual's processing of positive emotional faces; for instance, the presentation of the partner's name was associated with stronger activities in a wide range of brain regions and faster reaction times for positive facial expressions in the subjects. The current finding of higher activity in the left-hemisphere regions for secure priming rather than neutral priming is consistent with the prediction that attachment security priming triggers the spread of the activation of a positive emotional state. However, the difference in brain activity during processing of both, positive and negative emotional facial stimuli between the two priming conditions appeared in the attachment anxiety group alone. This study indicates that the effect of attachment secure priming on the processing of emotional facial stimuli could be mediated by chronic attachment anxiety. In addition, it highlights the association between higher-order processes of the attachment system (secure attachment schema priming) and early-stage information processing system (attention), given the increased attention toward the effects of secure base schema on the processing of emotion- and attachment-related information among the insecure population. Thus, the following study has applications in providing

  18. Fusiform Gyrus Dysfunction is Associated with Perceptual Processing Efficiency to Emotional Faces in Adolescent Depression: A Model-Based Approach

    PubMed Central

    Ho, Tiffany C.; Zhang, Shunan; Sacchet, Matthew D.; Weng, Helen; Connolly, Colm G.; Henje Blom, Eva; Han, Laura K. M.; Mobayed, Nisreen O.; Yang, Tony T.

    2016-01-01

    While the extant literature has focused on major depressive disorder (MDD) as being characterized by abnormalities in processing affective stimuli (e.g., facial expressions), little is known regarding which specific aspects of cognition influence the evaluation of affective stimuli, and what are the underlying neural correlates. To investigate these issues, we assessed 26 adolescents diagnosed with MDD and 37 well-matched healthy controls (HCL) who completed an emotion identification task of dynamically morphing faces during functional magnetic resonance imaging (fMRI). We analyzed the behavioral data using a sequential sampling model of response time (RT) commonly used to elucidate aspects of cognition in binary perceptual decision making tasks: the Linear Ballistic Accumulator (LBA) model. Using a hierarchical Bayesian estimation method, we obtained group-level and individual-level estimates of LBA parameters on the facial emotion identification task. While the MDD and HCL groups did not differ in mean RT, accuracy, or group-level estimates of perceptual processing efficiency (i.e., drift rate parameter of the LBA), the MDD group showed significantly reduced responses in left fusiform gyrus compared to the HCL group during the facial emotion identification task. Furthermore, within the MDD group, fMRI signal in the left fusiform gyrus during affective face processing was significantly associated with greater individual-level estimates of perceptual processing efficiency. Our results therefore suggest that affective processing biases in adolescents with MDD are characterized by greater perceptual processing efficiency of affective visual information in sensory brain regions responsible for the early processing of visual information. The theoretical, methodological, and clinical implications of our results are discussed. PMID:26869950

  19. Association between amygdala response to emotional faces and social anxiety in autism spectrum disorders.

    PubMed

    Kleinhans, Natalia M; Richards, Todd; Weaver, Kurt; Johnson, L Clark; Greenson, Jessica; Dawson, Geraldine; Aylward, Elizabeth

    2010-10-01

    Difficulty interpreting facial expressions has been reported in autism spectrum disorders (ASD) and is thought to be associated with amygdala abnormalities. To further explore the neural basis of abnormal emotional face processing in ASD, we conducted an fMRI study of emotional face matching in high-functioning adults with ASD and age, IQ, and gender matched controls. In addition, we investigated whether there was a relationship between self-reported social anxiety and fMRI activation. During fMRI scanning, study participants were instructed to match facial expressions depicting fear or anger. The control condition was a comparable shape-matching task. The control group evidenced significantly increased left prefrontal activation and decreased activation in the occipital lobes compared to the ASD group during emotional face matching. Further, within the ASD group, greater social anxiety was associated with increased activation in right amygdala and left middle temporal gyrus, and decreased activation in the fusiform face area. These results indicate that level of social anxiety mediates the neural response to emotional face perception in ASD. Copyright © 2010 Elsevier Ltd. All rights reserved.

  20. Emotion-Color Associations in the Context of the Face.

    PubMed

    Thorstenson, Christopher A; Elliot, Andrew J; Pazda, Adam D; Perrett, David I; Xiao, Dengke

    2017-11-27

    Facial expressions of emotion contain important information that is perceived and used by observers to understand others' emotional state. While there has been considerable research into perceptions of facial musculature and emotion, less work has been conducted to understand perceptions of facial coloration and emotion. The current research examined emotion-color associations in the context of the face. Across 4 experiments, participants were asked to manipulate the color of face, or shape, stimuli along 2 color axes (i.e., red-green, yellow-blue) for 6 target emotions (i.e., anger, disgust, fear, happiness, sadness, surprise). The results yielded a pattern that is consistent with physiological and psychological models of emotion. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  1. Is empathy necessary to comprehend the emotional faces? The empathic effect on attentional mechanisms (eye movements), cortical correlates (N200 event-related potentials) and facial behaviour (electromyography) in face processing.

    PubMed

    Balconi, Michela; Canavesio, Ylenia

    2016-01-01

    The present research explored the effect of social empathy on processing emotional facial expressions. Previous evidence suggested a close relationship between emotional empathy and both the ability to detect facial emotions and the attentional mechanisms involved. A multi-measure approach was adopted: we investigated the association between trait empathy (Balanced Emotional Empathy Scale) and individuals' performance (response times; RTs), attentional mechanisms (eye movements; number and duration of fixations), correlates of cortical activation (event-related potential (ERP) N200 component), and facial responsiveness (facial zygomatic and corrugator activity). Trait empathy was found to affect face detection performance (reduced RTs), attentional processes (more scanning eye movements in specific areas of interest), ERP salience effect (increased N200 amplitude), and electromyographic activity (more facial responses). A second important result was the demonstration of strong, direct correlations among these measures. We suggest that empathy may function as a social facilitator of the processes underlying the detection of facial emotion, and a general "facial response effect" is proposed to explain these results. We assumed that empathy influences cognitive and the facial responsiveness, such that empathic individuals are more skilful in processing facial emotion.

  2. Perceptual integration of faces and voices depends on the interaction of emotional content and spatial frequency.

    PubMed

    Kokinous, Jenny; Tavano, Alessandro; Kotz, Sonja A; Schröger, Erich

    2017-02-01

    The role of spatial frequencies (SF) is highly debated in emotion perception, but previous work suggests the importance of low SFs for detecting emotion in faces. Furthermore, emotion perception essentially relies on the rapid integration of multimodal information from faces and voices. We used EEG to test the functional relevance of SFs in the integration of emotional and non-emotional audiovisual stimuli. While viewing dynamic face-voice pairs, participants were asked to identify auditory interjections, and the electroencephalogram (EEG) was recorded. Audiovisual integration was measured as auditory facilitation, indexed by the extent of the auditory N1 amplitude suppression in audiovisual compared to an auditory only condition. We found an interaction of SF filtering and emotion in the auditory response suppression. For neutral faces, larger N1 suppression ensued in the unfiltered and high SF conditions as compared to the low SF condition. Angry face perception led to a larger N1 suppression in the low SF condition. While the results for the neural faces indicate that perceptual quality in terms of SF content plays a major role in audiovisual integration, the results for angry faces suggest that early multisensory integration of emotional information favors low SF neural processing pathways, overruling the predictive value of the visual signal per se. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Emotional facial expressions differentially influence predictions and performance for face recognition.

    PubMed

    Nomi, Jason S; Rhodes, Matthew G; Cleary, Anne M

    2013-01-01

    This study examined how participants' predictions of future memory performance are influenced by emotional facial expressions. Participants made judgements of learning (JOLs) predicting the likelihood that they would correctly identify a face displaying a happy, angry, or neutral emotional expression in a future two-alternative forced-choice recognition test of identity (i.e., recognition that a person's face was seen before). JOLs were higher for studied faces with happy and angry emotional expressions than for neutral faces. However, neutral test faces with studied neutral expressions had significantly higher identity recognition rates than neutral test faces studied with happy or angry expressions. Thus, these data are the first to demonstrate that people believe happy and angry emotional expressions will lead to better identity recognition in the future relative to neutral expressions. This occurred despite the fact that neutral expressions elicited better identity recognition than happy and angry expressions. These findings contribute to the growing literature examining the interaction of cognition and emotion.

  4. Visual search for facial expressions of emotions: a comparison of dynamic and static faces.

    PubMed

    Horstmann, Gernot; Ansorge, Ulrich

    2009-02-01

    A number of past studies have used the visual search paradigm to examine whether certain aspects of emotional faces are processed preattentively and can thus be used to guide attention. All these studies presented static depictions of facial prototypes. Emotional expressions conveyed by the movement patterns of the face have never been examined for their preattentive effect. The present study presented for the first time dynamic facial expressions in a visual search paradigm. Experiment 1 revealed efficient search for a dynamic angry face among dynamic friendly faces, but inefficient search in a control condition with static faces. Experiments 2 to 4 suggested that this pattern of results is due to a stronger movement signal in the angry than in the friendly face: No (strong) advantage of dynamic over static faces is revealed when the degree of movement is controlled. These results show that dynamic information can be efficiently utilized in visual search for facial expressions. However, these results do not generally support the hypothesis that emotion-specific movement patterns are always preattentively discriminated. (c) 2009 APA, all rights reserved

  5. Preschool negative emotionality predicts activity and connectivity of the fusiform face area and amygdala in later childhood.

    PubMed

    Kann, Sarah J; O'Rawe, Jonathan F; Huang, Anna S; Klein, Daniel N; Leung, Hoi-Chung

    2017-09-01

    Negative emotionality (NE) refers to individual differences in the propensity to experience and react with negative emotions and is associated with increased risk of psychological disorder. However, research on the neural bases of NE has focused almost exclusively on amygdala activity during emotional face processing. This study broadened this framework by examining the relationship between observed NE in early childhood and subsequent neural responses to emotional faces in both the amygdala and the fusiform face area (FFA) in a late childhood/early adolescent sample. Measures of NE were obtained from children at age 3 using laboratory observations, and functional magnetic resonance imaging (fMRI) data were collected when these children were between the ages of 9 and 12 while performing a visual stimulus identity matching task with houses and emotional faces as stimuli. Multiple regression analyses revealed that higher NE at age 3 is associated with significantly greater activation in the left amygdala and left FFA but lower functional connectivity between these two regions during the face conditions. These findings suggest that those with higher early NE have subsequent alterations in both activity and connectivity within an extended network during face processing. © The Author (2017). Published by Oxford University Press.

  6. Image-based Analysis of Emotional Facial Expressions in Full Face Transplants.

    PubMed

    Bedeloglu, Merve; Topcu, Çagdas; Akgul, Arzu; Döger, Ela Naz; Sever, Refik; Ozkan, Ozlenen; Ozkan, Omer; Uysal, Hilmi; Polat, Ovunc; Çolak, Omer Halil

    2018-01-20

    In this study, it is aimed to determine the degree of the development in emotional expression of full face transplant patients from photographs. Hence, a rehabilitation process can be planned according to the determination of degrees as a later work. As envisaged, in full face transplant cases, the determination of expressions can be confused or cannot be achieved as the healthy control group. In order to perform image-based analysis, a control group consist of 9 healthy males and 2 full-face transplant patients participated in the study. Appearance-based Gabor Wavelet Transform (GWT) and Local Binary Pattern (LBP) methods are adopted for recognizing neutral and 6 emotional expressions which consist of angry, scared, happy, hate, confused and sad. Feature extraction was carried out by using both methods and combination of these methods serially. In the performed expressions, the extracted features of the most distinct zones in the facial area where the eye and mouth region, have been used to classify the emotions. Also, the combination of these region features has been used to improve classifier performance. Control subjects and transplant patients' ability to perform emotional expressions have been determined with K-nearest neighbor (KNN) classifier with region-specific and method-specific decision stages. The results have been compared with healthy group. It has been observed that transplant patients don't reflect some emotional expressions. Also, there were confusions among expressions.

  7. Amygdala atrophy affects emotion-related activity in face-responsive regions in frontotemporal degeneration.

    PubMed

    De Winter, François-Laurent; Van den Stock, Jan; de Gelder, Beatrice; Peeters, Ronald; Jastorff, Jan; Sunaert, Stefan; Vanduffel, Wim; Vandenberghe, Rik; Vandenbulcke, Mathieu

    2016-09-01

    In the healthy brain, modulatory influences from the amygdala commonly explain enhanced activation in face-responsive areas by emotional facial expressions relative to neutral expressions. In the behavioral variant frontotemporal dementia (bvFTD) facial emotion recognition is impaired and has been associated with atrophy of the amygdala. By combining structural and functional MRI in 19 patients with bvFTD and 20 controls we investigated the neural effects of emotion in face-responsive cortex and its relationship with amygdalar gray matter (GM) volume in neurodegeneration. Voxel-based morphometry revealed decreased GM volume in anterior medio-temporal regions including amygdala in patients compared to controls. During fMRI, we presented dynamic facial expressions (fear and chewing) and their spatiotemporally scrambled versions. We found enhanced activation for fearful compared to neutral faces in ventral temporal cortex and superior temporal sulcus in controls, but not in patients. In the bvFTD group left amygdalar GM volume correlated positively with emotion-related activity in left fusiform face area (FFA). This correlation was amygdala-specific and driven by GM in superficial and basolateral (BLA) subnuclei, consistent with reported amygdalar-cortical networks. The data suggests that anterior medio-temporal atrophy in bvFTD affects emotion processing in distant posterior areas. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. The right place at the right time: priming facial expressions with emotional face components in developmental visual agnosia.

    PubMed

    Aviezer, Hillel; Hassin, Ran R; Perry, Anat; Dudarev, Veronica; Bentin, Shlomo

    2012-04-01

    The current study examined the nature of deficits in emotion recognition from facial expressions in case LG, an individual with a rare form of developmental visual agnosia (DVA). LG presents with profoundly impaired recognition of facial expressions, yet the underlying nature of his deficit remains unknown. During typical face processing, normal sighted individuals extract information about expressed emotions from face regions with activity diagnostic for specific emotion categories. Given LG's impairment, we sought to shed light on his emotion perception by examining if priming facial expressions with diagnostic emotional face components would facilitate his recognition of the emotion expressed by the face. LG and control participants matched isolated face components with components appearing in a subsequently presented full-face and then categorized the face's emotion. Critically, the matched components were from regions which were diagnostic or non-diagnostic of the emotion portrayed by the full face. In experiment 1, when the full faces were briefly presented (150 ms), LG's performance was strongly influenced by the diagnosticity of the components: his emotion recognition was boosted within normal limits when diagnostic components were used and was obliterated when non-diagnostic components were used. By contrast, in experiment 2, when the face-exposure duration was extended (2000 ms), the beneficial effect of the diagnostic matching was diminished as was the detrimental effect of the non-diagnostic matching. These data highlight the impact of diagnostic facial features in normal expression recognition and suggest that impaired emotion recognition in DVA results from deficient visual integration across diagnostic face components. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. Flexible and inflexible task sets: asymmetric interference when switching between emotional expression, sex, and age classification of perceived faces.

    PubMed

    Schuch, Stefanie; Werheid, Katja; Koch, Iring

    2012-01-01

    The present study investigated whether the processing characteristics of categorizing emotional facial expressions are different from those of categorizing facial age and sex information. Given that emotions change rapidly, it was hypothesized that processing facial expressions involves a more flexible task set that causes less between-task interference than the task sets involved in processing age or sex of a face. Participants switched between three tasks: categorizing a face as looking happy or angry (emotion task), young or old (age task), and male or female (sex task). Interference between tasks was measured by global interference and response interference. Both measures revealed patterns of asymmetric interference. Global between-task interference was reduced when a task was mixed with the emotion task. Response interference, as measured by congruency effects, was larger for the emotion task than for the nonemotional tasks. The results support the idea that processing emotional facial expression constitutes a more flexible task set that causes less interference (i.e., task-set "inertia") than processing the age or sex of a face.

  10. Face processing in chronic alcoholism: a specific deficit for emotional features.

    PubMed

    Maurage, P; Campanella, S; Philippot, P; Martin, S; de Timary, P

    2008-04-01

    It is well established that chronic alcoholism is associated with a deficit in the decoding of emotional facial expression (EFE). Nevertheless, it is still unclear whether this deficit is specifically for emotions or due to a more general impairment in visual or facial processing. This study was designed to clarify this issue using multiple control tasks and the subtraction method. Eighteen patients suffering from chronic alcoholism and 18 matched healthy control subjects were asked to perform several tasks evaluating (1) Basic visuo-spatial and facial identity processing; (2) Simple reaction times; (3) Complex facial features identification (namely age, emotion, gender, and race). Accuracy and reaction times were recorded. Alcoholic patients had a preserved performance for visuo-spatial and facial identity processing, but their performance was impaired for visuo-motor abilities and for the detection of complex facial aspects. More importantly, the subtraction method showed that alcoholism is associated with a specific EFE decoding deficit, still present when visuo-motor slowing down is controlled for. These results offer a post hoc confirmation of earlier data showing an EFE decoding deficit in alcoholism by strongly suggesting a specificity of this deficit for emotions. This may have implications for clinical situations, where emotional impairments are frequently observed among alcoholic subjects.

  11. The time course of face processing: startle eyeblink response modulation by face gender and expression.

    PubMed

    Duval, Elizabeth R; Lovelace, Christopher T; Aarant, Justin; Filion, Diane L

    2013-12-01

    The purpose of this study was to investigate the effects of both facial expression and face gender on startle eyeblink response patterns at varying lead intervals (300, 800, and 3500ms) indicative of attentional and emotional processes. We aimed to determine whether responses to affective faces map onto the Defense Cascade Model (Lang et al., 1997) to better understand the stages of processing during affective face viewing. At 300ms, there was an interaction between face expression and face gender with female happy and neutral faces and male angry faces producing inhibited startle. At 3500ms, there was a trend for facilitated startle during angry compared to neutral faces. These findings suggest that affective expressions are perceived differently in male and female faces, especially at short lead intervals. Future studies investigating face processing should take both face gender and expression into account. © 2013.

  12. The Right Place at the Right Time: Priming Facial Expressions with Emotional Face Components in Developmental Visual Agnosia

    PubMed Central

    Aviezer, Hillel; Hassin, Ran. R.; Perry, Anat; Dudarev, Veronica; Bentin, Shlomo

    2012-01-01

    The current study examined the nature of deficits in emotion recognition from facial expressions in case LG, an individual with a rare form of developmental visual agnosia (DVA). LG presents with profoundly impaired recognition of facial expressions, yet the underlying nature of his deficit remains unknown. During typical face processing, normal sighted individuals extract information about expressed emotions from face regions with activity diagnostic for specific emotion categories. Given LG’s impairment, we sought to shed light on his emotion perception by examining if priming facial expressions with diagnostic emotional face components would facilitate his recognition of the emotion expressed by the face. LG and control participants matched isolated face components with components appearing in a subsequently presented full-face and then categorized the face’s emotion. Critically, the matched components were from regions which were diagnostic or non-diagnostic of the emotion portrayed by the full face. In experiment 1, when the full faces were briefly presented (150 ms), LG’s performance was strongly influenced by the diagnosticity of the components: His emotion recognition was boosted within normal limits when diagnostic components were used and was obliterated when non-diagnostic components were used. By contrast, in experiment 2, when the face-exposure duration was extended (2000 ms), the beneficial effect of the diagnostic matching was diminished as was the detrimental effect of the non-diagnostic matching. These data highlight the impact of diagnostic facial features in normal expression recognition and suggest that impaired emotion recognition in DVA results from deficient visual integration across diagnostic face components. PMID:22349446

  13. Auditory Emotion Word Primes Influence Emotional Face Categorization in Children and Adults, but Not Vice Versa.

    PubMed

    Vesker, Michael; Bahn, Daniela; Kauschke, Christina; Tschense, Monika; Degé, Franziska; Schwarzer, Gudrun

    2018-01-01

    In order to assess how the perception of audible speech and facial expressions influence one another for the perception of emotions, and how this influence might change over the course of development, we conducted two cross-modal priming experiments with three age groups of children (6-, 9-, and 12-years old), as well as college-aged adults. In Experiment 1, 74 children and 24 adult participants were tasked with categorizing photographs of emotional faces as positive or negative as quickly as possible after being primed with emotion words presented via audio in valence-congruent and valence-incongruent trials. In Experiment 2, 67 children and 24 adult participants carried out a similar categorization task, but with faces acting as visual primes, and emotion words acting as auditory targets. The results of Experiment 1 showed that participants made more errors when categorizing positive faces primed by negative words versus positive words, and that 6-year-old children are particularly sensitive to positive word primes, giving faster correct responses regardless of target valence. Meanwhile, the results of Experiment 2 did not show any congruency effects for priming by facial expressions. Thus, audible emotion words seem to exert an influence on the emotional categorization of faces, while faces do not seem to influence the categorization of emotion words in a significant way.

  14. Emotional face recognition in adolescent suicide attempters and adolescents engaging in non-suicidal self-injury.

    PubMed

    Seymour, Karen E; Jones, Richard N; Cushman, Grace K; Galvan, Thania; Puzia, Megan E; Kim, Kerri L; Spirito, Anthony; Dickstein, Daniel P

    2016-03-01

    Little is known about the bio-behavioral mechanisms underlying and differentiating suicide attempts from non-suicidal self-injury (NSSI) in adolescents. Adolescents who attempt suicide or engage in NSSI often report significant interpersonal and social difficulties. Emotional face recognition ability is a fundamental skill required for successful social interactions, and deficits in this ability may provide insight into the unique brain-behavior interactions underlying suicide attempts versus NSSI in adolescents. Therefore, we examined emotional face recognition ability among three mutually exclusive groups: (1) inpatient adolescents who attempted suicide (SA, n = 30); (2) inpatient adolescents engaged in NSSI (NSSI, n = 30); and (3) typically developing controls (TDC, n = 30) without psychiatric illness. Participants included adolescents aged 13-17 years, matched on age, gender and full-scale IQ. Emotional face recognition was evaluated using the diagnostic assessment of nonverbal accuracy (DANVA-2). Compared to TDC youth, adolescents with NSSI made more errors on child fearful and adult sad face recognition while controlling for psychopathology and medication status (ps < 0.05). No differences were found on emotional face recognition between NSSI and SA groups. Secondary analyses showed that compared to inpatients without major depression, those with major depression made fewer errors on adult sad face recognition even when controlling for group status (p < 0.05). Further, compared to inpatients without generalized anxiety, those with generalized anxiety made fewer recognition errors on adult happy faces even when controlling for group status (p < 0.05). Adolescent inpatients engaged in NSSI showed greater deficits in emotional face recognition than TDC, but not inpatient adolescents who attempted suicide. Further results suggest the importance of psychopathology in emotional face recognition. Replication of these preliminary results and examination of the role

  15. Human versus Non-Human Face Processing: Evidence from Williams Syndrome

    ERIC Educational Resources Information Center

    Santos, Andreia; Rosset, Delphine; Deruelle, Christine

    2009-01-01

    Increased motivation towards social stimuli in Williams syndrome (WS) led us to hypothesize that a face's human status would have greater impact than face's orientation on WS' face processing abilities. Twenty-nine individuals with WS were asked to categorize facial emotion expressions in real, human cartoon and non-human cartoon faces presented…

  16. The recognition of emotional expression in prosopagnosia: decoding whole and part faces.

    PubMed

    Stephan, Blossom Christa Maree; Breen, Nora; Caine, Diana

    2006-11-01

    Prosopagnosia is currently viewed within the constraints of two competing theories of face recognition, one highlighting the analysis of features, the other focusing on configural processing of the whole face. This study investigated the role of feature analysis versus whole face configural processing in the recognition of facial expression. A prosopagnosic patient, SC made expression decisions from whole and incomplete (eyes-only and mouth-only) faces where features had been obscured. SC was impaired at recognizing some (e.g., anger, sadness, and fear), but not all (e.g., happiness) emotional expressions from the whole face. Analyses of his performance on incomplete faces indicated that his recognition of some expressions actually improved relative to his performance on the whole face condition. We argue that in SC interference from damaged configural processes seem to override an intact ability to utilize part-based or local feature cues.

  17. The neural representation of emotionally neutral faces and places in patients with panic disorder with agoraphobia.

    PubMed

    Petrowski, Katja; Wintermann, Gloria; Smolka, Michael N; Huebner, Thomas; Donix, Markus

    2014-01-01

    Panic disorder with agoraphobia (PD-A) has been associated with abnormal neural activity for threat-related stimuli (faces, places). Recent findings suggest a disturbed neural processing of emotionally neutral stimuli at a more general level. Using functional magnetic resonance imaging (fMRI) we investigated the neural processing of emotionally neutral faces and places in PD-A. Fifteen patients with PD-A and fifteen healthy subjects participated in the study. When they perceived neutral faces and places, the patients with PD-A showed significantly less brain activity in the fusiform gyrus, the inferior occipital gyrus, the calcarine gyrus, the cerebellum, and the cuneus compared with the healthy controls. However, the patients with PD-A showed significantly more brain activity in the precuneus compared with controls subjects. It was not possible to distinguish the agoraphobia-associated effects from possible contributions due to general anxiety induced by fMRI. For future investigations, an additional clinical control group with patients suffering from panic disorder without agoraphobia would be of interest. In addition, the psychopathology concerning the agoraphobic symptoms needs to be investigated in more detail. The findings suggest altered neural processing of emotionally neutral faces and places in patients with PD-A. Reduced neural activity in different brain regions may indicate difficulties in recognizing the emotional content in face and place stimuli due to anxiety-related hyper-arousal. © 2013 Published by Elsevier B.V.

  18. Detection of emotional faces: salient physical features guide effective visual search.

    PubMed

    Calvo, Manuel G; Nummenmaa, Lauri

    2008-08-01

    In this study, the authors investigated how salient visual features capture attention and facilitate detection of emotional facial expressions. In a visual search task, a target emotional face (happy, disgusted, fearful, angry, sad, or surprised) was presented in an array of neutral faces. Faster detection of happy and, to a lesser extent, surprised and disgusted faces was found both under upright and inverted display conditions. Inversion slowed down the detection of these faces less than that of others (fearful, angry, and sad). Accordingly, the detection advantage involves processing of featural rather than configural information. The facial features responsible for the detection advantage are located in the mouth rather than the eye region. Computationally modeled visual saliency predicted both attentional orienting and detection. Saliency was greatest for the faces (happy) and regions (mouth) that were fixated earlier and detected faster, and there was close correspondence between the onset of the modeled saliency peak and the time at which observers initially fixated the faces. The authors conclude that visual saliency of specific facial features--especially the smiling mouth--is responsible for facilitated initial orienting, which thus shortens detection. (PsycINFO Database Record (c) 2008 APA, all rights reserved).

  19. Childhood Poverty Predicts Adult Amygdala and Frontal Activity and Connectivity in Response to Emotional Faces.

    PubMed

    Javanbakht, Arash; King, Anthony P; Evans, Gary W; Swain, James E; Angstadt, Michael; Phan, K Luan; Liberzon, Israel

    2015-01-01

    Childhood poverty negatively impacts physical and mental health in adulthood. Altered brain development in response to social and environmental factors associated with poverty likely contributes to this effect, engendering maladaptive patterns of social attribution and/or elevated physiological stress. In this fMRI study, we examined the association between childhood poverty and neural processing of social signals (i.e., emotional faces) in adulthood. Fifty-two subjects from a longitudinal prospective study recruited as children, participated in a brain imaging study at 23-25 years of age using the Emotional Faces Assessment Task. Childhood poverty, independent of concurrent adult income, was associated with higher amygdala and medial prefrontal cortical (mPFC) responses to threat vs. happy faces. Also, childhood poverty was associated with decreased functional connectivity between left amygdala and mPFC. This study is unique, because it prospectively links childhood poverty to emotional processing during adulthood, suggesting a candidate neural mechanism for negative social-emotional bias. Adults who grew up poor appear to be more sensitive to social threat cues and less sensitive to positive social cues.

  20. ERPs reveal subliminal processing of fearful faces.

    PubMed

    Kiss, Monika; Eimer, Martin

    2008-03-01

    To investigate whether facial expression is processed in the absence of conscious awareness, ERPs were recorded in a task in which participants had to identify the expression of masked fearful and neutral target faces. On supraliminal trials (200 ms target duration), in which identification performance was high, a sustained positivity to fearful versus neutral target faces started 140 ms after target face onset. On subliminal trials (8 ms target duration), identification performance was at chance level, but ERPs still showed systematic fear-specific effects. An early positivity to fearful target faces was present but smaller than on supraliminal trials. A subsequent enhanced N2 to fearful faces was only present for subliminal trials. In contrast, a P3 enhancement to fearful faces was observed on supraliminal but not subliminal trials. Results demonstrate rapid emotional expression processing in the absence of awareness.

  1. ERPs reveal subliminal processing of fearful faces

    PubMed Central

    Kiss, Monika; Eimer, Martin

    2008-01-01

    To investigate whether facial expression is processed in the absence of conscious awareness, ERPs were recorded in a task where participants had to identify the expression of masked fearful and neutral target faces. On supraliminal trials (200 ms target duration), where identification performance was high, a sustained positivity to fearful versus neutral target faces started 140 ms after target face onset. On subliminal trials (8 ms target duration), identification performance was at chance level, but ERPs still showed systematic fear-specific effects. An early positivity to fearful target faces was present but smaller than on supraliminal trials. A subsequent enhanced N2 to fearful faces was only present for subliminal trials. In contrast, a P3 enhancement to fearful faces was observed on supraliminal but not subliminal trials. Results demonstrate rapid emotional expression processing in the absence of awareness. PMID:17995905

  2. Neurocognitive mechanisms of gaze-expression interactions in face processing and social attention

    PubMed Central

    Graham, Reiko; LaBar, Kevin S.

    2012-01-01

    The face conveys a rich source of non-verbal information used during social communication. While research has revealed how specific facial channels such as emotional expression are processed, little is known about the prioritization and integration of multiple cues in the face during dyadic exchanges. Classic models of face perception have emphasized the segregation of dynamic versus static facial features along independent information processing pathways. Here we review recent behavioral and neuroscientific evidence suggesting that within the dynamic stream, concurrent changes in eye gaze and emotional expression can yield early independent effects on face judgments and covert shifts of visuospatial attention. These effects are partially segregated within initial visual afferent processing volleys, but are subsequently integrated in limbic regions such as the amygdala or via reentrant visual processing volleys. This spatiotemporal pattern may help to resolve otherwise perplexing discrepancies across behavioral studies of emotional influences on gaze-directed attentional cueing. Theoretical explanations of gaze-expression interactions are discussed, with special consideration of speed-of-processing (discriminability) and contextual (ambiguity) accounts. Future research in this area promises to reveal the mental chronometry of face processing and interpersonal attention, with implications for understanding how social referencing develops in infancy and is impaired in autism and other disorders of social cognition. PMID:22285906

  3. A facial expression of pax: Assessing children's "recognition" of emotion from faces.

    PubMed

    Nelson, Nicole L; Russell, James A

    2016-01-01

    In a classic study, children were shown an array of facial expressions and asked to choose the person who expressed a specific emotion. Children were later asked to name the emotion in the face with any label they wanted. Subsequent research often relied on the same two tasks--choice from array and free labeling--to support the conclusion that children recognize basic emotions from facial expressions. Here five studies (N=120, 2- to 10-year-olds) showed that these two tasks produce illusory recognition; a novel nonsense facial expression was included in the array. Children "recognized" a nonsense emotion (pax or tolen) and two familiar emotions (fear and jealousy) from the same nonsense face. Children likely used a process of elimination; they paired the unknown facial expression with a label given in the choice-from-array task and, after just two trials, freely labeled the new facial expression with the new label. These data indicate that past studies using this method may have overestimated children's expression knowledge. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Repetition suppression of faces is modulated by emotion

    NASA Astrophysics Data System (ADS)

    Ishai, Alumit; Pessoa, Luiz; Bikle, Philip C.; Ungerleider, Leslie G.

    2004-06-01

    Single-unit recordings and functional brain imaging studies have shown reduced neural responses to repeated stimuli in the visual cortex. By using event-related functional MRI, we compared the activation evoked by repetitions of neutral and fearful faces, which were either task relevant (targets) or irrelevant (distracters). We found that within the inferior occipital gyri, lateral fusiform gyri, superior temporal sulci, amygdala, and the inferior frontal gyri/insula, targets evoked stronger responses than distracters and their repetition was associated with significantly reduced responses. Repetition suppression, as manifested by the difference in response amplitude between the first and third repetitions of a target, was stronger for fearful than neutral faces. Distracter faces, regardless of their repetition or valence, evoked negligible activation, indicating top-down attenuation of behaviorally irrelevant stimuli. Our findings demonstrate a three-way interaction between emotional valence, repetition, and task relevance and suggest that repetition suppression is influenced by high-level cognitive processes in the human brain. face perception | functional MRI

  5. Neurofunctional Underpinnings of Audiovisual Emotion Processing in Teens with Autism Spectrum Disorders

    PubMed Central

    Doyle-Thomas, Krissy A.R.; Goldberg, Jeremy; Szatmari, Peter; Hall, Geoffrey B.C.

    2013-01-01

    Despite successful performance on some audiovisual emotion tasks, hypoactivity has been observed in frontal and temporal integration cortices in individuals with autism spectrum disorders (ASD). Little is understood about the neurofunctional network underlying this ability in individuals with ASD. Research suggests that there may be processing biases in individuals with ASD, based on their ability to obtain meaningful information from the face and/or the voice. This functional magnetic resonance imaging study examined brain activity in teens with ASD (n = 18) and typically developing controls (n = 16) during audiovisual and unimodal emotion processing. Teens with ASD had a significantly lower accuracy when matching an emotional face to an emotion label. However, no differences in accuracy were observed between groups when matching an emotional voice or face-voice pair to an emotion label. In both groups brain activity during audiovisual emotion matching differed significantly from activity during unimodal emotion matching. Between-group analyses of audiovisual processing revealed significantly greater activation in teens with ASD in a parietofrontal network believed to be implicated in attention, goal-directed behaviors, and semantic processing. In contrast, controls showed greater activity in frontal and temporal association cortices during this task. These results suggest that in the absence of engaging integrative emotional networks during audiovisual emotion matching, teens with ASD may have recruited the parietofrontal network as an alternate compensatory system. PMID:23750139

  6. Amygdala excitability to subliminally presented emotional faces distinguishes unipolar and bipolar depression: an fMRI and pattern classification study.

    PubMed

    Grotegerd, Dominik; Stuhrmann, Anja; Kugel, Harald; Schmidt, Simone; Redlich, Ronny; Zwanzger, Peter; Rauch, Astrid Veronika; Heindel, Walter; Zwitserlood, Pienie; Arolt, Volker; Suslow, Thomas; Dannlowski, Udo

    2014-07-01

    Bipolar disorder and Major depressive disorder are difficult to differentiate during depressive episodes, motivating research for differentiating neurobiological markers. Dysfunctional amygdala responsiveness during emotion processing has been implicated in both disorders, but the important rapid and automatic stages of emotion processing in the amygdala have so far never been investigated in bipolar patients. fMRI data of 22 bipolar depressed patients (BD), 22 matched unipolar depressed patients (MDD), and 22 healthy controls (HC) were obtained during processing of subliminal sad, happy and neutral faces. Amygdala responsiveness was investigated using standard univariate analyses as well as pattern-recognition techniques to differentiate the two clinical groups. Furthermore, medication effects on amygdala responsiveness were explored. All subjects were unaware of the emotional faces. Univariate analysis revealed a significant group × emotion interaction within the left amygdala. Amygdala responsiveness to sad>neutral faces was increased in MDD relative to BD. In contrast, responsiveness to happy>neutral faces showed the opposite pattern, with higher amygdala activity in BD than in MDD. Most of the activation patterns in both clinical groups differed significantly from activation patterns of HC--and therefore represent abnormalities. Furthermore, pattern classification on amygdala activation to sad>happy faces yielded almost 80% accuracy differentiating MDD and BD patients. Medication had no significant effect on these findings. Distinct amygdala excitability during automatic stages of the processing of emotional faces may reflect differential pathophysiological processes in BD versus MDD depression, potentially representing diagnosis-specific neural markers mostly unaffected by current psychotropic medication. Copyright © 2013 Wiley Periodicals, Inc.

  7. The role of spatial frequency information for ERP components sensitive to faces and emotional facial expression.

    PubMed

    Holmes, Amanda; Winston, Joel S; Eimer, Martin

    2005-10-01

    To investigate the impact of spatial frequency on emotional facial expression analysis, ERPs were recorded in response to low spatial frequency (LSF), high spatial frequency (HSF), and unfiltered broad spatial frequency (BSF) faces with fearful or neutral expressions, houses, and chairs. In line with previous findings, BSF fearful facial expressions elicited a greater frontal positivity than BSF neutral facial expressions, starting at about 150 ms after stimulus onset. In contrast, this emotional expression effect was absent for HSF and LSF faces. Given that some brain regions involved in emotion processing, such as amygdala and connected structures, are selectively tuned to LSF visual inputs, these data suggest that ERP effects of emotional facial expression do not directly reflect activity in these regions. It is argued that higher order neocortical brain systems are involved in the generation of emotion-specific waveform modulations. The face-sensitive N170 component was neither affected by emotional facial expression nor by spatial frequency information.

  8. The interrelations between verbal working memory and visual selection of emotional faces.

    PubMed

    Grecucci, Alessandro; Soto, David; Rumiati, Raffaella Ida; Humphreys, Glyn W; Rotshtein, Pia

    2010-06-01

    Working memory (WM) and visual selection processes interact in a reciprocal fashion based on overlapping representations abstracted from the physical characteristics of stimuli. Here, we assessed the neural basis of this interaction using facial expressions that conveyed emotion information. Participants memorized an emotional word for a later recognition test and then searched for a face of a particular gender presented in a display with two faces that differed in gender and expression. The relation between the emotional word and the expressions of the target and distractor faces was varied. RTs for the memory test were faster when the target face matched the emotional word held in WM (on valid trials) relative to when the emotional word matched the expression of the distractor (on invalid trials). There was also enhanced activation on valid compared with invalid trials in the lateral orbital gyrus, superior frontal polar (BA 10), lateral occipital sulcus, and pulvinar. Re-presentation of the WM stimulus in the search display led to an earlier onset of activity in the superior and inferior frontal gyri and the anterior hippocampus irrespective of the search validity of the re-presented stimulus. The data indicate that the middle temporal and prefrontal cortices are sensitive to the reappearance of stimuli that are held in WM, whereas a fronto-thalamic occipital network is sensitive to the behavioral significance of the match between WM and targets for selection. We conclude that these networks are modulated by high-level matches between the contents of WM, behavioral goals, and current sensory input.

  9. A motivational determinant of facial emotion recognition: regulatory focus affects recognition of emotions in faces.

    PubMed

    Sassenrath, Claudia; Sassenberg, Kai; Ray, Devin G; Scheiter, Katharina; Jarodzka, Halszka

    2014-01-01

    Two studies examined an unexplored motivational determinant of facial emotion recognition: observer regulatory focus. It was predicted that a promotion focus would enhance facial emotion recognition relative to a prevention focus because the attentional strategies associated with promotion focus enhance performance on well-learned or innate tasks - such as facial emotion recognition. In Study 1, a promotion or a prevention focus was experimentally induced and better facial emotion recognition was observed in a promotion focus compared to a prevention focus. In Study 2, individual differences in chronic regulatory focus were assessed and attention allocation was measured using eye tracking during the facial emotion recognition task. Results indicated that the positive relation between a promotion focus and facial emotion recognition is mediated by shorter fixation duration on the face which reflects a pattern of attention allocation matched to the eager strategy in a promotion focus (i.e., striving to make hits). A prevention focus did not have an impact neither on perceptual processing nor on facial emotion recognition. Taken together, these findings demonstrate important mechanisms and consequences of observer motivational orientation for facial emotion recognition.

  10. Influence of spatial frequency and emotion expression on face processing in patients with panic disorder.

    PubMed

    Shim, Miseon; Kim, Do-Won; Yoon, Sunkyung; Park, Gewnhi; Im, Chang-Hwan; Lee, Seung-Hwan

    2016-06-01

    Deficits in facial emotion processing is a major characteristic of patients with panic disorder. It is known that visual stimuli with different spatial frequencies take distinct neural pathways. This study investigated facial emotion processing involving stimuli presented at broad, high, and low spatial frequencies in patients with panic disorder. Eighteen patients with panic disorder and 19 healthy controls were recruited. Seven event-related potential (ERP) components: (P100, N170, early posterior negativity (EPN); vertex positive potential (VPP), N250, P300; and late positive potential (LPP)) were evaluated while the participants looked at fearful and neutral facial stimuli presented at three spatial frequencies. When a fearful face was presented, panic disorder patients showed a significantly increased P100 amplitude in response to low spatial frequency compared to high spatial frequency; whereas healthy controls demonstrated significant broad spatial frequency dependent processing in P100 amplitude. Vertex positive potential amplitude was significantly increased in high and broad spatial frequency, compared to low spatial frequency in panic disorder. Early posterior negativity amplitude was significantly different between HSF and BSF, and between LSF and BSF processing in both groups, regardless of facial expression. The possibly confounding effects of medication could not be controlled. During early visual processing, patients with panic disorder prefer global to detailed information. However, in later processing, panic disorder patients overuse detailed information for the perception of facial expressions. These findings suggest that unique spatial frequency-dependent facial processing could shed light on the neural pathology associated with panic disorder. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. The serotonin transporter gene polymorphism and the effect of baseline on amygdala response to emotional faces.

    PubMed

    von dem Hagen, Elisabeth A H; Passamonti, Luca; Nutland, Sarah; Sambrook, Jennifer; Calder, Andrew J

    2011-03-01

    Previous research has found that a common polymorphism in the serotonin transporter gene (5-HTTLPR) is an important mediator of individual differences in brain responses associated with emotional behaviour. In particular, relative to individuals homozygous for the l-allele, carriers of the s-allele display heightened amygdala activation to emotional compared to non-emotional stimuli. However, there is some debate as to whether this difference is driven by increased activation to emotional stimuli, resting baseline differences between the groups, or decreased activation to neutral stimuli. We performed functional imaging during an implicit facial expression processing task in which participants viewed angry, sad and neutral faces. In addition to neutral faces, we included two further baseline conditions, houses and fixation. We found increased amygdala activation in s-allele carriers relative to l-homozygotes in response to angry faces compared to neutral faces, houses and fixation. When comparing neutral faces to houses or fixation, we found no significant difference in amygdala response between the two groups. In addition, there was no significant difference between the groups in response to fixation when compared with a houses baseline. Overall, these results suggest that the increased amygdala response observed in s-allele carriers to emotional faces is primarily driven by an increased response to emotional faces rather than a decreased response to neutral faces or an increased resting baseline. The results are discussed in relation to the tonic and phasic hypotheses of 5-HTTLPR-mediated modulation of amygdala activity. Copyright © 2010 Elsevier Ltd. All rights reserved.

  12. Effects of speaker emotional facial expression and listener age on incremental sentence processing.

    PubMed

    Carminati, Maria Nella; Knoeferle, Pia

    2013-01-01

    We report two visual-world eye-tracking experiments that investigated how and with which time course emotional information from a speaker's face affects younger (N = 32, Mean age  = 23) and older (N = 32, Mean age  = 64) listeners' visual attention and language comprehension as they processed emotional sentences in a visual context. The age manipulation tested predictions by socio-emotional selectivity theory of a positivity effect in older adults. After viewing the emotional face of a speaker (happy or sad) on a computer display, participants were presented simultaneously with two pictures depicting opposite-valence events (positive and negative; IAPS database) while they listened to a sentence referring to one of the events. Participants' eye fixations on the pictures while processing the sentence were increased when the speaker's face was (vs. wasn't) emotionally congruent with the sentence. The enhancement occurred from the early stages of referential disambiguation and was modulated by age. For the older adults it was more pronounced with positive faces, and for the younger ones with negative faces. These findings demonstrate for the first time that emotional facial expressions, similarly to previously-studied speaker cues such as eye gaze and gestures, are rapidly integrated into sentence processing. They also provide new evidence for positivity effects in older adults during situated sentence processing.

  13. Using an emotional saccade task to characterize executive functioning and emotion processing in attention-deficit hyperactivity disorder and bipolar disorder.

    PubMed

    Yep, Rachel; Soncin, Stephen; Brien, Donald C; Coe, Brian C; Marin, Alina; Munoz, Douglas P

    2018-04-23

    Despite distinct diagnostic criteria, attention-deficit hyperactivity disorder (ADHD) and bipolar disorder (BD) share cognitive and emotion processing deficits that complicate diagnoses. The goal of this study was to use an emotional saccade task to characterize executive functioning and emotion processing in adult ADHD and BD. Participants (21 control, 20 ADHD, 20 BD) performed an interleaved pro/antisaccade task (look toward vs. look away from a visual target, respectively) in which the sex of emotional face stimuli acted as the cue to perform either the pro- or antisaccade. Both patient groups made more direction (erroneous prosaccades on antisaccade trials) and anticipatory (saccades made before cue processing) errors than controls. Controls exhibited lower microsaccade rates preceding correct anti- vs. prosaccade initiation, but this task-related modulation was absent in both patient groups. Regarding emotion processing, the ADHD group performed worse than controls on neutral face trials, while the BD group performed worse than controls on trials presenting faces of all valence. These findings support the role of fronto-striatal circuitry in mediating response inhibition deficits in both ADHD and BD, and suggest that such deficits are exacerbated in BD during emotion processing, presumably via dysregulated limbic system circuitry involving the anterior cingulate and orbitofrontal cortex. Copyright © 2018 Elsevier Inc. All rights reserved.

  14. Unconscious presentation of fearful face modulates electrophysiological responses to emotional prosody.

    PubMed

    Doi, Hirokazu; Shinohara, Kazuyuki

    2015-03-01

    Cross-modal integration of visual and auditory emotional cues is supposed to be advantageous in the accurate recognition of emotional signals. However, the neural locus of cross-modal integration between affective prosody and unconsciously presented facial expression in the neurologically intact population is still elusive at this point. The present study examined the influences of unconsciously presented facial expressions on the event-related potentials (ERPs) in emotional prosody recognition. In the experiment, fearful, happy, and neutral faces were presented without awareness by continuous flash suppression simultaneously with voices containing laughter and a fearful shout. The conventional peak analysis revealed that the ERPs were modulated interactively by emotional prosody and facial expression at multiple latency ranges, indicating that audio-visual integration of emotional signals takes place automatically without conscious awareness. In addition, the global field power during the late-latency range was larger for shout than for laughter only when a fearful face was presented unconsciously. The neural locus of this effect was localized to the left posterior fusiform gyrus, giving support to the view that the cortical region, traditionally considered to be unisensory region for visual processing, functions as the locus of audiovisual integration of emotional signals. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. A dimensional approach to determine common and specific neurofunctional markers for depression and social anxiety during emotional face processing.

    PubMed

    Luo, Lizhu; Becker, Benjamin; Zheng, Xiaoxiao; Zhao, Zhiying; Xu, Xiaolei; Zhou, Feng; Wang, Jiaojian; Kou, Juan; Dai, Jing; Kendrick, Keith M

    2018-02-01

    Major depression disorder (MDD) and anxiety disorder are both prevalent and debilitating. High rates of comorbidity between MDD and social anxiety disorder (SAD) suggest common pathological pathways, including aberrant neural processing of interpersonal signals. In patient populations, the determination of common and distinct neurofunctional markers of MDD and SAD is often hampered by confounding factors, such as generally elevated anxiety levels and disorder-specific brain structural alterations. This study employed a dimensional disorder approach to map neurofunctional markers associated with levels of depression and social anxiety symptoms in a cohort of 91 healthy subjects using an emotional face processing paradigm. Examining linear associations between levels of depression and social anxiety, while controlling for trait anxiety revealed that both were associated with exaggerated dorsal striatal reactivity to fearful and sad expression faces respectively. Exploratory analysis revealed that depression scores were positively correlated with dorsal striatal functional connectivity during processing of fearful faces, whereas those of social anxiety showed a negative association during processing of sad faces. No linear relationships between levels of depression and social anxiety were observed during a facial-identity matching task or with brain structure. Together, the present findings indicate that dorsal striatal neurofunctional alterations might underlie aberrant interpersonal processing associated with both increased levels of depression and social anxiety. © 2017 Wiley Periodicals, Inc.

  16. The NMDA antagonist ketamine and the 5-HT agonist psilocybin produce dissociable effects on structural encoding of emotional face expressions.

    PubMed

    Schmidt, André; Kometer, Michael; Bachmann, Rosilla; Seifritz, Erich; Vollenweider, Franz

    2013-01-01

    Both glutamate and serotonin (5-HT) play a key role in the pathophysiology of emotional biases. Recent studies indicate that the glutamate N-methyl-D-aspartate (NMDA) receptor antagonist ketamine and the 5-HT receptor agonist psilocybin are implicated in emotion processing. However, as yet, no study has systematically compared their contribution to emotional biases. This study used event-related potentials (ERPs) and signal detection theory to compare the effects of the NMDA (via S-ketamine) and 5-HT (via psilocybin) receptor system on non-conscious or conscious emotional face processing biases. S-ketamine or psilocybin was administrated to two groups of healthy subjects in a double-blind within-subject placebo-controlled design. We behaviorally assessed objective thresholds for non-conscious discrimination in all drug conditions. Electrophysiological responses to fearful, happy, and neutral faces were subsequently recorded with the face-specific P100 and N170 ERP. Both S-ketamine and psilocybin impaired the encoding of fearful faces as expressed by a reduced N170 over parieto-occipital brain regions. In contrast, while S-ketamine also impaired the encoding of happy facial expressions, psilocybin had no effect on the N170 in response to happy faces. This study demonstrates that the NMDA and 5-HT receptor systems differentially contribute to the structural encoding of emotional face expressions as expressed by the N170. These findings suggest that the assessment of early visual evoked responses might allow detecting pharmacologically induced changes in emotional processing biases and thus provides a framework to study the pathophysiology of dysfunctional emotional biases.

  17. Oxytocin Reduces Face Processing Time but Leaves Recognition Accuracy and Eye-Gaze Unaffected.

    PubMed

    Hubble, Kelly; Daughters, Katie; Manstead, Antony S R; Rees, Aled; Thapar, Anita; van Goozen, Stephanie H M

    2017-01-01

    Previous studies have found that oxytocin (OXT) can improve the recognition of emotional facial expressions; it has been proposed that this effect is mediated by an increase in attention to the eye-region of faces. Nevertheless, evidence in support of this claim is inconsistent, and few studies have directly tested the effect of oxytocin on emotion recognition via altered eye-gaze Methods: In a double-blind, within-subjects, randomized control experiment, 40 healthy male participants received 24 IU intranasal OXT and placebo in two identical experimental sessions separated by a 2-week interval. Visual attention to the eye-region was assessed on both occasions while participants completed a static facial emotion recognition task using medium intensity facial expressions. Although OXT had no effect on emotion recognition accuracy, recognition performance was improved because face processing was faster across emotions under the influence of OXT. This effect was marginally significant (p<.06). Consistent with a previous study using dynamic stimuli, OXT had no effect on eye-gaze patterns when viewing static emotional faces and this was not related to recognition accuracy or face processing time. These findings suggest that OXT-induced enhanced facial emotion recognition is not necessarily mediated by an increase in attention to the eye-region of faces, as previously assumed. We discuss several methodological issues which may explain discrepant findings and suggest the effect of OXT on visual attention may differ depending on task requirements. (JINS, 2017, 23, 23-33).

  18. Perception of emotion on faces in frontotemporal dementia and Alzheimer's disease: a longitudinal study.

    PubMed

    Lavenu, I; Pasquier, F

    2005-01-01

    Frontotemporal dementia (FTD) is a neurodegenerative disease characterised by behavioural disorders that suggest abnormalities of emotional processing. In a previous study, we showed that patients with Alzheimer's disease (AD) and with FTD were equally able to distinguish a face displaying affect from one not displaying affect. However, recognition of emotion was worse in patients with FTD than in patients with AD who did not differ significantly from controls. The aim of this study was to follow up the perception of emotions on faces in these patients. The poor perception of emotion could worsen differently in AD and in FTD, with the progression of atrophy of the amygdala, the anterior temporal cortex and the orbital frontal cortex, structures that are components of the brain's emotional processing systems. Patients with AD or with FTD had to recognise and point out the name of one of seven basic emotions (anger, disgust, happiness, fear, sadness, surprise and contempt) on a set of 28 faces presented on slides at the first visit and 3 years later. Thirty-seven patients (AD = 19, FTD = 18) performed the tests initially. The two patient groups did not differ for age, sex and duration of the disease. During the follow-up, 12 patients died, 4 patients refused to perform the tests and 8 could not be tested because of the severity of the disease. Finally, 7 patients with AD and 6 patients with FTD performed the two tests at a mean delay of 40 months. All patients with AD had worse results at follow-up on the perception of emotion despite the prescription of inhibitors of cholinesterase in all patients and of selective serotonin reuptake inhibitors (SSRIs) in 4 patients. As a whole, patients with FTD had better results in the second than in the first assessment (however, 3 of them had worse results) independently of the prescription of trazodone (n = 2), other SSRIs (n = 2), or the absence of treatment (n = 2), and of possible cognitive change. Recognition of emotion on

  19. Sex differences in social cognition: The case of face processing.

    PubMed

    Proverbio, Alice Mado

    2017-01-02

    Several studies have demonstrated that women show a greater interest for social information and empathic attitude than men. This article reviews studies on sex differences in the brain, with particular reference to how males and females process faces and facial expressions, social interactions, pain of others, infant faces, faces in things (pareidolia phenomenon), opposite-sex faces, humans vs. landscapes, incongruent behavior, motor actions, biological motion, erotic pictures, and emotional information. Sex differences in oxytocin-based attachment response and emotional memory are also mentioned. In addition, we investigated how 400 different human faces were evaluated for arousal and valence dimensions by a group of healthy male and female University students. Stimuli were carefully balanced for sensory and perceptual characteristics, age, facial expression, and sex. As a whole, women judged all human faces as more positive and more arousing than men. Furthermore, they showed a preference for the faces of children and the elderly in the arousal evaluation. Regardless of face aesthetics, age, or facial expression, women rated human faces higher than men. The preference for opposite- vs. same-sex faces strongly interacted with facial age. Overall, both women and men exhibited differences in facial processing that could be interpreted in the light of evolutionary psychobiology. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  20. Audiovisual emotional processing and neurocognitive functioning in patients with depression

    PubMed Central

    Doose-Grünefeld, Sophie; Eickhoff, Simon B.; Müller, Veronika I.

    2015-01-01

    Alterations in the processing of emotional stimuli (e.g., facial expressions, prosody, music) have repeatedly been reported in patients with major depression. Such impairments may result from the likewise prevalent executive deficits in these patients. However, studies investigating this relationship are rare. Moreover, most studies to date have only assessed impairments in unimodal emotional processing, whereas in real life, emotions are primarily conveyed through more than just one sensory channel. The current study therefore aimed at investigating multi-modal emotional processing in patients with depression and to assess the relationship between emotional and neurocognitive impairments. Fourty one patients suffering from major depression and 41 never-depressed healthy controls participated in an audiovisual (faces-sounds) emotional integration paradigm as well as a neurocognitive test battery. Our results showed that depressed patients were specifically impaired in the processing of positive auditory stimuli as they rated faces significantly more fearful when presented with happy than with neutral sounds. Such an effect was absent in controls. Findings in emotional processing in patients did not correlate with Beck’s depression inventory score. Furthermore, neurocognitive findings revealed significant group differences for two of the tests. The effects found in audiovisual emotional processing, however, did not correlate with performance in the neurocognitive tests. In summary, our results underline the diversity of impairments going along with depression and indicate that deficits found for unimodal emotional processing cannot trivially be generalized to deficits in a multi-modal setting. The mechanisms of impairments therefore might be far more complex than previously thought. Our findings furthermore contradict the assumption that emotional processing deficits in major depression are associated with impaired attention or inhibitory functioning. PMID

  1. Influence of emotional processing on working memory in schizophrenia.

    PubMed

    Becerril, Karla; Barch, Deanna

    2011-09-01

    Research on emotional processing in schizophrenia suggests relatively intact subjective responses to affective stimuli "in the moment." However, neuroimaging evidence suggests diminished activation in brain regions associated with emotional processing in schizophrenia. We asked whether given a more vulnerable cognitive system in schizophrenia, individuals with this disorder would show increased or decreased modulation of working memory (WM) as a function of the emotional content of stimuli compared with healthy control subjects. In addition, we examined whether higher anhedonia levels were associated with a diminished impact of emotion on behavioral and brain activation responses. In the present study, 38 individuals with schizophrenia and 32 healthy individuals completed blocks of a 2-back WM task in a functional magnetic resonance imaging scanning session. Blocks contained faces displaying either only neutral stimuli or neutral and emotional stimuli (happy or fearful faces), randomly intermixed and occurring both as targets and non-targets. Both groups showed higher accuracy but slower reaction time for negative compared to neutral stimuli. Individuals with schizophrenia showed intact amygdala activity in response to emotionally evocative stimuli, but demonstrated altered dorsolateral prefrontal cortex (DLPFC) and hippocampal activity while performing an emotionally loaded WM-task. Higher levels of social anhedonia were associated with diminished amygdala responses to emotional stimuli and increased DLPFC activity in individuals with schizophrenia. Emotional arousal may challenge dorsal-frontal control systems, which may have both beneficial and detrimental influences. Our findings suggest that disturbances in emotional processing in schizophrenia relate to alterations in emotion-cognition interactions rather than to the perception and subjective experience of emotion per se.

  2. What Facial Appearance Reveals Over Time: When Perceived Expressions in Neutral Faces Reveal Stable Emotion Dispositions

    PubMed Central

    Adams, Reginald B.; Garrido, Carlos O.; Albohn, Daniel N.; Hess, Ursula; Kleck, Robert E.

    2016-01-01

    It might seem a reasonable assumption that when we are not actively using our faces to express ourselves (i.e., when we display nonexpressive, or neutral faces), those around us will not be able to read our emotions. Herein, using a variety of expression-related ratings, we examined whether age-related changes in the face can accurately reveal one’s innermost affective dispositions. In each study, we found that expressive ratings of neutral facial displays predicted self-reported positive/negative dispositional affect, but only for elderly women, and only for positive affect. These findings meaningfully replicate and extend earlier work examining age-related emotion cues in the face of elderly women (Malatesta et al., 1987a). We discuss these findings in light of evidence that women are expected to, and do, smile more than men, and that the quality of their smiles predicts their life satisfaction. Although ratings of old male faces did not significantly predict self-reported affective dispositions, the trend was similar to that found for old female faces. A plausible explanation for this gender difference is that in the process of attenuating emotional expressions over their lifetimes, old men reveal less evidence of their total emotional experiences in their faces than do old women. PMID:27445944

  3. Linking children's neuropsychological processing of emotion with their knowledge of emotion expression regulation.

    PubMed

    Watling, Dawn; Bourne, Victoria J

    2007-09-01

    Understanding of emotions has been shown to develop between the ages of 4 and 10 years; however, individual differences exist in this development. While previous research has typically examined these differences in terms of developmental and/or social factors, little research has considered the possible impact of neuropsychological development on the behavioural understanding of emotions. Emotion processing tends to be lateralised to the right hemisphere of the brain in adults, yet this pattern is not as evident in children until around the age of 10 years. In this study 136 children between 5 and 10 years were given both behavioural and neuropsychological tests of emotion processing. The behavioural task examined expression regulation knowledge (ERK) for prosocial and self-presentational hypothetical interactions. The chimeric faces test was given as a measure of lateralisation for processing positive facial emotion. An interaction between age and lateralisation for emotion processing was predictive of children's ERK for only the self-presentational interactions. The relationships between children's ERK and lateralisation for emotion processing changes across the three age groups, emerging as a positive relationship in the 10-year-olds. The 10-years-olds who were more lateralised to the right hemisphere for emotion processing tended to show greater understanding of the need for regulating negative emotions during interactions that would have a self-presentational motivation. This finding suggests an association between the behavioural and neuropsychological development of emotion processing.

  4. State-dependent alterations in inhibitory control and emotional face identification in seasonal affective disorder.

    PubMed

    Hjordt, Liv V; Stenbæk, Dea S; Madsen, Kathrine Skak; Mc Mahon, Brenda; Jensen, Christian G; Vestergaard, Martin; Hageman, Ida; Meder, David; Hasselbalch, Steen G; Knudsen, Gitte M

    2017-04-01

    Depressed individuals often exhibit impaired inhibition to negative input and identification of positive stimuli, but it is unclear whether this is a state or trait feature. We here exploited a naturalistic model, namely individuals with seasonal affective disorder (SAD), to study this feature longitudinally. The goal of this study was to examine seasonal changes in inhibitory control and identification of emotional faces in individuals with SAD. Twenty-nine individuals diagnosed with winter-SAD and 30 demographically matched controls with no seasonality symptoms completed an emotional Go/NoGo task, requiring inhibition of prepotent responses to emotional facial expressions and an emotional face identification task twice, in winter and summer. In winter, individuals with SAD showed impaired ability to inhibit responses to angry (p = .0006) and sad faces (p = .011), and decreased identification of happy faces (p = .032) compared with controls. In summer, individuals with SAD and controls performed similarly on these tasks (ps > .24). We provide novel evidence that inhibition of angry and sad faces and identification of happy faces are impaired in SAD in the symptomatic phase, but not in the remitted phase. The affective biases in cognitive processing constitute state-dependent features of SAD. Our data show that reinstatement of a normal affective cognition should be possible and would constitute a major goal in psychiatric treatment to improve the quality of life for these patients. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  5. Spatiotemporal dynamics in human visual cortex rapidly encode the emotional content of faces.

    PubMed

    Dima, Diana C; Perry, Gavin; Messaritaki, Eirini; Zhang, Jiaxiang; Singh, Krish D

    2018-06-08

    Recognizing emotion in faces is important in human interaction and survival, yet existing studies do not paint a consistent picture of the neural representation supporting this task. To address this, we collected magnetoencephalography (MEG) data while participants passively viewed happy, angry and neutral faces. Using time-resolved decoding of sensor-level data, we show that responses to angry faces can be discriminated from happy and neutral faces as early as 90 ms after stimulus onset and only 10 ms later than faces can be discriminated from scrambled stimuli, even in the absence of differences in evoked responses. Time-resolved relevance patterns in source space track expression-related information from the visual cortex (100 ms) to higher-level temporal and frontal areas (200-500 ms). Together, our results point to a system optimised for rapid processing of emotional faces and preferentially tuned to threat, consistent with the important evolutionary role that such a system must have played in the development of human social interactions. © 2018 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  6. Cholinergic enhancement modulates neural correlates of selective attention and emotional processing.

    PubMed

    Bentley, Paul; Vuilleumier, Patrik; Thiel, Christiane M; Driver, Jon; Dolan, Raymond J

    2003-09-01

    Neocortical cholinergic afferents are proposed to influence both selective attention and emotional processing. In a study of healthy adults we used event-related fMRI while orthogonally manipulating attention and emotionality to examine regions showing effects of cholinergic modulation by the anticholinesterase physostigmine. Either face or house pictures appeared at task-relevant locations, with the alternative picture type at irrelevant locations. Faces had either neutral or fearful expressions. Physostigmine increased relative activity within the anterior fusiform gyrus for faces at attended, versus unattended, locations, but decreased relative activity within the posterolateral occipital cortex for houses in attended, versus unattended, locations. A similar pattern of regional differences in the effect of physostigmine on cue-evoked responses was also present in the absence of stimuli. Cholinergic enhancement augmented the relative neuronal response within the middle fusiform gyrus to fearful faces, whether at attended or unattended locations. By contrast, physostigmine influenced responses in the orbitofrontal, intraparietal and cingulate cortices to fearful faces when faces occupied task-irrelevant locations. These findings suggest that acetylcholine may modulate both selective attention and emotional processes through independent, region-specific effects within the extrastriate cortex. Furthermore, cholinergic inputs to the frontoparietal cortex may influence the allocation of attention to emotional information.

  7. Emotional Processing of Infants Displays in Eating Disorders

    PubMed Central

    Cardi, Valentina; Corfield, Freya; Leppanen, Jenni; Rhind, Charlotte; Deriziotis, Stephanie; Hadjimichalis, Alexandra; Hibbs, Rebecca; Micali, Nadia; Treasure, Janet

    2014-01-01

    Aim The aim of this study is to examine emotional processing of infant displays in people with Eating Disorders (EDs). Background Social and emotional factors are implicated as causal and maintaining factors in EDs. Difficulties in emotional regulation have been mainly studied in relation to adult interactions, with less interest given to interactions with infants. Method A sample of 138 women were recruited, of which 49 suffered from Anorexia Nervosa (AN), 16 from Bulimia Nervosa (BN), and 73 were healthy controls (HCs). Attentional responses to happy and sad infant faces were tested with the visual probe detection task. Emotional identification of, and reactivity to, infant displays were measured using self-report measures. Facial expressions to video clips depicting sad, happy and frustrated infants were also recorded. Results No significant differences between groups were observed in the attentional response to infant photographs. However, there was a trend for patients to disengage from happy faces. People with EDs also reported lower positive ratings of happy infant displays and greater subjective negative reactions to sad infants. Finally, patients showed a significantly lower production of facial expressions, especially in response to the happy infant video clip. Insecure attachment was negatively correlated with positive facial expressions displayed in response to the happy infant and positively correlated with the intensity of negative emotions experienced in response to the sad infant video clip. Conclusion People with EDs do not have marked abnormalities in their attentional processing of infant emotional faces. However, they do have a reduction in facial affect particularly in response to happy infants. Also, they report greater negative reactions to sadness, and rate positive emotions less intensively than HCs. This pattern of emotional responsivity suggests abnormalities in social reward sensitivity and might indicate new treatment targets. PMID

  8. Interactions among the effects of head orientation, emotional expression, and physical attractiveness on face preferences.

    PubMed

    Main, Julie C; DeBruine, Lisa M; Little, Anthony C; Jones, Benedict C

    2010-01-01

    Previous studies have shown that preferences for direct versus averted gaze are modulated by emotional expressions and physical attractiveness. For example, preferences for direct gaze are stronger when judging happy or physically attractive faces than when judging disgusted or physically unattractive faces. Here we show that preferences for front versus three-quarter views of faces, in which gaze direction was always congruent with head orientation, are also modulated by emotional expressions and physical attractiveness; participants demonstrated preferences for front views of faces over three-quarter views of faces when judging the attractiveness of happy, physically attractive individuals, but not when judging the attractiveness of relatively unattractive individuals or those with disgusted expressions. Moreover, further analyses indicated that these interactions did not simply reflect differential perceptions of the intensity of the emotional expressions shown in each condition. Collectively, these findings present novel evidence that the effect of the direction of the attention of others on attractiveness judgments is modulated by cues to the physical attractiveness and emotional state of the depicted individual, potentially reflecting psychological adaptations for efficient allocation of social effort. These data also present the first behavioural evidence that the effect of the direction of the attention of others on attractiveness judgments reflects viewer-referenced, rather than face-referenced, coding and/or processing of gaze direction.

  9. Face puzzle—two new video-based tasks for measuring explicit and implicit aspects of facial emotion recognition

    PubMed Central

    Kliemann, Dorit; Rosenblau, Gabriela; Bölte, Sven; Heekeren, Hauke R.; Dziobek, Isabel

    2013-01-01

    Recognizing others' emotional states is crucial for effective social interaction. While most facial emotion recognition tasks use explicit prompts that trigger consciously controlled processing, emotional faces are almost exclusively processed implicitly in real life. Recent attempts in social cognition suggest a dual process perspective, whereby explicit and implicit processes largely operate independently. However, due to differences in methodology the direct comparison of implicit and explicit social cognition has remained a challenge. Here, we introduce a new tool to comparably measure implicit and explicit processing aspects comprising basic and complex emotions in facial expressions. We developed two video-based tasks with similar answer formats to assess performance in respective facial emotion recognition processes: Face Puzzle, implicit and explicit. To assess the tasks' sensitivity to atypical social cognition and to infer interrelationship patterns between explicit and implicit processes in typical and atypical development, we included healthy adults (NT, n = 24) and adults with autism spectrum disorder (ASD, n = 24). Item analyses yielded good reliability of the new tasks. Group-specific results indicated sensitivity to subtle social impairments in high-functioning ASD. Correlation analyses with established implicit and explicit socio-cognitive measures were further in favor of the tasks' external validity. Between group comparisons provide first hints of differential relations between implicit and explicit aspects of facial emotion recognition processes in healthy compared to ASD participants. In addition, an increased magnitude of between group differences in the implicit task was found for a speed-accuracy composite measure. The new Face Puzzle tool thus provides two new tasks to separately assess explicit and implicit social functioning, for instance, to measure subtle impairments as well as potential improvements due to social cognitive

  10. Influence of Emotional Facial Expressions on 3-5-Year-Olds' Face Recognition

    ERIC Educational Resources Information Center

    Freitag, Claudia; Schwarzer, Gudrun

    2011-01-01

    Three experiments examined 3- and 5-year-olds' recognition of faces in constant and varied emotional expressions. Children were asked to identify repeatedly presented target faces, distinguishing them from distractor faces, during an immediate recognition test and during delayed assessments after 10 min and one week. Emotional facial expression…

  11. Discrimination and categorization of emotional facial expressions and faces in Parkinson's disease.

    PubMed

    Alonso-Recio, Laura; Martín, Pilar; Rubio, Sandra; Serrano, Juan M

    2014-09-01

    Our objective was to compare the ability to discriminate and categorize emotional facial expressions (EFEs) and facial identity characteristics (age and/or gender) in a group of 53 individuals with Parkinson's disease (PD) and another group of 53 healthy subjects. On the one hand, by means of discrimination and identification tasks, we compared two stages in the visual recognition process that could be selectively affected in individuals with PD. On the other hand, facial expression versus gender and age comparison permits us to contrast whether the emotional or non-emotional content influences the configural perception of faces. In Experiment I, we did not find differences between groups, either with facial expression or age, in discrimination tasks. Conversely, in Experiment II, we found differences between the groups, but only in the EFE identification task. Taken together, our results indicate that configural perception of faces does not seem to be globally impaired in PD. However, this ability is selectively altered when the categorization of emotional faces is required. A deeper assessment of the PD group indicated that decline in facial expression categorization is more evident in a subgroup of patients with higher global impairment (motor and cognitive). Taken together, these results suggest that the problems found in facial expression recognition may be associated with the progressive neuronal loss in frontostriatal and mesolimbic circuits, which characterizes PD. © 2013 The British Psychological Society.

  12. Evaluating faces on trustworthiness: an extension of systems for recognition of emotions signaling approach/avoidance behaviors.

    PubMed

    Todorov, Alexander

    2008-03-01

    People routinely make various trait judgments from facial appearance, and such judgments affect important social outcomes. These judgments are highly correlated with each other, reflecting the fact that valence evaluation permeates trait judgments from faces. Trustworthiness judgments best approximate this evaluation, consistent with evidence about the involvement of the amygdala in the implicit evaluation of face trustworthiness. Based on computer modeling and behavioral experiments, I argue that face evaluation is an extension of functionally adaptive systems for understanding the communicative meaning of emotional expressions. Specifically, in the absence of diagnostic emotional cues, trustworthiness judgments are an attempt to infer behavioral intentions signaling approach/avoidance behaviors. Correspondingly, these judgments are derived from facial features that resemble emotional expressions signaling such behaviors: happiness and anger for the positive and negative ends of the trustworthiness continuum, respectively. The emotion overgeneralization hypothesis can explain highly efficient but not necessarily accurate trait judgments from faces, a pattern that appears puzzling from an evolutionary point of view and also generates novel predictions about brain responses to faces. Specifically, this hypothesis predicts a nonlinear response in the amygdala to face trustworthiness, confirmed in functional magnetic resonance imaging (fMRI) studies, and dissociations between processing of facial identity and face evaluation, confirmed in studies with developmental prosopagnosics. I conclude with some methodological implications for the study of face evaluation, focusing on the advantages of formally modeling representation of faces on social dimensions.

  13. Gaze Behavior Consistency among Older and Younger Adults When Looking at Emotional Faces

    PubMed Central

    Chaby, Laurence; Hupont, Isabelle; Avril, Marie; Luherne-du Boullay, Viviane; Chetouani, Mohamed

    2017-01-01

    The identification of non-verbal emotional signals, and especially of facial expressions, is essential for successful social communication among humans. Previous research has reported an age-related decline in facial emotion identification, and argued for socio-emotional or aging-brain model explanations. However, more perceptual differences in the gaze strategies that accompany facial emotional processing with advancing age have been under-explored yet. In this study, 22 young (22.2 years) and 22 older (70.4 years) adults were instructed to look at basic facial expressions while their gaze movements were recorded by an eye-tracker. Participants were then asked to identify each emotion, and the unbiased hit rate was applied as performance measure. Gaze data were first analyzed using traditional measures of fixations over two preferential regions of the face (upper and lower areas) for each emotion. Then, to better capture core gaze changes with advancing age, spatio-temporal gaze behaviors were deeper examined using data-driven analysis (dimension reduction, clustering). Results first confirmed that older adults performed worse than younger adults at identifying facial expressions, except for “joy” and “disgust,” and this was accompanied by a gaze preference toward the lower-face. Interestingly, this phenomenon was maintained during the whole time course of stimulus presentation. More importantly, trials corresponding to older adults were more tightly clustered, suggesting that the gaze behavior patterns of older adults are more consistent than those of younger adults. This study demonstrates that, confronted to emotional faces, younger and older adults do not prioritize or ignore the same facial areas. Older adults mainly adopted a focused-gaze strategy, consisting in focusing only on the lower part of the face throughout the whole stimuli display time. This consistency may constitute a robust and distinctive “social signature” of emotional

  14. Children's understanding of facial expression of emotion: II. Drawing of emotion-faces.

    PubMed

    Missaghi-Lakshman, M; Whissell, C

    1991-06-01

    67 children from Grades 2, 4, and 7 drew faces representing the emotional expressions of fear, anger, surprise, disgust, happiness, and sadness. The children themselves and 29 adults later decoded the drawings in an emotion-recognition task. Children were the more accurate decoders, and their accuracy and the accuracy of adults increased significantly for judgments of 7th-grade drawings. The emotions happy and sad were most accurately decoded. There were no significant differences associated with sex. In their drawings, children utilized a symbol system that seems to be based on a highlighting or exaggeration of features of the innately governed facial expression of emotion.

  15. Functional MRI of facial emotion processing in left temporal lobe epilepsy.

    PubMed

    Szaflarski, Jerzy P; Allendorfer, Jane B; Heyse, Heidi; Mendoza, Lucy; Szaflarski, Basia A; Cohen, Nancy

    2014-03-01

    Temporal lobe epilepsy (TLE) may negatively affect the ability to recognize emotions. This study aimed to determine the cortical correlates of facial emotion processing (happy, sad, fearful, and neutral) in patients with well-characterized left TLE (LTLE) and to examine the effect of seizure control on emotion processing. We enrolled 34 consecutive patients with LTLE and 30 matched healthy control (HC) subjects. Participants underwent functional MRI (fMRI) with an event-related facial emotion recognition task. The seizures of seventeen patients were controlled (no seizure in at least 3months; LTLE-sz), and 17 continued to experience frequent seizures (LTLE+sz). Mood was assessed with the Beck Depression Inventory (BDI) and the Profile of Mood States (POMS). There were no differences in demographic characteristics and measures of mood between HC subjects and patients with LTLE. In patients with LTLE, fMRI showed decreased blood oxygenation level dependent (BOLD) signal in the hippocampus/parahippocampus and cerebellum in processing of happy faces and increased BOLD signal in occipital regions in response to fearful faces. Comparison of groups with LTLE+sz and LTLE-sz showed worse BDI and POMS scores in LTLE+sz (all p<0.05) except for POMS tension/anxiety (p=0.067). Functional MRI revealed increased BOLD signal in patients with LTLE+sz in the left precuneus and left parahippocampus for "fearful" faces and in the left periarcheocortex for "neutral" faces. There was a correlation between the fMRI and Total Mood Disturbance in the left precuneus in LTLE-sz (p=0.019) and in LTLE+sz (p=0.018). Overall, LTLE appears to have a relatively minor effect on the cortical underpinnings of facial emotion processing, while the effect of seizure state (controlled vs. not controlled) is more pronounced, indicating a significant relationship between seizure control and emotion processing. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Emotion in the Neutral Face: A Mechanism for Impression Formation?

    PubMed Central

    Adams, Reginald B.; Nelson, Anthony J.; Soto, José A.; Hess, Ursula; Kleck, Robert E.

    2012-01-01

    The current work examined contributions of emotion-resembling facial cues to impression formation. There exist common facial cues that make people look male or female, emotional, and from which we derive personality inferences. We first conducted a Pilot study to assess these effects. We found that neutral female versus neutral male faces were rated as more submissive, affiliative, naïve, honest, cooperative, babyish, fearful, happy, and less angry than neutral male faces. In our Primary Study, we then “warped” these same neutral faces over their corresponding anger and fear displays so the resultant facial appearance cues now structurally resembled emotion while retaining a neutral visage (e.g., no wrinkles, furrows, creases etc.). The gender effects found in the Pilot Study were replicated in the Primary Study, suggesting clear stereotype driven impressions. Critically, ratings of the neutral-over-fear warps versus neutral-over-anger warps also revealed a profile similar to the gender-based ratings, revealing perceptually driven impressions directly attributable to emotion overgeneralization. PMID:22471850

  17. Automatic Processing of Changes in Facial Emotions in Dysphoria: A Magnetoencephalography Study.

    PubMed

    Xu, Qianru; Ruohonen, Elisa M; Ye, Chaoxiong; Li, Xueqiao; Kreegipuu, Kairi; Stefanics, Gabor; Luo, Wenbo; Astikainen, Piia

    2018-01-01

    It is not known to what extent the automatic encoding and change detection of peripherally presented facial emotion is altered in dysphoria. The negative bias in automatic face processing in particular has rarely been studied. We used magnetoencephalography (MEG) to record automatic brain responses to happy and sad faces in dysphoric (Beck's Depression Inventory ≥ 13) and control participants. Stimuli were presented in a passive oddball condition, which allowed potential negative bias in dysphoria at different stages of face processing (M100, M170, and M300) and alterations of change detection (visual mismatch negativity, vMMN) to be investigated. The magnetic counterpart of the vMMN was elicited at all stages of face processing, indexing automatic deviance detection in facial emotions. The M170 amplitude was modulated by emotion, response amplitudes being larger for sad faces than happy faces. Group differences were found for the M300, and they were indexed by two different interaction effects. At the left occipital region of interest, the dysphoric group had larger amplitudes for sad than happy deviant faces, reflecting negative bias in deviance detection, which was not found in the control group. On the other hand, the dysphoric group showed no vMMN to changes in facial emotions, while the vMMN was observed in the control group at the right occipital region of interest. Our results indicate that there is a negative bias in automatic visual deviance detection, but also a general change detection deficit in dysphoria.

  18. Greater sensitivity of the cortical face processing system to perceptually-equated face detection

    PubMed Central

    Maher, S.; Ekstrom, T.; Tong, Y.; Nickerson, L.D.; Frederick, B.; Chen, Y.

    2015-01-01

    Face detection, the perceptual capacity to identify a visual stimulus as a face before probing deeper into specific attributes (such as its identity or emotion), is essential for social functioning. Despite the importance of this functional capacity, face detection and its underlying brain mechanisms are not well understood. This study evaluated the roles that the cortical face processing system, which is identified largely through studying other aspects of face perception, play in face detection. Specifically, we used functional magnetic resonance imaging (fMRI) to examine the activations of the fusifom face area (FFA), occipital face area (OFA) and superior temporal sulcus (STS) when face detection was isolated from other aspects of face perception and when face detection was perceptually-equated across individual human participants (n=20). During face detection, FFA and OFA were significantly activated, even for stimuli presented at perceptual-threshold levels, whereas STS was not. During tree detection, however, FFA and OFA were responsive only for highly salient (i.e., high contrast) stimuli. Moreover, activation of FFA during face detection predicted a significant portion of the perceptual performance levels that were determined psychophysically for each participant. This pattern of result indicates that FFA and OFA have a greater sensitivity to face detection signals and selectively support the initial process of face vs. non-face object perception. PMID:26592952

  19. Emotional Recognition in Autism Spectrum Conditions from Voices and Faces

    ERIC Educational Resources Information Center

    Stewart, Mary E.; McAdam, Clair; Ota, Mitsuhiko; Peppe, Sue; Cleland, Joanne

    2013-01-01

    The present study reports on a new vocal emotion recognition task and assesses whether people with autism spectrum conditions (ASC) perform differently from typically developed individuals on tests of emotional identification from both the face and the voice. The new test of vocal emotion contained trials in which the vocal emotion of the sentence…

  20. Anxiety and Sensitivity to Eye Gaze in Emotional Faces

    ERIC Educational Resources Information Center

    Holmes, Amanda; Richards, Anne; Green, Simon

    2006-01-01

    This paper reports three studies in which stronger orienting to perceived eye gaze direction was revealed when observers viewed faces showing fearful or angry, compared with happy or neutral, emotional expressions. Gaze-related spatial cueing effects to laterally presented fearful faces and centrally presented angry faces were also modulated by…

  1. Visual short-term memory load modulates the early attention and perception of task-irrelevant emotional faces

    PubMed Central

    Yang, Ping; Wang, Min; Jin, Zhenlan; Li, Ling

    2015-01-01

    The ability to focus on task-relevant information, while suppressing distraction, is critical for human cognition and behavior. Using a delayed-match-to-sample (DMS) task, we investigated the effects of emotional face distractors (positive, negative, and neutral faces) on early and late phases of visual short-term memory (VSTM) maintenance intervals, using low and high VSTM loads. Behavioral results showed decreased accuracy and delayed reaction times (RTs) for high vs. low VSTM load. Event-related potentials (ERPs) showed enhanced frontal N1 and occipital P1 amplitudes for negative faces vs. neutral or positive faces, implying rapid attentional alerting effects and early perceptual processing of negative distractors. However, high VSTM load appeared to inhibit face processing in general, showing decreased N1 amplitudes and delayed P1 latencies. An inverse correlation between the N1 activation difference (high-load minus low-load) and RT costs (high-load minus low-load) was found at left frontal areas when viewing negative distractors, suggesting that the greater the inhibition the lower the RT cost for negative faces. Emotional interference effect was not found in the late VSTM-related parietal P300, frontal positive slow wave (PSW) and occipital negative slow wave (NSW) components. In general, our findings suggest that the VSTM load modulates the early attention and perception of emotional distractors. PMID:26388763

  2. Serotonergic neurotransmission in emotional processing: New evidence from long-term recreational poly-drug ecstasy use.

    PubMed

    Laursen, Helle Ruff; Henningsson, Susanne; Macoveanu, Julian; Jernigan, Terry L; Siebner, Hartwig R; Holst, Klaus K; Skimminge, Arnold; Knudsen, Gitte M; Ramsoy, Thomas Z; Erritzoe, David

    2016-12-01

    The brain's serotonergic system plays a crucial role in the processing of emotional stimuli, and several studies have shown that a reduced serotonergic neurotransmission is associated with an increase in amygdala activity during emotional face processing. Prolonged recreational use of ecstasy (3,4-methylene-dioxymethamphetamine [MDMA]) induces alterations in serotonergic neurotransmission that are comparable to those observed in a depleted state. In this functional magnetic resonance imaging (fMRI) study, we investigated the responsiveness of the amygdala to emotional face stimuli in recreational ecstasy users as a model of long-term serotonin depletion. Fourteen ecstasy users and 12 non-using controls underwent fMRI to measure the regional neural activity elicited in the amygdala by male or female faces expressing anger, disgust, fear, sadness, or no emotion. During fMRI, participants made a sex judgement on each face stimulus. Positron emission tomography with 11 C-DASB was additionally performed to assess serotonin transporter (SERT) binding in the brain. In the ecstasy users, SERT binding correlated negatively with amygdala activity, and accumulated lifetime intake of ecstasy tablets was associated with an increase in amygdala activity during angry face processing. Conversely, time since the last ecstasy intake was associated with a trend toward a decrease in amygdala activity during angry and sad face processing. These results indicate that the effects of long-term serotonin depletion resulting from ecstasy use are dose-dependent, affecting the functional neural basis of emotional face processing. © The Author(s) 2016.

  3. Infants' Temperament and Mothers', and Fathers' Depression Predict Infants' Attention to Objects Paired with Emotional Faces.

    PubMed

    Aktar, Evin; Mandell, Dorothy J; de Vente, Wieke; Majdandžić, Mirjana; Raijmakers, Maartje E J; Bögels, Susan M

    2016-07-01

    Between 10 and 14 months, infants gain the ability to learn about unfamiliar stimuli by observing others' emotional reactions to those stimuli, so called social referencing (SR). Joint processing of emotion and head/gaze direction is essential for SR. This study tested emotion and head/gaze direction effects on infants' attention via pupillometry in the period following the emergence of SR. Pupil responses of 14-to-17-month-old infants (N = 57) were measured during computerized presentations of unfamiliar objects alone, before-and-after being paired with emotional (happy, sad, fearful vs. neutral) faces gazing towards (vs. away) from objects. Additionally, the associations of infants' temperament, and parents' negative affect/depression/anxiety with infants' pupil responses were explored. Both mothers and fathers of participating infants completed questionnaires about their negative affect, depression and anxiety symptoms and their infants' negative temperament. Infants allocated more attention (larger pupils) to negative vs. neutral faces when the faces were presented alone, while they allocated less attention to objects paired with emotional vs. neutral faces independent of head/gaze direction. Sad (but not fearful) temperament predicted more attention to emotional faces. Infants' sad temperament moderated the associations of mothers' depression (but not anxiety) with infants' attention to objects. Maternal depression predicted more attention to objects paired with emotional expressions in infants low in sad temperament, while it predicted less attention in infants high in sad temperament. Fathers' depression (but not anxiety) predicted more attention to objects paired with emotional expressions independent of infants' temperament. We conclude that infants' own temperamental dispositions for sadness, and their exposure to mothers' and fathers' depressed moods may influence infants' attention to emotion-object associations in social learning contexts.

  4. Face Processing in Children with Autism Spectrum Disorder: Independent or Interactive Processing of Facial Identity and Facial Expression?

    ERIC Educational Resources Information Center

    Krebs, Julia F.; Biswas, Ajanta; Pascalis, Olivier; Kamp-Becker, Inge; Remschmidt, Helmuth; Schwarzer, Gudrun

    2011-01-01

    The current study investigated if deficits in processing emotional expression affect facial identity processing and vice versa in children with autism spectrum disorder. Children with autism and IQ and age matched typically developing children classified faces either by emotional expression, thereby ignoring facial identity or by facial identity…

  5. Love withdrawal predicts electrocortical responses to emotional faces with performance feedback: a follow-up and extension.

    PubMed

    Huffmeijer, Renske; Bakermans-Kranenburg, Marian J; Alink, Lenneke R A; van IJzendoorn, Marinus H

    2014-06-02

    Parental use of love withdrawal is thought to affect children's later psychological functioning because it creates a link between children's performance and relational consequences. In addition, recent studies have begun to show that experiences of love withdrawal also relate to the neural processing of socio-emotional information relevant to a performance-relational consequence link, and can moderate effects of oxytocin on social information processing and behavior. The current study follows-up on our previous results by attempting to confirm and extend previous findings indicating that experiences of maternal love withdrawal are related to electrocortical responses to emotional faces presented with performance feedback. More maternal love withdrawal was related to enhanced early processing of facial feedback stimuli (reflected in more positive VPP amplitudes, and confirming previous findings). However, attentional engagement with and processing of the stimuli at a later stage were diminished in those reporting higher maternal love withdrawal (reflected in less positive LPP amplitudes, and diverging from previous findings). Maternal love withdrawal affects the processing of emotional faces presented with performance feedback differently in different stages of neural processing.

  6. The implicit processing of categorical and dimensional strategies: an fMRI study of facial emotion perception

    PubMed Central

    Matsuda, Yoshi-Taka; Fujimura, Tomomi; Katahira, Kentaro; Okada, Masato; Ueno, Kenichi; Cheng, Kang; Okanoya, Kazuo

    2013-01-01

    Our understanding of facial emotion perception has been dominated by two seemingly opposing theories: the categorical and dimensional theories. However, we have recently demonstrated that hybrid processing involving both categorical and dimensional perception can be induced in an implicit manner (Fujimura etal., 2012). The underlying neural mechanisms of this hybrid processing remain unknown. In this study, we tested the hypothesis that separate neural loci might intrinsically encode categorical and dimensional processing functions that serve as a basis for hybrid processing. We used functional magnetic resonance imaging to measure neural correlates while subjects passively viewed emotional faces and performed tasks that were unrelated to facial emotion processing. Activity in the right fusiform face area (FFA) increased in response to psychologically obvious emotions and decreased in response to ambiguous expressions, demonstrating the role of the FFA in categorical processing. The amygdala, insula and medial prefrontal cortex exhibited evidence of dimensional (linear) processing that correlated with physical changes in the emotional face stimuli. The occipital face area and superior temporal sulcus did not respond to these changes in the presented stimuli. Our results indicated that distinct neural loci process the physical and psychological aspects of facial emotion perception in a region-specific and implicit manner. PMID:24133426

  7. Neural Activation to Emotional Faces in Adolescents with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Weng, Shih-Jen; Carrasco, Melisa; Swartz, Johnna R.; Wiggins, Jillian Lee; Kurapati, Nikhil; Liberzon, Israel; Risi, Susan; Lord, Catherine; Monk, Christopher S.

    2011-01-01

    Background: Autism spectrum disorders (ASD) involve a core deficit in social functioning and impairments in the ability to recognize face emotions. In an emotional faces task designed to constrain group differences in attention, the present study used functional MRI to characterize activation in the amygdala, ventral prefrontal cortex (vPFC), and…

  8. Dissimilar processing of emotional facial expressions in human and monkey temporal cortex

    PubMed Central

    Zhu, Qi; Nelissen, Koen; Van den Stock, Jan; De Winter, François-Laurent; Pauwels, Karl; de Gelder, Beatrice; Vanduffel, Wim; Vandenbulcke, Mathieu

    2013-01-01

    Emotional facial expressions play an important role in social communication across primates. Despite major progress made in our understanding of categorical information processing such as for objects and faces, little is known, however, about how the primate brain evolved to process emotional cues. In this study, we used functional magnetic resonance imaging (fMRI) to compare the processing of emotional facial expressions between monkeys and humans. We used a 2 × 2 × 2 factorial design with species (human and monkey), expression (fear and chewing) and configuration (intact versus scrambled) as factors. At the whole brain level, selective neural responses to conspecific emotional expressions were anatomically confined to the superior temporal sulcus (STS) in humans. Within the human STS, we found functional subdivisions with a face-selective right posterior STS area that also responded selectively to emotional expressions of other species and a more anterior area in the right middle STS that responded specifically to human emotions. Hence, we argue that the latter region does not show a mere emotion-dependent modulation of activity but is primarily driven by human emotional facial expressions. Conversely, in monkeys, emotional responses appeared in earlier visual cortex and outside face-selective regions in inferior temporal cortex that responded also to multiple visual categories. Within monkey IT, we also found areas that were more responsive to conspecific than to non-conspecific emotional expressions but these responses were not as specific as in human middle STS. Overall, our results indicate that human STS may have developed unique properties to deal with social cues such as emotional expressions. PMID:23142071

  9. Face-body integration of intense emotional expressions of victory and defeat.

    PubMed

    Wang, Lili; Xia, Lisheng; Zhang, Dandan

    2017-01-01

    Human facial expressions can be recognized rapidly and effortlessly. However, for intense emotions from real life, positive and negative facial expressions are difficult to discriminate and the judgment of facial expressions is biased towards simultaneously perceived body expressions. This study employed event-related potentials (ERPs) to investigate the neural dynamics involved in the integration of emotional signals from facial and body expressions of victory and defeat. Emotional expressions of professional players were used to create pictures of face-body compounds, with either matched or mismatched emotional expressions in faces and bodies. Behavioral results showed that congruent emotional information of face and body facilitated the recognition of facial expressions. ERP data revealed larger P1 amplitudes for incongruent compared to congruent stimuli. Also, a main effect of body valence on the P1 was observed, with enhanced amplitudes for the stimuli with losing compared to winning bodies. The main effect of body expression was also observed in N170 and N2, with winning bodies producing larger N170/N2 amplitudes. In the later stage, a significant interaction of congruence by body valence was found on the P3 component. Winning bodies elicited lager P3 amplitudes than losing bodies did when face and body conveyed congruent emotional signals. Beyond the knowledge based on prototypical facial and body expressions, the results of this study facilitate us to understand the complexity of emotion evaluation and categorization out of laboratory.

  10. Face-body integration of intense emotional expressions of victory and defeat

    PubMed Central

    Wang, Lili; Xia, Lisheng; Zhang, Dandan

    2017-01-01

    Human facial expressions can be recognized rapidly and effortlessly. However, for intense emotions from real life, positive and negative facial expressions are difficult to discriminate and the judgment of facial expressions is biased towards simultaneously perceived body expressions. This study employed event-related potentials (ERPs) to investigate the neural dynamics involved in the integration of emotional signals from facial and body expressions of victory and defeat. Emotional expressions of professional players were used to create pictures of face-body compounds, with either matched or mismatched emotional expressions in faces and bodies. Behavioral results showed that congruent emotional information of face and body facilitated the recognition of facial expressions. ERP data revealed larger P1 amplitudes for incongruent compared to congruent stimuli. Also, a main effect of body valence on the P1 was observed, with enhanced amplitudes for the stimuli with losing compared to winning bodies. The main effect of body expression was also observed in N170 and N2, with winning bodies producing larger N170/N2 amplitudes. In the later stage, a significant interaction of congruence by body valence was found on the P3 component. Winning bodies elicited lager P3 amplitudes than losing bodies did when face and body conveyed congruent emotional signals. Beyond the knowledge based on prototypical facial and body expressions, the results of this study facilitate us to understand the complexity of emotion evaluation and categorization out of laboratory. PMID:28245245

  11. The Relationship between Early Neural Responses to Emotional Faces at Age 3 and Later Autism and Anxiety Symptoms in Adolescents with Autism

    ERIC Educational Resources Information Center

    Neuhaus, Emily; Jones, Emily J. H.; Barnes, Karen; Sterling, Lindsey; Estes, Annette; Munson, Jeff; Dawson, Geraldine; Webb, Sara J.

    2016-01-01

    Both autism spectrum (ASD) and anxiety disorders are associated with atypical neural and attentional responses to emotional faces, differing in affective face processing from typically developing peers. Within a longitudinal study of children with ASD (23 male, 3 female), we hypothesized that early ERPs to emotional faces would predict concurrent…

  12. The Relationship between Processing Facial Identity and Emotional Expression in 8-Month-Old Infants

    ERIC Educational Resources Information Center

    Schwarzer, Gudrun; Jovanovic, Bianca

    2010-01-01

    In Experiment 1, it was investigated whether infants process facial identity and emotional expression independently or in conjunction with one another. Eight-month-old infants were habituated to two upright or two inverted faces varying in facial identity and emotional expression. Infants were tested with a habituation face, a switch face, and a…

  13. Impairment in face processing in autism spectrum disorder: a developmental perspective.

    PubMed

    Greimel, Ellen; Schulte-Rüther, Martin; Kamp-Becker, Inge; Remschmidt, Helmut; Herpertz-Dahlmann, Beate; Konrad, Kerstin

    2014-09-01

    Findings on face identity and facial emotion recognition in autism spectrum disorder (ASD) are inconclusive. Moreover, little is known about the developmental trajectory of face processing skills in ASD. Taking a developmental perspective, the aim of this study was to extend previous findings on face processing skills in a sample of adolescents and adults with ASD. N = 38 adolescents and adults (13-49 years) with high-functioning ASD and n = 37 typically developing (TD) control subjects matched for age and IQ participated in the study. Moreover, n = 18 TD children between the ages of 8 and 12 were included to address the question whether face processing skills in ASD follow a delayed developmental pattern. Face processing skills were assessed using computerized tasks of face identity recognition (FR) and identification of facial emotions (IFE). ASD subjects showed impaired performance on several parameters of the FR and IFE task compared to TD control adolescents and adults. Whereas TD adolescents and adults outperformed TD children in both tasks, performance in ASD adolescents and adults was similar to the group of TD children. Within the groups of ASD and control adolescents and adults, no age-related changes in performance were found. Our findings corroborate and extend previous studies showing that ASD is characterised by broad impairments in the ability to process faces. These impairments seem to reflect a developmentally delayed pattern that remains stable throughout adolescence and adulthood.

  14. Detection of Emotional Faces: Salient Physical Features Guide Effective Visual Search

    ERIC Educational Resources Information Center

    Calvo, Manuel G.; Nummenmaa, Lauri

    2008-01-01

    In this study, the authors investigated how salient visual features capture attention and facilitate detection of emotional facial expressions. In a visual search task, a target emotional face (happy, disgusted, fearful, angry, sad, or surprised) was presented in an array of neutral faces. Faster detection of happy and, to a lesser extent,…

  15. Impaired neural processing of dynamic faces in left-onset Parkinson's disease.

    PubMed

    Garrido-Vásquez, Patricia; Pell, Marc D; Paulmann, Silke; Sehm, Bernhard; Kotz, Sonja A

    2016-02-01

    Parkinson's disease (PD) affects patients beyond the motor domain. According to previous evidence, one mechanism that may be impaired in the disease is face processing. However, few studies have investigated this process at the neural level in PD. Moreover, research using dynamic facial displays rather than static pictures is scarce, but highly warranted due to the higher ecological validity of dynamic stimuli. In the present study we aimed to investigate how PD patients process emotional and non-emotional dynamic face stimuli at the neural level using event-related potentials. Since the literature has revealed a predominantly right-lateralized network for dynamic face processing, we divided the group into patients with left (LPD) and right (RPD) motor symptom onset (right versus left cerebral hemisphere predominantly affected, respectively). Participants watched short video clips of happy, angry, and neutral expressions and engaged in a shallow gender decision task in order to avoid confounds of task difficulty in the data. In line with our expectations, the LPD group showed significant face processing deficits compared to controls. While there were no group differences in early, sensory-driven processing (fronto-central N1 and posterior P1), the vertex positive potential, which is considered the fronto-central counterpart of the face-specific posterior N170 component, had a reduced amplitude and delayed latency in the LPD group. This may indicate disturbances of structural face processing in LPD. Furthermore, the effect was independent of the emotional content of the videos. In contrast, static facial identity recognition performance in LPD was not significantly different from controls, and comprehensive testing of cognitive functions did not reveal any deficits in this group. We therefore conclude that PD, and more specifically the predominant right-hemispheric affection in left-onset PD, is associated with impaired processing of dynamic facial expressions

  16. Differential Brain Activation to Angry Faces by Elite Warfighters: Neural Processing Evidence for Enhanced Threat Detection

    PubMed Central

    Paulus, Martin P.; Simmons, Alan N.; Fitzpatrick, Summer N.; Potterat, Eric G.; Van Orden, Karl F.; Bauman, James; Swain, Judith L.

    2010-01-01

    Background Little is known about the neural basis of elite performers and their optimal performance in extreme environments. The purpose of this study was to examine brain processing differences between elite warfighters and comparison subjects in brain structures that are important for emotion processing and interoception. Methodology/Principal Findings Navy Sea, Air, and Land Forces (SEALs) while off duty (n = 11) were compared with n = 23 healthy male volunteers while performing a simple emotion face-processing task during functional magnetic resonance imaging. Irrespective of the target emotion, elite warfighters relative to comparison subjects showed relatively greater right-sided insula, but attenuated left-sided insula, activation. Navy SEALs showed selectively greater activation to angry target faces relative to fearful or happy target faces bilaterally in the insula. This was not accounted for by contrasting positive versus negative emotions. Finally, these individuals also showed slower response latencies to fearful and happy target faces than did comparison subjects. Conclusions/Significance These findings support the hypothesis that elite warfighters deploy greater processing resources toward potential threat-related facial expressions and reduced processing resources to non-threat-related facial expressions. Moreover, rather than expending more effort in general, elite warfighters show more focused neural and performance tuning. In other words, greater neural processing resources are directed toward threat stimuli and processing resources are conserved when facing a nonthreat stimulus situation. PMID:20418943

  17. Neutral face classification using personalized appearance models for fast and robust emotion detection.

    PubMed

    Chiranjeevi, Pojala; Gopalakrishnan, Viswanath; Moogi, Pratibha

    2015-09-01

    Facial expression recognition is one of the open problems in computer vision. Robust neutral face recognition in real time is a major challenge for various supervised learning-based facial expression recognition methods. This is due to the fact that supervised methods cannot accommodate all appearance variability across the faces with respect to race, pose, lighting, facial biases, and so on, in the limited amount of training data. Moreover, processing each and every frame to classify emotions is not required, as user stays neutral for majority of the time in usual applications like video chat or photo album/web browsing. Detecting neutral state at an early stage, thereby bypassing those frames from emotion classification would save the computational power. In this paper, we propose a light-weight neutral versus emotion classification engine, which acts as a pre-processer to the traditional supervised emotion classification approaches. It dynamically learns neutral appearance at key emotion (KE) points using a statistical texture model, constructed by a set of reference neutral frames for each user. The proposed method is made robust to various types of user head motions by accounting for affine distortions based on a statistical texture model. Robustness to dynamic shift of KE points is achieved by evaluating the similarities on a subset of neighborhood patches around each KE point using the prior information regarding the directionality of specific facial action units acting on the respective KE point. The proposed method, as a result, improves emotion recognition (ER) accuracy and simultaneously reduces computational complexity of the ER system, as validated on multiple databases.

  18. Face processing regions are sensitive to distinct aspects of temporal sequence in facial dynamics.

    PubMed

    Reinl, Maren; Bartels, Andreas

    2014-11-15

    Facial movement conveys important information for social interactions, yet its neural processing is poorly understood. Computational models propose that shape- and temporal sequence sensitive mechanisms interact in processing dynamic faces. While face processing regions are known to respond to facial movement, their sensitivity to particular temporal sequences has barely been studied. Here we used fMRI to examine the sensitivity of human face-processing regions to two aspects of directionality in facial movement trajectories. We presented genuine movie recordings of increasing and decreasing fear expressions, each of which were played in natural or reversed frame order. This two-by-two factorial design matched low-level visual properties, static content and motion energy within each factor, emotion-direction (increasing or decreasing emotion) and timeline (natural versus artificial). The results showed sensitivity for emotion-direction in FFA, which was timeline-dependent as it only occurred within the natural frame order, and sensitivity to timeline in the STS, which was emotion-direction-dependent as it only occurred for decreased fear. The occipital face area (OFA) was sensitive to the factor timeline. These findings reveal interacting temporal sequence sensitive mechanisms that are responsive to both ecological meaning and to prototypical unfolding of facial dynamics. These mechanisms are temporally directional, provide socially relevant information regarding emotional state or naturalness of behavior, and agree with predictions from modeling and predictive coding theory. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  19. Deficits of unconscious emotional processing in patients with major depression: An ERP study.

    PubMed

    Zhang, Dandan; He, Zhenhong; Chen, Yuming; Wei, Zhaoguo

    2016-07-15

    Major depressive disorder (MDD) is associated with behavioral and neurobiological evidences of negative bias in unconscious emotional processing. However, little is known about the time course of this deficit. The current study aimed to explore the unconscious processing of emotional facial expressions in MDD patients by means of event-related potentials (ERPs). The ERP responses to subliminally presented happy/neutral/sad faces were recorded in 26 medication-free patients and 26 healthy controls in a backward masking task. Three ERP components were compared between patients and controls. Detection accuracy was at chance level for both groups, suggesting that the process was performed in the absence of conscious awareness of the emotional stimuli. Robust emotion×group interactions were observed in P1, N170 and P3. Compared with the neutral faces, 1) the patients showed larger P1 for sad and smaller P1 for happy faces; however, the controls showed a completely inverse P1 pattern; 2) the controls exhibited larger N170 in the happy but not in the sad trials, whereas patients had comparable larger N170 amplitudes in sad and happy trials; 3) although both groups exhibited larger P3 for emotional faces, the patients showed a priority for sad trials while the controls showed a priority for happy trials. Our data suggested that negative processing bias exists on the unconscious level in individuals with MDD. The ERP measures indicated that the unconscious emotional processing in MDD patients has a time course of three-stage deflection. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. The ties to unbind: age-related differences in feature (un)binding in working memory for emotional faces.

    PubMed

    Pehlivanoglu, Didem; Jain, Shivangi; Ariel, Robert; Verhaeghen, Paul

    2014-01-01

    In the present study, we investigated age-related differences in the processing of emotional stimuli. Specifically, we were interested in whether older adults would show deficits in unbinding emotional expression (i.e., either no emotion, happiness, anger, or disgust) from bound stimuli (i.e., photographs of faces expressing these emotions), as a hyper-binding account of age-related differences in working memory would predict. Younger and older adults completed different N-Back tasks (side-by-side 0-Back, 1-Back, 2-Back) under three conditions: match/mismatch judgments based on either the identity of the face (identity condition), the face's emotional expression (expression condition), or both identity and expression of the face (both condition). The two age groups performed more slowly and with lower accuracy in the expression condition than in the both condition, indicating the presence of an unbinding process. This unbinding effect was more pronounced in older adults than in younger adults, but only in the 2-Back task. Thus, older adults seemed to have a specific deficit in unbinding in working memory. Additionally, no age-related differences were found in accuracy in the 0-Back task, but such differences emerged in the 1-Back task, and were further magnified in the 2-Back task, indicating independent age-related differences in attention/STM and working memory. Pupil dilation data confirmed that the attention/STM version of the task (1-Back) is more effortful for older adults than younger adults.

  1. Neurocognitive mechanisms behind emotional attention: Inverse effects of anodal tDCS over the left and right DLPFC on gaze disengagement from emotional faces.

    PubMed

    Sanchez-Lopez, Alvaro; Vanderhasselt, Marie-Anne; Allaert, Jens; Baeken, Chris; De Raedt, Rudi

    2018-06-01

    Attention to relevant emotional information in the environment is an important process related to vulnerability and resilience for mood and anxiety disorders. In the present study, the effects of left and right dorsolateral prefrontal cortex (i.e., DLPFC) stimulation on attentional mechanisms of emotional processing were tested and contrasted. A sample of 54 healthy participants received 20 min of active and sham anodal transcranial direct current stimulation (i.e., tDCS) either of the left (n = 27) or of the right DLPFC (n = 27) on two separate days. The anode electrode was placed over the left or the right DLPFC, the cathode over the corresponding contra lateral supraorbital area. After each neurostimulation session, participants completed an eye-tracking task assessing direct processes of attentional engagement towards and attentional disengagement away from emotional faces (happy, disgusted, and sad expressions). Compared to sham, active tDCS over the left DLPFC led to faster gaze disengagement, whereas active tDCS over the right DLPFC led to slower gaze disengagement from emotional faces. Between-group comparisons showed that such inverse change patterns were significantly different and generalized for all types of emotion. Our findings support a lateralized role of left and right DLPFC activity in enhancing/worsening the top-down regulation of emotional attention processing. These results support the rationale of new therapies for affective disorders aimed to increase the activation of the left over the right DLPFC in combination with attentional control training, and identify specific target attention mechanisms to be trained.

  2. Behavioral dissociation between emotional and non-emotional facial expressions in congenital prosopagnosia

    PubMed Central

    Daini, Roberta; Comparetti, Chiara M.; Ricciardelli, Paola

    2014-01-01

    Neuropsychological and neuroimaging studies have shown that facial recognition and emotional expressions are dissociable. However, it is unknown if a single system supports the processing of emotional and non-emotional facial expressions. We aimed to understand if individuals with impairment in face recognition from birth (congenital prosopagnosia, CP) can use non-emotional facial expressions to recognize a face as an already seen one, and thus, process this facial dimension independently from features (which are impaired in CP), and basic emotional expressions. To this end, we carried out a behavioral study in which we compared the performance of 6 CP individuals to that of typical development individuals, using upright and inverted faces. Four avatar faces with a neutral expression were presented in the initial phase. The target faces presented in the recognition phase, in which a recognition task was requested (2AFC paradigm), could be identical (neutral) to those of the initial phase or present biologically plausible changes to features, non-emotional expressions, or emotional expressions. After this task, a second task was performed, in which the participants had to detect whether or not the recognized face exactly matched the study face or showed any difference. The results confirmed the CPs' impairment in the configural processing of the invariant aspects of the face, but also showed a spared configural processing of non-emotional facial expression (task 1). Interestingly and unlike the non-emotional expressions, the configural processing of emotional expressions was compromised in CPs and did not improve their change detection ability (task 2). These new results have theoretical implications for face perception models since they suggest that, at least in CPs, non-emotional expressions are processed configurally, can be dissociated from other facial dimensions, and may serve as a compensatory strategy to achieve face recognition. PMID:25520643

  3. Behavioral dissociation between emotional and non-emotional facial expressions in congenital prosopagnosia.

    PubMed

    Daini, Roberta; Comparetti, Chiara M; Ricciardelli, Paola

    2014-01-01

    Neuropsychological and neuroimaging studies have shown that facial recognition and emotional expressions are dissociable. However, it is unknown if a single system supports the processing of emotional and non-emotional facial expressions. We aimed to understand if individuals with impairment in face recognition from birth (congenital prosopagnosia, CP) can use non-emotional facial expressions to recognize a face as an already seen one, and thus, process this facial dimension independently from features (which are impaired in CP), and basic emotional expressions. To this end, we carried out a behavioral study in which we compared the performance of 6 CP individuals to that of typical development individuals, using upright and inverted faces. Four avatar faces with a neutral expression were presented in the initial phase. The target faces presented in the recognition phase, in which a recognition task was requested (2AFC paradigm), could be identical (neutral) to those of the initial phase or present biologically plausible changes to features, non-emotional expressions, or emotional expressions. After this task, a second task was performed, in which the participants had to detect whether or not the recognized face exactly matched the study face or showed any difference. The results confirmed the CPs' impairment in the configural processing of the invariant aspects of the face, but also showed a spared configural processing of non-emotional facial expression (task 1). Interestingly and unlike the non-emotional expressions, the configural processing of emotional expressions was compromised in CPs and did not improve their change detection ability (task 2). These new results have theoretical implications for face perception models since they suggest that, at least in CPs, non-emotional expressions are processed configurally, can be dissociated from other facial dimensions, and may serve as a compensatory strategy to achieve face recognition.

  4. Sensory Contributions to Impaired Emotion Processing in Schizophrenia

    PubMed Central

    Butler, Pamela D.; Abeles, Ilana Y.; Weiskopf, Nicole G.; Tambini, Arielle; Jalbrzikowski, Maria; Legatt, Michael E.; Zemon, Vance; Loughead, James; Gur, Ruben C.; Javitt, Daniel C.

    2009-01-01

    Both emotion and visual processing deficits are documented in schizophrenia, and preferential magnocellular visual pathway dysfunction has been reported in several studies. This study examined the contribution to emotion-processing deficits of magnocellular and parvocellular visual pathway function, based on stimulus properties and shape of contrast response functions. Experiment 1 examined the relationship between contrast sensitivity to magnocellular- and parvocellular-biased stimuli and emotion recognition using the Penn Emotion Recognition (ER-40) and Emotion Differentiation (EMODIFF) tests. Experiment 2 altered the contrast levels of the faces themselves to determine whether emotion detection curves would show a pattern characteristic of magnocellular neurons and whether patients would show a deficit in performance related to early sensory processing stages. Results for experiment 1 showed that patients had impaired emotion processing and a preferential magnocellular deficit on the contrast sensitivity task. Greater deficits in ER-40 and EMODIFF performance correlated with impaired contrast sensitivity to the magnocellular-biased condition, which remained significant for the EMODIFF task even when nonspecific correlations due to group were considered in a step-wise regression. Experiment 2 showed contrast response functions indicative of magnocellular processing for both groups, with patients showing impaired performance. Impaired emotion identification on this task was also correlated with magnocellular-biased visual sensory processing dysfunction. These results provide evidence for a contribution of impaired early-stage visual processing in emotion recognition deficits in schizophrenia and suggest that a bottom-up approach to remediation may be effective. PMID:19793797

  5. Sensory contributions to impaired emotion processing in schizophrenia.

    PubMed

    Butler, Pamela D; Abeles, Ilana Y; Weiskopf, Nicole G; Tambini, Arielle; Jalbrzikowski, Maria; Legatt, Michael E; Zemon, Vance; Loughead, James; Gur, Ruben C; Javitt, Daniel C

    2009-11-01

    Both emotion and visual processing deficits are documented in schizophrenia, and preferential magnocellular visual pathway dysfunction has been reported in several studies. This study examined the contribution to emotion-processing deficits of magnocellular and parvocellular visual pathway function, based on stimulus properties and shape of contrast response functions. Experiment 1 examined the relationship between contrast sensitivity to magnocellular- and parvocellular-biased stimuli and emotion recognition using the Penn Emotion Recognition (ER-40) and Emotion Differentiation (EMODIFF) tests. Experiment 2 altered the contrast levels of the faces themselves to determine whether emotion detection curves would show a pattern characteristic of magnocellular neurons and whether patients would show a deficit in performance related to early sensory processing stages. Results for experiment 1 showed that patients had impaired emotion processing and a preferential magnocellular deficit on the contrast sensitivity task. Greater deficits in ER-40 and EMODIFF performance correlated with impaired contrast sensitivity to the magnocellular-biased condition, which remained significant for the EMODIFF task even when nonspecific correlations due to group were considered in a step-wise regression. Experiment 2 showed contrast response functions indicative of magnocellular processing for both groups, with patients showing impaired performance. Impaired emotion identification on this task was also correlated with magnocellular-biased visual sensory processing dysfunction. These results provide evidence for a contribution of impaired early-stage visual processing in emotion recognition deficits in schizophrenia and suggest that a bottom-up approach to remediation may be effective.

  6. Emotional Devaluation of Distracting Patterns and Faces: A Consequence of Attentional Inhibition during Visual Search?

    ERIC Educational Resources Information Center

    Raymond, Jane E.; Fenske, Mark J.; Westoby, Nikki

    2005-01-01

    Visual search has been studied extensively, yet little is known about how its constituent processes affect subsequent emotional evaluation of searched-for and searched-through items. In 3 experiments, the authors asked observers to locate a colored pattern or tinted face in an array of other patterns or faces. Shortly thereafter, either the target…

  7. Perceptual Biases in Processing Facial Identity and Emotion

    ERIC Educational Resources Information Center

    Coolican, Jamesie; Eskes, Gail A.; McMullen, Patricia A.; Lecky, Erin

    2008-01-01

    Normal observers demonstrate a bias to process the left sides of faces during perceptual judgments about identity or emotion. This effect suggests a right cerebral hemisphere processing bias. To test the role of the right hemisphere and the involvement of configural processing underlying this effect, young and older control observers and patients…

  8. Assessment of Emotional Expressions after Full-Face Transplantation.

    PubMed

    Topçu, Çağdaş; Uysal, Hilmi; Özkan, Ömer; Özkan, Özlenen; Polat, Övünç; Bedeloğlu, Merve; Akgül, Arzu; Döğer, Ela Naz; Sever, Refik; Barçın, Nur Ebru; Tombak, Kadriye; Çolak, Ömer Halil

    2017-01-01

    We assessed clinical features as well as sensory and motor recoveries in 3 full-face transplantation patients. A frequency analysis was performed on facial surface electromyography data collected during 6 basic emotional expressions and 4 primary facial movements. Motor progress was assessed using the wavelet packet method by comparison against the mean results obtained from 10 healthy subjects. Analyses were conducted on 1 patient at approximately 1 year after face transplantation and at 2 years after transplantation in the remaining 2 patients. Motor recovery was observed following sensory recovery in all 3 patients; however, the 3 cases had different backgrounds and exhibited different degrees and rates of sensory and motor improvements after transplant. Wavelet packet energy was detected in all patients during emotional expressions and primary movements; however, there were fewer active channels during expressions in transplant patients compared to healthy individuals, and patterns of wavelet packet energy were different for each patient. Finally, high-frequency components were typically detected in patients during emotional expressions, but fewer channels demonstrated these high-frequency components in patients compared to healthy individuals. Our data suggest that the posttransplantation recovery of emotional facial expression requires neural plasticity.

  9. In search of the emotional face: anger versus happiness superiority in visual search.

    PubMed

    Savage, Ruth A; Lipp, Ottmar V; Craig, Belinda M; Becker, Stefanie I; Horstmann, Gernot

    2013-08-01

    Previous research has provided inconsistent results regarding visual search for emotional faces, yielding evidence for either anger superiority (i.e., more efficient search for angry faces) or happiness superiority effects (i.e., more efficient search for happy faces), suggesting that these results do not reflect on emotional expression, but on emotion (un-)related low-level perceptual features. The present study investigated possible factors mediating anger/happiness superiority effects; specifically search strategy (fixed vs. variable target search; Experiment 1), stimulus choice (Nimstim database vs. Ekman & Friesen database; Experiments 1 and 2), and emotional intensity (Experiment 3 and 3a). Angry faces were found faster than happy faces regardless of search strategy using faces from the Nimstim database (Experiment 1). By contrast, a happiness superiority effect was evident in Experiment 2 when using faces from the Ekman and Friesen database. Experiment 3 employed angry, happy, and exuberant expressions (Nimstim database) and yielded anger and happiness superiority effects, respectively, highlighting the importance of the choice of stimulus materials. Ratings of the stimulus materials collected in Experiment 3a indicate that differences in perceived emotional intensity, pleasantness, or arousal do not account for differences in search efficiency. Across three studies, the current investigation indicates that prior reports of anger or happiness superiority effects in visual search are likely to reflect on low-level visual features associated with the stimulus materials used, rather than on emotion. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  10. Face Patch Resting State Networks Link Face Processing to Social Cognition

    PubMed Central

    Schwiedrzik, Caspar M.; Zarco, Wilbert; Everling, Stefan; Freiwald, Winrich A.

    2015-01-01

    Faces transmit a wealth of social information. How this information is exchanged between face-processing centers and brain areas supporting social cognition remains largely unclear. Here we identify these routes using resting state functional magnetic resonance imaging in macaque monkeys. We find that face areas functionally connect to specific regions within frontal, temporal, and parietal cortices, as well as subcortical structures supporting emotive, mnemonic, and cognitive functions. This establishes the existence of an extended face-recognition system in the macaque. Furthermore, the face patch resting state networks and the default mode network in monkeys show a pattern of overlap akin to that between the social brain and the default mode network in humans: this overlap specifically includes the posterior superior temporal sulcus, medial parietal, and dorsomedial prefrontal cortex, areas supporting high-level social cognition in humans. Together, these results reveal the embedding of face areas into larger brain networks and suggest that the resting state networks of the face patch system offer a new, easily accessible venue into the functional organization of the social brain and into the evolution of possibly uniquely human social skills. PMID:26348613

  11. Neural activity and emotional processing following military deployment: Effects of mild traumatic brain injury and posttraumatic stress disorder.

    PubMed

    Zuj, Daniel V; Felmingham, Kim L; Palmer, Matthew A; Lawrence-Wood, Ellie; Van Hooff, Miranda; Lawrence, Andrew J; Bryant, Richard A; McFarlane, Alexander C

    2017-11-01

    Posttraumatic Stress Disorder (PTSD) and mild traumatic brain injury (mTBI) are common comorbidities during military deployment that affect emotional brain processing, yet few studies have examined the independent effects of mTBI and PTSD. The purpose of this study was to examine distinct differences in neural responses to emotional faces in mTBI and PTSD. Twenty-one soldiers reporting high PTSD symptoms were compared to 21 soldiers with low symptoms, and 16 soldiers who reported mTBI-consistent injury and symptoms were compared with 16 soldiers who did not sustain an mTBI. Participants viewed emotional face expressions while their neural activity was recorded (via event-related potentials) prior to and following deployment. The high-PTSD group displayed increased P1 and P2 amplitudes to threatening faces at post-deployment compared to the low-PTSD group. In contrast, the mTBI group displayed reduced face-specific processing (N170 amplitude) to all facial expressions compared to the no-mTBI group. Here, we identified distinctive neural patterns of emotional face processing, with attentional biases towards threatening faces in PTSD, and reduced emotional face processing in mTBI. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Age- and gender-related variations of emotion recognition in pseudowords and faces.

    PubMed

    Demenescu, Liliana R; Mathiak, Krystyna A; Mathiak, Klaus

    2014-01-01

    BACKGROUND/STUDY CONTEXT: The ability to interpret emotionally salient stimuli is an important skill for successful social functioning at any age. The objective of the present study was to disentangle age and gender effects on emotion recognition ability in voices and faces. Three age groups of participants (young, age range: 18-35 years; middle-aged, age range: 36-55 years; and older, age range: 56-75 years) identified basic emotions presented in voices and faces in a forced-choice paradigm. Five emotions (angry, fearful, sad, disgusted, and happy) and a nonemotional category (neutral) were shown as encoded in color photographs of facial expressions and pseudowords spoken in affective prosody. Overall, older participants had a lower accuracy rate in categorizing emotions than young and middle-aged participants. Females performed better than males in recognizing emotions from voices, and this gender difference emerged in middle-aged and older participants. The performance of emotion recognition in faces was significantly correlated with the performance in voices. The current study provides further evidence for a general age and gender effect on emotion recognition; the advantage of females seems to be age- and stimulus modality-dependent.

  13. Sex differences in functional activation patterns revealed by increased emotion processing demands.

    PubMed

    Hall, Geoffrey B C; Witelson, Sandra F; Szechtman, Henry; Nahmias, Claude

    2004-02-09

    Two [O(15)] PET studies assessed sex differences regional brain activation in the recognition of emotional stimuli. Study I revealed that the recognition of emotion in visual faces resulted in bilateral frontal activation in women, and unilateral right-sided activation in men. In study II, the complexity of the emotional face task was increased through tje addition of associated auditory emotional stimuli. Men again showed unilateral frontal activation, in this case to the left; whereas women did not show bilateral frontal activation, but showed greater limbic activity. These results suggest that when processing broader cross-modal emotional stimuli, men engage more in associative cognitive strategies while women draw more on primary emotional references.

  14. Exploring the Role of Spatial Frequency Information during Neural Emotion Processing in Human Infants.

    PubMed

    Jessen, Sarah; Grossmann, Tobias

    2017-01-01

    Enhanced attention to fear expressions in adults is primarily driven by information from low as opposed to high spatial frequencies contained in faces. However, little is known about the role of spatial frequency information in emotion processing during infancy. In the present study, we examined the role of low compared to high spatial frequencies in the processing of happy and fearful facial expressions by using filtered face stimuli and measuring event-related brain potentials (ERPs) in 7-month-old infants ( N = 26). Our results revealed that infants' brains discriminated between emotional facial expressions containing high but not between expressions containing low spatial frequencies. Specifically, happy faces containing high spatial frequencies elicited a smaller Nc amplitude than fearful faces containing high spatial frequencies and happy and fearful faces containing low spatial frequencies. Our results demonstrate that already in infancy spatial frequency content influences the processing of facial emotions. Furthermore, we observed that fearful facial expressions elicited a comparable Nc response for high and low spatial frequencies, suggesting a robust detection of fearful faces irrespective of spatial frequency content, whereas the detection of happy facial expressions was contingent upon frequency content. In summary, these data provide new insights into the neural processing of facial emotions in early development by highlighting the differential role played by spatial frequencies in the detection of fear and happiness.

  15. Emotional contexts modulate intentional memory suppression of neutral faces: Insights from ERPs.

    PubMed

    Pierguidi, Lapo; Righi, Stefania; Gronchi, Giorgio; Marzi, Tessa; Caharel, Stephanie; Giovannelli, Fabio; Viggiano, Maria Pia

    2016-08-01

    The main goal of present work is to gain new insight into the temporal dynamics underlying the voluntary memory control for neutral faces associated with neutral, positive and negative contexts. A directed forgetting (DF) procedure was used during the recording of EEG to answer the question whether is it possible to forget a face that has been encoded within a particular emotional context. A face-scene phase in which a neutral face was showed in a neutral or emotional scene (positive, negative) was followed by the voluntary memory cue (cue phase) indicating whether the face had to-be remember or to-be-forgotten (TBR and TBF). Memory for faces was then assessed with an old/new recognition task. Behaviorally, we found that it is harder to suppress faces-in-positive-scenes compared to faces-in-negative and neutral-scenes. The temporal information obtained by the ERPs showed: 1) during the face-scene phase, the Late Positive Potential (LPP), which indexes motivated emotional attention, was larger for faces-in-negative-scenes compared to faces-in-neutral-scenes. 2) Remarkably, during the cue phase, ERPs were significantly modulated by the emotional contexts. Faces-in-neutral scenes showed an ERP pattern that has been typically associated to DF effect whereas faces-in-positive-scenes elicited the reverse ERP pattern. Faces-in-negative scenes did not show differences in the DF-related neural activities but larger N1 amplitude for TBF vs. TBR faces may index early attentional deployment. These results support the hypothesis that the pleasantness or unpleasantness of the contexts (through attentional broadening and narrowing mechanisms, respectively) may modulate the effectiveness of intentional memory suppression for neutral information. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Neural markers of opposite-sex bias in face processing.

    PubMed

    Proverbio, Alice Mado; Riva, Federica; Martin, Eleonora; Zani, Alberto

    2010-01-01

    Some behavioral and neuroimaging studies suggest that adults prefer to view attractive faces of the opposite sex more than attractive faces of the same sex. However, unlike the other-race face effect (Caldara et al., 2004), little is known regarding the existence of an opposite-/same-sex bias in face processing. In this study, the faces of 130 attractive male and female adults were foveally presented to 40 heterosexual university students (20 men and 20 women) who were engaged in a secondary perceptual task (landscape detection). The automatic processing of face gender was investigated by recording ERPs from 128 scalp sites. Neural markers of opposite- vs. same-sex bias in face processing included larger and earlier centro-parietal N400s in response to faces of the opposite sex and a larger late positivity (LP) to same-sex faces. Analysis of intra-cortical neural generators (swLORETA) showed that facial processing-related (FG, BA37, BA20/21) and emotion-related brain areas (the right parahippocampal gyrus, BA35; uncus, BA36/38; and the cingulate gyrus, BA24) had higher activations in response to opposite- than same-sex faces. The results of this analysis, along with data obtained from ERP recordings, support the hypothesis that both genders process opposite-sex faces differently than same-sex faces. The data also suggest a hemispheric asymmetry in the processing of opposite-/same-sex faces, with the right hemisphere involved in processing same-sex faces and the left hemisphere involved in processing faces of the opposite sex. The data support previous literature suggesting a right lateralization for the representation of self-image and body awareness.

  17. Cognitive emotion regulation in children: Reappraisal of emotional faces modulates neural source activity in a frontoparietal network.

    PubMed

    Wessing, Ida; Rehbein, Maimu A; Romer, Georg; Achtergarde, Sandra; Dobel, Christian; Zwitserlood, Pienie; Fürniss, Tilman; Junghöfer, Markus

    2015-06-01

    Emotion regulation has an important role in child development and psychopathology. Reappraisal as cognitive regulation technique can be used effectively by children. Moreover, an ERP component known to reflect emotional processing called late positive potential (LPP) can be modulated by children using reappraisal and this modulation is also related to children's emotional adjustment. The present study seeks to elucidate the neural generators of such LPP effects. To this end, children aged 8-14 years reappraised emotional faces, while neural activity in an LPP time window was estimated using magnetoencephalography-based source localization. Additionally, neural activity was correlated with two indexes of emotional adjustment and age. Reappraisal reduced activity in the left dorsolateral prefrontal cortex during down-regulation and enhanced activity in the right parietal cortex during up-regulation. Activity in the visual cortex decreased with increasing age, more adaptive emotion regulation and less anxiety. Results demonstrate that reappraisal changed activity within a frontoparietal network in children. Decreasing activity in the visual cortex with increasing age is suggested to reflect neural maturation. A similar decrease with adaptive emotion regulation and less anxiety implies that better emotional adjustment may be associated with an advance in neural maturation. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  18. Mixed emotions: Sensitivity to facial variance in a crowd of faces.

    PubMed

    Haberman, Jason; Lee, Pegan; Whitney, David

    2015-01-01

    The visual system automatically represents summary information from crowds of faces, such as the average expression. This is a useful heuristic insofar as it provides critical information about the state of the world, not simply information about the state of one individual. However, the average alone is not sufficient for making decisions about how to respond to a crowd. The variance or heterogeneity of the crowd--the mixture of emotions--conveys information about the reliability of the average, essential for determining whether the average can be trusted. Despite its importance, the representation of variance within a crowd of faces has yet to be examined. This is addressed here in three experiments. In the first experiment, observers viewed a sample set of faces that varied in emotion, and then adjusted a subsequent set to match the variance of the sample set. To isolate variance as the summary statistic of interest, the average emotion of both sets was random. Results suggested that observers had information regarding crowd variance. The second experiment verified that this was indeed a uniquely high-level phenomenon, as observers were unable to derive the variance of an inverted set of faces as precisely as an upright set of faces. The third experiment replicated and extended the first two experiments using method-of-constant-stimuli. Together, these results show that the visual system is sensitive to emergent information about the emotional heterogeneity, or ambivalence, in crowds of faces.

  19. Reading Faces: Differential Lateral Gaze Bias in Processing Canine and Human Facial Expressions in Dogs and 4-Year-Old Children

    PubMed Central

    Racca, Anaïs; Guo, Kun; Meints, Kerstin; Mills, Daniel S.

    2012-01-01

    Sensitivity to the emotions of others provides clear biological advantages. However, in the case of heterospecific relationships, such as that existing between dogs and humans, there are additional challenges since some elements of the expression of emotions are species-specific. Given that faces provide important visual cues for communicating emotional state in both humans and dogs, and that processing of emotions is subject to brain lateralisation, we investigated lateral gaze bias in adult dogs when presented with pictures of expressive human and dog faces. Our analysis revealed clear differences in laterality of eye movements in dogs towards conspecific faces according to the emotional valence of the expressions. Differences were also found towards human faces, but to a lesser extent. For comparative purpose, a similar experiment was also run with 4-year-old children and it was observed that they showed differential processing of facial expressions compared to dogs, suggesting a species-dependent engagement of the right or left hemisphere in processing emotions. PMID:22558335

  20. Visual Afterimages of Emotional Faces in High Functioning Autism

    ERIC Educational Resources Information Center

    Rutherford, M. D.; Troubridge, Erin K.; Walsh, Jennifer

    2012-01-01

    Fixating an emotional facial expression can create afterimages, such that subsequent faces are seen as having the opposite expression of that fixated. Visual afterimages have been used to map the relationships among emotion categories, and this method was used here to compare ASD and matched control participants. Participants adapted to a facial…

  1. Gender differences in human single neuron responses to male emotional faces.

    PubMed

    Newhoff, Morgan; Treiman, David M; Smith, Kris A; Steinmetz, Peter N

    2015-01-01

    Well-documented differences in the psychology and behavior of men and women have spurred extensive exploration of gender's role within the brain, particularly regarding emotional processing. While neuroanatomical studies clearly show differences between the sexes, the functional effects of these differences are less understood. Neuroimaging studies have shown inconsistent locations and magnitudes of gender differences in brain hemodynamic responses to emotion. To better understand the neurophysiology of these gender differences, we analyzed recordings of single neuron activity in the human brain as subjects of both genders viewed emotional expressions. This study included recordings of single-neuron activity of 14 (6 male) epileptic patients in four brain areas: amygdala (236 neurons), hippocampus (n = 270), anterior cingulate cortex (n = 256), and ventromedial prefrontal cortex (n = 174). Neural activity was recorded while participants viewed a series of avatar male faces portraying positive, negative or neutral expressions. Significant gender differences were found in the left amygdala, where 23% (n = 15∕66) of neurons in men were significantly affected by facial emotion, vs. 8% (n = 6∕76) of neurons in women. A Fisher's exact test comparing the two ratios found a highly significant difference between the two (p < 0.01). These results show specific differences between genders at the single-neuron level in the human amygdala. These differences may reflect gender-based distinctions in evolved capacities for emotional processing and also demonstrate the importance of including subject gender as an independent factor in future studies of emotional processing by single neurons in the human amygdala.

  2. Gender differences in human single neuron responses to male emotional faces

    PubMed Central

    Newhoff, Morgan; Treiman, David M.; Smith, Kris A.; Steinmetz, Peter N.

    2015-01-01

    Well-documented differences in the psychology and behavior of men and women have spurred extensive exploration of gender's role within the brain, particularly regarding emotional processing. While neuroanatomical studies clearly show differences between the sexes, the functional effects of these differences are less understood. Neuroimaging studies have shown inconsistent locations and magnitudes of gender differences in brain hemodynamic responses to emotion. To better understand the neurophysiology of these gender differences, we analyzed recordings of single neuron activity in the human brain as subjects of both genders viewed emotional expressions. This study included recordings of single-neuron activity of 14 (6 male) epileptic patients in four brain areas: amygdala (236 neurons), hippocampus (n = 270), anterior cingulate cortex (n = 256), and ventromedial prefrontal cortex (n = 174). Neural activity was recorded while participants viewed a series of avatar male faces portraying positive, negative or neutral expressions. Significant gender differences were found in the left amygdala, where 23% (n = 15∕66) of neurons in men were significantly affected by facial emotion, vs. 8% (n = 6∕76) of neurons in women. A Fisher's exact test comparing the two ratios found a highly significant difference between the two (p < 0.01). These results show specific differences between genders at the single-neuron level in the human amygdala. These differences may reflect gender-based distinctions in evolved capacities for emotional processing and also demonstrate the importance of including subject gender as an independent factor in future studies of emotional processing by single neurons in the human amygdala. PMID:26441597

  3. Virtual faces expressing emotions: an initial concomitant and construct validity study.

    PubMed

    Joyal, Christian C; Jacob, Laurence; Cigna, Marie-Hélène; Guay, Jean-Pierre; Renaud, Patrice

    2014-01-01

    Facial expressions of emotions represent classic stimuli for the study of social cognition. Developing virtual dynamic facial expressions of emotions, however, would open-up possibilities, both for fundamental and clinical research. For instance, virtual faces allow real-time Human-Computer retroactions between physiological measures and the virtual agent. The goal of this study was to initially assess concomitants and construct validity of a newly developed set of virtual faces expressing six fundamental emotions (happiness, surprise, anger, sadness, fear, and disgust). Recognition rates, facial electromyography (zygomatic major and corrugator supercilii muscles), and regional gaze fixation latencies (eyes and mouth regions) were compared in 41 adult volunteers (20 ♂, 21 ♀) during the presentation of video clips depicting real vs. virtual adults expressing emotions. Emotions expressed by each set of stimuli were similarly recognized, both by men and women. Accordingly, both sets of stimuli elicited similar activation of facial muscles and similar ocular fixation times in eye regions from man and woman participants. Further validation studies can be performed with these virtual faces among clinical populations known to present social cognition difficulties. Brain-Computer Interface studies with feedback-feedforward interactions based on facial emotion expressions can also be conducted with these stimuli.

  4. Memory for faces and voices varies as a function of sex and expressed emotion.

    PubMed

    S Cortes, Diana; Laukka, Petri; Lindahl, Christina; Fischer, Håkan

    2017-01-01

    We investigated how memory for faces and voices (presented separately and in combination) varies as a function of sex and emotional expression (anger, disgust, fear, happiness, sadness, and neutral). At encoding, participants judged the expressed emotion of items in forced-choice tasks, followed by incidental Remember/Know recognition tasks. Results from 600 participants showed that accuracy (hits minus false alarms) was consistently higher for neutral compared to emotional items, whereas accuracy for specific emotions varied across the presentation modalities (i.e., faces, voices, and face-voice combinations). For the subjective sense of recollection ("remember" hits), neutral items received the highest hit rates only for faces, whereas for voices and face-voice combinations anger and fear expressions instead received the highest recollection rates. We also observed better accuracy for items by female expressers, and own-sex bias where female participants displayed memory advantage for female faces and face-voice combinations. Results further suggest that own-sex bias can be explained by recollection, rather than familiarity, rates. Overall, results show that memory for faces and voices may be influenced by the expressions that they carry, as well as by the sex of both items and participants. Emotion expressions may also enhance the subjective sense of recollection without enhancing memory accuracy.

  5. Memory for faces and voices varies as a function of sex and expressed emotion

    PubMed Central

    Laukka, Petri; Lindahl, Christina; Fischer, Håkan

    2017-01-01

    We investigated how memory for faces and voices (presented separately and in combination) varies as a function of sex and emotional expression (anger, disgust, fear, happiness, sadness, and neutral). At encoding, participants judged the expressed emotion of items in forced-choice tasks, followed by incidental Remember/Know recognition tasks. Results from 600 participants showed that accuracy (hits minus false alarms) was consistently higher for neutral compared to emotional items, whereas accuracy for specific emotions varied across the presentation modalities (i.e., faces, voices, and face-voice combinations). For the subjective sense of recollection (“remember” hits), neutral items received the highest hit rates only for faces, whereas for voices and face-voice combinations anger and fear expressions instead received the highest recollection rates. We also observed better accuracy for items by female expressers, and own-sex bias where female participants displayed memory advantage for female faces and face-voice combinations. Results further suggest that own-sex bias can be explained by recollection, rather than familiarity, rates. Overall, results show that memory for faces and voices may be influenced by the expressions that they carry, as well as by the sex of both items and participants. Emotion expressions may also enhance the subjective sense of recollection without enhancing memory accuracy. PMID:28570691

  6. Happy faces, sad faces: Emotion understanding in toddlers and preschoolers with language impairments.

    PubMed

    Rieffe, Carolien; Wiefferink, Carin H

    2017-03-01

    The capacity for emotion recognition and understanding is crucial for daily social functioning. We examined to what extent this capacity is impaired in young children with a Language Impairment (LI). In typical development, children learn to recognize emotions in faces and situations through social experiences and social learning. Children with LI have less access to these experiences and are therefore expected to fall behind their peers without LI. In this study, 89 preschool children with LI and 202 children without LI (mean age 3 years and 10 months in both groups) were tested on three indices for facial emotion recognition (discrimination, identification, and attribution in emotion evoking situations). Parents reported on their children's emotion vocabulary and ability to talk about their own emotions. Preschoolers with and without LI performed similarly on the non-verbal task for emotion discrimination. Children with LI fell behind their peers without LI on both other tasks for emotion recognition that involved labelling the four basic emotions (happy, sad, angry, fear). The outcomes of these two tasks were also related to children's level of emotion language. These outcomes emphasize the importance of 'emotion talk' at the youngest age possible for children with LI. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. In the face of emotions: event-related potentials in supraliminal and subliminal facial expression recognition.

    PubMed

    Balconi, Michela; Lucchiari, Claudio

    2005-02-01

    Is facial expression recognition marked by specific event-related potentials (ERPs) effects? Are conscious and unconscious elaborations of emotional facial stimuli qualitatively different processes? In Experiment 1, ERPs elicited by supraliminal stimuli were recorded when 21 participants viewed emotional facial expressions of four emotions and a neutral stimulus. Two ERP components (N2 and P3) were analyzed for their peak amplitude and latency measures. First, emotional face-specificity was observed for the negative deflection N2, whereas P3 was not affected by the content of the stimulus (emotional or neutral). A more posterior distribution of ERPs was found for N2. Moreover, a lateralization effect was revealed for negative (right lateralization) and positive (left lateralization) facial expressions. In Experiment 2 (20 participants), 1-ms subliminal stimulation was carried out. Unaware information processing was revealed to be quite similar to aware information processing for peak amplitude but not for latency. In fact, unconscious stimulation produced a more delayed peak variation than conscious stimulation.

  8. An Event-Related Potential Study on the Effects of Cannabis on Emotion Processing

    PubMed Central

    Troup, Lucy J.; Bastidas, Stephanie; Nguyen, Maia T.; Andrzejewski, Jeremy A.; Bowers, Matthew; Nomi, Jason S.

    2016-01-01

    The effect of cannabis on emotional processing was investigated using event-related potential paradigms (ERPs). ERPs associated with emotional processing of cannabis users, and non-using controls, were recorded and compared during an implicit and explicit emotional expression recognition and empathy task. Comparisons in P3 component mean amplitudes were made between cannabis users and controls. Results showed a significant decrease in the P3 amplitude in cannabis users compared to controls. Specifically, cannabis users showed reduced P3 amplitudes for implicit compared to explicit processing over centro-parietal sites which reversed, and was enhanced, at fronto-central sites. Cannabis users also showed a decreased P3 to happy faces, with an increase to angry faces, compared to controls. These effects appear to increase with those participants that self-reported the highest levels of cannabis consumption. Those cannabis users with the greatest consumption rates showed the largest P3 deficits for explicit processing and negative emotions. These data suggest that there is a complex relationship between cannabis consumption and emotion processing that appears to be modulated by attention. PMID:26926868

  9. Time for a Change: College Students' Preference for Technology-Mediated Versus Face-to-Face Help for Emotional Distress.

    PubMed

    Lungu, Anita; Sun, Michael

    2016-12-01

    Even with recent advances in psychological treatments and mobile technology, online computerized therapy is not yet popular. College students, with ubiquitous access to technology, experiencing high distress, and often nontreatment seekers, could be an important area for online treatment dissemination. Finding ways to reach out to college students by offering psychological interventions through technology, devices, and applications they often use, might increase their engagement in treatment. This study evaluates college students' reported willingness to seek help for emotional distress through novel delivery mediums, to play computer games for learning emotional coping skills, and to disclose personal information online. We also evaluated the role of ethnicity and level of emotional distress in help-seeking patterns. A survey exploring our domains of interest and the Mental Health Inventory ([MHI] as mental health index) were completed by 572 students (mean age 18.7 years, predominantly Asian American, female, and freshmen in college). More participants expressed preference for online versus face-to-face professional help. We found no relationship between MHI and help-seeking preference. A third of participants were likely to disclose at least as much information online as face-to-face. Ownership of mobile technology was pervasive. Asian Americans were more likely to be nontreatment seekers than Caucasians. Most participants were interested in serious games for emotional distress. Our results suggest that college students are very open to creative ways of receiving emotional help such as playing games and seeking emotional help online, suggesting a need for online evidence-based treatments.

  10. Dissociable neural effects of stimulus valence and preceding context during the inhibition of responses to emotional faces.

    PubMed

    Schulz, Kurt P; Clerkin, Suzanne M; Halperin, Jeffrey M; Newcorn, Jeffrey H; Tang, Cheuk Y; Fan, Jin

    2009-09-01

    Socially appropriate behavior requires the concurrent inhibition of actions that are inappropriate in the context. This self-regulatory function requires an interaction of inhibitory and emotional processes that recruits brain regions beyond those engaged by either processes alone. In this study, we isolated brain activity associated with response inhibition and emotional processing in 24 healthy adults using event-related functional magnetic resonance imaging (fMRI) and a go/no-go task that independently manipulated the context preceding no-go trials (ie, number of go trials) and the valence (ie, happy, sad, and neutral) of the face stimuli used as trial cues. Parallel quadratic trends were seen in correct inhibitions on no-go trials preceded by increasing numbers of go trials and associated activation for correct no-go trials in inferior frontal gyrus pars opercularis, pars triangularis, and pars orbitalis, temporoparietal junction, superior parietal lobule, and temporal sensory association cortices. Conversely, the comparison of happy versus neutral faces and sad versus neutral faces revealed valence-dependent activation in the amygdala, anterior insula cortex, and posterior midcingulate cortex. Further, an interaction between inhibition and emotion was seen in valence-dependent variations in the quadratic trend in no-go activation in the right inferior frontal gyrus and left posterior insula cortex. These results suggest that the inhibition of response to emotional cues involves the interaction of partly dissociable limbic and frontoparietal networks that encode emotional cues and use these cues to exert inhibitory control over the motor, attention, and sensory functions needed to perform the task, respectively. 2008 Wiley-Liss, Inc.

  11. Crossmodal adaptation in right posterior superior temporal sulcus during face-voice emotional integration.

    PubMed

    Watson, Rebecca; Latinus, Marianne; Noguchi, Takao; Garrod, Oliver; Crabbe, Frances; Belin, Pascal

    2014-05-14

    The integration of emotional information from the face and voice of other persons is known to be mediated by a number of "multisensory" cerebral regions, such as the right posterior superior temporal sulcus (pSTS). However, whether multimodal integration in these regions is attributable to interleaved populations of unisensory neurons responding to face or voice or rather by multimodal neurons receiving input from the two modalities is not fully clear. Here, we examine this question using functional magnetic resonance adaptation and dynamic audiovisual stimuli in which emotional information was manipulated parametrically and independently in the face and voice via morphing between angry and happy expressions. Healthy human adult subjects were scanned while performing a happy/angry emotion categorization task on a series of such stimuli included in a fast event-related, continuous carryover design. Subjects integrated both face and voice information when categorizing emotion-although there was a greater weighting of face information-and showed behavioral adaptation effects both within and across modality. Adaptation also occurred at the neural level: in addition to modality-specific adaptation in visual and auditory cortices, we observed for the first time a crossmodal adaptation effect. Specifically, fMRI signal in the right pSTS was reduced in response to a stimulus in which facial emotion was similar to the vocal emotion of the preceding stimulus. These results suggest that the integration of emotional information from face and voice in the pSTS involves a detectable proportion of bimodal neurons that combine inputs from visual and auditory cortices. Copyright © 2014 the authors 0270-6474/14/346813-09$15.00/0.

  12. Emotion processing biases and resting EEG activity in depressed adolescents

    PubMed Central

    Auerbach, Randy P.; Stewart, Jeremy G.; Stanton, Colin H.; Mueller, Erik M.; Pizzagalli, Diego A.

    2015-01-01

    Background While theorists have posited that adolescent depression is characterized by emotion processing biases (greater propensity to identify sad than happy facial expressions), findings have been mixed. Additionally, the neural correlates associated with putative emotion processing biases remain largely unknown. Our aim was to identify emotion processing biases in depressed adolescents and examine neural abnormalities related to these biases using high-density resting EEG and source localization. Methods Healthy (n = 36) and depressed (n = 23) female adolescents, aged 13–18 years, completed a facial recognition task in which they identified happy, sad, fear, and angry expressions across intensities from 10% (low) to 100% (high). Additionally, 128-channel resting (i.e., task-free) EEG was recorded and analyzed using a distributed source localization technique (LORETA). Given research implicating the dorsolateral prefrontal cortex (DLPFC) in depression and emotion processing, analyses focused on this region. Results Relative to healthy youth, depressed adolescents were more accurate for sad and less accurate for happy, particularly low-intensity happy faces. No differences emerged for fearful or angry facial expressions. Further, LORETA analyses revealed greater theta and alpha current density (i.e., reduced brain activity) in depressed versus healthy adolescents, particularly in the left DLPFC (BA9/BA46). Theta and alpha current density were positively correlated, and greater current density predicted reduced accuracy for happy faces. Conclusion Depressed female adolescents were characterized by emotion processing biases in favor of sad emotions and reduced recognition of happiness, especially when cues of happiness were subtle. Blunted recognition of happy was associated with left DLPFC resting hypoactivity. PMID:26032684

  13. How stable is activation in the amygdala and prefrontal cortex in adolescence? A study of emotional face processing across three measurements.

    PubMed

    van den Bulk, Bianca G; Koolschijn, P Cédric M P; Meens, Paul H F; van Lang, Natasja D J; van der Wee, Nic J A; Rombouts, Serge A R B; Vermeiren, Robert R J M; Crone, Eveline A

    2013-04-01

    Prior developmental functional magnetic resonance imaging (fMRI) studies have demonstrated elevated activation patterns in the amygdala and prefrontal cortex (PFC) in response to viewing emotional faces. As adolescence is a time of substantial variability in mood and emotional responsiveness, the stability of activation patterns could be fluctuating over time. In the current study, 27 healthy adolescents (age: 12-19 years) were scanned three times over a period of six months (mean test-retest interval of three months; final samples N=27, N=22, N=18). At each session, participants performed the same emotional faces task. At first measurement the presentation of emotional faces resulted in heightened activation in bilateral amygdala, bilateral lateral PFC and visual areas including the fusiform face area. Average activation did not differ across test-sessions over time, indicating that at the group level activation patterns in this network do not vary significantly over time. However, using the Intraclass Correlation Coefficient (ICC), fMRI reliability demonstrated only fair reliability for PFC (ICC=0.41-0.59) and poor reliability for the amygdala (ICC<0.4). These findings suggest substantial variability of brain activity over time and may have implications for studies investigating the influence of treatment effects on changes in neural levels in adolescents with psychiatric disorders. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. What the Face and Body Reveal: In-Group Emotion Effects and Stereotyping of Emotion in African American and European American Children

    ERIC Educational Resources Information Center

    Tuminello, Elizabeth R.; Davidson, Denise

    2011-01-01

    This study examined whether 3- to 7-year-old African American and European American children's assessment of emotion in face-only, face + body, and body-only photographic stimuli was affected by in-group emotion recognition effects and racial or gender stereotyping of emotion. Evidence for racial in-group effects was found, with European American…

  15. The Automaticity of Emotional Face-Context Integration

    PubMed Central

    Aviezer, Hillel; Dudarev, Veronica; Bentin, Shlomo; Hassin, Ran R.

    2011-01-01

    Recent studies have demonstrated that context can dramatically influence the recognition of basic facial expressions, yet the nature of this phenomenon is largely unknown. In the present paper we begin to characterize the underlying process of face-context integration. Specifically, we examine whether it is a relatively controlled or automatic process. In Experiment 1 participants were motivated and instructed to avoid using the context while categorizing contextualized facial expression, or they were led to believe that the context was irrelevant. Nevertheless, they were unable to disregard the context, which exerted a strong effect on their emotion recognition. In Experiment 2, participants categorized contextualized facial expressions while engaged in a concurrent working memory task. Despite the load, the context exerted a strong influence on their recognition of facial expressions. These results suggest that facial expressions and their body contexts are integrated in an unintentional, uncontrollable, and relatively effortless manner. PMID:21707150

  16. Emotional words facilitate lexical but not early visual processing.

    PubMed

    Trauer, Sophie M; Kotz, Sonja A; Müller, Matthias M

    2015-12-12

    Emotional scenes and faces have shown to capture and bind visual resources at early sensory processing stages, i.e. in early visual cortex. However, emotional words have led to mixed results. In the current study ERPs were assessed simultaneously with steady-state visual evoked potentials (SSVEPs) to measure attention effects on early visual activity in emotional word processing. Neutral and negative words were flickered at 12.14 Hz whilst participants performed a Lexical Decision Task. Emotional word content did not modulate the 12.14 Hz SSVEP amplitude, neither did word lexicality. However, emotional words affected the ERP. Negative compared to neutral words as well as words compared to pseudowords lead to enhanced deflections in the P2 time range indicative of lexico-semantic access. The N400 was reduced for negative compared to neutral words and enhanced for pseudowords compared to words indicating facilitated semantic processing of emotional words. LPC amplitudes reflected word lexicality and thus the task-relevant response. In line with previous ERP and imaging evidence, the present results indicate that written emotional words are facilitated in processing only subsequent to visual analysis.

  17. Emotion Recognition in Face and Body Motion in Bulimia Nervosa.

    PubMed

    Dapelo, Marcela Marin; Surguladze, Simon; Morris, Robin; Tchanturia, Kate

    2017-11-01

    Social cognition has been studied extensively in anorexia nervosa (AN), but there are few studies in bulimia nervosa (BN). This study investigated the ability of people with BN to recognise emotions in ambiguous facial expressions and in body movement. Participants were 26 women with BN, who were compared with 35 with AN, and 42 healthy controls. Participants completed an emotion recognition task by using faces portraying blended emotions, along with a body emotion recognition task by using videos of point-light walkers. The results indicated that BN participants exhibited difficulties recognising disgust in less-ambiguous facial expressions, and a tendency to interpret non-angry faces as anger, compared with healthy controls. These difficulties were similar to those found in AN. There were no significant differences amongst the groups in body motion emotion recognition. The findings suggest that difficulties with disgust and anger recognition in facial expressions may be shared transdiagnostically in people with eating disorders. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association.

  18. Recognition of face identity and emotion in expressive specific language impairment.

    PubMed

    Merkenschlager, A; Amorosa, H; Kiefl, H; Martinius, J

    2012-01-01

    To study face and emotion recognition in children with mostly expressive specific language impairment (SLI-E). A test movie to study perception and recognition of faces and mimic-gestural expression was applied to 24 children diagnosed as suffering from SLI-E and an age-matched control group of normally developing children. Compared to a normal control group, the SLI-E children scored significantly worse in both the face and expression recognition tasks with a preponderant effect on emotion recognition. The performance of the SLI-E group could not be explained by reduced attention during the test session. We conclude that SLI-E is associated with a deficiency in decoding non-verbal emotional facial and gestural information, which might lead to profound and persistent problems in social interaction and development. Copyright © 2012 S. Karger AG, Basel.

  19. Implicit reward associations impact face processing: Time-resolved evidence from event-related brain potentials and pupil dilations.

    PubMed

    Hammerschmidt, Wiebke; Kagan, Igor; Kulke, Louisa; Schacht, Annekathrin

    2018-06-22

    The present study aimed at investigating whether associated motivational salience causes preferential processing of inherently neutral faces similar to emotional expressions by means of event-related brain potentials (ERPs) and changes of the pupil size. To this aim, neutral faces were implicitly associated with monetary outcome, while participants (N = 44) performed a masked prime face-matching task that ensured performance around chance level and thus an equal proportion of gain, loss, and zero outcomes. Motivational context strongly impacted the processing of the fixation, prime and mask stimuli prior to the target face, indicated by enhanced amplitudes of subsequent ERP components and increased pupil size. In a separate test session, previously associated faces as well as novel faces with emotional expressions were presented within the same task but without motivational context and performance feedback. Most importantly, previously gain-associated faces amplified the LPC, although the individually contingent face-outcome assignments were not made explicit during the learning session. Emotional expressions impacted the N170 and EPN components. Modulations of the pupil size were absent in both motivationally-associated and emotional conditions. Our findings demonstrate that neural representations of neutral stimuli can acquire increased salience via implicit learning, with an advantage for gain over loss associations. Copyright © 2018. Published by Elsevier Inc.

  20. Detecting emotion in others: increased insula and decreased medial prefrontal cortex activation during emotion processing in elite adventure racers

    PubMed Central

    Johnson, Douglas C.; Flagan, Taru; Simmons, Alan N.; Kotturi, Sante A.; Van Orden, Karl F.; Potterat, Eric G.; Swain, Judith L.; Paulus, Martin P.

    2014-01-01

    Understanding the neural processes that characterize elite performers is a first step to develop a neuroscience model that can be used to improve performance in stressful circumstances. Adventure racers are elite athletes that operate in small teams in the context of environmental and physical extremes. In particular, awareness of team member’s emotional status is critical to the team’s ability to navigate high-magnitude stressors. Thus, this functional magnetic resonance imaging (fMRI) study examined the hypothesis that adventure racers would show altered emotion processing in brain areas that are important for resilience and social awareness. Elite adventure racers (n = 10) were compared with healthy volunteers (n = 12) while performing a simple emotion face-processing (modified Hariri) task during fMRI. Across three types of emotional faces, adventure racers showed greater activation in right insula, left amygdala and dorsal anterior cingulate. Additionally, compared with healthy controls adventure racers showed attenuated right medial prefrontal cortex activation. These results are consistent with previous studies showing elite performers differentially activate neural substrates underlying interoception. Thus, adventure racers differentially deploy brain resources in an effort to recognize and process the internal sensations associated with emotions in others, which could be advantageous for team-based performance under stress. PMID:23171614

  1. Spatial frequency filtered images reveal differences between masked and unmasked processing of emotional information.

    PubMed

    Rohr, Michaela; Wentura, Dirk

    2014-10-01

    High and low spatial frequency information has been shown to contribute differently to the processing of emotional information. In three priming studies using spatial frequency filtered emotional face primes, emotional face targets, and an emotion categorization task, we investigated this issue further. Differences in the pattern of results between short and masked, and short and long unmasked presentation conditions emerged. Given long and unmasked prime presentation, high and low frequency primes triggered emotion-specific priming effects. Given brief and masked prime presentation in Experiment 2, we found a dissociation: High frequency primes caused a valence priming effect, whereas low frequency primes yielded a differentiation between low and high arousing information within the negative domain. Brief and unmasked prime presentation in Experiment 3 revealed that subliminal processing of primes was responsible for the pattern observed in Experiment 2. The implications of these findings for theories of early emotional information processing are discussed. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Emotional Expression and Heart Rate in High-Risk Infants during the Face-To-Face/Still-Face

    PubMed Central

    Mattson, Whitney I.; Ekas, Naomi V.; Lambert, Brittany; Tronick, Ed; Lester, Barry M.; Messinger, Daniel S.

    2013-01-01

    In infants, eye constriction—the Duchenne marker—and mouth opening appear to index the intensity of both positive and negative facial expressions. We combined eye constriction and mouth opening that co-occurred with smiles and cry-faces (respectively, the prototypic expressions of infant joy and distress) to measure emotional expression intensity. Expression intensity and heart rate were measured throughout the Face-to-Face/Still Face (FFSF) in a sample of infants with prenatal cocaine exposure who were at risk for developmental difficulties. Smiles declined and cry-faces increased in the still-face episode, but the distribution of eye constriction and mouth opening in smiles and cry-faces did not differ across episodes of the FFSF. As time elapsed in the still face episode potential indices of intensity increased, cry-faces were more likely to be accompanied by eye constriction and mouth opening. During cry-faces there were also moderately stable individual differences in the quantity of eye constriction and mouth opening. Infant heart rate was higher during cry-faces and lower during smiles, but did not vary with intensity of expression or by episode. In sum, infants express more intense negative affect as the still-face progresses, but do not show clear differences in expressive intensity between episodes of the FFSF. PMID:24095807

  3. Lateralization for Processing Facial Emotions in Gay Men, Heterosexual Men, and Heterosexual Women.

    PubMed

    Rahman, Qazi; Yusuf, Sifat

    2015-07-01

    This study tested whether male sexual orientation and gender nonconformity influenced functional cerebral lateralization for the processing of facial emotions. We also tested for the effects of sex of poser and emotion displayed on putative differences. Thirty heterosexual men, 30 heterosexual women, and 40 gay men completed measures of demographic variables, recalled childhood gender nonconformity (CGN), IQ, and the Chimeric Faces Test (CFT). The CFT depicts vertically split chimeric faces, formed with one half showing a neutral expression and the other half showing an emotional expression and performance is measured using a "laterality quotient" (LQ) score. We found that heterosexual men were significantly more right-lateralized when viewing female faces compared to heterosexual women and gay men, who did not differ significantly from each other. Heterosexual women and gay men were more left-lateralized for processing female faces. There were no significant group differences in lateralization for male faces. These results remained when controlling for age and IQ scores. There was no significant effect of CGN on LQ scores. These data suggest that gay men are feminized in some aspects of functional cerebral lateralization for facial emotion. The results were discussed in relation to the selectivity of functional lateralization and putative brain mechanisms underlying sexual attraction towards opposite-sex and same-sex targets.

  4. Judging trustworthiness from faces: Emotion cues modulate trustworthiness judgments in young children.

    PubMed

    Caulfield, Frances; Ewing, Louise; Bank, Samantha; Rhodes, Gillian

    2016-08-01

    By adulthood, people judge trustworthiness from appearances rapidly and reliably. However, we know little about these judgments in children. This novel study investigates the developmental trajectory of explicit trust judgments from faces, and the contribution made by emotion cues across age groups. Five-, 7-, 10-year-olds, and adults rated the trustworthiness of trustworthy and untrustworthy faces with neutral expressions. The same participants also rated faces displaying overt happy and angry expressions, allowing us to investigate whether emotion cues modulate trustworthiness judgments similarly in children and adults. Results revealed that the ability to evaluate the trustworthiness of faces emerges in childhood, but may not be adult like until 10 years of age. Moreover, we show that emotion cues modulate trust judgments in young children, as well as adults. Anger cues diminished the appearance of trustworthiness for participants from 5 years of age and happy cues increased it, although this effect did not consistently emerge until later in childhood, that is, 10 years of age. These associations also extended to more subtle emotion cues present in neutral faces. Our results indicate that young children are sensitive to facial trustworthiness, and suggest that similar expression cues modulate these judgments in children and adults. © 2015 The British Psychological Society.

  5. Emotional Faces in Context: Age Differences in Recognition Accuracy and Scanning Patterns

    PubMed Central

    Noh, Soo Rim; Isaacowitz, Derek M.

    2014-01-01

    While age-related declines in facial expression recognition are well documented, previous research relied mostly on isolated faces devoid of context. We investigated the effects of context on age differences in recognition of facial emotions and in visual scanning patterns of emotional faces. While their eye movements were monitored, younger and older participants viewed facial expressions (i.e., anger, disgust) in contexts that were emotionally congruent, incongruent, or neutral to the facial expression to be identified. Both age groups had highest recognition rates of facial expressions in the congruent context, followed by the neutral context, and recognition rates in the incongruent context were worst. These context effects were more pronounced for older adults. Compared to younger adults, older adults exhibited a greater benefit from congruent contextual information, regardless of facial expression. Context also influenced the pattern of visual scanning characteristics of emotional faces in a similar manner across age groups. In addition, older adults initially attended more to context overall. Our data highlight the importance of considering the role of context in understanding emotion recognition in adulthood. PMID:23163713

  6. Effects of facial emotion recognition remediation on visual scanning of novel face stimuli.

    PubMed

    Marsh, Pamela J; Luckett, Gemma; Russell, Tamara; Coltheart, Max; Green, Melissa J

    2012-11-01

    Previous research shows that emotion recognition in schizophrenia can be improved with targeted remediation that draws attention to important facial features (eyes, nose, mouth). Moreover, the effects of training have been shown to last for up to one month after training. The aim of this study was to investigate whether improved emotion recognition of novel faces is associated with concomitant changes in visual scanning of these same novel facial expressions. Thirty-nine participants with schizophrenia received emotion recognition training using Ekman's Micro-Expression Training Tool (METT), with emotion recognition and visual scanpath (VSP) recordings to face stimuli collected simultaneously. Baseline ratings of interpersonal and cognitive functioning were also collected from all participants. Post-METT training, participants showed changes in foveal attention to the features of facial expressions of emotion not used in METT training, which were generally consistent with the information about important features from the METT. In particular, there were changes in how participants looked at the features of facial expressions of emotion surprise, disgust, fear, happiness, and neutral, demonstrating that improved emotion recognition is paralleled by changes in the way participants with schizophrenia viewed novel facial expressions of emotion. However, there were overall decreases in foveal attention to sad and neutral faces that indicate more intensive instruction might be needed for these faces during training. Most importantly, the evidence shows that participant gender may affect training outcomes. Copyright © 2012 Elsevier B.V. All rights reserved.

  7. The ties to unbind: age-related differences in feature (un)binding in working memory for emotional faces

    PubMed Central

    Pehlivanoglu, Didem; Jain, Shivangi; Ariel, Robert; Verhaeghen, Paul

    2014-01-01

    In the present study, we investigated age-related differences in the processing of emotional stimuli. Specifically, we were interested in whether older adults would show deficits in unbinding emotional expression (i.e., either no emotion, happiness, anger, or disgust) from bound stimuli (i.e., photographs of faces expressing these emotions), as a hyper-binding account of age-related differences in working memory would predict. Younger and older adults completed different N-Back tasks (side-by-side 0-Back, 1-Back, 2-Back) under three conditions: match/mismatch judgments based on either the identity of the face (identity condition), the face’s emotional expression (expression condition), or both identity and expression of the face (both condition). The two age groups performed more slowly and with lower accuracy in the expression condition than in the both condition, indicating the presence of an unbinding process. This unbinding effect was more pronounced in older adults than in younger adults, but only in the 2-Back task. Thus, older adults seemed to have a specific deficit in unbinding in working memory. Additionally, no age-related differences were found in accuracy in the 0-Back task, but such differences emerged in the 1-Back task, and were further magnified in the 2-Back task, indicating independent age-related differences in attention/STM and working memory. Pupil dilation data confirmed that the attention/STM version of the task (1-Back) is more effortful for older adults than younger adults. PMID:24795660

  8. [Emotional intelligence and oscillatory responses on the emotional facial expressions].

    PubMed

    Kniazev, G G; Mitrofanova, L G; Bocharov, A V

    2013-01-01

    Emotional intelligence-related differences in oscillatory responses to emotional facial expressions were investigated in 48 subjects (26 men and 22 women) in age 18-30 years. Participants were instructed to evaluate emotional expression (angry, happy and neutral) of each presented face on an analog scale ranging from -100 (very hostile) to + 100 (very friendly). High emotional intelligence (EI) participants were found to be more sensitive to the emotional content of the stimuli. It showed up both in their subjective evaluation of the stimuli and in a stronger EEG theta synchronization at an earlier (between 100 and 500 ms after face presentation) processing stage. Source localization using sLORETA showed that this effect was localized in the fusiform gyrus upon the presentation of angry faces and in the posterior cingulate gyrus upon the presentation of happy faces. At a later processing stage (500-870 ms) event-related theta synchronization in high emotional intelligence subject was higher in the left prefrontal cortex upon the presentation of happy faces, but it was lower in the anterior cingulate cortex upon presentation of angry faces. This suggests the existence of a mechanism that can be selectively increase the positive emotions and reduce negative emotions.

  9. Variation in White Matter Connectivity Predicts the Ability to Remember Faces and Discriminate Their Emotions

    PubMed Central

    Unger, Ashley; Alm, Kylie H.; Collins, Jessica A.; O’Leary, Jacqueline M.; Olson, Ingrid R.

    2017-01-01

    Objective The extended face network contains clusters of neurons that perform distinct functions on facial stimuli. Regions in the posterior ventral visual stream appear to perform basic perceptual functions on faces, while more anterior regions, such as the ventral anterior temporal lobe and amygdala, function to link mnemonic and affective information to faces. Anterior and posterior regions are interconnected by a long-range white matter tracts however it is not known if variation in connectivity of these pathways explains cognitive performance. Methods Here, we used diffusion imaging and deterministic tractography in a cohort of 28 neurologically normal adults ages 18–28 to examine microstructural properties of visual fiber pathways and their relationship to certain mnemonic and affective functions involved in face processing. We investigated how inter-individual variability in two tracts, the inferior longitudinal fasciculus (ILF) and the inferior fronto-occipital fasciculus (IFOF), related to performance on tests of facial emotion recognition and face memory. Results Results revealed that microstructure of both tracts predicted variability in behavioral performance indexed by both tasks, suggesting that the ILF and IFOF play a role in facilitating our ability to discriminate emotional expressions in faces, as well as to remember unique faces. Variation in a control tract, the uncinate fasciculus, did not predict performance on these tasks. Conclusions These results corroborate and extend the findings of previous neuropsychology studies investigating the effects of damage to the ILF and IFOF, and demonstrate that differences in face processing abilities are related to white matter microstructure, even in healthy individuals. PMID:26888615

  10. Music-Elicited Emotion Identification Using Optical Flow Analysis of Human Face

    NASA Astrophysics Data System (ADS)

    Kniaz, V. V.; Smirnova, Z. N.

    2015-05-01

    Human emotion identification from image sequences is highly demanded nowadays. The range of possible applications can vary from an automatic smile shutter function of consumer grade digital cameras to Biofied Building technologies, which enables communication between building space and residents. The highly perceptual nature of human emotions leads to the complexity of their classification and identification. The main question arises from the subjective quality of emotional classification of events that elicit human emotions. A variety of methods for formal classification of emotions were developed in musical psychology. This work is focused on identification of human emotions evoked by musical pieces using human face tracking and optical flow analysis. Facial feature tracking algorithm used for facial feature speed and position estimation is presented. Facial features were extracted from each image sequence using human face tracking with local binary patterns (LBP) features. Accurate relative speeds of facial features were estimated using optical flow analysis. Obtained relative positions and speeds were used as the output facial emotion vector. The algorithm was tested using original software and recorded image sequences. The proposed technique proves to give a robust identification of human emotions elicited by musical pieces. The estimated models could be used for human emotion identification from image sequences in such fields as emotion based musical background or mood dependent radio.

  11. Right hemisphere or valence hypothesis, or both? The processing of hybrid faces in the intact and callosotomized brain.

    PubMed

    Prete, Giulia; Laeng, Bruno; Fabri, Mara; Foschi, Nicoletta; Tommasi, Luca

    2015-02-01

    The valence hypothesis and the right hemisphere hypothesis in emotion processing have been alternatively supported. To better disentangle the two accounts, we carried out two studies, presenting healthy participants and an anterior callosotomized patient with 'hybrid faces', stimuli created by superimposing the low spatial frequencies of an emotional face to the high spatial frequencies of the same face in a neutral expression. In both studies we asked participants to judge the friendliness level of stimuli, which is an indirect measure of the processing of emotional information, despite this remaining "invisible". In Experiment 1 we presented hybrid faces in a divided visual field paradigm using different tachistoscopic presentation times; in Experiment 2 we presented hybrid chimeric faces in canonical view and upside-down. In Experiments 3 and 4 we tested a callosotomized patient, with spared splenium, in similar paradigms as those used in Experiments 1 and 2. Results from Experiments 1 and 3 were consistent with the valence hypothesis, whereas results of Experiments 2 and 4 were consistent with the right hemisphere hypothesis. This study confirms that the low spatial frequencies of emotional faces influence the social judgments of observers, even when seen for 28 ms (Experiment 1), possibly by means of configural analysis (Experiment 2). The possible roles of the cortical and subcortical emotional routes in these tasks are discussed in the light of the results obtained in the callosotomized patient. We propose that the right hemisphere and the valence accounts are not mutually exclusive, at least in the case of subliminal emotion processing. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Developmental Changes in the Relationship Between the Infant's Attention and Emotion During Early Face-to-Face Communication: The 2-Month Transition

    ERIC Educational Resources Information Center

    Lavelli, Manuela; Fogel, Alan

    2005-01-01

    Weekly observations documented developmental changes in mother-infant face-to-face communication between birth and 3 months. Developmental trajectories for each dyad of the duration of infant facial expressions showed a change from the dominance of Simple Attention (without other emotion expressions) to active and emotionally positive forms of…

  13. Differential Interactions between Identity and Emotional Expression in Own and Other-Race Faces: Effects of Familiarity Revealed through Redundancy Gains

    ERIC Educational Resources Information Center

    Yankouskaya, Alla; Humphreys, Glyn W.; Rotshtein, Pia

    2014-01-01

    We examined relations between the processing of facial identity and emotion in own- and other-race faces, using a fully crossed design with participants from 3 different ethnicities. The benefits of redundant identity and emotion signals were evaluated and formally tested in relation to models of independent and coactive feature processing and…

  14. The effect of emotionally valenced eye region images on visuocortical processing of surprised faces.

    PubMed

    Li, Shuaixia; Li, Ping; Wang, Wei; Zhu, Xiangru; Luo, Wenbo

    2018-05-01

    In this study, we presented pictorial representations of happy, neutral, and fearful expressions projected in the eye regions to determine whether the eye region alone is sufficient to produce a context effect. Participants were asked to judge the valence of surprised faces that had been preceded by a picture of an eye region. Behavioral results showed that affective ratings of surprised faces were context dependent. Prime-related ERPs with presentation of happy eyes elicited a larger P1 than those for neutral and fearful eyes, likely due to the recognition advantage provided by a happy expression. Target-related ERPs showed that surprised faces in the context of fearful and happy eyes elicited dramatically larger C1 than those in the neutral context, which reflected the modulation by predictions during the earliest stages of face processing. There were larger N170 with neutral and fearful eye contexts compared to the happy context, suggesting faces were being integrated with contextual threat information. The P3 component exhibited enhanced brain activity in response to faces preceded by happy and fearful eyes compared with neutral eyes, indicating motivated attention processing may be involved at this stage. Altogether, these results indicate for the first time that the influence of isolated eye regions on the perception of surprised faces involves preferential processing at the early stages and elaborate processing at the late stages. Moreover, higher cognitive processes such as predictions and attention can modulate face processing from the earliest stages in a top-down manner. © 2017 Society for Psychophysiological Research.

  15. Increased amygdala responses to emotional faces after psilocybin for treatment-resistant depression.

    PubMed

    Roseman, Leor; Demetriou, Lysia; Wall, Matthew B; Nutt, David J; Carhart-Harris, Robin L

    2017-12-27

    Recent evidence indicates that psilocybin with psychological support may be effective for treating depression. Some studies have found that patients with depression show heightened amygdala responses to fearful faces and there is reliable evidence that treatment with SSRIs attenuates amygdala responses (Ma, 2015). We hypothesised that amygdala responses to emotional faces would be altered post-treatment with psilocybin. In this open-label study, 20 individuals diagnosed with moderate to severe, treatment-resistant depression, underwent two separate dosing sessions with psilocybin. Psychological support was provided before, during and after these sessions and 19 completed fMRI scans one week prior to the first session and one day after the second and last. Neutral, fearful and happy faces were presented in the scanner and analyses focused on the amygdala. Group results revealed rapid and enduring improvements in depressive symptoms post psilocybin. Increased responses to fearful and happy faces were observed in the right amygdala post-treatment, and right amygdala increases to fearful versus neutral faces were predictive of clinical improvements at 1-week. Psilocybin with psychological support was associated with increased amygdala responses to emotional stimuli, an opposite effect to previous findings with SSRIs. This suggests fundamental differences in these treatments' therapeutic actions, with SSRIs mitigating negative emotions and psilocybin allowing patients to confront and work through them. Based on the present results, we propose that psilocybin with psychological support is a treatment approach that potentially revives emotional responsiveness in depression, enabling patients to reconnect with their emotions. ISRCTN, number ISRCTN14426797. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  16. Emotional expression and heart rate in high-risk infants during the face-to-face/still-face.

    PubMed

    Mattson, Whitney I; Ekas, Naomi V; Lambert, Brittany; Tronick, Ed; Lester, Barry M; Messinger, Daniel S

    2013-12-01

    In infants, eye constriction-the Duchenne marker-and mouth opening appear to index the intensity of both positive and negative facial expressions. We combined eye constriction and mouth opening that co-occurred with smiles and cry-faces (respectively, the prototypic expressions of infant joy and distress) to measure emotional expression intensity. Expression intensity and heart rate were measured throughout the face-to-face/still-face (FFSF) in a sample of infants with prenatal cocaine exposure who were at risk for developmental difficulties. Smiles declined and cry-faces increased in the still-face episode, but the distribution of eye constriction and mouth opening in smiles and cry-faces did not differ across episodes of the FFSF. As time elapsed in the still face episode potential indices of intensity increased, cry-faces were more likely to be accompanied by eye constriction and mouth opening. During cry-faces there were also moderately stable individual differences in the quantity of eye constriction and mouth opening. Infant heart rate was higher during cry-faces and lower during smiles, but did not vary with intensity of expression or by episode. In sum, infants express more intense negative affect as the still-face progresses, but do not show clear differences in expressive intensity between episodes of the FFSF. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. Altered medial prefrontal activity during dynamic face processing in schizophrenia spectrum patients.

    PubMed

    Mothersill, Omar; Morris, Derek W; Kelly, Sinead; Rose, Emma Jane; Bokde, Arun; Reilly, Richard; Gill, Michael; Corvin, Aiden P; Donohoe, Gary

    2014-08-01

    Processing the emotional content of faces is recognised as a key deficit of schizophrenia, associated with poorer functional outcomes and possibly contributing to the severity of clinical symptoms such as paranoia. At the neural level, fMRI studies have reported altered limbic activity in response to facial stimuli. However, previous studies may be limited by the use of cognitively demanding tasks and static facial stimuli. To address these issues, the current study used a face processing task involving both passive face viewing and dynamic social stimuli. Such a task may (1) lack the potentially confounding effects of high cognitive demands and (2) show higher ecological validity. Functional MRI was used to examine neural activity in 25 patients with a DSM-IV diagnosis of schizophrenia/schizoaffective disorder and 21 age- and gender-matched healthy controls while they participated in a face processing task, which involved viewing videos of angry and neutral facial expressions, and a non-biological baseline condition. While viewing faces, patients showed significantly weaker deactivation of the medial prefrontal cortex, including the anterior cingulate, and decreased activation in the left cerebellum, compared to controls. Patients also showed weaker medial prefrontal deactivation while viewing the angry faces relative to baseline. Given that the anterior cingulate plays a role in processing negative emotion, weaker deactivation of this region in patients while viewing faces may contribute to an increased perception of social threat. Future studies examining the neurobiology of social cognition in schizophrenia using fMRI may help establish targets for treatment interventions. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Do bodily expressions compete with facial expressions? Time course of integration of emotional signals from the face and the body.

    PubMed

    Gu, Yuanyuan; Mai, Xiaoqin; Luo, Yue-jia

    2013-01-01

    The decoding of social signals from nonverbal cues plays a vital role in the social interactions of socially gregarious animals such as humans. Because nonverbal emotional signals from the face and body are normally seen together, it is important to investigate the mechanism underlying the integration of emotional signals from these two sources. We conducted a study in which the time course of the integration of facial and bodily expressions was examined via analysis of event-related potentials (ERPs) while the focus of attention was manipulated. Distinctive integrating features were found during multiple stages of processing. In the first stage, threatening information from the body was extracted automatically and rapidly, as evidenced by enhanced P1 amplitudes when the subjects viewed compound face-body images with fearful bodies compared with happy bodies. In the second stage, incongruency between emotional information from the face and the body was detected and captured by N2. Incongruent compound images elicited larger N2s than did congruent compound images. The focus of attention modulated the third stage of integration. When the subjects' attention was focused on the face, images with congruent emotional signals elicited larger P3s than did images with incongruent signals, suggesting more sustained attention and elaboration of congruent emotional information extracted from the face and body. On the other hand, when the subjects' attention was focused on the body, images with fearful bodies elicited larger P3s than did images with happy bodies, indicating more sustained attention and elaboration of threatening information from the body during evaluative processes.

  19. Amygdala habituation to emotional faces in adolescents with internalizing disorders, adolescents with childhood sexual abuse related PTSD and healthy adolescents.

    PubMed

    van den Bulk, Bianca G; Somerville, Leah H; van Hoof, Marie-José; van Lang, Natasja D J; van der Wee, Nic J A; Crone, Eveline A; Vermeiren, Robert R J M

    2016-10-01

    Adolescents with internalizing disorders and adolescents with childhood sexual abuse related post-traumatic stress disorder (CSA-related PTSD) show a large overlap in symptomatology. In addition, brain research indicated hyper-responsiveness and sustained activation instead of habituation of amygdala activation to emotional faces in both groups. Little is known, however, about whether the same patterns of amygdala habituation are present in these two groups. The current study examined habituation patterns of amygdala activity to emotional faces (fearful, happy and neutral) in adolescents with a DSM-IV depressive and/or anxiety disorder (N=25), adolescents with CSA-related PTSD (N=19) and healthy controls (N=26). Behaviourally, the adolescents from the internalizing and CSA-related PTSD group reported more anxiety to fearful and neutral faces than adolescents from the control group and adolescents from the CSA-related PTSD group reacted slower compared to the internalizing group. At the whole brain level, there was a significant interaction between time and group within the left amygdala. Follow-up ROI analysis showed elevated initial activity in the amygdala and rapid habituation in the CSA-related PTSD group compared to the internalizing group. These findings suggest that habituation patterns of amygdala activation provide additional information on problems with emotional face processing. Furthermore, the results suggest there are differences in the underlying neurobiological mechanisms related to emotional face processing for adolescents with internalizing disorders and adolescents with CSA-related PTSD. Possibly CSA-related PTSD is characterized by a stronger primary emotional response driven by the amygdala. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  20. Do Bodily Expressions Compete with Facial Expressions? Time Course of Integration of Emotional Signals from the Face and the Body

    PubMed Central

    Gu, Yuanyuan; Mai, Xiaoqin; Luo, Yue-jia

    2013-01-01

    The decoding of social signals from nonverbal cues plays a vital role in the social interactions of socially gregarious animals such as humans. Because nonverbal emotional signals from the face and body are normally seen together, it is important to investigate the mechanism underlying the integration of emotional signals from these two sources. We conducted a study in which the time course of the integration of facial and bodily expressions was examined via analysis of event-related potentials (ERPs) while the focus of attention was manipulated. Distinctive integrating features were found during multiple stages of processing. In the first stage, threatening information from the body was extracted automatically and rapidly, as evidenced by enhanced P1 amplitudes when the subjects viewed compound face-body images with fearful bodies compared with happy bodies. In the second stage, incongruency between emotional information from the face and the body was detected and captured by N2. Incongruent compound images elicited larger N2s than did congruent compound images. The focus of attention modulated the third stage of integration. When the subjects' attention was focused on the face, images with congruent emotional signals elicited larger P3s than did images with incongruent signals, suggesting more sustained attention and elaboration of congruent emotional information extracted from the face and body. On the other hand, when the subjects' attention was focused on the body, images with fearful bodies elicited larger P3s than did images with happy bodies, indicating more sustained attention and elaboration of threatening information from the body during evaluative processes. PMID:23935825

  1. Unconscious processing of facial attractiveness: invisible attractive faces orient visual attention.

    PubMed

    Hung, Shao-Min; Nieh, Chih-Hsuan; Hsieh, Po-Jang

    2016-11-16

    Past research has proven human's extraordinary ability to extract information from a face in the blink of an eye, including its emotion, gaze direction, and attractiveness. However, it remains elusive whether facial attractiveness can be processed and influences our behaviors in the complete absence of conscious awareness. Here we demonstrate unconscious processing of facial attractiveness with three distinct approaches. In Experiment 1, the time taken for faces to break interocular suppression was measured. The results showed that attractive faces enjoyed the privilege of breaking suppression and reaching consciousness earlier. In Experiment 2, we further showed that attractive faces had lower visibility thresholds, again suggesting that facial attractiveness could be processed more easily to reach consciousness. Crucially, in Experiment 3, a significant decrease of accuracy on an orientation discrimination task subsequent to an invisible attractive face showed that attractive faces, albeit suppressed and invisible, still exerted an effect by orienting attention. Taken together, for the first time, we show that facial attractiveness can be processed in the complete absence of consciousness, and an unconscious attractive face is still capable of directing our attention.

  2. The effects of mothers' past infant-holding preferences on their adult children's face processing lateralisation.

    PubMed

    Vervloed, Mathijs P J; Hendriks, Angélique W; van den Eijnde, Esther

    2011-04-01

    Face processing development is negatively affected when infants have not been exposed to faces for some time because of congenital cataract blocking all vision (Le Grand, Mondloch, Maurer, & Brent, 2001). It is not clear, however, whether more subtle differences in face exposure may also have an influence. The present study looked at the effect of the mother's preferred side of holding an infant, on her adult child's face processing lateralisation. Adults with a mother who had a left-arm preference for holding infants were compared with adults with a mother who had a right-arm holding preference. All participants were right-handed and had been exclusively bottle-fed during infancy. The participants were presented with two chimeric faces tests, one involving emotion and the other one gender. The left-arm held individuals showed a normal left-bias on the chimeric face tests, whereas the right-arm held individuals a significantly decreased left-bias. The results might suggest that reduced exposure to high quality emotional information on faces in infancy results in diminished right-hemisphere lateralisation for face processing. Copyright © 2011 Elsevier Inc. All rights reserved.

  3. Do Valenced Odors and Trait Body Odor Disgust Affect Evaluation of Emotion in Dynamic Faces?

    PubMed

    Syrjänen, Elmeri; Liuzza, Marco Tullio; Fischer, Håkan; Olofsson, Jonas K

    2017-12-01

    Disgust is a core emotion evolved to detect and avoid the ingestion of poisonous food as well as the contact with pathogens and other harmful agents. Previous research has shown that multisensory presentation of olfactory and visual information may strengthen the processing of disgust-relevant information. However, it is not known whether these findings extend to dynamic facial stimuli that changes from neutral to emotionally expressive, or if individual differences in trait body odor disgust may influence the processing of disgust-related information. In this preregistered study, we tested whether a classification of dynamic facial expressions as happy or disgusted, and an emotional evaluation of these facial expressions, would be affected by individual differences in body odor disgust sensitivity, and by exposure to a sweat-like, negatively valenced odor (valeric acid), as compared with a soap-like, positively valenced odor (lilac essence) or a no-odor control. Using Bayesian hypothesis testing, we found evidence that odors do not affect recognition of emotion in dynamic faces even when body odor disgust sensitivity was used as moderator. However, an exploratory analysis suggested that an unpleasant odor context may cause faster RTs for faces, independent of their emotional expression. Our results further our understanding of the scope and limits of odor effects on facial perception affect and suggest further studies should focus on reproducibility, specifying experimental circumstances where odor effects on facial expressions may be present versus absent.

  4. Emotion recognition through static faces and moving bodies: a comparison between typically developed adults and individuals with high level of autistic traits

    PubMed Central

    Actis-Grosso, Rossana; Bossi, Francesco; Ricciardelli, Paola

    2015-01-01

    We investigated whether the type of stimulus (pictures of static faces vs. body motion) contributes differently to the recognition of emotions. The performance (accuracy and response times) of 25 Low Autistic Traits (LAT group) young adults (21 males) and 20 young adults (16 males) with either High Autistic Traits or with High Functioning Autism Spectrum Disorder (HAT group) was compared in the recognition of four emotions (Happiness, Anger, Fear, and Sadness) either shown in static faces or conveyed by moving body patch-light displays (PLDs). Overall, HAT individuals were as accurate as LAT ones in perceiving emotions both with faces and with PLDs. Moreover, they correctly described non-emotional actions depicted by PLDs, indicating that they perceived the motion conveyed by the PLDs per se. For LAT participants, happiness proved to be the easiest emotion to be recognized: in line with previous studies we found a happy face advantage for faces, which for the first time was also found for bodies (happy body advantage). Furthermore, LAT participants recognized sadness better by static faces and fear by PLDs. This advantage for motion kinematics in the recognition of fear was not present in HAT participants, suggesting that (i) emotion recognition is not generally impaired in HAT individuals, (ii) the cues exploited for emotion recognition by LAT and HAT groups are not always the same. These findings are discussed against the background of emotional processing in typically and atypically developed individuals. PMID:26557101

  5. Emotion recognition through static faces and moving bodies: a comparison between typically developed adults and individuals with high level of autistic traits.

    PubMed

    Actis-Grosso, Rossana; Bossi, Francesco; Ricciardelli, Paola

    2015-01-01

    We investigated whether the type of stimulus (pictures of static faces vs. body motion) contributes differently to the recognition of emotions. The performance (accuracy and response times) of 25 Low Autistic Traits (LAT group) young adults (21 males) and 20 young adults (16 males) with either High Autistic Traits or with High Functioning Autism Spectrum Disorder (HAT group) was compared in the recognition of four emotions (Happiness, Anger, Fear, and Sadness) either shown in static faces or conveyed by moving body patch-light displays (PLDs). Overall, HAT individuals were as accurate as LAT ones in perceiving emotions both with faces and with PLDs. Moreover, they correctly described non-emotional actions depicted by PLDs, indicating that they perceived the motion conveyed by the PLDs per se. For LAT participants, happiness proved to be the easiest emotion to be recognized: in line with previous studies we found a happy face advantage for faces, which for the first time was also found for bodies (happy body advantage). Furthermore, LAT participants recognized sadness better by static faces and fear by PLDs. This advantage for motion kinematics in the recognition of fear was not present in HAT participants, suggesting that (i) emotion recognition is not generally impaired in HAT individuals, (ii) the cues exploited for emotion recognition by LAT and HAT groups are not always the same. These findings are discussed against the background of emotional processing in typically and atypically developed individuals.

  6. Neural Systems Underlying Emotional and Non-emotional Interference Processing: An ALE Meta-Analysis of Functional Neuroimaging Studies

    PubMed Central

    Xu, Min; Xu, Guiping; Yang, Yang

    2016-01-01

    Understanding how the nature of interference might influence the recruitments of the neural systems is considered as the key to understanding cognitive control. Although, interference processing in the emotional domain has recently attracted great interest, the question of whether there are separable neural patterns for emotional and non-emotional interference processing remains open. Here, we performed an activation likelihood estimation meta-analysis of 78 neuroimaging experiments, and examined common and distinct neural systems for emotional and non-emotional interference processing. We examined brain activation in three domains of interference processing: emotional verbal interference in the face-word conflict task, non-emotional verbal interference in the color-word Stroop task, and non-emotional spatial interference in the Simon, SRC and Flanker tasks. Our results show that the dorsal anterior cingulate cortex (ACC) was recruited for both emotional and non-emotional interference. In addition, the right anterior insula, presupplementary motor area (pre-SMA), and right inferior frontal gyrus (IFG) were activated by interference processing across both emotional and non-emotional domains. In light of these results, we propose that the anterior insular cortex may serve to integrate information from different dimensions and work together with the dorsal ACC to detect and monitor conflicts, whereas pre-SMA and right IFG may be recruited to inhibit inappropriate responses. In contrast, the dorsolateral prefrontal cortex (DLPFC) and posterior parietal cortex (PPC) showed different degrees of activation and distinct lateralization patterns for different processing domains, which suggests that these regions may implement cognitive control based on the specific task requirements. PMID:27895564

  7. Face and body perception in schizophrenia: a configural processing deficit?

    PubMed

    Soria Bauser, Denise; Thoma, Patrizia; Aizenberg, Victoria; Brüne, Martin; Juckel, Georg; Daum, Irene

    2012-01-30

    Face and body perception rely on common processing mechanisms and activate similar but not identical brain networks. Patients with schizophrenia show impaired face perception, and the present study addressed for the first time body perception in this group. Seventeen patients diagnosed with schizophrenia or schizoaffective disorder were compared to 17 healthy controls on standardized tests assessing basic face perception skills (identity discrimination, memory for faces, recognition of facial affect). A matching-to-sample task including emotional and neutral faces, bodies and cars either in an upright or in an inverted position was administered to assess potential category-specific performance deficits and impairments of configural processing. Relative to healthy controls, schizophrenia patients showed poorer performance on the tasks assessing face perception skills. In the matching-to-sample task, they also responded more slowly and less accurately than controls, regardless of the stimulus category. Accuracy analysis showed significant inversion effects for faces and bodies across groups, reflecting configural processing mechanisms; however reaction time analysis indicated evidence of reduced inversion effects regardless of category in schizophrenia patients. The magnitude of the inversion effects was not related to clinical symptoms. Overall, the data point towards reduced configural processing, not only for faces but also for bodies and cars in individuals with schizophrenia. © 2011 Elsevier Ltd. All rights reserved.

  8. Implicit Processing of Visual Emotions Is Affected by Sound-Induced Affective States and Individual Affective Traits

    PubMed Central

    Quarto, Tiziana; Blasi, Giuseppe; Pallesen, Karen Johanne; Bertolino, Alessandro; Brattico, Elvira

    2014-01-01

    The ability to recognize emotions contained in facial expressions are affected by both affective traits and states and varies widely between individuals. While affective traits are stable in time, affective states can be regulated more rapidly by environmental stimuli, such as music, that indirectly modulate the brain state. Here, we tested whether a relaxing or irritating sound environment affects implicit processing of facial expressions. Moreover, we investigated whether and how individual traits of anxiety and emotional control interact with this process. 32 healthy subjects performed an implicit emotion processing task (presented to subjects as a gender discrimination task) while the sound environment was defined either by a) a therapeutic music sequence (MusiCure), b) a noise sequence or c) silence. Individual changes in mood were sampled before and after the task by a computerized questionnaire. Additionally, emotional control and trait anxiety were assessed in a separate session by paper and pencil questionnaires. Results showed a better mood after the MusiCure condition compared with the other experimental conditions and faster responses to happy faces during MusiCure compared with angry faces during Noise. Moreover, individuals with higher trait anxiety were faster in performing the implicit emotion processing task during MusiCure compared with Silence. These findings suggest that sound-induced affective states are associated with differential responses to angry and happy emotional faces at an implicit stage of processing, and that a relaxing sound environment facilitates the implicit emotional processing in anxious individuals. PMID:25072162

  9. Diagnostic Features of Emotional Expressions Are Processed Preferentially

    PubMed Central

    Scheller, Elisa; Büchel, Christian; Gamer, Matthias

    2012-01-01

    Diagnostic features of emotional expressions are differentially distributed across the face. The current study examined whether these diagnostic features are preferentially attended to even when they are irrelevant for the task at hand or when faces appear at different locations in the visual field. To this aim, fearful, happy and neutral faces were presented to healthy individuals in two experiments while measuring eye movements. In Experiment 1, participants had to accomplish an emotion classification, a gender discrimination or a passive viewing task. To differentiate fast, potentially reflexive, eye movements from a more elaborate scanning of faces, stimuli were either presented for 150 or 2000 ms. In Experiment 2, similar faces were presented at different spatial positions to rule out the possibility that eye movements only reflect a general bias for certain visual field locations. In both experiments, participants fixated the eye region much longer than any other region in the face. Furthermore, the eye region was attended to more pronouncedly when fearful or neutral faces were shown whereas more attention was directed toward the mouth of happy facial expressions. Since these results were similar across the other experimental manipulations, they indicate that diagnostic features of emotional expressions are preferentially processed irrespective of task demands and spatial locations. Saliency analyses revealed that a computational model of bottom-up visual attention could not explain these results. Furthermore, as these gaze preferences were evident very early after stimulus onset and occurred even when saccades did not allow for extracting further information from these stimuli, they may reflect a preattentive mechanism that automatically detects relevant facial features in the visual field and facilitates the orientation of attention towards them. This mechanism might crucially depend on amygdala functioning and it is potentially impaired in a number of

  10. Diagnostic features of emotional expressions are processed preferentially.

    PubMed

    Scheller, Elisa; Büchel, Christian; Gamer, Matthias

    2012-01-01

    Diagnostic features of emotional expressions are differentially distributed across the face. The current study examined whether these diagnostic features are preferentially attended to even when they are irrelevant for the task at hand or when faces appear at different locations in the visual field. To this aim, fearful, happy and neutral faces were presented to healthy individuals in two experiments while measuring eye movements. In Experiment 1, participants had to accomplish an emotion classification, a gender discrimination or a passive viewing task. To differentiate fast, potentially reflexive, eye movements from a more elaborate scanning of faces, stimuli were either presented for 150 or 2000 ms. In Experiment 2, similar faces were presented at different spatial positions to rule out the possibility that eye movements only reflect a general bias for certain visual field locations. In both experiments, participants fixated the eye region much longer than any other region in the face. Furthermore, the eye region was attended to more pronouncedly when fearful or neutral faces were shown whereas more attention was directed toward the mouth of happy facial expressions. Since these results were similar across the other experimental manipulations, they indicate that diagnostic features of emotional expressions are preferentially processed irrespective of task demands and spatial locations. Saliency analyses revealed that a computational model of bottom-up visual attention could not explain these results. Furthermore, as these gaze preferences were evident very early after stimulus onset and occurred even when saccades did not allow for extracting further information from these stimuli, they may reflect a preattentive mechanism that automatically detects relevant facial features in the visual field and facilitates the orientation of attention towards them. This mechanism might crucially depend on amygdala functioning and it is potentially impaired in a number of

  11. Distinct and Overlapping Brain Areas Engaged during Value-Based, Mathematical, and Emotional Decision Processing

    PubMed Central

    Hsu, Chun-Wei; Goh, Joshua O. S.

    2016-01-01

    When comparing between the values of different choices, human beings can rely on either more cognitive processes, such as using mathematical computation, or more affective processes, such as using emotion. However, the neural correlates of how these two types of processes operate during value-based decision-making remain unclear. In this study, we investigated the extent to which neural regions engaged during value-based decision-making overlap with those engaged during mathematical and emotional processing in a within-subject manner. In a functional magnetic resonance imaging experiment, participants viewed stimuli that always consisted of numbers and emotional faces that depicted two choices. Across tasks, participants decided between the two choices based on the expected value of the numbers, a mathematical result of the numbers, or the emotional face stimuli. We found that all three tasks commonly involved various cortical areas including frontal, parietal, motor, somatosensory, and visual regions. Critically, the mathematical task shared common areas with the value but not emotion task in bilateral striatum. Although the emotion task overlapped with the value task in parietal, motor, and sensory areas, the mathematical task also evoked responses in other areas within these same cortical structures. Minimal areas were uniquely engaged for the value task apart from the other two tasks. The emotion task elicited a more expansive area of neural activity whereas value and mathematical task responses were in more focal regions. Whole-brain spatial correlation analysis showed that valuative processing engaged functional brain responses more similarly to mathematical processing than emotional processing. While decisions on expected value entail both mathematical and emotional processing regions, mathematical processes have a more prominent contribution particularly in subcortical processes. PMID:27375466

  12. Distinct and Overlapping Brain Areas Engaged during Value-Based, Mathematical, and Emotional Decision Processing.

    PubMed

    Hsu, Chun-Wei; Goh, Joshua O S

    2016-01-01

    When comparing between the values of different choices, human beings can rely on either more cognitive processes, such as using mathematical computation, or more affective processes, such as using emotion. However, the neural correlates of how these two types of processes operate during value-based decision-making remain unclear. In this study, we investigated the extent to which neural regions engaged during value-based decision-making overlap with those engaged during mathematical and emotional processing in a within-subject manner. In a functional magnetic resonance imaging experiment, participants viewed stimuli that always consisted of numbers and emotional faces that depicted two choices. Across tasks, participants decided between the two choices based on the expected value of the numbers, a mathematical result of the numbers, or the emotional face stimuli. We found that all three tasks commonly involved various cortical areas including frontal, parietal, motor, somatosensory, and visual regions. Critically, the mathematical task shared common areas with the value but not emotion task in bilateral striatum. Although the emotion task overlapped with the value task in parietal, motor, and sensory areas, the mathematical task also evoked responses in other areas within these same cortical structures. Minimal areas were uniquely engaged for the value task apart from the other two tasks. The emotion task elicited a more expansive area of neural activity whereas value and mathematical task responses were in more focal regions. Whole-brain spatial correlation analysis showed that valuative processing engaged functional brain responses more similarly to mathematical processing than emotional processing. While decisions on expected value entail both mathematical and emotional processing regions, mathematical processes have a more prominent contribution particularly in subcortical processes.

  13. Retention of identity versus expression of emotional faces differs in the recruitment of limbic areas.

    PubMed

    Röder, Christian H; Mohr, Harald; Linden, David E J

    2011-02-01

    Faces are multidimensional stimuli that convey information for complex social and emotional functions. Separate neural systems have been implicated in the recognition of facial identity (mainly extrastriate visual cortex) and emotional expression (limbic areas and the superior temporal sulcus). Working-memory (WM) studies with faces have shown different but partly overlapping activation patterns in comparison to spatial WM in parietal and prefrontal areas. However, little is known about the neural representations of the different facial dimensions during WM. In the present study 22 subjects performed a face-identity or face-emotion WM task at different load levels during functional magnetic resonance imaging. We found a fronto-parietal-visual WM-network for both tasks during maintenance, including fusiform gyrus. Limbic areas in the amygdala and parahippocampal gyrus demonstrated a stronger activation for the identity than the emotion condition. One explanation for this finding is that the repetitive presentation of faces with different identities but the same emotional expression during the identity-task is responsible for the stronger increase in BOLD signal in the amygdala. These results raise the question how different emotional expressions are coded in WM. Our findings suggest that emotional expressions are re-coded in an abstract representation that is supported at the neural level by the canonical fronto-parietal WM network. Copyright © 2010 Elsevier Ltd. All rights reserved.

  14. Sad benefit in face working memory: an emotional bias of melancholic depression.

    PubMed

    Linden, Stefanie C; Jackson, Margaret C; Subramanian, Leena; Healy, David; Linden, David E J

    2011-12-01

    Emotion biases feature prominently in cognitive theories of depression and are a focus of psychological interventions. However, there is presently no stable neurocognitive marker of altered emotion-cognition interactions in depression. One reason may be the heterogeneity of major depressive disorder. Our aim in the present study was to find an emotional bias that differentiates patients with melancholic depression from controls, and patients with melancholic from those with non-melancholic depression. We used a working memory paradigm for emotional faces, where two faces with angry, happy, neutral, sad or fearful expression had to be retained over one second. Twenty patients with melancholic depression, 20 age-, education- and gender-matched control participants and 20 patients with non-melancholic depression participated in the study. We analysed performance on the working memory task using signal detection measures. We found an interaction between group and emotion on working memory performance that was driven by the higher performance for sad faces compared to other categories in the melancholic group. We computed a measure of "sad benefit", which distinguished melancholic and non-melancholic patients with good sensitivity and specificity. However, replication studies and formal discriminant analysis will be needed in order to assess whether emotion bias in working memory may become a useful diagnostic tool to distinguish these two syndromes. Copyright © 2011 Elsevier B.V. All rights reserved.

  15. Neural correlates of emotional intelligence in a visual emotional oddball task: an ERP study.

    PubMed

    Raz, Sivan; Dan, Orrie; Zysberg, Leehu

    2014-11-01

    The present study was aimed at identifying potential behavioral and neural correlates of Emotional Intelligence (EI) by using scalp-recorded Event-Related Potentials (ERPs). EI levels were defined according to both self-report questionnaire and a performance-based ability test. We identified ERP correlates of emotional processing by using a visual-emotional oddball paradigm, in which subjects were confronted with one frequent standard stimulus (a neutral face) and two deviant stimuli (a happy and an angry face). The effects of these faces were then compared across groups with low and high EI levels. The ERP results indicate that participants with high EI exhibited significantly greater mean amplitudes of the P1, P2, N2, and P3 ERP components in response to emotional and neutral faces, at frontal, posterior-parietal and occipital scalp locations. P1, P2 and N2 are considered indexes of attention-related processes and have been associated with early attention to emotional stimuli. The later P3 component has been thought to reflect more elaborative, top-down, emotional information processing including emotional evaluation and memory encoding and formation. These results may suggest greater recruitment of resources to process all emotional and non-emotional faces at early and late processing stages among individuals with higher EI. The present study underscores the usefulness of ERP methodology as a sensitive measure for the study of emotional stimuli processing in the research field of EI. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Perceptual Grouping, Not Emotion, Accounts for Search Asymmetries with Schematic Faces

    ERIC Educational Resources Information Center

    Becker, Stefanie I.; Horstmann, Gernot; Remington, Roger W.

    2011-01-01

    Several different explanations have been proposed to account for the search asymmetry (SA) for angry schematic faces (i.e., the fact that an angry face target among friendly faces can be found faster than vice versa). The present study critically tested the perceptual grouping account, (a) that the SA is not due to emotional factors, but to…

  17. "We all look the same to me": positive emotions eliminate the own-race in face recognition.

    PubMed

    Johnson, Kareem J; Fredrickson, Barbara L

    2005-11-01

    Extrapolating from the broaden-and-build theory, we hypothesized that positive emotion may reduce the own-race bias in facial recognition. In Experiments 1 and 2, Caucasian participants (N = 89) viewed Black and White faces for a recognition task. They viewed videos eliciting joy, fear, or neutrality before the learning (Experiment 1) or testing (Experiment 2) stages of the task. Results reliably supported the hypothesis. Relative to fear or a neutral state, joy experienced before either stage improved recognition of Black faces and significantly reduced the own-race bias. Discussion centers on possible mechanisms for this reduction of the own-race bias, including improvements in holistic processing and promotion of a common in-group identity due to positive emotions.

  18. Behavioral assessment of emotional and motivational appraisal during visual processing of emotional scenes depending on spatial frequencies.

    PubMed

    Fradcourt, B; Peyrin, C; Baciu, M; Campagne, A

    2013-10-01

    Previous studies performed on visual processing of emotional stimuli have revealed preference for a specific type of visual spatial frequencies (high spatial frequency, HSF; low spatial frequency, LSF) according to task demands. The majority of studies used a face and focused on the appraisal of the emotional state of others. The present behavioral study investigates the relative role of spatial frequencies on processing emotional natural scenes during two explicit cognitive appraisal tasks, one emotional, based on the self-emotional experience and one motivational, based on the tendency to action. Our results suggest that HSF information was the most relevant to rapidly identify the self-emotional experience (unpleasant, pleasant, and neutral) while LSF was required to rapidly identify the tendency to action (avoidance, approach, and no action). The tendency to action based on LSF analysis showed a priority for unpleasant stimuli whereas the identification of emotional experience based on HSF analysis showed a priority for pleasant stimuli. The present study confirms the interest of considering both emotional and motivational characteristics of visual stimuli. Copyright © 2013 Elsevier Inc. All rights reserved.

  19. Abnormal GABAergic function and face processing in schizophrenia: A pharmacologic-fMRI study.

    PubMed

    Tso, Ivy F; Fang, Yu; Phan, K Luan; Welsh, Robert C; Taylor, Stephan F

    2015-10-01

    The involvement of the gamma-aminobutyric acid (GABA) system in schizophrenia is suggested by postmortem studies and the common use of GABA receptor-potentiating agents in treatment. In a recent study, we used a benzodiazepine challenge to demonstrate abnormal GABAergic function during processing of negative visual stimuli in schizophrenia. This study extended this investigation by mapping GABAergic mechanisms associated with face processing and social appraisal in schizophrenia using a benzodiazepine challenge. Fourteen stable, medicated schizophrenia/schizoaffective patients (SZ) and 13 healthy controls (HC) underwent functional MRI using the blood oxygenation level-dependent (BOLD) technique while they performed the Socio-emotional Preference Task (SePT) on emotional face stimuli ("Do you like this face?"). Participants received single-blinded intravenous saline and lorazepam (LRZ) in two separate sessions separated by 1-3weeks. Both SZ and HC recruited medial prefrontal cortex/anterior cingulate during the SePT, relative to gender identification. A significant drug by group interaction was observed in the medial occipital cortex, such that SZ showed increased BOLD signal to LRZ challenge, while HC showed an expected decrease of signal; the interaction did not vary by task. The altered BOLD response to LRZ challenge in SZ was significantly correlated with increased negative affect across multiple measures. The altered response to LRZ challenge suggests that abnormal face processing and negative affect in SZ are associated with altered GABAergic function in the visual cortex, underscoring the role of impaired visual processing in socio-emotional deficits in schizophrenia. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Adolescents' ability to read different emotional faces relates to their history of maltreatment and type of psychopathology.

    PubMed

    Leist, Tatyana; Dadds, Mark R

    2009-04-01

    Emotional processing styles appear to characterize various forms of psychopathology and environmental adversity in children. For example, autistic, anxious, high- and low-emotion conduct problem children, and children who have been maltreated, all appear to show specific deficits and strengths in recognizing the facial expressions of emotions. Until now, the relationships between emotion recognition, antisocial behaviour, emotional problems, callous-unemotional (CU) traits and early maltreatment have never been assessed simultaneously in one study, and the specific associations of emotion recognition to maltreatment and child characteristics are therefore unknown. We examined facial-emotion processing in a sample of 23 adolescents selected for high-risk status on the variables of interest. As expected, maltreatment and child characteristics showed unique associations. CU traits were uniquely related to impairments in fear recognition. Antisocial behaviour was uniquely associated with better fear recognition, but impaired anger recognition. Emotional problems were associated with better recognition of anger and sadness, but lower recognition of neutral faces. Maltreatment was predictive of superior recognition of fear and sadness. The findings are considered in terms of social information-processing theories of psychopathology. Implications for clinical interventions are discussed.

  1. Unconscious processing of facial attractiveness: invisible attractive faces orient visual attention

    PubMed Central

    Hung, Shao-Min; Nieh, Chih-Hsuan; Hsieh, Po-Jang

    2016-01-01

    Past research has proven human’s extraordinary ability to extract information from a face in the blink of an eye, including its emotion, gaze direction, and attractiveness. However, it remains elusive whether facial attractiveness can be processed and influences our behaviors in the complete absence of conscious awareness. Here we demonstrate unconscious processing of facial attractiveness with three distinct approaches. In Experiment 1, the time taken for faces to break interocular suppression was measured. The results showed that attractive faces enjoyed the privilege of breaking suppression and reaching consciousness earlier. In Experiment 2, we further showed that attractive faces had lower visibility thresholds, again suggesting that facial attractiveness could be processed more easily to reach consciousness. Crucially, in Experiment 3, a significant decrease of accuracy on an orientation discrimination task subsequent to an invisible attractive face showed that attractive faces, albeit suppressed and invisible, still exerted an effect by orienting attention. Taken together, for the first time, we show that facial attractiveness can be processed in the complete absence of consciousness, and an unconscious attractive face is still capable of directing our attention. PMID:27848992

  2. Neurophysiological Markers of Emotion Processing in Burnout Syndrome.

    PubMed

    Golonka, Krystyna; Mojsa-Kaja, Justyna; Popiel, Katarzyna; Marek, Tadeusz; Gawlowska, Magda

    2017-01-01

    The substantial body of research employing subjective measures indicates that burnout syndrome is associated with cognitive and emotional dysfunctions. The growing amount of neurophysiological and neuroimaging research helps in broadening existing knowledge of the neural mechanisms underlying core burnout components (emotional exhaustion and depersonalization/cynicism) that are inextricably associated with emotional processing. In the presented EEG study, a group of 93 participants (55 women; mean age = 35.8) were selected for the burnout group or the demographically matched control group on the basis of the results of the Maslach Burnout Inventory - General Survey (MBI-GS) and the Areas of Worklife Survey (AWS). Subjects then participated in an EEG experiment using two experimental procedures: a facial recognition task and viewing of passive pictures. The study focuses on analyzing event-related potentials (ERPs): N170, VPP, EPN, and LPP, as indicators of emotional information processing. Our results show that burnout subjects, as compared to the control group, demonstrate significantly weaker response to affect-evoking stimuli, indexed by a decline in VPP amplitude to emotional faces and decreased EPN amplitude in processing emotional scenes. The analysis of N170 and LPP showed no significant between-group difference. The correlation analyses revealed that VPP and EPN, which are ERP components related to emotional processing, are associated with two core burnout symptoms: emotional exhaustion and cynicism. To our knowledge, we are one of the first research groups to use ERPs to demonstrate such a relationship between neurophysiological activity and burnout syndrome in the context of emotional processing. Thus, in conclusion we emphasized that the decreased amplitude of VPP and EPN components in the burnout group may be a neurophysiological manifestation of emotional blunting and may be considered as neurophysiological markers of emotional exhaustion and cynicism

  3. Neurophysiological Markers of Emotion Processing in Burnout Syndrome

    PubMed Central

    Golonka, Krystyna; Mojsa-Kaja, Justyna; Popiel, Katarzyna; Marek, Tadeusz; Gawlowska, Magda

    2017-01-01

    The substantial body of research employing subjective measures indicates that burnout syndrome is associated with cognitive and emotional dysfunctions. The growing amount of neurophysiological and neuroimaging research helps in broadening existing knowledge of the neural mechanisms underlying core burnout components (emotional exhaustion and depersonalization/cynicism) that are inextricably associated with emotional processing. In the presented EEG study, a group of 93 participants (55 women; mean age = 35.8) were selected for the burnout group or the demographically matched control group on the basis of the results of the Maslach Burnout Inventory – General Survey (MBI-GS) and the Areas of Worklife Survey (AWS). Subjects then participated in an EEG experiment using two experimental procedures: a facial recognition task and viewing of passive pictures. The study focuses on analyzing event-related potentials (ERPs): N170, VPP, EPN, and LPP, as indicators of emotional information processing. Our results show that burnout subjects, as compared to the control group, demonstrate significantly weaker response to affect-evoking stimuli, indexed by a decline in VPP amplitude to emotional faces and decreased EPN amplitude in processing emotional scenes. The analysis of N170 and LPP showed no significant between-group difference. The correlation analyses revealed that VPP and EPN, which are ERP components related to emotional processing, are associated with two core burnout symptoms: emotional exhaustion and cynicism. To our knowledge, we are one of the first research groups to use ERPs to demonstrate such a relationship between neurophysiological activity and burnout syndrome in the context of emotional processing. Thus, in conclusion we emphasized that the decreased amplitude of VPP and EPN components in the burnout group may be a neurophysiological manifestation of emotional blunting and may be considered as neurophysiological markers of emotional exhaustion and

  4. From face processing to face recognition: Comparing three different processing levels.

    PubMed

    Besson, G; Barragan-Jason, G; Thorpe, S J; Fabre-Thorpe, M; Puma, S; Ceccaldi, M; Barbeau, E J

    2017-01-01

    Verifying that a face is from a target person (e.g. finding someone in the crowd) is a critical ability of the human face processing system. Yet how fast this can be performed is unknown. The 'entry-level shift due to expertise' hypothesis suggests that - since humans are face experts - processing faces should be as fast - or even faster - at the individual than at superordinate levels. In contrast, the 'superordinate advantage' hypothesis suggests that faces are processed from coarse to fine, so that the opposite pattern should be observed. To clarify this debate, three different face processing levels were compared: (1) a superordinate face categorization level (i.e. detecting human faces among animal faces), (2) a face familiarity level (i.e. recognizing famous faces among unfamiliar ones) and (3) verifying that a face is from a target person, our condition of interest. The minimal speed at which faces can be categorized (∼260ms) or recognized as familiar (∼360ms) has largely been documented in previous studies, and thus provides boundaries to compare our condition of interest to. Twenty-seven participants were included. The recent Speed and Accuracy Boosting procedure paradigm (SAB) was used since it constrains participants to use their fastest strategy. Stimuli were presented either upright or inverted. Results revealed that verifying that a face is from a target person (minimal RT at ∼260ms) was remarkably fast but longer than the face categorization level (∼240ms) and was more sensitive to face inversion. In contrast, it was much faster than recognizing a face as familiar (∼380ms), a level severely affected by face inversion. Face recognition corresponding to finding a specific person in a crowd thus appears achievable in only a quarter of a second. In favor of the 'superordinate advantage' hypothesis or coarse-to-fine account of the face visual hierarchy, these results suggest a graded engagement of the face processing system across processing

  5. Positive emotion impedes emotional but not cognitive conflict processing.

    PubMed

    Zinchenko, Artyom; Obermeier, Christian; Kanske, Philipp; Schröger, Erich; Kotz, Sonja A

    2017-06-01

    Cognitive control enables successful goal-directed behavior by resolving a conflict between opposing action tendencies, while emotional control arises as a consequence of emotional conflict processing such as in irony. While negative emotion facilitates both cognitive and emotional conflict processing, it is unclear how emotional conflict processing is affected by positive emotion (e.g., humor). In 2 EEG experiments, we investigated the role of positive audiovisual target stimuli in cognitive and emotional conflict processing. Participants categorized either spoken vowels (cognitive task) or their emotional valence (emotional task) and ignored the visual stimulus dimension. Behaviorally, a positive target showed no influence on cognitive conflict processing, but impeded emotional conflict processing. In the emotional task, response time conflict costs were higher for positive than for neutral targets. In the EEG, we observed an interaction of emotion by congruence in the P200 and N200 ERP components in emotional but not in cognitive conflict processing. In the emotional conflict task, the P200 and N200 conflict effect was larger for emotional than neutral targets. Thus, our results show that emotion affects conflict processing differently as a function of conflict type and emotional valence. This suggests that there are conflict- and valence-specific mechanisms modulating executive control.

  6. An fMRI study of facial emotion processing in patients with schizophrenia.

    PubMed

    Gur, Raquel E; McGrath, Claire; Chan, Robin M; Schroeder, Lee; Turner, Travis; Turetsky, Bruce I; Kohler, Christian; Alsop, David; Maldjian, Joseph; Ragland, J Daniel; Gur, Ruben C

    2002-12-01

    Emotion processing deficits are notable in schizophrenia. The authors evaluated cerebral blood flow response in schizophrenia patients during facial emotion processing to test the hypothesis of diminished limbic activation related to emotional relevance of facial stimuli. Fourteen patients with schizophrenia and 14 matched comparison subjects viewed facial displays of happiness, sadness, anger, fear, and disgust as well as neutral faces. Functional magnetic resonance imaging was used to measure blood-oxygen-level-dependent signal changes as the subjects alternated between tasks of discriminating emotional valence (positive versus negative) and age (over 30 versus under 30) of the faces with an interleaved crosshair reference condition. The groups did not differ in performance on either task. For both tasks, healthy participants showed activation in the fusiform gyrus, occipital lobe, and inferior frontal cortex relative to the resting baseline condition. The increase was greater in the amygdala and hippocampus during the emotional valence discrimination task than during the age discrimination task. In the patients with schizophrenia, minimal focal response was observed for all tasks relative to the resting baseline condition. Contrasting patients and comparison subjects on the emotional valence discrimination task revealed voxels in the left amygdala and bilateral hippocampus in which the comparison subjects had significantly greater activation. Failure to activate limbic regions during emotional valence discrimination may explain emotion processing deficits in patients with schizophrenia. While the lack of limbic recruitment did not significantly impair simple valence discrimination performance in this clinically stable group, it may impact performance of more demanding tasks.

  7. The endocannabinoid system and emotional processing: a pharmacological fMRI study with ∆9-tetrahydrocannabinol.

    PubMed

    Bossong, Matthijs G; van Hell, Hendrika H; Jager, Gerry; Kahn, René S; Ramsey, Nick F; Jansma, J Martijn

    2013-12-01

    Various psychiatric disorders such as major depression are associated with abnormalities in emotional processing. Evidence indicating involvement of the endocannabinoid system in emotional processing, and thus potentially in related abnormalities, is increasing. In the present study, we examined the role of the endocannabinoid system in processing of stimuli with a positive and negative emotional content in healthy volunteers. A pharmacological functional magnetic resonance imaging (fMRI) study was conducted with a placebo-controlled, cross-over design, investigating effects of the endocannabinoid agonist ∆9-tetrahydrocannabinol (THC) on brain function related to emotional processing in 11 healthy subjects. Performance and brain activity during matching of stimuli with a negative ('fearful faces') or a positive content ('happy faces') were assessed after placebo and THC administration. After THC administration, performance accuracy was decreased for stimuli with a negative but not for stimuli with a positive emotional content. Our task activated a network of brain regions including amygdala, orbital frontal gyrus, hippocampus, parietal gyrus, prefrontal cortex, and regions in the occipital cortex. THC interacted with emotional content, as activity in this network was reduced for negative content, while activity for positive content was increased. These results indicate that THC administration reduces the negative bias in emotional processing. This adds human evidence to support the hypothesis that the endocannabinoid system is involved in modulation of emotional processing. Our findings also suggest a possible role for the endocannabinoid system in abnormal emotional processing, and may thus be relevant for psychiatric disorders such as major depression. Copyright © 2013 Elsevier B.V. and ECNP. All rights reserved.

  8. Individual differences in emotion processing: how similar are diffusion model parameters across tasks?

    PubMed

    Mueller, Christina J; White, Corey N; Kuchinke, Lars

    2017-11-27

    The goal of this study was to replicate findings of diffusion model parameters capturing emotion effects in a lexical decision task and investigating whether these findings extend to other tasks of implicit emotion processing. Additionally, we were interested in the stability of diffusion model parameters across emotional stimuli and tasks for individual subjects. Responses to words in a lexical decision task were compared with responses to faces in a gender categorization task for stimuli of the emotion categories: happy, neutral and fear. Main effects of emotion as well as stability of emerging response style patterns as evident in diffusion model parameters across these tasks were analyzed. Based on earlier findings, drift rates were assumed to be more similar in response to stimuli of the same emotion category compared to stimuli of a different emotion category. Results showed that emotion effects of the tasks differed with a processing advantage for happy followed by neutral and fear-related words in the lexical decision task and a processing advantage for neutral followed by happy and fearful faces in the gender categorization task. Both emotion effects were captured in estimated drift rate parameters-and in case of the lexical decision task also in the non-decision time parameters. A principal component analysis showed that contrary to our hypothesis drift rates were more similar within a specific task context than within a specific emotion category. Individual response patterns of subjects across tasks were evident in significant correlations regarding diffusion model parameters including response styles, non-decision times and information accumulation.

  9. Emotional language processing in autism spectrum disorders: a systematic review.

    PubMed

    Lartseva, Alina; Dijkstra, Ton; Buitelaar, Jan K

    2014-01-01

    In his first description of Autism Spectrum Disorders (ASD), Kanner emphasized emotional impairments by characterizing children with ASD as indifferent to other people, self-absorbed, emotionally cold, distanced, and retracted. Thereafter, emotional impairments became regarded as part of the social impairments of ASD, and research mostly focused on understanding how individuals with ASD recognize visual expressions of emotions from faces and body postures. However, it still remains unclear how emotions are processed outside of the visual domain. This systematic review aims to fill this gap by focusing on impairments of emotional language processing in ASD. We systematically searched PubMed for papers published between 1990 and 2013 using standardized search terms. Studies show that people with ASD are able to correctly classify emotional language stimuli as emotionally positive or negative. However, processing of emotional language stimuli in ASD is associated with atypical patterns of attention and memory performance, as well as abnormal physiological and neural activity. Particularly, younger children with ASD have difficulties in acquiring and developing emotional concepts, and avoid using these in discourse. These emotional language impairments were not consistently associated with age, IQ, or level of development of language skills. We discuss how emotional language impairments fit with existing cognitive theories of ASD, such as central coherence, executive dysfunction, and weak Theory of Mind. We conclude that emotional impairments in ASD may be broader than just a mere consequence of social impairments, and should receive more attention in future research.

  10. Emotional language processing in autism spectrum disorders: a systematic review

    PubMed Central

    Lartseva, Alina; Dijkstra, Ton; Buitelaar, Jan K.

    2015-01-01

    In his first description of Autism Spectrum Disorders (ASD), Kanner emphasized emotional impairments by characterizing children with ASD as indifferent to other people, self-absorbed, emotionally cold, distanced, and retracted. Thereafter, emotional impairments became regarded as part of the social impairments of ASD, and research mostly focused on understanding how individuals with ASD recognize visual expressions of emotions from faces and body postures. However, it still remains unclear how emotions are processed outside of the visual domain. This systematic review aims to fill this gap by focusing on impairments of emotional language processing in ASD. We systematically searched PubMed for papers published between 1990 and 2013 using standardized search terms. Studies show that people with ASD are able to correctly classify emotional language stimuli as emotionally positive or negative. However, processing of emotional language stimuli in ASD is associated with atypical patterns of attention and memory performance, as well as abnormal physiological and neural activity. Particularly, younger children with ASD have difficulties in acquiring and developing emotional concepts, and avoid using these in discourse. These emotional language impairments were not consistently associated with age, IQ, or level of development of language skills. We discuss how emotional language impairments fit with existing cognitive theories of ASD, such as central coherence, executive dysfunction, and weak Theory of Mind. We conclude that emotional impairments in ASD may be broader than just a mere consequence of social impairments, and should receive more attention in future research. PMID:25610383

  11. The Cambridge Mindreading (CAM) Face-Voice Battery: Testing complex emotion recognition in adults with and without Asperger syndrome.

    PubMed

    Golan, Ofer; Baron-Cohen, Simon; Hill, Jacqueline

    2006-02-01

    Adults with Asperger Syndrome (AS) can recognise simple emotions and pass basic theory of mind tasks, but have difficulties recognising more complex emotions and mental states. This study describes a new battery of tasks, testing recognition of 20 complex emotions and mental states from faces and voices. The battery was given to males and females with AS and matched controls. Results showed the AS group performed worse than controls overall, on emotion recognition from faces and voices and on 12/20 specific emotions. Females recognised faces better than males regardless of diagnosis, and males with AS had more difficulties recognising emotions from faces than from voices. The implications of these results are discussed in relation to social functioning in AS.

  12. Modulation of central serotonin affects emotional information processing in impulsive aggressive personality disorder.

    PubMed

    Lee, Royce J; Gill, Andrew; Chen, Bing; McCloskey, Michael; Coccaro, Emil F

    2012-06-01

    The mechanistic model whereby serotonin affects impulsive aggression is not completely understood. The purpose of this study was to test the hypothesis that depletion of serotonin reserves by tryptophan depletion affects emotional information processing in susceptible individuals. The effect of tryptophan (vs placebo) depletion on processing of Ekman emotional faces was compared in impulsive aggressive personality disordered, male and female adults with normal controls. All subjects were free of psychotropic medications, medically healthy, nondepressed, and substance free. Additionally, subjective mood state and vital signs were monitored. For emotion recognition, a significant interaction of Aggression × Drug × Sex (F(1, 31) = 7.687, P = 0.009) was found, with male normal controls but not impulsive aggressive males showing increased recognition of fear. For intensity ratings of emotional faces, a significant interaction was discovered of Drug × Group × Sex (F(1, 31) = 5.924, P = 0.021), with follow-up tests revealing that males with intermittent explosive disorder tended to increase intensity ratings of angry faces after tryptophan depletion. Additionally, tryptophan depletion was associated with increased heart rate in all subjects, and increased intensity of the subjective emotional state of "anger" in impulsive aggressive subjects. Individuals with clinically relevant levels of impulsive aggression may be susceptible to effects of serotonergic depletion on emotional information processing, showing a tendency to exaggerate their impression of the intensity of angry expressions and to report an angry mood state after tryptophan depletion. This may reflect heightened sensitivity to the effects of serotonergic dysregulation, and suggests that what underlies impulsive aggression is either supersensitivity to serotonergic disturbances or susceptibility to fluctuations in central serotonergic availability.

  13. Emotion Words, Regardless of Polarity, Have a Processing Advantage over Neutral Words

    ERIC Educational Resources Information Center

    Kousta, Stavroula-Thaleia; Vinson, David P.; Vigliocco, Gabriella

    2009-01-01

    Despite increasing interest in the interface between emotion and cognition, the role of emotion in cognitive tasks is unclear. According to one hypothesis, negative valence is more relevant for survival and is associated with a general slowdown of the processing of stimuli, due to a defense mechanism that freezes activity in the face of threat.…

  14. Face processing in different brain areas, and critical band masking.

    PubMed

    Rolls, Edmund T

    2008-09-01

    Neurophysiological evidence is described showing that some neurons in the macaque inferior temporal visual cortex have responses that are invariant with respect to the position, size, view, and spatial frequency of faces and objects, and that these neurons show rapid processing and rapid learning. Critical band spatial frequency masking is shown to be a property of these face-selective neurons and of the human visual perception of faces. Which face or object is present is encoded using a distributed representation in which each neuron conveys independent information in its firing rate, with little information evident in the relative time of firing of different neurons. This ensemble encoding has the advantages of maximizing the information in the representation useful for discrimination between stimuli using a simple weighted sum of the neuronal firing by the receiving neurons, generalization, and graceful degradation. These invariant representations are ideally suited to provide the inputs to brain regions such as the orbitofrontal cortex and amygdala that learn the reinforcement associations of an individual's face, for then the learning, and the appropriate social and emotional responses generalize to other views of the same face. A theory is described of how such invariant representations may be produced by self-organizing learning in a hierarchically organized set of visual cortical areas with convergent connectivity. The theory utilizes either temporal or spatial continuity with an associative synaptic modification rule. Another population of neurons in the cortex in the superior temporal sulcus encodes other aspects of faces such as face expression, eye-gaze, face view, and whether the head is moving. These neurons thus provide important additional inputs to parts of the brain such as the orbitofrontal cortex and amygdala that are involved in social communication and emotional behaviour. Outputs of these systems reach the amygdala, in which face

  15. Are event-related potentials to dynamic facial expressions of emotion related to individual differences in the accuracy of processing facial expressions and identity?

    PubMed

    Recio, Guillermo; Wilhelm, Oliver; Sommer, Werner; Hildebrandt, Andrea

    2017-04-01

    Despite a wealth of knowledge about the neural mechanisms behind emotional facial expression processing, little is known about how they relate to individual differences in social cognition abilities. We studied individual differences in the event-related potentials (ERPs) elicited by dynamic facial expressions. First, we assessed the latent structure of the ERPs, reflecting structural face processing in the N170, and the allocation of processing resources and reflexive attention to emotionally salient stimuli, in the early posterior negativity (EPN) and the late positive complex (LPC). Then we estimated brain-behavior relationships between the ERP factors and behavioral indicators of facial identity and emotion-processing abilities. Structural models revealed that the participants who formed faster structural representations of neutral faces (i.e., shorter N170 latencies) performed better at face perception (r = -.51) and memory (r = -.42). The N170 amplitude was not related to individual differences in face cognition or emotion processing. The latent EPN factor correlated with emotion perception (r = .47) and memory (r = .32), and also with face perception abilities (r = .41). Interestingly, the latent factor representing the difference in EPN amplitudes between the two neutral control conditions (chewing and blinking movements) also correlated with emotion perception (r = .51), highlighting the importance of tracking facial changes in the perception of emotional facial expressions. The LPC factor for negative expressions correlated with the memory for emotional facial expressions. The links revealed between the latency and strength of activations of brain systems and individual differences in processing socio-emotional information provide new insights into the brain mechanisms involved in social communication.

  16. Emotion processing for arousal and neutral content in Alzheimer's disease.

    PubMed

    Satler, Corina; Uribe, Carlos; Conde, Carlos; Da-Silva, Sergio Leme; Tomaz, Carlos

    2010-02-01

    Objective. To assess the ability of Alzheimer's disease (AD) patients to perceive emotional information and to assign subjective emotional rating scores to audiovisual presentations. Materials and Methods. 24 subjects (14 with AD, matched to controls for age and educational levels) were studied. After neuropsychological assessment, they watched a Neutral story and then a story with Emotional content. Results. Recall scores for both stories were significantly lower in AD (Neutral and Emotional: P = .001). CG assigned different emotional scores for each version of the test, P = .001, while ratings of AD did not differ, P = .32. Linear regression analyses determined the best predictors of emotional rating and recognition memory for each group among neuropsychological tests battery. Conclusions. AD patients show changes in emotional processing on declarative memory and a preserved ability to express emotions in face of arousal content. The present findings suggest that these impairments are due to general cognitive decline.

  17. The perception and identification of facial emotions in individuals with autism spectrum disorders using the Let's Face It! Emotion Skills Battery.

    PubMed

    Tanaka, James W; Wolf, Julie M; Klaiman, Cheryl; Koenig, Kathleen; Cockburn, Jeffrey; Herlihy, Lauren; Brown, Carla; Stahl, Sherin S; South, Mikle; McPartland, James C; Kaiser, Martha D; Schultz, Robert T

    2012-12-01

    Although impaired social-emotional ability is a hallmark of autism spectrum disorder (ASD), the perceptual skills and mediating strategies contributing to the social deficits of autism are not well understood. A perceptual skill that is fundamental to effective social communication is the ability to accurately perceive and interpret facial emotions. To evaluate the expression processing of participants with ASD, we designed the Let's Face It! Emotion Skills Battery (LFI! Battery), a computer-based assessment composed of three subscales measuring verbal and perceptual skills implicated in the recognition of facial emotions. We administered the LFI! Battery to groups of participants with ASD and typically developing control (TDC) participants that were matched for age and IQ. On the Name Game labeling task, participants with ASD (N = 68) performed on par with TDC individuals (N = 66) in their ability to name the facial emotions of happy, sad, disgust and surprise and were only impaired in their ability to identify the angry expression. On the Matchmaker Expression task that measures the recognition of facial emotions across different facial identities, the ASD participants (N = 66) performed reliably worse than TDC participants (N = 67) on the emotions of happy, sad, disgust, frighten and angry. In the Parts-Wholes test of perceptual strategies of expression, the TDC participants (N = 67) displayed more holistic encoding for the eyes than the mouths in expressive faces whereas ASD participants (N = 66) exhibited the reverse pattern of holistic recognition for the mouth and analytic recognition of the eyes. In summary, findings from the LFI! Battery show that participants with ASD were able to label the basic facial emotions (with the exception of angry expression) on par with age- and IQ-matched TDC participants. However, participants with ASD were impaired in their ability to generalize facial emotions across different identities and showed a tendency to recognize

  18. Of toothy grins and angry snarls--open mouth displays contribute to efficiency gains in search for emotional faces.

    PubMed

    Horstmann, Gernot; Lipp, Ottmar V; Becker, Stefanie I

    2012-05-25

    The emotional face-in-a-crowd effect is widely cited, but its origin remains controversial, particularly with photorealistic stimuli. Recently, it has been suggested that one factor underlying the guidance of attention by a photorealistic emotional face in visual search might be the visibility of teeth, a hypothesis, however, that has not been studied systematically to date. The present experiments manipulate the visibility of teeth experimentally and orthogonally to facial emotion. Results suggest that much of the face-in-a-crowd effect with photorealistic emotional faces is due to visible teeth, and that the visibility of teeth can create a search advantage for either a happy or an angry target face when teeth visibility and facial emotion are confounded. Further analyses clarify that the teeth visibility primarily affects the speed with which neutral crowds are scanned, shedding new light on the mechanism that evokes differences in search efficiency for different emotional expressions.

  19. Piccolo genotype modulates neural correlates of emotion processing but not executive functioning.

    PubMed

    Woudstra, S; Bochdanovits, Z; van Tol, M-J; Veltman, D J; Zitman, F G; van Buchem, M A; van der Wee, N J; Opmeer, E M; Demenescu, L R; Aleman, A; Penninx, B W; Hoogendijk, W J

    2012-04-03

    Major depressive disorder (MDD) is characterized by affective symptoms and cognitive impairments, which have been associated with changes in limbic and prefrontal activity as well as with monoaminergic neurotransmission. A genome-wide association study implicated the polymorphism rs2522833 in the piccolo (PCLO) gene--involved in monoaminergic neurotransmission--as a risk factor for MDD. However, the role of the PCLO risk allele in emotion processing and executive function or its effect on their neural substrate has never been studied. We used functional magnetic resonance imaging (fMRI) to investigate PCLO risk allele carriers vs noncarriers during an emotional face processing task and a visuospatial planning task in 159 current MDD patients and healthy controls. In PCLO risk allele carriers, we found increased activity in the left amygdala during processing of angry and sad faces compared with noncarriers, independent of psychopathological status. During processing of fearful faces, the PCLO risk allele was associated with increased amygdala activation in MDD patients only. During the visuospatial planning task, we found no genotype effect on performance or on BOLD signal in our predefined areas as a function of increasing task load. The PCLO risk allele was found to be specifically associated with altered emotion processing, but not with executive dysfunction. Moreover, the PCLO risk allele appears to modulate amygdala function during fearful facial processing in MDD and may constitute a possible link between genotype and susceptibility for depression via altered processing of fearful stimuli. The current results may therefore aid in better understanding underlying neurobiological mechanisms in MDD.

  20. Neural Correlates of Perceiving Emotional Faces and Bodies in Developmental Prosopagnosia: An Event-Related fMRI-Study

    PubMed Central

    Van den Stock, Jan; van de Riet, Wim A. C.; Righart, Ruthger; de Gelder, Beatrice

    2008-01-01

    Many people experience transient difficulties in recognizing faces but only a small number of them cannot recognize their family members when meeting them unexpectedly. Such face blindness is associated with serious problems in everyday life. A better understanding of the neuro-functional basis of impaired face recognition may be achieved by a careful comparison with an equally unique object category and by a adding a more realistic setting involving neutral faces as well facial expressions. We used event-related functional magnetic resonance imaging (fMRI) to investigate the neuro-functional basis of perceiving faces and bodies in three developmental prosopagnosics (DP) and matched healthy controls. Our approach involved materials consisting of neutral faces and bodies as well as faces and bodies expressing fear or happiness. The first main result is that the presence of emotional information has a different effect in the patient vs. the control group in the fusiform face area (FFA). Neutral faces trigger lower activation in the DP group, compared to the control group, while activation for facial expressions is the same in both groups. The second main result is that compared to controls, DPs have increased activation for bodies in the inferior occipital gyrus (IOG) and for neutral faces in the extrastriate body area (EBA), indicating that body and face sensitive processes are less categorically segregated in DP. Taken together our study shows the importance of using naturalistic emotional stimuli for a better understanding of developmental face deficits. PMID:18797499

  1. Double attention bias for positive and negative emotional faces in clinical depression: evidence from an eye-tracking study.

    PubMed

    Duque, Almudena; Vázquez, Carmelo

    2015-03-01

    According to cognitive models, attentional biases in depression play key roles in the onset and subsequent maintenance of the disorder. The present study examines the processing of emotional facial expressions (happy, angry, and sad) in depressed and non-depressed adults. Sixteen unmedicated patients with Major Depressive Disorder (MDD) and 34 never-depressed controls (ND) completed an eye-tracking task to assess different components of visual attention (orienting attention and maintenance of attention) in the processing of emotional faces. Compared to ND, participants with MDD showed a negative attentional bias in attentional maintenance indices (i.e. first fixation duration and total fixation time) for sad faces. This attentional bias was positively associated with the severity of depressive symptoms. Furthermore, the MDD group spent a marginally less amount of time viewing happy faces compared with the ND group. No differences were found between the groups with respect to angry faces and orienting attention indices. The current study is limited by its cross-sectional design. These results support the notion that attentional biases in depression are specific to depression-related information and that they operate in later stages in the deployment of attention. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Effect of positive emotion on consolidation of memory for faces: the modulation of facial valence and facial gender.

    PubMed

    Wang, Bo

    2013-01-01

    Studies have shown that emotion elicited after learning enhances memory consolidation. However, no prior studies have used facial photos as stimuli. This study examined the effect of post-learning positive emotion on consolidation of memory for faces. During the learning participants viewed neutral, positive, or negative faces. Then they were assigned to a condition in which they either watched a 9-minute positive video clip, or a 9-minute neutral video. Then 30 minutes after the learning participants took a surprise memory test, in which they made "remember", "know", and "new" judgements. The findings are: (1) Positive emotion enhanced consolidation of recognition for negative male faces, but impaired consolidation of recognition for negative female faces; (2) For males, recognition for negative faces was equivalent to that for positive faces; for females, recognition for negative faces was better than that for positive faces. Our study provides the important evidence that effect of post-learning emotion on memory consolidation can extend to facial stimuli and such an effect can be modulated by facial valence and facial gender. The findings may shed light on establishing models concerning the influence of emotion on memory consolidation.

  3. Attachment Patterns Trigger Differential Neural Signature of Emotional Processing in Adolescents

    PubMed Central

    Decety, Jean; Huepe, David; Cardona, Juan Felipe; Canales-Johnson, Andres; Sigman, Mariano; Mikulan, Ezequiel; Helgiu, Elena; Baez, Sandra; Manes, Facundo; Lopez, Vladimir; Ibañez, Agustín

    2013-01-01

    Background Research suggests that individuals with different attachment patterns process social information differently, especially in terms of facial emotion recognition. However, few studies have explored social information processes in adolescents. This study examined the behavioral and ERP correlates of emotional processing in adolescents with different attachment orientations (insecure attachment group and secure attachment group; IAG and SAG, respectively). This study also explored the association of these correlates to individual neuropsychological profiles. Methodology/Principal Findings We used a modified version of the dual valence task (DVT), in which participants classify stimuli (faces and words) according to emotional valence (positive or negative). Results showed that the IAG performed significantly worse than SAG on tests of executive function (EF attention, processing speed, visuospatial abilities and cognitive flexibility). In the behavioral DVT, the IAG presented lower performance and accuracy. The IAG also exhibited slower RTs for stimuli with negative valence. Compared to the SAG, the IAG showed a negative bias for faces; a larger P1 and attenuated N170 component over the right hemisphere was observed. A negative bias was also observed in the IAG for word stimuli, which was demonstrated by comparing the N170 amplitude of the IAG with the valence of the SAG. Finally, the amplitude of the N170 elicited by the facial stimuli correlated with EF in both groups (and negative valence with EF in the IAG). Conclusions/Significance Our results suggest that individuals with different attachment patterns process key emotional information and corresponding EF differently. This is evidenced by an early modulation of ERP components’ amplitudes, which are correlated with behavioral and neuropsychological effects. In brief, attachments patterns appear to impact multiple domains, such as emotional processing and EFs. PMID:23940552

  4. Is emotion recognition the only problem in ADHD? effects of pharmacotherapy on face and emotion recognition in children with ADHD.

    PubMed

    Demirci, Esra; Erdogan, Ayten

    2016-12-01

    The objectives of this study were to evaluate both face and emotion recognition, to detect differences among attention deficit and hyperactivity disorder (ADHD) subgroups, to identify effects of the gender and to assess the effects of methylphenidate and atomoxetine treatment on both face and emotion recognition in patients with ADHD. The study sample consisted of 41 male, 29 female patients, 8-15 years of age, who were diagnosed as having combined type ADHD (N = 26), hyperactive/impulsive type ADHD (N = 21) or inattentive type ADHD (N = 23) but had not previously used any medication for ADHD and 35 male, 25 female healthy individuals. Long-acting methylphenidate (OROS-MPH) was prescribed to 38 patients, whereas atomoxetine was prescribed to 32 patients. The reading the mind in the eyes test (RMET) and Benton face recognition test (BFRT) were applied to all participants before and after treatment. The patients with ADHD had a significantly lower number of correct answers in child and adolescent RMET and in BFRT than the healthy controls. Among the ADHD subtypes, the hyperactive/impulsive subtype had a lower number of correct answers in the RMET than the inattentive subtypes, and the hyperactive/impulsive subtype had a lower number of correct answers in short and long form of BFRT than the combined and inattentive subtypes. Male and female patients with ADHD did not differ significantly with respect to the number of correct answers on the RMET and BFRT. The patients showed significant improvement in RMET and BFRT after treatment with OROS-MPH or atomoxetine. Patients with ADHD have difficulties in face recognition as well as emotion recognition. Both OROS-MPH and atomoxetine affect emotion recognition. However, further studies on the face and emotion recognition are needed in ADHD.

  5. Disruption of Emotion and Conflict Processing in HIV Infection with and without Alcoholism Comorbidity

    PubMed Central

    Schulte, Tilman; Müller-Oehring, Eva M.; Sullivan, Edith V.; Pfefferbaum, Adolf

    2012-01-01

    Alcoholism and HIV-1 infection each affect components of selective attention and cognitive control that may contribute to deficits in emotion processing based on closely interacting fronto-parietal attention and frontal-subcortical emotion systems. Here, we investigated whether patients with alcoholism, HIV-1 infection, or both diseases have greater difficulty than healthy controls in resolving conflict from emotional words with different valences. Accordingly, patients with alcoholism (ALC, n = 20), HIV-1 infection (HIV, n = 20), ALC + HIV comorbidity (n = 22), and controls (CTL, n = 16) performed an emotional Stroop Match-to-Sample task, which assessed the contribution of emotion (happy, angry) to cognitive control (Stroop conflict processing). ALC + HIV showed greater Stroop effects than HIV, ALC, or CTL for negative (ANGRY) but not for positive (HAPPY) words, and also when the cue color did not match the Stroop stimulus color; the comorbid group performed similarly to the others when cue and word colors matched. Furthermore, emotionally salient face cues prolonged color-matching responses in all groups. HIV alone, compared with the other three groups, showed disproportionately slowed color-matching time when trials featured angry faces. The enhanced Stroop effects prominent in ALC + HIV suggest difficulty in exercising attentional top-down control on processes that consume attentional capacity, especially when cognitive effort is required to ignore negative emotions. PMID:21418720

  6. Disruption of emotion and conflict processing in HIV infection with and without alcoholism comorbidity.

    PubMed

    Schulte, Tilman; Müller-Oehring, Eva M; Sullivan, Edith V; Pfefferbaum, Adolf

    2011-05-01

    Alcoholism and HIV-1 infection each affect components of selective attention and cognitive control that may contribute to deficits in emotion processing based on closely interacting fronto-parietal attention and frontal-subcortical emotion systems. Here, we investigated whether patients with alcoholism, HIV-1 infection, or both diseases have greater difficulty than healthy controls in resolving conflict from emotional words with different valences. Accordingly, patients with alcoholism (ALC, n = 20), HIV-1 infection (HIV, n = 20), ALC + HIV comorbidity (n = 22), and controls (CTL, n = 16) performed an emotional Stroop Match-to-Sample task, which assessed the contribution of emotion (happy, angry) to cognitive control (Stroop conflict processing). ALC + HIV showed greater Stroop effects than HIV, ALC, or CTL for negative (ANGRY) but not for positive (HAPPY) words, and also when the cue color did not match the Stroop stimulus color; the comorbid group performed similarly to the others when cue and word colors matched. Furthermore, emotionally salient face cues prolonged color-matching responses in all groups. HIV alone, compared with the other three groups, showed disproportionately slowed color-matching time when trials featured angry faces. The enhanced Stroop effects prominent in ALC + HIV suggest difficulty in exercising attentional top-down control on processes that consume attentional capacity, especially when cognitive effort is required to ignore negative emotions.

  7. Callousness and affective face processing in adults: Behavioral and brain-potential indicators.

    PubMed

    Brislin, Sarah J; Yancey, James R; Perkins, Emily R; Palumbo, Isabella M; Drislane, Laura E; Salekin, Randall T; Fanti, Kostas A; Kimonis, Eva R; Frick, Paul J; Blair, R James R; Patrick, Christopher J

    2018-03-01

    The investigation of callous-unemotional (CU) traits has been central to contemporary research on child behavior problems, and served as the impetus for inclusion of a specifier for conduct disorder in the latest edition of the official psychiatric diagnostic system. Here, we report results from 2 studies that evaluated the construct validity of callousness as assessed in adults, by testing for affiliated deficits in behavioral and neural processing of fearful faces, as have been shown in youthful samples. We hypothesized that scores on an established measure of callousness would predict reduced recognition accuracy and diminished electocortical reactivity for fearful faces in adult participants. In Study 1, 66 undergraduate participants performed an emotion recognition task in which they viewed affective faces of different types and indicated the emotion expressed by each. In Study 2, electrocortical data were collected from 254 adult twins during viewing of fearful and neutral face stimuli, and scored for event-related response components. Analyses of Study 1 data revealed that higher callousness was associated with decreased recognition accuracy for fearful faces specifically. In Study 2, callousness was associated with reduced amplitude of both N170 and P200 responses to fearful faces. Current findings demonstrate for the first time that callousness in adults is associated with both behavioral and physiological deficits in the processing of fearful faces. These findings support the validity of the CU construct with adults and highlight the possibility of a multidomain measurement framework for continued study of this important clinical construct. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  8. Facing emotions in narcolepsy with cataplexy: haemodynamic and behavioural responses during emotional stimulation.

    PubMed

    de Zambotti, Massimiliano; Pizza, Fabio; Covassin, Naima; Vandi, Stefano; Cellini, Nicola; Stegagno, Luciano; Plazzi, Giuseppe

    2014-08-01

    Narcolepsy with cataplexy is a complex sleep disorder that affects the modulation of emotions: cataplexy, the key symptom of narcolepsy, is indeed strongly linked with emotions that usually trigger the episodes. Our study aimed to investigate haemodynamic and behavioural responses during emotional stimulation in narco-cataplexy. Twelve adult drug-naive narcoleptic patients (five males; age: 33.3 ± 9.4 years) and 12 healthy controls (five males; age: 30.9 ± 9.5 years) were exposed to emotional stimuli (pleasant, unpleasant and neutral pictures). Heart rate, arterial blood pressure and mean cerebral blood flow velocity of the middle cerebral arteries were continuously recorded using photoplethysmography and Doppler ultrasound. Ratings of valence and arousal and coping strategies were scored by the Self-Assessment Manikin and by questionnaires, respectively. Narcoleptic patients' haemodynamic responses to pictures overlapped with the data obtained from controls: decrease of heart rate and increase of mean cerebral blood flow velocity regardless of pictures' content, increase of systolic blood pressure during the pleasant condition, and relative reduction of heart rate during pleasant and unpleasant conditions. However, when compared with controls, narcoleptic patients reported lower arousal scores during the pleasant and neutral stimulation, and lower valence scores during the pleasant condition, respectively, and also a lower score at the 'focus on and venting of emotions' dimensions of coping. Our results suggested that adult narcoleptic patients, compared with healthy controls, inhibited their emotion-expressive behaviour to emotional stimulation, and that may be related to the development of adaptive cognitive strategies to face emotions avoiding cataplexy. © 2014 European Sleep Research Society.

  9. Major depression is associated with impaired processing of emotion in music as well as in facial and vocal stimuli.

    PubMed

    Naranjo, C; Kornreich, C; Campanella, S; Noël, X; Vandriette, Y; Gillain, B; de Longueville, X; Delatte, B; Verbanck, P; Constant, E

    2011-02-01

    The processing of emotional stimuli is thought to be negatively biased in major depression. This study investigates this issue using musical, vocal and facial affective stimuli. 23 depressed in-patients and 23 matched healthy controls were recruited. Affective information processing was assessed through musical, vocal and facial emotion recognition tasks. Depression, anxiety level and attention capacity were controlled. The depressed participants demonstrated less accurate identification of emotions than the control group in all three sorts of emotion-recognition tasks. The depressed group also gave higher intensity ratings than the controls when scoring negative emotions, and they were more likely to attribute negative emotions to neutral voices and faces. Our in-patient group might differ from the more general population of depressed adults. They were all taking anti-depressant medication, which may have had an influence on their emotional information processing. Major depression is associated with a general negative bias in the processing of emotional stimuli. Emotional processing impairment in depression is not confined to interpersonal stimuli (faces and voices), being also present in the ability to feel music accurately. © 2010 Elsevier B.V. All rights reserved.

  10. Getting to the Bottom of Face Processing. Species-Specific Inversion Effects for Faces and Behinds in Humans and Chimpanzees (Pan Troglodytes).

    PubMed

    Kret, Mariska E; Tomonaga, Masaki

    2016-01-01

    For social species such as primates, the recognition of conspecifics is crucial for their survival. As demonstrated by the 'face inversion effect', humans are experts in recognizing faces and unlike objects, recognize their identity by processing it configurally. The human face, with its distinct features such as eye-whites, eyebrows, red lips and cheeks signals emotions, intentions, health and sexual attraction and, as we will show here, shares important features with the primate behind. Chimpanzee females show a swelling and reddening of the anogenital region around the time of ovulation. This provides an important socio-sexual signal for group members, who can identify individuals by their behinds. We hypothesized that chimpanzees process behinds configurally in a way humans process faces. In four different delayed matching-to-sample tasks with upright and inverted body parts, we show that humans demonstrate a face, but not a behind inversion effect and that chimpanzees show a behind, but no clear face inversion effect. The findings suggest an evolutionary shift in socio-sexual signalling function from behinds to faces, two hairless, symmetrical and attractive body parts, which might have attuned the human brain to process faces, and the human face to become more behind-like.

  11. Dispositional fear, negative affectivity, and neuroimaging response to visually suppressed emotional faces.

    PubMed

    Vizueta, Nathalie; Patrick, Christopher J; Jiang, Yi; Thomas, Kathleen M; He, Sheng

    2012-01-02

    "Invisible" stimulus paradigms provide a method for investigating basic affective processing in clinical and non-clinical populations. Neuroimaging studies utilizing continuous flash suppression (CFS) have shown increased amygdala response to invisible fearful versus neutral faces. The current study used CFS in conjunction with functional MRI to test for differences in brain reactivity to visible and invisible emotional faces in relation to two distinct trait dimensions relevant to psychopathology: negative affectivity (NA) and fearfulness. Subjects consisted of college students (N=31) assessed for fear/fearlessness along with dispositional NA. The main brain regions of interest included the fusiform face area (FFA), superior temporal sulcus (STS), and amygdala. Higher NA, but not trait fear, was associated with enhanced response to fearful versus neutral faces in STS and right amygdala (but not FFA), within the invisible condition specifically. The finding that NA rather than fearfulness predicted degree of amygdala reactivity to suppressed faces implicates the input subdivision of the amygdala in the observed effects. Given the central role of NA in anxiety and mood disorders, the current data also support use of the CFS methodology for investigating the neurobiology of these disorders. Copyright © 2011 Elsevier Inc. All rights reserved.

  12. Emotional context enhances auditory novelty processing in superior temporal gyrus.

    PubMed

    Domínguez-Borràs, Judith; Trautmann, Sina-Alexa; Erhard, Peter; Fehr, Thorsten; Herrmann, Manfred; Escera, Carles

    2009-07-01

    Visualizing emotionally loaded pictures intensifies peripheral reflexes toward sudden auditory stimuli, suggesting that the emotional context may potentiate responses elicited by novel events in the acoustic environment. However, psychophysiological results have reported that attentional resources available to sounds become depleted, as attention allocation to emotional pictures increases. These findings have raised the challenging question of whether an emotional context actually enhances or attenuates auditory novelty processing at a central level in the brain. To solve this issue, we used functional magnetic resonance imaging to first identify brain activations induced by novel sounds (NOV) when participants made a color decision on visual stimuli containing both negative (NEG) and neutral (NEU) facial expressions. We then measured modulation of these auditory responses by the emotional load of the task. Contrary to what was assumed, activation induced by NOV in superior temporal gyrus (STG) was enhanced when subjects responded to faces with a NEG emotional expression compared with NEU ones. Accordingly, NOV yielded stronger behavioral disruption on subjects' performance in the NEG context. These results demonstrate that the emotional context modulates the excitability of auditory and possibly multimodal novelty cerebral regions, enhancing acoustic novelty processing in a potentially harming environment.

  13. Processing of subliminal facial expressions of emotion: a behavioral and fMRI study.

    PubMed

    Prochnow, D; Kossack, H; Brunheim, S; Müller, K; Wittsack, H-J; Markowitsch, H-J; Seitz, R J

    2013-01-01

    The recognition of emotional facial expressions is an important means to adjust behavior in social interactions. As facial expressions widely differ in their duration and degree of expressiveness, they often manifest with short and transient expressions below the level of awareness. In this combined behavioral and fMRI study, we aimed at examining whether or not consciously accessible (subliminal) emotional facial expressions influence empathic judgments and which brain activations are related to it. We hypothesized that subliminal facial expressions of emotions masked with neutral expressions of the same faces induce an empathic processing similar to consciously accessible (supraliminal) facial expressions. Our behavioral data in 23 healthy subjects showed that subliminal emotional facial expressions of 40 ms duration affect the judgments of the subsequent neutral facial expressions. In the fMRI study in 12 healthy subjects it was found that both, supra- and subliminal emotional facial expressions shared a widespread network of brain areas including the fusiform gyrus, the temporo-parietal junction, and the inferior, dorsolateral, and medial frontal cortex. Compared with subliminal facial expressions, supraliminal facial expressions led to a greater activation of left occipital and fusiform face areas. We conclude that masked subliminal emotional information is suited to trigger processing in brain areas which have been implicated in empathy and, thereby in social encounters.

  14. Association between Ability Emotional Intelligence and Left Insula during Social Judgment of Facial Emotions.

    PubMed

    Quarto, Tiziana; Blasi, Giuseppe; Maddalena, Chiara; Viscanti, Giovanna; Lanciano, Tiziana; Soleti, Emanuela; Mangiulli, Ivan; Taurisano, Paolo; Fazio, Leonardo; Bertolino, Alessandro; Curci, Antonietta

    2016-01-01

    The human ability of identifying, processing and regulating emotions from social stimuli is generally referred as Emotional Intelligence (EI). Within EI, Ability EI identifies a performance measure assessing individual skills at perceiving, using, understanding and managing emotions. Previous models suggest that a brain "somatic marker circuitry" (SMC) sustains emotional sub-processes included in EI. Three primary brain regions are included: the amygdala, the insula and the ventromedial prefrontal cortex (vmPFC). Here, our aim was to investigate the relationship between Ability EI scores and SMC activity during social judgment of emotional faces. Sixty-three healthy subjects completed a test measuring Ability EI and underwent fMRI during a social decision task (i.e. approach or avoid) about emotional faces with different facial expressions. Imaging data revealed that EI scores are associated with left insula activity during social judgment of emotional faces as a function of facial expression. Specifically, higher EI scores are associated with greater left insula activity during social judgment of fearful faces but also with lower activity of this region during social judgment of angry faces. These findings indicate that the association between Ability EI and the SMC activity during social behavior is region- and emotion-specific.

  15. Impaired Integration of Emotional Faces and Affective Body Context in a Rare Case of Developmental Visual Agnosia

    PubMed Central

    Aviezer, Hillel; Hassin, Ran. R.; Bentin, Shlomo

    2011-01-01

    In the current study we examined the recognition of facial expressions embedded in emotionally expressive bodies in case LG, an individual with a rare form of developmental visual agnosia who suffers from severe prosopagnosia. Neuropsychological testing demonstrated that LG‘s agnosia is characterized by profoundly impaired visual integration. Unlike individuals with typical developmental prosopagnosia who display specific difficulties with face identity (but typically not expression) recognition, LG was also impaired at recognizing isolated facial expressions. By contrast, he successfully recognized the expressions portrayed by faceless emotional bodies handling affective paraphernalia. When presented with contextualized faces in emotional bodies his ability to detect the emotion expressed by a face did not improve even if it was embedded in an emotionally-congruent body context. Furthermore, in contrast to controls, LG displayed an abnormal pattern of contextual influence from emotionally-incongruent bodies. The results are interpreted in the context of a general integration deficit in developmental visual agnosia, suggesting that impaired integration may extend from the level of the face to the level of the full person. PMID:21482423

  16. “Distracters” Do Not Always Distract: Visual Working Memory for Angry Faces is Enhanced by Incidental Emotional Words

    PubMed Central

    Jackson, Margaret C.; Linden, David E. J.; Raymond, Jane E.

    2012-01-01

    We are often required to filter out distraction in order to focus on a primary task during which working memory (WM) is engaged. Previous research has shown that negative versus neutral distracters presented during a visual WM maintenance period significantly impair memory for neutral information. However, the contents of WM are often also emotional in nature. The question we address here is how incidental information might impact upon visual WM when both this and the memory items contain emotional information. We presented emotional versus neutral words during the maintenance interval of an emotional visual WM faces task. Participants encoded two angry or happy faces into WM, and several seconds into a 9 s maintenance period a negative, positive, or neutral word was flashed on the screen three times. A single neutral test face was presented for retrieval with a face identity that was either present or absent in the preceding study array. WM for angry face identities was significantly better when an emotional (negative or positive) versus neutral (or no) word was presented. In contrast, WM for happy face identities was not significantly affected by word valence. These findings suggest that the presence of emotion within an intervening stimulus boosts the emotional value of threat-related information maintained in visual WM and thus improves performance. In addition, we show that incidental events that are emotional in nature do not always distract from an ongoing WM task. PMID:23112782

  17. An exploration of emotional protection and regulation in nurse-patient interactions: The role of the professional face and the emotional mirror.

    PubMed

    Cecil, Penelope; Glass, Nel

    2015-01-01

    While interpersonal styles of nurse-patient communication have become more relaxed in recent years, nurses remain challenged in emotional engagement with patients and other health professionals. In order to preserve a professional distance in patient care delivery however slight, nurses need to be able to regulate their emotions. This research aimed to investigate nurses' perceptions of emotional protection and regulation in patient care delivery. A qualitative approach was used for the study utilising in-depth semi-structured interviews and researcher reflective journaling. Participants were drawn from rural New South Wales. Following institutional ethics approval 5 nurses were interviewed and reflective journaling commenced. The interviews and the reflective journal were transcribed verbatim. The results revealed that nurses' emotional regulation demonstrated by a 'professional face' was an important strategy to enable delivery of quality care even though it resulted in emotional containment. Such regulation was a protective mechanism employed to look after self and was critical in situations of emotional dissonance. The results also found that nurses experience emotional dissonance in situations where they have unresolved personal emotional issues and the latter was a individual motivator to manage emotions in the workplace. Emotions play a pivotal role within nurse-patient relationships. The professional face can be recognised as contributing to emotional health and therefore maintaining the emotional health of nurses in practice. This study foregrounds the importance of regulating emotions and nurturing nurses' emotional health in contemporary practice.

  18. Identification of emotions in mixed disgusted-happy faces as a function of depressive symptom severity.

    PubMed

    Sanchez, Alvaro; Romero, Nuria; Maurage, Pierre; De Raedt, Rudi

    2017-12-01

    Interpersonal difficulties are common in depression, but their underlying mechanisms are not yet fully understood. The role of depression in the identification of mixed emotional signals with a direct interpersonal value remains unclear. The present study aimed to clarify this question. A sample of 39 individuals reporting a broad range of depression levels completed an emotion identification task where they viewed faces expressing three emotional categories (100% disgusted and 100% happy faces, as well as their morphed 50% disgusted - 50% happy exemplars). Participants were asked to identify the corresponding depicted emotion as "clearly disgusted", "mixed", or "clearly happy". Higher depression levels were associated with lower identification of positive emotions in 50% disgusted - 50% happy faces. The study was conducted with an analogue sample reporting individual differences in subclinical depression levels. Further research must replicate these findings in a clinical sample and clarify whether differential emotional identification patterns emerge in depression for different mixed negative-positive emotions (sad-happy vs. disgusted-happy). Depression may account for a lower bias to perceive positive states when ambiguous states from others include subtle signals of social threat (i.e., disgust), leading to an under-perception of positive social signals. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Direct effects of diazepam on emotional processing in healthy volunteers

    PubMed Central

    Murphy, S. E.; Downham, C.; Cowen, P. J.

    2008-01-01

    Rationale Pharmacological agents used in the treatment of anxiety have been reported to decrease threat relevant processing in patients and healthy controls, suggesting a potentially relevant mechanism of action. However, the effects of the anxiolytic diazepam have typically been examined at sedative doses, which do not allow the direct actions on emotional processing to be fully separated from global effects of the drug on cognition and alertness. Objectives The aim of this study was to investigate the effect of a lower, but still clinically effective, dose of diazepam on emotional processing in healthy volunteers. Materials and methods Twenty-four participants were randomised to receive a single dose of diazepam (5 mg) or placebo. Sixty minutes later, participants completed a battery of psychological tests, including measures of non-emotional cognitive performance (reaction time and sustained attention) and emotional processing (affective modulation of the startle reflex, attentional dot probe, facial expression recognition, and emotional memory). Mood and subjective experience were also measured. Results Diazepam significantly modulated attentional vigilance to masked emotional faces and significantly decreased overall startle reactivity. Diazepam did not significantly affect mood, alertness, response times, facial expression recognition, or sustained attention. Conclusions At non-sedating doses, diazepam produces effects on attentional vigilance and startle responsivity that are consistent with its anxiolytic action. This may be an underlying mechanism through which benzodiazepines exert their therapeutic effects in clinical anxiety. PMID:18581100

  20. Mineralocorticoid receptor haplotype, oral contraceptives and emotional information processing.

    PubMed

    Hamstra, D A; de Kloet, E R; van Hemert, A M; de Rijk, R H; Van der Does, A J W

    2015-02-12

    Oral contraceptives (OCs) affect mood in some women and may have more subtle effects on emotional information processing in many more users. Female carriers of mineralocorticoid receptor (MR) haplotype 2 have been shown to be more optimistic and less vulnerable to depression. To investigate the effects of oral contraceptives on emotional information processing and a possible moderating effect of MR haplotype. Cross-sectional study in 85 healthy premenopausal women of West-European descent. We found significant main effects of oral contraceptives on facial expression recognition, emotional memory and decision-making. Furthermore, carriers of MR haplotype 1 or 3 were sensitive to the impact of OCs on the recognition of sad and fearful faces and on emotional memory, whereas MR haplotype 2 carriers were not. Different compounds of OCs were included. No hormonal measures were taken. Most naturally cycling participants were assessed in the luteal phase of their menstrual cycle. Carriers of MR haplotype 2 may be less sensitive to depressogenic side-effects of OCs. Copyright © 2015 IBRO. Published by Elsevier Ltd. All rights reserved.

  1. Transcutaneous vagus nerve stimulation (tVNS) enhances recognition of emotions in faces but not bodies.

    PubMed

    Sellaro, Roberta; de Gelder, Beatrice; Finisguerra, Alessandra; Colzato, Lorenza S

    2018-02-01

    The polyvagal theory suggests that the vagus nerve is the key phylogenetic substrate enabling optimal social interactions, a crucial aspect of which is emotion recognition. A previous study showed that the vagus nerve plays a causal role in mediating people's ability to recognize emotions based on images of the eye region. The aim of this study is to verify whether the previously reported causal link between vagal activity and emotion recognition can be generalized to situations in which emotions must be inferred from images of whole faces and bodies. To this end, we employed transcutaneous vagus nerve stimulation (tVNS), a novel non-invasive brain stimulation technique that causes the vagus nerve to fire by the application of a mild electrical stimulation to the auricular branch of the vagus nerve, located in the anterior protuberance of the outer ear. In two separate sessions, participants received active or sham tVNS before and while performing two emotion recognition tasks, aimed at indexing their ability to recognize emotions from facial and bodily expressions. Active tVNS, compared to sham stimulation, enhanced emotion recognition for whole faces but not for bodies. Our results confirm and further extend recent observations supporting a causal relationship between vagus nerve activity and the ability to infer others' emotional state, but restrict this association to situations in which the emotional state is conveyed by the whole face and/or by salient facial cues, such as eyes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Gamma Activation in Young People with Autism Spectrum Disorders and Typically-Developing Controls When Viewing Emotions on Faces

    PubMed Central

    Wright, Barry; Alderson-Day, Ben; Prendergast, Garreth; Bennett, Sophie; Jordan, Jo; Whitton, Clare; Gouws, Andre; Jones, Nick; Attur, Ram; Tomlinson, Heather; Green, Gary

    2012-01-01

    Background Behavioural studies have highlighted irregularities in recognition of facial affect in children and young people with autism spectrum disorders (ASDs). Recent findings from studies utilising electroencephalography (EEG) and magnetoencephalography (MEG) have identified abnormal activation and irregular maintenance of gamma (>30 Hz) range oscillations when ASD individuals attempt basic visual and auditory tasks. Methodology/Principal Fndings The pilot study reported here is the first study to use spatial filtering techniques in MEG to explore face processing in children with ASD. We set out to examine theoretical suggestions that gamma activation underlying face processing may be different in a group of children and young people with ASD (n = 13) compared to typically developing (TD) age, gender and IQ matched controls. Beamforming and virtual electrode techniques were used to assess spatially localised induced and evoked activity. While lower-band (3–30 Hz) responses to faces were similar between groups, the ASD gamma response in occipital areas was observed to be largely absent when viewing emotions on faces. Virtual electrode analysis indicated the presence of intact evoked responses but abnormal induced activity in ASD participants. Conclusions/Significance These findings lend weight to previous suggestions that specific components of the early visual response to emotional faces is abnormal in ASD. Elucidation of the nature and specificity of these findings is worthy of further research. PMID:22859975

  3. The neuroscience of face processing and identification in eyewitnesses and offenders.

    PubMed

    Werner, Nicole-Simone; Kühnel, Sina; Markowitsch, Hans J

    2013-12-06

    Humans are experts in face perception. We are better able to distinguish between the differences of faces and their components than between any other kind of objects. Several studies investigating the underlying neural networks provided evidence for deviated face processing in criminal individuals, although results are often confounded by accompanying mental or addiction disorders. On the other hand, face processing in non-criminal healthy persons can be of high juridical interest in cases of witnessing a felony and afterward identifying a culprit. Memory and therefore recognition of a person can be affected by many parameters and thus become distorted. But also face processing itself is modulated by different factors like facial characteristics, degree of familiarity, and emotional relation. These factors make the comparison of different cases, as well as the transfer of laboratory results to real live settings very challenging. Several neuroimaging studies have been published in recent years and some progress was made connecting certain brain activation patterns with the correct recognition of an individual. However, there is still a long way to go before brain imaging can make a reliable contribution to court procedures.

  4. The Neuroscience of Face Processing and Identification in Eyewitnesses and Offenders

    PubMed Central

    Werner, Nicole-Simone; Kühnel, Sina; Markowitsch, Hans J.

    2013-01-01

    Humans are experts in face perception. We are better able to distinguish between the differences of faces and their components than between any other kind of objects. Several studies investigating the underlying neural networks provided evidence for deviated face processing in criminal individuals, although results are often confounded by accompanying mental or addiction disorders. On the other hand, face processing in non-criminal healthy persons can be of high juridical interest in cases of witnessing a felony and afterward identifying a culprit. Memory and therefore recognition of a person can be affected by many parameters and thus become distorted. But also face processing itself is modulated by different factors like facial characteristics, degree of familiarity, and emotional relation. These factors make the comparison of different cases, as well as the transfer of laboratory results to real live settings very challenging. Several neuroimaging studies have been published in recent years and some progress was made connecting certain brain activation patterns with the correct recognition of an individual. However, there is still a long way to go before brain imaging can make a reliable contribution to court procedures. PMID:24367306

  5. Implicit conditioning of faces via the social regulation of emotion: ERP evidence of early attentional biases for security conditioned faces.

    PubMed

    Beckes, Lane; Coan, James A; Morris, James P

    2013-08-01

    Not much is known about the neural and psychological processes that promote the initial conditions necessary for positive social bonding. This study explores one method of conditioned bonding utilizing dynamics related to the social regulation of emotion and attachment theory. This form of conditioning involves repeated presentations of negative stimuli followed by images of warm, smiling faces. L. Beckes, J. Simpson, and A. Erickson (2010) found that this conditioning procedure results in positive associations with the faces measured via a lexical decision task, suggesting they are perceived as comforting. This study found that the P1 ERP was similarly modified by this conditioning procedure and the P1 amplitude predicted lexical decision times to insecure words primed by the faces. The findings have implications for understanding how the brain detects supportive people, the flexibility and modifiability of early ERP components, and social bonding more broadly. Copyright © 2013 Society for Psychophysiological Research.

  6. Repetition Blindness for Faces: A Comparison of Face Identity, Expression, and Gender Judgments.

    PubMed

    Murphy, Karen; Ward, Zoe

    2017-01-01

    Repetition blindness (RB) refers to the impairment in reporting two identical targets within a rapid serial visual presentation stream. While numerous studies have demonstrated RB for words and picture of objects, very few studies have examined RB for faces. This study extended this research by examining RB when the two faces were complete repeats (same emotion and identity), identity repeats (same individual, different emotion), and emotion repeats (different individual, same emotion) for identity, gender, and expression judgment tasks. Complete RB and identity RB effects were evident for all three judgment tasks. Emotion RB was only evident for the expression and gender judgments. Complete RB effects were larger than emotion or identity RB effects across all judgment tasks. For the expression judgments, there was more emotion than identity RB. The identity RB effect was larger than the emotion RB effect for the gender judgments. Cross task comparisons revealed larger complete RB effects for the expression and gender judgments than the identity decisions. There was a larger emotion RB effect for the expression than gender judgments and the identity RB effect was larger for the gender than for the identity and expression judgments. These results indicate that while faces are subject to RB, this is affected by the type of repeated information and relevance of the facial characteristic to the judgment decision. This study provides further support for the operation of separate processing mechanisms for face gender, emotion, and identity information within models of face recognition.

  7. Repetition Blindness for Faces: A Comparison of Face Identity, Expression, and Gender Judgments

    PubMed Central

    Murphy, Karen; Ward, Zoe

    2017-01-01

    Repetition blindness (RB) refers to the impairment in reporting two identical targets within a rapid serial visual presentation stream. While numerous studies have demonstrated RB for words and picture of objects, very few studies have examined RB for faces. This study extended this research by examining RB when the two faces were complete repeats (same emotion and identity), identity repeats (same individual, different emotion), and emotion repeats (different individual, same emotion) for identity, gender, and expression judgment tasks. Complete RB and identity RB effects were evident for all three judgment tasks. Emotion RB was only evident for the expression and gender judgments. Complete RB effects were larger than emotion or identity RB effects across all judgment tasks. For the expression judgments, there was more emotion than identity RB. The identity RB effect was larger than the emotion RB effect for the gender judgments. Cross task comparisons revealed larger complete RB effects for the expression and gender judgments than the identity decisions. There was a larger emotion RB effect for the expression than gender judgments and the identity RB effect was larger for the gender than for the identity and expression judgments. These results indicate that while faces are subject to RB, this is affected by the type of repeated information and relevance of the facial characteristic to the judgment decision. This study provides further support for the operation of separate processing mechanisms for face gender, emotion, and identity information within models of face recognition. PMID:29038663

  8. The human body odor compound androstadienone leads to anger-dependent effects in an emotional Stroop but not dot-probe task using human faces.

    PubMed

    Hornung, Jonas; Kogler, Lydia; Wolpert, Stephan; Freiherr, Jessica; Derntl, Birgit

    2017-01-01

    The androgen derivative androstadienone is a substance found in human sweat and thus is a putative human chemosignal. Androstadienone has been studied with respect to effects on mood states, attractiveness ratings, physiological and neural activation. With the current experiment, we aimed to explore in which way androstadienone affects attention to social cues (human faces). Moreover, we wanted to test whether effects depend on specific emotions, the participants' sex and individual sensitivity to smell androstadienone. To do so, we investigated 56 healthy individuals (thereof 29 females taking oral contraceptives) with two attention tasks on two consecutive days (once under androstadienone, once under placebo exposure in pseudorandomized order). With an emotional dot-probe task we measured visuo-spatial cueing while an emotional Stroop task allowed us to investigate interference control. Our results suggest that androstadienone acts in a sex, task and emotion-specific manner as a reduction in interference processes in the emotional Stroop task was only apparent for angry faces in men under androstadienone exposure. More specifically, men showed a smaller difference in reaction times for congruent compared to incongruent trials. At the same time also women were slightly affected by smelling androstadienone as they classified angry faces more often correctly under androstadienone. For the emotional dot-probe task no modulation by androstadienone was observed. Furthermore, in both attention paradigms individual sensitivity to androstadienone was neither correlated with reaction times nor error rates in men and women. To conclude, exposure to androstadienone seems to potentiate the relevance of angry faces in both men and women in connection with interference control, while processes of visuo-spatial cueing remain unaffected.

  9. The human body odor compound androstadienone leads to anger-dependent effects in an emotional Stroop but not dot-probe task using human faces

    PubMed Central

    Kogler, Lydia; Wolpert, Stephan; Freiherr, Jessica; Derntl, Birgit

    2017-01-01

    The androgen derivative androstadienone is a substance found in human sweat and thus is a putative human chemosignal. Androstadienone has been studied with respect to effects on mood states, attractiveness ratings, physiological and neural activation. With the current experiment, we aimed to explore in which way androstadienone affects attention to social cues (human faces). Moreover, we wanted to test whether effects depend on specific emotions, the participants' sex and individual sensitivity to smell androstadienone. To do so, we investigated 56 healthy individuals (thereof 29 females taking oral contraceptives) with two attention tasks on two consecutive days (once under androstadienone, once under placebo exposure in pseudorandomized order). With an emotional dot-probe task we measured visuo-spatial cueing while an emotional Stroop task allowed us to investigate interference control. Our results suggest that androstadienone acts in a sex, task and emotion-specific manner as a reduction in interference processes in the emotional Stroop task was only apparent for angry faces in men under androstadienone exposure. More specifically, men showed a smaller difference in reaction times for congruent compared to incongruent trials. At the same time also women were slightly affected by smelling androstadienone as they classified angry faces more often correctly under androstadienone. For the emotional dot-probe task no modulation by androstadienone was observed. Furthermore, in both attention paradigms individual sensitivity to androstadienone was neither correlated with reaction times nor error rates in men and women. To conclude, exposure to androstadienone seems to potentiate the relevance of angry faces in both men and women in connection with interference control, while processes of visuo-spatial cueing remain unaffected. PMID:28369152

  10. Association between Ability Emotional Intelligence and Left Insula during Social Judgment of Facial Emotions

    PubMed Central

    Quarto, Tiziana; Blasi, Giuseppe; Maddalena, Chiara; Viscanti, Giovanna; Lanciano, Tiziana; Soleti, Emanuela; Mangiulli, Ivan; Taurisano, Paolo; Fazio, Leonardo; Bertolino, Alessandro; Curci, Antonietta

    2016-01-01

    The human ability of identifying, processing and regulating emotions from social stimuli is generally referred as Emotional Intelligence (EI). Within EI, Ability EI identifies a performance measure assessing individual skills at perceiving, using, understanding and managing emotions. Previous models suggest that a brain “somatic marker circuitry” (SMC) sustains emotional sub-processes included in EI. Three primary brain regions are included: the amygdala, the insula and the ventromedial prefrontal cortex (vmPFC). Here, our aim was to investigate the relationship between Ability EI scores and SMC activity during social judgment of emotional faces. Sixty-three healthy subjects completed a test measuring Ability EI and underwent fMRI during a social decision task (i.e. approach or avoid) about emotional faces with different facial expressions. Imaging data revealed that EI scores are associated with left insula activity during social judgment of emotional faces as a function of facial expression. Specifically, higher EI scores are associated with greater left insula activity during social judgment of fearful faces but also with lower activity of this region during social judgment of angry faces. These findings indicate that the association between Ability EI and the SMC activity during social behavior is region- and emotion-specific. PMID:26859495

  11. Amygdala activity and prefrontal cortex-amygdala effective connectivity to emerging emotional faces distinguish remitted and depressed mood states in bipolar disorder.

    PubMed

    Perlman, Susan B; Almeida, Jorge R C; Kronhaus, Dina M; Versace, Amelia; Labarbara, Edmund J; Klein, Crystal R; Phillips, Mary L

    2012-03-01

    Few studies have employed effective connectivity (EC) to examine the functional integrity of neural circuitry supporting abnormal emotion processing in bipolar disorder (BD), a key feature of the illness. We used Granger Causality Mapping (GCM) to map EC between the prefrontal cortex (PFC) and bilateral amygdala and a novel paradigm to assess emotion processing in adults with BD. Thirty-one remitted adults with BD [(remitted BD), mean age = 32 years], 21 adults with BD in a depressed episode [(depressed BD), mean age = 33 years], and 25 healthy control participants [(HC), mean age = 31 years] performed a block-design emotion processing task requiring color-labeling of a color flash superimposed on a task-irrelevant face morphing from neutral to emotional (happy, sad, angry, or fearful). GCM measured EC preceding (top-down) and following (bottom-up) activity between the PFC and the left and right amygdalae. Our findings indicated patterns of abnormally elevated bilateral amygdala activity in response to emerging fearful, sad, and angry facial expressions in remitted-BD subjects versus HC, and abnormally elevated right amygdala activity to emerging fearful faces in depressed-BD subjects versus HC. We also showed distinguishable patterns of abnormal EC between the amygdala and dorsomedial and ventrolateral PFC, especially to emerging happy and sad facial expressions in remitted-BD and depressed-BD subjects. EC measures of neural system level functioning can further understanding of neural mechanisms associated with abnormal emotion processing and regulation in BD. Our findings suggest major differences in recruitment of amygdala-PFC circuitry, supporting implicit emotion processing between remitted-BD and depressed-BD subjects, which may underlie changes from remission to depression in BD. © 2012 John Wiley and Sons A/S.

  12. Cocaine Exposure Is Associated with Subtle Compromises of Infants' and Mothers' Social-Emotional Behavior and Dyadic Features of Their Interaction in the Face-to-Face Still-Face Paradigm

    ERIC Educational Resources Information Center

    Tronick, E. Z.; Messinger, D. S.; Weinberg, M. K.; Lester, B. M.; LaGasse, L.; Seifer, R.; Bauer, C. R.; Shankaran, S.; Bada, H.; Wright, L. L.; Poole, K.; Liu, J.

    2005-01-01

    Prenatal cocaine and opiate exposure are thought to subtly compromise social and emotional development. The authors observed a large sample of 236 cocaine-exposed and 459 nonexposed infants (49 were opiate exposed and 646 nonexposed) with their mothers in the face-to-face still-face paradigm. Infant and maternal behaviors were microanalytically…

  13. Abnormal functional connectivity of EEG gamma band in patients with depression during emotional face processing.

    PubMed

    Li, Yingjie; Cao, Dan; Wei, Ling; Tang, Yingying; Wang, Jijun

    2015-11-01

    This paper evaluates the large-scale structure of functional brain networks using graph theoretical concepts and investigates the difference in brain functional networks between patients with depression and healthy controls while they were processing emotional stimuli. Electroencephalography (EEG) activities were recorded from 16 patients with depression and 14 healthy controls when they performed a spatial search task for facial expressions. Correlations between all possible pairs of 59 electrodes were determined by coherence, and the coherence matrices were calculated in delta, theta, alpha, beta, and gamma bands (low gamma: 30-50Hz and high gamma: 50-80Hz, respectively). Graph theoretical analysis was applied to these matrices by using two indexes: the clustering coefficient and the characteristic path length. The global EEG coherence of patients with depression was significantly higher than that of healthy controls in both gamma bands, especially in the high gamma band. The global coherence in both gamma bands from healthy controls appeared higher in negative conditions than in positive conditions. All the brain networks were found to hold a regular and ordered topology during emotion processing. However, the brain network of patients with depression appeared randomized compared with the normal one. The abnormal network topology of patients with depression was detected in both the prefrontal and occipital regions. The negative bias from healthy controls occurred in both gamma bands during emotion processing, while it disappeared in patients with depression. The proposed work studied abnormally increased connectivity of brain functional networks in patients with depression. By combing the clustering coefficient and the characteristic path length, we found that the brain networks of patients with depression and healthy controls had regular networks during emotion processing. Yet the brain networks of the depressed group presented randomization trends. Moreover

  14. Neural Processing of Facial Identity and Emotion in Infants at High-Risk for Autism Spectrum Disorders

    PubMed Central

    Fox, Sharon E.; Wagner, Jennifer B.; Shrock, Christine L.; Tager-Flusberg, Helen; Nelson, Charles A.

    2013-01-01

    Deficits in face processing and social impairment are core characteristics of autism spectrum disorder. The present work examined 7-month-old infants at high-risk for developing autism and typically developing controls at low-risk, using a face perception task designed to differentiate between the effects of face identity and facial emotions on neural response using functional Near-Infrared Spectroscopy. In addition, we employed independent component analysis, as well as a novel method of condition-related component selection and classification to identify group differences in hemodynamic waveforms and response distributions associated with face and emotion processing. The results indicate similarities of waveforms, but differences in the magnitude, spatial distribution, and timing of responses between groups. These early differences in local cortical regions and the hemodynamic response may, in turn, contribute to differences in patterns of functional connectivity. PMID:23576966

  15. Childhood anxiety and attention to emotion faces in a modified stroop task.

    PubMed

    Hadwin, Julie A; Donnelly, Nick; Richards, Anne; French, Christopher C; Patel, Umang

    2009-06-01

    This study used an emotional face stroop task to investigate the effects of self-report trait anxiety, social concern (SC), and chronological age (CA) on reaction time to match coloured outlines of angry, happy, and neutral faces (and control faces with scrambled features) with coloured buttons in a community sample of 74 children aged 6-12 years. The results showed an interference of colour matching for angry (relative to neutral) faces in children with elevated SC. The same effect was not found for happy or control faces. In addition, the results suggest that selective attention to angry faces in children with social concern (SC) was not significantly moderated by age.

  16. Age-Group Differences in Interference from Young and Older Emotional Faces.

    PubMed

    Ebner, Natalie C; Johnson, Marcia K

    2010-11-01

    Human attention is selective, focusing on some aspects of events at the expense of others. In particular, angry faces engage attention. Most studies have used pictures of young faces, even when comparing young and older age groups. Two experiments asked (1) whether task-irrelevant faces of young and older individuals with happy, angry, and neutral expressions disrupt performance on a face-unrelated task, (2) whether interference varies for faces of different ages and different facial expressions, and (3) whether young and older adults differ in this regard. Participants gave speeded responses on a number task while irrelevant faces appeared in the background. Both age groups were more distracted by own than other-age faces. In addition, young participants' responses were slower for angry than happy faces, whereas older participants' responses were slower for happy than angry faces. Factors underlying age-group differences in interference from emotional faces of different ages are discussed.

  17. Selective attention modulates high-frequency activity in the face-processing network.

    PubMed

    Müsch, Kathrin; Hamamé, Carlos M; Perrone-Bertolotti, Marcela; Minotti, Lorella; Kahane, Philippe; Engel, Andreas K; Lachaux, Jean-Philippe; Schneider, Till R

    2014-11-01

    Face processing depends on the orchestrated activity of a large-scale neuronal network. Its activity can be modulated by attention as a function of task demands. However, it remains largely unknown whether voluntary, endogenous attention and reflexive, exogenous attention to facial expressions equally affect all regions of the face-processing network, and whether such effects primarily modify the strength of the neuronal response, the latency, the duration, or the spectral characteristics. We exploited the good temporal and spatial resolution of intracranial electroencephalography (iEEG) and recorded from depth electrodes to uncover the fast dynamics of emotional face processing. We investigated frequency-specific responses and event-related potentials (ERP) in the ventral occipito-temporal cortex (VOTC), ventral temporal cortex (VTC), anterior insula, orbitofrontal cortex (OFC), and amygdala when facial expressions were task-relevant or task-irrelevant. All investigated regions of interest (ROI) were clearly modulated by task demands and exhibited stronger changes in stimulus-induced gamma band activity (50-150 Hz) when facial expressions were task-relevant. Observed latencies demonstrate that the activation is temporally coordinated across the network, rather than serially proceeding along a processing hierarchy. Early and sustained responses to task-relevant faces in VOTC and VTC corroborate their role for the core system of face processing, but they also occurred in the anterior insula. Strong attentional modulation in the OFC and amygdala (300 msec) suggests that the extended system of the face-processing network is only recruited if the task demands active face processing. Contrary to our expectation, we rarely observed differences between fearful and neutral faces. Our results demonstrate that activity in the face-processing network is susceptible to the deployment of selective attention. Moreover, we show that endogenous attention operates along the whole

  18. The perception of emotion-free faces in schizophrenia: a magneto-encephalography study.

    PubMed

    López-Ibor, Juan J; López-Ibor, María-Inés; Méndez, María-Andreína; Morón, María-Dolores; Ortiz-Terán, Laura; Fernandez, Alberto; Diaz-Marsá, Marina; Ortiz, Tomás

    2008-01-01

    To analyze how patients suffering from schizophrenia perceive faces of unknown individuals that show no actual emotions in order to investigate the attribution of meanings to a relatively non-significant but complex sensory experience. Analysis of baseline and poststimulation magneto-encephalographic recordings. The stimuli consisted of four pictures with neutral emotional expression of male and female faces of Spanish individuals unknown to research subjects. Twelve right-handed patients suffering from schizophrenia (DSM IV-TR criteria), age 18-65, with active delusional activity (SAPS score in delusions above 39) and 15 right-handed sex- and age-matched control subjects. Compared to controls patients have a significant higher activity of both hemispheres (0-700 ms) being the activity in the right hemisphere (RH) higher than in the left hemisphere (LH). Patients also have a higher activity on the middle fusiform gyrus (BA 37) in the LH (200-300 ms), on the superior temporal areas (BA 22, 41 and 42) in both hemispheres (100-700 ms) and on the temporal pole (BA 38) in the RH (300-400 ms) and a lower activity in the LH of the latter. The areas that are more activated in our study are those involved in the process of thinking, in attributing meanings to perceptions and in activities such as theory of mind, which are essential for social interaction. The anterior temporal areas less activated indicate a reduced semantic memory for faces that could explain the social withdrawal of schizophrenia. These alterations are suggestive of a dysfunction of left hemisphere neuronal networks.

  19. Age and gender modulate the neural circuitry supporting facial emotion processing in adults with major depressive disorder.

    PubMed

    Briceño, Emily M; Rapport, Lisa J; Kassel, Michelle T; Bieliauskas, Linas A; Zubieta, Jon-Kar; Weisenbach, Sara L; Langenecker, Scott A

    2015-03-01

    Emotion processing, supported by frontolimbic circuitry known to be sensitive to the effects of aging, is a relatively understudied cognitive-emotional domain in geriatric depression. Some evidence suggests that the neurophysiological disruption observed in emotion processing among adults with major depressive disorder (MDD) may be modulated by both gender and age. Therefore, the present study investigated the effects of gender and age on the neural circuitry supporting emotion processing in MDD. Cross-sectional comparison of fMRI signal during performance of an emotion processing task. Outpatient university setting. One hundred adults recruited by MDD status, gender, and age. Participants underwent fMRI while completing the Facial Emotion Perception Test. They viewed photographs of faces and categorized the emotion perceived. Contrast for fMRI was of face perception minus animal identification blocks. Effects of depression were observed in precuneus and effects of age in a number of frontolimbic regions. Three-way interactions were present between MDD status, gender, and age in regions pertinent to emotion processing, including frontal, limbic, and basal ganglia. Young women with MDD and older men with MDD exhibited hyperactivation in these regions compared with their respective same-gender healthy comparison (HC) counterparts. In contrast, older women and younger men with MDD exhibited hypoactivation compared to their respective same-gender HC counterparts. This the first study to report gender- and age-specific differences in emotion processing circuitry in MDD. Gender-differential mechanisms may underlie cognitive-emotional disruption in older adults with MDD. The present findings have implications for improved probes into the heterogeneity of the MDD syndrome. Copyright © 2015 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.

  20. About-face on face recognition ability and holistic processing.

    PubMed

    Richler, Jennifer J; Floyd, R Jackie; Gauthier, Isabel

    2015-01-01

    Previous work found a small but significant relationship between holistic processing measured with the composite task and face recognition ability measured by the Cambridge Face Memory Test (CFMT; Duchaine & Nakayama, 2006). Surprisingly, recent work using a different measure of holistic processing (Vanderbilt Holistic Face Processing Test [VHPT-F]; Richler, Floyd, & Gauthier, 2014) and a larger sample found no evidence for such a relationship. In Experiment 1 we replicate this unexpected result, finding no relationship between holistic processing (VHPT-F) and face recognition ability (CFMT). A key difference between the VHPT-F and other holistic processing measures is that unique face parts are used on each trial in the VHPT-F, unlike in other tasks where a small set of face parts repeat across the experiment. In Experiment 2, we test the hypothesis that correlations between the CFMT and holistic processing tasks are driven by stimulus repetition that allows for learning during the composite task. Consistent with our predictions, CFMT performance was correlated with holistic processing in the composite task when a small set of face parts repeated over trials, but not when face parts did not repeat. A meta-analysis confirms that relationships between the CFMT and holistic processing depend on stimulus repetition. These results raise important questions about what is being measured by the CFMT, and challenge current assumptions about why faces are processed holistically.

  1. About-face on face recognition ability and holistic processing

    PubMed Central

    Richler, Jennifer J.; Floyd, R. Jackie; Gauthier, Isabel

    2015-01-01

    Previous work found a small but significant relationship between holistic processing measured with the composite task and face recognition ability measured by the Cambridge Face Memory Test (CFMT; Duchaine & Nakayama, 2006). Surprisingly, recent work using a different measure of holistic processing (Vanderbilt Holistic Face Processing Test [VHPT-F]; Richler, Floyd, & Gauthier, 2014) and a larger sample found no evidence for such a relationship. In Experiment 1 we replicate this unexpected result, finding no relationship between holistic processing (VHPT-F) and face recognition ability (CFMT). A key difference between the VHPT-F and other holistic processing measures is that unique face parts are used on each trial in the VHPT-F, unlike in other tasks where a small set of face parts repeat across the experiment. In Experiment 2, we test the hypothesis that correlations between the CFMT and holistic processing tasks are driven by stimulus repetition that allows for learning during the composite task. Consistent with our predictions, CFMT performance was correlated with holistic processing in the composite task when a small set of face parts repeated over trials, but not when face parts did not repeat. A meta-analysis confirms that relationships between the CFMT and holistic processing depend on stimulus repetition. These results raise important questions about what is being measured by the CFMT, and challenge current assumptions about why faces are processed holistically. PMID:26223027

  2. Visual processing of emotional expressions in mixed anxious-depressed subclinical state: an event-related potential study on a female sample.

    PubMed

    Rossignol, M; Philippot, P; Crommelinck, M; Campanella, S

    2008-10-01

    Controversy remains about the existence and the nature of a specific bias in emotional facial expression processing in mixed anxious-depressed state (MAD). Event-related potentials were recorded in the following three types of groups defined by the Spielberger state and trait anxiety inventory (STAI) and the Beck depression inventory (BDI): a group of anxious participants (n=12), a group of participants with depressive and anxious tendencies (n=12), and a control group (n=12). Participants were confronted with a visual oddball task in which they had to detect, as quickly as possible, deviant faces amongst a train of standard neutral faces. Deviant stimuli changed either on identity, or on emotion (happy or sad expression). Anxiety facilitated emotional processing and the two anxious groups produced quicker responses than control participants; these effects were correlated with an earlier decisional wave (P3b) for anxious participants. Mixed anxious-depressed participants showed enhanced visual processing of deviant stimuli and produced higher amplitude in attentional complex (N2b/P3a), both for identity and emotional trials. P3a was also particularly increased for emotional faces in this group. Anxious state mainly influenced later decision processes (shorter latency of P3b), whereas mixed anxious-depressed state acted on earlier steps of emotional processing (enhanced N2b/P3a complex). Mixed anxious-depressed individuals seemed more reactive to any visual change, particularly emotional change, without displaying any valence bias.

  3. Affective blindsight in the absence of input from face processing regions in occipital-temporal cortex.

    PubMed

    Striemer, Christopher L; Whitwell, Robert L; Goodale, Melvyn A

    2017-11-12

    Previous research suggests that the implicit recognition of emotional expressions may be carried out by pathways that bypass primary visual cortex (V1) and project to the amygdala. Some of the strongest evidence supporting this claim comes from case studies of "affective blindsight" in which patients with V1 damage can correctly guess whether an unseen face was depicting a fearful or happy expression. In the current study, we report a new case of affective blindsight in patient MC who is cortically blind following extensive bilateral lesions to V1, as well as face and object processing regions in her ventral visual stream. Despite her large lesions, MC has preserved motion perception which is related to sparing of the motion sensitive region MT+ in both hemispheres. To examine affective blindsight in MC we asked her to perform gender and emotion discrimination tasks in which she had to guess, using a two-alternative forced-choice procedure, whether the face presented was male or female, happy or fearful, or happy or angry. In addition, we also tested MC in a four-alternative forced-choice target localization task. Results indicated that MC was not able to determine the gender of the faces (53% accuracy), or localize targets in a forced-choice task. However, she was able to determine, at above chance levels, whether the face presented was depicting a happy or fearful (67%, p = .006), or a happy or angry (64%, p = .025) expression. Interestingly, although MC was better than chance at discriminating between emotions in faces when asked to make rapid judgments, her performance fell to chance when she was asked to provide subjective confidence ratings about her performance. These data lend further support to the idea that there is a non-conscious visual pathway that bypasses V1 which is capable of processing affective signals from facial expressions without input from higher-order face and object processing regions in the ventral visual stream. Copyright © 2017

  4. The impact of oxytocin administration and maternal love withdrawal on event-related potential (ERP) responses to emotional faces with performance feedback.

    PubMed

    Huffmeijer, Renske; Alink, Lenneke R A; Tops, Mattie; Grewen, Karen M; Light, Kathleen C; Bakermans-Kranenburg, Marian J; van Ijzendoorn, Marinus H

    2013-03-01

    This is the first experimental study on the effect of oxytocin administration on the neural processing of facial stimuli conducted with female participants that uses event-related potentials (ERPs). Using a double-blind, placebo-controlled within-subjects design, we studied the effects of 16 IU of intranasal oxytocin on ERPs to pictures combining performance feedback with emotional facial expressions in 48 female undergraduate students. Participants also reported on the amount of love withdrawal they experienced from their mothers. Vertex positive potential (VPP) and late positive potential (LPP) amplitudes were more positive after oxytocin compared to placebo administration. This suggests that oxytocin increased attention to the feedback stimuli (LPP) and enhanced the processing of emotional faces (VPP). Oxytocin heightened processing of the happy and disgusted faces primarily for those reporting less love withdrawal. Significant associations with LPP amplitude suggest that more maternal love withdrawal relates to the allocation of attention toward the motivationally relevant combination of negative feedback with a disgusted face. Copyright © 2012 Elsevier Inc. All rights reserved.

  5. Effect of distracting faces on visual selective attention in the monkey.

    PubMed

    Landman, Rogier; Sharma, Jitendra; Sur, Mriganka; Desimone, Robert

    2014-12-16

    In primates, visual stimuli with social and emotional content tend to attract attention. Attention might be captured through rapid, automatic, subcortical processing or guided by slower, more voluntary cortical processing. Here we examined whether irrelevant faces with varied emotional expressions interfere with a covert attention task in macaque monkeys. In the task, the monkeys monitored a target grating in the periphery for a subtle color change while ignoring distracters that included faces appearing elsewhere on the screen. The onset time of distracter faces before the target change, as well as their spatial proximity to the target, was varied from trial to trial. The presence of faces, especially faces with emotional expressions interfered with the task, indicating a competition for attentional resources between the task and the face stimuli. However, this interference was significant only when faces were presented for greater than 200 ms. Emotional faces also affected saccade velocity and reduced pupillary reflex. Our results indicate that the attraction of attention by emotional faces in the monkey takes a considerable amount of processing time, possibly involving cortical-subcortical interactions. Intranasal application of the hormone oxytocin ameliorated the interfering effects of faces. Together these results provide evidence for slow modulation of attention by emotional distracters, which likely involves oxytocinergic brain circuits.

  6. Memory for faces: the effect of facial appearance and the context in which the face is encountered.

    PubMed

    Mattarozzi, Katia; Todorov, Alexander; Codispoti, Maurizio

    2015-03-01

    We investigated the effects of appearance of emotionally neutral faces and the context in which the faces are encountered on incidental face memory. To approximate real-life situations as closely as possible, faces were embedded in a newspaper article, with a headline that specified an action performed by the person pictured. We found that facial appearance affected memory so that faces perceived as trustworthy or untrustworthy were remembered better than neutral ones. Furthermore, the memory of untrustworthy faces was slightly better than that of trustworthy faces. The emotional context of encoding affected the details of face memory. Faces encountered in a neutral context were more likely to be recognized as only familiar. In contrast, emotionally relevant contexts of encoding, whether pleasant or unpleasant, increased the likelihood of remembering semantic and even episodic details associated with faces. These findings suggest that facial appearance (i.e., perceived trustworthiness) affects face memory. Moreover, the findings support prior evidence that the engagement of emotion processing during memory encoding increases the likelihood that events are not only recognized but also remembered.

  7. Facing the Problem: Impaired Emotion Recognition During Multimodal Social Information Processing in Borderline Personality Disorder.

    PubMed

    Niedtfeld, Inga; Defiebre, Nadine; Regenbogen, Christina; Mier, Daniela; Fenske, Sabrina; Kirsch, Peter; Lis, Stefanie; Schmahl, Christian

    2017-04-01

    Previous research has revealed alterations and deficits in facial emotion recognition in patients with borderline personality disorder (BPD). During interpersonal communication in daily life, social signals such as speech content, variation in prosody, and facial expression need to be considered simultaneously. We hypothesized that deficits in higher level integration of social stimuli contribute to difficulties in emotion recognition in BPD, and heightened arousal might explain this effect. Thirty-one patients with BPD and thirty-one healthy controls were asked to identify emotions in short video clips, which were designed to represent different combinations of the three communication channels: facial expression, speech content, and prosody. Skin conductance was recorded as a measure of sympathetic arousal, while controlling for state dissociation. Patients with BPD showed lower mean accuracy scores than healthy control subjects in all conditions comprising emotional facial expressions. This was true for the condition with facial expression only, and for the combination of all three communication channels. Electrodermal responses were enhanced in BPD only in response to auditory stimuli. In line with the major body of facial emotion recognition studies, we conclude that deficits in the interpretation of facial expressions lead to the difficulties observed in multimodal emotion processing in BPD.

  8. Reading emotions from faces in two indigenous societies.

    PubMed

    Crivelli, Carlos; Jarillo, Sergio; Russell, James A; Fernández-Dols, José-Miguel

    2016-07-01

    That all humans recognize certain specific emotions from their facial expression-the Universality Thesis-is a pillar of research, theory, and application in the psychology of emotion. Its most rigorous test occurs in indigenous societies with limited contact with external cultural influences, but such tests are scarce. Here we report 2 such tests. Study 1 was of children and adolescents (N = 68; aged 6-16 years) of the Trobriand Islands (Papua New Guinea, South Pacific) with a Western control group from Spain (N = 113, of similar ages). Study 2 was of children and adolescents (N = 36; same age range) of Matemo Island (Mozambique, Africa). In both studies, participants were shown an array of prototypical facial expressions and asked to point to the person feeling a specific emotion: happiness, fear, anger, disgust, or sadness. The Spanish control group matched faces to emotions as predicted by the Universality Thesis: matching was seen on 83% to 100% of trials. For the indigenous societies, in both studies, the Universality Thesis was moderately supported for happiness: smiles were matched to happiness on 58% and 56% of trials, respectively. For other emotions, however, results were even more modest: 7% to 46% in the Trobriand Islands and 22% to 53% in Matemo Island. These results were robust across age, gender, static versus dynamic display of the facial expressions, and between- versus within-subjects design. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  9. Effects of Repeated Concussions and Sex on Early Processing of Emotional Facial Expressions as Revealed by Electrophysiology.

    PubMed

    Carrier-Toutant, Frédérike; Guay, Samuel; Beaulieu, Christelle; Léveillé, Édith; Turcotte-Giroux, Alexandre; Papineau, Samaël D; Brisson, Benoit; D'Hondt, Fabien; De Beaumont, Louis

    2018-05-06

    Concussions affect the processing of emotional stimuli. This study aimed to investigate how sex interacts with concussion effects on early event-related brain potentials (ERP) measures (P1, N1) of emotional facial expressions (EFE) processing in asymptomatic, multi-concussion athletes during an EFE identification task. Forty control athletes (20 females and 20 males) and 43 multi-concussed athletes (22 females and 21 males), recruited more than 3 months after their last concussion, were tested. Participants completed the Beck Depression Inventory II, the Beck Anxiety Inventory, the Post-Concussion Symptom Scale, and an Emotional Facial Expression Identification Task. Pictures of male and female faces expressing neutral, angry, and happy emotions were randomly presented and the emotion depicted had to be identified as fast as possible during EEG acquisition. Relative to controls, concussed athletes of both sex exhibited a significant suppression of P1 amplitude recorded from the dominant right hemisphere while performing the emotional face expression identification task. The present study also highlighted a sex-specific suppression of the N1 component amplitude after concussion which affected male athletes. These findings suggest that repeated concussions alter the typical pattern of right-hemisphere response dominance to EFE in early stages of EFE processing and that the neurophysiological mechanisms underlying the processing of emotional stimuli are distinctively affected across sex. (JINS, 2018, 24, 1-11).

  10. ERP Correlates of Target-Distracter Differentiation in Repeated Runs of a Continuous Recognition Task with Emotional and Neutral Faces

    ERIC Educational Resources Information Center

    Treese, Anne-Cecile; Johansson, Mikael; Lindgren, Magnus

    2010-01-01

    The emotional salience of faces has previously been shown to induce memory distortions in recognition memory tasks. This event-related potential (ERP) study used repeated runs of a continuous recognition task with emotional and neutral faces to investigate emotion-induced memory distortions. In the second and third runs, participants made more…

  11. Effect of short-term escitalopram treatment on neural activation during emotional processing.

    PubMed

    Maron, Eduard; Wall, Matt; Norbury, Ray; Godlewska, Beata; Terbeck, Sylvia; Cowen, Philip; Matthews, Paul; Nutt, David J

    2016-01-01

    Recent functional magnetic resonance (fMRI) imaging studies have revealed that subchronic medication with escitalopram leads to significant reduction in both amygdala and medial frontal gyrus reactivity during processing of emotional faces, suggesting that escitalopram may have a distinguishable modulatory effect on neural activation as compared with other serotonin-selective antidepressants. In this fMRI study we aimed to explore whether short-term medication with escitalopram in healthy volunteers is associated with reduced neural response to emotional processing, and whether this effect is predicted by drug plasma concentration. The neural response to fearful and happy faces was measured before and on day 7 of treatment with escitalopram (10mg) in 15 healthy volunteers and compared with those in a control unmedicated group (n=14). Significantly reduced activation to fearful, but not to happy facial expressions was observed in the bilateral amygdala, cingulate and right medial frontal gyrus following escitalopram medication. This effect was not correlated with plasma drug concentration. In accordance with previous data, we showed that escitalopram exerts its rapid direct effect on emotional processing via attenuation of neural activation in pathways involving medial frontal gyrus and amygdala, an effect that seems to be distinguishable from that of other SSRIs. © The Author(s) 2015.

  12. Identity-expression interaction in face perception: sex, visual field, and psychophysical factors.

    PubMed

    Godard, Ornella; Baudouin, Jean-Yves; Bonnet, Philippe; Fiori, Nicole

    2013-01-01

    We investigated the psychophysical factors underlying the identity-emotion interaction in face perception. Visual field and sex were also taken into account. Participants had to judge whether a probe face, presented in either the left or the right visual field, and a central target face belonging to same person while emotional expression varied (Experiment 1) or to judge whether probe and target faces expressed the same emotion while identity was manipulated (Experiment 2). For accuracy we replicated the mutual facilitation effect between identity and emotion; no sex or hemispheric differences were found. Processing speed measurements, however, showed a lesser degree of interference in women than in men, especially for matching identity when faces expressed different emotions after a left visual presentation probe face. Psychophysical indices can be used to determine whether these effects are perceptual (A') or instead arise at a post-perceptual decision-making stage (B"). The influence of identity on the processing of facial emotion seems to be due to perceptual factors, whereas the influence of emotion changes on identity processing seems to be related to decisional factors. In addition, men seem to be more "conservative" after a LVF/RH probe-face presentation when processing identity. Women seem to benefit from better abilities to extract facial invariant aspects relative to identity.

  13. Gender-specific disruptions in emotion processing in younger adults with depression.

    PubMed

    Wright, Sara L; Langenecker, Scott A; Deldin, Patricia J; Rapport, Lisa J; Nielson, Kristy A; Kade, Allison M; Own, Lawrence S; Akil, Huda; Young, Elizabeth A; Zubieta, Jon-Kar

    2009-01-01

    One of the principal theories regarding the biological basis of major depressive disorder (MDD) implicates a dysregulation of emotion-processing circuitry. Gender differences in how emotions are processed and relative experience with emotion processing might help to explain some of the disparities in the prevalence of MDD between women and men. This study sought to explore how gender and depression status relate to emotion processing. This study employed a 2 (MDD status) x 2 (gender) factorial design to explore differences in classifications of posed facial emotional expressions (N=151). For errors, there was an interaction between gender and depression status. Women with MDD made more errors than did nondepressed women and men with MDD, particularly for fearful and sad stimuli (Ps <.02), which they were likely to misinterpret as angry (Ps <.04). There was also an interaction of diagnosis and gender for response cost for negative stimuli, with significantly greater interference from negative faces present in women with MDD compared to nondepressed women (P=.01). Men with MDD, conversely, performed similarly to control men (P=.61). These results provide novel and intriguing evidence that depression in younger adults (<35 years) differentially disrupts emotion processing in women as compared to men. This interaction could be driven by neurobiological and social learning mechanisms, or interactions between them, and may underlie differences in the prevalence of depression in women and men. (c) 2009 Wiley-Liss, Inc.

  14. Altered insular activation and increased insular functional connectivity during sad and happy face processing in adolescent major depressive disorder.

    PubMed

    Henje Blom, Eva; Connolly, Colm G; Ho, Tiffany C; LeWinn, Kaja Z; Mobayed, Nisreen; Han, Laura; Paulus, Martin P; Wu, Jing; Simmons, Alan N; Yang, Tony T

    2015-06-01

    Major depressive disorder (MDD) is a leading cause of disability worldwide and occurs commonly first during adolescence. The insular cortex (IC) plays an important role in integrating emotion processing with interoception and has been implicated recently in the pathophysiology of adult and adolescent MDD. However, no studies have yet specifically examined the IC in adolescent MDD during processing of faces in the sad-happy continuum. Thus, the aim of the present study is to investigate the IC during sad and happy face processing in adolescents with MDD compared to healthy controls (HCL). Thirty-one adolescents (22 female) with MDD and 36 (23 female) HCL underwent a well-validated emotional processing fMRI paradigm that included sad and happy face stimuli. The MDD group showed significantly less differential activation of the anterior/middle insular cortex (AMIC) in response to sad versus happy faces compared to the HCL group. AMIC also showed greater functional connectivity with right fusiform gyrus, left middle frontal gyrus, and right amygdala/parahippocampal gyrus in the MDD compared to HCL group. Moreover, differential activation to sad and happy faces in AMIC correlated negatively with depression severity within the MDD group. Small age-range and cross-sectional nature precluded assessment of development of the AMIC in adolescent depression. Given the role of the IC in integrating bodily stimuli with conscious cognitive and emotional processes, our findings of aberrant AMIC function in adolescent MDD provide a neuroscientific rationale for targeting the AMIC in the development of new treatment modalities. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Effects of methylphenidate during emotional processing in amphetamine users: preliminary findings.

    PubMed

    Bottelier, M A; Schouw, M L J; de Ruiter, M B; Ruhe, H G; Lindauer, R J L; Reneman, L

    2015-12-01

    D-amphetamine (dAMPH) and methylphenidate (MPH) are stimulants used in the treatment of Attention Deficit Hyperactivity Disorder (ADHD). Preclinical studies have shown that in healthy animals, dAMPH induces dopamine (DA) dysfunction, as evidenced for instance by loss of DA levels and its transporters. It has also been suggested that DA plays an important role in emotional processing, and that altered DA-ergic intervention may modulate amygdala function. To explore the role of the DA system in emotional processing we examined emotional processing using functional magnetic resonance imaging (fMRI) in eight male recreational users of dAMPH and eight male healthy controls. We compared brain activation between both groups during an emotional face-processing task with and without an oral MPH challenge. All subjects were abstinent for at least 2 weeks during the baseline scan. The second scan was performed on the same day 1½ hours after receiving an oral dose of 35 mg MPH. A significant Valence*Group interaction (p = .037) indicated amygdala hyperreactivity to fearful facial expressions in dAMPH users that was robust against adjustment for age (p = .015). Furthermore, duration of amphetamine use in years was positively correlated with amygdala reactivity in dAMPH users (r = .76; p = .029). These exploratory findings are in line with previous findings suggesting that DA plays a role in emotional processing.

  16. Cultural in-group advantage: emotion recognition in African American and European American faces and voices.

    PubMed

    Wickline, Virginia B; Bailey, Wendy; Nowicki, Stephen

    2009-03-01

    The authors explored whether there were in-group advantages in emotion recognition of faces and voices by culture or geographic region. Participants were 72 African American students (33 men, 39 women), 102 European American students (30 men, 72 women), 30 African international students (16 men, 14 women), and 30 European international students (15 men, 15 women). The participants determined emotions in African American and European American faces and voices. Results showed an in-group advantage-sometimes by culture, less often by race-in recognizing facial and vocal emotional expressions. African international students were generally less accurate at interpreting American nonverbal stimuli than were European American, African American, and European international peers. Results suggest that, although partly universal, emotional expressions have subtle differences across cultures that persons must learn.

  17. How Children Use Emotional Prosody: Crossmodal Emotional Integration?

    ERIC Educational Resources Information Center

    Gil, Sandrine; Hattouti, Jamila; Laval, Virginie

    2016-01-01

    A crossmodal effect has been observed in the processing of facial and vocal emotion in adults and infants. For the first time, we assessed whether this effect is present in childhood by administering a crossmodal task similar to those used in seminal studies featuring emotional faces (i.e., a continuum of emotional expressions running from…

  18. White matter fiber compromise contributes differentially to attention and emotion processing impairment in alcoholism, HIV-infection, and their comorbidity.

    PubMed

    Schulte, T; Müller-Oehring, E M; Sullivan, E V; Pfefferbaum, A

    2012-10-01

    Alcoholism (ALC) and HIV-1 infection (HIV) each affects emotional and attentional processes and integrity of brain white matter fibers likely contributing to functional compromise. The highly prevalent ALC+HIV comorbidity may exacerbate compromise. We used diffusion tensor imaging (DTI) and an emotional Stroop Match-to-Sample task in 19 ALC, 16 HIV, 15 ALC+HIV, and 15 control participants to investigate whether disruption of fiber system integrity accounts for compromised attentional and emotional processing. The task required matching a cue color to that of an emotional word with faces appearing between the color cue and the Stroop word in half of the trials. Nonmatched cue-word color pairs assessed selective attention, and face-word pairs assessed emotion. Relative to controls, DTI-based fiber tracking revealed lower inferior longitudinal fasciculus (ilf) integrity in HIV and ALC+HIV and lower uncinate fasciculus (uf) integrity in all three patient groups. Controls exhibited Stroop effects to positive face-word emotion, and greater interference was related to greater callosal, cingulum and ilf integrity. By contrast, HIV showed greater interference from negative Stroop words during color-nonmatch trials, correlating with greater uf compromise. For face trials, ALC and ALC+HIV showed greater Stroop-word interference, correlating with lower cingulate and callosal integrity. Thus, in HIV, conflict resolution was diminished when challenging conditions usurped resources needed to manage interference from negative emotion and to disengage attention from wrongly cued colors (nonmatch). In ALC and ALC+HIV, poorer callosal integrity was related to enhanced emotional interference suggesting curtailed interhemispheric exchange needed between preferentially right-hemispheric emotion and left-hemispheric Stroop-word functions. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Emotion Words Shape Emotion Percepts

    PubMed Central

    Gendron, Maria; Lindquist, Kristen A.; Barsalou, Lawrence; Barrett, Lisa Feldman

    2015-01-01

    People believe they see emotion written on the faces of other people. In an instant, simple facial actions are transformed into information about another's emotional state. The present research examined whether a perceiver unknowingly contributes to emotion perception with emotion word knowledge. We present 2 studies that together support a role for emotion concepts in the formation of visual percepts of emotion. As predicted, we found that perceptual priming of emotional faces (e.g., a scowling face) was disrupted when the accessibility of a relevant emotion word (e.g., anger) was temporarily reduced, demonstrating that the exact same face was encoded differently when a word was accessible versus when it was not. The implications of these findings for a linguistically relative view of emotion perception are discussed. PMID:22309717

  20. Emotion Processing in Parkinson’s Disease: A Three-Level Study on Recognition, Representation, and Regulation

    PubMed Central

    Enrici, Ivan; Adenzato, Mauro; Ardito, Rita B.; Mitkova, Antonia; Cavallo, Marco; Zibetti, Maurizio; Lopiano, Leonardo; Castelli, Lorys

    2015-01-01

    Background Parkinson’s disease (PD) is characterised by well-known motor symptoms, whereas the presence of cognitive non-motor symptoms, such as emotional disturbances, is still underestimated. One of the major problems in studying emotion deficits in PD is an atomising approach that does not take into account different levels of emotion elaboration. Our study addressed the question of whether people with PD exhibit difficulties in one or more specific dimensions of emotion processing, investigating three different levels of analyses, that is, recognition, representation, and regulation. Methodology Thirty-two consecutive medicated patients with PD and 25 healthy controls were enrolled in the study. Participants performed a three-level analysis assessment of emotional processing using quantitative standardised emotional tasks: the Ekman 60-Faces for emotion recognition, the full 36-item version of the Reading the Mind in the Eyes (RME) for emotion representation, and the 20-item Toronto Alexithymia Scale (TAS-20) for emotion regulation. Principal Findings Regarding emotion recognition, patients obtained significantly worse scores than controls in the total score of Ekman 60-Faces but not in any other basic emotions. For emotion representation, patients obtained significantly worse scores than controls in the RME experimental score but no in the RME gender control task. Finally, on emotion regulation, PD and controls did not perform differently at TAS-20 and no specific differences were found on TAS-20 subscales. The PD impairments on emotion recognition and representation do not correlate with dopamine therapy, disease severity, or with the duration of illness. These results are independent from other cognitive processes, such as global cognitive status and executive function, or from psychiatric status, such as depression, anxiety or apathy. Conclusions These results may contribute to better understanding of the emotional problems that are often seen in patients

  1. Variations in the serotonin-transporter gene are associated with attention bias patterns to positive and negative emotion faces.

    PubMed

    Pérez-Edgar, Koraly; Bar-Haim, Yair; McDermott, Jennifer Martin; Gorodetsky, Elena; Hodgkinson, Colin A; Goldman, David; Ernst, Monique; Pine, Daniel S; Fox, Nathan A

    2010-03-01

    Both attention biases to threat and a serotonin-transporter gene polymorphism (5-HTTLPR) have been linked to heightened neural activation to threat and the emergence of anxiety. The short allele of 5-HTTLPR may act via its effect on neurotransmitter availability, while attention biases shape broad patterns of cognitive processing. We examined individual differences in attention bias to emotion faces as a function of 5-HTTLPR genotype. Adolescents (N=117) were classified for presumed SLC6A4 expression based on 5-HTTLPR-low (SS, SL(G), or L(G)L(G)), intermediate (SL(A) or L(A)L(G)), or high (L(A)L(A)). Participants completed the dot-probe task, measuring attention biases toward or away from angry and happy faces. Biases for angry faces increased with the genotype-predicted neurotransmission levels (low>intermediate>high). The reverse pattern was evident for happy faces. The data indicate a linear relation between 5-HTTLPR allelic status and attention biases to emotion, demonstrating a genetic mechanism for biased attention using ecologically valid stimuli that target socioemotional adaptation. Copyright 2009 Elsevier B.V. All rights reserved.

  2. Face perception in high-functioning autistic adults: evidence for superior processing of face parts, not for a configural face-processing deficit.

    PubMed

    Lahaie, A; Mottron, L; Arguin, M; Berthiaume, C; Jemel, B; Saumier, D

    2006-01-01

    Configural processing in autism was studied in Experiment 1 by using the face inversion effect. A normal inversion effect was observed in the participants with autism, suggesting intact configural face processing. A priming paradigm using partial or complete faces served in Experiment 2 to assess both local and configural face processing. Overall, normal priming effects were found in participants with autism, irrespective of whether the partial face primes were intuitive face parts (i.e., eyes, nose, etc.) or arbitrary segments. An exception, however, was that participants with autism showed magnified priming with single face parts relative to typically developing control participants. The present findings argue for intact configural processing in autism along with an enhanced processing for individual face parts. The face-processing peculiarities known to characterize autism are discussed on the basis of these results and past congruent results with nonsocial stimuli.

  3. Facial color processing in the face-selective regions: an fMRI study.

    PubMed

    Nakajima, Kae; Minami, Tetsuto; Tanabe, Hiroki C; Sadato, Norihiro; Nakauchi, Shigeki

    2014-09-01

    Facial color is important information for social communication as it provides important clues to recognize a person's emotion and health condition. Our previous EEG study suggested that N170 at the left occipito-temporal site is related to facial color processing (Nakajima et al., [2012]: Neuropsychologia 50:2499-2505). However, because of the low spatial resolution of EEG experiment, the brain region is involved in facial color processing remains controversial. In the present study, we examined the neural substrates of facial color processing using functional magnetic resonance imaging (fMRI). We measured brain activity from 25 subjects during the presentation of natural- and bluish-colored face and their scrambled images. The bilateral fusiform face (FFA) area and occipital face area (OFA) were localized by the contrast of natural-colored faces versus natural-colored scrambled images. Moreover, region of interest (ROI) analysis showed that the left FFA was sensitive to facial color, whereas the right FFA and the right and left OFA were insensitive to facial color. In combination with our previous EEG results, these data suggest that the left FFA may play an important role in facial color processing. Copyright © 2014 Wiley Periodicals, Inc.

  4. Neural processing of emotional-intensity predicts emotion regulation choice.

    PubMed

    Shafir, Roni; Thiruchselvam, Ravi; Suri, Gaurav; Gross, James J; Sheppes, Gal

    2016-12-01

    Emotional-intensity is a core characteristic of affective events that strongly determines how individuals choose to regulate their emotions. Our conceptual framework suggests that in high emotional-intensity situations, individuals prefer to disengage attention using distraction, which can more effectively block highly potent emotional information, as compared with engagement reappraisal, which is preferred in low emotional-intensity. However, existing supporting evidence remains indirect because prior intensity categorization of emotional stimuli was based on subjective measures that are potentially biased and only represent the endpoint of emotional-intensity processing. Accordingly, this study provides the first direct evidence for the role of online emotional-intensity processing in predicting behavioral regulatory-choices. Utilizing the high temporal resolution of event-related potentials, we evaluated online neural processing of stimuli's emotional-intensity (late positive potential, LPP) prior to regulatory-choices between distraction and reappraisal. Results showed that enhanced neural processing of intensity (enhanced LPP amplitudes) uniquely predicted (above subjective measures of intensity) increased tendency to subsequently choose distraction over reappraisal. Additionally, regulatory-choices led to adaptive consequences, demonstrated in finding that actual implementation of distraction relative to reappraisal-choice resulted in stronger attenuation of LPPs and self-reported arousal. © The Author (2016). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  5. Emotion identification method using RGB information of human face

    NASA Astrophysics Data System (ADS)

    Kita, Shinya; Mita, Akira

    2015-03-01

    Recently, the number of single households is drastically increased due to the growth of the aging society and the diversity of lifestyle. Therefore, the evolution of building spaces is demanded. Biofied Building we propose can help to avoid this situation. It helps interaction between the building and residents' conscious and unconscious information using robots. The unconscious information includes emotion, condition, and behavior. One of the important information is thermal comfort. We assume we can estimate it from human face. There are many researchs about face color analysis, but a few of them are conducted in real situations. In other words, the existing methods were not used with disturbance such as room lumps. In this study, Kinect was used with face-tracking. Room lumps and task lumps were used to verify that our method could be applicable to real situation. In this research, two rooms at 22 and 28 degrees C were prepared. We showed that the transition of thermal comfort by changing temperature can be observed from human face. Thus, distinction between the data of 22 and 28 degrees C condition from face color was proved to be possible.

  6. Is fear in your head? A comparison of instructed and real-life expressions of emotion in the face and body.

    PubMed

    Abramson, Lior; Marom, Inbal; Petranker, Rotem; Aviezer, Hillel

    2017-04-01

    The majority of emotion perception studies utilize instructed and stereotypical expressions of faces or bodies. While such stimuli are highly standardized and well-recognized, their resemblance to real-life expressions of emotion remains unknown. Here we examined facial and body expressions of fear and anger during real-life situations and compared their recognition to that of instructed expressions of the same emotions. In order to examine the source of the affective signal, expressions of emotion were presented as faces alone, bodies alone, and naturally, as faces with bodies. The results demonstrated striking deviations between recognition of instructed and real-life stimuli, which differed as a function of the emotion expressed. In real-life fearful expressions of emotion, bodies were far better recognized than faces, a pattern not found with instructed expressions of emotion. Anger reactions were better recognized from the body than from the face in both real-life and instructed stimuli. However, the real-life stimuli were overall better recognized than their instructed counterparts. These results indicate that differences between instructed and real-life expressions of emotion are prevalent and raise caution against an overreliance of researchers on instructed affective stimuli. The findings also demonstrate that in real life, facial expression perception may rely heavily on information from the contextualizing body. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  7. Neural processing of fearful and happy facial expressions during emotion-relevant and emotion-irrelevant tasks: a fixation-to-feature approach

    PubMed Central

    Neath-Tavares, Karly N.; Itier, Roxane J.

    2017-01-01

    Research suggests an important role of the eyes and mouth for discriminating facial expressions of emotion. A gaze-contingent procedure was used to test the impact of fixation to facial features on the neural response to fearful, happy and neutral facial expressions in an emotion discrimination (Exp.1) and an oddball detection (Exp.2) task. The N170 was the only eye-sensitive ERP component, and this sensitivity did not vary across facial expressions. In both tasks, compared to neutral faces, responses to happy expressions were seen as early as 100–120ms occipitally, while responses to fearful expressions started around 150ms, on or after the N170, at both occipital and lateral-posterior sites. Analyses of scalp topographies revealed different distributions of these two emotion effects across most of the epoch. Emotion processing interacted with fixation location at different times between tasks. Results suggest a role of both the eyes and mouth in the neural processing of fearful expressions and of the mouth in the processing of happy expressions, before 350ms. PMID:27430934

  8. Neural processing of fearful and happy facial expressions during emotion-relevant and emotion-irrelevant tasks: A fixation-to-feature approach.

    PubMed

    Neath-Tavares, Karly N; Itier, Roxane J

    2016-09-01

    Research suggests an important role of the eyes and mouth for discriminating facial expressions of emotion. A gaze-contingent procedure was used to test the impact of fixation to facial features on the neural response to fearful, happy and neutral facial expressions in an emotion discrimination (Exp.1) and an oddball detection (Exp.2) task. The N170 was the only eye-sensitive ERP component, and this sensitivity did not vary across facial expressions. In both tasks, compared to neutral faces, responses to happy expressions were seen as early as 100-120ms occipitally, while responses to fearful expressions started around 150ms, on or after the N170, at both occipital and lateral-posterior sites. Analyses of scalp topographies revealed different distributions of these two emotion effects across most of the epoch. Emotion processing interacted with fixation location at different times between tasks. Results suggest a role of both the eyes and mouth in the neural processing of fearful expressions and of the mouth in the processing of happy expressions, before 350ms. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Differential effects of face-realism and emotion on event-related brain potentials and their implications for the uncanny valley theory

    NASA Astrophysics Data System (ADS)

    Schindler, Sebastian; Zell, Eduard; Botsch, Mario; Kissler, Johanna

    2017-03-01

    Cartoon characters are omnipresent in popular media. While few studies have scientifically investigated their processing, in computer graphics, efforts are made to increase realism. Yet, close approximations of reality have been suggested to evoke sometimes a feeling of eeriness, the “uncanny valley” effect. Here, we used high-density electroencephalography to investigate brain responses to professionally stylized happy, angry, and neutral character faces. We employed six face-stylization levels varying from abstract to realistic and investigated the N170, early posterior negativity (EPN), and late positive potential (LPP) event-related components. The face-specific N170 showed a u-shaped modulation, with stronger reactions towards both most abstract and most realistic compared to medium-stylized faces. For abstract faces, N170 was generated more occipitally than for real faces, implying stronger reliance on structural processing. Although emotional faces elicited highest amplitudes on both N170 and EPN, on the N170 realism and expression interacted. Finally, LPP increased linearly with face realism, reflecting activity increase in visual and parietal cortex for more realistic faces. Results reveal differential effects of face stylization on distinct face processing stages and suggest a perceptual basis to the uncanny valley hypothesis. They are discussed in relation to face perception, media design, and computer graphics.

  10. Differential effects of face-realism and emotion on event-related brain potentials and their implications for the uncanny valley theory

    PubMed Central

    Schindler, Sebastian; Zell, Eduard; Botsch, Mario; Kissler, Johanna

    2017-01-01

    Cartoon characters are omnipresent in popular media. While few studies have scientifically investigated their processing, in computer graphics, efforts are made to increase realism. Yet, close approximations of reality have been suggested to evoke sometimes a feeling of eeriness, the “uncanny valley” effect. Here, we used high-density electroencephalography to investigate brain responses to professionally stylized happy, angry, and neutral character faces. We employed six face-stylization levels varying from abstract to realistic and investigated the N170, early posterior negativity (EPN), and late positive potential (LPP) event-related components. The face-specific N170 showed a u-shaped modulation, with stronger reactions towards both most abstract and most realistic compared to medium-stylized faces. For abstract faces, N170 was generated more occipitally than for real faces, implying stronger reliance on structural processing. Although emotional faces elicited highest amplitudes on both N170 and EPN, on the N170 realism and expression interacted. Finally, LPP increased linearly with face realism, reflecting activity increase in visual and parietal cortex for more realistic faces. Results reveal differential effects of face stylization on distinct face processing stages and suggest a perceptual basis to the uncanny valley hypothesis. They are discussed in relation to face perception, media design, and computer graphics. PMID:28332557

  11. Independent effects of reward expectation and spatial orientation on the processing of emotional facial expressions.

    PubMed

    Kang, Guanlan; Zhou, Xiaolin; Wei, Ping

    2015-09-01

    The present study investigated the effect of reward expectation and spatial orientation on the processing of emotional facial expressions, using a spatial cue-target paradigm. A colored cue was presented at the left or right side of the central fixation point, with its color indicating the monetary reward stakes of a given trial (incentive vs. non-incentive), followed by the presentation of an emotional facial target (angry vs. neutral) at a cued or un-cued location. Participants were asked to discriminate the emotional expression of the target, with the cue-target stimulus onset asynchrony being 200-300 ms in Experiment 1 and 950-1250 ms in Experiment 2a (without a fixation cue) and Experiment 2b (with a fixation cue), producing a spatial facilitation effect and an inhibition of return effect, respectively. The results of all the experiments revealed faster reaction times in the monetary incentive condition than in the non-incentive condition, demonstrating the effect of reward to facilitate task performance. An interaction between reward expectation and the emotion of the target was evident in all the three experiments, with larger reward effects for angry faces than for neutral faces. This interaction was not affected by spatial orientation. These findings demonstrate that incentive motivation improves task performance and increases sensitivity to angry faces, irrespective of spatial orienting and reorienting processes.

  12. Unconscious Processing of Facial Emotional Valence Relation: Behavioral Evidence of Integration between Subliminally Perceived Stimuli.

    PubMed

    Liu, Chengzhen; Sun, Zhiyi; Jou, Jerwen; Cui, Qian; Zhao, Guang; Qiu, Jiang; Tu, Shen

    2016-01-01

    Although a few studies have investigated the integration between some types of unconscious stimuli, no research has yet explored the integration between unconscious emotional stimuli. This study was designed to provide behavioral evidence for the integration between unconsciously perceived emotional faces (same or different valence relation) using a modified priming paradigm. In two experiments, participants were asked to decide whether two faces in the target, which followed two subliminally presented faces of same or different emotional expressions, were of the same or different emotional valence. The interstimulus interval (ISI) between the prime and the target was manipulated (0, 53, 163 ms). In Experiment 1, prime visibility was assessed post-experiment. In Experiment 2, it was assessed on each trial. Interestingly, in both experiments, unconsciously processed valence relation of the two faces in the prime generated a negative priming effect in the response to the supraliminally presented target, independent of the length of ISI. Further analyses suggested that the negative priming was probably caused by a motor response incongruent relation between the subliminally perceived prime and the supraliminally perceived target. The visual feature incongruent relation across the prime and target was not found to play a role in the negative priming. Because the negative priming was found at short ISI, an attention mechanism as well as a motor inhibition mechanism were proposed in the generation of the negative priming effect. Overall, this study indicated that the subliminal valence relation was processed, and that integration between different unconsciously perceived stimuli could occur.

  13. Unconscious Processing of Facial Emotional Valence Relation: Behavioral Evidence of Integration between Subliminally Perceived Stimuli

    PubMed Central

    Jou, Jerwen; Cui, Qian; Zhao, Guang; Qiu, Jiang; Tu, Shen

    2016-01-01

    Although a few studies have investigated the integration between some types of unconscious stimuli, no research has yet explored the integration between unconscious emotional stimuli. This study was designed to provide behavioral evidence for the integration between unconsciously perceived emotional faces (same or different valence relation) using a modified priming paradigm. In two experiments, participants were asked to decide whether two faces in the target, which followed two subliminally presented faces of same or different emotional expressions, were of the same or different emotional valence. The interstimulus interval (ISI) between the prime and the target was manipulated (0, 53, 163 ms). In Experiment 1, prime visibility was assessed post-experiment. In Experiment 2, it was assessed on each trial. Interestingly, in both experiments, unconsciously processed valence relation of the two faces in the prime generated a negative priming effect in the response to the supraliminally presented target, independent of the length of ISI. Further analyses suggested that the negative priming was probably caused by a motor response incongruent relation between the subliminally perceived prime and the supraliminally perceived target. The visual feature incongruent relation across the prime and target was not found to play a role in the negative priming. Because the negative priming was found at short ISI, an attention mechanism as well as a motor inhibition mechanism were proposed in the generation of the negative priming effect. Overall, this study indicated that the subliminal valence relation was processed, and that integration between different unconsciously perceived stimuli could occur. PMID:27622600

  14. Emotion processing in joint hypermobility: A potential link to the neural bases of anxiety and related somatic symptoms in collagen anomalies.

    PubMed

    Mallorquí-Bagué, N; Bulbena, A; Roé-Vellvé, N; Hoekzema, E; Carmona, S; Barba-Müller, E; Fauquet, J; Pailhez, G; Vilarroya, O

    2015-06-01

    Joint hypermobility syndrome (JHS) has repeatedly been associated with anxiety and anxiety disorders, fibromyalgia, irritable bowel syndrome and temporomandibular joint disorder. However, the neural underpinnings of these associations still remain unclear. This study explored brain responses to facial visual stimuli with emotional cues using fMRI techniques in general population with different ranges of hypermobility. Fifty-one non-clinical volunteers (33 women) completed state and trait anxiety questionnaire measures, were assessed with a clinical examination for hypermobility (Beighton system) and performed an emotional face processing paradigm during functional neuroimaging. Trait anxiety scores did significantly correlate with both state anxiety and hypermobility scores. BOLD signals of the hippocampus did positively correlate with hypermobility scores for the crying faces versus neutral faces contrast in ROI analyses. No results were found for any of the other studied ROIs. Additionally, hypermobility scores were also associated with other key affective processing areas (i.e. the middle and anterior cingulate gyrus, fusiform gyrus, parahippocampal region, orbitofrontal cortex and cerebellum) in the whole brain analysis. Hypermobility scores are associated with trait anxiety and higher brain responses to emotional faces in emotion processing brain areas (including hippocampus) described to be linked to anxiety and somatic symptoms. These findings increase our understanding of emotion processing in people bearing this heritable variant of collagen and the mechanisms through which vulnerability to anxiety and somatic symptoms arises in this population. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  15. Are 6-month-old human infants able to transfer emotional information (happy or angry) from voices to faces? An eye-tracking study.

    PubMed

    Palama, Amaya; Malsert, Jennifer; Gentaz, Edouard

    2018-01-01

    The present study examined whether 6-month-old infants could transfer amodal information (i.e. independently of sensory modalities) from emotional voices to emotional faces. Thus, sequences of successive emotional stimuli (voice or face from one sensory modality -auditory- to another sensory modality -visual-), corresponding to a cross-modal transfer, were displayed to 24 infants. Each sequence presented an emotional (angry or happy) or neutral voice, uniquely, followed by the simultaneous presentation of two static emotional faces (angry or happy, congruous or incongruous with the emotional voice). Eye movements in response to the visual stimuli were recorded with an eye-tracker. First, results suggested no difference in infants' looking time to happy or angry face after listening to the neutral voice or the angry voice. Nevertheless, after listening to the happy voice, infants looked longer at the incongruent angry face (the mouth area in particular) than the congruent happy face. These results revealed that a cross-modal transfer (from auditory to visual modalities) is possible for 6-month-old infants only after the presentation of a happy voice, suggesting that they recognize this emotion amodally.

  16. Effects of intranasal oxytocin on amygdala reactivity to emotional faces in recently trauma-exposed individuals.

    PubMed

    Frijling, Jessie L; van Zuiden, Mirjam; Koch, Saskia B J; Nawijn, Laura; Veltman, Dick J; Olff, Miranda

    2016-02-01

    There is a need for effective, early post-trauma preventive interventions for post-traumatic stress disorder (PTSD). Attenuating amygdala hyperreactivity early post-trauma, a likely PTSD vulnerability factor, may decrease PTSD risk. Since oxytocin modulates amygdala reactivity to emotional stimuli, oxytocin administration early post-trauma may be a promising candidate for PTSD prevention. In a randomized double-blind placebo-controlled fMRI study, we investigated effects of a single intranasal oxytocin administration (40 IU) on amygdala reactivity to happy, neutral and fearful faces in 41 recently trauma-exposed men and women showing moderate to high distress after initial post-trauma screening. We explored treatment interactions with sex. Participants were scanned within 11 days post-trauma. Compared with placebo, oxytocin significantly increased right amygdala reactivity to fearful faces. There was a significant treatment by sex interaction on amygdala reactivity to neutral faces, with women showing increased left amygdala reactivity after oxytocin. These findings indicate that a single oxytocin administration may enhance fearful faces processing in recently trauma-exposed individuals and neutral faces processing in recently trauma-exposed women. These observations may be explained by oxytocin-induced increased salience processing. Clinical implications of these findings for PTSD prevention should be further investigated. Netherlands Trial Registry; Boosting Oxytocin after trauma: Neurobiology and the Development of Stress-related psychopathology (BONDS); NTR3190; http://www.trialregister.nl/trialreg/admin/rctview.asp?TC = 3190. © The Author (2015). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  17. How is this child feeling? Preschool-aged children’s ability to recognize emotion in faces and body poses

    PubMed Central

    Parker, Alison E.; Mathis, Erin T.; Kupersmidt, Janis B.

    2016-01-01

    The study examined children’s recognition of emotion from faces and body poses, as well as gender differences in these recognition abilities. Preschool-aged children (N = 55) and their parents and teachers participated in the study. Preschool-aged children completed a web-based measure of emotion recognition skills, which included five tasks (three with faces and two with bodies). Parents and teachers reported on children’s aggressive behaviors and social skills. Children’s emotion accuracy on two of the three facial tasks and one of the body tasks was related to teacher reports of social skills. Some of these relations were moderated by child gender. In particular, the relationships between emotion recognition accuracy and reports of children’s behavior were stronger for boys than girls. Identifying preschool-aged children’s strengths and weaknesses in identification of emotion from faces and body poses may be helpful in guiding interventions with children who have problems with social and behavioral functioning that may be due, in part, to emotional knowledge deficits. Further developmental implications of these findings are discussed. PMID:27057129

  18. Cyber Victimization in High School: Measurement, Overlap with Face-to-Face Victimization, and Associations with Social-Emotional Outcomes

    ERIC Educational Resources Information Center

    Brown, Christina Flynn; Demaray, Michelle Kilpatrick; Tennant, Jaclyn E.; Jenkins, Lyndsay N.

    2017-01-01

    Cyber victimization is a contemporary problem facing youth and adolescents (Diamanduros, Downs, & Jenkins, 2008; Kowalski & Limber, 2007). It is imperative for researchers and school personnel to understand the associations between cyber victimization and student social-emotional outcomes. This article explores (a) gender differences in…

  19. Abnormal emotion processing, but intact fairness and intentionality considerations during social decision-making in schizophrenia.

    PubMed

    de la Asuncion, Javier; Docx, Lise; Sabbe, Bernard; Morrens, Manuel; de Bruijn, Ellen R A

    2015-01-01

    Schizophrenia is a severe mental disorder that is highly characterized by social cognitive impairments. Most studies investigating these impairments focus on one specific social domain such as emotion recognition. However, in daily life, processing complex social situations relies on the combination of several social cognitive and affective processes simultaneously rather than one process alone. A modified version of the economically based Ultimatum Game was used to measure the interplay between fairness, intentionality, and emotion considerations during social decision-making. In this task, participants accept or reject fair and unfair monetary offers proposed intentionally or unintentionally by either angry, happy, neutral, or sad proposers. Behavioral data was collected from a group of schizophrenia patients (N = 35) and a group of healthy individuals (N = 30). Like healthy participants, schizophrenia patients differentiated between fair and unfair offers by rejecting unfair offers more compared to fair offers. However, overall patients did reject more fair offers, indicating that their construct of fairness operates within different margins. In both groups, intentional unfair offers were rejected more compared to unintentional ones, indicating a normal integration of intentionality considerations in schizophrenia. Importantly, healthy subjects also differentiated between proposers' emotion when rejecting unfair offers (more rejections from proposers depicting angry faces compared to proposers depicting, happy, neutral, or sad faces). Schizophrenia patients' decision behavior on the other hand, was not affected by the proposers' emotions. The current study thus shows that schizophrenia patients have specific problems with processing and integrating emotional information. Importantly, the finding that patients display normal fairness and intentionality considerations emphasizes preservation of central social cognitive processes in schizophrenia.

  20. Face Emotion Processing in Depressed Children and Adolescents with and without Comorbid Conduct Disorder

    ERIC Educational Resources Information Center

    Schepman, Karen; Taylor, Eric; Collishaw, Stephan; Fombonne, Eric

    2012-01-01

    Studies of adults with depression point to characteristic neurocognitive deficits, including differences in processing facial expressions. Few studies have examined face processing in juvenile depression, or taken account of other comorbid disorders. Three groups were compared: depressed children and adolescents with conduct disorder (n = 23),…

  1. Who cares? Offering emotion work as a 'gift' in the nursing labour process.

    PubMed

    Bolton, S C

    2000-09-01

    Who cares? Offering emotion work as a 'gift' in the nursing labour process The emotional elements of the nursing labour process are being recognized increasingly. Many commentators stress that nurses' 'emotional labour' is hard and productive work and should be valued in the same way as physical or technical labour. However, the term 'emotional labour' fails to conceptualize the many occasions when nurses not only work hard on their emotions in order to present the detached face of a professional carer, but also to offer authentic caring behaviour to patients in their care. Using qualitative data collected from a group of gynaecology nurses in an English National Health Service (NHS) Trust hospital, this paper argues that nursing work is emotionally complex and may be better understood by utilizing a combination of Hochschild's concepts: emotion work as a 'gift' in addition to 'emotional labour'. The gynaecology nurses in this study describe their work as 'emotionful' and therefore it could be said that this particular group of nurses represent a distinct example. Nevertheless, though it is impossible to generalize from limited data, the research presented in this paper does highlight the emotional complexity of the nursing labour process, expands the current conceptual analysis, and offers a path for future research. The examination further emphasizes the need to understand and value the motivations behind nurses' emotion work and their wish to maintain caring as a central value in professional nursing.

  2. Common and disorder-specific neural responses to emotional faces in generalised anxiety, social anxiety and panic disorders

    PubMed Central

    Fonzo, Gregory A.; Ramsawh, Holly J.; Flagan, Taru M.; Sullivan, Sarah G.; Letamendi, Andrea; Simmons, Alan N.; Paulus, Martin P.; Stein, Murray B.

    2015-01-01

    Background Although evidence exists for abnormal brain function across various anxiety disorders, direct comparison of neural function across diagnoses is needed to elicit abnormalities common across disorders and those distinct to a particular diagnosis. Aims To delineate common and distinct abnormalities within generalised anxiety (GAD), panic and social anxiety disorder (SAD) during affective processing. Method Fifty-nine adults (15 with GAD, 15 with panic disorder, 14 with SAD, and 15 healthy controls) underwent functional magnetic resonance imaging while completing a facial emotion matching task with fearful, angry and happy faces. Results Greater differential right amygdala activation to matching fearful v. happy facial expressions related to greater negative affectivity (i.e. trait anxiety) and was heightened across all anxiety disorder groups compared with controls. Collapsing across emotional face types, participants with panic disorder uniquely displayed greater posterior insula activation. Conclusions These preliminary results highlight a common neural basis for clinical anxiety in these diagnoses and also suggest the presence of disorder-specific dysfunction. PMID:25573399

  3. M69. Changes in Neural Measures of Emotion Processes Following Targeted Social Cognition Training

    PubMed Central

    Saxena, Abhishek; Guty, Erin; Dodell-Feder, David; Yin, Hong; Haut, Kristen; Nahum, Mor; Hooker, Christine

    2017-01-01

    Abstract Background: Research has shown that people who develop a psychotic disorder display observable decreases in cognitive abilities even before they begin to display overt symptoms of psychosis. Thus research has shown an increased interest in targeted cognitive training (TCT) as possible technique to deter or even stop cognitive deterioration in psychiatric disorders, such as schizophrenia. Although TCT has shown promising improvements in certain cognitive deficits, TCT research has largely ignored social cognition training. The current study investigates whether targeted social cognition training may be a viable method of improving social cognition in patient populations. Methods: To this end, 56 healthy adults from the community completed MRI scans before and after a 2-week period, where participants were randomized to complete either up to 10 hours of SocialVille, a computerized social cognition training program from PositScience Corporation, or up to 10 hours of common computer games. SocialVille consists of a variety of social cognition exercises, such as face emotion recognition, gaze tracking, and recognizing social incongruences. During the MRI scans, participants completed an emotion identification task (EmotID), consisting of object discrimination and emotion discrimination blocks. During the object discrimination blocks, participants where shown pictures of 2 cars and were asked to indicate whether the cars were the same or different, while in the emotion discrimination blocks, participants were shown 2 faces and were asked whether the faces displayed the same emotion. Results: Behavioral data indicated that controlling for initial performance, sex, age, and estimated IQ, being in the TCT group only predicted better performance during the emotion discrimination blocks after treatment compared to those who completed placebo computer games. Additionally, fMRI analyses indicate that brain regions central to the emotion processes (ie, amygdala) and the

  4. Processing of Facial Emotion in Bipolar Depression and Euthymia.

    PubMed

    Robinson, Lucy J; Gray, John M; Burt, Mike; Ferrier, I Nicol; Gallagher, Peter

    2015-10-01

    Previous studies of facial emotion processing in bipolar disorder (BD) have reported conflicting findings. In independently conducted studies, we investigate facial emotion labeling in euthymic and depressed BD patients using tasks with static and dynamically morphed images of different emotions displayed at different intensities. Study 1 included 38 euthymic BD patients and 28 controls. Participants completed two tasks: labeling of static images of basic facial emotions (anger, disgust, fear, happy, sad) shown at different expression intensities; the Eyes Test (Baron-Cohen, Wheelwright, Hill, Raste, & Plumb, 2001), which involves recognition of complex emotions using only the eye region of the face. Study 2 included 53 depressed BD patients and 47 controls. Participants completed two tasks: labeling of "dynamic" facial expressions of the same five basic emotions; the Emotional Hexagon test (Young, Perret, Calder, Sprengelmeyer, & Ekman, 2002). There were no significant group differences on any measures of emotion perception/labeling, compared to controls. A significant group by intensity interaction was observed in both emotion labeling tasks (euthymia and depression), although this effect did not survive the addition of measures of executive function/psychomotor speed as covariates. Only 2.6-15.8% of euthymic patients and 7.8-13.7% of depressed patients scored below the 10th percentile of the controls for total emotion recognition accuracy. There was no evidence of specific deficits in facial emotion labeling in euthymic or depressed BD patients. Methodological variations-including mood state, sample size, and the cognitive demands of the tasks-may contribute significantly to the variability in findings between studies.

  5. Emotion Perception from Face, Voice, and Touch: Comparisons and Convergence

    PubMed Central

    Schirmer, Annett; Adolphs, Ralph

    2017-01-01

    Historically, research on emotion perception has focused on facial expressions, and findings from this modality have come to dominate our thinking about other modalities. Here, we examine emotion perception through a wider lens by comparing facial with vocal and tactile processing. We review stimulus characteristics and ensuing behavioral and brain responses, and show that audition and touch do not simply duplicate visual mechanisms. Each modality provides a distinct input channel and engages partly non-overlapping neuroanatomical systems with different processing specializations (e.g., specific emotions versus affect). Moreover, processing of signals across the different modalities converges, first into multi- and later into amodal representations that enable holistic emotion judgments. PMID:28173998

  6. Exploring the unconscious using faces.

    PubMed

    Axelrod, Vadim; Bar, Moshe; Rees, Geraint

    2015-01-01

    Understanding the mechanisms of unconscious processing is one of the most substantial endeavors of cognitive science. While there are many different empirical ways to address this question, the use of faces in such research has proven exceptionally fruitful. We review here what has been learned about unconscious processing through the use of faces and face-selective neural correlates. A large number of cognitive systems can be explored with faces, including emotions, social cueing and evaluation, attention, multisensory integration, and various aspects of face processing. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. ERP correlates of attention allocation in mothers processing faces of their children

    PubMed Central

    Grasso, Damion J.; Moser, Jason S.; Dozier, Mary; Simons, Robert

    2012-01-01

    This study employed visually evoked event-related potential (ERP) methodology to examine temporal patterns of structural and higher-level face processing in birth and foster/adoptive mothers viewing pictures of their children. Fourteen birth mothers and 14 foster/adoptive mothers engaged in a computerized task in which they viewed facial pictures of their own children, and of familiar and unfamiliar children and adults. All mothers, regardless of type, showed ERP patterns suggestive of increased attention allocation to their own children’s faces compared to other child and adult faces beginning as early as 100–150 ms after stimulus onset and lasting for several hundred milliseconds. These data are in line with a parallel processing model that posits the involvement of several brain regions in simultaneously encoding the structural features of faces as well as their emotional and personal significance. Additionally, late positive ERP patterns associated with greater allocation of attention predicted mothers’ perceptions of the parent–child relationship as positive and influential to their children’s psychological development. These findings suggest the potential utility of using ERP components to index maternal processes. PMID:19428973

  8. Holistic face training enhances face processing in developmental prosopagnosia

    PubMed Central

    Cohan, Sarah; Nakayama, Ken

    2014-01-01

    Prosopagnosia has largely been regarded as an untreatable disorder. However, recent case studies using cognitive training have shown that it is possible to enhance face recognition abilities in individuals with developmental prosopagnosia. Our goal was to determine if this approach could be effective in a larger population of developmental prosopagnosics. We trained 24 developmental prosopagnosics using a 3-week online face-training program targeting holistic face processing. Twelve subjects with developmental prosopagnosia were assessed before and after training, and the other 12 were assessed before and after a waiting period, they then performed the training, and were then assessed again. The assessments included measures of front-view face discrimination, face discrimination with view-point changes, measures of holistic face processing, and a 5-day diary to quantify potential real-world improvements. Compared with the waiting period, developmental prosopagnosics showed moderate but significant overall training-related improvements on measures of front-view face discrimination. Those who reached the more difficult levels of training (‘better’ trainees) showed the strongest improvements in front-view face discrimination and showed significantly increased holistic face processing to the point of being similar to that of unimpaired control subjects. Despite challenges in characterizing developmental prosopagnosics’ everyday face recognition and potential biases in self-report, results also showed modest but consistent self-reported diary improvements. In summary, we demonstrate that by using cognitive training that targets holistic processing, it is possible to enhance face perception across a group of developmental prosopagnosics and further suggest that those who improved the most on the training task received the greatest benefits. PMID:24691394

  9. Gender differences in facial imitation and verbally reported emotional contagion from spontaneous to emotionally regulated processing levels.

    PubMed

    Sonnby-Borgström, Marianne; Jönsson, Peter; Svensson, Owe

    2008-04-01

    Previous studies on gender differences in facial imitation and verbally reported emotional contagion have investigated emotional responses to pictures of facial expressions at supraliminal exposure times. The aim of the present study was to investigate how gender differences are related to different exposure times, representing information processing levels from subliminal (spontaneous) to supraliminal (emotionally regulated). Further, the study aimed at exploring correlations between verbally reported emotional contagion and facial responses for men and women. Masked pictures of angry, happy and sad facial expressions were presented to 102 participants (51 men) at exposure times from subliminal (23 ms) to clearly supraliminal (2500 ms). Myoelectric activity (EMG) from the corrugator and the zygomaticus was measured and the participants reported their hedonic tone (verbally reported emotional contagion) after stimulus exposures. The results showed an effect of exposure time on gender differences in facial responses as well as in verbally reported emotional contagion. Women amplified imitative responses towards happy vs. angry faces and verbally reported emotional contagion with prolonged exposure times, whereas men did not. No gender differences were detected at the subliminal or borderliminal exposure times, but at the supraliminal exposure gender differences were found in imitation as well as in verbally reported emotional contagion. Women showed correspondence between their facial responses and their verbally reported emotional contagion to a greater extent than men. The results were interpreted in terms of gender differences in emotion regulation, rather than as differences in biologically prepared emotional reactivity.

  10. Inter-hemispheric interaction facilitates face processing.

    PubMed

    Compton, Rebecca J

    2002-01-01

    Many recent studies have revealed that interaction between the left and right cerebral hemispheres can aid in task performance, but these studies have tended to examine perception of simple stimuli such as letters, digits or simple shapes, which may have limited naturalistic validity. The present study extends these prior findings to a more naturalistic face perception task. Matching tasks required subjects to indicate when a target face matched one of two probe faces. Matches could be either across-field, requiring inter-hemispheric interaction, or within-field, not requiring inter-hemispheric interaction. Subjects indicated when faces matched in emotional expression (Experiment 1; n=32) or in character identity (Experiment 2; n=32). In both experiments, across-field performance was significantly better than within-field performance, supporting the primary hypothesis. Further, this advantage was greater for the more difficult character identity task. Results offer qualified support for the hypothesis that inter-hemispheric interaction is especially advantageous as task demands increase.

  11. The theta burst transcranial magnetic stimulation over the right PFC affects electroencephalogram oscillation during emotional processing.

    PubMed

    Cao, Dan; Li, Yingjie; Niznikiewicz, Margaret A; Tang, Yingying; Wang, Jijun

    2018-03-02

    Prefrontal cortex (PFC) plays an important role in emotional processing and therefore is one of the most frequently targeted regions for non-invasive brain stimulation such as repetitive transcranial magnetic stimulation (rTMS) in clinical trials, especially in the treatment of emotional disorders. As an approach to enhance the effectiveness of rTMS, continuous theta burst stimulation (cTBS) has been demonstrated to be efficient and safe. However, it is unclear how cTBS affects brain processes related to emotion. In particular, psychophysiological studies on the underlying neural mechanisms are sparse. In the current study, we investigated how the cTBS influences emotional processing when applied over the right PFC. Participants performed an emotion recognition Go/NoGo task, which asked them to select a GO response to either happy or fearful faces after the cTBS or after sham stimulation, while 64-channel electroencephalogram (EEG) was recorded. EEG oscillation was examined using event-related spectral perturbation (ERSP) in a time-interval between 170 and 310ms after face stimuli onset. In the sham group, we found a significant difference in the alpha band between response to happy and fearful stimuli but that effect did not exist in the cTBS group. The alpha band activity at the scalp was reduced suggesting the excitatory effect at the brain level. The beta and gamma band activity was not sensitive to cTBS intervention. The results of the current study demonstrate that cTBS does affect emotion processing and the effect is reflected in changes in EEG oscillations in the alpha band specifically. The results confirm the role of prefrontal cortex in emotion processing. We also suggest that this pattern of cTBS results elucidates mechanisms by which mood improvement in depressive disorders is achieved using cTBS intervention. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. How Context Influences Our Perception of Emotional Faces: A Behavioral Study on the Kuleshov Effect.

    PubMed

    Calbi, Marta; Heimann, Katrin; Barratt, Daniel; Siri, Francesca; Umiltà, Maria A; Gallese, Vittorio

    2017-01-01

    Facial expressions are of major importance in understanding the mental and emotional states of others. So far, most studies on the perception and comprehension of emotions have used isolated facial expressions as stimuli; for example, photographs of actors displaying facial expressions corresponding to one of the so called 'basic emotions.' However, our real experience during social interactions is different: facial expressions of emotion are mostly perceived in a wider context, constituted by body language, the surrounding environment, and our beliefs and expectations. Already in the early twentieth century, the Russian filmmaker Lev Kuleshov argued that such context, established by intermediate shots of strong emotional content, could significantly change our interpretation of facial expressions in film. Prior experiments have shown behavioral effects pointing in this direction, but have only used static images as stimuli. Our study used a more ecological design with participants watching film sequences of neutral faces, crosscut with scenes of strong emotional content (evoking happiness or fear, plus neutral stimuli as a baseline condition). The task was to rate the emotion displayed by a target person's face in terms of valence, arousal, and category. Results clearly demonstrated the presence of a significant effect in terms of both valence and arousal in the fear condition only. Moreover, participants tended to categorize the target person's neutral facial expression choosing the emotion category congruent with the preceding context. Our results highlight the context-sensitivity of emotions and the importance of studying them under ecologically valid conditions.

  13. Priming global and local processing of composite faces: revisiting the processing-bias effect on face perception.

    PubMed

    Gao, Zaifeng; Flevaris, Anastasia V; Robertson, Lynn C; Bentin, Shlomo

    2011-07-01

    We used the composite-face illusion and Navon stimuli to determine the consequences of priming local or global processing on subsequent face recognition. The composite-face illusion reflects the difficulty of ignoring the task-irrelevant half-face while attending the task-relevant half if the half-faces in the composite are aligned. On each trial, participants first matched two Navon stimuli, attending to either the global or the local level, and then matched the upper halves of two composite faces presented sequentially. Global processing of Navon stimuli increased the sensitivity to incongruence between the upper and the lower halves of the composite face, relative to a baseline in which the composite faces were not primed. Local processing of Navon stimuli did not influence the sensitivity to incongruence. Although incongruence induced a bias toward different responses, this bias was not modulated by priming. We conclude that global processing of Navon stimuli augments holistic processing of the face.

  14. Familiarity facilitates feature-based face processing.

    PubMed

    Visconti di Oleggio Castello, Matteo; Wheeler, Kelsey G; Cipolli, Carlo; Gobbini, M Ida

    2017-01-01

    Recognition of personally familiar faces is remarkably efficient, effortless and robust. We asked if feature-based face processing facilitates detection of familiar faces by testing the effect of face inversion on a visual search task for familiar and unfamiliar faces. Because face inversion disrupts configural and holistic face processing, we hypothesized that inversion would diminish the familiarity advantage to the extent that it is mediated by such processing. Subjects detected personally familiar and stranger target faces in arrays of two, four, or six face images. Subjects showed significant facilitation of personally familiar face detection for both upright and inverted faces. The effect of familiarity on target absent trials, which involved only rejection of unfamiliar face distractors, suggests that familiarity facilitates rejection of unfamiliar distractors as well as detection of familiar targets. The preserved familiarity effect for inverted faces suggests that facilitation of face detection afforded by familiarity reflects mostly feature-based processes.

  15. Holistic processing of static and moving faces.

    PubMed

    Zhao, Mintao; Bülthoff, Isabelle

    2017-07-01

    Humans' face ability develops and matures with extensive experience in perceiving, recognizing, and interacting with faces that move most of the time. However, how facial movements affect 1 core aspect of face ability-holistic face processing-remains unclear. Here we investigated the influence of rigid facial motion on holistic and part-based face processing by manipulating the presence of facial motion during study and at test in a composite face task. The results showed that rigidly moving faces were processed as holistically as static faces (Experiment 1). Holistic processing of moving faces persisted whether facial motion was presented during study, at test, or both (Experiment 2). Moreover, when faces were inverted to eliminate the contributions of both an upright face template and observers' expertise with upright faces, rigid facial motion facilitated holistic face processing (Experiment 3). Thus, holistic processing represents a general principle of face perception that applies to both static and dynamic faces, rather than being limited to static faces. These results support an emerging view that both perceiver-based and face-based factors contribute to holistic face processing, and they offer new insights on what underlies holistic face processing, how information supporting holistic face processing interacts with each other, and why facial motion may affect face recognition and holistic face processing differently. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  16. Abnormal emotion processing, but intact fairness and intentionality considerations during social decision-making in schizophrenia

    PubMed Central

    de la Asuncion, Javier; Docx, Lise; Sabbe, Bernard; Morrens, Manuel; de Bruijn, Ellen R. A.

    2015-01-01

    Schizophrenia is a severe mental disorder that is highly characterized by social cognitive impairments. Most studies investigating these impairments focus on one specific social domain such as emotion recognition. However, in daily life, processing complex social situations relies on the combination of several social cognitive and affective processes simultaneously rather than one process alone. A modified version of the economically based Ultimatum Game was used to measure the interplay between fairness, intentionality, and emotion considerations during social decision-making. In this task, participants accept or reject fair and unfair monetary offers proposed intentionally or unintentionally by either angry, happy, neutral, or sad proposers. Behavioral data was collected from a group of schizophrenia patients (N = 35) and a group of healthy individuals (N = 30). Like healthy participants, schizophrenia patients differentiated between fair and unfair offers by rejecting unfair offers more compared to fair offers. However, overall patients did reject more fair offers, indicating that their construct of fairness operates within different margins. In both groups, intentional unfair offers were rejected more compared to unintentional ones, indicating a normal integration of intentionality considerations in schizophrenia. Importantly, healthy subjects also differentiated between proposers’ emotion when rejecting unfair offers (more rejections from proposers depicting angry faces compared to proposers depicting, happy, neutral, or sad faces). Schizophrenia patients’ decision behavior on the other hand, was not affected by the proposers’ emotions. The current study thus shows that schizophrenia patients have specific problems with processing and integrating emotional information. Importantly, the finding that patients display normal fairness and intentionality considerations emphasizes preservation of central social cognitive processes in schizophrenia. PMID

  17. Processing of face identity in the affective flanker task: a diffusion model analysis.

    PubMed

    Mueller, Christina J; Kuchinke, Lars

    2016-11-01

    Affective flanker tasks often present affective facial expressions as stimuli. However, it is not clear whether the identity of the person on the target picture needs to be the same for the flanker stimuli or whether it is better to use pictures of different persons as flankers. While Grose-Fifer, Rodrigues, Hoover & Zottoli (Advances in Cognitive Psychology 9(2):81-91, 2013) state that attentional focus might be captured by processing the differences between faces, i.e. the identity, and therefore use pictures of the same individual as target and flanker stimuli, Munro, Dywan, Harris, McKee, Unsal & Segalowitz (Biological Psychology, 76:31-42, 2007) propose an advantage in presenting pictures of a different individual as flankers. They state that participants might focus only on small visual changes when targets and flankers are from the same individual instead of processing the affective content of the stimuli. The present study manipulated face identity in a between-subject design. Through investigation of behavioral measures as well as diffusion model parameters, we conclude that both types of flankers work equally efficient. This result seems best supported by recent accounts that propose an advantage of emotional processing over identity processing in face recognition. In the present study, there is no evidence that the processing of the face identity attracts sufficient attention to interfere with the affective evaluation of the target and flanker faces.

  18. Colored halos around faces and emotion-evoked colors: a new form of synesthesia.

    PubMed

    Ramachandran, Vilayanur S; Miller, Luke; Livingstone, Margaret S; Brang, David

    2012-01-01

    The claim that some individuals see colored halos or auras around faces has long been part of popular folklore. Here we report on a 23-year-old man (subject TK) diagnosed with Asperger's disorder, who began to consistently experience colors around individuals at the age of 10. TK's colors are based on the individual's identity and emotional connotation. We interpret these experiences as a form of synesthesia, and confirm their authenticity through a target detection paradigm. Additionally, we investigate TK's claim that emotions evoke highly specific colors, allowing him, despite his Asperger's, to introspect on emotions and recognize them in others.

  19. Effects of short-term quetiapine treatment on emotional processing, sleep and circadian rhythms.

    PubMed

    Rock, Philippa L; Goodwin, Guy M; Wulff, Katharina; McTavish, Sarah F B; Harmer, Catherine J

    2016-03-01

    Quetiapine is an atypical antipsychotic that can stabilise mood from any index episode of bipolar disorder. This study investigated the effects of seven-day quetiapine administration on sleep, circadian rhythms and emotional processing in healthy volunteers. Twenty healthy volunteers received 150 mg quetiapine XL for seven nights and 20 matched controls received placebo. Sleep-wake actigraphy was completed for one week both pre-dose and during drug treatment. On Day 8, participants completed emotional processing tasks. Actigraphy revealed that quetiapine treatment increased sleep duration and efficiency, delayed final wake time and had a tendency to reduce within-day variability. There were no effects of quetiapine on subjective ratings of mood or energy. Quetiapine-treated participants showed diminished bias towards positive words and away from negative words during recognition memory. Quetiapine did not significantly affect facial expression recognition, emotional word categorisation, emotion-potentiated startle or emotional word/faces dot-probe vigilance reaction times. These changes in sleep timing and circadian rhythmicity in healthy volunteers may be relevant to quetiapine's therapeutic actions. Effects on emotional processing did not emulate the effects of antidepressants. The effects of quetiapine on sleep and circadian rhythms in patients with bipolar disorder merit further investigation to elucidate its mechanisms of action. © The Author(s) 2016.

  20. The emotion seen in a face can be a methodological artifact: The process of elimination hypothesis.

    PubMed

    DiGirolamo, Marissa A; Russell, James A

    2017-04-01

    The claim that certain facial expressions signal certain specific emotions has been supported by high observer agreement in labeling the emotion predicted for that expression. Our hypothesis was that, with a method common to the field, high observer agreement can be achieved through a process of elimination: As participants move from trial to trial and they encounter a type of expression not previously encountered in the experiment, they tend to eliminate labels they have already associated with expressions seen on previous trials; they then select among labels not previously used. Seven experiments (total N = 1,068) here showed that the amount of agreement can be altered through a process of elimination. One facial expression not previously theorized to signal any emotion was consensually labeled as disgusted (76%), annoyed (85%), playful (89%), and mischievous (96%). Three quite different facial expressions were labeled nonplussed (82%, 93%, and 82%). A prototypical sad expression was labeled disgusted (55%), and a prototypical fear expression was labeled surprised (55%). A facial expression was labeled with a made-up word ( tolen ; 53%). Similar results were obtained both in a context focused on demonstrating a process of elimination and in one similar to a commonly used method, with 4 target expressions embedded with other expressions in 24 randomly ordered trials. (PsycINFO Database Record (c) 2017 APA, all rights reserved).