Interference among the Processing of Facial Emotion, Face Race, and Face Gender.
Li, Yongna; Tse, Chi-Shing
2016-01-01
People can process multiple dimensions of facial properties simultaneously. Facial processing models are based on the processing of facial properties. The current study examined the processing of facial emotion, face race, and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interacted with face race in all the tasks. The interaction of face race and face gender was found in the race and gender categorization tasks, whereas the interaction of facial emotion and face gender was significant in the emotion and gender categorization tasks. These results provided evidence for a symmetric interaction between variant facial properties (emotion) and invariant facial properties (race and gender).
Interference among the Processing of Facial Emotion, Face Race, and Face Gender
Li, Yongna; Tse, Chi-Shing
2016-01-01
People can process multiple dimensions of facial properties simultaneously. Facial processing models are based on the processing of facial properties. The current study examined the processing of facial emotion, face race, and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interacted with face race in all the tasks. The interaction of face race and face gender was found in the race and gender categorization tasks, whereas the interaction of facial emotion and face gender was significant in the emotion and gender categorization tasks. These results provided evidence for a symmetric interaction between variant facial properties (emotion) and invariant facial properties (race and gender). PMID:27840621
Ferrucci, Roberta; Giannicola, Gaia; Rosa, Manuela; Fumagalli, Manuela; Boggio, Paulo Sergio; Hallett, Mark; Zago, Stefano; Priori, Alberto
2012-01-01
Some evidence suggests that the cerebellum participates in the complex network processing emotional facial expression. To evaluate the role of the cerebellum in recognising facial expressions we delivered transcranial direct current stimulation (tDCS) over the cerebellum and prefrontal cortex. A facial emotion recognition task was administered to 21 healthy subjects before and after cerebellar tDCS; we also tested subjects with a visual attention task and a visual analogue scale (VAS) for mood. Anodal and cathodal cerebellar tDCS both significantly enhanced sensory processing in response to negative facial expressions (anodal tDCS, p=.0021; cathodal tDCS, p=.018), but left positive emotion and neutral facial expressions unchanged (p>.05). tDCS over the right prefrontal cortex left facial expressions of both negative and positive emotion unchanged. These findings suggest that the cerebellum is specifically involved in processing facial expressions of negative emotion.
Enhanced subliminal emotional responses to dynamic facial expressions.
Sato, Wataru; Kubota, Yasutaka; Toichi, Motomi
2014-01-01
Emotional processing without conscious awareness plays an important role in human social interaction. Several behavioral studies reported that subliminal presentation of photographs of emotional facial expressions induces unconscious emotional processing. However, it was difficult to elicit strong and robust effects using this method. We hypothesized that dynamic presentations of facial expressions would enhance subliminal emotional effects and tested this hypothesis with two experiments. Fearful or happy facial expressions were presented dynamically or statically in either the left or the right visual field for 20 (Experiment 1) and 30 (Experiment 2) ms. Nonsense target ideographs were then presented, and participants reported their preference for them. The results consistently showed that dynamic presentations of emotional facial expressions induced more evident emotional biases toward subsequent targets than did static ones. These results indicate that dynamic presentations of emotional facial expressions induce more evident unconscious emotional processing.
Balconi, Michela; Canavesio, Ylenia
2016-01-01
The present research explored the effect of social empathy on processing emotional facial expressions. Previous evidence suggested a close relationship between emotional empathy and both the ability to detect facial emotions and the attentional mechanisms involved. A multi-measure approach was adopted: we investigated the association between trait empathy (Balanced Emotional Empathy Scale) and individuals' performance (response times; RTs), attentional mechanisms (eye movements; number and duration of fixations), correlates of cortical activation (event-related potential (ERP) N200 component), and facial responsiveness (facial zygomatic and corrugator activity). Trait empathy was found to affect face detection performance (reduced RTs), attentional processes (more scanning eye movements in specific areas of interest), ERP salience effect (increased N200 amplitude), and electromyographic activity (more facial responses). A second important result was the demonstration of strong, direct correlations among these measures. We suggest that empathy may function as a social facilitator of the processes underlying the detection of facial emotion, and a general "facial response effect" is proposed to explain these results. We assumed that empathy influences cognitive and the facial responsiveness, such that empathic individuals are more skilful in processing facial emotion.
von Piekartz, H; Wallwork, S B; Mohr, G; Butler, D S; Moseley, G L
2015-04-01
Alexithymia, or a lack of emotional awareness, is prevalent in some chronic pain conditions and has been linked to poor recognition of others' emotions. Recognising others' emotions from their facial expression involves both emotional and motor processing, but the possible contribution of motor disruption has not been considered. It is possible that poor performance on emotional recognition tasks could reflect problems with emotional processing, motor processing or both. We hypothesised that people with chronic facial pain would be less accurate in recognising others' emotions from facial expressions, would be less accurate in a motor imagery task involving the face, and that performance on both tasks would be positively related. A convenience sample of 19 people (15 females) with chronic facial pain and 19 gender-matched controls participated. They undertook two tasks; in the first task, they identified the facial emotion presented in a photograph. In the second, they identified whether the person in the image had a facial feature pointed towards their left or right side, a well-recognised paradigm to induce implicit motor imagery. People with chronic facial pain performed worse than controls at both tasks (Facially Expressed Emotion Labelling (FEEL) task P < 0·001; left/right judgment task P < 0·001). Participants who were more accurate at one task were also more accurate at the other, regardless of group (P < 0·001, r(2) = 0·523). Participants with chronic facial pain were worse than controls at both the FEEL emotion recognition task and the left/right facial expression task and performance covaried within participants. We propose that disrupted motor processing may underpin or at least contribute to the difficulty that facial pain patients have in emotion recognition and that further research that tests this proposal is warranted. © 2014 John Wiley & Sons Ltd.
Attention to emotion and non-Western faces: revisiting the facial feedback hypothesis.
Dzokoto, Vivian; Wallace, David S; Peters, Laura; Bentsi-Enchill, Esi
2014-01-01
In a modified replication of Strack, Martin, and Stepper's demonstration of the Facial Feedback Hypothesis (1988), we investigated the effect of attention to emotion on the facial feedback process in a non-western cultural setting. Participants, recruited from two universities in Ghana, West Africa, gave self-reports of their perceived levels of attention to emotion, and then completed cartoon-rating tasks while randomly assigned to smiling, frowning, or neutral conditions. While participants with low Attention to Emotion scores displayed the usual facial feedback effect (rating cartoons as funnier when in the smiling compared to the frowning condition), the effect was not present in individuals with high Attention to Emotion. The findings indicate that (1) the facial feedback process can occur in contexts beyond those in which the phenomenon has previously been studied, and (2) aspects of emotion regulation, such as Attention to Emotion can interfere with the facial feedback process.
ERIC Educational Resources Information Center
Krebs, Julia F.; Biswas, Ajanta; Pascalis, Olivier; Kamp-Becker, Inge; Remschmidt, Helmuth; Schwarzer, Gudrun
2011-01-01
The current study investigated if deficits in processing emotional expression affect facial identity processing and vice versa in children with autism spectrum disorder. Children with autism and IQ and age matched typically developing children classified faces either by emotional expression, thereby ignoring facial identity or by facial identity…
Processing of subliminal facial expressions of emotion: a behavioral and fMRI study.
Prochnow, D; Kossack, H; Brunheim, S; Müller, K; Wittsack, H-J; Markowitsch, H-J; Seitz, R J
2013-01-01
The recognition of emotional facial expressions is an important means to adjust behavior in social interactions. As facial expressions widely differ in their duration and degree of expressiveness, they often manifest with short and transient expressions below the level of awareness. In this combined behavioral and fMRI study, we aimed at examining whether or not consciously accessible (subliminal) emotional facial expressions influence empathic judgments and which brain activations are related to it. We hypothesized that subliminal facial expressions of emotions masked with neutral expressions of the same faces induce an empathic processing similar to consciously accessible (supraliminal) facial expressions. Our behavioral data in 23 healthy subjects showed that subliminal emotional facial expressions of 40 ms duration affect the judgments of the subsequent neutral facial expressions. In the fMRI study in 12 healthy subjects it was found that both, supra- and subliminal emotional facial expressions shared a widespread network of brain areas including the fusiform gyrus, the temporo-parietal junction, and the inferior, dorsolateral, and medial frontal cortex. Compared with subliminal facial expressions, supraliminal facial expressions led to a greater activation of left occipital and fusiform face areas. We conclude that masked subliminal emotional information is suited to trigger processing in brain areas which have been implicated in empathy and, thereby in social encounters.
Norton, Daniel; McBain, Ryan; Holt, Daphne J; Ongur, Dost; Chen, Yue
2009-06-15
Impaired emotion recognition has been reported in schizophrenia, yet the nature of this impairment is not completely understood. Recognition of facial emotion depends on processing affective and nonaffective facial signals, as well as basic visual attributes. We examined whether and how poor facial emotion recognition in schizophrenia is related to basic visual processing and nonaffective face recognition. Schizophrenia patients (n = 32) and healthy control subjects (n = 29) performed emotion discrimination, identity discrimination, and visual contrast detection tasks, where the emotionality, distinctiveness of identity, or visual contrast was systematically manipulated. Subjects determined which of two presentations in a trial contained the target: the emotional face for emotion discrimination, a specific individual for identity discrimination, and a sinusoidal grating for contrast detection. Patients had significantly higher thresholds (worse performance) than control subjects for discriminating both fearful and happy faces. Furthermore, patients' poor performance in fear discrimination was predicted by performance in visual detection and face identity discrimination. Schizophrenia patients require greater emotional signal strength to discriminate fearful or happy face images from neutral ones. Deficient emotion recognition in schizophrenia does not appear to be determined solely by affective processing but is also linked to the processing of basic visual and facial information.
Ventura, Joseph; Wood, Rachel C.; Jimenez, Amy M.; Hellemann, Gerhard S.
2014-01-01
Background In schizophrenia patients, one of the most commonly studied deficits of social cognition is emotion processing (EP), which has documented links to facial recognition (FR). But, how are deficits in facial recognition linked to emotion processing deficits? Can neurocognitive and symptom correlates of FR and EP help differentiate the unique contribution of FR to the domain of social cognition? Methods A meta-analysis of 102 studies (combined n = 4826) in schizophrenia patients was conducted to determine the magnitude and pattern of relationships between facial recognition, emotion processing, neurocognition, and type of symptom. Results Meta-analytic results indicated that facial recognition and emotion processing are strongly interrelated (r = .51). In addition, the relationship between FR and EP through voice prosody (r = .58) is as strong as the relationship between FR and EP based on facial stimuli (r = .53). Further, the relationship between emotion recognition, neurocognition, and symptoms is independent of the emotion processing modality – facial stimuli and voice prosody. Discussion The association between FR and EP that occurs through voice prosody suggests that FR is a fundamental cognitive process. The observed links between FR and EP might be due to bottom-up associations between neurocognition and EP, and not simply because most emotion recognition tasks use visual facial stimuli. In addition, links with symptoms, especially negative symptoms and disorganization, suggest possible symptom mechanisms that contribute to FR and EP deficits. PMID:24268469
Ventura, Joseph; Wood, Rachel C; Jimenez, Amy M; Hellemann, Gerhard S
2013-12-01
In schizophrenia patients, one of the most commonly studied deficits of social cognition is emotion processing (EP), which has documented links to facial recognition (FR). But, how are deficits in facial recognition linked to emotion processing deficits? Can neurocognitive and symptom correlates of FR and EP help differentiate the unique contribution of FR to the domain of social cognition? A meta-analysis of 102 studies (combined n=4826) in schizophrenia patients was conducted to determine the magnitude and pattern of relationships between facial recognition, emotion processing, neurocognition, and type of symptom. Meta-analytic results indicated that facial recognition and emotion processing are strongly interrelated (r=.51). In addition, the relationship between FR and EP through voice prosody (r=.58) is as strong as the relationship between FR and EP based on facial stimuli (r=.53). Further, the relationship between emotion recognition, neurocognition, and symptoms is independent of the emotion processing modality - facial stimuli and voice prosody. The association between FR and EP that occurs through voice prosody suggests that FR is a fundamental cognitive process. The observed links between FR and EP might be due to bottom-up associations between neurocognition and EP, and not simply because most emotion recognition tasks use visual facial stimuli. In addition, links with symptoms, especially negative symptoms and disorganization, suggest possible symptom mechanisms that contribute to FR and EP deficits. © 2013 Elsevier B.V. All rights reserved.
Deeper than skin deep - The effect of botulinum toxin-A on emotion processing.
Baumeister, J-C; Papa, G; Foroni, F
2016-08-01
The effect of facial botulinum Toxin-A (BTX) injections on the processing of emotional stimuli was investigated. The hypothesis, that BTX would interfere with processing of slightly emotional stimuli and less with very emotional or neutral stimuli, was largely confirmed. BTX-users rated slightly emotional sentences and facial expressions, but not very emotional or neutral ones, as less emotional after the treatment. Furthermore, they became slower at categorizing slightly emotional facial expressions under time pressure. Copyright © 2016 Elsevier Ltd. All rights reserved.
Putting the face in context: Body expressions impact facial emotion processing in human infants.
Rajhans, Purva; Jessen, Sarah; Missana, Manuela; Grossmann, Tobias
2016-06-01
Body expressions exert strong contextual effects on facial emotion perception in adults. Specifically, conflicting body cues hamper the recognition of emotion from faces, as evident on both the behavioral and neural level. We examined the developmental origins of the neural processes involved in emotion perception across body and face in 8-month-old infants by measuring event-related brain potentials (ERPs). We primed infants with body postures (fearful, happy) that were followed by either congruent or incongruent facial expressions. Our results revealed that body expressions impact facial emotion processing and that incongruent body cues impair the neural discrimination of emotional facial expressions. Priming effects were associated with attentional and recognition memory processes, as reflected in a modulation of the Nc and Pc evoked at anterior electrodes. These findings demonstrate that 8-month-old infants possess neural mechanisms that allow for the integration of emotion across body and face, providing evidence for the early developmental emergence of context-sensitive facial emotion perception. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Recio, Guillermo; Wilhelm, Oliver; Sommer, Werner; Hildebrandt, Andrea
2017-04-01
Despite a wealth of knowledge about the neural mechanisms behind emotional facial expression processing, little is known about how they relate to individual differences in social cognition abilities. We studied individual differences in the event-related potentials (ERPs) elicited by dynamic facial expressions. First, we assessed the latent structure of the ERPs, reflecting structural face processing in the N170, and the allocation of processing resources and reflexive attention to emotionally salient stimuli, in the early posterior negativity (EPN) and the late positive complex (LPC). Then we estimated brain-behavior relationships between the ERP factors and behavioral indicators of facial identity and emotion-processing abilities. Structural models revealed that the participants who formed faster structural representations of neutral faces (i.e., shorter N170 latencies) performed better at face perception (r = -.51) and memory (r = -.42). The N170 amplitude was not related to individual differences in face cognition or emotion processing. The latent EPN factor correlated with emotion perception (r = .47) and memory (r = .32), and also with face perception abilities (r = .41). Interestingly, the latent factor representing the difference in EPN amplitudes between the two neutral control conditions (chewing and blinking movements) also correlated with emotion perception (r = .51), highlighting the importance of tracking facial changes in the perception of emotional facial expressions. The LPC factor for negative expressions correlated with the memory for emotional facial expressions. The links revealed between the latency and strength of activations of brain systems and individual differences in processing socio-emotional information provide new insights into the brain mechanisms involved in social communication.
Bologna, Matteo; Berardelli, Isabella; Paparella, Giulia; Marsili, Luca; Ricciardi, Lucia; Fabbrini, Giovanni; Berardelli, Alfredo
2016-01-01
Altered emotional processing, including reduced emotion facial expression and defective emotion recognition, has been reported in patients with Parkinson's disease (PD). However, few studies have objectively investigated facial expression abnormalities in PD using neurophysiological techniques. It is not known whether altered facial expression and recognition in PD are related. To investigate possible deficits in facial emotion expression and emotion recognition and their relationship, if any, in patients with PD. Eighteen patients with PD and 16 healthy controls were enrolled in this study. Facial expressions of emotion were recorded using a 3D optoelectronic system and analyzed using the facial action coding system. Possible deficits in emotion recognition were assessed using the Ekman test. Participants were assessed in one experimental session. Possible relationship between the kinematic variables of facial emotion expression, the Ekman test scores, and clinical and demographic data in patients were evaluated using the Spearman's test and multiple regression analysis. The facial expression of all six basic emotions had slower velocity and lower amplitude in patients in comparison to healthy controls (all P s < 0.05). Patients also yielded worse Ekman global score and disgust, sadness, and fear sub-scores than healthy controls (all P s < 0.001). Altered facial expression kinematics and emotion recognition deficits were unrelated in patients (all P s > 0.05). Finally, no relationship emerged between kinematic variables of facial emotion expression, the Ekman test scores, and clinical and demographic data in patients (all P s > 0.05). The results in this study provide further evidence of altered emotional processing in PD. The lack of any correlation between altered facial emotion expression kinematics and emotion recognition deficits in patients suggests that these abnormalities are mediated by separate pathophysiological mechanisms.
Daini, Roberta; Comparetti, Chiara M.; Ricciardelli, Paola
2014-01-01
Neuropsychological and neuroimaging studies have shown that facial recognition and emotional expressions are dissociable. However, it is unknown if a single system supports the processing of emotional and non-emotional facial expressions. We aimed to understand if individuals with impairment in face recognition from birth (congenital prosopagnosia, CP) can use non-emotional facial expressions to recognize a face as an already seen one, and thus, process this facial dimension independently from features (which are impaired in CP), and basic emotional expressions. To this end, we carried out a behavioral study in which we compared the performance of 6 CP individuals to that of typical development individuals, using upright and inverted faces. Four avatar faces with a neutral expression were presented in the initial phase. The target faces presented in the recognition phase, in which a recognition task was requested (2AFC paradigm), could be identical (neutral) to those of the initial phase or present biologically plausible changes to features, non-emotional expressions, or emotional expressions. After this task, a second task was performed, in which the participants had to detect whether or not the recognized face exactly matched the study face or showed any difference. The results confirmed the CPs' impairment in the configural processing of the invariant aspects of the face, but also showed a spared configural processing of non-emotional facial expression (task 1). Interestingly and unlike the non-emotional expressions, the configural processing of emotional expressions was compromised in CPs and did not improve their change detection ability (task 2). These new results have theoretical implications for face perception models since they suggest that, at least in CPs, non-emotional expressions are processed configurally, can be dissociated from other facial dimensions, and may serve as a compensatory strategy to achieve face recognition. PMID:25520643
Daini, Roberta; Comparetti, Chiara M; Ricciardelli, Paola
2014-01-01
Neuropsychological and neuroimaging studies have shown that facial recognition and emotional expressions are dissociable. However, it is unknown if a single system supports the processing of emotional and non-emotional facial expressions. We aimed to understand if individuals with impairment in face recognition from birth (congenital prosopagnosia, CP) can use non-emotional facial expressions to recognize a face as an already seen one, and thus, process this facial dimension independently from features (which are impaired in CP), and basic emotional expressions. To this end, we carried out a behavioral study in which we compared the performance of 6 CP individuals to that of typical development individuals, using upright and inverted faces. Four avatar faces with a neutral expression were presented in the initial phase. The target faces presented in the recognition phase, in which a recognition task was requested (2AFC paradigm), could be identical (neutral) to those of the initial phase or present biologically plausible changes to features, non-emotional expressions, or emotional expressions. After this task, a second task was performed, in which the participants had to detect whether or not the recognized face exactly matched the study face or showed any difference. The results confirmed the CPs' impairment in the configural processing of the invariant aspects of the face, but also showed a spared configural processing of non-emotional facial expression (task 1). Interestingly and unlike the non-emotional expressions, the configural processing of emotional expressions was compromised in CPs and did not improve their change detection ability (task 2). These new results have theoretical implications for face perception models since they suggest that, at least in CPs, non-emotional expressions are processed configurally, can be dissociated from other facial dimensions, and may serve as a compensatory strategy to achieve face recognition.
From Facial Emotional Recognition Abilities to Emotional Attribution: A Study in Down Syndrome
ERIC Educational Resources Information Center
Hippolyte, Loyse; Barisnikov, Koviljka; Van der Linden, Martial; Detraux, Jean-Jacques
2009-01-01
Facial expression processing and the attribution of facial emotions to a context were investigated in adults with Down syndrome (DS) in two experiments. Their performances were compared with those of a child control group matched for receptive vocabulary. The ability to process faces without emotional content was controlled for, and no differences…
Altering sensorimotor feedback disrupts visual discrimination of facial expressions.
Wood, Adrienne; Lupyan, Gary; Sherrin, Steven; Niedenthal, Paula
2016-08-01
Looking at another person's facial expression of emotion can trigger the same neural processes involved in producing the expression, and such responses play a functional role in emotion recognition. Disrupting individuals' facial action, for example, interferes with verbal emotion recognition tasks. We tested the hypothesis that facial responses also play a functional role in the perceptual processing of emotional expressions. We altered the facial action of participants with a gel facemask while they performed a task that involved distinguishing target expressions from highly similar distractors. Relative to control participants, participants in the facemask condition demonstrated inferior perceptual discrimination of facial expressions, but not of nonface stimuli. The findings suggest that somatosensory/motor processes involving the face contribute to the visual perceptual-and not just conceptual-processing of facial expressions. More broadly, our study contributes to growing evidence for the fundamentally interactive nature of the perceptual inputs from different sensory modalities.
Implicit and explicit processing of emotional facial expressions in Parkinson's disease.
Wagenbreth, Caroline; Wattenberg, Lena; Heinze, Hans-Jochen; Zaehle, Tino
2016-04-15
Besides motor problems, Parkinson's disease (PD) is associated with detrimental emotional and cognitive functioning. Deficient explicit emotional processing has been observed, whilst patients also show impaired Theory of Mind (ToM) abilities. However, it is unclear whether this PD patients' ToM deficit is based on an inability to infer otherś emotional states or whether it is due to explicit emotional processing deficits. We investigated implicit and explicit emotional processing in PD with an affective priming paradigm in which we used pictures of human eyes for emotional primes and a lexical decision task (LDT) with emotional connoted words for target stimuli. Sixteen PD patients and sixteen matched healthy controls performed a LTD combined with an emotional priming paradigm providing emotional information through the facial eye region to assess implicit emotional processing. Second, participants explicitly evaluated the emotional status of eyes and words used in the implicit task. Compared to controls implicit emotional processing abilities were generally preserved in PD with, however, considerable alterations for happiness and disgust processing. Furthermore, we observed a general impairment of patients for explicit evaluation of emotional stimuli, which was augmented for the rating of facial expressions. This is the first study reporting results for affective priming with facial eye expressions in PD patients. Our findings indicate largely preserved implicit emotional processing, with a specific altered processing of disgust and happiness. Explicit emotional processing was considerably impaired for semantic and especially for facial stimulus material. Poor ToM abilities in PD patients might be based on deficient explicit emotional processing, with preserved ability to implicitly infer other people's feelings. Copyright © 2016 Elsevier B.V. All rights reserved.
Doi, Hirokazu; Fujisawa, Takashi X; Kanai, Chieko; Ohta, Haruhisa; Yokoi, Hideki; Iwanami, Akira; Kato, Nobumasa; Shinohara, Kazuyuki
2013-09-01
This study investigated the ability of adults with Asperger syndrome to recognize emotional categories of facial expressions and emotional prosodies with graded emotional intensities. The individuals with Asperger syndrome showed poorer recognition performance for angry and sad expressions from both facial and vocal information. The group difference in facial expression recognition was prominent for stimuli with low or intermediate emotional intensities. In contrast to this, the individuals with Asperger syndrome exhibited lower recognition accuracy than typically-developed controls mainly for emotional prosody with high emotional intensity. In facial expression recognition, Asperger and control groups showed an inversion effect for all categories. The magnitude of this effect was less in the Asperger group for angry and sad expressions, presumably attributable to reduced recruitment of the configural mode of face processing. The individuals with Asperger syndrome outperformed the control participants in recognizing inverted sad expressions, indicating enhanced processing of local facial information representing sad emotion. These results suggest that the adults with Asperger syndrome rely on modality-specific strategies in emotion recognition from facial expression and prosodic information.
ERIC Educational Resources Information Center
Spangler, Sibylle M.; Schwarzer, Gudrun; Korell, Monika; Maier-Karius, Johanna
2010-01-01
Four experiments were conducted with 5- to 11-year-olds and adults to investigate whether facial identity, facial speech, emotional expression, and gaze direction are processed independently of or in interaction with one another. In a computer-based, speeded sorting task, participants sorted faces according to facial identity while disregarding…
Decoding facial blends of emotion: visual field, attentional and hemispheric biases.
Ross, Elliott D; Shayya, Luay; Champlain, Amanda; Monnot, Marilee; Prodan, Calin I
2013-12-01
Most clinical research assumes that modulation of facial expressions is lateralized predominantly across the right-left hemiface. However, social psychological research suggests that facial expressions are organized predominantly across the upper-lower face. Because humans learn to cognitively control facial expression for social purposes, the lower face may display a false emotion, typically a smile, to enable approach behavior. In contrast, the upper face may leak a person's true feeling state by producing a brief facial blend of emotion, i.e. a different emotion on the upper versus lower face. Previous studies from our laboratory have shown that upper facial emotions are processed preferentially by the right hemisphere under conditions of directed attention if facial blends of emotion are presented tachistoscopically to the mid left and right visual fields. This paper explores how facial blends are processed within the four visual quadrants. The results, combined with our previous research, demonstrate that lower more so than upper facial emotions are perceived best when presented to the viewer's left and right visual fields just above the horizontal axis. Upper facial emotions are perceived best when presented to the viewer's left visual field just above the horizontal axis under conditions of directed attention. Thus, by gazing at a person's left ear, which also avoids the social stigma of eye-to-eye contact, one's ability to decode facial expressions should be enhanced. Published by Elsevier Inc.
Wojciechowski, Jerzy; Stolarski, Maciej; Matthews, Gerald
2014-01-01
Processing facial emotion, especially mismatches between facial and verbal messages, is believed to be important in the detection of deception. For example, emotional leakage may accompany lying. Individuals with superior emotion perception abilities may then be more adept in detecting deception by identifying mismatch between facial and verbal messages. Two personal factors that may predict such abilities are female gender and high emotional intelligence (EI). However, evidence on the role of gender and EI in detection of deception is mixed. A key issue is that the facial processing skills required to detect deception may not be the same as those required to identify facial emotion. To test this possibility, we developed a novel facial processing task, the FDT (Face Decoding Test) that requires detection of inconsistencies between facial and verbal cues to emotion. We hypothesized that gender and ability EI would be related to performance when cues were inconsistent. We also hypothesized that gender effects would be mediated by EI, because women tend to score as more emotionally intelligent on ability tests. Data were collected from 210 participants. Analyses of the FDT suggested that EI was correlated with superior face decoding in all conditions. We also confirmed the expected gender difference, the superiority of high EI individuals, and the mediation hypothesis. Also, EI was more strongly associated with facial decoding performance in women than in men, implying there may be gender differences in strategies for processing affective cues. It is concluded that integration of emotional and cognitive cues may be a core attribute of EI that contributes to the detection of deception. PMID:24658500
Shyness and Emotion-Processing Skills in Preschoolers: A 6-Month Longitudinal Study
ERIC Educational Resources Information Center
Strand, Paul S.; Cerna, Sandra; Downs, Andrew
2008-01-01
The present study utilized a short-term longitudinal research design to examine the hypothesis that shyness in preschoolers is differentially related to different aspects of emotion processing. Using teacher reports of shyness and performance measures of emotion processing, including (1) facial emotion recognition, (2) non-facial emotion…
Effects of facial color on the subliminal processing of fearful faces.
Nakajima, K; Minami, T; Nakauchi, S
2015-12-03
Recent studies have suggested that both configural information, such as face shape, and surface information is important for face perception. In particular, facial color is sufficiently suggestive of emotional states, as in the phrases: "flushed with anger" and "pale with fear." However, few studies have examined the relationship between facial color and emotional expression. On the other hand, event-related potential (ERP) studies have shown that emotional expressions, such as fear, are processed unconsciously. In this study, we examined how facial color modulated the supraliminal and subliminal processing of fearful faces. We recorded electroencephalograms while participants performed a facial emotion identification task involving masked target faces exhibiting facial expressions (fearful or neutral) and colors (natural or bluish). The results indicated that there was a significant interaction between facial expression and color for the latency of the N170 component. Subsequent analyses revealed that the bluish-colored faces increased the latency effect of facial expressions compared to the natural-colored faces, indicating that the bluish color modulated the processing of fearful expressions. We conclude that the unconscious processing of fearful faces is affected by facial color. Copyright © 2015 IBRO. Published by Elsevier Ltd. All rights reserved.
Impaired recognition of happy facial expressions in bipolar disorder.
Lawlor-Savage, Linette; Sponheim, Scott R; Goghari, Vina M
2014-08-01
The ability to accurately judge facial expressions is important in social interactions. Individuals with bipolar disorder have been found to be impaired in emotion recognition; however, the specifics of the impairment are unclear. This study investigated whether facial emotion recognition difficulties in bipolar disorder reflect general cognitive, or emotion-specific, impairments. Impairment in the recognition of particular emotions and the role of processing speed in facial emotion recognition were also investigated. Clinically stable bipolar patients (n = 17) and healthy controls (n = 50) judged five facial expressions in two presentation types, time-limited and self-paced. An age recognition condition was used as an experimental control. Bipolar patients' overall facial recognition ability was unimpaired. However, patients' specific ability to judge happy expressions under time constraints was impaired. Findings suggest a deficit in happy emotion recognition impacted by processing speed. Given the limited sample size, further investigation with a larger patient sample is warranted.
Spapé, M M; Harjunen, Ville; Ravaja, N
2017-03-01
Being touched is known to affect emotion, and even a casual touch can elicit positive feelings and affinity. Psychophysiological studies have recently shown that tactile primes affect visual evoked potentials to emotional stimuli, suggesting altered affective stimulus processing. As, however, these studies approached emotion from a purely unidimensional perspective, it remains unclear whether touch biases emotional evaluation or a more general feature such as salience. Here, we investigated how simple tactile primes modulate event related potentials (ERPs), facial EMG and cardiac response to pictures of facial expressions of emotion. All measures replicated known effects of emotional face processing: Disgust and fear modulated early ERPs, anger increased the cardiac orienting response, and expressions elicited emotion-congruent facial EMG activity. Tactile primes also affected these measures, but priming never interacted with the type of emotional expression. Thus, touch may additively affect general stimulus processing, but it does not bias or modulate immediate affective evaluation. Copyright © 2017. Published by Elsevier B.V.
Balconi, Michela; Ferrari, Chiara
2012-01-01
The unconscious effects of an emotional stimulus have been highlighted by a vast amount of research, whereover it remains questionable whether it is possible to assign a specific function to cortical brain oscillations in the unconscious perception of facial expressions of emotions. Alpha band variation was monitored within the right- and left-cortical side when subjects consciously (supraliminal stimulation) or unconsciously (subliminal stimulation) processed facial patterns. Twenty subjects looked at six facial expressions of emotions (anger, fear, surprise, disgust, happiness, sadness, and neutral) under two different conditions: supraliminal (200 ms) vs. subliminal (30 ms) stimulation (140 target-mask pairs for each condition). The results showed that conscious/unconscious processing and the significance of the stimulus can modulate the alpha power. Moreover, it was found that there was an increased right frontal activity for negative emotions vs. an increased left response for positive emotion. The significance of facial expressions was adduced to elucidate cortical different responses to emotional types. PMID:24962767
Balconi, Michela; Ferrari, Chiara
2012-03-26
The unconscious effects of an emotional stimulus have been highlighted by a vast amount of research, whereover it remains questionable whether it is possible to assign a specific function to cortical brain oscillations in the unconscious perception of facial expressions of emotions. Alpha band variation was monitored within the right- and left-cortical side when subjects consciously (supraliminal stimulation) or unconsciously (subliminal stimulation) processed facial patterns. Twenty subjects looked at six facial expressions of emotions (anger, fear, surprise, disgust, happiness, sadness, and neutral) under two different conditions: supraliminal (200 ms) vs. subliminal (30 ms) stimulation (140 target-mask pairs for each condition). The results showed that conscious/unconscious processing and the significance of the stimulus can modulate the alpha power. Moreover, it was found that there was an increased right frontal activity for negative emotions vs. an increased left response for positive emotion. The significance of facial expressions was adduced to elucidate cortical different responses to emotional types.
The Relationship between Processing Facial Identity and Emotional Expression in 8-Month-Old Infants
ERIC Educational Resources Information Center
Schwarzer, Gudrun; Jovanovic, Bianca
2010-01-01
In Experiment 1, it was investigated whether infants process facial identity and emotional expression independently or in conjunction with one another. Eight-month-old infants were habituated to two upright or two inverted faces varying in facial identity and emotional expression. Infants were tested with a habituation face, a switch face, and a…
Daly, Eileen M; Deeley, Quinton; Ecker, Christine; Craig, Michael; Hallahan, Brian; Murphy, Clodagh; Johnston, Patrick; Spain, Debbie; Gillan, Nicola; Brammer, Michael; Giampietro, Vincent; Lamar, Melissa; Page, Lisa; Toal, Fiona; Cleare, Anthony; Surguladze, Simon; Murphy, Declan G M
2012-10-01
People with autism spectrum disorders (ASDs) have lifelong deficits in social behavior and differences in behavioral as well as neural responses to facial expressions of emotion. The biological basis to this is incompletely understood, but it may include differences in the role of neurotransmitters such as serotonin, which modulate facial emotion processing in health. While some individuals with ASD have significant differences in the serotonin system, to our knowledge, no one has investigated its role during facial emotion processing in adults with ASD and control subjects using acute tryptophan depletion (ATD) and functional magnetic resonance imaging. To compare the effects of ATD on brain responses to primary facial expressions of emotion in men with ASD and healthy control subjects. Double-blind, placebo-controlled, crossover trial of ATD and functional magnetic resonance imaging to measure brain activity during incidental processing of disgust, fearful, happy, and sad facial expressions. Institute of Psychiatry, King's College London, and South London and Maudsley National Health Service Foundation Trust, England. Fourteen men of normal intelligence with autism and 14 control subjects who did not significantly differ in sex, age, or overall intelligence. Blood oxygenation level-dependent response to facial expressions of emotion. Brain activation was differentially modulated by ATD depending on diagnostic group and emotion type within regions of the social brain network. For example, processing of disgust faces was associated with interactions in medial frontal and lingual gyri, whereas processing of happy faces was associated with interactions in middle frontal gyrus and putamen. Modulation of the processing of facial expressions of emotion by serotonin significantly differs in people with ASD compared with control subjects. The differences vary with emotion type and occur in social brain regions that have been shown to be associated with group differences in serotonin synthesis/receptor or transporter density.
Rapid processing of emotional expressions without conscious awareness.
Smith, Marie L
2012-08-01
Rapid accurate categorization of the emotional state of our peers is of critical importance and as such many have proposed that facial expressions of emotion can be processed without conscious awareness. Typically, studies focus selectively on fearful expressions due to their evolutionary significance, leaving the subliminal processing of other facial expressions largely unexplored. Here, I investigated the time course of processing of 3 facial expressions (fearful, disgusted, and happy) plus an emotionally neutral face, during objectively unaware and aware perception. Participants completed the challenging "which expression?" task in response to briefly presented backward-masked expressive faces. Although participant's behavioral responses did not differentiate between the emotional content of the stimuli in the unaware condition, activity over frontal and occipitotemporal (OT) brain regions indicated an emotional modulation of the neuronal response. Over frontal regions this was driven by negative facial expressions and was present on all emotional trials independent of later categorization. Whereas the N170 component, recorded on lateral OT electrodes, was enhanced for all facial expressions but only on trials that would later be categorized as emotional. The results indicate that emotional faces, not only fearful, are processed without conscious awareness at an early stage and highlight the critical importance of considering categorization response when studying subliminal perception.
Sassenrath, Claudia; Sassenberg, Kai; Ray, Devin G; Scheiter, Katharina; Jarodzka, Halszka
2014-01-01
Two studies examined an unexplored motivational determinant of facial emotion recognition: observer regulatory focus. It was predicted that a promotion focus would enhance facial emotion recognition relative to a prevention focus because the attentional strategies associated with promotion focus enhance performance on well-learned or innate tasks - such as facial emotion recognition. In Study 1, a promotion or a prevention focus was experimentally induced and better facial emotion recognition was observed in a promotion focus compared to a prevention focus. In Study 2, individual differences in chronic regulatory focus were assessed and attention allocation was measured using eye tracking during the facial emotion recognition task. Results indicated that the positive relation between a promotion focus and facial emotion recognition is mediated by shorter fixation duration on the face which reflects a pattern of attention allocation matched to the eager strategy in a promotion focus (i.e., striving to make hits). A prevention focus did not have an impact neither on perceptual processing nor on facial emotion recognition. Taken together, these findings demonstrate important mechanisms and consequences of observer motivational orientation for facial emotion recognition.
Dynamic Facial Expressions Prime the Processing of Emotional Prosody.
Garrido-Vásquez, Patricia; Pell, Marc D; Paulmann, Silke; Kotz, Sonja A
2018-01-01
Evidence suggests that emotion is represented supramodally in the human brain. Emotional facial expressions, which often precede vocally expressed emotion in real life, can modulate event-related potentials (N100 and P200) during emotional prosody processing. To investigate these cross-modal emotional interactions, two lines of research have been put forward: cross-modal integration and cross-modal priming. In cross-modal integration studies, visual and auditory channels are temporally aligned, while in priming studies they are presented consecutively. Here we used cross-modal emotional priming to study the interaction of dynamic visual and auditory emotional information. Specifically, we presented dynamic facial expressions (angry, happy, neutral) as primes and emotionally-intoned pseudo-speech sentences (angry, happy) as targets. We were interested in how prime-target congruency would affect early auditory event-related potentials, i.e., N100 and P200, in order to shed more light on how dynamic facial information is used in cross-modal emotional prediction. Results showed enhanced N100 amplitudes for incongruently primed compared to congruently and neutrally primed emotional prosody, while the latter two conditions did not significantly differ. However, N100 peak latency was significantly delayed in the neutral condition compared to the other two conditions. Source reconstruction revealed that the right parahippocampal gyrus was activated in incongruent compared to congruent trials in the N100 time window. No significant ERP effects were observed in the P200 range. Our results indicate that dynamic facial expressions influence vocal emotion processing at an early point in time, and that an emotional mismatch between a facial expression and its ensuing vocal emotional signal induces additional processing costs in the brain, potentially because the cross-modal emotional prediction mechanism is violated in case of emotional prime-target incongruency.
Contextual interference processing during fast categorisations of facial expressions.
Frühholz, Sascha; Trautmann-Lengsfeld, Sina A; Herrmann, Manfred
2011-09-01
We examined interference effects of emotionally associated background colours during fast valence categorisations of negative, neutral and positive expressions. According to implicitly learned colour-emotion associations, facial expressions were presented with colours that either matched the valence of these expressions or not. Experiment 1 included infrequent non-matching trials and Experiment 2 a balanced ratio of matching and non-matching trials. Besides general modulatory effects of contextual features on the processing of facial expressions, we found differential effects depending on the valance of target facial expressions. Whereas performance accuracy was mainly affected for neutral expressions, performance speed was specifically modulated by emotional expressions indicating some susceptibility of emotional expressions to contextual features. Experiment 3 used two further colour-emotion combinations, but revealed only marginal interference effects most likely due to missing colour-emotion associations. The results are discussed with respect to inherent processing demands of emotional and neutral expressions and their susceptibility to contextual interference.
Hemispheric differences in recognizing upper and lower facial displays of emotion.
Prodan, C I; Orbelo, D M; Testa, J A; Ross, E D
2001-01-01
To determine if there are hemispheric differences in processing upper versus lower facial displays of emotion. Recent evidence suggests that there are two broad classes of emotions with differential hemispheric lateralization. Primary emotions (e.g. anger, fear) and associated displays are innate, are recognized across all cultures, and are thought to be modulated by the right hemisphere. Social emotions (e.g., guilt, jealousy) and associated "display rules" are learned during early child development, vary across cultures, and are thought to be modulated by the left hemisphere. Display rules are used by persons to alter, suppress or enhance primary emotional displays for social purposes. During deceitful behaviors, a subject's true emotional state is often leaked through upper rather than lower facial displays, giving rise to facial blends of emotion. We hypothesized that upper facial displays are processed preferentially by the right hemisphere, as part of the primary emotional system, while lower facial displays are processed preferentially by the left hemisphere, as part of the social emotional system. 30 strongly right-handed adult volunteers were tested tachistoscopically by randomly flashing facial displays of emotion to the right and left visual fields. The stimuli were line drawings of facial blends with different emotions displayed on the upper versus lower face. The subjects were tested under two conditions: 1) without instructions and 2) with instructions to attend to the upper face. Without instructions, the subjects robustly identified the emotion displayed on the lower face, regardless of visual field presentation. With instructions to attend to the upper face, for the left visual field they robustly identified the emotion displayed on the upper face. For the right visual field, they continued to identify the emotion displayed on the lower face, but to a lesser degree. Our results support the hypothesis that hemispheric differences exist in the ability to process upper versus lower facial displays of emotion. Attention appears to enhance the ability to explore these hemispheric differences under experimental conditions. Our data also support the recent observation that the right hemisphere has a greater ability to recognize deceitful behaviors compared with the left hemisphere. This may be attributable to the different roles the hemispheres play in modulating social versus primary emotions and related behaviors.
Balconi, Michela; Lucchiari, Claudio
2005-02-01
Is facial expression recognition marked by specific event-related potentials (ERPs) effects? Are conscious and unconscious elaborations of emotional facial stimuli qualitatively different processes? In Experiment 1, ERPs elicited by supraliminal stimuli were recorded when 21 participants viewed emotional facial expressions of four emotions and a neutral stimulus. Two ERP components (N2 and P3) were analyzed for their peak amplitude and latency measures. First, emotional face-specificity was observed for the negative deflection N2, whereas P3 was not affected by the content of the stimulus (emotional or neutral). A more posterior distribution of ERPs was found for N2. Moreover, a lateralization effect was revealed for negative (right lateralization) and positive (left lateralization) facial expressions. In Experiment 2 (20 participants), 1-ms subliminal stimulation was carried out. Unaware information processing was revealed to be quite similar to aware information processing for peak amplitude but not for latency. In fact, unconscious stimulation produced a more delayed peak variation than conscious stimulation.
Mere social categorization modulates identification of facial expressions of emotion.
Young, Steven G; Hugenberg, Kurt
2010-12-01
The ability of the human face to communicate emotional states via facial expressions is well known, and past research has established the importance and universality of emotional facial expressions. However, recent evidence has revealed that facial expressions of emotion are most accurately recognized when the perceiver and expresser are from the same cultural ingroup. The current research builds on this literature and extends this work. Specifically, we find that mere social categorization, using a minimal-group paradigm, can create an ingroup emotion-identification advantage even when the culture of the target and perceiver is held constant. Follow-up experiments show that this effect is supported by differential motivation to process ingroup versus outgroup faces and that this motivational disparity leads to more configural processing of ingroup faces than of outgroup faces. Overall, the results point to distinct processing modes for ingroup and outgroup faces, resulting in differential identification accuracy for facial expressions of emotion. PsycINFO Database Record (c) 2010 APA, all rights reserved.
Facial emotion recognition in paranoid schizophrenia and autism spectrum disorder.
Sachse, Michael; Schlitt, Sabine; Hainz, Daniela; Ciaramidaro, Angela; Walter, Henrik; Poustka, Fritz; Bölte, Sven; Freitag, Christine M
2014-11-01
Schizophrenia (SZ) and autism spectrum disorder (ASD) share deficits in emotion processing. In order to identify convergent and divergent mechanisms, we investigated facial emotion recognition in SZ, high-functioning ASD (HFASD), and typically developed controls (TD). Different degrees of task difficulty and emotion complexity (face, eyes; basic emotions, complex emotions) were used. Two Benton tests were implemented in order to elicit potentially confounding visuo-perceptual functioning and facial processing. Nineteen participants with paranoid SZ, 22 with HFASD and 20 TD were included, aged between 14 and 33 years. Individuals with SZ were comparable to TD in all obtained emotion recognition measures, but showed reduced basic visuo-perceptual abilities. The HFASD group was impaired in the recognition of basic and complex emotions compared to both, SZ and TD. When facial identity recognition was adjusted for, group differences remained for the recognition of complex emotions only. Our results suggest that there is a SZ subgroup with predominantly paranoid symptoms that does not show problems in face processing and emotion recognition, but visuo-perceptual impairments. They also confirm the notion of a general facial and emotion recognition deficit in HFASD. No shared emotion recognition deficit was found for paranoid SZ and HFASD, emphasizing the differential cognitive underpinnings of both disorders. Copyright © 2014 Elsevier B.V. All rights reserved.
Murata, Aiko; Saito, Hisamichi; Schug, Joanna; Ogawa, Kenji; Kameda, Tatsuya
2016-01-01
A number of studies have shown that individuals often spontaneously mimic the facial expressions of others, a tendency known as facial mimicry. This tendency has generally been considered a reflex-like "automatic" response, but several recent studies have shown that the degree of mimicry may be moderated by contextual information. However, the cognitive and motivational factors underlying the contextual moderation of facial mimicry require further empirical investigation. In this study, we present evidence that the degree to which participants spontaneously mimic a target's facial expressions depends on whether participants are motivated to infer the target's emotional state. In the first study we show that facial mimicry, assessed by facial electromyography, occurs more frequently when participants are specifically instructed to infer a target's emotional state than when given no instruction. In the second study, we replicate this effect using the Facial Action Coding System to show that participants are more likely to mimic facial expressions of emotion when they are asked to infer the target's emotional state, rather than make inferences about a physical trait unrelated to emotion. These results provide convergent evidence that the explicit goal of understanding a target's emotional state affects the degree of facial mimicry shown by the perceiver, suggesting moderation of reflex-like motor activities by higher cognitive processes.
Murata, Aiko; Saito, Hisamichi; Schug, Joanna; Ogawa, Kenji; Kameda, Tatsuya
2016-01-01
A number of studies have shown that individuals often spontaneously mimic the facial expressions of others, a tendency known as facial mimicry. This tendency has generally been considered a reflex-like “automatic” response, but several recent studies have shown that the degree of mimicry may be moderated by contextual information. However, the cognitive and motivational factors underlying the contextual moderation of facial mimicry require further empirical investigation. In this study, we present evidence that the degree to which participants spontaneously mimic a target’s facial expressions depends on whether participants are motivated to infer the target’s emotional state. In the first study we show that facial mimicry, assessed by facial electromyography, occurs more frequently when participants are specifically instructed to infer a target’s emotional state than when given no instruction. In the second study, we replicate this effect using the Facial Action Coding System to show that participants are more likely to mimic facial expressions of emotion when they are asked to infer the target’s emotional state, rather than make inferences about a physical trait unrelated to emotion. These results provide convergent evidence that the explicit goal of understanding a target’s emotional state affects the degree of facial mimicry shown by the perceiver, suggesting moderation of reflex-like motor activities by higher cognitive processes. PMID:27055206
Neath-Tavares, Karly N.; Itier, Roxane J.
2017-01-01
Research suggests an important role of the eyes and mouth for discriminating facial expressions of emotion. A gaze-contingent procedure was used to test the impact of fixation to facial features on the neural response to fearful, happy and neutral facial expressions in an emotion discrimination (Exp.1) and an oddball detection (Exp.2) task. The N170 was the only eye-sensitive ERP component, and this sensitivity did not vary across facial expressions. In both tasks, compared to neutral faces, responses to happy expressions were seen as early as 100–120ms occipitally, while responses to fearful expressions started around 150ms, on or after the N170, at both occipital and lateral-posterior sites. Analyses of scalp topographies revealed different distributions of these two emotion effects across most of the epoch. Emotion processing interacted with fixation location at different times between tasks. Results suggest a role of both the eyes and mouth in the neural processing of fearful expressions and of the mouth in the processing of happy expressions, before 350ms. PMID:27430934
Wang, Shuo; Yu, Rongjun; Tyszka, J. Michael; Zhen, Shanshan; Kovach, Christopher; Sun, Sai; Huang, Yi; Hurlemann, Rene; Ross, Ian B.; Chung, Jeffrey M.; Mamelak, Adam N.; Adolphs, Ralph; Rutishauser, Ueli
2017-01-01
The human amygdala is a key structure for processing emotional facial expressions, but it remains unclear what aspects of emotion are processed. We investigated this question with three different approaches: behavioural analysis of 3 amygdala lesion patients, neuroimaging of 19 healthy adults, and single-neuron recordings in 9 neurosurgical patients. The lesion patients showed a shift in behavioural sensitivity to fear, and amygdala BOLD responses were modulated by both fear and emotion ambiguity (the uncertainty that a facial expression is categorized as fearful or happy). We found two populations of neurons, one whose response correlated with increasing degree of fear, or happiness, and a second whose response primarily decreased as a linear function of emotion ambiguity. Together, our results indicate that the human amygdala processes both the degree of emotion in facial expressions and the categorical ambiguity of the emotion shown and that these two aspects of amygdala processing can be most clearly distinguished at the level of single neurons. PMID:28429707
Emotion unfolded by motion: a role for parietal lobe in decoding dynamic facial expressions.
Sarkheil, Pegah; Goebel, Rainer; Schneider, Frank; Mathiak, Klaus
2013-12-01
Facial expressions convey important emotional and social information and are frequently applied in investigations of human affective processing. Dynamic faces may provide higher ecological validity to examine perceptual and cognitive processing of facial expressions. Higher order processing of emotional faces was addressed by varying the task and virtual face models systematically. Blood oxygenation level-dependent activation was assessed using functional magnetic resonance imaging in 20 healthy volunteers while viewing and evaluating either emotion or gender intensity of dynamic face stimuli. A general linear model analysis revealed that high valence activated a network of motion-responsive areas, indicating that visual motion areas support perceptual coding for the motion-based intensity of facial expressions. The comparison of emotion with gender discrimination task revealed increased activation of inferior parietal lobule, which highlights the involvement of parietal areas in processing of high level features of faces. Dynamic emotional stimuli may help to emphasize functions of the hypothesized 'extended' over the 'core' system for face processing.
Sonnby-Borgström, Marianne; Jönsson, Peter; Svensson, Owe
2008-04-01
Previous studies on gender differences in facial imitation and verbally reported emotional contagion have investigated emotional responses to pictures of facial expressions at supraliminal exposure times. The aim of the present study was to investigate how gender differences are related to different exposure times, representing information processing levels from subliminal (spontaneous) to supraliminal (emotionally regulated). Further, the study aimed at exploring correlations between verbally reported emotional contagion and facial responses for men and women. Masked pictures of angry, happy and sad facial expressions were presented to 102 participants (51 men) at exposure times from subliminal (23 ms) to clearly supraliminal (2500 ms). Myoelectric activity (EMG) from the corrugator and the zygomaticus was measured and the participants reported their hedonic tone (verbally reported emotional contagion) after stimulus exposures. The results showed an effect of exposure time on gender differences in facial responses as well as in verbally reported emotional contagion. Women amplified imitative responses towards happy vs. angry faces and verbally reported emotional contagion with prolonged exposure times, whereas men did not. No gender differences were detected at the subliminal or borderliminal exposure times, but at the supraliminal exposure gender differences were found in imitation as well as in verbally reported emotional contagion. Women showed correspondence between their facial responses and their verbally reported emotional contagion to a greater extent than men. The results were interpreted in terms of gender differences in emotion regulation, rather than as differences in biologically prepared emotional reactivity.
Do facial movements express emotions or communicate motives?
Parkinson, Brian
2005-01-01
This article addresses the debate between emotion-expression and motive-communication approaches to facial movements, focusing on Ekman's (1972) and Fridlund's (1994) contrasting models and their historical antecedents. Available evidence suggests that the presence of others either reduces or increases facial responses, depending on the quality and strength of the emotional manipulation and on the nature of the relationship between interactants. Although both display rules and social motives provide viable explanations of audience "inhibition" effects, some audience facilitation effects are less easily accommodated within an emotion-expression perspective. In particular, emotion is not a sufficient condition for a corresponding "expression," even discounting explicit regulation, and, apparently, "spontaneous" facial movements may be facilitated by the presence of others. Further, there is no direct evidence that any particular facial movement provides an unambiguous expression of a specific emotion. However, information communicated by facial movements is not necessarily extrinsic to emotion. Facial movements not only transmit emotion-relevant information but also contribute to ongoing processes of emotional action in accordance with pragmatic theories.
Masten, Carrie L.; Guyer, Amanda E.; Hodgdon, Hilary B.; McClure, Erin B.; Charney, Dennis S.; Ernst, Monique; Kaufman, Joan; Pine, Daniel S.; Monk, Christopher S.
2008-01-01
Objective The purpose of this study is to examine processing of facial emotions in a sample of maltreated children showing high rates of post-traumatic stress disorder (PTSD). Maltreatment during childhood has been associated independently with both atypical processing of emotion and the development of PTSD. However, research has provided little evidence indicating how high rates of PTSD might relate to maltreated children’s processing of emotions. Method Participants’ reaction time and labeling of emotions were measured using a morphed facial emotion identification task. Participants included a diverse sample of maltreated children with and without PTSD and controls ranging in age from 8 to 15 years. Maltreated children had been removed from their homes and placed in state custody following experiences of maltreatment. Diagnoses of PTSD and other disorders were determined through combination of parent, child, and teacher reports. Results Maltreated children displayed faster reaction times than controls when labeling emotional facial expressions, and this result was most pronounced for fearful faces. Relative to children who were not maltreated, maltreated children both with and without PTSD showed enhanced response times when identifying fearful faces. There was no group difference in labeling of emotions when identifying different facial emotions. Conclusions Maltreated children show heightened ability to identify fearful faces, evidenced by faster reaction times relative to controls. This association between maltreatment and atypical processing of emotion is independent of PTSD diagnosis. PMID:18155144
Schuch, Stefanie; Werheid, Katja; Koch, Iring
2012-01-01
The present study investigated whether the processing characteristics of categorizing emotional facial expressions are different from those of categorizing facial age and sex information. Given that emotions change rapidly, it was hypothesized that processing facial expressions involves a more flexible task set that causes less between-task interference than the task sets involved in processing age or sex of a face. Participants switched between three tasks: categorizing a face as looking happy or angry (emotion task), young or old (age task), and male or female (sex task). Interference between tasks was measured by global interference and response interference. Both measures revealed patterns of asymmetric interference. Global between-task interference was reduced when a task was mixed with the emotion task. Response interference, as measured by congruency effects, was larger for the emotion task than for the nonemotional tasks. The results support the idea that processing emotional facial expression constitutes a more flexible task set that causes less interference (i.e., task-set "inertia") than processing the age or sex of a face.
Facial decoding in schizophrenia is underpinned by basic visual processing impairments.
Belge, Jan-Baptist; Maurage, Pierre; Mangelinckx, Camille; Leleux, Dominique; Delatte, Benoît; Constant, Eric
2017-09-01
Schizophrenia is associated with a strong deficit in the decoding of emotional facial expression (EFE). Nevertheless, it is still unclear whether this deficit is specific for emotions or due to a more general impairment for any type of facial processing. This study was designed to clarify this issue. Thirty patients suffering from schizophrenia and 30 matched healthy controls performed several tasks evaluating the recognition of both changeable (i.e. eyes orientation and emotions) and stable (i.e. gender, age) facial characteristics. Accuracy and reaction times were recorded. Schizophrenic patients presented a performance deficit (accuracy and reaction times) in the perception of both changeable and stable aspects of faces, without any specific deficit for emotional decoding. Our results demonstrate a generalized face recognition deficit in schizophrenic patients, probably caused by a perceptual deficit in basic visual processing. It seems that the deficit in the decoding of emotional facial expression (EFE) is not a specific deficit of emotion processing, but is at least partly related to a generalized perceptual deficit in lower-level perceptual processing, occurring before the stage of emotion processing, and underlying more complex cognitive dysfunctions. These findings should encourage future investigations to explore the neurophysiologic background of these generalized perceptual deficits, and stimulate a clinical approach focusing on more basic visual processing. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.
Face processing in chronic alcoholism: a specific deficit for emotional features.
Maurage, P; Campanella, S; Philippot, P; Martin, S; de Timary, P
2008-04-01
It is well established that chronic alcoholism is associated with a deficit in the decoding of emotional facial expression (EFE). Nevertheless, it is still unclear whether this deficit is specifically for emotions or due to a more general impairment in visual or facial processing. This study was designed to clarify this issue using multiple control tasks and the subtraction method. Eighteen patients suffering from chronic alcoholism and 18 matched healthy control subjects were asked to perform several tasks evaluating (1) Basic visuo-spatial and facial identity processing; (2) Simple reaction times; (3) Complex facial features identification (namely age, emotion, gender, and race). Accuracy and reaction times were recorded. Alcoholic patients had a preserved performance for visuo-spatial and facial identity processing, but their performance was impaired for visuo-motor abilities and for the detection of complex facial aspects. More importantly, the subtraction method showed that alcoholism is associated with a specific EFE decoding deficit, still present when visuo-motor slowing down is controlled for. These results offer a post hoc confirmation of earlier data showing an EFE decoding deficit in alcoholism by strongly suggesting a specificity of this deficit for emotions. This may have implications for clinical situations, where emotional impairments are frequently observed among alcoholic subjects.
Yankouskaya, Alla; Humphreys, Glyn W.; Rotshtein, Pia
2014-01-01
Facial identity and emotional expression are two important sources of information for daily social interaction. However the link between these two aspects of face processing has been the focus of an unresolved debate for the past three decades. Three views have been advocated: (1) separate and parallel processing of identity and emotional expression signals derived from faces; (2) asymmetric processing with the computation of emotion in faces depending on facial identity coding but not vice versa; and (3) integrated processing of facial identity and emotion. We present studies with healthy participants that primarily apply methods from mathematical psychology, formally testing the relations between the processing of facial identity and emotion. Specifically, we focused on the “Garner” paradigm, the composite face effect and the divided attention tasks. We further ask whether the architecture of face-related processes is fixed or flexible and whether (and how) it can be shaped by experience. We conclude that formal methods of testing the relations between processes show that the processing of facial identity and expressions interact, and hence are not fully independent. We further demonstrate that the architecture of the relations depends on experience; where experience leads to higher degree of inter-dependence in the processing of identity and expressions. We propose that this change occurs as integrative processes are more efficient than parallel. Finally, we argue that the dynamic aspects of face processing need to be incorporated into theories in this field. PMID:25452722
Yankouskaya, Alla; Booth, David A; Humphreys, Glyn
2012-11-01
Interactions between the processing of emotion expression and form-based information from faces (facial identity) were investigated using the redundant-target paradigm, in which we specifically tested whether identity and emotional expression are integrated in a superadditive manner (Miller, Cognitive Psychology 14:247-279, 1982). In Experiments 1 and 2, participants performed emotion and face identity judgments on faces with sad or angry emotional expressions. Responses to redundant targets were faster than responses to either single target when a universal emotion was conveyed, and performance violated the predictions from a model assuming independent processing of emotion and face identity. Experiment 4 showed that these effects were not modulated by varying interstimulus and nontarget contingencies, and Experiment 5 demonstrated that the redundancy gains were eliminated when faces were inverted. Taken together, these results suggest that the identification of emotion and facial identity interact in face processing.
Neath-Tavares, Karly N; Itier, Roxane J
2016-09-01
Research suggests an important role of the eyes and mouth for discriminating facial expressions of emotion. A gaze-contingent procedure was used to test the impact of fixation to facial features on the neural response to fearful, happy and neutral facial expressions in an emotion discrimination (Exp.1) and an oddball detection (Exp.2) task. The N170 was the only eye-sensitive ERP component, and this sensitivity did not vary across facial expressions. In both tasks, compared to neutral faces, responses to happy expressions were seen as early as 100-120ms occipitally, while responses to fearful expressions started around 150ms, on or after the N170, at both occipital and lateral-posterior sites. Analyses of scalp topographies revealed different distributions of these two emotion effects across most of the epoch. Emotion processing interacted with fixation location at different times between tasks. Results suggest a role of both the eyes and mouth in the neural processing of fearful expressions and of the mouth in the processing of happy expressions, before 350ms. Copyright © 2016 Elsevier B.V. All rights reserved.
Comparison of emotion recognition from facial expression and music.
Gaspar, Tina; Labor, Marina; Jurić, Iva; Dumancić, Dijana; Ilakovac, Vesna; Heffer, Marija
2011-01-01
The recognition of basic emotions in everyday communication involves interpretation of different visual and auditory clues. The ability to recognize emotions is not clearly determined as their presentation is usually very short (micro expressions), whereas the recognition itself does not have to be a conscious process. We assumed that the recognition from facial expressions is selected over the recognition of emotions communicated through music. In order to compare the success rate in recognizing emotions presented as facial expressions or in classical music works we conducted a survey which included 90 elementary school and 87 high school students from Osijek (Croatia). The participants had to match 8 photographs of different emotions expressed on the face and 8 pieces of classical music works with 8 offered emotions. The recognition of emotions expressed through classical music pieces was significantly less successful than the recognition of emotional facial expressions. The high school students were significantly better at recognizing facial emotions than the elementary school students, whereas girls were better than boys. The success rate in recognizing emotions from music pieces was associated with higher grades in mathematics. Basic emotions are far better recognized if presented on human faces than in music, possibly because the understanding of facial emotions is one of the oldest communication skills in human society. Female advantage in emotion recognition was selected due to the necessity of their communication with the newborns during early development. The proficiency in recognizing emotional content of music and mathematical skills probably share some general cognitive skills like attention, memory and motivation. Music pieces were differently processed in brain than facial expressions and consequently, probably differently evaluated as relevant emotional clues.
Matsuda, Yoshi-Taka; Fujimura, Tomomi; Katahira, Kentaro; Okada, Masato; Ueno, Kenichi; Cheng, Kang; Okanoya, Kazuo
2013-01-01
Our understanding of facial emotion perception has been dominated by two seemingly opposing theories: the categorical and dimensional theories. However, we have recently demonstrated that hybrid processing involving both categorical and dimensional perception can be induced in an implicit manner (Fujimura etal., 2012). The underlying neural mechanisms of this hybrid processing remain unknown. In this study, we tested the hypothesis that separate neural loci might intrinsically encode categorical and dimensional processing functions that serve as a basis for hybrid processing. We used functional magnetic resonance imaging to measure neural correlates while subjects passively viewed emotional faces and performed tasks that were unrelated to facial emotion processing. Activity in the right fusiform face area (FFA) increased in response to psychologically obvious emotions and decreased in response to ambiguous expressions, demonstrating the role of the FFA in categorical processing. The amygdala, insula and medial prefrontal cortex exhibited evidence of dimensional (linear) processing that correlated with physical changes in the emotional face stimuli. The occipital face area and superior temporal sulcus did not respond to these changes in the presented stimuli. Our results indicated that distinct neural loci process the physical and psychological aspects of facial emotion perception in a region-specific and implicit manner. PMID:24133426
Misinterpretation of Facial Expressions of Emotion in Verbal Adults with Autism Spectrum Disorder
Eack, Shaun M.; MAZEFSKY, CARLA A.; Minshew, Nancy J.
2014-01-01
Facial emotion perception is significantly affected in autism spectrum disorder (ASD), yet little is known about how individuals with ASD misinterpret facial expressions that result in their difficulty in accurately recognizing emotion in faces. This study examined facial emotion perception in 45 verbal adults with ASD and 30 age- and gender-matched volunteers without ASD to identify patterns of emotion misinterpretation during face processing that contribute to emotion recognition impairments in autism. Results revealed that difficulty distinguishing emotional from neutral facial expressions characterized much of the emotion perception impairments exhibited by participants with ASD. In particular, adults with ASD uniquely misinterpreted happy faces as neutral, and were significantly more likely than typical volunteers to attribute negative valence to non-emotional faces. The over-attribution of emotions to neutral faces was significantly related to greater communication and emotional intelligence impairments in individuals with ASD. These findings suggest a potential negative bias toward the interpretation of facial expressions and may have implications for interventions designed to remediate emotion perception in ASD. PMID:24535689
Niedtfeld, Inga; Defiebre, Nadine; Regenbogen, Christina; Mier, Daniela; Fenske, Sabrina; Kirsch, Peter; Lis, Stefanie; Schmahl, Christian
2017-04-01
Previous research has revealed alterations and deficits in facial emotion recognition in patients with borderline personality disorder (BPD). During interpersonal communication in daily life, social signals such as speech content, variation in prosody, and facial expression need to be considered simultaneously. We hypothesized that deficits in higher level integration of social stimuli contribute to difficulties in emotion recognition in BPD, and heightened arousal might explain this effect. Thirty-one patients with BPD and thirty-one healthy controls were asked to identify emotions in short video clips, which were designed to represent different combinations of the three communication channels: facial expression, speech content, and prosody. Skin conductance was recorded as a measure of sympathetic arousal, while controlling for state dissociation. Patients with BPD showed lower mean accuracy scores than healthy control subjects in all conditions comprising emotional facial expressions. This was true for the condition with facial expression only, and for the combination of all three communication channels. Electrodermal responses were enhanced in BPD only in response to auditory stimuli. In line with the major body of facial emotion recognition studies, we conclude that deficits in the interpretation of facial expressions lead to the difficulties observed in multimodal emotion processing in BPD.
Greater perceptual sensitivity to happy facial expression.
Maher, Stephen; Ekstrom, Tor; Chen, Yue
2014-01-01
Perception of subtle facial expressions is essential for social functioning; yet it is unclear if human perceptual sensitivities differ in detecting varying types of facial emotions. Evidence diverges as to whether salient negative versus positive emotions (such as sadness versus happiness) are preferentially processed. Here, we measured perceptual thresholds for the detection of four types of emotion in faces--happiness, fear, anger, and sadness--using psychophysical methods. We also evaluated the association of the perceptual performances with facial morphological changes between neutral and respective emotion types. Human observers were highly sensitive to happiness compared with the other emotional expressions. Further, this heightened perceptual sensitivity to happy expressions can be attributed largely to the emotion-induced morphological change of a particular facial feature (end-lip raise).
Uono, Shota; Sato, Wataru; Toichi, Motomi
2010-03-01
Individuals with pervasive developmental disorder (PDD) have difficulty with social communication via emotional facial expressions, but behavioral studies involving static images have reported inconsistent findings about emotion recognition. We investigated whether dynamic presentation of facial expression would enhance subjective perception of expressed emotion in 13 individuals with PDD and 13 typically developing controls. We presented dynamic and static emotional (fearful and happy) expressions. Participants were asked to match a changeable emotional face display with the last presented image. The results showed that both groups perceived the last image of dynamic facial expression to be more emotionally exaggerated than the static facial expression. This finding suggests that individuals with PDD have an intact perceptual mechanism for processing dynamic information in another individual's face.
Bourne, Victoria J; Vladeanu, Matei
2011-04-01
Recent neuropsychological studies have attempted to distinguish between different types of anxiety by contrasting patterns of brain organisation or activation; however, lateralisation for processing emotional stimuli has received relatively little attention. This study examines the relationship between strength of lateralisation for the processing of facial expressions of emotion and three measures of anxiety: state anxiety, trait anxiety and social anxiety. Across all six of the basic emotions (anger, disgust, fear, happiness, sadness, surprise) the same patterns of association were found. Participants with high levels of trait anxiety were more strongly lateralised to the right hemisphere for processing facial emotion. In contrast, participants with high levels of self-reported physiological arousal in response to social anxiety were more weakly lateralised to the right hemisphere, or even lateralised to the left hemisphere, for the processing of facial emotion. There were also sex differences in these associations: the relationships were evident for males only. The finding of distinct patterns of lateralisation for trait anxiety and self-reported physiological arousal suggests different neural circuitry for trait and social anxiety. Copyright © 2011. Published by Elsevier Ltd.
Cognitive penetrability and emotion recognition in human facial expressions
Marchi, Francesco
2015-01-01
Do our background beliefs, desires, and mental images influence our perceptual experience of the emotions of others? In this paper, we will address the possibility of cognitive penetration (CP) of perceptual experience in the domain of social cognition. In particular, we focus on emotion recognition based on the visual experience of facial expressions. After introducing the current debate on CP, we review examples of perceptual adaptation for facial expressions of emotion. This evidence supports the idea that facial expressions are perceptually processed as wholes. That is, the perceptual system integrates lower-level facial features, such as eyebrow orientation, mouth angle etc., into facial compounds. We then present additional experimental evidence showing that in some cases, emotion recognition on the basis of facial expression is sensitive to and modified by the background knowledge of the subject. We argue that such sensitivity is best explained as a difference in the visual experience of the facial expression, not just as a modification of the judgment based on this experience. The difference in experience is characterized as the result of the interference of background knowledge with the perceptual integration process for faces. Thus, according to the best explanation, we have to accept CP in some cases of emotion recognition. Finally, we discuss a recently proposed mechanism for CP in the face-based recognition of emotion. PMID:26150796
Impaired Perception of Emotional Expression in Amyotrophic Lateral Sclerosis.
Oh, Seong Il; Oh, Ki Wook; Kim, Hee Jin; Park, Jin Seok; Kim, Seung Hyun
2016-07-01
The increasing recognition that deficits in social emotions occur in amyotrophic lateral sclerosis (ALS) is helping to explain the spectrum of neuropsychological dysfunctions, thus supporting the view of ALS as a multisystem disorder involving neuropsychological deficits as well as motor deficits. The aim of this study was to characterize the emotion perception abilities of Korean patients with ALS based on the recognition of facial expressions. Twenty-four patients with ALS and 24 age- and sex-matched healthy controls completed neuropsychological tests and facial emotion recognition tasks [ChaeLee Korean Facial Expressions of Emotions (ChaeLee-E)]. The ChaeLee-E test includes facial expressions for seven emotions: happiness, sadness, anger, disgust, fear, surprise, and neutral. The ability to perceive facial emotions was significantly worse among ALS patients performed than among healthy controls [65.2±18.0% vs. 77.1±6.6% (mean±SD), p=0.009]. Eight of the 24 patients (33%) scored below the 5th percentile score of controls for recognizing facial emotions. Emotion perception deficits occur in Korean ALS patients, particularly regarding facial expressions of emotion. These findings expand the spectrum of cognitive and behavioral dysfunction associated with ALS into emotion processing dysfunction.
[Impact of facial emotional recognition alterations in Dementia of the Alzheimer type].
Rubinstein, Wanda; Cossini, Florencia; Politis, Daniel
2016-07-01
Face recognition of basic emotions is independent of other deficits in dementia of the Alzheimer type. Among these deficits, there is disagreement about what emotions are more difficult to recognize. Our aim was to study the presence of alterations in the process of facial recognition of basic emotions, and to investigate if there were differences in the recognition of each type of emotion in Alzheimer's disease. With three tests of recognition of basic facial emotions we evaluated 29 patients who had been diagnosed with dementia of the Alzheimer type and 18 control subjects. Significant differences were obtained in tests of recognition of basic facial emotions and between each. Since the amygdala, one of the brain structures responsible for emotional reaction, is affected in the early stages of this disease, our findings become relevant to understand how this alteration of the process of emotional recognition impacts the difficulties these patients have in both interpersonal relations and behavioral disorders.
Subliminal Face Emotion Processing: A Comparison of Fearful and Disgusted Faces
Khalid, Shah; Ansorge, Ulrich
2017-01-01
Prior research has provided evidence for (1) subcortical processing of subliminal facial expressions of emotion and (2) for the emotion-specificity of these processes. Here, we investigated if this is also true for the processing of the subliminal facial display of disgust. In Experiment 1, we used differently filtered masked prime faces portraying emotionally neutral or disgusted expressions presented prior to clearly visible target faces to test if the masked primes exerted an influence on target processing nonetheless. Whereas we found evidence for subliminal face congruence or priming effects, in particular, reverse priming by low spatial frequencies disgusted face primes, we did not find any support for a subcortical origin of the effect. In Experiment 2, we compared the influence of subliminal disgusted faces with that of subliminal fearful faces and demonstrated a behavioral performance difference between the two, pointing to an emotion-specific processing of the disgusted facial expressions. In both experiments, we also tested for the dependence of the subliminal emotional face processing on spatial attention – with mixed results, suggesting an attention-independence in Experiment 1 but not in Experiment 2 –, and we found perfect masking of the face primes – that is, proof of the subliminality of the prime faces. Based on our findings, we speculate that subliminal facial expressions of disgust could afford easy avoidance of these faces. This could be a unique effect of disgusted faces as compared to other emotional facial displays, at least under the conditions studied here. PMID:28680413
Subliminal Face Emotion Processing: A Comparison of Fearful and Disgusted Faces.
Khalid, Shah; Ansorge, Ulrich
2017-01-01
Prior research has provided evidence for (1) subcortical processing of subliminal facial expressions of emotion and (2) for the emotion-specificity of these processes. Here, we investigated if this is also true for the processing of the subliminal facial display of disgust. In Experiment 1, we used differently filtered masked prime faces portraying emotionally neutral or disgusted expressions presented prior to clearly visible target faces to test if the masked primes exerted an influence on target processing nonetheless. Whereas we found evidence for subliminal face congruence or priming effects, in particular, reverse priming by low spatial frequencies disgusted face primes, we did not find any support for a subcortical origin of the effect. In Experiment 2, we compared the influence of subliminal disgusted faces with that of subliminal fearful faces and demonstrated a behavioral performance difference between the two, pointing to an emotion-specific processing of the disgusted facial expressions. In both experiments, we also tested for the dependence of the subliminal emotional face processing on spatial attention - with mixed results, suggesting an attention-independence in Experiment 1 but not in Experiment 2 -, and we found perfect masking of the face primes - that is, proof of the subliminality of the prime faces. Based on our findings, we speculate that subliminal facial expressions of disgust could afford easy avoidance of these faces. This could be a unique effect of disgusted faces as compared to other emotional facial displays, at least under the conditions studied here.
Misinterpretation of facial expressions of emotion in verbal adults with autism spectrum disorder.
Eack, Shaun M; Mazefsky, Carla A; Minshew, Nancy J
2015-04-01
Facial emotion perception is significantly affected in autism spectrum disorder, yet little is known about how individuals with autism spectrum disorder misinterpret facial expressions that result in their difficulty in accurately recognizing emotion in faces. This study examined facial emotion perception in 45 verbal adults with autism spectrum disorder and 30 age- and gender-matched volunteers without autism spectrum disorder to identify patterns of emotion misinterpretation during face processing that contribute to emotion recognition impairments in autism. Results revealed that difficulty distinguishing emotional from neutral facial expressions characterized much of the emotion perception impairments exhibited by participants with autism spectrum disorder. In particular, adults with autism spectrum disorder uniquely misinterpreted happy faces as neutral, and were significantly more likely than typical volunteers to attribute negative valence to nonemotional faces. The over-attribution of emotions to neutral faces was significantly related to greater communication and emotional intelligence impairments in individuals with autism spectrum disorder. These findings suggest a potential negative bias toward the interpretation of facial expressions and may have implications for interventions designed to remediate emotion perception in autism spectrum disorder. © The Author(s) 2014.
Emotion processing deficits in alexithymia and response to a depth of processing intervention.
Constantinou, Elena; Panayiotou, Georgia; Theodorou, Marios
2014-12-01
Findings on alexithymic emotion difficulties have been inconsistent. We examined potential differences between alexithymic and control participants in general arousal, reactivity, facial and subjective expression, emotion labeling, and covariation between emotion response systems. A depth of processing intervention was introduced. Fifty-four participants (27 alexithymic), selected using the Toronto Alexithymia Scale-20, completed an imagery experiment (imagining joy, fear and neutral scripts), under instructions for shallow or deep emotion processing. Heart rate, skin conductance, facial electromyography and startle reflex were recorded along with subjective ratings. Results indicated hypo-reactivity to emotion among high alexithymic individuals, smaller and slower startle responses, and low covariation between physiology and self-report. No deficits in facial expression, labeling and emotion ratings were identified. Deep processing was associated with increased physiological reactivity and lower perceived dominance and arousal in high alexithymia. Findings suggest a tendency for avoidance of intense, unpleasant emotions and less defensive action preparation in alexithymia. Copyright © 2014 Elsevier B.V. All rights reserved.
Long-term academic stress enhances early processing of facial expressions.
Zhang, Liang; Qin, Shaozheng; Yao, Zhuxi; Zhang, Kan; Wu, Jianhui
2016-11-01
Exposure to long-term stress can lead to a variety of emotional and behavioral problems. Although widely investigated, the neural basis of how long-term stress impacts emotional processing in humans remains largely elusive. Using event-related brain potentials (ERPs), we investigated the effects of long-term stress on the neural dynamics of emotionally facial expression processing. Thirty-nine male college students undergoing preparation for a major examination and twenty-one matched controls performed a gender discrimination task for faces displaying angry, happy, and neutral expressions. The results of the Perceived Stress Scale showed that participants in the stress group perceived higher levels of long-term stress relative to the control group. ERP analyses revealed differential effects of long-term stress on two early stages of facial expression processing: 1) long-term stress generally augmented posterior P1 amplitudes to facial stimuli irrespective of expression valence, suggesting that stress can increase sensitization to visual inputs in general, and 2) long-term stress selectively augmented fronto-central P2 amplitudes for angry but not for neutral or positive facial expressions, suggesting that stress may lead to increased attentional prioritization to processing negative emotional stimuli. Together, our findings suggest that long-term stress has profound impacts on the early stages of facial expression processing, with an increase at the very early stage of general information inputs and a subsequent attentional bias toward processing emotionally negative stimuli. Copyright © 2016 Elsevier B.V. All rights reserved.
Jessen, Sarah; Grossmann, Tobias
2017-01-01
Enhanced attention to fear expressions in adults is primarily driven by information from low as opposed to high spatial frequencies contained in faces. However, little is known about the role of spatial frequency information in emotion processing during infancy. In the present study, we examined the role of low compared to high spatial frequencies in the processing of happy and fearful facial expressions by using filtered face stimuli and measuring event-related brain potentials (ERPs) in 7-month-old infants ( N = 26). Our results revealed that infants' brains discriminated between emotional facial expressions containing high but not between expressions containing low spatial frequencies. Specifically, happy faces containing high spatial frequencies elicited a smaller Nc amplitude than fearful faces containing high spatial frequencies and happy and fearful faces containing low spatial frequencies. Our results demonstrate that already in infancy spatial frequency content influences the processing of facial emotions. Furthermore, we observed that fearful facial expressions elicited a comparable Nc response for high and low spatial frequencies, suggesting a robust detection of fearful faces irrespective of spatial frequency content, whereas the detection of happy facial expressions was contingent upon frequency content. In summary, these data provide new insights into the neural processing of facial emotions in early development by highlighting the differential role played by spatial frequencies in the detection of fear and happiness.
Alfimova, M V; Golimbet, V E; Korovaitseva, G I; Lezheiko, T V; Abramova, L I; Aksenova, E V; Bolgov, M I
2014-01-01
The 5-HTTLPR SLC6A4 and catechol-o-methyltransferase (COMT) Val158Met polymorphisms are reported to be associated with processing of facial expressions in general population. Impaired recognition of facial expressions that is characteristic of schizophrenia negatively impacts on the social adaptation of the patients. To search for molecular mechanisms of this deficit, we studied main and epistatic effects of 5-HTTLPR and Val158Met polymorphisms on the facial emotion recognition in patients with schizophrenia (n=299) and healthy controls (n=232). The 5-HTTLPR polymorphism was associated with the emotion recognition in patients. The ll-homozygotes recognized facial emotions significantly better compared to those with an s-allele (F=8.00; p=0.005). Although the recognition of facial emotions was correlated with negative symptoms, verbal learning and trait anxiety, these variables did not significantly modified the association. In both groups, no effect of the COMT on the recognition of facial emotions was found.
Binelli, C; Subirà, S; Batalla, A; Muñiz, A; Sugranyés, G; Crippa, J A; Farré, M; Pérez-Jurado, L; Martín-Santos, R
2014-11-01
Social Anxiety Disorder (SAD) and Williams-Beuren Syndrome (WS) are two conditions which seem to be at opposite ends in the continuum of social fear but show compromised abilities in some overlapping areas, including some social interactions, gaze contact and processing of facial emotional cues. The increase in the number of neuroimaging studies has greatly expanded our knowledge of the neural bases of facial emotion processing in both conditions. However, to date, SAD and WS have not been compared. We conducted a systematic review of functional magnetic resonance imaging (fMRI) studies comparing SAD and WS cases to healthy control participants (HC) using facial emotion processing paradigms. Two researchers conducted comprehensive PubMed/Medline searches to identify all fMRI studies of facial emotion processing in SAD and WS. The following search key-words were used: "emotion processing"; "facial emotion"; "social anxiety"; "social phobia"; "Williams syndrome"; "neuroimaging"; "functional magnetic resonance"; "fMRI" and their combinations, as well as terms specifying individual facial emotions. We extracted spatial coordinates from each study and conducted two separate voxel-wise activation likelihood estimation meta-analyses, one for SAD and one for WS. Twenty-two studies met the inclusion criteria: 17 studies of SAD and five of WS. We found evidence for both common and distinct patterns of neural activation. Limbic engagement was common to SAD and WS during facial emotion processing, although we observed opposite patterns of activation for each disorder. Compared to HC, SAD cases showed hyperactivation of the amygdala, the parahippocampal gyrus and the globus pallidus. Compared to controls, participants with WS showed hypoactivation of these regions. Differential activation in a number of regions specific to either condition was also identified: SAD cases exhibited greater activation of the insula, putamen, the superior temporal gyrus, medial frontal regions and the cuneus, while WS subjects showed decreased activation in the inferior region of the parietal lobule. The identification of limbic structures as a shared correlate and the patterns of activation observed for each condition may reflect the aberrant patterns of facial emotion processing that the two conditions share, and may contribute to explaining part of the underlying neural substrate of exaggerated/diminished fear responses to social cues that characterize SAD and WS respectively. We believe that insights from WS and the inclusion of this syndrome as a control group in future experimental studies may improve our understanding of the neural correlates of social fear in general, and of SAD in particular. Copyright © 2014 Elsevier Ltd. All rights reserved.
Selective attention modulates early human evoked potentials during emotional face-voice processing.
Ho, Hao Tam; Schröger, Erich; Kotz, Sonja A
2015-04-01
Recent findings on multisensory integration suggest that selective attention influences cross-sensory interactions from an early processing stage. Yet, in the field of emotional face-voice integration, the hypothesis prevails that facial and vocal emotional information interacts preattentively. Using ERPs, we investigated the influence of selective attention on the perception of congruent versus incongruent combinations of neutral and angry facial and vocal expressions. Attention was manipulated via four tasks that directed participants to (i) the facial expression, (ii) the vocal expression, (iii) the emotional congruence between the face and the voice, and (iv) the synchrony between lip movement and speech onset. Our results revealed early interactions between facial and vocal emotional expressions, manifested as modulations of the auditory N1 and P2 amplitude by incongruent emotional face-voice combinations. Although audiovisual emotional interactions within the N1 time window were affected by the attentional manipulations, interactions within the P2 modulation showed no such attentional influence. Thus, we propose that the N1 and P2 are functionally dissociated in terms of emotional face-voice processing and discuss evidence in support of the notion that the N1 is associated with cross-sensory prediction, whereas the P2 relates to the derivation of an emotional percept. Essentially, our findings put the integration of facial and vocal emotional expressions into a new perspective-one that regards the integration process as a composite of multiple, possibly independent subprocesses, some of which are susceptible to attentional modulation, whereas others may be influenced by additional factors.
Facial responsiveness of psychopaths to the emotional expressions of others
Mokros, Andreas; Olderbak, Sally; Wilhelm, Oliver
2018-01-01
Psychopathic individuals show selfish, manipulative, and antisocial behavior in addition to emotional detachment and reduced empathy. Their empathic deficits are thought to be associated with a reduced responsiveness to emotional stimuli. Immediate facial muscle responses to the emotional expressions of others reflect the expressive part of emotional responsiveness and are positively related to trait empathy. Empirical evidence for reduced facial muscle responses in adult psychopathic individuals to the emotional expressions of others is rare. In the present study, 261 male criminal offenders and non-offenders categorized dynamically presented facial emotion expressions (angry, happy, sad, and neutral) during facial electromyography recording of their corrugator muscle activity. We replicated a measurement model of facial muscle activity, which controls for general facial responsiveness to face stimuli, and modeled three correlated emotion-specific factors (i.e., anger, happiness, and sadness) representing emotion specific activity. In a multi-group confirmatory factor analysis, we compared the means of the anger, happiness, and sadness latent factors between three groups: 1) non-offenders, 2) low, and 3) high psychopathic offenders. There were no significant mean differences between groups. Our results challenge current theories that focus on deficits in emotional responsiveness as leading to the development of psychopathy and encourage further theoretical development on deviant emotional processes in psychopathic individuals. PMID:29324826
Enjoying vs. smiling: Facial muscular activation in response to emotional language.
Fino, Edita; Menegatti, Michela; Avenanti, Alessio; Rubini, Monica
2016-07-01
The present study examined whether emotionally congruent facial muscular activation - a somatic index of emotional language embodiment can be elicited by reading subject-verb sentences composed of action verbs, that refer directly to facial expressions (e.g., Mario smiles), but also by reading more abstract state verbs, which provide more direct access to the emotions felt by the agent (e.g., Mario enjoys). To address this issue, we measured facial electromyography (EMG) while participants evaluated state and action verb sentences. We found emotional sentences including both verb categories to have valence-congruent effects on emotional ratings and corresponding facial muscle activations. As expected, state verb-sentences were judged with higher valence ratings than action verb-sentences. Moreover, despite emotional congruent facial activations were similar for the two linguistic categories, in a late temporal window we found a tendency for greater EMG modulation when reading action relative to state verb sentences. These results support embodied theories of language comprehension and suggest that understanding emotional action and state verb sentences relies on partially dissociable motor and emotional processes. Copyright © 2016 Elsevier B.V. All rights reserved.
Holmes, Amanda; Winston, Joel S; Eimer, Martin
2005-10-01
To investigate the impact of spatial frequency on emotional facial expression analysis, ERPs were recorded in response to low spatial frequency (LSF), high spatial frequency (HSF), and unfiltered broad spatial frequency (BSF) faces with fearful or neutral expressions, houses, and chairs. In line with previous findings, BSF fearful facial expressions elicited a greater frontal positivity than BSF neutral facial expressions, starting at about 150 ms after stimulus onset. In contrast, this emotional expression effect was absent for HSF and LSF faces. Given that some brain regions involved in emotion processing, such as amygdala and connected structures, are selectively tuned to LSF visual inputs, these data suggest that ERP effects of emotional facial expression do not directly reflect activity in these regions. It is argued that higher order neocortical brain systems are involved in the generation of emotion-specific waveform modulations. The face-sensitive N170 component was neither affected by emotional facial expression nor by spatial frequency information.
2011-01-01
Background Integration of compatible or incompatible emotional valence and semantic information is an essential aspect of complex social interactions. A modified version of the Implicit Association Test (IAT) called Dual Valence Association Task (DVAT) was designed in order to measure conflict resolution processing from compatibility/incompatibly of semantic and facial valence. The DVAT involves two emotional valence evaluative tasks which elicits two forms of emotional compatible/incompatible associations (facial and semantic). Methods Behavioural measures and Event Related Potentials were recorded while participants performed the DVAT. Results Behavioural data showed a robust effect that distinguished compatible/incompatible tasks. The effects of valence and contextual association (between facial and semantic stimuli) showed early discrimination in N170 of faces. The LPP component was modulated by the compatibility of the DVAT. Conclusions Results suggest that DVAT is a robust paradigm for studying the emotional interference effect in the processing of simultaneous information from semantic and facial stimuli. PMID:21489277
Naranjo, C; Kornreich, C; Campanella, S; Noël, X; Vandriette, Y; Gillain, B; de Longueville, X; Delatte, B; Verbanck, P; Constant, E
2011-02-01
The processing of emotional stimuli is thought to be negatively biased in major depression. This study investigates this issue using musical, vocal and facial affective stimuli. 23 depressed in-patients and 23 matched healthy controls were recruited. Affective information processing was assessed through musical, vocal and facial emotion recognition tasks. Depression, anxiety level and attention capacity were controlled. The depressed participants demonstrated less accurate identification of emotions than the control group in all three sorts of emotion-recognition tasks. The depressed group also gave higher intensity ratings than the controls when scoring negative emotions, and they were more likely to attribute negative emotions to neutral voices and faces. Our in-patient group might differ from the more general population of depressed adults. They were all taking anti-depressant medication, which may have had an influence on their emotional information processing. Major depression is associated with a general negative bias in the processing of emotional stimuli. Emotional processing impairment in depression is not confined to interpersonal stimuli (faces and voices), being also present in the ability to feel music accurately. © 2010 Elsevier B.V. All rights reserved.
Emotional facial expressions reduce neural adaptation to face identity.
Gerlicher, Anna M V; van Loon, Anouk M; Scholte, H Steven; Lamme, Victor A F; van der Leij, Andries R
2014-05-01
In human social interactions, facial emotional expressions are a crucial source of information. Repeatedly presented information typically leads to an adaptation of neural responses. However, processing seems sustained with emotional facial expressions. Therefore, we tested whether sustained processing of emotional expressions, especially threat-related expressions, would attenuate neural adaptation. Neutral and emotional expressions (happy, mixed and fearful) of same and different identity were presented at 3 Hz. We used electroencephalography to record the evoked steady-state visual potentials (ssVEP) and tested to what extent the ssVEP amplitude adapts to the same when compared with different face identities. We found adaptation to the identity of a neutral face. However, for emotional faces, adaptation was reduced, decreasing linearly with negative valence, with the least adaptation to fearful expressions. This short and straightforward method may prove to be a valuable new tool in the study of emotional processing.
Brain correlates of musical and facial emotion recognition: evidence from the dementias.
Hsieh, S; Hornberger, M; Piguet, O; Hodges, J R
2012-07-01
The recognition of facial expressions of emotion is impaired in semantic dementia (SD) and is associated with right-sided brain atrophy in areas known to be involved in emotion processing, notably the amygdala. Whether patients with SD also experience difficulty recognizing emotions conveyed by other media, such as music, is unclear. Prior studies have used excerpts of known music from classical or film repertoire but not unfamiliar melodies designed to convey distinct emotions. Patients with SD (n = 11), Alzheimer's disease (n = 12) and healthy control participants (n = 20) underwent tests of emotion recognition in two modalities: unfamiliar musical tunes and unknown faces as well as volumetric MRI. Patients with SD were most impaired with the recognition of facial and musical emotions, particularly for negative emotions. Voxel-based morphometry showed that the labelling of emotions, regardless of modality, correlated with the degree of atrophy in the right temporal pole, amygdala and insula. The recognition of musical (but not facial) emotions was also associated with atrophy of the left anterior and inferior temporal lobe, which overlapped with regions correlating with standardized measures of verbal semantic memory. These findings highlight the common neural substrates supporting the processing of emotions by facial and musical stimuli but also indicate that the recognition of emotions from music draws upon brain regions that are associated with semantics in language. Copyright © 2012 Elsevier Ltd. All rights reserved.
McBain, Ryan; Norton, Daniel; Chen, Yue
2010-09-01
While schizophrenia patients are impaired at facial emotion perception, the role of basic visual processing in this deficit remains relatively unclear. We examined emotion perception when spatial frequency content of facial images was manipulated via high-pass and low-pass filtering. Unlike controls (n=29), patients (n=30) perceived images with low spatial frequencies as more fearful than those without this information, across emotional salience levels. Patients also perceived images with high spatial frequencies as happier. In controls, this effect was found only at low emotional salience. These results indicate that basic visual processing has an amplified modulatory effect on emotion perception in schizophrenia. (c) 2010 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Masten, Carrie L.; Guyer, Amanda E.; Hodgdon, Hilary B.; McClure, Erin B.; Charney, Dennis S.; Ernst, Monique; Kaufman, Joan; Pine, Daniel S.; Monk, Christopher S.
2008-01-01
Objective: The purpose of this study is to examine processing of facial emotions in a sample of maltreated children showing high rates of post-traumatic stress disorder (PTSD). Maltreatment during childhood has been associated independently with both atypical processing of emotion and the development of PTSD. However, research has provided little…
Automatic facial mimicry in response to dynamic emotional stimuli in five-month-old infants.
Isomura, Tomoko; Nakano, Tamami
2016-12-14
Human adults automatically mimic others' emotional expressions, which is believed to contribute to sharing emotions with others. Although this behaviour appears fundamental to social reciprocity, little is known about its developmental process. Therefore, we examined whether infants show automatic facial mimicry in response to others' emotional expressions. Facial electromyographic activity over the corrugator supercilii (brow) and zygomaticus major (cheek) of four- to five-month-old infants was measured while they viewed dynamic clips presenting audiovisual, visual and auditory emotions. The audiovisual bimodal emotion stimuli were a display of a laughing/crying facial expression with an emotionally congruent vocalization, whereas the visual/auditory unimodal emotion stimuli displayed those emotional faces/vocalizations paired with a neutral vocalization/face, respectively. Increased activation of the corrugator supercilii muscle in response to audiovisual cries and the zygomaticus major in response to audiovisual laughter were observed between 500 and 1000 ms after stimulus onset, which clearly suggests rapid facial mimicry. By contrast, both visual and auditory unimodal emotion stimuli did not activate the infants' corresponding muscles. These results revealed that automatic facial mimicry is present as early as five months of age, when multimodal emotional information is present. © 2016 The Author(s).
Processing of Facial Emotion in Bipolar Depression and Euthymia.
Robinson, Lucy J; Gray, John M; Burt, Mike; Ferrier, I Nicol; Gallagher, Peter
2015-10-01
Previous studies of facial emotion processing in bipolar disorder (BD) have reported conflicting findings. In independently conducted studies, we investigate facial emotion labeling in euthymic and depressed BD patients using tasks with static and dynamically morphed images of different emotions displayed at different intensities. Study 1 included 38 euthymic BD patients and 28 controls. Participants completed two tasks: labeling of static images of basic facial emotions (anger, disgust, fear, happy, sad) shown at different expression intensities; the Eyes Test (Baron-Cohen, Wheelwright, Hill, Raste, & Plumb, 2001), which involves recognition of complex emotions using only the eye region of the face. Study 2 included 53 depressed BD patients and 47 controls. Participants completed two tasks: labeling of "dynamic" facial expressions of the same five basic emotions; the Emotional Hexagon test (Young, Perret, Calder, Sprengelmeyer, & Ekman, 2002). There were no significant group differences on any measures of emotion perception/labeling, compared to controls. A significant group by intensity interaction was observed in both emotion labeling tasks (euthymia and depression), although this effect did not survive the addition of measures of executive function/psychomotor speed as covariates. Only 2.6-15.8% of euthymic patients and 7.8-13.7% of depressed patients scored below the 10th percentile of the controls for total emotion recognition accuracy. There was no evidence of specific deficits in facial emotion labeling in euthymic or depressed BD patients. Methodological variations-including mood state, sample size, and the cognitive demands of the tasks-may contribute significantly to the variability in findings between studies.
de la Rosa, Stephan; Fademrecht, Laura; Bülthoff, Heinrich H; Giese, Martin A; Curio, Cristóbal
2018-06-01
Motor-based theories of facial expression recognition propose that the visual perception of facial expression is aided by sensorimotor processes that are also used for the production of the same expression. Accordingly, sensorimotor and visual processes should provide congruent emotional information about a facial expression. Here, we report evidence that challenges this view. Specifically, the repeated execution of facial expressions has the opposite effect on the recognition of a subsequent facial expression than the repeated viewing of facial expressions. Moreover, the findings of the motor condition, but not of the visual condition, were correlated with a nonsensory condition in which participants imagined an emotional situation. These results can be well accounted for by the idea that facial expression recognition is not always mediated by motor processes but can also be recognized on visual information alone.
Kliemann, Dorit; Rosenblau, Gabriela; Bölte, Sven; Heekeren, Hauke R.; Dziobek, Isabel
2013-01-01
Recognizing others' emotional states is crucial for effective social interaction. While most facial emotion recognition tasks use explicit prompts that trigger consciously controlled processing, emotional faces are almost exclusively processed implicitly in real life. Recent attempts in social cognition suggest a dual process perspective, whereby explicit and implicit processes largely operate independently. However, due to differences in methodology the direct comparison of implicit and explicit social cognition has remained a challenge. Here, we introduce a new tool to comparably measure implicit and explicit processing aspects comprising basic and complex emotions in facial expressions. We developed two video-based tasks with similar answer formats to assess performance in respective facial emotion recognition processes: Face Puzzle, implicit and explicit. To assess the tasks' sensitivity to atypical social cognition and to infer interrelationship patterns between explicit and implicit processes in typical and atypical development, we included healthy adults (NT, n = 24) and adults with autism spectrum disorder (ASD, n = 24). Item analyses yielded good reliability of the new tasks. Group-specific results indicated sensitivity to subtle social impairments in high-functioning ASD. Correlation analyses with established implicit and explicit socio-cognitive measures were further in favor of the tasks' external validity. Between group comparisons provide first hints of differential relations between implicit and explicit aspects of facial emotion recognition processes in healthy compared to ASD participants. In addition, an increased magnitude of between group differences in the implicit task was found for a speed-accuracy composite measure. The new Face Puzzle tool thus provides two new tasks to separately assess explicit and implicit social functioning, for instance, to measure subtle impairments as well as potential improvements due to social cognitive interventions. PMID:23805122
Poor sleep quality predicts deficient emotion information processing over time in early adolescence.
Soffer-Dudek, Nirit; Sadeh, Avi; Dahl, Ronald E; Rosenblat-Stein, Shiran
2011-11-01
There is deepening understanding of the effects of sleep on emotional information processing. Emotion information processing is a key aspect of social competence, which undergoes important maturational and developmental changes in adolescence; however, most research in this area has focused on adults. Our aim was to test the links between sleep and emotion information processing during early adolescence. Sleep and facial information processing were assessed objectively during 3 assessment waves, separated by 1-year lags. Data were obtained in natural environments-sleep was assessed in home settings, and facial information processing was assessed at school. 94 healthy children (53 girls, 41 boys), aged 10 years at Time 1. N/A. Facial information processing was tested under neutral (gender identification) and emotional (emotional expression identification) conditions. Sleep was assessed in home settings using actigraphy for 7 nights at each assessment wave. Waking > 5 min was considered a night awakening. Using multilevel modeling, elevated night awakenings and decreased sleep efficiency significantly predicted poor performance only in the emotional information processing condition (e.g., b = -1.79, SD = 0.52, confidence interval: lower boundary = -2.82, upper boundary = -0.076, t(416.94) = -3.42, P = 0.001). Poor sleep quality is associated with compromised emotional information processing during early adolescence, a sensitive period in socio-emotional development.
Szabó, Ádám György; Farkas, Kinga; Marosi, Csilla; Kozák, Lajos R; Rudas, Gábor; Réthelyi, János; Csukly, Gábor
2017-12-08
Schizophrenia has a negative effect on the activity of the temporal and prefrontal cortices in the processing of emotional facial expressions. However no previous research focused on the evaluation of mixed emotions in schizophrenia, albeit they are frequently expressed in everyday situations and negative emotions are frequently expressed by mixed facial expressions. Altogether 37 subjects, 19 patients with schizophrenia and 18 healthy control subjects were enrolled in the study. The two study groups did not differ in age and education. The stimulus set consisted of 10 fearful (100%), 10 happy (100%), 10 mixed fear (70% fear and 30% happy) and 10 mixed happy facial expressions. During the fMRI acquisition pictures were presented in a randomized order and subjects had to categorize expressions by button press. A decreased activation was found in the patient group during fear, mixed fear and mixed happy processing in the right ventrolateral prefrontal cortex (VLPFC) and the right anterior insula (RAI) at voxel and cluster level after familywise error correction. No difference was found between study groups in activations to happy facial condition. Patients with schizophrenia did not show a differential activation between mixed happy and happy facial expression similar to controls in the right dorsolateral prefrontal cortex (DLPFC). Patients with schizophrenia showed decreased functioning in right prefrontal regions responsible for salience signaling and valence evaluation during emotion recognition. Our results indicate that fear and mixed happy/fear processing are impaired in schizophrenia, while happy facial expression processing is relatively intact.
Nentjes, Lieke; Bernstein, David P; Meijer, Ewout; Arntz, Arnoud; Wiers, Reinout W
2016-12-01
This study investigated the physiological, self-reported, and facial correlates of emotion regulation in psychopathy. Specifically, we compared psychopathic offenders (n = 42), nonpsychopathic offenders (n = 42), and nonoffender controls (n = 26) in their ability to inhibit and express emotion while watching affective films (fear, happy, and sad). Results showed that all participants were capable of drastically diminishing facial emotions under inhibition instructions. Contrary to expectation, psychopaths were not superior in adopting such a "poker face." Further, the inhibition of emotion was associated with cardiovascular changes, an effect that was also not dependent on psychopathy (or its factors), suggesting emotion inhibition to be an effortful process in psychopaths as well. Interestingly, psychopathic offenders did not differ from nonpsychopaths in the capacity to show content-appropriate facial emotions during the expression condition. Taken together, these data challenge the view that psychopathy is associated with either superior emotional inhibitory capacities or a generalized impairment in showing facial affect.
Emotional priming with facial exposures in euthymic patients with bipolar disorder.
Kim, Taek Su; Lee, Su Young; Ha, Ra Yeon; Kim, Eosu; An, Suk Kyoon; Ha, Kyooseob; Cho, Hyun-Sang
2011-12-01
People with bipolar disorder have abnormal emotional processing. We investigated the automatic and controlled emotional processing via a priming paradigm with subliminal and supraliminal facial exposure. We compared 20 euthymic bipolar patients and 20 healthy subjects on their performance in subliminal and supraliminal tasks. Priming tasks consisted of three different primes according to facial emotions (happy, sad, and neutral) followed by a neutral face as a target stimulus. The prime stimuli were presented subliminally (17 msec) or supraliminally (1000 msec). In subliminal tasks, both patients and controls judged the neutral target face as significantly more unpleasant (negative judgment shift) when presented with negative emotion primes compared with positive primes. In supraliminal tasks, bipolar subjects showed significant negative judgment shift, whereas healthy subjects did not. There was a significant group × emotion interaction for the judgment rate in supraliminal tasks. Our finding of persistent affective priming even at conscious awareness may suggest that bipolar patients have impaired cognitive control on emotional processing rather than automatically spreading activation of emotion.
Stability of facial emotion recognition performance in bipolar disorder.
Martino, Diego J; Samamé, Cecilia; Strejilevich, Sergio A
2016-09-30
The aim of this study was to assess the performance in emotional processing over time in a sample of euthymic patients with bipolar disorder (BD). Performance in the facial recognition of the six basic emotions (surprise, anger, sadness, happiness, disgust, and fear) did not change during a follow-up period of almost 7 years. These preliminary results suggest that performance in facial emotion recognition might be stable over time in BD. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Chechko, Natalya; Pagel, Alena; Otte, Ellen; Koch, Iring; Habel, Ute
2016-01-01
Spontaneous emotional expressions (rapid facial mimicry) perform both emotional and social functions. In the current study, we sought to test whether there were deficits in automatic mimic responses to emotional facial expressions in patients (15 of them) with stable schizophrenia compared to 15 controls. In a perception-action interference paradigm (the Simon task; first experiment), and in the context of a dual-task paradigm (second experiment), the task-relevant stimulus feature was the gender of a face, which, however, displayed a smiling or frowning expression (task-irrelevant stimulus feature). We measured the electromyographical activity in the corrugator supercilii and zygomaticus major muscle regions in response to either compatible or incompatible stimuli (i.e., when the required response did or did not correspond to the depicted facial expression). The compatibility effect based on interactions between the implicit processing of a task-irrelevant emotional facial expression and the conscious production of an emotional facial expression did not differ between the groups. In stable patients (in spite of a reduced mimic reaction), we observed an intact capacity to respond spontaneously to facial emotional stimuli. PMID:27303335
Rigon, A; Voss, M W; Turkstra, L S; Mutlu, B; Duff, M C
2017-01-01
Although several studies have demonstrated that facial-affect recognition impairment is common following moderate-severe traumatic brain injury (TBI), and that there are diffuse alterations in large-scale functional brain networks in TBI populations, little is known about the relationship between the two. Here, in a sample of 26 participants with TBI and 20 healthy comparison participants (HC) we measured facial-affect recognition abilities and resting-state functional connectivity (rs-FC) using fMRI. We then used network-based statistics to examine (A) the presence of rs-FC differences between individuals with TBI and HC within the facial-affect processing network, and (B) the association between inter-individual differences in emotion recognition skills and rs-FC within the facial-affect processing network. We found that participants with TBI showed significantly lower rs-FC in a component comprising homotopic and within-hemisphere, anterior-posterior connections within the facial-affect processing network. In addition, within the TBI group, participants with higher emotion-labeling skills showed stronger rs-FC within a network comprised of intra- and inter-hemispheric bilateral connections. Findings indicate that the ability to successfully recognize facial-affect after TBI is related to rs-FC within components of facial-affective networks, and provide new evidence that further our understanding of the mechanisms underlying emotion recognition impairment in TBI.
Liu, Xinyang; Hildebrandt, Andrea; Recio, Guillermo; Sommer, Werner; Cai, Xinxia; Wilhelm, Oliver
2017-01-01
Facial identity and facial expression processing are crucial socio-emotional abilities but seem to show only limited psychometric uniqueness when the processing speed is considered in easy tasks. We applied a comprehensive measurement of processing speed and contrasted performance specificity in socio-emotional, social and non-social stimuli from an individual differences perspective. Performance in a multivariate task battery could be best modeled by a general speed factor and a first-order factor capturing some specific variance due to processing emotional facial expressions. We further tested equivalence of the relationships between speed factors and polymorphisms of dopamine and serotonin transporter genes. Results show that the speed factors are not only psychometrically equivalent but invariant in their relation with the Catechol-O-Methyl-Transferase (COMT) Val158Met polymorphism. However, the 5-HTTLPR/rs25531 serotonin polymorphism was related with the first-order factor of emotion perception speed, suggesting a specific genetic correlate of processing emotions. We further investigated the relationship between several components of event-related brain potentials with psychometric abilities, and tested emotion specific individual differences at the neurophysiological level. Results revealed swifter emotion perception abilities to go along with larger amplitudes of the P100 and the Early Posterior Negativity (EPN), when emotion processing was modeled on its own. However, after partialling out the shared variance of emotion perception speed with general processing speed-related abilities, brain-behavior relationships did not remain specific for emotion. Together, the present results suggest that speed abilities are strongly interrelated but show some specificity for emotion processing speed at the psychometric level. At both genetic and neurophysiological levels, emotion specificity depended on whether general cognition is taken into account or not. These findings keenly suggest that general speed abilities should be taken into account when the study of emotion recognition abilities is targeted in its specificity. PMID:28848411
More than mere mimicry? The influence of emotion on rapid facial reactions to faces.
Moody, Eric J; McIntosh, Daniel N; Mann, Laura J; Weisser, Kimberly R
2007-05-01
Within a second of seeing an emotional facial expression, people typically match that expression. These rapid facial reactions (RFRs), often termed mimicry, are implicated in emotional contagion, social perception, and embodied affect, yet ambiguity remains regarding the mechanism(s) involved. Two studies evaluated whether RFRs to faces are solely nonaffective motor responses or whether emotional processes are involved. Brow (corrugator, related to anger) and forehead (frontalis, related to fear) activity were recorded using facial electromyography (EMG) while undergraduates in two conditions (fear induction vs. neutral) viewed fear, anger, and neutral facial expressions. As predicted, fear induction increased fear expressions to angry faces within 1000 ms of exposure, demonstrating an emotional component of RFRs. This did not merely reflect increased fear from the induction, because responses to neutral faces were unaffected. Considering RFRs to be merely nonaffective automatic reactions is inaccurate. RFRs are not purely motor mimicry; emotion influences early facial responses to faces. The relevance of these data to emotional contagion, autism, and the mirror system-based perspectives on imitation is discussed.
Impaired recognition of facial emotions from low-spatial frequencies in Asperger syndrome.
Kätsyri, Jari; Saalasti, Satu; Tiippana, Kaisa; von Wendt, Lennart; Sams, Mikko
2008-01-01
The theory of 'weak central coherence' [Happe, F., & Frith, U. (2006). The weak coherence account: Detail-focused cognitive style in autism spectrum disorders. Journal of Autism and Developmental Disorders, 36(1), 5-25] implies that persons with autism spectrum disorders (ASDs) have a perceptual bias for local but not for global stimulus features. The recognition of emotional facial expressions representing various different levels of detail has not been studied previously in ASDs. We analyzed the recognition of four basic emotional facial expressions (anger, disgust, fear and happiness) from low-spatial frequencies (overall global shapes without local features) in adults with an ASD. A group of 20 participants with Asperger syndrome (AS) was compared to a group of non-autistic age- and sex-matched controls. Emotion recognition was tested from static and dynamic facial expressions whose spatial frequency contents had been manipulated by low-pass filtering at two levels. The two groups recognized emotions similarly from non-filtered faces and from dynamic vs. static facial expressions. In contrast, the participants with AS were less accurate than controls in recognizing facial emotions from very low-spatial frequencies. The results suggest intact recognition of basic facial emotions and dynamic facial information, but impaired visual processing of global features in ASDs.
Right Hemispheric Dominance in Processing of Unconscious Negative Emotion
ERIC Educational Resources Information Center
Sato, Wataru; Aoki, Satoshi
2006-01-01
Right hemispheric dominance in unconscious emotional processing has been suggested, but remains controversial. This issue was investigated using the subliminal affective priming paradigm combined with unilateral visual presentation in 40 normal subjects. In either left or right visual fields, angry facial expressions, happy facial expressions, or…
Intact anger recognition in depression despite aberrant visual facial information usage.
Clark, Cameron M; Chiu, Carina G; Diaz, Ruth L; Goghari, Vina M
2014-08-01
Previous literature has indicated abnormalities in facial emotion recognition abilities, as well as deficits in basic visual processes in major depression. However, the literature is unclear on a number of important factors including whether or not these abnormalities represent deficient or enhanced emotion recognition abilities compared to control populations, and the degree to which basic visual deficits might impact this process. The present study investigated emotion recognition abilities for angry versus neutral facial expressions in a sample of undergraduate students with Beck Depression Inventory-II (BDI-II) scores indicative of moderate depression (i.e., ≥20), compared to matched low-BDI-II score (i.e., ≤2) controls via the Bubbles Facial Emotion Perception Task. Results indicated unimpaired behavioural performance in discriminating angry from neutral expressions in the high depressive symptoms group relative to the minimal depressive symptoms group, despite evidence of an abnormal pattern of visual facial information usage. The generalizability of the current findings is limited by the highly structured nature of the facial emotion recognition task used, as well as the use of an analog sample undergraduates scoring high in self-rated symptoms of depression rather than a clinical sample. Our findings suggest that basic visual processes are involved in emotion recognition abnormalities in depression, demonstrating consistency with the emotion recognition literature in other psychopathologies (e.g., schizophrenia, autism, social anxiety). Future research should seek to replicate these findings in clinical populations with major depression, and assess the association between aberrant face gaze behaviours and symptom severity and social functioning. Copyright © 2014 Elsevier B.V. All rights reserved.
Facial emotion processing in pediatric social anxiety disorder: Relevance of situational context.
Schwab, Daniela; Schienle, Anne
2017-08-01
Social anxiety disorder (SAD) typically begins in childhood. Previous research has demonstrated that adult patients respond with elevated late positivity (LP) to negative facial expressions. In the present study on pediatric SAD, we investigated responses to negative facial expressions and the role of social context information. Fifteen children with SAD and 15 non-anxious controls were first presented with images of negative facial expressions with masked backgrounds. Following this, the complete images which included context information, were shown. The negative expressions were either a result of an emotion-relevant (e.g., social exclusion) or emotion-irrelevant elicitor (e.g., weight lifting). Relative to controls, the clinical group showed elevated parietal LP during face processing with and without context information. Both groups differed in their frontal LP depending on the type of context. In SAD patients, frontal LP was lower in emotion-relevant than emotion-irrelevant contexts. We conclude that SAD patients direct more automatic attention towards negative facial expressions (parietal effect) and are less capable in integrating affective context information (frontal effect). Copyright © 2017 Elsevier Ltd. All rights reserved.
Perlman, Susan B; Fournier, Jay C; Bebko, Genna; Bertocci, Michele A; Hinze, Amanda K; Bonar, Lisa; Almeida, Jorge R C; Versace, Amelia; Schirda, Claudiu; Travis, Michael; Gill, Mary Kay; Demeter, Christine; Diwadkar, Vaibhav A; Sunshine, Jeffrey L; Holland, Scott K; Kowatch, Robert A; Birmaher, Boris; Axelson, David; Horwitz, Sarah M; Arnold, L Eugene; Fristad, Mary A; Youngstrom, Eric A; Findling, Robert L; Phillips, Mary L
2013-12-01
Pediatric bipolar disorder involves poor social functioning, but the neural mechanisms underlying these deficits are not well understood. Previous neuroimaging studies have found deficits in emotional face processing localized to emotional brain regions. However, few studies have examined dysfunction in other regions of the face processing circuit. This study assessed hypoactivation in key face processing regions of the brain in pediatric bipolar disorder. Youth with a bipolar spectrum diagnosis (n = 20) were matched to a nonbipolar clinical group (n = 20), with similar demographics and comorbid diagnoses, and a healthy control group (n = 20). Youth participated in a functional magnetic resonance imaging (fMRI) scanning which employed a task-irrelevant emotion processing design in which processing of facial emotions was not germane to task performance. Hypoactivation, isolated to the fusiform gyrus, was found when viewing animated, emerging facial expressions of happiness, sadness, fearfulness, and especially anger in pediatric bipolar participants relative to matched clinical and healthy control groups. The results of the study imply that differences exist in visual regions of the brain's face processing system and are not solely isolated to emotional brain regions such as the amygdala. Findings are discussed in relation to facial emotion recognition and fusiform gyrus deficits previously reported in the autism literature. Behavioral interventions targeting attention to facial stimuli might be explored as possible treatments for bipolar disorder in youth. Copyright © 2013 American Academy of Child and Adolescent Psychiatry. Published by Elsevier Inc. All rights reserved.
Sex, Sexual Orientation, and Identification of Positive and Negative Facial Affect
ERIC Educational Resources Information Center
Rahman, Qazi; Wilson, Glenn D.; Abrahams, Sharon
2004-01-01
Sex and sexual orientation related differences in processing of happy and sad facial emotions were examined using an experimental facial emotion recognition paradigm with a large sample (N=240). Analysis of covariance (controlling for age and IQ) revealed that women (irrespective of sexual orientation) had faster reaction times than men for…
[The role of experience in the neurology of facial expression of emotions].
Gordillo, Fernando; Pérez, Miguel A; Arana, José M; Mestas, Lilia; López, Rafael M
2015-04-01
Facial expression of emotion has an important social function that facilitates interaction between people. This process has a neurological basis, which is not isolated from the context, or the experience of the interaction between people in that context. Yet, to date, the impact that experience has on the perception of emotions is not completely understood. To discuss the role of experience in the recognition of facial expression of emotions and to analyze the biases towards emotional perception. The maturation of the structures that support the ability to recognize emotion goes through a sensitive period during adolescence, where experience may have greater impact on emotional recognition. Experiences of abuse, neglect, war, and stress generate a bias towards expressions of anger and sadness. Similarly, positive experiences generate a bias towards the expression of happiness. Only when people are able to use the facial expression of emotions as a channel for understanding an expression, will they be able to interact appropriately with their environment. This environment, in turn, will lead to experiences that modulate this capacity. Therefore, it is a self-regulatory process that can be directed through the implementation of intervention programs on emotional aspects.
Developmental changes in the primacy of facial cues for emotion recognition.
Leitzke, Brian T; Pollak, Seth D
2016-04-01
There have been long-standing differences of opinion regarding the influence of the face relative to that of contextual information on how individuals process and judge facial expressions of emotion. However, developmental changes in how individuals use such information have remained largely unexplored and could be informative in attempting to reconcile these opposing views. The current study tested for age-related differences in how individuals prioritize viewing emotional faces versus contexts when making emotion judgments. To do so, we asked 4-, 8-, and 12-year-old children as well as college students to categorize facial expressions of emotion that were presented with scenes that were either congruent or incongruent with the facial displays. During this time, we recorded participants' gaze patterns via eye tracking. College students directed their visual attention primarily to the face, regardless of contextual information. Children, however, divided their attention between both the face and the context as sources of emotional information depending on the valence of the context. These findings reveal a developmental shift in how individuals process and integrate emotional cues. (c) 2016 APA, all rights reserved).
Dissimilar processing of emotional facial expressions in human and monkey temporal cortex
Zhu, Qi; Nelissen, Koen; Van den Stock, Jan; De Winter, François-Laurent; Pauwels, Karl; de Gelder, Beatrice; Vanduffel, Wim; Vandenbulcke, Mathieu
2013-01-01
Emotional facial expressions play an important role in social communication across primates. Despite major progress made in our understanding of categorical information processing such as for objects and faces, little is known, however, about how the primate brain evolved to process emotional cues. In this study, we used functional magnetic resonance imaging (fMRI) to compare the processing of emotional facial expressions between monkeys and humans. We used a 2 × 2 × 2 factorial design with species (human and monkey), expression (fear and chewing) and configuration (intact versus scrambled) as factors. At the whole brain level, selective neural responses to conspecific emotional expressions were anatomically confined to the superior temporal sulcus (STS) in humans. Within the human STS, we found functional subdivisions with a face-selective right posterior STS area that also responded selectively to emotional expressions of other species and a more anterior area in the right middle STS that responded specifically to human emotions. Hence, we argue that the latter region does not show a mere emotion-dependent modulation of activity but is primarily driven by human emotional facial expressions. Conversely, in monkeys, emotional responses appeared in earlier visual cortex and outside face-selective regions in inferior temporal cortex that responded also to multiple visual categories. Within monkey IT, we also found areas that were more responsive to conspecific than to non-conspecific emotional expressions but these responses were not as specific as in human middle STS. Overall, our results indicate that human STS may have developed unique properties to deal with social cues such as emotional expressions. PMID:23142071
Self-Relevance Appraisal Influences Facial Reactions to Emotional Body Expressions
Grèzes, Julie; Philip, Léonor; Chadwick, Michèle; Dezecache, Guillaume; Soussignan, Robert; Conty, Laurence
2013-01-01
People display facial reactions when exposed to others' emotional expressions, but exactly what mechanism mediates these facial reactions remains a debated issue. In this study, we manipulated two critical perceptual features that contribute to determining the significance of others' emotional expressions: the direction of attention (toward or away from the observer) and the intensity of the emotional display. Electromyographic activity over the corrugator muscle was recorded while participants observed videos of neutral to angry body expressions. Self-directed bodies induced greater corrugator activity than other-directed bodies; additionally corrugator activity was only influenced by the intensity of anger expresssed by self-directed bodies. These data support the hypothesis that rapid facial reactions are the outcome of self-relevant emotional processing. PMID:23405230
Van Rheenen, Tamsyn E; Joshua, Nicole; Castle, David J; Rossell, Susan L
2017-03-01
Emotion recognition impairments have been demonstrated in schizophrenia (Sz), but are less consistent and lesser in magnitude in bipolar disorder (BD). This may be related to the extent to which different face processing strategies are engaged during emotion recognition in each of these disorders. We recently showed that Sz patients had impairments in the use of both featural and configural face processing strategies, whereas BD patients were impaired only in the use of the latter. Here we examine the influence that these impairments have on facial emotion recognition in these cohorts. Twenty-eight individuals with Sz, 28 individuals with BD, and 28 healthy controls completed a facial emotion labeling task with two conditions designed to separate the use of featural and configural face processing strategies; part-based and whole-face emotion recognition. Sz patients performed worse than controls on both conditions, and worse than BD patients on the whole-face condition. BD patients performed worse than controls on the whole-face condition only. Configural processing deficits appear to influence the recognition of facial emotions in BD, whereas both configural and featural processing abnormalities impair emotion recognition in Sz. This may explain discrepancies in the profiles of emotion recognition between the disorders. (JINS, 2017, 23, 287-291).
An fMRI study of facial emotion processing in patients with schizophrenia.
Gur, Raquel E; McGrath, Claire; Chan, Robin M; Schroeder, Lee; Turner, Travis; Turetsky, Bruce I; Kohler, Christian; Alsop, David; Maldjian, Joseph; Ragland, J Daniel; Gur, Ruben C
2002-12-01
Emotion processing deficits are notable in schizophrenia. The authors evaluated cerebral blood flow response in schizophrenia patients during facial emotion processing to test the hypothesis of diminished limbic activation related to emotional relevance of facial stimuli. Fourteen patients with schizophrenia and 14 matched comparison subjects viewed facial displays of happiness, sadness, anger, fear, and disgust as well as neutral faces. Functional magnetic resonance imaging was used to measure blood-oxygen-level-dependent signal changes as the subjects alternated between tasks of discriminating emotional valence (positive versus negative) and age (over 30 versus under 30) of the faces with an interleaved crosshair reference condition. The groups did not differ in performance on either task. For both tasks, healthy participants showed activation in the fusiform gyrus, occipital lobe, and inferior frontal cortex relative to the resting baseline condition. The increase was greater in the amygdala and hippocampus during the emotional valence discrimination task than during the age discrimination task. In the patients with schizophrenia, minimal focal response was observed for all tasks relative to the resting baseline condition. Contrasting patients and comparison subjects on the emotional valence discrimination task revealed voxels in the left amygdala and bilateral hippocampus in which the comparison subjects had significantly greater activation. Failure to activate limbic regions during emotional valence discrimination may explain emotion processing deficits in patients with schizophrenia. While the lack of limbic recruitment did not significantly impair simple valence discrimination performance in this clinically stable group, it may impact performance of more demanding tasks.
Thomas, Laura A; Brotman, Melissa A; Muhrer, Eli J; Rosen, Brooke H; Bones, Brian L; Reynolds, Richard C; Deveney, Christen M; Pine, Daniel S; Leibenluft, Ellen
2012-12-01
CONTEXT Youth with bipolar disorder (BD) and those with severe, nonepisodic irritability (severe mood dysregulation [SMD]) exhibit amygdala dysfunction during facial emotion processing. However, studies have not compared such patients with each other and with comparison individuals in neural responsiveness to subtle changes in facial emotion; the ability to process such changes is important for social cognition. To evaluate this, we used a novel, parametrically designed faces paradigm. OBJECTIVE To compare activation in the amygdala and across the brain in BD patients, SMD patients, and healthy volunteers (HVs). DESIGN Case-control study. SETTING Government research institute. PARTICIPANTS Fifty-seven youths (19 BD, 15 SMD, and 23 HVs). MAIN OUTCOME MEASURE Blood oxygenation level-dependent data. Neutral faces were morphed with angry and happy faces in 25% intervals; static facial stimuli appeared for 3000 milliseconds. Participants performed hostility or nonemotional facial feature (ie, nose width) ratings. The slope of blood oxygenation level-dependent activity was calculated across neutral-to-angry and neutral-to-happy facial stimuli. RESULTS In HVs, but not BD or SMD participants, there was a positive association between left amygdala activity and anger on the face. In the neutral-to-happy whole-brain analysis, BD and SMD participants modulated parietal, temporal, and medial-frontal areas differently from each other and from that in HVs; with increasing facial happiness, SMD patients demonstrated increased, and BD patients decreased, activity in the parietal, temporal, and frontal regions. CONCLUSIONS Youth with BD or SMD differ from HVs in modulation of amygdala activity in response to small changes in facial anger displays. In contrast, individuals with BD or SMD show distinct perturbations in regions mediating attention and face processing in association with changes in the emotional intensity of facial happiness displays. These findings demonstrate similarities and differences in the neural correlates of facial emotion processing in BD and SMD, suggesting that these distinct clinical presentations may reflect differing dysfunctions along a mood disorders spectrum.
Neves, Maila de Castro Lourenço das; Tremeau, Fabien; Nicolato, Rodrigo; Lauar, Hélio; Romano-Silva, Marco Aurélio; Correa, Humberto
2011-09-01
A large body of evidence suggests that several aspects of face processing are impaired in autism and that this impairment might be hereditary. This study was aimed at assessing facial emotion recognition in parents of children with autism and its associations with a functional polymorphism of the serotonin transporter (5HTTLPR). We evaluated 40 parents of children with autism and 41 healthy controls. All participants were administered the Penn Emotion Recognition Test (ER40) and were genotyped for 5HTTLPR. Our study showed that parents of children with autism performed worse in the facial emotion recognition test than controls. Analyses of error patterns showed that parents of children with autism over-attributed neutral to emotional faces. We found evidence that 5HTTLPR polymorphism did not influence the performance in the Penn Emotion Recognition Test, but that it may determine different error patterns. Facial emotion recognition deficits are more common in first-degree relatives of autistic patients than in the general population, suggesting that facial emotion recognition is a candidate endophenotype for autism.
More Pronounced Deficits in Facial Emotion Recognition for Schizophrenia than Bipolar Disorder
Goghari, Vina M; Sponheim, Scott R
2012-01-01
Schizophrenia and bipolar disorder are typically separated in diagnostic systems. Behavioural, cognitive, and brain abnormalities associated with each disorder nonetheless overlap. We evaluated the diagnostic specificity of facial emotion recognition deficits in schizophrenia and bipolar disorder to determine whether select aspects of emotion recognition differed for the two disorders. The investigation used an experimental task that included the same facial images in an emotion recognition condition and an age recognition condition (to control for processes associated with general face recognition) in 27 schizophrenia patients, 16 bipolar I patients, and 30 controls. Schizophrenia and bipolar patients exhibited both shared and distinct aspects of facial emotion recognition deficits. Schizophrenia patients had deficits in recognizing angry facial expressions compared to healthy controls and bipolar patients. Compared to control participants, both schizophrenia and bipolar patients were more likely to mislabel facial expressions of anger as fear. Given that schizophrenia patients exhibited a deficit in emotion recognition for angry faces, which did not appear due to generalized perceptual and cognitive dysfunction, improving recognition of threat-related expression may be an important intervention target to improve social functioning in schizophrenia. PMID:23218816
A facial expression of pax: Assessing children's "recognition" of emotion from faces.
Nelson, Nicole L; Russell, James A
2016-01-01
In a classic study, children were shown an array of facial expressions and asked to choose the person who expressed a specific emotion. Children were later asked to name the emotion in the face with any label they wanted. Subsequent research often relied on the same two tasks--choice from array and free labeling--to support the conclusion that children recognize basic emotions from facial expressions. Here five studies (N=120, 2- to 10-year-olds) showed that these two tasks produce illusory recognition; a novel nonsense facial expression was included in the array. Children "recognized" a nonsense emotion (pax or tolen) and two familiar emotions (fear and jealousy) from the same nonsense face. Children likely used a process of elimination; they paired the unknown facial expression with a label given in the choice-from-array task and, after just two trials, freely labeled the new facial expression with the new label. These data indicate that past studies using this method may have overestimated children's expression knowledge. Copyright © 2015 Elsevier Inc. All rights reserved.
Interference with facial emotion recognition by verbal but not visual loads.
Reed, Phil; Steed, Ian
2015-12-01
The ability to recognize emotions through facial characteristics is critical for social functioning, but is often impaired in those with a developmental or intellectual disability. The current experiments explored the degree to which interfering with the processing capacities of typically-developing individuals would produce a similar inability to recognize emotions through the facial elements of faces displaying particular emotions. It was found that increasing the cognitive load (in an attempt to model learning impairments in a typically developing population) produced deficits in correctly identifying emotions from facial elements. However, this effect was much more pronounced when using a concurrent verbal task than when employing a concurrent visual task, suggesting that there is a substantial verbal element to the labeling and subsequent recognition of emotions. This concurs with previous work conducted with those with developmental disabilities that suggests emotion recognition deficits are connected with language deficits. Copyright © 2015 Elsevier Ltd. All rights reserved.
Xiao, Ruiqi; Li, Xianchun; Li, Lin; Wang, Yanmei
2016-01-01
Most previous studies on facial expression recognition have focused on the moderate emotions; to date, few studies have been conducted to investigate the explicit and implicit processes of peak emotions. In the current study, we used transiently peak intense expression images of athletes at the winning or losing point in competition as materials, and investigated the diagnosability of peak facial expressions at both implicit and explicit levels. In Experiment 1, participants were instructed to evaluate isolated faces, isolated bodies, and the face-body compounds, and eye-tracking movement was recorded. The results revealed that the isolated body and face-body congruent images were better recognized than isolated face and face-body incongruent images, indicating that the emotional information conveyed by facial cues was ambiguous, and the body cues influenced facial emotion recognition. Furthermore, eye movement records showed that the participants displayed distinct gaze patterns for the congruent and incongruent compounds. In Experiment 2A, the subliminal affective priming task was used, with faces as primes and bodies as targets, to investigate the unconscious emotion perception of peak facial expressions. The results showed that winning face prime facilitated reaction to winning body target, whereas losing face prime inhibited reaction to winning body target, suggesting that peak facial expressions could be perceived at the implicit level. In general, the results indicate that peak facial expressions cannot be consciously recognized but can be perceived at the unconscious level. In Experiment 2B, revised subliminal affective priming task and a strict awareness test were used to examine the validity of unconscious perception of peak facial expressions found in Experiment 2A. Results of Experiment 2B showed that reaction time to both winning body targets and losing body targets was influenced by the invisibly peak facial expression primes, which indicated the unconscious perception of peak facial expressions.
Xiao, Ruiqi; Li, Xianchun; Li, Lin; Wang, Yanmei
2016-01-01
Most previous studies on facial expression recognition have focused on the moderate emotions; to date, few studies have been conducted to investigate the explicit and implicit processes of peak emotions. In the current study, we used transiently peak intense expression images of athletes at the winning or losing point in competition as materials, and investigated the diagnosability of peak facial expressions at both implicit and explicit levels. In Experiment 1, participants were instructed to evaluate isolated faces, isolated bodies, and the face-body compounds, and eye-tracking movement was recorded. The results revealed that the isolated body and face-body congruent images were better recognized than isolated face and face-body incongruent images, indicating that the emotional information conveyed by facial cues was ambiguous, and the body cues influenced facial emotion recognition. Furthermore, eye movement records showed that the participants displayed distinct gaze patterns for the congruent and incongruent compounds. In Experiment 2A, the subliminal affective priming task was used, with faces as primes and bodies as targets, to investigate the unconscious emotion perception of peak facial expressions. The results showed that winning face prime facilitated reaction to winning body target, whereas losing face prime inhibited reaction to winning body target, suggesting that peak facial expressions could be perceived at the implicit level. In general, the results indicate that peak facial expressions cannot be consciously recognized but can be perceived at the unconscious level. In Experiment 2B, revised subliminal affective priming task and a strict awareness test were used to examine the validity of unconscious perception of peak facial expressions found in Experiment 2A. Results of Experiment 2B showed that reaction time to both winning body targets and losing body targets was influenced by the invisibly peak facial expression primes, which indicated the unconscious perception of peak facial expressions. PMID:27630604
Kim, Do-Won; Kim, Han-Sung; Lee, Seung-Hwan; Im, Chang-Hwan
2013-12-01
Schizophrenia is one of the most devastating of all mental illnesses, and has dimensional characteristics that include both positive and negative symptoms. One problem reported in schizophrenia patients is that they tend to show deficits in face emotion processing, on which negative symptoms are thought to have stronger influence. In this study, four event-related potential (ERP) components (P100, N170, N250, and P300) and their source activities were analyzed using EEG data acquired from 23 schizophrenia patients while they were presented with facial emotion picture stimuli. Correlations between positive and negative syndrome scale (PANSS) scores and source activations during facial emotion processing were calculated to identify the brain areas affected by symptom scores. Our analysis demonstrates that PANSS positive scores are negatively correlated with major areas of the left temporal lobule for early ERP components (P100, N170) and with the right middle frontal lobule for a later component (N250), which indicates that positive symptoms affect both early face processing and facial emotion processing. On the other hand, PANSS negative scores are negatively correlated with several clustered regions, including the left fusiform gyrus (at P100), most of which are not overlapped with regions showing correlations with PANSS positive scores. Our results suggest that positive and negative symptoms affect independent brain regions during facial emotion processing, which may help to explain the heterogeneous characteristics of schizophrenia. © 2013 Elsevier B.V. All rights reserved.
Do you see what I see? Sex differences in the discrimination of facial emotions during adolescence.
Lee, Nikki C; Krabbendam, Lydia; White, Thomas P; Meeter, Martijn; Banaschewski, Tobias; Barker, Gareth J; Bokde, Arun L W; Büchel, Christian; Conrod, Patricia; Flor, Herta; Frouin, Vincent; Heinz, Andreas; Garavan, Hugh; Gowland, Penny; Ittermann, Bernd; Mann, Karl; Paillère Martinot, Marie-Laure; Nees, Frauke; Paus, Tomas; Pausova, Zdenka; Rietschel, Marcella; Robbins, Trevor; Fauth-Bühler, Mira; Smolka, Michael N; Gallinat, Juergen; Schumann, Gunther; Shergill, Sukhi S
2013-12-01
During adolescence social relationships become increasingly important. Establishing and maintaining these relationships requires understanding of emotional stimuli, such as facial emotions. A failure to adequately interpret emotional facial expressions has previously been associated with various mental disorders that emerge during adolescence. The current study examined sex differences in emotional face processing during adolescence. Participants were adolescents (n = 1951) with a target age of 14, who completed a forced-choice emotion discrimination task. The stimuli used comprised morphed faces that contained a blend of two emotions in varying intensities (11 stimuli per set of emotions). Adolescent girls showed faster and more sensitive perception of facial emotions than boys. However, both adolescent boys and girls were most sensitive to variations in emotion intensity in faces combining happiness and sadness, and least sensitive to changes in faces comprising fear and anger. Furthermore, both sexes overidentified happiness and anger. However, the overidentification of happiness was stronger in boys. These findings were not influenced by individual differences in the level of pubertal maturation. These results indicate that male and female adolescents differ in their ability to identify emotions in morphed faces containing emotional blends. The findings provide information for clinical studies examining whether sex differences in emotional processing are related to sex differences in the prevalence of psychiatric disorders within this age group.
Perception of face and body expressions using electromyography, pupillometry and gaze measures.
Kret, Mariska E; Stekelenburg, Jeroen J; Roelofs, Karin; de Gelder, Beatrice
2013-01-01
Traditional emotion theories stress the importance of the face in the expression of emotions but bodily expressions are becoming increasingly important as well. In these experiments we tested the hypothesis that similar physiological responses can be evoked by observing emotional face and body signals and that the reaction to angry signals is amplified in anxious individuals. We designed three experiments in which participants categorized emotional expressions from isolated facial and bodily expressions and emotionally congruent and incongruent face-body compounds. Participants' fixations were measured and their pupil size recorded with eye-tracking equipment and their facial reactions measured with electromyography. The results support our prediction that the recognition of a facial expression is improved in the context of a matching posture and importantly, vice versa as well. From their facial expressions, it appeared that observers acted with signs of negative emotionality (increased corrugator activity) to angry and fearful facial expressions and with positive emotionality (increased zygomaticus) to happy facial expressions. What we predicted and found, was that angry and fearful cues from the face or the body, attracted more attention than happy cues. We further observed that responses evoked by angry cues were amplified in individuals with high anxiety scores. In sum, we show that people process bodily expressions of emotion in a similar fashion as facial expressions and that the congruency between the emotional signals from the face and body facilitates the recognition of the emotion.
Perception of Face and Body Expressions Using Electromyography, Pupillometry and Gaze Measures
Kret, Mariska E.; Stekelenburg, Jeroen J.; Roelofs, Karin; de Gelder, Beatrice
2013-01-01
Traditional emotion theories stress the importance of the face in the expression of emotions but bodily expressions are becoming increasingly important as well. In these experiments we tested the hypothesis that similar physiological responses can be evoked by observing emotional face and body signals and that the reaction to angry signals is amplified in anxious individuals. We designed three experiments in which participants categorized emotional expressions from isolated facial and bodily expressions and emotionally congruent and incongruent face-body compounds. Participants’ fixations were measured and their pupil size recorded with eye-tracking equipment and their facial reactions measured with electromyography. The results support our prediction that the recognition of a facial expression is improved in the context of a matching posture and importantly, vice versa as well. From their facial expressions, it appeared that observers acted with signs of negative emotionality (increased corrugator activity) to angry and fearful facial expressions and with positive emotionality (increased zygomaticus) to happy facial expressions. What we predicted and found, was that angry and fearful cues from the face or the body, attracted more attention than happy cues. We further observed that responses evoked by angry cues were amplified in individuals with high anxiety scores. In sum, we show that people process bodily expressions of emotion in a similar fashion as facial expressions and that the congruency between the emotional signals from the face and body facilitates the recognition of the emotion. PMID:23403886
Neural signatures of conscious and unconscious emotional face processing in human infants.
Jessen, Sarah; Grossmann, Tobias
2015-03-01
Human adults can process emotional information both with and without conscious awareness, and it has been suggested that the two processes rely on partly distinct brain mechanisms. However, the developmental origins of these brain processes are unknown. In the present event-related brain potential (ERP) study, we examined the brain responses of 7-month-old infants in response to subliminally (50 and 100 msec) and supraliminally (500 msec) presented happy and fearful facial expressions. Our results revealed that infants' brain responses (Pb and Nc) over central electrodes distinguished between emotions irrespective of stimulus duration, whereas the discrimination between emotions at occipital electrodes (N290 and P400) only occurred when faces were presented supraliminally (above threshold). This suggests that early in development the human brain not only discriminates between happy and fearful facial expressions irrespective of conscious perception, but also that, similar to adults, supraliminal and subliminal emotion processing relies on distinct neural processes. Our data further suggest that the processing of emotional facial expressions differs across infants depending on their behaviorally shown perceptual sensitivity. The current ERP findings suggest that distinct brain processes underpinning conscious and unconscious emotion perception emerge early in ontogeny and can therefore be seen as a key feature of human social functioning. Copyright © 2014 Elsevier Ltd. All rights reserved.
Face Processing and Facial Emotion Recognition in Adults with Down Syndrome
ERIC Educational Resources Information Center
Barisnikov, Koviljka; Hippolyte, Loyse; Van der Linden, Martial
2008-01-01
Face processing and facial expression recognition was investigated in 17 adults with Down syndrome, and results were compared with those of a child control group matched for receptive vocabulary. On the tasks involving faces without emotional content, the adults with Down syndrome performed significantly worse than did the controls. However, their…
Discrimination of emotional facial expressions by tufted capuchin monkeys (Sapajus apella).
Calcutt, Sarah E; Rubin, Taylor L; Pokorny, Jennifer J; de Waal, Frans B M
2017-02-01
Tufted or brown capuchin monkeys (Sapajus apella) have been shown to recognize conspecific faces as well as categorize them according to group membership. Little is known, though, about their capacity to differentiate between emotionally charged facial expressions or whether facial expressions are processed as a collection of features or configurally (i.e., as a whole). In 3 experiments, we examined whether tufted capuchins (a) differentiate photographs of neutral faces from either affiliative or agonistic expressions, (b) use relevant facial features to make such choices or view the expression as a whole, and (c) demonstrate an inversion effect for facial expressions suggestive of configural processing. Using an oddity paradigm presented on a computer touchscreen, we collected data from 9 adult and subadult monkeys. Subjects discriminated between emotional and neutral expressions with an exceptionally high success rate, including differentiating open-mouth threats from neutral expressions even when the latter contained varying degrees of visible teeth and mouth opening. They also showed an inversion effect for facial expressions, results that may indicate that quickly recognizing expressions does not originate solely from feature-based processing but likely a combination of relational processes. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Chiu, Isabelle; Piguet, Olivier; Diehl-Schmid, Janine; Riedl, Lina; Beck, Johannes; Leyhe, Thomas; Holsboer-Trachsler, Edith; Berres, Manfred; Monsch, Andreas U; Sollberger, Marc
2016-11-01
Features of behavioral variant frontotemporal dementia (bvFTD) such as executive dysfunction, apathy, and impaired empathic abilities are also observed in major depressive disorder (MDD). This may contribute to the reason why early stage bvFTD is often misdiagnosed as MDD. New assessment tools are thus needed to improve early diagnosis of bvFTD. Although emotion processing is affected in bvFTD and MDD, growing evidence indicates that the pattern of emotion processing deficits varies between the two disorders. As such, emotion processing paradigms have substantial potentials to distinguish bvFTD from MDD. The current study compared 25 patients with bvFTD, 21 patients with MDD, 21 patients with Alzheimer disease (AD) dementia, and 31 healthy participants on a novel facial emotion intensity rating task. Stimuli comprised morphed faces from the Ekman and Friesen stimulus set containing faces of each sex with two different degrees of emotion intensity for each of the six basic emotions. Analyses of covariance uncovered a significant dissociation between bvFTD and MDD patients in rating the intensity of negative emotions overall (i.e., bvFTD patients underrated negative emotions overall, whereas MDD patients overrated negative emotions overall compared with healthy participants). In contrast, AD dementia patients rated negative emotions similarly to healthy participants, suggesting no impact of cognitive deficits on rating facial emotions. By strongly differentiating bvFTD and MDDpatients through negative facial emotions, this sensitive and short rating task might help improve the early diagnosis of bvFTD. Copyright © 2016 American Association for Geriatric Psychiatry. All rights reserved.
Diéguez-Risco, Teresa; Aguado, Luis; Albert, Jacobo; Hinojosa, José Antonio
2015-12-01
The influence of explicit evaluative processes on the contextual integration of facial expressions of emotion was studied in a procedure that required the participants to judge the congruency of happy and angry faces with preceding sentences describing emotion-inducing situations. Judgments were faster on congruent trials in the case of happy faces and on incongruent trials in the case of angry faces. At the electrophysiological level, a congruency effect was observed in the face-sensitive N170 component that showed larger amplitudes on incongruent trials. An interactive effect of congruency and emotion appeared on the LPP (late positive potential), with larger amplitudes in response to happy faces that followed anger-inducing situations. These results show that the deliberate intention to judge the contextual congruency of facial expressions influences not only processes involved in affective evaluation such as those indexed by the LPP but also earlier processing stages that are involved in face perception. Copyright © 2015. Published by Elsevier B.V.
Tang, Qingting; Chen, Xu; Hu, Jia; Liu, Ying
2017-01-01
Our study explored how priming with a secure base schema affects the processing of emotional facial stimuli in individuals with attachment anxiety. We enrolled 42 undergraduate students between 18 and 27 years of age, and divided them into two groups: attachment anxiety and attachment secure. All participants were primed under two conditions, the secure priming using references to the partner, and neutral priming using neutral references. We performed repeated attachment security priming combined with a dual-task paradigm and functional magnetic resonance imaging. Participants’ reaction times in terms of responding to the facial stimuli were also measured. Attachment security priming can facilitate an individual’s processing of positive emotional faces; for instance, the presentation of the partner’s name was associated with stronger activities in a wide range of brain regions and faster reaction times for positive facial expressions in the subjects. The current finding of higher activity in the left-hemisphere regions for secure priming rather than neutral priming is consistent with the prediction that attachment security priming triggers the spread of the activation of a positive emotional state. However, the difference in brain activity during processing of both, positive and negative emotional facial stimuli between the two priming conditions appeared in the attachment anxiety group alone. This study indicates that the effect of attachment secure priming on the processing of emotional facial stimuli could be mediated by chronic attachment anxiety. In addition, it highlights the association between higher-order processes of the attachment system (secure attachment schema priming) and early-stage information processing system (attention), given the increased attention toward the effects of secure base schema on the processing of emotion- and attachment-related information among the insecure population. Thus, the following study has applications in providing directions for clinical treatment of mood disorders in attachment anxiety. PMID:28473796
Tang, Qingting; Chen, Xu; Hu, Jia; Liu, Ying
2017-01-01
Our study explored how priming with a secure base schema affects the processing of emotional facial stimuli in individuals with attachment anxiety. We enrolled 42 undergraduate students between 18 and 27 years of age, and divided them into two groups: attachment anxiety and attachment secure. All participants were primed under two conditions, the secure priming using references to the partner, and neutral priming using neutral references. We performed repeated attachment security priming combined with a dual-task paradigm and functional magnetic resonance imaging. Participants' reaction times in terms of responding to the facial stimuli were also measured. Attachment security priming can facilitate an individual's processing of positive emotional faces; for instance, the presentation of the partner's name was associated with stronger activities in a wide range of brain regions and faster reaction times for positive facial expressions in the subjects. The current finding of higher activity in the left-hemisphere regions for secure priming rather than neutral priming is consistent with the prediction that attachment security priming triggers the spread of the activation of a positive emotional state. However, the difference in brain activity during processing of both, positive and negative emotional facial stimuli between the two priming conditions appeared in the attachment anxiety group alone. This study indicates that the effect of attachment secure priming on the processing of emotional facial stimuli could be mediated by chronic attachment anxiety. In addition, it highlights the association between higher-order processes of the attachment system (secure attachment schema priming) and early-stage information processing system (attention), given the increased attention toward the effects of secure base schema on the processing of emotion- and attachment-related information among the insecure population. Thus, the following study has applications in providing directions for clinical treatment of mood disorders in attachment anxiety.
Face processing regions are sensitive to distinct aspects of temporal sequence in facial dynamics.
Reinl, Maren; Bartels, Andreas
2014-11-15
Facial movement conveys important information for social interactions, yet its neural processing is poorly understood. Computational models propose that shape- and temporal sequence sensitive mechanisms interact in processing dynamic faces. While face processing regions are known to respond to facial movement, their sensitivity to particular temporal sequences has barely been studied. Here we used fMRI to examine the sensitivity of human face-processing regions to two aspects of directionality in facial movement trajectories. We presented genuine movie recordings of increasing and decreasing fear expressions, each of which were played in natural or reversed frame order. This two-by-two factorial design matched low-level visual properties, static content and motion energy within each factor, emotion-direction (increasing or decreasing emotion) and timeline (natural versus artificial). The results showed sensitivity for emotion-direction in FFA, which was timeline-dependent as it only occurred within the natural frame order, and sensitivity to timeline in the STS, which was emotion-direction-dependent as it only occurred for decreased fear. The occipital face area (OFA) was sensitive to the factor timeline. These findings reveal interacting temporal sequence sensitive mechanisms that are responsive to both ecological meaning and to prototypical unfolding of facial dynamics. These mechanisms are temporally directional, provide socially relevant information regarding emotional state or naturalness of behavior, and agree with predictions from modeling and predictive coding theory. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Donges, Uta-Susan; Kersting, Anette; Suslow, Thomas
2012-01-01
There is evidence that women are better in recognizing their own and others' emotions. The female advantage in emotion recognition becomes even more apparent under conditions of rapid stimulus presentation. Affective priming paradigms have been developed to examine empirically whether facial emotion stimuli presented outside of conscious awareness color our impressions. It was observed that masked emotional facial expression has an affect congruent influence on subsequent judgments of neutral stimuli. The aim of the present study was to examine the effect of gender on affective priming based on negative and positive facial expression. In our priming experiment sad, happy, neutral, or no facial expression was briefly presented (for 33 ms) and masked by neutral faces which had to be evaluated. 81 young healthy volunteers (53 women) participated in the study. Subjects had no subjective awareness of emotional primes. Women did not differ from men with regard to age, education, intelligence, trait anxiety, or depressivity. In the whole sample, happy but not sad facial expression elicited valence congruent affective priming. Between-group analyses revealed that women manifested greater affective priming due to happy faces than men. Women seem to have a greater ability to perceive and respond to positive facial emotion at an automatic processing level compared to men. High perceptual sensitivity to minimal social-affective signals may contribute to women's advantage in understanding other persons' emotional states.
Donges, Uta-Susan; Kersting, Anette; Suslow, Thomas
2012-01-01
There is evidence that women are better in recognizing their own and others' emotions. The female advantage in emotion recognition becomes even more apparent under conditions of rapid stimulus presentation. Affective priming paradigms have been developed to examine empirically whether facial emotion stimuli presented outside of conscious awareness color our impressions. It was observed that masked emotional facial expression has an affect congruent influence on subsequent judgments of neutral stimuli. The aim of the present study was to examine the effect of gender on affective priming based on negative and positive facial expression. In our priming experiment sad, happy, neutral, or no facial expression was briefly presented (for 33 ms) and masked by neutral faces which had to be evaluated. 81 young healthy volunteers (53 women) participated in the study. Subjects had no subjective awareness of emotional primes. Women did not differ from men with regard to age, education, intelligence, trait anxiety, or depressivity. In the whole sample, happy but not sad facial expression elicited valence congruent affective priming. Between-group analyses revealed that women manifested greater affective priming due to happy faces than men. Women seem to have a greater ability to perceive and respond to positive facial emotion at an automatic processing level compared to men. High perceptual sensitivity to minimal social-affective signals may contribute to women's advantage in understanding other persons' emotional states. PMID:22844519
Drug effects on responses to emotional facial expressions: recent findings.
Miller, Melissa A; Bershad, Anya K; de Wit, Harriet
2015-09-01
Many psychoactive drugs increase social behavior and enhance social interactions, which may, in turn, increase their attractiveness to users. Although the psychological mechanisms by which drugs affect social behavior are not fully understood, there is some evidence that drugs alter the perception of emotions in others. Drugs can affect the ability to detect, attend to, and respond to emotional facial expressions, which in turn may influence their use in social settings. Either increased reactivity to positive expressions or decreased response to negative expressions may facilitate social interaction. This article reviews evidence that psychoactive drugs alter the processing of emotional facial expressions using subjective, behavioral, and physiological measures. The findings lay the groundwork for better understanding how drugs alter social processing and social behavior more generally.
Enhanced embodied response following ambiguous emotional processing.
Beffara, Brice; Ouellet, Marc; Vermeulen, Nicolas; Basu, Anamitra; Morisseau, Tiffany; Mermillod, Martial
2012-08-01
It has generally been assumed that high-level cognitive and emotional processes are based on amodal conceptual information. In contrast, however, "embodied simulation" theory states that the perception of an emotional signal can trigger a simulation of the related state in the motor, somatosensory, and affective systems. To study the effect of social context on the mimicry effect predicted by the "embodied simulation" theory, we recorded the electromyographic (EMG) activity of participants when looking at emotional facial expressions. We observed an increase in embodied responses when the participants were exposed to a context involving social valence before seeing the emotional facial expressions. An examination of the dynamic EMG activity induced by two socially relevant emotional expressions (namely joy and anger) revealed enhanced EMG responses of the facial muscles associated with the related social prime (either positive or negative). These results are discussed within the general framework of embodiment theory.
Computerised analysis of facial emotion expression in eating disorders.
Leppanen, Jenni; Dapelo, Marcela Marin; Davies, Helen; Lang, Katie; Treasure, Janet; Tchanturia, Kate
2017-01-01
Problems with social-emotional processing are known to be an important contributor to the development and maintenance of eating disorders (EDs). Diminished facial communication of emotion has been frequently reported in individuals with anorexia nervosa (AN). Less is known about facial expressivity in bulimia nervosa (BN) and in people who have recovered from AN (RecAN). This study aimed to pilot the use of computerised facial expression analysis software to investigate emotion expression across the ED spectrum and recovery in a large sample of participants. 297 participants with AN, BN, RecAN, and healthy controls were recruited. Participants watched film clips designed to elicit happy or sad emotions, and facial expressions were then analysed using FaceReader. The finding mirrored those from previous work showing that healthy control and RecAN participants expressed significantly more positive emotions during the positive clip compared to the AN group. There were no differences in emotion expression during the sad film clip. These findings support the use of computerised methods to analyse emotion expression in EDs. The findings also demonstrate that reduced positive emotion expression is likely to be associated with the acute stage of AN illness, with individuals with BN showing an intermediate profile.
Time course of implicit processing and explicit processing of emotional faces and emotional words.
Frühholz, Sascha; Jellinghaus, Anne; Herrmann, Manfred
2011-05-01
Facial expressions are important emotional stimuli during social interactions. Symbolic emotional cues, such as affective words, also convey information regarding emotions that is relevant for social communication. Various studies have demonstrated fast decoding of emotions from words, as was shown for faces, whereas others report a rather delayed decoding of information about emotions from words. Here, we introduced an implicit (color naming) and explicit task (emotion judgment) with facial expressions and words, both containing information about emotions, to directly compare the time course of emotion processing using event-related potentials (ERP). The data show that only negative faces affected task performance, resulting in increased error rates compared to neutral faces. Presentation of emotional faces resulted in a modulation of the N170, the EPN and the LPP components and these modulations were found during both the explicit and implicit tasks. Emotional words only affected the EPN during the explicit task, but a task-independent effect on the LPP was revealed. Finally, emotional faces modulated source activity in the extrastriate cortex underlying the generation of the N170, EPN and LPP components. Emotional words led to a modulation of source activity corresponding to the EPN and LPP, but they also affected the N170 source on the right hemisphere. These data show that facial expressions affect earlier stages of emotion processing compared to emotional words, but the emotional value of words may have been detected at early stages of emotional processing in the visual cortex, as was indicated by the extrastriate source activity. Copyright © 2011 Elsevier B.V. All rights reserved.
[Recognition of facial expression of emotions in Parkinson's disease: a theoretical review].
Alonso-Recio, L; Serrano-Rodriguez, J M; Carvajal-Molina, F; Loeches-Alonso, A; Martin-Plasencia, P
2012-04-16
Emotional facial expression is a basic guide during social interaction and, therefore, alterations in their expression or recognition are important limitations for communication. To examine facial expression recognition abilities and their possible impairment in Parkinson's disease. First, we review the studies on this topic which have not found entirely similar results. Second, we analyze the factors that may explain these discrepancies and, in particular, as third objective, we consider the relationship between emotional recognition problems and cognitive impairment associated with the disease. Finally, we propose alternatives strategies for the development of studies that could clarify the state of these abilities in Parkinson's disease. Most studies suggest deficits in facial expression recognition, especially in those with negative emotional content. However, it is possible that these alterations are related to those that also appear in the course of the disease in other perceptual and executive processes. To advance in this issue, we consider necessary to design emotional recognition studies implicating differentially the executive or visuospatial processes, and/or contrasting cognitive abilities with facial expressions and non emotional stimuli. The precision of the status of these abilities, as well as increase our knowledge of the functional consequences of the characteristic brain damage in the disease, may indicate if we should pay special attention in their rehabilitation inside the programs implemented.
Testosterone reactivity to facial display of emotions in men and women.
Zilioli, Samuele; Caldbick, Evan; Watson, Neil V
2014-05-01
Previous studies have examined testosterone's role in regulating the processing of facial displays of emotions (FDEs). However, the reciprocal process - the influence of FDEs, an evolutionarily ancient and potent class of social signals, on the secretion of testosterone - has not yet been studied. To address this gap, we examined the effects of emotional content and sex of facial stimuli in modulating endogenous testosterone fluctuations, as well as sex differences in the endocrine responses to faces. One hundred and sixty-four young healthy men and women were exposed, in a between-subjects design, to happy or angry same-sex or opposite-sex facial expressions. Results showed that in both men (n=85) and women (n=79), extended exposure to faces of the opposite sex, regardless of their apparent emotional content, was accompanied by an accumulation in salivary testosterone when compared to exposure to faces of the same sex. Furthermore, testosterone change in women exposed to angry expressions was greater than testosterone change in women exposed to happy expressions. These results add emotional facial stimuli to the collection of social signals that modulate endocrine status, and are discussed with regard to the evolutionary roles of testosterone. Copyright © 2014 Elsevier Inc. All rights reserved.
Hindocha, Chandni; Freeman, Tom P; Schafer, Grainne; Gardener, Chelsea; Das, Ravi K; Morgan, Celia J A; Curran, H Valerie
2015-03-01
Acute administration of the primary psychoactive constituent of cannabis, Δ-9-tetrahydrocannabinol (THC), impairs human facial affect recognition, implicating the endocannabinoid system in emotional processing. Another main constituent of cannabis, cannabidiol (CBD), has seemingly opposite functional effects on the brain. This study aimed to determine the effects of THC and CBD, both alone and in combination on emotional facial affect recognition. 48 volunteers, selected for high and low frequency of cannabis use and schizotypy, were administered, THC (8mg), CBD (16mg), THC+CBD (8mg+16mg) and placebo, by inhalation, in a 4-way, double-blind, placebo-controlled crossover design. They completed an emotional facial affect recognition task including fearful, angry, happy, sad, surprise and disgust faces varying in intensity from 20% to 100%. A visual analogue scale (VAS) of feeling 'stoned' was also completed. In comparison to placebo, CBD improved emotional facial affect recognition at 60% emotional intensity; THC was detrimental to the recognition of ambiguous faces of 40% intensity. The combination of THC+CBD produced no impairment. Relative to placebo, both THC alone and combined THC+CBD equally increased feelings of being 'stoned'. CBD did not influence feelings of 'stoned'. No effects of frequency of use or schizotypy were found. In conclusion, CBD improves recognition of emotional facial affect and attenuates the impairment induced by THC. This is the first human study examining the effects of different cannabinoids on emotional processing. It provides preliminary evidence that different pharmacological agents acting upon the endocannabinoid system can both improve and impair recognition of emotional faces. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.
Hindocha, Chandni; Freeman, Tom P.; Schafer, Grainne; Gardener, Chelsea; Das, Ravi K.; Morgan, Celia J.A.; Curran, H. Valerie
2015-01-01
Acute administration of the primary psychoactive constituent of cannabis, Δ-9-tetrahydrocannabinol (THC), impairs human facial affect recognition, implicating the endocannabinoid system in emotional processing. Another main constituent of cannabis, cannabidiol (CBD), has seemingly opposite functional effects on the brain. This study aimed to determine the effects of THC and CBD, both alone and in combination on emotional facial affect recognition. 48 volunteers, selected for high and low frequency of cannabis use and schizotypy, were administered, THC (8 mg), CBD (16 mg), THC+CBD (8 mg+16 mg) and placebo, by inhalation, in a 4-way, double-blind, placebo-controlled crossover design. They completed an emotional facial affect recognition task including fearful, angry, happy, sad, surprise and disgust faces varying in intensity from 20% to 100%. A visual analogue scale (VAS) of feeling ‘stoned’ was also completed. In comparison to placebo, CBD improved emotional facial affect recognition at 60% emotional intensity; THC was detrimental to the recognition of ambiguous faces of 40% intensity. The combination of THC+CBD produced no impairment. Relative to placebo, both THC alone and combined THC+CBD equally increased feelings of being ‘stoned’. CBD did not influence feelings of ‘stoned’. No effects of frequency of use or schizotypy were found. In conclusion, CBD improves recognition of emotional facial affect and attenuates the impairment induced by THC. This is the first human study examining the effects of different cannabinoids on emotional processing. It provides preliminary evidence that different pharmacological agents acting upon the endocannabinoid system can both improve and impair recognition of emotional faces. PMID:25534187
Cardi, Valentina; Corfield, Freya; Leppanen, Jenni; Rhind, Charlotte; Deriziotis, Stephanie; Hadjimichalis, Alexandra; Hibbs, Rebecca; Micali, Nadia; Treasure, Janet
2015-01-01
Background Difficulties in social cognition have been identified in eating disorders (EDs), but the exact profile of these abnormalities is unclear. The aim of this study is to examine distinct processes of social-cognition in this patient group, including attentional processing and recognition, empathic reaction and evoked facial expression in response to discrete vignettes of others displaying positive (i.e. happiness) or negative (i.e. sadness and anger) emotions. Method One hundred and thirty-eight female participants were included in the study: 73 healthy controls (HCs) and 65 individuals with an ED (49 with Anorexia Nervosa and 16 with Bulimia Nervosa). Self-report and behavioural measures were used. Results Participants with EDs did not display specific abnormalities in emotional processing, recognition and empathic response to others’ basic discrete emotions. However, they had poorer facial expressivity and a tendency to turn away from emotional displays. Conclusion Treatments focusing on the development of non-verbal emotional communication skills might be of benefit for patients with EDs. PMID:26252220
Emotion Unchained: Facial Expression Modulates Gaze Cueing under Cognitive Load.
Pecchinenda, Anna; Petrucci, Manuel
2016-01-01
Direction of eye gaze cues spatial attention, and typically this cueing effect is not modulated by the expression of a face unless top-down processes are explicitly or implicitly involved. To investigate the role of cognitive control on gaze cueing by emotional faces, participants performed a gaze cueing task with happy, angry, or neutral faces under high (i.e., counting backward by 7) or low cognitive load (i.e., counting forward by 2). Results show that high cognitive load enhances gaze cueing effects for angry facial expressions. In addition, cognitive load reduces gaze cueing for neutral faces, whereas happy facial expressions and gaze affected object preferences regardless of load. This evidence clearly indicates a differential role of cognitive control in processing gaze direction and facial expression, suggesting that under typical conditions, when we shift attention based on social cues from another person, cognitive control processes are used to reduce interference from emotional information.
Emotion Unchained: Facial Expression Modulates Gaze Cueing under Cognitive Load
Petrucci, Manuel
2016-01-01
Direction of eye gaze cues spatial attention, and typically this cueing effect is not modulated by the expression of a face unless top-down processes are explicitly or implicitly involved. To investigate the role of cognitive control on gaze cueing by emotional faces, participants performed a gaze cueing task with happy, angry, or neutral faces under high (i.e., counting backward by 7) or low cognitive load (i.e., counting forward by 2). Results show that high cognitive load enhances gaze cueing effects for angry facial expressions. In addition, cognitive load reduces gaze cueing for neutral faces, whereas happy facial expressions and gaze affected object preferences regardless of load. This evidence clearly indicates a differential role of cognitive control in processing gaze direction and facial expression, suggesting that under typical conditions, when we shift attention based on social cues from another person, cognitive control processes are used to reduce interference from emotional information. PMID:27959925
Perry, Anat; Aviezer, Hillel; Goldstein, Pavel; Palgi, Sharon; Klein, Ehud; Shamay-Tsoory, Simone G
2013-11-01
The neuropeptide oxytocin (OT) has been repeatedly reported to play an essential role in the regulation of social cognition in humans in general, and specifically in enhancing the recognition of emotions from facial expressions. The later was assessed in different paradigms that rely primarily on isolated and decontextualized emotional faces. However, recent evidence has indicated that the perception of basic facial expressions is not context invariant and can be categorically altered by context, especially body context, at early perceptual levels. Body context has a strong effect on our perception of emotional expressions, especially when the actual target face and the contextually expected face are perceptually similar. To examine whether and how OT affects emotion recognition, we investigated the role of OT in categorizing facial expressions in incongruent body contexts. Our results show that in the combined process of deciphering emotions from facial expressions and from context, OT gives an advantage to the face. This advantage is most evident when the target face and the contextually expected face are perceptually similar. Copyright © 2013 Elsevier Ltd. All rights reserved.
Wieser, Matthias J; Brosch, Tobias
2012-01-01
Facial expressions are of eminent importance for social interaction as they convey information about other individuals' emotions and social intentions. According to the predominant "basic emotion" approach, the perception of emotion in faces is based on the rapid, automatic categorization of prototypical, universal expressions. Consequently, the perception of facial expressions has typically been investigated using isolated, de-contextualized, static pictures of facial expressions that maximize the distinction between categories. However, in everyday life, an individual's face is not perceived in isolation, but almost always appears within a situational context, which may arise from other people, the physical environment surrounding the face, as well as multichannel information from the sender. Furthermore, situational context may be provided by the perceiver, including already present social information gained from affective learning and implicit processing biases such as race bias. Thus, the perception of facial expressions is presumably always influenced by contextual variables. In this comprehensive review, we aim at (1) systematizing the contextual variables that may influence the perception of facial expressions and (2) summarizing experimental paradigms and findings that have been used to investigate these influences. The studies reviewed here demonstrate that perception and neural processing of facial expressions are substantially modified by contextual information, including verbal, visual, and auditory information presented together with the face as well as knowledge or processing biases already present in the observer. These findings further challenge the assumption of automatic, hardwired categorical emotion extraction mechanisms predicted by basic emotion theories. Taking into account a recent model on face processing, we discuss where and when these different contextual influences may take place, thus outlining potential avenues in future research.
Impact of civil war on emotion recognition: the denial of sadness in Sierra Leone.
Umiltà, Maria Allessandra; Wood, Rachel; Loffredo, Francesca; Ravera, Roberto; Gallese, Vittorio
2013-01-01
Studies of children with atypical emotional experience demonstrate that childhood exposure to high levels of hostility and threat biases emotion perception. This study investigates emotion processing, in former child soldiers and non-combatant civilians. All participants have experienced prolonged violence exposure during childhood. The study, carried out in Sierra Leone, aimed to examine the effects of exposure to and forced participation in acts of extreme violence on the emotion processing of young adults war survivors. A total of 76 young, male adults (38 former child soldier survivors and 38 civilian survivors) were tested in order to assess participants' ability to identify four different facial emotion expressions from photographs and movies. Both groups were able to recognize facial expressions of emotion. However, despite their general ability to correctly identify facial emotions, participants showed a significant response bias in their recognition of sadness. Both former soldiers and civilians made more errors in identifying expressions of sadness than in the other three emotions and when mislabeling sadness participants most often described it as anger. Conversely, when making erroneous identifications of other emotions, participants were most likely to label the expressed emotion as sadness. In addition, while for three of the four emotions participants were better able to make a correct identification the greater the intensity of the expression, this pattern was not observed for sadness. During movies presentation the recognition of sadness was significantly worse for soldiers. While both former child soldiers and civilians were found to be able to identify facial emotions, a significant response bias in their attribution of negative emotions was observed. Such bias was particularly pronounced in former child soldiers. These findings point to a pervasive long-lasting effect of childhood exposure to violence on emotion processing in later life.
Impact of civil war on emotion recognition: the denial of sadness in Sierra Leone
Umiltà, Maria Allessandra; Wood, Rachel; Loffredo, Francesca; Ravera, Roberto; Gallese, Vittorio
2013-01-01
Studies of children with atypical emotional experience demonstrate that childhood exposure to high levels of hostility and threat biases emotion perception. This study investigates emotion processing, in former child soldiers and non-combatant civilians. All participants have experienced prolonged violence exposure during childhood. The study, carried out in Sierra Leone, aimed to examine the effects of exposure to and forced participation in acts of extreme violence on the emotion processing of young adults war survivors. A total of 76 young, male adults (38 former child soldier survivors and 38 civilian survivors) were tested in order to assess participants' ability to identify four different facial emotion expressions from photographs and movies. Both groups were able to recognize facial expressions of emotion. However, despite their general ability to correctly identify facial emotions, participants showed a significant response bias in their recognition of sadness. Both former soldiers and civilians made more errors in identifying expressions of sadness than in the other three emotions and when mislabeling sadness participants most often described it as anger. Conversely, when making erroneous identifications of other emotions, participants were most likely to label the expressed emotion as sadness. In addition, while for three of the four emotions participants were better able to make a correct identification the greater the intensity of the expression, this pattern was not observed for sadness. During movies presentation the recognition of sadness was significantly worse for soldiers. While both former child soldiers and civilians were found to be able to identify facial emotions, a significant response bias in their attribution of negative emotions was observed. Such bias was particularly pronounced in former child soldiers. These findings point to a pervasive long-lasting effect of childhood exposure to violence on emotion processing in later life. PMID:24027541
Rymarczyk, Krystyna; Żurawski, Łukasz; Jankowiak-Siuda, Kamila; Szatkowska, Iwona
2018-01-01
Facial mimicry (FM) is an automatic response to imitate the facial expressions of others. However, neural correlates of the phenomenon are as yet not well established. We investigated this issue using simultaneously recorded EMG and BOLD signals during perception of dynamic and static emotional facial expressions of happiness and anger. During display presentations, BOLD signals and zygomaticus major (ZM), corrugator supercilii (CS) and orbicularis oculi (OO) EMG responses were recorded simultaneously from 46 healthy individuals. Subjects reacted spontaneously to happy facial expressions with increased EMG activity in ZM and OO muscles and decreased CS activity, which was interpreted as FM. Facial muscle responses correlated with BOLD activity in regions associated with motor simulation of facial expressions [i.e., inferior frontal gyrus, a classical Mirror Neuron System (MNS)]. Further, we also found correlations for regions associated with emotional processing (i.e., insula, part of the extended MNS). It is concluded that FM involves both motor and emotional brain structures, especially during perception of natural emotional expressions. PMID:29467691
Anders, Silke; Sack, Benjamin; Pohl, Anna; Münte, Thomas; Pramstaller, Peter; Klein, Christine; Binkofski, Ferdinand
2012-04-01
Patients with Parkinson's disease suffer from significant motor impairments and accompanying cognitive and affective dysfunction due to progressive disturbances of basal ganglia-cortical gating loops. Parkinson's disease has a long presymptomatic stage, which indicates a substantial capacity of the human brain to compensate for dopaminergic nerve degeneration before clinical manifestation of the disease. Neuroimaging studies provide evidence that increased motor-related cortical activity can compensate for progressive dopaminergic nerve degeneration in carriers of a single mutant Parkin or PINK1 gene, who show a mild but significant reduction of dopamine metabolism in the basal ganglia in the complete absence of clinical motor signs. However, it is currently unknown whether similar compensatory mechanisms are effective in non-motor basal ganglia-cortical gating loops. Here, we ask whether asymptomatic Parkin mutation carriers show altered patterns of brain activity during processing of facial gestures, and whether this might compensate for latent facial emotion recognition deficits. Current theories in social neuroscience assume that execution and perception of facial gestures are linked by a special class of visuomotor neurons ('mirror neurons') in the ventrolateral premotor cortex/pars opercularis of the inferior frontal gyrus (Brodmann area 44/6). We hypothesized that asymptomatic Parkin mutation carriers would show increased activity in this area during processing of affective facial gestures, replicating the compensatory motor effects that have previously been observed in these individuals. Additionally, Parkin mutation carriers might show altered activity in other basal ganglia-cortical gating loops. Eight asymptomatic heterozygous Parkin mutation carriers and eight matched controls underwent functional magnetic resonance imaging and a subsequent facial emotion recognition task. As predicted, Parkin mutation carriers showed significantly stronger activity in the right ventrolateral premotor cortex during execution and perception of affective facial gestures than healthy controls. Furthermore, Parkin mutation carriers showed a slightly reduced ability to recognize facial emotions that was least severe in individuals who showed the strongest increase of ventrolateral premotor activity. In addition, Parkin mutation carriers showed a significantly weaker than normal increase of activity in the left lateral orbitofrontal cortex (inferior frontal gyrus pars orbitalis, Brodmann area 47), which was unrelated to facial emotion recognition ability. These findings are consistent with the hypothesis that compensatory activity in the ventrolateral premotor cortex during processing of affective facial gestures can reduce impairments in facial emotion recognition in subclinical Parkin mutation carriers. A breakdown of this compensatory mechanism might lead to the impairment of facial expressivity and facial emotion recognition observed in manifest Parkinson's disease.
Sack, Benjamin; Pohl, Anna; Münte, Thomas; Pramstaller, Peter; Klein, Christine; Binkofski, Ferdinand
2012-01-01
Patients with Parkinson's disease suffer from significant motor impairments and accompanying cognitive and affective dysfunction due to progressive disturbances of basal ganglia–cortical gating loops. Parkinson's disease has a long presymptomatic stage, which indicates a substantial capacity of the human brain to compensate for dopaminergic nerve degeneration before clinical manifestation of the disease. Neuroimaging studies provide evidence that increased motor-related cortical activity can compensate for progressive dopaminergic nerve degeneration in carriers of a single mutant Parkin or PINK1 gene, who show a mild but significant reduction of dopamine metabolism in the basal ganglia in the complete absence of clinical motor signs. However, it is currently unknown whether similar compensatory mechanisms are effective in non-motor basal ganglia–cortical gating loops. Here, we ask whether asymptomatic Parkin mutation carriers show altered patterns of brain activity during processing of facial gestures, and whether this might compensate for latent facial emotion recognition deficits. Current theories in social neuroscience assume that execution and perception of facial gestures are linked by a special class of visuomotor neurons (‘mirror neurons’) in the ventrolateral premotor cortex/pars opercularis of the inferior frontal gyrus (Brodmann area 44/6). We hypothesized that asymptomatic Parkin mutation carriers would show increased activity in this area during processing of affective facial gestures, replicating the compensatory motor effects that have previously been observed in these individuals. Additionally, Parkin mutation carriers might show altered activity in other basal ganglia–cortical gating loops. Eight asymptomatic heterozygous Parkin mutation carriers and eight matched controls underwent functional magnetic resonance imaging and a subsequent facial emotion recognition task. As predicted, Parkin mutation carriers showed significantly stronger activity in the right ventrolateral premotor cortex during execution and perception of affective facial gestures than healthy controls. Furthermore, Parkin mutation carriers showed a slightly reduced ability to recognize facial emotions that was least severe in individuals who showed the strongest increase of ventrolateral premotor activity. In addition, Parkin mutation carriers showed a significantly weaker than normal increase of activity in the left lateral orbitofrontal cortex (inferior frontal gyrus pars orbitalis, Brodmann area 47), which was unrelated to facial emotion recognition ability. These findings are consistent with the hypothesis that compensatory activity in the ventrolateral premotor cortex during processing of affective facial gestures can reduce impairments in facial emotion recognition in subclinical Parkin mutation carriers. A breakdown of this compensatory mechanism might lead to the impairment of facial expressivity and facial emotion recognition observed in manifest Parkinson's disease. PMID:22434215
Stewart, Suzanne L K; Schepman, Astrid; Haigh, Matthew; McHugh, Rhian; Stewart, Andrew J
2018-03-14
The recognition of emotional facial expressions is often subject to contextual influence, particularly when the face and the context convey similar emotions. We investigated whether spontaneous, incidental affective theory of mind inferences made while reading vignettes describing social situations would produce context effects on the identification of same-valenced emotions (Experiment 1) as well as differently-valenced emotions (Experiment 2) conveyed by subsequently presented faces. Crucially, we found an effect of context on reaction times in both experiments while, in line with previous work, we found evidence for a context effect on accuracy only in Experiment 1. This demonstrates that affective theory of mind inferences made at the pragmatic level of a text can automatically, contextually influence the perceptual processing of emotional facial expressions in a separate task even when those emotions are of a distinctive valence. Thus, our novel findings suggest that language acts as a contextual influence to the recognition of emotional facial expressions for both same and different valences.
Farsham, Aida; Abbaslou, Tahereh; Bidaki, Reza; Bozorg, Bonnie
2017-01-01
Objective: No research has been conducted on facial emotional recognition on patients with borderline personality disorder (BPD) and schizotypal personality disorder (SPD). The present study aimed at comparing facial emotion recognition in these patients with the general population. The neurocognitive processing of emotions can show the pathologic style of these 2 disorders. Method: Twenty BPD patients, 16 SPD patients, and 20 healthy individuals were selected by available sampling method. Structural Clinical Interview for Axis II, Millon Personality Inventory, Beck Depression Inventory and Facial Emotional Recognition Test was were conducted for all participants. Discussion: The results of one way ANOVA and Scheffe’s post hoc test analysis revealed significant differences in neuropsychology assessment of facial emotional recognition between BPD and SPD patients with normal group (p = 0/001). A significant difference was found in emotion recognition of fear between the 2 groups of BPD and normal population (p = 0/008). A significant difference was observed between SPD patients and control group in emotion recognition of wonder (p = 0/04(. The obtained results indicated a deficit in negative emotion recognition, especially disgust emotion, thus, it can be concluded that these patients have the same neurocognitive profile in the emotion domain. PMID:28659980
Processing of Emotional Faces in Patients with Chronic Pain Disorder: An Eye-Tracking Study.
Giel, Katrin Elisabeth; Paganini, Sarah; Schank, Irena; Enck, Paul; Zipfel, Stephan; Junne, Florian
2018-01-01
Problems in emotion processing potentially contribute to the development and maintenance of chronic pain. Theories focusing on attentional processing have suggested that dysfunctional attention deployment toward emotional information, i.e., attentional biases for negative emotions, might entail one potential developmental and/or maintenance factor of chronic pain. We assessed self-reported alexithymia, attentional orienting to and maintenance on emotional stimuli using eye tracking in 17 patients with chronic pain disorder (CP) and two age- and sex-matched control groups, 17 healthy individuals (HC) and 17 individuals who were matched to CP according to depressive symptoms (DC). In a choice viewing paradigm, a dot indicated the position of the emotional picture in the next trial to allow for strategic attention deployment. Picture pairs consisted of a happy or sad facial expression and a neutral facial expression of the same individual. Participants were asked to explore picture pairs freely. CP and DC groups reported higher alexithymia than the HC group. HC showed a previously reported emotionality bias by preferentially orienting to the emotional face and preferentially maintaining on the happy face. CP and DC participants showed no facilitated early attention to sad facial expressions, and DC participants showed no facilitated early attention to happy facial expressions, while CP and DC participants did. We found no group differences in attentional maintenance. Our findings are in line with the clinical large overlap between pain and depression. The blunted initial reaction to sadness could be interpreted as a failure of the attentional system to attend to evolutionary salient emotional stimuli or as an attempt to suppress negative emotions. These difficulties in emotion processing might contribute to etiology or maintenance of chronic pain and depression.
The recognition of facial emotion expressions in Parkinson's disease.
Assogna, Francesca; Pontieri, Francesco E; Caltagirone, Carlo; Spalletta, Gianfranco
2008-11-01
A limited number of studies in Parkinson's Disease (PD) suggest a disturbance of recognition of facial emotion expressions. In particular, disgust recognition impairment has been reported in unmedicated and medicated PD patients. However, the results are rather inconclusive in the definition of the degree and the selectivity of emotion recognition impairment, and an associated impairment of almost all basic facial emotions in PD is also described. Few studies have investigated the relationship with neuropsychiatric and neuropsychological symptoms with mainly negative results. This inconsistency may be due to many different problems, such as emotion assessment, perception deficit, cognitive impairment, behavioral symptoms, illness severity and antiparkinsonian therapy. Here we review the clinical characteristics and neural structures involved in the recognition of specific facial emotion expressions, and the plausible role of dopamine transmission and dopamine replacement therapy in these processes. It is clear that future studies should be directed to clarify all these issues.
ERIC Educational Resources Information Center
Carvajal, Fernando; Fernandez-Alcaraz, Camino; Rueda, Maria; Sarrion, Louise
2012-01-01
The processing of facial expressions of emotions by 23 adults with Down syndrome and moderate intellectual disability was compared with that of adults with intellectual disability of other etiologies (24 matched in cognitive level and 26 with mild intellectual disability). Each participant performed 4 tasks of the Florida Affect Battery and an…
Maternal Personality and Infants' Neural and Visual Responsivity to Facial Expressions of Emotion
ERIC Educational Resources Information Center
De Haan, Michelle; Belsky, Jay; Reid, Vincent; Volein, Agnes; Johnson, Mark H.
2004-01-01
Background: Recent investigations suggest that experience plays an important role in the development of face processing. The aim of this study was to investigate the potential role of experience in the development of the ability to process facial expressions of emotion. Method: We examined the potential role of experience indirectly by…
ERIC Educational Resources Information Center
Magnee, Maurice J. C. M.; de Gelder, Beatrice; van Engeland, Herman; Kemner, Chantal
2007-01-01
Background: Despite extensive research, it is still debated whether impairments in social skills of individuals with pervasive developmental disorder (PDD) are related to specific deficits in the early processing of emotional information. We aimed to test both automatic processing of facial affect as well as the integration of auditory and visual…
Mavratzakis, Aimee; Herbert, Cornelia; Walla, Peter
2016-01-01
In the current study, electroencephalography (EEG) was recorded simultaneously with facial electromyography (fEMG) to determine whether emotional faces and emotional scenes are processed differently at the neural level. In addition, it was investigated whether these differences can be observed at the behavioural level via spontaneous facial muscle activity. Emotional content of the stimuli did not affect early P1 activity. Emotional faces elicited enhanced amplitudes of the face-sensitive N170 component, while its counterpart, the scene-related N100, was not sensitive to emotional content of scenes. At 220-280ms, the early posterior negativity (EPN) was enhanced only slightly for fearful as compared to neutral or happy faces. However, its amplitudes were significantly enhanced during processing of scenes with positive content, particularly over the right hemisphere. Scenes of positive content also elicited enhanced spontaneous zygomatic activity from 500-750ms onwards, while happy faces elicited no such changes. Contrastingly, both fearful faces and negative scenes elicited enhanced spontaneous corrugator activity at 500-750ms after stimulus onset. However, relative to baseline EMG changes occurred earlier for faces (250ms) than for scenes (500ms) whereas for scenes activity changes were more pronounced over the whole viewing period. Taking into account all effects, the data suggests that emotional facial expressions evoke faster attentional orienting, but weaker affective neural activity and emotional behavioural responses compared to emotional scenes. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Event-related brain responses to emotional words, pictures, and faces – a cross-domain comparison
Bayer, Mareike; Schacht, Annekathrin
2014-01-01
Emotion effects in event-related brain potentials (ERPs) have previously been reported for a range of visual stimuli, including emotional words, pictures, and facial expressions. Still, little is known about the actual comparability of emotion effects across these stimulus classes. The present study aimed to fill this gap by investigating emotion effects in response to words, pictures, and facial expressions using a blocked within-subject design. Furthermore, ratings of stimulus arousal and valence were collected from an independent sample of participants. Modulations of early posterior negativity (EPN) and late positive complex (LPC) were visible for all stimulus domains, but showed clear differences, particularly in valence processing. While emotion effects were limited to positive stimuli for words, they were predominant for negative stimuli in pictures and facial expressions. These findings corroborate the notion of a positivity offset for words and a negativity bias for pictures and facial expressions, which was assumed to be caused by generally lower arousal levels of written language. Interestingly, however, these assumed differences were not confirmed by arousal ratings. Instead, words were rated as overall more positive than pictures and facial expressions. Taken together, the present results point toward systematic differences in the processing of written words and pictorial stimuli of emotional content, not only in terms of a valence bias evident in ERPs, but also concerning their emotional evaluation captured by ratings of stimulus valence and arousal. PMID:25339927
Maki, Yohko; Yoshida, Hiroshi; Yamaguchi, Tomoharu; Yamaguchi, Haruyasu
2013-01-01
Positivity recognition bias has been reported for facial expression as well as memory and visual stimuli in aged individuals, whereas emotional facial recognition in Alzheimer disease (AD) patients is controversial, with possible involvement of confounding factors such as deficits in spatial processing of non-emotional facial features and in verbal processing to express emotions. Thus, we examined whether recognition of positive facial expressions was preserved in AD patients, by adapting a new method that eliminated the influences of these confounding factors. Sensitivity of six basic facial expressions (happiness, sadness, surprise, anger, disgust, and fear) was evaluated in 12 outpatients with mild AD, 17 aged normal controls (ANC), and 25 young normal controls (YNC). To eliminate the factors related to non-emotional facial features, averaged faces were prepared as stimuli. To eliminate the factors related to verbal processing, the participants were required to match the images of stimulus and answer, avoiding the use of verbal labels. In recognition of happiness, there was no difference in sensitivity between YNC and ANC, and between ANC and AD patients. AD patients were less sensitive than ANC in recognition of sadness, surprise, and anger. ANC were less sensitive than YNC in recognition of surprise, anger, and disgust. Within the AD patient group, sensitivity of happiness was significantly higher than those of the other five expressions. In AD patient, recognition of happiness was relatively preserved; recognition of happiness was most sensitive and was preserved against the influences of age and disease.
Sex differences in event-related potentials and attentional biases to emotional facial stimuli.
Pfabigan, Daniela M; Lamplmayr-Kragl, Elisabeth; Pintzinger, Nina M; Sailer, Uta; Tran, Ulrich S
2014-01-01
Attentional processes play an important role in the processing of emotional information. Previous research reported attentional biases during stimulus processing in anxiety and depression. However, sex differences in the processing of emotional stimuli and higher prevalence rates of anxiety disorders among women, compared to men, suggest that attentional biases may also differ between the two sexes. The present study used a modified version of the dot probe task with happy, angry, and neutral facial stimuli to investigate the time course of attentional biases in healthy volunteers. Moreover, associations of attentional biases with alexithymia were examined on the behavioral and physiological level. Event-related potentials were measured while 21 participants (11 women) performed the task, utilizing also for the first time a difference wave approach in the analysis to highlight emotion-specific aspects. Women showed overall enhanced probe P1 amplitudes compared to men, in particular after rewarding facial stimuli. Using the difference wave approach, probe P1 amplitudes appeared specifically enhanced with regard to congruently presented happy facial stimuli among women, compared to men. Both methods yielded enhanced probe P1 amplitudes after presentation of the emotional stimulus in the left compared to the right visual hemifield. Probe P1 amplitudes correlated negatively with self-reported alexithymia, most of these correlations were only observable in women. Our results suggest that women orient their attention to a greater extent to facial stimuli than men and corroborate that alexithymia is a correlate of reduced emotional reactivity on a neuronal level. We recommend using a difference wave approach when addressing attentional processes of orientation and disengagement also in future studies.
Hartley, Alan A; Ravich, Zoe; Stringer, Sarah; Wiley, Katherine
2015-09-01
Memory for both facial emotional expression and facial identity was explored in younger and older adults in 3 experiments using a delayed match-to-sample procedure. Memory sets of 1, 2, or 3 faces were presented, which were followed by a probe after a 3-s retention interval. There was very little difference between younger and older adults in memory for emotional expressions, but memory for identity was substantially impaired in the older adults. Possible explanations for spared memory for emotional expressions include socioemotional selectivity theory as well as the existence of overlapping yet distinct brain networks for processing of different emotions. © The Author 2013. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Gentsch, Kornelia; Grandjean, Didier; Scherer, Klaus R
2014-04-01
Componential theories assume that emotion episodes consist of emergent and dynamic response changes to relevant events in different components, such as appraisal, physiology, motivation, expression, and subjective feeling. In particular, Scherer's Component Process Model hypothesizes that subjective feeling emerges when the synchronization (or coherence) of appraisal-driven changes between emotion components has reached a critical threshold. We examined the prerequisite of this synchronization hypothesis for appraisal-driven response changes in facial expression. The appraisal process was manipulated by using feedback stimuli, presented in a gambling task. Participants' responses to the feedback were investigated in concurrently recorded brain activity related to appraisal (event-related potentials, ERP) and facial muscle activity (electromyography, EMG). Using principal component analysis, the prediction of appraisal-driven response changes in facial EMG was examined. Results support this prediction: early cognitive processes (related to the feedback-related negativity) seem to primarily affect the upper face, whereas processes that modulate P300 amplitudes tend to predominantly drive cheek region responses. Copyright © 2013 Elsevier B.V. All rights reserved.
Fujiwara, Takeo; Mizuki, Rie; Miki, Takahiro; Chemtob, Claude
2015-01-01
"Emotional numbing" is a symptom of post-traumatic stress disorder (PTSD) characterized by a loss of interest in usually enjoyable activities, feeling detached from others, and an inability to express a full range of emotions. Emotional numbing is usually assessed through self-report, and is particularly difficult to ascertain among young children. We conducted a pilot study to explore the use of facial expression ratings in response to a comedy video clip to assess emotional reactivity among preschool children directly exposed to the Great East Japan Earthquake. This study included 23 child participants. Child PTSD symptoms were measured using a modified version of the Parent's Report of the Child's Reaction to Stress scale. Children were filmed while watching a 2-min video compilation of natural scenes ('baseline video') followed by a 2-min video clip from a television comedy ('comedy video'). Children's facial expressions were processed the using Noldus FaceReader software, which implements the Facial Action Coding System (FACS). We investigated the association between PTSD symptom scores and facial emotion reactivity using linear regression analysis. Children with higher PTSD symptom scores showed a significantly greater proportion of neutral facial expressions, controlling for sex, age, and baseline facial expression (p < 0.05). This pilot study suggests that facial emotion reactivity, measured using facial expression recognition software, has the potential to index emotional numbing in young children. This pilot study adds to the emerging literature on using experimental psychopathology methods to characterize children's reactions to disasters.
Fujiwara, Takeo; Mizuki, Rie; Miki, Takahiro; Chemtob, Claude
2015-01-01
“Emotional numbing” is a symptom of post-traumatic stress disorder (PTSD) characterized by a loss of interest in usually enjoyable activities, feeling detached from others, and an inability to express a full range of emotions. Emotional numbing is usually assessed through self-report, and is particularly difficult to ascertain among young children. We conducted a pilot study to explore the use of facial expression ratings in response to a comedy video clip to assess emotional reactivity among preschool children directly exposed to the Great East Japan Earthquake. This study included 23 child participants. Child PTSD symptoms were measured using a modified version of the Parent’s Report of the Child’s Reaction to Stress scale. Children were filmed while watching a 2-min video compilation of natural scenes (‘baseline video’) followed by a 2-min video clip from a television comedy (‘comedy video’). Children’s facial expressions were processed the using Noldus FaceReader software, which implements the Facial Action Coding System (FACS). We investigated the association between PTSD symptom scores and facial emotion reactivity using linear regression analysis. Children with higher PTSD symptom scores showed a significantly greater proportion of neutral facial expressions, controlling for sex, age, and baseline facial expression (p < 0.05). This pilot study suggests that facial emotion reactivity, measured using facial expression recognition software, has the potential to index emotional numbing in young children. This pilot study adds to the emerging literature on using experimental psychopathology methods to characterize children’s reactions to disasters. PMID:26528206
Multimedia Content Development as a Facial Expression Datasets for Recognition of Human Emotions
NASA Astrophysics Data System (ADS)
Mamonto, N. E.; Maulana, H.; Liliana, D. Y.; Basaruddin, T.
2018-02-01
Datasets that have been developed before contain facial expression from foreign people. The development of multimedia content aims to answer the problems experienced by the research team and other researchers who will conduct similar research. The method used in the development of multimedia content as facial expression datasets for human emotion recognition is the Villamil-Molina version of the multimedia development method. Multimedia content developed with 10 subjects or talents with each talent performing 3 shots with each capturing talent having to demonstrate 19 facial expressions. After the process of editing and rendering, tests are carried out with the conclusion that the multimedia content can be used as a facial expression dataset for recognition of human emotions.
Identity modulates short-term memory for facial emotion.
Galster, Murray; Kahana, Michael J; Wilson, Hugh R; Sekuler, Robert
2009-12-01
For some time, the relationship between processing of facial expression and facial identity has been in dispute. Using realistic synthetic faces, we reexamined this relationship for both perception and short-term memory. In Experiment 1, subjects tried to identify whether the emotional expression on a probe stimulus face matched the emotional expression on either of two remembered faces that they had just seen. The results showed that identity strongly influenced recognition short-term memory for emotional expression. In Experiment 2, subjects' similarity/dissimilarity judgments were transformed by multidimensional scaling (MDS) into a 2-D description of the faces' perceptual representations. Distances among stimuli in the MDS representation, which showed a strong linkage of emotional expression and facial identity, were good predictors of correct and false recognitions obtained previously in Experiment 1. The convergence of the results from Experiments 1 and 2 suggests that the overall structure and configuration of faces' perceptual representations may parallel their representation in short-term memory and that facial identity modulates the representation of facial emotion, both in perception and in memory. The stimuli from this study may be downloaded from http://cabn.psychonomic-journals.org/content/supplemental.
Identity modulates short-term memory for facial emotion
Galster, Murray; Kahana, Michael J.; Wilson, Hugh R.; Sekuler, Robert
2010-01-01
For some time, the relationship between processing of facial expression and facial identity has been in dispute. Using realistic synthetic faces, we reexamined this relationship for both perception and short-term memory. In Experiment 1, subjects tried to identify whether the emotional expression on a probe stimulus face matched the emotional expression on either of two remembered faces that they had just seen. The results showed that identity strongly influenced recognition short-term memory for emotional expression. In Experiment 2, subjects’ similarity/dissimilarity judgments were transformed by multidimensional scaling (MDS) into a 2-D description of the faces’ perceptual representations. Distances among stimuli in the MDS representation, which showed a strong linkage of emotional expression and facial identity, were good predictors of correct and false recognitions obtained previously in Experiment 1. The convergence of the results from Experiments 1 and 2 suggests that the overall structure and configuration of faces’ perceptual representations may parallel their representation in short-term memory and that facial identity modulates the representation of facial emotion, both in perception and in memory. The stimuli from this study may be downloaded from http://cabn.psychonomic-journals.org/content/supplemental. PMID:19897794
Emotion Perception from Face, Voice, and Touch: Comparisons and Convergence
Schirmer, Annett; Adolphs, Ralph
2017-01-01
Historically, research on emotion perception has focused on facial expressions, and findings from this modality have come to dominate our thinking about other modalities. Here, we examine emotion perception through a wider lens by comparing facial with vocal and tactile processing. We review stimulus characteristics and ensuing behavioral and brain responses, and show that audition and touch do not simply duplicate visual mechanisms. Each modality provides a distinct input channel and engages partly non-overlapping neuroanatomical systems with different processing specializations (e.g., specific emotions versus affect). Moreover, processing of signals across the different modalities converges, first into multi- and later into amodal representations that enable holistic emotion judgments. PMID:28173998
Amphetamine as a social drug: Effects of d-amphetamine on social processing and behavior
Wardle, Margaret C.; Garner, Matthew J.; Munafò, Marcus R.; de Wit, Harriet
2012-01-01
Rationale Drug users often report using drugs to enhance social situations, and empirical studies support the idea that drugs increase both social behavior and the value of social interactions. One way drugs may affect social behavior is by altering social processing, for example by decreasing perceptions of negative emotion in others. Objectives We examined effects of d-amphetamine on processing of emotional facial expressions, and on the social behavior of talking. We predicted amphetamine would enhance attention, identification and responsivity to positive expressions, and that this in turn would predict increased talkativeness. Methods Over three sessions, 36 healthy normal adults received placebo, 10mg, and 20mg d-amphetamine under counterbalanced double-blind conditions. At each session we measured processing of happy, fearful, sad and angry expressions using an attentional visual probe task, a dynamic emotion identification task, and measures of facial muscle activity. We also measured talking. Results Amphetamine decreased the threshold for identifying all emotions, increased negative facial responses to sad expressions, and increased talkativeness. Contrary to our hypotheses, amphetamine did not alter attention to, identification of or facial responses to positive emotions specifically. Interestingly, the drug decreased the threshold to identify all emotions, and this effect was uniquely related to increased talkativeness, even after controlling for overall sensitivity to amphetamine. Conclusions The results suggest that amphetamine may encourage sociability by increasing sensitivity to subtle emotional expressions. These findings suggest novel social mechanisms that may contribute to the rewarding effects of amphetamine. PMID:22526538
Drug effects on responses to emotional facial expressions: recent findings
Miller, Melissa A.; Bershad, Anya K.; de Wit, Harriet
2016-01-01
Many psychoactive drugs increase social behavior and enhance social interactions, which may, in turn, increase their attractiveness to users. Although the psychological mechanisms by which drugs affect social behavior are not fully understood, there is some evidence that drugs alter the perception of emotions in others. Drugs can affect the ability to detect, attend to, and respond to emotional facial expressions, which in turn may influence their use in social settings. Either increased reactivity to positive expressions or decreased response to negative expressions may facilitate social interaction. This article reviews evidence that psychoactive drugs alter the processing of emotional facial expressions using subjective, behavioral, and physiological measures. The findings lay the groundwork for better understanding how drugs alter social processing and social behavior more generally. PMID:26226144
Damaskinou, Nikoleta; Watling, Dawn
2018-05-01
This study was designed to investigate the patterns of electrophysiological responses of early emotional processing at frontocentral sites in adults and to explore whether adults' activation patterns show hemispheric lateralization for facial emotion processing. Thirty-five adults viewed full face and chimeric face stimuli. After viewing two faces, sequentially, participants were asked to decide which of the two faces was more emotive. The findings from the standard faces and the chimeric faces suggest that emotion processing is present during the early phases of face processing in the frontocentral sites. In particular, sad emotional faces are processed differently than neutral and happy (including happy chimeras) faces in these early phases of processing. Further, there were differences in the electrode amplitudes over the left and right hemisphere, particularly in the early temporal window. This research provides supporting evidence that the chimeric face test is a test of emotion processing that elicits right hemispheric processing.
Boutsen, Frank A; Dvorak, Justin D; Pulusu, Vinay K; Ross, Elliott D
2017-04-01
Depending on a subject's attentional bias, robust changes in emotional perception occur when facial blends (different emotions expressed on upper/lower face) are presented tachistoscopically. If no instructions are given, subjects overwhelmingly identify the lower facial expression when blends are presented to either visual field. If asked to attend to the upper face, subjects overwhelmingly identify the upper facial expression in the left visual field but remain slightly biased to the lower facial expression in the right visual field. The current investigation sought to determine whether differences in initial saccadic targets could help explain the perceptual biases described above. Ten subjects were presented with full and blend facial expressions under different attentional conditions. No saccadic differences were found for left versus right visual field presentations or for full facial versus blend stimuli. When asked to identify the presented emotion, saccades were directed to the lower face. When asked to attend to the upper face, saccades were directed to the upper face. When asked to attend to the upper face and try to identify the emotion, saccades were directed to the upper face but to a lesser degree. Thus, saccadic behavior supports the concept that there are cognitive-attentional pre-attunements when subjects visually process facial expressions. However, these pre-attunements do not fully explain the perceptual superiority of the left visual field for identifying the upper facial expression when facial blends are presented tachistoscopically. Hence other perceptual factors must be in play, such as the phenomenon of virtual scanning. Published by Elsevier Ltd.
Davis, Jennifer S; Fani, Negar; Ressler, Kerry; Jovanovic, Tanja; Tone, Erin B.; Bradley, Bekh
2014-01-01
Research indicates that some individuals who were maltreated in childhood demonstrate biases in social information processing. However, the mechanisms through which these biases develop remain unclear—one possible mechanism is via attachment-related processes. Childhood maltreatment increases risk for insecure attachment. The internal working models of self and others associated with insecure attachment may impact the processing of socially relevant information, particularly emotion conveyed in facial expressions. We investigated associations among child abuse, attachment anxiety and avoidance, and attention biases for emotion in an adult population. Specifically, we examined how self-reported attachment influences the relationship between childhood abuse and attention bias for emotion. A dot probe task consisting of happy, threatening, and neutral female facial stimuli was used to assess possible biases in attention for socially relevant stimuli. Our findings indicate that attachment anxiety moderated the relationship between maltreatment and attention bias for happy emotion; among individuals with a child abuse history, attachment anxiety significantly predicted an attention bias away from happy facial stimuli. PMID:24680873
Seeing Life through Positive-Tinted Glasses: Color–Meaning Associations
Gil, Sandrine; Le Bigot, Ludovic
2014-01-01
There is a growing body of literature to show that color can convey information, owing to its emotionally meaningful associations. Most research so far has focused on negative hue–meaning associations (e.g., red) with the exception of the positive aspects associated with green. We therefore set out to investigate the positive associations of two colors (i.e., green and pink), using an emotional facial expression recognition task in which colors provided the emotional contextual information for the face processing. In two experiments, green and pink backgrounds enhanced happy face recognition and impaired sad face recognition, compared with a control color (gray). Our findings therefore suggest that because green and pink both convey positive information, they facilitate the processing of emotionally congruent facial expressions (i.e., faces expressing happiness) and interfere with that of incongruent facial expressions (i.e., faces expressing sadness). Data also revealed a positive association for white. Results are discussed within the theoretical framework of emotional cue processing and color meaning. PMID:25098167
Seeing life through positive-tinted glasses: color-meaning associations.
Gil, Sandrine; Le Bigot, Ludovic
2014-01-01
There is a growing body of literature to show that color can convey information, owing to its emotionally meaningful associations. Most research so far has focused on negative hue-meaning associations (e.g., red) with the exception of the positive aspects associated with green. We therefore set out to investigate the positive associations of two colors (i.e., green and pink), using an emotional facial expression recognition task in which colors provided the emotional contextual information for the face processing. In two experiments, green and pink backgrounds enhanced happy face recognition and impaired sad face recognition, compared with a control color (gray). Our findings therefore suggest that because green and pink both convey positive information, they facilitate the processing of emotionally congruent facial expressions (i.e., faces expressing happiness) and interfere with that of incongruent facial expressions (i.e., faces expressing sadness). Data also revealed a positive association for white. Results are discussed within the theoretical framework of emotional cue processing and color meaning.
Balconi, Michela; Mazza, Guido
2009-11-01
Alpha brain oscillation modulation was analyzed in response to masked emotional facial expressions. In addition, behavioural activation (BAS) and behavioural inhibition systems (BIS) were considered as an explicative factor to verify the effect of motivational significance on cortical activity. Nineteen subjects were submitted to an ample range of facial expressions of emotions (anger, fear, surprise, disgust, happiness, sadness, and neutral). The results demonstrated that anterior frontal sites were more active than central and posterior sites in response to facial stimuli. Moreover, right-side responses varied as a function of emotional types, with an increased right-frontal activity for negative emotions. Finally, whereas higher BIS subjects generated a more right hemisphere activation for some negative emotions (such as fear, anger, and surprise), Reward-BAS subjects were more responsive to positive emotion (happiness) within the left hemisphere. Valence and potential threatening power of facial expressions were considered to elucidate these cortical differences.
Neural bases of different cognitive strategies for facial affect processing in schizophrenia.
Fakra, Eric; Salgado-Pineda, Pilar; Delaveau, Pauline; Hariri, Ahmad R; Blin, Olivier
2008-03-01
To examine the neural basis and dynamics of facial affect processing in schizophrenic patients as compared to healthy controls. Fourteen schizophrenic patients and fourteen matched controls performed a facial affect identification task during fMRI acquisition. The emotional task included an intuitive emotional condition (matching emotional faces) and a more cognitively demanding condition (labeling emotional faces). Individual analysis for each emotional condition, and second-level t-tests examining both within-, and between-group differences, were carried out using a random effects approach. Psychophysiological interactions (PPI) were tested for variations in functional connectivity between amygdala and other brain regions as a function of changes in experimental conditions (labeling versus matching). During the labeling condition, both groups engaged similar networks. During the matching condition, schizophrenics failed to activate regions of the limbic system implicated in the automatic processing of emotions. PPI revealed an inverse functional connectivity between prefrontal regions and the left amygdala in healthy volunteers but there was no such change in patients. Furthermore, during the matching condition, and compared to controls, patients showed decreased activation of regions involved in holistic face processing (fusiform gyrus) and increased activation of regions associated with feature analysis (inferior parietal cortex, left middle temporal lobe, right precuneus). Our findings suggest that schizophrenic patients invariably adopt a cognitive approach when identifying facial affect. The distributed neocortical network observed during the intuitive condition indicates that patients may resort to feature-based, rather than configuration-based, processing and may constitute a compensatory strategy for limbic dysfunction.
NK1 receptor antagonism and emotional processing in healthy volunteers.
Chandra, P; Hafizi, S; Massey-Chase, R M; Goodwin, G M; Cowen, P J; Harmer, C J
2010-04-01
The neurokinin-1 (NK(1)) receptor antagonist, aprepitant, showed activity in several animal models of depression; however, its efficacy in clinical trials was disappointing. There is little knowledge of the role of NK(1) receptors in human emotional behaviour to help explain this discrepancy. The aim of the current study was to assess the effects of a single oral dose of aprepitant (125 mg) on models of emotional processing sensitive to conventional antidepressant drug administration in 38 healthy volunteers, randomly allocated to receive aprepitant or placebo in a between groups double blind design. Performance on measures of facial expression recognition, emotional categorisation, memory and attentional visual-probe were assessed following the drug absorption. Relative to placebo, aprepitant improved recognition of happy facial expressions and increased vigilance to emotional information in the unmasked condition of the visual probe task. In contrast, aprepitant impaired emotional memory and slowed responses in the facial expression recognition task suggesting possible deleterious effects on cognition. These results suggest that while antagonism of NK(1) receptors does affect emotional processing in humans, its effects are more restricted and less consistent across tasks than those of conventional antidepressants. Human models of emotional processing may provide a useful means of assessing the likely therapeutic potential of new treatments for depression.
Computerised analysis of facial emotion expression in eating disorders
2017-01-01
Background Problems with social-emotional processing are known to be an important contributor to the development and maintenance of eating disorders (EDs). Diminished facial communication of emotion has been frequently reported in individuals with anorexia nervosa (AN). Less is known about facial expressivity in bulimia nervosa (BN) and in people who have recovered from AN (RecAN). This study aimed to pilot the use of computerised facial expression analysis software to investigate emotion expression across the ED spectrum and recovery in a large sample of participants. Method 297 participants with AN, BN, RecAN, and healthy controls were recruited. Participants watched film clips designed to elicit happy or sad emotions, and facial expressions were then analysed using FaceReader. Results The finding mirrored those from previous work showing that healthy control and RecAN participants expressed significantly more positive emotions during the positive clip compared to the AN group. There were no differences in emotion expression during the sad film clip. Discussion These findings support the use of computerised methods to analyse emotion expression in EDs. The findings also demonstrate that reduced positive emotion expression is likely to be associated with the acute stage of AN illness, with individuals with BN showing an intermediate profile. PMID:28575109
Wolf, Richard C; Pujara, Maia; Baskaya, Mustafa K; Koenigs, Michael
2016-09-01
Facial emotion recognition is a critical aspect of human communication. Since abnormalities in facial emotion recognition are associated with social and affective impairment in a variety of psychiatric and neurological conditions, identifying the neural substrates and psychological processes underlying facial emotion recognition will help advance basic and translational research on social-affective function. Ventromedial prefrontal cortex (vmPFC) has recently been implicated in deploying visual attention to the eyes of emotional faces, although there is mixed evidence regarding the importance of this brain region for recognition accuracy. In the present study of neurological patients with vmPFC damage, we used an emotion recognition task with morphed facial expressions of varying intensities to determine (1) whether vmPFC is essential for emotion recognition accuracy, and (2) whether instructed attention to the eyes of faces would be sufficient to improve any accuracy deficits. We found that vmPFC lesion patients are impaired, relative to neurologically healthy adults, at recognizing moderate intensity expressions of anger and that recognition accuracy can be improved by providing instructions of where to fixate. These results suggest that vmPFC may be important for the recognition of facial emotion through a role in guiding visual attention to emotionally salient regions of faces. Copyright © 2016 Elsevier Ltd. All rights reserved.
Ardizzi, Martina; Umiltà, Maria Alessandra; Evangelista, Valentina; Di Liscia, Alessandra; Ravera, Roberto; Gallese, Vittorio
2016-01-01
Facial mimicry and vagal regulation represent two crucial physiological responses to others’ facial expressions of emotions. Facial mimicry, defined as the automatic, rapid and congruent electromyographic activation to others’ facial expressions, is implicated in empathy, emotional reciprocity and emotions recognition. Vagal regulation, quantified by the computation of Respiratory Sinus Arrhythmia (RSA), exemplifies the autonomic adaptation to contingent social cues. Although it has been demonstrated that childhood maltreatment induces alterations in the processing of the facial expression of emotions, both at an explicit and implicit level, the effects of maltreatment on children’s facial mimicry and vagal regulation in response to facial expressions of emotions remain unknown. The purpose of the present study was to fill this gap, involving 24 street-children (maltreated group) and 20 age-matched controls (control group). We recorded their spontaneous facial electromyographic activations of corrugator and zygomaticus muscles and RSA responses during the visualization of the facial expressions of anger, fear, joy and sadness. Results demonstrated a different impact of childhood maltreatment on facial mimicry and vagal regulation. Maltreated children did not show the typical positive-negative modulation of corrugator mimicry. Furthermore, when only negative facial expressions were considered, maltreated children demonstrated lower corrugator mimicry than controls. With respect to vagal regulation, whereas maltreated children manifested the expected and functional inverse correlation between RSA value at rest and RSA response to angry facial expressions, controls did not. These results describe an early and divergent functional adaptation to hostile environment of the two investigated physiological mechanisms. On the one side, maltreatment leads to the suppression of the spontaneous facial mimicry normally concurring to empathic understanding of others’ emotions. On the other side, maltreatment forces the precocious development of the functional synchronization between vagal regulation and threatening social cues facilitating the recruitment of fight-or-flight defensive behavioral strategies. PMID:27685802
The role of the cannabinoid receptor in adolescents' processing of facial expressions.
Ewald, Anais; Becker, Susanne; Heinrich, Angela; Banaschewski, Tobias; Poustka, Luise; Bokde, Arun; Büchel, Christian; Bromberg, Uli; Cattrell, Anna; Conrod, Patricia; Desrivières, Sylvane; Frouin, Vincent; Papadopoulos-Orfanos, Dimitri; Gallinat, Jürgen; Garavan, Hugh; Heinz, Andreas; Walter, Henrik; Ittermann, Bernd; Gowland, Penny; Paus, Tomáš; Martinot, Jean-Luc; Paillère Martinot, Marie-Laure; Smolka, Michael N; Vetter, Nora; Whelan, Rob; Schumann, Gunter; Flor, Herta; Nees, Frauke
2016-01-01
The processing of emotional faces is an important prerequisite for adequate social interactions in daily life, and might thus specifically be altered in adolescence, a period marked by significant changes in social emotional processing. Previous research has shown that the cannabinoid receptor CB1R is associated with longer gaze duration and increased brain responses in the striatum to happy faces in adults, yet, for adolescents, it is not clear whether an association between CBR1 and face processing exists. In the present study we investigated genetic effects of the two CB1R polymorphisms, rs1049353 and rs806377, on the processing of emotional faces in healthy adolescents. They participated in functional magnetic resonance imaging during a Faces Task, watching blocks of video clips with angry and neutral facial expressions, and completed a Morphed Faces Task in the laboratory where they looked at different facial expressions that switched from anger to fear or sadness or from happiness to fear or sadness, and labelled them according to these four emotional expressions. A-allele versus GG-carriers in rs1049353 displayed earlier recognition of facial expressions changing from anger to sadness or fear, but not for expressions changing from happiness to sadness or fear, and higher brain responses to angry, but not neutral, faces in the amygdala and insula. For rs806377 no significant effects emerged. This suggests that rs1049353 is involved in the processing of negative facial expressions with relation to anger in adolescence. These findings add to our understanding of social emotion-related mechanisms in this life period. © 2015 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.
Neural circuitry of emotional and cognitive conflict revealed through facial expressions.
Chiew, Kimberly S; Braver, Todd S
2011-03-09
Neural systems underlying conflict processing have been well studied in the cognitive realm, but the extent to which these overlap with those underlying emotional conflict processing remains unclear. A novel adaptation of the AX Continuous Performance Task (AX-CPT), a stimulus-response incompatibility paradigm, was examined that permits close comparison of emotional and cognitive conflict conditions, through the use of affectively-valenced facial expressions as the response modality. Brain activity was monitored with functional magnetic resonance imaging (fMRI) during performance of the emotional AX-CPT. Emotional conflict was manipulated on a trial-by-trial basis, by requiring contextually pre-cued facial expressions to emotional probe stimuli (IAPS images) that were either affectively compatible (low-conflict) or incompatible (high-conflict). The emotion condition was contrasted against a matched cognitive condition that was identical in all respects, except that probe stimuli were emotionally neutral. Components of the brain cognitive control network, including dorsal anterior cingulate cortex (ACC) and lateral prefrontal cortex (PFC), showed conflict-related activation increases in both conditions, but with higher activity during emotion conditions. In contrast, emotion conflict effects were not found in regions associated with affective processing, such as rostral ACC. These activation patterns provide evidence for a domain-general neural system that is active for both emotional and cognitive conflict processing. In line with previous behavioural evidence, greatest activity in these brain regions occurred when both emotional and cognitive influences additively combined to produce increased interference.
Neural Circuitry of Emotional and Cognitive Conflict Revealed through Facial Expressions
Chiew, Kimberly S.; Braver, Todd S.
2011-01-01
Background Neural systems underlying conflict processing have been well studied in the cognitive realm, but the extent to which these overlap with those underlying emotional conflict processing remains unclear. A novel adaptation of the AX Continuous Performance Task (AX-CPT), a stimulus-response incompatibility paradigm, was examined that permits close comparison of emotional and cognitive conflict conditions, through the use of affectively-valenced facial expressions as the response modality. Methodology/Principal Findings Brain activity was monitored with functional magnetic resonance imaging (fMRI) during performance of the emotional AX-CPT. Emotional conflict was manipulated on a trial-by-trial basis, by requiring contextually pre-cued facial expressions to emotional probe stimuli (IAPS images) that were either affectively compatible (low-conflict) or incompatible (high-conflict). The emotion condition was contrasted against a matched cognitive condition that was identical in all respects, except that probe stimuli were emotionally neutral. Components of the brain cognitive control network, including dorsal anterior cingulate cortex (ACC) and lateral prefrontal cortex (PFC), showed conflict-related activation increases in both conditions, but with higher activity during emotion conditions. In contrast, emotion conflict effects were not found in regions associated with affective processing, such as rostral ACC. Conclusions/Significance These activation patterns provide evidence for a domain-general neural system that is active for both emotional and cognitive conflict processing. In line with previous behavioural evidence, greatest activity in these brain regions occurred when both emotional and cognitive influences additively combined to produce increased interference. PMID:21408006
Right Hemisphere Dominance for Emotion Processing in Baboons
ERIC Educational Resources Information Center
Wallez, Catherine; Vauclair, Jacques
2011-01-01
Asymmetries of emotional facial expressions in humans offer reliable indexes to infer brain lateralization and mostly revealed right hemisphere dominance. Studies concerned with oro-facial asymmetries in nonhuman primates largely showed a left-sided asymmetry in chimpanzees, marmosets and macaques. The presence of asymmetrical oro-facial…
Passing faces: sequence-dependent variations in the perceptual processing of emotional faces.
Karl, Christian; Hewig, Johannes; Osinsky, Roman
2016-10-01
There is broad evidence that contextual factors influence the processing of emotional facial expressions. Yet temporal-dynamic aspects, inter alia how face processing is influenced by the specific order of neutral and emotional facial expressions, have been largely neglected. To shed light on this topic, we recorded electroencephalogram from 168 healthy participants while they performed a gender-discrimination task with angry and neutral faces. Our event-related potential (ERP) analyses revealed a strong emotional modulation of the N170 component, indicating that the basic visual encoding and emotional analysis of a facial stimulus happen, at least partially, in parallel. While the N170 and the late positive potential (LPP; 400-600 ms) were only modestly affected by the sequence of preceding faces, we observed a strong influence of face sequences on the early posterior negativity (EPN; 200-300 ms). Finally, the differing response patterns of the EPN and LPP indicate that these two ERPs represent distinct processes during face analysis: while the former seems to represent the integration of contextual information in the perception of a current face, the latter appears to represent the net emotional interpretation of a current face.
Carrier-Toutant, Frédérike; Guay, Samuel; Beaulieu, Christelle; Léveillé, Édith; Turcotte-Giroux, Alexandre; Papineau, Samaël D; Brisson, Benoit; D'Hondt, Fabien; De Beaumont, Louis
2018-05-06
Concussions affect the processing of emotional stimuli. This study aimed to investigate how sex interacts with concussion effects on early event-related brain potentials (ERP) measures (P1, N1) of emotional facial expressions (EFE) processing in asymptomatic, multi-concussion athletes during an EFE identification task. Forty control athletes (20 females and 20 males) and 43 multi-concussed athletes (22 females and 21 males), recruited more than 3 months after their last concussion, were tested. Participants completed the Beck Depression Inventory II, the Beck Anxiety Inventory, the Post-Concussion Symptom Scale, and an Emotional Facial Expression Identification Task. Pictures of male and female faces expressing neutral, angry, and happy emotions were randomly presented and the emotion depicted had to be identified as fast as possible during EEG acquisition. Relative to controls, concussed athletes of both sex exhibited a significant suppression of P1 amplitude recorded from the dominant right hemisphere while performing the emotional face expression identification task. The present study also highlighted a sex-specific suppression of the N1 component amplitude after concussion which affected male athletes. These findings suggest that repeated concussions alter the typical pattern of right-hemisphere response dominance to EFE in early stages of EFE processing and that the neurophysiological mechanisms underlying the processing of emotional stimuli are distinctively affected across sex. (JINS, 2018, 24, 1-11).
Multimodal processing of emotional information in 9-month-old infants I: emotional faces and voices.
Otte, R A; Donkers, F C L; Braeken, M A K A; Van den Bergh, B R H
2015-04-01
Making sense of emotions manifesting in human voice is an important social skill which is influenced by emotions in other modalities, such as that of the corresponding face. Although processing emotional information from voices and faces simultaneously has been studied in adults, little is known about the neural mechanisms underlying the development of this ability in infancy. Here we investigated multimodal processing of fearful and happy face/voice pairs using event-related potential (ERP) measures in a group of 84 9-month-olds. Infants were presented with emotional vocalisations (fearful/happy) preceded by the same or a different facial expression (fearful/happy). The ERP data revealed that the processing of emotional information appearing in human voice was modulated by the emotional expression appearing on the corresponding face: Infants responded with larger auditory ERPs after fearful compared to happy facial primes. This finding suggests that infants dedicate more processing capacities to potentially threatening than to non-threatening stimuli. Copyright © 2014 Elsevier Inc. All rights reserved.
Somppi, Sanni; Törnqvist, Heini; Kujala, Miiamaaria V.; Hänninen, Laura; Krause, Christina M.; Vainio, Outi
2016-01-01
Appropriate response to companions’ emotional signals is important for all social creatures. The emotional expressions of humans and non-human animals have analogies in their form and function, suggesting shared evolutionary roots, but very little is known about how animals other than primates view and process facial expressions. In primates, threat-related facial expressions evoke exceptional viewing patterns compared with neutral or positive stimuli. Here, we explore if domestic dogs (Canis familiaris) have such an attentional bias toward threatening social stimuli and whether observed emotional expressions affect dogs’ gaze fixation distribution among the facial features (eyes, midface and mouth). We recorded the voluntary eye gaze of 31 domestic dogs during viewing of facial photographs of humans and dogs with three emotional expressions (threatening, pleasant and neutral). We found that dogs’ gaze fixations spread systematically among facial features. The distribution of fixations was altered by the seen expression, but eyes were the most probable targets of the first fixations and gathered longer looking durations than mouth regardless of the viewed expression. The examination of the inner facial features as a whole revealed more pronounced scanning differences among expressions. This suggests that dogs do not base their perception of facial expressions on the viewing of single structures, but the interpretation of the composition formed by eyes, midface and mouth. Dogs evaluated social threat rapidly and this evaluation led to attentional bias, which was dependent on the depicted species: threatening conspecifics’ faces evoked heightened attention but threatening human faces instead an avoidance response. We propose that threatening signals carrying differential biological validity are processed via distinctive neurocognitive pathways. Both of these mechanisms may have an adaptive significance for domestic dogs. The findings provide a novel perspective on understanding the processing of emotional expressions and sensitivity to social threat in non-primates. PMID:26761433
Bublatzky, Florian; Gerdes, Antje B. M.; White, Andrew J.; Riemer, Martin; Alpers, Georg W.
2014-01-01
Human face perception is modulated by both emotional valence and social relevance, but their interaction has rarely been examined. Event-related brain potentials (ERP) to happy, neutral, and angry facial expressions with different degrees of social relevance were recorded. To implement a social anticipation task, relevance was manipulated by presenting faces of two specific actors as future interaction partners (socially relevant), whereas two other face actors remained non-relevant. In a further control task all stimuli were presented without specific relevance instructions (passive viewing). Face stimuli of four actors (2 women, from the KDEF) were randomly presented for 1s to 26 participants (16 female). Results showed an augmented N170, early posterior negativity (EPN), and late positive potential (LPP) for emotional in contrast to neutral facial expressions. Of particular interest, face processing varied as a function of experimental tasks. Whereas task effects were observed for P1 and EPN regardless of instructed relevance, LPP amplitudes were modulated by emotional facial expression and relevance manipulation. The LPP was specifically enhanced for happy facial expressions of the anticipated future interaction partners. This underscores that social relevance can impact face processing already at an early stage of visual processing. These findings are discussed within the framework of motivated attention and face processing theories. PMID:25076881
The Functional Role of the Periphery in Emotional Language Comprehension
Havas, David A.; Matheson, James
2013-01-01
Language can impact emotion, even when it makes no reference to emotion states. For example, reading sentences with positive meanings (“The water park is refreshing on the hot summer day”) induces patterns of facial feedback congruent with the sentence emotionality (smiling), whereas sentences with negative meanings induce a frown. Moreover, blocking facial afference with botox selectively slows comprehension of emotional sentences. Therefore, theories of cognition should account for emotion-language interactions above the level of explicit emotion words, and the role of peripheral feedback in comprehension. For this special issue exploring frontiers in the role of the body and environment in cognition, we propose a theory in which facial feedback provides a context-sensitive constraint on the simulation of actions described in language. Paralleling the role of emotions in real-world behavior, our account proposes that (1) facial expressions accompany sudden shifts in wellbeing as described in language; (2) facial expressions modulate emotional action systems during reading; and (3) emotional action systems prepare the reader for an effective simulation of the ensuing language content. To inform the theory and guide future research, we outline a framework based on internal models for motor control. To support the theory, we assemble evidence from diverse areas of research. Taking a functional view of emotion, we tie the theory to behavioral and neural evidence for a role of facial feedback in cognition. Our theoretical framework provides a detailed account that can guide future research on the role of emotional feedback in language processing, and on interactions of language and emotion. It also highlights the bodily periphery as relevant to theories of embodied cognition. PMID:23750145
The effect of Ramadan fasting on spatial attention through emotional stimuli
Molavi, Maziyar; Yunus, Jasmy; Utama, Nugraha P
2016-01-01
Fasting can influence psychological and mental states. In the current study, the effect of periodical fasting on the process of emotion through gazed facial expression as a realistic multisource of social information was investigated for the first time. The dynamic cue-target task was applied via behavior and event-related potential measurements for 40 participants to reveal the temporal and spatial brain activities – before, during, and after fasting periods. The significance of fasting included several effects. The amplitude of the N1 component decreased over the centroparietal scalp during fasting. Furthermore, the reaction time during the fasting period decreased. The self-measurement of deficit arousal as well as the mood increased during the fasting period. There was a significant contralateral alteration of P1 over occipital area for the happy facial expression stimuli. The significant effect of gazed expression and its interaction with the emotional stimuli was indicated by the amplitude of N1. Furthermore, the findings of the study approved the validity effect as a congruency between gaze and target position, as indicated by the increment of P3 amplitude over centroparietal area as well as slower reaction time from behavioral response data during incongruency or invalid condition between gaze and target position compared with those during valid condition. Results of this study proved that attention to facial expression stimuli as a kind of communicative social signal was affected by fasting. Also, fasting improved the mood of practitioners. Moreover, findings from the behavioral and event-related potential data analyses indicated that the neural dynamics of facial emotion are processed faster than that of gazing, as the participants tended to react faster and prefer to relay on the type of facial emotions than to gaze direction while doing the task. Because of happy facial expression stimuli, right hemisphere activation was more than that of the left hemisphere. It indicated the consistency of the emotional lateralization concept rather than the valence concept of emotional processing. PMID:27307772
Quarto, Tiziana; Blasi, Giuseppe; Maddalena, Chiara; Viscanti, Giovanna; Lanciano, Tiziana; Soleti, Emanuela; Mangiulli, Ivan; Taurisano, Paolo; Fazio, Leonardo; Bertolino, Alessandro; Curci, Antonietta
2016-01-01
The human ability of identifying, processing and regulating emotions from social stimuli is generally referred as Emotional Intelligence (EI). Within EI, Ability EI identifies a performance measure assessing individual skills at perceiving, using, understanding and managing emotions. Previous models suggest that a brain "somatic marker circuitry" (SMC) sustains emotional sub-processes included in EI. Three primary brain regions are included: the amygdala, the insula and the ventromedial prefrontal cortex (vmPFC). Here, our aim was to investigate the relationship between Ability EI scores and SMC activity during social judgment of emotional faces. Sixty-three healthy subjects completed a test measuring Ability EI and underwent fMRI during a social decision task (i.e. approach or avoid) about emotional faces with different facial expressions. Imaging data revealed that EI scores are associated with left insula activity during social judgment of emotional faces as a function of facial expression. Specifically, higher EI scores are associated with greater left insula activity during social judgment of fearful faces but also with lower activity of this region during social judgment of angry faces. These findings indicate that the association between Ability EI and the SMC activity during social behavior is region- and emotion-specific.
Wynn, Jonathan K.; Lee, Junghee; Horan, William P.; Green, Michael F.
2008-01-01
Schizophrenia patients show impairments in identifying facial affect; however, it is not known at what stage facial affect processing is impaired. We evaluated 3 event-related potentials (ERPs) to explore stages of facial affect processing in schizophrenia patients. Twenty-six schizophrenia patients and 27 normal controls participated. In separate blocks, subjects identified the gender of a face, the emotion of a face, or if a building had 1 or 2 stories. Three ERPs were examined: (1) P100 to examine basic visual processing, (2) N170 to examine facial feature encoding, and (3) N250 to examine affect decoding. Behavioral performance on each task was also measured. Results showed that schizophrenia patients’ P100 was comparable to the controls during all 3 identification tasks. Both patients and controls exhibited a comparable N170 that was largest during processing of faces and smallest during processing of buildings. For both groups, the N250 was largest during the emotion identification task and smallest for the building identification task. However, the patients produced a smaller N250 compared with the controls across the 3 tasks. The groups did not differ in behavioral performance in any of the 3 identification tasks. The pattern of intact P100 and N170 suggest that patients maintain basic visual processing and facial feature encoding abilities. The abnormal N250 suggests that schizophrenia patients are less efficient at decoding facial affect features. Our results imply that abnormalities in the later stage of feature decoding could potentially underlie emotion identification deficits in schizophrenia. PMID:18499704
ERIC Educational Resources Information Center
Gross, Thomas F.
2005-01-01
Global information processing and perception of facial age and emotional expression was studied in children with autism, language disorders, mental retardation, and a clinical control group. Children were given a global-local task and asked to recognize age and emotion in human and canine faces. Children with autism made fewer global responses and…
Culture shapes 7-month-olds' perceptual strategies in discriminating facial expressions of emotion.
Geangu, Elena; Ichikawa, Hiroko; Lao, Junpeng; Kanazawa, So; Yamaguchi, Masami K; Caldara, Roberto; Turati, Chiara
2016-07-25
Emotional facial expressions are thought to have evolved because they play a crucial role in species' survival. From infancy, humans develop dedicated neural circuits [1] to exhibit and recognize a variety of facial expressions [2]. But there is increasing evidence that culture specifies when and how certain emotions can be expressed - social norms - and that the mature perceptual mechanisms used to transmit and decode the visual information from emotional signals differ between Western and Eastern adults [3-5]. Specifically, the mouth is more informative for transmitting emotional signals in Westerners and the eye region for Easterners [4], generating culture-specific fixation biases towards these features [5]. During development, it is recognized that cultural differences can be observed at the level of emotional reactivity and regulation [6], and to the culturally dominant modes of attention [7]. Nonetheless, to our knowledge no study has explored whether culture shapes the processing of facial emotional signals early in development. The data we report here show that, by 7 months, infants from both cultures visually discriminate facial expressions of emotion by relying on culturally distinct fixation strategies, resembling those used by the adults from the environment in which they develop [5]. Copyright © 2016 Elsevier Ltd. All rights reserved.
A voxel-based lesion study on facial emotion recognition after penetrating brain injury
Dal Monte, Olga; Solomon, Jeffrey M.; Schintu, Selene; Knutson, Kristine M.; Strenziok, Maren; Pardini, Matteo; Leopold, Anne; Raymont, Vanessa; Grafman, Jordan
2013-01-01
The ability to read emotions in the face of another person is an important social skill that can be impaired in subjects with traumatic brain injury (TBI). To determine the brain regions that modulate facial emotion recognition, we conducted a whole-brain analysis using a well-validated facial emotion recognition task and voxel-based lesion symptom mapping (VLSM) in a large sample of patients with focal penetrating TBIs (pTBIs). Our results revealed that individuals with pTBI performed significantly worse than normal controls in recognizing unpleasant emotions. VLSM mapping results showed that impairment in facial emotion recognition was due to damage in a bilateral fronto-temporo-limbic network, including medial prefrontal cortex (PFC), anterior cingulate cortex, left insula and temporal areas. Beside those common areas, damage to the bilateral and anterior regions of PFC led to impairment in recognizing unpleasant emotions, whereas bilateral posterior PFC and left temporal areas led to impairment in recognizing pleasant emotions. Our findings add empirical evidence that the ability to read pleasant and unpleasant emotions in other people's faces is a complex process involving not only a common network that includes bilateral fronto-temporo-limbic lobes, but also other regions depending on emotional valence. PMID:22496440
Frühholz, Sascha; Fehr, Thorsten; Herrmann, Manfred
2009-10-01
Contextual features during recognition of facial affect are assumed to modulate the temporal course of emotional face processing. Here, we simultaneously presented colored backgrounds during valence categorizations of facial expressions. Subjects incidentally learned to perceive negative, neutral and positive expressions within a specific colored context. Subsequently, subjects made fast valence judgments while presented with the same face-color-combinations as in the first run (congruent trials) or with different face-color-combinations (incongruent trials). Incongruent trials induced significantly increased response latencies and significantly decreased performance accuracy. Contextual incongruent information during processing of neutral expressions modulated the P1 and the early posterior negativity (EPN) both localized in occipito-temporal areas. Contextual congruent information during emotional face perception revealed an emotion-related modulation of the P1 for positive expressions and of the N170 and the EPN for negative expressions. Highest amplitude of the N170 was found for negative expressions in a negatively associated context and the N170 amplitude varied with the amount of overall negative information. Incongruent trials with negative expressions elicited a parietal negativity which was localized to superior parietal cortex and which most likely represents a posterior manifestation of the N450 as an indicator of conflict processing. A sustained activation of the late LPP over parietal cortex for all incongruent trials might reflect enhanced engagement with facial expression during task conditions of contextual interference. In conclusion, whereas early components seem to be sensitive to the emotional valence of facial expression in specific contexts, late components seem to subserve interference resolution during emotional face processing.
Rymarczyk, Krystyna; Żurawski, Łukasz; Jankowiak-Siuda, Kamila; Szatkowska, Iwona
2016-01-01
Facial mimicry is the spontaneous response to others’ facial expressions by mirroring or matching the interaction partner. Recent evidence suggested that mimicry may not be only an automatic reaction but could be dependent on many factors, including social context, type of task in which the participant is engaged, or stimulus properties (dynamic vs static presentation). In the present study, we investigated the impact of dynamic facial expression and sex differences on facial mimicry and judgment of emotional intensity. Electromyography recordings were recorded from the corrugator supercilii, zygomaticus major, and orbicularis oculi muscles during passive observation of static and dynamic images of happiness and anger. The ratings of the emotional intensity of facial expressions were also analysed. As predicted, dynamic expressions were rated as more intense than static ones. Compared to static images, dynamic displays of happiness also evoked stronger activity in the zygomaticus major and orbicularis oculi, suggesting that subjects experienced positive emotion. No muscles showed mimicry activity in response to angry faces. Moreover, we found that women exhibited greater zygomaticus major muscle activity in response to dynamic happiness stimuli than static stimuli. Our data support the hypothesis that people mimic positive emotions and confirm the importance of dynamic stimuli in some emotional processing. PMID:27390867
Adaptation to Emotional Conflict: Evidence from a Novel Face Emotion Paradigm
Clayson, Peter E.; Larson, Michael J.
2013-01-01
The preponderance of research on trial-by-trial recruitment of affective control (e.g., conflict adaptation) relies on stimuli wherein lexical word information conflicts with facial affective stimulus properties (e.g., the face-Stroop paradigm where an emotional word is overlaid on a facial expression). Several studies, however, indicate different neural time course and properties for processing of affective lexical stimuli versus affective facial stimuli. The current investigation used a novel task to examine control processes implemented following conflicting emotional stimuli with conflict-inducing affective face stimuli in the absence of affective words. Forty-one individuals completed a task wherein the affective-valence of the eyes and mouth were either congruent (happy eyes, happy mouth) or incongruent (happy eyes, angry mouth) while high-density event-related potentials (ERPs) were recorded. There was a significant congruency effect and significant conflict adaptation effects for error rates. Although response times (RTs) showed a significant congruency effect, the effect of previous-trial congruency on current-trial RTs was only present for current congruent trials. Temporospatial principal components analysis showed a P3-like ERP source localized using FieldTrip software to the medial cingulate gyrus that was smaller on incongruent than congruent trials and was significantly influenced by the recruitment of control processes following previous-trial emotional conflict (i.e., there was significant conflict adaptation in the ERPs). Results show that a face-only paradigm may be sufficient to elicit emotional conflict and suggest a system for rapidly detecting conflicting emotional stimuli and subsequently adjusting control resources, similar to cognitive conflict detection processes, when using conflicting facial expressions without words. PMID:24073278
Adaptation to emotional conflict: evidence from a novel face emotion paradigm.
Clayson, Peter E; Larson, Michael J
2013-01-01
The preponderance of research on trial-by-trial recruitment of affective control (e.g., conflict adaptation) relies on stimuli wherein lexical word information conflicts with facial affective stimulus properties (e.g., the face-Stroop paradigm where an emotional word is overlaid on a facial expression). Several studies, however, indicate different neural time course and properties for processing of affective lexical stimuli versus affective facial stimuli. The current investigation used a novel task to examine control processes implemented following conflicting emotional stimuli with conflict-inducing affective face stimuli in the absence of affective words. Forty-one individuals completed a task wherein the affective-valence of the eyes and mouth were either congruent (happy eyes, happy mouth) or incongruent (happy eyes, angry mouth) while high-density event-related potentials (ERPs) were recorded. There was a significant congruency effect and significant conflict adaptation effects for error rates. Although response times (RTs) showed a significant congruency effect, the effect of previous-trial congruency on current-trial RTs was only present for current congruent trials. Temporospatial principal components analysis showed a P3-like ERP source localized using FieldTrip software to the medial cingulate gyrus that was smaller on incongruent than congruent trials and was significantly influenced by the recruitment of control processes following previous-trial emotional conflict (i.e., there was significant conflict adaptation in the ERPs). Results show that a face-only paradigm may be sufficient to elicit emotional conflict and suggest a system for rapidly detecting conflicting emotional stimuli and subsequently adjusting control resources, similar to cognitive conflict detection processes, when using conflicting facial expressions without words.
Radke, Sina; Schäfer, Ina C; Müller, Bernhard W; de Bruijn, Ellen R A
2013-12-15
Although 'irrational' decision-making has been linked to depression, the contribution of biases in information processing to these findings remains unknown. To investigate the impact of cognitive biases and aberrant processing of facial emotions on social decision-making, we manipulated both context-related and emotion-related information in a modified Ultimatum Game. Unfair offers were (1) paired with different unselected alternatives, establishing the context in which an offer was made, and (2) accompanied by emotional facial expressions of proposers. Responder behavior was assessed in patients with major depressive disorder and healthy controls. In both groups alike, rejection rates were highest following unambiguous signals of unfairness, i.e. an angry proposer face or when an unfair distribution had deliberately been chosen over an equal split. However, depressed patients showed overall higher rejection rates than healthy volunteers, without exhibiting differential processing biases. This suggests that depressed patients were, as healthy individuals, basing their decisions on informative, salient features and differentiating between (i) fair and unfair offers, (ii) alternatives to unfair offers and (iii) proposers' facial emotions. Although more fundamental processes, e.g. reduced reward sensitivity, might underlie increased rejection in depression, the current study provides insight into mechanisms that shape fairness considerations in both depressed and healthy individuals. © 2013 Elsevier Ireland Ltd. All rights reserved.
Oxytocin Reduces Face Processing Time but Leaves Recognition Accuracy and Eye-Gaze Unaffected.
Hubble, Kelly; Daughters, Katie; Manstead, Antony S R; Rees, Aled; Thapar, Anita; van Goozen, Stephanie H M
2017-01-01
Previous studies have found that oxytocin (OXT) can improve the recognition of emotional facial expressions; it has been proposed that this effect is mediated by an increase in attention to the eye-region of faces. Nevertheless, evidence in support of this claim is inconsistent, and few studies have directly tested the effect of oxytocin on emotion recognition via altered eye-gaze Methods: In a double-blind, within-subjects, randomized control experiment, 40 healthy male participants received 24 IU intranasal OXT and placebo in two identical experimental sessions separated by a 2-week interval. Visual attention to the eye-region was assessed on both occasions while participants completed a static facial emotion recognition task using medium intensity facial expressions. Although OXT had no effect on emotion recognition accuracy, recognition performance was improved because face processing was faster across emotions under the influence of OXT. This effect was marginally significant (p<.06). Consistent with a previous study using dynamic stimuli, OXT had no effect on eye-gaze patterns when viewing static emotional faces and this was not related to recognition accuracy or face processing time. These findings suggest that OXT-induced enhanced facial emotion recognition is not necessarily mediated by an increase in attention to the eye-region of faces, as previously assumed. We discuss several methodological issues which may explain discrepant findings and suggest the effect of OXT on visual attention may differ depending on task requirements. (JINS, 2017, 23, 23-33).
Facial thermal variations: A new marker of emotional arousal.
Kosonogov, Vladimir; De Zorzi, Lucas; Honoré, Jacques; Martínez-Velázquez, Eduardo S; Nandrino, Jean-Louis; Martinez-Selva, José M; Sequeira, Henrique
2017-01-01
Functional infrared thermal imaging (fITI) is considered a promising method to measure emotional autonomic responses through facial cutaneous thermal variations. However, the facial thermal response to emotions still needs to be investigated within the framework of the dimensional approach to emotions. The main aim of this study was to assess how the facial thermal variations index the emotional arousal and valence dimensions of visual stimuli. Twenty-four participants were presented with three groups of standardized emotional pictures (unpleasant, neutral and pleasant) from the International Affective Picture System. Facial temperature was recorded at the nose tip, an important region of interest for facial thermal variations, and compared to electrodermal responses, a robust index of emotional arousal. Both types of responses were also compared to subjective ratings of pictures. An emotional arousal effect was found on the amplitude and latency of thermal responses and on the amplitude and frequency of electrodermal responses. The participants showed greater thermal and dermal responses to emotional than to neutral pictures with no difference between pleasant and unpleasant ones. Thermal responses correlated and the dermal ones tended to correlate with subjective ratings. Finally, in the emotional conditions compared to the neutral one, the frequency of simultaneous thermal and dermal responses increased while both thermal or dermal isolated responses decreased. Overall, this study brings convergent arguments to consider fITI as a promising method reflecting the arousal dimension of emotional stimulation and, consequently, as a credible alternative to the classical recording of electrodermal activity. The present research provides an original way to unveil autonomic implication in emotional processes and opens new perspectives to measure them in touchless conditions.
Faces in Context: A Review and Systematization of Contextual Influences on Affective Face Processing
Wieser, Matthias J.; Brosch, Tobias
2012-01-01
Facial expressions are of eminent importance for social interaction as they convey information about other individuals’ emotions and social intentions. According to the predominant “basic emotion” approach, the perception of emotion in faces is based on the rapid, automatic categorization of prototypical, universal expressions. Consequently, the perception of facial expressions has typically been investigated using isolated, de-contextualized, static pictures of facial expressions that maximize the distinction between categories. However, in everyday life, an individual’s face is not perceived in isolation, but almost always appears within a situational context, which may arise from other people, the physical environment surrounding the face, as well as multichannel information from the sender. Furthermore, situational context may be provided by the perceiver, including already present social information gained from affective learning and implicit processing biases such as race bias. Thus, the perception of facial expressions is presumably always influenced by contextual variables. In this comprehensive review, we aim at (1) systematizing the contextual variables that may influence the perception of facial expressions and (2) summarizing experimental paradigms and findings that have been used to investigate these influences. The studies reviewed here demonstrate that perception and neural processing of facial expressions are substantially modified by contextual information, including verbal, visual, and auditory information presented together with the face as well as knowledge or processing biases already present in the observer. These findings further challenge the assumption of automatic, hardwired categorical emotion extraction mechanisms predicted by basic emotion theories. Taking into account a recent model on face processing, we discuss where and when these different contextual influences may take place, thus outlining potential avenues in future research. PMID:23130011
Fenske, Sabrina; Lis, Stefanie; Liebke, Lisa; Niedtfeld, Inga; Kirsch, Peter; Mier, Daniela
2015-01-01
Borderline Personality Disorder (BPD) is characterized by severe deficits in social interactions, which might be linked to deficits in emotion recognition. Research on emotion recognition abilities in BPD revealed heterogeneous results, ranging from deficits to heightened sensitivity. The most stable findings point to an impairment in the evaluation of neutral facial expressions as neutral, as well as to a negative bias in emotion recognition; that is the tendency to attribute negative emotions to neutral expressions, or in a broader sense to report a more negative emotion category than depicted. However, it remains unclear which contextual factors influence the occurrence of this negative bias. Previous studies suggest that priming by preceding emotional information and also constrained processing time might augment the emotion recognition deficit in BPD. To test these assumptions, 32 female BPD patients and 31 healthy females, matched for age and education, participated in an emotion recognition study, in which every facial expression was preceded by either a positive, neutral or negative scene. Furthermore, time constraints for processing were varied by presenting the facial expressions with short (100 ms) or long duration (up to 3000 ms) in two separate blocks. BPD patients showed a significant deficit in emotion recognition for neutral and positive facial expression, associated with a significant negative bias. In BPD patients, this emotion recognition deficit was differentially affected by preceding emotional information and time constraints, with a greater influence of emotional information during long face presentations and a greater influence of neutral information during short face presentations. Our results are in line with previous findings supporting the existence of a negative bias in emotion recognition in BPD patients, and provide further insights into biased social perceptions in BPD patients.
Impaired perception of facial emotion in developmental prosopagnosia.
Biotti, Federica; Cook, Richard
2016-08-01
Developmental prosopagnosia (DP) is a neurodevelopmental condition characterised by difficulties recognising faces. Despite severe difficulties recognising facial identity, expression recognition is typically thought to be intact in DP; case studies have described individuals who are able to correctly label photographic displays of facial emotion, and no group differences have been reported. This pattern of deficits suggests a locus of impairment relatively late in the face processing stream, after the divergence of expression and identity analysis pathways. To date, however, there has been little attempt to investigate emotion recognition systematically in a large sample of developmental prosopagnosics using sensitive tests. In the present study, we describe three complementary experiments that examine emotion recognition in a sample of 17 developmental prosopagnosics. In Experiment 1, we investigated observers' ability to make binary classifications of whole-face expression stimuli drawn from morph continua. In Experiment 2, observers judged facial emotion using only the eye-region (the rest of the face was occluded). Analyses of both experiments revealed diminished ability to classify facial expressions in our sample of developmental prosopagnosics, relative to typical observers. Imprecise expression categorisation was particularly evident in those individuals exhibiting apperceptive profiles, associated with problems encoding facial shape accurately. Having split the sample of prosopagnosics into apperceptive and non-apperceptive subgroups, only the apperceptive prosopagnosics were impaired relative to typical observers. In our third experiment, we examined the ability of observers' to classify the emotion present within segments of vocal affect. Despite difficulties judging facial emotion, the prosopagnosics exhibited excellent recognition of vocal affect. Contrary to the prevailing view, our results suggest that many prosopagnosics do experience difficulties classifying expressions, particularly those with apperceptive profiles. These individuals may have difficulties forming view-invariant structural descriptions at an early stage in the face processing stream, before identity and expression pathways diverge. Copyright © 2016 Elsevier Ltd. All rights reserved.
Tang, Dorothy Y Y; Liu, Amy C Y; Lui, Simon S Y; Lam, Bess Y H; Siu, Bonnie W M; Lee, Tatia M C; Cheung, Eric F C
2016-02-28
Impairment in facial emotion perception is believed to be associated with aggression. Schizophrenia patients with antisocial features are more impaired in facial emotion perception than their counterparts without these features. However, previous studies did not define the comorbidity of antisocial personality disorder (ASPD) using stringent criteria. We recruited 30 participants with dual diagnoses of ASPD and schizophrenia, 30 participants with schizophrenia and 30 controls. We employed the Facial Emotional Recognition paradigm to measure facial emotion perception, and administered a battery of neurocognitive tests. The Life History of Aggression scale was used. ANOVAs and ANCOVAs were conducted to examine group differences in facial emotion perception, and control for the effect of other neurocognitive dysfunctions on facial emotion perception. Correlational analyses were conducted to examine the association between facial emotion perception and aggression. Patients with dual diagnoses performed worst in facial emotion perception among the three groups. The group differences in facial emotion perception remained significant, even after other neurocognitive impairments were controlled for. Severity of aggression was correlated with impairment in perceiving negative-valenced facial emotions in patients with dual diagnoses. Our findings support the presence of facial emotion perception impairment and its association with aggression in schizophrenia patients with comorbid ASPD. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Effects of task demands on the early neural processing of fearful and happy facial expressions
Itier, Roxane J.; Neath-Tavares, Karly N.
2017-01-01
Task demands shape how we process environmental stimuli but their impact on the early neural processing of facial expressions remains unclear. In a within-subject design, ERPs were recorded to the same fearful, happy and neutral facial expressions presented during a gender discrimination, an explicit emotion discrimination and an oddball detection tasks, the most studied tasks in the field. Using an eye tracker, fixation on the face nose was enforced using a gaze-contingent presentation. Task demands modulated amplitudes from 200–350ms at occipito-temporal sites spanning the EPN component. Amplitudes were more negative for fearful than neutral expressions starting on N170 from 150–350ms, with a temporo-occipital distribution, whereas no clear effect of happy expressions was seen. Task and emotion effects never interacted in any time window or for the ERP components analyzed (P1, N170, EPN). Thus, whether emotion is explicitly discriminated or irrelevant for the task at hand, neural correlates of fearful and happy facial expressions seem immune to these task demands during the first 350ms of visual processing. PMID:28315309
Seeing Emotion with Your Ears: Emotional Prosody Implicitly Guides Visual Attention to Faces
Rigoulot, Simon; Pell, Marc D.
2012-01-01
Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality) and emotional speech prosody (auditory modality) which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing) uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0–1250 ms], [1250–2500 ms], [2500–5000 ms]) were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect), although this effect was often emotion-specific (with greatest effects for fear). Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions. PMID:22303454
Shim, Miseon; Kim, Do-Won; Yoon, Sunkyung; Park, Gewnhi; Im, Chang-Hwan; Lee, Seung-Hwan
2016-06-01
Deficits in facial emotion processing is a major characteristic of patients with panic disorder. It is known that visual stimuli with different spatial frequencies take distinct neural pathways. This study investigated facial emotion processing involving stimuli presented at broad, high, and low spatial frequencies in patients with panic disorder. Eighteen patients with panic disorder and 19 healthy controls were recruited. Seven event-related potential (ERP) components: (P100, N170, early posterior negativity (EPN); vertex positive potential (VPP), N250, P300; and late positive potential (LPP)) were evaluated while the participants looked at fearful and neutral facial stimuli presented at three spatial frequencies. When a fearful face was presented, panic disorder patients showed a significantly increased P100 amplitude in response to low spatial frequency compared to high spatial frequency; whereas healthy controls demonstrated significant broad spatial frequency dependent processing in P100 amplitude. Vertex positive potential amplitude was significantly increased in high and broad spatial frequency, compared to low spatial frequency in panic disorder. Early posterior negativity amplitude was significantly different between HSF and BSF, and between LSF and BSF processing in both groups, regardless of facial expression. The possibly confounding effects of medication could not be controlled. During early visual processing, patients with panic disorder prefer global to detailed information. However, in later processing, panic disorder patients overuse detailed information for the perception of facial expressions. These findings suggest that unique spatial frequency-dependent facial processing could shed light on the neural pathology associated with panic disorder. Copyright © 2016 Elsevier B.V. All rights reserved.
Spontaneous facial expressions of emotion of congenitally and noncongenitally blind individuals.
Matsumoto, David; Willingham, Bob
2009-01-01
The study of the spontaneous expressions of blind individuals offers a unique opportunity to understand basic processes concerning the emergence and source of facial expressions of emotion. In this study, the authors compared the expressions of congenitally and noncongenitally blind athletes in the 2004 Paralympic Games with each other and with those produced by sighted athletes in the 2004 Olympic Games. The authors also examined how expressions change from 1 context to another. There were no differences between congenitally blind, noncongenitally blind, and sighted athletes, either on the level of individual facial actions or in facial emotion configurations. Blind athletes did produce more overall facial activity, but these were isolated to head and eye movements. The blind athletes' expressions differentiated whether they had won or lost a medal match at 3 different points in time, and there were no cultural differences in expression. These findings provide compelling evidence that the production of spontaneous facial expressions of emotion is not dependent on observational learning but simultaneously demonstrates a learned component to the social management of expressions, even among blind individuals.
Implicit attentional bias for facial emotion in dissociative seizures: Additional evidence.
Pick, Susannah; Mellers, John D C; Goldstein, Laura H
2018-03-01
This study sought to extend knowledge about the previously reported preconscious attentional bias (AB) for facial emotion in patients with dissociative seizures (DS) by exploring whether the finding could be replicated, while controlling for concurrent anxiety, depression, and potentially relevant cognitive impairments. Patients diagnosed with DS (n=38) were compared with healthy controls (n=43) on a pictorial emotional Stroop test, in which backwardly masked emotional faces (angry, happy, neutral) were processed implicitly. The group with DS displayed a significantly greater AB to facial emotion relative to controls; however, the bias was not specific to negative or positive emotions. The group effect could not be explained by performance on standardized cognitive tests or self-reported depression/anxiety. The study provides additional evidence of a disproportionate and automatic allocation of attention to facial affect in patients with DS, including both positive and negative facial expressions. Such a tendency could act as a predisposing factor for developing DS initially, or may contribute to triggering individuals' seizures on an ongoing basis. Psychological interventions such as Cognitive Behavioral Therapy (CBT) or AB modification might be suitable approaches to target this bias in clinical practice. Copyright © 2018 Elsevier Inc. All rights reserved.
Reduced emotion processing efficiency in healthy males relative to females
Rapport, Lisa J.; Briceno, Emily M.; Haase, Brennan D.; Vederman, Aaron C.; Bieliauskas, Linas A.; Welsh, Robert C.; Starkman, Monica N.; McInnis, Melvin G.; Zubieta, Jon-Kar; Langenecker, Scott A.
2014-01-01
This study examined sex differences in categorization of facial emotions and activation of brain regions supportive of those classifications. In Experiment 1, performance on the Facial Emotion Perception Test (FEPT) was examined among 75 healthy females and 63 healthy males. Females were more accurate in the categorization of fearful expressions relative to males. In Experiment 2, 3T functional magnetic resonance imaging data were acquired for a separate sample of 21 healthy females and 17 healthy males while performing the FEPT. Activation to neutral facial expressions was subtracted from activation to sad, angry, fearful and happy facial expressions. Although females and males demonstrated activation in some overlapping regions for all emotions, many regions were exclusive to females or males. For anger, sad and happy, males displayed a larger extent of activation than did females, and greater height of activation was detected in diffuse cortical and subcortical regions. For fear, males displayed greater activation than females only in right postcentral gyri. With one exception in females, performance was not associated with activation. Results suggest that females and males process emotions using different neural pathways, and these differences cannot be explained by performance variations. PMID:23196633
Voorthuis, Alexandra; Riem, Madelon M E; Van IJzendoorn, Marinus H; Bakermans-Kranenburg, Marian J
2014-09-11
The neuropeptide oxytocin facilitates parental caregiving and is involved in the processing of infant vocal cues. In this randomized-controlled trial with functional magnetic resonance imaging we examined the influence of intranasally administered oxytocin on neural activity during emotion recognition in infant faces. Blood oxygenation level dependent (BOLD) responses during emotion recognition were measured in 50 women who were administered 16 IU of oxytocin or a placebo. Participants performed an adapted version of the Infant Facial Expressions of Emotions from Looking at Pictures (IFEEL pictures), a task that has been developed to assess the perception and interpretation of infants' facial expressions. Experimentally induced oxytocin levels increased activation in the inferior frontal gyrus (IFG), the middle temporal gyrus (MTG) and the superior temporal gyrus (STG). However, oxytocin decreased performance on the IFEEL picture task. Our findings suggest that oxytocin enhances processing of facial cues of the emotional state of infants on a neural level, but at the same time it may decrease the correct interpretation of infants' facial expressions on a behavior level. This article is part of a Special Issue entitled Oxytocin and Social Behav. © 2013 Published by Elsevier B.V.
A small-world network model of facial emotion recognition.
Takehara, Takuma; Ochiai, Fumio; Suzuki, Naoto
2016-01-01
Various models have been proposed to increase understanding of the cognitive basis of facial emotions. Despite those efforts, interactions between facial emotions have received minimal attention. If collective behaviours relating to each facial emotion in the comprehensive cognitive system could be assumed, specific facial emotion relationship patterns might emerge. In this study, we demonstrate that the frameworks of complex networks can effectively capture those patterns. We generate 81 facial emotion images (6 prototypes and 75 morphs) and then ask participants to rate degrees of similarity in 3240 facial emotion pairs in a paired comparison task. A facial emotion network constructed on the basis of similarity clearly forms a small-world network, which features an extremely short average network distance and close connectivity. Further, even if two facial emotions have opposing valences, they are connected within only two steps. In addition, we show that intermediary morphs are crucial for maintaining full network integration, whereas prototypes are not at all important. These results suggest the existence of collective behaviours in the cognitive systems of facial emotions and also describe why people can efficiently recognize facial emotions in terms of information transmission and propagation. For comparison, we construct three simulated networks--one based on the categorical model, one based on the dimensional model, and one random network. The results reveal that small-world connectivity in facial emotion networks is apparently different from those networks, suggesting that a small-world network is the most suitable model for capturing the cognitive basis of facial emotions.
Beall, Paula M; Moody, Eric J; McIntosh, Daniel N; Hepburn, Susan L; Reed, Catherine L
2008-11-01
Typical adults mimic facial expressions within 1000 ms, but adults with autism spectrum disorder (ASD) do not. These rapid facial reactions (RFRs) are associated with the development of social-emotional abilities. Such interpersonal matching may be caused by motor mirroring or emotional responses. Using facial electromyography (EMG), this study evaluated mechanisms underlying RFRs during childhood and examined possible impairment in children with ASD. Experiment 1 found RFRs to happy and angry faces (not fear faces) in 15 typically developing children from 7 to 12 years of age. RFRs of fear (not anger) in response to angry faces indicated an emotional mechanism. In 11 children (8-13 years of age) with ASD, Experiment 2 found undifferentiated RFRs to fear expressions and no consistent RFRs to happy or angry faces. However, as children with ASD aged, matching RFRs to happy faces increased significantly, suggesting the development of processes underlying matching RFRs during this period in ASD.
Direction of Amygdala-Neocortex Interaction During Dynamic Facial Expression Processing.
Sato, Wataru; Kochiyama, Takanori; Uono, Shota; Yoshikawa, Sakiko; Toichi, Motomi
2017-03-01
Dynamic facial expressions of emotion strongly elicit multifaceted emotional, perceptual, cognitive, and motor responses. Neuroimaging studies revealed that some subcortical (e.g., amygdala) and neocortical (e.g., superior temporal sulcus and inferior frontal gyrus) brain regions and their functional interaction were involved in processing dynamic facial expressions. However, the direction of the functional interaction between the amygdala and the neocortex remains unknown. To investigate this issue, we re-analyzed functional magnetic resonance imaging (fMRI) data from 2 studies and magnetoencephalography (MEG) data from 1 study. First, a psychophysiological interaction analysis of the fMRI data confirmed the functional interaction between the amygdala and neocortical regions. Then, dynamic causal modeling analysis was used to compare models with forward, backward, or bidirectional effective connectivity between the amygdala and neocortical networks in the fMRI and MEG data. The results consistently supported the model of effective connectivity from the amygdala to the neocortex. Further increasing time-window analysis of the MEG demonstrated that this model was valid after 200 ms from the stimulus onset. These data suggest that emotional processing in the amygdala rapidly modulates some neocortical processing, such as perception, recognition, and motor mimicry, when observing dynamic facial expressions of emotion. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Do Valenced Odors and Trait Body Odor Disgust Affect Evaluation of Emotion in Dynamic Faces?
Syrjänen, Elmeri; Liuzza, Marco Tullio; Fischer, Håkan; Olofsson, Jonas K
2017-12-01
Disgust is a core emotion evolved to detect and avoid the ingestion of poisonous food as well as the contact with pathogens and other harmful agents. Previous research has shown that multisensory presentation of olfactory and visual information may strengthen the processing of disgust-relevant information. However, it is not known whether these findings extend to dynamic facial stimuli that changes from neutral to emotionally expressive, or if individual differences in trait body odor disgust may influence the processing of disgust-related information. In this preregistered study, we tested whether a classification of dynamic facial expressions as happy or disgusted, and an emotional evaluation of these facial expressions, would be affected by individual differences in body odor disgust sensitivity, and by exposure to a sweat-like, negatively valenced odor (valeric acid), as compared with a soap-like, positively valenced odor (lilac essence) or a no-odor control. Using Bayesian hypothesis testing, we found evidence that odors do not affect recognition of emotion in dynamic faces even when body odor disgust sensitivity was used as moderator. However, an exploratory analysis suggested that an unpleasant odor context may cause faster RTs for faces, independent of their emotional expression. Our results further our understanding of the scope and limits of odor effects on facial perception affect and suggest further studies should focus on reproducibility, specifying experimental circumstances where odor effects on facial expressions may be present versus absent.
Emotion and Object Processing in Parkinson's Disease
ERIC Educational Resources Information Center
Cohen, Henri; Gagne, Marie-Helene; Hess, Ursula; Pourcher, Emmanuelle
2010-01-01
The neuropsychological literature on the processing of emotions in Parkinson's disease (PD) reveals conflicting evidence about the role of the basal ganglia in the recognition of facial emotions. Hence, the present study had two objectives. One was to determine the extent to which the visual processing of emotions and objects differs in PD. The…
Zangara, Andrea; Blair, R J R; Curran, H Valerie
2002-08-01
Accumulating evidence from neuropsychological and neuroimaging research suggests that facial expressions are processed by at least partially separable neurocognitive systems. Recent evidence implies that the processing of different facial expressions may also be dissociable pharmacologically by GABAergic and noradrenergic compounds, although no study has directly compared the two types of drugs. The present study therefore directly compared the effects of a benzodiazepine with those of a beta-adrenergic blocker on the ability to recognise emotional expressions. A double-blind, independent group design was used with 45 volunteers to compare the effects of diazepam (15 mg) and metoprolol (50 mg) with matched placebo. Participants were presented with morphed facial expression stimuli and asked to identify which of the six basic emotions (sadness, happiness, anger, disgust, fear and surprise) were portrayed. Control measures of mood, pulse rate and word recall were also taken. Diazepam selectively impaired participants' ability to recognise expressions of both anger and fear but not other emotional expressions. Errors were mainly mistaking fear for surprise and disgust for anger. Metoprolol did not significantly affect facial expression recognition. These findings are interpreted as providing further support for the suggestion that there are dissociable systems responsible for processing emotional expressions. The results may have implications for understanding why 'paradoxical' aggression is sometimes elicited by benzodiazepines and for extending our psychological understanding of the anxiolytic effects of these drugs.
Developmental Changes in the Primacy of Facial Cues for Emotion Recognition
ERIC Educational Resources Information Center
Leitzke, Brian T.; Pollak, Seth D.
2016-01-01
There have been long-standing differences of opinion regarding the influence of the face relative to that of contextual information on how individuals process and judge facial expressions of emotion. However, developmental changes in how individuals use such information have remained largely unexplored and could be informative in attempting to…
Facial Emotion Processing and Social Adaptation in Adults with and without Autism Spectrum Disorder
ERIC Educational Resources Information Center
Garcia-Villamisar, Domingo; Rojahn, Johannes; Zaja, Rebecca H.; Jodra, Marina
2010-01-01
Individuals with autism spectrum disorder (ASD) and individuals with intellectual disabilities without ASD have limited facial emotion recognition abilities, which may adversely impact social adjustment and other adaptive behavior. This study was designed to examine this relationship in adults with and without ASD. Two groups of adults with…
Neural Correlates of Explicit versus Implicit Facial Emotion Processing in ASD
ERIC Educational Resources Information Center
Luckhardt, Christina; Kröger, Anne; Cholemkery, Hannah; Bender, Stephan; Freitag, Christine M.
2017-01-01
The underlying neural mechanisms of implicit and explicit facial emotion recognition (FER) were studied in children and adolescents with autism spectrum disorder (ASD) compared to matched typically developing controls (TDC). EEG was obtained from N = 21 ASD and N = 16 TDC. Task performance, visual (P100, N170) and cognitive (late positive…
Sun, Shiyue; Carretié, Luis; Zhang, Lei; Dong, Yi; Zhu, Chunyan; Luo, Yuejia; Wang, Kai
2014-01-01
Background Although ample evidence suggests that emotion and response inhibition are interrelated at the behavioral and neural levels, neural substrates of response inhibition to negative facial information remain unclear. Thus we used event-related potential (ERP) methods to explore the effects of explicit and implicit facial expression processing in response inhibition. Methods We used implicit (gender categorization) and explicit emotional Go/Nogo tasks (emotion categorization) in which neutral and sad faces were presented. Electrophysiological markers at the scalp and the voxel level were analyzed during the two tasks. Results We detected a task, emotion and trial type interaction effect in the Nogo-P3 stage. Larger Nogo-P3 amplitudes during sad conditions versus neutral conditions were detected with explicit tasks. However, the amplitude differences between the two conditions were not significant for implicit tasks. Source analyses on P3 component revealed that right inferior frontal junction (rIFJ) was involved during this stage. The current source density (CSD) of rIFJ was higher with sad conditions compared to neutral conditions for explicit tasks, rather than for implicit tasks. Conclusions The findings indicated that response inhibition was modulated by sad facial information at the action inhibition stage when facial expressions were processed explicitly rather than implicitly. The rIFJ may be a key brain region in emotion regulation. PMID:25330212
Perceiving emotions in neutral faces: expression processing is biased by affective person knowledge.
Suess, Franziska; Rabovsky, Milena; Abdel Rahman, Rasha
2015-04-01
According to a widely held view, basic emotions such as happiness or anger are reflected in facial expressions that are invariant and uniquely defined by specific facial muscle movements. Accordingly, expression perception should not be vulnerable to influences outside the face. Here, we test this assumption by manipulating the emotional valence of biographical knowledge associated with individual persons. Faces of well-known and initially unfamiliar persons displaying neutral expressions were associated with socially relevant negative, positive or comparatively neutral biographical information. The expressions of faces associated with negative information were classified as more negative than faces associated with neutral information. Event-related brain potential modulations in the early posterior negativity, a component taken to reflect early sensory processing of affective stimuli such as emotional facial expressions, suggest that negative affective knowledge can bias the perception of faces with neutral expressions toward subjectively displaying negative emotions. © The Author (2014). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
Emotional Processing of Infants Displays in Eating Disorders
Cardi, Valentina; Corfield, Freya; Leppanen, Jenni; Rhind, Charlotte; Deriziotis, Stephanie; Hadjimichalis, Alexandra; Hibbs, Rebecca; Micali, Nadia; Treasure, Janet
2014-01-01
Aim The aim of this study is to examine emotional processing of infant displays in people with Eating Disorders (EDs). Background Social and emotional factors are implicated as causal and maintaining factors in EDs. Difficulties in emotional regulation have been mainly studied in relation to adult interactions, with less interest given to interactions with infants. Method A sample of 138 women were recruited, of which 49 suffered from Anorexia Nervosa (AN), 16 from Bulimia Nervosa (BN), and 73 were healthy controls (HCs). Attentional responses to happy and sad infant faces were tested with the visual probe detection task. Emotional identification of, and reactivity to, infant displays were measured using self-report measures. Facial expressions to video clips depicting sad, happy and frustrated infants were also recorded. Results No significant differences between groups were observed in the attentional response to infant photographs. However, there was a trend for patients to disengage from happy faces. People with EDs also reported lower positive ratings of happy infant displays and greater subjective negative reactions to sad infants. Finally, patients showed a significantly lower production of facial expressions, especially in response to the happy infant video clip. Insecure attachment was negatively correlated with positive facial expressions displayed in response to the happy infant and positively correlated with the intensity of negative emotions experienced in response to the sad infant video clip. Conclusion People with EDs do not have marked abnormalities in their attentional processing of infant emotional faces. However, they do have a reduction in facial affect particularly in response to happy infants. Also, they report greater negative reactions to sadness, and rate positive emotions less intensively than HCs. This pattern of emotional responsivity suggests abnormalities in social reward sensitivity and might indicate new treatment targets. PMID:25463051
Face Generation Using Emotional Regions for Sensibility Robot
NASA Astrophysics Data System (ADS)
Gotoh, Minori; Kanoh, Masayoshi; Kato, Shohei; Kunitachi, Tsutomu; Itoh, Hidenori
We think that psychological interaction is necessary for smooth communication between robots and people. One way to psychologically interact with others is through facial expressions. Facial expressions are very important for communication because they show true emotions and feelings. The ``Ifbot'' robot communicates with people by considering its own ``emotions''. Ifbot has many facial expressions to communicate enjoyment. We developed a method for generating facial expressions based on human subjective judgements mapping Ifbot's facial expressions to its emotions. We first created Ifbot's emotional space to map its facial expressions. We applied a five-layer auto-associative neural network to the space. We then subjectively evaluated the emotional space and created emotional regions based on the results. We generated emotive facial expressions using the emotional regions.
Wingenbach, Tanja S. H.; Brosnan, Mark; Pfaltz, Monique C.; Plichta, Michael M.; Ashwin, Chris
2018-01-01
According to embodied cognition accounts, viewing others’ facial emotion can elicit the respective emotion representation in observers which entails simulations of sensory, motor, and contextual experiences. In line with that, published research found viewing others’ facial emotion to elicit automatic matched facial muscle activation, which was further found to facilitate emotion recognition. Perhaps making congruent facial muscle activity explicit produces an even greater recognition advantage. If there is conflicting sensory information, i.e., incongruent facial muscle activity, this might impede recognition. The effects of actively manipulating facial muscle activity on facial emotion recognition from videos were investigated across three experimental conditions: (a) explicit imitation of viewed facial emotional expressions (stimulus-congruent condition), (b) pen-holding with the lips (stimulus-incongruent condition), and (c) passive viewing (control condition). It was hypothesised that (1) experimental condition (a) and (b) result in greater facial muscle activity than (c), (2) experimental condition (a) increases emotion recognition accuracy from others’ faces compared to (c), (3) experimental condition (b) lowers recognition accuracy for expressions with a salient facial feature in the lower, but not the upper face area, compared to (c). Participants (42 males, 42 females) underwent a facial emotion recognition experiment (ADFES-BIV) while electromyography (EMG) was recorded from five facial muscle sites. The experimental conditions’ order was counter-balanced. Pen-holding caused stimulus-incongruent facial muscle activity for expressions with facial feature saliency in the lower face region, which reduced recognition of lower face region emotions. Explicit imitation caused stimulus-congruent facial muscle activity without modulating recognition. Methodological implications are discussed. PMID:29928240
Wingenbach, Tanja S H; Brosnan, Mark; Pfaltz, Monique C; Plichta, Michael M; Ashwin, Chris
2018-01-01
According to embodied cognition accounts, viewing others' facial emotion can elicit the respective emotion representation in observers which entails simulations of sensory, motor, and contextual experiences. In line with that, published research found viewing others' facial emotion to elicit automatic matched facial muscle activation, which was further found to facilitate emotion recognition. Perhaps making congruent facial muscle activity explicit produces an even greater recognition advantage. If there is conflicting sensory information, i.e., incongruent facial muscle activity, this might impede recognition. The effects of actively manipulating facial muscle activity on facial emotion recognition from videos were investigated across three experimental conditions: (a) explicit imitation of viewed facial emotional expressions (stimulus-congruent condition), (b) pen-holding with the lips (stimulus-incongruent condition), and (c) passive viewing (control condition). It was hypothesised that (1) experimental condition (a) and (b) result in greater facial muscle activity than (c), (2) experimental condition (a) increases emotion recognition accuracy from others' faces compared to (c), (3) experimental condition (b) lowers recognition accuracy for expressions with a salient facial feature in the lower, but not the upper face area, compared to (c). Participants (42 males, 42 females) underwent a facial emotion recognition experiment (ADFES-BIV) while electromyography (EMG) was recorded from five facial muscle sites. The experimental conditions' order was counter-balanced. Pen-holding caused stimulus-incongruent facial muscle activity for expressions with facial feature saliency in the lower face region, which reduced recognition of lower face region emotions. Explicit imitation caused stimulus-congruent facial muscle activity without modulating recognition. Methodological implications are discussed.
Stereotypes and prejudice affect the recognition of emotional body postures.
Bijlstra, Gijsbert; Holland, Rob W; Dotsch, Ron; Wigboldus, Daniel H J
2018-03-26
Most research on emotion recognition focuses on facial expressions. However, people communicate emotional information through bodily cues as well. Prior research on facial expressions has demonstrated that emotion recognition is modulated by top-down processes. Here, we tested whether this top-down modulation generalizes to the recognition of emotions from body postures. We report three studies demonstrating that stereotypes and prejudice about men and women may affect how fast people classify various emotional body postures. Our results suggest that gender cues activate gender associations, which affect the recognition of emotions from body postures in a top-down fashion. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Denmark, Tanya; Atkinson, Joanna; Campbell, Ruth; Swettenham, John
2014-10-01
Facial expressions in sign language carry a variety of communicative features. While emotion can modulate a spoken utterance through changes in intonation, duration and intensity, in sign language specific facial expressions presented concurrently with a manual sign perform this function. When deaf adult signers cannot see facial features, their ability to judge emotion in a signed utterance is impaired (Reilly et al. in Sign Lang Stud 75:113-118, 1992). We examined the role of the face in the comprehension of emotion in sign language in a group of typically developing (TD) deaf children and in a group of deaf children with autism spectrum disorder (ASD). We replicated Reilly et al.'s (Sign Lang Stud 75:113-118, 1992) adult results in the TD deaf signing children, confirming the importance of the face in understanding emotion in sign language. The ASD group performed more poorly on the emotion recognition task than the TD children. The deaf children with ASD showed a deficit in emotion recognition during sign language processing analogous to the deficit in vocal emotion recognition that has been observed in hearing children with ASD.
Mondloch, Catherine J
2012-02-01
The current research investigated the influence of body posture on adults' and children's perception of facial displays of emotion. In each of two experiments, participants categorized facial expressions that were presented on a body posture that was congruent (e.g., a sad face on a body posing sadness) or incongruent (e.g., a sad face on a body posing fear). Adults and 8-year-olds made more errors and had longer reaction times on incongruent trials than on congruent trials when judging sad versus fearful facial expressions, an effect that was larger in 8-year-olds. The congruency effect was reduced when faces and bodies were misaligned, providing some evidence for holistic processing. Neither adults nor 8-year-olds were affected by congruency when judging sad versus happy expressions. Evidence that congruency effects vary with age and with similarity of emotional expressions is consistent with dimensional theories and "emotional seed" models of emotion perception. 2011 Elsevier Inc. All rights reserved.
A new look at emotion perception: Concepts speed and shape facial emotion recognition.
Nook, Erik C; Lindquist, Kristen A; Zaki, Jamil
2015-10-01
Decades ago, the "New Look" movement challenged how scientists thought about vision by suggesting that conceptual processes shape visual perceptions. Currently, affective scientists are likewise debating the role of concepts in emotion perception. Here, we utilized a repetition-priming paradigm in conjunction with signal detection and individual difference analyses to examine how providing emotion labels-which correspond to discrete emotion concepts-affects emotion recognition. In Study 1, pairing emotional faces with emotion labels (e.g., "sad") increased individuals' speed and sensitivity in recognizing emotions. Additionally, individuals with alexithymia-who have difficulty labeling their own emotions-struggled to recognize emotions based on visual cues alone, but not when emotion labels were provided. Study 2 replicated these findings and further demonstrated that emotion concepts can shape perceptions of facial expressions. Together, these results suggest that emotion perception involves conceptual processing. We discuss the implications of these findings for affective, social, and clinical psychology. (c) 2015 APA, all rights reserved).
Wang, Hailing; Ip, Chengteng; Fu, Shimin; Sun, Pei
2017-05-01
Face recognition theories suggest that our brains process invariant (e.g., gender) and changeable (e.g., emotion) facial dimensions separately. To investigate whether these two dimensions are processed in different time courses, we analyzed the selection negativity (SN, an event-related potential component reflecting attentional modulation) elicited by face gender and emotion during a feature selective attention task. Participants were instructed to attend to a combination of face emotion and gender attributes in Experiment 1 (bi-dimensional task) and to either face emotion or gender in Experiment 2 (uni-dimensional task). The results revealed that face emotion did not elicit a substantial SN, whereas face gender consistently generated a substantial SN in both experiments. These results suggest that face gender is more sensitive to feature-selective attention and that face emotion is encoded relatively automatically on SN, implying the existence of different underlying processing mechanisms for invariant and changeable facial dimensions. Copyright © 2017 Elsevier Ltd. All rights reserved.
Trautmann-Lengsfeld, Sina Alexa; Domínguez-Borràs, Judith; Escera, Carles; Herrmann, Manfred; Fehr, Thorsten
2013-01-01
A recent functional magnetic resonance imaging (fMRI) study by our group demonstrated that dynamic emotional faces are more accurately recognized and evoked more widespread patterns of hemodynamic brain responses than static emotional faces. Based on this experimental design, the present study aimed at investigating the spatio-temporal processing of static and dynamic emotional facial expressions in 19 healthy women by means of multi-channel electroencephalography (EEG), event-related potentials (ERP) and fMRI-constrained regional source analyses. ERP analysis showed an increased amplitude of the LPP (late posterior positivity) over centro-parietal regions for static facial expressions of disgust compared to neutral faces. In addition, the LPP was more widespread and temporally prolonged for dynamic compared to static faces of disgust and happiness. fMRI constrained source analysis on static emotional face stimuli indicated the spatio-temporal modulation of predominantly posterior regional brain activation related to the visual processing stream for both emotional valences when compared to the neutral condition in the fusiform gyrus. The spatio-temporal processing of dynamic stimuli yielded enhanced source activity for emotional compared to neutral conditions in temporal (e.g., fusiform gyrus), and frontal regions (e.g., ventromedial prefrontal cortex, medial and inferior frontal cortex) in early and again in later time windows. The present data support the view that dynamic facial displays trigger more information reflected in complex neural networks, in particular because of their changing features potentially triggering sustained activation related to a continuing evaluation of those faces. A combined fMRI and EEG approach thus provides an advanced insight to the spatio-temporal characteristics of emotional face processing, by also revealing additional neural generators, not identifiable by the only use of an fMRI approach. PMID:23818974
Quarto, Tiziana; Blasi, Giuseppe; Maddalena, Chiara; Viscanti, Giovanna; Lanciano, Tiziana; Soleti, Emanuela; Mangiulli, Ivan; Taurisano, Paolo; Fazio, Leonardo; Bertolino, Alessandro; Curci, Antonietta
2016-01-01
The human ability of identifying, processing and regulating emotions from social stimuli is generally referred as Emotional Intelligence (EI). Within EI, Ability EI identifies a performance measure assessing individual skills at perceiving, using, understanding and managing emotions. Previous models suggest that a brain “somatic marker circuitry” (SMC) sustains emotional sub-processes included in EI. Three primary brain regions are included: the amygdala, the insula and the ventromedial prefrontal cortex (vmPFC). Here, our aim was to investigate the relationship between Ability EI scores and SMC activity during social judgment of emotional faces. Sixty-three healthy subjects completed a test measuring Ability EI and underwent fMRI during a social decision task (i.e. approach or avoid) about emotional faces with different facial expressions. Imaging data revealed that EI scores are associated with left insula activity during social judgment of emotional faces as a function of facial expression. Specifically, higher EI scores are associated with greater left insula activity during social judgment of fearful faces but also with lower activity of this region during social judgment of angry faces. These findings indicate that the association between Ability EI and the SMC activity during social behavior is region- and emotion-specific. PMID:26859495
Discrimination and categorization of emotional facial expressions and faces in Parkinson's disease.
Alonso-Recio, Laura; Martín, Pilar; Rubio, Sandra; Serrano, Juan M
2014-09-01
Our objective was to compare the ability to discriminate and categorize emotional facial expressions (EFEs) and facial identity characteristics (age and/or gender) in a group of 53 individuals with Parkinson's disease (PD) and another group of 53 healthy subjects. On the one hand, by means of discrimination and identification tasks, we compared two stages in the visual recognition process that could be selectively affected in individuals with PD. On the other hand, facial expression versus gender and age comparison permits us to contrast whether the emotional or non-emotional content influences the configural perception of faces. In Experiment I, we did not find differences between groups, either with facial expression or age, in discrimination tasks. Conversely, in Experiment II, we found differences between the groups, but only in the EFE identification task. Taken together, our results indicate that configural perception of faces does not seem to be globally impaired in PD. However, this ability is selectively altered when the categorization of emotional faces is required. A deeper assessment of the PD group indicated that decline in facial expression categorization is more evident in a subgroup of patients with higher global impairment (motor and cognitive). Taken together, these results suggest that the problems found in facial expression recognition may be associated with the progressive neuronal loss in frontostriatal and mesolimbic circuits, which characterizes PD. © 2013 The British Psychological Society.
Barden, R C; Ford, M E; Wilhelm, W M; Rogers-Salyer, M; Salyer, K E
1988-09-01
The present experiment investigated whether observers' emotional and behavioral reactions to facially deformed patients could be substantially improved by surgical procedures conducted by well-trained specialists in an experienced multidisciplinary team. Also investigated was the hypothesis that emotional states mediate the effects of physical attractiveness and facial deformity on social interaction. Twenty patients between the ages of 3 months and 17 years were randomly selected from over 2000 patients' files of Kenneth E. Salyer of Dallas, Texas. Patient diagnoses included facial clefts, hypertelorism, Treacher Collins syndrome, and craniofacial dysostoses (Crouzon's and Apert's syndromes). Rigorously standardized photographs of patients taken before and after surgery were shown to 22 "naive" raters ranging in age from 18 to 54 years. Raters were asked to predict their emotional and behavioral responses to the patients. These ratings indicated that observers' behavioral reactions to facially deformed children and adolescents would be more positive following craniofacial surgery. Similarly, the ratings indicated that observers' emotional reactions to these patients would be more positive following surgery. The results are discussed in terms of current sociopsychologic theoretical models for the effects of attractiveness on social interaction. A new model is presented that implicates induced emotional states as a mediating process in explaining the effects of attractiveness and facial deformity on the quality of social interactions. Limitations of the current investigation and directions for future research are also discussed.
Event-Related Brain Potential Correlates of Emotional Face Processing
ERIC Educational Resources Information Center
Eimer, Martin; Holmes, Amanda
2007-01-01
Results from recent event-related brain potential (ERP) studies investigating brain processes involved in the detection and analysis of emotional facial expression are reviewed. In all experiments, emotional faces were found to trigger an increased ERP positivity relative to neutral faces. The onset of this emotional expression effect was…
Namba, Shushi; Kabir, Russell S.; Miyatani, Makoto; Nakao, Takashi
2017-01-01
While numerous studies have examined the relationships between facial actions and emotions, they have yet to account for the ways that specific spontaneous facial expressions map onto emotional experiences induced without expressive intent. Moreover, previous studies emphasized that a fine-grained investigation of facial components could establish the coherence of facial actions with actual internal states. Therefore, this study aimed to accumulate evidence for the correspondence between spontaneous facial components and emotional experiences. We reinvestigated data from previous research which secretly recorded spontaneous facial expressions of Japanese participants as they watched film clips designed to evoke four different target emotions: surprise, amusement, disgust, and sadness. The participants rated their emotional experiences via a self-reported questionnaire of 16 emotions. These spontaneous facial expressions were coded using the Facial Action Coding System, the gold standard for classifying visible facial movements. We corroborated each facial action that was present in the emotional experiences by applying stepwise regression models. The results found that spontaneous facial components occurred in ways that cohere to their evolutionary functions based on the rating values of emotional experiences (e.g., the inner brow raiser might be involved in the evaluation of novelty). This study provided new empirical evidence for the correspondence between each spontaneous facial component and first-person internal states of emotion as reported by the expresser. PMID:28522979
Tanaka, James W; Wolf, Julie M; Klaiman, Cheryl; Koenig, Kathleen; Cockburn, Jeffrey; Herlihy, Lauren; Brown, Carla; Stahl, Sherin S; South, Mikle; McPartland, James C; Kaiser, Martha D; Schultz, Robert T
2012-12-01
Although impaired social-emotional ability is a hallmark of autism spectrum disorder (ASD), the perceptual skills and mediating strategies contributing to the social deficits of autism are not well understood. A perceptual skill that is fundamental to effective social communication is the ability to accurately perceive and interpret facial emotions. To evaluate the expression processing of participants with ASD, we designed the Let's Face It! Emotion Skills Battery (LFI! Battery), a computer-based assessment composed of three subscales measuring verbal and perceptual skills implicated in the recognition of facial emotions. We administered the LFI! Battery to groups of participants with ASD and typically developing control (TDC) participants that were matched for age and IQ. On the Name Game labeling task, participants with ASD (N = 68) performed on par with TDC individuals (N = 66) in their ability to name the facial emotions of happy, sad, disgust and surprise and were only impaired in their ability to identify the angry expression. On the Matchmaker Expression task that measures the recognition of facial emotions across different facial identities, the ASD participants (N = 66) performed reliably worse than TDC participants (N = 67) on the emotions of happy, sad, disgust, frighten and angry. In the Parts-Wholes test of perceptual strategies of expression, the TDC participants (N = 67) displayed more holistic encoding for the eyes than the mouths in expressive faces whereas ASD participants (N = 66) exhibited the reverse pattern of holistic recognition for the mouth and analytic recognition of the eyes. In summary, findings from the LFI! Battery show that participants with ASD were able to label the basic facial emotions (with the exception of angry expression) on par with age- and IQ-matched TDC participants. However, participants with ASD were impaired in their ability to generalize facial emotions across different identities and showed a tendency to recognize the mouth feature holistically and the eyes as isolated parts. © 2012 The Authors. Journal of Child Psychology and Psychiatry © 2012 Association for Child and Adolescent Mental Health.
More emotional facial expressions during episodic than during semantic autobiographical retrieval.
El Haj, Mohamad; Antoine, Pascal; Nandrino, Jean Louis
2016-04-01
There is a substantial body of research on the relationship between emotion and autobiographical memory. Using facial analysis software, our study addressed this relationship by investigating basic emotional facial expressions that may be detected during autobiographical recall. Participants were asked to retrieve 3 autobiographical memories, each of which was triggered by one of the following cue words: happy, sad, and city. The autobiographical recall was analyzed by a software for facial analysis that detects and classifies basic emotional expressions. Analyses showed that emotional cues triggered the corresponding basic facial expressions (i.e., happy facial expression for memories cued by happy). Furthermore, we dissociated episodic and semantic retrieval, observing more emotional facial expressions during episodic than during semantic retrieval, regardless of the emotional valence of cues. Our study provides insight into facial expressions that are associated with emotional autobiographical memory. It also highlights an ecological tool to reveal physiological changes that are associated with emotion and memory.
Exploring the nature of facial affect processing deficits in schizophrenia.
van 't Wout, Mascha; Aleman, André; Kessels, Roy P C; Cahn, Wiepke; de Haan, Edward H F; Kahn, René S
2007-04-15
Schizophrenia has been associated with deficits in facial affect processing, especially negative emotions. However, the exact nature of the deficit remains unclear. The aim of the present study was to investigate whether schizophrenia patients have problems in automatic allocation of attention as well as in controlled evaluation of facial affect. Thirty-seven patients with schizophrenia were compared with 41 control subjects on incidental facial affect processing (gender decision of faces with a fearful, angry, happy, disgusted, and neutral expression) and degraded facial affect labeling (labeling of fearful, angry, happy, and neutral faces). The groups were matched on estimates of verbal and performance intelligence (National Adult Reading Test; Raven's Matrices), general face recognition ability (Benton Face Recognition), and other demographic variables. The results showed that patients with schizophrenia as well as control subjects demonstrate the normal threat-related interference during incidental facial affect processing. Conversely, on controlled evaluation patients were specifically worse in the labeling of fearful faces. In particular, patients with high levels of negative symptoms may be characterized by deficits in labeling fear. We suggest that patients with schizophrenia show no evidence of deficits in the automatic allocation of attention resources to fearful (threat-indicating) faces, but have a deficit in the controlled processing of facial emotions that may be specific for fearful faces.
Facial emotion recognition in Parkinson's disease: A review and new hypotheses
Vérin, Marc; Sauleau, Paul; Grandjean, Didier
2018-01-01
Abstract Parkinson's disease is a neurodegenerative disorder classically characterized by motor symptoms. Among them, hypomimia affects facial expressiveness and social communication and has a highly negative impact on patients' and relatives' quality of life. Patients also frequently experience nonmotor symptoms, including emotional‐processing impairments, leading to difficulty in recognizing emotions from faces. Aside from its theoretical importance, understanding the disruption of facial emotion recognition in PD is crucial for improving quality of life for both patients and caregivers, as this impairment is associated with heightened interpersonal difficulties. However, studies assessing abilities in recognizing facial emotions in PD still report contradictory outcomes. The origins of this inconsistency are unclear, and several questions (regarding the role of dopamine replacement therapy or the possible consequences of hypomimia) remain unanswered. We therefore undertook a fresh review of relevant articles focusing on facial emotion recognition in PD to deepen current understanding of this nonmotor feature, exploring multiple significant potential confounding factors, both clinical and methodological, and discussing probable pathophysiological mechanisms. This led us to examine recent proposals about the role of basal ganglia‐based circuits in emotion and to consider the involvement of facial mimicry in this deficit from the perspective of embodied simulation theory. We believe our findings will inform clinical practice and increase fundamental knowledge, particularly in relation to potential embodied emotion impairment in PD. © 2018 The Authors. Movement Disorders published by Wiley Periodicals, Inc. on behalf of International Parkinson and Movement Disorder Society. PMID:29473661
ERIC Educational Resources Information Center
Zurek, Peter Paul; Scheithauer, Herbert
2017-01-01
Empathy entails basic cognitive processes such as the recognition of facial expressions and basic emotional processes such as emotional contagion, but also higher-order cognitive processes such as abstract reasoning about the other person's emotional states and higher-order emotional processes such as empathic concern. Thus, empathy must be…
Rodway, Paul; Wright, Lynn; Hardie, Scott
2003-12-01
The right hemisphere has often been viewed as having a dominant role in the processing of emotional information. Other evidence indicates that both hemispheres process emotional information but their involvement is valence specific, with the right hemisphere dealing with negative emotions and the left hemisphere preferentially processing positive emotions. This has been found under both restricted (Reuter-Lorenz & Davidson, 1981) and free viewing conditions (Jansari, Tranel, & Adophs, 2000). It remains unclear whether the valence-specific laterality effect is also sex specific or is influenced by the handedness of participants. To explore this issue we repeated Jansari et al.'s free-viewing laterality task with 78 participants. We found a valence-specific laterality effect in women but not men, with women discriminating negative emotional expressions more accurately when the face was presented on the left-hand side and discriminating positive emotions more accurately when those faces were presented on the right-hand side. These results indicate that under free viewing conditions women are more lateralised for the processing of facial emotion than are men. Handedness did not affect the lateralised processing of facial emotion. Finally, participants demonstrated a response bias on control trials, where facial emotion did not differ between the faces. Participants selected the left-hand side more frequently when they believed the expression was negative and the right-hand side more frequently when they believed the expression was positive. This response bias can cause a spurious valence-specific laterality effect which might have contributed to the conflicting findings within the literature.
Larra, Mauro F.; Merz, Martina U.; Schächinger, Hartmut
2017-01-01
Facial self-resemblance has been associated with positive emotional evaluations, but this effect may be biased by self-face familiarity. Here we report two experiments utilizing startle modulation to investigate how the processing of facial expressions of emotion is affected by subtle resemblance to the self as well as to familiar faces. Participants of the first experiment (I) (N = 39) were presented with morphed faces showing happy, neutral, and fearful expressions which were manipulated to resemble either their own or unknown faces. At SOAs of either 300 ms or 3500–4500 ms after picture onset, startle responses were elicited by binaural bursts of white noise (50 ms, 105 dB), and recorded at the orbicularis oculi via EMG. Manual reaction time was measured in a simple emotion discrimination paradigm. Pictures preceding noise bursts by short SOA inhibited startle (prepulse inhibition, PPI). Both affective modulation and PPI of startle in response to emotional faces was altered by physical similarity to the self. As indexed both by relative facilitation of startle and faster manual responses, self-resemblance apparently induced deeper processing of facial affect, particularly in happy faces. Experiment II (N = 54) produced similar findings using morphs of famous faces, yet showed no impact of mere familiarity on PPI effects (or response time, either). The results are discussed with respect to differential (presumably pre-attentive) effects of self-specific vs. familiar information in face processing. PMID:29216226
Automatic Processing of Changes in Facial Emotions in Dysphoria: A Magnetoencephalography Study.
Xu, Qianru; Ruohonen, Elisa M; Ye, Chaoxiong; Li, Xueqiao; Kreegipuu, Kairi; Stefanics, Gabor; Luo, Wenbo; Astikainen, Piia
2018-01-01
It is not known to what extent the automatic encoding and change detection of peripherally presented facial emotion is altered in dysphoria. The negative bias in automatic face processing in particular has rarely been studied. We used magnetoencephalography (MEG) to record automatic brain responses to happy and sad faces in dysphoric (Beck's Depression Inventory ≥ 13) and control participants. Stimuli were presented in a passive oddball condition, which allowed potential negative bias in dysphoria at different stages of face processing (M100, M170, and M300) and alterations of change detection (visual mismatch negativity, vMMN) to be investigated. The magnetic counterpart of the vMMN was elicited at all stages of face processing, indexing automatic deviance detection in facial emotions. The M170 amplitude was modulated by emotion, response amplitudes being larger for sad faces than happy faces. Group differences were found for the M300, and they were indexed by two different interaction effects. At the left occipital region of interest, the dysphoric group had larger amplitudes for sad than happy deviant faces, reflecting negative bias in deviance detection, which was not found in the control group. On the other hand, the dysphoric group showed no vMMN to changes in facial emotions, while the vMMN was observed in the control group at the right occipital region of interest. Our results indicate that there is a negative bias in automatic visual deviance detection, but also a general change detection deficit in dysphoria.
Younger and Older Users’ Recognition of Virtual Agent Facial Expressions
Beer, Jenay M.; Smarr, Cory-Ann; Fisk, Arthur D.; Rogers, Wendy A.
2015-01-01
As technology advances, robots and virtual agents will be introduced into the home and healthcare settings to assist individuals, both young and old, with everyday living tasks. Understanding how users recognize an agent’s social cues is therefore imperative, especially in social interactions. Facial expression, in particular, is one of the most common non-verbal cues used to display and communicate emotion in on-screen agents (Cassell, Sullivan, Prevost, & Churchill, 2000). Age is important to consider because age-related differences in emotion recognition of human facial expression have been supported (Ruffman et al., 2008), with older adults showing a deficit for recognition of negative facial expressions. Previous work has shown that younger adults can effectively recognize facial emotions displayed by agents (Bartneck & Reichenbach, 2005; Courgeon et al. 2009; 2011; Breazeal, 2003); however, little research has compared in-depth younger and older adults’ ability to label a virtual agent’s facial emotions, an import consideration because social agents will be required to interact with users of varying ages. If such age-related differences exist for recognition of virtual agent facial expressions, we aim to understand if those age-related differences are influenced by the intensity of the emotion, dynamic formation of emotion (i.e., a neutral expression developing into an expression of emotion through motion), or the type of virtual character differing by human-likeness. Study 1 investigated the relationship between age-related differences, the implication of dynamic formation of emotion, and the role of emotion intensity in emotion recognition of the facial expressions of a virtual agent (iCat). Study 2 examined age-related differences in recognition expressed by three types of virtual characters differing by human-likeness (non-humanoid iCat, synthetic human, and human). Study 2 also investigated the role of configural and featural processing as a possible explanation for age-related differences in emotion recognition. First, our findings show age-related differences in the recognition of emotions expressed by a virtual agent, with older adults showing lower recognition for the emotions of anger, disgust, fear, happiness, sadness, and neutral. These age-related difference might be explained by older adults having difficulty discriminating similarity in configural arrangement of facial features for certain emotions; for example, older adults often mislabeled the similar emotions of fear as surprise. Second, our results did not provide evidence for the dynamic formation improving emotion recognition; but, in general, the intensity of the emotion improved recognition. Lastly, we learned that emotion recognition, for older and younger adults, differed by character type, from best to worst: human, synthetic human, and then iCat. Our findings provide guidance for design, as well as the development of a framework of age-related differences in emotion recognition. PMID:25705105
Bedi, Gillinder; Shiffrin, Laura; Vadhan, Nehal P; Nunes, Edward V; Foltin, Richard W; Bisaga, Adam
2016-04-01
In addition to difficulties in daily social functioning, regular cocaine users have decrements in social processing (the cognitive and affective processes underlying social behavior) relative to non-users. Little is known, however, about the effects of clinically-relevant pharmacological agents, such as cocaine and potential treatment medications, on social processing in cocaine users. Such drug effects could potentially alleviate or compound baseline social processing decrements in cocaine abusers. Here, we assessed the individual and combined effects of smoked cocaine and a potential treatment medication, levodopa-carbidopa-entacapone (LCE), on facial emotion recognition in cocaine smokers. Healthy non-treatment-seeking cocaine smokers (N = 14; two female) completed this 11-day inpatient within-subjects study. Participants received LCE (titrated to 400mg/100mg/200mg b.i.d.) for five days with the remaining time on placebo. The order of medication administration was counterbalanced. Facial emotion recognition was measured twice during target LCE dosing and twice on placebo: once without cocaine and once after repeated cocaine doses. LCE increased the response threshold for identification of facial fear, biasing responses away from fear identification. Cocaine had no effect on facial emotion recognition. Results highlight the possibility for candidate pharmacotherapies to have unintended impacts on social processing in cocaine users, potentially exacerbating already existing difficulties in this population. © The Author(s) 2016.
Mancuso, Mauro; Magnani, Nadia; Cantagallo, Anna; Rossi, Giulia; Capitani, Donatella; Galletti, Vania; Cardamone, Giuseppe; Robertson, Ian Hamilton
2015-02-01
The aim of our study was to identify the common and separate mechanisms that might underpin emotion recognition impairment in patients with traumatic brain injury (TBI) and schizophrenia (Sz) compared with healthy controls (HCs). We recruited 21 Sz outpatients, 24 severe TBI outpatients, and 38 HCs, and we used eye-tracking to compare facial emotion processing performance. Both Sz and TBI patients were significantly poorer at recognizing facial emotions compared with HC. Sz patients showed a different way of exploring the Pictures of Facial Affects stimuli and were significantly worse in recognition of neutral expressions. Selective or sustained attention deficits in TBI may reduce efficient emotion recognition, whereas in Sz, there is a more strategic deficit underlying the observed problem. There would seem to be scope for adjustment of effective rehabilitative training focused on emotion recognition.
ERIC Educational Resources Information Center
Lacroix, Agnes; Guidetti, Michele; Roge, Bernadette; Reilly, Judy
2009-01-01
The aim of our study was to compare two neurodevelopmental disorders (Williams syndrome and autism) in terms of the ability to recognize emotional and nonemotional facial expressions. The comparison of these two disorders is particularly relevant to the investigation of face processing and should contribute to a better understanding of social…
Aviezer, Hillel; Hassin, Ran. R.; Perry, Anat; Dudarev, Veronica; Bentin, Shlomo
2012-01-01
The current study examined the nature of deficits in emotion recognition from facial expressions in case LG, an individual with a rare form of developmental visual agnosia (DVA). LG presents with profoundly impaired recognition of facial expressions, yet the underlying nature of his deficit remains unknown. During typical face processing, normal sighted individuals extract information about expressed emotions from face regions with activity diagnostic for specific emotion categories. Given LG’s impairment, we sought to shed light on his emotion perception by examining if priming facial expressions with diagnostic emotional face components would facilitate his recognition of the emotion expressed by the face. LG and control participants matched isolated face components with components appearing in a subsequently presented full-face and then categorized the face’s emotion. Critically, the matched components were from regions which were diagnostic or non-diagnostic of the emotion portrayed by the full face. In experiment 1, when the full faces were briefly presented (150 ms), LG’s performance was strongly influenced by the diagnosticity of the components: His emotion recognition was boosted within normal limits when diagnostic components were used and was obliterated when non-diagnostic components were used. By contrast, in experiment 2, when the face-exposure duration was extended (2000 ms), the beneficial effect of the diagnostic matching was diminished as was the detrimental effect of the non-diagnostic matching. These data highlight the impact of diagnostic facial features in normal expression recognition and suggest that impaired emotion recognition in DVA results from deficient visual integration across diagnostic face components. PMID:22349446
Emotion categories and dimensions in the facial communication of affect: An integrated approach.
Mehu, Marc; Scherer, Klaus R
2015-12-01
We investigated the role of facial behavior in emotional communication, using both categorical and dimensional approaches. We used a corpus of enacted emotional expressions (GEMEP) in which professional actors are instructed, with the help of scenarios, to communicate a variety of emotional experiences. The results of Study 1 replicated earlier findings showing that only a minority of facial action units are associated with specific emotional categories. Likewise, facial behavior did not show a specific association with particular emotional dimensions. Study 2 showed that facial behavior plays a significant role both in the detection of emotions and in the judgment of their dimensional aspects, such as valence, arousal, dominance, and unpredictability. In addition, a mediation model revealed that the association between facial behavior and recognition of the signaler's emotional intentions is mediated by perceived emotional dimensions. We conclude that, from a production perspective, facial action units convey neither specific emotions nor specific emotional dimensions, but are associated with several emotions and several dimensions. From the perceiver's perspective, facial behavior facilitated both dimensional and categorical judgments, and the former mediated the effect of facial behavior on recognition accuracy. The classification of emotional expressions into discrete categories may, therefore, rely on the perception of more general dimensions such as valence and arousal and, presumably, the underlying appraisals that are inferred from facial movements. (c) 2015 APA, all rights reserved).
Functional MRI of facial emotion processing in left temporal lobe epilepsy.
Szaflarski, Jerzy P; Allendorfer, Jane B; Heyse, Heidi; Mendoza, Lucy; Szaflarski, Basia A; Cohen, Nancy
2014-03-01
Temporal lobe epilepsy (TLE) may negatively affect the ability to recognize emotions. This study aimed to determine the cortical correlates of facial emotion processing (happy, sad, fearful, and neutral) in patients with well-characterized left TLE (LTLE) and to examine the effect of seizure control on emotion processing. We enrolled 34 consecutive patients with LTLE and 30 matched healthy control (HC) subjects. Participants underwent functional MRI (fMRI) with an event-related facial emotion recognition task. The seizures of seventeen patients were controlled (no seizure in at least 3months; LTLE-sz), and 17 continued to experience frequent seizures (LTLE+sz). Mood was assessed with the Beck Depression Inventory (BDI) and the Profile of Mood States (POMS). There were no differences in demographic characteristics and measures of mood between HC subjects and patients with LTLE. In patients with LTLE, fMRI showed decreased blood oxygenation level dependent (BOLD) signal in the hippocampus/parahippocampus and cerebellum in processing of happy faces and increased BOLD signal in occipital regions in response to fearful faces. Comparison of groups with LTLE+sz and LTLE-sz showed worse BDI and POMS scores in LTLE+sz (all p<0.05) except for POMS tension/anxiety (p=0.067). Functional MRI revealed increased BOLD signal in patients with LTLE+sz in the left precuneus and left parahippocampus for "fearful" faces and in the left periarcheocortex for "neutral" faces. There was a correlation between the fMRI and Total Mood Disturbance in the left precuneus in LTLE-sz (p=0.019) and in LTLE+sz (p=0.018). Overall, LTLE appears to have a relatively minor effect on the cortical underpinnings of facial emotion processing, while the effect of seizure state (controlled vs. not controlled) is more pronounced, indicating a significant relationship between seizure control and emotion processing. Copyright © 2014 Elsevier Inc. All rights reserved.
Sharpe, Emma; Wallis, Deborah J; Ridout, Nathan
2016-06-30
This study aimed to: (i) determine if the attention bias towards angry faces reported in eating disorders generalises to a non-clinical sample varying in eating disorder-related symptoms; (ii) examine if the bias occurs during initial orientation or later strategic processing; and (iii) confirm previous findings of impaired facial emotion recognition in non-clinical disordered eating. Fifty-two females viewed a series of face-pairs (happy or angry paired with neutral) whilst their attentional deployment was continuously monitored using an eye-tracker. They subsequently identified the emotion portrayed in a separate series of faces. The highest (n=18) and lowest scorers (n=17) on the Eating Disorders Inventory (EDI) were compared on the attention and facial emotion recognition tasks. Those with relatively high scores exhibited impaired facial emotion recognition, confirming previous findings in similar non-clinical samples. They also displayed biased attention away from emotional faces during later strategic processing, which is consistent with previously observed impairments in clinical samples. These differences were related to drive-for-thinness. Although we found no evidence of a bias towards angry faces, it is plausible that the observed impairments in emotion recognition and avoidance of emotional faces could disrupt social functioning and act as a risk factor for the development of eating disorders. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Chiu, Isabelle; Gfrörer, Regina I; Piguet, Olivier; Berres, Manfred; Monsch, Andreas U; Sollberger, Marc
2015-08-01
The importance of including measures of emotion processing, such as tests of facial emotion recognition (FER), as part of a comprehensive neuropsychological assessment is being increasingly recognized. In clinical settings, FER tests need to be sensitive, short, and easy to administer, given the limited time available and patient limitations. Current tests, however, commonly use stimuli that either display prototypical emotions, bearing the risk of ceiling effects and unequal task difficulty, or are cognitively too demanding and time-consuming. To overcome these limitations in FER testing in patient populations, we aimed to define FER threshold levels for the six basic emotions in healthy individuals. Forty-nine healthy individuals between 52 and 79 years of age were asked to identify the six basic emotions at different intensity levels (25%, 50%, 75%, 100%, and 125% of the prototypical emotion). Analyses uncovered differing threshold levels across emotions and sex of facial stimuli, ranging from 50% up to 100% intensities. Using these findings as "healthy population benchmarks", we propose to apply these threshold levels to clinical populations either as facial emotion recognition or intensity rating tasks. As part of any comprehensive social cognition test battery, this approach should allow for a rapid and sensitive assessment of potential FER deficits.
Avoiding threat in late adulthood: testing two life span theories of emotion.
Orgeta, Vasiliki
2011-07-01
The purpose of the present research was to explore the time course of age-related attentional biases and the role of emotion regulation as a potential mediator of older adults' performance in an emotion dot probe task. In two studies, younger and older adults (N = 80) completed a visual probe detection task, which presented happy, angry, and sad facial expressions. Across both studies, age influenced attentional responses to angry faces. Results indicated a bias away from angry-related facial emotion information occurring relatively late in attention. Age effects were not attributable to decreasing information processing speed or visuoperceptual function. Current results demonstrated that an age-related attentional preference away from angry facial cues was mediated by efforts to suppress emotion. Findings are discussed in relation to current theories of sociocognitive aging.
CACNA1C risk variant affects facial emotion recognition in healthy individuals.
Nieratschker, Vanessa; Brückmann, Christof; Plewnia, Christian
2015-11-27
Recognition and correct interpretation of facial emotion is essential for social interaction and communication. Previous studies have shown that impairments in this cognitive domain are common features of several psychiatric disorders. Recent association studies identified CACNA1C as one of the most promising genetic risk factors for psychiatric disorders and previous evidence suggests that the most replicated risk variant in CACNA1C (rs1006737) is affecting emotion recognition and processing. However, studies investigating the influence of rs1006737 on this intermediate phenotype in healthy subjects at the behavioral level are largely missing to date. Here, we applied the "Reading the Mind in the Eyes" test, a facial emotion recognition paradigm in a cohort of 92 healthy individuals to address this question. Whereas accuracy was not affected by genotype, CACNA1C rs1006737 risk-allele carries (AA/AG) showed significantly slower mean response times compared to individuals homozygous for the G-allele, indicating that healthy risk-allele carriers require more information to correctly identify a facial emotion. Our study is the first to provide evidence for an impairing behavioral effect of the CACNA1C risk variant rs1006737 on facial emotion recognition in healthy individuals and adds to the growing number of studies pointing towards CACNA1C as affecting intermediate phenotypes of psychiatric disorders.
The Automaticity of Emotional Face-Context Integration
Aviezer, Hillel; Dudarev, Veronica; Bentin, Shlomo; Hassin, Ran R.
2011-01-01
Recent studies have demonstrated that context can dramatically influence the recognition of basic facial expressions, yet the nature of this phenomenon is largely unknown. In the present paper we begin to characterize the underlying process of face-context integration. Specifically, we examine whether it is a relatively controlled or automatic process. In Experiment 1 participants were motivated and instructed to avoid using the context while categorizing contextualized facial expression, or they were led to believe that the context was irrelevant. Nevertheless, they were unable to disregard the context, which exerted a strong effect on their emotion recognition. In Experiment 2, participants categorized contextualized facial expressions while engaged in a concurrent working memory task. Despite the load, the context exerted a strong influence on their recognition of facial expressions. These results suggest that facial expressions and their body contexts are integrated in an unintentional, uncontrollable, and relatively effortless manner. PMID:21707150
Facial dynamics and emotional expressions in facial aging treatments.
Michaud, Thierry; Gassia, Véronique; Belhaouari, Lakhdar
2015-03-01
Facial expressions convey emotions that form the foundation of interpersonal relationships, and many of these emotions promote and regulate our social linkages. Hence, the facial aging symptomatological analysis and the treatment plan must of necessity include knowledge of the facial dynamics and the emotional expressions of the face. This approach aims to more closely meet patients' expectations of natural-looking results, by correcting age-related negative expressions while observing the emotional language of the face. This article will successively describe patients' expectations, the role of facial expressions in relational dynamics, the relationship between facial structures and facial expressions, and the way facial aging mimics negative expressions. Eventually, therapeutic implications for facial aging treatment will be addressed. © 2015 Wiley Periodicals, Inc.
Effects of task demands on the early neural processing of fearful and happy facial expressions.
Itier, Roxane J; Neath-Tavares, Karly N
2017-05-15
Task demands shape how we process environmental stimuli but their impact on the early neural processing of facial expressions remains unclear. In a within-subject design, ERPs were recorded to the same fearful, happy and neutral facial expressions presented during a gender discrimination, an explicit emotion discrimination and an oddball detection tasks, the most studied tasks in the field. Using an eye tracker, fixation on the face nose was enforced using a gaze-contingent presentation. Task demands modulated amplitudes from 200 to 350ms at occipito-temporal sites spanning the EPN component. Amplitudes were more negative for fearful than neutral expressions starting on N170 from 150 to 350ms, with a temporo-occipital distribution, whereas no clear effect of happy expressions was seen. Task and emotion effects never interacted in any time window or for the ERP components analyzed (P1, N170, EPN). Thus, whether emotion is explicitly discriminated or irrelevant for the task at hand, neural correlates of fearful and happy facial expressions seem immune to these task demands during the first 350ms of visual processing. Copyright © 2017 Elsevier B.V. All rights reserved.
The role of encoding and attention in facial emotion memory: an EEG investigation.
Brenner, Colleen A; Rumak, Samuel P; Burns, Amy M N; Kieffaber, Paul D
2014-09-01
Facial expressions are encoded via sensory mechanisms, but meaning extraction and salience of these expressions involve cognitive functions. We investigated the time course of sensory encoding and subsequent maintenance in memory via EEG. Twenty-nine healthy participants completed a facial emotion delayed match-to-sample task. P100, N170 and N250 ERPs were measured in response to the first stimulus, and evoked theta power (4-7Hz) was measured during the delay interval. Negative facial expressions produced larger N170 amplitudes and greater theta power early in the delay. N170 amplitude correlated with theta power, however larger N170 amplitude coupled with greater theta power only predicted behavioural performance for one emotion condition (very happy) out of six tested (see Supplemental Data). These findings indicate that the N170 ERP may be sensitive to emotional facial expressions when task demands require encoding and retention of this information. Furthermore, sustained theta activity may represent continued attentional processing that supports short-term memory, especially of negative facial stimuli. Further study is needed to investigate the potential influence of these measures, and their interaction, on behavioural performance. Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.
Does Facial Amimia Impact the Recognition of Facial Emotions? An EMG Study in Parkinson’s Disease
Argaud, Soizic; Delplanque, Sylvain; Houvenaghel, Jean-François; Auffret, Manon; Duprez, Joan; Vérin, Marc; Grandjean, Didier; Sauleau, Paul
2016-01-01
According to embodied simulation theory, understanding other people’s emotions is fostered by facial mimicry. However, studies assessing the effect of facial mimicry on the recognition of emotion are still controversial. In Parkinson’s disease (PD), one of the most distinctive clinical features is facial amimia, a reduction in facial expressiveness, but patients also show emotional disturbances. The present study used the pathological model of PD to examine the role of facial mimicry on emotion recognition by investigating EMG responses in PD patients during a facial emotion recognition task (anger, joy, neutral). Our results evidenced a significant decrease in facial mimicry for joy in PD, essentially linked to the absence of reaction of the zygomaticus major and the orbicularis oculi muscles in response to happy avatars, whereas facial mimicry for expressions of anger was relatively preserved. We also confirmed that PD patients were less accurate in recognizing positive and neutral facial expressions and highlighted a beneficial effect of facial mimicry on the recognition of emotion. We thus provide additional arguments for embodied simulation theory suggesting that facial mimicry is a potential lever for therapeutic actions in PD even if it seems not to be necessarily required in recognizing emotion as such. PMID:27467393
Attention to emotion modulates fMRI activity in human right superior temporal sulcus.
Narumoto, J; Okada, T; Sadato, N; Fukui, K; Yonekura, Y
2001-10-01
A parallel neural network has been proposed for processing various types of information conveyed by faces including emotion. Using functional magnetic resonance imaging (fMRI), we tested the effect of the explicit attention to the emotional expression of the faces on the neuronal activity of the face-responsive regions. Delayed match to sample procedure was adopted. Subjects were required to match the visually presented pictures with regard to the contour of the face pictures, facial identity, and emotional expressions by valence (happy and fearful expressions) and arousal (fearful and sad expressions). Contour matching of the non-face scrambled pictures was used as a control condition. The face-responsive regions that responded more to faces than to non-face stimuli were the bilateral lateral fusiform gyrus (LFG), the right superior temporal sulcus (STS), and the bilateral intraparietal sulcus (IPS). In these regions, general attention to the face enhanced the activities of the bilateral LFG, the right STS, and the left IPS compared with attention to the contour of the facial image. Selective attention to facial emotion specifically enhanced the activity of the right STS compared with attention to the face per se. The results suggest that the right STS region plays a special role in facial emotion recognition within distributed face-processing systems. This finding may support the notion that the STS is involved in social perception.
Routledge, Kylie M; Williams, Leanne M; Harris, Anthony W F; Schofield, Peter R; Clark, C Richard; Gatt, Justine M
2018-06-01
Currently there is a very limited understanding of how mental wellbeing versus anxiety and depression symptoms are associated with emotion processing behaviour. For the first time, we examined these associations using a behavioural emotion task of positive and negative facial expressions in 1668 healthy adult twins. Linear mixed model results suggested faster reaction times to happy facial expressions was associated with higher wellbeing scores, and slower reaction times with higher depression and anxiety scores. Multivariate twin modelling identified a significant genetic correlation between depression and anxiety symptoms and reaction time to happy facial expressions, in the absence of any significant correlations with wellbeing. We also found a significant negative phenotypic relationship between depression and anxiety symptoms and accuracy for identifying neutral emotions, although the genetic or environment correlations were not significant in the multivariate model. Overall, the phenotypic relationships between speed of identifying happy facial expressions and wellbeing on the one hand, versus depression and anxiety symptoms on the other, were in opposing directions. Twin modelling revealed a small common genetic correlation between response to happy faces and depression and anxiety symptoms alone, suggesting that wellbeing and depression and anxiety symptoms show largely independent relationships with emotion processing at the behavioral level. Copyright © 2018 Elsevier B.V. All rights reserved.
Lahera, Guillermo; Ruiz, Alicia; Brañas, Antía; Vicens, María; Orozco, Arantxa
Previous studies have linked processing speed with social cognition and functioning of patients with schizophrenia. A discriminant analysis is needed to determine the different components of this neuropsychological construct. This paper analyzes the impact of processing speed, reaction time and sustained attention on social functioning. 98 outpatients between 18 and 65 with DSM-5 diagnosis of schizophrenia, with a period of 3 months of clinical stability, were recruited. Sociodemographic and clinical data were collected, and the following variables were measured: processing speed (Trail Making Test [TMT], symbol coding [BACS], verbal fluency), simple and elective reaction time, sustained attention, recognition of facial emotions and global functioning. Processing speed (measured only through the BACS), sustained attention (CPT) and elective reaction time (but not simple) were associated with functioning. Recognizing facial emotions (FEIT) correlated significantly with scores on measures of processing speed (BACS, Animals, TMT), sustained attention (CPT) and reaction time. The linear regression model showed a significant relationship between functioning, emotion recognition (P=.015) and processing speed (P=.029). A deficit in processing speed and facial emotion recognition are associated with worse global functioning in patients with schizophrenia. Copyright © 2017 SEP y SEPB. Publicado por Elsevier España, S.L.U. All rights reserved.
Feeser, Melanie; Fan, Yan; Weigand, Anne; Hahn, Adam; Gärtner, Matti; Aust, Sabine; Böker, Heinz; Bajbouj, Malek; Grimm, Simone
2014-12-01
Previous studies have shown that oxytocin (OXT) enhances social cognitive processes. It has also been demonstrated that OXT does not uniformly facilitate social cognition. The effects of OXT administration strongly depend on the exposure to stressful experiences in early life. Emotional facial recognition is crucial for social cognition. However, no study has yet examined how the effects of OXT on the ability to identify emotional faces are altered by early life stress (ELS) experiences. Given the role of OXT in modulating social motivational processes, we specifically aimed to investigate its effects on the recognition of approach- and avoidance-related facial emotions. In a double-blind, between-subjects, placebo-controlled design, 82 male participants performed an emotion recognition task with faces taken from the "Karolinska Directed Emotional Faces" set. We clustered the six basic emotions along the dimensions approach (happy, surprise, anger) and avoidance (fear, sadness, disgust). ELS was assessed with the Childhood Trauma Questionnaire (CTQ). Our results showed that OXT improved the ability to recognize avoidance-related emotional faces as compared to approach-related emotional faces. Whereas the performance for avoidance-related emotions in participants with higher ELS scores was comparable in both OXT and placebo condition, OXT enhanced emotion recognition in participants with lower ELS scores. Independent of OXT administration, we observed increased emotion recognition for avoidance-related faces in participants with high ELS scores. Our findings suggest that the investigation of OXT on social recognition requires a broad approach that takes ELS experiences as well as motivational processes into account.
The emotion seen in a face can be a methodological artifact: The process of elimination hypothesis.
DiGirolamo, Marissa A; Russell, James A
2017-04-01
The claim that certain facial expressions signal certain specific emotions has been supported by high observer agreement in labeling the emotion predicted for that expression. Our hypothesis was that, with a method common to the field, high observer agreement can be achieved through a process of elimination: As participants move from trial to trial and they encounter a type of expression not previously encountered in the experiment, they tend to eliminate labels they have already associated with expressions seen on previous trials; they then select among labels not previously used. Seven experiments (total N = 1,068) here showed that the amount of agreement can be altered through a process of elimination. One facial expression not previously theorized to signal any emotion was consensually labeled as disgusted (76%), annoyed (85%), playful (89%), and mischievous (96%). Three quite different facial expressions were labeled nonplussed (82%, 93%, and 82%). A prototypical sad expression was labeled disgusted (55%), and a prototypical fear expression was labeled surprised (55%). A facial expression was labeled with a made-up word ( tolen ; 53%). Similar results were obtained both in a context focused on demonstrating a process of elimination and in one similar to a commonly used method, with 4 target expressions embedded with other expressions in 24 randomly ordered trials. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Neural mechanism for judging the appropriateness of facial affect.
Kim, Ji-Woong; Kim, Jae-Jin; Jeong, Bum Seok; Ki, Seon Wan; Im, Dong-Mi; Lee, Soo Jung; Lee, Hong Shick
2005-12-01
Questions regarding the appropriateness of facial expressions in particular situations arise ubiquitously in everyday social interactions. To determine the appropriateness of facial affect, first of all, we should represent our own or the other's emotional state as induced by the social situation. Then, based on these representations, we should infer the possible affective response of the other person. In this study, we identified the brain mechanism mediating special types of social evaluative judgments of facial affect in which the internal reference is related to theory of mind (ToM) processing. Many previous ToM studies have used non-emotional stimuli, but, because so much valuable social information is conveyed through nonverbal emotional channels, this investigation used emotionally salient visual materials to tap ToM. Fourteen right-handed healthy subjects volunteered for our study. We used functional magnetic resonance imaging to examine brain activation during the judgmental task for the appropriateness of facial affects as opposed to gender matching tasks. We identified activation of a brain network, which includes both medial frontal cortex, left temporal pole, left inferior frontal gyrus, and left thalamus during the judgmental task for appropriateness of facial affect compared to the gender matching task. The results of this study suggest that the brain system involved in ToM plays a key role in judging the appropriateness of facial affect in an emotionally laden situation. In addition, our result supports that common neural substrates are involved in performing diverse kinds of ToM tasks irrespective of perceptual modalities and the emotional salience of test materials.
Effects of delta-9-tetrahydrocannabinol on evaluation of emotional images
Ballard, Michael E; Bedi, Gillinder; de Wit, Harriet
2013-01-01
There is growing evidence that drugs of abuse alter processing of emotional information in ways that could be attractive to users. Our recent report that Δ9-tetrahydrocannabinol (THC) diminishes amygdalar activation in response to threat-related faces suggests that THC may modify evaluation of emotionally-salient, particularly negative or threatening, stimuli. In this study, we examined the effects of acute THC on evaluation of emotional images. Healthy volunteers received two doses of THC (7.5 and 15 mg; p.o.) and placebo across separate sessions before performing tasks assessing facial emotion recognition and emotional responses to pictures of emotional scenes. THC significantly impaired recognition of facial fear and anger, but it only marginally impaired recognition of sadness and happiness. The drug did not consistently affect ratings of emotional scenes. THC' effects on emotional evaluation were not clearly related to its mood-altering effects. These results support our previous work, and show that THC reduces perception of facial threat. Nevertheless, THC does not appear to positively bias evaluation of emotional stimuli in general PMID:22585232
Suslow, Thomas; Kugel, Harald; Rufer, Michael; Redlich, Ronny; Dohm, Katharina; Grotegerd, Dominik; Zaremba, Dario; Dannlowski, Udo
2016-02-04
Alexithymia is a clinically relevant personality trait related to difficulties in recognizing and describing emotions. Previous studies examining the neural correlates of alexithymia have shown mainly decreased response of several brain areas during emotion processing in healthy samples and patients suffering from autism or post-traumatic stress disorder. In the present study, we examined the effect of alexithymia on automatic brain reactivity to negative and positive facial expressions in clinical depression. Brain activation in response to sad, happy, neutral, and no facial expression (presented for 33 ms and masked by neutral faces) was measured by functional magnetic resonance imaging at 3 T in 26 alexithymic and 26 non-alexithymic patients with major depression. Alexithymic patients manifested less activation in response to masked sad and happy (compared to neutral) faces in right frontal regions and right caudate nuclei than non-alexithymic patients. Our neuroimaging study provides evidence that the personality trait alexithymia has a modulating effect on automatic emotion processing in clinical depression. Our findings support the idea that alexithymia could be associated with functional deficits of the right hemisphere. Future research on the neural substrates of emotion processing in depression should assess and control alexithymia in their analyses.
Yan, Xiaoqian; Young, Andrew W; Andrews, Timothy J
2017-12-01
The aim of this study was to investigate the causes of the own-race advantage in facial expression perception. In Experiment 1, we investigated Western Caucasian and Chinese participants' perception and categorization of facial expressions of six basic emotions that included two pairs of confusable expressions (fear and surprise; anger and disgust). People were slightly better at identifying facial expressions posed by own-race members (mainly in anger and disgust). In Experiment 2, we asked whether the own-race advantage was due to differences in the holistic processing of facial expressions. Participants viewed composite faces in which the upper part of one expression was combined with the lower part of a different expression. The upper and lower parts of the composite faces were either aligned or misaligned. Both Chinese and Caucasian participants were better at identifying the facial expressions from the misaligned images, showing interference on recognizing the parts of the expressions created by holistic perception of the aligned composite images. However, this interference from holistic processing was equivalent across expressions of own-race and other-race faces in both groups of participants. Whilst the own-race advantage in recognizing facial expressions does seem to reflect the confusability of certain emotions, it cannot be explained by differences in holistic processing.
Gaze Behavior Consistency among Older and Younger Adults When Looking at Emotional Faces
Chaby, Laurence; Hupont, Isabelle; Avril, Marie; Luherne-du Boullay, Viviane; Chetouani, Mohamed
2017-01-01
The identification of non-verbal emotional signals, and especially of facial expressions, is essential for successful social communication among humans. Previous research has reported an age-related decline in facial emotion identification, and argued for socio-emotional or aging-brain model explanations. However, more perceptual differences in the gaze strategies that accompany facial emotional processing with advancing age have been under-explored yet. In this study, 22 young (22.2 years) and 22 older (70.4 years) adults were instructed to look at basic facial expressions while their gaze movements were recorded by an eye-tracker. Participants were then asked to identify each emotion, and the unbiased hit rate was applied as performance measure. Gaze data were first analyzed using traditional measures of fixations over two preferential regions of the face (upper and lower areas) for each emotion. Then, to better capture core gaze changes with advancing age, spatio-temporal gaze behaviors were deeper examined using data-driven analysis (dimension reduction, clustering). Results first confirmed that older adults performed worse than younger adults at identifying facial expressions, except for “joy” and “disgust,” and this was accompanied by a gaze preference toward the lower-face. Interestingly, this phenomenon was maintained during the whole time course of stimulus presentation. More importantly, trials corresponding to older adults were more tightly clustered, suggesting that the gaze behavior patterns of older adults are more consistent than those of younger adults. This study demonstrates that, confronted to emotional faces, younger and older adults do not prioritize or ignore the same facial areas. Older adults mainly adopted a focused-gaze strategy, consisting in focusing only on the lower part of the face throughout the whole stimuli display time. This consistency may constitute a robust and distinctive “social signature” of emotional identification in aging. Younger adults, however, were more dispersed in terms of gaze behavior and used a more exploratory-gaze strategy, consisting in repeatedly visiting both facial areas. PMID:28450841
Emotion perception across cultures: the role of cognitive mechanisms
Engelmann, Jan B.; Pogosyan, Marianna
2012-01-01
Despite consistently documented cultural differences in the perception of facial expressions of emotion, the role of culture in shaping cognitive mechanisms that are central to emotion perception has received relatively little attention in past research. We review recent developments in cross-cultural psychology that provide particular insights into the modulatory role of culture on cognitive mechanisms involved in interpretations of facial expressions of emotion through two distinct routes: display rules and cognitive styles. Investigations of emotion intensity perception have demonstrated that facial expressions with varying levels of intensity of positive affect are perceived and categorized differently across cultures. Specifically, recent findings indicating significant levels of differentiation between intensity levels of facial expressions among American participants, as well as deviations from clear categorization of high and low intensity expressions among Japanese and Russian participants, suggest that display rules shape mental representations of emotions, such as intensity levels of emotion prototypes. Furthermore, a series of recent studies using eye tracking as a proxy for overt attention during face perception have identified culture-specific cognitive styles, such as the propensity to attend to very specific features of the face. Together, these results suggest a cascade of cultural influences on cognitive mechanisms involved in interpretations of facial expressions of emotion, whereby cultures impart specific behavioral practices that shape the way individuals process information from the environment. These cultural influences lead to differences in cognitive styles due to culture-specific attentional biases and emotion prototypes, which partially account for the gradient of cultural agreements and disagreements obtained in past investigations of emotion perception. PMID:23486743
Emotion perception across cultures: the role of cognitive mechanisms.
Engelmann, Jan B; Pogosyan, Marianna
2013-01-01
Despite consistently documented cultural differences in the perception of facial expressions of emotion, the role of culture in shaping cognitive mechanisms that are central to emotion perception has received relatively little attention in past research. We review recent developments in cross-cultural psychology that provide particular insights into the modulatory role of culture on cognitive mechanisms involved in interpretations of facial expressions of emotion through two distinct routes: display rules and cognitive styles. Investigations of emotion intensity perception have demonstrated that facial expressions with varying levels of intensity of positive affect are perceived and categorized differently across cultures. Specifically, recent findings indicating significant levels of differentiation between intensity levels of facial expressions among American participants, as well as deviations from clear categorization of high and low intensity expressions among Japanese and Russian participants, suggest that display rules shape mental representations of emotions, such as intensity levels of emotion prototypes. Furthermore, a series of recent studies using eye tracking as a proxy for overt attention during face perception have identified culture-specific cognitive styles, such as the propensity to attend to very specific features of the face. Together, these results suggest a cascade of cultural influences on cognitive mechanisms involved in interpretations of facial expressions of emotion, whereby cultures impart specific behavioral practices that shape the way individuals process information from the environment. These cultural influences lead to differences in cognitive styles due to culture-specific attentional biases and emotion prototypes, which partially account for the gradient of cultural agreements and disagreements obtained in past investigations of emotion perception.
Automatic emotion processing as a function of trait emotional awareness: an fMRI study
Lichev, Vladimir; Sacher, Julia; Ihme, Klas; Rosenberg, Nicole; Quirin, Markus; Lepsien, Jöran; Pampel, André; Rufer, Michael; Grabe, Hans-Jörgen; Kugel, Harald; Kersting, Anette; Villringer, Arno; Lane, Richard D.
2015-01-01
It is unclear whether reflective awareness of emotions is related to extent and intensity of implicit affective reactions. This study is the first to investigate automatic brain reactivity to emotional stimuli as a function of trait emotional awareness. To assess emotional awareness the Levels of Emotional Awareness Scale (LEAS) was administered. During scanning, masked happy, angry, fearful and neutral facial expressions were presented to 46 healthy subjects, who had to rate the fit between artificial and emotional words. The rating procedure allowed assessment of shifts in implicit affectivity due to emotion faces. Trait emotional awareness was associated with increased activation in the primary somatosensory cortex, inferior parietal lobule, anterior cingulate gyrus, middle frontal and cerebellar areas, thalamus, putamen and amygdala in response to masked happy faces. LEAS correlated positively with shifts in implicit affect caused by masked happy faces. According to our findings, people with high emotional awareness show stronger affective reactivity and more activation in brain areas involved in emotion processing and simulation during the perception of masked happy facial expression than people with low emotional awareness. High emotional awareness appears to be characterized by an enhanced positive affective resonance to others at an automatic processing level. PMID:25140051
Kometer, Michael; Schmidt, André; Bachmann, Rosilla; Studerus, Erich; Seifritz, Erich; Vollenweider, Franz X
2012-12-01
Serotonin (5-HT) 1A and 2A receptors have been associated with dysfunctional emotional processing biases in mood disorders. These receptors further predominantly mediate the subjective and behavioral effects of psilocybin and might be important for its recently suggested antidepressive effects. However, the effect of psilocybin on emotional processing biases and the specific contribution of 5-HT2A receptors across different emotional domains is unknown. In a randomized, double-blind study, 17 healthy human subjects received on 4 separate days placebo, psilocybin (215 μg/kg), the preferential 5-HT2A antagonist ketanserin (50 mg), or psilocybin plus ketanserin. Mood states were assessed by self-report ratings, and behavioral and event-related potential measurements were used to quantify facial emotional recognition and goal-directed behavior toward emotional cues. Psilocybin enhanced positive mood and attenuated recognition of negative facial expression. Furthermore, psilocybin increased goal-directed behavior toward positive compared with negative cues, facilitated positive but inhibited negative sequential emotional effects, and valence-dependently attenuated the P300 component. Ketanserin alone had no effects but blocked the psilocybin-induced mood enhancement and decreased recognition of negative facial expression. This study shows that psilocybin shifts the emotional bias across various psychological domains and that activation of 5-HT2A receptors is central in mood regulation and emotional face recognition in healthy subjects. These findings may not only have implications for the pathophysiology of dysfunctional emotional biases but may also provide a framework to delineate the mechanisms underlying psylocybin's putative antidepressant effects. Copyright © 2012 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.
Psychometric challenges and proposed solutions when scoring facial emotion expression codes.
Olderbak, Sally; Hildebrandt, Andrea; Pinkpank, Thomas; Sommer, Werner; Wilhelm, Oliver
2014-12-01
Coding of facial emotion expressions is increasingly performed by automated emotion expression scoring software; however, there is limited discussion on how best to score the resulting codes. We present a discussion of facial emotion expression theories and a review of contemporary emotion expression coding methodology. We highlight methodological challenges pertinent to scoring software-coded facial emotion expression codes and present important psychometric research questions centered on comparing competing scoring procedures of these codes. Then, on the basis of a time series data set collected to assess individual differences in facial emotion expression ability, we derive, apply, and evaluate several statistical procedures, including four scoring methods and four data treatments, to score software-coded emotion expression data. These scoring procedures are illustrated to inform analysis decisions pertaining to the scoring and data treatment of other emotion expression questions and under different experimental circumstances. Overall, we found applying loess smoothing and controlling for baseline facial emotion expression and facial plasticity are recommended methods of data treatment. When scoring facial emotion expression ability, maximum score is preferred. Finally, we discuss the scoring methods and data treatments in the larger context of emotion expression research.
Facial color is an efficient mechanism to visually transmit emotion
Benitez-Quiroz, Carlos F.; Srinivasan, Ramprakash
2018-01-01
Facial expressions of emotion in humans are believed to be produced by contracting one’s facial muscles, generally called action units. However, the surface of the face is also innervated with a large network of blood vessels. Blood flow variations in these vessels yield visible color changes on the face. Here, we study the hypothesis that these visible facial colors allow observers to successfully transmit and visually interpret emotion even in the absence of facial muscle activation. To study this hypothesis, we address the following two questions. Are observable facial colors consistent within and differential between emotion categories and positive vs. negative valence? And does the human visual system use these facial colors to decode emotion from faces? These questions suggest the existence of an important, unexplored mechanism of the production of facial expressions of emotion by a sender and their visual interpretation by an observer. The results of our studies provide evidence in favor of our hypothesis. We show that people successfully decode emotion using these color features, even in the absence of any facial muscle activation. We also demonstrate that this color signal is independent from that provided by facial muscle movements. These results support a revised model of the production and perception of facial expressions of emotion where facial color is an effective mechanism to visually transmit and decode emotion. PMID:29555780
Facial color is an efficient mechanism to visually transmit emotion.
Benitez-Quiroz, Carlos F; Srinivasan, Ramprakash; Martinez, Aleix M
2018-04-03
Facial expressions of emotion in humans are believed to be produced by contracting one's facial muscles, generally called action units. However, the surface of the face is also innervated with a large network of blood vessels. Blood flow variations in these vessels yield visible color changes on the face. Here, we study the hypothesis that these visible facial colors allow observers to successfully transmit and visually interpret emotion even in the absence of facial muscle activation. To study this hypothesis, we address the following two questions. Are observable facial colors consistent within and differential between emotion categories and positive vs. negative valence? And does the human visual system use these facial colors to decode emotion from faces? These questions suggest the existence of an important, unexplored mechanism of the production of facial expressions of emotion by a sender and their visual interpretation by an observer. The results of our studies provide evidence in favor of our hypothesis. We show that people successfully decode emotion using these color features, even in the absence of any facial muscle activation. We also demonstrate that this color signal is independent from that provided by facial muscle movements. These results support a revised model of the production and perception of facial expressions of emotion where facial color is an effective mechanism to visually transmit and decode emotion. Copyright © 2018 the Author(s). Published by PNAS.
Emotion processing biases and resting EEG activity in depressed adolescents
Auerbach, Randy P.; Stewart, Jeremy G.; Stanton, Colin H.; Mueller, Erik M.; Pizzagalli, Diego A.
2015-01-01
Background While theorists have posited that adolescent depression is characterized by emotion processing biases (greater propensity to identify sad than happy facial expressions), findings have been mixed. Additionally, the neural correlates associated with putative emotion processing biases remain largely unknown. Our aim was to identify emotion processing biases in depressed adolescents and examine neural abnormalities related to these biases using high-density resting EEG and source localization. Methods Healthy (n = 36) and depressed (n = 23) female adolescents, aged 13–18 years, completed a facial recognition task in which they identified happy, sad, fear, and angry expressions across intensities from 10% (low) to 100% (high). Additionally, 128-channel resting (i.e., task-free) EEG was recorded and analyzed using a distributed source localization technique (LORETA). Given research implicating the dorsolateral prefrontal cortex (DLPFC) in depression and emotion processing, analyses focused on this region. Results Relative to healthy youth, depressed adolescents were more accurate for sad and less accurate for happy, particularly low-intensity happy faces. No differences emerged for fearful or angry facial expressions. Further, LORETA analyses revealed greater theta and alpha current density (i.e., reduced brain activity) in depressed versus healthy adolescents, particularly in the left DLPFC (BA9/BA46). Theta and alpha current density were positively correlated, and greater current density predicted reduced accuracy for happy faces. Conclusion Depressed female adolescents were characterized by emotion processing biases in favor of sad emotions and reduced recognition of happiness, especially when cues of happiness were subtle. Blunted recognition of happy was associated with left DLPFC resting hypoactivity. PMID:26032684
Waller, Bridget M; Bard, Kim A; Vick, Sarah-Jane; Smith Pasqualini, Marcia C
2007-11-01
Human face perception is a finely tuned, specialized process. When comparing faces between species, therefore, it is essential to consider how people make these observational judgments. Comparing facial expressions may be particularly problematic, given that people tend to consider them categorically as emotional signals, which may affect how accurately specific details are processed. The bared-teeth display (BT), observed in most primates, has been proposed as a homologue of the human smile (J. A. R. A. M. van Hooff, 1972). In this study, judgments of similarity between BT displays of chimpanzees (Pan troglodytes) and human smiles varied in relation to perceived emotional valence. When a chimpanzee BT was interpreted as fearful, observers tended to underestimate the magnitude of the relationship between certain features (the extent of lip corner raise) and human smiles. These judgments may reflect the combined effects of categorical emotional perception, configural face processing, and perceptual organization in mental imagery and may demonstrate the advantages of using standardized observational methods in comparative facial expression research. Copyright 2007 APA.
Young, Audrey; Luyster, Rhiannon J; Fox, Nathan A; Zeanah, Charles H; Nelson, Charles A
2017-09-01
Early psychosocial deprivation has profound adverse effects on children's brain and behavioural development, including abnormalities in physical growth, intellectual function, social cognition, and emotional development. Nevertheless, the domain of emotional face processing has appeared in previous research to be relatively spared; here, we test for possible sleeper effects emerging in early adolescence. This study employed event-related potentials (ERPs) to examine the neural correlates of facial emotion processing in 12-year-old children who took part in a randomized controlled trial of foster care as an intervention for early institutionalization. Results revealed no significant group differences in two face and emotion-sensitive ERP components (P1 and N170), nor any association with age at placement or per cent of lifetime spent in an institution. These results converged with previous evidence from this population supporting relative sparing of facial emotion processing. We hypothesize that this sparing is due to an experience-dependent mechanism in which the amount of exposure to faces and facial expressions of emotion children received was sufficient to meet the low threshold required for cortical specialization of structures critical to emotion processing. Statement of contribution What is already known on this subject? Early psychosocial deprivation leads to profoundly detrimental effects on children's brain and behavioural development. With respect to children's emotional face processing abilities, few adverse effects of institutionalized rearing have previously been reported. Recent studies suggest that 'sleeper effects' may emerge many years later, especially in the domain of face processing. What does this study add? Examining a cumulative 12 years of data, we found only minimal group differences and no evidence of a sleeper effect in this particular domain. These findings identify emotional face processing as a unique ability in which relative sparing can be found. We propose an experience-dependent mechanism in which the amount of social interaction children received met the low threshold required for cortical specialization. © 2017 The British Psychological Society.
Emotional facial recognition in proactive and reactive violent offenders.
Philipp-Wiegmann, Florence; Rösler, Michael; Retz-Junginger, Petra; Retz, Wolfgang
2017-10-01
The purpose of this study is to analyse individual differences in the ability of emotional facial recognition in violent offenders, who were characterised as either reactive or proactive in relation to their offending. In accordance with findings of our previous study, we expected higher impairments in facial recognition in reactive than proactive violent offenders. To assess the ability to recognize facial expressions, the computer-based Facial Emotional Expression Labeling Test (FEEL) was performed. Group allocation of reactive und proactive violent offenders and assessment of psychopathic traits were performed by an independent forensic expert using rating scales (PROREA, PCL-SV). Compared to proactive violent offenders and controls, the performance of emotion recognition in the reactive offender group was significantly lower, both in total and especially in recognition of negative emotions such as anxiety (d = -1.29), sadness (d = -1.54), and disgust (d = -1.11). Furthermore, reactive violent offenders showed a tendency to interpret non-anger emotions as anger. In contrast, proactive violent offenders performed as well as controls. General and specific deficits in reactive violent offenders are in line with the results of our previous study and correspond to predictions of the Integrated Emotion System (IES, 7) and the hostile attribution processes (21). Due to the different error pattern in the FEEL test, the theoretical distinction between proactive and reactive aggression can be supported based on emotion recognition, even though aggression itself is always a heterogeneous act rather than a distinct one-dimensional concept.
Taurisano, Paolo; Blasi, Giuseppe; Romano, Raffaella; Sambataro, Fabio; Fazio, Leonardo; Gelao, Barbara; Ursini, Gianluca; Lo Bianco, Luciana; Di Giorgio, Annabella; Ferrante, Francesca; Papazacharias, Apostolos; Porcelli, Annamaria; Sinibaldi, Lorenzo; Popolizio, Teresa; Bertolino, Alessandro
2013-12-01
Maternal care (MC) and dopamine modulate brain activity during emotion processing in inferior frontal gyrus (IFG), striatum and amygdala. Reuptake of dopamine from the synapse is performed by the dopamine transporter (DAT), whose abundance is predicted by variation in its gene (DAT 3'VNTR; 10 > 9-repeat alleles). Here, we investigated the interaction between perceived MC and DAT 3'VNTR genotype on brain activity during processing of aversive facial emotional stimuli. Sixty-one healthy subjects were genotyped for DAT 3'VNTR and categorized in low and high MC individuals. They underwent functional magnetic resonance imaging while performing a task requiring gender discrimination of facial stimuli with angry, fearful or neutral expressions. An interaction between facial expression, DAT genotype and MC was found in left IFG, such that low MC and homozygosity for the 10-repeat allele are associated with greater activity during processing of fearful faces. This greater activity was also inversely correlated with a measure of emotion control as scored with the Big Five Questionnaire. Moreover, MC and DAT genotype described a double dissociation on functional connectivity between IFG and amygdala. These findings suggest that perceived early parental bonding may interact with DAT 3'VNTR genotype in modulating brain activity during emotionally relevant inputs.
Taurisano, Paolo; Blasi, Giuseppe; Romano, Raffaella; Sambataro, Fabio; Fazio, Leonardo; Gelao, Barbara; Ursini, Gianluca; Lo Bianco, Luciana; Di Giorgio, Annabella; Ferrante, Francesca; Papazacharias, Apostolos; Porcelli, Annamaria; Sinibaldi, Lorenzo; Popolizio, Teresa
2013-01-01
Background: Maternal care (MC) and dopamine modulate brain activity during emotion processing in inferior frontal gyrus (IFG), striatum and amygdala. Reuptake of dopamine from the synapse is performed by the dopamine transporter (DAT), whose abundance is predicted by variation in its gene (DAT 3′VNTR; 10 > 9-repeat alleles). Here, we investigated the interaction between perceived MC and DAT 3′VNTR genotype on brain activity during processing of aversive facial emotional stimuli. Methods: Sixty-one healthy subjects were genotyped for DAT 3′VNTR and categorized in low and high MC individuals. They underwent functional magnetic resonance imaging while performing a task requiring gender discrimination of facial stimuli with angry, fearful or neutral expressions. Results: An interaction between facial expression, DAT genotype and MC was found in left IFG, such that low MC and homozygosity for the 10-repeat allele are associated with greater activity during processing of fearful faces. This greater activity was also inversely correlated with a measure of emotion control as scored with the Big Five Questionnaire. Moreover, MC and DAT genotype described a double dissociation on functional connectivity between IFG and amygdala. Conclusion: These findings suggest that perceived early parental bonding may interact with DAT 3′VNTR genotype in modulating brain activity during emotionally relevant inputs. PMID:22842906
Brain Response to a Humanoid Robot in Areas Implicated in the Perception of Human Emotional Gestures
Chaminade, Thierry; Zecca, Massimiliano; Blakemore, Sarah-Jayne; Takanishi, Atsuo; Frith, Chris D.; Micera, Silvestro; Dario, Paolo; Rizzolatti, Giacomo; Gallese, Vittorio; Umiltà, Maria Alessandra
2010-01-01
Background The humanoid robot WE4-RII was designed to express human emotions in order to improve human-robot interaction. We can read the emotions depicted in its gestures, yet might utilize different neural processes than those used for reading the emotions in human agents. Methodology Here, fMRI was used to assess how brain areas activated by the perception of human basic emotions (facial expression of Anger, Joy, Disgust) and silent speech respond to a humanoid robot impersonating the same emotions, while participants were instructed to attend either to the emotion or to the motion depicted. Principal Findings Increased responses to robot compared to human stimuli in the occipital and posterior temporal cortices suggest additional visual processing when perceiving a mechanical anthropomorphic agent. In contrast, activity in cortical areas endowed with mirror properties, like left Broca's area for the perception of speech, and in the processing of emotions like the left anterior insula for the perception of disgust and the orbitofrontal cortex for the perception of anger, is reduced for robot stimuli, suggesting lesser resonance with the mechanical agent. Finally, instructions to explicitly attend to the emotion significantly increased response to robot, but not human facial expressions in the anterior part of the left inferior frontal gyrus, a neural marker of motor resonance. Conclusions Motor resonance towards a humanoid robot, but not a human, display of facial emotion is increased when attention is directed towards judging emotions. Significance Artificial agents can be used to assess how factors like anthropomorphism affect neural response to the perception of human actions. PMID:20657777
NASA Astrophysics Data System (ADS)
Khan, Masood Mehmood; Ward, Robert D.; Ingleby, Michael
The ability to distinguish feigned from involuntary expressions of emotions could help in the investigation and treatment of neuropsychiatric and affective disorders and in the detection of malingering. This work investigates differences in emotion-specific patterns of thermal variations along the major facial muscles. Using experimental data extracted from 156 images, we attempted to classify patterns of emotion-specific thermal variations into neutral, and voluntary and involuntary expressions of positive and negative emotive states. Initial results suggest (i) each facial muscle exhibits a unique thermal response to various emotive states; (ii) the pattern of thermal variances along the facial muscles may assist in classifying voluntary and involuntary facial expressions; and (iii) facial skin temperature measurements along the major facial muscles may be used in automated emotion assessment.
Suslow, Thomas; Kugel, Harald; Lindner, Christian; Dannlowski, Udo; Egloff, Boris
2017-01-06
Extraversion-introversion is a personality dimension referring to individual differences in social behavior. In the past, neurobiological research on extraversion was almost entirely based upon questionnaires which inform about the explicit self-concept. Today, indirect measures are available that tap into the implicit self-concept of extraversion which is assumed to result from automatic processing functions. In our study, brain activation while viewing facial expression of affiliation relevant (i.e., happiness, and disgust) and irrelevant (i.e., fear) emotions was examined as a function of the implicit and explicit self-concept of extraversion and processing mode (automatic vs. controlled). 40 healthy volunteers watched blocks of masked and unmasked emotional faces while undergoing functional magnetic resonance imaging. The Implicit Association Test and the NEO Five-Factor Inventory were applied as implicit and explicit measures of extraversion which were uncorrelated in our sample. Implicit extraversion was found to be positively associated with neural response to masked happy faces in the thalamus and temporo-parietal regions and to masked disgust faces in cerebellar areas. Moreover, it was positively correlated with brain response to unmasked disgust faces in the amygdala and cortical areas. Explicit extraversion was not related to brain response to facial emotions when controlling trait anxiety. The implicit compared to the explicit self-concept of extraversion seems to be more strongly associated with brain activation not only during automatic but also during controlled processing of affiliation relevant facial emotions. Enhanced neural response to facial disgust could reflect high sensitivity to signals of interpersonal rejection in extraverts (i.e., individuals with affiliative tendencies). Copyright © 2016 IBRO. Published by Elsevier Ltd. All rights reserved.
Stan, Ana D; Schirda, Claudiu V; Bertocci, Michele A; Bebko, Genna M; Kronhaus, Dina M; Aslam, Haris A; LaBarbara, Eduard J; Tanase, Costin; Lockovich, Jeanette C; Pollock, Myrna H; Stiffler, Richelle S; Phillips, Mary L
2014-09-30
The dorsomedial prefrontal cortex (MdPFC) and anterior cingulate cortices (ACC) play a critical role in implicit emotion regulation; however the understanding of the specific neurotransmitters that mediate such role is lacking. In this study, we examined relationships between MdPFC concentrations of two neurotransmitters, glutamate and γ-amino butyric acid (GABA), and BOLD activity in ACC during performance of an implicit facial emotion-processing task. Twenty healthy volunteers, aged 20-35 years, were scanned while performing an implicit facial emotion-processing task, whereby presented facial expressions changed from neutral to one of the four emotions: happy, anger, fear, or sad. Glutamate concentrations were measured before and after the emotion-processing task in right MdPFC using magnetic resonance spectroscopy (MRS). GABA concentrations were measured in bilateral MdPFC after the emotion-processing task. Multiple regression models were run to determine the relative contribution of glutamate and GABA concentration, age, and gender to BOLD signal in ACC to each of the four emotions. Multiple regression analyses revealed a significant negative correlation between MdPFC GABA concentration and BOLD signal in subgenual ACC (p<0.05, corrected) to sad versus shape contrast. For the anger versus shape contrast, there was a significant negative correlation between age and BOLD signal in pregenual ACC (p<0.05, corrected) and a positive correlation between MdPFC glutamate concentration (pre-task) and BOLD signal in pregenual ACC (p<0.05, corrected). Our findings are the first to provide insight into relationships between MdPFC neurotransmitter concentrations and ACC BOLD signal, and could further understanding of molecular mechanisms underlying emotion processing in healthy and mood-disordered individuals. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Unconscious Processing of Facial Expressions in Individuals with Internet Gaming Disorder.
Peng, Xiaozhe; Cui, Fang; Wang, Ting; Jiao, Can
2017-01-01
Internet Gaming Disorder (IGD) is characterized by impairments in social communication and the avoidance of social contact. Facial expression processing is the basis of social communication. However, few studies have investigated how individuals with IGD process facial expressions, and whether they have deficits in emotional facial processing remains unclear. The aim of the present study was to explore these two issues by investigating the time course of emotional facial processing in individuals with IGD. A backward masking task was used to investigate the differences between individuals with IGD and normal controls (NC) in the processing of subliminally presented facial expressions (sad, happy, and neutral) with event-related potentials (ERPs). The behavioral results showed that individuals with IGD are slower than NC in response to both sad and neutral expressions in the sad-neutral context. The ERP results showed that individuals with IGD exhibit decreased amplitudes in ERP component N170 (an index of early face processing) in response to neutral expressions compared to happy expressions in the happy-neutral expressions context, which might be due to their expectancies for positive emotional content. The NC, on the other hand, exhibited comparable N170 amplitudes in response to both happy and neutral expressions in the happy-neutral expressions context, as well as sad and neutral expressions in the sad-neutral expressions context. Both individuals with IGD and NC showed comparable ERP amplitudes during the processing of sad expressions and neutral expressions. The present study revealed that individuals with IGD have different unconscious neutral facial processing patterns compared with normal individuals and suggested that individuals with IGD may expect more positive emotion in the happy-neutral expressions context. • The present study investigated whether the unconscious processing of facial expressions is influenced by excessive online gaming. A validated backward masking paradigm was used to investigate whether individuals with Internet Gaming Disorder (IGD) and normal controls (NC) exhibit different patterns in facial expression processing.• The results demonstrated that individuals with IGD respond differently to facial expressions compared with NC on a preattentive level. Behaviorally, individuals with IGD are slower than NC in response to both sad and neutral expressions in the sad-neutral context. The ERP results further showed (1) decreased amplitudes in the N170 component (an index of early face processing) in individuals with IGD when they process neutral expressions compared with happy expressions in the happy-neutral expressions context, whereas the NC exhibited comparable N170 amplitudes in response to these two expressions; (2) both the IGD and NC group demonstrated similar N170 amplitudes in response to sad and neutral faces in the sad-neutral expressions context.• The decreased amplitudes of N170 to neutral faces than happy faces in individuals with IGD might due to their less expectancies for neutral content in the happy-neutral expressions context, while individuals with IGD may have no different expectancies for neutral and sad faces in the sad-neutral expressions context.
Mapping the emotional face. How individual face parts contribute to successful emotion recognition.
Wegrzyn, Martin; Vogt, Maria; Kireclioglu, Berna; Schneider, Julia; Kissler, Johanna
2017-01-01
Which facial features allow human observers to successfully recognize expressions of emotion? While the eyes and mouth have been frequently shown to be of high importance, research on facial action units has made more precise predictions about the areas involved in displaying each emotion. The present research investigated on a fine-grained level, which physical features are most relied on when decoding facial expressions. In the experiment, individual faces expressing the basic emotions according to Ekman were hidden behind a mask of 48 tiles, which was sequentially uncovered. Participants were instructed to stop the sequence as soon as they recognized the facial expression and assign it the correct label. For each part of the face, its contribution to successful recognition was computed, allowing to visualize the importance of different face areas for each expression. Overall, observers were mostly relying on the eye and mouth regions when successfully recognizing an emotion. Furthermore, the difference in the importance of eyes and mouth allowed to group the expressions in a continuous space, ranging from sadness and fear (reliance on the eyes) to disgust and happiness (mouth). The face parts with highest diagnostic value for expression identification were typically located in areas corresponding to action units from the facial action coding system. A similarity analysis of the usefulness of different face parts for expression recognition demonstrated that faces cluster according to the emotion they express, rather than by low-level physical features. Also, expressions relying more on the eyes or mouth region were in close proximity in the constructed similarity space. These analyses help to better understand how human observers process expressions of emotion, by delineating the mapping from facial features to psychological representation.
Mapping the emotional face. How individual face parts contribute to successful emotion recognition
Wegrzyn, Martin; Vogt, Maria; Kireclioglu, Berna; Schneider, Julia; Kissler, Johanna
2017-01-01
Which facial features allow human observers to successfully recognize expressions of emotion? While the eyes and mouth have been frequently shown to be of high importance, research on facial action units has made more precise predictions about the areas involved in displaying each emotion. The present research investigated on a fine-grained level, which physical features are most relied on when decoding facial expressions. In the experiment, individual faces expressing the basic emotions according to Ekman were hidden behind a mask of 48 tiles, which was sequentially uncovered. Participants were instructed to stop the sequence as soon as they recognized the facial expression and assign it the correct label. For each part of the face, its contribution to successful recognition was computed, allowing to visualize the importance of different face areas for each expression. Overall, observers were mostly relying on the eye and mouth regions when successfully recognizing an emotion. Furthermore, the difference in the importance of eyes and mouth allowed to group the expressions in a continuous space, ranging from sadness and fear (reliance on the eyes) to disgust and happiness (mouth). The face parts with highest diagnostic value for expression identification were typically located in areas corresponding to action units from the facial action coding system. A similarity analysis of the usefulness of different face parts for expression recognition demonstrated that faces cluster according to the emotion they express, rather than by low-level physical features. Also, expressions relying more on the eyes or mouth region were in close proximity in the constructed similarity space. These analyses help to better understand how human observers process expressions of emotion, by delineating the mapping from facial features to psychological representation. PMID:28493921
Papazacharias, Apostolos; Taurisano, Paolo; Fazio, Leonardo; Gelao, Barbara; Di Giorgio, Annabella; Lo Bianco, Luciana; Quarto, Tiziana; Mancini, Marina; Porcelli, Annamaria; Romano, Raffaella; Caforio, Grazia; Todarello, Orlando; Popolizio, Teresa; Blasi, Giuseppe; Bertolino, Alessandro
2015-01-01
Earlier studies have demonstrated that emotional stimulation modulates attentional processing during goal-directed behavior and related activity of a brain network including the inferior frontal gyrus (IFG) and the caudate nucleus. However, it is not clear how emotional interference modulates behavior and brain physiology during variation in attentional control, a relevant question for everyday life situations in which both emotional stimuli and cognitive load vary. The aim of this study was to investigate the impact of negative emotions on behavior and activity in IFG and caudate nucleus during increasing levels of attentional control. Twenty two healthy subjects underwent event-related functional magnetic resonance imaging while performing a task in which neutral or fearful facial expressions were displayed before stimuli eliciting increasing levels of attentional control processing. Results indicated slower reaction time (RT) and greater right IFG activity when fearful compared with neutral facial expressions preceded the low level of attentional control. On the other hand, fearful facial expressions preceding the intermediate level of attentional control elicited faster behavioral responses and greater activity in the right and left sides of the caudate. Finally, correlation analysis indicated a relationship between behavioral correlates of attentional control after emotional interference and right IFG activity. All together, these results suggest that the impact of negative emotions on attentional processing is differentially elicited at the behavioral and physiological levels as a function of cognitive load.
Papazacharias, Apostolos; Taurisano, Paolo; Fazio, Leonardo; Gelao, Barbara; Di Giorgio, Annabella; Lo Bianco, Luciana; Quarto, Tiziana; Mancini, Marina; Porcelli, Annamaria; Romano, Raffaella; Caforio, Grazia; Todarello, Orlando; Popolizio, Teresa; Blasi, Giuseppe; Bertolino, Alessandro
2015-01-01
Earlier studies have demonstrated that emotional stimulation modulates attentional processing during goal-directed behavior and related activity of a brain network including the inferior frontal gyrus (IFG) and the caudate nucleus. However, it is not clear how emotional interference modulates behavior and brain physiology during variation in attentional control, a relevant question for everyday life situations in which both emotional stimuli and cognitive load vary. The aim of this study was to investigate the impact of negative emotions on behavior and activity in IFG and caudate nucleus during increasing levels of attentional control. Twenty two healthy subjects underwent event-related functional magnetic resonance imaging while performing a task in which neutral or fearful facial expressions were displayed before stimuli eliciting increasing levels of attentional control processing. Results indicated slower reaction time (RT) and greater right IFG activity when fearful compared with neutral facial expressions preceded the low level of attentional control. On the other hand, fearful facial expressions preceding the intermediate level of attentional control elicited faster behavioral responses and greater activity in the right and left sides of the caudate. Finally, correlation analysis indicated a relationship between behavioral correlates of attentional control after emotional interference and right IFG activity. All together, these results suggest that the impact of negative emotions on attentional processing is differentially elicited at the behavioral and physiological levels as a function of cognitive load. PMID:25954172
Patterns of Emotion Experiences as Predictors of Facial Expressions of Emotion.
ERIC Educational Resources Information Center
Blumberg, Samuel H.; Izard, Carroll E.
1991-01-01
Examined the relations between emotion and facial expressions of emotion in 8- to 12-year-old male psychiatric patients. Results indicated that patterns or combinations of emotion experiences had an impact on facial expressions of emotion. (Author/BB)
Aviezer, Hillel; Hassin, Ran R; Perry, Anat; Dudarev, Veronica; Bentin, Shlomo
2012-04-01
The current study examined the nature of deficits in emotion recognition from facial expressions in case LG, an individual with a rare form of developmental visual agnosia (DVA). LG presents with profoundly impaired recognition of facial expressions, yet the underlying nature of his deficit remains unknown. During typical face processing, normal sighted individuals extract information about expressed emotions from face regions with activity diagnostic for specific emotion categories. Given LG's impairment, we sought to shed light on his emotion perception by examining if priming facial expressions with diagnostic emotional face components would facilitate his recognition of the emotion expressed by the face. LG and control participants matched isolated face components with components appearing in a subsequently presented full-face and then categorized the face's emotion. Critically, the matched components were from regions which were diagnostic or non-diagnostic of the emotion portrayed by the full face. In experiment 1, when the full faces were briefly presented (150 ms), LG's performance was strongly influenced by the diagnosticity of the components: his emotion recognition was boosted within normal limits when diagnostic components were used and was obliterated when non-diagnostic components were used. By contrast, in experiment 2, when the face-exposure duration was extended (2000 ms), the beneficial effect of the diagnostic matching was diminished as was the detrimental effect of the non-diagnostic matching. These data highlight the impact of diagnostic facial features in normal expression recognition and suggest that impaired emotion recognition in DVA results from deficient visual integration across diagnostic face components. Copyright © 2012 Elsevier Ltd. All rights reserved.
Quantifying facial expression recognition across viewing conditions.
Goren, Deborah; Wilson, Hugh R
2006-04-01
Facial expressions are key to social interactions and to assessment of potential danger in various situations. Therefore, our brains must be able to recognize facial expressions when they are transformed in biologically plausible ways. We used synthetic happy, sad, angry and fearful faces to determine the amount of geometric change required to recognize these emotions during brief presentations. Five-alternative forced choice conditions involving central viewing, peripheral viewing and inversion were used to study recognition among the four emotions. Two-alternative forced choice was used to study affect discrimination when spatial frequency information in the stimulus was modified. The results show an emotion and task-dependent pattern of detection. Facial expressions presented with low peak frequencies are much harder to discriminate from neutral than faces defined by either mid or high peak frequencies. Peripheral presentation of faces also makes recognition much more difficult, except for happy faces. Differences between fearful detection and recognition tasks are probably due to common confusions with sadness when recognizing fear from among other emotions. These findings further support the idea that these emotions are processed separately from each other.
Recognition of schematic facial displays of emotion in parents of children with autism.
Palermo, Mark T; Pasqualetti, Patrizio; Barbati, Giulia; Intelligente, Fabio; Rossini, Paolo Maria
2006-07-01
Performance on an emotional labeling task in response to schematic facial patterns representing five basic emotions without the concurrent presentation of a verbal category was investigated in 40 parents of children with autism and 40 matched controls. 'Autism fathers' performed worse than 'autism mothers', who performed worse than controls in decoding displays representing sadness or disgust. This indicates the need to include facial expression decoding tasks in genetic research of autism. In addition, emotional expression interactions between parents and their children with autism, particularly through play, where affect and prosody are 'physiologically' exaggerated, may stimulate development of social competence. Future studies could benefit from a combination of stimuli including photographs and schematic drawings, with and without associated verbal categories. This may allow the subdivision of patients and relatives on the basis of the amount of information needed to understand and process social-emotionally relevant information.
Hagan, Cindy C; Woods, Will; Johnson, Sam; Calder, Andrew J; Green, Gary G R; Young, Andrew W
2009-11-24
An influential neural model of face perception suggests that the posterior superior temporal sulcus (STS) is sensitive to those aspects of faces that produce transient visual changes, including facial expression. Other researchers note that recognition of expression involves multiple sensory modalities and suggest that the STS also may respond to crossmodal facial signals that change transiently. Indeed, many studies of audiovisual (AV) speech perception show STS involvement in AV speech integration. Here we examine whether these findings extend to AV emotion. We used magnetoencephalography to measure the neural responses of participants as they viewed and heard emotionally congruent fear and minimally congruent neutral face and voice stimuli. We demonstrate significant supra-additive responses (i.e., where AV > [unimodal auditory + unimodal visual]) in the posterior STS within the first 250 ms for emotionally congruent AV stimuli. These findings show a role for the STS in processing crossmodal emotive signals.
Chen, Kuan-Hua; Lwi, Sandy J.; Hua, Alice Y.; Haase, Claudia M.; Miller, Bruce L.; Levenson, Robert W.
2017-01-01
Although laboratory procedures are designed to produce specific emotions, participants often experience mixed emotions (i.e., target and non-target emotions). We examined non-target emotions in patients with frontotemporal dementia (FTD), Alzheimer’s disease (AD), other neurodegenerative diseases, and healthy controls. Participants watched film clips designed to produce three target emotions. Subjective experience of non-target emotions was assessed and emotional facial expressions were coded. Compared to patients with other neurodegenerative diseases and healthy controls, FTD patients reported more positive and negative non-target emotions, whereas AD patients reported more positive non-target emotions. There were no group differences in facial expressions of non-target emotions. We interpret these findings as reflecting deficits in processing interoceptive and contextual information resulting from neurodegeneration in brain regions critical for creating subjective emotional experience. PMID:29457053
Preferential responses in amygdala and insula during presentation of facial contempt and disgust.
Sambataro, Fabio; Dimalta, Savino; Di Giorgio, Annabella; Taurisano, Paolo; Blasi, Giuseppe; Scarabino, Tommaso; Giannatempo, Giuseppe; Nardini, Marcello; Bertolino, Alessandro
2006-10-01
Some authors consider contempt to be a basic emotion while others consider it a variant of disgust. The neural correlates of contempt have not so far been specifically contrasted with disgust. Using functional magnetic resonance imaging (fMRI), we investigated the neural networks involved in the processing of facial contempt and disgust in 24 healthy subjects. Facial recognition of contempt was lower than that of disgust and of neutral faces. The imaging data indicated significant activity in the amygdala and in globus pallidus and putamen during processing of contemptuous faces. Bilateral insula and caudate nuclei and left as well as right inferior frontal gyrus were engaged during processing of disgusted faces. Moreover, direct comparisons of contempt vs. disgust yielded significantly different activations in the amygdala. On the other hand, disgusted faces elicited greater activation than contemptuous faces in the right insula and caudate. Our findings suggest preferential involvement of different neural substrates in the processing of facial emotional expressions of contempt and disgust.
Pistoia, Francesca; Carolei, Antonio; Sacco, Simona; Conson, Massimiliano; Pistarini, Caterina; Cazzulani, Benedetta; Stewart, Janet; Franceschini, Marco; Sarà, Marco
2015-12-15
There is much evidence to suggest that recognizing and sharing emotions with others require a first-hand experience of those emotions in our own body which, in turn, depends on the adequate perception of our own internal state (interoception) through preserved sensory pathways. Here we explored the contribution of interoception to first-hand emotional experiences and to the recognition of others' emotions. For this aim, 10 individuals with sensory deafferentation as a consequence of high spinal cord injury (SCI; five males and five females; mean age, 48 ± 14.8 years) and 20 healthy subjects matched for age, sex, and education were included in the study. Recognition of facial expressions and judgment of emotionally evocative scenes were investigated in both groups using the Ekman and Friesen set of Pictures of Facial Affect and the International Affective Picture System. A two-way mixed analysis of variance and post hoc comparisons were used to test differences among emotions and groups. Compared with healthy subjects, individuals with SCI, when asked to judge emotionally evocative scenes, had difficulties in judging their own emotional response to complex scenes eliciting fear and anger, while they were able to recognize the same emotions when conveyed by facial expressions. Our findings endorse a simulative view of emotional processing according to which the proper perception of our own internal state (interoception), through preserved sensory pathways, is crucial for first-hand experiences of the more primordial emotions, such as fear and anger.
Attention and memory bias to facial emotions underlying negative symptoms of schizophrenia.
Jang, Seon-Kyeong; Park, Seon-Cheol; Lee, Seung-Hwan; Cho, Yang Seok; Choi, Kee-Hong
2016-01-01
This study assessed bias in selective attention to facial emotions in negative symptoms of schizophrenia and its influence on subsequent memory for facial emotions. Thirty people with schizophrenia who had high and low levels of negative symptoms (n = 15, respectively) and 21 healthy controls completed a visual probe detection task investigating selective attention bias (happy, sad, and angry faces randomly presented for 50, 500, or 1000 ms). A yes/no incidental facial memory task was then completed. Attention bias scores and recognition errors were calculated. Those with high negative symptoms exhibited reduced attention to emotional faces relative to neutral faces; those with low negative symptoms showed the opposite pattern when faces were presented for 500 ms regardless of the valence. Compared to healthy controls, those with high negative symptoms made more errors for happy faces in the memory task. Reduced attention to emotional faces in the probe detection task was significantly associated with less pleasure and motivation and more recognition errors for happy faces in schizophrenia group only. Attention bias away from emotional information relatively early in the attentional process and associated diminished positive memory may relate to pathological mechanisms for negative symptoms.
Facial emotion recognition ability: psychiatry nurses versus nurses from other departments.
Gultekin, Gozde; Kincir, Zeliha; Kurt, Merve; Catal, Yasir; Acil, Asli; Aydin, Aybike; Özcan, Mualla; Delikkaya, Busra N; Kacar, Selma; Emul, Murat
2016-12-01
Facial emotion recognition is a basic element in non-verbal communication. Although some researchers have shown that recognizing facial expressions may be important in the interaction between doctors and patients, there are no studies concerning facial emotion recognition in nurses. Here, we aimed to investigate facial emotion recognition ability in nurses and compare the abilities between nurses from psychiatry and other departments. In this cross-sectional study, sixty seven nurses were divided into two groups according to their departments: psychiatry (n=31); and, other departments (n=36). A Facial Emotion Recognition Test, constructed from a set of photographs from Ekman and Friesen's book "Pictures of Facial Affect", was administered to all participants. In whole group, the highest mean accuracy rate of recognizing facial emotion was the happy (99.14%) while the lowest accurately recognized facial expression was fear (47.71%). There were no significant differences between two groups among mean accuracy rates in recognizing happy, sad, fear, angry, surprised facial emotion expressions (for all, p>0.05). The ability of recognizing disgusted and neutral facial emotions tended to be better in other nurses than psychiatry nurses (p=0.052 and p=0.053, respectively) Conclusion: This study was the first that revealed indifference in the ability of FER between psychiatry nurses and non-psychiatry nurses. In medical education curricula throughout the world, no specific training program is scheduled for recognizing emotional cues of patients. We considered that improving the ability of recognizing facial emotion expression in medical stuff might be beneficial in reducing inappropriate patient-medical stuff interaction.
Schultebraucks, Katharina; Deuter, Christian E; Duesenberg, Moritz; Schulze, Lars; Hellmann-Regen, Julian; Domke, Antonia; Lockenvitz, Lisa; Kuehl, Linn K; Otte, Christian; Wingenfeld, Katja
2016-09-01
Selective attention toward emotional cues and emotion recognition of facial expressions are important aspects of social cognition. Stress modulates social cognition through cortisol, which acts on glucocorticoid (GR) and mineralocorticoid receptors (MR) in the brain. We examined the role of MR activation on attentional bias toward emotional cues and on emotion recognition. We included 40 healthy young women and 40 healthy young men (mean age 23.9 ± 3.3), who either received 0.4 mg of the MR agonist fludrocortisone or placebo. A dot-probe paradigm was used to test for attentional biases toward emotional cues (happy and sad faces). Moreover, we used a facial emotion recognition task to investigate the ability to recognize emotional valence (anger and sadness) from facial expression in four graded categories of emotional intensity (20, 30, 40, and 80 %). In the emotional dot-probe task, we found a main effect of treatment and a treatment × valence interaction. Post hoc analyses revealed an attentional bias away from sad faces after placebo intake and a shift in selective attention toward sad faces compared to placebo. We found no attentional bias toward happy faces after fludrocortisone or placebo intake. In the facial emotion recognition task, there was no main effect of treatment. MR stimulation seems to be important in modulating quick, automatic emotional processing, i.e., a shift in selective attention toward negative emotional cues. Our results confirm and extend previous findings of MR function. However, we did not find an effect of MR stimulation on emotion recognition.
The Development of Emotional Face Processing during Childhood
ERIC Educational Resources Information Center
Batty, Magali; Taylor, Margot J.
2006-01-01
Our facial expressions give others the opportunity to access our feelings, and constitute an important nonverbal tool for communication. Many recent studies have investigated emotional perception in adults, and our knowledge of neural processes involved in emotions is increasingly precise. Young children also use faces to express their internal…
Peschard, Virginie; Philippot, Pierre; Joassin, Frédéric; Rossignol, Mandy
2013-04-01
Social anxiety has been characterized by an attentional bias towards threatening faces. Electrophysiological studies have demonstrated modulations of cognitive processing from 100 ms after stimulus presentation. However, the impact of the stimulus features and task instructions on facial processing remains unclear. Event-related potentials were recorded while high and low socially anxious individuals performed an adapted Stroop paradigm that included a colour-naming task with non-emotional stimuli, an emotion-naming task (the explicit task) and a colour-naming task (the implicit task) on happy, angry and neutral faces. Whereas the impact of task factors was examined by contrasting an explicit and an implicit emotional task, the effects of perceptual changes on facial processing were explored by including upright and inverted faces. The findings showed an enhanced P1 in social anxiety during the three tasks, without a moderating effect of the type of task or stimulus. These results suggest a global modulation of attentional processing in performance situations. Copyright © 2013 Elsevier B.V. All rights reserved.
Tryptophan depletion decreases the recognition of fear in female volunteers.
Harmer, C J; Rogers, R D; Tunbridge, E; Cowen, P J; Goodwin, G M
2003-06-01
Serotonergic processes have been implicated in the modulation of fear conditioning in humans, postulated to occur at the level of the amygdala. The processing of other fear-relevant cues, such as facial expressions, has also been associated with amygdala function, but an effect of serotonin depletion on these processes has not been assessed. The present study investigated the effects of reducing serotonin function, using acute tryptophan depletion, on the recognition of basic facial expressions of emotions in healthy male and female volunteers. A double-blind between-groups design was used, with volunteers being randomly allocated to receive an amino acid drink specifically lacking tryptophan or a control mixture containing a balanced mixture of these amino acids. Participants were given a facial expression recognition task 5 h after drink administration. This task featured examples of six basic emotions (fear, anger, disgust, surprise, sadness and happiness) that had been morphed between each full emotion and neutral in 10% steps. As a control, volunteers were given a famous face classification task matched in terms of response selection and difficulty level. Tryptophan depletion significantly impaired the recognition of fearful facial expressions in female, but not male, volunteers. This was specific since recognition of other basic emotions was comparable in the two groups. There was also no effect of tryptophan depletion on the classification of famous faces or on subjective state ratings of mood or anxiety. These results confirm a role for serotonin in the processing of fear related cues, and in line with previous findings also suggest greater effects of tryptophan depletion in female volunteers. Although acute tryptophan depletion does not typically affect mood in healthy subjects, the present results suggest that subtle changes in the processing of emotional material may occur with this manipulation of serotonin function.
Modulation of α power and functional connectivity during facial affect recognition.
Popov, Tzvetan; Miller, Gregory A; Rockstroh, Brigitte; Weisz, Nathan
2013-04-03
Research has linked oscillatory activity in the α frequency range, particularly in sensorimotor cortex, to processing of social actions. Results further suggest involvement of sensorimotor α in the processing of facial expressions, including affect. The sensorimotor face area may be critical for perception of emotional face expression, but the role it plays is unclear. The present study sought to clarify how oscillatory brain activity contributes to or reflects processing of facial affect during changes in facial expression. Neuromagnetic oscillatory brain activity was monitored while 30 volunteers viewed videos of human faces that changed their expression from neutral to fearful, neutral, or happy expressions. Induced changes in α power during the different morphs, source analysis, and graph-theoretic metrics served to identify the role of α power modulation and cross-regional coupling by means of phase synchrony during facial affect recognition. Changes from neutral to emotional faces were associated with a 10-15 Hz power increase localized in bilateral sensorimotor areas, together with occipital power decrease, preceding reported emotional expression recognition. Graph-theoretic analysis revealed that, in the course of a trial, the balance between sensorimotor power increase and decrease was associated with decreased and increased transregional connectedness as measured by node degree. Results suggest that modulations in α power facilitate early registration, with sensorimotor cortex including the sensorimotor face area largely functionally decoupled and thereby protected from additional, disruptive input and that subsequent α power decrease together with increased connectedness of sensorimotor areas facilitates successful facial affect recognition.
Objectifying facial expressivity assessment of Parkinson's patients: preliminary study.
Wu, Peng; Gonzalez, Isabel; Patsis, Georgios; Jiang, Dongmei; Sahli, Hichem; Kerckhofs, Eric; Vandekerckhove, Marie
2014-01-01
Patients with Parkinson's disease (PD) can exhibit a reduction of spontaneous facial expression, designated as "facial masking," a symptom in which facial muscles become rigid. To improve clinical assessment of facial expressivity of PD, this work attempts to quantify the dynamic facial expressivity (facial activity) of PD by automatically recognizing facial action units (AUs) and estimating their intensity. Spontaneous facial expressivity was assessed by comparing 7 PD patients with 8 control participants. To voluntarily produce spontaneous facial expressions that resemble those typically triggered by emotions, six emotions (amusement, sadness, anger, disgust, surprise, and fear) were elicited using movie clips. During the movie clips, physiological signals (facial electromyography (EMG) and electrocardiogram (ECG)) and frontal face video of the participants were recorded. The participants were asked to report on their emotional states throughout the experiment. We first examined the effectiveness of the emotion manipulation by evaluating the participant's self-reports. Disgust-induced emotions were significantly higher than the other emotions. Thus we focused on the analysis of the recorded data during watching disgust movie clips. The proposed facial expressivity assessment approach captured differences in facial expressivity between PD patients and controls. Also differences between PD patients with different progression of Parkinson's disease have been observed.
Influence of gender in the recognition of basic facial expressions: A critical literature review
Forni-Santos, Larissa; Osório, Flávia L
2015-01-01
AIM: To conduct a systematic literature review about the influence of gender on the recognition of facial expressions of six basic emotions. METHODS: We made a systematic search with the search terms (face OR facial) AND (processing OR recognition OR perception) AND (emotional OR emotion) AND (gender or sex) in PubMed, PsycINFO, LILACS, and SciELO electronic databases for articles assessing outcomes related to response accuracy and latency and emotional intensity. The articles selection was performed according to parameters set by COCHRANE. The reference lists of the articles found through the database search were checked for additional references of interest. RESULTS: In respect to accuracy, women tend to perform better than men when all emotions are considered as a set. Regarding specific emotions, there seems to be no gender-related differences in the recognition of happiness, whereas results are quite heterogeneous in respect to the remaining emotions, especially sadness, anger, and disgust. Fewer articles dealt with the parameters of response latency and emotional intensity, which hinders the generalization of their findings, especially in the face of their methodological differences. CONCLUSION: The analysis of the studies conducted to date do not allow for definite conclusions concerning the role of the observer’s gender in the recognition of facial emotion, mostly because of the absence of standardized methods of investigation. PMID:26425447
Motor signatures of emotional reactivity in frontotemporal dementia.
Marshall, Charles R; Hardy, Chris J D; Russell, Lucy L; Clark, Camilla N; Bond, Rebecca L; Dick, Katrina M; Brotherhood, Emilie V; Mummery, Cath J; Schott, Jonathan M; Rohrer, Jonathan D; Kilner, James M; Warren, Jason D
2018-01-18
Automatic motor mimicry is essential to the normal processing of perceived emotion, and disrupted automatic imitation might underpin socio-emotional deficits in neurodegenerative diseases, particularly the frontotemporal dementias. However, the pathophysiology of emotional reactivity in these diseases has not been elucidated. We studied facial electromyographic responses during emotion identification on viewing videos of dynamic facial expressions in 37 patients representing canonical frontotemporal dementia syndromes versus 21 healthy older individuals. Neuroanatomical associations of emotional expression identification accuracy and facial muscle reactivity were assessed using voxel-based morphometry. Controls showed characteristic profiles of automatic imitation, and this response predicted correct emotion identification. Automatic imitation was reduced in the behavioural and right temporal variant groups, while the normal coupling between imitation and correct identification was lost in the right temporal and semantic variant groups. Grey matter correlates of emotion identification and imitation were delineated within a distributed network including primary visual and motor, prefrontal, insular, anterior temporal and temporo-occipital junctional areas, with common involvement of supplementary motor cortex across syndromes. Impaired emotional mimesis may be a core mechanism of disordered emotional signal understanding and reactivity in frontotemporal dementia, with implications for the development of novel physiological biomarkers of socio-emotional dysfunction in these diseases.
Öztürk, Ahmet; Kiliç, Alperen; Deveci, Erdem; Kirpinar, İsmet
2016-01-01
Background The concept of facial emotion recognition is well established in various neuropsychiatric disorders. Although emotional disturbances are strongly associated with somatoform disorders, there are a restricted number of studies that have investigated facial emotion recognition in somatoform disorders. Furthermore, there have been no studies that have regarded this issue using the new diagnostic criteria for somatoform disorders as somatic symptoms and related disorders (SSD). In this study, we aimed to compare the factors of facial emotion recognition between patients with SSD and age- and sex-matched healthy controls (HC) and to retest and investigate the factors of facial emotion recognition using the new criteria for SSD. Patients and methods After applying the inclusion and exclusion criteria, 54 patients who were diagnosed with SSD according to the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5) criteria and 46 age- and sex-matched HC were selected to participate in the present study. Facial emotion recognition, alexithymia, and the status of anxiety and depression were compared between the groups. Results Patients with SSD had significantly decreased scores of facial emotion for fear faces, disgust faces, and neutral faces compared with age- and sex-matched HC (t=−2.88, P=0.005; t=−2.86, P=0.005; and t=−2.56, P=0.009, respectively). After eliminating the effects of alexithymia and depressive and anxious states, the groups were found to be similar in terms of their responses to facial emotion and mean reaction time to facial emotions. Discussion Although there have been limited numbers of studies that have examined the recognition of facial emotion in patients with somatoform disorders, our study is the first to investigate facial recognition in patients with SSD diagnosed according to the DSM-5 criteria. Recognition of facial emotion was found to be disturbed in patients with SSD. However, our findings suggest that disturbances in facial recognition were significantly associated with alexithymia and the status of depression and anxiety, which is consistent with the previous studies. Further studies are needed to highlight the associations between facial emotion recognition and SSD. PMID:27199559
Tanzer, Michal; Shahar, Golan; Avidan, Galia
2014-01-01
The aim of the proposed theoretical model is to illuminate personal and interpersonal resilience by drawing from the field of emotional face perception. We suggest that perception/recognition of emotional facial expressions serves as a central link between subjective, self-related processes and the social context. Emotional face perception constitutes a salient social cue underlying interpersonal communication and behavior. Because problems in communication and interpersonal behavior underlie most, if not all, forms of psychopathology, it follows that perception/recognition of emotional facial expressions impacts psychopathology. The ability to accurately interpret one’s facial expression is crucial in subsequently deciding on an appropriate course of action. However, perception in general, and of emotional facial expressions in particular, is highly influenced by individuals’ personality and the self-concept. Herein we briefly outline well-established theories of personal and interpersonal resilience and link them to the neuro-cognitive basis of face perception. We then describe the findings of our ongoing program of research linking two well-established resilience factors, general self-efficacy (GSE) and perceived social support (PSS), with face perception. We conclude by pointing out avenues for future research focusing on possible genetic markers and patterns of brain connectivity associated with the proposed model. Implications of our integrative model to psychotherapy are discussed. PMID:25165439
The association between PTSD and facial affect recognition.
Williams, Christian L; Milanak, Melissa E; Judah, Matt R; Berenbaum, Howard
2018-05-05
The major aims of this study were to examine how, if at all, having higher levels of PTSD would be associated with performance on a facial affect recognition task in which facial expressions of emotion are superimposed on emotionally valenced, non-face images. College students with trauma histories (N = 90) completed a facial affect recognition task as well as measures of exposure to traumatic events, and PTSD symptoms. When the face and context matched, participants with higher levels of PTSD were significantly more accurate. When the face and context were mismatched, participants with lower levels of PTSD were more accurate than were those with higher levels of PTSD. These findings suggest that PTSD is associated with how people process affective information. Furthermore, these results suggest that the enhanced attention of people with higher levels of PTSD to affective information can be either beneficial or detrimental to their ability to accurately identify facial expressions of emotion. Limitations, future directions and clinical implications are discussed. Copyright © 2018 Elsevier B.V. All rights reserved.
Kuin, Niki C; Masthoff, Erik D M; Munafò, Marcus R; Penton-Voak, Ian S
2017-01-01
Research into the causal and perpetuating factors influencing aggression has partly focused on the general tendency of aggression-prone individuals to infer hostile intent in others, even in ambiguous circumstances. This is referred to as the 'hostile interpretation bias'. Whether this hostile interpretation bias also exists in basal information processing, such as perception of facial emotion, is not yet known, especially with respect to the perception of ambiguous expressions. In addition, little is known about how this potential bias in facial emotion perception is related to specific characteristics of aggression. In the present study, conducted in a penitentiary setting with detained male adults, we investigated if violent offenders (n = 71) show a stronger tendency to interpret ambiguous facial expressions on a computer task as angry rather than happy, compared to non-violent offenders (n = 14) and to a control group of healthy volunteers (n = 32). We also investigated if hostile perception of facial expressions is related to specific characteristics of aggression, such as proactive and reactive aggression. No clear statistical evidence was found that violent offenders perceived facial emotional expressions as more angry than non-violent offenders or healthy volunteers. A regression analysis in the violent offender group showed that only age and a self-report measure of hostility predicted outcome on the emotion perception task. Other traits, such as psychopathic traits, intelligence, attention and a tendency to jump to conclusions were not associated with interpretation of anger in facial emotional expressions. We discuss the possible impact of the study design and population studied on our results, as well as implications for future studies.
Masthoff, Erik D. M.; Munafò, Marcus R.; Penton-Voak, Ian S.
2017-01-01
Research into the causal and perpetuating factors influencing aggression has partly focused on the general tendency of aggression-prone individuals to infer hostile intent in others, even in ambiguous circumstances. This is referred to as the ‘hostile interpretation bias’. Whether this hostile interpretation bias also exists in basal information processing, such as perception of facial emotion, is not yet known, especially with respect to the perception of ambiguous expressions. In addition, little is known about how this potential bias in facial emotion perception is related to specific characteristics of aggression. In the present study, conducted in a penitentiary setting with detained male adults, we investigated if violent offenders (n = 71) show a stronger tendency to interpret ambiguous facial expressions on a computer task as angry rather than happy, compared to non-violent offenders (n = 14) and to a control group of healthy volunteers (n = 32). We also investigated if hostile perception of facial expressions is related to specific characteristics of aggression, such as proactive and reactive aggression. No clear statistical evidence was found that violent offenders perceived facial emotional expressions as more angry than non-violent offenders or healthy volunteers. A regression analysis in the violent offender group showed that only age and a self-report measure of hostility predicted outcome on the emotion perception task. Other traits, such as psychopathic traits, intelligence, attention and a tendency to jump to conclusions were not associated with interpretation of anger in facial emotional expressions. We discuss the possible impact of the study design and population studied on our results, as well as implications for future studies. PMID:29190802
Balconi, Michela; Pala, Francesca; Manenti, Rosa; Brambilla, Michela; Cobelli, Chiara; Rosini, Sandra; Benussi, Alberto; Padovani, Alessandro; Borroni, Barbara; Cotelli, Maria
2016-08-11
Emotional deficits are part of the non-motor features of Parkinson's disease but few attention has been paid to specific aspects such as subjective emotional experience and autonomic responses. This study aimed to investigate the mechanisms of emotional recognition in Parkinson's Disease (PD) using the following levels: explicit evaluation of emotions (Self-Assessment Manikin) and implicit reactivity (Skin Conductance Response; electromyographic measure of facial feedback of the zygomaticus and corrugator muscles). 20 PD Patients and 34 healthy controls were required to observe and evaluate affective pictures during physiological parameters recording. In PD, the appraisal process on both valence and arousal features of emotional cues were preserved, but we found significant impairment in autonomic responses. Specifically, in comparison to healthy controls, PD patients revealed lower Skin Conductance Response values to negative and high arousing emotional stimuli. In addition, the electromyographic measures showed defective responses exclusively limited to negative and high arousing emotional category: PD did not show increasing of corrugator activity in response to negative emotions as happened in heathy controls. PD subjects inadequately respond to the emotional categories which were considered more "salient": they had preserved appraisal process, but impaired automatic ability to distinguish between different emotional contexts.
ERIC Educational Resources Information Center
Huggenberger, Harriet J.; Suter, Susanne E.; Reijnen, Ester; Schachinger, Hartmut
2009-01-01
Women's cradling side preference has been related to contralateral hemispheric specialization of processing emotional signals; but not of processing baby's facial expression. Therefore, 46 nulliparous female volunteers were characterized as left or non-left holders (HG) during a doll holding task. During a signal detection task they were then…
How does context affect assessments of facial emotion? The role of culture and age
Ko, Seon-Gyu; Lee, Tae-Ho; Yoon, Hyea-Young; Kwon, Jung-Hye; Mather, Mara
2010-01-01
People from Asian cultures are more influenced by context in their visual processing than people from Western cultures. In this study, we examined how these cultural differences in context processing affect how people interpret facial emotions. We found that younger Koreans were more influenced than younger Americans by emotional background pictures when rating the emotion of a central face, especially those younger Koreans with low self-rated stress. In contrast, among older adults, neither Koreans nor Americans showed significant influences of context in their face emotion ratings. These findings suggest that cultural differences in reliance on context to interpret others' emotions depend on perceptual integration processes that decline with age, leading to fewer cultural differences in perception among older adults than among younger adults. Furthermore, when asked to recall the background pictures, younger participants recalled more negative pictures than positive pictures, whereas older participants recalled similar numbers of positive and negative pictures. These age differences in the valence of memory were consistent across culture. PMID:21038967
Automatic Facial Expression Recognition and Operator Functional State
NASA Technical Reports Server (NTRS)
Blanson, Nina
2012-01-01
The prevalence of human error in safety-critical occupations remains a major challenge to mission success despite increasing automation in control processes. Although various methods have been proposed to prevent incidences of human error, none of these have been developed to employ the detection and regulation of Operator Functional State (OFS), or the optimal condition of the operator while performing a task, in work environments due to drawbacks such as obtrusiveness and impracticality. A video-based system with the ability to infer an individual's emotional state from facial feature patterning mitigates some of the problems associated with other methods of detecting OFS, like obtrusiveness and impracticality in integration with the mission environment. This paper explores the utility of facial expression recognition as a technology for inferring OFS by first expounding on the intricacies of OFS and the scientific background behind emotion and its relationship with an individual's state. Then, descriptions of the feedback loop and the emotion protocols proposed for the facial recognition program are explained. A basic version of the facial expression recognition program uses Haar classifiers and OpenCV libraries to automatically locate key facial landmarks during a live video stream. Various methods of creating facial expression recognition software are reviewed to guide future extensions of the program. The paper concludes with an examination of the steps necessary in the research of emotion and recommendations for the creation of an automatic facial expression recognition program for use in real-time, safety-critical missions
Automatic Facial Expression Recognition and Operator Functional State
NASA Technical Reports Server (NTRS)
Blanson, Nina
2011-01-01
The prevalence of human error in safety-critical occupations remains a major challenge to mission success despite increasing automation in control processes. Although various methods have been proposed to prevent incidences of human error, none of these have been developed to employ the detection and regulation of Operator Functional State (OFS), or the optimal condition of the operator while performing a task, in work environments due to drawbacks such as obtrusiveness and impracticality. A video-based system with the ability to infer an individual's emotional state from facial feature patterning mitigates some of the problems associated with other methods of detecting OFS, like obtrusiveness and impracticality in integration with the mission environment. This paper explores the utility of facial expression recognition as a technology for inferring OFS by first expounding on the intricacies of OFS and the scientific background behind emotion and its relationship with an individual's state. Then, descriptions of the feedback loop and the emotion protocols proposed for the facial recognition program are explained. A basic version of the facial expression recognition program uses Haar classifiers and OpenCV libraries to automatically locate key facial landmarks during a live video stream. Various methods of creating facial expression recognition software are reviewed to guide future extensions of the program. The paper concludes with an examination of the steps necessary in the research of emotion and recommendations for the creation of an automatic facial expression recognition program for use in real-time, safety-critical missions.
Age-Related Changes in the Processing of Emotional Faces in a Dual-Task Paradigm.
Casares-Guillén, Carmen; García-Rodríguez, Beatriz; Delgado, Marisa; Ellgring, Heiner
2016-01-01
Background/ Study Context: Age-related changes appear to affect the ability to identify emotional facial expressions in dual-task conditions (i.e., while simultaneously performing a second visual task). The level of interference generated by the secondary task depends on the phase of emotional processing affected by the interference and the nature of the secondary task. The aim of the present study was to investigate the effect of these variables on age-related changes in the processing of emotional faces. The identification of emotional facial expressions (EFEs) was assessed in a dual-task paradigm using the following variables: (a) the phase during which interference was applied (encoding vs. retrieval phase); and (b) the nature of the interfering stimulus (visuospatial vs. verbal). The sample population consisted of 24 healthy aged adults (mean age = 75.38) and 40 younger adults (mean age = 26.90). The accuracy of EFE identification was calculated for all experimental conditions. Consistent with our hypothesis, the performance of the older group was poorer than that of the younger group in all experimental conditions. Dual-task performance was poorer when the interference occurred during the encoding phase of emotional face processing and when both tasks were of the same nature (i.e., when the experimental condition was more demanding in terms of attention). These results provide empirical evidence of age-related deficits in the identification of emotional facial expressions, which may be partially explained by the impairment of cognitive resources specific to this task. These findings may account for the difficulties experienced by the elderly during social interactions that require the concomitant processing of emotional and environmental information.
Cognitive and physiological markers of emotional awareness in chimpanzees (Pan troglodytes).
Parr, L A
2001-11-01
The ability to understand emotion in others is one of the most important factors involved in regulating social interactions in primates. Such emotional awareness functions to coordinate activity among group members, enable the formation of long-lasting individual relationships, and facilitate the pursuit of shared interests. Despite these important evolutionary implications, comparative studies of emotional processing in humans and great apes are practically nonexistent, constituting a major gap in our understanding of the extent to which emotional awareness has played an important role in shaping human behavior and societies. This paper presents the results of two experiments that examine chimpanzees' responses to emotional stimuli. First, changes in peripheral skin temperature were measured while subjects viewed three categories of emotionally negative video scenes; conspecifics being injected with needles (INJ), darts and needles alone (DART), and conspecific directing agonism towards the veterinarians (CHASE). Second, chimpanzees were required to use facial expressions to categorize emotional video scenes, i.e., favorite food and objects and veterinarian procedures, according to their positive and negative valence. With no prior training, subjects spontaneously matched the emotional videos to conspecific facial expressions according to their shared emotional meaning, indicating that chimpanzee facial expressions are processed emotionally, as are human expressions. Decreases in peripheral skin temperature, indicative of negative sympathetic arousal, were significantly lower when subjects viewed the INJ and DART videos, compared to the CHASE videos, indicating greater negative arousal when viewing conspecifics being injected with needles, and needles themselves, than when viewing conspecifics engaged in general agonism.
Carroll, Erin M A; Kamboj, Sunjeev K; Conroy, Laura; Tookman, Adrian; Williams, Amanda C de C; Jones, Louise; Morgan, Celia J A; Curran, H Valerie
2011-06-01
As a multidimensional phenomenon, pain is influenced by various psychological factors. One such factor is catastrophizing, which is associated with higher pain intensity and emotional distress in cancer and noncancer pain. One possibility is that catastrophizing represents a general cognitive style that preferentially supports the processing of negative affective stimuli. Such preferential processing of threat--toward negative facial expressions, for example--is seen in emotional disorders and is sensitive to pharmacological treatment. Whether pharmacological (analgesic) treatment might also influence the processing of threat in pain patients is currently unclear. This study investigates the effects catastrophizing on processing of facial affect in those receiving an acute opioid dose. In a double-blind crossover design, the performance of 20 palliative care patients after their usual dose of immediate-release opioid was compared with their performance following matched-placebo administration on a facial affect recognition (i.e., speed and accuracy) and threat-pain estimation task (i.e., ratings of pain intensity). The influence of catastrophizing was examined by splitting the sample according to their score on the Pain Catastrophizing Scale (PCS). Opioid administration had no effect on facial affect processing compared with placebo. However, the main finding was that enhanced processing of fear, sadness, and disgust was found only in patients who scored highly on the PCS. There was no difference in performance between the two PCS groups on the other emotions (i.e., happiness, surprise, and anger). These findings suggest that catastrophizing is associated with an affective information-processing bias in patients with severe pain conditions. Copyright © 2011 U.S. Cancer Pain Relief Committee. Published by Elsevier Inc. All rights reserved.
Agency and facial emotion judgment in context.
Ito, Kenichi; Masuda, Takahiko; Li, Liman Man Wai
2013-06-01
Past research showed that East Asians' belief in holism was expressed as their tendencies to include background facial emotions into the evaluation of target faces more than North Americans. However, this pattern can be interpreted as North Americans' tendency to downplay background facial emotions due to their conceptualization of facial emotion as volitional expression of internal states. Examining this alternative explanation, we investigated whether different types of contextual information produce varying degrees of effect on one's face evaluation across cultures. In three studies, European Canadians and East Asians rated the intensity of target facial emotions surrounded with either affectively salient landscape sceneries or background facial emotions. The results showed that, although affectively salient landscapes influenced the judgment of both cultural groups, only European Canadians downplayed the background facial emotions. The role of agency as differently conceptualized across cultures and multilayered systems of cultural meanings are discussed.
Dalkıran, Mihriban; Tasdemir, Akif; Salihoglu, Tamer; Emul, Murat; Duran, Alaattin; Ugur, Mufit; Yavuz, Ruhi
2017-09-01
People with schizophrenia have impairments in emotion recognition along with other social cognitive deficits. In the current study, we aimed to investigate the immediate benefits of ECT on facial emotion recognition ability. Thirty-two treatment resistant patients with schizophrenia who have been indicated for ECT enrolled in the study. Facial emotion stimuli were a set of 56 photographs that depicted seven basic emotions: sadness, anger, happiness, disgust, surprise, fear, and neutral faces. The average age of the participants was 33.4 ± 10.5 years. The rate of recognizing the disgusted facial expression increased significantly after ECT (p < 0.05) and no significant changes were found in the rest of the facial expressions (p > 0.05). After the ECT, the time period of responding to the fear and happy facial expressions were significantly shorter (p < 0.05). Facial emotion recognition ability is an important social cognitive skill for social harmony, proper relation and living independently. At least, the ECT sessions do not seem to affect facial emotion recognition ability negatively and seem to improve identifying disgusted facial emotion which is related with dopamine enriched regions in brain.
Chiranjeevi, Pojala; Gopalakrishnan, Viswanath; Moogi, Pratibha
2015-09-01
Facial expression recognition is one of the open problems in computer vision. Robust neutral face recognition in real time is a major challenge for various supervised learning-based facial expression recognition methods. This is due to the fact that supervised methods cannot accommodate all appearance variability across the faces with respect to race, pose, lighting, facial biases, and so on, in the limited amount of training data. Moreover, processing each and every frame to classify emotions is not required, as user stays neutral for majority of the time in usual applications like video chat or photo album/web browsing. Detecting neutral state at an early stage, thereby bypassing those frames from emotion classification would save the computational power. In this paper, we propose a light-weight neutral versus emotion classification engine, which acts as a pre-processer to the traditional supervised emotion classification approaches. It dynamically learns neutral appearance at key emotion (KE) points using a statistical texture model, constructed by a set of reference neutral frames for each user. The proposed method is made robust to various types of user head motions by accounting for affine distortions based on a statistical texture model. Robustness to dynamic shift of KE points is achieved by evaluating the similarities on a subset of neighborhood patches around each KE point using the prior information regarding the directionality of specific facial action units acting on the respective KE point. The proposed method, as a result, improves emotion recognition (ER) accuracy and simultaneously reduces computational complexity of the ER system, as validated on multiple databases.
When Early Experiences Build a Wall to Others’ Emotions: An Electrophysiological and Autonomic Study
Ardizzi, Martina; Martini, Francesca; Umiltà, Maria Alessandra; Sestito, Mariateresa; Ravera, Roberto; Gallese, Vittorio
2013-01-01
Facial expression of emotions is a powerful vehicle for communicating information about others’ emotional states and it normally induces facial mimicry in the observers. The aim of this study was to investigate if early aversive experiences could interfere with emotion recognition, facial mimicry, and with the autonomic regulation of social behaviors. We conducted a facial emotion recognition task in a group of “street-boys” and in an age-matched control group. We recorded facial electromyography (EMG), a marker of facial mimicry, and respiratory sinus arrhythmia (RSA), an index of the recruitment of autonomic system promoting social behaviors and predisposition, in response to the observation of facial expressions of emotions. Results showed an over-attribution of anger, and reduced EMG responses during the observation of both positive and negative expressions only among street-boys. Street-boys also showed lower RSA after observation of facial expressions and ineffective RSA suppression during presentation of non-threatening expressions. Our findings suggest that early aversive experiences alter not only emotion recognition but also facial mimicry of emotions. These deficits affect the autonomic regulation of social behaviors inducing lower social predisposition after the visualization of facial expressions and an ineffective recruitment of defensive behavior in response to non-threatening expressions. PMID:23593374
Thomas, Laura A.; Brotman, Melissa A.; Muhrer, Eli M.; Rosen, Brooke H.; Bones, Brian L.; Reynolds, Richard C.; Deveney, Christen; Pine, Daniel S.; Leibenluft, Ellen
2012-01-01
Context Youth with bipolar disorder (BD) and those with severe, non-episodic irritability (severe mood dysregulation, SMD) show amygdala dysfunction during face emotion processing. However, studies have not compared such patients to each other and to comparison subjects in neural responsiveness to subtle changes in face emotion; the ability to process such changes is important for social cognition. We employed a novel parametrically designed faces paradigm. Objective Using a parametrically morphed emotional faces task, we compared activation in the amygdala and across the brain in BD, SMD, and healthy volunteers (HV). Design Case-control study. Setting Government research institute. Participants 57 youths (19 BD, 15 SMD, 23 HV). Main Outcome Measure Blood oxygenated level dependent (BOLD) data. Neutral faces were morphed with angry and happy faces in 25% intervals; static face stimuli appeared for 3000ms. Subjects performed hostility or non-emotional facial feature (i.e., nose width) ratings. Slope of BOLD activity was calculated across neutral-to-angry (N→A) and neutral-to-happy (N→H) face stimuli. Results In HV, but not BD or SMD, there was a positive association between left amygdala activity and anger on the face. In the N→H whole brain analysis, BD and SMD modulated parietal, temporal, and medial-frontal areas differently from each other and from HV; with increasing facial-happiness, SMD increased, while BD decreased, activity in parietal, temporal, and frontal regions. Conclusions Youth with BD or SMD differ from HV in modulation of amygdala activity in response to small changes in facial anger displays. In contrast, BD and SMD show distinct perturbations in regions mediating attention and face processing in association with changes in the emotional intensity of facial happiness displays. These findings demonstrate similarities and differences in the neural correlates of face emotion processing in BD and SMD, suggesting these distinct clinical presentations may reflect differing pathologies along a mood disorders spectrum. PMID:23026912
ERIC Educational Resources Information Center
Balconi, Michela; Carrera, Alba
2007-01-01
The paper explored conceptual and lexical skills with regard to emotional correlates of facial stimuli and scripts. In two different experimental phases normal and autistic children observed six facial expressions of emotions (happiness, anger, fear, sadness, surprise, and disgust) and six emotional scripts (contextualized facial expressions). In…
Misinterpretation of Facial Expressions of Emotion in Verbal Adults with Autism Spectrum Disorder
ERIC Educational Resources Information Center
Eack, Shaun M.; Mazefsky, Carla A.; Minshew, Nancy J.
2015-01-01
Facial emotion perception is significantly affected in autism spectrum disorder, yet little is known about how individuals with autism spectrum disorder misinterpret facial expressions that result in their difficulty in accurately recognizing emotion in faces. This study examined facial emotion perception in 45 verbal adults with autism spectrum…
Face-to-face: Perceived personal relevance amplifies face processing
Pittig, Andre; Schupp, Harald T.; Alpers, Georg W.
2017-01-01
Abstract The human face conveys emotional and social information, but it is not well understood how these two aspects influence face perception. In order to model a group situation, two faces displaying happy, neutral or angry expressions were presented. Importantly, faces were either facing the observer, or they were presented in profile view directed towards, or looking away from each other. In Experiment 1 (n = 64), face pairs were rated regarding perceived relevance, wish-to-interact, and displayed interactivity, as well as valence and arousal. All variables revealed main effects of facial expression (emotional > neutral), face orientation (facing observer > towards > away) and interactions showed that evaluation of emotional faces strongly varies with their orientation. Experiment 2 (n = 33) examined the temporal dynamics of perceptual-attentional processing of these face constellations with event-related potentials. Processing of emotional and neutral faces differed significantly in N170 amplitudes, early posterior negativity (EPN), and sustained positive potentials. Importantly, selective emotional face processing varied as a function of face orientation, indicating early emotion-specific (N170, EPN) and late threat-specific effects (LPP, sustained positivity). Taken together, perceived personal relevance to the observer—conveyed by facial expression and face direction—amplifies emotional face processing within triadic group situations. PMID:28158672
Musical chords and emotion: major and minor triads are processed for emotion.
Bakker, David Radford; Martin, Frances Heritage
2015-03-01
Musical chords are arguably the smallest building blocks of music that retain emotional information. Major chords are generally perceived as positive- and minor chords as negative-sounding, but there has been debate concerning how early these emotional connotations may be processed. To investigate this, emotional facial stimuli and musical chord stimuli were simultaneously presented to participants, and facilitation of processing was measured via event-related potential (ERP) amplitudes. Decreased amplitudes of the P1 and N2 ERP components have been found to index the facilitation of early processing. If simultaneously presented musical chords and facial stimuli are perceived at early stages as belonging to the same emotional category, then early processing should be facilitated for these congruent pairs, and ERP amplitudes should therefore be decreased as compared to the incongruent pairs. ERPs were recorded from 30 musically naive participants as they viewed happy, sad, and neutral faces presented simultaneously with a major or minor chord. When faces and chords were presented that contained congruent emotional information (happy-major or sad-minor), processing was facilitated, as indexed by decreased N2 ERP amplitudes. This suggests that musical chords do possess emotional connotations that can be processed as early as 200 ms in naive listeners. The early stages of processing that are involved suggest that major and minor chords have deeply connected emotional meanings, rather than superficially attributed ones, indicating that minor triads possess negative emotional connotations and major triads possess positive emotional connotations.
ERIC Educational Resources Information Center
Rellecke, Julian; Palazova, Marina; Sommer, Werner; Schacht, Annekathrin
2011-01-01
The degree to which emotional aspects of stimuli are processed automatically is controversial. Here, we assessed the automatic elicitation of emotion-related brain potentials (ERPs) to positive, negative, and neutral words and facial expressions in an easy and superficial face-word discrimination task, for which the emotional valence was…
On Assisting a Visual-Facial Affect Recognition System with Keyboard-Stroke Pattern Information
NASA Astrophysics Data System (ADS)
Stathopoulou, I.-O.; Alepis, E.; Tsihrintzis, G. A.; Virvou, M.
Towards realizing a multimodal affect recognition system, we are considering the advantages of assisting a visual-facial expression recognition system with keyboard-stroke pattern information. Our work is based on the assumption that the visual-facial and keyboard modalities are complementary to each other and that their combination can significantly improve the accuracy in affective user models. Specifically, we present and discuss the development and evaluation process of two corresponding affect recognition subsystems, with emphasis on the recognition of 6 basic emotional states, namely happiness, sadness, surprise, anger and disgust as well as the emotion-less state which we refer to as neutral. We find that emotion recognition by the visual-facial modality can be aided greatly by keyboard-stroke pattern information and the combination of the two modalities can lead to better results towards building a multimodal affect recognition system.
2014-01-01
Background Alexithymia is a personality trait that is characterized by difficulties in identifying and describing feelings. Previous studies have shown that alexithymia is related to problems in recognizing others’ emotional facial expressions when these are presented with temporal constraints. These problems can be less severe when the expressions are visible for a relatively long time. Because the neural correlates of these recognition deficits are still relatively unexplored, we investigated the labeling of facial emotions and brain responses to facial emotions as a function of alexithymia. Results Forty-eight healthy participants had to label the emotional expression (angry, fearful, happy, or neutral) of faces presented for 1 or 3 seconds in a forced-choice format while undergoing functional magnetic resonance imaging. The participants’ level of alexithymia was assessed using self-report and interview. In light of the previous findings, we focused our analysis on the alexithymia component of difficulties in describing feelings. Difficulties describing feelings, as assessed by the interview, were associated with increased reaction times for negative (i.e., angry and fearful) faces, but not with labeling accuracy. Moreover, individuals with higher alexithymia showed increased brain activation in the somatosensory cortex and supplementary motor area (SMA) in response to angry and fearful faces. These cortical areas are known to be involved in the simulation of the bodily (motor and somatosensory) components of facial emotions. Conclusion The present data indicate that alexithymic individuals may use information related to bodily actions rather than affective states to understand the facial expressions of other persons. PMID:24629094
Gentsch, Kornelia; Grandjean, Didier; Scherer, Klaus R
2015-01-01
Scherer's Component Process Model provides a theoretical framework for research on the production mechanism of emotion and facial emotional expression. The model predicts that appraisal results drive facial expressions, which unfold sequentially and cumulatively over time. In two experiments, we examined facial muscle activity changes (via facial electromyography recordings over the corrugator, cheek, and frontalis regions) in response to events in a gambling task. These events were experimentally manipulated feedback stimuli which presented simultaneous information directly affecting goal conduciveness (gambling outcome: win, loss, or break-even) and power appraisals (Experiment 1 and 2), as well as control appraisal (Experiment 2). We repeatedly found main effects of goal conduciveness (starting ~600 ms), and power appraisals (starting ~800 ms after feedback onset). Control appraisal main effects were inconclusive. Interaction effects of goal conduciveness and power appraisals were obtained in both experiments (Experiment 1: over the corrugator and cheek regions; Experiment 2: over the frontalis region) suggesting amplified goal conduciveness effects when power was high in contrast to invariant goal conduciveness effects when power was low. Also an interaction of goal conduciveness and control appraisals was found over the cheek region, showing differential goal conduciveness effects when control was high and invariant effects when control was low. These interaction effects suggest that the appraisal of having sufficient control or power affects facial responses towards gambling outcomes. The result pattern suggests that corrugator and frontalis regions are primarily related to cognitive operations that process motivational pertinence, whereas the cheek region would be more influenced by coping implications. Our results provide first evidence demonstrating that cognitive-evaluative mechanisms related to goal conduciveness, control, and power appraisals affect facial expressions dynamically over time, immediately after an event is perceived. In addition, our results provide further indications for the chronography of appraisal-driven facial movements and the underlying cognitive processes.
Gentsch, Kornelia; Grandjean, Didier; Scherer, Klaus R.
2015-01-01
Scherer’s Component Process Model provides a theoretical framework for research on the production mechanism of emotion and facial emotional expression. The model predicts that appraisal results drive facial expressions, which unfold sequentially and cumulatively over time. In two experiments, we examined facial muscle activity changes (via facial electromyography recordings over the corrugator, cheek, and frontalis regions) in response to events in a gambling task. These events were experimentally manipulated feedback stimuli which presented simultaneous information directly affecting goal conduciveness (gambling outcome: win, loss, or break-even) and power appraisals (Experiment 1 and 2), as well as control appraisal (Experiment 2). We repeatedly found main effects of goal conduciveness (starting ~600 ms), and power appraisals (starting ~800 ms after feedback onset). Control appraisal main effects were inconclusive. Interaction effects of goal conduciveness and power appraisals were obtained in both experiments (Experiment 1: over the corrugator and cheek regions; Experiment 2: over the frontalis region) suggesting amplified goal conduciveness effects when power was high in contrast to invariant goal conduciveness effects when power was low. Also an interaction of goal conduciveness and control appraisals was found over the cheek region, showing differential goal conduciveness effects when control was high and invariant effects when control was low. These interaction effects suggest that the appraisal of having sufficient control or power affects facial responses towards gambling outcomes. The result pattern suggests that corrugator and frontalis regions are primarily related to cognitive operations that process motivational pertinence, whereas the cheek region would be more influenced by coping implications. Our results provide first evidence demonstrating that cognitive-evaluative mechanisms related to goal conduciveness, control, and power appraisals affect facial expressions dynamically over time, immediately after an event is perceived. In addition, our results provide further indications for the chronography of appraisal-driven facial movements and the underlying cognitive processes. PMID:26295338
Lateralization for Processing Facial Emotions in Gay Men, Heterosexual Men, and Heterosexual Women.
Rahman, Qazi; Yusuf, Sifat
2015-07-01
This study tested whether male sexual orientation and gender nonconformity influenced functional cerebral lateralization for the processing of facial emotions. We also tested for the effects of sex of poser and emotion displayed on putative differences. Thirty heterosexual men, 30 heterosexual women, and 40 gay men completed measures of demographic variables, recalled childhood gender nonconformity (CGN), IQ, and the Chimeric Faces Test (CFT). The CFT depicts vertically split chimeric faces, formed with one half showing a neutral expression and the other half showing an emotional expression and performance is measured using a "laterality quotient" (LQ) score. We found that heterosexual men were significantly more right-lateralized when viewing female faces compared to heterosexual women and gay men, who did not differ significantly from each other. Heterosexual women and gay men were more left-lateralized for processing female faces. There were no significant group differences in lateralization for male faces. These results remained when controlling for age and IQ scores. There was no significant effect of CGN on LQ scores. These data suggest that gay men are feminized in some aspects of functional cerebral lateralization for facial emotion. The results were discussed in relation to the selectivity of functional lateralization and putative brain mechanisms underlying sexual attraction towards opposite-sex and same-sex targets.
Effects of the potential lithium-mimetic, ebselen, on impulsivity and emotional processing.
Masaki, Charles; Sharpley, Ann L; Cooper, Charlotte M; Godlewska, Beata R; Singh, Nisha; Vasudevan, Sridhar R; Harmer, Catherine J; Churchill, Grant C; Sharp, Trevor; Rogers, Robert D; Cowen, Philip J
2016-07-01
Lithium remains the most effective treatment for bipolar disorder and also has important effects to lower suicidal behaviour, a property that may be linked to its ability to diminish impulsive, aggressive behaviour. The antioxidant drug, ebselen, has been proposed as a possible lithium-mimetic based on its ability in animals to inhibit inositol monophosphatase (IMPase), an action which it shares with lithium. The aim of the study was to determine whether treatment with ebselen altered emotional processing and diminished measures of risk-taking behaviour. We studied 20 healthy participants who were tested on two occasions receiving either ebselen (3600 mg over 24 h) or identical placebo in a double-blind, randomized, cross-over design. Three hours after the final dose of ebselen/placebo, participants completed the Cambridge Gambling Task (CGT) and a task that required the detection of emotional facial expressions (facial emotion recognition task (FERT)). On the CGT, relative to placebo, ebselen reduced delay aversion while on the FERT, it increased the recognition of positive vs negative facial expressions. The study suggests that at the dosage used, ebselen can decrease impulsivity and produce a positive bias in emotional processing. These findings have implications for the possible use of ebselen in the disorders characterized by impulsive behaviour and dysphoric mood.
Weisbuch, Max; Grunberg, Rebecca L; Slepian, Michael L; Ambady, Nalini
2016-10-01
Beliefs about the malleability versus stability of traits (incremental vs. entity lay theories) have a profound impact on social cognition and self-regulation, shaping phenomena that range from the fundamental attribution error and group-based stereotyping to academic motivation and achievement. Less is known about the causes than the effects of these lay theories, and in the current work the authors examine the perception of facial emotion as a causal influence on lay theories. Specifically, they hypothesized that (a) within-person variability in facial emotion signals within-person variability in traits and (b) social environments replete with within-person variability in facial emotion encourage perceivers to endorse incremental lay theories. Consistent with Hypothesis 1, Study 1 participants were more likely to attribute dynamic (vs. stable) traits to a person who exhibited several different facial emotions than to a person who exhibited a single facial emotion across multiple images. Hypothesis 2 suggests that social environments support incremental lay theories to the extent that they include many people who exhibit within-person variability in facial emotion. Consistent with Hypothesis 2, participants in Studies 2-4 were more likely to endorse incremental theories of personality, intelligence, and morality after exposure to multiple individuals exhibiting within-person variability in facial emotion than after exposure to multiple individuals exhibiting a single emotion several times. Perceptions of within-person variability in facial emotion-rather than perceptions of simple diversity in facial emotion-were responsible for these effects. Discussion focuses on how social ecologies shape lay theories. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Effects of the BDNF Val66Met polymorphism on neural responses to facial emotion.
Mukherjee, Prerona; Whalley, Heather C; McKirdy, James W; McIntosh, Andrew M; Johnstone, Eve C; Lawrie, Stephen M; Hall, Jeremy
2011-03-31
The brain derived neurotrophic factor (BDNF) Val66Met polymorphism has been associated with affective disorders, but its role in emotion processing has not been fully established. Due to the clinically heterogeneous nature of these disorders, studying the effect of genetic variation in the BDNF gene on a common attribute such as fear processing may elucidate how the BDNF Val66Met polymorphism impacts brain function. Here we use functional magnetic resonance imaging examine the effect of the BDNF Val66Met genotype on neural activity for fear processing. Forty healthy participants performed an implicit fear task during scanning, where subjects made gender judgments from facial images with neutral or fearful emotion. Subjects were tested for facial emotion recognition post-scan. Functional connectivity was investigated using psycho-physiological interactions. Subjects were genotyped for the BDNF Val66Met polymorphism and the measures compared between genotype groups. Met carriers showed overactivation in the anterior cingulate cortex (ACC), brainstem and insula bilaterally for fear processing, along with reduced functional connectivity from the ACC to the left hippocampus, and impaired fear recognition ability. The results show that during fear processing, Met allele carriers show an increased neural response in regions previously implicated in mediating autonomic arousal. Further, the Met carriers show decreased functional connectivity with the hippocampus, which may reflect differential retrieval of emotional associations. Together, these effects show significant differences in the neural substrate for fear processing with genetic variation in BDNF. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Bertsch, Katja; Böhnke, Robina; Kruk, Menno R; Naumann, Ewald
2009-01-01
Aggression is a common behavior which has frequently been explained as involving changes in higher level information processing patterns. Although researchers have started only recently to investigate information processing in healthy individuals while engaged in aggressive behavior, the impact of aggression on information processing beyond an aggressive encounter remains unclear. In an event-related potential study, we investigated the processing of facial expressions (happy, angry, fearful, and neutral) in an emotional Stroop task after experimentally provoking aggressive behavior in healthy participants. Compared to a non-provoked group, these individuals showed increased early (P2) and late (P3) positive amplitudes for all facial expressions. For the P2 amplitude, the effect of provocation was greatest for threat-related expressions. Beyond this, a bias for emotional expressions, i.e., slower reaction times to all emotional expressions, was found in provoked participants with a high level of trait anger. These results indicate significant effects of aggression on information processing, which last beyond the aggressive encounter even in healthy participants.
Emotion elicitor or emotion messenger? Subliminal priming reveals two faces of facial expressions.
Ruys, Kirsten I; Stapel, Diederik A
2008-06-01
Facial emotional expressions can serve both as emotional stimuli and as communicative signals. The research reported here was conducted to illustrate how responses to both roles of facial emotional expressions unfold over time. As an emotion elicitor, a facial emotional expression (e.g., a disgusted face) activates a response that is similar to responses to other emotional stimuli of the same valence (e.g., a dirty, nonflushed toilet). As an emotion messenger, the same facial expression (e.g., a disgusted face) serves as a communicative signal by also activating the knowledge that the sender is experiencing a specific emotion (e.g., the sender feels disgusted). By varying the duration of exposure to disgusted, fearful, angry, and neutral faces in two subliminal-priming studies, we demonstrated that responses to faces as emotion elicitors occur prior to responses to faces as emotion messengers, and that both types of responses may unfold unconsciously.
ERIC Educational Resources Information Center
Herba, Catherine; Phillips, Mary
2004-01-01
Background: Intact emotion processing is critical for normal emotional development. Recent advances in neuroimaging have facilitated the examination of brain development, and have allowed for the exploration of the relationships between the development of emotion processing abilities, and that of associated neural systems. Methods: A literature…
Objectifying Facial Expressivity Assessment of Parkinson's Patients: Preliminary Study
Patsis, Georgios; Jiang, Dongmei; Sahli, Hichem; Kerckhofs, Eric; Vandekerckhove, Marie
2014-01-01
Patients with Parkinson's disease (PD) can exhibit a reduction of spontaneous facial expression, designated as “facial masking,” a symptom in which facial muscles become rigid. To improve clinical assessment of facial expressivity of PD, this work attempts to quantify the dynamic facial expressivity (facial activity) of PD by automatically recognizing facial action units (AUs) and estimating their intensity. Spontaneous facial expressivity was assessed by comparing 7 PD patients with 8 control participants. To voluntarily produce spontaneous facial expressions that resemble those typically triggered by emotions, six emotions (amusement, sadness, anger, disgust, surprise, and fear) were elicited using movie clips. During the movie clips, physiological signals (facial electromyography (EMG) and electrocardiogram (ECG)) and frontal face video of the participants were recorded. The participants were asked to report on their emotional states throughout the experiment. We first examined the effectiveness of the emotion manipulation by evaluating the participant's self-reports. Disgust-induced emotions were significantly higher than the other emotions. Thus we focused on the analysis of the recorded data during watching disgust movie clips. The proposed facial expressivity assessment approach captured differences in facial expressivity between PD patients and controls. Also differences between PD patients with different progression of Parkinson's disease have been observed. PMID:25478003
Racca, Anaïs; Guo, Kun; Meints, Kerstin; Mills, Daniel S.
2012-01-01
Sensitivity to the emotions of others provides clear biological advantages. However, in the case of heterospecific relationships, such as that existing between dogs and humans, there are additional challenges since some elements of the expression of emotions are species-specific. Given that faces provide important visual cues for communicating emotional state in both humans and dogs, and that processing of emotions is subject to brain lateralisation, we investigated lateral gaze bias in adult dogs when presented with pictures of expressive human and dog faces. Our analysis revealed clear differences in laterality of eye movements in dogs towards conspecific faces according to the emotional valence of the expressions. Differences were also found towards human faces, but to a lesser extent. For comparative purpose, a similar experiment was also run with 4-year-old children and it was observed that they showed differential processing of facial expressions compared to dogs, suggesting a species-dependent engagement of the right or left hemisphere in processing emotions. PMID:22558335
A Facial Control Method Using Emotional Parameters in Sensibility Robot
NASA Astrophysics Data System (ADS)
Shibata, Hiroshi; Kanoh, Masayoshi; Kato, Shohei; Kunitachi, Tsutomu; Itoh, Hidenori
The “Ifbot” robot communicates with people by considering its own “emotions”. Ifbot has many facial expressions to communicate enjoyment. These are used to express its internal emotions, purposes, reactions caused by external stimulus, and entertainment such as singing songs. All these facial expressions are developed by designers manually. Using this approach, we must design all facial motions, if we want Ifbot to express them. It, however, is not realistic. We have therefore developed a system which convert Ifbot's emotions to its facial expressions automatically. In this paper, we propose a method for creating Ifbot's facial expressions from parameters, emotional parameters, which handle its internal emotions computationally.
Facial Emotion Recognition and Expression in Parkinson's Disease: An Emotional Mirror Mechanism?
Ricciardi, Lucia; Visco-Comandini, Federica; Erro, Roberto; Morgante, Francesca; Bologna, Matteo; Fasano, Alfonso; Ricciardi, Diego; Edwards, Mark J; Kilner, James
2017-01-01
Parkinson's disease (PD) patients have impairment of facial expressivity (hypomimia) and difficulties in interpreting the emotional facial expressions produced by others, especially for aversive emotions. We aimed to evaluate the ability to produce facial emotional expressions and to recognize facial emotional expressions produced by others in a group of PD patients and a group of healthy participants in order to explore the relationship between these two abilities and any differences between the two groups of participants. Twenty non-demented, non-depressed PD patients and twenty healthy participants (HC) matched for demographic characteristics were studied. The ability of recognizing emotional facial expressions was assessed with the Ekman 60-faces test (Emotion recognition task). Participants were video-recorded while posing facial expressions of 6 primary emotions (happiness, sadness, surprise, disgust, fear and anger). The most expressive pictures for each emotion were derived from the videos. Ten healthy raters were asked to look at the pictures displayed on a computer-screen in pseudo-random fashion and to identify the emotional label in a six-forced-choice response format (Emotion expressivity task). Reaction time (RT) and accuracy of responses were recorded. At the end of each trial the participant was asked to rate his/her confidence in his/her perceived accuracy of response. For emotion recognition, PD reported lower score than HC for Ekman total score (p<0.001), and for single emotions sub-scores happiness, fear, anger, sadness (p<0.01) and surprise (p = 0.02). In the facial emotion expressivity task, PD and HC significantly differed in the total score (p = 0.05) and in the sub-scores for happiness, sadness, anger (all p<0.001). RT and the level of confidence showed significant differences between PD and HC for the same emotions. There was a significant positive correlation between the emotion facial recognition and expressivity in both groups; the correlation was even stronger when ranking emotions from the best recognized to the worst (R = 0.75, p = 0.004). PD patients showed difficulties in recognizing emotional facial expressions produced by others and in posing facial emotional expressions compared to healthy subjects. The linear correlation between recognition and expression in both experimental groups suggests that the two mechanisms share a common system, which could be deteriorated in patients with PD. These results open new clinical and rehabilitation perspectives.
Imitating expressions: emotion-specific neural substrates in facial mimicry.
Lee, Tien-Wen; Josephs, Oliver; Dolan, Raymond J; Critchley, Hugo D
2006-09-01
Intentionally adopting a discrete emotional facial expression can modulate the subjective feelings corresponding to that emotion; however, the underlying neural mechanism is poorly understood. We therefore used functional brain imaging (functional magnetic resonance imaging) to examine brain activity during intentional mimicry of emotional and non-emotional facial expressions and relate regional responses to the magnitude of expression-induced facial movement. Eighteen healthy subjects were scanned while imitating video clips depicting three emotional (sad, angry, happy), and two 'ingestive' (chewing and licking) facial expressions. Simultaneously, facial movement was monitored from displacement of fiducial markers (highly reflective dots) on each subject's face. Imitating emotional expressions enhanced activity within right inferior prefrontal cortex. This pattern was absent during passive viewing conditions. Moreover, the magnitude of facial movement during emotion-imitation predicted responses within right insula and motor/premotor cortices. Enhanced activity in ventromedial prefrontal cortex and frontal pole was observed during imitation of anger, in ventromedial prefrontal and rostral anterior cingulate during imitation of sadness and in striatal, amygdala and occipitotemporal during imitation of happiness. Our findings suggest a central role for right inferior frontal gyrus in the intentional imitation of emotional expressions. Further, by entering metrics for facial muscular change into analysis of brain imaging data, we highlight shared and discrete neural substrates supporting affective, action and social consequences of somatomotor emotional expression.
ERIC Educational Resources Information Center
Brideau, Linda B.; Allen, Vernon L.
A study was undertaken to examine the impact of the paralinguistic channel on the ability to encode facial expressions of emotion. The first set of subjects, 19 encoders, were asked to encode facial expressions for five emotions (fear, sadness, anger, happiness, and disgust). The emotions were produced in three encoding conditions: facial channel…
Mike, Andrea; Strammer, Erzsebet; Aradi, Mihaly; Orsi, Gergely; Perlaki, Gabor; Hajnal, Andras; Sandor, Janos; Banati, Miklos; Illes, Eniko; Zaitsev, Alexander; Herold, Robert; Guttmann, Charles R G; Illes, Zsolt
2013-01-01
Successful socialization requires the ability of understanding of others' mental states. This ability called as mentalization (Theory of Mind) may become deficient and contribute to everyday life difficulties in multiple sclerosis. We aimed to explore the impact of brain pathology on mentalization performance in multiple sclerosis. Mentalization performance of 49 patients with multiple sclerosis was compared to 24 age- and gender matched healthy controls. T1- and T2-weighted three-dimensional brain MRI images were acquired at 3Tesla from patients with multiple sclerosis and 18 gender- and age matched healthy controls. We assessed overall brain cortical thickness in patients with multiple sclerosis and the scanned healthy controls, and measured the total and regional T1 and T2 white matter lesion volumes in patients with multiple sclerosis. Performances in tests of recognition of mental states and emotions from facial expressions and eye gazes correlated with both total T1-lesion load and regional T1-lesion load of association fiber tracts interconnecting cortical regions related to visual and emotion processing (genu and splenium of corpus callosum, right inferior longitudinal fasciculus, right inferior fronto-occipital fasciculus, uncinate fasciculus). Both of these tests showed correlations with specific cortical areas involved in emotion recognition from facial expressions (right and left fusiform face area, frontal eye filed), processing of emotions (right entorhinal cortex) and socially relevant information (left temporal pole). Thus, both disconnection mechanism due to white matter lesions and cortical thinning of specific brain areas may result in cognitive deficit in multiple sclerosis affecting emotion and mental state processing from facial expressions and contributing to everyday and social life difficulties of these patients.
Smitha, K G; Vinod, A P
2015-11-01
Children with autism spectrum disorder have difficulty in understanding the emotional and mental states from the facial expressions of the people they interact. The inability to understand other people's emotions will hinder their interpersonal communication. Though many facial emotion recognition algorithms have been proposed in the literature, they are mainly intended for processing by a personal computer, which limits their usability in on-the-move applications where portability is desired. The portability of the system will ensure ease of use and real-time emotion recognition and that will aid for immediate feedback while communicating with caretakers. Principal component analysis (PCA) has been identified as the least complex feature extraction algorithm to be implemented in hardware. In this paper, we present a detailed study of the implementation of serial and parallel implementation of PCA in order to identify the most feasible method for realization of a portable emotion detector for autistic children. The proposed emotion recognizer architectures are implemented on Virtex 7 XC7VX330T FFG1761-3 FPGA. We achieved 82.3% detection accuracy for a word length of 8 bits.
Developmental differences in the neural mechanisms of facial emotion labeling
Adleman, Nancy E.; Kim, Pilyoung; Oakes, Allison H.; Hsu, Derek; Reynolds, Richard C.; Chen, Gang; Pine, Daniel S.; Brotman, Melissa A.; Leibenluft, Ellen
2016-01-01
Adolescence is a time of increased risk for the onset of psychological disorders associated with deficits in face emotion labeling. We used functional magnetic resonance imaging (fMRI) to examine age-related differences in brain activation while adolescents and adults labeled the emotion on fearful, happy and angry faces of varying intensities [0% (i.e. neutral), 50%, 75%, 100%]. Adolescents and adults did not differ on accuracy to label emotions. In the superior temporal sulcus, ventrolateral prefrontal cortex and middle temporal gyrus, adults show an inverted-U-shaped response to increasing intensities of fearful faces and a U-shaped response to increasing intensities of happy faces, whereas adolescents show the opposite patterns. In addition, adults, but not adolescents, show greater inferior occipital gyrus activation to negative (angry, fearful) vs positive (happy) emotions. In sum, when subjects classify subtly varying facial emotions, developmental differences manifest in several ‘ventral stream’ brain regions. Charting the typical developmental course of the brain mechanisms of socioemotional processes, such as facial emotion labeling, is an important focus for developmental psychopathology research. PMID:26245836
Retention interval affects visual short-term memory encoding.
Bankó, Eva M; Vidnyánszky, Zoltán
2010-03-01
Humans can efficiently store fine-detailed facial emotional information in visual short-term memory for several seconds. However, an unresolved question is whether the same neural mechanisms underlie high-fidelity short-term memory for emotional expressions at different retention intervals. Here we show that retention interval affects the neural processes of short-term memory encoding using a delayed facial emotion discrimination task. The early sensory P100 component of the event-related potentials (ERP) was larger in the 1-s interstimulus interval (ISI) condition than in the 6-s ISI condition, whereas the face-specific N170 component was larger in the longer ISI condition. Furthermore, the memory-related late P3b component of the ERP responses was also modulated by retention interval: it was reduced in the 1-s ISI as compared with the 6-s condition. The present findings cannot be explained based on differences in sensory processing demands or overall task difficulty because there was no difference in the stimulus information and subjects' performance between the two different ISI conditions. These results reveal that encoding processes underlying high-precision short-term memory for facial emotional expressions are modulated depending on whether information has to be stored for one or for several seconds.
Fixation to features and neural processing of facial expressions in a gender discrimination task
Neath, Karly N.; Itier, Roxane J.
2017-01-01
Early face encoding, as reflected by the N170 ERP component, is sensitive to fixation to the eyes. Whether this sensitivity varies with facial expressions of emotion and can also be seen on other ERP components such as P1 and EPN, was investigated. Using eye-tracking to manipulate fixation on facial features, we found the N170 to be the only eye-sensitive component and this was true for fearful, happy and neutral faces. A different effect of fixation to features was seen for the earlier P1 that likely reflected general sensitivity to face position. An early effect of emotion (~120 ms) for happy faces was seen at occipital sites and was sustained until ~350 ms post-stimulus. For fearful faces, an early effect was seen around 80 ms followed by a later effect appearing at ~150 ms until ~300 ms at lateral posterior sites. Results suggests that in this emotion-irrelevant gender discrimination task, processing of fearful and happy expressions occurred early and largely independently of the eye-sensitivity indexed by the N170. Processing of the two emotions involved different underlying brain networks active at different times. PMID:26277653
Wingenbach, Tanja S H; Ashwin, Chris; Brosnan, Mark
2018-01-01
There has been much research on sex differences in the ability to recognise facial expressions of emotions, with results generally showing a female advantage in reading emotional expressions from the face. However, most of the research to date has used static images and/or 'extreme' examples of facial expressions. Therefore, little is known about how expression intensity and dynamic stimuli might affect the commonly reported female advantage in facial emotion recognition. The current study investigated sex differences in accuracy of response (Hu; unbiased hit rates) and response latencies for emotion recognition using short video stimuli (1sec) of 10 different facial emotion expressions (anger, disgust, fear, sadness, surprise, happiness, contempt, pride, embarrassment, neutral) across three variations in the intensity of the emotional expression (low, intermediate, high) in an adolescent and adult sample (N = 111; 51 male, 60 female) aged between 16 and 45 (M = 22.2, SD = 5.7). Overall, females showed more accurate facial emotion recognition compared to males and were faster in correctly recognising facial emotions. The female advantage in reading expressions from the faces of others was unaffected by expression intensity levels and emotion categories used in the study. The effects were specific to recognition of emotions, as males and females did not differ in the recognition of neutral faces. Together, the results showed a robust sex difference favouring females in facial emotion recognition using video stimuli of a wide range of emotions and expression intensity variations.
Sex differences in facial emotion recognition across varying expression intensity levels from videos
2018-01-01
There has been much research on sex differences in the ability to recognise facial expressions of emotions, with results generally showing a female advantage in reading emotional expressions from the face. However, most of the research to date has used static images and/or ‘extreme’ examples of facial expressions. Therefore, little is known about how expression intensity and dynamic stimuli might affect the commonly reported female advantage in facial emotion recognition. The current study investigated sex differences in accuracy of response (Hu; unbiased hit rates) and response latencies for emotion recognition using short video stimuli (1sec) of 10 different facial emotion expressions (anger, disgust, fear, sadness, surprise, happiness, contempt, pride, embarrassment, neutral) across three variations in the intensity of the emotional expression (low, intermediate, high) in an adolescent and adult sample (N = 111; 51 male, 60 female) aged between 16 and 45 (M = 22.2, SD = 5.7). Overall, females showed more accurate facial emotion recognition compared to males and were faster in correctly recognising facial emotions. The female advantage in reading expressions from the faces of others was unaffected by expression intensity levels and emotion categories used in the study. The effects were specific to recognition of emotions, as males and females did not differ in the recognition of neutral faces. Together, the results showed a robust sex difference favouring females in facial emotion recognition using video stimuli of a wide range of emotions and expression intensity variations. PMID:29293674
Interoceptive sensitivity predicts sensitivity to the emotions of others.
Terasawa, Yuri; Moriguchi, Yoshiya; Tochizawa, Saiko; Umeda, Satoshi
2014-01-01
Some theories of emotion emphasise a close relationship between interoception and subjective experiences of emotion. In this study, we used facial expressions to examine whether interoceptive sensibility modulated emotional experience in a social context. Interoceptive sensibility was measured using the heartbeat detection task. To estimate individual emotional sensitivity, we made morphed photos that ranged between a neutral and an emotional facial expression (i.e., anger, sadness, disgust and happy). Recognition rates of particular emotions from these photos were calculated and considered as emotional sensitivity thresholds. Our results indicate that participants with accurate interoceptive awareness are sensitive to the emotions of others, especially for expressions of sadness and happy. We also found that false responses to sad faces were closely related with an individual's degree of social anxiety. These results suggest that interoceptive awareness modulates the intensity of the subjective experience of emotion and affects individual traits related to emotion processing.
How Children Use Emotional Prosody: Crossmodal Emotional Integration?
ERIC Educational Resources Information Center
Gil, Sandrine; Hattouti, Jamila; Laval, Virginie
2016-01-01
A crossmodal effect has been observed in the processing of facial and vocal emotion in adults and infants. For the first time, we assessed whether this effect is present in childhood by administering a crossmodal task similar to those used in seminal studies featuring emotional faces (i.e., a continuum of emotional expressions running from…
Silver, Henry; Bilker, Warren B
2015-01-01
Social cognition is commonly assessed by identification of emotions in facial expressions. Presence of colour, a salient feature of stimuli, might influence emotional face perception. We administered 2 tests of facial emotion recognition, the Emotion Recognition Test (ER40) using colour pictures and the Penn Emotional Acuity Test using monochromatic pictures, to 37 young healthy, 39 old healthy and 37 schizophrenic men. Among young healthy individuals recognition of emotions was more accurate and faster in colour than in monochromatic pictures. Compared to the younger group, older healthy individuals revealed impairment in identification of sad expressions in colour but not monochromatic pictures. Schizophrenia patients showed greater impairment in colour than monochromatic pictures of neutral and sad expressions and overall total score compared to both healthy groups. Patients showed significant correlations between cognitive impairment and perception of emotion in colour but not monochromatic pictures. Colour enhances perception of general emotional clues and this contextual effect is impaired in healthy ageing and schizophrenia. The effects of colour need to be considered in interpreting and comparing studies of emotion perception. Coloured face stimuli may be more sensitive to emotion processing impairments but less selective for emotion-specific information than monochromatic stimuli. This may impact on their utility in early detection of impairments and investigations of underlying mechanisms.
Facial EMG responses to emotional expressions are related to emotion perception ability.
Künecke, Janina; Hildebrandt, Andrea; Recio, Guillermo; Sommer, Werner; Wilhelm, Oliver
2014-01-01
Although most people can identify facial expressions of emotions well, they still differ in this ability. According to embodied simulation theories understanding emotions of others is fostered by involuntarily mimicking the perceived expressions, causing a "reactivation" of the corresponding mental state. Some studies suggest automatic facial mimicry during expression viewing; however, findings on the relationship between mimicry and emotion perception abilities are equivocal. The present study investigated individual differences in emotion perception and its relationship to facial muscle responses - recorded with electromyogram (EMG)--in response to emotional facial expressions. N° = °269 participants completed multiple tasks measuring face and emotion perception. EMG recordings were taken from a subsample (N° = °110) in an independent emotion classification task of short videos displaying six emotions. Confirmatory factor analyses of the m. corrugator supercilii in response to angry, happy, sad, and neutral expressions showed that individual differences in corrugator activity can be separated into a general response to all faces and an emotion-related response. Structural equation modeling revealed a substantial relationship between the emotion-related response and emotion perception ability, providing evidence for the role of facial muscle activation in emotion perception from an individual differences perspective.
Facial EMG Responses to Emotional Expressions Are Related to Emotion Perception Ability
Künecke, Janina; Hildebrandt, Andrea; Recio, Guillermo; Sommer, Werner; Wilhelm, Oliver
2014-01-01
Although most people can identify facial expressions of emotions well, they still differ in this ability. According to embodied simulation theories understanding emotions of others is fostered by involuntarily mimicking the perceived expressions, causing a “reactivation” of the corresponding mental state. Some studies suggest automatic facial mimicry during expression viewing; however, findings on the relationship between mimicry and emotion perception abilities are equivocal. The present study investigated individual differences in emotion perception and its relationship to facial muscle responses - recorded with electromyogram (EMG) - in response to emotional facial expressions. N° = °269 participants completed multiple tasks measuring face and emotion perception. EMG recordings were taken from a subsample (N° = °110) in an independent emotion classification task of short videos displaying six emotions. Confirmatory factor analyses of the m. corrugator supercilii in response to angry, happy, sad, and neutral expressions showed that individual differences in corrugator activity can be separated into a general response to all faces and an emotion-related response. Structural equation modeling revealed a substantial relationship between the emotion-related response and emotion perception ability, providing evidence for the role of facial muscle activation in emotion perception from an individual differences perspective. PMID:24489647
Gaze Dynamics in the Recognition of Facial Expressions of Emotion.
Barabanschikov, Vladimir A
2015-01-01
We studied preferably fixated parts and features of human face in the process of recognition of facial expressions of emotion. Photographs of facial expressions were used. Participants were to categorize these as basic emotions; during this process, eye movements were registered. It was found that variation in the intensity of an expression is mirrored in accuracy of emotion recognition; it was also reflected by several indices of oculomotor function: duration of inspection of certain areas of the face, its upper and bottom or right parts, right and left sides; location, number and duration of fixations, viewing trajectory. In particular, for low-intensity expressions, right side of the face was found to be attended predominantly (right-side dominance); the right-side dominance effect, was, however, absent for expressions of high intensity. For both low- and high-intensity expressions, upper face part was predominantly fixated, though with greater fixation of high-intensity expressions. The majority of trials (70%), in line with findings in previous studies, revealed a V-shaped pattern of inspection trajectory. No relationship, between accuracy of recognition of emotional expressions, was found, though, with either location and duration of fixations or pattern of gaze directedness in the face. © The Author(s) 2015.
Li, Huijie; Chan, Raymond C K; Zhao, Qing; Hong, Xiaohong; Gong, Qi-Yong
2010-03-17
Although there is a consensus that patients with schizophrenia have certain deficits in perceiving and expressing facial emotions, previous studies of facial emotion perception in schizophrenia do not present consistent results. The objective of this study was to explore facial emotion perception deficits in Chinese patients with schizophrenia and their non-psychotic first-degree relatives. Sixty-nine patients with schizophrenia, 56 of their first-degree relatives (33 parents and 23 siblings), and 92 healthy controls (67 younger healthy controls matched to the patients and siblings, and 25 older healthy controls matched to the parents) completed a set of facial emotion perception tasks, including facial emotion discrimination, identification, intensity, valence, and corresponding face identification tasks. The results demonstrated that patients with schizophrenia performed significantly worse than their siblings and younger healthy controls in accuracy in a variety of facial emotion perception tasks, whereas the siblings of the patients performed as well as the corresponding younger healthy controls in all of the facial emotion perception tasks. Patients with schizophrenia also showed significantly reduced speed than younger healthy controls, while siblings of patients did not demonstrate significant differences with both patients and younger healthy controls in speed. Meanwhile, we also found that parents of the schizophrenia patients performed significantly worse than the corresponding older healthy controls in accuracy in terms of facial emotion identification, valence, and the composite index of the facial discrimination, identification, intensity and valence tasks. Moreover, no significant differences were found between the parents of patients and older healthy controls in speed after controlling the years of education and IQ. Taken together, the results suggest that facial emotion perception deficits may serve as potential endophenotypes for schizophrenia. Copyright 2010 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Beall, Paula M.; Moody, Eric J.; McIntosh, Daniel N.; Hepburn, Susan L.; Reed, Catherine L.
2008-01-01
Typical adults mimic facial expressions within 1000ms, but adults with autism spectrum disorder (ASD) do not. These rapid facial reactions (RFRs) are associated with the development of social-emotional abilities. Such interpersonal matching may be caused by motor mirroring or emotional responses. Using facial electromyography (EMG), this study…
Leist, Tatyana; Dadds, Mark R
2009-04-01
Emotional processing styles appear to characterize various forms of psychopathology and environmental adversity in children. For example, autistic, anxious, high- and low-emotion conduct problem children, and children who have been maltreated, all appear to show specific deficits and strengths in recognizing the facial expressions of emotions. Until now, the relationships between emotion recognition, antisocial behaviour, emotional problems, callous-unemotional (CU) traits and early maltreatment have never been assessed simultaneously in one study, and the specific associations of emotion recognition to maltreatment and child characteristics are therefore unknown. We examined facial-emotion processing in a sample of 23 adolescents selected for high-risk status on the variables of interest. As expected, maltreatment and child characteristics showed unique associations. CU traits were uniquely related to impairments in fear recognition. Antisocial behaviour was uniquely associated with better fear recognition, but impaired anger recognition. Emotional problems were associated with better recognition of anger and sadness, but lower recognition of neutral faces. Maltreatment was predictive of superior recognition of fear and sadness. The findings are considered in terms of social information-processing theories of psychopathology. Implications for clinical interventions are discussed.
Ibáñez, Agustin; Petroni, Agustin; Urquina, Hugo; Torrente, Fernando; Torralva, Teresa; Hurtado, Esteban; Guex, Raphael; Blenkmann, Alejandro; Beltrachini, Leandro; Muravchik, Carlos; Baez, Sandra; Cetkovich, Marcelo; Sigman, Mariano; Lischinsky, Alicia; Manes, Facundo
2011-01-01
Although it has been shown that adults with attention-deficit hyperactivity disorder (ADHD) have impaired social cognition, no previous study has reported the brain correlates of face valence processing. This study looked for behavioral, neuropsychological, and electrophysiological markers of emotion processing for faces (N170) in adult ADHD compared to controls matched by age, gender, educational level, and handedness. We designed an event-related potential (ERP) study based on a dual valence task (DVT), in which faces and words were presented to test the effects of stimulus type (faces, words, or face-word stimuli) and valence (positive versus negative). Individual signatures of cognitive functioning in participants with ADHD and controls were assessed with a comprehensive neuropsychological evaluation, including executive functioning (EF) and theory of mind (ToM). Compared to controls, the adult ADHD group showed deficits in N170 emotion modulation for facial stimuli. These N170 impairments were observed in the absence of any deficit in facial structural processing, suggesting a specific ADHD impairment in early facial emotion modulation. The cortical current density mapping of N170 yielded a main neural source of N170 at posterior section of fusiform gyrus (maximum at left hemisphere for words and right hemisphere for faces and simultaneous stimuli). Neural generators of N170 (fusiform gyrus) were reduced in ADHD. In those patients, N170 emotion processing was associated with performance on an emotional inference ToM task, and N170 from simultaneous stimuli was associated with EF, especially working memory. This is the first report to reveal an adult ADHD-specific impairment in the cortical modulation of emotion for faces and an association between N170 cortical measures and ToM and EF.
Neural correlates of the perception of dynamic versus static facial expressions of emotion.
Kessler, Henrik; Doyen-Waldecker, Cornelia; Hofer, Christian; Hoffmann, Holger; Traue, Harald C; Abler, Birgit
2011-04-20
This study investigated brain areas involved in the perception of dynamic facial expressions of emotion. A group of 30 healthy subjects was measured with fMRI when passively viewing prototypical facial expressions of fear, disgust, sadness and happiness. Using morphing techniques, all faces were displayed as still images and also dynamically as a film clip with the expressions evolving from neutral to emotional. Irrespective of a specific emotion, dynamic stimuli selectively activated bilateral superior temporal sulcus, visual area V5, fusiform gyrus, thalamus and other frontal and parietal areas. Interaction effects of emotion and mode of presentation (static/dynamic) were only found for the expression of happiness, where static faces evoked greater activity in the medial prefrontal cortex. Our results confirm previous findings on neural correlates of the perception of dynamic facial expressions and are in line with studies showing the importance of the superior temporal sulcus and V5 in the perception of biological motion. Differential activation in the fusiform gyrus for dynamic stimuli stands in contrast to classical models of face perception but is coherent with new findings arguing for a more general role of the fusiform gyrus in the processing of socially relevant stimuli.
Hulvershorn, Leslie A; Finn, Peter; Hummer, Tom A; Leibenluft, Ellen; Ball, Brandon; Gichina, Victoria; Anand, Amit
2013-08-01
Recent longitudinal studies demonstrate that addiction risk may be influenced by a cognitive, affective and behavioral phenotype that emerges during childhood. Relatively little research has focused on the affective or emotional risk components of this high-risk phenotype, including the relevant neurobiology. Non-substance abusing youth (N=19; mean age=12.2) with externalizing psychopathology and paternal history of a substance use disorder and demographically matched healthy comparisons (N=18; mean age=11.9) were tested on a facial emotion matching task during functional MRI. This task involved matching faces by emotions (angry, anxious) or matching shape orientation. High-risk youth exhibited increased medial prefrontal, precuneus and occipital cortex activation compared to the healthy comparison group during the face matching condition, relative to the control shape condition. The occipital activation correlated positively with parent-rated emotion regulation impairments in the high-risk group. These findings suggest a preexisting abnormality in cortical activation in response to facial emotion matching in youth at high risk for the development of problem drug or alcohol use. These cortical deficits may underlie impaired affective processing and regulation, which in turn may contribute to escalating drug use in adolescence. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Hulvershorn, Leslie A.; Finn, Peter; Hummer, Tom A.; Leibenluft, Ellen; Ball, Brandon; Gichina, Victoria; Anand, Amit
2013-01-01
Background Recent longitudinal studies demonstrate that addiction risk may be influenced by a cognitive, affective and behavioral phenotype that emerges during childhood. Relatively little research has focused on the affective or emotional risk components of this high-risk phenotype, including the relevant neurobiology. Methods Non-substance abusing youth (N = 19; mean age = 12.2) with externalizing psychopathology and paternal history of a substance use disorder and demographically matched healthy comparisons (N=18; mean age = 11.9) were tested on a facial emotion matching task during functional MRI. This task involved matching faces by emotions (angry, anxious) or matching shape orientation. Results High-risk youth exhibited increased medial prefrontal, precuneus and occipital cortex activation compared to the healthy comparison group during the face matching condition, relative to the control shape condition. The occipital activation correlated positively with parent-rated emotion regulation impairments in the high-risk group. Conclusions These findings suggest a preexisting abnormality in cortical activation in response to facial emotion matching in youth at high risk for the development of problem drug or alcohol use. These cortical deficits may underlie impaired affective processing and regulation, which in turn may contribute to escalating drug use in adolescence. PMID:23768841
Mimicking emotions: how 3-12-month-old infants use the facial expressions and eyes of a model.
Soussignan, Robert; Dollion, Nicolas; Schaal, Benoist; Durand, Karine; Reissland, Nadja; Baudouin, Jean-Yves
2018-06-01
While there is an extensive literature on the tendency to mimic emotional expressions in adults, it is unclear how this skill emerges and develops over time. Specifically, it is unclear whether infants mimic discrete emotion-related facial actions, whether their facial displays are moderated by contextual cues and whether infants' emotional mimicry is constrained by developmental changes in the ability to discriminate emotions. We therefore investigate these questions using Baby-FACS to code infants' facial displays and eye-movement tracking to examine infants' looking times at facial expressions. Three-, 7-, and 12-month-old participants were exposed to dynamic facial expressions (joy, anger, fear, disgust, sadness) of a virtual model which either looked at the infant or had an averted gaze. Infants did not match emotion-specific facial actions shown by the model, but they produced valence-congruent facial responses to the distinct expressions. Furthermore, only the 7- and 12-month-olds displayed negative responses to the model's negative expressions and they looked more at areas of the face recruiting facial actions involved in specific expressions. Our results suggest that valence-congruent expressions emerge in infancy during a period where the decoding of facial expressions becomes increasingly sensitive to the social signal value of emotions.
Yang, Tao; Penton, Tegan; Köybaşı, Şerife Leman; Banissy, Michael J
2017-09-01
Previous findings suggest that older adults show impairments in the social perception of faces, including the perception of emotion and facial identity. The majority of this work has tended to examine performance on tasks involving young adult faces and prototypical emotions. While useful, this can influence performance differences between groups due to perceptual biases and limitations on task performance. Here we sought to examine how typical aging is associated with the perception of subtle changes in facial happiness and facial identity in older adult faces. We developed novel tasks that permitted the ability to assess facial happiness, facial identity, and non-social perception (object perception) across similar task parameters. We observe that aging is linked with declines in the ability to make fine-grained judgements in the perception of facial happiness and facial identity (from older adult faces), but not for non-social (object) perception. This pattern of results is discussed in relation to mechanisms that may contribute to declines in facial perceptual processing in older adulthood. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Emotion identification and aging: Behavioral and neural age-related changes.
Gonçalves, Ana R; Fernandes, Carina; Pasion, Rita; Ferreira-Santos, Fernando; Barbosa, Fernando; Marques-Teixeira, João
2018-05-01
Aging is known to alter the processing of facial expressions of emotion (FEE), however the impact of this alteration is less clear. Additionally, there is little information about the temporal dynamics of the neural processing of facial affect. We examined behavioral and neural age-related changes in the identification of FEE using event-related potentials. Furthermore, we analyze the relationship between behavioral/neural responses and neuropsychological functioning. To this purpose, 30 younger adults, 29 middle-aged adults and 26 older adults identified FEE. The behavioral results showed a similar performance between groups. The neural results showed no significant differences between groups for the P100 component and an increased N170 amplitude in the older group. Furthermore, a pattern of asymmetric activation was evident in the N170 component. Results also suggest deficits in facial feature decoding abilities, reflected by a reduced N250 amplitude in older adults. Neuropsychological functioning predicts P100 modulation, but does not seem to influence emotion identification ability. The findings suggest the existence of a compensatory function that would explain the age-equivalent performance in emotion identification. The study may help future research addressing behavioral and neural processes involved on processing of FEE in neurodegenerative conditions. Copyright © 2018 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.
Candra, Henry; Yuwono, Mitchell; Rifai Chai; Nguyen, Hung T; Su, Steven
2016-08-01
Psychotherapy requires appropriate recognition of patient's facial-emotion expression to provide proper treatment in psychotherapy session. To address the needs this paper proposed a facial emotion recognition system using Combination of Viola-Jones detector together with a feature descriptor we term Edge-Histogram of Oriented Gradients (E-HOG). The performance of the proposed method is compared with various feature sources including the face, the eyes, the mouth, as well as both the eyes and the mouth. Seven classes of basic emotions have been successfully identified with 96.4% accuracy using Multi-class Support Vector Machine (SVM). The proposed descriptor E-HOG is much leaner to compute compared to traditional HOG as shown by a significant improvement in processing time as high as 1833.33% (p-value = 2.43E-17) with a slight reduction in accuracy of only 1.17% (p-value = 0.0016).
Kaulard, Kathrin; Cunningham, Douglas W.; Bülthoff, Heinrich H.; Wallraven, Christian
2012-01-01
The ability to communicate is one of the core aspects of human life. For this, we use not only verbal but also nonverbal signals of remarkable complexity. Among the latter, facial expressions belong to the most important information channels. Despite the large variety of facial expressions we use in daily life, research on facial expressions has so far mostly focused on the emotional aspect. Consequently, most databases of facial expressions available to the research community also include only emotional expressions, neglecting the largely unexplored aspect of conversational expressions. To fill this gap, we present the MPI facial expression database, which contains a large variety of natural emotional and conversational expressions. The database contains 55 different facial expressions performed by 19 German participants. Expressions were elicited with the help of a method-acting protocol, which guarantees both well-defined and natural facial expressions. The method-acting protocol was based on every-day scenarios, which are used to define the necessary context information for each expression. All facial expressions are available in three repetitions, in two intensities, as well as from three different camera angles. A detailed frame annotation is provided, from which a dynamic and a static version of the database have been created. In addition to describing the database in detail, we also present the results of an experiment with two conditions that serve to validate the context scenarios as well as the naturalness and recognizability of the video sequences. Our results provide clear evidence that conversational expressions can be recognized surprisingly well from visual information alone. The MPI facial expression database will enable researchers from different research fields (including the perceptual and cognitive sciences, but also affective computing, as well as computer vision) to investigate the processing of a wider range of natural facial expressions. PMID:22438875
Measuring facial expression of emotion.
Wolf, Karsten
2015-12-01
Research into emotions has increased in recent decades, especially on the subject of recognition of emotions. However, studies of the facial expressions of emotion were compromised by technical problems with visible video analysis and electromyography in experimental settings. These have only recently been overcome. There have been new developments in the field of automated computerized facial recognition; allowing real-time identification of facial expression in social environments. This review addresses three approaches to measuring facial expression of emotion and describes their specific contributions to understanding emotion in the healthy population and in persons with mental illness. Despite recent progress, studies on human emotions have been hindered by the lack of consensus on an emotion theory suited to examining the dynamic aspects of emotion and its expression. Studying expression of emotion in patients with mental health conditions for diagnostic and therapeutic purposes will profit from theoretical and methodological progress.
Visual search for facial expressions of emotions: a comparison of dynamic and static faces.
Horstmann, Gernot; Ansorge, Ulrich
2009-02-01
A number of past studies have used the visual search paradigm to examine whether certain aspects of emotional faces are processed preattentively and can thus be used to guide attention. All these studies presented static depictions of facial prototypes. Emotional expressions conveyed by the movement patterns of the face have never been examined for their preattentive effect. The present study presented for the first time dynamic facial expressions in a visual search paradigm. Experiment 1 revealed efficient search for a dynamic angry face among dynamic friendly faces, but inefficient search in a control condition with static faces. Experiments 2 to 4 suggested that this pattern of results is due to a stronger movement signal in the angry than in the friendly face: No (strong) advantage of dynamic over static faces is revealed when the degree of movement is controlled. These results show that dynamic information can be efficiently utilized in visual search for facial expressions. However, these results do not generally support the hypothesis that emotion-specific movement patterns are always preattentively discriminated. (c) 2009 APA, all rights reserved
The influence of context on distinct facial expressions of disgust.
Reschke, Peter J; Walle, Eric A; Knothe, Jennifer M; Lopez, Lukas D
2018-06-11
Face perception is susceptible to contextual influence and perceived physical similarities between emotion cues. However, studies often use structurally homogeneous facial expressions, making it difficult to explore how within-emotion variability in facial configuration affects emotion perception. This study examined the influence of context on the emotional perception of categorically identical, yet physically distinct, facial expressions of disgust. Participants categorized two perceptually distinct disgust facial expressions, "closed" (i.e., scrunched nose, closed mouth) and "open" (i.e., scrunched nose, open mouth, protruding tongue), that were embedded in contexts comprising emotion postures and scenes. Results demonstrated that the effect of nonfacial elements was significantly stronger for "open" disgust facial expressions than "closed" disgust facial expressions. These findings provide support that physical similarity within discrete categories of facial expressions is mutable and plays an important role in affective face perception. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Infant Expressions in an Approach/Withdrawal Framework
Sullivan, Margaret Wolan
2014-01-01
Since the introduction of empirical methods for studying facial expression, the interpretation of infant facial expressions has generated much debate. The premise of this paper is that action tendencies of approach and withdrawal constitute a core organizational feature of emotion in humans, promoting coherence of behavior, facial signaling and physiological responses. The approach/withdrawal framework can provide a taxonomy of contexts and the neurobehavioral framework for the systematic, empirical study of individual differences in expression, physiology, and behavior within individuals as well as across contexts over time. By adopting this framework in developmental work on basic emotion processes, it may be possible to better understand the behavioral principles governing facial displays, and how individual differences in them are related to physiology and behavior, function in context. PMID:25412273
Compensating for age limits through emotional crossmodal integration
Chaby, Laurence; Boullay, Viviane Luherne-du; Chetouani, Mohamed; Plaza, Monique
2015-01-01
Social interactions in daily life necessitate the integration of social signals from different sensory modalities. In the aging literature, it is well established that the recognition of emotion in facial expressions declines with advancing age, and this also occurs with vocal expressions. By contrast, crossmodal integration processing in healthy aging individuals is less documented. Here, we investigated the age-related effects on emotion recognition when faces and voices were presented alone or simultaneously, allowing for crossmodal integration. In this study, 31 young adults (M = 25.8 years) and 31 older adults (M = 67.2 years) were instructed to identify several basic emotions (happiness, sadness, anger, fear, disgust) and a neutral expression, which were displayed as visual (facial expressions), auditory (non-verbal affective vocalizations) or crossmodal (simultaneous, congruent facial and vocal affective expressions) stimuli. The results showed that older adults performed slower and worse than younger adults at recognizing negative emotions from isolated faces and voices. In the crossmodal condition, although slower, older adults were as accurate as younger except for anger. Importantly, additional analyses using the “race model” demonstrate that older adults benefited to the same extent as younger adults from the combination of facial and vocal emotional stimuli. These results help explain some conflicting results in the literature and may clarify emotional abilities related to daily life that are partially spared among older adults. PMID:26074845
Chiu, Isabelle; Piguet, Olivier; Diehl-Schmid, Janine; Riedl, Lina; Beck, Johannes; Leyhe, Thomas; Holsboer-Trachsler, Edith; Kressig, Reto W; Berres, Manfred; Monsch, Andreas U; Sollberger, Marc
Misdiagnosis of early behavioral variant frontotemporal dementia (bvFTD) with major depressive disorder (MDD) is not uncommon due to overlapping symptoms. The aim of this study was to improve the discrimination between these disorders using a novel facial emotion perception task. In this prospective cohort study (July 2013-March 2016), we compared 25 patients meeting Rascovsky diagnostic criteria for bvFTD, 20 patients meeting DSM-IV criteria for MDD, 21 patients meeting McKhann diagnostic criteria for Alzheimer's disease dementia, and 31 healthy participants on a novel emotion intensity rating task comprising morphed low-intensity facial stimuli. Participants were asked to rate the intensity of morphed faces on the congruent basic emotion (eg, rating on sadness when sad face is shown) and on the 5 incongruent basic emotions (eg, rating on each of the other basic emotions when sad face is shown). While bvFTD patients underrated congruent emotions (P < .01), they also overrated incongruent emotions (P < .001), resulting in confusion of facial emotions. In contrast, MDD patients overrated congruent negative facial emotions (P < .001), but not incongruent facial emotions. Accordingly, ratings of congruent and incongruent emotions highly discriminated between bvFTD and MDD patients, ranging from area under the curve (AUC) = 93% to AUC = 98%. Further, an almost complete discrimination (AUC = 99%) was achieved by contrasting the 2 rating types. In contrast, Alzheimer's disease dementia patients perceived emotions similarly to healthy participants, indicating no impact of cognitive impairment on rating scores. Our congruent and incongruent facial emotion intensity rating task allows a detailed assessment of facial emotion perception in patient populations. By using this simple task, we achieved an almost complete discrimination between bvFTD and MDD, potentially helping improve the diagnostic certainty in early bvFTD. © Copyright 2018 Physicians Postgraduate Press, Inc.
Leitman, David I; Wolf, Daniel H; Loughead, James; Valdez, Jeffrey N; Kohler, Christian G; Brensinger, Colleen; Elliott, Mark A; Turetsky, Bruce I; Gur, Raquel E; Gur, Ruben C
2011-01-01
Schizophrenia patients display impaired performance and brain activity during facial affect recognition. These impairments may reflect stimulus-driven perceptual decrements and evaluative processing abnormalities. We differentiated these two processes by contrasting responses to identical stimuli presented under different contexts. Seventeen healthy controls and 16 schizophrenia patients performed an fMRI facial affect detection task. Subjects identified an affective target presented amongst foils of differing emotions. We hypothesized that targeting affiliative emotions (happiness, sadness) would create a task demand context distinct from that generated when targeting threat emotions (anger, fear). We compared affiliative foil stimuli within a congruent affiliative context with identical stimuli presented in an incongruent threat context. Threat foils were analysed in the same manner. Controls activated right orbitofrontal cortex (OFC)/ventrolateral prefrontal cortex (VLPFC) more to affiliative foils in threat contexts than to identical stimuli within affiliative contexts. Patients displayed reduced OFC/VLPFC activation to all foils, and no activation modulation by context. This lack of context modulation coincided with a 2-fold decrement in foil detection efficiency. Task demands produce contextual effects during facial affective processing in regions activated during affect evaluation. In schizophrenia, reduced modulation of OFC/VLPFC by context coupled with reduced behavioural efficiency suggests impaired ventral prefrontal control mechanisms that optimize affective appraisal.
Perceptual Biases in Processing Facial Identity and Emotion
ERIC Educational Resources Information Center
Coolican, Jamesie; Eskes, Gail A.; McMullen, Patricia A.; Lecky, Erin
2008-01-01
Normal observers demonstrate a bias to process the left sides of faces during perceptual judgments about identity or emotion. This effect suggests a right cerebral hemisphere processing bias. To test the role of the right hemisphere and the involvement of configural processing underlying this effect, young and older control observers and patients…
Voluntary facial action generates emotion-specific autonomic nervous system activity.
Levenson, R W; Ekman, P; Friesen, W V
1990-07-01
Four experiments were conducted to determine whether voluntarily produced emotional facial configurations are associated with differentiated patterns of autonomic activity, and if so, how this might be mediated. Subjects received muscle-by-muscle instructions and coaching to produce facial configurations for anger, disgust, fear, happiness, sadness, and surprise while heart rate, skin conductance, finger temperature, and somatic activity were monitored. Results indicated that voluntary facial activity produced significant levels of subjective experience of the associated emotion, and that autonomic distinctions among emotions: (a) were found both between negative and positive emotions and among negative emotions, (b) were consistent between group and individual subjects' data, (c) were found in both male and female subjects, (d) were found in both specialized (actors, scientists) and nonspecialized populations, (e) were stronger when the voluntary facial configurations most closely resembled actual emotional expressions, and (f) were stronger when experience of the associated emotion was reported. The capacity of voluntary facial activity to generate emotion-specific autonomic activity: (a) did not require subjects to see facial expressions (either in a mirror or on an experimenter's face), and (b) could not be explained by differences in the difficulty of making the expressions or by differences in concomitant somatic activity.
Emotion understanding in postinstitutionalized Eastern European children
WISMER FRIES, ALISON B.; POLLAK, SETH D.
2005-01-01
To examine the effects of early emotional neglect on children’s affective development, we assessed children who had experienced institutionalized care prior to adoption into family environments. One task required children to identify photographs of facial expressions of emotion. A second task required children to match facial expressions to an emotional situation. Internationally adopted, postinstitutionalized children had difficulty identifying facial expressions of emotion. In addition, postinstitutionalized children had significant difficulty matching appropriate facial expressions to happy, sad, and fearful scenarios. However, postinstitutionalized children performed as well as comparison children when asked to identify and match angry facial expressions. These results are discussed in terms of the importance of emotional input early in life on later developmental organization. PMID:15487600
Bekele, E; Bian, D; Peterman, J; Park, S; Sarkar, N
2017-06-01
Schizophrenia is a life-long, debilitating psychotic disorder with poor outcome that affects about 1% of the population. Although pharmacotherapy can alleviate some of the acute psychotic symptoms, residual social impairments present a significant barrier that prevents successful rehabilitation. With limited resources and access to social skills training opportunities, innovative technology has emerged as a potentially powerful tool for intervention. In this paper, we present a novel virtual reality (VR)-based system for understanding facial emotion processing impairments that may lead to poor social outcome in schizophrenia. We henceforth call it a VR System for Affect Analysis in Facial Expressions (VR-SAAFE). This system integrates a VR-based task presentation platform that can minutely control facial expressions of an avatar with or without accompanying verbal interaction, with an eye-tracker to quantitatively measure a participants real-time gaze and a set of physiological sensors to infer his/her affective states to allow in-depth understanding of the emotion recognition mechanism of patients with schizophrenia based on quantitative metrics. A usability study with 12 patients with schizophrenia and 12 healthy controls was conducted to examine processing of the emotional faces. Preliminary results indicated that there were significant differences in the way patients with schizophrenia processed and responded towards the emotional faces presented in the VR environment compared with healthy control participants. The preliminary results underscore the utility of such a VR-based system that enables precise and quantitative assessment of social skill deficits in patients with schizophrenia.
Palumbo, Letizia; Jellema, Tjeerd
2013-01-01
Emotional facial expressions are immediate indicators of the affective dispositions of others. Recently it has been shown that early stages of social perception can already be influenced by (implicit) attributions made by the observer about the agent's mental state and intentions. In the current study possible mechanisms underpinning distortions in the perception of dynamic, ecologically-valid, facial expressions were explored. In four experiments we examined to what extent basic perceptual processes such as contrast/context effects, adaptation and representational momentum underpinned the perceptual distortions, and to what extent 'emotional anticipation', i.e. the involuntary anticipation of the other's emotional state of mind on the basis of the immediate perceptual history, might have played a role. Neutral facial expressions displayed at the end of short video-clips, in which an initial facial expression of joy or anger gradually morphed into a neutral expression, were misjudged as being slightly angry or slightly happy, respectively (Experiment 1). This response bias disappeared when the actor's identity changed in the final neutral expression (Experiment 2). Videos depicting neutral-to-joy-to-neutral and neutral-to-anger-to-neutral sequences again produced biases but in opposite direction (Experiment 3). The bias survived insertion of a 400 ms blank (Experiment 4). These results suggested that the perceptual distortions were not caused by any of the low-level perceptual mechanisms (adaptation, representational momentum and contrast effects). We speculate that especially when presented with dynamic, facial expressions, perceptual distortions occur that reflect 'emotional anticipation' (a low-level mindreading mechanism), which overrules low-level visual mechanisms. Underpinning neural mechanisms are discussed in relation to the current debate on action and emotion understanding.
Palumbo, Letizia; Jellema, Tjeerd
2013-01-01
Emotional facial expressions are immediate indicators of the affective dispositions of others. Recently it has been shown that early stages of social perception can already be influenced by (implicit) attributions made by the observer about the agent’s mental state and intentions. In the current study possible mechanisms underpinning distortions in the perception of dynamic, ecologically-valid, facial expressions were explored. In four experiments we examined to what extent basic perceptual processes such as contrast/context effects, adaptation and representational momentum underpinned the perceptual distortions, and to what extent ‘emotional anticipation’, i.e. the involuntary anticipation of the other’s emotional state of mind on the basis of the immediate perceptual history, might have played a role. Neutral facial expressions displayed at the end of short video-clips, in which an initial facial expression of joy or anger gradually morphed into a neutral expression, were misjudged as being slightly angry or slightly happy, respectively (Experiment 1). This response bias disappeared when the actor’s identity changed in the final neutral expression (Experiment 2). Videos depicting neutral-to-joy-to-neutral and neutral-to-anger-to-neutral sequences again produced biases but in opposite direction (Experiment 3). The bias survived insertion of a 400 ms blank (Experiment 4). These results suggested that the perceptual distortions were not caused by any of the low-level perceptual mechanisms (adaptation, representational momentum and contrast effects). We speculate that especially when presented with dynamic, facial expressions, perceptual distortions occur that reflect ‘emotional anticipation’ (a low-level mindreading mechanism), which overrules low-level visual mechanisms. Underpinning neural mechanisms are discussed in relation to the current debate on action and emotion understanding. PMID:23409112
Miyata, Hiromitsu; Nishimura, Ritsuko; Okanoya, Kazuo; Kawai, Nobuyuki
2012-01-01
A Noh mask worn by expert actors when performing on a Japanese traditional Noh drama is suggested to convey countless different facial expressions according to different angles of head/body orientation. The present study addressed the question of how different facial parts of a Noh mask, including the eyebrows, the eyes, and the mouth, may contribute to different emotional expressions. Both experimental situations of active creation and passive recognition of emotional facial expressions were introduced. In Experiment 1, participants either created happy or sad facial expressions, or imitated a face that looked up or down, by actively changing each facial part of a Noh mask image presented on a computer screen. For an upward tilted mask, the eyebrows and the mouth shared common features with sad expressions, whereas the eyes with happy expressions. This contingency tended to be reversed for a downward tilted mask. Experiment 2 further examined which facial parts of a Noh mask are crucial in determining emotional expressions. Participants were exposed to the synthesized Noh mask images with different facial parts expressing different emotions. Results clearly revealed that participants primarily used the shape of the mouth in judging emotions. The facial images having the mouth of an upward/downward tilted Noh mask strongly tended to be evaluated as sad/happy, respectively. The results suggest that Noh masks express chimeric emotional patterns, with different facial parts conveying different emotions This appears consistent with the principles of Noh which highly appreciate subtle and composite emotional expressions, as well as with the mysterious facial expressions observed in Western art. It was further demonstrated that the mouth serves as a diagnostic feature in characterizing the emotional expressions. This indicates the superiority of biologically-driven factors over the traditionally formulated performing styles when evaluating the emotions of the Noh masks.
Miyata, Hiromitsu; Nishimura, Ritsuko; Okanoya, Kazuo; Kawai, Nobuyuki
2012-01-01
Background A Noh mask worn by expert actors when performing on a Japanese traditional Noh drama is suggested to convey countless different facial expressions according to different angles of head/body orientation. The present study addressed the question of how different facial parts of a Noh mask, including the eyebrows, the eyes, and the mouth, may contribute to different emotional expressions. Both experimental situations of active creation and passive recognition of emotional facial expressions were introduced. Methodology/Principal Findings In Experiment 1, participants either created happy or sad facial expressions, or imitated a face that looked up or down, by actively changing each facial part of a Noh mask image presented on a computer screen. For an upward tilted mask, the eyebrows and the mouth shared common features with sad expressions, whereas the eyes with happy expressions. This contingency tended to be reversed for a downward tilted mask. Experiment 2 further examined which facial parts of a Noh mask are crucial in determining emotional expressions. Participants were exposed to the synthesized Noh mask images with different facial parts expressing different emotions. Results clearly revealed that participants primarily used the shape of the mouth in judging emotions. The facial images having the mouth of an upward/downward tilted Noh mask strongly tended to be evaluated as sad/happy, respectively. Conclusions/Significance The results suggest that Noh masks express chimeric emotional patterns, with different facial parts conveying different emotions This appears consistent with the principles of Noh which highly appreciate subtle and composite emotional expressions, as well as with the mysterious facial expressions observed in Western art. It was further demonstrated that the mouth serves as a diagnostic feature in characterizing the emotional expressions. This indicates the superiority of biologically-driven factors over the traditionally formulated performing styles when evaluating the emotions of the Noh masks. PMID:23185595
Facial Emotion Recognition and Expression in Parkinson’s Disease: An Emotional Mirror Mechanism?
Ricciardi, Lucia; Visco-Comandini, Federica; Erro, Roberto; Morgante, Francesca; Bologna, Matteo; Fasano, Alfonso; Ricciardi, Diego; Edwards, Mark J.; Kilner, James
2017-01-01
Background and aim Parkinson’s disease (PD) patients have impairment of facial expressivity (hypomimia) and difficulties in interpreting the emotional facial expressions produced by others, especially for aversive emotions. We aimed to evaluate the ability to produce facial emotional expressions and to recognize facial emotional expressions produced by others in a group of PD patients and a group of healthy participants in order to explore the relationship between these two abilities and any differences between the two groups of participants. Methods Twenty non-demented, non-depressed PD patients and twenty healthy participants (HC) matched for demographic characteristics were studied. The ability of recognizing emotional facial expressions was assessed with the Ekman 60-faces test (Emotion recognition task). Participants were video-recorded while posing facial expressions of 6 primary emotions (happiness, sadness, surprise, disgust, fear and anger). The most expressive pictures for each emotion were derived from the videos. Ten healthy raters were asked to look at the pictures displayed on a computer-screen in pseudo-random fashion and to identify the emotional label in a six-forced-choice response format (Emotion expressivity task). Reaction time (RT) and accuracy of responses were recorded. At the end of each trial the participant was asked to rate his/her confidence in his/her perceived accuracy of response. Results For emotion recognition, PD reported lower score than HC for Ekman total score (p<0.001), and for single emotions sub-scores happiness, fear, anger, sadness (p<0.01) and surprise (p = 0.02). In the facial emotion expressivity task, PD and HC significantly differed in the total score (p = 0.05) and in the sub-scores for happiness, sadness, anger (all p<0.001). RT and the level of confidence showed significant differences between PD and HC for the same emotions. There was a significant positive correlation between the emotion facial recognition and expressivity in both groups; the correlation was even stronger when ranking emotions from the best recognized to the worst (R = 0.75, p = 0.004). Conclusions PD patients showed difficulties in recognizing emotional facial expressions produced by others and in posing facial emotional expressions compared to healthy subjects. The linear correlation between recognition and expression in both experimental groups suggests that the two mechanisms share a common system, which could be deteriorated in patients with PD. These results open new clinical and rehabilitation perspectives. PMID:28068393
Positive facial expressions during retrieval of self-defining memories.
Gandolphe, Marie Charlotte; Nandrino, Jean Louis; Delelis, Gérald; Ducro, Claire; Lavallee, Audrey; Saloppe, Xavier; Moustafa, Ahmed A; El Haj, Mohamad
2017-11-14
In this study, we investigated, for the first time, facial expressions during the retrieval of Self-defining memories (i.e., those vivid and emotionally intense memories of enduring concerns or unresolved conflicts). Participants self-rated the emotional valence of their Self-defining memories and autobiographical retrieval was analyzed with a facial analysis software. This software (Facereader) synthesizes the facial expression information (i.e., cheek, lips, muscles, eyebrow muscles) to describe and categorize facial expressions (i.e., neutral, happy, sad, surprised, angry, scared, and disgusted facial expressions). We found that participants showed more emotional than neutral facial expressions during the retrieval of Self-defining memories. We also found that participants showed more positive than negative facial expressions during the retrieval of Self-defining memories. Interestingly, participants attributed positive valence to the retrieved memories. These findings are the first to demonstrate the consistency between facial expressions and the emotional subjective experience of Self-defining memories. These findings provide valuable physiological information about the emotional experience of the past.
Watters, Anna J; Harris, Anthony W F; Williams, Leanne M
2018-05-21
Facial expressions signaling threat and mood-congruent loss have been used to probe abnormal neural reactivity in major depressive disorder (MDD) and may be implicated in genetic vulnerability to MDD. This study investigated electro-cortical reactivity to facial expressions 101 unaffected, adult first degree relatives of probands with MDD and non-relative controls (n = 101). We investigated event-related potentials (ERPs) to five facial expressions of basic emotion: fear, anger, disgust, sadness and happiness under both subliminal (masked) and conscious (unmasked) presentation conditions, and the source localization of group differences. In the conscious condition, controls showed a distinctly positive-going shift in responsive to negative versus happy faces, reflected in a greater positivity for the VPP frontally and the P300 parietally, and less negativity for the N200. By contrast, relatives showed less differentiation of emotions, reflected in less VPP and P300 positivity, particularly for anger and disgust, and which produced an enhanced N200 for sadness. These group differences were consistently source localized to the anterior cingulate cortex. The findings contribute new evidence for neural disruptions underlying the differentiation of salient emotions in familial risk for depression. These disruptions occur in the appraisal (∼200 ms post-stimulus) through to the context evaluation (∼300 ms+ post-stimulus) phases of of emotion processing, consistent with theories that risk for depression involves biased or attenuated processing of emotion. Copyright © 2018. Published by Elsevier B.V.
Saneiro, Mar; Salmeron-Majadas, Sergio
2014-01-01
We report current findings when considering video recordings of facial expressions and body movements to provide affective personalized support in an educational context from an enriched multimodal emotion detection approach. In particular, we describe an annotation methodology to tag facial expression and body movements that conform to changes in the affective states of learners while dealing with cognitive tasks in a learning process. The ultimate goal is to combine these annotations with additional affective information collected during experimental learning sessions from different sources such as qualitative, self-reported, physiological, and behavioral information. These data altogether are to train data mining algorithms that serve to automatically identify changes in the learners' affective states when dealing with cognitive tasks which help to provide emotional personalized support. PMID:24892055
Saneiro, Mar; Santos, Olga C; Salmeron-Majadas, Sergio; Boticario, Jesus G
2014-01-01
We report current findings when considering video recordings of facial expressions and body movements to provide affective personalized support in an educational context from an enriched multimodal emotion detection approach. In particular, we describe an annotation methodology to tag facial expression and body movements that conform to changes in the affective states of learners while dealing with cognitive tasks in a learning process. The ultimate goal is to combine these annotations with additional affective information collected during experimental learning sessions from different sources such as qualitative, self-reported, physiological, and behavioral information. These data altogether are to train data mining algorithms that serve to automatically identify changes in the learners' affective states when dealing with cognitive tasks which help to provide emotional personalized support.
El Haj, Mohamad; Daoudi, Mohamed; Gallouj, Karim; Moustafa, Ahmed A; Nandrino, Jean-Louis
2018-05-11
Thanks to the current advances in the software analysis of facial expressions, there is a burgeoning interest in understanding emotional facial expressions observed during the retrieval of autobiographical memories. This review describes the research on facial expressions during autobiographical retrieval showing distinct emotional facial expressions according to the characteristics of retrieved memoires. More specifically, this research demonstrates that the retrieval of emotional memories can trigger corresponding emotional facial expressions (e.g. positive memories may trigger positive facial expressions). Also, this study demonstrates the variations of facial expressions according to specificity, self-relevance, or past versus future direction of memory construction. Besides linking research on facial expressions during autobiographical retrieval to cognitive and affective characteristics of autobiographical memory in general, this review positions this research within the broader context research on the physiologic characteristics of autobiographical retrieval. We also provide several perspectives for clinical studies to investigate facial expressions in populations with deficits in autobiographical memory (e.g. whether autobiographical overgenerality in neurologic and psychiatric populations may trigger few emotional facial expressions). In sum, this review paper demonstrates how the evaluation of facial expressions during autobiographical retrieval may help understand the functioning and dysfunctioning of autobiographical memory.
Aberrant patterns of visual facial information usage in schizophrenia.
Clark, Cameron M; Gosselin, Frédéric; Goghari, Vina M
2013-05-01
Deficits in facial emotion perception have been linked to poorer functional outcome in schizophrenia. However, the relationship between abnormal emotion perception and functional outcome remains poorly understood. To better understand the nature of facial emotion perception deficits in schizophrenia, we used the Bubbles Facial Emotion Perception Task to identify differences in usage of visual facial information in schizophrenia patients (n = 20) and controls (n = 20), when differentiating between angry and neutral facial expressions. As hypothesized, schizophrenia patients required more facial information than controls to accurately differentiate between angry and neutral facial expressions, and they relied on different facial features and spatial frequencies to differentiate these facial expressions. Specifically, schizophrenia patients underutilized the eye regions, overutilized the nose and mouth regions, and virtually ignored information presented at the lowest levels of spatial frequency. In addition, a post hoc one-tailed t test revealed a positive relationship of moderate strength between the degree of divergence from "normal" visual facial information usage in the eye region and lower overall social functioning. These findings provide direct support for aberrant patterns of visual facial information usage in schizophrenia in differentiating between socially salient emotional states. © 2013 American Psychological Association
Dynamic Displays Enhance the Ability to Discriminate Genuine and Posed Facial Expressions of Emotion
Namba, Shushi; Kabir, Russell S.; Miyatani, Makoto; Nakao, Takashi
2018-01-01
Accurately gauging the emotional experience of another person is important for navigating interpersonal interactions. This study investigated whether perceivers are capable of distinguishing between unintentionally expressed (genuine) and intentionally manipulated (posed) facial expressions attributed to four major emotions: amusement, disgust, sadness, and surprise. Sensitivity to this discrimination was explored by comparing unstaged dynamic and static facial stimuli and analyzing the results with signal detection theory. Participants indicated whether facial stimuli presented on a screen depicted a person showing a given emotion and whether that person was feeling a given emotion. The results showed that genuine displays were evaluated more as felt expressions than posed displays for all target emotions presented. In addition, sensitivity to the perception of emotional experience, or discriminability, was enhanced in dynamic facial displays, but was less pronounced in the case of static displays. This finding indicates that dynamic information in facial displays contributes to the ability to accurately infer the emotional experiences of another person. PMID:29896135
[Negative symptoms, emotion and cognition in schizophrenia].
Fakra, E; Belzeaux, R; Azorin, J-M; Adida, M
2015-12-01
For a long time, treatment of schizophrenia has been essentially focussed on positive symptoms managing. Yet, even if these symptoms are the most noticeable, negative symptoms are more enduring, resistant to pharmacological treatment and associated with a worse prognosis. In the two last decades, attention has shift towards cognitive deficit, as this deficit is most robustly associated to functional outcome. But it appears that the modest improvement in cognition, obtained in schizophrenia through pharmacological treatment or, more purposely, by cognitive enhancement therapy, has only lead to limited amelioration of functional outcome. Authors have claimed that pure cognitive processes, such as those evaluated and trained in lots of these programs, may be too distant from real-life conditions, as the latter are largely based on social interactions. Consequently, the field of social cognition, at the interface of cognition and emotion, has emerged. In a first part of this article we examined the links, in schizophrenia, between negative symptoms, cognition and emotions from a therapeutic standpoint. Nonetheless, investigation of emotion in schizophrenia may also hold relevant premises for understanding the physiopathology of this disorder. In a second part, we propose to illustrate this research by relying on the heuristic value of an elementary marker of social cognition, facial affect recognition. Facial affect recognition has been repeatedly reported to be impaired in schizophrenia and some authors have argued that this deficit could constitute an endophenotype of the illness. We here examined how facial affect processing has been used to explore broader emotion dysfunction in schizophrenia, through behavioural and imaging studies. In particular, fMRI paradigms using facial affect have shown particular patterns of amygdala engagement in schizophrenia, suggesting an intact potential to elicit the limbic system which may however not be advantageous. Finally, we analysed facial affect processing on a cognitive-perceptual level, and the aptitude in schizophrenia to manipulate featural and configural information in faces. Copyright © 2015 Elsevier Masson SAS. All rights reserved.
Facial expressions of emotion and the course of conjugal bereavement.
Bonanno, G A; Keltner, D
1997-02-01
The common assumption that emotional expression mediates the course of bereavement is tested. Competing hypotheses about the direction of mediation were formulated from the grief work and social-functional accounts of emotional expression. Facial expressions of emotion in conjugally bereaved adults were coded at 6 months post-loss as they described their relationship with the deceased; grief and perceived health were measured at 6, 14, and 25 months. Facial expressions of negative emotion, in particular anger, predicted increased grief at 14 months and poorer perceived health through 25 months. Facial expressions of positive emotion predicted decreased grief through 25 months and a positive but nonsignificant relation to perceived health. Predictive relations between negative and positive emotional expression persisted when initial levels of self-reported emotion, grief, and health were statistically controlled, demonstrating the mediating role of facial expressions of emotion in adjustment to conjugal loss. Theoretical and clinical implications are discussed.
Facial recognition in primary focal dystonia.
Rinnerthaler, Martina; Benecke, Cord; Bartha, Lisa; Entner, Tanja; Poewe, Werner; Mueller, Joerg
2006-01-01
The basal ganglia seem to be involved in emotional processing. Primary dystonia is a movement disorder considered to result from basal ganglia dysfunction, and the aim of the present study was to investigate emotion recognition in patients with primary focal dystonia. Thirty-two patients with primary cranial (n=12) and cervical (n=20) dystonia were compared to 32 healthy controls matched for age, sex, and educational level on the facially expressed emotion labeling (FEEL) test, a computer-based tool measuring a person's ability to recognize facially expressed emotions. Patients with cognitive impairment or depression were excluded. None of the patients received medication with a possible cognitive side effect profile and only those with mild to moderate dystonia were included. Patients with primary dystonia showed isolated deficits in the recognition of disgust (P=0.007), while no differences between patients and controls were found with regard to the other emotions (fear, happiness, surprise, sadness, and anger). The findings of the present study add further evidence to the conception that dystonia is not only a motor but a complex basal ganglia disorder including selective emotion recognition disturbances. Copyright (c) 2005 Movement Disorder Society.
Dawel, Amy; O'Kearney, Richard; McKone, Elinor; Palermo, Romina
2012-11-01
The present meta-analysis aimed to clarify whether deficits in emotion recognition in psychopathy are restricted to certain emotions and modalities or whether they are more pervasive. We also attempted to assess the influence of other important variables: age, and the affective factor of psychopathy. A systematic search of electronic databases and a subsequent manual search identified 26 studies that included 29 experiments (N = 1376) involving six emotion categories (anger, disgust, fear, happiness, sadness, surprise) across three modalities (facial, vocal, postural). Meta-analyses found evidence of pervasive impairments across modalities (facial and vocal) with significant deficits evident for several emotions (i.e., not only fear and sadness) in both adults and children/adolescents. These results are consistent with recent theorizing that the amygdala, which is believed to be dysfunctional in psychopathy, has a broad role in emotion processing. We discuss limitations of the available data that restrict the ability of meta-analysis to consider the influence of age and separate the sub-factors of psychopathy, highlighting important directions for future research. Copyright © 2012 Elsevier Ltd. All rights reserved.
Diano, Matteo; Tamietto, Marco; Celeghin, Alessia; Weiskrantz, Lawrence; Tatu, Mona-Karina; Bagnis, Arianna; Duca, Sergio; Geminiani, Giuliano; Cauda, Franco; Costa, Tommaso
2017-03-27
The quest to characterize the neural signature distinctive of different basic emotions has recently come under renewed scrutiny. Here we investigated whether facial expressions of different basic emotions modulate the functional connectivity of the amygdala with the rest of the brain. To this end, we presented seventeen healthy participants (8 females) with facial expressions of anger, disgust, fear, happiness, sadness and emotional neutrality and analyzed amygdala's psychophysiological interaction (PPI). In fact, PPI can reveal how inter-regional amygdala communications change dynamically depending on perception of various emotional expressions to recruit different brain networks, compared to the functional interactions it entertains during perception of neutral expressions. We found that for each emotion the amygdala recruited a distinctive and spatially distributed set of structures to interact with. These changes in amygdala connectional patters characterize the dynamic signature prototypical of individual emotion processing, and seemingly represent a neural mechanism that serves to implement the distinctive influence that each emotion exerts on perceptual, cognitive, and motor responses. Besides these differences, all emotions enhanced amygdala functional integration with premotor cortices compared to neutral faces. The present findings thus concur to reconceptualise the structure-function relation between brain-emotion from the traditional one-to-one mapping toward a network-based and dynamic perspective.
How does context affect assessments of facial emotion? The role of culture and age.
Ko, Seon-Gyu; Lee, Tae-Ho; Yoon, Hyea-Young; Kwon, Jung-Hye; Mather, Mara
2011-03-01
People from Asian cultures are more influenced by context in their visual processing than people from Western cultures. In this study, we examined how these cultural differences in context processing affect how people interpret facial emotions. We found that younger Koreans were more influenced than younger Americans by emotional background pictures when rating the emotion of a central face, especially those younger Koreans with low self-rated stress. In contrast, among older adults, neither Koreans nor Americans showed significant influences of context in their face emotion ratings. These findings suggest that cultural differences in reliance on context to interpret others' emotions depend on perceptual integration processes that decline with age, leading to fewer cultural differences in perception among older adults than among younger adults. Furthermore, when asked to recall the background pictures, younger participants recalled more negative pictures than positive pictures, whereas older participants recalled similar numbers of positive and negative pictures. These age differences in the valence of memory were consistent across culture. (c) 2011 APA, all rights reserved.
Zhang, Heming; Chen, Xuhai; Chen, Shengdong; Li, Yansong; Chen, Changming; Long, Quanshan; Yuan, Jiajin
2018-05-09
Facial and vocal expressions are essential modalities mediating the perception of emotion and social communication. Nonetheless, currently little is known about how emotion perception and its neural substrates differ across facial expression and vocal prosody. To clarify this issue, functional MRI scans were acquired in Study 1, in which participants were asked to discriminate the valence of emotional expression (angry, happy or neutral) from facial, vocal, or bimodal stimuli. In Study 2, we used an affective priming task (unimodal materials as primers and bimodal materials as target) and participants were asked to rate the intensity, valence, and arousal of the targets. Study 1 showed higher accuracy and shorter response latencies in the facial than in the vocal modality for a happy expression. Whole-brain analysis showed enhanced activation during facial compared to vocal emotions in the inferior temporal-occipital regions. Region of interest analysis showed a higher percentage signal change for facial than for vocal anger in the superior temporal sulcus. Study 2 showed that facial relative to vocal priming of anger had a greater influence on perceived emotion for bimodal targets, irrespective of the target valence. These findings suggest that facial expression is associated with enhanced emotion perception compared to equivalent vocal prosodies.
Kohn, Nils; Fernández, Guillén
2017-12-06
Our surrounding provides a host of sensory input, which we cannot fully process without streamlining and automatic processing. Levels of automaticity differ for different cognitive and affective processes. Situational and contextual interactions between cognitive and affective processes in turn influence the level of automaticity. Automaticity can be measured by interference in Stroop tasks. We applied an emotional version of the Stroop task to investigate how stress as a contextual factor influences the affective valence-dependent level of automaticity. 120 young, healthy men were investigated for behavioral and brain interference following a stress induction or control procedure in a counter-balanced cross-over-design. Although Stroop interference was always observed, sex and emotion of the face strongly modulated interference, which was larger for fearful and male faces. These effects suggest higher automaticity when processing happy and also female faces. Supporting behavioral patterns, brain data show lower interference related brain activity in executive control related regions in response to happy and female faces. In the absence of behavioral stress effects, congruent compared to incongruent trials (reverse interference) showed little to no deactivation under stress in response to happy female and fearful male trials. These congruency effects are potentially based on altered context- stress-related facial processing that interact with sex-emotion stereotypes. Results indicate that sex and facial emotion modulate Stroop interference in brain and behavior. These effects can be explained by altered response difficulty as a consequence of the contextual and stereotype related modulation of automaticity. Copyright © 2017 Elsevier Ltd. All rights reserved.
Face-to-face: Perceived personal relevance amplifies face processing.
Bublatzky, Florian; Pittig, Andre; Schupp, Harald T; Alpers, Georg W
2017-05-01
The human face conveys emotional and social information, but it is not well understood how these two aspects influence face perception. In order to model a group situation, two faces displaying happy, neutral or angry expressions were presented. Importantly, faces were either facing the observer, or they were presented in profile view directed towards, or looking away from each other. In Experiment 1 (n = 64), face pairs were rated regarding perceived relevance, wish-to-interact, and displayed interactivity, as well as valence and arousal. All variables revealed main effects of facial expression (emotional > neutral), face orientation (facing observer > towards > away) and interactions showed that evaluation of emotional faces strongly varies with their orientation. Experiment 2 (n = 33) examined the temporal dynamics of perceptual-attentional processing of these face constellations with event-related potentials. Processing of emotional and neutral faces differed significantly in N170 amplitudes, early posterior negativity (EPN), and sustained positive potentials. Importantly, selective emotional face processing varied as a function of face orientation, indicating early emotion-specific (N170, EPN) and late threat-specific effects (LPP, sustained positivity). Taken together, perceived personal relevance to the observer-conveyed by facial expression and face direction-amplifies emotional face processing within triadic group situations. © The Author (2017). Published by Oxford University Press.
Sfärlea, Anca; Greimel, Ellen; Platt, Belinda; Bartling, Jürgen; Schulte-Körne, Gerd; Dieler, Alica C
2016-09-01
The present study explored the neurophysiological correlates of perception and recognition of emotional facial expressions in adolescent anorexia nervosa (AN) patients using event-related potentials (ERPs). We included 20 adolescent girls with AN and 24 healthy girls and recorded ERPs during a passive viewing task and three active tasks requiring processing of emotional faces in varying processing depths; one of the tasks also assessed emotion recognition abilities behaviourally. Despite the absence of behavioural differences, we found that across all tasks AN patients exhibited a less pronounced early posterior negativity (EPN) in response to all facial expressions compared to controls. The EPN is an ERP component reflecting an automatic, perceptual processing stage which is modulated by the intrinsic salience of a stimulus. Hence, the less pronounced EPN in anorexic girls suggests that they might perceive other people's faces as less intrinsically relevant, i.e. as less "important" than do healthy girls. Copyright © 2016 Elsevier B.V. All rights reserved.
Neural responses to facial expressions support the role of the amygdala in processing threat
Sormaz, Mladen; Flack, Tessa; Asghar, Aziz U. R.; Fan, Siyan; Frey, Julia; Manssuer, Luis; Usten, Deniz; Young, Andrew W.; Andrews, Timothy J.
2014-01-01
The amygdala is known to play an important role in the response to facial expressions that convey fear. However, it remains unclear whether the amygdala’s response to fear reflects its role in the interpretation of danger and threat, or whether it is to some extent activated by all facial expressions of emotion. Previous attempts to address this issue using neuroimaging have been confounded by differences in the use of control stimuli across studies. Here, we address this issue using a block design functional magnetic resonance imaging paradigm, in which we compared the response to face images posing expressions of fear, anger, happiness, disgust and sadness with a range of control conditions. The responses in the amygdala to different facial expressions were compared with the responses to a non-face condition (buildings), to mildly happy faces and to neutral faces. Results showed that only fear and anger elicited significantly greater responses compared with the control conditions involving faces. Overall, these findings are consistent with the role of the amygdala in processing threat, rather than in the processing of all facial expressions of emotion, and demonstrate the critical importance of the choice of comparison condition to the pattern of results. PMID:24097376
Dynamic facial expressions of emotion transmit an evolving hierarchy of signals over time.
Jack, Rachael E; Garrod, Oliver G B; Schyns, Philippe G
2014-01-20
Designed by biological and social evolutionary pressures, facial expressions of emotion comprise specific facial movements to support a near-optimal system of signaling and decoding. Although highly dynamical, little is known about the form and function of facial expression temporal dynamics. Do facial expressions transmit diagnostic signals simultaneously to optimize categorization of the six classic emotions, or sequentially to support a more complex communication system of successive categorizations over time? Our data support the latter. Using a combination of perceptual expectation modeling, information theory, and Bayesian classifiers, we show that dynamic facial expressions of emotion transmit an evolving hierarchy of "biologically basic to socially specific" information over time. Early in the signaling dynamics, facial expressions systematically transmit few, biologically rooted face signals supporting the categorization of fewer elementary categories (e.g., approach/avoidance). Later transmissions comprise more complex signals that support categorization of a larger number of socially specific categories (i.e., the six classic emotions). Here, we show that dynamic facial expressions of emotion provide a sophisticated signaling system, questioning the widely accepted notion that emotion communication is comprised of six basic (i.e., psychologically irreducible) categories, and instead suggesting four. Copyright © 2014 Elsevier Ltd. All rights reserved.
Svärd, Joakim; Wiens, Stefan; Fischer, Håkan
2012-01-01
In the aging literature it has been shown that even though emotion recognition performance decreases with age, the decrease is less for happiness than other facial expressions. Studies in younger adults have also revealed that happy faces are more strongly attended to and better recognized than other emotional facial expressions. Thus, there might be a more age independent happy face advantage in facial expression recognition. By using a backward masking paradigm and varying stimulus onset asynchronies (17–267 ms) the temporal development of a happy face advantage, on a continuum from low to high levels of visibility, was examined in younger and older adults. Results showed that across age groups, recognition performance for happy faces was better than for neutral and fearful faces at durations longer than 50 ms. Importantly, the results showed a happy face advantage already during early processing of emotional faces in both younger and older adults. This advantage is discussed in terms of processing of salient perceptual features and elaborative processing of the happy face. We also investigate the combined effect of age and neuroticism on emotional face processing. The rationale was previous findings of age-related differences in physiological arousal to emotional pictures and a relation between arousal and neuroticism. Across all durations, there was an interaction between age and neuroticism, showing that being high in neuroticism might be disadvantageous for younger, but not older adults’ emotion recognition performance during arousal enhancing tasks. These results indicate that there is a relation between aging, neuroticism, and performance, potentially related to physiological arousal. PMID:23226135
Yang, Chengqing; Zhang, Tianhong; Li, Zezhi; Heeramun-Aubeeluck, Anisha; Liu, Na; Huang, Nan; Zhang, Jie; He, Leiying; Li, Hui; Tang, Yingying; Chen, Fazhan; Liu, Fei; Wang, Jijun; Lu, Zheng
2015-10-08
Although many studies have examined executive functions and facial emotion recognition in people with schizophrenia, few of them focused on the correlation between them. Furthermore, their relationship in the siblings of patients also remains unclear. The aim of the present study is to examine the correlation between executive functions and facial emotion recognition in patients with first-episode schizophrenia and their siblings. Thirty patients with first-episode schizophrenia, their twenty-six siblings, and thirty healthy controls were enrolled. They completed facial emotion recognition tasks using the Ekman Standard Faces Database, and executive functioning was measured by Wisconsin Card Sorting Test (WCST). Hierarchical regression analysis was applied to assess the correlation between executive functions and facial emotion recognition. Our study found that in siblings, the accuracy in recognizing low degree 'disgust' emotion was negatively correlated with the total correct rate in WCST (r = -0.614, p = 0.023), but was positively correlated with the total error in WCST (r = 0.623, p = 0.020); the accuracy in recognizing 'neutral' emotion was positively correlated with the total error rate in WCST (r = 0.683, p = 0.014) while negatively correlated with the total correct rate in WCST (r = -0.677, p = 0.017). People with schizophrenia showed an impairment in facial emotion recognition when identifying moderate 'happy' facial emotion, the accuracy of which was significantly correlated with the number of completed categories of WCST (R(2) = 0.432, P < .05). There were no correlations between executive functions and facial emotion recognition in the healthy control group. Our study demonstrated that facial emotion recognition impairment correlated with executive function impairment in people with schizophrenia and their unaffected siblings but not in healthy controls.
Kang, Guanlan; Zhou, Xiaolin; Wei, Ping
2015-09-01
The present study investigated the effect of reward expectation and spatial orientation on the processing of emotional facial expressions, using a spatial cue-target paradigm. A colored cue was presented at the left or right side of the central fixation point, with its color indicating the monetary reward stakes of a given trial (incentive vs. non-incentive), followed by the presentation of an emotional facial target (angry vs. neutral) at a cued or un-cued location. Participants were asked to discriminate the emotional expression of the target, with the cue-target stimulus onset asynchrony being 200-300 ms in Experiment 1 and 950-1250 ms in Experiment 2a (without a fixation cue) and Experiment 2b (with a fixation cue), producing a spatial facilitation effect and an inhibition of return effect, respectively. The results of all the experiments revealed faster reaction times in the monetary incentive condition than in the non-incentive condition, demonstrating the effect of reward to facilitate task performance. An interaction between reward expectation and the emotion of the target was evident in all the three experiments, with larger reward effects for angry faces than for neutral faces. This interaction was not affected by spatial orientation. These findings demonstrate that incentive motivation improves task performance and increases sensitivity to angry faces, irrespective of spatial orienting and reorienting processes.
Ibáñez, Agustín; Riveros, Rodrigo; Hurtado, Esteban; Gleichgerrcht, Ezequiel; Urquina, Hugo; Herrera, Eduar; Amoruso, Lucía; Reyes, Migdyrai Martin; Manes, Facundo
2012-01-30
Previous studies have reported facial emotion recognition impairments in schizophrenic patients, as well as abnormalities in the N170 component of the event-related potential. Current research on schizophrenia highlights the importance of complexly-inherited brain-based deficits. In order to examine the N170 markers of face structural and emotional processing, DSM-IV diagnosed schizophrenia probands (n=13), unaffected first-degree relatives from multiplex families (n=13), and control subjects (n=13) matched by age, gender and educational level, performed a categorization task which involved words and faces with positive and negative valence. The N170 component, while present in relatives and control subjects, was reduced in patients, not only for faces, but also for face-word differences, suggesting a deficit in structural processing of stimuli. Control subjects showed N170 modulation according to the valence of facial stimuli. However, this discrimination effect was found to be reduced both in patients and relatives. This is the first report showing N170 valence deficits in relatives. Our results suggest a generalized deficit affecting the structural encoding of faces in patients, as well as the emotion discrimination both in patients and relatives. Finally, these findings lend support to the notion that cortical markers of facial discrimination can be validly considered as vulnerability markers. © 2011 Elsevier Ireland Ltd. All rights reserved.
Facial emotion recognition and borderline personality pathology.
Meehan, Kevin B; De Panfilis, Chiara; Cain, Nicole M; Antonucci, Camilla; Soliani, Antonio; Clarkin, John F; Sambataro, Fabio
2017-09-01
The impact of borderline personality pathology on facial emotion recognition has been in dispute; with impaired, comparable, and enhanced accuracy found in high borderline personality groups. Discrepancies are likely driven by variations in facial emotion recognition tasks across studies (stimuli type/intensity) and heterogeneity in borderline personality pathology. This study evaluates facial emotion recognition for neutral and negative emotions (fear/sadness/disgust/anger) presented at varying intensities. Effortful control was evaluated as a moderator of facial emotion recognition in borderline personality. Non-clinical multicultural undergraduates (n = 132) completed a morphed facial emotion recognition task of neutral and negative emotional expressions across different intensities (100% Neutral; 25%/50%/75% Emotion) and self-reported borderline personality features and effortful control. Greater borderline personality features related to decreased accuracy in detecting neutral faces, but increased accuracy in detecting negative emotion faces, particularly at low-intensity thresholds. This pattern was moderated by effortful control; for individuals with low but not high effortful control, greater borderline personality features related to misattributions of emotion to neutral expressions, and enhanced detection of low-intensity emotional expressions. Individuals with high borderline personality features may therefore exhibit a bias toward detecting negative emotions that are not or barely present; however, good self-regulatory skills may protect against this potential social-cognitive vulnerability. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.
Empathy costs: Negative emotional bias in high empathisers.
Chikovani, George; Babuadze, Lasha; Iashvili, Nino; Gvalia, Tamar; Surguladze, Simon
2015-09-30
Excessive empathy has been associated with compassion fatigue in health professionals and caregivers. We investigated an effect of empathy on emotion processing in 137 healthy individuals of both sexes. We tested a hypothesis that high empathy may underlie increased sensitivity to negative emotion recognition which may interact with gender. Facial emotion stimuli comprised happy, angry, fearful, and sad faces presented at different intensities (mild and prototypical) and different durations (500ms and 2000ms). The parameters of emotion processing were represented by discrimination accuracy, response bias and reaction time. We found that higher empathy was associated with better recognition of all emotions. We also demonstrated that higher empathy was associated with response bias towards sad and fearful faces. The reaction time analysis revealed that higher empathy in females was associated with faster (compared with males) recognition of mildly sad faces of brief duration. We conclude that although empathic abilities were providing for advantages in recognition of all facial emotional expressions, the bias towards emotional negativity may potentially carry a risk for empathic distress. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Spontaneous and posed facial expression in Parkinson's disease.
Smith, M C; Smith, M K; Ellgring, H
1996-09-01
Spontaneous and posed emotional facial expressions in individuals with Parkinson's disease (PD, n = 12) were compared with those of healthy age-matched controls (n = 12). The intensity and amount of facial expression in PD patients were expected to be reduced for spontaneous but not posed expressions. Emotional stimuli were video clips selected from films, 2-5 min in duration, designed to elicit feelings of happiness, sadness, fear, disgust, or anger. Facial movements were coded using Ekman and Friesen's (1978) Facial Action Coding System (FACS). In addition, participants rated their emotional experience on 9-point Likert scales. The PD group showed significantly less overall facial reactivity than did controls when viewing the films. The predicted Group X Condition (spontaneous vs. posed) interaction effect on smile intensity was found when PD participants with more severe disease were compared with those with milder disease and with controls. In contrast, ratings of emotional experience were similar for both groups. Depression was positively associated with emotion rating but not with measures of facial activity. Spontaneous facial expression appears to be selectively affected in PD, whereas posed expression and emotional experience remain relatively intact.
Misinterpretation of facial expression: a cross-cultural study.
Shioiri, T; Someya, T; Helmeste, D; Tang, S W
1999-02-01
Accurately recognizing facial emotional expressions is important in psychiatrist-versus-patient interactions. This might be difficult when the physician and patients are from different cultures. More than two decades of research on facial expressions have documented the universality of the emotions of anger, contempt, disgust, fear, happiness, sadness, and surprise. In contrast, some research data supported the concept that there are significant cultural differences in the judgment of emotion. In this pilot study, the recognition of emotional facial expressions in 123 Japanese subjects was evaluated using the Japanese and Caucasian Facial Expression of Emotion (JACFEE) photos. The results indicated that Japanese subjects experienced difficulties in recognizing some emotional facial expressions and misunderstood others as depicted by the posers, when compared to previous studies using American subjects. Interestingly, the sex and cultural background of the poser did not appear to influence the accuracy of recognition. The data suggest that in this young Japanese sample, judgment of certain emotional facial expressions was significantly different from the Americans. Further exploration in this area is warranted due to its importance in cross-cultural clinician-patient interactions.
Park, Soowon; Kim, Taehoon; Shin, Seong A; Kim, Yu Kyeong; Sohn, Bo Kyung; Park, Hyeon-Ju; Youn, Jung-Hae; Lee, Jun-Young
2017-01-01
Background: Facial emotion recognition (FER) is impaired in individuals with frontotemporal dementia (FTD) and Alzheimer’s disease (AD) when compared to healthy older adults. Since deficits in emotion recognition are closely related to caregiver burden or social interactions, researchers have fundamental interest in FER performance in patients with dementia. Purpose: The purpose of this study was to identify the performance profiles of six facial emotions (i.e., fear, anger, disgust, sadness, surprise, and happiness) and neutral faces measured among Korean healthy control (HCs), and those with mild cognitive impairment (MCI), AD, and FTD. Additionally, the neuroanatomical correlates of facial emotions were investigated. Methods: A total of 110 (33 HC, 32 MCI, 32 AD, 13 FTD) older adult participants were recruited from two different medical centers in metropolitan areas of South Korea. These individuals underwent an FER test that was used to assess the recognition of emotions or absence of emotion (neutral) in 35 facial stimuli. Repeated measures two-way analyses of variance were used to examine the distinct profiles of emotional recognition among the four groups. We also performed brain imaging and voxel-based morphometry (VBM) on the participants to examine the associations between FER scores and gray matter volume. Results: The mean score of negative emotion recognition (i.e., fear, anger, disgust, and sadness) clearly discriminated FTD participants from individuals with MCI and AD and HC [F(3,106) = 10.829, p < 0.001, η2 = 0.235], whereas the mean score of positive emotion recognition (i.e., surprise and happiness) did not. A VBM analysis showed negative emotions were correlated with gray matter volume of anterior temporal regions, whereas positive emotions were related to gray matter volume of fronto-parietal regions. Conclusion: Impairment of negative FER in patients with FTD is cross-cultural. The discrete neural correlates of FER indicate that emotional recognition processing is a multi-modal system in the brain. Focusing on the negative emotion recognition is a more effective way to discriminate healthy aging, MCI, and AD from FTD in older Korean adults. PMID:29249960
Sedda, Anna; Petito, Sara; Guarino, Maria; Stracciari, Andrea
2017-07-14
Most of the studies since now show an impairment for facial displays of disgust recognition in Parkinson disease. A general impairment in disgust processing in patients with Parkinson disease might adversely affect their social interactions, given the relevance of this emotion for human relations. However, despite the importance of faces, disgust is also expressed through other format of visual stimuli such as sentences and visual images. The aim of our study was to explore disgust processing in a sample of patients affected by Parkinson disease, by means of various tests tackling not only facial recognition but also other format of visual stimuli through which disgust can be recognized. Our results confirm that patients are impaired in recognizing facial displays of disgust. Further analyses show that patients are also impaired and slower for other facial expressions, with the only exception of happiness. Notably however, patients with Parkinson disease processed visual images and sentences as controls. Our findings show a dissociation within different formats of visual stimuli of disgust, suggesting that Parkinson disease is not characterized by a general compromising of disgust processing, as often suggested. The involvement of the basal ganglia-frontal cortex system might spare some cognitive components of emotional processing, related to memory and culture, at least for disgust. Copyright © 2017 Elsevier B.V. All rights reserved.
The effect of constraining eye-contact during dynamic emotional face perception—an fMRI study
Zurcher, Nicole R.; Lassalle, Amandine; Hippolyte, Loyse; Ward, Noreen; Johnels, Jakob Åsberg
2017-01-01
Abstract Eye-contact modifies how we perceive emotions and modulates activity in the social brain network. Here, using fMRI, we demonstrate that adding a fixation cross in the eye region of dynamic facial emotional stimuli significantly increases activation in the social brain of healthy, neurotypical participants when compared with activation for the exact same stimuli observed in a free-viewing mode. In addition, using PPI analysis, we show that the degree of amygdala connectivity with the rest of the brain is enhanced for the constrained view for all emotions tested except for fear, and that anxiety and alexithymia modulate the strength of amygdala connectivity for each emotion differently. Finally, we show that autistic traits have opposite effects on amygdala connectivity for fearful and angry emotional expressions, suggesting that these emotions should be treated separately in studies investigating facial emotion processing. PMID:28402536
Facial Expression Generation from Speaker's Emotional States in Daily Conversation
NASA Astrophysics Data System (ADS)
Mori, Hiroki; Ohshima, Koh
A framework for generating facial expressions from emotional states in daily conversation is described. It provides a mapping between emotional states and facial expressions, where the former is represented by vectors with psychologically-defined abstract dimensions, and the latter is coded by the Facial Action Coding System. In order to obtain the mapping, parallel data with rated emotional states and facial expressions were collected for utterances of a female speaker, and a neural network was trained with the data. The effectiveness of proposed method is verified by a subjective evaluation test. As the result, the Mean Opinion Score with respect to the suitability of generated facial expression was 3.86 for the speaker, which was close to that of hand-made facial expressions.
Distinct facial processing in schizophrenia and schizoaffective disorders
Chen, Yue; Cataldo, Andrea; Norton, Daniel J; Ongur, Dost
2011-01-01
Although schizophrenia and schizoaffective disorders have both similar and differing clinical features, it is not well understood whether similar or differing pathophysiological processes mediate patients’ cognitive functions. Using psychophysical methods, this study compared the performances of schizophrenia (SZ) patients, patients with schizoaffective disorder (SA), and a healthy control group in two face-related cognitive tasks: emotion discrimination, which tested perception of facial affect, and identity discrimination, which tested perception of non-affective facial features. Compared to healthy controls, SZ patients, but not SA patients, exhibited deficient performance in both fear and happiness discrimination, as well as identity discrimination. SZ patients, but not SA patients, also showed impaired performance in a theory-of-mind task for which emotional expressions are identified based upon the eye regions of face images. This pattern of results suggests distinct processing of face information in schizophrenia and schizoaffective disorders. PMID:21868199
Harmer, Catherine J; Shelley, Nicholas C; Cowen, Philip J; Goodwin, Guy M
2004-07-01
Antidepressants that inhibit the reuptake of serotonin (SSRIs) or norepinephrine (SNRIs) are effective in the treatment of disorders such as depression and anxiety. Cognitive psychological theories emphasize the importance of correcting negative biases of information processing in the nonpharmacological treatment of these disorders, but it is not known whether antidepressant drugs can directly modulate the neural processing of affective information. The present study therefore assessed the actions of repeated antidepressant administration on perception and memory for positive and negative emotional information in healthy volunteers. Forty-two male and female volunteers were randomly assigned to 7 days of double-blind intervention with the SSRI citalopram (20 mg/day), the SNRI reboxetine (8 mg/day), or placebo. On the final day, facial expression recognition, emotion-potentiated startle response, and memory for affect-laden words were assessed. Questionnaires monitoring mood, hostility, and anxiety were given before and after treatment. In the facial expression recognition task, citalopram and reboxetine reduced the identification of the negative facial expressions of anger and fear. Citalopram also abolished the increased startle response found in the context of negative affective images. Both antidepressants increased the relative recall of positive (versus negative) emotional material. These changes in emotional processing occurred in the absence of significant differences in ratings of mood and anxiety. However, reboxetine decreased subjective ratings of hostility and elevated energy. Short-term administration of two different antidepressant types had similar effects on emotion-related tasks in healthy volunteers, reducing the processing of negative relative to positive emotional material. Such effects of antidepressants may ameliorate the negative biases in information processing that characterize mood and anxiety disorders. They also suggest a mechanism of action potentially compatible with cognitive theories of anxiety and depression.
Bertsch, Katja; Böhnke, Robina; Kruk, Menno R.; Naumann, Ewald
2009-01-01
Aggression is a common behavior which has frequently been explained as involving changes in higher level information processing patterns. Although researchers have started only recently to investigate information processing in healthy individuals while engaged in aggressive behavior, the impact of aggression on information processing beyond an aggressive encounter remains unclear. In an event-related potential study, we investigated the processing of facial expressions (happy, angry, fearful, and neutral) in an emotional Stroop task after experimentally provoking aggressive behavior in healthy participants. Compared to a non-provoked group, these individuals showed increased early (P2) and late (P3) positive amplitudes for all facial expressions. For the P2 amplitude, the effect of provocation was greatest for threat-related expressions. Beyond this, a bias for emotional expressions, i.e., slower reaction times to all emotional expressions, was found in provoked participants with a high level of trait anger. These results indicate significant effects of aggression on information processing, which last beyond the aggressive encounter even in healthy participants. PMID:19826616
Ardizzi, Martina; Sestito, Mariateresa; Martini, Francesca; Umiltà, Maria Alessandra; Ravera, Roberto; Gallese, Vittorio
2014-01-01
Age-group membership effects on explicit emotional facial expressions recognition have been widely demonstrated. In this study we investigated whether Age-group membership could also affect implicit physiological responses, as facial mimicry and autonomic regulation, to observation of emotional facial expressions. To this aim, facial Electromyography (EMG) and Respiratory Sinus Arrhythmia (RSA) were recorded from teenager and adult participants during the observation of facial expressions performed by teenager and adult models. Results highlighted that teenagers exhibited greater facial EMG responses to peers' facial expressions, whereas adults showed higher RSA-responses to adult facial expressions. The different physiological modalities through which young and adults respond to peers' emotional expressions are likely to reflect two different ways to engage in social interactions with coetaneous. Findings confirmed that age is an important and powerful social feature that modulates interpersonal interactions by influencing low-level physiological responses. PMID:25337916
Dingle, Genevieve A; Neves, Diana da Costa; Alhadad, Sakinah S J; Hides, Leanne
2018-06-01
Self-report studies show that negative emotional states and ineffective use of emotion regulation strategies are key maintaining factors of substance use disorders (SUD). However, experimental research into emotional processing in adults with SUD is in its infancy. Theoretical conceptualizations of emotion regulation have shifted from a focus on individual (internal) processes to one that encompasses social and interpersonal functions - including the regulation of facial expression of emotion. The purpose of this study was to examine the individual and interpersonal emotion regulation capacity of 35 adults in residential treatment diagnosed with a SUD compared to 35 demographically matched controls (both samples M age = 25 years; 37% females). Participants completed a facial emotion expression flexibility task while viewing emotive images, as well as the Difficulties of Emotion Regulation Scale (DERS) and the Social (Emotion) Expectancy Scale (SES). Adults in SUD treatment experienced significantly more emotion regulation difficulties on all DERS subscales than controls. They also reported higher levels of negative self-evaluation and social expectancies not to feel negative emotions (anxiety and depression) compared to controls. Moreover, when viewing emotive images, the treatment sample showed significantly less flexibility of their emotional expression compared to the control sample. These findings demonstrate that the awareness, expression, and regulation of emotions are particularly difficult for people with SUD and this may maintain their substance use and provide an important target for treatment. Compared to matched controls, adults with substance use disorders self-report significantly more difficulties with emotional awareness and regulation. Compared to matched controls, adults with substance use disorders report significantly greater expectancies not to show depression and anxiety. When viewing positive and negative images, adults with substance use disorders are significantly less flexible in their facial expression of emotion than matched controls in response to regulatory instructions. Emotion regulation should be measured and addressed as part of substance use disorder treatment. © 2017 Commonwealth of Australia. British Journal of Clinical Psychology © 2017 The British Psychological Society.
Common cues to emotion in the dynamic facial expressions of speech and song
Livingstone, Steven R.; Thompson, William F.; Wanderley, Marcelo M.; Palmer, Caroline
2015-01-01
Speech and song are universal forms of vocalization that may share aspects of emotional expression. Research has focused on parallels in acoustic features, overlooking facial cues to emotion. In three experiments, we compared moving facial expressions in speech and song. In Experiment 1, vocalists spoke and sang statements each with five emotions. Vocalists exhibited emotion-dependent movements of the eyebrows and lip corners that transcended speech–song differences. Vocalists’ jaw movements were coupled to their acoustic intensity, exhibiting differences across emotion and speech–song. Vocalists’ emotional movements extended beyond vocal sound to include large sustained expressions, suggesting a communicative function. In Experiment 2, viewers judged silent videos of vocalists’ facial expressions prior to, during, and following vocalization. Emotional intentions were identified accurately for movements during and after vocalization, suggesting that these movements support the acoustic message. Experiment 3 compared emotional identification in voice-only, face-only, and face-and-voice recordings. Emotion judgements for voice-only singing were poorly identified, yet were accurate for all other conditions, confirming that facial expressions conveyed emotion more accurately than the voice in song, yet were equivalent in speech. Collectively, these findings highlight broad commonalities in the facial cues to emotion in speech and song, yet highlight differences in perception and acoustic-motor production. PMID:25424388
Common cues to emotion in the dynamic facial expressions of speech and song.
Livingstone, Steven R; Thompson, William F; Wanderley, Marcelo M; Palmer, Caroline
2015-01-01
Speech and song are universal forms of vocalization that may share aspects of emotional expression. Research has focused on parallels in acoustic features, overlooking facial cues to emotion. In three experiments, we compared moving facial expressions in speech and song. In Experiment 1, vocalists spoke and sang statements each with five emotions. Vocalists exhibited emotion-dependent movements of the eyebrows and lip corners that transcended speech-song differences. Vocalists' jaw movements were coupled to their acoustic intensity, exhibiting differences across emotion and speech-song. Vocalists' emotional movements extended beyond vocal sound to include large sustained expressions, suggesting a communicative function. In Experiment 2, viewers judged silent videos of vocalists' facial expressions prior to, during, and following vocalization. Emotional intentions were identified accurately for movements during and after vocalization, suggesting that these movements support the acoustic message. Experiment 3 compared emotional identification in voice-only, face-only, and face-and-voice recordings. Emotion judgements for voice-only singing were poorly identified, yet were accurate for all other conditions, confirming that facial expressions conveyed emotion more accurately than the voice in song, yet were equivalent in speech. Collectively, these findings highlight broad commonalities in the facial cues to emotion in speech and song, yet highlight differences in perception and acoustic-motor production.
Developmental differences in the neural mechanisms of facial emotion labeling.
Wiggins, Jillian Lee; Adleman, Nancy E; Kim, Pilyoung; Oakes, Allison H; Hsu, Derek; Reynolds, Richard C; Chen, Gang; Pine, Daniel S; Brotman, Melissa A; Leibenluft, Ellen
2016-01-01
Adolescence is a time of increased risk for the onset of psychological disorders associated with deficits in face emotion labeling. We used functional magnetic resonance imaging (fMRI) to examine age-related differences in brain activation while adolescents and adults labeled the emotion on fearful, happy and angry faces of varying intensities [0% (i.e. neutral), 50%, 75%, 100%]. Adolescents and adults did not differ on accuracy to label emotions. In the superior temporal sulcus, ventrolateral prefrontal cortex and middle temporal gyrus, adults show an inverted-U-shaped response to increasing intensities of fearful faces and a U-shaped response to increasing intensities of happy faces, whereas adolescents show the opposite patterns. In addition, adults, but not adolescents, show greater inferior occipital gyrus activation to negative (angry, fearful) vs positive (happy) emotions. In sum, when subjects classify subtly varying facial emotions, developmental differences manifest in several 'ventral stream' brain regions. Charting the typical developmental course of the brain mechanisms of socioemotional processes, such as facial emotion labeling, is an important focus for developmental psychopathology research. Published by Oxford University Press 2015. This work is written by US Government employees and is in the public domain in the US.
Theory of mind and recognition of facial emotion in dementia: challenge to current concepts.
Freedman, Morris; Binns, Malcolm A; Black, Sandra E; Murphy, Cara; Stuss, Donald T
2013-01-01
Current literature suggests that theory of mind (ToM) and recognition of facial emotion are impaired in behavioral variant frontotemporal dementia (bvFTD). In contrast, studies suggest that ToM is spared in Alzheimer disease (AD). However, there is controversy whether recognition of emotion in faces is impaired in AD. This study challenges the concepts that ToM is preserved in AD and that recognition of facial emotion is impaired in bvFTD. ToM, recognition of facial emotion, and identification of emotions associated with video vignettes were studied in bvFTD, AD, and normal controls. ToM was assessed using false-belief and visual perspective-taking tasks. Identification of facial emotion was tested using Ekman and Friesen's pictures of facial affect. After adjusting for relevant covariates, there were significant ToM deficits in bvFTD and AD compared with controls, whereas neither group was impaired in the identification of emotions associated with video vignettes. There was borderline impairment in recognizing angry faces in bvFTD. Patients with AD showed significant deficits on false belief and visual perspective taking, and bvFTD patients were impaired on second-order false belief. We report novel findings challenging the concepts that ToM is spared in AD and that recognition of facial emotion is impaired in bvFTD.
Gaudelus, B; Virgile, J; Peyroux, E; Leleu, A; Baudouin, J-Y; Franck, N
2015-06-01
The impairment of social cognition, including facial affects recognition, is a well-established trait in schizophrenia, and specific cognitive remediation programs focusing on facial affects recognition have been developed by different teams worldwide. However, even though social cognitive impairments have been confirmed, previous studies have also shown heterogeneity of the results between different subjects. Therefore, assessment of personal abilities should be measured individually before proposing such programs. Most research teams apply tasks based on facial affects recognition by Ekman et al. or Gur et al. However, these tasks are not easily applicable in a clinical exercise. Here, we present the Facial Emotions Recognition Test (TREF), which is designed to identify facial affects recognition impairments in a clinical practice. The test is composed of 54 photos and evaluates abilities in the recognition of six universal emotions (joy, anger, sadness, fear, disgust and contempt). Each of these emotions is represented with colored photos of 4 different models (two men and two women) at nine intensity levels from 20 to 100%. Each photo is presented during 10 seconds; no time limit for responding is applied. The present study compared the scores of the TREF test in a sample of healthy controls (64 subjects) and people with stabilized schizophrenia (45 subjects) according to the DSM IV-TR criteria. We analysed global scores for all emotions, as well as sub scores for each emotion between these two groups, taking into account gender differences. Our results were coherent with previous findings. Applying TREF, we confirmed an impairment in facial affects recognition in schizophrenia by showing significant differences between the two groups in their global results (76.45% for healthy controls versus 61.28% for people with schizophrenia), as well as in sub scores for each emotion except for joy. Scores for women were significantly higher than for men in the population without psychiatric diagnosis. The study also allowed the identification of cut-off scores; results below 2 standard deviations of the healthy control average (61.57%) pointed to a facial affect recognition deficit. The TREF appears to be a useful tool to identify facial affects recognition impairment in schizophrenia. Neuropsychologists, who have tried this task, have positive feedback. The TREF is easy to use (duration of about 15 minutes), easy to apply in subjects with attentional difficulties, and tests facial affects recognition at ecological intensity levels. These results have to be confirmed in the future with larger sample sizes, and in comparison with other tasks, evaluating the facial affects recognition processes. Copyright © 2014 L’Encéphale, Paris. Published by Elsevier Masson SAS. All rights reserved.
Fixation to features and neural processing of facial expressions in a gender discrimination task.
Neath, Karly N; Itier, Roxane J
2015-10-01
Early face encoding, as reflected by the N170 ERP component, is sensitive to fixation to the eyes. Whether this sensitivity varies with facial expressions of emotion and can also be seen on other ERP components such as P1 and EPN, was investigated. Using eye-tracking to manipulate fixation on facial features, we found the N170 to be the only eye-sensitive component and this was true for fearful, happy and neutral faces. A different effect of fixation to features was seen for the earlier P1 that likely reflected general sensitivity to face position. An early effect of emotion (∼120 ms) for happy faces was seen at occipital sites and was sustained until ∼350 ms post-stimulus. For fearful faces, an early effect was seen around 80 ms followed by a later effect appearing at ∼150 ms until ∼300 ms at lateral posterior sites. Results suggests that in this emotion-irrelevant gender discrimination task, processing of fearful and happy expressions occurred early and largely independently of the eye-sensitivity indexed by the N170. Processing of the two emotions involved different underlying brain networks active at different times. Copyright © 2015 Elsevier Inc. All rights reserved.
Balconi, Michela; Lucchiari, Claudio
2008-01-01
It remains an open question whether it is possible to assign a single brain operation or psychological function for facial emotion decoding to a certain type of oscillatory activity. Gamma band activity (GBA) offers an adequate tool for studying cortical activation patterns during emotional face information processing. In the present study brain oscillations were analyzed in response to facial expression of emotions. Specifically, GBA modulation was measured when twenty subjects looked at emotional (angry, fearful, happy, and sad faces) or neutral faces in two different conditions: supraliminal (10 ms) vs subliminal (150 ms) stimulation (100 target-mask pairs for each condition). The results showed that both consciousness and significance of the stimulus in terms of arousal can modulate the power synchronization (ERD decrease) during 150-350 time range: an early oscillatory event showed its peak at about 200 ms post-stimulus. GBA was enhanced by supraliminal more than subliminal elaboration, as well as more by high arousal (anger and fear) than low arousal (happiness and sadness) emotions. Finally a left-posterior dominance for conscious elaboration was found, whereas right hemisphere was discriminant in emotional processing of face in comparison with neutral face.
[Emotional intelligence and oscillatory responses on the emotional facial expressions].
Kniazev, G G; Mitrofanova, L G; Bocharov, A V
2013-01-01
Emotional intelligence-related differences in oscillatory responses to emotional facial expressions were investigated in 48 subjects (26 men and 22 women) in age 18-30 years. Participants were instructed to evaluate emotional expression (angry, happy and neutral) of each presented face on an analog scale ranging from -100 (very hostile) to + 100 (very friendly). High emotional intelligence (EI) participants were found to be more sensitive to the emotional content of the stimuli. It showed up both in their subjective evaluation of the stimuli and in a stronger EEG theta synchronization at an earlier (between 100 and 500 ms after face presentation) processing stage. Source localization using sLORETA showed that this effect was localized in the fusiform gyrus upon the presentation of angry faces and in the posterior cingulate gyrus upon the presentation of happy faces. At a later processing stage (500-870 ms) event-related theta synchronization in high emotional intelligence subject was higher in the left prefrontal cortex upon the presentation of happy faces, but it was lower in the anterior cingulate cortex upon presentation of angry faces. This suggests the existence of a mechanism that can be selectively increase the positive emotions and reduce negative emotions.
Compound facial expressions of emotion: from basic research to clinical applications
Du, Shichuan; Martinez, Aleix M.
2015-01-01
Emotions are sometimes revealed through facial expressions. When these natural facial articulations involve the contraction of the same muscle groups in people of distinct cultural upbringings, this is taken as evidence of a biological origin of these emotions. While past research had identified facial expressions associated with a single internally felt category (eg, the facial expression of happiness when we feel joyful), we have recently studied facial expressions observed when people experience compound emotions (eg, the facial expression of happy surprise when we feel joyful in a surprised way, as, for example, at a surprise birthday party). Our research has identified 17 compound expressions consistently produced across cultures, suggesting that the number of facial expressions of emotion of biological origin is much larger than previously believed. The present paper provides an overview of these findings and shows evidence supporting the view that spontaneous expressions are produced using the same facial articulations previously identified in laboratory experiments. We also discuss the implications of our results in the study of psychopathologies, and consider several open research questions. PMID:26869845
Facial mimicry in its social setting
Seibt, Beate; Mühlberger, Andreas; Likowski, Katja U.; Weyers, Peter
2015-01-01
In interpersonal encounters, individuals often exhibit changes in their own facial expressions in response to emotional expressions of another person. Such changes are often called facial mimicry. While this tendency first appeared to be an automatic tendency of the perceiver to show the same emotional expression as the sender, evidence is now accumulating that situation, person, and relationship jointly determine whether and for which emotions such congruent facial behavior is shown. We review the evidence regarding the moderating influence of such factors on facial mimicry with a focus on understanding the meaning of facial responses to emotional expressions in a particular constellation. From this, we derive recommendations for a research agenda with a stronger focus on the most common forms of encounters, actual interactions with known others, and on assessing potential mediators of facial mimicry. We conclude that facial mimicry is modulated by many factors: attention deployment and sensitivity, detection of valence, emotional feelings, and social motivations. We posit that these are the more proximal causes of changes in facial mimicry due to changes in its social setting. PMID:26321970
Missana, Manuela; Grigutsch, Maren; Grossmann, Tobias
2014-01-01
We examined the processing of facial expressions of pain and anger in 8-month-old infants and adults by measuring event-related brain potentials (ERPs) and frontal EEG alpha asymmetry. The ERP results revealed that while adults showed a late positive potential (LPP) to emotional expressions that was enhanced to pain expressions, reflecting increased evaluation and emotional arousal to pain expressions, infants showed a negative component (Nc) to emotional expressions that was enhanced to angry expressions, reflecting increased allocation of attention to angry faces. Moreover, infants and adults showed opposite patterns in their frontal asymmetry responses to pain and anger, suggesting developmental differences in the motivational processes engendered by these facial expressions. These findings are discussed in the light of associated individual differences in infant temperament and adult dispositional empathy. PMID:24705497
Wu, Minjie; Kujawa, Autumn; Lu, Lisa H.; Fitzgerald, Daniel A.; Klumpp, Heide; Fitzgerald, Kate D.; Monk, Christopher S.; Phan, K. Luan
2016-01-01
The ability to process and respond to emotional facial expressions is a critical skill for healthy social and emotional development. There has been growing interest in understanding the neural circuitry underlying development of emotional processing, with previous research implicating functional connectivity between amygdala and frontal regions. However, existing work has focused on threatening emotional faces, raising questions regarding the extent to which these developmental patterns are specific to threat or to emotional face processing more broadly. In the current study, we examined age-related changes in brain activity and amygdala functional connectivity during an fMRI emotional face matching task (including angry, fearful and happy faces) in 61 healthy subjects aged 7–25 years. We found age-related decreases in ventral medial prefrontal cortex (vmPFC) activity in response to happy faces but not to angry or fearful faces, and an age-related change (shifting from positive to negative correlation) in amygdala-anterior cingulate cortex/medial prefrontal cortex (ACC/mPFC) functional connectivity to all emotional faces. Specifically, positive correlations between amygdala and ACC/mPFC in children changed to negative correlations in adults, which may suggest early emergence of bottom-up amygdala excitatory signaling to ACC/mPFC in children and later development of top-down inhibitory control of ACC/mPFC over amygdala in adults. Age-related changes in amygdala-ACC/mPFC connectivity did not vary for processing of different facial emotions, suggesting changes in amygdala-ACC/mPFC connectivity may underlie development of broad emotional processing, rather than threat-specific processing. PMID:26931629
Acute alcohol effects on facial expressions of emotions in social drinkers: a systematic review
Capito, Eva Susanne; Lautenbacher, Stefan; Horn-Hofmann, Claudia
2017-01-01
Background As known from everyday experience and experimental research, alcohol modulates emotions. Particularly regarding social interaction, the effects of alcohol on the facial expression of emotion might be of relevance. However, these effects have not been systematically studied. We performed a systematic review on acute alcohol effects on social drinkers’ facial expressions of induced positive and negative emotions. Materials and methods With a predefined algorithm, we searched three electronic databases (PubMed, PsycInfo, and Web of Science) for studies conducted on social drinkers that used acute alcohol administration, emotion induction, and standardized methods to record facial expressions. We excluded those studies that failed common quality standards, and finally selected 13 investigations for this review. Results Overall, alcohol exerted effects on facial expressions of emotions in social drinkers. These effects were not generally disinhibiting, but varied depending on the valence of emotion and on social interaction. Being consumed within social groups, alcohol mostly influenced facial expressions of emotions in a socially desirable way, thus underscoring the view of alcohol as social lubricant. However, methodical differences regarding alcohol administration between the studies complicated comparability. Conclusion Our review highlighted the relevance of emotional valence and social-context factors for acute alcohol effects on social drinkers’ facial expressions of emotions. Future research should investigate how these alcohol effects influence the development of problematic drinking behavior in social drinkers. PMID:29255375
Fiacconi, Chris M; Owen, Adrian M
2016-09-01
To examine whether emotional functioning can be observed in patients who are behaviourally non-responsive using peripheral markers of emotional functioning. We tested two patients, both diagnosed as being in a vegetative state (VS) following hypoxia secondary to cardiac arrest. Thirty-seven healthy participants with no history of neurological illness served as a control group. The activity of two facial muscles (zygomaticus major, corrugator supercilii) was measured using facial electromyography (EMG) to probe for patterned responses that differentiate between auditorily presented joke and non-joke stimuli in VS patients. One of the two VS patients we tested demonstrated greater zygomatic and reduced corrugator activity in response to jokes compared with non-jokes. Critically, these responses followed the pattern and temporal profile of muscle activity observed in our healthy control sample. Despite their behaviorally non-responsive profile, some patients diagnosed as VS appear to retain some aspects of emotional experience. Our findings represent, to our knowledge, the first demonstration that a patient diagnosed as VS can exhibit intact emotional responses to humor as assessed by facial EMG. Therefore, our approach may constitute a feasible bedside tool capable of providing novel insight into the mental and emotional lives of patients who are behaviourally non-responsive. Copyright © 2016 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Zheng, Leilei; Chai, Hao; Chen, Wanzhen; Yu, Rongrong; He, Wei; Jiang, Zhengyan; Yu, Shaohua; Li, Huichun; Wang, Wei
2011-12-01
Early parental bonding experiences play a role in emotion recognition and expression in later adulthood, and patients with personality disorder frequently experience inappropriate parental bonding styles, therefore the aim of the present study was to explore whether parental bonding style is correlated with recognition of facial emotion in personality disorder patients. The Parental Bonding Instrument (PBI) and the Matsumoto and Ekman Japanese and Caucasian Facial Expressions of Emotion (JACFEE) photo set tests were carried out in 289 participants. Patients scored lower on parental Care but higher on parental Freedom Control and Autonomy Denial subscales, and they displayed less accuracy when recognizing contempt, disgust and happiness than the healthy volunteers. In healthy volunteers, maternal Autonomy Denial significantly predicted accuracy when recognizing fear, and maternal Care predicted the accuracy of recognizing sadness. In patients, paternal Care negatively predicted the accuracy of recognizing anger, paternal Freedom Control predicted the perceived intensity of contempt, maternal Care predicted the accuracy of recognizing sadness, and the intensity of disgust. Parenting bonding styles have an impact on the decoding process and sensitivity when recognizing facial emotions, especially in personality disorder patients. © 2011 The Authors. Psychiatry and Clinical Neurosciences © 2011 Japanese Society of Psychiatry and Neurology.
Face and emotion expression processing and the serotonin transporter polymorphism 5-HTTLPR/rs22531.
Hildebrandt, A; Kiy, A; Reuter, M; Sommer, W; Wilhelm, O
2016-06-01
Face cognition, including face identity and facial expression processing, is a crucial component of socio-emotional abilities, characterizing humans as highest developed social beings. However, for these trait domains molecular genetic studies investigating gene-behavior associations based on well-founded phenotype definitions are still rare. We examined the relationship between 5-HTTLPR/rs25531 polymorphisms - related to serotonin-reuptake - and the ability to perceive and recognize faces and emotional expressions in human faces. For this aim we conducted structural equation modeling on data from 230 young adults, obtained by using a comprehensive, multivariate task battery with maximal effort tasks. By additionally modeling fluid intelligence and immediate and delayed memory factors, we aimed to address the discriminant relationships of the 5-HTTLPR/rs25531 polymorphisms with socio-emotional abilities. We found a robust association between the 5-HTTLPR/rs25531 polymorphism and facial emotion perception. Carriers of two long (L) alleles outperformed carriers of one or two S alleles. Weaker associations were present for face identity perception and memory for emotional facial expressions. There was no association between the 5-HTTLPR/rs25531 polymorphism and non-social abilities, demonstrating discriminant validity of the relationships. We discuss the implications and possible neural mechanisms underlying these novel findings. © 2016 John Wiley & Sons Ltd and International Behavioural and Neural Genetics Society.
ERIC Educational Resources Information Center
Gaspar, Augusta; Esteves, Francisco G.
2012-01-01
Prototypical facial expressions of emotion, also known as universal facial expressions, are the underpinnings of most research concerning recognition of emotions in both adults and children. Data on natural occurrences of these prototypes in natural emotional contexts are rare and difficult to obtain in adults. By recording naturalistic…
ERIC Educational Resources Information Center
Doi, Hirokazu; Fujisawa, Takashi X.; Kanai, Chieko; Ohta, Haruhisa; Yokoi, Hideki; Iwanami, Akira; Kato, Nobumasa; Shinohara, Kazuyuki
2013-01-01
This study investigated the ability of adults with Asperger syndrome to recognize emotional categories of facial expressions and emotional prosodies with graded emotional intensities. The individuals with Asperger syndrome showed poorer recognition performance for angry and sad expressions from both facial and vocal information. The group…
Benito, Adolfo; Lahera, Guillermo; Herrera, Sara; Muncharaz, Ramón; Benito, Guillermo; Fernández-Liria, Alberto; Montes, José Manuel
2013-01-01
To analyze the recognition, identification, and discrimination of facial emotions in a sample of outpatients with bipolar disorder (BD). Forty-four outpatients with diagnosis of BD and 48 matched control subjects were selected. Both groups were assessed with tests for recognition (Emotion Recognition-40 - ER40), identification (Facial Emotion Identification Test - FEIT), and discrimination (Facial Emotion Discrimination Test - FEDT) of facial emotions, as well as a theory of mind (ToM) verbal test (Hinting Task). Differences between groups were analyzed, controlling the influence of mild depressive and manic symptoms. Patients with BD scored significantly lower than controls on recognition (ER40), identification (FEIT), and discrimination (FEDT) of emotions. Regarding the verbal measure of ToM, a lower score was also observed in patients compared to controls. Patients with mild syndromal depressive symptoms obtained outcomes similar to patients in euthymia. A significant correlation between FEDT scores and global functioning (measured by the Functioning Assessment Short Test, FAST) was found. These results suggest that, even in euthymia, patients with BD experience deficits in recognition, identification, and discrimination of facial emotions, with potential functional implications.
Qiao-Tasserit, Emilie; Garcia Quesada, Maria; Antico, Lia; Bavelier, Daphne; Vuilleumier, Patrik; Pichon, Swann
2017-01-01
Both affective states and personality traits shape how we perceive the social world and interpret emotions. The literature on affective priming has mostly focused on brief influences of emotional stimuli and emotional states on perceptual and cognitive processes. Yet this approach does not fully capture more dynamic processes at the root of emotional states, with such states lingering beyond the duration of the inducing external stimuli. Our goal was to put in perspective three different types of affective states (induced affective states, more sustained mood states and affective traits such as depression and anxiety) and investigate how they may interact and influence emotion perception. Here, we hypothesized that absorption into positive and negative emotional episodes generate sustained affective states that outlast the episode period and bias the interpretation of facial expressions in a perceptual decision-making task. We also investigated how such effects are influenced by more sustained mood states and by individual affect traits (depression and anxiety) and whether they interact. Transient emotional states were induced using movie-clips, after which participants performed a forced-choice emotion classification task with morphed facial expressions ranging from fear to happiness. Using a psychometric approach, we show that negative (vs. neutral) clips increased participants' propensity to classify ambiguous faces as fearful during several minutes. In contrast, positive movies biased classification toward happiness only for those clips perceived as most absorbing. Negative mood, anxiety and depression had a stronger effect than transient states and increased the propensity to classify ambiguous faces as fearful. These results provide the first evidence that absorption and different temporal dimensions of emotions have a significant effect on how we perceive facial expressions.
Garcia Quesada, Maria; Antico, Lia; Bavelier, Daphne; Vuilleumier, Patrik; Pichon, Swann
2017-01-01
Both affective states and personality traits shape how we perceive the social world and interpret emotions. The literature on affective priming has mostly focused on brief influences of emotional stimuli and emotional states on perceptual and cognitive processes. Yet this approach does not fully capture more dynamic processes at the root of emotional states, with such states lingering beyond the duration of the inducing external stimuli. Our goal was to put in perspective three different types of affective states (induced affective states, more sustained mood states and affective traits such as depression and anxiety) and investigate how they may interact and influence emotion perception. Here, we hypothesized that absorption into positive and negative emotional episodes generate sustained affective states that outlast the episode period and bias the interpretation of facial expressions in a perceptual decision-making task. We also investigated how such effects are influenced by more sustained mood states and by individual affect traits (depression and anxiety) and whether they interact. Transient emotional states were induced using movie-clips, after which participants performed a forced-choice emotion classification task with morphed facial expressions ranging from fear to happiness. Using a psychometric approach, we show that negative (vs. neutral) clips increased participants’ propensity to classify ambiguous faces as fearful during several minutes. In contrast, positive movies biased classification toward happiness only for those clips perceived as most absorbing. Negative mood, anxiety and depression had a stronger effect than transient states and increased the propensity to classify ambiguous faces as fearful. These results provide the first evidence that absorption and different temporal dimensions of emotions have a significant effect on how we perceive facial expressions. PMID:28151976
Quarto, Tiziana; Blasi, Giuseppe; Pallesen, Karen Johanne; Bertolino, Alessandro; Brattico, Elvira
2014-01-01
The ability to recognize emotions contained in facial expressions are affected by both affective traits and states and varies widely between individuals. While affective traits are stable in time, affective states can be regulated more rapidly by environmental stimuli, such as music, that indirectly modulate the brain state. Here, we tested whether a relaxing or irritating sound environment affects implicit processing of facial expressions. Moreover, we investigated whether and how individual traits of anxiety and emotional control interact with this process. 32 healthy subjects performed an implicit emotion processing task (presented to subjects as a gender discrimination task) while the sound environment was defined either by a) a therapeutic music sequence (MusiCure), b) a noise sequence or c) silence. Individual changes in mood were sampled before and after the task by a computerized questionnaire. Additionally, emotional control and trait anxiety were assessed in a separate session by paper and pencil questionnaires. Results showed a better mood after the MusiCure condition compared with the other experimental conditions and faster responses to happy faces during MusiCure compared with angry faces during Noise. Moreover, individuals with higher trait anxiety were faster in performing the implicit emotion processing task during MusiCure compared with Silence. These findings suggest that sound-induced affective states are associated with differential responses to angry and happy emotional faces at an implicit stage of processing, and that a relaxing sound environment facilitates the implicit emotional processing in anxious individuals. PMID:25072162
Laterality of facial expressions of emotion: Universal and culture-specific influences.
Mandal, Manas K; Ambady, Nalini
2004-01-01
Recent research indicates that (a) the perception and expression of facial emotion are lateralized to a great extent in the right hemisphere, and, (b) whereas facial expressions of emotion embody universal signals, culture-specific learning moderates the expression and interpretation of these emotions. In the present article, we review the literature on laterality and universality, and propose that, although some components of facial expressions of emotion are governed biologically, others are culturally influenced. We suggest that the left side of the face is more expressive of emotions, is more uninhibited, and displays culture-specific emotional norms. The right side of face, on the other hand, is less susceptible to cultural display norms and exhibits more universal emotional signals. Copyright 2004 IOS Press
Skilful communication: Emotional facial expressions recognition in very old adults.
María Sarabia-Cobo, Carmen; Navas, María José; Ellgring, Heiner; García-Rodríguez, Beatriz
2016-02-01
The main objective of this study was to assess the changes associated with ageing in the ability to identify emotional facial expressions and to what extent such age-related changes depend on the intensity with which each basic emotion is manifested. A randomised controlled trial carried out on 107 subjects who performed a six alternative forced-choice emotional expressions identification task. The stimuli consisted of 270 virtual emotional faces expressing the six basic emotions (happiness, sadness, surprise, fear, anger and disgust) at three different levels of intensity (low, pronounced and maximum). The virtual faces were generated by facial surface changes, as described in the Facial Action Coding System (FACS). A progressive age-related decline in the ability to identify emotional facial expressions was detected. The ability to recognise the intensity of expressions was one of the most strongly impaired variables associated with age, although the valence of emotion was also poorly identified, particularly in terms of recognising negative emotions. Nurses should be mindful of how ageing affects communication with older patients. In this study, very old adults displayed more difficulties in identifying emotional facial expressions, especially low intensity expressions and those associated with difficult emotions like disgust or fear. Copyright © 2015 Elsevier Ltd. All rights reserved.
Identity-expression interaction in face perception: sex, visual field, and psychophysical factors.
Godard, Ornella; Baudouin, Jean-Yves; Bonnet, Philippe; Fiori, Nicole
2013-01-01
We investigated the psychophysical factors underlying the identity-emotion interaction in face perception. Visual field and sex were also taken into account. Participants had to judge whether a probe face, presented in either the left or the right visual field, and a central target face belonging to same person while emotional expression varied (Experiment 1) or to judge whether probe and target faces expressed the same emotion while identity was manipulated (Experiment 2). For accuracy we replicated the mutual facilitation effect between identity and emotion; no sex or hemispheric differences were found. Processing speed measurements, however, showed a lesser degree of interference in women than in men, especially for matching identity when faces expressed different emotions after a left visual presentation probe face. Psychophysical indices can be used to determine whether these effects are perceptual (A') or instead arise at a post-perceptual decision-making stage (B"). The influence of identity on the processing of facial emotion seems to be due to perceptual factors, whereas the influence of emotion changes on identity processing seems to be related to decisional factors. In addition, men seem to be more "conservative" after a LVF/RH probe-face presentation when processing identity. Women seem to benefit from better abilities to extract facial invariant aspects relative to identity.
[Emotion Recognition in Patients with Peripheral Facial Paralysis - A Pilot Study].
Konnerth, V; Mohr, G; von Piekartz, H
2016-02-01
The perception of emotions is an important component in enabling human beings to social interaction in everyday life. Thus, the ability to recognize the emotions of the other one's mime is a key prerequisite for this. The following study aimed at evaluating the ability of subjects with 'peripheral facial paresis' to perceive emotions in healthy individuals. A pilot study was conducted in which 13 people with 'peripheral facial paresis' participated. This assessment included the 'Facially Expressed Emotion Labeling-Test' (FEEL-Test), the 'Facial-Laterality-Recognition Test' (FLR-Test) and the 'Toronto-Alexithymie-Scale 26' (TAS 26). The results were compared with data of healthy people from other studies. In contrast to healthy patients, the subjects with 'facial paresis' show more difficulties in recognizing basic emotions; however the results are not significant. The participants show a significant lower level of speed (right/left: p<0.001) concerning the perception of facial laterality compared to healthy people. With regard to the alexithymia, the tested group reveals significantly higher results (p<0.001) compared to the unimpaired people. The present pilot study does not prove any impact on this specific patient group's ability to recognize emotions and facial laterality. For future studies the research question should be verified in a larger sample size. © Georg Thieme Verlag KG Stuttgart · New York.
Task-irrelevant emotion facilitates face discrimination learning.
Lorenzino, Martina; Caudek, Corrado
2015-03-01
We understand poorly how the ability to discriminate faces from one another is shaped by visual experience. The purpose of the present study is to determine whether face discrimination learning can be facilitated by facial emotions. To answer this question, we used a task-irrelevant perceptual learning paradigm because it closely mimics the learning processes that, in daily life, occur without a conscious intention to learn and without an attentional focus on specific facial features. We measured face discrimination thresholds before and after training. During the training phase (4 days), participants performed a contrast discrimination task on face images. They were not informed that we introduced (task-irrelevant) subtle variations in the face images from trial to trial. For the Identity group, the task-irrelevant features were variations along a morphing continuum of facial identity. For the Emotion group, the task-irrelevant features were variations along an emotional expression morphing continuum. The Control group did not undergo contrast discrimination learning and only performed the pre-training and post-training tests, with the same temporal gap between them as the other two groups. Results indicate that face discrimination improved, but only for the Emotion group. Participants in the Emotion group, moreover, showed face discrimination improvements also for stimulus variations along the facial identity dimension, even if these (task-irrelevant) stimulus features had not been presented during training. The present results highlight the importance of emotions for face discrimination learning. Copyright © 2015 Elsevier Ltd. All rights reserved.
EMOTION RECOGNITION OF VIRTUAL AGENTS FACIAL EXPRESSIONS: THE EFFECTS OF AGE AND EMOTION INTENSITY
Beer, Jenay M.; Fisk, Arthur D.; Rogers, Wendy A.
2014-01-01
People make determinations about the social characteristics of an agent (e.g., robot or virtual agent) by interpreting social cues displayed by the agent, such as facial expressions. Although a considerable amount of research has been conducted investigating age-related differences in emotion recognition of human faces (e.g., Sullivan, & Ruffman, 2004), the effect of age on emotion identification of virtual agent facial expressions has been largely unexplored. Age-related differences in emotion recognition of facial expressions are an important factor to consider in the design of agents that may assist older adults in a recreational or healthcare setting. The purpose of the current research was to investigate whether age-related differences in facial emotion recognition can extend to emotion-expressive virtual agents. Younger and older adults performed a recognition task with a virtual agent expressing six basic emotions. Larger age-related differences were expected for virtual agents displaying negative emotions, such as anger, sadness, and fear. In fact, the results indicated that older adults showed a decrease in emotion recognition accuracy for a virtual agent's emotions of anger, fear, and happiness. PMID:25552896
Recognition of facial expressions of mixed emotions in school-age children exposed to terrorism.
Scrimin, Sara; Moscardino, Ughetta; Capello, Fabia; Altoè, Gianmarco; Axia, Giovanna
2009-09-01
This exploratory study aims at investigating the effects of terrorism on children's ability to recognize emotions. A sample of 101 exposed and 102 nonexposed children (mean age = 11 years), balanced for age and gender, were assessed 20 months after a terrorist attack in Beslan, Russia. Two trials controlled for children's ability to match a facial emotional stimulus with an emotional label and their ability to match an emotional label with an emotional context. The experimental trial evaluated the relation between exposure to terrorism and children's free labeling of mixed emotion facial stimuli created by morphing between 2 prototypical emotions. Repeated measures analyses of covariance revealed that exposed children correctly recognized pure emotions. Four log-linear models were performed to explore the association between exposure group and category of answer given in response to different mixed emotion facial stimuli. Model parameters indicated that, compared with nonexposed children, exposed children (a) labeled facial expressions containing anger and sadness significantly more often than expected as anger, and (b) produced fewer correct answers in response to stimuli containing sadness as a target emotion.
Grossman, Ruth B; Tager-Flusberg, Helen
2012-01-01
Data on emotion processing by individuals with ASD suggest both intact abilities and significant deficits. Signal intensity may be a contributing factor to this discrepancy. We presented low- and high-intensity emotional stimuli in a face-voice matching task to 22 adolescents with ASD and 22 typically developing (TD) peers. Participants heard semantically neutral sentences with happy, surprised, angry, and sad prosody presented at two intensity levels (low, high) and matched them to emotional faces. The facial expression choice was either across- or within-valence. Both groups were less accurate for low-intensity emotions, but the ASD participants' accuracy levels dropped off more sharply. ASD participants were significantly less accurate than their TD peers for trials involving low-intensity emotions and within-valence face contrasts. PMID:22450703
Dissociation between facial and bodily expressions in emotion recognition: A case study.
Leiva, Samanta; Margulis, Laura; Micciulli, Andrea; Ferreres, Aldo
2017-12-21
Existing single-case studies have reported deficit in recognizing basic emotions through facial expression and unaffected performance with body expressions, but not the opposite pattern. The aim of this paper is to present a case study with impaired emotion recognition through body expressions and intact performance with facial expressions. In this single-case study we assessed a 30-year-old patient with autism spectrum disorder, without intellectual disability, and a healthy control group (n = 30) with four tasks of basic and complex emotion recognition through face and body movements, and two non-emotional control tasks. To analyze the dissociation between facial and body expressions, we used Crawford and Garthwaite's operational criteria, and we compared the patient and the control group performance with a modified one-tailed t-test designed specifically for single-case studies. There were no statistically significant differences between the patient's and the control group's performances on the non-emotional body movement task or the facial perception task. For both kinds of emotions (basic and complex) when the patient's performance was compared to the control group's, statistically significant differences were only observed for the recognition of body expressions. There were no significant differences between the patient's and the control group's correct answers for emotional facial stimuli. Our results showed a profile of impaired emotion recognition through body expressions and intact performance with facial expressions. This is the first case study that describes the existence of this kind of dissociation pattern between facial and body expressions of basic and complex emotions.
Prigent, Elise; Amorim, Michel-Ange; de Oliveira, Armando Mónica
2018-01-01
Humans have developed a specific capacity to rapidly perceive and anticipate other people's facial expressions so as to get an immediate impression of their emotional state of mind. We carried out two experiments to examine the perceptual and memory dynamics of facial expressions of pain. In the first experiment, we investigated how people estimate other people's levels of pain based on the perception of various dynamic facial expressions; these differ both in terms of the amount and intensity of activated action units. A second experiment used a representational momentum (RM) paradigm to study the emotional anticipation (memory bias) elicited by the same facial expressions of pain studied in Experiment 1. Our results highlighted the relationship between the level of perceived pain (in Experiment 1) and the direction and magnitude of memory bias (in Experiment 2): When perceived pain increases, the memory bias tends to be reduced (if positive) and ultimately becomes negative. Dynamic facial expressions of pain may reenact an "immediate perceptual history" in the perceiver before leading to an emotional anticipation of the agent's upcoming state. Thus, a subtle facial expression of pain (i.e., a low contraction around the eyes) that leads to a significant positive anticipation can be considered an adaptive process-one through which we can swiftly and involuntarily detect other people's pain.
Neural correlates of mirth and laughter: a direct electrical cortical stimulation study.
Yamao, Yukihiro; Matsumoto, Riki; Kunieda, Takeharu; Shibata, Sumiya; Shimotake, Akihiro; Kikuchi, Takayuki; Satow, Takeshi; Mikuni, Nobuhiro; Fukuyama, Hidenao; Ikeda, Akio; Miyamoto, Susumu
2015-05-01
Laughter consists of both motor and emotional aspects. The emotional component, known as mirth, is usually associated with the motor component, namely, bilateral facial movements. Previous electrical cortical stimulation (ES) studies revealed that mirth was associated with the basal temporal cortex, inferior frontal cortex, and medial frontal cortex. Functional neuroimaging implicated a role for the left inferior frontal and bilateral temporal cortices in humor processing. However, the neural origins and pathways linking mirth with facial movements are still unclear. We hereby report two cases with temporal lobe epilepsy undergoing subdural electrode implantation in whom ES of the left basal temporal cortex elicited both mirth and laughter-related facial muscle movements. In one case with normal hippocampus, high-frequency ES consistently caused contralateral facial movement, followed by bilateral facial movements with mirth. In contrast, in another case with hippocampal sclerosis (HS), ES elicited only mirth at low intensity and short duration, and eventually laughter at higher intensity and longer duration. In both cases, the basal temporal language area (BTLA) was located within or adjacent to the cortex where ES produced mirth. In conclusion, the present direct ES study demonstrated that 1) mirth had a close relationship with language function, 2) intact mesial temporal structures were actively engaged in the beginning of facial movements associated with mirth, and 3) these emotion-related facial movements had contralateral dominance. Copyright © 2014 Elsevier Ltd. All rights reserved.
The Child Affective Facial Expression (CAFE) set: validity and reliability from untrained adults
LoBue, Vanessa; Thrasher, Cat
2014-01-01
Emotional development is one of the largest and most productive areas of psychological research. For decades, researchers have been fascinated by how humans respond to, detect, and interpret emotional facial expressions. Much of the research in this area has relied on controlled stimulus sets of adults posing various facial expressions. Here we introduce a new stimulus set of emotional facial expressions into the domain of research on emotional development—The Child Affective Facial Expression set (CAFE). The CAFE set features photographs of a racially and ethnically diverse group of 2- to 8-year-old children posing for six emotional facial expressions—angry, fearful, sad, happy, surprised, and disgusted—and a neutral face. In the current work, we describe the set and report validity and reliability data on the set from 100 untrained adult participants. PMID:25610415
The Child Affective Facial Expression (CAFE) set: validity and reliability from untrained adults.
LoBue, Vanessa; Thrasher, Cat
2014-01-01
Emotional development is one of the largest and most productive areas of psychological research. For decades, researchers have been fascinated by how humans respond to, detect, and interpret emotional facial expressions. Much of the research in this area has relied on controlled stimulus sets of adults posing various facial expressions. Here we introduce a new stimulus set of emotional facial expressions into the domain of research on emotional development-The Child Affective Facial Expression set (CAFE). The CAFE set features photographs of a racially and ethnically diverse group of 2- to 8-year-old children posing for six emotional facial expressions-angry, fearful, sad, happy, surprised, and disgusted-and a neutral face. In the current work, we describe the set and report validity and reliability data on the set from 100 untrained adult participants.
Haldane, Morgan; Jogia, Jigar; Cobb, Annabel; Kozuch, Eliza; Kumari, Veena; Frangou, Sophia
2008-01-01
Verbal working memory and emotional self-regulation are impaired in Bipolar Disorder (BD). Our aim was to investigate the effect of Lamotrigine (LTG), which is effective in the clinical management of BD, on the neural circuits subserving working memory and emotional processing. Functional Magnetic Resonance Imaging data from 12 stable BD patients was used to detect LTG-induced changes as the differences in brain activity between drug-free and post-LTG monotherapy conditions during a verbal working memory (N-back sequential letter task) and an angry facial affect recognition task. For both tasks, LGT monotherapy compared to baseline was associated with increased activation mostly within the prefrontal cortex and cingulate gyrus, in regions normally engaged in verbal working memory and emotional processing. Therefore, LTG monotherapy in BD patients may enhance cortical function within neural circuits involved in memory and emotional self-regulation.
Grossard, Charline; Chaby, Laurence; Hun, Stéphanie; Pellerin, Hugues; Bourgeois, Jérémy; Dapogny, Arnaud; Ding, Huaxiong; Serret, Sylvie; Foulon, Pierre; Chetouani, Mohamed; Chen, Liming; Bailly, Kevin; Grynszpan, Ouriel; Cohen, David
2018-01-01
The production of facial expressions (FEs) is an important skill that allows children to share and adapt emotions with their relatives and peers during social interactions. These skills are impaired in children with Autism Spectrum Disorder. However, the way in which typical children develop and master their production of FEs has still not been clearly assessed. This study aimed to explore factors that could influence the production of FEs in childhood such as age, gender, emotion subtype (sadness, anger, joy, and neutral), elicitation task (on request, imitation), area of recruitment (French Riviera and Parisian) and emotion multimodality. A total of one hundred fifty-seven children aged 6–11 years were enrolled in Nice and Paris, France. We asked them to produce FEs in two different tasks: imitation with an avatar model and production on request without a model. Results from a multivariate analysis revealed that: (1) children performed better with age. (2) Positive emotions were easier to produce than negative emotions. (3) Children produced better FE on request (as opposed to imitation); and (4) Riviera children performed better than Parisian children suggesting regional influences on emotion production. We conclude that facial emotion production is a complex developmental process influenced by several factors that needs to be acknowledged in future research. PMID:29670561
ERIC Educational Resources Information Center
Yankouskaya, Alla; Humphreys, Glyn W.; Rotshtein, Pia
2014-01-01
We examined relations between the processing of facial identity and emotion in own- and other-race faces, using a fully crossed design with participants from 3 different ethnicities. The benefits of redundant identity and emotion signals were evaluated and formally tested in relation to models of independent and coactive feature processing and…
Leppänen, J M; Niehaus, D J H; Koen, L; Du Toit, E; Schoeman, R; Emsley, R
2006-06-01
Schizophrenia is associated with a deficit in the recognition of negative emotions from facial expressions. The present study examined the universality of this finding by studying facial expression recognition in African Xhosa population. Forty-four Xhosa patients with schizophrenia and forty healthy controls were tested with a computerized task requiring rapid perceptual discrimination of matched positive (i.e. happy), negative (i.e. angry), and neutral faces. Patients were equally accurate as controls in recognizing happy faces but showed a marked impairment in recognition of angry faces. The impairment was particularly pronounced for high-intensity (open-mouth) angry faces. Patients also exhibited more false happy and angry responses to neutral faces than controls. No correlation between level of education or illness duration and emotion recognition was found but the deficit in the recognition of negative emotions was more pronounced in familial compared to non-familial cases of schizophrenia. These findings suggest that the deficit in the recognition of negative facial expressions may constitute a universal neurocognitive marker of schizophrenia.
Aho-Özhan, Helena E A; Keller, Jürgen; Heimrath, Johanna; Uttner, Ingo; Kassubek, Jan; Birbaumer, Niels; Ludolph, Albert C; Lulé, Dorothée
2016-01-01
Amyotrophic lateral sclerosis (ALS) primarily impairs motor abilities but also affects cognition and emotional processing. We hypothesise that subjective ratings of emotional stimuli depicting social interactions and facial expressions is changed in ALS. It was found that recognition of negative emotions and ability to mentalize other's intentions is reduced. Processing of emotions in faces was investigated. A behavioural test of Ekman faces expressing six basic emotions was presented to 30 ALS patients and 29 age-, gender and education matched healthy controls. Additionally, a subgroup of 15 ALS patients that were able to lie supine in the scanner and 14 matched healthy controls viewed the Ekman faces during functional magnetic resonance imaging (fMRI). Affective state and a number of daily social contacts were measured. ALS patients recognized disgust and fear less accurately than healthy controls. In fMRI, reduced brain activity was seen in areas involved in processing of negative emotions replicating our previous results. During processing of sad faces, increased brain activity was seen in areas associated with social emotions in right inferior frontal gyrus and reduced activity in hippocampus bilaterally. No differences in brain activity were seen for any of the other emotional expressions. Inferior frontal gyrus activity for sad faces was associated with increased amount of social contacts of ALS patients. ALS patients showed decreased brain and behavioural responses in processing of disgust and fear and an altered brain response pattern for sadness. The negative consequences of neurodegenerative processes in the course of ALS might be counteracted by positive emotional activity and positive social interactions.
Involvement of Right STS in Audio-Visual Integration for Affective Speech Demonstrated Using MEG
Hagan, Cindy C.; Woods, Will; Johnson, Sam; Green, Gary G. R.; Young, Andrew W.
2013-01-01
Speech and emotion perception are dynamic processes in which it may be optimal to integrate synchronous signals emitted from different sources. Studies of audio-visual (AV) perception of neutrally expressed speech demonstrate supra-additive (i.e., where AV>[unimodal auditory+unimodal visual]) responses in left STS to crossmodal speech stimuli. However, emotions are often conveyed simultaneously with speech; through the voice in the form of speech prosody and through the face in the form of facial expression. Previous studies of AV nonverbal emotion integration showed a role for right (rather than left) STS. The current study therefore examined whether the integration of facial and prosodic signals of emotional speech is associated with supra-additive responses in left (cf. results for speech integration) or right (due to emotional content) STS. As emotional displays are sometimes difficult to interpret, we also examined whether supra-additive responses were affected by emotional incongruence (i.e., ambiguity). Using magnetoencephalography, we continuously recorded eighteen participants as they viewed and heard AV congruent emotional and AV incongruent emotional speech stimuli. Significant supra-additive responses were observed in right STS within the first 250 ms for emotionally incongruent and emotionally congruent AV speech stimuli, which further underscores the role of right STS in processing crossmodal emotive signals. PMID:23950977
Involvement of right STS in audio-visual integration for affective speech demonstrated using MEG.
Hagan, Cindy C; Woods, Will; Johnson, Sam; Green, Gary G R; Young, Andrew W
2013-01-01
Speech and emotion perception are dynamic processes in which it may be optimal to integrate synchronous signals emitted from different sources. Studies of audio-visual (AV) perception of neutrally expressed speech demonstrate supra-additive (i.e., where AV>[unimodal auditory+unimodal visual]) responses in left STS to crossmodal speech stimuli. However, emotions are often conveyed simultaneously with speech; through the voice in the form of speech prosody and through the face in the form of facial expression. Previous studies of AV nonverbal emotion integration showed a role for right (rather than left) STS. The current study therefore examined whether the integration of facial and prosodic signals of emotional speech is associated with supra-additive responses in left (cf. results for speech integration) or right (due to emotional content) STS. As emotional displays are sometimes difficult to interpret, we also examined whether supra-additive responses were affected by emotional incongruence (i.e., ambiguity). Using magnetoencephalography, we continuously recorded eighteen participants as they viewed and heard AV congruent emotional and AV incongruent emotional speech stimuli. Significant supra-additive responses were observed in right STS within the first 250 ms for emotionally incongruent and emotionally congruent AV speech stimuli, which further underscores the role of right STS in processing crossmodal emotive signals.
ERIC Educational Resources Information Center
Roberson, Debi; Kikutani, Mariko; Doge, Paula; Whitaker, Lydia; Majid, Asifa
2012-01-01
Three studies investigated developmental changes in facial expression processing, between 3 years-of-age and adulthood. For adults and older children, the addition of sunglasses to upright faces caused an equivalent decrement in performance to face inversion. However, younger children showed "better" classification of expressions of faces wearing…
ERIC Educational Resources Information Center
Tardif, Carole; Laine, France; Rodriguez, Melissa; Gepner, Bruno
2007-01-01
This study examined the effects of slowing down presentation of facial expressions and their corresponding vocal sounds on facial expression recognition and facial and/or vocal imitation in children with autism. Twelve autistic children and twenty-four normal control children were presented with emotional and non-emotional facial expressions on…
Singing emotionally: a study of pre-production, production, and post-production facial expressions
Quinto, Lena R.; Thompson, William F.; Kroos, Christian; Palmer, Caroline
2014-01-01
Singing involves vocal production accompanied by a dynamic and meaningful use of facial expressions, which may serve as ancillary gestures that complement, disambiguate, or reinforce the acoustic signal. In this investigation, we examined the use of facial movements to communicate emotion, focusing on movements arising in three epochs: before vocalization (pre-production), during vocalization (production), and immediately after vocalization (post-production). The stimuli were recordings of seven vocalists' facial movements as they sang short (14 syllable) melodic phrases with the intention of communicating happiness, sadness, irritation, or no emotion. Facial movements were presented as point-light displays to 16 observers who judged the emotion conveyed. Experiment 1 revealed that the accuracy of emotional judgment varied with singer, emotion, and epoch. Accuracy was highest in the production epoch, however, happiness was well communicated in the pre-production epoch. In Experiment 2, observers judged point-light displays of exaggerated movements. The ratings suggested that the extent of facial and head movements was largely perceived as a gauge of emotional arousal. In Experiment 3, observers rated point-light displays of scrambled movements. Configural information was removed in these stimuli but velocity and acceleration were retained. Exaggerated scrambled movements were likely to be associated with happiness or irritation whereas unexaggerated scrambled movements were more likely to be identified as “neutral.” An analysis of singers' facial movements revealed systematic changes as a function of the emotional intentions of singers. The findings confirm the central role of facial expressions in vocal emotional communication, and highlight individual differences between singers in the amount and intelligibility of facial movements made before, during, and after vocalization. PMID:24808868
Liu, Chengwei; Liu, Ying; Iqbal, Zahida; Li, Wenhui; Lv, Bo; Jiang, Zhongqing
2017-01-01
To investigate the interaction between facial expressions and facial gender information during face perception, the present study matched the intensities of the two types of information in face images and then adopted the orthogonal condition of the Garner Paradigm to present the images to participants who were required to judge the gender and expression of the faces; the gender and expression presentations were varied orthogonally. Gender and expression processing displayed a mutual interaction. On the one hand, the judgment of angry expressions occurred faster when presented with male facial images; on the other hand, the classification of the female gender occurred faster when presented with a happy facial expression than when presented with an angry facial expression. According to the evoked-related potential results, the expression classification was influenced by gender during the face structural processing stage (as indexed by N170), which indicates the promotion or interference of facial gender with the coding of facial expression features. However, gender processing was affected by facial expressions in more stages, including the early (P1) and late (LPC) stages of perceptual processing, reflecting that emotional expression influences gender processing mainly by directing attention.
Briceño, Emily M; Rapport, Lisa J; Kassel, Michelle T; Bieliauskas, Linas A; Zubieta, Jon-Kar; Weisenbach, Sara L; Langenecker, Scott A
2015-03-01
Emotion processing, supported by frontolimbic circuitry known to be sensitive to the effects of aging, is a relatively understudied cognitive-emotional domain in geriatric depression. Some evidence suggests that the neurophysiological disruption observed in emotion processing among adults with major depressive disorder (MDD) may be modulated by both gender and age. Therefore, the present study investigated the effects of gender and age on the neural circuitry supporting emotion processing in MDD. Cross-sectional comparison of fMRI signal during performance of an emotion processing task. Outpatient university setting. One hundred adults recruited by MDD status, gender, and age. Participants underwent fMRI while completing the Facial Emotion Perception Test. They viewed photographs of faces and categorized the emotion perceived. Contrast for fMRI was of face perception minus animal identification blocks. Effects of depression were observed in precuneus and effects of age in a number of frontolimbic regions. Three-way interactions were present between MDD status, gender, and age in regions pertinent to emotion processing, including frontal, limbic, and basal ganglia. Young women with MDD and older men with MDD exhibited hyperactivation in these regions compared with their respective same-gender healthy comparison (HC) counterparts. In contrast, older women and younger men with MDD exhibited hypoactivation compared to their respective same-gender HC counterparts. This the first study to report gender- and age-specific differences in emotion processing circuitry in MDD. Gender-differential mechanisms may underlie cognitive-emotional disruption in older adults with MDD. The present findings have implications for improved probes into the heterogeneity of the MDD syndrome. Copyright © 2015 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.
Ventromedial prefrontal cortex mediates visual attention during facial emotion recognition.
Wolf, Richard C; Philippi, Carissa L; Motzkin, Julian C; Baskaya, Mustafa K; Koenigs, Michael
2014-06-01
The ventromedial prefrontal cortex is known to play a crucial role in regulating human social and emotional behaviour, yet the precise mechanisms by which it subserves this broad function remain unclear. Whereas previous neuropsychological studies have largely focused on the role of the ventromedial prefrontal cortex in higher-order deliberative processes related to valuation and decision-making, here we test whether ventromedial prefrontal cortex may also be critical for more basic aspects of orienting attention to socially and emotionally meaningful stimuli. Using eye tracking during a test of facial emotion recognition in a sample of lesion patients, we show that bilateral ventromedial prefrontal cortex damage impairs visual attention to the eye regions of faces, particularly for fearful faces. This finding demonstrates a heretofore unrecognized function of the ventromedial prefrontal cortex-the basic attentional process of controlling eye movements to faces expressing emotion. © The Author (2014). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Harit, Aditya; Joshi, J. C., Col; Gupta, K. K.
2018-03-01
The paper proposed an automatic facial emotion recognition algorithm which comprises of two main components: feature extraction and expression recognition. The algorithm uses a Gabor filter bank on fiducial points to find the facial expression features. The resulting magnitudes of Gabor transforms, along with 14 chosen FAPs (Facial Animation Parameters), compose the feature space. There are two stages: the training phase and the recognition phase. Firstly, for the present 6 different emotions, the system classifies all training expressions in 6 different classes (one for each emotion) in the training stage. In the recognition phase, it recognizes the emotion by applying the Gabor bank to a face image, then finds the fiducial points, and then feeds it to the trained neural architecture.
Almeida, Jorge R C; Versace, Amelia; Hassel, Stefanie; Kupfer, David J; Phillips, Mary L
2010-03-01
Difficulties in emotion processing and poor social function are common to bipolar disorder (BD) and major depressive disorder (MDD) depression, resulting in many BD depressed individuals being misdiagnosed with MDD. The amygdala is a key region implicated in processing emotionally salient stimuli, including emotional facial expressions. It is unclear, however, whether abnormal amygdala activity during positive and negative emotion processing represents a persistent marker of BD regardless of illness phase or a state marker of depression common or specific to BD and MDD depression. Sixty adults were recruited: 15 depressed with BD type 1 (BDd), 15 depressed with recurrent MDD, 15 with BD in remission (BDr), diagnosed with DSM-IV and Structured Clinical Interview for DSM-IV Research Version criteria; and 15 healthy control subjects (HC). Groups were age- and gender ratio-matched; patient groups were matched for age of illness onset and illness duration; depressed groups were matched for depression severity. The BDd were taking more psychotropic medication than other patient groups. All individuals participated in three separate 3T neuroimaging event-related experiments, where they viewed mild and intense emotional and neutral faces of fear, happiness, or sadness from a standardized series. The BDd-relative to HC, BDr, and MDD-showed elevated left amygdala activity to mild and neutral facial expressions in the sad (p < .009) but not other emotion experiments that was not associated with medication. There were no other significant between-group differences in amygdala activity. Abnormally elevated left amygdala activity to mild sad and neutral faces might be a depression-specific marker in BD but not MDD, suggesting different pathophysiologic processes for BD versus MDD depression. Copyright 2010 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.
Blasi, Giuseppe; Bianco, Luciana Lo; Taurisano, Paolo; Gelao, Barbara; Romano, Raffaella; Fazio, Leonardo; Papazacharias, Apostolos; Di Giorgio, Annabella; Caforio, Grazia; Rampino, Antonio; Masellis, Rita; Papp, Audrey; Ursini, Gianluca; Sinibaldi, Lorenzo; Popolizio, Teresa; Sadee, Wolfgang; Bertolino, Alessandro
2010-01-01
Personality traits related to emotion processing are, at least in part, heritable and genetically determined. Dopamine D2 receptor signaling is involved in modulation of emotional behavior and activity of associated brain regions such as the amygdala and the prefrontal cortex. An intronic single nucleotide polymorphism within the D2 receptor gene (DRD2, rs1076560, guanine>thymine - G>T) shifts splicing of the two protein isoforms (D2 short, D2S, mainly presynaptic, and D2 long, D2L) and has been associated with modulation of memory performance and brain activity. Here, our aim was to investigate the association of DRD2 rs1076560 genotype with personality traits of emotional stability and with brain physiology during processing of emotionally relevant stimuli. DRD2 genotype and Big Five Questionnaire scores were evaluated in 134 healthy subjects demonstrating that GG subjects have reduced ‘emotion control’ compared with GT subjects. fMRI in a sample of 24 individuals indicated greater amygdala activity during implicit processing and greater dorsolateral prefrontal cortex (DLPFC) response during explicit processing of facial emotional stimuli in GG subjects compared with GT. Other results also demonstrate an interaction between DRD2 genotype and facial emotional expression on functional connectivity of both amygdala and dorsolateral prefrontal regions with overlapping medial prefrontal areas. Moreover, rs1076560 genotype is associated with differential relationships between amygdala/DLPFC functional connectivity and emotion control scores. These results suggest that genetically determined D2 signaling may explain part of personality traits related to emotion processing and individual variability in specific brain responses to emotionally relevant inputs. PMID:19940176
Blasi, Giuseppe; Lo Bianco, Luciana; Taurisano, Paolo; Gelao, Barbara; Romano, Raffaella; Fazio, Leonardo; Papazacharias, Apostolos; Di Giorgio, Annabella; Caforio, Grazia; Rampino, Antonio; Masellis, Rita; Papp, Audrey; Ursini, Gianluca; Sinibaldi, Lorenzo; Popolizio, Teresa; Sadee, Wolfgang; Bertolino, Alessandro
2009-11-25
Personality traits related to emotion processing are, at least in part, heritable and genetically determined. Dopamine D(2) receptor signaling is involved in modulation of emotional behavior and activity of associated brain regions such as the amygdala and the prefrontal cortex. An intronic single nucleotide polymorphism within the D(2) receptor gene (DRD2) (rs1076560, guanine > thymine or G > T) shifts splicing of the two protein isoforms (D(2) short, mainly presynaptic, and D(2) long) and has been associated with modulation of memory performance and brain activity. Here, our aim was to investigate the association of DRD2 rs1076560 genotype with personality traits of emotional stability and with brain physiology during processing of emotionally relevant stimuli. DRD2 genotype and Big Five Questionnaire scores were evaluated in 134 healthy subjects demonstrating that GG subjects have reduced "emotion control" compared with GT subjects. Functional magnetic resonance imaging in a sample of 24 individuals indicated greater amygdala activity during implicit processing and greater dorsolateral prefrontal cortex (DLPFC) response during explicit processing of facial emotional stimuli in GG subjects compared with GT. Other results also demonstrate an interaction between DRD2 genotype and facial emotional expression on functional connectivity of both amygdala and dorsolateral prefrontal regions with overlapping medial prefrontal areas. Moreover, rs1076560 genotype is associated with differential relationships between amygdala/DLPFC functional connectivity and emotion control scores. These results suggest that genetically determined D(2) signaling may explain part of personality traits related to emotion processing and individual variability in specific brain responses to emotionally relevant inputs.
Development and validation of an Argentine set of facial expressions of emotion.
Vaiman, Marcelo; Wagner, Mónica Anna; Caicedo, Estefanía; Pereno, Germán Leandro
2017-02-01
Pictures of facial expressions of emotion are used in a wide range of experiments. The last decade has seen an increase in the number of studies presenting local sets of emotion stimuli. However, only a few existing sets contain pictures of Latin Americans, despite the growing attention emotion research is receiving in this region. Here we present the development and validation of the Universidad Nacional de Cordoba, Expresiones de Emociones Faciales (UNCEEF), a Facial Action Coding System (FACS)-verified set of pictures of Argentineans expressing the six basic emotions, plus neutral expressions. FACS scores, recognition rates, Hu scores, and discrimination indices are reported. Evidence of convergent validity was obtained using the Pictures of Facial Affect in an Argentine sample. However, recognition accuracy was greater for UNCEEF. The importance of local sets of emotion pictures is discussed.
Effects of speaker emotional facial expression and listener age on incremental sentence processing.
Carminati, Maria Nella; Knoeferle, Pia
2013-01-01
We report two visual-world eye-tracking experiments that investigated how and with which time course emotional information from a speaker's face affects younger (N = 32, Mean age = 23) and older (N = 32, Mean age = 64) listeners' visual attention and language comprehension as they processed emotional sentences in a visual context. The age manipulation tested predictions by socio-emotional selectivity theory of a positivity effect in older adults. After viewing the emotional face of a speaker (happy or sad) on a computer display, participants were presented simultaneously with two pictures depicting opposite-valence events (positive and negative; IAPS database) while they listened to a sentence referring to one of the events. Participants' eye fixations on the pictures while processing the sentence were increased when the speaker's face was (vs. wasn't) emotionally congruent with the sentence. The enhancement occurred from the early stages of referential disambiguation and was modulated by age. For the older adults it was more pronounced with positive faces, and for the younger ones with negative faces. These findings demonstrate for the first time that emotional facial expressions, similarly to previously-studied speaker cues such as eye gaze and gestures, are rapidly integrated into sentence processing. They also provide new evidence for positivity effects in older adults during situated sentence processing.
Direct effects of diazepam on emotional processing in healthy volunteers
Murphy, S. E.; Downham, C.; Cowen, P. J.
2008-01-01
Rationale Pharmacological agents used in the treatment of anxiety have been reported to decrease threat relevant processing in patients and healthy controls, suggesting a potentially relevant mechanism of action. However, the effects of the anxiolytic diazepam have typically been examined at sedative doses, which do not allow the direct actions on emotional processing to be fully separated from global effects of the drug on cognition and alertness. Objectives The aim of this study was to investigate the effect of a lower, but still clinically effective, dose of diazepam on emotional processing in healthy volunteers. Materials and methods Twenty-four participants were randomised to receive a single dose of diazepam (5 mg) or placebo. Sixty minutes later, participants completed a battery of psychological tests, including measures of non-emotional cognitive performance (reaction time and sustained attention) and emotional processing (affective modulation of the startle reflex, attentional dot probe, facial expression recognition, and emotional memory). Mood and subjective experience were also measured. Results Diazepam significantly modulated attentional vigilance to masked emotional faces and significantly decreased overall startle reactivity. Diazepam did not significantly affect mood, alertness, response times, facial expression recognition, or sustained attention. Conclusions At non-sedating doses, diazepam produces effects on attentional vigilance and startle responsivity that are consistent with its anxiolytic action. This may be an underlying mechanism through which benzodiazepines exert their therapeutic effects in clinical anxiety. PMID:18581100
Gosselin, P; Larocque, C
2000-09-01
The effects of Asian and Caucasian facial morphology were examined by having Canadian children categorize pictures of facial expressions of basic emotions. The pictures were selected from the Japanese and Caucasian Facial Expressions of Emotion set developed by D. Matsumoto and P. Ekman (1989). Sixty children between the ages of 5 and 10 years were presented with short stories and an array of facial expressions, and were asked to point to the expression that best depicted the specific emotion experienced by the characters. The results indicated that expressions of fear and surprise were better categorized from Asian faces, whereas expressions of disgust were better categorized from Caucasian faces. These differences originated in some specific confusions between expressions.
Four not six: Revealing culturally common facial expressions of emotion.
Jack, Rachael E; Sun, Wei; Delis, Ioannis; Garrod, Oliver G B; Schyns, Philippe G
2016-06-01
As a highly social species, humans generate complex facial expressions to communicate a diverse range of emotions. Since Darwin's work, identifying among these complex patterns which are common across cultures and which are culture-specific has remained a central question in psychology, anthropology, philosophy, and more recently machine vision and social robotics. Classic approaches to addressing this question typically tested the cross-cultural recognition of theoretically motivated facial expressions representing 6 emotions, and reported universality. Yet, variable recognition accuracy across cultures suggests a narrower cross-cultural communication supported by sets of simpler expressive patterns embedded in more complex facial expressions. We explore this hypothesis by modeling the facial expressions of over 60 emotions across 2 cultures, and segregating out the latent expressive patterns. Using a multidisciplinary approach, we first map the conceptual organization of a broad spectrum of emotion words by building semantic networks in 2 cultures. For each emotion word in each culture, we then model and validate its corresponding dynamic facial expression, producing over 60 culturally valid facial expression models. We then apply to the pooled models a multivariate data reduction technique, revealing 4 latent and culturally common facial expression patterns that each communicates specific combinations of valence, arousal, and dominance. We then reveal the face movements that accentuate each latent expressive pattern to create complex facial expressions. Our data questions the widely held view that 6 facial expression patterns are universal, instead suggesting 4 latent expressive patterns with direct implications for emotion communication, social psychology, cognitive neuroscience, and social robotics. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
The Face of the Chameleon: The Experience of Facial Mimicry for the Mimicker and the Mimickee
Kulesza, Wojciech Marek; Cisłak, Aleksandra; Vallacher, Robin R.; Nowak, Andrzej; Czekiel, Martyna; Bedynska, Sylwia
2015-01-01
ABSTRACT This research addressed three questions concerning facial mimicry: (a) Does the relationship between mimicry and liking characterize all facial expressions, or is it limited to specific expressions? (b) Is the relationship between facial mimicry and liking symmetrical for the mimicker and the mimickee? (c) Does conscious mimicry have consequences for emotion recognition? A paradigm is introduced in which participants interact over a computer setup with a confederate whose prerecorded facial displays of emotion are synchronized with participants’ behavior to create the illusion of social interaction. In Experiment 1, the confederate did or did not mimic participants’ facial displays of various subsets of basic emotions. Mimicry promoted greater liking for the confederate regardless of which emotions were mimicked. Experiment 2 reversed these roles: participants were instructed to mimic or not to mimic the confederate’s facial displays. Mimicry did not affect liking for the confederate but it did impair emotion recognition. PMID:25811746
Emotion Estimation Algorithm from Facial Image Analyses of e-Learning Users
NASA Astrophysics Data System (ADS)
Shigeta, Ayuko; Koike, Takeshi; Kurokawa, Tomoya; Nosu, Kiyoshi
This paper proposes an emotion estimation algorithm from e-Learning user's facial image. The algorithm characteristics are as follows: The criteria used to relate an e-Learning use's emotion to a representative emotion were obtained from the time sequential analysis of user's facial expressions. By examining the emotions of the e-Learning users and the positional change of the facial expressions from the experiment results, the following procedures are introduce to improve the estimation reliability; (1) some effective features points are chosen by the emotion estimation (2) dividing subjects into two groups by the change rates of the face feature points (3) selection of the eigenvector of the variance-co-variance matrices (cumulative contribution rate>=95%) (4) emotion calculation using Mahalanobis distance.
Facial Feedback Mechanisms in Autistic Spectrum Disorders
van den Heuvel, Claudia; Smeets, Raymond C.
2008-01-01
Facial feedback mechanisms of adolescents with Autistic Spectrum Disorders (ASD) were investigated utilizing three studies. Facial expressions, which became activated via automatic (Studies 1 and 2) or intentional (Study 2) mimicry, or via holding a pen between the teeth (Study 3), influenced corresponding emotions for controls, while individuals with ASD remained emotionally unaffected. Thus, individuals with ASD do not experience feedback from activated facial expressions as controls do. This facial feedback-impairment enhances our understanding of the social and emotional lives of individuals with ASD. PMID:18293075
Oxytocin attenuates neural reactivity to masked threat cues from the eyes.
Kanat, Manuela; Heinrichs, Markus; Schwarzwald, Ralf; Domes, Gregor
2015-01-01
The neuropeptide oxytocin has recently been shown to modulate covert attention shifts to emotional face cues and to improve discrimination of masked facial emotions. These results suggest that oxytocin modulates facial emotion processing at early perceptual stages prior to full evaluation of the emotional expression. Here, we used functional magnetic resonance imaging to examine whether oxytocin alters neural responses to backwardly masked angry and happy faces while controlling for attention to the eye vs the mouth region. Intranasal oxytocin administration reduced amygdala reactivity to masked emotions when attending to salient facial features, ie, the eyes of angry faces and the mouth of happy faces. In addition, oxytocin decreased neural responses within the fusiform gyrus and brain stem areas, as well as functional coupling between the amygdala and the fusiform gyrus specifically for threat cues from the eyes. Effects of oxytocin on brain activity were not attributable to differences in behavioral performance, as oxytocin had no impact on mere emotion detection. Our results suggest that oxytocin attenuates neural correlates of early arousal by threat signals from the eye region. As reduced threat sensitivity may increase the likelihood of engaging in social interactions, our findings may have important implications for clinical states of social anxiety.
Realistic prediction of individual facial emotion expressions for craniofacial surgery simulations
NASA Astrophysics Data System (ADS)
Gladilin, Evgeny; Zachow, Stefan; Deuflhard, Peter; Hege, Hans-Christian
2003-05-01
In addition to the static soft tissue prediction, the estimation of individual facial emotion expressions is an important criterion for the evaluation of the carniofacial surgery planning. In this paper, we present an approach for the estimation of individual facial emotion expressions on the basis of geometrical models of human anatomy derived from tomographic data and the finite element modeling of facial tissue biomechanics.
García-Rodríguez, Beatriz; Guillén, Carmen Casares; Barba, Rosa Jurado; io Valladolid, Gabriel Rub; Arjona, José Antonio Molina; Ellgring, Heiner
2012-02-15
There is evidence that visuo-spatial capacity can become overloaded when processing a secondary visual task (Dual Task, DT), as occurs in daily life. Hence, we investigated the influence of the visuo-spatial interference in the identification of emotional facial expressions (EFEs) in early stages of Parkinson's disease (PD). We compared the identification of 24 emotional faces that illustrate six basic emotions in, unmedicated recently diagnosed PD patients (16) and healthy adults (20), under two different conditions: a) simple EFE identification, and b) identification with a concurrent visuo-spatial task (Corsi Blocks). EFE identification by PD patients was significantly worse than that of healthy adults when combined with another visual stimulus. Published by Elsevier B.V.
Ardizzi, Martina; Evangelista, Valentina; Ferroni, Francesca; Umiltà, Maria A.; Ravera, Roberto; Gallese, Vittorio
2017-01-01
One of the crucial features defining basic emotions and their prototypical facial expressions is their value for survival. Childhood traumatic experiences affect the effective recognition of facial expressions of negative emotions, normally allowing the recruitment of adequate behavioral responses to environmental threats. Specifically, anger becomes an extraordinarily salient stimulus unbalancing victims’ recognition of negative emotions. Despite the plethora of studies on this topic, to date, it is not clear whether this phenomenon reflects an overall response tendency toward anger recognition or a selective proneness to the salience of specific facial expressive cues of anger after trauma exposure. To address this issue, a group of underage Sierra Leonean Ebola virus disease survivors (mean age 15.40 years, SE 0.35; years of schooling 8.8 years, SE 0.46; 14 males) and a control group (mean age 14.55, SE 0.30; years of schooling 8.07 years, SE 0.30, 15 males) performed a forced-choice chimeric facial expressions recognition task. The chimeric facial expressions were obtained pairing upper and lower half faces of two different negative emotions (selected from anger, fear and sadness for a total of six different combinations). Overall, results showed that upper facial expressive cues were more salient than lower facial expressive cues. This priority was lost among Ebola virus disease survivors for the chimeric facial expressions of anger. In this case, differently from controls, Ebola virus disease survivors recognized anger regardless of the upper or lower position of the facial expressive cues of this emotion. The present results demonstrate that victims’ performance in the recognition of the facial expression of anger does not reflect an overall response tendency toward anger recognition, but rather the specific greater salience of facial expressive cues of anger. Furthermore, the present results show that traumatic experiences deeply modify the perceptual analysis of philogenetically old behavioral patterns like the facial expressions of emotions. PMID:28690565
Ardizzi, Martina; Evangelista, Valentina; Ferroni, Francesca; Umiltà, Maria A; Ravera, Roberto; Gallese, Vittorio
2017-01-01
One of the crucial features defining basic emotions and their prototypical facial expressions is their value for survival. Childhood traumatic experiences affect the effective recognition of facial expressions of negative emotions, normally allowing the recruitment of adequate behavioral responses to environmental threats. Specifically, anger becomes an extraordinarily salient stimulus unbalancing victims' recognition of negative emotions. Despite the plethora of studies on this topic, to date, it is not clear whether this phenomenon reflects an overall response tendency toward anger recognition or a selective proneness to the salience of specific facial expressive cues of anger after trauma exposure. To address this issue, a group of underage Sierra Leonean Ebola virus disease survivors (mean age 15.40 years, SE 0.35; years of schooling 8.8 years, SE 0.46; 14 males) and a control group (mean age 14.55, SE 0.30; years of schooling 8.07 years, SE 0.30, 15 males) performed a forced-choice chimeric facial expressions recognition task. The chimeric facial expressions were obtained pairing upper and lower half faces of two different negative emotions (selected from anger, fear and sadness for a total of six different combinations). Overall, results showed that upper facial expressive cues were more salient than lower facial expressive cues. This priority was lost among Ebola virus disease survivors for the chimeric facial expressions of anger. In this case, differently from controls, Ebola virus disease survivors recognized anger regardless of the upper or lower position of the facial expressive cues of this emotion. The present results demonstrate that victims' performance in the recognition of the facial expression of anger does not reflect an overall response tendency toward anger recognition, but rather the specific greater salience of facial expressive cues of anger. Furthermore, the present results show that traumatic experiences deeply modify the perceptual analysis of philogenetically old behavioral patterns like the facial expressions of emotions.
Can an anger face also be scared? Malleability of facial expressions.
Widen, Sherri C; Naab, Pamela
2012-10-01
Do people always interpret a facial expression as communicating a single emotion (e.g., the anger face as only angry) or is that interpretation malleable? The current study investigated preschoolers' (N = 60; 3-4 years) and adults' (N = 20) categorization of facial expressions. On each of five trials, participants selected from an array of 10 facial expressions (an open-mouthed, high arousal expression and a closed-mouthed, low arousal expression each for happiness, sadness, anger, fear, and disgust) all those that displayed the target emotion. Children's interpretation of facial expressions was malleable: 48% of children who selected the fear, anger, sadness, and disgust faces for the "correct" category also selected these same faces for another emotion category; 47% of adults did so for the sadness and disgust faces. The emotion children and adults attribute to facial expressions is influenced by the emotion category for which they are looking.
Real-time speech-driven animation of expressive talking faces
NASA Astrophysics Data System (ADS)
Liu, Jia; You, Mingyu; Chen, Chun; Song, Mingli
2011-05-01
In this paper, we present a real-time facial animation system in which speech drives mouth movements and facial expressions synchronously. Considering five basic emotions, a hierarchical structure with an upper layer of emotion classification is established. Based on the recognized emotion label, the under-layer classification at sub-phonemic level has been modelled on the relationship between acoustic features of frames and audio labels in phonemes. Using certain constraint, the predicted emotion labels of speech are adjusted to gain the facial expression labels which are combined with sub-phonemic labels. The combinations are mapped into facial action units (FAUs), and audio-visual synchronized animation with mouth movements and facial expressions is generated by morphing between FAUs. The experimental results demonstrate that the two-layer structure succeeds in both emotion and sub-phonemic classifications, and the synthesized facial sequences reach a comparative convincing quality.
Sex differences in social cognition: The case of face processing.
Proverbio, Alice Mado
2017-01-02
Several studies have demonstrated that women show a greater interest for social information and empathic attitude than men. This article reviews studies on sex differences in the brain, with particular reference to how males and females process faces and facial expressions, social interactions, pain of others, infant faces, faces in things (pareidolia phenomenon), opposite-sex faces, humans vs. landscapes, incongruent behavior, motor actions, biological motion, erotic pictures, and emotional information. Sex differences in oxytocin-based attachment response and emotional memory are also mentioned. In addition, we investigated how 400 different human faces were evaluated for arousal and valence dimensions by a group of healthy male and female University students. Stimuli were carefully balanced for sensory and perceptual characteristics, age, facial expression, and sex. As a whole, women judged all human faces as more positive and more arousing than men. Furthermore, they showed a preference for the faces of children and the elderly in the arousal evaluation. Regardless of face aesthetics, age, or facial expression, women rated human faces higher than men. The preference for opposite- vs. same-sex faces strongly interacted with facial age. Overall, both women and men exhibited differences in facial processing that could be interpreted in the light of evolutionary psychobiology. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
ERIC Educational Resources Information Center
Chickerur, Satyadhyan; Joshi, Kartik
2015-01-01
Emotion detection using facial images is a technique that researchers have been using for the last two decades to try to analyze a person's emotional state given his/her image. Detection of various kinds of emotion using facial expressions of students in educational environment is useful in providing insight into the effectiveness of tutoring…
Barkl, Sophie J; Lah, Suncica; Harris, Anthony W F; Williams, Leanne M
2014-10-01
Patients with chronic schizophrenia are characterized by deficits in identifying facial expressions of emotion, and these deficits relate to impaired social and occupational function. It is not yet known if these deficits are trait-like and present at the onset of psychosis, preceding a subsequent diagnosis of schizophrenia. Our objective was to systematically review and analyze the extant literature to assess if there is a consistent profile of emotion identification problems in early-onset and first-episode psychosis. We conducted a systematic review and meta-analysis of 12 peer-reviewed studies of facial emotion identification in early-onset and first-episode psychosis, published between 1980 and March 2013. We examined the average mean difference between patients and controls on measures of facial emotion identification. Findings suggest that patients with early-onset and first-episode psychosis have impairment in identifying facial expressions of biologically salient emotion. Across the 12 studies, the onset of psychosis was distinguished by a generalized effect of significantly poorer accuracy for identifying facial expressions of emotion than healthy controls, and this difference had a substantial effect size (d=-0.88, N=378, 95% CI=-1.42 to -0.32). Within this general effect some emotions were also harder for patients to identify than others, with the magnitude of impairment found to be (i) large for disgust, fear and surprise, and (ii) medium for sadness, and happiness. No between groups mean differences were found for anger or neutral facial expressions. Deficits in facial emotion identification are evident at first onset of a psychotic episode. The findings suggest that, over and above a generalized deficit in identifying facial emotion, patients may find some emotions harder to identifying than others. This reflects findings with chronic schizophrenia populations and suggests that emotion identification impairment represents a trait susceptibility marker, rather than a sequeale of illness. They signal the urgent need to treat emotion identification deficits at the onset of illness, which could improve functional outcomes. Copyright © 2014 Elsevier B.V. All rights reserved.