Sample records for facial expression processing

  1. Two Ways to Facial Expression Recognition? Motor and Visual Information Have Different Effects on Facial Expression Recognition.

    PubMed

    de la Rosa, Stephan; Fademrecht, Laura; Bülthoff, Heinrich H; Giese, Martin A; Curio, Cristóbal

    2018-06-01

    Motor-based theories of facial expression recognition propose that the visual perception of facial expression is aided by sensorimotor processes that are also used for the production of the same expression. Accordingly, sensorimotor and visual processes should provide congruent emotional information about a facial expression. Here, we report evidence that challenges this view. Specifically, the repeated execution of facial expressions has the opposite effect on the recognition of a subsequent facial expression than the repeated viewing of facial expressions. Moreover, the findings of the motor condition, but not of the visual condition, were correlated with a nonsensory condition in which participants imagined an emotional situation. These results can be well accounted for by the idea that facial expression recognition is not always mediated by motor processes but can also be recognized on visual information alone.

  2. Face Processing in Children with Autism Spectrum Disorder: Independent or Interactive Processing of Facial Identity and Facial Expression?

    ERIC Educational Resources Information Center

    Krebs, Julia F.; Biswas, Ajanta; Pascalis, Olivier; Kamp-Becker, Inge; Remschmidt, Helmuth; Schwarzer, Gudrun

    2011-01-01

    The current study investigated if deficits in processing emotional expression affect facial identity processing and vice versa in children with autism spectrum disorder. Children with autism and IQ and age matched typically developing children classified faces either by emotional expression, thereby ignoring facial identity or by facial identity…

  3. Symmetrical and Asymmetrical Interactions between Facial Expressions and Gender Information in Face Perception.

    PubMed

    Liu, Chengwei; Liu, Ying; Iqbal, Zahida; Li, Wenhui; Lv, Bo; Jiang, Zhongqing

    2017-01-01

    To investigate the interaction between facial expressions and facial gender information during face perception, the present study matched the intensities of the two types of information in face images and then adopted the orthogonal condition of the Garner Paradigm to present the images to participants who were required to judge the gender and expression of the faces; the gender and expression presentations were varied orthogonally. Gender and expression processing displayed a mutual interaction. On the one hand, the judgment of angry expressions occurred faster when presented with male facial images; on the other hand, the classification of the female gender occurred faster when presented with a happy facial expression than when presented with an angry facial expression. According to the evoked-related potential results, the expression classification was influenced by gender during the face structural processing stage (as indexed by N170), which indicates the promotion or interference of facial gender with the coding of facial expression features. However, gender processing was affected by facial expressions in more stages, including the early (P1) and late (LPC) stages of perceptual processing, reflecting that emotional expression influences gender processing mainly by directing attention.

  4. Cerebellum and processing of negative facial emotions: cerebellar transcranial DC stimulation specifically enhances the emotional recognition of facial anger and sadness.

    PubMed

    Ferrucci, Roberta; Giannicola, Gaia; Rosa, Manuela; Fumagalli, Manuela; Boggio, Paulo Sergio; Hallett, Mark; Zago, Stefano; Priori, Alberto

    2012-01-01

    Some evidence suggests that the cerebellum participates in the complex network processing emotional facial expression. To evaluate the role of the cerebellum in recognising facial expressions we delivered transcranial direct current stimulation (tDCS) over the cerebellum and prefrontal cortex. A facial emotion recognition task was administered to 21 healthy subjects before and after cerebellar tDCS; we also tested subjects with a visual attention task and a visual analogue scale (VAS) for mood. Anodal and cathodal cerebellar tDCS both significantly enhanced sensory processing in response to negative facial expressions (anodal tDCS, p=.0021; cathodal tDCS, p=.018), but left positive emotion and neutral facial expressions unchanged (p>.05). tDCS over the right prefrontal cortex left facial expressions of both negative and positive emotion unchanged. These findings suggest that the cerebellum is specifically involved in processing facial expressions of negative emotion.

  5. Processing of subliminal facial expressions of emotion: a behavioral and fMRI study.

    PubMed

    Prochnow, D; Kossack, H; Brunheim, S; Müller, K; Wittsack, H-J; Markowitsch, H-J; Seitz, R J

    2013-01-01

    The recognition of emotional facial expressions is an important means to adjust behavior in social interactions. As facial expressions widely differ in their duration and degree of expressiveness, they often manifest with short and transient expressions below the level of awareness. In this combined behavioral and fMRI study, we aimed at examining whether or not consciously accessible (subliminal) emotional facial expressions influence empathic judgments and which brain activations are related to it. We hypothesized that subliminal facial expressions of emotions masked with neutral expressions of the same faces induce an empathic processing similar to consciously accessible (supraliminal) facial expressions. Our behavioral data in 23 healthy subjects showed that subliminal emotional facial expressions of 40 ms duration affect the judgments of the subsequent neutral facial expressions. In the fMRI study in 12 healthy subjects it was found that both, supra- and subliminal emotional facial expressions shared a widespread network of brain areas including the fusiform gyrus, the temporo-parietal junction, and the inferior, dorsolateral, and medial frontal cortex. Compared with subliminal facial expressions, supraliminal facial expressions led to a greater activation of left occipital and fusiform face areas. We conclude that masked subliminal emotional information is suited to trigger processing in brain areas which have been implicated in empathy and, thereby in social encounters.

  6. Facial identity and facial expression are initially integrated at visual perceptual stages of face processing.

    PubMed

    Fisher, Katie; Towler, John; Eimer, Martin

    2016-01-08

    It is frequently assumed that facial identity and facial expression are analysed in functionally and anatomically distinct streams within the core visual face processing system. To investigate whether expression and identity interact during the visual processing of faces, we employed a sequential matching procedure where participants compared either the identity or the expression of two successively presented faces, and ignored the other irrelevant dimension. Repetitions versus changes of facial identity and expression were varied independently across trials, and event-related potentials (ERPs) were recorded during task performance. Irrelevant facial identity and irrelevant expression both interfered with performance in the expression and identity matching tasks. These symmetrical interference effects show that neither identity nor expression can be selectively ignored during face matching, and suggest that they are not processed independently. N250r components to identity repetitions that reflect identity matching mechanisms in face-selective visual cortex were delayed and attenuated when there was an expression change, demonstrating that facial expression interferes with visual identity matching. These findings provide new evidence for interactions between facial identity and expression within the core visual processing system, and question the hypothesis that these two attributes are processed independently. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Altering sensorimotor feedback disrupts visual discrimination of facial expressions.

    PubMed

    Wood, Adrienne; Lupyan, Gary; Sherrin, Steven; Niedenthal, Paula

    2016-08-01

    Looking at another person's facial expression of emotion can trigger the same neural processes involved in producing the expression, and such responses play a functional role in emotion recognition. Disrupting individuals' facial action, for example, interferes with verbal emotion recognition tasks. We tested the hypothesis that facial responses also play a functional role in the perceptual processing of emotional expressions. We altered the facial action of participants with a gel facemask while they performed a task that involved distinguishing target expressions from highly similar distractors. Relative to control participants, participants in the facemask condition demonstrated inferior perceptual discrimination of facial expressions, but not of nonface stimuli. The findings suggest that somatosensory/motor processes involving the face contribute to the visual perceptual-and not just conceptual-processing of facial expressions. More broadly, our study contributes to growing evidence for the fundamentally interactive nature of the perceptual inputs from different sensory modalities.

  8. Effects of facial color on the subliminal processing of fearful faces.

    PubMed

    Nakajima, K; Minami, T; Nakauchi, S

    2015-12-03

    Recent studies have suggested that both configural information, such as face shape, and surface information is important for face perception. In particular, facial color is sufficiently suggestive of emotional states, as in the phrases: "flushed with anger" and "pale with fear." However, few studies have examined the relationship between facial color and emotional expression. On the other hand, event-related potential (ERP) studies have shown that emotional expressions, such as fear, are processed unconsciously. In this study, we examined how facial color modulated the supraliminal and subliminal processing of fearful faces. We recorded electroencephalograms while participants performed a facial emotion identification task involving masked target faces exhibiting facial expressions (fearful or neutral) and colors (natural or bluish). The results indicated that there was a significant interaction between facial expression and color for the latency of the N170 component. Subsequent analyses revealed that the bluish-colored faces increased the latency effect of facial expressions compared to the natural-colored faces, indicating that the bluish color modulated the processing of fearful expressions. We conclude that the unconscious processing of fearful faces is affected by facial color. Copyright © 2015 IBRO. Published by Elsevier Ltd. All rights reserved.

  9. Proposal of Self-Learning and Recognition System of Facial Expression

    NASA Astrophysics Data System (ADS)

    Ogawa, Yukihiro; Kato, Kunihito; Yamamoto, Kazuhiko

    We describe realization of more complicated function by using the information acquired from some equipped unripe functions. The self-learning and recognition system of the human facial expression, which achieved under the natural relation between human and robot, are proposed. The robot with this system can understand human facial expressions and behave according to their facial expressions after the completion of learning process. The system modelled after the process that a baby learns his/her parents’ facial expressions. Equipping the robot with a camera the system can get face images and equipping the CdS sensors on the robot’s head the robot can get the information of human action. Using the information of these sensors, the robot can get feature of each facial expression. After self-learning is completed, when a person changed his facial expression in front of the robot, the robot operates actions under the relevant facial expression.

  10. [Prosopagnosia and facial expression recognition].

    PubMed

    Koyama, Shinichi

    2014-04-01

    This paper reviews clinical neuropsychological studies that have indicated that the recognition of a person's identity and the recognition of facial expressions are processed by different cortical and subcortical areas of the brain. The fusiform gyrus, especially the right fusiform gyrus, plays an important role in the recognition of identity. The superior temporal sulcus, amygdala, and medial frontal cortex play important roles in facial-expression recognition. Both facial recognition and facial-expression recognition are highly intellectual processes that involve several regions of the brain.

  11. Differences in holistic processing do not explain cultural differences in the recognition of facial expression.

    PubMed

    Yan, Xiaoqian; Young, Andrew W; Andrews, Timothy J

    2017-12-01

    The aim of this study was to investigate the causes of the own-race advantage in facial expression perception. In Experiment 1, we investigated Western Caucasian and Chinese participants' perception and categorization of facial expressions of six basic emotions that included two pairs of confusable expressions (fear and surprise; anger and disgust). People were slightly better at identifying facial expressions posed by own-race members (mainly in anger and disgust). In Experiment 2, we asked whether the own-race advantage was due to differences in the holistic processing of facial expressions. Participants viewed composite faces in which the upper part of one expression was combined with the lower part of a different expression. The upper and lower parts of the composite faces were either aligned or misaligned. Both Chinese and Caucasian participants were better at identifying the facial expressions from the misaligned images, showing interference on recognizing the parts of the expressions created by holistic perception of the aligned composite images. However, this interference from holistic processing was equivalent across expressions of own-race and other-race faces in both groups of participants. Whilst the own-race advantage in recognizing facial expressions does seem to reflect the confusability of certain emotions, it cannot be explained by differences in holistic processing.

  12. Enhanced subliminal emotional responses to dynamic facial expressions.

    PubMed

    Sato, Wataru; Kubota, Yasutaka; Toichi, Motomi

    2014-01-01

    Emotional processing without conscious awareness plays an important role in human social interaction. Several behavioral studies reported that subliminal presentation of photographs of emotional facial expressions induces unconscious emotional processing. However, it was difficult to elicit strong and robust effects using this method. We hypothesized that dynamic presentations of facial expressions would enhance subliminal emotional effects and tested this hypothesis with two experiments. Fearful or happy facial expressions were presented dynamically or statically in either the left or the right visual field for 20 (Experiment 1) and 30 (Experiment 2) ms. Nonsense target ideographs were then presented, and participants reported their preference for them. The results consistently showed that dynamic presentations of emotional facial expressions induced more evident emotional biases toward subsequent targets than did static ones. These results indicate that dynamic presentations of emotional facial expressions induce more evident unconscious emotional processing.

  13. Unconscious Processing of Facial Expressions in Individuals with Internet Gaming Disorder.

    PubMed

    Peng, Xiaozhe; Cui, Fang; Wang, Ting; Jiao, Can

    2017-01-01

    Internet Gaming Disorder (IGD) is characterized by impairments in social communication and the avoidance of social contact. Facial expression processing is the basis of social communication. However, few studies have investigated how individuals with IGD process facial expressions, and whether they have deficits in emotional facial processing remains unclear. The aim of the present study was to explore these two issues by investigating the time course of emotional facial processing in individuals with IGD. A backward masking task was used to investigate the differences between individuals with IGD and normal controls (NC) in the processing of subliminally presented facial expressions (sad, happy, and neutral) with event-related potentials (ERPs). The behavioral results showed that individuals with IGD are slower than NC in response to both sad and neutral expressions in the sad-neutral context. The ERP results showed that individuals with IGD exhibit decreased amplitudes in ERP component N170 (an index of early face processing) in response to neutral expressions compared to happy expressions in the happy-neutral expressions context, which might be due to their expectancies for positive emotional content. The NC, on the other hand, exhibited comparable N170 amplitudes in response to both happy and neutral expressions in the happy-neutral expressions context, as well as sad and neutral expressions in the sad-neutral expressions context. Both individuals with IGD and NC showed comparable ERP amplitudes during the processing of sad expressions and neutral expressions. The present study revealed that individuals with IGD have different unconscious neutral facial processing patterns compared with normal individuals and suggested that individuals with IGD may expect more positive emotion in the happy-neutral expressions context. • The present study investigated whether the unconscious processing of facial expressions is influenced by excessive online gaming. A validated backward masking paradigm was used to investigate whether individuals with Internet Gaming Disorder (IGD) and normal controls (NC) exhibit different patterns in facial expression processing.• The results demonstrated that individuals with IGD respond differently to facial expressions compared with NC on a preattentive level. Behaviorally, individuals with IGD are slower than NC in response to both sad and neutral expressions in the sad-neutral context. The ERP results further showed (1) decreased amplitudes in the N170 component (an index of early face processing) in individuals with IGD when they process neutral expressions compared with happy expressions in the happy-neutral expressions context, whereas the NC exhibited comparable N170 amplitudes in response to these two expressions; (2) both the IGD and NC group demonstrated similar N170 amplitudes in response to sad and neutral faces in the sad-neutral expressions context.• The decreased amplitudes of N170 to neutral faces than happy faces in individuals with IGD might due to their less expectancies for neutral content in the happy-neutral expressions context, while individuals with IGD may have no different expectancies for neutral and sad faces in the sad-neutral expressions context.

  14. The Relationships between Processing Facial Identity, Emotional Expression, Facial Speech, and Gaze Direction during Development

    ERIC Educational Resources Information Center

    Spangler, Sibylle M.; Schwarzer, Gudrun; Korell, Monika; Maier-Karius, Johanna

    2010-01-01

    Four experiments were conducted with 5- to 11-year-olds and adults to investigate whether facial identity, facial speech, emotional expression, and gaze direction are processed independently of or in interaction with one another. In a computer-based, speeded sorting task, participants sorted faces according to facial identity while disregarding…

  15. Discrimination of emotional facial expressions by tufted capuchin monkeys (Sapajus apella).

    PubMed

    Calcutt, Sarah E; Rubin, Taylor L; Pokorny, Jennifer J; de Waal, Frans B M

    2017-02-01

    Tufted or brown capuchin monkeys (Sapajus apella) have been shown to recognize conspecific faces as well as categorize them according to group membership. Little is known, though, about their capacity to differentiate between emotionally charged facial expressions or whether facial expressions are processed as a collection of features or configurally (i.e., as a whole). In 3 experiments, we examined whether tufted capuchins (a) differentiate photographs of neutral faces from either affiliative or agonistic expressions, (b) use relevant facial features to make such choices or view the expression as a whole, and (c) demonstrate an inversion effect for facial expressions suggestive of configural processing. Using an oddity paradigm presented on a computer touchscreen, we collected data from 9 adult and subadult monkeys. Subjects discriminated between emotional and neutral expressions with an exceptionally high success rate, including differentiating open-mouth threats from neutral expressions even when the latter contained varying degrees of visible teeth and mouth opening. They also showed an inversion effect for facial expressions, results that may indicate that quickly recognizing expressions does not originate solely from feature-based processing but likely a combination of relational processes. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  16. Facial Expression Recognition using Multiclass Ensemble Least-Square Support Vector Machine

    NASA Astrophysics Data System (ADS)

    Lawi, Armin; Sya'Rani Machrizzandi, M.

    2018-03-01

    Facial expression is one of behavior characteristics of human-being. The use of biometrics technology system with facial expression characteristics makes it possible to recognize a person’s mood or emotion. The basic components of facial expression analysis system are face detection, face image extraction, facial classification and facial expressions recognition. This paper uses Principal Component Analysis (PCA) algorithm to extract facial features with expression parameters, i.e., happy, sad, neutral, angry, fear, and disgusted. Then Multiclass Ensemble Least-Squares Support Vector Machine (MELS-SVM) is used for the classification process of facial expression. The result of MELS-SVM model obtained from our 185 different expression images of 10 persons showed high accuracy level of 99.998% using RBF kernel.

  17. Impaired holistic coding of facial expression and facial identity in congenital prosopagnosia.

    PubMed

    Palermo, Romina; Willis, Megan L; Rivolta, Davide; McKone, Elinor; Wilson, C Ellie; Calder, Andrew J

    2011-04-01

    We test 12 individuals with congenital prosopagnosia (CP), who replicate a common pattern of showing severe difficulty in recognising facial identity in conjunction with normal recognition of facial expressions (both basic and 'social'). Strength of holistic processing was examined using standard expression composite and identity composite tasks. Compared to age- and sex-matched controls, group analyses demonstrated that CPs showed weaker holistic processing, for both expression and identity information. Implications are (a) normal expression recognition in CP can derive from compensatory strategies (e.g., over-reliance on non-holistic cues to expression); (b) the split between processing of expression and identity information may take place after a common stage of holistic processing; and (c) contrary to a recent claim, holistic processing of identity is functionally involved in face identification ability. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. Impaired holistic coding of facial expression and facial identity in congenital prosopagnosia

    PubMed Central

    Palermo, Romina; Willis, Megan L.; Rivolta, Davide; McKone, Elinor; Wilson, C. Ellie; Calder, Andrew J.

    2011-01-01

    We test 12 individuals with congenital prosopagnosia (CP), who replicate a common pattern of showing severe difficulty in recognising facial identity in conjunction with normal recognition of facial expressions (both basic and ‘social’). Strength of holistic processing was examined using standard expression composite and identity composite tasks. Compared to age- and sex-matched controls, group analyses demonstrated that CPs showed weaker holistic processing, for both expression and identity information. Implications are (a) normal expression recognition in CP can derive from compensatory strategies (e.g., over-reliance on non-holistic cues to expression); (b) the split between processing of expression and identity information may take place after a common stage of holistic processing; and (c) contrary to a recent claim, holistic processing of identity is functionally involved in face identification ability. PMID:21333662

  19. Behavioral dissociation between emotional and non-emotional facial expressions in congenital prosopagnosia

    PubMed Central

    Daini, Roberta; Comparetti, Chiara M.; Ricciardelli, Paola

    2014-01-01

    Neuropsychological and neuroimaging studies have shown that facial recognition and emotional expressions are dissociable. However, it is unknown if a single system supports the processing of emotional and non-emotional facial expressions. We aimed to understand if individuals with impairment in face recognition from birth (congenital prosopagnosia, CP) can use non-emotional facial expressions to recognize a face as an already seen one, and thus, process this facial dimension independently from features (which are impaired in CP), and basic emotional expressions. To this end, we carried out a behavioral study in which we compared the performance of 6 CP individuals to that of typical development individuals, using upright and inverted faces. Four avatar faces with a neutral expression were presented in the initial phase. The target faces presented in the recognition phase, in which a recognition task was requested (2AFC paradigm), could be identical (neutral) to those of the initial phase or present biologically plausible changes to features, non-emotional expressions, or emotional expressions. After this task, a second task was performed, in which the participants had to detect whether or not the recognized face exactly matched the study face or showed any difference. The results confirmed the CPs' impairment in the configural processing of the invariant aspects of the face, but also showed a spared configural processing of non-emotional facial expression (task 1). Interestingly and unlike the non-emotional expressions, the configural processing of emotional expressions was compromised in CPs and did not improve their change detection ability (task 2). These new results have theoretical implications for face perception models since they suggest that, at least in CPs, non-emotional expressions are processed configurally, can be dissociated from other facial dimensions, and may serve as a compensatory strategy to achieve face recognition. PMID:25520643

  20. Behavioral dissociation between emotional and non-emotional facial expressions in congenital prosopagnosia.

    PubMed

    Daini, Roberta; Comparetti, Chiara M; Ricciardelli, Paola

    2014-01-01

    Neuropsychological and neuroimaging studies have shown that facial recognition and emotional expressions are dissociable. However, it is unknown if a single system supports the processing of emotional and non-emotional facial expressions. We aimed to understand if individuals with impairment in face recognition from birth (congenital prosopagnosia, CP) can use non-emotional facial expressions to recognize a face as an already seen one, and thus, process this facial dimension independently from features (which are impaired in CP), and basic emotional expressions. To this end, we carried out a behavioral study in which we compared the performance of 6 CP individuals to that of typical development individuals, using upright and inverted faces. Four avatar faces with a neutral expression were presented in the initial phase. The target faces presented in the recognition phase, in which a recognition task was requested (2AFC paradigm), could be identical (neutral) to those of the initial phase or present biologically plausible changes to features, non-emotional expressions, or emotional expressions. After this task, a second task was performed, in which the participants had to detect whether or not the recognized face exactly matched the study face or showed any difference. The results confirmed the CPs' impairment in the configural processing of the invariant aspects of the face, but also showed a spared configural processing of non-emotional facial expression (task 1). Interestingly and unlike the non-emotional expressions, the configural processing of emotional expressions was compromised in CPs and did not improve their change detection ability (task 2). These new results have theoretical implications for face perception models since they suggest that, at least in CPs, non-emotional expressions are processed configurally, can be dissociated from other facial dimensions, and may serve as a compensatory strategy to achieve face recognition.

  1. Are event-related potentials to dynamic facial expressions of emotion related to individual differences in the accuracy of processing facial expressions and identity?

    PubMed

    Recio, Guillermo; Wilhelm, Oliver; Sommer, Werner; Hildebrandt, Andrea

    2017-04-01

    Despite a wealth of knowledge about the neural mechanisms behind emotional facial expression processing, little is known about how they relate to individual differences in social cognition abilities. We studied individual differences in the event-related potentials (ERPs) elicited by dynamic facial expressions. First, we assessed the latent structure of the ERPs, reflecting structural face processing in the N170, and the allocation of processing resources and reflexive attention to emotionally salient stimuli, in the early posterior negativity (EPN) and the late positive complex (LPC). Then we estimated brain-behavior relationships between the ERP factors and behavioral indicators of facial identity and emotion-processing abilities. Structural models revealed that the participants who formed faster structural representations of neutral faces (i.e., shorter N170 latencies) performed better at face perception (r = -.51) and memory (r = -.42). The N170 amplitude was not related to individual differences in face cognition or emotion processing. The latent EPN factor correlated with emotion perception (r = .47) and memory (r = .32), and also with face perception abilities (r = .41). Interestingly, the latent factor representing the difference in EPN amplitudes between the two neutral control conditions (chewing and blinking movements) also correlated with emotion perception (r = .51), highlighting the importance of tracking facial changes in the perception of emotional facial expressions. The LPC factor for negative expressions correlated with the memory for emotional facial expressions. The links revealed between the latency and strength of activations of brain systems and individual differences in processing socio-emotional information provide new insights into the brain mechanisms involved in social communication.

  2. The Relationship between Processing Facial Identity and Emotional Expression in 8-Month-Old Infants

    ERIC Educational Resources Information Center

    Schwarzer, Gudrun; Jovanovic, Bianca

    2010-01-01

    In Experiment 1, it was investigated whether infants process facial identity and emotional expression independently or in conjunction with one another. Eight-month-old infants were habituated to two upright or two inverted faces varying in facial identity and emotional expression. Infants were tested with a habituation face, a switch face, and a…

  3. Neural processing of fearful and happy facial expressions during emotion-relevant and emotion-irrelevant tasks: a fixation-to-feature approach

    PubMed Central

    Neath-Tavares, Karly N.; Itier, Roxane J.

    2017-01-01

    Research suggests an important role of the eyes and mouth for discriminating facial expressions of emotion. A gaze-contingent procedure was used to test the impact of fixation to facial features on the neural response to fearful, happy and neutral facial expressions in an emotion discrimination (Exp.1) and an oddball detection (Exp.2) task. The N170 was the only eye-sensitive ERP component, and this sensitivity did not vary across facial expressions. In both tasks, compared to neutral faces, responses to happy expressions were seen as early as 100–120ms occipitally, while responses to fearful expressions started around 150ms, on or after the N170, at both occipital and lateral-posterior sites. Analyses of scalp topographies revealed different distributions of these two emotion effects across most of the epoch. Emotion processing interacted with fixation location at different times between tasks. Results suggest a role of both the eyes and mouth in the neural processing of fearful expressions and of the mouth in the processing of happy expressions, before 350ms. PMID:27430934

  4. Recognition of facial expressions and prosodic cues with graded emotional intensities in adults with Asperger syndrome.

    PubMed

    Doi, Hirokazu; Fujisawa, Takashi X; Kanai, Chieko; Ohta, Haruhisa; Yokoi, Hideki; Iwanami, Akira; Kato, Nobumasa; Shinohara, Kazuyuki

    2013-09-01

    This study investigated the ability of adults with Asperger syndrome to recognize emotional categories of facial expressions and emotional prosodies with graded emotional intensities. The individuals with Asperger syndrome showed poorer recognition performance for angry and sad expressions from both facial and vocal information. The group difference in facial expression recognition was prominent for stimuli with low or intermediate emotional intensities. In contrast to this, the individuals with Asperger syndrome exhibited lower recognition accuracy than typically-developed controls mainly for emotional prosody with high emotional intensity. In facial expression recognition, Asperger and control groups showed an inversion effect for all categories. The magnitude of this effect was less in the Asperger group for angry and sad expressions, presumably attributable to reduced recruitment of the configural mode of face processing. The individuals with Asperger syndrome outperformed the control participants in recognizing inverted sad expressions, indicating enhanced processing of local facial information representing sad emotion. These results suggest that the adults with Asperger syndrome rely on modality-specific strategies in emotion recognition from facial expression and prosodic information.

  5. Putting the face in context: Body expressions impact facial emotion processing in human infants.

    PubMed

    Rajhans, Purva; Jessen, Sarah; Missana, Manuela; Grossmann, Tobias

    2016-06-01

    Body expressions exert strong contextual effects on facial emotion perception in adults. Specifically, conflicting body cues hamper the recognition of emotion from faces, as evident on both the behavioral and neural level. We examined the developmental origins of the neural processes involved in emotion perception across body and face in 8-month-old infants by measuring event-related brain potentials (ERPs). We primed infants with body postures (fearful, happy) that were followed by either congruent or incongruent facial expressions. Our results revealed that body expressions impact facial emotion processing and that incongruent body cues impair the neural discrimination of emotional facial expressions. Priming effects were associated with attentional and recognition memory processes, as reflected in a modulation of the Nc and Pc evoked at anterior electrodes. These findings demonstrate that 8-month-old infants possess neural mechanisms that allow for the integration of emotion across body and face, providing evidence for the early developmental emergence of context-sensitive facial emotion perception. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  6. Contextual interference processing during fast categorisations of facial expressions.

    PubMed

    Frühholz, Sascha; Trautmann-Lengsfeld, Sina A; Herrmann, Manfred

    2011-09-01

    We examined interference effects of emotionally associated background colours during fast valence categorisations of negative, neutral and positive expressions. According to implicitly learned colour-emotion associations, facial expressions were presented with colours that either matched the valence of these expressions or not. Experiment 1 included infrequent non-matching trials and Experiment 2 a balanced ratio of matching and non-matching trials. Besides general modulatory effects of contextual features on the processing of facial expressions, we found differential effects depending on the valance of target facial expressions. Whereas performance accuracy was mainly affected for neutral expressions, performance speed was specifically modulated by emotional expressions indicating some susceptibility of emotional expressions to contextual features. Experiment 3 used two further colour-emotion combinations, but revealed only marginal interference effects most likely due to missing colour-emotion associations. The results are discussed with respect to inherent processing demands of emotional and neutral expressions and their susceptibility to contextual interference.

  7. Altered Kinematics of Facial Emotion Expression and Emotion Recognition Deficits Are Unrelated in Parkinson's Disease.

    PubMed

    Bologna, Matteo; Berardelli, Isabella; Paparella, Giulia; Marsili, Luca; Ricciardi, Lucia; Fabbrini, Giovanni; Berardelli, Alfredo

    2016-01-01

    Altered emotional processing, including reduced emotion facial expression and defective emotion recognition, has been reported in patients with Parkinson's disease (PD). However, few studies have objectively investigated facial expression abnormalities in PD using neurophysiological techniques. It is not known whether altered facial expression and recognition in PD are related. To investigate possible deficits in facial emotion expression and emotion recognition and their relationship, if any, in patients with PD. Eighteen patients with PD and 16 healthy controls were enrolled in this study. Facial expressions of emotion were recorded using a 3D optoelectronic system and analyzed using the facial action coding system. Possible deficits in emotion recognition were assessed using the Ekman test. Participants were assessed in one experimental session. Possible relationship between the kinematic variables of facial emotion expression, the Ekman test scores, and clinical and demographic data in patients were evaluated using the Spearman's test and multiple regression analysis. The facial expression of all six basic emotions had slower velocity and lower amplitude in patients in comparison to healthy controls (all P s < 0.05). Patients also yielded worse Ekman global score and disgust, sadness, and fear sub-scores than healthy controls (all P s < 0.001). Altered facial expression kinematics and emotion recognition deficits were unrelated in patients (all P s > 0.05). Finally, no relationship emerged between kinematic variables of facial emotion expression, the Ekman test scores, and clinical and demographic data in patients (all P s > 0.05). The results in this study provide further evidence of altered emotional processing in PD. The lack of any correlation between altered facial emotion expression kinematics and emotion recognition deficits in patients suggests that these abnormalities are mediated by separate pathophysiological mechanisms.

  8. Functional integration of the posterior superior temporal sulcus correlates with facial expression recognition.

    PubMed

    Wang, Xu; Song, Yiying; Zhen, Zonglei; Liu, Jia

    2016-05-01

    Face perception is essential for daily and social activities. Neuroimaging studies have revealed a distributed face network (FN) consisting of multiple regions that exhibit preferential responses to invariant or changeable facial information. However, our understanding about how these regions work collaboratively to facilitate facial information processing is limited. Here, we focused on changeable facial information processing, and investigated how the functional integration of the FN is related to the performance of facial expression recognition. To do so, we first defined the FN as voxels that responded more strongly to faces than objects, and then used a voxel-based global brain connectivity method based on resting-state fMRI to characterize the within-network connectivity (WNC) of each voxel in the FN. By relating the WNC and performance in the "Reading the Mind in the Eyes" Test across participants, we found that individuals with stronger WNC in the right posterior superior temporal sulcus (rpSTS) were better at recognizing facial expressions. Further, the resting-state functional connectivity (FC) between the rpSTS and right occipital face area (rOFA), early visual cortex (EVC), and bilateral STS were positively correlated with the ability of facial expression recognition, and the FCs of EVC-pSTS and OFA-pSTS contributed independently to facial expression recognition. In short, our study highlights the behavioral significance of intrinsic functional integration of the FN in facial expression processing, and provides evidence for the hub-like role of the rpSTS for facial expression recognition. Hum Brain Mapp 37:1930-1940, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  9. The Associations between Visual Attention and Facial Expression Identification in Patients with Schizophrenia.

    PubMed

    Lin, I-Mei; Fan, Sheng-Yu; Huang, Tiao-Lai; Wu, Wan-Ting; Li, Shi-Ming

    2013-12-01

    Visual search is an important attention process that precedes the information processing. Visual search also mediates the relationship between cognition function (attention) and social cognition (such as facial expression identification). However, the association between visual attention and social cognition in patients with schizophrenia remains unknown. The purposes of this study were to examine the differences in visual search performance and facial expression identification between patients with schizophrenia and normal controls, and to explore the relationship between visual search performance and facial expression identification in patients with schizophrenia. Fourteen patients with schizophrenia (mean age=46.36±6.74) and 15 normal controls (mean age=40.87±9.33) participated this study. The visual search task, including feature search and conjunction search, and Japanese and Caucasian Facial Expression of Emotion were administered. Patients with schizophrenia had worse visual search performance both in feature search and conjunction search than normal controls, as well as had worse facial expression identification, especially in surprised and sadness. In addition, there were negative associations between visual search performance and facial expression identification in patients with schizophrenia, especially in surprised and sadness. However, this phenomenon was not showed in normal controls. Patients with schizophrenia who had visual search deficits had the impairment on facial expression identification. Increasing ability of visual search and facial expression identification may improve their social function and interpersonal relationship.

  10. Long-term academic stress enhances early processing of facial expressions.

    PubMed

    Zhang, Liang; Qin, Shaozheng; Yao, Zhuxi; Zhang, Kan; Wu, Jianhui

    2016-11-01

    Exposure to long-term stress can lead to a variety of emotional and behavioral problems. Although widely investigated, the neural basis of how long-term stress impacts emotional processing in humans remains largely elusive. Using event-related brain potentials (ERPs), we investigated the effects of long-term stress on the neural dynamics of emotionally facial expression processing. Thirty-nine male college students undergoing preparation for a major examination and twenty-one matched controls performed a gender discrimination task for faces displaying angry, happy, and neutral expressions. The results of the Perceived Stress Scale showed that participants in the stress group perceived higher levels of long-term stress relative to the control group. ERP analyses revealed differential effects of long-term stress on two early stages of facial expression processing: 1) long-term stress generally augmented posterior P1 amplitudes to facial stimuli irrespective of expression valence, suggesting that stress can increase sensitization to visual inputs in general, and 2) long-term stress selectively augmented fronto-central P2 amplitudes for angry but not for neutral or positive facial expressions, suggesting that stress may lead to increased attentional prioritization to processing negative emotional stimuli. Together, our findings suggest that long-term stress has profound impacts on the early stages of facial expression processing, with an increase at the very early stage of general information inputs and a subsequent attentional bias toward processing emotionally negative stimuli. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Rapid processing of emotional expressions without conscious awareness.

    PubMed

    Smith, Marie L

    2012-08-01

    Rapid accurate categorization of the emotional state of our peers is of critical importance and as such many have proposed that facial expressions of emotion can be processed without conscious awareness. Typically, studies focus selectively on fearful expressions due to their evolutionary significance, leaving the subliminal processing of other facial expressions largely unexplored. Here, I investigated the time course of processing of 3 facial expressions (fearful, disgusted, and happy) plus an emotionally neutral face, during objectively unaware and aware perception. Participants completed the challenging "which expression?" task in response to briefly presented backward-masked expressive faces. Although participant's behavioral responses did not differentiate between the emotional content of the stimuli in the unaware condition, activity over frontal and occipitotemporal (OT) brain regions indicated an emotional modulation of the neuronal response. Over frontal regions this was driven by negative facial expressions and was present on all emotional trials independent of later categorization. Whereas the N170 component, recorded on lateral OT electrodes, was enhanced for all facial expressions but only on trials that would later be categorized as emotional. The results indicate that emotional faces, not only fearful, are processed without conscious awareness at an early stage and highlight the critical importance of considering categorization response when studying subliminal perception.

  12. System for face recognition under expression variations of neutral-sampled individuals using recognized expression warping and a virtual expression-face database

    NASA Astrophysics Data System (ADS)

    Petpairote, Chayanut; Madarasmi, Suthep; Chamnongthai, Kosin

    2018-01-01

    The practical identification of individuals using facial recognition techniques requires the matching of faces with specific expressions to faces from a neutral face database. A method for facial recognition under varied expressions against neutral face samples of individuals via recognition of expression warping and the use of a virtual expression-face database is proposed. In this method, facial expressions are recognized and the input expression faces are classified into facial expression groups. To aid facial recognition, the virtual expression-face database is sorted into average facial-expression shapes and by coarse- and fine-featured facial textures. Wrinkle information is also employed in classification by using a process of masking to adjust input faces to match the expression-face database. We evaluate the performance of the proposed method using the CMU multi-PIE, Cohn-Kanade, and AR expression-face databases, and we find that it provides significantly improved results in terms of face recognition accuracy compared to conventional methods and is acceptable for facial recognition under expression variation.

  13. Neural processing of fearful and happy facial expressions during emotion-relevant and emotion-irrelevant tasks: A fixation-to-feature approach.

    PubMed

    Neath-Tavares, Karly N; Itier, Roxane J

    2016-09-01

    Research suggests an important role of the eyes and mouth for discriminating facial expressions of emotion. A gaze-contingent procedure was used to test the impact of fixation to facial features on the neural response to fearful, happy and neutral facial expressions in an emotion discrimination (Exp.1) and an oddball detection (Exp.2) task. The N170 was the only eye-sensitive ERP component, and this sensitivity did not vary across facial expressions. In both tasks, compared to neutral faces, responses to happy expressions were seen as early as 100-120ms occipitally, while responses to fearful expressions started around 150ms, on or after the N170, at both occipital and lateral-posterior sites. Analyses of scalp topographies revealed different distributions of these two emotion effects across most of the epoch. Emotion processing interacted with fixation location at different times between tasks. Results suggest a role of both the eyes and mouth in the neural processing of fearful expressions and of the mouth in the processing of happy expressions, before 350ms. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Mere social categorization modulates identification of facial expressions of emotion.

    PubMed

    Young, Steven G; Hugenberg, Kurt

    2010-12-01

    The ability of the human face to communicate emotional states via facial expressions is well known, and past research has established the importance and universality of emotional facial expressions. However, recent evidence has revealed that facial expressions of emotion are most accurately recognized when the perceiver and expresser are from the same cultural ingroup. The current research builds on this literature and extends this work. Specifically, we find that mere social categorization, using a minimal-group paradigm, can create an ingroup emotion-identification advantage even when the culture of the target and perceiver is held constant. Follow-up experiments show that this effect is supported by differential motivation to process ingroup versus outgroup faces and that this motivational disparity leads to more configural processing of ingroup faces than of outgroup faces. Overall, the results point to distinct processing modes for ingroup and outgroup faces, resulting in differential identification accuracy for facial expressions of emotion. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  15. The processing of facial identity and expression is interactive, but dependent on task and experience

    PubMed Central

    Yankouskaya, Alla; Humphreys, Glyn W.; Rotshtein, Pia

    2014-01-01

    Facial identity and emotional expression are two important sources of information for daily social interaction. However the link between these two aspects of face processing has been the focus of an unresolved debate for the past three decades. Three views have been advocated: (1) separate and parallel processing of identity and emotional expression signals derived from faces; (2) asymmetric processing with the computation of emotion in faces depending on facial identity coding but not vice versa; and (3) integrated processing of facial identity and emotion. We present studies with healthy participants that primarily apply methods from mathematical psychology, formally testing the relations between the processing of facial identity and emotion. Specifically, we focused on the “Garner” paradigm, the composite face effect and the divided attention tasks. We further ask whether the architecture of face-related processes is fixed or flexible and whether (and how) it can be shaped by experience. We conclude that formal methods of testing the relations between processes show that the processing of facial identity and expressions interact, and hence are not fully independent. We further demonstrate that the architecture of the relations depends on experience; where experience leads to higher degree of inter-dependence in the processing of identity and expressions. We propose that this change occurs as integrative processes are more efficient than parallel. Finally, we argue that the dynamic aspects of face processing need to be incorporated into theories in this field. PMID:25452722

  16. Revisiting the Relationship between the Processing of Gaze Direction and the Processing of Facial Expression

    ERIC Educational Resources Information Center

    Ganel, Tzvi

    2011-01-01

    There is mixed evidence on the nature of the relationship between the perception of gaze direction and the perception of facial expressions. Major support for shared processing of gaze and expression comes from behavioral studies that showed that observers cannot process expression or gaze and ignore irrelevant variations in the other dimension.…

  17. Impaired mixed emotion processing in the right ventrolateral prefrontal cortex in schizophrenia: an fMRI study.

    PubMed

    Szabó, Ádám György; Farkas, Kinga; Marosi, Csilla; Kozák, Lajos R; Rudas, Gábor; Réthelyi, János; Csukly, Gábor

    2017-12-08

    Schizophrenia has a negative effect on the activity of the temporal and prefrontal cortices in the processing of emotional facial expressions. However no previous research focused on the evaluation of mixed emotions in schizophrenia, albeit they are frequently expressed in everyday situations and negative emotions are frequently expressed by mixed facial expressions. Altogether 37 subjects, 19 patients with schizophrenia and 18 healthy control subjects were enrolled in the study. The two study groups did not differ in age and education. The stimulus set consisted of 10 fearful (100%), 10 happy (100%), 10 mixed fear (70% fear and 30% happy) and 10 mixed happy facial expressions. During the fMRI acquisition pictures were presented in a randomized order and subjects had to categorize expressions by button press. A decreased activation was found in the patient group during fear, mixed fear and mixed happy processing in the right ventrolateral prefrontal cortex (VLPFC) and the right anterior insula (RAI) at voxel and cluster level after familywise error correction. No difference was found between study groups in activations to happy facial condition. Patients with schizophrenia did not show a differential activation between mixed happy and happy facial expression similar to controls in the right dorsolateral prefrontal cortex (DLPFC). Patients with schizophrenia showed decreased functioning in right prefrontal regions responsible for salience signaling and valence evaluation during emotion recognition. Our results indicate that fear and mixed happy/fear processing are impaired in schizophrenia, while happy facial expression processing is relatively intact.

  18. Dynamics of processing invisible faces in the brain: automatic neural encoding of facial expression information.

    PubMed

    Jiang, Yi; Shannon, Robert W; Vizueta, Nathalie; Bernat, Edward M; Patrick, Christopher J; He, Sheng

    2009-02-01

    The fusiform face area (FFA) and the superior temporal sulcus (STS) are suggested to process facial identity and facial expression information respectively. We recently demonstrated a functional dissociation between the FFA and the STS as well as correlated sensitivity of the STS and the amygdala to facial expressions using an interocular suppression paradigm [Jiang, Y., He, S., 2006. Cortical responses to invisible faces: dissociating subsystems for facial-information processing. Curr. Biol. 16, 2023-2029.]. In the current event-related brain potential (ERP) study, we investigated the temporal dynamics of facial information processing. Observers viewed neutral, fearful, and scrambled face stimuli, either visibly or rendered invisible through interocular suppression. Relative to scrambled face stimuli, intact visible faces elicited larger positive P1 (110-130 ms) and larger negative N1 or N170 (160-180 ms) potentials at posterior occipital and bilateral occipito-temporal regions respectively, with the N170 amplitude significantly greater for fearful than neutral faces. Invisible intact faces generated a stronger signal than scrambled faces at 140-200 ms over posterior occipital areas whereas invisible fearful faces (compared to neutral and scrambled faces) elicited a significantly larger negative deflection starting at 220 ms along the STS. These results provide further evidence for cortical processing of facial information without awareness and elucidate the temporal sequence of automatic facial expression information extraction.

  19. Anatomically constrained neural network models for the categorization of facial expression

    NASA Astrophysics Data System (ADS)

    McMenamin, Brenton W.; Assadi, Amir H.

    2004-12-01

    The ability to recognize facial expression in humans is performed with the amygdala which uses parallel processing streams to identify the expressions quickly and accurately. Additionally, it is possible that a feedback mechanism may play a role in this process as well. Implementing a model with similar parallel structure and feedback mechanisms could be used to improve current facial recognition algorithms for which varied expressions are a source for error. An anatomically constrained artificial neural-network model was created that uses this parallel processing architecture and feedback to categorize facial expressions. The presence of a feedback mechanism was not found to significantly improve performance for models with parallel architecture. However the use of parallel processing streams significantly improved accuracy over a similar network that did not have parallel architecture. Further investigation is necessary to determine the benefits of using parallel streams and feedback mechanisms in more advanced object recognition tasks.

  20. Anatomically constrained neural network models for the categorization of facial expression

    NASA Astrophysics Data System (ADS)

    McMenamin, Brenton W.; Assadi, Amir H.

    2005-01-01

    The ability to recognize facial expression in humans is performed with the amygdala which uses parallel processing streams to identify the expressions quickly and accurately. Additionally, it is possible that a feedback mechanism may play a role in this process as well. Implementing a model with similar parallel structure and feedback mechanisms could be used to improve current facial recognition algorithms for which varied expressions are a source for error. An anatomically constrained artificial neural-network model was created that uses this parallel processing architecture and feedback to categorize facial expressions. The presence of a feedback mechanism was not found to significantly improve performance for models with parallel architecture. However the use of parallel processing streams significantly improved accuracy over a similar network that did not have parallel architecture. Further investigation is necessary to determine the benefits of using parallel streams and feedback mechanisms in more advanced object recognition tasks.

  1. Contributions of feature shapes and surface cues to the recognition of facial expressions.

    PubMed

    Sormaz, Mladen; Young, Andrew W; Andrews, Timothy J

    2016-10-01

    Theoretical accounts of face processing often emphasise feature shapes as the primary visual cue to the recognition of facial expressions. However, changes in facial expression also affect the surface properties of the face. In this study, we investigated whether this surface information can also be used in the recognition of facial expression. First, participants identified facial expressions (fear, anger, disgust, sadness, happiness) from images that were manipulated such that they varied mainly in shape or mainly in surface properties. We found that the categorization of facial expression is possible in either type of image, but that different expressions are relatively dependent on surface or shape properties. Next, we investigated the relative contributions of shape and surface information to the categorization of facial expressions. This employed a complementary method that involved combining the surface properties of one expression with the shape properties from a different expression. Our results showed that the categorization of facial expressions in these hybrid images was equally dependent on the surface and shape properties of the image. Together, these findings provide a direct demonstration that both feature shape and surface information make significant contributions to the recognition of facial expressions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Exploring the Role of Spatial Frequency Information during Neural Emotion Processing in Human Infants.

    PubMed

    Jessen, Sarah; Grossmann, Tobias

    2017-01-01

    Enhanced attention to fear expressions in adults is primarily driven by information from low as opposed to high spatial frequencies contained in faces. However, little is known about the role of spatial frequency information in emotion processing during infancy. In the present study, we examined the role of low compared to high spatial frequencies in the processing of happy and fearful facial expressions by using filtered face stimuli and measuring event-related brain potentials (ERPs) in 7-month-old infants ( N = 26). Our results revealed that infants' brains discriminated between emotional facial expressions containing high but not between expressions containing low spatial frequencies. Specifically, happy faces containing high spatial frequencies elicited a smaller Nc amplitude than fearful faces containing high spatial frequencies and happy and fearful faces containing low spatial frequencies. Our results demonstrate that already in infancy spatial frequency content influences the processing of facial emotions. Furthermore, we observed that fearful facial expressions elicited a comparable Nc response for high and low spatial frequencies, suggesting a robust detection of fearful faces irrespective of spatial frequency content, whereas the detection of happy facial expressions was contingent upon frequency content. In summary, these data provide new insights into the neural processing of facial emotions in early development by highlighting the differential role played by spatial frequencies in the detection of fear and happiness.

  3. Multivariate Pattern Classification of Facial Expressions Based on Large-Scale Functional Connectivity.

    PubMed

    Liang, Yin; Liu, Baolin; Li, Xianglin; Wang, Peiyuan

    2018-01-01

    It is an important question how human beings achieve efficient recognition of others' facial expressions in cognitive neuroscience, and it has been identified that specific cortical regions show preferential activation to facial expressions in previous studies. However, the potential contributions of the connectivity patterns in the processing of facial expressions remained unclear. The present functional magnetic resonance imaging (fMRI) study explored whether facial expressions could be decoded from the functional connectivity (FC) patterns using multivariate pattern analysis combined with machine learning algorithms (fcMVPA). We employed a block design experiment and collected neural activities while participants viewed facial expressions of six basic emotions (anger, disgust, fear, joy, sadness, and surprise). Both static and dynamic expression stimuli were included in our study. A behavioral experiment after scanning confirmed the validity of the facial stimuli presented during the fMRI experiment with classification accuracies and emotional intensities. We obtained whole-brain FC patterns for each facial expression and found that both static and dynamic facial expressions could be successfully decoded from the FC patterns. Moreover, we identified the expression-discriminative networks for the static and dynamic facial expressions, which span beyond the conventional face-selective areas. Overall, these results reveal that large-scale FC patterns may also contain rich expression information to accurately decode facial expressions, suggesting a novel mechanism, which includes general interactions between distributed brain regions, and that contributes to the human facial expression recognition.

  4. Multivariate Pattern Classification of Facial Expressions Based on Large-Scale Functional Connectivity

    PubMed Central

    Liang, Yin; Liu, Baolin; Li, Xianglin; Wang, Peiyuan

    2018-01-01

    It is an important question how human beings achieve efficient recognition of others’ facial expressions in cognitive neuroscience, and it has been identified that specific cortical regions show preferential activation to facial expressions in previous studies. However, the potential contributions of the connectivity patterns in the processing of facial expressions remained unclear. The present functional magnetic resonance imaging (fMRI) study explored whether facial expressions could be decoded from the functional connectivity (FC) patterns using multivariate pattern analysis combined with machine learning algorithms (fcMVPA). We employed a block design experiment and collected neural activities while participants viewed facial expressions of six basic emotions (anger, disgust, fear, joy, sadness, and surprise). Both static and dynamic expression stimuli were included in our study. A behavioral experiment after scanning confirmed the validity of the facial stimuli presented during the fMRI experiment with classification accuracies and emotional intensities. We obtained whole-brain FC patterns for each facial expression and found that both static and dynamic facial expressions could be successfully decoded from the FC patterns. Moreover, we identified the expression-discriminative networks for the static and dynamic facial expressions, which span beyond the conventional face-selective areas. Overall, these results reveal that large-scale FC patterns may also contain rich expression information to accurately decode facial expressions, suggesting a novel mechanism, which includes general interactions between distributed brain regions, and that contributes to the human facial expression recognition. PMID:29615882

  5. Impaired recognition of happy facial expressions in bipolar disorder.

    PubMed

    Lawlor-Savage, Linette; Sponheim, Scott R; Goghari, Vina M

    2014-08-01

    The ability to accurately judge facial expressions is important in social interactions. Individuals with bipolar disorder have been found to be impaired in emotion recognition; however, the specifics of the impairment are unclear. This study investigated whether facial emotion recognition difficulties in bipolar disorder reflect general cognitive, or emotion-specific, impairments. Impairment in the recognition of particular emotions and the role of processing speed in facial emotion recognition were also investigated. Clinically stable bipolar patients (n = 17) and healthy controls (n = 50) judged five facial expressions in two presentation types, time-limited and self-paced. An age recognition condition was used as an experimental control. Bipolar patients' overall facial recognition ability was unimpaired. However, patients' specific ability to judge happy expressions under time constraints was impaired. Findings suggest a deficit in happy emotion recognition impacted by processing speed. Given the limited sample size, further investigation with a larger patient sample is warranted.

  6. Are Happy Faces Attractive? The Roles of Early vs. Late Processing

    PubMed Central

    Sun, Delin; Chan, Chetwyn C. H.; Fan, Jintu; Wu, Yi; Lee, Tatia M. C.

    2015-01-01

    Facial attractiveness is closely related to romantic love. To understand if the neural underpinnings of perceived facial attractiveness and facial expression are similar constructs, we recorded neural signals using an event-related potential (ERP) methodology for 20 participants who were viewing faces with varied attractiveness and expressions. We found that attractiveness and expression were reflected by two early components, P2-lateral (P2l) and P2-medial (P2m), respectively; their interaction effect was reflected by LPP, a late component. The findings suggested that facial attractiveness and expression are first processed in parallel for discrimination between stimuli. After the initial processing, more attentional resources are allocated to the faces with the most positive or most negative valence in both the attractiveness and expression dimensions. The findings contribute to the theoretical model of face perception. PMID:26648885

  7. Altered saccadic targets when processing facial expressions under different attentional and stimulus conditions.

    PubMed

    Boutsen, Frank A; Dvorak, Justin D; Pulusu, Vinay K; Ross, Elliott D

    2017-04-01

    Depending on a subject's attentional bias, robust changes in emotional perception occur when facial blends (different emotions expressed on upper/lower face) are presented tachistoscopically. If no instructions are given, subjects overwhelmingly identify the lower facial expression when blends are presented to either visual field. If asked to attend to the upper face, subjects overwhelmingly identify the upper facial expression in the left visual field but remain slightly biased to the lower facial expression in the right visual field. The current investigation sought to determine whether differences in initial saccadic targets could help explain the perceptual biases described above. Ten subjects were presented with full and blend facial expressions under different attentional conditions. No saccadic differences were found for left versus right visual field presentations or for full facial versus blend stimuli. When asked to identify the presented emotion, saccades were directed to the lower face. When asked to attend to the upper face, saccades were directed to the upper face. When asked to attend to the upper face and try to identify the emotion, saccades were directed to the upper face but to a lesser degree. Thus, saccadic behavior supports the concept that there are cognitive-attentional pre-attunements when subjects visually process facial expressions. However, these pre-attunements do not fully explain the perceptual superiority of the left visual field for identifying the upper facial expression when facial blends are presented tachistoscopically. Hence other perceptual factors must be in play, such as the phenomenon of virtual scanning. Published by Elsevier Ltd.

  8. Functionally dissociated aspects in anterior and posterior electrocortical processing of facial threat.

    PubMed

    Schutter, Dennis J L G; de Haan, Edward H F; van Honk, Jack

    2004-06-01

    The angry facial expression is an important socially threatening stimulus argued to have evolved to regulate social hierarchies. In the present study, event-related potentials (ERP) were used to investigate the involvement and temporal dynamics of the frontal and parietal regions in the processing of angry facial expressions. Angry, happy and neutral faces were shown to eighteen healthy right-handed volunteers in a passive viewing task. Stimulus-locked ERPs were recorded from the frontal and parietal scalp sites. The P200, N300 and early contingent negativity variation (eCNV) components of the electric brain potentials were investigated. Analyses revealed statistical significant reductions in P200 amplitudes for the angry facial expression on both frontal and parietal electrode sites. Furthermore, apart from being strongly associated with the anterior P200, the N300 showed to be more negative for the angry facial expression in the anterior regions also. Finally, the eCNV was more pronounced over the parietal sites for the angry facial expressions. The present study demonstrated specific electrocortical correlates underlying the processing of angry facial expressions in the anterior and posterior brain sectors. The P200 is argued to indicate valence tagging by a fast and early detection mechanism. The lowered N300 with an anterior distribution for the angry facial expressions indicates more elaborate evaluation of stimulus relevance. The fact that the P200 and the N300 are highly correlated suggests that they reflect different stages of the same anterior evaluation mechanism. The more pronounced posterior eCNV suggests sustained attention to socially threatening information. Copyright 2004 Elsevier B.V.

  9. The Automaticity of Emotional Face-Context Integration

    PubMed Central

    Aviezer, Hillel; Dudarev, Veronica; Bentin, Shlomo; Hassin, Ran R.

    2011-01-01

    Recent studies have demonstrated that context can dramatically influence the recognition of basic facial expressions, yet the nature of this phenomenon is largely unknown. In the present paper we begin to characterize the underlying process of face-context integration. Specifically, we examine whether it is a relatively controlled or automatic process. In Experiment 1 participants were motivated and instructed to avoid using the context while categorizing contextualized facial expression, or they were led to believe that the context was irrelevant. Nevertheless, they were unable to disregard the context, which exerted a strong effect on their emotion recognition. In Experiment 2, participants categorized contextualized facial expressions while engaged in a concurrent working memory task. Despite the load, the context exerted a strong influence on their recognition of facial expressions. These results suggest that facial expressions and their body contexts are integrated in an unintentional, uncontrollable, and relatively effortless manner. PMID:21707150

  10. Does Gaze Direction Modulate Facial Expression Processing in Children with Autism Spectrum Disorder?

    ERIC Educational Resources Information Center

    Akechi, Hironori; Senju, Atsushi; Kikuchi, Yukiko; Tojo, Yoshikuni; Osanai, Hiroo; Hasegawa, Toshikazu

    2009-01-01

    Two experiments investigated whether children with autism spectrum disorder (ASD) integrate relevant communicative signals, such as gaze direction, when decoding a facial expression. In Experiment 1, typically developing children (9-14 years old; n = 14) were faster at detecting a facial expression accompanying a gaze direction with a congruent…

  11. Brief report: Representational momentum for dynamic facial expressions in pervasive developmental disorder.

    PubMed

    Uono, Shota; Sato, Wataru; Toichi, Motomi

    2010-03-01

    Individuals with pervasive developmental disorder (PDD) have difficulty with social communication via emotional facial expressions, but behavioral studies involving static images have reported inconsistent findings about emotion recognition. We investigated whether dynamic presentation of facial expression would enhance subjective perception of expressed emotion in 13 individuals with PDD and 13 typically developing controls. We presented dynamic and static emotional (fearful and happy) expressions. Participants were asked to match a changeable emotional face display with the last presented image. The results showed that both groups perceived the last image of dynamic facial expression to be more emotionally exaggerated than the static facial expression. This finding suggests that individuals with PDD have an intact perceptual mechanism for processing dynamic information in another individual's face.

  12. Emotion Unchained: Facial Expression Modulates Gaze Cueing under Cognitive Load.

    PubMed

    Pecchinenda, Anna; Petrucci, Manuel

    2016-01-01

    Direction of eye gaze cues spatial attention, and typically this cueing effect is not modulated by the expression of a face unless top-down processes are explicitly or implicitly involved. To investigate the role of cognitive control on gaze cueing by emotional faces, participants performed a gaze cueing task with happy, angry, or neutral faces under high (i.e., counting backward by 7) or low cognitive load (i.e., counting forward by 2). Results show that high cognitive load enhances gaze cueing effects for angry facial expressions. In addition, cognitive load reduces gaze cueing for neutral faces, whereas happy facial expressions and gaze affected object preferences regardless of load. This evidence clearly indicates a differential role of cognitive control in processing gaze direction and facial expression, suggesting that under typical conditions, when we shift attention based on social cues from another person, cognitive control processes are used to reduce interference from emotional information.

  13. Emotion Unchained: Facial Expression Modulates Gaze Cueing under Cognitive Load

    PubMed Central

    Petrucci, Manuel

    2016-01-01

    Direction of eye gaze cues spatial attention, and typically this cueing effect is not modulated by the expression of a face unless top-down processes are explicitly or implicitly involved. To investigate the role of cognitive control on gaze cueing by emotional faces, participants performed a gaze cueing task with happy, angry, or neutral faces under high (i.e., counting backward by 7) or low cognitive load (i.e., counting forward by 2). Results show that high cognitive load enhances gaze cueing effects for angry facial expressions. In addition, cognitive load reduces gaze cueing for neutral faces, whereas happy facial expressions and gaze affected object preferences regardless of load. This evidence clearly indicates a differential role of cognitive control in processing gaze direction and facial expression, suggesting that under typical conditions, when we shift attention based on social cues from another person, cognitive control processes are used to reduce interference from emotional information. PMID:27959925

  14. Can We Distinguish Emotions from Faces? Investigation of Implicit and Explicit Processes of Peak Facial Expressions.

    PubMed

    Xiao, Ruiqi; Li, Xianchun; Li, Lin; Wang, Yanmei

    2016-01-01

    Most previous studies on facial expression recognition have focused on the moderate emotions; to date, few studies have been conducted to investigate the explicit and implicit processes of peak emotions. In the current study, we used transiently peak intense expression images of athletes at the winning or losing point in competition as materials, and investigated the diagnosability of peak facial expressions at both implicit and explicit levels. In Experiment 1, participants were instructed to evaluate isolated faces, isolated bodies, and the face-body compounds, and eye-tracking movement was recorded. The results revealed that the isolated body and face-body congruent images were better recognized than isolated face and face-body incongruent images, indicating that the emotional information conveyed by facial cues was ambiguous, and the body cues influenced facial emotion recognition. Furthermore, eye movement records showed that the participants displayed distinct gaze patterns for the congruent and incongruent compounds. In Experiment 2A, the subliminal affective priming task was used, with faces as primes and bodies as targets, to investigate the unconscious emotion perception of peak facial expressions. The results showed that winning face prime facilitated reaction to winning body target, whereas losing face prime inhibited reaction to winning body target, suggesting that peak facial expressions could be perceived at the implicit level. In general, the results indicate that peak facial expressions cannot be consciously recognized but can be perceived at the unconscious level. In Experiment 2B, revised subliminal affective priming task and a strict awareness test were used to examine the validity of unconscious perception of peak facial expressions found in Experiment 2A. Results of Experiment 2B showed that reaction time to both winning body targets and losing body targets was influenced by the invisibly peak facial expression primes, which indicated the unconscious perception of peak facial expressions.

  15. Can We Distinguish Emotions from Faces? Investigation of Implicit and Explicit Processes of Peak Facial Expressions

    PubMed Central

    Xiao, Ruiqi; Li, Xianchun; Li, Lin; Wang, Yanmei

    2016-01-01

    Most previous studies on facial expression recognition have focused on the moderate emotions; to date, few studies have been conducted to investigate the explicit and implicit processes of peak emotions. In the current study, we used transiently peak intense expression images of athletes at the winning or losing point in competition as materials, and investigated the diagnosability of peak facial expressions at both implicit and explicit levels. In Experiment 1, participants were instructed to evaluate isolated faces, isolated bodies, and the face-body compounds, and eye-tracking movement was recorded. The results revealed that the isolated body and face-body congruent images were better recognized than isolated face and face-body incongruent images, indicating that the emotional information conveyed by facial cues was ambiguous, and the body cues influenced facial emotion recognition. Furthermore, eye movement records showed that the participants displayed distinct gaze patterns for the congruent and incongruent compounds. In Experiment 2A, the subliminal affective priming task was used, with faces as primes and bodies as targets, to investigate the unconscious emotion perception of peak facial expressions. The results showed that winning face prime facilitated reaction to winning body target, whereas losing face prime inhibited reaction to winning body target, suggesting that peak facial expressions could be perceived at the implicit level. In general, the results indicate that peak facial expressions cannot be consciously recognized but can be perceived at the unconscious level. In Experiment 2B, revised subliminal affective priming task and a strict awareness test were used to examine the validity of unconscious perception of peak facial expressions found in Experiment 2A. Results of Experiment 2B showed that reaction time to both winning body targets and losing body targets was influenced by the invisibly peak facial expression primes, which indicated the unconscious perception of peak facial expressions. PMID:27630604

  16. Contemporary solutions for the treatment of facial nerve paralysis.

    PubMed

    Garcia, Ryan M; Hadlock, Tessa A; Klebuc, Michael J; Simpson, Roger L; Zenn, Michael R; Marcus, Jeffrey R

    2015-06-01

    After reviewing this article, the participant should be able to: 1. Understand the most modern indications and technique for neurotization, including masseter-to-facial nerve transfer (fifth-to-seventh cranial nerve transfer). 2. Contrast the advantages and limitations associated with contiguous muscle transfers and free-muscle transfers for facial reanimation. 3. Understand the indications for a two-stage and one-stage free gracilis muscle transfer for facial reanimation. 4. Apply nonsurgical adjuvant treatments for acute facial nerve paralysis. Facial expression is a complex neuromotor and psychomotor process that is disrupted in patients with facial paralysis breaking the link between emotion and physical expression. Contemporary reconstructive options are being implemented in patients with facial paralysis. While static procedures provide facial symmetry at rest, true 'facial reanimation' requires restoration of facial movement. Contemporary treatment options include neurotization procedures (a new motor nerve is used to restore innervation to a viable muscle), contiguous regional muscle transfer (most commonly temporalis muscle transfer), microsurgical free muscle transfer, and nonsurgical adjuvants used to balance facial symmetry. Each approach has advantages and disadvantages along with ongoing controversies and should be individualized for each patient. Treatments for patients with facial paralysis continue to evolve in order to restore the complex psychomotor process of facial expression.

  17. Dogs Evaluate Threatening Facial Expressions by Their Biological Validity – Evidence from Gazing Patterns

    PubMed Central

    Somppi, Sanni; Törnqvist, Heini; Kujala, Miiamaaria V.; Hänninen, Laura; Krause, Christina M.; Vainio, Outi

    2016-01-01

    Appropriate response to companions’ emotional signals is important for all social creatures. The emotional expressions of humans and non-human animals have analogies in their form and function, suggesting shared evolutionary roots, but very little is known about how animals other than primates view and process facial expressions. In primates, threat-related facial expressions evoke exceptional viewing patterns compared with neutral or positive stimuli. Here, we explore if domestic dogs (Canis familiaris) have such an attentional bias toward threatening social stimuli and whether observed emotional expressions affect dogs’ gaze fixation distribution among the facial features (eyes, midface and mouth). We recorded the voluntary eye gaze of 31 domestic dogs during viewing of facial photographs of humans and dogs with three emotional expressions (threatening, pleasant and neutral). We found that dogs’ gaze fixations spread systematically among facial features. The distribution of fixations was altered by the seen expression, but eyes were the most probable targets of the first fixations and gathered longer looking durations than mouth regardless of the viewed expression. The examination of the inner facial features as a whole revealed more pronounced scanning differences among expressions. This suggests that dogs do not base their perception of facial expressions on the viewing of single structures, but the interpretation of the composition formed by eyes, midface and mouth. Dogs evaluated social threat rapidly and this evaluation led to attentional bias, which was dependent on the depicted species: threatening conspecifics’ faces evoked heightened attention but threatening human faces instead an avoidance response. We propose that threatening signals carrying differential biological validity are processed via distinctive neurocognitive pathways. Both of these mechanisms may have an adaptive significance for domestic dogs. The findings provide a novel perspective on understanding the processing of emotional expressions and sensitivity to social threat in non-primates. PMID:26761433

  18. Subliminal and Supraliminal Processing of Facial Expression of Emotions: Brain Oscillation in the Left/Right Frontal Area

    PubMed Central

    Balconi, Michela; Ferrari, Chiara

    2012-01-01

    The unconscious effects of an emotional stimulus have been highlighted by a vast amount of research, whereover it remains questionable whether it is possible to assign a specific function to cortical brain oscillations in the unconscious perception of facial expressions of emotions. Alpha band variation was monitored within the right- and left-cortical side when subjects consciously (supraliminal stimulation) or unconsciously (subliminal stimulation) processed facial patterns. Twenty subjects looked at six facial expressions of emotions (anger, fear, surprise, disgust, happiness, sadness, and neutral) under two different conditions: supraliminal (200 ms) vs. subliminal (30 ms) stimulation (140 target-mask pairs for each condition). The results showed that conscious/unconscious processing and the significance of the stimulus can modulate the alpha power. Moreover, it was found that there was an increased right frontal activity for negative emotions vs. an increased left response for positive emotion. The significance of facial expressions was adduced to elucidate cortical different responses to emotional types. PMID:24962767

  19. In the face of emotions: event-related potentials in supraliminal and subliminal facial expression recognition.

    PubMed

    Balconi, Michela; Lucchiari, Claudio

    2005-02-01

    Is facial expression recognition marked by specific event-related potentials (ERPs) effects? Are conscious and unconscious elaborations of emotional facial stimuli qualitatively different processes? In Experiment 1, ERPs elicited by supraliminal stimuli were recorded when 21 participants viewed emotional facial expressions of four emotions and a neutral stimulus. Two ERP components (N2 and P3) were analyzed for their peak amplitude and latency measures. First, emotional face-specificity was observed for the negative deflection N2, whereas P3 was not affected by the content of the stimulus (emotional or neutral). A more posterior distribution of ERPs was found for N2. Moreover, a lateralization effect was revealed for negative (right lateralization) and positive (left lateralization) facial expressions. In Experiment 2 (20 participants), 1-ms subliminal stimulation was carried out. Unaware information processing was revealed to be quite similar to aware information processing for peak amplitude but not for latency. In fact, unconscious stimulation produced a more delayed peak variation than conscious stimulation.

  20. Subliminal and supraliminal processing of facial expression of emotions: brain oscillation in the left/right frontal area.

    PubMed

    Balconi, Michela; Ferrari, Chiara

    2012-03-26

    The unconscious effects of an emotional stimulus have been highlighted by a vast amount of research, whereover it remains questionable whether it is possible to assign a specific function to cortical brain oscillations in the unconscious perception of facial expressions of emotions. Alpha band variation was monitored within the right- and left-cortical side when subjects consciously (supraliminal stimulation) or unconsciously (subliminal stimulation) processed facial patterns. Twenty subjects looked at six facial expressions of emotions (anger, fear, surprise, disgust, happiness, sadness, and neutral) under two different conditions: supraliminal (200 ms) vs. subliminal (30 ms) stimulation (140 target-mask pairs for each condition). The results showed that conscious/unconscious processing and the significance of the stimulus can modulate the alpha power. Moreover, it was found that there was an increased right frontal activity for negative emotions vs. an increased left response for positive emotion. The significance of facial expressions was adduced to elucidate cortical different responses to emotional types.

  1. Real-time face and gesture analysis for human-robot interaction

    NASA Astrophysics Data System (ADS)

    Wallhoff, Frank; Rehrl, Tobias; Mayer, Christoph; Radig, Bernd

    2010-05-01

    Human communication relies on a large number of different communication mechanisms like spoken language, facial expressions, or gestures. Facial expressions and gestures are one of the main nonverbal communication mechanisms and pass large amounts of information between human dialog partners. Therefore, to allow for intuitive human-machine interaction, a real-time capable processing and recognition of facial expressions, hand and head gestures are of great importance. We present a system that is tackling these challenges. The input features for the dynamic head gestures and facial expressions are obtained from a sophisticated three-dimensional model, which is fitted to the user in a real-time capable manner. Applying this model different kinds of information are extracted from the image data and afterwards handed over to a real-time capable data-transferring framework, the so-called Real-Time DataBase (RTDB). In addition to the head and facial-related features, also low-level image features regarding the human hand - optical flow, Hu-moments are stored into the RTDB for the evaluation process of hand gestures. In general, the input of a single camera is sufficient for the parallel evaluation of the different gestures and facial expressions. The real-time capable recognition of the dynamic hand and head gestures are performed via different Hidden Markov Models, which have proven to be a quick and real-time capable classification method. On the other hand, for the facial expressions classical decision trees or more sophisticated support vector machines are used for the classification process. These obtained results of the classification processes are again handed over to the RTDB, where other processes (like a Dialog Management Unit) can easily access them without any blocking effects. In addition, an adjustable amount of history can be stored by the RTDB buffer unit.

  2. People with chronic facial pain perform worse than controls at a facial emotion recognition task, but it is not all about the emotion.

    PubMed

    von Piekartz, H; Wallwork, S B; Mohr, G; Butler, D S; Moseley, G L

    2015-04-01

    Alexithymia, or a lack of emotional awareness, is prevalent in some chronic pain conditions and has been linked to poor recognition of others' emotions. Recognising others' emotions from their facial expression involves both emotional and motor processing, but the possible contribution of motor disruption has not been considered. It is possible that poor performance on emotional recognition tasks could reflect problems with emotional processing, motor processing or both. We hypothesised that people with chronic facial pain would be less accurate in recognising others' emotions from facial expressions, would be less accurate in a motor imagery task involving the face, and that performance on both tasks would be positively related. A convenience sample of 19 people (15 females) with chronic facial pain and 19 gender-matched controls participated. They undertook two tasks; in the first task, they identified the facial emotion presented in a photograph. In the second, they identified whether the person in the image had a facial feature pointed towards their left or right side, a well-recognised paradigm to induce implicit motor imagery. People with chronic facial pain performed worse than controls at both tasks (Facially Expressed Emotion Labelling (FEEL) task P < 0·001; left/right judgment task P < 0·001). Participants who were more accurate at one task were also more accurate at the other, regardless of group (P < 0·001, r(2)  = 0·523). Participants with chronic facial pain were worse than controls at both the FEEL emotion recognition task and the left/right facial expression task and performance covaried within participants. We propose that disrupted motor processing may underpin or at least contribute to the difficulty that facial pain patients have in emotion recognition and that further research that tests this proposal is warranted. © 2014 John Wiley & Sons Ltd.

  3. Individual differences in the recognition of facial expressions: an event-related potentials study.

    PubMed

    Tamamiya, Yoshiyuki; Hiraki, Kazuo

    2013-01-01

    Previous studies have shown that early posterior components of event-related potentials (ERPs) are modulated by facial expressions. The goal of the current study was to investigate individual differences in the recognition of facial expressions by examining the relationship between ERP components and the discrimination of facial expressions. Pictures of 3 facial expressions (angry, happy, and neutral) were presented to 36 young adults during ERP recording. Participants were asked to respond with a button press as soon as they recognized the expression depicted. A multiple regression analysis, where ERP components were set as predictor variables, assessed hits and reaction times in response to the facial expressions as dependent variables. The N170 amplitudes significantly predicted for accuracy of angry and happy expressions, and the N170 latencies were predictive for accuracy of neutral expressions. The P2 amplitudes significantly predicted reaction time. The P2 latencies significantly predicted reaction times only for neutral faces. These results suggest that individual differences in the recognition of facial expressions emerge from early components in visual processing.

  4. Identifying differences in biased affective information processing in major depression.

    PubMed

    Gollan, Jackie K; Pane, Heather T; McCloskey, Michael S; Coccaro, Emil F

    2008-05-30

    This study investigates the extent to which participants with major depression differ from healthy comparison participants in the irregularities in affective information processing, characterized by deficits in facial expression recognition, intensity categorization, and reaction time to identifying emotionally salient and neutral information. Data on diagnoses, symptom severity, and affective information processing using a facial recognition task were collected from 66 participants, male and female between ages 18 and 54 years, grouped by major depressive disorder (N=37) or healthy non-psychiatric (N=29) status. Findings from MANCOVAs revealed that major depression was associated with a significantly longer reaction time to sad facial expressions compared with healthy status. Also, depressed participants demonstrated a negative bias towards interpreting neutral facial expressions as sad significantly more often than healthy participants. In turn, healthy participants interpreted neutral faces as happy significantly more often than depressed participants. No group differences were observed for facial expression recognition and intensity categorization. The observed effects suggest that depression has significant effects on the perception of the intensity of negative affective stimuli, delayed speed of processing sad affective information, and biases towards interpreting neutral faces as sad.

  5. Serotonin and the neural processing of facial emotions in adults with autism: an fMRI study using acute tryptophan depletion.

    PubMed

    Daly, Eileen M; Deeley, Quinton; Ecker, Christine; Craig, Michael; Hallahan, Brian; Murphy, Clodagh; Johnston, Patrick; Spain, Debbie; Gillan, Nicola; Brammer, Michael; Giampietro, Vincent; Lamar, Melissa; Page, Lisa; Toal, Fiona; Cleare, Anthony; Surguladze, Simon; Murphy, Declan G M

    2012-10-01

    People with autism spectrum disorders (ASDs) have lifelong deficits in social behavior and differences in behavioral as well as neural responses to facial expressions of emotion. The biological basis to this is incompletely understood, but it may include differences in the role of neurotransmitters such as serotonin, which modulate facial emotion processing in health. While some individuals with ASD have significant differences in the serotonin system, to our knowledge, no one has investigated its role during facial emotion processing in adults with ASD and control subjects using acute tryptophan depletion (ATD) and functional magnetic resonance imaging. To compare the effects of ATD on brain responses to primary facial expressions of emotion in men with ASD and healthy control subjects. Double-blind, placebo-controlled, crossover trial of ATD and functional magnetic resonance imaging to measure brain activity during incidental processing of disgust, fearful, happy, and sad facial expressions. Institute of Psychiatry, King's College London, and South London and Maudsley National Health Service Foundation Trust, England. Fourteen men of normal intelligence with autism and 14 control subjects who did not significantly differ in sex, age, or overall intelligence. Blood oxygenation level-dependent response to facial expressions of emotion. Brain activation was differentially modulated by ATD depending on diagnostic group and emotion type within regions of the social brain network. For example, processing of disgust faces was associated with interactions in medial frontal and lingual gyri, whereas processing of happy faces was associated with interactions in middle frontal gyrus and putamen. Modulation of the processing of facial expressions of emotion by serotonin significantly differs in people with ASD compared with control subjects. The differences vary with emotion type and occur in social brain regions that have been shown to be associated with group differences in serotonin synthesis/receptor or transporter density.

  6. Regional Brain Responses Are Biased Toward Infant Facial Expressions Compared to Adult Facial Expressions in Nulliparous Women.

    PubMed

    Li, Bingbing; Cheng, Gang; Zhang, Dajun; Wei, Dongtao; Qiao, Lei; Wang, Xiangpeng; Che, Xianwei

    2016-01-01

    Recent neuroimaging studies suggest that neutral infant faces compared to neutral adult faces elicit greater activity in brain areas associated with face processing, attention, empathic response, reward, and movement. However, whether infant facial expressions evoke larger brain responses than adult facial expressions remains unclear. Here, we performed event-related functional magnetic resonance imaging in nulliparous women while they were presented with images of matched unfamiliar infant and adult facial expressions (happy, neutral, and uncomfortable/sad) in a pseudo-randomized order. We found that the bilateral fusiform and right lingual gyrus were overall more activated during the presentation of infant facial expressions compared to adult facial expressions. Uncomfortable infant faces compared to sad adult faces evoked greater activation in the bilateral fusiform gyrus, precentral gyrus, postcentral gyrus, posterior cingulate cortex-thalamus, and precuneus. Neutral infant faces activated larger brain responses in the left fusiform gyrus compared to neutral adult faces. Happy infant faces compared to happy adult faces elicited larger responses in areas of the brain associated with emotion and reward processing using a more liberal threshold of p < 0.005 uncorrected. Furthermore, the level of the test subjects' Interest-In-Infants was positively associated with the intensity of right fusiform gyrus response to infant faces and uncomfortable infant faces compared to sad adult faces. In addition, the Perspective Taking subscale score on the Interpersonal Reactivity Index-Chinese was significantly correlated with precuneus activity during uncomfortable infant faces compared to sad adult faces. Our findings suggest that regional brain areas may bias cognitive and emotional responses to infant facial expressions compared to adult facial expressions among nulliparous women, and this bias may be modulated by individual differences in Interest-In-Infants and perspective taking ability.

  7. Regional Brain Responses Are Biased Toward Infant Facial Expressions Compared to Adult Facial Expressions in Nulliparous Women

    PubMed Central

    Zhang, Dajun; Wei, Dongtao; Qiao, Lei; Wang, Xiangpeng; Che, Xianwei

    2016-01-01

    Recent neuroimaging studies suggest that neutral infant faces compared to neutral adult faces elicit greater activity in brain areas associated with face processing, attention, empathic response, reward, and movement. However, whether infant facial expressions evoke larger brain responses than adult facial expressions remains unclear. Here, we performed event-related functional magnetic resonance imaging in nulliparous women while they were presented with images of matched unfamiliar infant and adult facial expressions (happy, neutral, and uncomfortable/sad) in a pseudo-randomized order. We found that the bilateral fusiform and right lingual gyrus were overall more activated during the presentation of infant facial expressions compared to adult facial expressions. Uncomfortable infant faces compared to sad adult faces evoked greater activation in the bilateral fusiform gyrus, precentral gyrus, postcentral gyrus, posterior cingulate cortex-thalamus, and precuneus. Neutral infant faces activated larger brain responses in the left fusiform gyrus compared to neutral adult faces. Happy infant faces compared to happy adult faces elicited larger responses in areas of the brain associated with emotion and reward processing using a more liberal threshold of p < 0.005 uncorrected. Furthermore, the level of the test subjects’ Interest-In-Infants was positively associated with the intensity of right fusiform gyrus response to infant faces and uncomfortable infant faces compared to sad adult faces. In addition, the Perspective Taking subscale score on the Interpersonal Reactivity Index-Chinese was significantly correlated with precuneus activity during uncomfortable infant faces compared to sad adult faces. Our findings suggest that regional brain areas may bias cognitive and emotional responses to infant facial expressions compared to adult facial expressions among nulliparous women, and this bias may be modulated by individual differences in Interest-In-Infants and perspective taking ability. PMID:27977692

  8. Decoding facial blends of emotion: visual field, attentional and hemispheric biases.

    PubMed

    Ross, Elliott D; Shayya, Luay; Champlain, Amanda; Monnot, Marilee; Prodan, Calin I

    2013-12-01

    Most clinical research assumes that modulation of facial expressions is lateralized predominantly across the right-left hemiface. However, social psychological research suggests that facial expressions are organized predominantly across the upper-lower face. Because humans learn to cognitively control facial expression for social purposes, the lower face may display a false emotion, typically a smile, to enable approach behavior. In contrast, the upper face may leak a person's true feeling state by producing a brief facial blend of emotion, i.e. a different emotion on the upper versus lower face. Previous studies from our laboratory have shown that upper facial emotions are processed preferentially by the right hemisphere under conditions of directed attention if facial blends of emotion are presented tachistoscopically to the mid left and right visual fields. This paper explores how facial blends are processed within the four visual quadrants. The results, combined with our previous research, demonstrate that lower more so than upper facial emotions are perceived best when presented to the viewer's left and right visual fields just above the horizontal axis. Upper facial emotions are perceived best when presented to the viewer's left visual field just above the horizontal axis under conditions of directed attention. Thus, by gazing at a person's left ear, which also avoids the social stigma of eye-to-eye contact, one's ability to decode facial expressions should be enhanced. Published by Elsevier Inc.

  9. A computational model of the development of separate representations of facial identity and expression in the primate visual system.

    PubMed

    Tromans, James Matthew; Harris, Mitchell; Stringer, Simon Maitland

    2011-01-01

    Experimental studies have provided evidence that the visual processing areas of the primate brain represent facial identity and facial expression within different subpopulations of neurons. For example, in non-human primates there is evidence that cells within the inferior temporal gyrus (TE) respond primarily to facial identity, while cells within the superior temporal sulcus (STS) respond to facial expression. More recently, it has been found that the orbitofrontal cortex (OFC) of non-human primates contains some cells that respond exclusively to changes in facial identity, while other cells respond exclusively to facial expression. How might the primate visual system develop physically separate representations of facial identity and expression given that the visual system is always exposed to simultaneous combinations of facial identity and expression during learning? In this paper, a biologically plausible neural network model, VisNet, of the ventral visual pathway is trained on a set of carefully-designed cartoon faces with different identities and expressions. The VisNet model architecture is composed of a hierarchical series of four Self-Organising Maps (SOMs), with associative learning in the feedforward synaptic connections between successive layers. During learning, the network develops separate clusters of cells that respond exclusively to either facial identity or facial expression. We interpret the performance of the network in terms of the learning properties of SOMs, which are able to exploit the statistical indendependence between facial identity and expression.

  10. Cognitive penetrability and emotion recognition in human facial expressions

    PubMed Central

    Marchi, Francesco

    2015-01-01

    Do our background beliefs, desires, and mental images influence our perceptual experience of the emotions of others? In this paper, we will address the possibility of cognitive penetration (CP) of perceptual experience in the domain of social cognition. In particular, we focus on emotion recognition based on the visual experience of facial expressions. After introducing the current debate on CP, we review examples of perceptual adaptation for facial expressions of emotion. This evidence supports the idea that facial expressions are perceptually processed as wholes. That is, the perceptual system integrates lower-level facial features, such as eyebrow orientation, mouth angle etc., into facial compounds. We then present additional experimental evidence showing that in some cases, emotion recognition on the basis of facial expression is sensitive to and modified by the background knowledge of the subject. We argue that such sensitivity is best explained as a difference in the visual experience of the facial expression, not just as a modification of the judgment based on this experience. The difference in experience is characterized as the result of the interference of background knowledge with the perceptual integration process for faces. Thus, according to the best explanation, we have to accept CP in some cases of emotion recognition. Finally, we discuss a recently proposed mechanism for CP in the face-based recognition of emotion. PMID:26150796

  11. Emotion unfolded by motion: a role for parietal lobe in decoding dynamic facial expressions.

    PubMed

    Sarkheil, Pegah; Goebel, Rainer; Schneider, Frank; Mathiak, Klaus

    2013-12-01

    Facial expressions convey important emotional and social information and are frequently applied in investigations of human affective processing. Dynamic faces may provide higher ecological validity to examine perceptual and cognitive processing of facial expressions. Higher order processing of emotional faces was addressed by varying the task and virtual face models systematically. Blood oxygenation level-dependent activation was assessed using functional magnetic resonance imaging in 20 healthy volunteers while viewing and evaluating either emotion or gender intensity of dynamic face stimuli. A general linear model analysis revealed that high valence activated a network of motion-responsive areas, indicating that visual motion areas support perceptual coding for the motion-based intensity of facial expressions. The comparison of emotion with gender discrimination task revealed increased activation of inferior parietal lobule, which highlights the involvement of parietal areas in processing of high level features of faces. Dynamic emotional stimuli may help to emphasize functions of the hypothesized 'extended' over the 'core' system for face processing.

  12. Relative preservation of the recognition of positive facial expression "happiness" in Alzheimer disease.

    PubMed

    Maki, Yohko; Yoshida, Hiroshi; Yamaguchi, Tomoharu; Yamaguchi, Haruyasu

    2013-01-01

    Positivity recognition bias has been reported for facial expression as well as memory and visual stimuli in aged individuals, whereas emotional facial recognition in Alzheimer disease (AD) patients is controversial, with possible involvement of confounding factors such as deficits in spatial processing of non-emotional facial features and in verbal processing to express emotions. Thus, we examined whether recognition of positive facial expressions was preserved in AD patients, by adapting a new method that eliminated the influences of these confounding factors. Sensitivity of six basic facial expressions (happiness, sadness, surprise, anger, disgust, and fear) was evaluated in 12 outpatients with mild AD, 17 aged normal controls (ANC), and 25 young normal controls (YNC). To eliminate the factors related to non-emotional facial features, averaged faces were prepared as stimuli. To eliminate the factors related to verbal processing, the participants were required to match the images of stimulus and answer, avoiding the use of verbal labels. In recognition of happiness, there was no difference in sensitivity between YNC and ANC, and between ANC and AD patients. AD patients were less sensitive than ANC in recognition of sadness, surprise, and anger. ANC were less sensitive than YNC in recognition of surprise, anger, and disgust. Within the AD patient group, sensitivity of happiness was significantly higher than those of the other five expressions. In AD patient, recognition of happiness was relatively preserved; recognition of happiness was most sensitive and was preserved against the influences of age and disease.

  13. Discriminability effect on Garner interference: evidence from recognition of facial identity and expression

    PubMed Central

    Wang, Yamin; Fu, Xiaolan; Johnston, Robert A.; Yan, Zheng

    2013-01-01

    Using Garner’s speeded classification task existing studies demonstrated an asymmetric interference in the recognition of facial identity and facial expression. It seems that expression is hard to interfere with identity recognition. However, discriminability of identity and expression, a potential confounding variable, had not been carefully examined in existing studies. In current work, we manipulated discriminability of identity and expression by matching facial shape (long or round) in identity and matching mouth (opened or closed) in facial expression. Garner interference was found either from identity to expression (Experiment 1) or from expression to identity (Experiment 2). Interference was also found in both directions (Experiment 3) or in neither direction (Experiment 4). The results support that Garner interference tends to occur under condition of low discriminability of relevant dimension regardless of facial property. Our findings indicate that Garner interference is not necessarily related to interdependent processing in recognition of facial identity and expression. The findings also suggest that discriminability as a mediating factor should be carefully controlled in future research. PMID:24391609

  14. Effects of task demands on the early neural processing of fearful and happy facial expressions

    PubMed Central

    Itier, Roxane J.; Neath-Tavares, Karly N.

    2017-01-01

    Task demands shape how we process environmental stimuli but their impact on the early neural processing of facial expressions remains unclear. In a within-subject design, ERPs were recorded to the same fearful, happy and neutral facial expressions presented during a gender discrimination, an explicit emotion discrimination and an oddball detection tasks, the most studied tasks in the field. Using an eye tracker, fixation on the face nose was enforced using a gaze-contingent presentation. Task demands modulated amplitudes from 200–350ms at occipito-temporal sites spanning the EPN component. Amplitudes were more negative for fearful than neutral expressions starting on N170 from 150–350ms, with a temporo-occipital distribution, whereas no clear effect of happy expressions was seen. Task and emotion effects never interacted in any time window or for the ERP components analyzed (P1, N170, EPN). Thus, whether emotion is explicitly discriminated or irrelevant for the task at hand, neural correlates of fearful and happy facial expressions seem immune to these task demands during the first 350ms of visual processing. PMID:28315309

  15. Neural responses to facial expressions support the role of the amygdala in processing threat

    PubMed Central

    Sormaz, Mladen; Flack, Tessa; Asghar, Aziz U. R.; Fan, Siyan; Frey, Julia; Manssuer, Luis; Usten, Deniz; Young, Andrew W.; Andrews, Timothy J.

    2014-01-01

    The amygdala is known to play an important role in the response to facial expressions that convey fear. However, it remains unclear whether the amygdala’s response to fear reflects its role in the interpretation of danger and threat, or whether it is to some extent activated by all facial expressions of emotion. Previous attempts to address this issue using neuroimaging have been confounded by differences in the use of control stimuli across studies. Here, we address this issue using a block design functional magnetic resonance imaging paradigm, in which we compared the response to face images posing expressions of fear, anger, happiness, disgust and sadness with a range of control conditions. The responses in the amygdala to different facial expressions were compared with the responses to a non-face condition (buildings), to mildly happy faces and to neutral faces. Results showed that only fear and anger elicited significantly greater responses compared with the control conditions involving faces. Overall, these findings are consistent with the role of the amygdala in processing threat, rather than in the processing of all facial expressions of emotion, and demonstrate the critical importance of the choice of comparison condition to the pattern of results. PMID:24097376

  16. Expressive facial animation synthesis by learning speech coarticulation and expression spaces.

    PubMed

    Deng, Zhigang; Neumann, Ulrich; Lewis, J P; Kim, Tae-Yong; Bulut, Murtaza; Narayanan, Shrikanth

    2006-01-01

    Synthesizing expressive facial animation is a very challenging topic within the graphics community. In this paper, we present an expressive facial animation synthesis system enabled by automated learning from facial motion capture data. Accurate 3D motions of the markers on the face of a human subject are captured while he/she recites a predesigned corpus, with specific spoken and visual expressions. We present a novel motion capture mining technique that "learns" speech coarticulation models for diphones and triphones from the recorded data. A Phoneme-Independent Expression Eigenspace (PIEES) that encloses the dynamic expression signals is constructed by motion signal processing (phoneme-based time-warping and subtraction) and Principal Component Analysis (PCA) reduction. New expressive facial animations are synthesized as follows: First, the learned coarticulation models are concatenated to synthesize neutral visual speech according to novel speech input, then a texture-synthesis-based approach is used to generate a novel dynamic expression signal from the PIEES model, and finally the synthesized expression signal is blended with the synthesized neutral visual speech to create the final expressive facial animation. Our experiments demonstrate that the system can effectively synthesize realistic expressive facial animation.

  17. Spatiotemporal neural network dynamics for the processing of dynamic facial expressions.

    PubMed

    Sato, Wataru; Kochiyama, Takanori; Uono, Shota

    2015-07-24

    The dynamic facial expressions of emotion automatically elicit multifaceted psychological activities; however, the temporal profiles and dynamic interaction patterns of brain activities remain unknown. We investigated these issues using magnetoencephalography. Participants passively observed dynamic facial expressions of fear and happiness, or dynamic mosaics. Source-reconstruction analyses utilizing functional magnetic-resonance imaging data revealed higher activation in broad regions of the bilateral occipital and temporal cortices in response to dynamic facial expressions than in response to dynamic mosaics at 150-200 ms and some later time points. The right inferior frontal gyrus exhibited higher activity for dynamic faces versus mosaics at 300-350 ms. Dynamic causal-modeling analyses revealed that dynamic faces activated the dual visual routes and visual-motor route. Superior influences of feedforward and feedback connections were identified before and after 200 ms, respectively. These results indicate that hierarchical, bidirectional neural network dynamics within a few hundred milliseconds implement the processing of dynamic facial expressions.

  18. Spatiotemporal neural network dynamics for the processing of dynamic facial expressions

    PubMed Central

    Sato, Wataru; Kochiyama, Takanori; Uono, Shota

    2015-01-01

    The dynamic facial expressions of emotion automatically elicit multifaceted psychological activities; however, the temporal profiles and dynamic interaction patterns of brain activities remain unknown. We investigated these issues using magnetoencephalography. Participants passively observed dynamic facial expressions of fear and happiness, or dynamic mosaics. Source-reconstruction analyses utilizing functional magnetic-resonance imaging data revealed higher activation in broad regions of the bilateral occipital and temporal cortices in response to dynamic facial expressions than in response to dynamic mosaics at 150–200 ms and some later time points. The right inferior frontal gyrus exhibited higher activity for dynamic faces versus mosaics at 300–350 ms. Dynamic causal-modeling analyses revealed that dynamic faces activated the dual visual routes and visual–motor route. Superior influences of feedforward and feedback connections were identified before and after 200 ms, respectively. These results indicate that hierarchical, bidirectional neural network dynamics within a few hundred milliseconds implement the processing of dynamic facial expressions. PMID:26206708

  19. Faces in Context: A Review and Systematization of Contextual Influences on Affective Face Processing

    PubMed Central

    Wieser, Matthias J.; Brosch, Tobias

    2012-01-01

    Facial expressions are of eminent importance for social interaction as they convey information about other individuals’ emotions and social intentions. According to the predominant “basic emotion” approach, the perception of emotion in faces is based on the rapid, automatic categorization of prototypical, universal expressions. Consequently, the perception of facial expressions has typically been investigated using isolated, de-contextualized, static pictures of facial expressions that maximize the distinction between categories. However, in everyday life, an individual’s face is not perceived in isolation, but almost always appears within a situational context, which may arise from other people, the physical environment surrounding the face, as well as multichannel information from the sender. Furthermore, situational context may be provided by the perceiver, including already present social information gained from affective learning and implicit processing biases such as race bias. Thus, the perception of facial expressions is presumably always influenced by contextual variables. In this comprehensive review, we aim at (1) systematizing the contextual variables that may influence the perception of facial expressions and (2) summarizing experimental paradigms and findings that have been used to investigate these influences. The studies reviewed here demonstrate that perception and neural processing of facial expressions are substantially modified by contextual information, including verbal, visual, and auditory information presented together with the face as well as knowledge or processing biases already present in the observer. These findings further challenge the assumption of automatic, hardwired categorical emotion extraction mechanisms predicted by basic emotion theories. Taking into account a recent model on face processing, we discuss where and when these different contextual influences may take place, thus outlining potential avenues in future research. PMID:23130011

  20. Faces in context: a review and systematization of contextual influences on affective face processing.

    PubMed

    Wieser, Matthias J; Brosch, Tobias

    2012-01-01

    Facial expressions are of eminent importance for social interaction as they convey information about other individuals' emotions and social intentions. According to the predominant "basic emotion" approach, the perception of emotion in faces is based on the rapid, automatic categorization of prototypical, universal expressions. Consequently, the perception of facial expressions has typically been investigated using isolated, de-contextualized, static pictures of facial expressions that maximize the distinction between categories. However, in everyday life, an individual's face is not perceived in isolation, but almost always appears within a situational context, which may arise from other people, the physical environment surrounding the face, as well as multichannel information from the sender. Furthermore, situational context may be provided by the perceiver, including already present social information gained from affective learning and implicit processing biases such as race bias. Thus, the perception of facial expressions is presumably always influenced by contextual variables. In this comprehensive review, we aim at (1) systematizing the contextual variables that may influence the perception of facial expressions and (2) summarizing experimental paradigms and findings that have been used to investigate these influences. The studies reviewed here demonstrate that perception and neural processing of facial expressions are substantially modified by contextual information, including verbal, visual, and auditory information presented together with the face as well as knowledge or processing biases already present in the observer. These findings further challenge the assumption of automatic, hardwired categorical emotion extraction mechanisms predicted by basic emotion theories. Taking into account a recent model on face processing, we discuss where and when these different contextual influences may take place, thus outlining potential avenues in future research.

  1. Expression-dependent susceptibility to face distortions in processing of facial expressions of emotion.

    PubMed

    Guo, Kun; Soornack, Yoshi; Settle, Rebecca

    2018-03-05

    Our capability of recognizing facial expressions of emotion under different viewing conditions implies the existence of an invariant expression representation. As natural visual signals are often distorted and our perceptual strategy changes with external noise level, it is essential to understand how expression perception is susceptible to face distortion and whether the same facial cues are used to process high- and low-quality face images. We systematically manipulated face image resolution (experiment 1) and blur (experiment 2), and measured participants' expression categorization accuracy, perceived expression intensity and associated gaze patterns. Our analysis revealed a reasonable tolerance to face distortion in expression perception. Reducing image resolution up to 48 × 64 pixels or increasing image blur up to 15 cycles/image had little impact on expression assessment and associated gaze behaviour. Further distortion led to decreased expression categorization accuracy and intensity rating, increased reaction time and fixation duration, and stronger central fixation bias which was not driven by distortion-induced changes in local image saliency. Interestingly, the observed distortion effects were expression-dependent with less deterioration impact on happy and surprise expressions, suggesting this distortion-invariant facial expression perception might be achieved through the categorical model involving a non-linear configural combination of local facial features. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. The MPI Facial Expression Database — A Validated Database of Emotional and Conversational Facial Expressions

    PubMed Central

    Kaulard, Kathrin; Cunningham, Douglas W.; Bülthoff, Heinrich H.; Wallraven, Christian

    2012-01-01

    The ability to communicate is one of the core aspects of human life. For this, we use not only verbal but also nonverbal signals of remarkable complexity. Among the latter, facial expressions belong to the most important information channels. Despite the large variety of facial expressions we use in daily life, research on facial expressions has so far mostly focused on the emotional aspect. Consequently, most databases of facial expressions available to the research community also include only emotional expressions, neglecting the largely unexplored aspect of conversational expressions. To fill this gap, we present the MPI facial expression database, which contains a large variety of natural emotional and conversational expressions. The database contains 55 different facial expressions performed by 19 German participants. Expressions were elicited with the help of a method-acting protocol, which guarantees both well-defined and natural facial expressions. The method-acting protocol was based on every-day scenarios, which are used to define the necessary context information for each expression. All facial expressions are available in three repetitions, in two intensities, as well as from three different camera angles. A detailed frame annotation is provided, from which a dynamic and a static version of the database have been created. In addition to describing the database in detail, we also present the results of an experiment with two conditions that serve to validate the context scenarios as well as the naturalness and recognizability of the video sequences. Our results provide clear evidence that conversational expressions can be recognized surprisingly well from visual information alone. The MPI facial expression database will enable researchers from different research fields (including the perceptual and cognitive sciences, but also affective computing, as well as computer vision) to investigate the processing of a wider range of natural facial expressions. PMID:22438875

  3. Interference among the Processing of Facial Emotion, Face Race, and Face Gender.

    PubMed

    Li, Yongna; Tse, Chi-Shing

    2016-01-01

    People can process multiple dimensions of facial properties simultaneously. Facial processing models are based on the processing of facial properties. The current study examined the processing of facial emotion, face race, and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interacted with face race in all the tasks. The interaction of face race and face gender was found in the race and gender categorization tasks, whereas the interaction of facial emotion and face gender was significant in the emotion and gender categorization tasks. These results provided evidence for a symmetric interaction between variant facial properties (emotion) and invariant facial properties (race and gender).

  4. Interference among the Processing of Facial Emotion, Face Race, and Face Gender

    PubMed Central

    Li, Yongna; Tse, Chi-Shing

    2016-01-01

    People can process multiple dimensions of facial properties simultaneously. Facial processing models are based on the processing of facial properties. The current study examined the processing of facial emotion, face race, and face gender using categorization tasks. The same set of Chinese, White and Black faces, each posing a neutral, happy or angry expression, was used in three experiments. Facial emotion interacted with face race in all the tasks. The interaction of face race and face gender was found in the race and gender categorization tasks, whereas the interaction of facial emotion and face gender was significant in the emotion and gender categorization tasks. These results provided evidence for a symmetric interaction between variant facial properties (emotion) and invariant facial properties (race and gender). PMID:27840621

  5. Multimedia Content Development as a Facial Expression Datasets for Recognition of Human Emotions

    NASA Astrophysics Data System (ADS)

    Mamonto, N. E.; Maulana, H.; Liliana, D. Y.; Basaruddin, T.

    2018-02-01

    Datasets that have been developed before contain facial expression from foreign people. The development of multimedia content aims to answer the problems experienced by the research team and other researchers who will conduct similar research. The method used in the development of multimedia content as facial expression datasets for human emotion recognition is the Villamil-Molina version of the multimedia development method. Multimedia content developed with 10 subjects or talents with each talent performing 3 shots with each capturing talent having to demonstrate 19 facial expressions. After the process of editing and rendering, tests are carried out with the conclusion that the multimedia content can be used as a facial expression dataset for recognition of human emotions.

  6. The role of spatial frequency information for ERP components sensitive to faces and emotional facial expression.

    PubMed

    Holmes, Amanda; Winston, Joel S; Eimer, Martin

    2005-10-01

    To investigate the impact of spatial frequency on emotional facial expression analysis, ERPs were recorded in response to low spatial frequency (LSF), high spatial frequency (HSF), and unfiltered broad spatial frequency (BSF) faces with fearful or neutral expressions, houses, and chairs. In line with previous findings, BSF fearful facial expressions elicited a greater frontal positivity than BSF neutral facial expressions, starting at about 150 ms after stimulus onset. In contrast, this emotional expression effect was absent for HSF and LSF faces. Given that some brain regions involved in emotion processing, such as amygdala and connected structures, are selectively tuned to LSF visual inputs, these data suggest that ERP effects of emotional facial expression do not directly reflect activity in these regions. It is argued that higher order neocortical brain systems are involved in the generation of emotion-specific waveform modulations. The face-sensitive N170 component was neither affected by emotional facial expression nor by spatial frequency information.

  7. Neural Correlates of Facial Mimicry: Simultaneous Measurements of EMG and BOLD Responses during Perception of Dynamic Compared to Static Facial Expressions

    PubMed Central

    Rymarczyk, Krystyna; Żurawski, Łukasz; Jankowiak-Siuda, Kamila; Szatkowska, Iwona

    2018-01-01

    Facial mimicry (FM) is an automatic response to imitate the facial expressions of others. However, neural correlates of the phenomenon are as yet not well established. We investigated this issue using simultaneously recorded EMG and BOLD signals during perception of dynamic and static emotional facial expressions of happiness and anger. During display presentations, BOLD signals and zygomaticus major (ZM), corrugator supercilii (CS) and orbicularis oculi (OO) EMG responses were recorded simultaneously from 46 healthy individuals. Subjects reacted spontaneously to happy facial expressions with increased EMG activity in ZM and OO muscles and decreased CS activity, which was interpreted as FM. Facial muscle responses correlated with BOLD activity in regions associated with motor simulation of facial expressions [i.e., inferior frontal gyrus, a classical Mirror Neuron System (MNS)]. Further, we also found correlations for regions associated with emotional processing (i.e., insula, part of the extended MNS). It is concluded that FM involves both motor and emotional brain structures, especially during perception of natural emotional expressions. PMID:29467691

  8. A comparison of the effects of a beta-adrenergic blocker and a benzodiazepine upon the recognition of human facial expressions.

    PubMed

    Zangara, Andrea; Blair, R J R; Curran, H Valerie

    2002-08-01

    Accumulating evidence from neuropsychological and neuroimaging research suggests that facial expressions are processed by at least partially separable neurocognitive systems. Recent evidence implies that the processing of different facial expressions may also be dissociable pharmacologically by GABAergic and noradrenergic compounds, although no study has directly compared the two types of drugs. The present study therefore directly compared the effects of a benzodiazepine with those of a beta-adrenergic blocker on the ability to recognise emotional expressions. A double-blind, independent group design was used with 45 volunteers to compare the effects of diazepam (15 mg) and metoprolol (50 mg) with matched placebo. Participants were presented with morphed facial expression stimuli and asked to identify which of the six basic emotions (sadness, happiness, anger, disgust, fear and surprise) were portrayed. Control measures of mood, pulse rate and word recall were also taken. Diazepam selectively impaired participants' ability to recognise expressions of both anger and fear but not other emotional expressions. Errors were mainly mistaking fear for surprise and disgust for anger. Metoprolol did not significantly affect facial expression recognition. These findings are interpreted as providing further support for the suggestion that there are dissociable systems responsible for processing emotional expressions. The results may have implications for understanding why 'paradoxical' aggression is sometimes elicited by benzodiazepines and for extending our psychological understanding of the anxiolytic effects of these drugs.

  9. Cradling Side Preference Is Associated with Lateralized Processing of Baby Facial Expressions in Females

    ERIC Educational Resources Information Center

    Huggenberger, Harriet J.; Suter, Susanne E.; Reijnen, Ester; Schachinger, Hartmut

    2009-01-01

    Women's cradling side preference has been related to contralateral hemispheric specialization of processing emotional signals; but not of processing baby's facial expression. Therefore, 46 nulliparous female volunteers were characterized as left or non-left holders (HG) during a doll holding task. During a signal detection task they were then…

  10. Facing the Problem: Impaired Emotion Recognition During Multimodal Social Information Processing in Borderline Personality Disorder.

    PubMed

    Niedtfeld, Inga; Defiebre, Nadine; Regenbogen, Christina; Mier, Daniela; Fenske, Sabrina; Kirsch, Peter; Lis, Stefanie; Schmahl, Christian

    2017-04-01

    Previous research has revealed alterations and deficits in facial emotion recognition in patients with borderline personality disorder (BPD). During interpersonal communication in daily life, social signals such as speech content, variation in prosody, and facial expression need to be considered simultaneously. We hypothesized that deficits in higher level integration of social stimuli contribute to difficulties in emotion recognition in BPD, and heightened arousal might explain this effect. Thirty-one patients with BPD and thirty-one healthy controls were asked to identify emotions in short video clips, which were designed to represent different combinations of the three communication channels: facial expression, speech content, and prosody. Skin conductance was recorded as a measure of sympathetic arousal, while controlling for state dissociation. Patients with BPD showed lower mean accuracy scores than healthy control subjects in all conditions comprising emotional facial expressions. This was true for the condition with facial expression only, and for the combination of all three communication channels. Electrodermal responses were enhanced in BPD only in response to auditory stimuli. In line with the major body of facial emotion recognition studies, we conclude that deficits in the interpretation of facial expressions lead to the difficulties observed in multimodal emotion processing in BPD.

  11. Face in profile view reduces perceived facial expression intensity: an eye-tracking study.

    PubMed

    Guo, Kun; Shaw, Heather

    2015-02-01

    Recent studies measuring the facial expressions of emotion have focused primarily on the perception of frontal face images. As we frequently encounter expressive faces from different viewing angles, having a mechanism which allows invariant expression perception would be advantageous to our social interactions. Although a couple of studies have indicated comparable expression categorization accuracy across viewpoints, it is unknown how perceived expression intensity and associated gaze behaviour change across viewing angles. Differences could arise because diagnostic cues from local facial features for decoding expressions could vary with viewpoints. Here we manipulated orientation of faces (frontal, mid-profile, and profile view) displaying six common facial expressions of emotion, and measured participants' expression categorization accuracy, perceived expression intensity and associated gaze patterns. In comparison with frontal faces, profile faces slightly reduced identification rates for disgust and sad expressions, but significantly decreased perceived intensity for all tested expressions. Although quantitatively viewpoint had expression-specific influence on the proportion of fixations directed at local facial features, the qualitative gaze distribution within facial features (e.g., the eyes tended to attract the highest proportion of fixations, followed by the nose and then the mouth region) was independent of viewpoint and expression type. Our results suggest that the viewpoint-invariant facial expression processing is categorical perception, which could be linked to a viewpoint-invariant holistic gaze strategy for extracting expressive facial cues. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Right Hemispheric Dominance in Processing of Unconscious Negative Emotion

    ERIC Educational Resources Information Center

    Sato, Wataru; Aoki, Satoshi

    2006-01-01

    Right hemispheric dominance in unconscious emotional processing has been suggested, but remains controversial. This issue was investigated using the subliminal affective priming paradigm combined with unilateral visual presentation in 40 normal subjects. In either left or right visual fields, angry facial expressions, happy facial expressions, or…

  13. Do facial movements express emotions or communicate motives?

    PubMed

    Parkinson, Brian

    2005-01-01

    This article addresses the debate between emotion-expression and motive-communication approaches to facial movements, focusing on Ekman's (1972) and Fridlund's (1994) contrasting models and their historical antecedents. Available evidence suggests that the presence of others either reduces or increases facial responses, depending on the quality and strength of the emotional manipulation and on the nature of the relationship between interactants. Although both display rules and social motives provide viable explanations of audience "inhibition" effects, some audience facilitation effects are less easily accommodated within an emotion-expression perspective. In particular, emotion is not a sufficient condition for a corresponding "expression," even discounting explicit regulation, and, apparently, "spontaneous" facial movements may be facilitated by the presence of others. Further, there is no direct evidence that any particular facial movement provides an unambiguous expression of a specific emotion. However, information communicated by facial movements is not necessarily extrinsic to emotion. Facial movements not only transmit emotion-relevant information but also contribute to ongoing processes of emotional action in accordance with pragmatic theories.

  14. The emotion seen in a face can be a methodological artifact: The process of elimination hypothesis.

    PubMed

    DiGirolamo, Marissa A; Russell, James A

    2017-04-01

    The claim that certain facial expressions signal certain specific emotions has been supported by high observer agreement in labeling the emotion predicted for that expression. Our hypothesis was that, with a method common to the field, high observer agreement can be achieved through a process of elimination: As participants move from trial to trial and they encounter a type of expression not previously encountered in the experiment, they tend to eliminate labels they have already associated with expressions seen on previous trials; they then select among labels not previously used. Seven experiments (total N = 1,068) here showed that the amount of agreement can be altered through a process of elimination. One facial expression not previously theorized to signal any emotion was consensually labeled as disgusted (76%), annoyed (85%), playful (89%), and mischievous (96%). Three quite different facial expressions were labeled nonplussed (82%, 93%, and 82%). A prototypical sad expression was labeled disgusted (55%), and a prototypical fear expression was labeled surprised (55%). A facial expression was labeled with a made-up word ( tolen ; 53%). Similar results were obtained both in a context focused on demonstrating a process of elimination and in one similar to a commonly used method, with 4 target expressions embedded with other expressions in 24 randomly ordered trials. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  15. Emotional facial expressions reduce neural adaptation to face identity.

    PubMed

    Gerlicher, Anna M V; van Loon, Anouk M; Scholte, H Steven; Lamme, Victor A F; van der Leij, Andries R

    2014-05-01

    In human social interactions, facial emotional expressions are a crucial source of information. Repeatedly presented information typically leads to an adaptation of neural responses. However, processing seems sustained with emotional facial expressions. Therefore, we tested whether sustained processing of emotional expressions, especially threat-related expressions, would attenuate neural adaptation. Neutral and emotional expressions (happy, mixed and fearful) of same and different identity were presented at 3 Hz. We used electroencephalography to record the evoked steady-state visual potentials (ssVEP) and tested to what extent the ssVEP amplitude adapts to the same when compared with different face identities. We found adaptation to the identity of a neutral face. However, for emotional faces, adaptation was reduced, decreasing linearly with negative valence, with the least adaptation to fearful expressions. This short and straightforward method may prove to be a valuable new tool in the study of emotional processing.

  16. Flexible and inflexible task sets: asymmetric interference when switching between emotional expression, sex, and age classification of perceived faces.

    PubMed

    Schuch, Stefanie; Werheid, Katja; Koch, Iring

    2012-01-01

    The present study investigated whether the processing characteristics of categorizing emotional facial expressions are different from those of categorizing facial age and sex information. Given that emotions change rapidly, it was hypothesized that processing facial expressions involves a more flexible task set that causes less between-task interference than the task sets involved in processing age or sex of a face. Participants switched between three tasks: categorizing a face as looking happy or angry (emotion task), young or old (age task), and male or female (sex task). Interference between tasks was measured by global interference and response interference. Both measures revealed patterns of asymmetric interference. Global between-task interference was reduced when a task was mixed with the emotion task. Response interference, as measured by congruency effects, was larger for the emotion task than for the nonemotional tasks. The results support the idea that processing emotional facial expression constitutes a more flexible task set that causes less interference (i.e., task-set "inertia") than processing the age or sex of a face.

  17. Shades of Emotion: What the Addition of Sunglasses or Masks to Faces Reveals about the Development of Facial Expression Processing

    ERIC Educational Resources Information Center

    Roberson, Debi; Kikutani, Mariko; Doge, Paula; Whitaker, Lydia; Majid, Asifa

    2012-01-01

    Three studies investigated developmental changes in facial expression processing, between 3 years-of-age and adulthood. For adults and older children, the addition of sunglasses to upright faces caused an equivalent decrement in performance to face inversion. However, younger children showed "better" classification of expressions of faces wearing…

  18. Effects of task demands on the early neural processing of fearful and happy facial expressions.

    PubMed

    Itier, Roxane J; Neath-Tavares, Karly N

    2017-05-15

    Task demands shape how we process environmental stimuli but their impact on the early neural processing of facial expressions remains unclear. In a within-subject design, ERPs were recorded to the same fearful, happy and neutral facial expressions presented during a gender discrimination, an explicit emotion discrimination and an oddball detection tasks, the most studied tasks in the field. Using an eye tracker, fixation on the face nose was enforced using a gaze-contingent presentation. Task demands modulated amplitudes from 200 to 350ms at occipito-temporal sites spanning the EPN component. Amplitudes were more negative for fearful than neutral expressions starting on N170 from 150 to 350ms, with a temporo-occipital distribution, whereas no clear effect of happy expressions was seen. Task and emotion effects never interacted in any time window or for the ERP components analyzed (P1, N170, EPN). Thus, whether emotion is explicitly discriminated or irrelevant for the task at hand, neural correlates of fearful and happy facial expressions seem immune to these task demands during the first 350ms of visual processing. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Spontaneous facial expressions of emotion of congenitally and noncongenitally blind individuals.

    PubMed

    Matsumoto, David; Willingham, Bob

    2009-01-01

    The study of the spontaneous expressions of blind individuals offers a unique opportunity to understand basic processes concerning the emergence and source of facial expressions of emotion. In this study, the authors compared the expressions of congenitally and noncongenitally blind athletes in the 2004 Paralympic Games with each other and with those produced by sighted athletes in the 2004 Olympic Games. The authors also examined how expressions change from 1 context to another. There were no differences between congenitally blind, noncongenitally blind, and sighted athletes, either on the level of individual facial actions or in facial emotion configurations. Blind athletes did produce more overall facial activity, but these were isolated to head and eye movements. The blind athletes' expressions differentiated whether they had won or lost a medal match at 3 different points in time, and there were no cultural differences in expression. These findings provide compelling evidence that the production of spontaneous facial expressions of emotion is not dependent on observational learning but simultaneously demonstrates a learned component to the social management of expressions, even among blind individuals.

  20. Early and late temporo-spatial effects of contextual interference during perception of facial affect.

    PubMed

    Frühholz, Sascha; Fehr, Thorsten; Herrmann, Manfred

    2009-10-01

    Contextual features during recognition of facial affect are assumed to modulate the temporal course of emotional face processing. Here, we simultaneously presented colored backgrounds during valence categorizations of facial expressions. Subjects incidentally learned to perceive negative, neutral and positive expressions within a specific colored context. Subsequently, subjects made fast valence judgments while presented with the same face-color-combinations as in the first run (congruent trials) or with different face-color-combinations (incongruent trials). Incongruent trials induced significantly increased response latencies and significantly decreased performance accuracy. Contextual incongruent information during processing of neutral expressions modulated the P1 and the early posterior negativity (EPN) both localized in occipito-temporal areas. Contextual congruent information during emotional face perception revealed an emotion-related modulation of the P1 for positive expressions and of the N170 and the EPN for negative expressions. Highest amplitude of the N170 was found for negative expressions in a negatively associated context and the N170 amplitude varied with the amount of overall negative information. Incongruent trials with negative expressions elicited a parietal negativity which was localized to superior parietal cortex and which most likely represents a posterior manifestation of the N450 as an indicator of conflict processing. A sustained activation of the late LPP over parietal cortex for all incongruent trials might reflect enhanced engagement with facial expression during task conditions of contextual interference. In conclusion, whereas early components seem to be sensitive to the emotional valence of facial expression in specific contexts, late components seem to subserve interference resolution during emotional face processing.

  1. Deeper than skin deep - The effect of botulinum toxin-A on emotion processing.

    PubMed

    Baumeister, J-C; Papa, G; Foroni, F

    2016-08-01

    The effect of facial botulinum Toxin-A (BTX) injections on the processing of emotional stimuli was investigated. The hypothesis, that BTX would interfere with processing of slightly emotional stimuli and less with very emotional or neutral stimuli, was largely confirmed. BTX-users rated slightly emotional sentences and facial expressions, but not very emotional or neutral ones, as less emotional after the treatment. Furthermore, they became slower at categorizing slightly emotional facial expressions under time pressure. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Selective attention to a facial feature with and without facial context: an ERP-study.

    PubMed

    Wijers, A A; Van Besouw, N J P; Mulder, G

    2002-04-01

    The present experiment addressed the question whether selectively attending to a facial feature (mouth shape) would benefit from the presence of a correct facial context. Subjects attended selectively to one of two possible mouth shapes belonging to photographs of a face with a happy or sad expression, respectively. These mouths were presented randomly either in isolation, embedded in the original photos, or in an exchanged facial context. The ERP effect of attending mouth shape was a lateral posterior negativity, anterior positivity with an onset latency of 160-200 ms; this effect was completely unaffected by the type of facial context. When the mouth shape and the facial context conflicted, this resulted in a medial parieto-occipital positivity with an onset latency of 180 ms, independent of the relevance of the mouth shape. Finally, there was a late (onset at approx. 400 ms) expression (happy vs. sad) effect, which was strongly lateralized to the right posterior hemisphere and was most prominent for attended stimuli in the correct facial context. For the isolated mouth stimuli, a similarly distributed expression effect was observed at an earlier latency range (180-240 ms). These data suggest the existence of separate, independent and neuroanatomically segregated processors engaged in the selective processing of facial features and the detection of contextual congruence and emotional expression of face stimuli. The data do not support that early selective attention processes benefit from top-down constraints provided by the correct facial context.

  3. Greater perceptual sensitivity to happy facial expression.

    PubMed

    Maher, Stephen; Ekstrom, Tor; Chen, Yue

    2014-01-01

    Perception of subtle facial expressions is essential for social functioning; yet it is unclear if human perceptual sensitivities differ in detecting varying types of facial emotions. Evidence diverges as to whether salient negative versus positive emotions (such as sadness versus happiness) are preferentially processed. Here, we measured perceptual thresholds for the detection of four types of emotion in faces--happiness, fear, anger, and sadness--using psychophysical methods. We also evaluated the association of the perceptual performances with facial morphological changes between neutral and respective emotion types. Human observers were highly sensitive to happiness compared with the other emotional expressions. Further, this heightened perceptual sensitivity to happy expressions can be attributed largely to the emotion-induced morphological change of a particular facial feature (end-lip raise).

  4. Modulation of α power and functional connectivity during facial affect recognition.

    PubMed

    Popov, Tzvetan; Miller, Gregory A; Rockstroh, Brigitte; Weisz, Nathan

    2013-04-03

    Research has linked oscillatory activity in the α frequency range, particularly in sensorimotor cortex, to processing of social actions. Results further suggest involvement of sensorimotor α in the processing of facial expressions, including affect. The sensorimotor face area may be critical for perception of emotional face expression, but the role it plays is unclear. The present study sought to clarify how oscillatory brain activity contributes to or reflects processing of facial affect during changes in facial expression. Neuromagnetic oscillatory brain activity was monitored while 30 volunteers viewed videos of human faces that changed their expression from neutral to fearful, neutral, or happy expressions. Induced changes in α power during the different morphs, source analysis, and graph-theoretic metrics served to identify the role of α power modulation and cross-regional coupling by means of phase synchrony during facial affect recognition. Changes from neutral to emotional faces were associated with a 10-15 Hz power increase localized in bilateral sensorimotor areas, together with occipital power decrease, preceding reported emotional expression recognition. Graph-theoretic analysis revealed that, in the course of a trial, the balance between sensorimotor power increase and decrease was associated with decreased and increased transregional connectedness as measured by node degree. Results suggest that modulations in α power facilitate early registration, with sensorimotor cortex including the sensorimotor face area largely functionally decoupled and thereby protected from additional, disruptive input and that subsequent α power decrease together with increased connectedness of sensorimotor areas facilitates successful facial affect recognition.

  5. Intact Rapid Facial Mimicry as well as Generally Reduced Mimic Responses in Stable Schizophrenia Patients

    PubMed Central

    Chechko, Natalya; Pagel, Alena; Otte, Ellen; Koch, Iring; Habel, Ute

    2016-01-01

    Spontaneous emotional expressions (rapid facial mimicry) perform both emotional and social functions. In the current study, we sought to test whether there were deficits in automatic mimic responses to emotional facial expressions in patients (15 of them) with stable schizophrenia compared to 15 controls. In a perception-action interference paradigm (the Simon task; first experiment), and in the context of a dual-task paradigm (second experiment), the task-relevant stimulus feature was the gender of a face, which, however, displayed a smiling or frowning expression (task-irrelevant stimulus feature). We measured the electromyographical activity in the corrugator supercilii and zygomaticus major muscle regions in response to either compatible or incompatible stimuli (i.e., when the required response did or did not correspond to the depicted facial expression). The compatibility effect based on interactions between the implicit processing of a task-irrelevant emotional facial expression and the conscious production of an emotional facial expression did not differ between the groups. In stable patients (in spite of a reduced mimic reaction), we observed an intact capacity to respond spontaneously to facial emotional stimuli. PMID:27303335

  6. Facial emotion processing in pediatric social anxiety disorder: Relevance of situational context.

    PubMed

    Schwab, Daniela; Schienle, Anne

    2017-08-01

    Social anxiety disorder (SAD) typically begins in childhood. Previous research has demonstrated that adult patients respond with elevated late positivity (LP) to negative facial expressions. In the present study on pediatric SAD, we investigated responses to negative facial expressions and the role of social context information. Fifteen children with SAD and 15 non-anxious controls were first presented with images of negative facial expressions with masked backgrounds. Following this, the complete images which included context information, were shown. The negative expressions were either a result of an emotion-relevant (e.g., social exclusion) or emotion-irrelevant elicitor (e.g., weight lifting). Relative to controls, the clinical group showed elevated parietal LP during face processing with and without context information. Both groups differed in their frontal LP depending on the type of context. In SAD patients, frontal LP was lower in emotion-relevant than emotion-irrelevant contexts. We conclude that SAD patients direct more automatic attention towards negative facial expressions (parietal effect) and are less capable in integrating affective context information (frontal effect). Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. A facial expression of pax: Assessing children's "recognition" of emotion from faces.

    PubMed

    Nelson, Nicole L; Russell, James A

    2016-01-01

    In a classic study, children were shown an array of facial expressions and asked to choose the person who expressed a specific emotion. Children were later asked to name the emotion in the face with any label they wanted. Subsequent research often relied on the same two tasks--choice from array and free labeling--to support the conclusion that children recognize basic emotions from facial expressions. Here five studies (N=120, 2- to 10-year-olds) showed that these two tasks produce illusory recognition; a novel nonsense facial expression was included in the array. Children "recognized" a nonsense emotion (pax or tolen) and two familiar emotions (fear and jealousy) from the same nonsense face. Children likely used a process of elimination; they paired the unknown facial expression with a label given in the choice-from-array task and, after just two trials, freely labeled the new facial expression with the new label. These data indicate that past studies using this method may have overestimated children's expression knowledge. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. The role of the cannabinoid receptor in adolescents' processing of facial expressions.

    PubMed

    Ewald, Anais; Becker, Susanne; Heinrich, Angela; Banaschewski, Tobias; Poustka, Luise; Bokde, Arun; Büchel, Christian; Bromberg, Uli; Cattrell, Anna; Conrod, Patricia; Desrivières, Sylvane; Frouin, Vincent; Papadopoulos-Orfanos, Dimitri; Gallinat, Jürgen; Garavan, Hugh; Heinz, Andreas; Walter, Henrik; Ittermann, Bernd; Gowland, Penny; Paus, Tomáš; Martinot, Jean-Luc; Paillère Martinot, Marie-Laure; Smolka, Michael N; Vetter, Nora; Whelan, Rob; Schumann, Gunter; Flor, Herta; Nees, Frauke

    2016-01-01

    The processing of emotional faces is an important prerequisite for adequate social interactions in daily life, and might thus specifically be altered in adolescence, a period marked by significant changes in social emotional processing. Previous research has shown that the cannabinoid receptor CB1R is associated with longer gaze duration and increased brain responses in the striatum to happy faces in adults, yet, for adolescents, it is not clear whether an association between CBR1 and face processing exists. In the present study we investigated genetic effects of the two CB1R polymorphisms, rs1049353 and rs806377, on the processing of emotional faces in healthy adolescents. They participated in functional magnetic resonance imaging during a Faces Task, watching blocks of video clips with angry and neutral facial expressions, and completed a Morphed Faces Task in the laboratory where they looked at different facial expressions that switched from anger to fear or sadness or from happiness to fear or sadness, and labelled them according to these four emotional expressions. A-allele versus GG-carriers in rs1049353 displayed earlier recognition of facial expressions changing from anger to sadness or fear, but not for expressions changing from happiness to sadness or fear, and higher brain responses to angry, but not neutral, faces in the amygdala and insula. For rs806377 no significant effects emerged. This suggests that rs1049353 is involved in the processing of negative facial expressions with relation to anger in adolescence. These findings add to our understanding of social emotion-related mechanisms in this life period. © 2015 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  9. Drug effects on responses to emotional facial expressions: recent findings.

    PubMed

    Miller, Melissa A; Bershad, Anya K; de Wit, Harriet

    2015-09-01

    Many psychoactive drugs increase social behavior and enhance social interactions, which may, in turn, increase their attractiveness to users. Although the psychological mechanisms by which drugs affect social behavior are not fully understood, there is some evidence that drugs alter the perception of emotions in others. Drugs can affect the ability to detect, attend to, and respond to emotional facial expressions, which in turn may influence their use in social settings. Either increased reactivity to positive expressions or decreased response to negative expressions may facilitate social interaction. This article reviews evidence that psychoactive drugs alter the processing of emotional facial expressions using subjective, behavioral, and physiological measures. The findings lay the groundwork for better understanding how drugs alter social processing and social behavior more generally.

  10. Automatic processing of facial affects in patients with borderline personality disorder: associations with symptomatology and comorbid disorders.

    PubMed

    Donges, Uta-Susan; Dukalski, Bibiana; Kersting, Anette; Suslow, Thomas

    2015-01-01

    Instability of affects and interpersonal relations are important features of borderline personality disorder (BPD). Interpersonal problems of individuals suffering from BPD might develop based on abnormalities in the processing of facial affects and high sensitivity to negative affective expressions. The aims of the present study were to examine automatic evaluative shifts and latencies as a function of masked facial affects in patients with BPD compared to healthy individuals. As BPD comorbidity rates for mental and personality disorders are high, we investigated also the relationships of affective processing characteristics with specific borderline symptoms and comorbidity. Twenty-nine women with BPD and 38 healthy women participated in the study. The majority of patients suffered from additional Axis I disorders and/or additional personality disorders. In the priming experiment, angry, happy, neutral, or no facial expression was briefly presented (for 33 ms) and masked by neutral faces that had to be evaluated. Evaluative decisions and response latencies were registered. Borderline-typical symptomatology was assessed with the Borderline Symptom List. In the total sample, valence-congruent evaluative shifts and delays of evaluative decision due to facial affect were observed. No between-group differences were obtained for evaluative decisions and latencies. The presence of comorbid anxiety disorders was found to be positively correlated with evaluative shifting owing to masked happy primes, regardless of baseline-neutral or no facial expression condition. The presence of comorbid depressive disorder, paranoid personality disorder, and symptoms of social isolation and self-aggression were significantly correlated with response delay due to masked angry faces, regardless of baseline. In the present affective priming study, no abnormalities in the automatic recognition and processing of facial affects were observed in BPD patients compared to healthy individuals. The presence of comorbid anxiety disorders could make patients more susceptible to the influence of a happy expression on judgment processes at an automatic processing level. Comorbid depressive disorder, paranoid personality disorder, and symptoms of social isolation and self-aggression may enhance automatic attention allocation to threatening facial expressions in BPD. Increased automatic vigilance for social threat stimuli might contribute to affective instability and interpersonal problems in specific patients with BPD.

  11. Human amygdala response to dynamic facial expressions of positive and negative surprise.

    PubMed

    Vrticka, Pascal; Lordier, Lara; Bediou, Benoît; Sander, David

    2014-02-01

    Although brain imaging evidence accumulates to suggest that the amygdala plays a key role in the processing of novel stimuli, only little is known about its role in processing expressed novelty conveyed by surprised faces, and even less about possible interactive encoding of novelty and valence. Those investigations that have already probed human amygdala involvement in the processing of surprised facial expressions either used static pictures displaying negative surprise (as contained in fear) or "neutral" surprise, and manipulated valence by contextually priming or subjectively associating static surprise with either negative or positive information. Therefore, it still remains unresolved how the human amygdala differentially processes dynamic surprised facial expressions displaying either positive or negative surprise. Here, we created new artificial dynamic 3-dimensional facial expressions conveying surprise with an intrinsic positive (wonderment) or negative (fear) connotation, but also intrinsic positive (joy) or negative (anxiety) emotions not containing any surprise, in addition to neutral facial displays either containing ("typical surprise" expression) or not containing ("neutral") surprise. Results showed heightened amygdala activity to faces containing positive (vs. negative) surprise, which may either correspond to a specific wonderment effect as such, or to the computation of a negative expected value prediction error. Findings are discussed in the light of data obtained from a closely matched nonsocial lottery task, which revealed overlapping activity within the left amygdala to unexpected positive outcomes. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  12. Dynamic Facial Expressions Prime the Processing of Emotional Prosody.

    PubMed

    Garrido-Vásquez, Patricia; Pell, Marc D; Paulmann, Silke; Kotz, Sonja A

    2018-01-01

    Evidence suggests that emotion is represented supramodally in the human brain. Emotional facial expressions, which often precede vocally expressed emotion in real life, can modulate event-related potentials (N100 and P200) during emotional prosody processing. To investigate these cross-modal emotional interactions, two lines of research have been put forward: cross-modal integration and cross-modal priming. In cross-modal integration studies, visual and auditory channels are temporally aligned, while in priming studies they are presented consecutively. Here we used cross-modal emotional priming to study the interaction of dynamic visual and auditory emotional information. Specifically, we presented dynamic facial expressions (angry, happy, neutral) as primes and emotionally-intoned pseudo-speech sentences (angry, happy) as targets. We were interested in how prime-target congruency would affect early auditory event-related potentials, i.e., N100 and P200, in order to shed more light on how dynamic facial information is used in cross-modal emotional prediction. Results showed enhanced N100 amplitudes for incongruently primed compared to congruently and neutrally primed emotional prosody, while the latter two conditions did not significantly differ. However, N100 peak latency was significantly delayed in the neutral condition compared to the other two conditions. Source reconstruction revealed that the right parahippocampal gyrus was activated in incongruent compared to congruent trials in the N100 time window. No significant ERP effects were observed in the P200 range. Our results indicate that dynamic facial expressions influence vocal emotion processing at an early point in time, and that an emotional mismatch between a facial expression and its ensuing vocal emotional signal induces additional processing costs in the brain, potentially because the cross-modal emotional prediction mechanism is violated in case of emotional prime-target incongruency.

  13. Subliminal Face Emotion Processing: A Comparison of Fearful and Disgusted Faces

    PubMed Central

    Khalid, Shah; Ansorge, Ulrich

    2017-01-01

    Prior research has provided evidence for (1) subcortical processing of subliminal facial expressions of emotion and (2) for the emotion-specificity of these processes. Here, we investigated if this is also true for the processing of the subliminal facial display of disgust. In Experiment 1, we used differently filtered masked prime faces portraying emotionally neutral or disgusted expressions presented prior to clearly visible target faces to test if the masked primes exerted an influence on target processing nonetheless. Whereas we found evidence for subliminal face congruence or priming effects, in particular, reverse priming by low spatial frequencies disgusted face primes, we did not find any support for a subcortical origin of the effect. In Experiment 2, we compared the influence of subliminal disgusted faces with that of subliminal fearful faces and demonstrated a behavioral performance difference between the two, pointing to an emotion-specific processing of the disgusted facial expressions. In both experiments, we also tested for the dependence of the subliminal emotional face processing on spatial attention – with mixed results, suggesting an attention-independence in Experiment 1 but not in Experiment 2 –, and we found perfect masking of the face primes – that is, proof of the subliminality of the prime faces. Based on our findings, we speculate that subliminal facial expressions of disgust could afford easy avoidance of these faces. This could be a unique effect of disgusted faces as compared to other emotional facial displays, at least under the conditions studied here. PMID:28680413

  14. Subliminal Face Emotion Processing: A Comparison of Fearful and Disgusted Faces.

    PubMed

    Khalid, Shah; Ansorge, Ulrich

    2017-01-01

    Prior research has provided evidence for (1) subcortical processing of subliminal facial expressions of emotion and (2) for the emotion-specificity of these processes. Here, we investigated if this is also true for the processing of the subliminal facial display of disgust. In Experiment 1, we used differently filtered masked prime faces portraying emotionally neutral or disgusted expressions presented prior to clearly visible target faces to test if the masked primes exerted an influence on target processing nonetheless. Whereas we found evidence for subliminal face congruence or priming effects, in particular, reverse priming by low spatial frequencies disgusted face primes, we did not find any support for a subcortical origin of the effect. In Experiment 2, we compared the influence of subliminal disgusted faces with that of subliminal fearful faces and demonstrated a behavioral performance difference between the two, pointing to an emotion-specific processing of the disgusted facial expressions. In both experiments, we also tested for the dependence of the subliminal emotional face processing on spatial attention - with mixed results, suggesting an attention-independence in Experiment 1 but not in Experiment 2 -, and we found perfect masking of the face primes - that is, proof of the subliminality of the prime faces. Based on our findings, we speculate that subliminal facial expressions of disgust could afford easy avoidance of these faces. This could be a unique effect of disgusted faces as compared to other emotional facial displays, at least under the conditions studied here.

  15. Spontaneous Facial Mimicry Is Enhanced by the Goal of Inferring Emotional States: Evidence for Moderation of "Automatic" Mimicry by Higher Cognitive Processes.

    PubMed

    Murata, Aiko; Saito, Hisamichi; Schug, Joanna; Ogawa, Kenji; Kameda, Tatsuya

    2016-01-01

    A number of studies have shown that individuals often spontaneously mimic the facial expressions of others, a tendency known as facial mimicry. This tendency has generally been considered a reflex-like "automatic" response, but several recent studies have shown that the degree of mimicry may be moderated by contextual information. However, the cognitive and motivational factors underlying the contextual moderation of facial mimicry require further empirical investigation. In this study, we present evidence that the degree to which participants spontaneously mimic a target's facial expressions depends on whether participants are motivated to infer the target's emotional state. In the first study we show that facial mimicry, assessed by facial electromyography, occurs more frequently when participants are specifically instructed to infer a target's emotional state than when given no instruction. In the second study, we replicate this effect using the Facial Action Coding System to show that participants are more likely to mimic facial expressions of emotion when they are asked to infer the target's emotional state, rather than make inferences about a physical trait unrelated to emotion. These results provide convergent evidence that the explicit goal of understanding a target's emotional state affects the degree of facial mimicry shown by the perceiver, suggesting moderation of reflex-like motor activities by higher cognitive processes.

  16. Spontaneous Facial Mimicry Is Enhanced by the Goal of Inferring Emotional States: Evidence for Moderation of “Automatic” Mimicry by Higher Cognitive Processes

    PubMed Central

    Murata, Aiko; Saito, Hisamichi; Schug, Joanna; Ogawa, Kenji; Kameda, Tatsuya

    2016-01-01

    A number of studies have shown that individuals often spontaneously mimic the facial expressions of others, a tendency known as facial mimicry. This tendency has generally been considered a reflex-like “automatic” response, but several recent studies have shown that the degree of mimicry may be moderated by contextual information. However, the cognitive and motivational factors underlying the contextual moderation of facial mimicry require further empirical investigation. In this study, we present evidence that the degree to which participants spontaneously mimic a target’s facial expressions depends on whether participants are motivated to infer the target’s emotional state. In the first study we show that facial mimicry, assessed by facial electromyography, occurs more frequently when participants are specifically instructed to infer a target’s emotional state than when given no instruction. In the second study, we replicate this effect using the Facial Action Coding System to show that participants are more likely to mimic facial expressions of emotion when they are asked to infer the target’s emotional state, rather than make inferences about a physical trait unrelated to emotion. These results provide convergent evidence that the explicit goal of understanding a target’s emotional state affects the degree of facial mimicry shown by the perceiver, suggesting moderation of reflex-like motor activities by higher cognitive processes. PMID:27055206

  17. Sad Facial Expressions Increase Choice Blindness

    PubMed Central

    Wang, Yajie; Zhao, Song; Zhang, Zhijie; Feng, Wenfeng

    2018-01-01

    Previous studies have discovered a fascinating phenomenon known as choice blindness—individuals fail to detect mismatches between the face they choose and the face replaced by the experimenter. Although previous studies have reported a couple of factors that can modulate the magnitude of choice blindness, the potential effect of facial expression on choice blindness has not yet been explored. Using faces with sad and neutral expressions (Experiment 1) and faces with happy and neutral expressions (Experiment 2) in the classic choice blindness paradigm, the present study investigated the effects of facial expressions on choice blindness. The results showed that the detection rate was significantly lower on sad faces than neutral faces, whereas no significant difference was observed between happy faces and neutral faces. The exploratory analysis of verbal reports found that participants who reported less facial features for sad (as compared to neutral) expressions also tended to show a lower detection rate of sad (as compared to neutral) faces. These findings indicated that sad facial expressions increased choice blindness, which might have resulted from inhibition of further processing of the detailed facial features by the less attractive sad expressions (as compared to neutral expressions). PMID:29358926

  18. Sad Facial Expressions Increase Choice Blindness.

    PubMed

    Wang, Yajie; Zhao, Song; Zhang, Zhijie; Feng, Wenfeng

    2017-01-01

    Previous studies have discovered a fascinating phenomenon known as choice blindness-individuals fail to detect mismatches between the face they choose and the face replaced by the experimenter. Although previous studies have reported a couple of factors that can modulate the magnitude of choice blindness, the potential effect of facial expression on choice blindness has not yet been explored. Using faces with sad and neutral expressions (Experiment 1) and faces with happy and neutral expressions (Experiment 2) in the classic choice blindness paradigm, the present study investigated the effects of facial expressions on choice blindness. The results showed that the detection rate was significantly lower on sad faces than neutral faces, whereas no significant difference was observed between happy faces and neutral faces. The exploratory analysis of verbal reports found that participants who reported less facial features for sad (as compared to neutral) expressions also tended to show a lower detection rate of sad (as compared to neutral) faces. These findings indicated that sad facial expressions increased choice blindness, which might have resulted from inhibition of further processing of the detailed facial features by the less attractive sad expressions (as compared to neutral expressions).

  19. The human amygdala parametrically encodes the intensity of specific facial emotions and their categorical ambiguity

    PubMed Central

    Wang, Shuo; Yu, Rongjun; Tyszka, J. Michael; Zhen, Shanshan; Kovach, Christopher; Sun, Sai; Huang, Yi; Hurlemann, Rene; Ross, Ian B.; Chung, Jeffrey M.; Mamelak, Adam N.; Adolphs, Ralph; Rutishauser, Ueli

    2017-01-01

    The human amygdala is a key structure for processing emotional facial expressions, but it remains unclear what aspects of emotion are processed. We investigated this question with three different approaches: behavioural analysis of 3 amygdala lesion patients, neuroimaging of 19 healthy adults, and single-neuron recordings in 9 neurosurgical patients. The lesion patients showed a shift in behavioural sensitivity to fear, and amygdala BOLD responses were modulated by both fear and emotion ambiguity (the uncertainty that a facial expression is categorized as fearful or happy). We found two populations of neurons, one whose response correlated with increasing degree of fear, or happiness, and a second whose response primarily decreased as a linear function of emotion ambiguity. Together, our results indicate that the human amygdala processes both the degree of emotion in facial expressions and the categorical ambiguity of the emotion shown and that these two aspects of amygdala processing can be most clearly distinguished at the level of single neurons. PMID:28429707

  20. Effects of touch on emotional face processing: A study of event-related potentials, facial EMG and cardiac activity.

    PubMed

    Spapé, M M; Harjunen, Ville; Ravaja, N

    2017-03-01

    Being touched is known to affect emotion, and even a casual touch can elicit positive feelings and affinity. Psychophysiological studies have recently shown that tactile primes affect visual evoked potentials to emotional stimuli, suggesting altered affective stimulus processing. As, however, these studies approached emotion from a purely unidimensional perspective, it remains unclear whether touch biases emotional evaluation or a more general feature such as salience. Here, we investigated how simple tactile primes modulate event related potentials (ERPs), facial EMG and cardiac response to pictures of facial expressions of emotion. All measures replicated known effects of emotional face processing: Disgust and fear modulated early ERPs, anger increased the cardiac orienting response, and expressions elicited emotion-congruent facial EMG activity. Tactile primes also affected these measures, but priming never interacted with the type of emotional expression. Thus, touch may additively affect general stimulus processing, but it does not bias or modulate immediate affective evaluation. Copyright © 2017. Published by Elsevier B.V.

  1. Comparison of emotion recognition from facial expression and music.

    PubMed

    Gaspar, Tina; Labor, Marina; Jurić, Iva; Dumancić, Dijana; Ilakovac, Vesna; Heffer, Marija

    2011-01-01

    The recognition of basic emotions in everyday communication involves interpretation of different visual and auditory clues. The ability to recognize emotions is not clearly determined as their presentation is usually very short (micro expressions), whereas the recognition itself does not have to be a conscious process. We assumed that the recognition from facial expressions is selected over the recognition of emotions communicated through music. In order to compare the success rate in recognizing emotions presented as facial expressions or in classical music works we conducted a survey which included 90 elementary school and 87 high school students from Osijek (Croatia). The participants had to match 8 photographs of different emotions expressed on the face and 8 pieces of classical music works with 8 offered emotions. The recognition of emotions expressed through classical music pieces was significantly less successful than the recognition of emotional facial expressions. The high school students were significantly better at recognizing facial emotions than the elementary school students, whereas girls were better than boys. The success rate in recognizing emotions from music pieces was associated with higher grades in mathematics. Basic emotions are far better recognized if presented on human faces than in music, possibly because the understanding of facial emotions is one of the oldest communication skills in human society. Female advantage in emotion recognition was selected due to the necessity of their communication with the newborns during early development. The proficiency in recognizing emotional content of music and mathematical skills probably share some general cognitive skills like attention, memory and motivation. Music pieces were differently processed in brain than facial expressions and consequently, probably differently evaluated as relevant emotional clues.

  2. Concurrent development of facial identity and expression discrimination.

    PubMed

    Dalrymple, Kirsten A; Visconti di Oleggio Castello, Matteo; Elison, Jed T; Gobbini, M Ida

    2017-01-01

    Facial identity and facial expression processing both appear to follow a protracted developmental trajectory, yet these trajectories have been studied independently and have not been directly compared. Here we investigated whether these processes develop at the same or different rates using matched identity and expression discrimination tasks. The Identity task begins with a target face that is a morph between two identities (Identity A/Identity B). After a brief delay, the target face is replaced by two choice faces: 100% Identity A and 100% Identity B. Children 5-12-years-old were asked to pick the choice face that is most similar to the target identity. The Expression task is matched in format and difficulty to the Identity task, except the targets are morphs between two expressions (Angry/Happy, or Disgust/Surprise). The same children were asked to pick the choice face with the expression that is most similar to the target expression. There were significant effects of age, with performance improving (becoming more accurate and faster) on both tasks with increasing age. Accuracy and reaction times were not significantly different across tasks and there was no significant Age x Task interaction. Thus, facial identity and facial expression discrimination appear to develop at a similar rate, with comparable improvement on both tasks from age five to twelve. Because our tasks are so closely matched in format and difficulty, they may prove useful for testing face identity and face expression processing in special populations, such as autism or prosopagnosia, where one of these abilities might be impaired.

  3. Brain Responses to Dynamic Facial Expressions: A Normative Meta-Analysis.

    PubMed

    Zinchenko, Oksana; Yaple, Zachary A; Arsalidou, Marie

    2018-01-01

    Identifying facial expressions is crucial for social interactions. Functional neuroimaging studies show that a set of brain areas, such as the fusiform gyrus and amygdala, become active when viewing emotional facial expressions. The majority of functional magnetic resonance imaging (fMRI) studies investigating face perception typically employ static images of faces. However, studies that use dynamic facial expressions (e.g., videos) are accumulating and suggest that a dynamic presentation may be more sensitive and ecologically valid for investigating faces. By using quantitative fMRI meta-analysis the present study examined concordance of brain regions associated with viewing dynamic facial expressions. We analyzed data from 216 participants that participated in 14 studies, which reported coordinates for 28 experiments. Our analysis revealed bilateral fusiform and middle temporal gyri, left amygdala, left declive of the cerebellum and the right inferior frontal gyrus. These regions are discussed in terms of their relation to models of face processing.

  4. Processing of Facial Expressions of Emotions by Adults with Down Syndrome and Moderate Intellectual Disability

    ERIC Educational Resources Information Center

    Carvajal, Fernando; Fernandez-Alcaraz, Camino; Rueda, Maria; Sarrion, Louise

    2012-01-01

    The processing of facial expressions of emotions by 23 adults with Down syndrome and moderate intellectual disability was compared with that of adults with intellectual disability of other etiologies (24 matched in cognitive level and 26 with mild intellectual disability). Each participant performed 4 tasks of the Florida Affect Battery and an…

  5. Maternal Personality and Infants' Neural and Visual Responsivity to Facial Expressions of Emotion

    ERIC Educational Resources Information Center

    De Haan, Michelle; Belsky, Jay; Reid, Vincent; Volein, Agnes; Johnson, Mark H.

    2004-01-01

    Background: Recent investigations suggest that experience plays an important role in the development of face processing. The aim of this study was to investigate the potential role of experience in the development of the ability to process facial expressions of emotion. Method: We examined the potential role of experience indirectly by…

  6. Drug effects on responses to emotional facial expressions: recent findings

    PubMed Central

    Miller, Melissa A.; Bershad, Anya K.; de Wit, Harriet

    2016-01-01

    Many psychoactive drugs increase social behavior and enhance social interactions, which may, in turn, increase their attractiveness to users. Although the psychological mechanisms by which drugs affect social behavior are not fully understood, there is some evidence that drugs alter the perception of emotions in others. Drugs can affect the ability to detect, attend to, and respond to emotional facial expressions, which in turn may influence their use in social settings. Either increased reactivity to positive expressions or decreased response to negative expressions may facilitate social interaction. This article reviews evidence that psychoactive drugs alter the processing of emotional facial expressions using subjective, behavioral, and physiological measures. The findings lay the groundwork for better understanding how drugs alter social processing and social behavior more generally. PMID:26226144

  7. Deficits in Degraded Facial Affect Labeling in Schizophrenia and Borderline Personality Disorder.

    PubMed

    van Dijke, Annemiek; van 't Wout, Mascha; Ford, Julian D; Aleman, André

    2016-01-01

    Although deficits in facial affect processing have been reported in schizophrenia as well as in borderline personality disorder (BPD), these disorders have not yet been directly compared on facial affect labeling. Using degraded stimuli portraying neutral, angry, fearful and angry facial expressions, we hypothesized more errors in labeling negative facial expressions in patients with schizophrenia compared to healthy controls. Patients with BPD were expected to have difficulty in labeling neutral expressions and to display a bias towards a negative attribution when wrongly labeling neutral faces. Patients with schizophrenia (N = 57) and patients with BPD (N = 30) were compared to patients with somatoform disorder (SoD, a psychiatric control group; N = 25) and healthy control participants (N = 41) on facial affect labeling accuracy and type of misattributions. Patients with schizophrenia showed deficits in labeling angry and fearful expressions compared to the healthy control group and patients with BPD showed deficits in labeling neutral expressions compared to the healthy control group. Schizophrenia and BPD patients did not differ significantly from each other when labeling any of the facial expressions. Compared to SoD patients, schizophrenia patients showed deficits on fearful expressions, but BPD did not significantly differ from SoD patients on any of the facial expressions. With respect to the type of misattributions, BPD patients mistook neutral expressions more often for fearful expressions compared to schizophrenia patients and healthy controls, and less often for happy compared to schizophrenia patients. These findings suggest that although schizophrenia and BPD patients demonstrate different as well as similar facial affect labeling deficits, BPD may be associated with a tendency to detect negative affect in neutral expressions.

  8. Exposure to the self-face facilitates identification of dynamic facial expressions: influences on individual differences.

    PubMed

    Li, Yuan Hang; Tottenham, Nim

    2013-04-01

    A growing literature suggests that the self-face is involved in processing the facial expressions of others. The authors experimentally activated self-face representations to assess its effects on the recognition of dynamically emerging facial expressions of others. They exposed participants to videos of either their own faces (self-face prime) or faces of others (nonself-face prime) prior to a facial expression judgment task. Their results show that experimentally activating self-face representations results in earlier recognition of dynamically emerging facial expression. As a group, participants in the self-face prime condition recognized expressions earlier (when less affective perceptual information was available) compared to participants in the nonself-face prime condition. There were individual differences in performance, such that poorer expression identification was associated with higher autism traits (in this neurocognitively healthy sample). However, when randomized into the self-face prime condition, participants with high autism traits performed as well as those with low autism traits. Taken together, these data suggest that the ability to recognize facial expressions in others is linked with the internal representations of our own faces. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  9. Influence of spatial frequency and emotion expression on face processing in patients with panic disorder.

    PubMed

    Shim, Miseon; Kim, Do-Won; Yoon, Sunkyung; Park, Gewnhi; Im, Chang-Hwan; Lee, Seung-Hwan

    2016-06-01

    Deficits in facial emotion processing is a major characteristic of patients with panic disorder. It is known that visual stimuli with different spatial frequencies take distinct neural pathways. This study investigated facial emotion processing involving stimuli presented at broad, high, and low spatial frequencies in patients with panic disorder. Eighteen patients with panic disorder and 19 healthy controls were recruited. Seven event-related potential (ERP) components: (P100, N170, early posterior negativity (EPN); vertex positive potential (VPP), N250, P300; and late positive potential (LPP)) were evaluated while the participants looked at fearful and neutral facial stimuli presented at three spatial frequencies. When a fearful face was presented, panic disorder patients showed a significantly increased P100 amplitude in response to low spatial frequency compared to high spatial frequency; whereas healthy controls demonstrated significant broad spatial frequency dependent processing in P100 amplitude. Vertex positive potential amplitude was significantly increased in high and broad spatial frequency, compared to low spatial frequency in panic disorder. Early posterior negativity amplitude was significantly different between HSF and BSF, and between LSF and BSF processing in both groups, regardless of facial expression. The possibly confounding effects of medication could not be controlled. During early visual processing, patients with panic disorder prefer global to detailed information. However, in later processing, panic disorder patients overuse detailed information for the perception of facial expressions. These findings suggest that unique spatial frequency-dependent facial processing could shed light on the neural pathology associated with panic disorder. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Less Empathic and More Reactive: The Different Impact of Childhood Maltreatment on Facial Mimicry and Vagal Regulation

    PubMed Central

    Ardizzi, Martina; Umiltà, Maria Alessandra; Evangelista, Valentina; Di Liscia, Alessandra; Ravera, Roberto; Gallese, Vittorio

    2016-01-01

    Facial mimicry and vagal regulation represent two crucial physiological responses to others’ facial expressions of emotions. Facial mimicry, defined as the automatic, rapid and congruent electromyographic activation to others’ facial expressions, is implicated in empathy, emotional reciprocity and emotions recognition. Vagal regulation, quantified by the computation of Respiratory Sinus Arrhythmia (RSA), exemplifies the autonomic adaptation to contingent social cues. Although it has been demonstrated that childhood maltreatment induces alterations in the processing of the facial expression of emotions, both at an explicit and implicit level, the effects of maltreatment on children’s facial mimicry and vagal regulation in response to facial expressions of emotions remain unknown. The purpose of the present study was to fill this gap, involving 24 street-children (maltreated group) and 20 age-matched controls (control group). We recorded their spontaneous facial electromyographic activations of corrugator and zygomaticus muscles and RSA responses during the visualization of the facial expressions of anger, fear, joy and sadness. Results demonstrated a different impact of childhood maltreatment on facial mimicry and vagal regulation. Maltreated children did not show the typical positive-negative modulation of corrugator mimicry. Furthermore, when only negative facial expressions were considered, maltreated children demonstrated lower corrugator mimicry than controls. With respect to vagal regulation, whereas maltreated children manifested the expected and functional inverse correlation between RSA value at rest and RSA response to angry facial expressions, controls did not. These results describe an early and divergent functional adaptation to hostile environment of the two investigated physiological mechanisms. On the one side, maltreatment leads to the suppression of the spontaneous facial mimicry normally concurring to empathic understanding of others’ emotions. On the other side, maltreatment forces the precocious development of the functional synchronization between vagal regulation and threatening social cues facilitating the recruitment of fight-or-flight defensive behavioral strategies. PMID:27685802

  11. Impaired social brain network for processing dynamic facial expressions in autism spectrum disorders.

    PubMed

    Sato, Wataru; Toichi, Motomi; Uono, Shota; Kochiyama, Takanori

    2012-08-13

    Impairment of social interaction via facial expressions represents a core clinical feature of autism spectrum disorders (ASD). However, the neural correlates of this dysfunction remain unidentified. Because this dysfunction is manifested in real-life situations, we hypothesized that the observation of dynamic, compared with static, facial expressions would reveal abnormal brain functioning in individuals with ASD.We presented dynamic and static facial expressions of fear and happiness to individuals with high-functioning ASD and to age- and sex-matched typically developing controls and recorded their brain activities using functional magnetic resonance imaging (fMRI). Regional analysis revealed reduced activation of several brain regions in the ASD group compared with controls in response to dynamic versus static facial expressions, including the middle temporal gyrus (MTG), fusiform gyrus, amygdala, medial prefrontal cortex, and inferior frontal gyrus (IFG). Dynamic causal modeling analyses revealed that bi-directional effective connectivity involving the primary visual cortex-MTG-IFG circuit was enhanced in response to dynamic as compared with static facial expressions in the control group. Group comparisons revealed that all these modulatory effects were weaker in the ASD group than in the control group. These results suggest that weak activity and connectivity of the social brain network underlie the impairment in social interaction involving dynamic facial expressions in individuals with ASD.

  12. Strategies for Perceiving Facial Expressions in Adults with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Walsh, Jennifer A.; Vida, Mark D.; Rutherford, M. D.

    2014-01-01

    Rutherford and McIntosh (J Autism Dev Disord 37:187-196, 2007) demonstrated that individuals with autism spectrum disorder (ASD) are more tolerant than controls of exaggerated schematic facial expressions, suggesting that they may use an alternative strategy when processing emotional expressions. The current study was designed to test this finding…

  13. The Role of Facial Expressions in Attention-Orienting in Adults and Infants

    ERIC Educational Resources Information Center

    Rigato, Silvia; Menon, Enrica; Di Gangi, Valentina; George, Nathalie; Farroni, Teresa

    2013-01-01

    Faces convey many signals (i.e., gaze or expressions) essential for interpersonal interaction. We have previously shown that facial expressions of emotion and gaze direction are processed and integrated in specific combinations early in life. These findings open a number of developmental questions and specifically in this paper we address whether…

  14. Discrimination of gender using facial image with expression change

    NASA Astrophysics Data System (ADS)

    Kuniyada, Jun; Fukuda, Takahiro; Terada, Kenji

    2005-12-01

    By carrying out marketing research, the managers of large-sized department stores or small convenience stores obtain the information such as ratio of men and women of visitors and an age group, and improve their management plan. However, these works are carried out in the manual operations, and it becomes a big burden to small stores. In this paper, the authors propose a method of men and women discrimination by extracting difference of the facial expression change from color facial images. Now, there are a lot of methods of the automatic recognition of the individual using a motion facial image or a still facial image in the field of image processing. However, it is very difficult to discriminate gender under the influence of the hairstyle and clothes, etc. Therefore, we propose the method which is not affected by personality such as size and position of facial parts by paying attention to a change of an expression. In this method, it is necessary to obtain two facial images with an expression and an expressionless. First, a region of facial surface and the regions of facial parts such as eyes, nose, and mouth are extracted in the facial image with color information of hue and saturation in HSV color system and emphasized edge information. Next, the features are extracted by calculating the rate of the change of each facial part generated by an expression change. In the last step, the values of those features are compared between the input data and the database, and the gender is discriminated. In this paper, it experimented for the laughing expression and smile expression, and good results were provided for discriminating gender.

  15. From Facial Emotional Recognition Abilities to Emotional Attribution: A Study in Down Syndrome

    ERIC Educational Resources Information Center

    Hippolyte, Loyse; Barisnikov, Koviljka; Van der Linden, Martial; Detraux, Jean-Jacques

    2009-01-01

    Facial expression processing and the attribution of facial emotions to a context were investigated in adults with Down syndrome (DS) in two experiments. Their performances were compared with those of a child control group matched for receptive vocabulary. The ability to process faces without emotional content was controlled for, and no differences…

  16. [Recognition of facial expression of emotions in Parkinson's disease: a theoretical review].

    PubMed

    Alonso-Recio, L; Serrano-Rodriguez, J M; Carvajal-Molina, F; Loeches-Alonso, A; Martin-Plasencia, P

    2012-04-16

    Emotional facial expression is a basic guide during social interaction and, therefore, alterations in their expression or recognition are important limitations for communication. To examine facial expression recognition abilities and their possible impairment in Parkinson's disease. First, we review the studies on this topic which have not found entirely similar results. Second, we analyze the factors that may explain these discrepancies and, in particular, as third objective, we consider the relationship between emotional recognition problems and cognitive impairment associated with the disease. Finally, we propose alternatives strategies for the development of studies that could clarify the state of these abilities in Parkinson's disease. Most studies suggest deficits in facial expression recognition, especially in those with negative emotional content. However, it is possible that these alterations are related to those that also appear in the course of the disease in other perceptual and executive processes. To advance in this issue, we consider necessary to design emotional recognition studies implicating differentially the executive or visuospatial processes, and/or contrasting cognitive abilities with facial expressions and non emotional stimuli. The precision of the status of these abilities, as well as increase our knowledge of the functional consequences of the characteristic brain damage in the disease, may indicate if we should pay special attention in their rehabilitation inside the programs implemented.

  17. Perception of face and body expressions using electromyography, pupillometry and gaze measures.

    PubMed

    Kret, Mariska E; Stekelenburg, Jeroen J; Roelofs, Karin; de Gelder, Beatrice

    2013-01-01

    Traditional emotion theories stress the importance of the face in the expression of emotions but bodily expressions are becoming increasingly important as well. In these experiments we tested the hypothesis that similar physiological responses can be evoked by observing emotional face and body signals and that the reaction to angry signals is amplified in anxious individuals. We designed three experiments in which participants categorized emotional expressions from isolated facial and bodily expressions and emotionally congruent and incongruent face-body compounds. Participants' fixations were measured and their pupil size recorded with eye-tracking equipment and their facial reactions measured with electromyography. The results support our prediction that the recognition of a facial expression is improved in the context of a matching posture and importantly, vice versa as well. From their facial expressions, it appeared that observers acted with signs of negative emotionality (increased corrugator activity) to angry and fearful facial expressions and with positive emotionality (increased zygomaticus) to happy facial expressions. What we predicted and found, was that angry and fearful cues from the face or the body, attracted more attention than happy cues. We further observed that responses evoked by angry cues were amplified in individuals with high anxiety scores. In sum, we show that people process bodily expressions of emotion in a similar fashion as facial expressions and that the congruency between the emotional signals from the face and body facilitates the recognition of the emotion.

  18. Perception of Face and Body Expressions Using Electromyography, Pupillometry and Gaze Measures

    PubMed Central

    Kret, Mariska E.; Stekelenburg, Jeroen J.; Roelofs, Karin; de Gelder, Beatrice

    2013-01-01

    Traditional emotion theories stress the importance of the face in the expression of emotions but bodily expressions are becoming increasingly important as well. In these experiments we tested the hypothesis that similar physiological responses can be evoked by observing emotional face and body signals and that the reaction to angry signals is amplified in anxious individuals. We designed three experiments in which participants categorized emotional expressions from isolated facial and bodily expressions and emotionally congruent and incongruent face-body compounds. Participants’ fixations were measured and their pupil size recorded with eye-tracking equipment and their facial reactions measured with electromyography. The results support our prediction that the recognition of a facial expression is improved in the context of a matching posture and importantly, vice versa as well. From their facial expressions, it appeared that observers acted with signs of negative emotionality (increased corrugator activity) to angry and fearful facial expressions and with positive emotionality (increased zygomaticus) to happy facial expressions. What we predicted and found, was that angry and fearful cues from the face or the body, attracted more attention than happy cues. We further observed that responses evoked by angry cues were amplified in individuals with high anxiety scores. In sum, we show that people process bodily expressions of emotion in a similar fashion as facial expressions and that the congruency between the emotional signals from the face and body facilitates the recognition of the emotion. PMID:23403886

  19. Developmental Changes in Infants' Processing of Happy and Angry Facial Expressions: A Neurobehavioral Study

    ERIC Educational Resources Information Center

    Grossmann, Tobias; Striano, Tricia; Friederici, Angela D.

    2007-01-01

    Event-related brain potentials were measured in 7- and 12-month-old infants to examine the development of processing happy and angry facial expressions. In 7-month-olds a larger negativity to happy faces was observed at frontal, central, temporal and parietal sites (Experiment 1), whereas 12-month-olds showed a larger negativity to angry faces at…

  20. [Molecular cloning, expression of rat Msx-1 and Msx-2 during early embryo genesis and roles for mandibular chondrogenesis].

    PubMed

    Ishiguro, S

    1999-03-01

    Quail-chick chimera experiments have shown a contribution of carnial neural crest cells to the craniofacial skeletal elements. Moreover, tissue interactions between epithelial-mesenchymal interaction during early facial process development are required for both skeletal differentiation and morphogenesis. In this study, it was observed that Msx homeobox containing genes expressed in the facial process were important molecules of cartilage morphogenesis. Rat cDNAs were isolated and encoded by Msx-1 and -2, and then the expression patterns using in situ hybridization were investigated during early rat face development. These genes were correlatively expressed in the cranial neural crest forming area (E 9.5 dpc) and the facial process (E 12.5 dpc). Antisence inhibition of Msx genes in the E 12.5 mandibular process exhibited the alteration of their gene expression and cartilage patterns. Antisence inhibition of Msx-1 induced lack of the medial portion of cartilage, and antisence inhibition of Msx-2 enhanced chondrogenesis of mandibular process under the organ culture condition. Thus it was concluded that expression of Msx genes during mandibular process development comprises important signals of chondrogenesis.

  1. Infant Expressions in an Approach/Withdrawal Framework

    PubMed Central

    Sullivan, Margaret Wolan

    2014-01-01

    Since the introduction of empirical methods for studying facial expression, the interpretation of infant facial expressions has generated much debate. The premise of this paper is that action tendencies of approach and withdrawal constitute a core organizational feature of emotion in humans, promoting coherence of behavior, facial signaling and physiological responses. The approach/withdrawal framework can provide a taxonomy of contexts and the neurobehavioral framework for the systematic, empirical study of individual differences in expression, physiology, and behavior within individuals as well as across contexts over time. By adopting this framework in developmental work on basic emotion processes, it may be possible to better understand the behavioral principles governing facial displays, and how individual differences in them are related to physiology and behavior, function in context. PMID:25412273

  2. Self-Relevance Appraisal Influences Facial Reactions to Emotional Body Expressions

    PubMed Central

    Grèzes, Julie; Philip, Léonor; Chadwick, Michèle; Dezecache, Guillaume; Soussignan, Robert; Conty, Laurence

    2013-01-01

    People display facial reactions when exposed to others' emotional expressions, but exactly what mechanism mediates these facial reactions remains a debated issue. In this study, we manipulated two critical perceptual features that contribute to determining the significance of others' emotional expressions: the direction of attention (toward or away from the observer) and the intensity of the emotional display. Electromyographic activity over the corrugator muscle was recorded while participants observed videos of neutral to angry body expressions. Self-directed bodies induced greater corrugator activity than other-directed bodies; additionally corrugator activity was only influenced by the intensity of anger expresssed by self-directed bodies. These data support the hypothesis that rapid facial reactions are the outcome of self-relevant emotional processing. PMID:23405230

  3. Representational momentum in dynamic facial expressions is modulated by the level of expressed pain: Amplitude and direction effects.

    PubMed

    Prigent, Elise; Amorim, Michel-Ange; de Oliveira, Armando Mónica

    2018-01-01

    Humans have developed a specific capacity to rapidly perceive and anticipate other people's facial expressions so as to get an immediate impression of their emotional state of mind. We carried out two experiments to examine the perceptual and memory dynamics of facial expressions of pain. In the first experiment, we investigated how people estimate other people's levels of pain based on the perception of various dynamic facial expressions; these differ both in terms of the amount and intensity of activated action units. A second experiment used a representational momentum (RM) paradigm to study the emotional anticipation (memory bias) elicited by the same facial expressions of pain studied in Experiment 1. Our results highlighted the relationship between the level of perceived pain (in Experiment 1) and the direction and magnitude of memory bias (in Experiment 2): When perceived pain increases, the memory bias tends to be reduced (if positive) and ultimately becomes negative. Dynamic facial expressions of pain may reenact an "immediate perceptual history" in the perceiver before leading to an emotional anticipation of the agent's upcoming state. Thus, a subtle facial expression of pain (i.e., a low contraction around the eyes) that leads to a significant positive anticipation can be considered an adaptive process-one through which we can swiftly and involuntarily detect other people's pain.

  4. Perceptual and affective mechanisms in facial expression recognition: An integrative review.

    PubMed

    Calvo, Manuel G; Nummenmaa, Lauri

    2016-09-01

    Facial expressions of emotion involve a physical component of morphological changes in a face and an affective component conveying information about the expresser's internal feelings. It remains unresolved how much recognition and discrimination of expressions rely on the perception of morphological patterns or the processing of affective content. This review of research on the role of visual and emotional factors in expression recognition reached three major conclusions. First, behavioral, neurophysiological, and computational measures indicate that basic expressions are reliably recognized and discriminated from one another, albeit the effect may be inflated by the use of prototypical expression stimuli and forced-choice responses. Second, affective content along the dimensions of valence and arousal is extracted early from facial expressions, although this coarse affective representation contributes minimally to categorical recognition of specific expressions. Third, the physical configuration and visual saliency of facial features contribute significantly to expression recognition, with "emotionless" computational models being able to reproduce some of the basic phenomena demonstrated in human observers. We conclude that facial expression recognition, as it has been investigated in conventional laboratory tasks, depends to a greater extent on perceptual than affective information and mechanisms.

  5. A dynamic Shh expression pattern, regulated by SHH and BMP signaling, coordinates fusion of primordia in the amniote face

    PubMed Central

    Hu, Diane; Young, Nathan M.; Li, Xin; Xu, Yanhua; Hallgrímsson, Benedikt; Marcucio, Ralph S.

    2015-01-01

    The mechanisms of morphogenesis are not well understood, yet shaping structures during development is essential for establishing correct organismal form and function. Here, we examine mechanisms that help to shape the developing face during the crucial period of facial primordia fusion. This period of development is a time when the faces of amniote embryos exhibit the greatest degree of similarity, and it probably results from the necessity for fusion to occur to establish the primary palate. Our results show that hierarchical induction mechanisms, consisting of iterative signaling by Sonic hedgehog (SHH) followed by Bone morphogenetic proteins (BMPs), regulate a dynamic expression pattern of Shh in the ectoderm covering the frontonasal (FNP) and maxillary (MxP) processes. Furthermore, this Shh expression domain contributes to the morphogenetic processes that drive the directional growth of the globular process of the FNP toward the lateral nasal process and MxP, in part by regulating cell proliferation in the facial mesenchyme. The nature of the induction mechanism that we discovered suggests that the process of fusion of the facial primordia is intrinsically buffered against producing maladaptive morphologies, such as clefts of the primary palate, because there appears to be little opportunity for variation to occur during expansion of the Shh expression domain in the ectoderm of the facial primordia. Ultimately, these results might explain why this period of development constitutes a phylotypic stage of facial development among amniotes. PMID:25605783

  6. Impaired Perception of Emotional Expression in Amyotrophic Lateral Sclerosis.

    PubMed

    Oh, Seong Il; Oh, Ki Wook; Kim, Hee Jin; Park, Jin Seok; Kim, Seung Hyun

    2016-07-01

    The increasing recognition that deficits in social emotions occur in amyotrophic lateral sclerosis (ALS) is helping to explain the spectrum of neuropsychological dysfunctions, thus supporting the view of ALS as a multisystem disorder involving neuropsychological deficits as well as motor deficits. The aim of this study was to characterize the emotion perception abilities of Korean patients with ALS based on the recognition of facial expressions. Twenty-four patients with ALS and 24 age- and sex-matched healthy controls completed neuropsychological tests and facial emotion recognition tasks [ChaeLee Korean Facial Expressions of Emotions (ChaeLee-E)]. The ChaeLee-E test includes facial expressions for seven emotions: happiness, sadness, anger, disgust, fear, surprise, and neutral. The ability to perceive facial emotions was significantly worse among ALS patients performed than among healthy controls [65.2±18.0% vs. 77.1±6.6% (mean±SD), p=0.009]. Eight of the 24 patients (33%) scored below the 5th percentile score of controls for recognizing facial emotions. Emotion perception deficits occur in Korean ALS patients, particularly regarding facial expressions of emotion. These findings expand the spectrum of cognitive and behavioral dysfunction associated with ALS into emotion processing dysfunction.

  7. Identification and intensity of disgust: Distinguishing visual, linguistic and facial expressions processing in Parkinson disease.

    PubMed

    Sedda, Anna; Petito, Sara; Guarino, Maria; Stracciari, Andrea

    2017-07-14

    Most of the studies since now show an impairment for facial displays of disgust recognition in Parkinson disease. A general impairment in disgust processing in patients with Parkinson disease might adversely affect their social interactions, given the relevance of this emotion for human relations. However, despite the importance of faces, disgust is also expressed through other format of visual stimuli such as sentences and visual images. The aim of our study was to explore disgust processing in a sample of patients affected by Parkinson disease, by means of various tests tackling not only facial recognition but also other format of visual stimuli through which disgust can be recognized. Our results confirm that patients are impaired in recognizing facial displays of disgust. Further analyses show that patients are also impaired and slower for other facial expressions, with the only exception of happiness. Notably however, patients with Parkinson disease processed visual images and sentences as controls. Our findings show a dissociation within different formats of visual stimuli of disgust, suggesting that Parkinson disease is not characterized by a general compromising of disgust processing, as often suggested. The involvement of the basal ganglia-frontal cortex system might spare some cognitive components of emotional processing, related to memory and culture, at least for disgust. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Social and emotional relevance in face processing: happy faces of future interaction partners enhance the late positive potential

    PubMed Central

    Bublatzky, Florian; Gerdes, Antje B. M.; White, Andrew J.; Riemer, Martin; Alpers, Georg W.

    2014-01-01

    Human face perception is modulated by both emotional valence and social relevance, but their interaction has rarely been examined. Event-related brain potentials (ERP) to happy, neutral, and angry facial expressions with different degrees of social relevance were recorded. To implement a social anticipation task, relevance was manipulated by presenting faces of two specific actors as future interaction partners (socially relevant), whereas two other face actors remained non-relevant. In a further control task all stimuli were presented without specific relevance instructions (passive viewing). Face stimuli of four actors (2 women, from the KDEF) were randomly presented for 1s to 26 participants (16 female). Results showed an augmented N170, early posterior negativity (EPN), and late positive potential (LPP) for emotional in contrast to neutral facial expressions. Of particular interest, face processing varied as a function of experimental tasks. Whereas task effects were observed for P1 and EPN regardless of instructed relevance, LPP amplitudes were modulated by emotional facial expression and relevance manipulation. The LPP was specifically enhanced for happy facial expressions of the anticipated future interaction partners. This underscores that social relevance can impact face processing already at an early stage of visual processing. These findings are discussed within the framework of motivated attention and face processing theories. PMID:25076881

  9. [The role of experience in the neurology of facial expression of emotions].

    PubMed

    Gordillo, Fernando; Pérez, Miguel A; Arana, José M; Mestas, Lilia; López, Rafael M

    2015-04-01

    Facial expression of emotion has an important social function that facilitates interaction between people. This process has a neurological basis, which is not isolated from the context, or the experience of the interaction between people in that context. Yet, to date, the impact that experience has on the perception of emotions is not completely understood. To discuss the role of experience in the recognition of facial expression of emotions and to analyze the biases towards emotional perception. The maturation of the structures that support the ability to recognize emotion goes through a sensitive period during adolescence, where experience may have greater impact on emotional recognition. Experiences of abuse, neglect, war, and stress generate a bias towards expressions of anger and sadness. Similarly, positive experiences generate a bias towards the expression of happiness. Only when people are able to use the facial expression of emotions as a channel for understanding an expression, will they be able to interact appropriately with their environment. This environment, in turn, will lead to experiences that modulate this capacity. Therefore, it is a self-regulatory process that can be directed through the implementation of intervention programs on emotional aspects.

  10. Neural responses to facial expression and face identity in the monkey amygdala.

    PubMed

    Gothard, K M; Battaglia, F P; Erickson, C A; Spitler, K M; Amaral, D G

    2007-02-01

    The amygdala is purported to play an important role in face processing, yet the specificity of its activation to face stimuli and the relative contribution of identity and expression to its activation are unknown. In the current study, neural activity in the amygdala was recorded as monkeys passively viewed images of monkey faces, human faces, and objects on a computer monitor. Comparable proportions of neurons responded selectively to images from each category. Neural responses to monkey faces were further examined to determine whether face identity or facial expression drove the face-selective responses. The majority of these neurons (64%) responded both to identity and facial expression, suggesting that these parameters are processed jointly in the amygdala. Large fractions of neurons, however, showed pure identity-selective or expression-selective responses. Neurons were selective for a particular facial expression by either increasing or decreasing their firing rate compared with the firing rates elicited by the other expressions. Responses to appeasing faces were often marked by significant decreases of firing rates, whereas responses to threatening faces were strongly associated with increased firing rate. Thus global activation in the amygdala might be larger to threatening faces than to neutral or appeasing faces.

  11. Deficits in facial affect recognition among antisocial populations: a meta-analysis.

    PubMed

    Marsh, Abigail A; Blair, R J R

    2008-01-01

    Individuals with disorders marked by antisocial behavior frequently show deficits in recognizing displays of facial affect. Antisociality may be associated with specific deficits in identifying fearful expressions, which would implicate dysfunction in neural structures that subserve fearful expression processing. A meta-analysis of 20 studies was conducted to assess: (a) if antisocial populations show any consistent deficits in recognizing six emotional expressions; (b) beyond any generalized impairment, whether specific fear recognition deficits are apparent; and (c) if deficits in fear recognition are a function of task difficulty. Results show a robust link between antisocial behavior and specific deficits in recognizing fearful expressions. This impairment cannot be attributed solely to task difficulty. These results suggest dysfunction among antisocial individuals in specified neural substrates, namely the amygdala, involved in processing fearful facial affect.

  12. Dissociation between recognition and detection advantage for facial expressions: a meta-analysis.

    PubMed

    Nummenmaa, Lauri; Calvo, Manuel G

    2015-04-01

    Happy facial expressions are recognized faster and more accurately than other expressions in categorization tasks, whereas detection in visual search tasks is widely believed to be faster for angry than happy faces. We used meta-analytic techniques for resolving this categorization versus detection advantage discrepancy for positive versus negative facial expressions. Effect sizes were computed on the basis of the r statistic for a total of 34 recognition studies with 3,561 participants and 37 visual search studies with 2,455 participants, yielding a total of 41 effect sizes for recognition accuracy, 25 for recognition speed, and 125 for visual search speed. Random effects meta-analysis was conducted to estimate effect sizes at population level. For recognition tasks, an advantage in recognition accuracy and speed for happy expressions was found for all stimulus types. In contrast, for visual search tasks, moderator analysis revealed that a happy face detection advantage was restricted to photographic faces, whereas a clear angry face advantage was found for schematic and "smiley" faces. Robust detection advantage for nonhappy faces was observed even when stimulus emotionality was distorted by inversion or rearrangement of the facial features, suggesting that visual features primarily drive the search. We conclude that the recognition advantage for happy faces is a genuine phenomenon related to processing of facial expression category and affective valence. In contrast, detection advantages toward either happy (photographic stimuli) or nonhappy (schematic) faces is contingent on visual stimulus features rather than facial expression, and may not involve categorical or affective processing. (c) 2015 APA, all rights reserved).

  13. Association between facial expression and PTSD symptoms among young children exposed to the Great East Japan Earthquake: a pilot study.

    PubMed

    Fujiwara, Takeo; Mizuki, Rie; Miki, Takahiro; Chemtob, Claude

    2015-01-01

    "Emotional numbing" is a symptom of post-traumatic stress disorder (PTSD) characterized by a loss of interest in usually enjoyable activities, feeling detached from others, and an inability to express a full range of emotions. Emotional numbing is usually assessed through self-report, and is particularly difficult to ascertain among young children. We conducted a pilot study to explore the use of facial expression ratings in response to a comedy video clip to assess emotional reactivity among preschool children directly exposed to the Great East Japan Earthquake. This study included 23 child participants. Child PTSD symptoms were measured using a modified version of the Parent's Report of the Child's Reaction to Stress scale. Children were filmed while watching a 2-min video compilation of natural scenes ('baseline video') followed by a 2-min video clip from a television comedy ('comedy video'). Children's facial expressions were processed the using Noldus FaceReader software, which implements the Facial Action Coding System (FACS). We investigated the association between PTSD symptom scores and facial emotion reactivity using linear regression analysis. Children with higher PTSD symptom scores showed a significantly greater proportion of neutral facial expressions, controlling for sex, age, and baseline facial expression (p < 0.05). This pilot study suggests that facial emotion reactivity, measured using facial expression recognition software, has the potential to index emotional numbing in young children. This pilot study adds to the emerging literature on using experimental psychopathology methods to characterize children's reactions to disasters.

  14. Association between facial expression and PTSD symptoms among young children exposed to the Great East Japan Earthquake: a pilot study

    PubMed Central

    Fujiwara, Takeo; Mizuki, Rie; Miki, Takahiro; Chemtob, Claude

    2015-01-01

    “Emotional numbing” is a symptom of post-traumatic stress disorder (PTSD) characterized by a loss of interest in usually enjoyable activities, feeling detached from others, and an inability to express a full range of emotions. Emotional numbing is usually assessed through self-report, and is particularly difficult to ascertain among young children. We conducted a pilot study to explore the use of facial expression ratings in response to a comedy video clip to assess emotional reactivity among preschool children directly exposed to the Great East Japan Earthquake. This study included 23 child participants. Child PTSD symptoms were measured using a modified version of the Parent’s Report of the Child’s Reaction to Stress scale. Children were filmed while watching a 2-min video compilation of natural scenes (‘baseline video’) followed by a 2-min video clip from a television comedy (‘comedy video’). Children’s facial expressions were processed the using Noldus FaceReader software, which implements the Facial Action Coding System (FACS). We investigated the association between PTSD symptom scores and facial emotion reactivity using linear regression analysis. Children with higher PTSD symptom scores showed a significantly greater proportion of neutral facial expressions, controlling for sex, age, and baseline facial expression (p < 0.05). This pilot study suggests that facial emotion reactivity, measured using facial expression recognition software, has the potential to index emotional numbing in young children. This pilot study adds to the emerging literature on using experimental psychopathology methods to characterize children’s reactions to disasters. PMID:26528206

  15. [Association between intelligence development and facial expression recognition ability in children with autism spectrum disorder].

    PubMed

    Pan, Ning; Wu, Gui-Hua; Zhang, Ling; Zhao, Ya-Fen; Guan, Han; Xu, Cai-Juan; Jing, Jin; Jin, Yu

    2017-03-01

    To investigate the features of intelligence development, facial expression recognition ability, and the association between them in children with autism spectrum disorder (ASD). A total of 27 ASD children aged 6-16 years (ASD group, full intelligence quotient >70) and age- and gender-matched normally developed children (control group) were enrolled. Wechsler Intelligence Scale for Children Fourth Edition and Chinese Static Facial Expression Photos were used for intelligence evaluation and facial expression recognition test. Compared with the control group, the ASD group had significantly lower scores of full intelligence quotient, verbal comprehension index, perceptual reasoning index (PRI), processing speed index(PSI), and working memory index (WMI) (P<0.05). The ASD group also had a significantly lower overall accuracy rate of facial expression recognition and significantly lower accuracy rates of the recognition of happy, angry, sad, and frightened expressions than the control group (P<0.05). In the ASD group, the overall accuracy rate of facial expression recognition and the accuracy rates of the recognition of happy and frightened expressions were positively correlated with PRI (r=0.415, 0.455, and 0.393 respectively; P<0.05). The accuracy rate of the recognition of angry expression was positively correlated with WMI (r=0.397; P<0.05). ASD children have delayed intelligence development compared with normally developed children and impaired expression recognition ability. Perceptual reasoning and working memory abilities are positively correlated with expression recognition ability, which suggests that insufficient perceptual reasoning and working memory abilities may be important factors affecting facial expression recognition ability in ASD children.

  16. Dissimilar processing of emotional facial expressions in human and monkey temporal cortex

    PubMed Central

    Zhu, Qi; Nelissen, Koen; Van den Stock, Jan; De Winter, François-Laurent; Pauwels, Karl; de Gelder, Beatrice; Vanduffel, Wim; Vandenbulcke, Mathieu

    2013-01-01

    Emotional facial expressions play an important role in social communication across primates. Despite major progress made in our understanding of categorical information processing such as for objects and faces, little is known, however, about how the primate brain evolved to process emotional cues. In this study, we used functional magnetic resonance imaging (fMRI) to compare the processing of emotional facial expressions between monkeys and humans. We used a 2 × 2 × 2 factorial design with species (human and monkey), expression (fear and chewing) and configuration (intact versus scrambled) as factors. At the whole brain level, selective neural responses to conspecific emotional expressions were anatomically confined to the superior temporal sulcus (STS) in humans. Within the human STS, we found functional subdivisions with a face-selective right posterior STS area that also responded selectively to emotional expressions of other species and a more anterior area in the right middle STS that responded specifically to human emotions. Hence, we argue that the latter region does not show a mere emotion-dependent modulation of activity but is primarily driven by human emotional facial expressions. Conversely, in monkeys, emotional responses appeared in earlier visual cortex and outside face-selective regions in inferior temporal cortex that responded also to multiple visual categories. Within monkey IT, we also found areas that were more responsive to conspecific than to non-conspecific emotional expressions but these responses were not as specific as in human middle STS. Overall, our results indicate that human STS may have developed unique properties to deal with social cues such as emotional expressions. PMID:23142071

  17. Impaired social brain network for processing dynamic facial expressions in autism spectrum disorders

    PubMed Central

    2012-01-01

    Background Impairment of social interaction via facial expressions represents a core clinical feature of autism spectrum disorders (ASD). However, the neural correlates of this dysfunction remain unidentified. Because this dysfunction is manifested in real-life situations, we hypothesized that the observation of dynamic, compared with static, facial expressions would reveal abnormal brain functioning in individuals with ASD. We presented dynamic and static facial expressions of fear and happiness to individuals with high-functioning ASD and to age- and sex-matched typically developing controls and recorded their brain activities using functional magnetic resonance imaging (fMRI). Result Regional analysis revealed reduced activation of several brain regions in the ASD group compared with controls in response to dynamic versus static facial expressions, including the middle temporal gyrus (MTG), fusiform gyrus, amygdala, medial prefrontal cortex, and inferior frontal gyrus (IFG). Dynamic causal modeling analyses revealed that bi-directional effective connectivity involving the primary visual cortex–MTG–IFG circuit was enhanced in response to dynamic as compared with static facial expressions in the control group. Group comparisons revealed that all these modulatory effects were weaker in the ASD group than in the control group. Conclusions These results suggest that weak activity and connectivity of the social brain network underlie the impairment in social interaction involving dynamic facial expressions in individuals with ASD. PMID:22889284

  18. Seeing Life through Positive-Tinted Glasses: Color–Meaning Associations

    PubMed Central

    Gil, Sandrine; Le Bigot, Ludovic

    2014-01-01

    There is a growing body of literature to show that color can convey information, owing to its emotionally meaningful associations. Most research so far has focused on negative hue–meaning associations (e.g., red) with the exception of the positive aspects associated with green. We therefore set out to investigate the positive associations of two colors (i.e., green and pink), using an emotional facial expression recognition task in which colors provided the emotional contextual information for the face processing. In two experiments, green and pink backgrounds enhanced happy face recognition and impaired sad face recognition, compared with a control color (gray). Our findings therefore suggest that because green and pink both convey positive information, they facilitate the processing of emotionally congruent facial expressions (i.e., faces expressing happiness) and interfere with that of incongruent facial expressions (i.e., faces expressing sadness). Data also revealed a positive association for white. Results are discussed within the theoretical framework of emotional cue processing and color meaning. PMID:25098167

  19. Seeing life through positive-tinted glasses: color-meaning associations.

    PubMed

    Gil, Sandrine; Le Bigot, Ludovic

    2014-01-01

    There is a growing body of literature to show that color can convey information, owing to its emotionally meaningful associations. Most research so far has focused on negative hue-meaning associations (e.g., red) with the exception of the positive aspects associated with green. We therefore set out to investigate the positive associations of two colors (i.e., green and pink), using an emotional facial expression recognition task in which colors provided the emotional contextual information for the face processing. In two experiments, green and pink backgrounds enhanced happy face recognition and impaired sad face recognition, compared with a control color (gray). Our findings therefore suggest that because green and pink both convey positive information, they facilitate the processing of emotionally congruent facial expressions (i.e., faces expressing happiness) and interfere with that of incongruent facial expressions (i.e., faces expressing sadness). Data also revealed a positive association for white. Results are discussed within the theoretical framework of emotional cue processing and color meaning.

  20. Computerised analysis of facial emotion expression in eating disorders.

    PubMed

    Leppanen, Jenni; Dapelo, Marcela Marin; Davies, Helen; Lang, Katie; Treasure, Janet; Tchanturia, Kate

    2017-01-01

    Problems with social-emotional processing are known to be an important contributor to the development and maintenance of eating disorders (EDs). Diminished facial communication of emotion has been frequently reported in individuals with anorexia nervosa (AN). Less is known about facial expressivity in bulimia nervosa (BN) and in people who have recovered from AN (RecAN). This study aimed to pilot the use of computerised facial expression analysis software to investigate emotion expression across the ED spectrum and recovery in a large sample of participants. 297 participants with AN, BN, RecAN, and healthy controls were recruited. Participants watched film clips designed to elicit happy or sad emotions, and facial expressions were then analysed using FaceReader. The finding mirrored those from previous work showing that healthy control and RecAN participants expressed significantly more positive emotions during the positive clip compared to the AN group. There were no differences in emotion expression during the sad film clip. These findings support the use of computerised methods to analyse emotion expression in EDs. The findings also demonstrate that reduced positive emotion expression is likely to be associated with the acute stage of AN illness, with individuals with BN showing an intermediate profile.

  1. Interactions between facial emotion and identity in face processing: evidence based on redundancy gains.

    PubMed

    Yankouskaya, Alla; Booth, David A; Humphreys, Glyn

    2012-11-01

    Interactions between the processing of emotion expression and form-based information from faces (facial identity) were investigated using the redundant-target paradigm, in which we specifically tested whether identity and emotional expression are integrated in a superadditive manner (Miller, Cognitive Psychology 14:247-279, 1982). In Experiments 1 and 2, participants performed emotion and face identity judgments on faces with sad or angry emotional expressions. Responses to redundant targets were faster than responses to either single target when a universal emotion was conveyed, and performance violated the predictions from a model assuming independent processing of emotion and face identity. Experiment 4 showed that these effects were not modulated by varying interstimulus and nontarget contingencies, and Experiment 5 demonstrated that the redundancy gains were eliminated when faces were inverted. Taken together, these results suggest that the identification of emotion and facial identity interact in face processing.

  2. Selective attention modulates early human evoked potentials during emotional face-voice processing.

    PubMed

    Ho, Hao Tam; Schröger, Erich; Kotz, Sonja A

    2015-04-01

    Recent findings on multisensory integration suggest that selective attention influences cross-sensory interactions from an early processing stage. Yet, in the field of emotional face-voice integration, the hypothesis prevails that facial and vocal emotional information interacts preattentively. Using ERPs, we investigated the influence of selective attention on the perception of congruent versus incongruent combinations of neutral and angry facial and vocal expressions. Attention was manipulated via four tasks that directed participants to (i) the facial expression, (ii) the vocal expression, (iii) the emotional congruence between the face and the voice, and (iv) the synchrony between lip movement and speech onset. Our results revealed early interactions between facial and vocal emotional expressions, manifested as modulations of the auditory N1 and P2 amplitude by incongruent emotional face-voice combinations. Although audiovisual emotional interactions within the N1 time window were affected by the attentional manipulations, interactions within the P2 modulation showed no such attentional influence. Thus, we propose that the N1 and P2 are functionally dissociated in terms of emotional face-voice processing and discuss evidence in support of the notion that the N1 is associated with cross-sensory prediction, whereas the P2 relates to the derivation of an emotional percept. Essentially, our findings put the integration of facial and vocal emotional expressions into a new perspective-one that regards the integration process as a composite of multiple, possibly independent subprocesses, some of which are susceptible to attentional modulation, whereas others may be influenced by additional factors.

  3. Multiple faces of pain: effects of chronic pain on the brain regulation of facial expression

    PubMed Central

    Vachon-Presseau, Etienne; Roy, Mathieu; Woo, Choong-Wan; Kunz, Miriam; Martel, Marc-Olivier; Sullivan, Michael J.; Jackson, Philip L.; Wager, Tor D.; Rainville, Pierre

    2018-01-01

    Pain behaviors are shaped by social demands and learning processes, and chronic pain has been previously suggested to affect their meaning. In this study, we combined functional magnetic resonance imaging with in-scanner video recording during thermal pain stimulations and use multilevel mediation analyses to study the brain mediators of pain facial expressions and the perception of pain intensity (self-reports) in healthy individuals and patients with chronic back pain (CBP). Behavioral data showed that the relation between pain expression and pain report was disrupted in CBP. In both patients with CBP and healthy controls, brain activity varying on a trial-by-trial basis with pain facial expressions was mainly located in the primary motor cortex and completely dissociated from the pattern of brain activity varying with pain intensity ratings. Stronger activity was observed in CBP specifically during pain facial expressions in several nonmotor brain regions such as the medial prefrontal cortex, the precuneus, and the medial temporal lobe. In sharp contrast, no moderating effect of chronic pain was observed on brain activity associated with pain intensity ratings. Our results demonstrate that pain facial expressions and pain intensity ratings reflect different aspects of pain processing and support psychosocial models of pain suggesting that distinctive mechanisms are involved in the regulation of pain behaviors in chronic pain. PMID:27411160

  4. Spatially generalizable representations of facial expressions: Decoding across partial face samples.

    PubMed

    Greening, Steven G; Mitchell, Derek G V; Smith, Fraser W

    2018-04-01

    A network of cortical and sub-cortical regions is known to be important in the processing of facial expression. However, to date no study has investigated whether representations of facial expressions present in this network permit generalization across independent samples of face information (e.g., eye region vs mouth region). We presented participants with partial face samples of five expression categories in a rapid event-related fMRI experiment. We reveal a network of face-sensitive regions that contain information about facial expression categories regardless of which part of the face is presented. We further reveal that the neural information present in a subset of these regions: dorsal prefrontal cortex (dPFC), superior temporal sulcus (STS), lateral occipital and ventral temporal cortex, and even early visual cortex, enables reliable generalization across independent visual inputs (faces depicting the 'eyes only' vs 'eyes removed'). Furthermore, classification performance was correlated to behavioral performance in STS and dPFC. Our results demonstrate that both higher (e.g., STS, dPFC) and lower level cortical regions contain information useful for facial expression decoding that go beyond the visual information presented, and implicate a key role for contextual mechanisms such as cortical feedback in facial expression perception under challenging conditions of visual occlusion. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Categorical Representation of Facial Expressions in the Infant Brain

    ERIC Educational Resources Information Center

    Leppanen, Jukka M.; Richmond, Jenny; Vogel-Farley, Vanessa K.; Moulson, Margaret C.; Nelson, Charles A.

    2009-01-01

    Categorical perception, demonstrated as reduced discrimination of within-category relative to between-category differences in stimuli, has been found in a variety of perceptual domains in adults. To examine the development of categorical perception in the domain of facial expression processing, we used behavioral and event-related potential (ERP)…

  6. Positive, but Not Negative, Facial Expressions Facilitate 3-Month-Olds' Recognition of an Individual Face

    ERIC Educational Resources Information Center

    Brenna, Viola; Proietti, Valentina; Montirosso, Rosario; Turati, Chiara

    2013-01-01

    The current study examined whether and how the presence of a positive or a negative emotional expression may affect the face recognition process at 3 months of age. Using a familiarization procedure, Experiment 1 demonstrated that positive (i.e., happiness), but not negative (i.e., fear and anger) facial expressions facilitate infants' ability to…

  7. Face or body? Oxytocin improves perception of emotions from facial expressions in incongruent emotional body context.

    PubMed

    Perry, Anat; Aviezer, Hillel; Goldstein, Pavel; Palgi, Sharon; Klein, Ehud; Shamay-Tsoory, Simone G

    2013-11-01

    The neuropeptide oxytocin (OT) has been repeatedly reported to play an essential role in the regulation of social cognition in humans in general, and specifically in enhancing the recognition of emotions from facial expressions. The later was assessed in different paradigms that rely primarily on isolated and decontextualized emotional faces. However, recent evidence has indicated that the perception of basic facial expressions is not context invariant and can be categorically altered by context, especially body context, at early perceptual levels. Body context has a strong effect on our perception of emotional expressions, especially when the actual target face and the contextually expected face are perceptually similar. To examine whether and how OT affects emotion recognition, we investigated the role of OT in categorizing facial expressions in incongruent body contexts. Our results show that in the combined process of deciphering emotions from facial expressions and from context, OT gives an advantage to the face. This advantage is most evident when the target face and the contextually expected face are perceptually similar. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Perceiving emotions in neutral faces: expression processing is biased by affective person knowledge.

    PubMed

    Suess, Franziska; Rabovsky, Milena; Abdel Rahman, Rasha

    2015-04-01

    According to a widely held view, basic emotions such as happiness or anger are reflected in facial expressions that are invariant and uniquely defined by specific facial muscle movements. Accordingly, expression perception should not be vulnerable to influences outside the face. Here, we test this assumption by manipulating the emotional valence of biographical knowledge associated with individual persons. Faces of well-known and initially unfamiliar persons displaying neutral expressions were associated with socially relevant negative, positive or comparatively neutral biographical information. The expressions of faces associated with negative information were classified as more negative than faces associated with neutral information. Event-related brain potential modulations in the early posterior negativity, a component taken to reflect early sensory processing of affective stimuli such as emotional facial expressions, suggest that negative affective knowledge can bias the perception of faces with neutral expressions toward subjectively displaying negative emotions. © The Author (2014). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  9. Parental catastrophizing about children's pain and selective attention to varying levels of facial expression of pain in children: a dot-probe study.

    PubMed

    Vervoort, Tine; Caes, Line; Crombez, Geert; Koster, Ernst; Van Damme, Stefaan; Dewitte, Marieke; Goubert, Liesbet

    2011-08-01

    The attentional demand of pain has primarily been investigated within an intrapersonal context. Little is known about observers' attentional processing of another's pain. The present study investigated, within a sample of parents (n=65; 51 mothers, 14 fathers) of school children, parental selective attention to children's facial display of pain and the moderating role of child's facial expressiveness of pain and parental catastrophizing about their child's pain. Parents performed a dot-probe task in which child facial display of pain (of varying pain expressiveness) were presented. Findings provided evidence of parental selective attention to child pain displays. Low facial displays of pain appeared sufficiently and also, as compared with higher facial displays of pain, equally capable of engaging parents' attention to the location of threat. Severity of facial displays of pain had a nonspatial effect on attention; that is, there was increased interference (ie, delayed responding) with increasing facial expressiveness. This interference effect was particularly pronounced for high-catastrophizing parents, suggesting that being confronted with increasing child pain displays becomes particularly demanding for high-catastrophizing parents. Finally, parents with higher levels of catastrophizing increasingly attended away from low pain expressions, whereas selective attention to high-pain expressions did not differ between high-catastrophizing and low-catastrophizing parents. Theoretical implications and further research directions are discussed. Copyright © 2011 International Association for the Study of Pain. Published by Elsevier B.V. All rights reserved.

  10. Parallel Processing in Face Perception

    ERIC Educational Resources Information Center

    Martens, Ulla; Leuthold, Hartmut; Schweinberger, Stefan R.

    2010-01-01

    The authors examined face perception models with regard to the functional and temporal organization of facial identity and expression analysis. Participants performed a manual 2-choice go/no-go task to classify faces, where response hand depended on facial familiarity (famous vs. unfamiliar) and response execution depended on facial expression…

  11. Interactive effects between gaze direction and facial expression on attentional resources deployment: the task instruction and context matter

    PubMed Central

    Ricciardelli, Paola; Lugli, Luisa; Pellicano, Antonello; Iani, Cristina; Nicoletti, Roberto

    2016-01-01

    In three experiments, we tested whether the amount of attentional resources needed to process a face displaying neutral/angry/fearful facial expressions with direct or averted gaze depends on task instructions, and face presentation. To this end, we used a Rapid Serial Visual Presentation paradigm in which participants in Experiment 1 were first explicitly asked to discriminate whether the expression of a target face (T1) with direct or averted gaze was angry or neutral, and then to judge the orientation of a landscape (T2). Experiment 2 was identical to Experiment 1 except that participants had to discriminate the gender of the face of T1 and fearful faces were also presented randomly inter-mixed within each block of trials. Experiment 3 differed from Experiment 2 only because angry and fearful faces were never presented within the same block. The findings indicated that the presence of the attentional blink (AB) for face stimuli depends on specific combinations of gaze direction and emotional facial expressions and crucially revealed that the contextual factors (e.g., explicit instruction to process the facial expression and the presence of other emotional faces) can modify and even reverse the AB, suggesting a flexible and more contextualized deployment of attentional resources in face processing. PMID:26898473

  12. Attention to gaze and emotion in schizophrenia.

    PubMed

    Schwartz, Barbara L; Vaidya, Chandan J; Howard, James H; Deutsch, Stephen I

    2010-11-01

    Individuals with schizophrenia have difficulty interpreting social and emotional cues such as facial expression, gaze direction, body position, and voice intonation. Nonverbal cues are powerful social signals but are often processed implicitly, outside the focus of attention. The aim of this research was to assess implicit processing of social cues in individuals with schizophrenia. Patients with schizophrenia or schizoaffective disorder and matched controls performed a primary task of word classification with social cues in the background. Participants were asked to classify target words (LEFT/RIGHT) by pressing a key that corresponded to the word, in the context of facial expressions with eye gaze averted to the left or right. Although facial expression and gaze direction were irrelevant to the task, these facial cues influenced word classification performance. Participants were slower to classify target words (e.g., LEFT) that were incongruent to gaze direction (e.g., eyes averted to the right) compared to target words (e.g., LEFT) that were congruent to gaze direction (e.g., eyes averted to the left), but this only occurred for expressions of fear. This pattern did not differ for patients and controls. The results showed that threat-related signals capture the attention of individuals with schizophrenia. These data suggest that implicit processing of eye gaze and fearful expressions is intact in schizophrenia. (c) 2010 APA, all rights reserved

  13. Direction of Amygdala-Neocortex Interaction During Dynamic Facial Expression Processing.

    PubMed

    Sato, Wataru; Kochiyama, Takanori; Uono, Shota; Yoshikawa, Sakiko; Toichi, Motomi

    2017-03-01

    Dynamic facial expressions of emotion strongly elicit multifaceted emotional, perceptual, cognitive, and motor responses. Neuroimaging studies revealed that some subcortical (e.g., amygdala) and neocortical (e.g., superior temporal sulcus and inferior frontal gyrus) brain regions and their functional interaction were involved in processing dynamic facial expressions. However, the direction of the functional interaction between the amygdala and the neocortex remains unknown. To investigate this issue, we re-analyzed functional magnetic resonance imaging (fMRI) data from 2 studies and magnetoencephalography (MEG) data from 1 study. First, a psychophysiological interaction analysis of the fMRI data confirmed the functional interaction between the amygdala and neocortical regions. Then, dynamic causal modeling analysis was used to compare models with forward, backward, or bidirectional effective connectivity between the amygdala and neocortical networks in the fMRI and MEG data. The results consistently supported the model of effective connectivity from the amygdala to the neocortex. Further increasing time-window analysis of the MEG demonstrated that this model was valid after 200 ms from the stimulus onset. These data suggest that emotional processing in the amygdala rapidly modulates some neocortical processing, such as perception, recognition, and motor mimicry, when observing dynamic facial expressions of emotion. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  14. Do Dynamic Compared to Static Facial Expressions of Happiness and Anger Reveal Enhanced Facial Mimicry?

    PubMed Central

    Rymarczyk, Krystyna; Żurawski, Łukasz; Jankowiak-Siuda, Kamila; Szatkowska, Iwona

    2016-01-01

    Facial mimicry is the spontaneous response to others’ facial expressions by mirroring or matching the interaction partner. Recent evidence suggested that mimicry may not be only an automatic reaction but could be dependent on many factors, including social context, type of task in which the participant is engaged, or stimulus properties (dynamic vs static presentation). In the present study, we investigated the impact of dynamic facial expression and sex differences on facial mimicry and judgment of emotional intensity. Electromyography recordings were recorded from the corrugator supercilii, zygomaticus major, and orbicularis oculi muscles during passive observation of static and dynamic images of happiness and anger. The ratings of the emotional intensity of facial expressions were also analysed. As predicted, dynamic expressions were rated as more intense than static ones. Compared to static images, dynamic displays of happiness also evoked stronger activity in the zygomaticus major and orbicularis oculi, suggesting that subjects experienced positive emotion. No muscles showed mimicry activity in response to angry faces. Moreover, we found that women exhibited greater zygomaticus major muscle activity in response to dynamic happiness stimuli than static stimuli. Our data support the hypothesis that people mimic positive emotions and confirm the importance of dynamic stimuli in some emotional processing. PMID:27390867

  15. Facial responsiveness of psychopaths to the emotional expressions of others

    PubMed Central

    Mokros, Andreas; Olderbak, Sally; Wilhelm, Oliver

    2018-01-01

    Psychopathic individuals show selfish, manipulative, and antisocial behavior in addition to emotional detachment and reduced empathy. Their empathic deficits are thought to be associated with a reduced responsiveness to emotional stimuli. Immediate facial muscle responses to the emotional expressions of others reflect the expressive part of emotional responsiveness and are positively related to trait empathy. Empirical evidence for reduced facial muscle responses in adult psychopathic individuals to the emotional expressions of others is rare. In the present study, 261 male criminal offenders and non-offenders categorized dynamically presented facial emotion expressions (angry, happy, sad, and neutral) during facial electromyography recording of their corrugator muscle activity. We replicated a measurement model of facial muscle activity, which controls for general facial responsiveness to face stimuli, and modeled three correlated emotion-specific factors (i.e., anger, happiness, and sadness) representing emotion specific activity. In a multi-group confirmatory factor analysis, we compared the means of the anger, happiness, and sadness latent factors between three groups: 1) non-offenders, 2) low, and 3) high psychopathic offenders. There were no significant mean differences between groups. Our results challenge current theories that focus on deficits in emotional responsiveness as leading to the development of psychopathy and encourage further theoretical development on deviant emotional processes in psychopathic individuals. PMID:29324826

  16. Discrimination and categorization of emotional facial expressions and faces in Parkinson's disease.

    PubMed

    Alonso-Recio, Laura; Martín, Pilar; Rubio, Sandra; Serrano, Juan M

    2014-09-01

    Our objective was to compare the ability to discriminate and categorize emotional facial expressions (EFEs) and facial identity characteristics (age and/or gender) in a group of 53 individuals with Parkinson's disease (PD) and another group of 53 healthy subjects. On the one hand, by means of discrimination and identification tasks, we compared two stages in the visual recognition process that could be selectively affected in individuals with PD. On the other hand, facial expression versus gender and age comparison permits us to contrast whether the emotional or non-emotional content influences the configural perception of faces. In Experiment I, we did not find differences between groups, either with facial expression or age, in discrimination tasks. Conversely, in Experiment II, we found differences between the groups, but only in the EFE identification task. Taken together, our results indicate that configural perception of faces does not seem to be globally impaired in PD. However, this ability is selectively altered when the categorization of emotional faces is required. A deeper assessment of the PD group indicated that decline in facial expression categorization is more evident in a subgroup of patients with higher global impairment (motor and cognitive). Taken together, these results suggest that the problems found in facial expression recognition may be associated with the progressive neuronal loss in frontostriatal and mesolimbic circuits, which characterizes PD. © 2013 The British Psychological Society.

  17. Smiling faces, sometimes they don't tell the truth: facial expression in the ultimatum game impacts decision making and event-related potentials.

    PubMed

    Mussel, Patrick; Hewig, Johannes; Allen, John J B; Coles, Michael G H; Miltner, Wolfgang

    2014-04-01

    Facial expressions are an important aspect of social interaction, conveying not only information regarding emotional states, but also regarding intentions, personality, and complex social characteristics. The present research investigates how a smiling, compared to a nonsmiling, expression impacts decision making and underlying cognitive and emotional processes in economic bargaining. Our results using the ultimatum game show that facial expressions have an impact on decision making as well as the feedback-related negativity following the offer. Furthermore, a moderating effect of sex on decision making was observed, with differential effects of facial expressions from male compared to female proposers. It is concluded that predictions of bargaining behavior must account for aspects of social interactions as well as sex effects to obtain more precise estimates of behavior. Copyright © 2014 Society for Psychophysiological Research.

  18. Avoidant decision making in social anxiety: the interaction of angry faces and emotional responses

    PubMed Central

    Pittig, Andre; Pawlikowski, Mirko; Craske, Michelle G.; Alpers, Georg W.

    2014-01-01

    Recent research indicates that angry facial expressions are preferentially processed and may facilitate automatic avoidance response, especially in socially anxious individuals. However, few studies have examined whether this bias also expresses itself in more complex cognitive processes and behavior such as decision making. We recently introduced a variation of the Iowa Gambling Task which allowed us to document the influence of task-irrelevant emotional cues on rational decision making. The present study used a modified gambling task to investigate the impact of angry facial expressions on decision making in 38 individuals with a wide range of social anxiety. Participants were to find out which choices were (dis-) advantageous to maximize overall gain. To create a decision conflict between approach of reward and avoidance of fear-relevant angry faces, advantageous choices were associated with angry facial expressions, whereas disadvantageous choices were associated with happy facial expressions. Results indicated that higher social avoidance predicted less advantageous decisions in the beginning of the task, i.e., when contingencies were still uncertain. Interactions with specific skin conductance responses further clarified that this initial avoidance only occurred in combination with elevated responses before choosing an angry facial expressions. In addition, an interaction between high trait anxiety and elevated responses to early losses predicted faster learning of an advantageous strategy. These effects were independent of intelligence, general risky decision-making, self-reported state anxiety, and depression. Thus, socially avoidant individuals who respond emotionally to angry facial expressions are more likely to show avoidance of these faces under uncertainty. This novel laboratory paradigm may be an appropriate analog for central features of social anxiety. PMID:25324792

  19. Avoidant decision making in social anxiety: the interaction of angry faces and emotional responses.

    PubMed

    Pittig, Andre; Pawlikowski, Mirko; Craske, Michelle G; Alpers, Georg W

    2014-01-01

    Recent research indicates that angry facial expressions are preferentially processed and may facilitate automatic avoidance response, especially in socially anxious individuals. However, few studies have examined whether this bias also expresses itself in more complex cognitive processes and behavior such as decision making. We recently introduced a variation of the Iowa Gambling Task which allowed us to document the influence of task-irrelevant emotional cues on rational decision making. The present study used a modified gambling task to investigate the impact of angry facial expressions on decision making in 38 individuals with a wide range of social anxiety. Participants were to find out which choices were (dis-) advantageous to maximize overall gain. To create a decision conflict between approach of reward and avoidance of fear-relevant angry faces, advantageous choices were associated with angry facial expressions, whereas disadvantageous choices were associated with happy facial expressions. Results indicated that higher social avoidance predicted less advantageous decisions in the beginning of the task, i.e., when contingencies were still uncertain. Interactions with specific skin conductance responses further clarified that this initial avoidance only occurred in combination with elevated responses before choosing an angry facial expressions. In addition, an interaction between high trait anxiety and elevated responses to early losses predicted faster learning of an advantageous strategy. These effects were independent of intelligence, general risky decision-making, self-reported state anxiety, and depression. Thus, socially avoidant individuals who respond emotionally to angry facial expressions are more likely to show avoidance of these faces under uncertainty. This novel laboratory paradigm may be an appropriate analog for central features of social anxiety.

  20. Impaired recognition of facial emotions from low-spatial frequencies in Asperger syndrome.

    PubMed

    Kätsyri, Jari; Saalasti, Satu; Tiippana, Kaisa; von Wendt, Lennart; Sams, Mikko

    2008-01-01

    The theory of 'weak central coherence' [Happe, F., & Frith, U. (2006). The weak coherence account: Detail-focused cognitive style in autism spectrum disorders. Journal of Autism and Developmental Disorders, 36(1), 5-25] implies that persons with autism spectrum disorders (ASDs) have a perceptual bias for local but not for global stimulus features. The recognition of emotional facial expressions representing various different levels of detail has not been studied previously in ASDs. We analyzed the recognition of four basic emotional facial expressions (anger, disgust, fear and happiness) from low-spatial frequencies (overall global shapes without local features) in adults with an ASD. A group of 20 participants with Asperger syndrome (AS) was compared to a group of non-autistic age- and sex-matched controls. Emotion recognition was tested from static and dynamic facial expressions whose spatial frequency contents had been manipulated by low-pass filtering at two levels. The two groups recognized emotions similarly from non-filtered faces and from dynamic vs. static facial expressions. In contrast, the participants with AS were less accurate than controls in recognizing facial emotions from very low-spatial frequencies. The results suggest intact recognition of basic facial emotions and dynamic facial information, but impaired visual processing of global features in ASDs.

  1. Misinterpretation of Facial Expressions of Emotion in Verbal Adults with Autism Spectrum Disorder

    PubMed Central

    Eack, Shaun M.; MAZEFSKY, CARLA A.; Minshew, Nancy J.

    2014-01-01

    Facial emotion perception is significantly affected in autism spectrum disorder (ASD), yet little is known about how individuals with ASD misinterpret facial expressions that result in their difficulty in accurately recognizing emotion in faces. This study examined facial emotion perception in 45 verbal adults with ASD and 30 age- and gender-matched volunteers without ASD to identify patterns of emotion misinterpretation during face processing that contribute to emotion recognition impairments in autism. Results revealed that difficulty distinguishing emotional from neutral facial expressions characterized much of the emotion perception impairments exhibited by participants with ASD. In particular, adults with ASD uniquely misinterpreted happy faces as neutral, and were significantly more likely than typical volunteers to attribute negative valence to non-emotional faces. The over-attribution of emotions to neutral faces was significantly related to greater communication and emotional intelligence impairments in individuals with ASD. These findings suggest a potential negative bias toward the interpretation of facial expressions and may have implications for interventions designed to remediate emotion perception in ASD. PMID:24535689

  2. More than mere mimicry? The influence of emotion on rapid facial reactions to faces.

    PubMed

    Moody, Eric J; McIntosh, Daniel N; Mann, Laura J; Weisser, Kimberly R

    2007-05-01

    Within a second of seeing an emotional facial expression, people typically match that expression. These rapid facial reactions (RFRs), often termed mimicry, are implicated in emotional contagion, social perception, and embodied affect, yet ambiguity remains regarding the mechanism(s) involved. Two studies evaluated whether RFRs to faces are solely nonaffective motor responses or whether emotional processes are involved. Brow (corrugator, related to anger) and forehead (frontalis, related to fear) activity were recorded using facial electromyography (EMG) while undergraduates in two conditions (fear induction vs. neutral) viewed fear, anger, and neutral facial expressions. As predicted, fear induction increased fear expressions to angry faces within 1000 ms of exposure, demonstrating an emotional component of RFRs. This did not merely reflect increased fear from the induction, because responses to neutral faces were unaffected. Considering RFRs to be merely nonaffective automatic reactions is inaccurate. RFRs are not purely motor mimicry; emotion influences early facial responses to faces. The relevance of these data to emotional contagion, autism, and the mirror system-based perspectives on imitation is discussed.

  3. Misinterpretation of facial expressions of emotion in verbal adults with autism spectrum disorder.

    PubMed

    Eack, Shaun M; Mazefsky, Carla A; Minshew, Nancy J

    2015-04-01

    Facial emotion perception is significantly affected in autism spectrum disorder, yet little is known about how individuals with autism spectrum disorder misinterpret facial expressions that result in their difficulty in accurately recognizing emotion in faces. This study examined facial emotion perception in 45 verbal adults with autism spectrum disorder and 30 age- and gender-matched volunteers without autism spectrum disorder to identify patterns of emotion misinterpretation during face processing that contribute to emotion recognition impairments in autism. Results revealed that difficulty distinguishing emotional from neutral facial expressions characterized much of the emotion perception impairments exhibited by participants with autism spectrum disorder. In particular, adults with autism spectrum disorder uniquely misinterpreted happy faces as neutral, and were significantly more likely than typical volunteers to attribute negative valence to nonemotional faces. The over-attribution of emotions to neutral faces was significantly related to greater communication and emotional intelligence impairments in individuals with autism spectrum disorder. These findings suggest a potential negative bias toward the interpretation of facial expressions and may have implications for interventions designed to remediate emotion perception in autism spectrum disorder. © The Author(s) 2014.

  4. Identity modulates short-term memory for facial emotion.

    PubMed

    Galster, Murray; Kahana, Michael J; Wilson, Hugh R; Sekuler, Robert

    2009-12-01

    For some time, the relationship between processing of facial expression and facial identity has been in dispute. Using realistic synthetic faces, we reexamined this relationship for both perception and short-term memory. In Experiment 1, subjects tried to identify whether the emotional expression on a probe stimulus face matched the emotional expression on either of two remembered faces that they had just seen. The results showed that identity strongly influenced recognition short-term memory for emotional expression. In Experiment 2, subjects' similarity/dissimilarity judgments were transformed by multidimensional scaling (MDS) into a 2-D description of the faces' perceptual representations. Distances among stimuli in the MDS representation, which showed a strong linkage of emotional expression and facial identity, were good predictors of correct and false recognitions obtained previously in Experiment 1. The convergence of the results from Experiments 1 and 2 suggests that the overall structure and configuration of faces' perceptual representations may parallel their representation in short-term memory and that facial identity modulates the representation of facial emotion, both in perception and in memory. The stimuli from this study may be downloaded from http://cabn.psychonomic-journals.org/content/supplemental.

  5. Identity modulates short-term memory for facial emotion

    PubMed Central

    Galster, Murray; Kahana, Michael J.; Wilson, Hugh R.; Sekuler, Robert

    2010-01-01

    For some time, the relationship between processing of facial expression and facial identity has been in dispute. Using realistic synthetic faces, we reexamined this relationship for both perception and short-term memory. In Experiment 1, subjects tried to identify whether the emotional expression on a probe stimulus face matched the emotional expression on either of two remembered faces that they had just seen. The results showed that identity strongly influenced recognition short-term memory for emotional expression. In Experiment 2, subjects’ similarity/dissimilarity judgments were transformed by multidimensional scaling (MDS) into a 2-D description of the faces’ perceptual representations. Distances among stimuli in the MDS representation, which showed a strong linkage of emotional expression and facial identity, were good predictors of correct and false recognitions obtained previously in Experiment 1. The convergence of the results from Experiments 1 and 2 suggests that the overall structure and configuration of faces’ perceptual representations may parallel their representation in short-term memory and that facial identity modulates the representation of facial emotion, both in perception and in memory. The stimuli from this study may be downloaded from http://cabn.psychonomic-journals.org/content/supplemental. PMID:19897794

  6. Impaired perception of facial emotion in developmental prosopagnosia.

    PubMed

    Biotti, Federica; Cook, Richard

    2016-08-01

    Developmental prosopagnosia (DP) is a neurodevelopmental condition characterised by difficulties recognising faces. Despite severe difficulties recognising facial identity, expression recognition is typically thought to be intact in DP; case studies have described individuals who are able to correctly label photographic displays of facial emotion, and no group differences have been reported. This pattern of deficits suggests a locus of impairment relatively late in the face processing stream, after the divergence of expression and identity analysis pathways. To date, however, there has been little attempt to investigate emotion recognition systematically in a large sample of developmental prosopagnosics using sensitive tests. In the present study, we describe three complementary experiments that examine emotion recognition in a sample of 17 developmental prosopagnosics. In Experiment 1, we investigated observers' ability to make binary classifications of whole-face expression stimuli drawn from morph continua. In Experiment 2, observers judged facial emotion using only the eye-region (the rest of the face was occluded). Analyses of both experiments revealed diminished ability to classify facial expressions in our sample of developmental prosopagnosics, relative to typical observers. Imprecise expression categorisation was particularly evident in those individuals exhibiting apperceptive profiles, associated with problems encoding facial shape accurately. Having split the sample of prosopagnosics into apperceptive and non-apperceptive subgroups, only the apperceptive prosopagnosics were impaired relative to typical observers. In our third experiment, we examined the ability of observers' to classify the emotion present within segments of vocal affect. Despite difficulties judging facial emotion, the prosopagnosics exhibited excellent recognition of vocal affect. Contrary to the prevailing view, our results suggest that many prosopagnosics do experience difficulties classifying expressions, particularly those with apperceptive profiles. These individuals may have difficulties forming view-invariant structural descriptions at an early stage in the face processing stream, before identity and expression pathways diverge. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Orienting to face expression during encoding improves men's recognition of own gender faces.

    PubMed

    Fulton, Erika K; Bulluck, Megan; Hertzog, Christopher

    2015-10-01

    It is unclear why women have superior episodic memory of faces, but the benefit may be partially the result of women engaging in superior processing of facial expressions. Therefore, we hypothesized that orienting instructions to attend to facial expression at encoding would significantly improve men's memory of faces and possibly reduce gender differences. We directed 203 college students (122 women) to study 120 faces under instructions to orient to either the person's gender or their emotional expression. They later took a recognition test of these faces by either judging whether they had previously studied the same person or that person with the exact same expression; the latter test evaluated recollection of specific facial details. Orienting to facial expressions during encoding significantly improved men's recognition of own-gender faces and eliminated the advantage that women had for male faces under gender orienting instructions. Although gender differences in spontaneous strategy use when orienting to faces cannot fully account for gender differences in face recognition, orienting men to facial expression during encoding is one way to significantly improve their episodic memory for male faces. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. An Age-Related Dissociation of Short-Term Memory for Facial Identity and Facial Emotional Expression.

    PubMed

    Hartley, Alan A; Ravich, Zoe; Stringer, Sarah; Wiley, Katherine

    2015-09-01

    Memory for both facial emotional expression and facial identity was explored in younger and older adults in 3 experiments using a delayed match-to-sample procedure. Memory sets of 1, 2, or 3 faces were presented, which were followed by a probe after a 3-s retention interval. There was very little difference between younger and older adults in memory for emotional expressions, but memory for identity was substantially impaired in the older adults. Possible explanations for spared memory for emotional expressions include socioemotional selectivity theory as well as the existence of overlapping yet distinct brain networks for processing of different emotions. © The Author 2013. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  9. Implicit and explicit processing of emotional facial expressions in Parkinson's disease.

    PubMed

    Wagenbreth, Caroline; Wattenberg, Lena; Heinze, Hans-Jochen; Zaehle, Tino

    2016-04-15

    Besides motor problems, Parkinson's disease (PD) is associated with detrimental emotional and cognitive functioning. Deficient explicit emotional processing has been observed, whilst patients also show impaired Theory of Mind (ToM) abilities. However, it is unclear whether this PD patients' ToM deficit is based on an inability to infer otherś emotional states or whether it is due to explicit emotional processing deficits. We investigated implicit and explicit emotional processing in PD with an affective priming paradigm in which we used pictures of human eyes for emotional primes and a lexical decision task (LDT) with emotional connoted words for target stimuli. Sixteen PD patients and sixteen matched healthy controls performed a LTD combined with an emotional priming paradigm providing emotional information through the facial eye region to assess implicit emotional processing. Second, participants explicitly evaluated the emotional status of eyes and words used in the implicit task. Compared to controls implicit emotional processing abilities were generally preserved in PD with, however, considerable alterations for happiness and disgust processing. Furthermore, we observed a general impairment of patients for explicit evaluation of emotional stimuli, which was augmented for the rating of facial expressions. This is the first study reporting results for affective priming with facial eye expressions in PD patients. Our findings indicate largely preserved implicit emotional processing, with a specific altered processing of disgust and happiness. Explicit emotional processing was considerably impaired for semantic and especially for facial stimulus material. Poor ToM abilities in PD patients might be based on deficient explicit emotional processing, with preserved ability to implicitly infer other people's feelings. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Coherence explored between emotion components: evidence from event-related potentials and facial electromyography.

    PubMed

    Gentsch, Kornelia; Grandjean, Didier; Scherer, Klaus R

    2014-04-01

    Componential theories assume that emotion episodes consist of emergent and dynamic response changes to relevant events in different components, such as appraisal, physiology, motivation, expression, and subjective feeling. In particular, Scherer's Component Process Model hypothesizes that subjective feeling emerges when the synchronization (or coherence) of appraisal-driven changes between emotion components has reached a critical threshold. We examined the prerequisite of this synchronization hypothesis for appraisal-driven response changes in facial expression. The appraisal process was manipulated by using feedback stimuli, presented in a gambling task. Participants' responses to the feedback were investigated in concurrently recorded brain activity related to appraisal (event-related potentials, ERP) and facial muscle activity (electromyography, EMG). Using principal component analysis, the prediction of appraisal-driven response changes in facial EMG was examined. Results support this prediction: early cognitive processes (related to the feedback-related negativity) seem to primarily affect the upper face, whereas processes that modulate P300 amplitudes tend to predominantly drive cheek region responses. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Human Facial Expressions as Adaptations:Evolutionary Questions in Facial Expression Research

    PubMed Central

    SCHMIDT, KAREN L.; COHN, JEFFREY F.

    2007-01-01

    The importance of the face in social interaction and social intelligence is widely recognized in anthropology. Yet the adaptive functions of human facial expression remain largely unknown. An evolutionary model of human facial expression as behavioral adaptation can be constructed, given the current knowledge of the phenotypic variation, ecological contexts, and fitness consequences of facial behavior. Studies of facial expression are available, but results are not typically framed in an evolutionary perspective. This review identifies the relevant physical phenomena of facial expression and integrates the study of this behavior with the anthropological study of communication and sociality in general. Anthropological issues with relevance to the evolutionary study of facial expression include: facial expressions as coordinated, stereotyped behavioral phenotypes, the unique contexts and functions of different facial expressions, the relationship of facial expression to speech, the value of facial expressions as signals, and the relationship of facial expression to social intelligence in humans and in nonhuman primates. Human smiling is used as an example of adaptation, and testable hypotheses concerning the human smile, as well as other expressions, are proposed. PMID:11786989

  12. Sex differences in social cognition: The case of face processing.

    PubMed

    Proverbio, Alice Mado

    2017-01-02

    Several studies have demonstrated that women show a greater interest for social information and empathic attitude than men. This article reviews studies on sex differences in the brain, with particular reference to how males and females process faces and facial expressions, social interactions, pain of others, infant faces, faces in things (pareidolia phenomenon), opposite-sex faces, humans vs. landscapes, incongruent behavior, motor actions, biological motion, erotic pictures, and emotional information. Sex differences in oxytocin-based attachment response and emotional memory are also mentioned. In addition, we investigated how 400 different human faces were evaluated for arousal and valence dimensions by a group of healthy male and female University students. Stimuli were carefully balanced for sensory and perceptual characteristics, age, facial expression, and sex. As a whole, women judged all human faces as more positive and more arousing than men. Furthermore, they showed a preference for the faces of children and the elderly in the arousal evaluation. Regardless of face aesthetics, age, or facial expression, women rated human faces higher than men. The preference for opposite- vs. same-sex faces strongly interacted with facial age. Overall, both women and men exhibited differences in facial processing that could be interpreted in the light of evolutionary psychobiology. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  13. Computerised analysis of facial emotion expression in eating disorders

    PubMed Central

    2017-01-01

    Background Problems with social-emotional processing are known to be an important contributor to the development and maintenance of eating disorders (EDs). Diminished facial communication of emotion has been frequently reported in individuals with anorexia nervosa (AN). Less is known about facial expressivity in bulimia nervosa (BN) and in people who have recovered from AN (RecAN). This study aimed to pilot the use of computerised facial expression analysis software to investigate emotion expression across the ED spectrum and recovery in a large sample of participants. Method 297 participants with AN, BN, RecAN, and healthy controls were recruited. Participants watched film clips designed to elicit happy or sad emotions, and facial expressions were then analysed using FaceReader. Results The finding mirrored those from previous work showing that healthy control and RecAN participants expressed significantly more positive emotions during the positive clip compared to the AN group. There were no differences in emotion expression during the sad film clip. Discussion These findings support the use of computerised methods to analyse emotion expression in EDs. The findings also demonstrate that reduced positive emotion expression is likely to be associated with the acute stage of AN illness, with individuals with BN showing an intermediate profile. PMID:28575109

  14. Developmental and Individual Differences in the Neural Processing of Dynamic Expressions of Pain and Anger

    PubMed Central

    Missana, Manuela; Grigutsch, Maren; Grossmann, Tobias

    2014-01-01

    We examined the processing of facial expressions of pain and anger in 8-month-old infants and adults by measuring event-related brain potentials (ERPs) and frontal EEG alpha asymmetry. The ERP results revealed that while adults showed a late positive potential (LPP) to emotional expressions that was enhanced to pain expressions, reflecting increased evaluation and emotional arousal to pain expressions, infants showed a negative component (Nc) to emotional expressions that was enhanced to angry expressions, reflecting increased allocation of attention to angry faces. Moreover, infants and adults showed opposite patterns in their frontal asymmetry responses to pain and anger, suggesting developmental differences in the motivational processes engendered by these facial expressions. These findings are discussed in the light of associated individual differences in infant temperament and adult dispositional empathy. PMID:24705497

  15. Characterization and recognition of mixed emotional expressions in thermal face image

    NASA Astrophysics Data System (ADS)

    Saha, Priya; Bhattacharjee, Debotosh; De, Barin K.; Nasipuri, Mita

    2016-05-01

    Facial expressions in infrared imaging have been introduced to solve the problem of illumination, which is an integral constituent of visual imagery. The paper investigates facial skin temperature distribution on mixed thermal facial expressions of our created face database where six are basic expressions and rest 12 are a mixture of those basic expressions. Temperature analysis has been performed on three facial regions of interest (ROIs); periorbital, supraorbital and mouth. Temperature variability of the ROIs in different expressions has been measured using statistical parameters. The temperature variation measurement in ROIs of a particular expression corresponds to a vector, which is later used in recognition of mixed facial expressions. Investigations show that facial features in mixed facial expressions can be characterized by positive emotion induced facial features and negative emotion induced facial features. Supraorbital is a useful facial region that can differentiate basic expressions from mixed expressions. Analysis and interpretation of mixed expressions have been conducted with the help of box and whisker plot. Facial region containing mixture of two expressions is generally less temperature inducing than corresponding facial region containing basic expressions.

  16. Responses in the right posterior superior temporal sulcus show a feature-based response to facial expression.

    PubMed

    Flack, Tessa R; Andrews, Timothy J; Hymers, Mark; Al-Mosaiwi, Mohammed; Marsden, Samuel P; Strachan, James W A; Trakulpipat, Chayanit; Wang, Liang; Wu, Tian; Young, Andrew W

    2015-08-01

    The face-selective region of the right posterior superior temporal sulcus (pSTS) plays an important role in analysing facial expressions. However, it is less clear how facial expressions are represented in this region. In this study, we used the face composite effect to explore whether the pSTS contains a holistic or feature-based representation of facial expression. Aligned and misaligned composite images were created from the top and bottom halves of faces posing different expressions. In Experiment 1, participants performed a behavioural matching task in which they judged whether the top half of two images was the same or different. The ability to discriminate the top half of the face was affected by changes in the bottom half of the face when the images were aligned, but not when they were misaligned. This shows a holistic behavioural response to expression. In Experiment 2, we used fMR-adaptation to ask whether the pSTS has a corresponding holistic neural representation of expression. Aligned or misaligned images were presented in blocks that involved repeating the same image or in which the top or bottom half of the images changed. Increased neural responses were found in the right pSTS regardless of whether the change occurred in the top or bottom of the image, showing that changes in expression were detected across all parts of the face. However, in contrast to the behavioural data, the pattern did not differ between aligned and misaligned stimuli. This suggests that the pSTS does not encode facial expressions holistically. In contrast to the pSTS, a holistic pattern of response to facial expression was found in the right inferior frontal gyrus (IFG). Together, these results suggest that pSTS reflects an early stage in the processing of facial expression in which facial features are represented independently. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Seeing a haptically explored face: visual facial-expression aftereffect from haptic adaptation to a face.

    PubMed

    Matsumiya, Kazumichi

    2013-10-01

    Current views on face perception assume that the visual system receives only visual facial signals. However, I show that the visual perception of faces is systematically biased by adaptation to a haptically explored face. Recently, face aftereffects (FAEs; the altered perception of faces after adaptation to a face) have been demonstrated not only in visual perception but also in haptic perception; therefore, I combined the two FAEs to examine whether the visual system receives face-related signals from the haptic modality. I found that adaptation to a haptically explored facial expression on a face mask produced a visual FAE for facial expression. This cross-modal FAE was not due to explicitly imaging a face, response bias, or adaptation to local features. Furthermore, FAEs transferred from vision to haptics. These results indicate that visual face processing depends on substrates adapted by haptic faces, which suggests that face processing relies on shared representation underlying cross-modal interactions.

  18. Automatic Facial Expression Recognition and Operator Functional State

    NASA Technical Reports Server (NTRS)

    Blanson, Nina

    2012-01-01

    The prevalence of human error in safety-critical occupations remains a major challenge to mission success despite increasing automation in control processes. Although various methods have been proposed to prevent incidences of human error, none of these have been developed to employ the detection and regulation of Operator Functional State (OFS), or the optimal condition of the operator while performing a task, in work environments due to drawbacks such as obtrusiveness and impracticality. A video-based system with the ability to infer an individual's emotional state from facial feature patterning mitigates some of the problems associated with other methods of detecting OFS, like obtrusiveness and impracticality in integration with the mission environment. This paper explores the utility of facial expression recognition as a technology for inferring OFS by first expounding on the intricacies of OFS and the scientific background behind emotion and its relationship with an individual's state. Then, descriptions of the feedback loop and the emotion protocols proposed for the facial recognition program are explained. A basic version of the facial expression recognition program uses Haar classifiers and OpenCV libraries to automatically locate key facial landmarks during a live video stream. Various methods of creating facial expression recognition software are reviewed to guide future extensions of the program. The paper concludes with an examination of the steps necessary in the research of emotion and recommendations for the creation of an automatic facial expression recognition program for use in real-time, safety-critical missions

  19. Automatic Facial Expression Recognition and Operator Functional State

    NASA Technical Reports Server (NTRS)

    Blanson, Nina

    2011-01-01

    The prevalence of human error in safety-critical occupations remains a major challenge to mission success despite increasing automation in control processes. Although various methods have been proposed to prevent incidences of human error, none of these have been developed to employ the detection and regulation of Operator Functional State (OFS), or the optimal condition of the operator while performing a task, in work environments due to drawbacks such as obtrusiveness and impracticality. A video-based system with the ability to infer an individual's emotional state from facial feature patterning mitigates some of the problems associated with other methods of detecting OFS, like obtrusiveness and impracticality in integration with the mission environment. This paper explores the utility of facial expression recognition as a technology for inferring OFS by first expounding on the intricacies of OFS and the scientific background behind emotion and its relationship with an individual's state. Then, descriptions of the feedback loop and the emotion protocols proposed for the facial recognition program are explained. A basic version of the facial expression recognition program uses Haar classifiers and OpenCV libraries to automatically locate key facial landmarks during a live video stream. Various methods of creating facial expression recognition software are reviewed to guide future extensions of the program. The paper concludes with an examination of the steps necessary in the research of emotion and recommendations for the creation of an automatic facial expression recognition program for use in real-time, safety-critical missions.

  20. Appraisals Generate Specific Configurations of Facial Muscle Movements in a Gambling Task: Evidence for the Component Process Model of Emotion.

    PubMed

    Gentsch, Kornelia; Grandjean, Didier; Scherer, Klaus R

    2015-01-01

    Scherer's Component Process Model provides a theoretical framework for research on the production mechanism of emotion and facial emotional expression. The model predicts that appraisal results drive facial expressions, which unfold sequentially and cumulatively over time. In two experiments, we examined facial muscle activity changes (via facial electromyography recordings over the corrugator, cheek, and frontalis regions) in response to events in a gambling task. These events were experimentally manipulated feedback stimuli which presented simultaneous information directly affecting goal conduciveness (gambling outcome: win, loss, or break-even) and power appraisals (Experiment 1 and 2), as well as control appraisal (Experiment 2). We repeatedly found main effects of goal conduciveness (starting ~600 ms), and power appraisals (starting ~800 ms after feedback onset). Control appraisal main effects were inconclusive. Interaction effects of goal conduciveness and power appraisals were obtained in both experiments (Experiment 1: over the corrugator and cheek regions; Experiment 2: over the frontalis region) suggesting amplified goal conduciveness effects when power was high in contrast to invariant goal conduciveness effects when power was low. Also an interaction of goal conduciveness and control appraisals was found over the cheek region, showing differential goal conduciveness effects when control was high and invariant effects when control was low. These interaction effects suggest that the appraisal of having sufficient control or power affects facial responses towards gambling outcomes. The result pattern suggests that corrugator and frontalis regions are primarily related to cognitive operations that process motivational pertinence, whereas the cheek region would be more influenced by coping implications. Our results provide first evidence demonstrating that cognitive-evaluative mechanisms related to goal conduciveness, control, and power appraisals affect facial expressions dynamically over time, immediately after an event is perceived. In addition, our results provide further indications for the chronography of appraisal-driven facial movements and the underlying cognitive processes.

  1. Appraisals Generate Specific Configurations of Facial Muscle Movements in a Gambling Task: Evidence for the Component Process Model of Emotion

    PubMed Central

    Gentsch, Kornelia; Grandjean, Didier; Scherer, Klaus R.

    2015-01-01

    Scherer’s Component Process Model provides a theoretical framework for research on the production mechanism of emotion and facial emotional expression. The model predicts that appraisal results drive facial expressions, which unfold sequentially and cumulatively over time. In two experiments, we examined facial muscle activity changes (via facial electromyography recordings over the corrugator, cheek, and frontalis regions) in response to events in a gambling task. These events were experimentally manipulated feedback stimuli which presented simultaneous information directly affecting goal conduciveness (gambling outcome: win, loss, or break-even) and power appraisals (Experiment 1 and 2), as well as control appraisal (Experiment 2). We repeatedly found main effects of goal conduciveness (starting ~600 ms), and power appraisals (starting ~800 ms after feedback onset). Control appraisal main effects were inconclusive. Interaction effects of goal conduciveness and power appraisals were obtained in both experiments (Experiment 1: over the corrugator and cheek regions; Experiment 2: over the frontalis region) suggesting amplified goal conduciveness effects when power was high in contrast to invariant goal conduciveness effects when power was low. Also an interaction of goal conduciveness and control appraisals was found over the cheek region, showing differential goal conduciveness effects when control was high and invariant effects when control was low. These interaction effects suggest that the appraisal of having sufficient control or power affects facial responses towards gambling outcomes. The result pattern suggests that corrugator and frontalis regions are primarily related to cognitive operations that process motivational pertinence, whereas the cheek region would be more influenced by coping implications. Our results provide first evidence demonstrating that cognitive-evaluative mechanisms related to goal conduciveness, control, and power appraisals affect facial expressions dynamically over time, immediately after an event is perceived. In addition, our results provide further indications for the chronography of appraisal-driven facial movements and the underlying cognitive processes. PMID:26295338

  2. Is empathy necessary to comprehend the emotional faces? The empathic effect on attentional mechanisms (eye movements), cortical correlates (N200 event-related potentials) and facial behaviour (electromyography) in face processing.

    PubMed

    Balconi, Michela; Canavesio, Ylenia

    2016-01-01

    The present research explored the effect of social empathy on processing emotional facial expressions. Previous evidence suggested a close relationship between emotional empathy and both the ability to detect facial emotions and the attentional mechanisms involved. A multi-measure approach was adopted: we investigated the association between trait empathy (Balanced Emotional Empathy Scale) and individuals' performance (response times; RTs), attentional mechanisms (eye movements; number and duration of fixations), correlates of cortical activation (event-related potential (ERP) N200 component), and facial responsiveness (facial zygomatic and corrugator activity). Trait empathy was found to affect face detection performance (reduced RTs), attentional processes (more scanning eye movements in specific areas of interest), ERP salience effect (increased N200 amplitude), and electromyographic activity (more facial responses). A second important result was the demonstration of strong, direct correlations among these measures. We suggest that empathy may function as a social facilitator of the processes underlying the detection of facial emotion, and a general "facial response effect" is proposed to explain these results. We assumed that empathy influences cognitive and the facial responsiveness, such that empathic individuals are more skilful in processing facial emotion.

  3. Judging emotional congruency: Explicit attention to situational context modulates processing of facial expressions of emotion.

    PubMed

    Diéguez-Risco, Teresa; Aguado, Luis; Albert, Jacobo; Hinojosa, José Antonio

    2015-12-01

    The influence of explicit evaluative processes on the contextual integration of facial expressions of emotion was studied in a procedure that required the participants to judge the congruency of happy and angry faces with preceding sentences describing emotion-inducing situations. Judgments were faster on congruent trials in the case of happy faces and on incongruent trials in the case of angry faces. At the electrophysiological level, a congruency effect was observed in the face-sensitive N170 component that showed larger amplitudes on incongruent trials. An interactive effect of congruency and emotion appeared on the LPP (late positive potential), with larger amplitudes in response to happy faces that followed anger-inducing situations. These results show that the deliberate intention to judge the contextual congruency of facial expressions influences not only processes involved in affective evaluation such as those indexed by the LPP but also earlier processing stages that are involved in face perception. Copyright © 2015. Published by Elsevier B.V.

  4. Amphetamine as a social drug: Effects of d-amphetamine on social processing and behavior

    PubMed Central

    Wardle, Margaret C.; Garner, Matthew J.; Munafò, Marcus R.; de Wit, Harriet

    2012-01-01

    Rationale Drug users often report using drugs to enhance social situations, and empirical studies support the idea that drugs increase both social behavior and the value of social interactions. One way drugs may affect social behavior is by altering social processing, for example by decreasing perceptions of negative emotion in others. Objectives We examined effects of d-amphetamine on processing of emotional facial expressions, and on the social behavior of talking. We predicted amphetamine would enhance attention, identification and responsivity to positive expressions, and that this in turn would predict increased talkativeness. Methods Over three sessions, 36 healthy normal adults received placebo, 10mg, and 20mg d-amphetamine under counterbalanced double-blind conditions. At each session we measured processing of happy, fearful, sad and angry expressions using an attentional visual probe task, a dynamic emotion identification task, and measures of facial muscle activity. We also measured talking. Results Amphetamine decreased the threshold for identifying all emotions, increased negative facial responses to sad expressions, and increased talkativeness. Contrary to our hypotheses, amphetamine did not alter attention to, identification of or facial responses to positive emotions specifically. Interestingly, the drug decreased the threshold to identify all emotions, and this effect was uniquely related to increased talkativeness, even after controlling for overall sensitivity to amphetamine. Conclusions The results suggest that amphetamine may encourage sociability by increasing sensitivity to subtle emotional expressions. These findings suggest novel social mechanisms that may contribute to the rewarding effects of amphetamine. PMID:22526538

  5. Recognition of Emotional and Nonemotional Facial Expressions: A Comparison between Williams Syndrome and Autism

    ERIC Educational Resources Information Center

    Lacroix, Agnes; Guidetti, Michele; Roge, Bernadette; Reilly, Judy

    2009-01-01

    The aim of our study was to compare two neurodevelopmental disorders (Williams syndrome and autism) in terms of the ability to recognize emotional and nonemotional facial expressions. The comparison of these two disorders is particularly relevant to the investigation of face processing and should contribute to a better understanding of social…

  6. Teachers' Perception Regarding Facial Expressions as an Effective Teaching Tool

    ERIC Educational Resources Information Center

    Butt, Muhammad Naeem; Iqbal, Mohammad

    2011-01-01

    The major objective of the study was to explore teachers' perceptions about the importance of facial expression in the teaching-learning process. All the teachers of government secondary schools constituted the population of the study. A sample of 40 teachers, both male and female, in rural and urban areas of district Peshawar, were selected…

  7. Testosterone reactivity to facial display of emotions in men and women.

    PubMed

    Zilioli, Samuele; Caldbick, Evan; Watson, Neil V

    2014-05-01

    Previous studies have examined testosterone's role in regulating the processing of facial displays of emotions (FDEs). However, the reciprocal process - the influence of FDEs, an evolutionarily ancient and potent class of social signals, on the secretion of testosterone - has not yet been studied. To address this gap, we examined the effects of emotional content and sex of facial stimuli in modulating endogenous testosterone fluctuations, as well as sex differences in the endocrine responses to faces. One hundred and sixty-four young healthy men and women were exposed, in a between-subjects design, to happy or angry same-sex or opposite-sex facial expressions. Results showed that in both men (n=85) and women (n=79), extended exposure to faces of the opposite sex, regardless of their apparent emotional content, was accompanied by an accumulation in salivary testosterone when compared to exposure to faces of the same sex. Furthermore, testosterone change in women exposed to angry expressions was greater than testosterone change in women exposed to happy expressions. These results add emotional facial stimuli to the collection of social signals that modulate endocrine status, and are discussed with regard to the evolutionary roles of testosterone. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. The role of encoding and attention in facial emotion memory: an EEG investigation.

    PubMed

    Brenner, Colleen A; Rumak, Samuel P; Burns, Amy M N; Kieffaber, Paul D

    2014-09-01

    Facial expressions are encoded via sensory mechanisms, but meaning extraction and salience of these expressions involve cognitive functions. We investigated the time course of sensory encoding and subsequent maintenance in memory via EEG. Twenty-nine healthy participants completed a facial emotion delayed match-to-sample task. P100, N170 and N250 ERPs were measured in response to the first stimulus, and evoked theta power (4-7Hz) was measured during the delay interval. Negative facial expressions produced larger N170 amplitudes and greater theta power early in the delay. N170 amplitude correlated with theta power, however larger N170 amplitude coupled with greater theta power only predicted behavioural performance for one emotion condition (very happy) out of six tested (see Supplemental Data). These findings indicate that the N170 ERP may be sensitive to emotional facial expressions when task demands require encoding and retention of this information. Furthermore, sustained theta activity may represent continued attentional processing that supports short-term memory, especially of negative facial stimuli. Further study is needed to investigate the potential influence of these measures, and their interaction, on behavioural performance. Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.

  9. Affect of the unconscious: visually suppressed angry faces modulate our decisions.

    PubMed

    Almeida, Jorge; Pajtas, Petra E; Mahon, Bradford Z; Nakayama, Ken; Caramazza, Alfonso

    2013-03-01

    Emotional and affective processing imposes itself over cognitive processes and modulates our perception of the surrounding environment. In two experiments, we addressed the issue of whether nonconscious processing of affect can take place even under deep states of unawareness, such as those induced by interocular suppression techniques, and can elicit an affective response that can influence our understanding of the surrounding environment. In Experiment 1, participants judged the likeability of an unfamiliar item--a Chinese character--that was preceded by a face expressing a particular emotion (either happy or angry). The face was rendered invisible through an interocular suppression technique (continuous flash suppression; CFS). In Experiment 2, backward masking (BM), a less robust masking technique, was used to render the facial expressions invisible. We found that despite equivalent phenomenological suppression of the visual primes under CFS and BM, different patterns of affective processing were obtained with the two masking techniques. Under BM, nonconscious affective priming was obtained for both happy and angry invisible facial expressions. However, under CFS, nonconscious affective priming was obtained only for angry facial expressions. We discuss an interpretation of this dissociation between affective processing and visual masking techniques in terms of distinct routes from the retina to the amygdala.

  10. Facial decoding in schizophrenia is underpinned by basic visual processing impairments.

    PubMed

    Belge, Jan-Baptist; Maurage, Pierre; Mangelinckx, Camille; Leleux, Dominique; Delatte, Benoît; Constant, Eric

    2017-09-01

    Schizophrenia is associated with a strong deficit in the decoding of emotional facial expression (EFE). Nevertheless, it is still unclear whether this deficit is specific for emotions or due to a more general impairment for any type of facial processing. This study was designed to clarify this issue. Thirty patients suffering from schizophrenia and 30 matched healthy controls performed several tasks evaluating the recognition of both changeable (i.e. eyes orientation and emotions) and stable (i.e. gender, age) facial characteristics. Accuracy and reaction times were recorded. Schizophrenic patients presented a performance deficit (accuracy and reaction times) in the perception of both changeable and stable aspects of faces, without any specific deficit for emotional decoding. Our results demonstrate a generalized face recognition deficit in schizophrenic patients, probably caused by a perceptual deficit in basic visual processing. It seems that the deficit in the decoding of emotional facial expression (EFE) is not a specific deficit of emotion processing, but is at least partly related to a generalized perceptual deficit in lower-level perceptual processing, occurring before the stage of emotion processing, and underlying more complex cognitive dysfunctions. These findings should encourage future investigations to explore the neurophysiologic background of these generalized perceptual deficits, and stimulate a clinical approach focusing on more basic visual processing. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  11. Facial dynamics and emotional expressions in facial aging treatments.

    PubMed

    Michaud, Thierry; Gassia, Véronique; Belhaouari, Lakhdar

    2015-03-01

    Facial expressions convey emotions that form the foundation of interpersonal relationships, and many of these emotions promote and regulate our social linkages. Hence, the facial aging symptomatological analysis and the treatment plan must of necessity include knowledge of the facial dynamics and the emotional expressions of the face. This approach aims to more closely meet patients' expectations of natural-looking results, by correcting age-related negative expressions while observing the emotional language of the face. This article will successively describe patients' expectations, the role of facial expressions in relational dynamics, the relationship between facial structures and facial expressions, and the way facial aging mimics negative expressions. Eventually, therapeutic implications for facial aging treatment will be addressed. © 2015 Wiley Periodicals, Inc.

  12. When your face describes your memories: facial expressions during retrieval of autobiographical memories.

    PubMed

    El Haj, Mohamad; Daoudi, Mohamed; Gallouj, Karim; Moustafa, Ahmed A; Nandrino, Jean-Louis

    2018-05-11

    Thanks to the current advances in the software analysis of facial expressions, there is a burgeoning interest in understanding emotional facial expressions observed during the retrieval of autobiographical memories. This review describes the research on facial expressions during autobiographical retrieval showing distinct emotional facial expressions according to the characteristics of retrieved memoires. More specifically, this research demonstrates that the retrieval of emotional memories can trigger corresponding emotional facial expressions (e.g. positive memories may trigger positive facial expressions). Also, this study demonstrates the variations of facial expressions according to specificity, self-relevance, or past versus future direction of memory construction. Besides linking research on facial expressions during autobiographical retrieval to cognitive and affective characteristics of autobiographical memory in general, this review positions this research within the broader context research on the physiologic characteristics of autobiographical retrieval. We also provide several perspectives for clinical studies to investigate facial expressions in populations with deficits in autobiographical memory (e.g. whether autobiographical overgenerality in neurologic and psychiatric populations may trigger few emotional facial expressions). In sum, this review paper demonstrates how the evaluation of facial expressions during autobiographical retrieval may help understand the functioning and dysfunctioning of autobiographical memory.

  13. Facial expression judgments support a socio-relational model, rather than a negativity bias model of political psychology.

    PubMed

    Vigil, Jacob M; Strenth, Chance

    2014-06-01

    Self-reported opinions and judgments may be more rooted in expressive biases than in cognitive processing biases, and ultimately operate within a broader behavioral style for advertising the capacity - versus the trustworthiness - dimension of human reciprocity potential. Our analyses of facial expression judgments of likely voters are consistent with this thesis, and directly contradict one major prediction from the authors' "negativity-bias" model.

  14. Repeated short presentations of morphed facial expressions change recognition and evaluation of facial expressions.

    PubMed

    Moriya, Jun; Tanno, Yoshihiko; Sugiura, Yoshinori

    2013-11-01

    This study investigated whether sensitivity to and evaluation of facial expressions varied with repeated exposure to non-prototypical facial expressions for a short presentation time. A morphed facial expression was presented for 500 ms repeatedly, and participants were required to indicate whether each facial expression was happy or angry. We manipulated the distribution of presentations of the morphed facial expressions for each facial stimulus. Some of the individuals depicted in the facial stimuli expressed anger frequently (i.e., anger-prone individuals), while the others expressed happiness frequently (i.e., happiness-prone individuals). After being exposed to the faces of anger-prone individuals, the participants became less sensitive to those individuals' angry faces. Further, after being exposed to the faces of happiness-prone individuals, the participants became less sensitive to those individuals' happy faces. We also found a relative increase in the social desirability of happiness-prone individuals after exposure to the facial stimuli.

  15. Gaze Dynamics in the Recognition of Facial Expressions of Emotion.

    PubMed

    Barabanschikov, Vladimir A

    2015-01-01

    We studied preferably fixated parts and features of human face in the process of recognition of facial expressions of emotion. Photographs of facial expressions were used. Participants were to categorize these as basic emotions; during this process, eye movements were registered. It was found that variation in the intensity of an expression is mirrored in accuracy of emotion recognition; it was also reflected by several indices of oculomotor function: duration of inspection of certain areas of the face, its upper and bottom or right parts, right and left sides; location, number and duration of fixations, viewing trajectory. In particular, for low-intensity expressions, right side of the face was found to be attended predominantly (right-side dominance); the right-side dominance effect, was, however, absent for expressions of high intensity. For both low- and high-intensity expressions, upper face part was predominantly fixated, though with greater fixation of high-intensity expressions. The majority of trials (70%), in line with findings in previous studies, revealed a V-shaped pattern of inspection trajectory. No relationship, between accuracy of recognition of emotional expressions, was found, though, with either location and duration of fixations or pattern of gaze directedness in the face. © The Author(s) 2015.

  16. Event-related brain responses to emotional words, pictures, and faces – a cross-domain comparison

    PubMed Central

    Bayer, Mareike; Schacht, Annekathrin

    2014-01-01

    Emotion effects in event-related brain potentials (ERPs) have previously been reported for a range of visual stimuli, including emotional words, pictures, and facial expressions. Still, little is known about the actual comparability of emotion effects across these stimulus classes. The present study aimed to fill this gap by investigating emotion effects in response to words, pictures, and facial expressions using a blocked within-subject design. Furthermore, ratings of stimulus arousal and valence were collected from an independent sample of participants. Modulations of early posterior negativity (EPN) and late positive complex (LPC) were visible for all stimulus domains, but showed clear differences, particularly in valence processing. While emotion effects were limited to positive stimuli for words, they were predominant for negative stimuli in pictures and facial expressions. These findings corroborate the notion of a positivity offset for words and a negativity bias for pictures and facial expressions, which was assumed to be caused by generally lower arousal levels of written language. Interestingly, however, these assumed differences were not confirmed by arousal ratings. Instead, words were rated as overall more positive than pictures and facial expressions. Taken together, the present results point toward systematic differences in the processing of written words and pictorial stimuli of emotional content, not only in terms of a valence bias evident in ERPs, but also concerning their emotional evaluation captured by ratings of stimulus valence and arousal. PMID:25339927

  17. Do Valenced Odors and Trait Body Odor Disgust Affect Evaluation of Emotion in Dynamic Faces?

    PubMed

    Syrjänen, Elmeri; Liuzza, Marco Tullio; Fischer, Håkan; Olofsson, Jonas K

    2017-12-01

    Disgust is a core emotion evolved to detect and avoid the ingestion of poisonous food as well as the contact with pathogens and other harmful agents. Previous research has shown that multisensory presentation of olfactory and visual information may strengthen the processing of disgust-relevant information. However, it is not known whether these findings extend to dynamic facial stimuli that changes from neutral to emotionally expressive, or if individual differences in trait body odor disgust may influence the processing of disgust-related information. In this preregistered study, we tested whether a classification of dynamic facial expressions as happy or disgusted, and an emotional evaluation of these facial expressions, would be affected by individual differences in body odor disgust sensitivity, and by exposure to a sweat-like, negatively valenced odor (valeric acid), as compared with a soap-like, positively valenced odor (lilac essence) or a no-odor control. Using Bayesian hypothesis testing, we found evidence that odors do not affect recognition of emotion in dynamic faces even when body odor disgust sensitivity was used as moderator. However, an exploratory analysis suggested that an unpleasant odor context may cause faster RTs for faces, independent of their emotional expression. Our results further our understanding of the scope and limits of odor effects on facial perception affect and suggest further studies should focus on reproducibility, specifying experimental circumstances where odor effects on facial expressions may be present versus absent.

  18. The Impact of Perceptual Load on the Non-Conscious Processing of Fearful Faces

    PubMed Central

    Wang, Lili; Feng, Chunliang; Mai, Xiaoqin; Jia, Lina; Zhu, Xiangru; Luo, Wenbo; Luo, Yue-jia

    2016-01-01

    Emotional stimuli can be processed without consciousness. In the current study, we used event-related potentials (ERPs) to assess whether perceptual load influences non-conscious processing of fearful facial expressions. Perceptual load was manipulated using a letter search task with the target letter presented at the fixation point, while facial expressions were presented peripherally and masked to prevent conscious awareness. The letter string comprised six letters (X or N) that were identical (low load) or different (high load). Participants were instructed to discriminate the letters at fixation or the facial expression (fearful or neutral) in the periphery. Participants were faster and more accurate at detecting letters in the low load condition than in the high load condition. Fearful faces elicited a sustained positivity from 250 ms to 700 ms post-stimulus over fronto-central areas during the face discrimination and low-load letter discrimination conditions, but this effect was completely eliminated during high-load letter discrimination. Our findings imply that non-conscious processing of fearful faces depends on perceptual load, and attentional resources are necessary for non-conscious processing. PMID:27149273

  19. The effect of Ramadan fasting on spatial attention through emotional stimuli

    PubMed Central

    Molavi, Maziyar; Yunus, Jasmy; Utama, Nugraha P

    2016-01-01

    Fasting can influence psychological and mental states. In the current study, the effect of periodical fasting on the process of emotion through gazed facial expression as a realistic multisource of social information was investigated for the first time. The dynamic cue-target task was applied via behavior and event-related potential measurements for 40 participants to reveal the temporal and spatial brain activities – before, during, and after fasting periods. The significance of fasting included several effects. The amplitude of the N1 component decreased over the centroparietal scalp during fasting. Furthermore, the reaction time during the fasting period decreased. The self-measurement of deficit arousal as well as the mood increased during the fasting period. There was a significant contralateral alteration of P1 over occipital area for the happy facial expression stimuli. The significant effect of gazed expression and its interaction with the emotional stimuli was indicated by the amplitude of N1. Furthermore, the findings of the study approved the validity effect as a congruency between gaze and target position, as indicated by the increment of P3 amplitude over centroparietal area as well as slower reaction time from behavioral response data during incongruency or invalid condition between gaze and target position compared with those during valid condition. Results of this study proved that attention to facial expression stimuli as a kind of communicative social signal was affected by fasting. Also, fasting improved the mood of practitioners. Moreover, findings from the behavioral and event-related potential data analyses indicated that the neural dynamics of facial emotion are processed faster than that of gazing, as the participants tended to react faster and prefer to relay on the type of facial emotions than to gaze direction while doing the task. Because of happy facial expression stimuli, right hemisphere activation was more than that of the left hemisphere. It indicated the consistency of the emotional lateralization concept rather than the valence concept of emotional processing. PMID:27307772

  20. Turning Avatar into Realistic Human Expression Using Linear and Bilinear Interpolations

    NASA Astrophysics Data System (ADS)

    Hazim Alkawaz, Mohammed; Mohamad, Dzulkifli; Rehman, Amjad; Basori, Ahmad Hoirul

    2014-06-01

    The facial animation in term of 3D facial data has accurate research support of the laser scan and advance 3D tools for complex facial model production. However, the approach still lacks facial expression based on emotional condition. Though, facial skin colour is required to offers an effect of facial expression improvement, closely related to the human emotion. This paper presents innovative techniques for facial animation transformation using the facial skin colour based on linear interpolation and bilinear interpolation. The generated expressions are almost same to the genuine human expression and also enhance the facial expression of the virtual human.

  1. Valence Scaling of Dynamic Facial Expressions Is Altered in High-Functioning Subjects with Autism Spectrum Disorders: An FMRI Study

    ERIC Educational Resources Information Center

    Rahko, Jukka S.; Paakki, Jyri-Johan; Starck, Tuomo H.; Nikkinen, Juha; Pauls, David L.; Katsyri, Jari V.; Jansson-Verkasalo, Eira M.; Carter, Alice S.; Hurtig, Tuula M.; Mattila, Marja-Leena; Jussila, Katja K.; Remes, Jukka J.; Kuusikko-Gauffin, Sanna A.; Sams, Mikko E.; Bolte, Sven; Ebeling, Hanna E.; Moilanen, Irma K.; Tervonen, Osmo; Kiviniemi, Vesa

    2012-01-01

    FMRI was performed with the dynamic facial expressions fear and happiness. This was done to detect differences in valence processing between 25 subjects with autism spectrum disorders (ASDs) and 27 typically developing controls. Valence scaling was abnormal in ASDs. Positive valence induces lower deactivation and abnormally strong activity in ASD…

  2. More Pronounced Deficits in Facial Emotion Recognition for Schizophrenia than Bipolar Disorder

    PubMed Central

    Goghari, Vina M; Sponheim, Scott R

    2012-01-01

    Schizophrenia and bipolar disorder are typically separated in diagnostic systems. Behavioural, cognitive, and brain abnormalities associated with each disorder nonetheless overlap. We evaluated the diagnostic specificity of facial emotion recognition deficits in schizophrenia and bipolar disorder to determine whether select aspects of emotion recognition differed for the two disorders. The investigation used an experimental task that included the same facial images in an emotion recognition condition and an age recognition condition (to control for processes associated with general face recognition) in 27 schizophrenia patients, 16 bipolar I patients, and 30 controls. Schizophrenia and bipolar patients exhibited both shared and distinct aspects of facial emotion recognition deficits. Schizophrenia patients had deficits in recognizing angry facial expressions compared to healthy controls and bipolar patients. Compared to control participants, both schizophrenia and bipolar patients were more likely to mislabel facial expressions of anger as fear. Given that schizophrenia patients exhibited a deficit in emotion recognition for angry faces, which did not appear due to generalized perceptual and cognitive dysfunction, improving recognition of threat-related expression may be an important intervention target to improve social functioning in schizophrenia. PMID:23218816

  3. Tactile Stimulation of the Face and the Production of Facial Expressions Activate Neurons in the Primate Amygdala

    PubMed Central

    Mosher, Clayton P.; Zimmerman, Prisca E.; Fuglevand, Andrew J.

    2016-01-01

    Abstract The majority of neurophysiological studies that have explored the role of the primate amygdala in the evaluation of social signals have relied on visual stimuli such as images of facial expressions. Vision, however, is not the only sensory modality that carries social signals. Both humans and nonhuman primates exchange emotionally meaningful social signals through touch. Indeed, social grooming in nonhuman primates and caressing touch in humans is critical for building lasting and reassuring social bonds. To determine the role of the amygdala in processing touch, we recorded the responses of single neurons in the macaque amygdala while we applied tactile stimuli to the face. We found that one-third of the recorded neurons responded to tactile stimulation. Although we recorded exclusively from the right amygdala, the receptive fields of 98% of the neurons were bilateral. A fraction of these tactile neurons were monitored during the production of facial expressions and during facial movements elicited occasionally by touch stimuli. Firing rates arising during the production of facial expressions were similar to those elicited by tactile stimulation. In a subset of cells, combining tactile stimulation with facial movement further augmented the firing rates. This suggests that tactile neurons in the amygdala receive input from skin mechanoceptors that are activated by touch and by compressions and stretches of the facial skin during the contraction of the underlying muscles. Tactile neurons in the amygdala may play a role in extracting the valence of touch stimuli and/or monitoring the facial expressions of self during social interactions. PMID:27752543

  4. Tactile Stimulation of the Face and the Production of Facial Expressions Activate Neurons in the Primate Amygdala.

    PubMed

    Mosher, Clayton P; Zimmerman, Prisca E; Fuglevand, Andrew J; Gothard, Katalin M

    2016-01-01

    The majority of neurophysiological studies that have explored the role of the primate amygdala in the evaluation of social signals have relied on visual stimuli such as images of facial expressions. Vision, however, is not the only sensory modality that carries social signals. Both humans and nonhuman primates exchange emotionally meaningful social signals through touch. Indeed, social grooming in nonhuman primates and caressing touch in humans is critical for building lasting and reassuring social bonds. To determine the role of the amygdala in processing touch, we recorded the responses of single neurons in the macaque amygdala while we applied tactile stimuli to the face. We found that one-third of the recorded neurons responded to tactile stimulation. Although we recorded exclusively from the right amygdala, the receptive fields of 98% of the neurons were bilateral. A fraction of these tactile neurons were monitored during the production of facial expressions and during facial movements elicited occasionally by touch stimuli. Firing rates arising during the production of facial expressions were similar to those elicited by tactile stimulation. In a subset of cells, combining tactile stimulation with facial movement further augmented the firing rates. This suggests that tactile neurons in the amygdala receive input from skin mechanoceptors that are activated by touch and by compressions and stretches of the facial skin during the contraction of the underlying muscles. Tactile neurons in the amygdala may play a role in extracting the valence of touch stimuli and/or monitoring the facial expressions of self during social interactions.

  5. Face Processing and Facial Emotion Recognition in Adults with Down Syndrome

    ERIC Educational Resources Information Center

    Barisnikov, Koviljka; Hippolyte, Loyse; Van der Linden, Martial

    2008-01-01

    Face processing and facial expression recognition was investigated in 17 adults with Down syndrome, and results were compared with those of a child control group matched for receptive vocabulary. On the tasks involving faces without emotional content, the adults with Down syndrome performed significantly worse than did the controls. However, their…

  6. Mapping the emotional face. How individual face parts contribute to successful emotion recognition.

    PubMed

    Wegrzyn, Martin; Vogt, Maria; Kireclioglu, Berna; Schneider, Julia; Kissler, Johanna

    2017-01-01

    Which facial features allow human observers to successfully recognize expressions of emotion? While the eyes and mouth have been frequently shown to be of high importance, research on facial action units has made more precise predictions about the areas involved in displaying each emotion. The present research investigated on a fine-grained level, which physical features are most relied on when decoding facial expressions. In the experiment, individual faces expressing the basic emotions according to Ekman were hidden behind a mask of 48 tiles, which was sequentially uncovered. Participants were instructed to stop the sequence as soon as they recognized the facial expression and assign it the correct label. For each part of the face, its contribution to successful recognition was computed, allowing to visualize the importance of different face areas for each expression. Overall, observers were mostly relying on the eye and mouth regions when successfully recognizing an emotion. Furthermore, the difference in the importance of eyes and mouth allowed to group the expressions in a continuous space, ranging from sadness and fear (reliance on the eyes) to disgust and happiness (mouth). The face parts with highest diagnostic value for expression identification were typically located in areas corresponding to action units from the facial action coding system. A similarity analysis of the usefulness of different face parts for expression recognition demonstrated that faces cluster according to the emotion they express, rather than by low-level physical features. Also, expressions relying more on the eyes or mouth region were in close proximity in the constructed similarity space. These analyses help to better understand how human observers process expressions of emotion, by delineating the mapping from facial features to psychological representation.

  7. Mapping the emotional face. How individual face parts contribute to successful emotion recognition

    PubMed Central

    Wegrzyn, Martin; Vogt, Maria; Kireclioglu, Berna; Schneider, Julia; Kissler, Johanna

    2017-01-01

    Which facial features allow human observers to successfully recognize expressions of emotion? While the eyes and mouth have been frequently shown to be of high importance, research on facial action units has made more precise predictions about the areas involved in displaying each emotion. The present research investigated on a fine-grained level, which physical features are most relied on when decoding facial expressions. In the experiment, individual faces expressing the basic emotions according to Ekman were hidden behind a mask of 48 tiles, which was sequentially uncovered. Participants were instructed to stop the sequence as soon as they recognized the facial expression and assign it the correct label. For each part of the face, its contribution to successful recognition was computed, allowing to visualize the importance of different face areas for each expression. Overall, observers were mostly relying on the eye and mouth regions when successfully recognizing an emotion. Furthermore, the difference in the importance of eyes and mouth allowed to group the expressions in a continuous space, ranging from sadness and fear (reliance on the eyes) to disgust and happiness (mouth). The face parts with highest diagnostic value for expression identification were typically located in areas corresponding to action units from the facial action coding system. A similarity analysis of the usefulness of different face parts for expression recognition demonstrated that faces cluster according to the emotion they express, rather than by low-level physical features. Also, expressions relying more on the eyes or mouth region were in close proximity in the constructed similarity space. These analyses help to better understand how human observers process expressions of emotion, by delineating the mapping from facial features to psychological representation. PMID:28493921

  8. Aggression differentially modulates brain responses to fearful and angry faces: an exploratory study.

    PubMed

    Lu, Hui; Wang, Yu; Xu, Shuang; Wang, Yifeng; Zhang, Ruiping; Li, Tsingan

    2015-08-19

    Aggression is reported to modulate neural responses to the threatening information. However, whether aggression can modulate neural response to different kinds of threatening facial expressions (angry and fearful expressions) remains unknown. Thus, event-related potentials were measured in individuals (13 high aggressive, 12 low aggressive) exposed to neutral, angry, and fearful facial expressions while performing a frame-distinguishing task, irrespective of the emotional valence of the expressions. Highly aggressive participants showed no distinct neural responses between the three facial expressions. In addition, compared with individuals with low aggression, highly aggressive individuals showed a decreased frontocentral response to fearful faces within 250-300 ms and to angry faces within 400-500 ms of exposure. These results indicate that fearful faces represent a more threatening signal requiring a quick cognitive response during the early stage of facial processing, whereas angry faces elicit a stronger response during the later processing stage because of its eminent emotional significance. The present results represent the first known evidence that aggression is associated with different neural responses to fearful and angry faces. By exploring the distinct temporal responses to fearful and angry faces modulated by aggression, this study more precisely characterizes the cognitive characteristics of aggressive individuals. Copyright © 2015 Wolters Kluwer Health, Inc. All rights reserved.

  9. [Neural mechanisms of facial recognition].

    PubMed

    Nagai, Chiyoko

    2007-01-01

    We review recent researches in neural mechanisms of facial recognition in the light of three aspects: facial discrimination and identification, recognition of facial expressions, and face perception in itself. First, it has been demonstrated that the fusiform gyrus has a main role of facial discrimination and identification. However, whether the FFA (fusiform face area) is really a special area for facial processing or not is controversial; some researchers insist that the FFA is related to 'becoming an expert' for some kinds of visual objects, including faces. Neural mechanisms of prosopagnosia would be deeply concerned to this issue. Second, the amygdala seems to be very concerned to recognition of facial expressions, especially fear. The amygdala, connected with the superior temporal sulcus and the orbitofrontal cortex, appears to operate the cortical function. The amygdala and the superior temporal sulcus are related to gaze recognition, which explains why a patient with bilateral amygdala damage could not recognize only a fear expression; the information from eyes is necessary for fear recognition. Finally, even a newborn infant can recognize a face as a face, which is congruent with the innate hypothesis of facial recognition. Some researchers speculate that the neural basis of such face perception is the subcortical network, comprised of the amygdala, the superior colliculus, and the pulvinar. This network would relate to covert recognition that prosopagnosic patients have.

  10. Computational Simulation on Facial Expressions and Experimental Tensile Strength for Silicone Rubber as Artificial Skin

    NASA Astrophysics Data System (ADS)

    Amijoyo Mochtar, Andi

    2018-02-01

    Applications of robotics have become important for human life in recent years. There are many specification of robots that have been improved and encriched with the technology advances. One of them are humanoid robot with facial expression which closer with the human facial expression naturally. The purpose of this research is to make computation on facial expressions and conduct the tensile strength for silicone rubber as artificial skin. Facial expressions were calculated by determining dimension, material properties, number of node elements, boundary condition, force condition, and analysis type. A Facial expression robot is determined by the direction and the magnitude external force on the driven point. The expression face of robot is identical with the human facial expression where the muscle structure in face according to the human face anatomy. For developing facial expression robots, facial action coding system (FACS) in approached due to follow expression human. The tensile strength is conducting due to check the proportional force of artificial skin that can be applied on the future of robot facial expression. Combining of calculated and experimental results can generate reliable and sustainable robot facial expression that using silicone rubber as artificial skin.

  11. [The effect of the serotonin transporter 5-HTTLPR polymorphism on the recognition of facial emotions in schizophrenia].

    PubMed

    Alfimova, M V; Golimbet, V E; Korovaitseva, G I; Lezheiko, T V; Abramova, L I; Aksenova, E V; Bolgov, M I

    2014-01-01

    The 5-HTTLPR SLC6A4 and catechol-o-methyltransferase (COMT) Val158Met polymorphisms are reported to be associated with processing of facial expressions in general population. Impaired recognition of facial expressions that is characteristic of schizophrenia negatively impacts on the social adaptation of the patients. To search for molecular mechanisms of this deficit, we studied main and epistatic effects of 5-HTTLPR and Val158Met polymorphisms on the facial emotion recognition in patients with schizophrenia (n=299) and healthy controls (n=232). The 5-HTTLPR polymorphism was associated with the emotion recognition in patients. The ll-homozygotes recognized facial emotions significantly better compared to those with an s-allele (F=8.00; p=0.005). Although the recognition of facial emotions was correlated with negative symptoms, verbal learning and trait anxiety, these variables did not significantly modified the association. In both groups, no effect of the COMT on the recognition of facial emotions was found.

  12. Testosterone reduces conscious detection of signals serving social correction: implications for antisocial behavior.

    PubMed

    van Honk, Jack; Schutter, Dennis J L G

    2007-08-01

    Elevated levels of testosterone have repeatedly been associated with antisocial behavior, but the psychobiological mechanisms underlying this effect are unknown. However, testosterone is evidently capable of altering the processing of facial threat, and facial signals of fear and anger serve sociality through their higher-level empathy-provoking and socially corrective properties. We investigated the hypothesis that testosterone predisposes people to antisocial behavior by reducing conscious recognition of facial threat. In a within-subjects design, testosterone (0.5 mg) or placebo was administered to 16 female volunteers. Afterward, a task with morphed stimuli indexed their sensitivity for consciously recognizing the facial expressions of threat (disgust, fear, and anger) and nonthreat (surprise, sadness, and happiness). Testosterone induced a significant reduction in the conscious recognition of facial threat overall. Separate analyses for the three categories of threat faces indicated that this effect was reliable for angry facial expressions exclusively. This testosterone-induced impairment in the conscious detection of the socially corrective facial signal of anger may predispose individuals to antisocial behavior.

  13. Reproducibility of the dynamics of facial expressions in unilateral facial palsy.

    PubMed

    Alagha, M A; Ju, X; Morley, S; Ayoub, A

    2018-02-01

    The aim of this study was to assess the reproducibility of non-verbal facial expressions in unilateral facial paralysis using dynamic four-dimensional (4D) imaging. The Di4D system was used to record five facial expressions of 20 adult patients. The system captured 60 three-dimensional (3D) images per second; each facial expression took 3-4seconds which was recorded in real time. Thus a set of 180 3D facial images was generated for each expression. The procedure was repeated after 30min to assess the reproducibility of the expressions. A mathematical facial mesh consisting of thousands of quasi-point 'vertices' was conformed to the face in order to determine the morphological characteristics in a comprehensive manner. The vertices were tracked throughout the sequence of the 180 images. Five key 3D facial frames from each sequence of images were analyzed. Comparisons were made between the first and second capture of each facial expression to assess the reproducibility of facial movements. Corresponding images were aligned using partial Procrustes analysis, and the root mean square distance between them was calculated and analyzed statistically (paired Student t-test, P<0.05). Facial expressions of lip purse, cheek puff, and raising of eyebrows were reproducible. Facial expressions of maximum smile and forceful eye closure were not reproducible. The limited coordination of various groups of facial muscles contributed to the lack of reproducibility of these facial expressions. 4D imaging is a useful clinical tool for the assessment of facial expressions. Copyright © 2017 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  14. The Mask of Sanity: Facial Expressive, Self-Reported, and Physiological Consequences of Emotion Regulation in Psychopathic Offenders.

    PubMed

    Nentjes, Lieke; Bernstein, David P; Meijer, Ewout; Arntz, Arnoud; Wiers, Reinout W

    2016-12-01

    This study investigated the physiological, self-reported, and facial correlates of emotion regulation in psychopathy. Specifically, we compared psychopathic offenders (n = 42), nonpsychopathic offenders (n = 42), and nonoffender controls (n = 26) in their ability to inhibit and express emotion while watching affective films (fear, happy, and sad). Results showed that all participants were capable of drastically diminishing facial emotions under inhibition instructions. Contrary to expectation, psychopaths were not superior in adopting such a "poker face." Further, the inhibition of emotion was associated with cardiovascular changes, an effect that was also not dependent on psychopathy (or its factors), suggesting emotion inhibition to be an effortful process in psychopaths as well. Interestingly, psychopathic offenders did not differ from nonpsychopaths in the capacity to show content-appropriate facial emotions during the expression condition. Taken together, these data challenge the view that psychopathy is associated with either superior emotional inhibitory capacities or a generalized impairment in showing facial affect.

  15. Rapid facial reactions to emotional facial expressions in typically developing children and children with autism spectrum disorder.

    PubMed

    Beall, Paula M; Moody, Eric J; McIntosh, Daniel N; Hepburn, Susan L; Reed, Catherine L

    2008-11-01

    Typical adults mimic facial expressions within 1000 ms, but adults with autism spectrum disorder (ASD) do not. These rapid facial reactions (RFRs) are associated with the development of social-emotional abilities. Such interpersonal matching may be caused by motor mirroring or emotional responses. Using facial electromyography (EMG), this study evaluated mechanisms underlying RFRs during childhood and examined possible impairment in children with ASD. Experiment 1 found RFRs to happy and angry faces (not fear faces) in 15 typically developing children from 7 to 12 years of age. RFRs of fear (not anger) in response to angry faces indicated an emotional mechanism. In 11 children (8-13 years of age) with ASD, Experiment 2 found undifferentiated RFRs to fear expressions and no consistent RFRs to happy or angry faces. However, as children with ASD aged, matching RFRs to happy faces increased significantly, suggesting the development of processes underlying matching RFRs during this period in ASD.

  16. A study of patient facial expressivity in relation to orthodontic/surgical treatment.

    PubMed

    Nafziger, Y J

    1994-09-01

    A dynamic analysis of the faces of patients seeking an aesthetic restoration of facial aberrations with orthognathic treatment requires (besides the routine static study, such as records, study models, photographs, and cephalometric tracings) the study of their facial expressions. To determine a classification method for the units of expressive facial behavior, the mobility of the face is studied with the aid of the facial action coding system (FACS) created by Ekman and Friesen. With video recordings of faces and photographic images taken from the video recordings, the authors have modified a technique of facial analysis structured on the visual observation of the anatomic basis of movement. The technique, itself, is based on the defining of individual facial expressions and then codifying such expressions through the use of minimal, anatomic action units. These action units actually combine to form facial expressions. With the help of FACS, the facial expressions of 18 patients before and after orthognathic surgery, and six control subjects without dentofacial deformation have been studied. I was able to register 6278 facial expressions and then further define 18,844 action units, from the 6278 facial expressions. A classification of the facial expressions made by subject groups and repeated in quantified time frames has allowed establishment of "rules" or "norms" relating to expression, thus further enabling the making of comparisons of facial expressiveness between patients and control subjects. This study indicates that the facial expressions of the patients were more similar to the facial expressions of the controls after orthognathic surgery. It was possible to distinguish changes in facial expressivity in patients after dentofacial surgery, the type and degree of change depended on the facial structure before surgery. Changes noted tended toward a functioning that is identical to that of subjects who do not suffer from dysmorphosis and toward greater lip competence, particularly the function of the orbicular muscle of the lips, with reduced compensatory activity of the lower lip and the chin. The results of our study are supported by the clinical observations and suggest that the FACS technique should be able to provide a coding for the study of facial expression.

  17. Facial Expression Influences Face Identity Recognition During the Attentional Blink

    PubMed Central

    2014-01-01

    Emotional stimuli (e.g., negative facial expressions) enjoy prioritized memory access when task relevant, consistent with their ability to capture attention. Whether emotional expression also impacts on memory access when task-irrelevant is important for arbitrating between feature-based and object-based attentional capture. Here, the authors address this question in 3 experiments using an attentional blink task with face photographs as first and second target (T1, T2). They demonstrate reduced neutral T2 identity recognition after angry or happy T1 expression, compared to neutral T1, and this supports attentional capture by a task-irrelevant feature. Crucially, after neutral T1, T2 identity recognition was enhanced and not suppressed when T2 was angry—suggesting that attentional capture by this task-irrelevant feature may be object-based and not feature-based. As an unexpected finding, both angry and happy facial expressions suppress memory access for competing objects, but only angry facial expression enjoyed privileged memory access. This could imply that these 2 processes are relatively independent from one another. PMID:25286076

  18. Facial expression influences face identity recognition during the attentional blink.

    PubMed

    Bach, Dominik R; Schmidt-Daffy, Martin; Dolan, Raymond J

    2014-12-01

    Emotional stimuli (e.g., negative facial expressions) enjoy prioritized memory access when task relevant, consistent with their ability to capture attention. Whether emotional expression also impacts on memory access when task-irrelevant is important for arbitrating between feature-based and object-based attentional capture. Here, the authors address this question in 3 experiments using an attentional blink task with face photographs as first and second target (T1, T2). They demonstrate reduced neutral T2 identity recognition after angry or happy T1 expression, compared to neutral T1, and this supports attentional capture by a task-irrelevant feature. Crucially, after neutral T1, T2 identity recognition was enhanced and not suppressed when T2 was angry-suggesting that attentional capture by this task-irrelevant feature may be object-based and not feature-based. As an unexpected finding, both angry and happy facial expressions suppress memory access for competing objects, but only angry facial expression enjoyed privileged memory access. This could imply that these 2 processes are relatively independent from one another.

  19. Facial expression identification using 3D geometric features from Microsoft Kinect device

    NASA Astrophysics Data System (ADS)

    Han, Dongxu; Al Jawad, Naseer; Du, Hongbo

    2016-05-01

    Facial expression identification is an important part of face recognition and closely related to emotion detection from face images. Various solutions have been proposed in the past using different types of cameras and features. Microsoft Kinect device has been widely used for multimedia interactions. More recently, the device has been increasingly deployed for supporting scientific investigations. This paper explores the effectiveness of using the device in identifying emotional facial expressions such as surprise, smile, sad, etc. and evaluates the usefulness of 3D data points on a face mesh structure obtained from the Kinect device. We present a distance-based geometric feature component that is derived from the distances between points on the face mesh and selected reference points in a single frame. The feature components extracted across a sequence of frames starting and ending by neutral emotion represent a whole expression. The feature vector eliminates the need for complex face orientation correction, simplifying the feature extraction process and making it more efficient. We applied the kNN classifier that exploits a feature component based similarity measure following the principle of dynamic time warping to determine the closest neighbors. Preliminary tests on a small scale database of different facial expressions show promises of the newly developed features and the usefulness of the Kinect device in facial expression identification.

  20. Treatment with escitalopram improves the attentional bias toward negative facial expressions in patients with major depressive disorders.

    PubMed

    Zhou, Zhenhe; Cao, Suxia; Li, Hengfen; Li, Youhui

    2015-10-01

    We hypothesized that treatment with escitalopram would improve cognitive bias and contribute to the recovery process for patients with major depressive disorder (MDD). Many previous studies have established that patients with MDD tend to pay selective attention to negative stimuli. The assessment of the level of cognitive bias is regarded as a crucial dimension of treatment outcomes for MDD. To our knowledge, no prior studies have been reported on the effects of treatment with escitalopram on attentional bias in MDD, employing a dot probe task of facial expression. We studied 25 patients with MDD and 25 controls, and used a dot probe task of facial expression to measure cognitive bias. The patients' psychopathologies were rated using the Hamilton Depression Scale (HAMD) at baseline and after 8 weeks of treatment with escitalopram. All participants performed the facial expression dot probe task. The results revealed that the 8 week escitalopram treatment decreased the HAMD scores. The patients with MDD at baseline exhibited an attentional bias towards negative faces, however, no significant bias toward either negative or happy faces were observed in the controls. After the 8 week escitalopram treatment, no significant bias toward negative faces was observed in the patient group. In conclusion, patients with MDD pay more attention to negative facial expressions, and treatment with escitalopram improves this attentional bias toward negative facial expressions. This is the first study, to our knowledge, on the effects of treatment with escitalopram on attentional bias in patients with MDD that has employed a dot probe task of facial expression. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Analysis of facial expressions in parkinson's disease through video-based automatic methods.

    PubMed

    Bandini, Andrea; Orlandi, Silvia; Escalante, Hugo Jair; Giovannelli, Fabio; Cincotta, Massimo; Reyes-Garcia, Carlos A; Vanni, Paola; Zaccara, Gaetano; Manfredi, Claudia

    2017-04-01

    The automatic analysis of facial expressions is an evolving field that finds several clinical applications. One of these applications is the study of facial bradykinesia in Parkinson's disease (PD), which is a major motor sign of this neurodegenerative illness. Facial bradykinesia consists in the reduction/loss of facial movements and emotional facial expressions called hypomimia. In this work we propose an automatic method for studying facial expressions in PD patients relying on video-based METHODS: 17 Parkinsonian patients and 17 healthy control subjects were asked to show basic facial expressions, upon request of the clinician and after the imitation of a visual cue on a screen. Through an existing face tracker, the Euclidean distance of the facial model from a neutral baseline was computed in order to quantify the changes in facial expressivity during the tasks. Moreover, an automatic facial expressions recognition algorithm was trained in order to study how PD expressions differed from the standard expressions. Results show that control subjects reported on average higher distances than PD patients along the tasks. This confirms that control subjects show larger movements during both posed and imitated facial expressions. Moreover, our results demonstrate that anger and disgust are the two most impaired expressions in PD patients. Contactless video-based systems can be important techniques for analyzing facial expressions also in rehabilitation, in particular speech therapy, where patients could get a definite advantage from a real-time feedback about the proper facial expressions/movements to perform. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Women's greater ability to perceive happy facial emotion automatically: gender differences in affective priming.

    PubMed

    Donges, Uta-Susan; Kersting, Anette; Suslow, Thomas

    2012-01-01

    There is evidence that women are better in recognizing their own and others' emotions. The female advantage in emotion recognition becomes even more apparent under conditions of rapid stimulus presentation. Affective priming paradigms have been developed to examine empirically whether facial emotion stimuli presented outside of conscious awareness color our impressions. It was observed that masked emotional facial expression has an affect congruent influence on subsequent judgments of neutral stimuli. The aim of the present study was to examine the effect of gender on affective priming based on negative and positive facial expression. In our priming experiment sad, happy, neutral, or no facial expression was briefly presented (for 33 ms) and masked by neutral faces which had to be evaluated. 81 young healthy volunteers (53 women) participated in the study. Subjects had no subjective awareness of emotional primes. Women did not differ from men with regard to age, education, intelligence, trait anxiety, or depressivity. In the whole sample, happy but not sad facial expression elicited valence congruent affective priming. Between-group analyses revealed that women manifested greater affective priming due to happy faces than men. Women seem to have a greater ability to perceive and respond to positive facial emotion at an automatic processing level compared to men. High perceptual sensitivity to minimal social-affective signals may contribute to women's advantage in understanding other persons' emotional states.

  3. Women's Greater Ability to Perceive Happy Facial Emotion Automatically: Gender Differences in Affective Priming

    PubMed Central

    Donges, Uta-Susan; Kersting, Anette; Suslow, Thomas

    2012-01-01

    There is evidence that women are better in recognizing their own and others' emotions. The female advantage in emotion recognition becomes even more apparent under conditions of rapid stimulus presentation. Affective priming paradigms have been developed to examine empirically whether facial emotion stimuli presented outside of conscious awareness color our impressions. It was observed that masked emotional facial expression has an affect congruent influence on subsequent judgments of neutral stimuli. The aim of the present study was to examine the effect of gender on affective priming based on negative and positive facial expression. In our priming experiment sad, happy, neutral, or no facial expression was briefly presented (for 33 ms) and masked by neutral faces which had to be evaluated. 81 young healthy volunteers (53 women) participated in the study. Subjects had no subjective awareness of emotional primes. Women did not differ from men with regard to age, education, intelligence, trait anxiety, or depressivity. In the whole sample, happy but not sad facial expression elicited valence congruent affective priming. Between-group analyses revealed that women manifested greater affective priming due to happy faces than men. Women seem to have a greater ability to perceive and respond to positive facial emotion at an automatic processing level compared to men. High perceptual sensitivity to minimal social-affective signals may contribute to women's advantage in understanding other persons' emotional states. PMID:22844519

  4. Slowing down Presentation of Facial Movements and Vocal Sounds Enhances Facial Expression Recognition and Induces Facial-Vocal Imitation in Children with Autism

    ERIC Educational Resources Information Center

    Tardif, Carole; Laine, France; Rodriguez, Melissa; Gepner, Bruno

    2007-01-01

    This study examined the effects of slowing down presentation of facial expressions and their corresponding vocal sounds on facial expression recognition and facial and/or vocal imitation in children with autism. Twelve autistic children and twenty-four normal control children were presented with emotional and non-emotional facial expressions on…

  5. Dynamic facial expression recognition based on geometric and texture features

    NASA Astrophysics Data System (ADS)

    Li, Ming; Wang, Zengfu

    2018-04-01

    Recently, dynamic facial expression recognition in videos has attracted growing attention. In this paper, we propose a novel dynamic facial expression recognition method by using geometric and texture features. In our system, the facial landmark movements and texture variations upon pairwise images are used to perform the dynamic facial expression recognition tasks. For one facial expression sequence, pairwise images are created between the first frame and each of its subsequent frames. Integration of both geometric and texture features further enhances the representation of the facial expressions. Finally, Support Vector Machine is used for facial expression recognition. Experiments conducted on the extended Cohn-Kanade database show that our proposed method can achieve a competitive performance with other methods.

  6. A comparison of facial expression properties in five hylobatid species.

    PubMed

    Scheider, Linda; Liebal, Katja; Oña, Leonardo; Burrows, Anne; Waller, Bridget

    2014-07-01

    Little is known about facial communication of lesser apes (family Hylobatidae) and how their facial expressions (and use of) relate to social organization. We investigated facial expressions (defined as combinations of facial movements) in social interactions of mated pairs in five different hylobatid species belonging to three different genera using a recently developed objective coding system, the Facial Action Coding System for hylobatid species (GibbonFACS). We described three important properties of their facial expressions and compared them between genera. First, we compared the rate of facial expressions, which was defined as the number of facial expressions per units of time. Second, we compared their repertoire size, defined as the number of different types of facial expressions used, independent of their frequency. Third, we compared the diversity of expression, defined as the repertoire weighted by the rate of use for each type of facial expression. We observed a higher rate and diversity of facial expression, but no larger repertoire, in Symphalangus (siamangs) compared to Hylobates and Nomascus species. In line with previous research, these results suggest siamangs differ from other hylobatids in certain aspects of their social behavior. To investigate whether differences in facial expressions are linked to hylobatid socio-ecology, we used a Phylogenetic General Least Square (PGLS) regression analysis to correlate those properties with two social factors: group-size and level of monogamy. No relationship between the properties of facial expressions and these socio-ecological factors was found. One explanation could be that facial expressions in hylobatid species are subject to phylogenetic inertia and do not differ sufficiently between species to reveal correlations with factors such as group size and monogamy level. © 2014 Wiley Periodicals, Inc.

  7. Facial expressions and pair bonds in hylobatids.

    PubMed

    Florkiewicz, Brittany; Skollar, Gabriella; Reichard, Ulrich H

    2018-06-06

    Facial expressions are an important component of primate communication that functions to transmit social information and modulate intentions and motivations. Chimpanzees and macaques, for example, produce a variety of facial expressions when communicating with conspecifics. Hylobatids also produce various facial expressions; however, the origin and function of these facial expressions are still largely unclear. It has been suggested that larger facial expression repertoires may have evolved in the context of social complexity, but this link has yet to be tested at a broader empirical basis. The social complexity hypothesis offers a possible explanation for the evolution of complex communicative signals such as facial expressions, because as the complexity of an individual's social environment increases so does the need for communicative signals. We used an intraspecies, pair-focused study design to test the link between facial expressions and sociality within hylobatids, specifically the strength of pair-bonds. The current study compared 206 hr of video and 103 hr of focal animal data for ten hylobatid pairs from three genera (Nomascus, Hoolock, and Hylobates) living at the Gibbon Conservation Center. Using video footage, we explored 5,969 facial expressions along three dimensions: repertoire use, repertoire breadth, and facial expression synchrony [FES]. We then used focal animal data to compare dimensions of facial expressiveness to pair bond strength and behavioral synchrony. Hylobatids in our study overlapped in only half of their facial expressions (50%) with the only other detailed, quantitative study of hylobatid facial expressions, while 27 facial expressions were uniquely observed in our study animals. Taken together, hylobatids have a large facial expression repertoire of at least 80 unique facial expressions. Contrary to our prediction, facial repertoire composition was not significantly correlated with pair bond strength, rates of territorial synchrony, or rates of behavioral synchrony. We found that FES was the strongest measure of hylobatid expressiveness and was significantly positively correlated with higher sociality index scores; however, FES showed no significant correlation with behavioral synchrony. No noticeable differences between pairs were found regarding rates of behavioral or territorial synchrony. Facial repertoire sizes and FES were not significantly correlated with rates of behavioral synchrony or territorial synchrony. Our study confirms an important role of facial expressions in maintaining pair bonds and coordinating activities in hylobatids. Data support the hypothesis that facial expressions and sociality have been linked in hylobatid and primate evolution. It is possible that larger facial repertoires may have contributed to strengthening pair bonds in primates, because richer facial repertoires provide more opportunities for FES which can effectively increase the "understanding" between partners through smoother coordination of interaction patterns. This study supports the social complexity hypothesis as the driving force for the evolution of complex communication signaling. © 2018 Wiley Periodicals, Inc.

  8. Enhanced embodied response following ambiguous emotional processing.

    PubMed

    Beffara, Brice; Ouellet, Marc; Vermeulen, Nicolas; Basu, Anamitra; Morisseau, Tiffany; Mermillod, Martial

    2012-08-01

    It has generally been assumed that high-level cognitive and emotional processes are based on amodal conceptual information. In contrast, however, "embodied simulation" theory states that the perception of an emotional signal can trigger a simulation of the related state in the motor, somatosensory, and affective systems. To study the effect of social context on the mimicry effect predicted by the "embodied simulation" theory, we recorded the electromyographic (EMG) activity of participants when looking at emotional facial expressions. We observed an increase in embodied responses when the participants were exposed to a context involving social valence before seeing the emotional facial expressions. An examination of the dynamic EMG activity induced by two socially relevant emotional expressions (namely joy and anger) revealed enhanced EMG responses of the facial muscles associated with the related social prime (either positive or negative). These results are discussed within the general framework of embodiment theory.

  9. Plain faces are more expressive: comparative study of facial colour, mobility and musculature in primates

    PubMed Central

    Santana, Sharlene E.; Dobson, Seth D.; Diogo, Rui

    2014-01-01

    Facial colour patterns and facial expressions are among the most important phenotypic traits that primates use during social interactions. While colour patterns provide information about the sender's identity, expressions can communicate its behavioural intentions. Extrinsic factors, including social group size, have shaped the evolution of facial coloration and mobility, but intrinsic relationships and trade-offs likely operate in their evolution as well. We hypothesize that complex facial colour patterning could reduce how salient facial expressions appear to a receiver, and thus species with highly expressive faces would have evolved uniformly coloured faces. We test this hypothesis through a phylogenetic comparative study, and explore the underlying morphological factors of facial mobility. Supporting our hypothesis, we find that species with highly expressive faces have plain facial colour patterns. The number of facial muscles does not predict facial mobility; instead, species that are larger and have a larger facial nucleus have more expressive faces. This highlights a potential trade-off between facial mobility and colour patterning in primates and reveals complex relationships between facial features during primate evolution. PMID:24850898

  10. A unified probabilistic framework for spontaneous facial action modeling and understanding.

    PubMed

    Tong, Yan; Chen, Jixu; Ji, Qiang

    2010-02-01

    Facial expression is a natural and powerful means of human communication. Recognizing spontaneous facial actions, however, is very challenging due to subtle facial deformation, frequent head movements, and ambiguous and uncertain facial motion measurements. Because of these challenges, current research in facial expression recognition is limited to posed expressions and often in frontal view. A spontaneous facial expression is characterized by rigid head movements and nonrigid facial muscular movements. More importantly, it is the coherent and consistent spatiotemporal interactions among rigid and nonrigid facial motions that produce a meaningful facial expression. Recognizing this fact, we introduce a unified probabilistic facial action model based on the Dynamic Bayesian network (DBN) to simultaneously and coherently represent rigid and nonrigid facial motions, their spatiotemporal dependencies, and their image measurements. Advanced machine learning methods are introduced to learn the model based on both training data and subjective prior knowledge. Given the model and the measurements of facial motions, facial action recognition is accomplished through probabilistic inference by systematically integrating visual measurements with the facial action model. Experiments show that compared to the state-of-the-art techniques, the proposed system yields significant improvements in recognizing both rigid and nonrigid facial motions, especially for spontaneous facial expressions.

  11. Objectifying facial expressivity assessment of Parkinson's patients: preliminary study.

    PubMed

    Wu, Peng; Gonzalez, Isabel; Patsis, Georgios; Jiang, Dongmei; Sahli, Hichem; Kerckhofs, Eric; Vandekerckhove, Marie

    2014-01-01

    Patients with Parkinson's disease (PD) can exhibit a reduction of spontaneous facial expression, designated as "facial masking," a symptom in which facial muscles become rigid. To improve clinical assessment of facial expressivity of PD, this work attempts to quantify the dynamic facial expressivity (facial activity) of PD by automatically recognizing facial action units (AUs) and estimating their intensity. Spontaneous facial expressivity was assessed by comparing 7 PD patients with 8 control participants. To voluntarily produce spontaneous facial expressions that resemble those typically triggered by emotions, six emotions (amusement, sadness, anger, disgust, surprise, and fear) were elicited using movie clips. During the movie clips, physiological signals (facial electromyography (EMG) and electrocardiogram (ECG)) and frontal face video of the participants were recorded. The participants were asked to report on their emotional states throughout the experiment. We first examined the effectiveness of the emotion manipulation by evaluating the participant's self-reports. Disgust-induced emotions were significantly higher than the other emotions. Thus we focused on the analysis of the recorded data during watching disgust movie clips. The proposed facial expressivity assessment approach captured differences in facial expressivity between PD patients and controls. Also differences between PD patients with different progression of Parkinson's disease have been observed.

  12. Event-related theta synchronization predicts deficit in facial affect recognition in schizophrenia.

    PubMed

    Csukly, Gábor; Stefanics, Gábor; Komlósi, Sarolta; Czigler, István; Czobor, Pál

    2014-02-01

    Growing evidence suggests that abnormalities in the synchronized oscillatory activity of neurons in schizophrenia may lead to impaired neural activation and temporal coding and thus lead to neurocognitive dysfunctions, such as deficits in facial affect recognition. To gain an insight into the neurobiological processes linked to facial affect recognition, we investigated both induced and evoked oscillatory activity by calculating the Event Related Spectral Perturbation (ERSP) and the Inter Trial Coherence (ITC) during facial affect recognition. Fearful and neutral faces as well as nonface patches were presented to 24 patients with schizophrenia and 24 matched healthy controls while EEG was recorded. The participants' task was to recognize facial expressions. Because previous findings with healthy controls showed that facial feature decoding was associated primarily with oscillatory activity in the theta band, we analyzed ERSP and ITC in this frequency band in the time interval of 140-200 ms, which corresponds to the N170 component. Event-related theta activity and phase-locking to facial expressions, but not to nonface patches, predicted emotion recognition performance in both controls and patients. Event-related changes in theta amplitude and phase-locking were found to be significantly weaker in patients compared with healthy controls, which is in line with previous investigations showing decreased neural synchronization in the low frequency bands in patients with schizophrenia. Neural synchrony is thought to underlie distributed information processing. Our results indicate a less effective functioning in the recognition process of facial features, which may contribute to a less effective social cognition in schizophrenia. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  13. Influence of aggression on information processing in the emotional stroop task--an event-related potential study.

    PubMed

    Bertsch, Katja; Böhnke, Robina; Kruk, Menno R; Naumann, Ewald

    2009-01-01

    Aggression is a common behavior which has frequently been explained as involving changes in higher level information processing patterns. Although researchers have started only recently to investigate information processing in healthy individuals while engaged in aggressive behavior, the impact of aggression on information processing beyond an aggressive encounter remains unclear. In an event-related potential study, we investigated the processing of facial expressions (happy, angry, fearful, and neutral) in an emotional Stroop task after experimentally provoking aggressive behavior in healthy participants. Compared to a non-provoked group, these individuals showed increased early (P2) and late (P3) positive amplitudes for all facial expressions. For the P2 amplitude, the effect of provocation was greatest for threat-related expressions. Beyond this, a bias for emotional expressions, i.e., slower reaction times to all emotional expressions, was found in provoked participants with a high level of trait anger. These results indicate significant effects of aggression on information processing, which last beyond the aggressive encounter even in healthy participants.

  14. Global-Local Precedence in the Perception of Facial Age and Emotional Expression by Children with Autism and Other Developmental Disabilities

    ERIC Educational Resources Information Center

    Gross, Thomas F.

    2005-01-01

    Global information processing and perception of facial age and emotional expression was studied in children with autism, language disorders, mental retardation, and a clinical control group. Children were given a global-local task and asked to recognize age and emotion in human and canine faces. Children with autism made fewer global responses and…

  15. Visual attention mechanisms in happiness versus trustworthiness processing of facial expressions.

    PubMed

    Calvo, Manuel G; Krumhuber, Eva G; Fernández-Martín, Andrés

    2018-03-01

    A happy facial expression makes a person look (more) trustworthy. Do perceptions of happiness and trustworthiness rely on the same face regions and visual attention processes? In an eye-tracking study, eye movements and fixations were recorded while participants judged the un/happiness or the un/trustworthiness of dynamic facial expressions in which the eyes and/or the mouth unfolded from neutral to happy or vice versa. A smiling mouth and happy eyes enhanced perceived happiness and trustworthiness similarly, with a greater contribution of the smile relative to the eyes. This comparable judgement output for happiness and trustworthiness was reached through shared as well as distinct attentional mechanisms: (a) entry times and (b) initial fixation thresholds for each face region were equivalent for both judgements, thereby revealing the same attentional orienting in happiness and trustworthiness processing. However, (c) greater and (d) longer fixation density for the mouth region in the happiness task, and for the eye region in the trustworthiness task, demonstrated different selective attentional engagement. Relatedly, (e) mean fixation duration across face regions was longer in the trustworthiness task, thus showing increased attentional intensity or processing effort.

  16. Emotional Processing, Recognition, Empathy and Evoked Facial Expression in Eating Disorders: An Experimental Study to Map Deficits in Social Cognition

    PubMed Central

    Cardi, Valentina; Corfield, Freya; Leppanen, Jenni; Rhind, Charlotte; Deriziotis, Stephanie; Hadjimichalis, Alexandra; Hibbs, Rebecca; Micali, Nadia; Treasure, Janet

    2015-01-01

    Background Difficulties in social cognition have been identified in eating disorders (EDs), but the exact profile of these abnormalities is unclear. The aim of this study is to examine distinct processes of social-cognition in this patient group, including attentional processing and recognition, empathic reaction and evoked facial expression in response to discrete vignettes of others displaying positive (i.e. happiness) or negative (i.e. sadness and anger) emotions. Method One hundred and thirty-eight female participants were included in the study: 73 healthy controls (HCs) and 65 individuals with an ED (49 with Anorexia Nervosa and 16 with Bulimia Nervosa). Self-report and behavioural measures were used. Results Participants with EDs did not display specific abnormalities in emotional processing, recognition and empathic response to others’ basic discrete emotions. However, they had poorer facial expressivity and a tendency to turn away from emotional displays. Conclusion Treatments focusing on the development of non-verbal emotional communication skills might be of benefit for patients with EDs. PMID:26252220

  17. Exaggerated perception of facial expressions is increased in individuals with schizotypal traits

    PubMed Central

    Uono, Shota; Sato, Wataru; Toichi, Motomi

    2015-01-01

    Emotional facial expressions are indispensable communicative tools, and social interactions involving facial expressions are impaired in some psychiatric disorders. Recent studies revealed that the perception of dynamic facial expressions was exaggerated in normal participants, and this exaggerated perception is weakened in autism spectrum disorder (ASD). Based on the notion that ASD and schizophrenia spectrum disorder are at two extremes of the continuum with respect to social impairment, we hypothesized that schizophrenic characteristics would strengthen the exaggerated perception of dynamic facial expressions. To test this hypothesis, we investigated the relationship between the perception of facial expressions and schizotypal traits in a normal population. We presented dynamic and static facial expressions, and asked participants to change an emotional face display to match the perceived final image. The presence of schizotypal traits was positively correlated with the degree of exaggeration for dynamic, as well as static, facial expressions. Among its subscales, the paranoia trait was positively correlated with the exaggerated perception of facial expressions. These results suggest that schizotypal traits, specifically the tendency to over-attribute mental states to others, exaggerate the perception of emotional facial expressions. PMID:26135081

  18. Exaggerated perception of facial expressions is increased in individuals with schizotypal traits.

    PubMed

    Uono, Shota; Sato, Wataru; Toichi, Motomi

    2015-07-02

    Emotional facial expressions are indispensable communicative tools, and social interactions involving facial expressions are impaired in some psychiatric disorders. Recent studies revealed that the perception of dynamic facial expressions was exaggerated in normal participants, and this exaggerated perception is weakened in autism spectrum disorder (ASD). Based on the notion that ASD and schizophrenia spectrum disorder are at two extremes of the continuum with respect to social impairment, we hypothesized that schizophrenic characteristics would strengthen the exaggerated perception of dynamic facial expressions. To test this hypothesis, we investigated the relationship between the perception of facial expressions and schizotypal traits in a normal population. We presented dynamic and static facial expressions, and asked participants to change an emotional face display to match the perceived final image. The presence of schizotypal traits was positively correlated with the degree of exaggeration for dynamic, as well as static, facial expressions. Among its subscales, the paranoia trait was positively correlated with the exaggerated perception of facial expressions. These results suggest that schizotypal traits, specifically the tendency to over-attribute mental states to others, exaggerate the perception of emotional facial expressions.

  19. Wait, are you sad or angry? Large exposure time differences required for the categorization of facial expressions of emotion

    PubMed Central

    Du, Shichuan; Martinez, Aleix M.

    2013-01-01

    Abstract Facial expressions of emotion are essential components of human behavior, yet little is known about the hierarchical organization of their cognitive analysis. We study the minimum exposure time needed to successfully classify the six classical facial expressions of emotion (joy, surprise, sadness, anger, disgust, fear) plus neutral as seen at different image resolutions (240 × 160 to 15 × 10 pixels). Our results suggest a consistent hierarchical analysis of these facial expressions regardless of the resolution of the stimuli. Happiness and surprise can be recognized after very short exposure times (10–20 ms), even at low resolutions. Fear and anger are recognized the slowest (100–250 ms), even in high-resolution images, suggesting a later computation. Sadness and disgust are recognized in between (70–200 ms). The minimum exposure time required for successful classification of each facial expression correlates with the ability of a human subject to identify it correctly at low resolutions. These results suggest a fast, early computation of expressions represented mostly by low spatial frequencies or global configural cues and a later, slower process for those categories requiring a more fine-grained analysis of the image. We also demonstrate that those expressions that are mostly visible in higher-resolution images are not recognized as accurately. We summarize implications for current computational models. PMID:23509409

  20. The role of spatial frequency information in the decoding of facial expressions of pain: a novel hybrid task.

    PubMed

    Wang, Shan; Eccleston, Christopher; Keogh, Edmund

    2017-11-01

    Spatial frequency (SF) information contributes to the recognition of facial expressions, including pain. Low-SF encodes facial configuration and structure and often dominates over high-SF information, which encodes fine details in facial features. This low-SF preference has not been investigated within the context of pain. In this study, we investigated whether perpetual preference differences exist for low-SF and high-SF pain information. A novel hybrid expression paradigm was used in which 2 different expressions, one containing low-SF information and the other high-SF information, were combined in a facial hybrid. Participants are instructed to identify the core expression contained within the hybrid, allowing for the measurement of SF information preference. Three experiments were conducted (46 participants in each) that varied the expressions within the hybrid faces: respectively pain-neutral, pain-fear, and pain-happiness. In order to measure the temporal aspects of image processing, each hybrid image was presented for 33, 67, 150, and 300 ms. As expected, identification of pain and other expressions was dominated by low-SF information across the 3 experiments. The low-SF preference was largest when the presentation of hybrid faces was brief and reduced as the presentation duration increased. A sex difference was also found in experiment 1. For women, the low-SF preference was dampened by high-SF pain information, when viewing low-SF neutral expressions. These results not only confirm the role that SF information has in the recognition of pain in facial expressions but suggests that in some situations, there may be sex differences in how pain is communicated.

  1. Recognizing Facial Expressions Automatically from Video

    NASA Astrophysics Data System (ADS)

    Shan, Caifeng; Braspenning, Ralph

    Facial expressions, resulting from movements of the facial muscles, are the face changes in response to a person's internal emotional states, intentions, or social communications. There is a considerable history associated with the study on facial expressions. Darwin [22] was the first to describe in details the specific facial expressions associated with emotions in animals and humans, who argued that all mammals show emotions reliably in their faces. Since that, facial expression analysis has been a area of great research interest for behavioral scientists [27]. Psychological studies [48, 3] suggest that facial expressions, as the main mode for nonverbal communication, play a vital role in human face-to-face communication. For illustration, we show some examples of facial expressions in Fig. 1.

  2. Sad or fearful? The influence of body posture on adults' and children's perception of facial displays of emotion.

    PubMed

    Mondloch, Catherine J

    2012-02-01

    The current research investigated the influence of body posture on adults' and children's perception of facial displays of emotion. In each of two experiments, participants categorized facial expressions that were presented on a body posture that was congruent (e.g., a sad face on a body posing sadness) or incongruent (e.g., a sad face on a body posing fear). Adults and 8-year-olds made more errors and had longer reaction times on incongruent trials than on congruent trials when judging sad versus fearful facial expressions, an effect that was larger in 8-year-olds. The congruency effect was reduced when faces and bodies were misaligned, providing some evidence for holistic processing. Neither adults nor 8-year-olds were affected by congruency when judging sad versus happy expressions. Evidence that congruency effects vary with age and with similarity of emotional expressions is consistent with dimensional theories and "emotional seed" models of emotion perception. 2011 Elsevier Inc. All rights reserved.

  3. Affective theory of mind inferences contextually influence the recognition of emotional facial expressions.

    PubMed

    Stewart, Suzanne L K; Schepman, Astrid; Haigh, Matthew; McHugh, Rhian; Stewart, Andrew J

    2018-03-14

    The recognition of emotional facial expressions is often subject to contextual influence, particularly when the face and the context convey similar emotions. We investigated whether spontaneous, incidental affective theory of mind inferences made while reading vignettes describing social situations would produce context effects on the identification of same-valenced emotions (Experiment 1) as well as differently-valenced emotions (Experiment 2) conveyed by subsequently presented faces. Crucially, we found an effect of context on reaction times in both experiments while, in line with previous work, we found evidence for a context effect on accuracy only in Experiment 1. This demonstrates that affective theory of mind inferences made at the pragmatic level of a text can automatically, contextually influence the perceptual processing of emotional facial expressions in a separate task even when those emotions are of a distinctive valence. Thus, our novel findings suggest that language acts as a contextual influence to the recognition of emotional facial expressions for both same and different valences.

  4. Slowing down presentation of facial movements and vocal sounds enhances facial expression recognition and induces facial-vocal imitation in children with autism.

    PubMed

    Tardif, Carole; Lainé, France; Rodriguez, Mélissa; Gepner, Bruno

    2007-09-01

    This study examined the effects of slowing down presentation of facial expressions and their corresponding vocal sounds on facial expression recognition and facial and/or vocal imitation in children with autism. Twelve autistic children and twenty-four normal control children were presented with emotional and non-emotional facial expressions on CD-Rom, under audio or silent conditions, and under dynamic visual conditions (slowly, very slowly, at normal speed) plus a static control. Overall, children with autism showed lower performance in expression recognition and more induced facial-vocal imitation than controls. In the autistic group, facial expression recognition and induced facial-vocal imitation were significantly enhanced in slow conditions. Findings may give new perspectives for understanding and intervention for verbal and emotional perceptive and communicative impairments in autistic populations.

  5. Automated Video Based Facial Expression Analysis of Neuropsychiatric Disorders

    PubMed Central

    Wang, Peng; Barrett, Frederick; Martin, Elizabeth; Milanova, Marina; Gur, Raquel E.; Gur, Ruben C.; Kohler, Christian; Verma, Ragini

    2008-01-01

    Deficits in emotional expression are prominent in several neuropsychiatric disorders, including schizophrenia. Available clinical facial expression evaluations provide subjective and qualitative measurements, which are based on static 2D images that do not capture the temporal dynamics and subtleties of expression changes. Therefore, there is a need for automated, objective and quantitative measurements of facial expressions captured using videos. This paper presents a computational framework that creates probabilistic expression profiles for video data and can potentially help to automatically quantify emotional expression differences between patients with neuropsychiatric disorders and healthy controls. Our method automatically detects and tracks facial landmarks in videos, and then extracts geometric features to characterize facial expression changes. To analyze temporal facial expression changes, we employ probabilistic classifiers that analyze facial expressions in individual frames, and then propagate the probabilities throughout the video to capture the temporal characteristics of facial expressions. The applications of our method to healthy controls and case studies of patients with schizophrenia and Asperger’s syndrome demonstrate the capability of the video-based expression analysis method in capturing subtleties of facial expression. Such results can pave the way for a video based method for quantitative analysis of facial expressions in clinical research of disorders that cause affective deficits. PMID:18045693

  6. Affect of the unconscious: Visually suppressed angry faces modulate our decisions

    PubMed Central

    Pajtas, Petra E.; Mahon, Bradford Z.; Nakayama, Ken; Caramazza, Alfonso

    2016-01-01

    Emotional and affective processing imposes itself over cognitive processes and modulates our perception of the surrounding environment. In two experiments, we addressed the issue of whether nonconscious processing of affect can take place even under deep states of unawareness, such as those induced by interocular suppression techniques, and can elicit an affective response that can influence our understanding of the surrounding environment. In Experiment 1, participants judged the likeability of an unfamiliar item—a Chinese character—that was preceded by a face expressing a particular emotion (either happy or angry). The face was rendered invisible through an interocular suppression technique (continuous flash suppression; CFS). In Experiment 2, backward masking (BM), a less robust masking technique, was used to render the facial expressions invisible. We found that despite equivalent phenomenological suppression of the visual primes under CFS and BM, different patterns of affective processing were obtained with the two masking techniques. Under BM, nonconscious affective priming was obtained for both happy and angry invisible facial expressions. However, under CFS, nonconscious affective priming was obtained only for angry facial expressions. We discuss an interpretation of this dissociation between affective processing and visual masking techniques in terms of distinct routes from the retina to the amygdala. PMID:23224765

  7. Effects of Repeated Concussions and Sex on Early Processing of Emotional Facial Expressions as Revealed by Electrophysiology.

    PubMed

    Carrier-Toutant, Frédérike; Guay, Samuel; Beaulieu, Christelle; Léveillé, Édith; Turcotte-Giroux, Alexandre; Papineau, Samaël D; Brisson, Benoit; D'Hondt, Fabien; De Beaumont, Louis

    2018-05-06

    Concussions affect the processing of emotional stimuli. This study aimed to investigate how sex interacts with concussion effects on early event-related brain potentials (ERP) measures (P1, N1) of emotional facial expressions (EFE) processing in asymptomatic, multi-concussion athletes during an EFE identification task. Forty control athletes (20 females and 20 males) and 43 multi-concussed athletes (22 females and 21 males), recruited more than 3 months after their last concussion, were tested. Participants completed the Beck Depression Inventory II, the Beck Anxiety Inventory, the Post-Concussion Symptom Scale, and an Emotional Facial Expression Identification Task. Pictures of male and female faces expressing neutral, angry, and happy emotions were randomly presented and the emotion depicted had to be identified as fast as possible during EEG acquisition. Relative to controls, concussed athletes of both sex exhibited a significant suppression of P1 amplitude recorded from the dominant right hemisphere while performing the emotional face expression identification task. The present study also highlighted a sex-specific suppression of the N1 component amplitude after concussion which affected male athletes. These findings suggest that repeated concussions alter the typical pattern of right-hemisphere response dominance to EFE in early stages of EFE processing and that the neurophysiological mechanisms underlying the processing of emotional stimuli are distinctively affected across sex. (JINS, 2018, 24, 1-11).

  8. The recognition of facial emotion expressions in Parkinson's disease.

    PubMed

    Assogna, Francesca; Pontieri, Francesco E; Caltagirone, Carlo; Spalletta, Gianfranco

    2008-11-01

    A limited number of studies in Parkinson's Disease (PD) suggest a disturbance of recognition of facial emotion expressions. In particular, disgust recognition impairment has been reported in unmedicated and medicated PD patients. However, the results are rather inconclusive in the definition of the degree and the selectivity of emotion recognition impairment, and an associated impairment of almost all basic facial emotions in PD is also described. Few studies have investigated the relationship with neuropsychiatric and neuropsychological symptoms with mainly negative results. This inconsistency may be due to many different problems, such as emotion assessment, perception deficit, cognitive impairment, behavioral symptoms, illness severity and antiparkinsonian therapy. Here we review the clinical characteristics and neural structures involved in the recognition of specific facial emotion expressions, and the plausible role of dopamine transmission and dopamine replacement therapy in these processes. It is clear that future studies should be directed to clarify all these issues.

  9. Beta event-related desynchronization as an index of individual differences in processing human facial expression: further investigations of autistic traits in typically developing adults

    PubMed Central

    Cooper, Nicholas R.; Simpson, Andrew; Till, Amy; Simmons, Kelly; Puzzo, Ignazio

    2013-01-01

    The human mirror neuron system (hMNS) has been associated with various forms of social cognition and affective processing including vicarious experience. It has also been proposed that a faulty hMNS may underlie some of the deficits seen in the autism spectrum disorders (ASDs). In the present study we set out to investigate whether emotional facial expressions could modulate a putative EEG index of hMNS activation (mu suppression) and if so, would this differ according to the individual level of autistic traits [high versus low Autism Spectrum Quotient (AQ) score]. Participants were presented with 3 s films of actors opening and closing their hands (classic hMNS mu-suppression protocol) while simultaneously wearing happy, angry, or neutral expressions. Mu-suppression was measured in the alpha and low beta bands. The low AQ group displayed greater low beta event-related desynchronization (ERD) to both angry and neutral expressions. The high AQ group displayed greater low beta ERD to angry than to happy expressions. There was also significantly more low beta ERD to happy faces for the low than for the high AQ group. In conclusion, an interesting interaction between AQ group and emotional expression revealed that hMNS activation can be modulated by emotional facial expressions and that this is differentiated according to individual differences in the level of autistic traits. The EEG index of hMNS activation (mu suppression) seems to be a sensitive measure of the variability in facial processing in typically developing individuals with high and low self-reported traits of autism. PMID:23630489

  10. Beta event-related desynchronization as an index of individual differences in processing human facial expression: further investigations of autistic traits in typically developing adults.

    PubMed

    Cooper, Nicholas R; Simpson, Andrew; Till, Amy; Simmons, Kelly; Puzzo, Ignazio

    2013-01-01

    The human mirror neuron system (hMNS) has been associated with various forms of social cognition and affective processing including vicarious experience. It has also been proposed that a faulty hMNS may underlie some of the deficits seen in the autism spectrum disorders (ASDs). In the present study we set out to investigate whether emotional facial expressions could modulate a putative EEG index of hMNS activation (mu suppression) and if so, would this differ according to the individual level of autistic traits [high versus low Autism Spectrum Quotient (AQ) score]. Participants were presented with 3 s films of actors opening and closing their hands (classic hMNS mu-suppression protocol) while simultaneously wearing happy, angry, or neutral expressions. Mu-suppression was measured in the alpha and low beta bands. The low AQ group displayed greater low beta event-related desynchronization (ERD) to both angry and neutral expressions. The high AQ group displayed greater low beta ERD to angry than to happy expressions. There was also significantly more low beta ERD to happy faces for the low than for the high AQ group. In conclusion, an interesting interaction between AQ group and emotional expression revealed that hMNS activation can be modulated by emotional facial expressions and that this is differentiated according to individual differences in the level of autistic traits. The EEG index of hMNS activation (mu suppression) seems to be a sensitive measure of the variability in facial processing in typically developing individuals with high and low self-reported traits of autism.

  11. Objectifying Facial Expressivity Assessment of Parkinson's Patients: Preliminary Study

    PubMed Central

    Patsis, Georgios; Jiang, Dongmei; Sahli, Hichem; Kerckhofs, Eric; Vandekerckhove, Marie

    2014-01-01

    Patients with Parkinson's disease (PD) can exhibit a reduction of spontaneous facial expression, designated as “facial masking,” a symptom in which facial muscles become rigid. To improve clinical assessment of facial expressivity of PD, this work attempts to quantify the dynamic facial expressivity (facial activity) of PD by automatically recognizing facial action units (AUs) and estimating their intensity. Spontaneous facial expressivity was assessed by comparing 7 PD patients with 8 control participants. To voluntarily produce spontaneous facial expressions that resemble those typically triggered by emotions, six emotions (amusement, sadness, anger, disgust, surprise, and fear) were elicited using movie clips. During the movie clips, physiological signals (facial electromyography (EMG) and electrocardiogram (ECG)) and frontal face video of the participants were recorded. The participants were asked to report on their emotional states throughout the experiment. We first examined the effectiveness of the emotion manipulation by evaluating the participant's self-reports. Disgust-induced emotions were significantly higher than the other emotions. Thus we focused on the analysis of the recorded data during watching disgust movie clips. The proposed facial expressivity assessment approach captured differences in facial expressivity between PD patients and controls. Also differences between PD patients with different progression of Parkinson's disease have been observed. PMID:25478003

  12. Neural measures of the role of affective prosody in empathy for pain.

    PubMed

    Meconi, Federica; Doro, Mattia; Lomoriello, Arianna Schiano; Mastrella, Giulia; Sessa, Paola

    2018-01-10

    Emotional communication often needs the integration of affective prosodic and semantic components from speech and the speaker's facial expression. Affective prosody may have a special role by virtue of its dual-nature; pre-verbal on one side and accompanying semantic content on the other. This consideration led us to hypothesize that it could act transversely, encompassing a wide temporal window involving the processing of facial expressions and semantic content expressed by the speaker. This would allow powerful communication in contexts of potential urgency such as witnessing the speaker's physical pain. Seventeen participants were shown with faces preceded by verbal reports of pain. Facial expressions, intelligibility of the semantic content of the report (i.e., participants' mother tongue vs. fictional language) and the affective prosody of the report (neutral vs. painful) were manipulated. We monitored event-related potentials (ERPs) time-locked to the onset of the faces as a function of semantic content intelligibility and affective prosody of the verbal reports. We found that affective prosody may interact with facial expressions and semantic content in two successive temporal windows, supporting its role as a transverse communication cue.

  13. Emotion processing deficits in alexithymia and response to a depth of processing intervention.

    PubMed

    Constantinou, Elena; Panayiotou, Georgia; Theodorou, Marios

    2014-12-01

    Findings on alexithymic emotion difficulties have been inconsistent. We examined potential differences between alexithymic and control participants in general arousal, reactivity, facial and subjective expression, emotion labeling, and covariation between emotion response systems. A depth of processing intervention was introduced. Fifty-four participants (27 alexithymic), selected using the Toronto Alexithymia Scale-20, completed an imagery experiment (imagining joy, fear and neutral scripts), under instructions for shallow or deep emotion processing. Heart rate, skin conductance, facial electromyography and startle reflex were recorded along with subjective ratings. Results indicated hypo-reactivity to emotion among high alexithymic individuals, smaller and slower startle responses, and low covariation between physiology and self-report. No deficits in facial expression, labeling and emotion ratings were identified. Deep processing was associated with increased physiological reactivity and lower perceived dominance and arousal in high alexithymia. Findings suggest a tendency for avoidance of intense, unpleasant emotions and less defensive action preparation in alexithymia. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Perceiving the evil eye: Investigating hostile interpretation of ambiguous facial emotional expression in violent and non-violent offenders.

    PubMed

    Kuin, Niki C; Masthoff, Erik D M; Munafò, Marcus R; Penton-Voak, Ian S

    2017-01-01

    Research into the causal and perpetuating factors influencing aggression has partly focused on the general tendency of aggression-prone individuals to infer hostile intent in others, even in ambiguous circumstances. This is referred to as the 'hostile interpretation bias'. Whether this hostile interpretation bias also exists in basal information processing, such as perception of facial emotion, is not yet known, especially with respect to the perception of ambiguous expressions. In addition, little is known about how this potential bias in facial emotion perception is related to specific characteristics of aggression. In the present study, conducted in a penitentiary setting with detained male adults, we investigated if violent offenders (n = 71) show a stronger tendency to interpret ambiguous facial expressions on a computer task as angry rather than happy, compared to non-violent offenders (n = 14) and to a control group of healthy volunteers (n = 32). We also investigated if hostile perception of facial expressions is related to specific characteristics of aggression, such as proactive and reactive aggression. No clear statistical evidence was found that violent offenders perceived facial emotional expressions as more angry than non-violent offenders or healthy volunteers. A regression analysis in the violent offender group showed that only age and a self-report measure of hostility predicted outcome on the emotion perception task. Other traits, such as psychopathic traits, intelligence, attention and a tendency to jump to conclusions were not associated with interpretation of anger in facial emotional expressions. We discuss the possible impact of the study design and population studied on our results, as well as implications for future studies.

  15. Perceiving the evil eye: Investigating hostile interpretation of ambiguous facial emotional expression in violent and non-violent offenders

    PubMed Central

    Masthoff, Erik D. M.; Munafò, Marcus R.; Penton-Voak, Ian S.

    2017-01-01

    Research into the causal and perpetuating factors influencing aggression has partly focused on the general tendency of aggression-prone individuals to infer hostile intent in others, even in ambiguous circumstances. This is referred to as the ‘hostile interpretation bias’. Whether this hostile interpretation bias also exists in basal information processing, such as perception of facial emotion, is not yet known, especially with respect to the perception of ambiguous expressions. In addition, little is known about how this potential bias in facial emotion perception is related to specific characteristics of aggression. In the present study, conducted in a penitentiary setting with detained male adults, we investigated if violent offenders (n = 71) show a stronger tendency to interpret ambiguous facial expressions on a computer task as angry rather than happy, compared to non-violent offenders (n = 14) and to a control group of healthy volunteers (n = 32). We also investigated if hostile perception of facial expressions is related to specific characteristics of aggression, such as proactive and reactive aggression. No clear statistical evidence was found that violent offenders perceived facial emotional expressions as more angry than non-violent offenders or healthy volunteers. A regression analysis in the violent offender group showed that only age and a self-report measure of hostility predicted outcome on the emotion perception task. Other traits, such as psychopathic traits, intelligence, attention and a tendency to jump to conclusions were not associated with interpretation of anger in facial emotional expressions. We discuss the possible impact of the study design and population studied on our results, as well as implications for future studies. PMID:29190802

  16. Following the time course of face gender and expression processing: a task-dependent ERP study.

    PubMed

    Valdés-Conroy, Berenice; Aguado, Luis; Fernández-Cahill, María; Romero-Ferreiro, Verónica; Diéguez-Risco, Teresa

    2014-05-01

    The effects of task demands and the interaction between gender and expression in face perception were studied using event-related potentials (ERPs). Participants performed three different tasks with male and female faces that were emotionally inexpressive or that showed happy or angry expressions. In two of the tasks (gender and expression categorization) facial properties were task-relevant while in a third task (symbol discrimination) facial information was irrelevant. Effects of expression were observed on the visual P100 component under all task conditions, suggesting the operation of an automatic process that is not influenced by task demands. The earliest interaction between expression and gender was observed later in the face-sensitive N170 component. This component showed differential modulations by specific combinations of gender and expression (e.g., angry male vs. angry female faces). Main effects of expression and task were observed in a later occipito-temporal component peaking around 230 ms post-stimulus onset (EPN or early posterior negativity). Less positive amplitudes in the presence of angry faces and during performance of the gender and expression tasks were observed. Finally, task demands also modulated a positive component peaking around 400 ms (LPC, or late positive complex) that showed enhanced amplitude for the gender task. The pattern of results obtained here adds new evidence about the sequence of operations involved in face processing and the interaction of facial properties (gender and expression) in response to different task demands. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Role of temporal processing stages by inferior temporal neurons in facial recognition.

    PubMed

    Sugase-Miyamoto, Yasuko; Matsumoto, Narihisa; Kawano, Kenji

    2011-01-01

    In this review, we focus on the role of temporal stages of encoded facial information in the visual system, which might enable the efficient determination of species, identity, and expression. Facial recognition is an important function of our brain and is known to be processed in the ventral visual pathway, where visual signals are processed through areas V1, V2, V4, and the inferior temporal (IT) cortex. In the IT cortex, neurons show selective responses to complex visual images such as faces, and at each stage along the pathway the stimulus selectivity of the neural responses becomes sharper, particularly in the later portion of the responses. In the IT cortex of the monkey, facial information is represented by different temporal stages of neural responses, as shown in our previous study: the initial transient response of face-responsive neurons represents information about global categories, i.e., human vs. monkey vs. simple shapes, whilst the later portion of these responses represents information about detailed facial categories, i.e., expression and/or identity. This suggests that the temporal stages of the neuronal firing pattern play an important role in the coding of visual stimuli, including faces. This type of coding may be a plausible mechanism underlying the temporal dynamics of recognition, including the process of detection/categorization followed by the identification of objects. Recent single-unit studies in monkeys have also provided evidence consistent with the important role of the temporal stages of encoded facial information. For example, view-invariant facial identity information is represented in the response at a later period within a region of face-selective neurons. Consistent with these findings, temporally modulated neural activity has also been observed in human studies. These results suggest a close correlation between the temporal processing stages of facial information by IT neurons and the temporal dynamics of face recognition.

  18. Role of Temporal Processing Stages by Inferior Temporal Neurons in Facial Recognition

    PubMed Central

    Sugase-Miyamoto, Yasuko; Matsumoto, Narihisa; Kawano, Kenji

    2011-01-01

    In this review, we focus on the role of temporal stages of encoded facial information in the visual system, which might enable the efficient determination of species, identity, and expression. Facial recognition is an important function of our brain and is known to be processed in the ventral visual pathway, where visual signals are processed through areas V1, V2, V4, and the inferior temporal (IT) cortex. In the IT cortex, neurons show selective responses to complex visual images such as faces, and at each stage along the pathway the stimulus selectivity of the neural responses becomes sharper, particularly in the later portion of the responses. In the IT cortex of the monkey, facial information is represented by different temporal stages of neural responses, as shown in our previous study: the initial transient response of face-responsive neurons represents information about global categories, i.e., human vs. monkey vs. simple shapes, whilst the later portion of these responses represents information about detailed facial categories, i.e., expression and/or identity. This suggests that the temporal stages of the neuronal firing pattern play an important role in the coding of visual stimuli, including faces. This type of coding may be a plausible mechanism underlying the temporal dynamics of recognition, including the process of detection/categorization followed by the identification of objects. Recent single-unit studies in monkeys have also provided evidence consistent with the important role of the temporal stages of encoded facial information. For example, view-invariant facial identity information is represented in the response at a later period within a region of face-selective neurons. Consistent with these findings, temporally modulated neural activity has also been observed in human studies. These results suggest a close correlation between the temporal processing stages of facial information by IT neurons and the temporal dynamics of face recognition. PMID:21734904

  19. Extraction and representation of common feature from uncertain facial expressions with cloud model.

    PubMed

    Wang, Shuliang; Chi, Hehua; Yuan, Hanning; Geng, Jing

    2017-12-01

    Human facial expressions are key ingredient to convert an individual's innate emotion in communication. However, the variation of facial expressions affects the reliable identification of human emotions. In this paper, we present a cloud model to extract facial features for representing human emotion. First, the uncertainties in facial expression are analyzed in the context of cloud model. The feature extraction and representation algorithm is established under cloud generators. With forward cloud generator, facial expression images can be re-generated as many as we like for visually representing the extracted three features, and each feature shows different roles. The effectiveness of the computing model is tested on Japanese Female Facial Expression database. Three common features are extracted from seven facial expression images. Finally, the paper is concluded and remarked.

  20. Hemodynamic response of children with attention-deficit and hyperactive disorder (ADHD) to emotional facial expressions.

    PubMed

    Ichikawa, Hiroko; Nakato, Emi; Kanazawa, So; Shimamura, Keiichi; Sakuta, Yuiko; Sakuta, Ryoichi; Yamaguchi, Masami K; Kakigi, Ryusuke

    2014-10-01

    Children with attention-deficit/hyperactivity disorder (ADHD) have difficulty recognizing facial expressions. They identify angry expressions less accurately than typically developing (TD) children, yet little is known about their atypical neural basis for the recognition of facial expressions. Here, we used near-infrared spectroscopy (NIRS) to examine the distinctive cerebral hemodynamics of ADHD and TD children while they viewed happy and angry expressions. We measured the hemodynamic responses of 13 ADHD boys and 13 TD boys to happy and angry expressions at their bilateral temporal areas, which are sensitive to face processing. The ADHD children showed an increased concentration of oxy-Hb for happy faces but not for angry faces, while TD children showed increased oxy-Hb for both faces. Moreover, the individual peak latency of hemodynamic response in the right temporal area showed significantly greater variance in the ADHD group than in the TD group. Such atypical brain activity observed in ADHD boys may relate to their preserved ability to recognize a happy expression and their difficulty recognizing an angry expression. We firstly demonstrated that NIRS can be used to detect atypical hemodynamic response to facial expressions in ADHD children. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. Spontaneous Facial Mimicry in Response to Dynamic Facial Expressions

    ERIC Educational Resources Information Center

    Sato, Wataru; Yoshikawa, Sakiko

    2007-01-01

    Based on previous neuroscientific evidence indicating activation of the mirror neuron system in response to dynamic facial actions, we hypothesized that facial mimicry would occur while subjects viewed dynamic facial expressions. To test this hypothesis, dynamic/static facial expressions of anger/happiness were presented using computer-morphing…

  2. Dynamic facial expressions evoke distinct activation in the face perception network: a connectivity analysis study.

    PubMed

    Foley, Elaine; Rippon, Gina; Thai, Ngoc Jade; Longe, Olivia; Senior, Carl

    2012-02-01

    Very little is known about the neural structures involved in the perception of realistic dynamic facial expressions. In the present study, a unique set of naturalistic dynamic facial emotional expressions was created. Through fMRI and connectivity analysis, a dynamic face perception network was identified, which is demonstrated to extend Haxby et al.'s [Haxby, J. V., Hoffman, E. A., & Gobbini, M. I. The distributed human neural system for face perception. Trends in Cognitive Science, 4, 223-233, 2000] distributed neural system for face perception. This network includes early visual regions, such as the inferior occipital gyrus, which is identified as insensitive to motion or affect but sensitive to the visual stimulus, the STS, identified as specifically sensitive to motion, and the amygdala, recruited to process affect. Measures of effective connectivity between these regions revealed that dynamic facial stimuli were associated with specific increases in connectivity between early visual regions, such as the inferior occipital gyrus and the STS, along with coupling between the STS and the amygdala, as well as the inferior frontal gyrus. These findings support the presence of a distributed network of cortical regions that mediate the perception of different dynamic facial expressions.

  3. Neural signatures of conscious and unconscious emotional face processing in human infants.

    PubMed

    Jessen, Sarah; Grossmann, Tobias

    2015-03-01

    Human adults can process emotional information both with and without conscious awareness, and it has been suggested that the two processes rely on partly distinct brain mechanisms. However, the developmental origins of these brain processes are unknown. In the present event-related brain potential (ERP) study, we examined the brain responses of 7-month-old infants in response to subliminally (50 and 100 msec) and supraliminally (500 msec) presented happy and fearful facial expressions. Our results revealed that infants' brain responses (Pb and Nc) over central electrodes distinguished between emotions irrespective of stimulus duration, whereas the discrimination between emotions at occipital electrodes (N290 and P400) only occurred when faces were presented supraliminally (above threshold). This suggests that early in development the human brain not only discriminates between happy and fearful facial expressions irrespective of conscious perception, but also that, similar to adults, supraliminal and subliminal emotion processing relies on distinct neural processes. Our data further suggest that the processing of emotional facial expressions differs across infants depending on their behaviorally shown perceptual sensitivity. The current ERP findings suggest that distinct brain processes underpinning conscious and unconscious emotion perception emerge early in ontogeny and can therefore be seen as a key feature of human social functioning. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Culture shapes 7-month-olds' perceptual strategies in discriminating facial expressions of emotion.

    PubMed

    Geangu, Elena; Ichikawa, Hiroko; Lao, Junpeng; Kanazawa, So; Yamaguchi, Masami K; Caldara, Roberto; Turati, Chiara

    2016-07-25

    Emotional facial expressions are thought to have evolved because they play a crucial role in species' survival. From infancy, humans develop dedicated neural circuits [1] to exhibit and recognize a variety of facial expressions [2]. But there is increasing evidence that culture specifies when and how certain emotions can be expressed - social norms - and that the mature perceptual mechanisms used to transmit and decode the visual information from emotional signals differ between Western and Eastern adults [3-5]. Specifically, the mouth is more informative for transmitting emotional signals in Westerners and the eye region for Easterners [4], generating culture-specific fixation biases towards these features [5]. During development, it is recognized that cultural differences can be observed at the level of emotional reactivity and regulation [6], and to the culturally dominant modes of attention [7]. Nonetheless, to our knowledge no study has explored whether culture shapes the processing of facial emotional signals early in development. The data we report here show that, by 7 months, infants from both cultures visually discriminate facial expressions of emotion by relying on culturally distinct fixation strategies, resembling those used by the adults from the environment in which they develop [5]. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Automatic facial mimicry in response to dynamic emotional stimuli in five-month-old infants.

    PubMed

    Isomura, Tomoko; Nakano, Tamami

    2016-12-14

    Human adults automatically mimic others' emotional expressions, which is believed to contribute to sharing emotions with others. Although this behaviour appears fundamental to social reciprocity, little is known about its developmental process. Therefore, we examined whether infants show automatic facial mimicry in response to others' emotional expressions. Facial electromyographic activity over the corrugator supercilii (brow) and zygomaticus major (cheek) of four- to five-month-old infants was measured while they viewed dynamic clips presenting audiovisual, visual and auditory emotions. The audiovisual bimodal emotion stimuli were a display of a laughing/crying facial expression with an emotionally congruent vocalization, whereas the visual/auditory unimodal emotion stimuli displayed those emotional faces/vocalizations paired with a neutral vocalization/face, respectively. Increased activation of the corrugator supercilii muscle in response to audiovisual cries and the zygomaticus major in response to audiovisual laughter were observed between 500 and 1000 ms after stimulus onset, which clearly suggests rapid facial mimicry. By contrast, both visual and auditory unimodal emotion stimuli did not activate the infants' corresponding muscles. These results revealed that automatic facial mimicry is present as early as five months of age, when multimodal emotional information is present. © 2016 The Author(s).

  6. A Quantitative Assessment of Lip Movements in Different Facial Expressions Through 3-Dimensional on 3-Dimensional Superimposition: A Cross-Sectional Study.

    PubMed

    Gibelli, Daniele; Codari, Marina; Pucciarelli, Valentina; Dolci, Claudia; Sforza, Chiarella

    2017-11-23

    The quantitative assessment of facial modifications from mimicry is of relevant interest for the rehabilitation of patients who can no longer produce facial expressions. This study investigated a novel application of 3-dimensional on 3-dimensional superimposition for facial mimicry. This cross-sectional study was based on 10 men 30 to 40 years old who underwent stereophotogrammetry for neutral, happy, sad, and angry expressions. Registration of facial expressions on the neutral expression was performed. Root mean square (RMS) point-to-point distance in the labial area was calculated between each facial expression and the neutral one and was considered the main parameter for assessing facial modifications. In addition, effect size (Cohen d) was calculated to assess the effects of labial movements in relation to facial modifications. All participants were free from possible facial deformities, pathologies, or trauma that could affect facial mimicry. RMS values of facial areas differed significantly among facial expressions (P = .0004 by Friedman test). The widest modifications of the lips were observed in happy expressions (RMS, 4.06 mm; standard deviation [SD], 1.14 mm), with a statistically relevant difference compared with the sad (RMS, 1.42 mm; SD, 1.15 mm) and angry (RMS, 0.76 mm; SD, 0.45 mm) expressions. The effect size of labial versus total face movements was limited for happy and sad expressions and large for the angry expression. This study found that a happy expression provides wider modifications of the lips than the other facial expressions and suggests a novel procedure for assessing regional changes from mimicry. Copyright © 2017 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  7. Diminished neural and cognitive responses to facial expressions of disgust in patients with psoriasis: a functional magnetic resonance imaging study.

    PubMed

    Kleyn, C Elise; McKie, Shane; Ross, Andrew R; Montaldi, Daniela; Gregory, Lloyd J; Elliott, Rebecca; Isaacs, Clare L; Anderson, Ian M; Richards, Helen L; Deakin, J F William; Fortune, Donal G; Griffiths, Christopher E M

    2009-11-01

    Psoriasis produces significant psychosocial disability; however, little is understood about the neurocognitive mechanisms that mediate the adverse consequences of the social stigma associated with visible skin lesions, such as disgusted facial expressions of others. Both the feeling of disgust and the observation of disgust in others are known to activate the insula cortex. We investigated whether the social impact of psoriasis is associated with altered cognitive processing of disgust using (i) a covert recognition of faces task conducted using functional magnetic resonance imaging (fMRI) and (ii) the facial expression recognition task (FERT), a decision-making task, conducted outside the scanner to assess the ability to recognize overtly different intensities of disgust. Thirteen right-handed male patients with psoriasis and 13 age-matched male controls were included. In the fMRI study, psoriasis patients had significantly (P<0.005) smaller signal responses to disgusted faces in the bilateral insular cortex compared with healthy controls. These data were corroborated by FERT, in that patients were less able than controls to identify all intensities of disgust tested. We hypothesize that patients with psoriasis, in this case male patients, develop a coping mechanism to protect them from stressful emotional responses by blocking the processing of disgusted facial expressions.

  8. Hierarchical Encoding of Social Cues in Primate Inferior Temporal Cortex

    PubMed Central

    Morin, Elyse L.; Hadj-Bouziane, Fadila; Stokes, Mark; Ungerleider, Leslie G.; Bell, Andrew H.

    2015-01-01

    Faces convey information about identity and emotional state, both of which are important for our social interactions. Models of face processing propose that changeable versus invariant aspects of a face, specifically facial expression/gaze direction versus facial identity, are coded by distinct neural pathways and yet neurophysiological data supporting this separation are incomplete. We recorded activity from neurons along the inferior bank of the superior temporal sulcus (STS), while monkeys viewed images of conspecific faces and non-face control stimuli. Eight monkey identities were used, each presented with 3 different facial expressions (neutral, fear grin, and threat). All facial expressions were displayed with both a direct and averted gaze. In the posterior STS, we found that about one-quarter of face-responsive neurons are sensitive to social cues, the majority of which being sensitive to only one of these cues. In contrast, in anterior STS, not only did the proportion of neurons sensitive to social cues increase, but so too did the proportion of neurons sensitive to conjunctions of identity with either gaze direction or expression. These data support a convergence of signals related to faces as one moves anteriorly along the inferior bank of the STS, which forms a fundamental part of the face-processing network. PMID:24836688

  9. Genetic correlations between wellbeing, depression and anxiety symptoms and behavioral responses to the emotional faces task in healthy twins.

    PubMed

    Routledge, Kylie M; Williams, Leanne M; Harris, Anthony W F; Schofield, Peter R; Clark, C Richard; Gatt, Justine M

    2018-06-01

    Currently there is a very limited understanding of how mental wellbeing versus anxiety and depression symptoms are associated with emotion processing behaviour. For the first time, we examined these associations using a behavioural emotion task of positive and negative facial expressions in 1668 healthy adult twins. Linear mixed model results suggested faster reaction times to happy facial expressions was associated with higher wellbeing scores, and slower reaction times with higher depression and anxiety scores. Multivariate twin modelling identified a significant genetic correlation between depression and anxiety symptoms and reaction time to happy facial expressions, in the absence of any significant correlations with wellbeing. We also found a significant negative phenotypic relationship between depression and anxiety symptoms and accuracy for identifying neutral emotions, although the genetic or environment correlations were not significant in the multivariate model. Overall, the phenotypic relationships between speed of identifying happy facial expressions and wellbeing on the one hand, versus depression and anxiety symptoms on the other, were in opposing directions. Twin modelling revealed a small common genetic correlation between response to happy faces and depression and anxiety symptoms alone, suggesting that wellbeing and depression and anxiety symptoms show largely independent relationships with emotion processing at the behavioral level. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Beyond face value: does involuntary emotional anticipation shape the perception of dynamic facial expressions?

    PubMed

    Palumbo, Letizia; Jellema, Tjeerd

    2013-01-01

    Emotional facial expressions are immediate indicators of the affective dispositions of others. Recently it has been shown that early stages of social perception can already be influenced by (implicit) attributions made by the observer about the agent's mental state and intentions. In the current study possible mechanisms underpinning distortions in the perception of dynamic, ecologically-valid, facial expressions were explored. In four experiments we examined to what extent basic perceptual processes such as contrast/context effects, adaptation and representational momentum underpinned the perceptual distortions, and to what extent 'emotional anticipation', i.e. the involuntary anticipation of the other's emotional state of mind on the basis of the immediate perceptual history, might have played a role. Neutral facial expressions displayed at the end of short video-clips, in which an initial facial expression of joy or anger gradually morphed into a neutral expression, were misjudged as being slightly angry or slightly happy, respectively (Experiment 1). This response bias disappeared when the actor's identity changed in the final neutral expression (Experiment 2). Videos depicting neutral-to-joy-to-neutral and neutral-to-anger-to-neutral sequences again produced biases but in opposite direction (Experiment 3). The bias survived insertion of a 400 ms blank (Experiment 4). These results suggested that the perceptual distortions were not caused by any of the low-level perceptual mechanisms (adaptation, representational momentum and contrast effects). We speculate that especially when presented with dynamic, facial expressions, perceptual distortions occur that reflect 'emotional anticipation' (a low-level mindreading mechanism), which overrules low-level visual mechanisms. Underpinning neural mechanisms are discussed in relation to the current debate on action and emotion understanding.

  11. Beyond Face Value: Does Involuntary Emotional Anticipation Shape the Perception of Dynamic Facial Expressions?

    PubMed Central

    Palumbo, Letizia; Jellema, Tjeerd

    2013-01-01

    Emotional facial expressions are immediate indicators of the affective dispositions of others. Recently it has been shown that early stages of social perception can already be influenced by (implicit) attributions made by the observer about the agent’s mental state and intentions. In the current study possible mechanisms underpinning distortions in the perception of dynamic, ecologically-valid, facial expressions were explored. In four experiments we examined to what extent basic perceptual processes such as contrast/context effects, adaptation and representational momentum underpinned the perceptual distortions, and to what extent ‘emotional anticipation’, i.e. the involuntary anticipation of the other’s emotional state of mind on the basis of the immediate perceptual history, might have played a role. Neutral facial expressions displayed at the end of short video-clips, in which an initial facial expression of joy or anger gradually morphed into a neutral expression, were misjudged as being slightly angry or slightly happy, respectively (Experiment 1). This response bias disappeared when the actor’s identity changed in the final neutral expression (Experiment 2). Videos depicting neutral-to-joy-to-neutral and neutral-to-anger-to-neutral sequences again produced biases but in opposite direction (Experiment 3). The bias survived insertion of a 400 ms blank (Experiment 4). These results suggested that the perceptual distortions were not caused by any of the low-level perceptual mechanisms (adaptation, representational momentum and contrast effects). We speculate that especially when presented with dynamic, facial expressions, perceptual distortions occur that reflect ‘emotional anticipation’ (a low-level mindreading mechanism), which overrules low-level visual mechanisms. Underpinning neural mechanisms are discussed in relation to the current debate on action and emotion understanding. PMID:23409112

  12. Lateralization for dynamic facial expressions in human superior temporal sulcus.

    PubMed

    De Winter, François-Laurent; Zhu, Qi; Van den Stock, Jan; Nelissen, Koen; Peeters, Ronald; de Gelder, Beatrice; Vanduffel, Wim; Vandenbulcke, Mathieu

    2015-02-01

    Most face processing studies in humans show stronger activation in the right compared to the left hemisphere. Evidence is largely based on studies with static stimuli focusing on the fusiform face area (FFA). Hence, the pattern of lateralization for dynamic faces is less clear. Furthermore, it is unclear whether this property is common to human and non-human primates due to predisposing processing strategies in the right hemisphere or that alternatively left sided specialization for language in humans could be the driving force behind this phenomenon. We aimed to address both issues by studying lateralization for dynamic facial expressions in monkeys and humans. Therefore, we conducted an event-related fMRI experiment in three macaques and twenty right handed humans. We presented human and monkey dynamic facial expressions (chewing and fear) as well as scrambled versions to both species. We studied lateralization in independently defined face-responsive and face-selective regions by calculating a weighted lateralization index (LIwm) using a bootstrapping method. In order to examine if lateralization in humans is related to language, we performed a separate fMRI experiment in ten human volunteers including a 'speech' expression (one syllable non-word) and its scrambled version. Both within face-responsive and selective regions, we found consistent lateralization for dynamic faces (chewing and fear) versus scrambled versions in the right human posterior superior temporal sulcus (pSTS), but not in FFA nor in ventral temporal cortex. Conversely, in monkeys no consistent pattern of lateralization for dynamic facial expressions was observed. Finally, LIwms based on the contrast between different types of dynamic facial expressions (relative to scrambled versions) revealed left-sided lateralization in human pSTS for speech-related expressions compared to chewing and emotional expressions. To conclude, we found consistent laterality effects in human posterior STS but not in visual cortex of monkeys. Based on our results, it is tempting to speculate that lateralization for dynamic face processing in humans may be driven by left-hemispheric language specialization which may not have been present yet in the common ancestor of human and macaque monkeys. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. Behavioural responses to facial and postural expressions of emotion: An interpersonal circumplex approach.

    PubMed

    Aan Het Rot, Marije; Enea, Violeta; Dafinoiu, Ion; Iancu, Sorina; Taftă, Steluţa A; Bărbuşelu, Mariana

    2017-11-01

    While the recognition of emotional expressions has been extensively studied, the behavioural response to these expressions has not. In the interpersonal circumplex, behaviour is defined in terms of communion and agency. In this study, we examined behavioural responses to both facial and postural expressions of emotion. We presented 101 Romanian students with facial and postural stimuli involving individuals ('targets') expressing happiness, sadness, anger, or fear. Using an interpersonal grid, participants simultaneously indicated how communal (i.e., quarrelsome or agreeable) and agentic (i.e., dominant or submissive) they would be towards people displaying these expressions. Participants were agreeable-dominant towards targets showing happy facial expressions and primarily quarrelsome towards targets with angry or fearful facial expressions. Responses to targets showing sad facial expressions were neutral on both dimensions of interpersonal behaviour. Postural versus facial expressions of happiness and anger elicited similar behavioural responses. Participants responded in a quarrelsome-submissive way to fearful postural expressions and in an agreeable way to sad postural expressions. Behavioural responses to the various facial expressions were largely comparable to those previously observed in Dutch students. Observed differences may be explained from participants' cultural background. Responses to the postural expressions largely matched responses to the facial expressions. © 2017 The British Psychological Society.

  14. Positive facial expressions during retrieval of self-defining memories.

    PubMed

    Gandolphe, Marie Charlotte; Nandrino, Jean Louis; Delelis, Gérald; Ducro, Claire; Lavallee, Audrey; Saloppe, Xavier; Moustafa, Ahmed A; El Haj, Mohamad

    2017-11-14

    In this study, we investigated, for the first time, facial expressions during the retrieval of Self-defining memories (i.e., those vivid and emotionally intense memories of enduring concerns or unresolved conflicts). Participants self-rated the emotional valence of their Self-defining memories and autobiographical retrieval was analyzed with a facial analysis software. This software (Facereader) synthesizes the facial expression information (i.e., cheek, lips, muscles, eyebrow muscles) to describe and categorize facial expressions (i.e., neutral, happy, sad, surprised, angry, scared, and disgusted facial expressions). We found that participants showed more emotional than neutral facial expressions during the retrieval of Self-defining memories. We also found that participants showed more positive than negative facial expressions during the retrieval of Self-defining memories. Interestingly, participants attributed positive valence to the retrieved memories. These findings are the first to demonstrate the consistency between facial expressions and the emotional subjective experience of Self-defining memories. These findings provide valuable physiological information about the emotional experience of the past.

  15. Forming Facial Expressions Influences Assessment of Others' Dominance but Not Trustworthiness.

    PubMed

    Ueda, Yoshiyuki; Nagoya, Kie; Yoshikawa, Sakiko; Nomura, Michio

    2017-01-01

    Forming specific facial expressions influences emotions and perception. Bearing this in mind, studies should be reconsidered in which observers expressing neutral emotions inferred personal traits from the facial expressions of others. In the present study, participants were asked to make happy, neutral, and disgusted facial expressions: for "happy," they held a wooden chopstick in their molars to form a smile; for "neutral," they clasped the chopstick between their lips, making no expression; for "disgusted," they put the chopstick between their upper lip and nose and knit their brows in a scowl. However, they were not asked to intentionally change their emotional state. Observers judged happy expression images as more trustworthy, competent, warm, friendly, and distinctive than disgusted expression images, regardless of the observers' own facial expression. Observers judged disgusted expression images as more dominant than happy expression images. However, observers expressing disgust overestimated dominance in observed disgusted expression images and underestimated dominance in happy expression images. In contrast, observers with happy facial forms attenuated dominance for disgusted expression images. These results suggest that dominance inferred from facial expressions is unstable and influenced by not only the observed facial expression, but also the observers' own physiological states.

  16. Impaired Overt Facial Mimicry in Response to Dynamic Facial Expressions in High-Functioning Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Yoshimura, Sayaka; Sato, Wataru; Uono, Shota; Toichi, Motomi

    2015-01-01

    Previous electromyographic studies have reported that individuals with autism spectrum disorders (ASD) exhibited atypical patterns of facial muscle activity in response to facial expression stimuli. However, whether such activity is expressed in visible facial mimicry remains unknown. To investigate this issue, we videotaped facial responses in…

  17. Neural correlates of the perception of dynamic versus static facial expressions of emotion.

    PubMed

    Kessler, Henrik; Doyen-Waldecker, Cornelia; Hofer, Christian; Hoffmann, Holger; Traue, Harald C; Abler, Birgit

    2011-04-20

    This study investigated brain areas involved in the perception of dynamic facial expressions of emotion. A group of 30 healthy subjects was measured with fMRI when passively viewing prototypical facial expressions of fear, disgust, sadness and happiness. Using morphing techniques, all faces were displayed as still images and also dynamically as a film clip with the expressions evolving from neutral to emotional. Irrespective of a specific emotion, dynamic stimuli selectively activated bilateral superior temporal sulcus, visual area V5, fusiform gyrus, thalamus and other frontal and parietal areas. Interaction effects of emotion and mode of presentation (static/dynamic) were only found for the expression of happiness, where static faces evoked greater activity in the medial prefrontal cortex. Our results confirm previous findings on neural correlates of the perception of dynamic facial expressions and are in line with studies showing the importance of the superior temporal sulcus and V5 in the perception of biological motion. Differential activation in the fusiform gyrus for dynamic stimuli stands in contrast to classical models of face perception but is coherent with new findings arguing for a more general role of the fusiform gyrus in the processing of socially relevant stimuli.

  18. Face Generation Using Emotional Regions for Sensibility Robot

    NASA Astrophysics Data System (ADS)

    Gotoh, Minori; Kanoh, Masayoshi; Kato, Shohei; Kunitachi, Tsutomu; Itoh, Hidenori

    We think that psychological interaction is necessary for smooth communication between robots and people. One way to psychologically interact with others is through facial expressions. Facial expressions are very important for communication because they show true emotions and feelings. The ``Ifbot'' robot communicates with people by considering its own ``emotions''. Ifbot has many facial expressions to communicate enjoyment. We developed a method for generating facial expressions based on human subjective judgements mapping Ifbot's facial expressions to its emotions. We first created Ifbot's emotional space to map its facial expressions. We applied a five-layer auto-associative neural network to the space. We then subjectively evaluated the emotional space and created emotional regions based on the results. We generated emotive facial expressions using the emotional regions.

  19. Compound facial expressions of emotion: from basic research to clinical applications

    PubMed Central

    Du, Shichuan; Martinez, Aleix M.

    2015-01-01

    Emotions are sometimes revealed through facial expressions. When these natural facial articulations involve the contraction of the same muscle groups in people of distinct cultural upbringings, this is taken as evidence of a biological origin of these emotions. While past research had identified facial expressions associated with a single internally felt category (eg, the facial expression of happiness when we feel joyful), we have recently studied facial expressions observed when people experience compound emotions (eg, the facial expression of happy surprise when we feel joyful in a surprised way, as, for example, at a surprise birthday party). Our research has identified 17 compound expressions consistently produced across cultures, suggesting that the number of facial expressions of emotion of biological origin is much larger than previously believed. The present paper provides an overview of these findings and shows evidence supporting the view that spontaneous expressions are produced using the same facial articulations previously identified in laboratory experiments. We also discuss the implications of our results in the study of psychopathologies, and consider several open research questions. PMID:26869845

  20. Exploring the nature of facial affect processing deficits in schizophrenia.

    PubMed

    van 't Wout, Mascha; Aleman, André; Kessels, Roy P C; Cahn, Wiepke; de Haan, Edward H F; Kahn, René S

    2007-04-15

    Schizophrenia has been associated with deficits in facial affect processing, especially negative emotions. However, the exact nature of the deficit remains unclear. The aim of the present study was to investigate whether schizophrenia patients have problems in automatic allocation of attention as well as in controlled evaluation of facial affect. Thirty-seven patients with schizophrenia were compared with 41 control subjects on incidental facial affect processing (gender decision of faces with a fearful, angry, happy, disgusted, and neutral expression) and degraded facial affect labeling (labeling of fearful, angry, happy, and neutral faces). The groups were matched on estimates of verbal and performance intelligence (National Adult Reading Test; Raven's Matrices), general face recognition ability (Benton Face Recognition), and other demographic variables. The results showed that patients with schizophrenia as well as control subjects demonstrate the normal threat-related interference during incidental facial affect processing. Conversely, on controlled evaluation patients were specifically worse in the labeling of fearful faces. In particular, patients with high levels of negative symptoms may be characterized by deficits in labeling fear. We suggest that patients with schizophrenia show no evidence of deficits in the automatic allocation of attention resources to fearful (threat-indicating) faces, but have a deficit in the controlled processing of facial emotions that may be specific for fearful faces.

  1. Reduced emotion processing efficiency in healthy males relative to females

    PubMed Central

    Rapport, Lisa J.; Briceno, Emily M.; Haase, Brennan D.; Vederman, Aaron C.; Bieliauskas, Linas A.; Welsh, Robert C.; Starkman, Monica N.; McInnis, Melvin G.; Zubieta, Jon-Kar; Langenecker, Scott A.

    2014-01-01

    This study examined sex differences in categorization of facial emotions and activation of brain regions supportive of those classifications. In Experiment 1, performance on the Facial Emotion Perception Test (FEPT) was examined among 75 healthy females and 63 healthy males. Females were more accurate in the categorization of fearful expressions relative to males. In Experiment 2, 3T functional magnetic resonance imaging data were acquired for a separate sample of 21 healthy females and 17 healthy males while performing the FEPT. Activation to neutral facial expressions was subtracted from activation to sad, angry, fearful and happy facial expressions. Although females and males demonstrated activation in some overlapping regions for all emotions, many regions were exclusive to females or males. For anger, sad and happy, males displayed a larger extent of activation than did females, and greater height of activation was detected in diffuse cortical and subcortical regions. For fear, males displayed greater activation than females only in right postcentral gyri. With one exception in females, performance was not associated with activation. Results suggest that females and males process emotions using different neural pathways, and these differences cannot be explained by performance variations. PMID:23196633

  2. Reading the mind in the infant eyes: paradoxical effects of oxytocin on neural activity and emotion recognition in watching pictures of infant faces.

    PubMed

    Voorthuis, Alexandra; Riem, Madelon M E; Van IJzendoorn, Marinus H; Bakermans-Kranenburg, Marian J

    2014-09-11

    The neuropeptide oxytocin facilitates parental caregiving and is involved in the processing of infant vocal cues. In this randomized-controlled trial with functional magnetic resonance imaging we examined the influence of intranasally administered oxytocin on neural activity during emotion recognition in infant faces. Blood oxygenation level dependent (BOLD) responses during emotion recognition were measured in 50 women who were administered 16 IU of oxytocin or a placebo. Participants performed an adapted version of the Infant Facial Expressions of Emotions from Looking at Pictures (IFEEL pictures), a task that has been developed to assess the perception and interpretation of infants' facial expressions. Experimentally induced oxytocin levels increased activation in the inferior frontal gyrus (IFG), the middle temporal gyrus (MTG) and the superior temporal gyrus (STG). However, oxytocin decreased performance on the IFEEL picture task. Our findings suggest that oxytocin enhances processing of facial cues of the emotional state of infants on a neural level, but at the same time it may decrease the correct interpretation of infants' facial expressions on a behavior level. This article is part of a Special Issue entitled Oxytocin and Social Behav. © 2013 Published by Elsevier B.V.

  3. When Age Matters: Differences in Facial Mimicry and Autonomic Responses to Peers' Emotions in Teenagers and Adults

    PubMed Central

    Ardizzi, Martina; Sestito, Mariateresa; Martini, Francesca; Umiltà, Maria Alessandra; Ravera, Roberto; Gallese, Vittorio

    2014-01-01

    Age-group membership effects on explicit emotional facial expressions recognition have been widely demonstrated. In this study we investigated whether Age-group membership could also affect implicit physiological responses, as facial mimicry and autonomic regulation, to observation of emotional facial expressions. To this aim, facial Electromyography (EMG) and Respiratory Sinus Arrhythmia (RSA) were recorded from teenager and adult participants during the observation of facial expressions performed by teenager and adult models. Results highlighted that teenagers exhibited greater facial EMG responses to peers' facial expressions, whereas adults showed higher RSA-responses to adult facial expressions. The different physiological modalities through which young and adults respond to peers' emotional expressions are likely to reflect two different ways to engage in social interactions with coetaneous. Findings confirmed that age is an important and powerful social feature that modulates interpersonal interactions by influencing low-level physiological responses. PMID:25337916

  4. Impaired recognition of body expressions in the behavioral variant of frontotemporal dementia.

    PubMed

    Van den Stock, Jan; De Winter, François-Laurent; de Gelder, Beatrice; Rangarajan, Janaki Raman; Cypers, Gert; Maes, Frederik; Sunaert, Stefan; Goffin, Karolien; Vandenberghe, Rik; Vandenbulcke, Mathieu

    2015-08-01

    Progressive deterioration of social cognition and emotion processing are core symptoms of the behavioral variant of frontotemporal dementia (bvFTD). Here we investigate whether bvFTD is also associated with impaired recognition of static (Experiment 1) and dynamic (Experiment 2) bodily expressions. In addition, we compared body expression processing with processing of static (Experiment 3) and dynamic (Experiment 4) facial expressions, as well as with face identity processing (Experiment 5). The results reveal that bvFTD is associated with impaired recognition of static and dynamic bodily and facial expressions, while identity processing was intact. No differential impairments were observed regarding motion (static vs. dynamic) or category (body vs. face). Within the bvFTD group, we observed a significant partial correlation between body and face expression recognition, when controlling for performance on the identity task. Voxel-Based Morphometry (VBM) analysis revealed that body emotion recognition was positively associated with gray matter volume in a region of the inferior frontal gyrus (pars orbitalis/triangularis). The results are in line with a supramodal emotion recognition deficit in bvFTD. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. The look of fear and anger: facial maturity modulates recognition of fearful and angry expressions.

    PubMed

    Sacco, Donald F; Hugenberg, Kurt

    2009-02-01

    The current series of studies provide converging evidence that facial expressions of fear and anger may have co-evolved to mimic mature and babyish faces in order to enhance their communicative signal. In Studies 1 and 2, fearful and angry facial expressions were manipulated to have enhanced babyish features (larger eyes) or enhanced mature features (smaller eyes) and in the context of a speeded categorization task in Study 1 and a visual noise paradigm in Study 2, results indicated that larger eyes facilitated the recognition of fearful facial expressions, while smaller eyes facilitated the recognition of angry facial expressions. Study 3 manipulated facial roundness, a stable structure that does not vary systematically with expressions, and found that congruency between maturity and expression (narrow face-anger; round face-fear) facilitated expression recognition accuracy. Results are discussed as representing a broad co-evolutionary relationship between facial maturity and fearful and angry facial expressions. (c) 2009 APA, all rights reserved

  6. Emotional facial activation induced by unconsciously perceived dynamic facial expressions.

    PubMed

    Kaiser, Jakob; Davey, Graham C L; Parkhouse, Thomas; Meeres, Jennifer; Scott, Ryan B

    2016-12-01

    Do facial expressions of emotion influence us when not consciously perceived? Methods to investigate this question have typically relied on brief presentation of static images. In contrast, real facial expressions are dynamic and unfold over several seconds. Recent studies demonstrate that gaze contingent crowding (GCC) can block awareness of dynamic expressions while still inducing behavioural priming effects. The current experiment tested for the first time whether dynamic facial expressions presented using this method can induce unconscious facial activation. Videos of dynamic happy and angry expressions were presented outside participants' conscious awareness while EMG measurements captured activation of the zygomaticus major (active when smiling) and the corrugator supercilii (active when frowning). Forced-choice classification of expressions confirmed they were not consciously perceived, while EMG revealed significant differential activation of facial muscles consistent with the expressions presented. This successful demonstration opens new avenues for research examining the unconscious emotional influences of facial expressions. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Social Use of Facial Expressions in Hylobatids

    PubMed Central

    Scheider, Linda; Waller, Bridget M.; Oña, Leonardo; Burrows, Anne M.; Liebal, Katja

    2016-01-01

    Non-human primates use various communicative means in interactions with others. While primate gestures are commonly considered to be intentionally and flexibly used signals, facial expressions are often referred to as inflexible, automatic expressions of affective internal states. To explore whether and how non-human primates use facial expressions in specific communicative interactions, we studied five species of small apes (gibbons) by employing a newly established Facial Action Coding System for hylobatid species (GibbonFACS). We found that, despite individuals often being in close proximity to each other, in social (as opposed to non-social contexts) the duration of facial expressions was significantly longer when gibbons were facing another individual compared to non-facing situations. Social contexts included grooming, agonistic interactions and play, whereas non-social contexts included resting and self-grooming. Additionally, gibbons used facial expressions while facing another individual more often in social contexts than non-social contexts where facial expressions were produced regardless of the attentional state of the partner. Also, facial expressions were more likely ‘responded to’ by the partner’s facial expressions when facing another individual than non-facing. Taken together, our results indicate that gibbons use their facial expressions differentially depending on the social context and are able to use them in a directed way in communicative interactions with other conspecifics. PMID:26978660

  8. Neurophysiology of spontaneous facial expressions: I. Motor control of the upper and lower face is behaviorally independent in adults.

    PubMed

    Ross, Elliott D; Gupta, Smita S; Adnan, Asif M; Holden, Thomas L; Havlicek, Joseph; Radhakrishnan, Sridhar

    2016-03-01

    Facial expressions are described traditionally as monolithic entities. However, humans have the capacity to produce facial blends, in which the upper and lower face simultaneously display different emotional expressions. This, in turn, has led to the Component Theory of facial expressions. Recent neuroanatomical studies in monkeys have demonstrated that there are separate cortical motor areas for controlling the upper and lower face that, presumably, also occur in humans. The lower face is represented on the posterior ventrolateral surface of the frontal lobes in the primary motor and premotor cortices and the upper face is represented on the medial surface of the posterior frontal lobes in the supplementary motor and anterior cingulate cortices. Our laboratory has been engaged in a series of studies exploring the perception and production of facial blends. Using high-speed videography, we began measuring the temporal aspects of facial expressions to develop a more complete understanding of the neurophysiology underlying facial expressions and facial blends. The goal of the research presented here was to determine if spontaneous facial expressions in adults are predominantly monolithic or exhibit independent motor control of the upper and lower face. We found that spontaneous facial expressions are very complex and that the motor control of the upper and lower face is overwhelmingly independent, thus robustly supporting the Component Theory of facial expressions. Seemingly monolithic expressions, be they full facial or facial blends, are most likely the result of a timing coincident rather than a synchronous coordination between the ventrolateral and medial cortical motor areas responsible for controlling the lower and upper face, respectively. In addition, we found evidence that the right and left face may also exhibit independent motor control, thus supporting the concept that spontaneous facial expressions are organized predominantly across the horizontal facial axis and secondarily across the vertical axis. Published by Elsevier Ltd.

  9. Perceived differences between chimpanzee (Pan troglodytes) and human (Homo sapiens) facial expressions are related to emotional interpretation.

    PubMed

    Waller, Bridget M; Bard, Kim A; Vick, Sarah-Jane; Smith Pasqualini, Marcia C

    2007-11-01

    Human face perception is a finely tuned, specialized process. When comparing faces between species, therefore, it is essential to consider how people make these observational judgments. Comparing facial expressions may be particularly problematic, given that people tend to consider them categorically as emotional signals, which may affect how accurately specific details are processed. The bared-teeth display (BT), observed in most primates, has been proposed as a homologue of the human smile (J. A. R. A. M. van Hooff, 1972). In this study, judgments of similarity between BT displays of chimpanzees (Pan troglodytes) and human smiles varied in relation to perceived emotional valence. When a chimpanzee BT was interpreted as fearful, observers tended to underestimate the magnitude of the relationship between certain features (the extent of lip corner raise) and human smiles. These judgments may reflect the combined effects of categorical emotional perception, configural face processing, and perceptual organization in mental imagery and may demonstrate the advantages of using standardized observational methods in comparative facial expression research. Copyright 2007 APA.

  10. Tryptophan depletion decreases the recognition of fear in female volunteers.

    PubMed

    Harmer, C J; Rogers, R D; Tunbridge, E; Cowen, P J; Goodwin, G M

    2003-06-01

    Serotonergic processes have been implicated in the modulation of fear conditioning in humans, postulated to occur at the level of the amygdala. The processing of other fear-relevant cues, such as facial expressions, has also been associated with amygdala function, but an effect of serotonin depletion on these processes has not been assessed. The present study investigated the effects of reducing serotonin function, using acute tryptophan depletion, on the recognition of basic facial expressions of emotions in healthy male and female volunteers. A double-blind between-groups design was used, with volunteers being randomly allocated to receive an amino acid drink specifically lacking tryptophan or a control mixture containing a balanced mixture of these amino acids. Participants were given a facial expression recognition task 5 h after drink administration. This task featured examples of six basic emotions (fear, anger, disgust, surprise, sadness and happiness) that had been morphed between each full emotion and neutral in 10% steps. As a control, volunteers were given a famous face classification task matched in terms of response selection and difficulty level. Tryptophan depletion significantly impaired the recognition of fearful facial expressions in female, but not male, volunteers. This was specific since recognition of other basic emotions was comparable in the two groups. There was also no effect of tryptophan depletion on the classification of famous faces or on subjective state ratings of mood or anxiety. These results confirm a role for serotonin in the processing of fear related cues, and in line with previous findings also suggest greater effects of tryptophan depletion in female volunteers. Although acute tryptophan depletion does not typically affect mood in healthy subjects, the present results suggest that subtle changes in the processing of emotional material may occur with this manipulation of serotonin function.

  11. The influence of context on distinct facial expressions of disgust.

    PubMed

    Reschke, Peter J; Walle, Eric A; Knothe, Jennifer M; Lopez, Lukas D

    2018-06-11

    Face perception is susceptible to contextual influence and perceived physical similarities between emotion cues. However, studies often use structurally homogeneous facial expressions, making it difficult to explore how within-emotion variability in facial configuration affects emotion perception. This study examined the influence of context on the emotional perception of categorically identical, yet physically distinct, facial expressions of disgust. Participants categorized two perceptually distinct disgust facial expressions, "closed" (i.e., scrunched nose, closed mouth) and "open" (i.e., scrunched nose, open mouth, protruding tongue), that were embedded in contexts comprising emotion postures and scenes. Results demonstrated that the effect of nonfacial elements was significantly stronger for "open" disgust facial expressions than "closed" disgust facial expressions. These findings provide support that physical similarity within discrete categories of facial expressions is mutable and plays an important role in affective face perception. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  12. More emotional facial expressions during episodic than during semantic autobiographical retrieval.

    PubMed

    El Haj, Mohamad; Antoine, Pascal; Nandrino, Jean Louis

    2016-04-01

    There is a substantial body of research on the relationship between emotion and autobiographical memory. Using facial analysis software, our study addressed this relationship by investigating basic emotional facial expressions that may be detected during autobiographical recall. Participants were asked to retrieve 3 autobiographical memories, each of which was triggered by one of the following cue words: happy, sad, and city. The autobiographical recall was analyzed by a software for facial analysis that detects and classifies basic emotional expressions. Analyses showed that emotional cues triggered the corresponding basic facial expressions (i.e., happy facial expression for memories cued by happy). Furthermore, we dissociated episodic and semantic retrieval, observing more emotional facial expressions during episodic than during semantic retrieval, regardless of the emotional valence of cues. Our study provides insight into facial expressions that are associated with emotional autobiographical memory. It also highlights an ecological tool to reveal physiological changes that are associated with emotion and memory.

  13. Decoding facial expressions based on face-selective and motion-sensitive areas.

    PubMed

    Liang, Yin; Liu, Baolin; Xu, Junhai; Zhang, Gaoyan; Li, Xianglin; Wang, Peiyuan; Wang, Bin

    2017-06-01

    Humans can easily recognize others' facial expressions. Among the brain substrates that enable this ability, considerable attention has been paid to face-selective areas; in contrast, whether motion-sensitive areas, which clearly exhibit sensitivity to facial movements, are involved in facial expression recognition remained unclear. The present functional magnetic resonance imaging (fMRI) study used multi-voxel pattern analysis (MVPA) to explore facial expression decoding in both face-selective and motion-sensitive areas. In a block design experiment, participants viewed facial expressions of six basic emotions (anger, disgust, fear, joy, sadness, and surprise) in images, videos, and eyes-obscured videos. Due to the use of multiple stimulus types, the impacts of facial motion and eye-related information on facial expression decoding were also examined. It was found that motion-sensitive areas showed significant responses to emotional expressions and that dynamic expressions could be successfully decoded in both face-selective and motion-sensitive areas. Compared with static stimuli, dynamic expressions elicited consistently higher neural responses and decoding performance in all regions. A significant decrease in both activation and decoding accuracy due to the absence of eye-related information was also observed. Overall, the findings showed that emotional expressions are represented in motion-sensitive areas in addition to conventional face-selective areas, suggesting that motion-sensitive regions may also effectively contribute to facial expression recognition. The results also suggested that facial motion and eye-related information played important roles by carrying considerable expression information that could facilitate facial expression recognition. Hum Brain Mapp 38:3113-3125, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  14. Right Hemisphere Dominance for Emotion Processing in Baboons

    ERIC Educational Resources Information Center

    Wallez, Catherine; Vauclair, Jacques

    2011-01-01

    Asymmetries of emotional facial expressions in humans offer reliable indexes to infer brain lateralization and mostly revealed right hemisphere dominance. Studies concerned with oro-facial asymmetries in nonhuman primates largely showed a left-sided asymmetry in chimpanzees, marmosets and macaques. The presence of asymmetrical oro-facial…

  15. Factors contributing to the adaptation aftereffects of facial expression.

    PubMed

    Butler, Andrea; Oruc, Ipek; Fox, Christopher J; Barton, Jason J S

    2008-01-29

    Previous studies have demonstrated the existence of adaptation aftereffects for facial expressions. Here we investigated which aspects of facial stimuli contribute to these aftereffects. In Experiment 1, we examined the role of local adaptation to image elements such as curvature, shape and orientation, independent of expression, by using hybrid faces constructed from either the same or opposing expressions. While hybrid faces made with consistent expressions generated aftereffects as large as those with normal faces, there were no aftereffects from hybrid faces made from different expressions, despite the fact that these contained the same local image elements. In Experiment 2, we examined the role of facial features independent of the normal face configuration by contrasting adaptation with whole faces to adaptation with scrambled faces. We found that scrambled faces also generated significant aftereffects, indicating that expressive features without a normal facial configuration could generate expression aftereffects. In Experiment 3, we examined the role of facial configuration by using schematic faces made from line elements that in isolation do not carry expression-related information (e.g. curved segments and straight lines) but that convey an expression when arranged in a normal facial configuration. We obtained a significant aftereffect for facial configurations but not scrambled configurations of these line elements. We conclude that facial expression aftereffects are not due to local adaptation to image elements but due to high-level adaptation of neural representations that involve both facial features and facial configuration.

  16. Knowing how you are feeling depends on what's on my mind: Cognitive load and expression categorization.

    PubMed

    Ahmed, Lubna

    2018-03-01

    The ability to correctly interpret facial expressions is key to effective social interactions. People are well rehearsed and generally very efficient at correctly categorizing expressions. However, does their ability to do so depend on how cognitively loaded they are at the time? Using repeated-measures designs, we assessed the sensitivity of facial expression categorization to cognitive resources availability by measuring people's expression categorization performance during concurrent low and high cognitive load situations. In Experiment1, participants categorized the 6 basic upright facial expressions in a 6-automated-facial-coding response paradigm while maintaining low or high loading information in working memory (N = 40; 60 observations per load condition). In Experiment 2, they did so for both upright and inverted faces (N = 46; 60 observations per load and inversion condition). In both experiments, expression categorization for upright faces was worse during high versus low load. Categorization rates actually improved with increased load for the inverted faces. The opposing effects of cognitive load on upright and inverted expressions are explained in terms of a cognitive load-related dispersion in the attentional window. Overall, the findings support that expression categorization is sensitive to cognitive resources availability and moreover suggest that, in this paradigm, it is the perceptual processing stage of expression categorization that is affected by cognitive load. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  17. Influence of Aggression on Information Processing in the Emotional Stroop Task – an Event-Related Potential Study

    PubMed Central

    Bertsch, Katja; Böhnke, Robina; Kruk, Menno R.; Naumann, Ewald

    2009-01-01

    Aggression is a common behavior which has frequently been explained as involving changes in higher level information processing patterns. Although researchers have started only recently to investigate information processing in healthy individuals while engaged in aggressive behavior, the impact of aggression on information processing beyond an aggressive encounter remains unclear. In an event-related potential study, we investigated the processing of facial expressions (happy, angry, fearful, and neutral) in an emotional Stroop task after experimentally provoking aggressive behavior in healthy participants. Compared to a non-provoked group, these individuals showed increased early (P2) and late (P3) positive amplitudes for all facial expressions. For the P2 amplitude, the effect of provocation was greatest for threat-related expressions. Beyond this, a bias for emotional expressions, i.e., slower reaction times to all emotional expressions, was found in provoked participants with a high level of trait anger. These results indicate significant effects of aggression on information processing, which last beyond the aggressive encounter even in healthy participants. PMID:19826616

  18. The not face: A grammaticalization of facial expressions of emotion.

    PubMed

    Benitez-Quiroz, C Fabian; Wilbur, Ronnie B; Martinez, Aleix M

    2016-05-01

    Facial expressions of emotion are thought to have evolved from the development of facial muscles used in sensory regulation and later adapted to express moral judgment. Negative moral judgment includes the expressions of anger, disgust and contempt. Here, we study the hypothesis that these facial expressions of negative moral judgment have further evolved into a facial expression of negation regularly used as a grammatical marker in human language. Specifically, we show that people from different cultures expressing negation use the same facial muscles as those employed to express negative moral judgment. We then show that this nonverbal signal is used as a co-articulator in speech and that, in American Sign Language, it has been grammaticalized as a non-manual marker. Furthermore, this facial expression of negation exhibits the theta oscillation (3-8 Hz) universally seen in syllable and mouthing production in speech and signing. These results provide evidence for the hypothesis that some components of human language have evolved from facial expressions of emotion, and suggest an evolutionary route for the emergence of grammatical markers. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. The Not Face: A grammaticalization of facial expressions of emotion

    PubMed Central

    Benitez-Quiroz, C. Fabian; Wilbur, Ronnie B.; Martinez, Aleix M.

    2016-01-01

    Facial expressions of emotion are thought to have evolved from the development of facial muscles used in sensory regulation and later adapted to express moral judgment. Negative moral judgment includes the expressions of anger, disgust and contempt. Here, we study the hypothesis that these facial expressions of negative moral judgment have further evolved into a facial expression of negation regularly used as a grammatical marker in human language. Specifically, we show that people from different cultures expressing negation use the same facial muscles as those employed to express negative moral judgment. We then show that this nonverbal signal is used as a co-articulator in speech and that, in American Sign Language, it has been grammaticalized as a non-manual marker. Furthermore, this facial expression of negation exhibits the theta oscillation (3–8 Hz) universally seen in syllable and mouthing production in speech and signing. These results provide evidence for the hypothesis that some components of human language have evolved from facial expressions of emotion, and suggest an evolutionary route for the emergence of grammatical markers. PMID:26872248

  20. Facial Expression Generation from Speaker's Emotional States in Daily Conversation

    NASA Astrophysics Data System (ADS)

    Mori, Hiroki; Ohshima, Koh

    A framework for generating facial expressions from emotional states in daily conversation is described. It provides a mapping between emotional states and facial expressions, where the former is represented by vectors with psychologically-defined abstract dimensions, and the latter is coded by the Facial Action Coding System. In order to obtain the mapping, parallel data with rated emotional states and facial expressions were collected for utterances of a female speaker, and a neural network was trained with the data. The effectiveness of proposed method is verified by a subjective evaluation test. As the result, the Mean Opinion Score with respect to the suitability of generated facial expression was 3.86 for the speaker, which was close to that of hand-made facial expressions.

  1. Evidence for Anger Saliency during the Recognition of Chimeric Facial Expressions of Emotions in Underage Ebola Survivors

    PubMed Central

    Ardizzi, Martina; Evangelista, Valentina; Ferroni, Francesca; Umiltà, Maria A.; Ravera, Roberto; Gallese, Vittorio

    2017-01-01

    One of the crucial features defining basic emotions and their prototypical facial expressions is their value for survival. Childhood traumatic experiences affect the effective recognition of facial expressions of negative emotions, normally allowing the recruitment of adequate behavioral responses to environmental threats. Specifically, anger becomes an extraordinarily salient stimulus unbalancing victims’ recognition of negative emotions. Despite the plethora of studies on this topic, to date, it is not clear whether this phenomenon reflects an overall response tendency toward anger recognition or a selective proneness to the salience of specific facial expressive cues of anger after trauma exposure. To address this issue, a group of underage Sierra Leonean Ebola virus disease survivors (mean age 15.40 years, SE 0.35; years of schooling 8.8 years, SE 0.46; 14 males) and a control group (mean age 14.55, SE 0.30; years of schooling 8.07 years, SE 0.30, 15 males) performed a forced-choice chimeric facial expressions recognition task. The chimeric facial expressions were obtained pairing upper and lower half faces of two different negative emotions (selected from anger, fear and sadness for a total of six different combinations). Overall, results showed that upper facial expressive cues were more salient than lower facial expressive cues. This priority was lost among Ebola virus disease survivors for the chimeric facial expressions of anger. In this case, differently from controls, Ebola virus disease survivors recognized anger regardless of the upper or lower position of the facial expressive cues of this emotion. The present results demonstrate that victims’ performance in the recognition of the facial expression of anger does not reflect an overall response tendency toward anger recognition, but rather the specific greater salience of facial expressive cues of anger. Furthermore, the present results show that traumatic experiences deeply modify the perceptual analysis of philogenetically old behavioral patterns like the facial expressions of emotions. PMID:28690565

  2. Evidence for Anger Saliency during the Recognition of Chimeric Facial Expressions of Emotions in Underage Ebola Survivors.

    PubMed

    Ardizzi, Martina; Evangelista, Valentina; Ferroni, Francesca; Umiltà, Maria A; Ravera, Roberto; Gallese, Vittorio

    2017-01-01

    One of the crucial features defining basic emotions and their prototypical facial expressions is their value for survival. Childhood traumatic experiences affect the effective recognition of facial expressions of negative emotions, normally allowing the recruitment of adequate behavioral responses to environmental threats. Specifically, anger becomes an extraordinarily salient stimulus unbalancing victims' recognition of negative emotions. Despite the plethora of studies on this topic, to date, it is not clear whether this phenomenon reflects an overall response tendency toward anger recognition or a selective proneness to the salience of specific facial expressive cues of anger after trauma exposure. To address this issue, a group of underage Sierra Leonean Ebola virus disease survivors (mean age 15.40 years, SE 0.35; years of schooling 8.8 years, SE 0.46; 14 males) and a control group (mean age 14.55, SE 0.30; years of schooling 8.07 years, SE 0.30, 15 males) performed a forced-choice chimeric facial expressions recognition task. The chimeric facial expressions were obtained pairing upper and lower half faces of two different negative emotions (selected from anger, fear and sadness for a total of six different combinations). Overall, results showed that upper facial expressive cues were more salient than lower facial expressive cues. This priority was lost among Ebola virus disease survivors for the chimeric facial expressions of anger. In this case, differently from controls, Ebola virus disease survivors recognized anger regardless of the upper or lower position of the facial expressive cues of this emotion. The present results demonstrate that victims' performance in the recognition of the facial expression of anger does not reflect an overall response tendency toward anger recognition, but rather the specific greater salience of facial expressive cues of anger. Furthermore, the present results show that traumatic experiences deeply modify the perceptual analysis of philogenetically old behavioral patterns like the facial expressions of emotions.

  3. Mimicking emotions: how 3-12-month-old infants use the facial expressions and eyes of a model.

    PubMed

    Soussignan, Robert; Dollion, Nicolas; Schaal, Benoist; Durand, Karine; Reissland, Nadja; Baudouin, Jean-Yves

    2018-06-01

    While there is an extensive literature on the tendency to mimic emotional expressions in adults, it is unclear how this skill emerges and develops over time. Specifically, it is unclear whether infants mimic discrete emotion-related facial actions, whether their facial displays are moderated by contextual cues and whether infants' emotional mimicry is constrained by developmental changes in the ability to discriminate emotions. We therefore investigate these questions using Baby-FACS to code infants' facial displays and eye-movement tracking to examine infants' looking times at facial expressions. Three-, 7-, and 12-month-old participants were exposed to dynamic facial expressions (joy, anger, fear, disgust, sadness) of a virtual model which either looked at the infant or had an averted gaze. Infants did not match emotion-specific facial actions shown by the model, but they produced valence-congruent facial responses to the distinct expressions. Furthermore, only the 7- and 12-month-olds displayed negative responses to the model's negative expressions and they looked more at areas of the face recruiting facial actions involved in specific expressions. Our results suggest that valence-congruent expressions emerge in infancy during a period where the decoding of facial expressions becomes increasingly sensitive to the social signal value of emotions.

  4. Face Processing: Models For Recognition

    NASA Astrophysics Data System (ADS)

    Turk, Matthew A.; Pentland, Alexander P.

    1990-03-01

    The human ability to process faces is remarkable. We can identify perhaps thousands of faces learned throughout our lifetime and read facial expression to understand such subtle qualities as emotion. These skills are quite robust, despite sometimes large changes in the visual stimulus due to expression, aging, and distractions such as glasses or changes in hairstyle or facial hair. Computers which model and recognize faces will be useful in a variety of applications, including criminal identification, human-computer interface, and animation. We discuss models for representing faces and their applicability to the task of recognition, and present techniques for identifying faces and detecting eye blinks.

  5. Psychometric challenges and proposed solutions when scoring facial emotion expression codes.

    PubMed

    Olderbak, Sally; Hildebrandt, Andrea; Pinkpank, Thomas; Sommer, Werner; Wilhelm, Oliver

    2014-12-01

    Coding of facial emotion expressions is increasingly performed by automated emotion expression scoring software; however, there is limited discussion on how best to score the resulting codes. We present a discussion of facial emotion expression theories and a review of contemporary emotion expression coding methodology. We highlight methodological challenges pertinent to scoring software-coded facial emotion expression codes and present important psychometric research questions centered on comparing competing scoring procedures of these codes. Then, on the basis of a time series data set collected to assess individual differences in facial emotion expression ability, we derive, apply, and evaluate several statistical procedures, including four scoring methods and four data treatments, to score software-coded emotion expression data. These scoring procedures are illustrated to inform analysis decisions pertaining to the scoring and data treatment of other emotion expression questions and under different experimental circumstances. Overall, we found applying loess smoothing and controlling for baseline facial emotion expression and facial plasticity are recommended methods of data treatment. When scoring facial emotion expression ability, maximum score is preferred. Finally, we discuss the scoring methods and data treatments in the larger context of emotion expression research.

  6. Visual search for facial expressions of emotions: a comparison of dynamic and static faces.

    PubMed

    Horstmann, Gernot; Ansorge, Ulrich

    2009-02-01

    A number of past studies have used the visual search paradigm to examine whether certain aspects of emotional faces are processed preattentively and can thus be used to guide attention. All these studies presented static depictions of facial prototypes. Emotional expressions conveyed by the movement patterns of the face have never been examined for their preattentive effect. The present study presented for the first time dynamic facial expressions in a visual search paradigm. Experiment 1 revealed efficient search for a dynamic angry face among dynamic friendly faces, but inefficient search in a control condition with static faces. Experiments 2 to 4 suggested that this pattern of results is due to a stronger movement signal in the angry than in the friendly face: No (strong) advantage of dynamic over static faces is revealed when the degree of movement is controlled. These results show that dynamic information can be efficiently utilized in visual search for facial expressions. However, these results do not generally support the hypothesis that emotion-specific movement patterns are always preattentively discriminated. (c) 2009 APA, all rights reserved

  7. Quantifying facial expression recognition across viewing conditions.

    PubMed

    Goren, Deborah; Wilson, Hugh R

    2006-04-01

    Facial expressions are key to social interactions and to assessment of potential danger in various situations. Therefore, our brains must be able to recognize facial expressions when they are transformed in biologically plausible ways. We used synthetic happy, sad, angry and fearful faces to determine the amount of geometric change required to recognize these emotions during brief presentations. Five-alternative forced choice conditions involving central viewing, peripheral viewing and inversion were used to study recognition among the four emotions. Two-alternative forced choice was used to study affect discrimination when spatial frequency information in the stimulus was modified. The results show an emotion and task-dependent pattern of detection. Facial expressions presented with low peak frequencies are much harder to discriminate from neutral than faces defined by either mid or high peak frequencies. Peripheral presentation of faces also makes recognition much more difficult, except for happy faces. Differences between fearful detection and recognition tasks are probably due to common confusions with sadness when recognizing fear from among other emotions. These findings further support the idea that these emotions are processed separately from each other.

  8. Investigating the genetic basis of attention to facial expressions: the role of the norepinephrine transporter gene.

    PubMed

    Yang, Xing; Ru, Wenzhao; Wang, Bei; Gao, Xiaocai; Yang, Lu; Li, She; Xi, Shoumin; Gong, Pingyuan

    2016-12-01

    Levels of norepinephrine (NE) in the brain are related to attention ability in animals and risk of attention-deficit hyperactivity disorder in humans. Given the modulation of the norepinephrine transporter (NET) on NE levels in the brain and the link between NE and attention impairment of attention-deficit hyperactivity disorder, it was possible that the NET gene underpinned individual differences in attention processes in healthy populations. To investigate to what extent NET could modulate one's attention orientation to facial expressions, we categorized individuals according to the genotypes of the -182 T/C (rs2242446) polymorphism and measured individuals' attention orientation with the spatial cueing task. Our results indicated that the -182 T/C polymorphism significantly modulated attention orientation to facial expressions, of which the CC genotype facilitated attention reorientation to the locations where cued faces were previously presented. However, this polymorphism showed no significant effects on the regulations of emotional cues on attention orientation. Our findings suggest that the NET gene modulates the individual difference in attention to facial expressions, which provides new insights into the roles of NE in social interactions.

  9. Facial morphology and children's categorization of facial expressions of emotions: a comparison between Asian and Caucasian faces.

    PubMed

    Gosselin, P; Larocque, C

    2000-09-01

    The effects of Asian and Caucasian facial morphology were examined by having Canadian children categorize pictures of facial expressions of basic emotions. The pictures were selected from the Japanese and Caucasian Facial Expressions of Emotion set developed by D. Matsumoto and P. Ekman (1989). Sixty children between the ages of 5 and 10 years were presented with short stories and an array of facial expressions, and were asked to point to the expression that best depicted the specific emotion experienced by the characters. The results indicated that expressions of fear and surprise were better categorized from Asian faces, whereas expressions of disgust were better categorized from Caucasian faces. These differences originated in some specific confusions between expressions.

  10. Recognizing Action Units for Facial Expression Analysis

    PubMed Central

    Tian, Ying-li; Kanade, Takeo; Cohn, Jeffrey F.

    2010-01-01

    Most automatic expression analysis systems attempt to recognize a small set of prototypic expressions, such as happiness, anger, surprise, and fear. Such prototypic expressions, however, occur rather infrequently. Human emotions and intentions are more often communicated by changes in one or a few discrete facial features. In this paper, we develop an Automatic Face Analysis (AFA) system to analyze facial expressions based on both permanent facial features (brows, eyes, mouth) and transient facial features (deepening of facial furrows) in a nearly frontal-view face image sequence. The AFA system recognizes fine-grained changes in facial expression into action units (AUs) of the Facial Action Coding System (FACS), instead of a few prototypic expressions. Multistate face and facial component models are proposed for tracking and modeling the various facial features, including lips, eyes, brows, cheeks, and furrows. During tracking, detailed parametric descriptions of the facial features are extracted. With these parameters as the inputs, a group of action units (neutral expression, six upper face AUs and 10 lower face AUs) are recognized whether they occur alone or in combinations. The system has achieved average recognition rates of 96.4 percent (95.4 percent if neutral expressions are excluded) for upper face AUs and 96.7 percent (95.6 percent with neutral expressions excluded) for lower face AUs. The generalizability of the system has been tested by using independent image databases collected and FACS-coded for ground-truth by different research teams. PMID:25210210

  11. A Facial Control Method Using Emotional Parameters in Sensibility Robot

    NASA Astrophysics Data System (ADS)

    Shibata, Hiroshi; Kanoh, Masayoshi; Kato, Shohei; Kunitachi, Tsutomu; Itoh, Hidenori

    The “Ifbot” robot communicates with people by considering its own “emotions”. Ifbot has many facial expressions to communicate enjoyment. These are used to express its internal emotions, purposes, reactions caused by external stimulus, and entertainment such as singing songs. All these facial expressions are developed by designers manually. Using this approach, we must design all facial motions, if we want Ifbot to express them. It, however, is not realistic. We have therefore developed a system which convert Ifbot's emotions to its facial expressions automatically. In this paper, we propose a method for creating Ifbot's facial expressions from parameters, emotional parameters, which handle its internal emotions computationally.

  12. Towards Emotion Detection in Educational Scenarios from Facial Expressions and Body Movements through Multimodal Approaches

    PubMed Central

    Saneiro, Mar; Salmeron-Majadas, Sergio

    2014-01-01

    We report current findings when considering video recordings of facial expressions and body movements to provide affective personalized support in an educational context from an enriched multimodal emotion detection approach. In particular, we describe an annotation methodology to tag facial expression and body movements that conform to changes in the affective states of learners while dealing with cognitive tasks in a learning process. The ultimate goal is to combine these annotations with additional affective information collected during experimental learning sessions from different sources such as qualitative, self-reported, physiological, and behavioral information. These data altogether are to train data mining algorithms that serve to automatically identify changes in the learners' affective states when dealing with cognitive tasks which help to provide emotional personalized support. PMID:24892055

  13. Automatic recognition of emotions from facial expressions

    NASA Astrophysics Data System (ADS)

    Xue, Henry; Gertner, Izidor

    2014-06-01

    In the human-computer interaction (HCI) process it is desirable to have an artificial intelligent (AI) system that can identify and categorize human emotions from facial expressions. Such systems can be used in security, in entertainment industries, and also to study visual perception, social interactions and disorders (e.g. schizophrenia and autism). In this work we survey and compare the performance of different feature extraction algorithms and classification schemes. We introduce a faster feature extraction method that resizes and applies a set of filters to the data images without sacrificing the accuracy. In addition, we have enhanced SVM to multiple dimensions while retaining the high accuracy rate of SVM. The algorithms were tested using the Japanese Female Facial Expression (JAFFE) Database and the Database of Faces (AT&T Faces).

  14. Towards emotion detection in educational scenarios from facial expressions and body movements through multimodal approaches.

    PubMed

    Saneiro, Mar; Santos, Olga C; Salmeron-Majadas, Sergio; Boticario, Jesus G

    2014-01-01

    We report current findings when considering video recordings of facial expressions and body movements to provide affective personalized support in an educational context from an enriched multimodal emotion detection approach. In particular, we describe an annotation methodology to tag facial expression and body movements that conform to changes in the affective states of learners while dealing with cognitive tasks in a learning process. The ultimate goal is to combine these annotations with additional affective information collected during experimental learning sessions from different sources such as qualitative, self-reported, physiological, and behavioral information. These data altogether are to train data mining algorithms that serve to automatically identify changes in the learners' affective states when dealing with cognitive tasks which help to provide emotional personalized support.

  15. Categorical Perception of Affective and Linguistic Facial Expressions

    ERIC Educational Resources Information Center

    McCullough, Stephen; Emmorey, Karen

    2009-01-01

    Two experiments investigated categorical perception (CP) effects for affective facial expressions and linguistic facial expressions from American Sign Language (ASL) for Deaf native signers and hearing non-signers. Facial expressions were presented in isolation (Experiment 1) or in an ASL verb context (Experiment 2). Participants performed ABX…

  16. Four not six: Revealing culturally common facial expressions of emotion.

    PubMed

    Jack, Rachael E; Sun, Wei; Delis, Ioannis; Garrod, Oliver G B; Schyns, Philippe G

    2016-06-01

    As a highly social species, humans generate complex facial expressions to communicate a diverse range of emotions. Since Darwin's work, identifying among these complex patterns which are common across cultures and which are culture-specific has remained a central question in psychology, anthropology, philosophy, and more recently machine vision and social robotics. Classic approaches to addressing this question typically tested the cross-cultural recognition of theoretically motivated facial expressions representing 6 emotions, and reported universality. Yet, variable recognition accuracy across cultures suggests a narrower cross-cultural communication supported by sets of simpler expressive patterns embedded in more complex facial expressions. We explore this hypothesis by modeling the facial expressions of over 60 emotions across 2 cultures, and segregating out the latent expressive patterns. Using a multidisciplinary approach, we first map the conceptual organization of a broad spectrum of emotion words by building semantic networks in 2 cultures. For each emotion word in each culture, we then model and validate its corresponding dynamic facial expression, producing over 60 culturally valid facial expression models. We then apply to the pooled models a multivariate data reduction technique, revealing 4 latent and culturally common facial expression patterns that each communicates specific combinations of valence, arousal, and dominance. We then reveal the face movements that accentuate each latent expressive pattern to create complex facial expressions. Our data questions the widely held view that 6 facial expression patterns are universal, instead suggesting 4 latent expressive patterns with direct implications for emotion communication, social psychology, cognitive neuroscience, and social robotics. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  17. Project PAVE (Personality And Vision Experimentation): role of personal and interpersonal resilience in the perception of emotional facial expression

    PubMed Central

    Tanzer, Michal; Shahar, Golan; Avidan, Galia

    2014-01-01

    The aim of the proposed theoretical model is to illuminate personal and interpersonal resilience by drawing from the field of emotional face perception. We suggest that perception/recognition of emotional facial expressions serves as a central link between subjective, self-related processes and the social context. Emotional face perception constitutes a salient social cue underlying interpersonal communication and behavior. Because problems in communication and interpersonal behavior underlie most, if not all, forms of psychopathology, it follows that perception/recognition of emotional facial expressions impacts psychopathology. The ability to accurately interpret one’s facial expression is crucial in subsequently deciding on an appropriate course of action. However, perception in general, and of emotional facial expressions in particular, is highly influenced by individuals’ personality and the self-concept. Herein we briefly outline well-established theories of personal and interpersonal resilience and link them to the neuro-cognitive basis of face perception. We then describe the findings of our ongoing program of research linking two well-established resilience factors, general self-efficacy (GSE) and perceived social support (PSS), with face perception. We conclude by pointing out avenues for future research focusing on possible genetic markers and patterns of brain connectivity associated with the proposed model. Implications of our integrative model to psychotherapy are discussed. PMID:25165439

  18. Brain oscillations and BIS/BAS (behavioral inhibition/activation system) effects on processing masked emotional cues. ERS/ERD and coherence measures of alpha band.

    PubMed

    Balconi, Michela; Mazza, Guido

    2009-11-01

    Alpha brain oscillation modulation was analyzed in response to masked emotional facial expressions. In addition, behavioural activation (BAS) and behavioural inhibition systems (BIS) were considered as an explicative factor to verify the effect of motivational significance on cortical activity. Nineteen subjects were submitted to an ample range of facial expressions of emotions (anger, fear, surprise, disgust, happiness, sadness, and neutral). The results demonstrated that anterior frontal sites were more active than central and posterior sites in response to facial stimuli. Moreover, right-side responses varied as a function of emotional types, with an increased right-frontal activity for negative emotions. Finally, whereas higher BIS subjects generated a more right hemisphere activation for some negative emotions (such as fear, anger, and surprise), Reward-BAS subjects were more responsive to positive emotion (happiness) within the left hemisphere. Valence and potential threatening power of facial expressions were considered to elucidate these cortical differences.

  19. Pose-variant facial expression recognition using an embedded image system

    NASA Astrophysics Data System (ADS)

    Song, Kai-Tai; Han, Meng-Ju; Chang, Shuo-Hung

    2008-12-01

    In recent years, one of the most attractive research areas in human-robot interaction is automated facial expression recognition. Through recognizing the facial expression, a pet robot can interact with human in a more natural manner. In this study, we focus on the facial pose-variant problem. A novel method is proposed in this paper to recognize pose-variant facial expressions. After locating the face position in an image frame, the active appearance model (AAM) is applied to track facial features. Fourteen feature points are extracted to represent the variation of facial expressions. The distance between feature points are defined as the feature values. These feature values are sent to a support vector machine (SVM) for facial expression determination. The pose-variant facial expression is classified into happiness, neutral, sadness, surprise or anger. Furthermore, in order to evaluate the performance for practical applications, this study also built a low resolution database (160x120 pixels) using a CMOS image sensor. Experimental results show that the recognition rate is 84% with the self-built database.

  20. Hierarchical Encoding of Social Cues in Primate Inferior Temporal Cortex.

    PubMed

    Morin, Elyse L; Hadj-Bouziane, Fadila; Stokes, Mark; Ungerleider, Leslie G; Bell, Andrew H

    2015-09-01

    Faces convey information about identity and emotional state, both of which are important for our social interactions. Models of face processing propose that changeable versus invariant aspects of a face, specifically facial expression/gaze direction versus facial identity, are coded by distinct neural pathways and yet neurophysiological data supporting this separation are incomplete. We recorded activity from neurons along the inferior bank of the superior temporal sulcus (STS), while monkeys viewed images of conspecific faces and non-face control stimuli. Eight monkey identities were used, each presented with 3 different facial expressions (neutral, fear grin, and threat). All facial expressions were displayed with both a direct and averted gaze. In the posterior STS, we found that about one-quarter of face-responsive neurons are sensitive to social cues, the majority of which being sensitive to only one of these cues. In contrast, in anterior STS, not only did the proportion of neurons sensitive to social cues increase, but so too did the proportion of neurons sensitive to conjunctions of identity with either gaze direction or expression. These data support a convergence of signals related to faces as one moves anteriorly along the inferior bank of the STS, which forms a fundamental part of the face-processing network. Published by Oxford University Press 2014. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  1. Attention to emotion modulates fMRI activity in human right superior temporal sulcus.

    PubMed

    Narumoto, J; Okada, T; Sadato, N; Fukui, K; Yonekura, Y

    2001-10-01

    A parallel neural network has been proposed for processing various types of information conveyed by faces including emotion. Using functional magnetic resonance imaging (fMRI), we tested the effect of the explicit attention to the emotional expression of the faces on the neuronal activity of the face-responsive regions. Delayed match to sample procedure was adopted. Subjects were required to match the visually presented pictures with regard to the contour of the face pictures, facial identity, and emotional expressions by valence (happy and fearful expressions) and arousal (fearful and sad expressions). Contour matching of the non-face scrambled pictures was used as a control condition. The face-responsive regions that responded more to faces than to non-face stimuli were the bilateral lateral fusiform gyrus (LFG), the right superior temporal sulcus (STS), and the bilateral intraparietal sulcus (IPS). In these regions, general attention to the face enhanced the activities of the bilateral LFG, the right STS, and the left IPS compared with attention to the contour of the facial image. Selective attention to facial emotion specifically enhanced the activity of the right STS compared with attention to the face per se. The results suggest that the right STS region plays a special role in facial emotion recognition within distributed face-processing systems. This finding may support the notion that the STS is involved in social perception.

  2. A Virtual Environment to Improve the Detection of Oral-Facial Malfunction in Children with Cerebral Palsy.

    PubMed

    Martín-Ruiz, María-Luisa; Máximo-Bocanegra, Nuria; Luna-Oliva, Laura

    2016-03-26

    The importance of an early rehabilitation process in children with cerebral palsy (CP) is widely recognized. On the one hand, new and useful treatment tools such as rehabilitation systems based on interactive technologies have appeared for rehabilitation of gross motor movements. On the other hand, from the therapeutic point of view, performing rehabilitation exercises with the facial muscles can improve the swallowing process, the facial expression through the management of muscles in the face, and even the speech of children with cerebral palsy. However, it is difficult to find interactive games to improve the detection and evaluation of oral-facial musculature dysfunctions in children with CP. This paper describes a framework based on strategies developed for interactive serious games that is created both for typically developed children and children with disabilities. Four interactive games are the core of a Virtual Environment called SONRIE. This paper demonstrates the benefits of SONRIE to monitor children's oral-facial difficulties. The next steps will focus on the validation of SONRIE to carry out the rehabilitation process of oral-facial musculature in children with cerebral palsy.

  3. Facial expressions perceived by the adolescent brain: Towards the proficient use of low spatial frequency information.

    PubMed

    Peters, Judith C; Kemner, Chantal

    2017-10-01

    Rapid decoding of emotional expressions is essential for social communication. Fast processing of facial expressions depends on the adequate (subcortical) processing of important global face cues in the low spatial frequency (LSF) ranges. However, children below 9 years of age extract fearful expression information from local details represented by high SF (HSF) image content. Our ERP study investigated at which developmental stage this ineffective HSF-driven processing is replaced by the proficient and rapid LSF-driven perception of fearful faces, in which adults are highly skilled. We examined behavioral and neural responses to high- and low-pass filtered faces with a fearful or neutral expression in groups of children on the verge of pre-adolescence (9-10 years), adolescents (14-15 years), and young adults (20-28 years). Our results suggest that the neural emotional face processing network has a protracted maturational course into adolescence, which is related to changes in SF processing. In mid-adolescence, increased sensitivity to emotional LSF cues is developed, which aids the fast and adequate processing of fearful expressions that might signal impending danger. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Faces in-between: evaluations reflect the interplay of facial features and task-dependent fluency.

    PubMed

    Winkielman, Piotr; Olszanowski, Michal; Gola, Mateusz

    2015-04-01

    Facial features influence social evaluations. For example, faces are rated as more attractive and trustworthy when they have more smiling features and also more female features. However, the influence of facial features on evaluations should be qualified by the affective consequences of fluency (cognitive ease) with which such features are processed. Further, fluency (along with its affective consequences) should depend on whether the current task highlights conflict between specific features. Four experiments are presented. In 3 experiments, participants saw faces varying in expressions ranging from pure anger, through mixed expression, to pure happiness. Perceivers first categorized faces either on a control dimension, or an emotional dimension (angry/happy). Thus, the emotional categorization task made "pure" expressions fluent and "mixed" expressions disfluent. Next, participants made social evaluations. Results show that after emotional categorization, but not control categorization, targets with mixed expressions are relatively devalued. Further, this effect is mediated by categorization disfluency. Additional data from facial electromyography reveal that on a basic physiological level, affective devaluation of mixed expressions is driven by their objective ambiguity. The fourth experiment shows that the relative devaluation of mixed faces that vary in gender ambiguity requires a gender categorization task. Overall, these studies highlight that the impact of facial features on evaluation is qualified by their fluency, and that the fluency of features is a function of the current task. The discussion highlights the implications of these findings for research on emotional reactions to ambiguity. (c) 2015 APA, all rights reserved).

  5. Analysis of Facial Expression by Taste Stimulation

    NASA Astrophysics Data System (ADS)

    Tobitani, Kensuke; Kato, Kunihito; Yamamoto, Kazuhiko

    In this study, we focused on the basic taste stimulation for the analysis of real facial expressions. We considered that the expressions caused by taste stimulation were unaffected by individuality or emotion, that is, such expressions were involuntary. We analyzed the movement of facial muscles by taste stimulation and compared real expressions with artificial expressions. From the result, we identified an obvious difference between real and artificial expressions. Thus, our method would be a new approach for facial expression recognition.

  6. Gender differences in facial imitation and verbally reported emotional contagion from spontaneous to emotionally regulated processing levels.

    PubMed

    Sonnby-Borgström, Marianne; Jönsson, Peter; Svensson, Owe

    2008-04-01

    Previous studies on gender differences in facial imitation and verbally reported emotional contagion have investigated emotional responses to pictures of facial expressions at supraliminal exposure times. The aim of the present study was to investigate how gender differences are related to different exposure times, representing information processing levels from subliminal (spontaneous) to supraliminal (emotionally regulated). Further, the study aimed at exploring correlations between verbally reported emotional contagion and facial responses for men and women. Masked pictures of angry, happy and sad facial expressions were presented to 102 participants (51 men) at exposure times from subliminal (23 ms) to clearly supraliminal (2500 ms). Myoelectric activity (EMG) from the corrugator and the zygomaticus was measured and the participants reported their hedonic tone (verbally reported emotional contagion) after stimulus exposures. The results showed an effect of exposure time on gender differences in facial responses as well as in verbally reported emotional contagion. Women amplified imitative responses towards happy vs. angry faces and verbally reported emotional contagion with prolonged exposure times, whereas men did not. No gender differences were detected at the subliminal or borderliminal exposure times, but at the supraliminal exposure gender differences were found in imitation as well as in verbally reported emotional contagion. Women showed correspondence between their facial responses and their verbally reported emotional contagion to a greater extent than men. The results were interpreted in terms of gender differences in emotion regulation, rather than as differences in biologically prepared emotional reactivity.

  7. Face inversion decreased information about facial identity and expression in face-responsive neurons in macaque area TE.

    PubMed

    Sugase-Miyamoto, Yasuko; Matsumoto, Narihisa; Ohyama, Kaoru; Kawano, Kenji

    2014-09-10

    To investigate the effect of face inversion and thatcherization (eye inversion) on temporal processing stages of facial information, single neuron activities in the temporal cortex (area TE) of two rhesus monkeys were recorded. Test stimuli were colored pictures of monkey faces (four with four different expressions), human faces (three with four different expressions), and geometric shapes. Modifications were made in each face-picture, and its four variations were used as stimuli: upright original, inverted original, upright thatcherized, and inverted thatcherized faces. A total of 119 neurons responded to at least one of the upright original facial stimuli. A majority of the neurons (71%) showed activity modulations depending on upright and inverted presentations, and a lesser number of neurons (13%) showed activity modulations depending on original and thatcherized face conditions. In the case of face inversion, information about the fine category (facial identity and expression) decreased, whereas information about the global category (monkey vs human vs shape) was retained for both the original and thatcherized faces. Principal component analysis on the neuronal population responses revealed that the global categorization occurred regardless of the face inversion and that the inverted faces were represented near the upright faces in the principal component analysis space. By contrast, the face inversion decreased the ability to represent human facial identity and monkey facial expression. Thus, the neuronal population represented inverted faces as faces but failed to represent the identity and expression of the inverted faces, indicating that the neuronal representation in area TE cause the perceptual effect of face inversion. Copyright © 2014 the authors 0270-6474/14/3412457-13$15.00/0.

  8. Impact of civil war on emotion recognition: the denial of sadness in Sierra Leone.

    PubMed

    Umiltà, Maria Allessandra; Wood, Rachel; Loffredo, Francesca; Ravera, Roberto; Gallese, Vittorio

    2013-01-01

    Studies of children with atypical emotional experience demonstrate that childhood exposure to high levels of hostility and threat biases emotion perception. This study investigates emotion processing, in former child soldiers and non-combatant civilians. All participants have experienced prolonged violence exposure during childhood. The study, carried out in Sierra Leone, aimed to examine the effects of exposure to and forced participation in acts of extreme violence on the emotion processing of young adults war survivors. A total of 76 young, male adults (38 former child soldier survivors and 38 civilian survivors) were tested in order to assess participants' ability to identify four different facial emotion expressions from photographs and movies. Both groups were able to recognize facial expressions of emotion. However, despite their general ability to correctly identify facial emotions, participants showed a significant response bias in their recognition of sadness. Both former soldiers and civilians made more errors in identifying expressions of sadness than in the other three emotions and when mislabeling sadness participants most often described it as anger. Conversely, when making erroneous identifications of other emotions, participants were most likely to label the expressed emotion as sadness. In addition, while for three of the four emotions participants were better able to make a correct identification the greater the intensity of the expression, this pattern was not observed for sadness. During movies presentation the recognition of sadness was significantly worse for soldiers. While both former child soldiers and civilians were found to be able to identify facial emotions, a significant response bias in their attribution of negative emotions was observed. Such bias was particularly pronounced in former child soldiers. These findings point to a pervasive long-lasting effect of childhood exposure to violence on emotion processing in later life.

  9. Impact of civil war on emotion recognition: the denial of sadness in Sierra Leone

    PubMed Central

    Umiltà, Maria Allessandra; Wood, Rachel; Loffredo, Francesca; Ravera, Roberto; Gallese, Vittorio

    2013-01-01

    Studies of children with atypical emotional experience demonstrate that childhood exposure to high levels of hostility and threat biases emotion perception. This study investigates emotion processing, in former child soldiers and non-combatant civilians. All participants have experienced prolonged violence exposure during childhood. The study, carried out in Sierra Leone, aimed to examine the effects of exposure to and forced participation in acts of extreme violence on the emotion processing of young adults war survivors. A total of 76 young, male adults (38 former child soldier survivors and 38 civilian survivors) were tested in order to assess participants' ability to identify four different facial emotion expressions from photographs and movies. Both groups were able to recognize facial expressions of emotion. However, despite their general ability to correctly identify facial emotions, participants showed a significant response bias in their recognition of sadness. Both former soldiers and civilians made more errors in identifying expressions of sadness than in the other three emotions and when mislabeling sadness participants most often described it as anger. Conversely, when making erroneous identifications of other emotions, participants were most likely to label the expressed emotion as sadness. In addition, while for three of the four emotions participants were better able to make a correct identification the greater the intensity of the expression, this pattern was not observed for sadness. During movies presentation the recognition of sadness was significantly worse for soldiers. While both former child soldiers and civilians were found to be able to identify facial emotions, a significant response bias in their attribution of negative emotions was observed. Such bias was particularly pronounced in former child soldiers. These findings point to a pervasive long-lasting effect of childhood exposure to violence on emotion processing in later life. PMID:24027541

  10. Disconnection mechanism and regional cortical atrophy contribute to impaired processing of facial expressions and theory of mind in multiple sclerosis: a structural MRI study.

    PubMed

    Mike, Andrea; Strammer, Erzsebet; Aradi, Mihaly; Orsi, Gergely; Perlaki, Gabor; Hajnal, Andras; Sandor, Janos; Banati, Miklos; Illes, Eniko; Zaitsev, Alexander; Herold, Robert; Guttmann, Charles R G; Illes, Zsolt

    2013-01-01

    Successful socialization requires the ability of understanding of others' mental states. This ability called as mentalization (Theory of Mind) may become deficient and contribute to everyday life difficulties in multiple sclerosis. We aimed to explore the impact of brain pathology on mentalization performance in multiple sclerosis. Mentalization performance of 49 patients with multiple sclerosis was compared to 24 age- and gender matched healthy controls. T1- and T2-weighted three-dimensional brain MRI images were acquired at 3Tesla from patients with multiple sclerosis and 18 gender- and age matched healthy controls. We assessed overall brain cortical thickness in patients with multiple sclerosis and the scanned healthy controls, and measured the total and regional T1 and T2 white matter lesion volumes in patients with multiple sclerosis. Performances in tests of recognition of mental states and emotions from facial expressions and eye gazes correlated with both total T1-lesion load and regional T1-lesion load of association fiber tracts interconnecting cortical regions related to visual and emotion processing (genu and splenium of corpus callosum, right inferior longitudinal fasciculus, right inferior fronto-occipital fasciculus, uncinate fasciculus). Both of these tests showed correlations with specific cortical areas involved in emotion recognition from facial expressions (right and left fusiform face area, frontal eye filed), processing of emotions (right entorhinal cortex) and socially relevant information (left temporal pole). Thus, both disconnection mechanism due to white matter lesions and cortical thinning of specific brain areas may result in cognitive deficit in multiple sclerosis affecting emotion and mental state processing from facial expressions and contributing to everyday and social life difficulties of these patients.

  11. Processing of Emotional Faces in Patients with Chronic Pain Disorder: An Eye-Tracking Study.

    PubMed

    Giel, Katrin Elisabeth; Paganini, Sarah; Schank, Irena; Enck, Paul; Zipfel, Stephan; Junne, Florian

    2018-01-01

    Problems in emotion processing potentially contribute to the development and maintenance of chronic pain. Theories focusing on attentional processing have suggested that dysfunctional attention deployment toward emotional information, i.e., attentional biases for negative emotions, might entail one potential developmental and/or maintenance factor of chronic pain. We assessed self-reported alexithymia, attentional orienting to and maintenance on emotional stimuli using eye tracking in 17 patients with chronic pain disorder (CP) and two age- and sex-matched control groups, 17 healthy individuals (HC) and 17 individuals who were matched to CP according to depressive symptoms (DC). In a choice viewing paradigm, a dot indicated the position of the emotional picture in the next trial to allow for strategic attention deployment. Picture pairs consisted of a happy or sad facial expression and a neutral facial expression of the same individual. Participants were asked to explore picture pairs freely. CP and DC groups reported higher alexithymia than the HC group. HC showed a previously reported emotionality bias by preferentially orienting to the emotional face and preferentially maintaining on the happy face. CP and DC participants showed no facilitated early attention to sad facial expressions, and DC participants showed no facilitated early attention to happy facial expressions, while CP and DC participants did. We found no group differences in attentional maintenance. Our findings are in line with the clinical large overlap between pain and depression. The blunted initial reaction to sadness could be interpreted as a failure of the attentional system to attend to evolutionary salient emotional stimuli or as an attempt to suppress negative emotions. These difficulties in emotion processing might contribute to etiology or maintenance of chronic pain and depression.

  12. A large-scale analysis of sex differences in facial expressions

    PubMed Central

    Kodra, Evan; el Kaliouby, Rana; LaFrance, Marianne

    2017-01-01

    There exists a stereotype that women are more expressive than men; however, research has almost exclusively focused on a single facial behavior, smiling. A large-scale study examines whether women are consistently more expressive than men or whether the effects are dependent on the emotion expressed. Studies of gender differences in expressivity have been somewhat restricted to data collected in lab settings or which required labor-intensive manual coding. In the present study, we analyze gender differences in facial behaviors as over 2,000 viewers watch a set of video advertisements in their home environments. The facial responses were recorded using participants’ own webcams. Using a new automated facial coding technology we coded facial activity. We find that women are not universally more expressive across all facial actions. Nor are they more expressive in all positive valence actions and less expressive in all negative valence actions. It appears that generally women express actions more frequently than men, and in particular express more positive valence actions. However, expressiveness is not greater in women for all negative valence actions and is dependent on the discrete emotional state. PMID:28422963

  13. Social Alienation in Schizophrenia Patients: Association with Insula Responsiveness to Facial Expressions of Disgust

    PubMed Central

    Lindner, Christian; Dannlowski, Udo; Walhöfer, Kirsten; Rödiger, Maike; Maisch, Birgit; Bauer, Jochen; Ohrmann, Patricia; Lencer, Rebekka; Zwitserlood, Pienie; Kersting, Anette; Heindel, Walter; Arolt, Volker

    2014-01-01

    Introduction Among the functional neuroimaging studies on emotional face processing in schizophrenia, few have used paradigms with facial expressions of disgust. In this study, we investigated whether schizophrenia patients show less insula activation to macro-expressions (overt, clearly visible expressions) and micro-expressions (covert, very brief expressions) of disgust than healthy controls. Furthermore, departing from the assumption that disgust faces signal social rejection, we examined whether perceptual sensitivity to disgust is related to social alienation in patients and controls. We hypothesized that high insula responsiveness to facial disgust predicts social alienation. Methods We used functional magnetic resonance imaging to measure insula activation in 36 schizophrenia patients and 40 healthy controls. During scanning, subjects passively viewed covert and overt presentations of disgust and neutral faces. To measure social alienation, a social loneliness scale and an agreeableness scale were administered. Results Schizophrenia patients exhibited reduced insula activation in response to covert facial expressions of disgust. With respect to macro-expressions of disgust, no between-group differences emerged. In patients, insula responsiveness to covert faces of disgust was positively correlated with social loneliness. Furthermore, patients' insula responsiveness to covert and overt faces of disgust was negatively correlated with agreeableness. In controls, insula responsiveness to covert expressions of disgust correlated negatively with agreeableness. Discussion Schizophrenia patients show reduced insula responsiveness to micro-expressions but not macro-expressions of disgust compared to healthy controls. In patients, low agreeableness was associated with stronger insula response to micro- and macro-expressions of disgust. Patients with a strong tendency to feel uncomfortable with social interactions appear to be characterized by a high sensitivity for facial expression signaling social rejection. Given the associations of insula responsiveness to covert disgust expression with low agreeableness in healthy individuals, insula responsiveness to expressions of disgust might be in general a neural marker of the personality trait of agreeableness. PMID:24465469

  14. Social alienation in schizophrenia patients: association with insula responsiveness to facial expressions of disgust.

    PubMed

    Lindner, Christian; Dannlowski, Udo; Walhöfer, Kirsten; Rödiger, Maike; Maisch, Birgit; Bauer, Jochen; Ohrmann, Patricia; Lencer, Rebekka; Zwitserlood, Pienie; Kersting, Anette; Heindel, Walter; Arolt, Volker; Kugel, Harald; Suslow, Thomas

    2014-01-01

    Among the functional neuroimaging studies on emotional face processing in schizophrenia, few have used paradigms with facial expressions of disgust. In this study, we investigated whether schizophrenia patients show less insula activation to macro-expressions (overt, clearly visible expressions) and micro-expressions (covert, very brief expressions) of disgust than healthy controls. Furthermore, departing from the assumption that disgust faces signal social rejection, we examined whether perceptual sensitivity to disgust is related to social alienation in patients and controls. We hypothesized that high insula responsiveness to facial disgust predicts social alienation. We used functional magnetic resonance imaging to measure insula activation in 36 schizophrenia patients and 40 healthy controls. During scanning, subjects passively viewed covert and overt presentations of disgust and neutral faces. To measure social alienation, a social loneliness scale and an agreeableness scale were administered. Schizophrenia patients exhibited reduced insula activation in response to covert facial expressions of disgust. With respect to macro-expressions of disgust, no between-group differences emerged. In patients, insula responsiveness to covert faces of disgust was positively correlated with social loneliness. Furthermore, patients' insula responsiveness to covert and overt faces of disgust was negatively correlated with agreeableness. In controls, insula responsiveness to covert expressions of disgust correlated negatively with agreeableness. Schizophrenia patients show reduced insula responsiveness to micro-expressions but not macro-expressions of disgust compared to healthy controls. In patients, low agreeableness was associated with stronger insula response to micro- and macro-expressions of disgust. Patients with a strong tendency to feel uncomfortable with social interactions appear to be characterized by a high sensitivity for facial expression signaling social rejection. Given the associations of insula responsiveness to covert disgust expression with low agreeableness in healthy individuals, insula responsiveness to expressions of disgust might be in general a neural marker of the personality trait of agreeableness.

  15. The Right Place at the Right Time: Priming Facial Expressions with Emotional Face Components in Developmental Visual Agnosia

    PubMed Central

    Aviezer, Hillel; Hassin, Ran. R.; Perry, Anat; Dudarev, Veronica; Bentin, Shlomo

    2012-01-01

    The current study examined the nature of deficits in emotion recognition from facial expressions in case LG, an individual with a rare form of developmental visual agnosia (DVA). LG presents with profoundly impaired recognition of facial expressions, yet the underlying nature of his deficit remains unknown. During typical face processing, normal sighted individuals extract information about expressed emotions from face regions with activity diagnostic for specific emotion categories. Given LG’s impairment, we sought to shed light on his emotion perception by examining if priming facial expressions with diagnostic emotional face components would facilitate his recognition of the emotion expressed by the face. LG and control participants matched isolated face components with components appearing in a subsequently presented full-face and then categorized the face’s emotion. Critically, the matched components were from regions which were diagnostic or non-diagnostic of the emotion portrayed by the full face. In experiment 1, when the full faces were briefly presented (150 ms), LG’s performance was strongly influenced by the diagnosticity of the components: His emotion recognition was boosted within normal limits when diagnostic components were used and was obliterated when non-diagnostic components were used. By contrast, in experiment 2, when the face-exposure duration was extended (2000 ms), the beneficial effect of the diagnostic matching was diminished as was the detrimental effect of the non-diagnostic matching. These data highlight the impact of diagnostic facial features in normal expression recognition and suggest that impaired emotion recognition in DVA results from deficient visual integration across diagnostic face components. PMID:22349446

  16. The association between PTSD and facial affect recognition.

    PubMed

    Williams, Christian L; Milanak, Melissa E; Judah, Matt R; Berenbaum, Howard

    2018-05-05

    The major aims of this study were to examine how, if at all, having higher levels of PTSD would be associated with performance on a facial affect recognition task in which facial expressions of emotion are superimposed on emotionally valenced, non-face images. College students with trauma histories (N = 90) completed a facial affect recognition task as well as measures of exposure to traumatic events, and PTSD symptoms. When the face and context matched, participants with higher levels of PTSD were significantly more accurate. When the face and context were mismatched, participants with lower levels of PTSD were more accurate than were those with higher levels of PTSD. These findings suggest that PTSD is associated with how people process affective information. Furthermore, these results suggest that the enhanced attention of people with higher levels of PTSD to affective information can be either beneficial or detrimental to their ability to accurately identify facial expressions of emotion. Limitations, future directions and clinical implications are discussed. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Sparse coding for flexible, robust 3D facial-expression synthesis.

    PubMed

    Lin, Yuxu; Song, Mingli; Quynh, Dao Thi Phuong; He, Ying; Chen, Chun

    2012-01-01

    Computer animation researchers have been extensively investigating 3D facial-expression synthesis for decades. However, flexible, robust production of realistic 3D facial expressions is still technically challenging. A proposed modeling framework applies sparse coding to synthesize 3D expressive faces, using specified coefficients or expression examples. It also robustly recovers facial expressions from noisy and incomplete data. This approach can synthesize higher-quality expressions in less time than the state-of-the-art techniques.

  18. Facial expressions recognition with an emotion expressive robotic head

    NASA Astrophysics Data System (ADS)

    Doroftei, I.; Adascalitei, F.; Lefeber, D.; Vanderborght, B.; Doroftei, I. A.

    2016-08-01

    The purpose of this study is to present the preliminary steps in facial expressions recognition with a new version of an expressive social robotic head. So, in a first phase, our main goal was to reach a minimum level of emotional expressiveness in order to obtain nonverbal communication between the robot and human by building six basic facial expressions. To evaluate the facial expressions, the robot was used in some preliminary user studies, among children and adults.

  19. A Neural Basis of Facial Action Recognition in Humans

    PubMed Central

    Srinivasan, Ramprakash; Golomb, Julie D.

    2016-01-01

    By combining different facial muscle actions, called action units, humans can produce an extraordinarily large number of facial expressions. Computational models and studies in cognitive science and social psychology have long hypothesized that the brain needs to visually interpret these action units to understand other people's actions and intentions. Surprisingly, no studies have identified the neural basis of the visual recognition of these action units. Here, using functional magnetic resonance imaging and an innovative machine learning analysis approach, we identify a consistent and differential coding of action units in the brain. Crucially, in a brain region thought to be responsible for the processing of changeable aspects of the face, multivoxel pattern analysis could decode the presence of specific action units in an image. This coding was found to be consistent across people, facilitating the estimation of the perceived action units on participants not used to train the multivoxel decoder. Furthermore, this coding of action units was identified when participants attended to the emotion category of the facial expression, suggesting an interaction between the visual analysis of action units and emotion categorization as predicted by the computational models mentioned above. These results provide the first evidence for a representation of action units in the brain and suggest a mechanism for the analysis of large numbers of facial actions and a loss of this capacity in psychopathologies. SIGNIFICANCE STATEMENT Computational models and studies in cognitive and social psychology propound that visual recognition of facial expressions requires an intermediate step to identify visible facial changes caused by the movement of specific facial muscles. Because facial expressions are indeed created by moving one's facial muscles, it is logical to assume that our visual system solves this inverse problem. Here, using an innovative machine learning method and neuroimaging data, we identify for the first time a brain region responsible for the recognition of actions associated with specific facial muscles. Furthermore, this representation is preserved across subjects. Our machine learning analysis does not require mapping the data to a standard brain and may serve as an alternative to hyperalignment. PMID:27098688

  20. Developmental Changes in the Primacy of Facial Cues for Emotion Recognition

    ERIC Educational Resources Information Center

    Leitzke, Brian T.; Pollak, Seth D.

    2016-01-01

    There have been long-standing differences of opinion regarding the influence of the face relative to that of contextual information on how individuals process and judge facial expressions of emotion. However, developmental changes in how individuals use such information have remained largely unexplored and could be informative in attempting to…

  1. Sex Differences in Facial Scanning: Similarities and Dissimilarities between Infants and Adults

    ERIC Educational Resources Information Center

    Rennels, Jennifer L.; Cummings, Andrew J.

    2013-01-01

    When face processing studies find sex differences, male infants appear better at face recognition than female infants, whereas female adults appear better at face recognition than male adults. Both female infants and adults, however, discriminate emotional expressions better than males. To investigate if sex and age differences in facial scanning…

  2. Developmental changes in the primacy of facial cues for emotion recognition.

    PubMed

    Leitzke, Brian T; Pollak, Seth D

    2016-04-01

    There have been long-standing differences of opinion regarding the influence of the face relative to that of contextual information on how individuals process and judge facial expressions of emotion. However, developmental changes in how individuals use such information have remained largely unexplored and could be informative in attempting to reconcile these opposing views. The current study tested for age-related differences in how individuals prioritize viewing emotional faces versus contexts when making emotion judgments. To do so, we asked 4-, 8-, and 12-year-old children as well as college students to categorize facial expressions of emotion that were presented with scenes that were either congruent or incongruent with the facial displays. During this time, we recorded participants' gaze patterns via eye tracking. College students directed their visual attention primarily to the face, regardless of contextual information. Children, however, divided their attention between both the face and the context as sources of emotional information depending on the valence of the context. These findings reveal a developmental shift in how individuals process and integrate emotional cues. (c) 2016 APA, all rights reserved).

  3. Viewing distance matter to perceived intensity of facial expressions

    PubMed Central

    Gerhardsson, Andreas; Högman, Lennart; Fischer, Håkan

    2015-01-01

    In our daily perception of facial expressions, we depend on an ability to generalize across the varied distances at which they may appear. This is important to how we interpret the quality and the intensity of the expression. Previous research has not investigated whether this so called perceptual constancy also applies to the experienced intensity of facial expressions. Using a psychophysical measure (Borg CR100 scale) the present study aimed to further investigate perceptual constancy of happy and angry facial expressions at varied sizes, which is a proxy for varying viewing distances. Seventy-one (42 females) participants rated the intensity and valence of facial expressions varying in distance and intensity. The results demonstrated that the perceived intensity (PI) of the emotional facial expression was dependent on the distance of the face and the person perceiving it. An interaction effect was noted, indicating that close-up faces are perceived as more intense than faces at a distance and that this effect is stronger the more intense the facial expression truly is. The present study raises considerations regarding constancy of the PI of happy and angry facial expressions at varied distances. PMID:26191035

  4. Gaze Behavior of Children with ASD toward Pictures of Facial Expressions.

    PubMed

    Matsuda, Soichiro; Minagawa, Yasuyo; Yamamoto, Junichi

    2015-01-01

    Atypical gaze behavior in response to a face has been well documented in individuals with autism spectrum disorders (ASDs). Children with ASD appear to differ from typically developing (TD) children in gaze behavior for spoken and dynamic face stimuli but not for nonspeaking, static face stimuli. Furthermore, children with ASD and TD children show a difference in their gaze behavior for certain expressions. However, few studies have examined the relationship between autism severity and gaze behavior toward certain facial expressions. The present study replicated and extended previous studies by examining gaze behavior towards pictures of facial expressions. We presented ASD and TD children with pictures of surprised, happy, neutral, angry, and sad facial expressions. Autism severity was assessed using the Childhood Autism Rating Scale (CARS). The results showed that there was no group difference in gaze behavior when looking at pictures of facial expressions. Conversely, the children with ASD who had more severe autistic symptomatology had a tendency to gaze at angry facial expressions for a shorter duration in comparison to other facial expressions. These findings suggest that autism severity should be considered when examining atypical responses to certain facial expressions.

  5. Gaze Behavior of Children with ASD toward Pictures of Facial Expressions

    PubMed Central

    Matsuda, Soichiro; Minagawa, Yasuyo; Yamamoto, Junichi

    2015-01-01

    Atypical gaze behavior in response to a face has been well documented in individuals with autism spectrum disorders (ASDs). Children with ASD appear to differ from typically developing (TD) children in gaze behavior for spoken and dynamic face stimuli but not for nonspeaking, static face stimuli. Furthermore, children with ASD and TD children show a difference in their gaze behavior for certain expressions. However, few studies have examined the relationship between autism severity and gaze behavior toward certain facial expressions. The present study replicated and extended previous studies by examining gaze behavior towards pictures of facial expressions. We presented ASD and TD children with pictures of surprised, happy, neutral, angry, and sad facial expressions. Autism severity was assessed using the Childhood Autism Rating Scale (CARS). The results showed that there was no group difference in gaze behavior when looking at pictures of facial expressions. Conversely, the children with ASD who had more severe autistic symptomatology had a tendency to gaze at angry facial expressions for a shorter duration in comparison to other facial expressions. These findings suggest that autism severity should be considered when examining atypical responses to certain facial expressions. PMID:26090223

  6. 4D ultrasound study of fetal facial expressions in the third trimester of pregnancy.

    PubMed

    AboEllail, Mohamed Ahmed Mostafa; Kanenishi, Kenji; Mori, Nobuhiro; Mohamed, Osman Abdel Kareem; Hata, Toshiyuki

    2018-07-01

    To evaluate the frequencies of fetal facial expressions in the third trimester of pregnancy, when fetal brain maturation and development are progressing in normal healthy fetuses. Four-dimensional (4 D) ultrasound was used to examine the facial expressions of 111 healthy fetuses between 30 and 40 weeks of gestation. The frequencies of seven facial expressions (mouthing, yawning, smiling, tongue expulsion, scowling, sucking, and blinking) during 15-minute recordings were assessed. The fetuses were further divided into three gestational age groups (25 fetuses at 30-31 weeks, 43 at 32-35 weeks, and 43 at ≥36 weeks). Comparison of facial expressions among the three gestational age groups was performed to determine their changes with advancing gestation. Mouthing was the most frequent facial expression at 30-40 weeks of gestation, followed by blinking. Both facial expressions were significantly more frequent than the other expressions (p < .05). The frequency of yawning decreased with the gestational age after 30 weeks of gestation (p = .031). Other facial expressions did not change between 30 and 40 weeks. The frequency of yawning at 30-31 weeks was significantly higher than that at 36-40 weeks (p < .05). There were no significant differences in the other facial expressions among the three gestational age groups. Our results suggest that 4D ultrasound assessment of fetal facial expressions may be a useful modality for evaluating fetal brain maturation and development. The decreasing frequency of fetal yawning after 30 weeks of gestation may explain the emergence of distinct states of arousal.

  7. When Early Experiences Build a Wall to Others’ Emotions: An Electrophysiological and Autonomic Study

    PubMed Central

    Ardizzi, Martina; Martini, Francesca; Umiltà, Maria Alessandra; Sestito, Mariateresa; Ravera, Roberto; Gallese, Vittorio

    2013-01-01

    Facial expression of emotions is a powerful vehicle for communicating information about others’ emotional states and it normally induces facial mimicry in the observers. The aim of this study was to investigate if early aversive experiences could interfere with emotion recognition, facial mimicry, and with the autonomic regulation of social behaviors. We conducted a facial emotion recognition task in a group of “street-boys” and in an age-matched control group. We recorded facial electromyography (EMG), a marker of facial mimicry, and respiratory sinus arrhythmia (RSA), an index of the recruitment of autonomic system promoting social behaviors and predisposition, in response to the observation of facial expressions of emotions. Results showed an over-attribution of anger, and reduced EMG responses during the observation of both positive and negative expressions only among street-boys. Street-boys also showed lower RSA after observation of facial expressions and ineffective RSA suppression during presentation of non-threatening expressions. Our findings suggest that early aversive experiences alter not only emotion recognition but also facial mimicry of emotions. These deficits affect the autonomic regulation of social behaviors inducing lower social predisposition after the visualization of facial expressions and an ineffective recruitment of defensive behavior in response to non-threatening expressions. PMID:23593374

  8. Children's Representations of Facial Expression and Identity: Identity-Contingent Expression Aftereffects

    ERIC Educational Resources Information Center

    Vida, Mark D.; Mondloch, Catherine J.

    2009-01-01

    This investigation used adaptation aftereffects to examine developmental changes in the perception of facial expressions. Previous studies have shown that adults' perceptions of ambiguous facial expressions are biased following adaptation to intense expressions. These expression aftereffects are strong when the adapting and probe expressions share…

  9. The face is not an empty canvas: how facial expressions interact with facial appearance.

    PubMed

    Hess, Ursula; Adams, Reginald B; Kleck, Robert E

    2009-12-12

    Faces are not simply blank canvases upon which facial expressions write their emotional messages. In fact, facial appearance and facial movement are both important social signalling systems in their own right. We here provide multiple lines of evidence for the notion that the social signals derived from facial appearance on the one hand and facial movement on the other interact in a complex manner, sometimes reinforcing and sometimes contradicting one another. Faces provide information on who a person is. Sex, age, ethnicity, personality and other characteristics that can define a person and the social group the person belongs to can all be derived from the face alone. The present article argues that faces interact with the perception of emotion expressions because this information informs a decoder's expectations regarding an expresser's probable emotional reactions. Facial appearance also interacts more directly with the interpretation of facial movement because some of the features that are used to derive personality or sex information are also features that closely resemble certain emotional expressions, thereby enhancing or diluting the perceived strength of particular expressions.

  10. Using Video Modeling to Teach Children with PDD-NOS to Respond to Facial Expressions

    ERIC Educational Resources Information Center

    Axe, Judah B.; Evans, Christine J.

    2012-01-01

    Children with autism spectrum disorders often exhibit delays in responding to facial expressions, and few studies have examined teaching responding to subtle facial expressions to this population. We used video modeling to train 3 participants with PDD-NOS (age 5) to respond to eight facial expressions: approval, bored, calming, disapproval,…

  11. Development of a Support Application and a Textbook for Practicing Facial Expression Detection for Students with Visual Impairment

    ERIC Educational Resources Information Center

    Saito, Hirotaka; Ando, Akinobu; Itagaki, Shota; Kawada, Taku; Davis, Darold; Nagai, Nobuyuki

    2017-01-01

    Until now, when practicing facial expression recognition skills in nonverbal communication areas of SST, judgment of facial expression was not quantitative because the subjects of SST were judged by teachers. Therefore, we thought whether SST could be performed using facial expression detection devices that can quantitatively measure facial…

  12. Facial Expression Recognition Deficits and Faulty Learning: Implications for Theoretical Models and Clinical Applications

    ERIC Educational Resources Information Center

    Sheaffer, Beverly L.; Golden, Jeannie A.; Averett, Paige

    2009-01-01

    The ability to recognize facial expressions of emotion is integral in social interaction. Although the importance of facial expression recognition is reflected in increased research interest as well as in popular culture, clinicians may know little about this topic. The purpose of this article is to discuss facial expression recognition literature…

  13. Violent Media Consumption and the Recognition of Dynamic Facial Expressions

    ERIC Educational Resources Information Center

    Kirsh, Steven J.; Mounts, Jeffrey R. W.; Olczak, Paul V.

    2006-01-01

    This study assessed the speed of recognition of facial emotional expressions (happy and angry) as a function of violent media consumption. Color photos of calm facial expressions morphed to either an angry or a happy facial expression. Participants were asked to make a speeded identification of the emotion (happiness or anger) during the morph.…

  14. Blend Shape Interpolation and FACS for Realistic Avatar

    NASA Astrophysics Data System (ADS)

    Alkawaz, Mohammed Hazim; Mohamad, Dzulkifli; Basori, Ahmad Hoirul; Saba, Tanzila

    2015-03-01

    The quest of developing realistic facial animation is ever-growing. The emergence of sophisticated algorithms, new graphical user interfaces, laser scans and advanced 3D tools imparted further impetus towards the rapid advancement of complex virtual human facial model. Face-to-face communication being the most natural way of human interaction, the facial animation systems became more attractive in the information technology era for sundry applications. The production of computer-animated movies using synthetic actors are still challenging issues. Proposed facial expression carries the signature of happiness, sadness, angry or cheerful, etc. The mood of a particular person in the midst of a large group can immediately be identified via very subtle changes in facial expressions. Facial expressions being very complex as well as important nonverbal communication channel are tricky to synthesize realistically using computer graphics. Computer synthesis of practical facial expressions must deal with the geometric representation of the human face and the control of the facial animation. We developed a new approach by integrating blend shape interpolation (BSI) and facial action coding system (FACS) to create a realistic and expressive computer facial animation design. The BSI is used to generate the natural face while the FACS is employed to reflect the exact facial muscle movements for four basic natural emotional expressions such as angry, happy, sad and fear with high fidelity. The results in perceiving the realistic facial expression for virtual human emotions based on facial skin color and texture may contribute towards the development of virtual reality and game environment of computer aided graphics animation systems.

  15. iFER: facial expression recognition using automatically selected geometric eye and eyebrow features

    NASA Astrophysics Data System (ADS)

    Oztel, Ismail; Yolcu, Gozde; Oz, Cemil; Kazan, Serap; Bunyak, Filiz

    2018-03-01

    Facial expressions have an important role in interpersonal communications and estimation of emotional states or intentions. Automatic recognition of facial expressions has led to many practical applications and became one of the important topics in computer vision. We present a facial expression recognition system that relies on geometry-based features extracted from eye and eyebrow regions of the face. The proposed system detects keypoints on frontal face images and forms a feature set using geometric relationships among groups of detected keypoints. Obtained feature set is refined and reduced using the sequential forward selection (SFS) algorithm and fed to a support vector machine classifier to recognize five facial expression classes. The proposed system, iFER (eye-eyebrow only facial expression recognition), is robust to lower face occlusions that may be caused by beards, mustaches, scarves, etc. and lower face motion during speech production. Preliminary experiments on benchmark datasets produced promising results outperforming previous facial expression recognition studies using partial face features, and comparable results to studies using whole face information, only slightly lower by ˜ 2.5 % compared to the best whole face facial recognition system while using only ˜ 1 / 3 of the facial region.

  16. That "poker face" just might lose you the game! The impact of expressive suppression and mimicry on sensitivity to facial expressions of emotion.

    PubMed

    Schneider, Kristin G; Hempel, Roelie J; Lynch, Thomas R

    2013-10-01

    Successful interpersonal functioning often requires both the ability to mask inner feelings and the ability to accurately recognize others' expressions--but what if effortful control of emotional expressions impacts the ability to accurately read others? In this study, we examined the influence of self-controlled expressive suppression and mimicry on facial affect sensitivity--the speed with which one can accurately identify gradually intensifying facial expressions of emotion. Muscle activity of the brow (corrugator, related to anger), upper lip (levator, related to disgust), and cheek (zygomaticus, related to happiness) were recorded using facial electromyography while participants randomized to one of three conditions (Suppress, Mimic, and No-Instruction) viewed a series of six distinct emotional expressions (happiness, sadness, fear, anger, surprise, and disgust) as they morphed from neutral to full expression. As hypothesized, individuals instructed to suppress their own facial expressions showed impairment in facial affect sensitivity. Conversely, mimicry of emotion expressions appeared to facilitate facial affect sensitivity. Results suggest that it is difficult for a person to be able to simultaneously mask inner feelings and accurately "read" the facial expressions of others, at least when these expressions are at low intensity. The combined behavioral and physiological data suggest that the strategies an individual selects to control his or her own expression of emotion have important implications for interpersonal functioning.

  17. Intact anger recognition in depression despite aberrant visual facial information usage.

    PubMed

    Clark, Cameron M; Chiu, Carina G; Diaz, Ruth L; Goghari, Vina M

    2014-08-01

    Previous literature has indicated abnormalities in facial emotion recognition abilities, as well as deficits in basic visual processes in major depression. However, the literature is unclear on a number of important factors including whether or not these abnormalities represent deficient or enhanced emotion recognition abilities compared to control populations, and the degree to which basic visual deficits might impact this process. The present study investigated emotion recognition abilities for angry versus neutral facial expressions in a sample of undergraduate students with Beck Depression Inventory-II (BDI-II) scores indicative of moderate depression (i.e., ≥20), compared to matched low-BDI-II score (i.e., ≤2) controls via the Bubbles Facial Emotion Perception Task. Results indicated unimpaired behavioural performance in discriminating angry from neutral expressions in the high depressive symptoms group relative to the minimal depressive symptoms group, despite evidence of an abnormal pattern of visual facial information usage. The generalizability of the current findings is limited by the highly structured nature of the facial emotion recognition task used, as well as the use of an analog sample undergraduates scoring high in self-rated symptoms of depression rather than a clinical sample. Our findings suggest that basic visual processes are involved in emotion recognition abnormalities in depression, demonstrating consistency with the emotion recognition literature in other psychopathologies (e.g., schizophrenia, autism, social anxiety). Future research should seek to replicate these findings in clinical populations with major depression, and assess the association between aberrant face gaze behaviours and symptom severity and social functioning. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Accurate landmarking of three-dimensional facial data in the presence of facial expressions and occlusions using a three-dimensional statistical facial feature model.

    PubMed

    Zhao, Xi; Dellandréa, Emmanuel; Chen, Liming; Kakadiaris, Ioannis A

    2011-10-01

    Three-dimensional face landmarking aims at automatically localizing facial landmarks and has a wide range of applications (e.g., face recognition, face tracking, and facial expression analysis). Existing methods assume neutral facial expressions and unoccluded faces. In this paper, we propose a general learning-based framework for reliable landmark localization on 3-D facial data under challenging conditions (i.e., facial expressions and occlusions). Our approach relies on a statistical model, called 3-D statistical facial feature model, which learns both the global variations in configurational relationships between landmarks and the local variations of texture and geometry around each landmark. Based on this model, we further propose an occlusion classifier and a fitting algorithm. Results from experiments on three publicly available 3-D face databases (FRGC, BU-3-DFE, and Bosphorus) demonstrate the effectiveness of our approach, in terms of landmarking accuracy and robustness, in the presence of expressions and occlusions.

  19. The perception and identification of facial emotions in individuals with autism spectrum disorders using the Let's Face It! Emotion Skills Battery.

    PubMed

    Tanaka, James W; Wolf, Julie M; Klaiman, Cheryl; Koenig, Kathleen; Cockburn, Jeffrey; Herlihy, Lauren; Brown, Carla; Stahl, Sherin S; South, Mikle; McPartland, James C; Kaiser, Martha D; Schultz, Robert T

    2012-12-01

    Although impaired social-emotional ability is a hallmark of autism spectrum disorder (ASD), the perceptual skills and mediating strategies contributing to the social deficits of autism are not well understood. A perceptual skill that is fundamental to effective social communication is the ability to accurately perceive and interpret facial emotions. To evaluate the expression processing of participants with ASD, we designed the Let's Face It! Emotion Skills Battery (LFI! Battery), a computer-based assessment composed of three subscales measuring verbal and perceptual skills implicated in the recognition of facial emotions. We administered the LFI! Battery to groups of participants with ASD and typically developing control (TDC) participants that were matched for age and IQ. On the Name Game labeling task, participants with ASD (N = 68) performed on par with TDC individuals (N = 66) in their ability to name the facial emotions of happy, sad, disgust and surprise and were only impaired in their ability to identify the angry expression. On the Matchmaker Expression task that measures the recognition of facial emotions across different facial identities, the ASD participants (N = 66) performed reliably worse than TDC participants (N = 67) on the emotions of happy, sad, disgust, frighten and angry. In the Parts-Wholes test of perceptual strategies of expression, the TDC participants (N = 67) displayed more holistic encoding for the eyes than the mouths in expressive faces whereas ASD participants (N = 66) exhibited the reverse pattern of holistic recognition for the mouth and analytic recognition of the eyes. In summary, findings from the LFI! Battery show that participants with ASD were able to label the basic facial emotions (with the exception of angry expression) on par with age- and IQ-matched TDC participants. However, participants with ASD were impaired in their ability to generalize facial emotions across different identities and showed a tendency to recognize the mouth feature holistically and the eyes as isolated parts. © 2012 The Authors. Journal of Child Psychology and Psychiatry © 2012 Association for Child and Adolescent Mental Health.

  20. Mapping spontaneous facial expression in people with Parkinson's disease: A multiple case study design.

    PubMed

    Gunnery, Sarah D; Naumova, Elena N; Saint-Hilaire, Marie; Tickle-Degnen, Linda

    2017-01-01

    People with Parkinson's disease (PD) often experience a decrease in their facial expressivity, but little is known about how the coordinated movements across regions of the face are impaired in PD. The face has neurologically independent regions that coordinate to articulate distinct social meanings that others perceive as gestalt expressions, and so understanding how different regions of the face are affected is important. Using the Facial Action Coding System, this study comprehensively measured spontaneous facial expression across 600 frames for a multiple case study of people with PD who were rated as having varying degrees of facial expression deficits, and created correlation matrices for frequency and intensity of produced muscle activations across different areas of the face. Data visualization techniques were used to create temporal and correlational mappings of muscle action in the face at different degrees of facial expressivity. Results showed that as severity of facial expression deficit increased, there was a decrease in number, duration, intensity, and coactivation of facial muscle action. This understanding of how regions of the parkinsonian face move independently and in conjunction with other regions will provide a new focus for future research aiming to model how facial expression in PD relates to disease progression, stigma, and quality of life.

  1. Fixation to features and neural processing of facial expressions in a gender discrimination task

    PubMed Central

    Neath, Karly N.; Itier, Roxane J.

    2017-01-01

    Early face encoding, as reflected by the N170 ERP component, is sensitive to fixation to the eyes. Whether this sensitivity varies with facial expressions of emotion and can also be seen on other ERP components such as P1 and EPN, was investigated. Using eye-tracking to manipulate fixation on facial features, we found the N170 to be the only eye-sensitive component and this was true for fearful, happy and neutral faces. A different effect of fixation to features was seen for the earlier P1 that likely reflected general sensitivity to face position. An early effect of emotion (~120 ms) for happy faces was seen at occipital sites and was sustained until ~350 ms post-stimulus. For fearful faces, an early effect was seen around 80 ms followed by a later effect appearing at ~150 ms until ~300 ms at lateral posterior sites. Results suggests that in this emotion-irrelevant gender discrimination task, processing of fearful and happy expressions occurred early and largely independently of the eye-sensitivity indexed by the N170. Processing of the two emotions involved different underlying brain networks active at different times. PMID:26277653

  2. The mysterious noh mask: contribution of multiple facial parts to the recognition of emotional expressions.

    PubMed

    Miyata, Hiromitsu; Nishimura, Ritsuko; Okanoya, Kazuo; Kawai, Nobuyuki

    2012-01-01

    A Noh mask worn by expert actors when performing on a Japanese traditional Noh drama is suggested to convey countless different facial expressions according to different angles of head/body orientation. The present study addressed the question of how different facial parts of a Noh mask, including the eyebrows, the eyes, and the mouth, may contribute to different emotional expressions. Both experimental situations of active creation and passive recognition of emotional facial expressions were introduced. In Experiment 1, participants either created happy or sad facial expressions, or imitated a face that looked up or down, by actively changing each facial part of a Noh mask image presented on a computer screen. For an upward tilted mask, the eyebrows and the mouth shared common features with sad expressions, whereas the eyes with happy expressions. This contingency tended to be reversed for a downward tilted mask. Experiment 2 further examined which facial parts of a Noh mask are crucial in determining emotional expressions. Participants were exposed to the synthesized Noh mask images with different facial parts expressing different emotions. Results clearly revealed that participants primarily used the shape of the mouth in judging emotions. The facial images having the mouth of an upward/downward tilted Noh mask strongly tended to be evaluated as sad/happy, respectively. The results suggest that Noh masks express chimeric emotional patterns, with different facial parts conveying different emotions This appears consistent with the principles of Noh which highly appreciate subtle and composite emotional expressions, as well as with the mysterious facial expressions observed in Western art. It was further demonstrated that the mouth serves as a diagnostic feature in characterizing the emotional expressions. This indicates the superiority of biologically-driven factors over the traditionally formulated performing styles when evaluating the emotions of the Noh masks.

  3. The Mysterious Noh Mask: Contribution of Multiple Facial Parts to the Recognition of Emotional Expressions

    PubMed Central

    Miyata, Hiromitsu; Nishimura, Ritsuko; Okanoya, Kazuo; Kawai, Nobuyuki

    2012-01-01

    Background A Noh mask worn by expert actors when performing on a Japanese traditional Noh drama is suggested to convey countless different facial expressions according to different angles of head/body orientation. The present study addressed the question of how different facial parts of a Noh mask, including the eyebrows, the eyes, and the mouth, may contribute to different emotional expressions. Both experimental situations of active creation and passive recognition of emotional facial expressions were introduced. Methodology/Principal Findings In Experiment 1, participants either created happy or sad facial expressions, or imitated a face that looked up or down, by actively changing each facial part of a Noh mask image presented on a computer screen. For an upward tilted mask, the eyebrows and the mouth shared common features with sad expressions, whereas the eyes with happy expressions. This contingency tended to be reversed for a downward tilted mask. Experiment 2 further examined which facial parts of a Noh mask are crucial in determining emotional expressions. Participants were exposed to the synthesized Noh mask images with different facial parts expressing different emotions. Results clearly revealed that participants primarily used the shape of the mouth in judging emotions. The facial images having the mouth of an upward/downward tilted Noh mask strongly tended to be evaluated as sad/happy, respectively. Conclusions/Significance The results suggest that Noh masks express chimeric emotional patterns, with different facial parts conveying different emotions This appears consistent with the principles of Noh which highly appreciate subtle and composite emotional expressions, as well as with the mysterious facial expressions observed in Western art. It was further demonstrated that the mouth serves as a diagnostic feature in characterizing the emotional expressions. This indicates the superiority of biologically-driven factors over the traditionally formulated performing styles when evaluating the emotions of the Noh masks. PMID:23185595

  4. Face to face: blocking facial mimicry can selectively impair recognition of emotional expressions.

    PubMed

    Oberman, Lindsay M; Winkielman, Piotr; Ramachandran, Vilayanur S

    2007-01-01

    People spontaneously mimic a variety of behaviors, including emotional facial expressions. Embodied cognition theories suggest that mimicry reflects internal simulation of perceived emotion in order to facilitate its understanding. If so, blocking facial mimicry should impair recognition of expressions, especially of emotions that are simulated using facial musculature. The current research tested this hypothesis using four expressions (happy, disgust, fear, and sad) and two mimicry-interfering manipulations (1) biting on a pen and (2) chewing gum, as well as two control conditions. Experiment 1 used electromyography over cheek, mouth, and nose regions. The bite manipulation consistently activated assessed muscles, whereas the chew manipulation activated muscles only intermittently. Further, expressing happiness generated most facial action. Experiment 2 found that the bite manipulation interfered most with recognition of happiness. These findings suggest that facial mimicry differentially contributes to recognition of specific facial expressions, thus allowing for more refined predictions from embodied cognition theories.

  5. Reading Faces: Differential Lateral Gaze Bias in Processing Canine and Human Facial Expressions in Dogs and 4-Year-Old Children

    PubMed Central

    Racca, Anaïs; Guo, Kun; Meints, Kerstin; Mills, Daniel S.

    2012-01-01

    Sensitivity to the emotions of others provides clear biological advantages. However, in the case of heterospecific relationships, such as that existing between dogs and humans, there are additional challenges since some elements of the expression of emotions are species-specific. Given that faces provide important visual cues for communicating emotional state in both humans and dogs, and that processing of emotions is subject to brain lateralisation, we investigated lateral gaze bias in adult dogs when presented with pictures of expressive human and dog faces. Our analysis revealed clear differences in laterality of eye movements in dogs towards conspecific faces according to the emotional valence of the expressions. Differences were also found towards human faces, but to a lesser extent. For comparative purpose, a similar experiment was also run with 4-year-old children and it was observed that they showed differential processing of facial expressions compared to dogs, suggesting a species-dependent engagement of the right or left hemisphere in processing emotions. PMID:22558335

  6. Decoding Facial Expressions: A New Test with Decoding Norms.

    ERIC Educational Resources Information Center

    Leathers, Dale G.; Emigh, Ted H.

    1980-01-01

    Describes the development and testing of a new facial meaning sensitivity test designed to determine how specialized are the meanings that can be decoded from facial expressions. Demonstrates the use of the test to measure a receiver's current level of skill in decoding facial expressions. (JMF)

  7. Multi-layer sparse representation for weighted LBP-patches based facial expression recognition.

    PubMed

    Jia, Qi; Gao, Xinkai; Guo, He; Luo, Zhongxuan; Wang, Yi

    2015-03-19

    In this paper, a novel facial expression recognition method based on sparse representation is proposed. Most contemporary facial expression recognition systems suffer from limited ability to handle image nuisances such as low resolution and noise. Especially for low intensity expression, most of the existing training methods have quite low recognition rates. Motivated by sparse representation, the problem can be solved by finding sparse coefficients of the test image by the whole training set. Deriving an effective facial representation from original face images is a vital step for successful facial expression recognition. We evaluate facial representation based on weighted local binary patterns, and Fisher separation criterion is used to calculate the weighs of patches. A multi-layer sparse representation framework is proposed for multi-intensity facial expression recognition, especially for low-intensity expressions and noisy expressions in reality, which is a critical problem but seldom addressed in the existing works. To this end, several experiments based on low-resolution and multi-intensity expressions are carried out. Promising results on publicly available databases demonstrate the potential of the proposed approach.

  8. Non-rigid, but not rigid, motion interferes with the processing of structural face information in developmental prosopagnosia.

    PubMed

    Maguinness, Corrina; Newell, Fiona N

    2015-04-01

    There is growing evidence to suggest that facial motion is an important cue for face recognition. However, it is poorly understood whether motion is integrated with facial form information or whether it provides an independent cue to identity. To provide further insight into this issue, we compared the effect of motion on face perception in two developmental prosopagnosics and age-matched controls. Participants first learned faces presented dynamically (video), or in a sequence of static images, in which rigid (viewpoint) or non-rigid (expression) changes occurred. Immediately following learning, participants were required to match a static face image to the learned face. Test face images varied by viewpoint (Experiment 1) or expression (Experiment 2) and were learned or novel face images. We found similar performance across prosopagnosics and controls in matching facial identity across changes in viewpoint when the learned face was shown moving in a rigid manner. However, non-rigid motion interfered with face matching across changes in expression in both individuals with prosopagnosia compared to the performance of control participants. In contrast, non-rigid motion did not differentially affect the matching of facial expressions across changes in identity for either prosopagnosics (Experiment 3). Our results suggest that whilst the processing of rigid motion information of a face may be preserved in developmental prosopagnosia, non-rigid motion can specifically interfere with the representation of structural face information. Taken together, these results suggest that both form and motion cues are important in face perception and that these cues are likely integrated in the representation of facial identity. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Can an anger face also be scared? Malleability of facial expressions.

    PubMed

    Widen, Sherri C; Naab, Pamela

    2012-10-01

    Do people always interpret a facial expression as communicating a single emotion (e.g., the anger face as only angry) or is that interpretation malleable? The current study investigated preschoolers' (N = 60; 3-4 years) and adults' (N = 20) categorization of facial expressions. On each of five trials, participants selected from an array of 10 facial expressions (an open-mouthed, high arousal expression and a closed-mouthed, low arousal expression each for happiness, sadness, anger, fear, and disgust) all those that displayed the target emotion. Children's interpretation of facial expressions was malleable: 48% of children who selected the fear, anger, sadness, and disgust faces for the "correct" category also selected these same faces for another emotion category; 47% of adults did so for the sadness and disgust faces. The emotion children and adults attribute to facial expressions is influenced by the emotion category for which they are looking.

  10. Deception Detection in Multicultural Coalitions: Foundations for a Cognitive Model

    DTIC Science & Technology

    2011-06-01

    and spontaneous vs. deliberate and contrived facial expression of emotions , symmetry, leakage through microexpressions, hand postures, dynamic...sequences of visually detectable cues , such as facial muscle-group coordination and correlations expressed as changes in facial expressions and face...concert, whereas facial expressions of deceivers emphasize a few cues that arise more randomly and chaotically [15]. A smile without the use of

  11. Dissociation of sad facial expressions and autonomic nervous system responding in boys with disruptive behavior disorders

    PubMed Central

    Marsh, Penny; Beauchaine, Theodore P.; Williams, Bailey

    2009-01-01

    Although deficiencies in emotional responding have been linked to externalizing behaviors in children, little is known about how discrete response systems (e.g., expressive, physiological) are coordinated during emotional challenge among these youth. We examined time-linked correspondence of sad facial expressions and autonomic reactivity during an empathy-eliciting task among boys with disruptive behavior disorders (n = 31) and controls (n = 23). For controls, sad facial expressions were associated with reduced sympathetic (lower skin conductance level, lengthened cardiac preejection period [PEP]) and increased parasympathetic (higher respiratory sinus arrhythmia [RSA]) activity. In contrast, no correspondence between facial expressions and autonomic reactivity was observed among boys with conduct problems. Furthermore, low correspondence between facial expressions and PEP predicted externalizing symptom severity, whereas low correspondence between facial expressions and RSA predicted internalizing symptom severity. PMID:17868261

  12. NK1 receptor antagonism and emotional processing in healthy volunteers.

    PubMed

    Chandra, P; Hafizi, S; Massey-Chase, R M; Goodwin, G M; Cowen, P J; Harmer, C J

    2010-04-01

    The neurokinin-1 (NK(1)) receptor antagonist, aprepitant, showed activity in several animal models of depression; however, its efficacy in clinical trials was disappointing. There is little knowledge of the role of NK(1) receptors in human emotional behaviour to help explain this discrepancy. The aim of the current study was to assess the effects of a single oral dose of aprepitant (125 mg) on models of emotional processing sensitive to conventional antidepressant drug administration in 38 healthy volunteers, randomly allocated to receive aprepitant or placebo in a between groups double blind design. Performance on measures of facial expression recognition, emotional categorisation, memory and attentional visual-probe were assessed following the drug absorption. Relative to placebo, aprepitant improved recognition of happy facial expressions and increased vigilance to emotional information in the unmasked condition of the visual probe task. In contrast, aprepitant impaired emotional memory and slowed responses in the facial expression recognition task suggesting possible deleterious effects on cognition. These results suggest that while antagonism of NK(1) receptors does affect emotional processing in humans, its effects are more restricted and less consistent across tasks than those of conventional antidepressants. Human models of emotional processing may provide a useful means of assessing the likely therapeutic potential of new treatments for depression.

  13. Mutual information-based facial expression recognition

    NASA Astrophysics Data System (ADS)

    Hazar, Mliki; Hammami, Mohamed; Hanêne, Ben-Abdallah

    2013-12-01

    This paper introduces a novel low-computation discriminative regions representation for expression analysis task. The proposed approach relies on interesting studies in psychology which show that most of the descriptive and responsible regions for facial expression are located around some face parts. The contributions of this work lie in the proposition of new approach which supports automatic facial expression recognition based on automatic regions selection. The regions selection step aims to select the descriptive regions responsible or facial expression and was performed using Mutual Information (MI) technique. For facial feature extraction, we have applied Local Binary Patterns Pattern (LBP) on Gradient image to encode salient micro-patterns of facial expressions. Experimental studies have shown that using discriminative regions provide better results than using the whole face regions whilst reducing features vector dimension.

  14. Younger and Older Users’ Recognition of Virtual Agent Facial Expressions

    PubMed Central

    Beer, Jenay M.; Smarr, Cory-Ann; Fisk, Arthur D.; Rogers, Wendy A.

    2015-01-01

    As technology advances, robots and virtual agents will be introduced into the home and healthcare settings to assist individuals, both young and old, with everyday living tasks. Understanding how users recognize an agent’s social cues is therefore imperative, especially in social interactions. Facial expression, in particular, is one of the most common non-verbal cues used to display and communicate emotion in on-screen agents (Cassell, Sullivan, Prevost, & Churchill, 2000). Age is important to consider because age-related differences in emotion recognition of human facial expression have been supported (Ruffman et al., 2008), with older adults showing a deficit for recognition of negative facial expressions. Previous work has shown that younger adults can effectively recognize facial emotions displayed by agents (Bartneck & Reichenbach, 2005; Courgeon et al. 2009; 2011; Breazeal, 2003); however, little research has compared in-depth younger and older adults’ ability to label a virtual agent’s facial emotions, an import consideration because social agents will be required to interact with users of varying ages. If such age-related differences exist for recognition of virtual agent facial expressions, we aim to understand if those age-related differences are influenced by the intensity of the emotion, dynamic formation of emotion (i.e., a neutral expression developing into an expression of emotion through motion), or the type of virtual character differing by human-likeness. Study 1 investigated the relationship between age-related differences, the implication of dynamic formation of emotion, and the role of emotion intensity in emotion recognition of the facial expressions of a virtual agent (iCat). Study 2 examined age-related differences in recognition expressed by three types of virtual characters differing by human-likeness (non-humanoid iCat, synthetic human, and human). Study 2 also investigated the role of configural and featural processing as a possible explanation for age-related differences in emotion recognition. First, our findings show age-related differences in the recognition of emotions expressed by a virtual agent, with older adults showing lower recognition for the emotions of anger, disgust, fear, happiness, sadness, and neutral. These age-related difference might be explained by older adults having difficulty discriminating similarity in configural arrangement of facial features for certain emotions; for example, older adults often mislabeled the similar emotions of fear as surprise. Second, our results did not provide evidence for the dynamic formation improving emotion recognition; but, in general, the intensity of the emotion improved recognition. Lastly, we learned that emotion recognition, for older and younger adults, differed by character type, from best to worst: human, synthetic human, and then iCat. Our findings provide guidance for design, as well as the development of a framework of age-related differences in emotion recognition. PMID:25705105

  15. Design of a Virtual Reality System for Affect Analysis in Facial Expressions (VR-SAAFE); Application to Schizophrenia.

    PubMed

    Bekele, E; Bian, D; Peterman, J; Park, S; Sarkar, N

    2017-06-01

    Schizophrenia is a life-long, debilitating psychotic disorder with poor outcome that affects about 1% of the population. Although pharmacotherapy can alleviate some of the acute psychotic symptoms, residual social impairments present a significant barrier that prevents successful rehabilitation. With limited resources and access to social skills training opportunities, innovative technology has emerged as a potentially powerful tool for intervention. In this paper, we present a novel virtual reality (VR)-based system for understanding facial emotion processing impairments that may lead to poor social outcome in schizophrenia. We henceforth call it a VR System for Affect Analysis in Facial Expressions (VR-SAAFE). This system integrates a VR-based task presentation platform that can minutely control facial expressions of an avatar with or without accompanying verbal interaction, with an eye-tracker to quantitatively measure a participants real-time gaze and a set of physiological sensors to infer his/her affective states to allow in-depth understanding of the emotion recognition mechanism of patients with schizophrenia based on quantitative metrics. A usability study with 12 patients with schizophrenia and 12 healthy controls was conducted to examine processing of the emotional faces. Preliminary results indicated that there were significant differences in the way patients with schizophrenia processed and responded towards the emotional faces presented in the VR environment compared with healthy control participants. The preliminary results underscore the utility of such a VR-based system that enables precise and quantitative assessment of social skill deficits in patients with schizophrenia.

  16. Emotion perception across cultures: the role of cognitive mechanisms

    PubMed Central

    Engelmann, Jan B.; Pogosyan, Marianna

    2012-01-01

    Despite consistently documented cultural differences in the perception of facial expressions of emotion, the role of culture in shaping cognitive mechanisms that are central to emotion perception has received relatively little attention in past research. We review recent developments in cross-cultural psychology that provide particular insights into the modulatory role of culture on cognitive mechanisms involved in interpretations of facial expressions of emotion through two distinct routes: display rules and cognitive styles. Investigations of emotion intensity perception have demonstrated that facial expressions with varying levels of intensity of positive affect are perceived and categorized differently across cultures. Specifically, recent findings indicating significant levels of differentiation between intensity levels of facial expressions among American participants, as well as deviations from clear categorization of high and low intensity expressions among Japanese and Russian participants, suggest that display rules shape mental representations of emotions, such as intensity levels of emotion prototypes. Furthermore, a series of recent studies using eye tracking as a proxy for overt attention during face perception have identified culture-specific cognitive styles, such as the propensity to attend to very specific features of the face. Together, these results suggest a cascade of cultural influences on cognitive mechanisms involved in interpretations of facial expressions of emotion, whereby cultures impart specific behavioral practices that shape the way individuals process information from the environment. These cultural influences lead to differences in cognitive styles due to culture-specific attentional biases and emotion prototypes, which partially account for the gradient of cultural agreements and disagreements obtained in past investigations of emotion perception. PMID:23486743

  17. Emotion perception across cultures: the role of cognitive mechanisms.

    PubMed

    Engelmann, Jan B; Pogosyan, Marianna

    2013-01-01

    Despite consistently documented cultural differences in the perception of facial expressions of emotion, the role of culture in shaping cognitive mechanisms that are central to emotion perception has received relatively little attention in past research. We review recent developments in cross-cultural psychology that provide particular insights into the modulatory role of culture on cognitive mechanisms involved in interpretations of facial expressions of emotion through two distinct routes: display rules and cognitive styles. Investigations of emotion intensity perception have demonstrated that facial expressions with varying levels of intensity of positive affect are perceived and categorized differently across cultures. Specifically, recent findings indicating significant levels of differentiation between intensity levels of facial expressions among American participants, as well as deviations from clear categorization of high and low intensity expressions among Japanese and Russian participants, suggest that display rules shape mental representations of emotions, such as intensity levels of emotion prototypes. Furthermore, a series of recent studies using eye tracking as a proxy for overt attention during face perception have identified culture-specific cognitive styles, such as the propensity to attend to very specific features of the face. Together, these results suggest a cascade of cultural influences on cognitive mechanisms involved in interpretations of facial expressions of emotion, whereby cultures impart specific behavioral practices that shape the way individuals process information from the environment. These cultural influences lead to differences in cognitive styles due to culture-specific attentional biases and emotion prototypes, which partially account for the gradient of cultural agreements and disagreements obtained in past investigations of emotion perception.

  18. Incongruence between Verbal and Non-Verbal Information Enhances the Late Positive Potential.

    PubMed

    Morioka, Shu; Osumi, Michihiro; Shiotani, Mayu; Nobusako, Satoshi; Maeoka, Hiroshi; Okada, Yohei; Hiyamizu, Makoto; Matsuo, Atsushi

    2016-01-01

    Smooth social communication consists of both verbal and non-verbal information. However, when presented with incongruence between verbal information and nonverbal information, the relationship between an individual judging trustworthiness in those who present the verbal-nonverbal incongruence and the brain activities observed during judgment for trustworthiness are not clear. In the present study, we attempted to identify the impact of incongruencies between verbal information and facial expression on the value of trustworthiness and brain activity using event-related potentials (ERP). Combinations of verbal information [positive/negative] and facial expressions [smile/angry] expressions were presented randomly on a computer screen to 17 healthy volunteers. The value of trustworthiness of the presented facial expression was evaluated by the amount of donation offered by the observer to the person depicted on the computer screen. In addition, the time required to judge the value of trustworthiness was recorded for each trial. Using electroencephalography, ERP were obtained by averaging the wave patterns recorded while the participants judged the value of trustworthiness. The amount of donation offered was significantly lower when the verbal information and facial expression were incongruent, particularly for [negative × smile]. The amplitude of the early posterior negativity (EPN) at the temporal lobe showed no significant difference between all conditions. However, the amplitude of the late positive potential (LPP) at the parietal electrodes for the incongruent condition [negative × smile] was higher than that for the congruent condition [positive × smile]. These results suggest that the LPP amplitude observed from the parietal cortex is involved in the processing of incongruence between verbal information and facial expression.

  19. Intranasal oxytocin increases facial expressivity, but not ratings of trustworthiness, in patients with schizophrenia and healthy controls.

    PubMed

    Woolley, J D; Chuang, B; Fussell, C; Scherer, S; Biagianti, B; Fulford, D; Mathalon, D H; Vinogradov, S

    2017-05-01

    Blunted facial affect is a common negative symptom of schizophrenia. Additionally, assessing the trustworthiness of faces is a social cognitive ability that is impaired in schizophrenia. Currently available pharmacological agents are ineffective at improving either of these symptoms, despite their clinical significance. The hypothalamic neuropeptide oxytocin has multiple prosocial effects when administered intranasally to healthy individuals and shows promise in decreasing negative symptoms and enhancing social cognition in schizophrenia. Although two small studies have investigated oxytocin's effects on ratings of facial trustworthiness in schizophrenia, its effects on facial expressivity have not been investigated in any population. We investigated the effects of oxytocin on facial emotional expressivity while participants performed a facial trustworthiness rating task in 33 individuals with schizophrenia and 35 age-matched healthy controls using a double-blind, placebo-controlled, cross-over design. Participants rated the trustworthiness of presented faces interspersed with emotionally evocative photographs while being video-recorded. Participants' facial expressivity in these videos was quantified by blind raters using a well-validated manualized approach (i.e. the Facial Expression Coding System; FACES). While oxytocin administration did not affect ratings of facial trustworthiness, it significantly increased facial expressivity in individuals with schizophrenia (Z = -2.33, p = 0.02) and at trend level in healthy controls (Z = -1.87, p = 0.06). These results demonstrate that oxytocin administration can increase facial expressivity in response to emotional stimuli and suggest that oxytocin may have the potential to serve as a treatment for blunted facial affect in schizophrenia.

  20. Dynamic Emotional Faces Generalise Better to a New Expression but not to a New View.

    PubMed

    Liu, Chang Hong; Chen, Wenfeng; Ward, James; Takahashi, Nozomi

    2016-08-08

    Prior research based on static images has found limited improvement for recognising previously learnt faces in a new expression after several different facial expressions of these faces had been shown during the learning session. We investigated whether non-rigid motion of facial expression facilitates the learning process. In Experiment 1, participants remembered faces that were either presented in short video clips or still images. To assess the effect of exposure to expression variation, each face was either learnt through a single expression or three different expressions. Experiment 2 examined whether learning faces in video clips could generalise more effectively to a new view. The results show that faces learnt from video clips generalised effectively to a new expression with exposure to a single expression, whereas faces learnt from stills showed poorer generalisation with exposure to either single or three expressions. However, although superior recognition performance was demonstrated for faces learnt through video clips, dynamic facial expression did not create better transfer of learning to faces tested in a new view. The data thus fail to support the hypothesis that non-rigid motion enhances viewpoint invariance. These findings reveal both benefits and limitations of exposures to moving expressions for expression-invariant face recognition.

  1. Dynamic Emotional Faces Generalise Better to a New Expression but not to a New View

    PubMed Central

    Liu, Chang Hong; Chen, Wenfeng; Ward, James; Takahashi, Nozomi

    2016-01-01

    Prior research based on static images has found limited improvement for recognising previously learnt faces in a new expression after several different facial expressions of these faces had been shown during the learning session. We investigated whether non-rigid motion of facial expression facilitates the learning process. In Experiment 1, participants remembered faces that were either presented in short video clips or still images. To assess the effect of exposure to expression variation, each face was either learnt through a single expression or three different expressions. Experiment 2 examined whether learning faces in video clips could generalise more effectively to a new view. The results show that faces learnt from video clips generalised effectively to a new expression with exposure to a single expression, whereas faces learnt from stills showed poorer generalisation with exposure to either single or three expressions. However, although superior recognition performance was demonstrated for faces learnt through video clips, dynamic facial expression did not create better transfer of learning to faces tested in a new view. The data thus fail to support the hypothesis that non-rigid motion enhances viewpoint invariance. These findings reveal both benefits and limitations of exposures to moving expressions for expression-invariant face recognition. PMID:27499252

  2. Using facial expressions as CSs and fearsome and disgusting pictures as UCSs: affective responding and evaluative learning of fear and disgust in blood-injection-injury phobia.

    PubMed

    Olatunji, Bunmi O; Lohr, Jeffrey M; Sawchuk, Craig N; Westendorf, David H

    2005-01-01

    Two experiments examine use of an evaluative conditioning (EC) paradigm in the acquisition of fear and disgust responding to neutral facial expressions. In Experiment 1, 60 participants were randomly assigned to one of three evaluative learning conditions in which neutral facial expressions were paired with fearsome, disgusting, or neutral pictures. No statistically significant differences were detected between the three conditions. However, significant differences emerged within subjects as post-exposure of fear and disgust ratings were higher among expressions that were paired with pictorial stimuli. Experiment 2 sought to examine if an analogue sample of BII phobics would be more susceptible than nonphobic controls to fear and disgust EC utilizing a similar experimental design, given the co-occurrence of fear and disgust in BII-phobic responding. Results failed to demonstrate an EC effect specific to the analogue phobic group, although both groups showed an evaluative shift toward disgust for those facial expressions paired with BII-relevant pictures. Consistent with previous findings, examination of picture rating data suggested that analogue BII phobics rated the BII pictures as significantly more disgusting than fearful. The role of EC processes and a priori expectancy biases in the associative learning of disgust in BII phobia is discussed.

  3. The Development of Emotional Face Processing during Childhood

    ERIC Educational Resources Information Center

    Batty, Magali; Taylor, Margot J.

    2006-01-01

    Our facial expressions give others the opportunity to access our feelings, and constitute an important nonverbal tool for communication. Many recent studies have investigated emotional perception in adults, and our knowledge of neural processes involved in emotions is increasingly precise. Young children also use faces to express their internal…

  4. Event-Related Brain Potential Correlates of Emotional Face Processing

    ERIC Educational Resources Information Center

    Eimer, Martin; Holmes, Amanda

    2007-01-01

    Results from recent event-related brain potential (ERP) studies investigating brain processes involved in the detection and analysis of emotional facial expression are reviewed. In all experiments, emotional faces were found to trigger an increased ERP positivity relative to neutral faces. The onset of this emotional expression effect was…

  5. Spontaneous Facial Actions Map onto Emotional Experiences in a Non-social Context: Toward a Component-Based Approach

    PubMed Central

    Namba, Shushi; Kabir, Russell S.; Miyatani, Makoto; Nakao, Takashi

    2017-01-01

    While numerous studies have examined the relationships between facial actions and emotions, they have yet to account for the ways that specific spontaneous facial expressions map onto emotional experiences induced without expressive intent. Moreover, previous studies emphasized that a fine-grained investigation of facial components could establish the coherence of facial actions with actual internal states. Therefore, this study aimed to accumulate evidence for the correspondence between spontaneous facial components and emotional experiences. We reinvestigated data from previous research which secretly recorded spontaneous facial expressions of Japanese participants as they watched film clips designed to evoke four different target emotions: surprise, amusement, disgust, and sadness. The participants rated their emotional experiences via a self-reported questionnaire of 16 emotions. These spontaneous facial expressions were coded using the Facial Action Coding System, the gold standard for classifying visible facial movements. We corroborated each facial action that was present in the emotional experiences by applying stepwise regression models. The results found that spontaneous facial components occurred in ways that cohere to their evolutionary functions based on the rating values of emotional experiences (e.g., the inner brow raiser might be involved in the evaluation of novelty). This study provided new empirical evidence for the correspondence between each spontaneous facial component and first-person internal states of emotion as reported by the expresser. PMID:28522979

  6. Electrocortical reactivity to negative and positive facial expressions in individuals with a family history of major depression.

    PubMed

    Watters, Anna J; Harris, Anthony W F; Williams, Leanne M

    2018-05-21

    Facial expressions signaling threat and mood-congruent loss have been used to probe abnormal neural reactivity in major depressive disorder (MDD) and may be implicated in genetic vulnerability to MDD. This study investigated electro-cortical reactivity to facial expressions 101 unaffected, adult first degree relatives of probands with MDD and non-relative controls (n = 101). We investigated event-related potentials (ERPs) to five facial expressions of basic emotion: fear, anger, disgust, sadness and happiness under both subliminal (masked) and conscious (unmasked) presentation conditions, and the source localization of group differences. In the conscious condition, controls showed a distinctly positive-going shift in responsive to negative versus happy faces, reflected in a greater positivity for the VPP frontally and the P300 parietally, and less negativity for the N200. By contrast, relatives showed less differentiation of emotions, reflected in less VPP and P300 positivity, particularly for anger and disgust, and which produced an enhanced N200 for sadness. These group differences were consistently source localized to the anterior cingulate cortex. The findings contribute new evidence for neural disruptions underlying the differentiation of salient emotions in familial risk for depression. These disruptions occur in the appraisal (∼200 ms post-stimulus) through to the context evaluation (∼300 ms+ post-stimulus) phases of of emotion processing, consistent with theories that risk for depression involves biased or attenuated processing of emotion. Copyright © 2018. Published by Elsevier B.V.

  7. Realistic facial expression of virtual human based on color, sweat, and tears effects.

    PubMed

    Alkawaz, Mohammed Hazim; Basori, Ahmad Hoirul; Mohamad, Dzulkifli; Mohamed, Farhan

    2014-01-01

    Generating extreme appearances such as scared awaiting sweating while happy fit for tears (cry) and blushing (anger and happiness) is the key issue in achieving the high quality facial animation. The effects of sweat, tears, and colors are integrated into a single animation model to create realistic facial expressions of 3D avatar. The physical properties of muscles, emotions, or the fluid properties with sweating and tears initiators are incorporated. The action units (AUs) of facial action coding system are merged with autonomous AUs to create expressions including sadness, anger with blushing, happiness with blushing, and fear. Fluid effects such as sweat and tears are simulated using the particle system and smoothed-particle hydrodynamics (SPH) methods which are combined with facial animation technique to produce complex facial expressions. The effects of oxygenation of the facial skin color appearance are measured using the pulse oximeter system and the 3D skin analyzer. The result shows that virtual human facial expression is enhanced by mimicking actual sweating and tears simulations for all extreme expressions. The proposed method has contribution towards the development of facial animation industry and game as well as computer graphics.

  8. Realistic Facial Expression of Virtual Human Based on Color, Sweat, and Tears Effects

    PubMed Central

    Alkawaz, Mohammed Hazim; Basori, Ahmad Hoirul; Mohamad, Dzulkifli; Mohamed, Farhan

    2014-01-01

    Generating extreme appearances such as scared awaiting sweating while happy fit for tears (cry) and blushing (anger and happiness) is the key issue in achieving the high quality facial animation. The effects of sweat, tears, and colors are integrated into a single animation model to create realistic facial expressions of 3D avatar. The physical properties of muscles, emotions, or the fluid properties with sweating and tears initiators are incorporated. The action units (AUs) of facial action coding system are merged with autonomous AUs to create expressions including sadness, anger with blushing, happiness with blushing, and fear. Fluid effects such as sweat and tears are simulated using the particle system and smoothed-particle hydrodynamics (SPH) methods which are combined with facial animation technique to produce complex facial expressions. The effects of oxygenation of the facial skin color appearance are measured using the pulse oximeter system and the 3D skin analyzer. The result shows that virtual human facial expression is enhanced by mimicking actual sweating and tears simulations for all extreme expressions. The proposed method has contribution towards the development of facial animation industry and game as well as computer graphics. PMID:25136663

  9. Guide to Understanding Facial Palsy

    MedlinePlus

    ... to many different facial muscles. These muscles control facial expression. The coordinated activity of this nerve and these ... involves a weakness of the muscles responsible for facial expression and side-to-side eye movement. Moebius syndrome ...

  10. A Virtual Environment to Improve the Detection of Oral-Facial Malfunction in Children with Cerebral Palsy

    PubMed Central

    Martín-Ruiz, María-Luisa; Máximo-Bocanegra, Nuria; Luna-Oliva, Laura

    2016-01-01

    The importance of an early rehabilitation process in children with cerebral palsy (CP) is widely recognized. On the one hand, new and useful treatment tools such as rehabilitation systems based on interactive technologies have appeared for rehabilitation of gross motor movements. On the other hand, from the therapeutic point of view, performing rehabilitation exercises with the facial muscles can improve the swallowing process, the facial expression through the management of muscles in the face, and even the speech of children with cerebral palsy. However, it is difficult to find interactive games to improve the detection and evaluation of oral-facial musculature dysfunctions in children with CP. This paper describes a framework based on strategies developed for interactive serious games that is created both for typically developed children and children with disabilities. Four interactive games are the core of a Virtual Environment called SONRIE. This paper demonstrates the benefits of SONRIE to monitor children’s oral-facial difficulties. The next steps will focus on the validation of SONRIE to carry out the rehabilitation process of oral-facial musculature in children with cerebral palsy. PMID:27023561

  11. Preferential responses in amygdala and insula during presentation of facial contempt and disgust.

    PubMed

    Sambataro, Fabio; Dimalta, Savino; Di Giorgio, Annabella; Taurisano, Paolo; Blasi, Giuseppe; Scarabino, Tommaso; Giannatempo, Giuseppe; Nardini, Marcello; Bertolino, Alessandro

    2006-10-01

    Some authors consider contempt to be a basic emotion while others consider it a variant of disgust. The neural correlates of contempt have not so far been specifically contrasted with disgust. Using functional magnetic resonance imaging (fMRI), we investigated the neural networks involved in the processing of facial contempt and disgust in 24 healthy subjects. Facial recognition of contempt was lower than that of disgust and of neutral faces. The imaging data indicated significant activity in the amygdala and in globus pallidus and putamen during processing of contemptuous faces. Bilateral insula and caudate nuclei and left as well as right inferior frontal gyrus were engaged during processing of disgusted faces. Moreover, direct comparisons of contempt vs. disgust yielded significantly different activations in the amygdala. On the other hand, disgusted faces elicited greater activation than contemptuous faces in the right insula and caudate. Our findings suggest preferential involvement of different neural substrates in the processing of facial emotional expressions of contempt and disgust.

  12. Caricaturing facial expressions.

    PubMed

    Calder, A J; Rowland, D; Young, A W; Nimmo-Smith, I; Keane, J; Perrett, D I

    2000-08-14

    The physical differences between facial expressions (e.g. fear) and a reference norm (e.g. a neutral expression) were altered to produce photographic-quality caricatures. In Experiment 1, participants rated caricatures of fear, happiness and sadness for their intensity of these three emotions; a second group of participants rated how 'face-like' the caricatures appeared. With increasing levels of exaggeration the caricatures were rated as more emotionally intense, but less 'face-like'. Experiment 2 demonstrated a similar relationship between emotional intensity and level of caricature for six different facial expressions. Experiments 3 and 4 compared intensity ratings of facial expression caricatures prepared relative to a selection of reference norms - a neutral expression, an average expression, or a different facial expression (e.g. anger caricatured relative to fear). Each norm produced a linear relationship between caricature and rated intensity of emotion; this finding is inconsistent with two-dimensional models of the perceptual representation of facial expression. An exemplar-based multidimensional model is proposed as an alternative account.

  13. Face expressive lifting (FEL): an original surgical concept combined with bipolar radiofrequency.

    PubMed

    Divaris, Marc; Blugerman, Guillermo; Paul, Malcolm D

    2014-01-01

    Aging can lead to changes in facial expressions, transforming the positive youth expression of happiness to negative expressions as sadness, tiredness, and disgust. Local skin distension is another consequence of aging, which can be difficult to treat with rejuvenation procedures. The "face expressive lifting" (FEL) is an original concept in facial rejuvenation surgery. On the one hand, FEL integrates established convergent surgical techniques aiming to correct the age-related negative facial expressions. On the other hand, FEL incorporates novel bipolar RF technology aiming to correct local skin distension. One hundred twenty-six patients underwent FEL procedure. Facial expression and local skin distension were assessed with 2 years follow-up. There was a correction of negative facial expression for 96 patients (76 %) and a tightening of local skin distension in 100 % of cases. FEL is an effective procedure taking into account and able to correct both age-related negative changes in facial expression and local skin distension using radiofrequency. Level of Evidence: Level IV, therapeutic study.

  14. Face processing in chronic alcoholism: a specific deficit for emotional features.

    PubMed

    Maurage, P; Campanella, S; Philippot, P; Martin, S; de Timary, P

    2008-04-01

    It is well established that chronic alcoholism is associated with a deficit in the decoding of emotional facial expression (EFE). Nevertheless, it is still unclear whether this deficit is specifically for emotions or due to a more general impairment in visual or facial processing. This study was designed to clarify this issue using multiple control tasks and the subtraction method. Eighteen patients suffering from chronic alcoholism and 18 matched healthy control subjects were asked to perform several tasks evaluating (1) Basic visuo-spatial and facial identity processing; (2) Simple reaction times; (3) Complex facial features identification (namely age, emotion, gender, and race). Accuracy and reaction times were recorded. Alcoholic patients had a preserved performance for visuo-spatial and facial identity processing, but their performance was impaired for visuo-motor abilities and for the detection of complex facial aspects. More importantly, the subtraction method showed that alcoholism is associated with a specific EFE decoding deficit, still present when visuo-motor slowing down is controlled for. These results offer a post hoc confirmation of earlier data showing an EFE decoding deficit in alcoholism by strongly suggesting a specificity of this deficit for emotions. This may have implications for clinical situations, where emotional impairments are frequently observed among alcoholic subjects.

  15. FaceWarehouse: a 3D facial expression database for visual computing.

    PubMed

    Cao, Chen; Weng, Yanlin; Zhou, Shun; Tong, Yiying; Zhou, Kun

    2014-03-01

    We present FaceWarehouse, a database of 3D facial expressions for visual computing applications. We use Kinect, an off-the-shelf RGBD camera, to capture 150 individuals aged 7-80 from various ethnic backgrounds. For each person, we captured the RGBD data of her different expressions, including the neutral expression and 19 other expressions such as mouth-opening, smile, kiss, etc. For every RGBD raw data record, a set of facial feature points on the color image such as eye corners, mouth contour, and the nose tip are automatically localized, and manually adjusted if better accuracy is required. We then deform a template facial mesh to fit the depth data as closely as possible while matching the feature points on the color image to their corresponding points on the mesh. Starting from these fitted face meshes, we construct a set of individual-specific expression blendshapes for each person. These meshes with consistent topology are assembled as a rank-3 tensor to build a bilinear face model with two attributes: identity and expression. Compared with previous 3D facial databases, for every person in our database, there is a much richer matching collection of expressions, enabling depiction of most human facial actions. We demonstrate the potential of FaceWarehouse for visual computing with four applications: facial image manipulation, face component transfer, real-time performance-based facial image animation, and facial animation retargeting from video to image.

  16. The right place at the right time: priming facial expressions with emotional face components in developmental visual agnosia.

    PubMed

    Aviezer, Hillel; Hassin, Ran R; Perry, Anat; Dudarev, Veronica; Bentin, Shlomo

    2012-04-01

    The current study examined the nature of deficits in emotion recognition from facial expressions in case LG, an individual with a rare form of developmental visual agnosia (DVA). LG presents with profoundly impaired recognition of facial expressions, yet the underlying nature of his deficit remains unknown. During typical face processing, normal sighted individuals extract information about expressed emotions from face regions with activity diagnostic for specific emotion categories. Given LG's impairment, we sought to shed light on his emotion perception by examining if priming facial expressions with diagnostic emotional face components would facilitate his recognition of the emotion expressed by the face. LG and control participants matched isolated face components with components appearing in a subsequently presented full-face and then categorized the face's emotion. Critically, the matched components were from regions which were diagnostic or non-diagnostic of the emotion portrayed by the full face. In experiment 1, when the full faces were briefly presented (150 ms), LG's performance was strongly influenced by the diagnosticity of the components: his emotion recognition was boosted within normal limits when diagnostic components were used and was obliterated when non-diagnostic components were used. By contrast, in experiment 2, when the face-exposure duration was extended (2000 ms), the beneficial effect of the diagnostic matching was diminished as was the detrimental effect of the non-diagnostic matching. These data highlight the impact of diagnostic facial features in normal expression recognition and suggest that impaired emotion recognition in DVA results from deficient visual integration across diagnostic face components. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Emotional Representation in Facial Expression and Script: A Comparison between Normal and Autistic Children

    ERIC Educational Resources Information Center

    Balconi, Michela; Carrera, Alba

    2007-01-01

    The paper explored conceptual and lexical skills with regard to emotional correlates of facial stimuli and scripts. In two different experimental phases normal and autistic children observed six facial expressions of emotions (happiness, anger, fear, sadness, surprise, and disgust) and six emotional scripts (contextualized facial expressions). In…

  18. New Tests to Measure Individual Differences in Matching and Labelling Facial Expressions of Emotion, and Their Association with Ability to Recognise Vocal Emotions and Facial Identity

    PubMed Central

    Palermo, Romina; O’Connor, Kirsty B.; Davis, Joshua M.; Irons, Jessica; McKone, Elinor

    2013-01-01

    Although good tests are available for diagnosing clinical impairments in face expression processing, there is a lack of strong tests for assessing “individual differences” – that is, differences in ability between individuals within the typical, nonclinical, range. Here, we develop two new tests, one for expression perception (an odd-man-out matching task in which participants select which one of three faces displays a different expression) and one additionally requiring explicit identification of the emotion (a labelling task in which participants select one of six verbal labels). We demonstrate validity (careful check of individual items, large inversion effects, independence from nonverbal IQ, convergent validity with a previous labelling task), reliability (Cronbach’s alphas of.77 and.76 respectively), and wide individual differences across the typical population. We then demonstrate the usefulness of the tests by addressing theoretical questions regarding the structure of face processing, specifically the extent to which the following processes are common or distinct: (a) perceptual matching and explicit labelling of expression (modest correlation between matching and labelling supported partial independence); (b) judgement of expressions from faces and voices (results argued labelling tasks tap into a multi-modal system, while matching tasks tap distinct perceptual processes); and (c) expression and identity processing (results argued for a common first step of perceptual processing for expression and identity). PMID:23840821

  19. New tests to measure individual differences in matching and labelling facial expressions of emotion, and their association with ability to recognise vocal emotions and facial identity.

    PubMed

    Palermo, Romina; O'Connor, Kirsty B; Davis, Joshua M; Irons, Jessica; McKone, Elinor

    2013-01-01

    Although good tests are available for diagnosing clinical impairments in face expression processing, there is a lack of strong tests for assessing "individual differences"--that is, differences in ability between individuals within the typical, nonclinical, range. Here, we develop two new tests, one for expression perception (an odd-man-out matching task in which participants select which one of three faces displays a different expression) and one additionally requiring explicit identification of the emotion (a labelling task in which participants select one of six verbal labels). We demonstrate validity (careful check of individual items, large inversion effects, independence from nonverbal IQ, convergent validity with a previous labelling task), reliability (Cronbach's alphas of.77 and.76 respectively), and wide individual differences across the typical population. We then demonstrate the usefulness of the tests by addressing theoretical questions regarding the structure of face processing, specifically the extent to which the following processes are common or distinct: (a) perceptual matching and explicit labelling of expression (modest correlation between matching and labelling supported partial independence); (b) judgement of expressions from faces and voices (results argued labelling tasks tap into a multi-modal system, while matching tasks tap distinct perceptual processes); and (c) expression and identity processing (results argued for a common first step of perceptual processing for expression and identity).

  20. Spontaneous and posed facial expression in Parkinson's disease.

    PubMed

    Smith, M C; Smith, M K; Ellgring, H

    1996-09-01

    Spontaneous and posed emotional facial expressions in individuals with Parkinson's disease (PD, n = 12) were compared with those of healthy age-matched controls (n = 12). The intensity and amount of facial expression in PD patients were expected to be reduced for spontaneous but not posed expressions. Emotional stimuli were video clips selected from films, 2-5 min in duration, designed to elicit feelings of happiness, sadness, fear, disgust, or anger. Facial movements were coded using Ekman and Friesen's (1978) Facial Action Coding System (FACS). In addition, participants rated their emotional experience on 9-point Likert scales. The PD group showed significantly less overall facial reactivity than did controls when viewing the films. The predicted Group X Condition (spontaneous vs. posed) interaction effect on smile intensity was found when PD participants with more severe disease were compared with those with milder disease and with controls. In contrast, ratings of emotional experience were similar for both groups. Depression was positively associated with emotion rating but not with measures of facial activity. Spontaneous facial expression appears to be selectively affected in PD, whereas posed expression and emotional experience remain relatively intact.

  1. Dynamic facial expressions of emotion transmit an evolving hierarchy of signals over time.

    PubMed

    Jack, Rachael E; Garrod, Oliver G B; Schyns, Philippe G

    2014-01-20

    Designed by biological and social evolutionary pressures, facial expressions of emotion comprise specific facial movements to support a near-optimal system of signaling and decoding. Although highly dynamical, little is known about the form and function of facial expression temporal dynamics. Do facial expressions transmit diagnostic signals simultaneously to optimize categorization of the six classic emotions, or sequentially to support a more complex communication system of successive categorizations over time? Our data support the latter. Using a combination of perceptual expectation modeling, information theory, and Bayesian classifiers, we show that dynamic facial expressions of emotion transmit an evolving hierarchy of "biologically basic to socially specific" information over time. Early in the signaling dynamics, facial expressions systematically transmit few, biologically rooted face signals supporting the categorization of fewer elementary categories (e.g., approach/avoidance). Later transmissions comprise more complex signals that support categorization of a larger number of socially specific categories (i.e., the six classic emotions). Here, we show that dynamic facial expressions of emotion provide a sophisticated signaling system, questioning the widely accepted notion that emotion communication is comprised of six basic (i.e., psychologically irreducible) categories, and instead suggesting four. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Automatic decoding of facial movements reveals deceptive pain expressions

    PubMed Central

    Bartlett, Marian Stewart; Littlewort, Gwen C.; Frank, Mark G.; Lee, Kang

    2014-01-01

    Summary In highly social species such as humans, faces have evolved to convey rich information for social interaction, including expressions of emotions and pain [1–3]. Two motor pathways control facial movement [4–7]. A subcortical extrapyramidal motor system drives spontaneous facial expressions of felt emotions. A cortical pyramidal motor system controls voluntary facial expressions. The pyramidal system enables humans to simulate facial expressions of emotions not actually experienced. Their simulation is so successful that they can deceive most observers [8–11]. Machine vision may, however, be able to distinguish deceptive from genuine facial signals by identifying the subtle differences between pyramidally and extrapyramidally driven movements. Here we show that human observers could not discriminate real from faked expressions of pain better than chance, and after training, improved accuracy to a modest 55%. However a computer vision system that automatically measures facial movements and performs pattern recognition on those movements attained 85% accuracy. The machine system’s superiority is attributable to its ability to differentiate the dynamics of genuine from faked expressions. Thus by revealing the dynamics of facial action through machine vision systems, our approach has the potential to elucidate behavioral fingerprints of neural control systems involved in emotional signaling. PMID:24656830

  3. [Facial expressions of negative emotions in clinical interviews: The development, reliability and validity of a categorical system for the attribution of functions to facial expressions of negative emotions].

    PubMed

    Bock, Astrid; Huber, Eva; Peham, Doris; Benecke, Cord

    2015-01-01

    The development (Study 1) and validation (Study 2) of a categorical system for the attribution of facial expressions of negative emotions to specific functions. The facial expressions observed inOPDinterviews (OPD-Task-Force 2009) are coded according to the Facial Action Coding System (FACS; Ekman et al. 2002) and attributed to categories of basic emotional displays using EmFACS (Friesen & Ekman 1984). In Study 1 we analyze a partial sample of 20 interviews and postulate 10 categories of functions that can be arranged into three main categories (interactive, self and object). In Study 2 we rate the facial expressions (n=2320) from the OPD interviews (10 minutes each interview) of 80 female subjects (16 healthy, 64 with DSM-IV diagnosis; age: 18-57 years) according to the categorical system and correlate them with problematic relationship experiences (measured with IIP,Horowitz et al. 2000). Functions of negative facial expressions can be attributed reliably and validly with the RFE-Coding System. The attribution of interactive, self-related and object-related functions allows for a deeper understanding of the emotional facial expressions of patients with mental disorders.

  4. Image ratio features for facial expression recognition application.

    PubMed

    Song, Mingli; Tao, Dacheng; Liu, Zicheng; Li, Xuelong; Zhou, Mengchu

    2010-06-01

    Video-based facial expression recognition is a challenging problem in computer vision and human-computer interaction. To target this problem, texture features have been extracted and widely used, because they can capture image intensity changes raised by skin deformation. However, existing texture features encounter problems with albedo and lighting variations. To solve both problems, we propose a new texture feature called image ratio features. Compared with previously proposed texture features, e.g., high gradient component features, image ratio features are more robust to albedo and lighting variations. In addition, to further improve facial expression recognition accuracy based on image ratio features, we combine image ratio features with facial animation parameters (FAPs), which describe the geometric motions of facial feature points. The performance evaluation is based on the Carnegie Mellon University Cohn-Kanade database, our own database, and the Japanese Female Facial Expression database. Experimental results show that the proposed image ratio feature is more robust to albedo and lighting variations, and the combination of image ratio features and FAPs outperforms each feature alone. In addition, we study asymmetric facial expressions based on our own facial expression database and demonstrate the superior performance of our combined expression recognition system.

  5. Alexithymia and the labeling of facial emotions: response slowing and increased motor and somatosensory processing

    PubMed Central

    2014-01-01

    Background Alexithymia is a personality trait that is characterized by difficulties in identifying and describing feelings. Previous studies have shown that alexithymia is related to problems in recognizing others’ emotional facial expressions when these are presented with temporal constraints. These problems can be less severe when the expressions are visible for a relatively long time. Because the neural correlates of these recognition deficits are still relatively unexplored, we investigated the labeling of facial emotions and brain responses to facial emotions as a function of alexithymia. Results Forty-eight healthy participants had to label the emotional expression (angry, fearful, happy, or neutral) of faces presented for 1 or 3 seconds in a forced-choice format while undergoing functional magnetic resonance imaging. The participants’ level of alexithymia was assessed using self-report and interview. In light of the previous findings, we focused our analysis on the alexithymia component of difficulties in describing feelings. Difficulties describing feelings, as assessed by the interview, were associated with increased reaction times for negative (i.e., angry and fearful) faces, but not with labeling accuracy. Moreover, individuals with higher alexithymia showed increased brain activation in the somatosensory cortex and supplementary motor area (SMA) in response to angry and fearful faces. These cortical areas are known to be involved in the simulation of the bodily (motor and somatosensory) components of facial emotions. Conclusion The present data indicate that alexithymic individuals may use information related to bodily actions rather than affective states to understand the facial expressions of other persons. PMID:24629094

  6. Neural mechanisms underlying the effects of face-based affective signals on memory for faces: a tentative model

    PubMed Central

    Tsukiura, Takashi

    2012-01-01

    In our daily lives, we form some impressions of other people. Although those impressions are affected by many factors, face-based affective signals such as facial expression, facial attractiveness, or trustworthiness are important. Previous psychological studies have demonstrated the impact of facial impressions on remembering other people, but little is known about the neural mechanisms underlying this psychological process. The purpose of this article is to review recent functional MRI (fMRI) studies to investigate the effects of face-based affective signals including facial expression, facial attractiveness, and trustworthiness on memory for faces, and to propose a tentative concept for understanding this affective-cognitive interaction. On the basis of the aforementioned research, three brain regions are potentially involved in the processing of face-based affective signals. The first candidate is the amygdala, where activity is generally modulated by both affectively positive and negative signals from faces. Activity in the orbitofrontal cortex (OFC), as the second candidate, increases as a function of perceived positive signals from faces; whereas activity in the insular cortex, as the third candidate, reflects a function of face-based negative signals. In addition, neuroscientific studies have reported that the three regions are functionally connected to the memory-related hippocampal regions. These findings suggest that the effects of face-based affective signals on memory for faces could be modulated by interactions between the regions associated with the processing of face-based affective signals and the hippocampus as a memory-related region. PMID:22837740

  7. The role of holistic processing in judgments of facial attractiveness.

    PubMed

    Abbas, Zara-Angela; Duchaine, Bradley

    2008-01-01

    Previous work has demonstrated that facial identity recognition, expression recognition, gender categorisation, and race categorisation rely on a holistic representation. Here we examine whether a holistic representation is also used for judgments of facial attractiveness. Like past studies, we used the composite paradigm to assess holistic processing (Young et al 1987, Perception 16 747-759). Experiment 1 showed that top halves of upright faces are judged to be more attractive when aligned with an attractive bottom half than when aligned with an unattractive bottom half. To assess whether this effect resulted from holistic processing or more general effects, we examined the impact of the attractive and unattractive bottom halves when upright halves were misaligned and when aligned and misaligned halves were presented upside-down. The bottom halves had no effect in either condition. These results demonstrate that the perceptual processes underlying upright facial-attractiveness judgments represent the face holistically. Our findings with attractiveness judgments and previous demonstrations involving other aspects of face processing suggest that a common holistic representation is used for most types of face processing.

  8. The Child Affective Facial Expression (CAFE) set: validity and reliability from untrained adults.

    PubMed

    LoBue, Vanessa; Thrasher, Cat

    2014-01-01

    Emotional development is one of the largest and most productive areas of psychological research. For decades, researchers have been fascinated by how humans respond to, detect, and interpret emotional facial expressions. Much of the research in this area has relied on controlled stimulus sets of adults posing various facial expressions. Here we introduce a new stimulus set of emotional facial expressions into the domain of research on emotional development-The Child Affective Facial Expression set (CAFE). The CAFE set features photographs of a racially and ethnically diverse group of 2- to 8-year-old children posing for six emotional facial expressions-angry, fearful, sad, happy, surprised, and disgusted-and a neutral face. In the current work, we describe the set and report validity and reliability data on the set from 100 untrained adult participants.

  9. Impact of Childhood Maltreatment on the Recognition of Facial Expressions of Emotions.

    PubMed

    Ardizzi, Martina; Martini, Francesca; Umiltà, Maria Alessandra; Evangelista, Valentina; Ravera, Roberto; Gallese, Vittorio

    2015-01-01

    The development of the explicit recognition of facial expressions of emotions can be affected by childhood maltreatment experiences. A previous study demonstrated the existence of an explicit recognition bias for angry facial expressions among a population of adolescent Sierra Leonean street-boys exposed to high levels of maltreatment. In the present study, the recognition bias for angry facial expressions was investigated in a younger population of street-children and age-matched controls. Participants performed a forced-choice facial expressions recognition task. Recognition bias was measured as participants' tendency to over-attribute anger label to other negative facial expressions. Participants' heart rate was assessed and related to their behavioral performance, as index of their stress-related physiological responses. Results demonstrated the presence of a recognition bias for angry facial expressions among street-children, also pinpointing a similar, although significantly less pronounced, tendency among controls. Participants' performance was controlled for age, cognitive and educational levels and for naming skills. None of these variables influenced the recognition bias for angry facial expressions. Differently, a significant effect of heart rate on participants' tendency to use anger label was evidenced. Taken together, these results suggest that childhood exposure to maltreatment experiences amplifies children's "pre-existing bias" for anger labeling in forced-choice emotion recognition task. Moreover, they strengthen the thesis according to which the recognition bias for angry facial expressions is a manifestation of a functional adaptive mechanism that tunes victim's perceptive and attentive focus on salient environmental social stimuli.

  10. Impact of Childhood Maltreatment on the Recognition of Facial Expressions of Emotions

    PubMed Central

    Ardizzi, Martina; Martini, Francesca; Umiltà, Maria Alessandra; Evangelista, Valentina; Ravera, Roberto; Gallese, Vittorio

    2015-01-01

    The development of the explicit recognition of facial expressions of emotions can be affected by childhood maltreatment experiences. A previous study demonstrated the existence of an explicit recognition bias for angry facial expressions among a population of adolescent Sierra Leonean street-boys exposed to high levels of maltreatment. In the present study, the recognition bias for angry facial expressions was investigated in a younger population of street-children and age-matched controls. Participants performed a forced-choice facial expressions recognition task. Recognition bias was measured as participants’ tendency to over-attribute anger label to other negative facial expressions. Participants’ heart rate was assessed and related to their behavioral performance, as index of their stress-related physiological responses. Results demonstrated the presence of a recognition bias for angry facial expressions among street-children, also pinpointing a similar, although significantly less pronounced, tendency among controls. Participants’ performance was controlled for age, cognitive and educational levels and for naming skills. None of these variables influenced the recognition bias for angry facial expressions. Differently, a significant effect of heart rate on participants’ tendency to use anger label was evidenced. Taken together, these results suggest that childhood exposure to maltreatment experiences amplifies children’s “pre-existing bias” for anger labeling in forced-choice emotion recognition task. Moreover, they strengthen the thesis according to which the recognition bias for angry facial expressions is a manifestation of a functional adaptive mechanism that tunes victim’s perceptive and attentive focus on salient environmental social stimuli. PMID:26509890

  11. Eye-Tracking Evidence that Happy Faces Impair Verbal Message Comprehension: The Case of Health Warnings in Direct-to-Consumer Pharmaceutical Television Commercials

    PubMed Central

    Russell, Cristel Antonia; Swasy, John L.; Russell, Dale Wesley; Engel, Larry

    2017-01-01

    Risk warning or disclosure information in advertising is only effective in correcting consumers’ judgments if enough cognitive capacity is available to process that information. Hence, comprehension of verbal warnings in TV commercials may suffer if accompanied by positive visual elements. This research addresses this concern about cross-modality interference in the context of direct-to-consumer (DTC) pharmaceutical commercials in the United States by experimentally testing whether positive facial expressions reduce consumers’ understanding of the mandated health warning. A content analysis of a sample of DTC commercials reveals that positive facial expressions are more prevalent during the verbal warning act of the commercials than during the other acts. An eye-tracking experiment conducted with specially produced DTC commercials, which vary the valence of characters’ facial expressions during the health warning, provides evidence that happy faces reduce objective comprehension of the warning. PMID:29269979

  12. Red - Take a Closer Look

    PubMed Central

    Buechner, Vanessa L.; Maier, Markus A.; Lichtenfeld, Stephanie; Schwarz, Sascha

    2014-01-01

    Color research has shown that red is associated with avoidance of threat (e.g., failure) or approach of reward (e.g., mating) depending on the context in which it is perceived. In the present study we explored one central cognitive process that might be involved in the context dependency of red associations. According to our theory, red is supposed to highlight the relevance (importance) of a goal-related stimulus and correspondingly intensifies the perceivers’ attentional reaction to it. Angry and happy human compared to non-human facial expressions were used as goal-relevant stimuli. The data indicate that the color red leads to enhanced attentional engagement to angry and happy human facial expressions (compared to neutral ones) - the use of non-human facial expressions does not bias attention. The results are discussed with regard to the idea that red induced attentional biases might explain the red-context effects on motivation. PMID:25254380

  13. The influence of attention toward facial expressions on size perception.

    PubMed

    Choi, Jeong-Won; Kim, Kiho; Lee, Jang-Han

    2016-01-01

    According to the New Look theory, size perception is affected by emotional factors. Although previous studies have attempted to explain the effects of both emotion and motivation on size perception, they have failed to identify the underlying mechanisms. This study aimed to investigate the underlying mechanisms of size perception by applying attention toward facial expressions using the Ebbinghaus illusion as a measurement tool. The participants, female university students, were asked to judge the size of a target stimulus relative to the size of facial expressions (i.e., happy, angry, and neutral) surrounding the target. The results revealed that the participants perceived angry and neutral faces to be larger than happy faces. This finding indicates that individuals pay closer attention to neutral and angry faces than happy ones. These results suggest that the mechanisms underlying size perception involve cognitive processes that focus attention toward relevant stimuli and block out irrelevant stimuli.

  14. Is right hemisphere decline in the perception of emotion a function of aging?

    PubMed

    McDowell, C L; Harrison, D W; Demaree, H A

    1994-11-01

    The hypothesis that the right cerebral hemisphere declines more quickly than the left cerebral hemisphere in the normal aging process was tested using accuracy and intensity measures in a facial recognition test and using response time and response bias measures in a tachistoscopic paradigm. Elderly and younger men and women (N = 60) participated in both experiments. Experiment 1 required facial affect identification and intensity ratings of 50 standardized photographs of 5 affective categories: Happy, Neutral, Sad, Angry, and Fearful. The elderly were significantly less accurate in identifying facial affective valence. This effect was found using negative and neutral expressions. Results for happy expressions, however, were consistent with the younger group. In Experiment 2, age differences in hemispheric asymmetry were evaluated using presentation of affective faces in each visual field. Following prolonged experience with the affective stimuli during Experiment 1, the elderly showed heightened cerebral asymmetry for facial affect processing compared to the younger group. Both groups showed a positive affective bias to neutral stimuli presented to the left hemisphere. Elderly and younger subjects scored significantly higher on Vocabulary and Block Design subtests of the WAIS-R, respectively. Overall, the findings suggest that the elderly have more difficulty processing negative affect, while their ability to process positive affect remains intact. The results lend only partial support to the right hemi-aging hypothesis.

  15. The effect of facial expressions on respirators contact pressures.

    PubMed

    Cai, Mang; Shen, Shengnan; Li, Hui

    2017-08-01

    This study investigated the effect of four typical facial expressions (calmness, happiness, sadness and surprise) on contact characteristics between an N95 filtering facepiece respirator and a headform. The respirator model comprised two layers (an inner layer and an outer layer) and a nose clip. The headform model was comprised of a skin layer, a fatty tissue layer embedded with eight muscles, and a skull layer. Four typical facial expressions were generated by the coordinated contraction of four facial muscles. After that, the distribution of the contact pressure on the headform, as well as the contact area, were calculated. Results demonstrated that the nasal clip could help make the respirator move closer to the nose bridge while causing facial discomfort. Moreover, contact areas varied with different facial expressions, and facial expressions significantly altered contact pressures at different key areas, which may result in leakage.

  16. Macaques can predict social outcomes from facial expressions.

    PubMed

    Waller, Bridget M; Whitehouse, Jamie; Micheletta, Jérôme

    2016-09-01

    There is widespread acceptance that facial expressions are useful in social interactions, but empirical demonstration of their adaptive function has remained elusive. Here, we investigated whether macaques can use the facial expressions of others to predict the future outcomes of social interaction. Crested macaques (Macaca nigra) were shown an approach between two unknown individuals on a touchscreen and were required to choose between one of two potential social outcomes. The facial expressions of the actors were manipulated in the last frame of the video. One subject reached the experimental stage and accurately predicted different social outcomes depending on which facial expressions the actors displayed. The bared-teeth display (homologue of the human smile) was most strongly associated with predicted friendly outcomes. Contrary to our predictions, screams and threat faces were not associated more with conflict outcomes. Overall, therefore, the presence of any facial expression (compared to neutral) caused the subject to choose friendly outcomes more than negative outcomes. Facial expression in general, therefore, indicated a reduced likelihood of social conflict. The findings dispute traditional theories that view expressions only as indicators of present emotion and instead suggest that expressions form part of complex social interactions where individuals think beyond the present.

  17. Imitating expressions: emotion-specific neural substrates in facial mimicry.

    PubMed

    Lee, Tien-Wen; Josephs, Oliver; Dolan, Raymond J; Critchley, Hugo D

    2006-09-01

    Intentionally adopting a discrete emotional facial expression can modulate the subjective feelings corresponding to that emotion; however, the underlying neural mechanism is poorly understood. We therefore used functional brain imaging (functional magnetic resonance imaging) to examine brain activity during intentional mimicry of emotional and non-emotional facial expressions and relate regional responses to the magnitude of expression-induced facial movement. Eighteen healthy subjects were scanned while imitating video clips depicting three emotional (sad, angry, happy), and two 'ingestive' (chewing and licking) facial expressions. Simultaneously, facial movement was monitored from displacement of fiducial markers (highly reflective dots) on each subject's face. Imitating emotional expressions enhanced activity within right inferior prefrontal cortex. This pattern was absent during passive viewing conditions. Moreover, the magnitude of facial movement during emotion-imitation predicted responses within right insula and motor/premotor cortices. Enhanced activity in ventromedial prefrontal cortex and frontal pole was observed during imitation of anger, in ventromedial prefrontal and rostral anterior cingulate during imitation of sadness and in striatal, amygdala and occipitotemporal during imitation of happiness. Our findings suggest a central role for right inferior frontal gyrus in the intentional imitation of emotional expressions. Further, by entering metrics for facial muscular change into analysis of brain imaging data, we highlight shared and discrete neural substrates supporting affective, action and social consequences of somatomotor emotional expression.

  18. Capturing Physiology of Emotion along Facial Muscles: A Method of Distinguishing Feigned from Involuntary Expressions

    NASA Astrophysics Data System (ADS)

    Khan, Masood Mehmood; Ward, Robert D.; Ingleby, Michael

    The ability to distinguish feigned from involuntary expressions of emotions could help in the investigation and treatment of neuropsychiatric and affective disorders and in the detection of malingering. This work investigates differences in emotion-specific patterns of thermal variations along the major facial muscles. Using experimental data extracted from 156 images, we attempted to classify patterns of emotion-specific thermal variations into neutral, and voluntary and involuntary expressions of positive and negative emotive states. Initial results suggest (i) each facial muscle exhibits a unique thermal response to various emotive states; (ii) the pattern of thermal variances along the facial muscles may assist in classifying voluntary and involuntary facial expressions; and (iii) facial skin temperature measurements along the major facial muscles may be used in automated emotion assessment.

  19. Posed versus spontaneous facial expressions are modulated by opposite cerebral hemispheres.

    PubMed

    Ross, Elliott D; Pulusu, Vinay K

    2013-05-01

    Clinical research has indicated that the left face is more expressive than the right face, suggesting that modulation of facial expressions is lateralized to the right hemisphere. The findings, however, are controversial because the results explain, on average, approximately 4% of the data variance. Using high-speed videography, we sought to determine if movement-onset asymmetry was a more powerful research paradigm than terminal movement asymmetry. The results were very robust, explaining up to 70% of the data variance. Posed expressions began overwhelmingly on the right face whereas spontaneous expressions began overwhelmingly on the left face. This dichotomy was most robust for upper facial expressions. In addition, movement-onset asymmetries did not predict terminal movement asymmetries, which were not significantly lateralized. The results support recent neuroanatomic observations that upper versus lower facial movements have different forebrain motor representations and recent behavioral constructs that posed versus spontaneous facial expressions are modulated preferentially by opposite cerebral hemispheres and that spontaneous facial expressions are graded rather than non-graded movements. Published by Elsevier Ltd.

  20. Tuning to the Positive: Age-Related Differences in Subjective Perception of Facial Emotion

    PubMed Central

    Picardo, Rochelle; Baron, Andrew S.; Anderson, Adam K.; Todd, Rebecca M.

    2016-01-01

    Facial expressions aid social transactions and serve as socialization tools, with smiles signaling approval and reward, and angry faces signaling disapproval and punishment. The present study examined whether the subjective experience of positive vs. negative facial expressions differs between children and adults. Specifically, we examined age-related differences in biases toward happy and angry facial expressions. Young children (5–7 years) and young adults (18–29 years) rated the intensity of happy and angry expressions as well as levels of experienced arousal. Results showed that young children—but not young adults—rated happy facial expressions as both more intense and arousing than angry faces. This finding, which we replicated in two independent samples, was not due to differences in the ability to identify facial expressions, and suggests that children are more tuned to information in positive expressions. Together these studies provide evidence that children see unambiguous adult emotional expressions through rose-colored glasses, and suggest that what is emotionally relevant can shift with development. PMID:26734940

  1. Effects of damping head movement and facial expression in dyadic conversation using real–time facial expression tracking and synthesized avatars

    PubMed Central

    Boker, Steven M.; Cohn, Jeffrey F.; Theobald, Barry-John; Matthews, Iain; Brick, Timothy R.; Spies, Jeffrey R.

    2009-01-01

    When people speak with one another, they tend to adapt their head movements and facial expressions in response to each others' head movements and facial expressions. We present an experiment in which confederates' head movements and facial expressions were motion tracked during videoconference conversations, an avatar face was reconstructed in real time, and naive participants spoke with the avatar face. No naive participant guessed that the computer generated face was not video. Confederates' facial expressions, vocal inflections and head movements were attenuated at 1 min intervals in a fully crossed experimental design. Attenuated head movements led to increased head nods and lateral head turns, and attenuated facial expressions led to increased head nodding in both naive participants and confederates. Together, these results are consistent with a hypothesis that the dynamics of head movements in dyadicconversation include a shared equilibrium. Although both conversational partners were blind to the manipulation, when apparent head movement of one conversant was attenuated, both partners responded by increasing the velocity of their head movements. PMID:19884143

  2. Misinterpretation of facial expression: a cross-cultural study.

    PubMed

    Shioiri, T; Someya, T; Helmeste, D; Tang, S W

    1999-02-01

    Accurately recognizing facial emotional expressions is important in psychiatrist-versus-patient interactions. This might be difficult when the physician and patients are from different cultures. More than two decades of research on facial expressions have documented the universality of the emotions of anger, contempt, disgust, fear, happiness, sadness, and surprise. In contrast, some research data supported the concept that there are significant cultural differences in the judgment of emotion. In this pilot study, the recognition of emotional facial expressions in 123 Japanese subjects was evaluated using the Japanese and Caucasian Facial Expression of Emotion (JACFEE) photos. The results indicated that Japanese subjects experienced difficulties in recognizing some emotional facial expressions and misunderstood others as depicted by the posers, when compared to previous studies using American subjects. Interestingly, the sex and cultural background of the poser did not appear to influence the accuracy of recognition. The data suggest that in this young Japanese sample, judgment of certain emotional facial expressions was significantly different from the Americans. Further exploration in this area is warranted due to its importance in cross-cultural clinician-patient interactions.

  3. Multichannel Communication: The Impact of the Paralinguistic Channel on Facial Expression of Emotion.

    ERIC Educational Resources Information Center

    Brideau, Linda B.; Allen, Vernon L.

    A study was undertaken to examine the impact of the paralinguistic channel on the ability to encode facial expressions of emotion. The first set of subjects, 19 encoders, were asked to encode facial expressions for five emotions (fear, sadness, anger, happiness, and disgust). The emotions were produced in three encoding conditions: facial channel…

  4. The Effect of Observers’ Mood on the Local Processing of Emotional Faces: Evidence from Short-Lived and Prolonged Mood States

    PubMed Central

    Mokhtari, Setareh; Buttle, Heather

    2015-01-01

    We examined the effect of induced mood, varying in valence and longevity, on local processing of emotional faces. It was found that negative facial expression conveyed by the global level of the face interferes with efficient processing of the local features. The results also showed that the duration of involvement with a mood influenced the local processing. We observed that attending to the local level of faces is not different in short-lived happy and sad mood states. However, as the mood state is experienced for a longer period, local processing was impaired in happy mood compared to sad mood. Taken together, we concluded that both facial expressions and affective states influence processing of the local parts of faces. Moreover, we suggest that mediating factors like the duration of involvement with the mood play a role in the interrelation between mood, attention, and perception. PMID:25883696

  5. Reconstructing dynamic mental models of facial expressions in prosopagnosia reveals distinct representations for identity and expression.

    PubMed

    Richoz, Anne-Raphaëlle; Jack, Rachael E; Garrod, Oliver G B; Schyns, Philippe G; Caldara, Roberto

    2015-04-01

    The human face transmits a wealth of signals that readily provide crucial information for social interactions, such as facial identity and emotional expression. Yet, a fundamental question remains unresolved: does the face information for identity and emotional expression categorization tap into common or distinct representational systems? To address this question we tested PS, a pure case of acquired prosopagnosia with bilateral occipitotemporal lesions anatomically sparing the regions that are assumed to contribute to facial expression (de)coding (i.e., the amygdala, the insula and the posterior superior temporal sulcus--pSTS). We previously demonstrated that PS does not use information from the eye region to identify faces, but relies on the suboptimal mouth region. PS's abnormal information use for identity, coupled with her neural dissociation, provides a unique opportunity to probe the existence of a dichotomy in the face representational system. To reconstruct the mental models of the six basic facial expressions of emotion in PS and age-matched healthy observers, we used a novel reverse correlation technique tracking information use on dynamic faces. PS was comparable to controls, using all facial features to (de)code facial expressions with the exception of fear. PS's normal (de)coding of dynamic facial expressions suggests that the face system relies either on distinct representational systems for identity and expression, or dissociable cortical pathways to access them. Interestingly, PS showed a selective impairment for categorizing many static facial expressions, which could be accounted for by her lesion in the right inferior occipital gyrus. PS's advantage for dynamic facial expressions might instead relate to a functionally distinct and sufficient cortical pathway directly connecting the early visual cortex to the spared pSTS. Altogether, our data provide critical insights on the healthy and impaired face systems, question evidence of deficits obtained from patients by using static images of facial expressions, and offer novel routes for patient rehabilitation. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Identity recognition and happy and sad facial expression recall: influence of depressive symptoms.

    PubMed

    Jermann, Françoise; van der Linden, Martial; D'Argembeau, Arnaud

    2008-05-01

    Relatively few studies have examined memory bias for social stimuli in depression or dysphoria. The aim of this study was to investigate the influence of depressive symptoms on memory for facial information. A total of 234 participants completed the Beck Depression Inventory II and a task examining memory for facial identity and expression of happy and sad faces. For both facial identity and expression, the recollective experience was measured with the Remember/Know/Guess procedure (Gardiner & Richardson-Klavehn, 2000). The results show no major association between depressive symptoms and memory for identities. However, dysphoric individuals consciously recalled (Remember responses) more sad facial expressions than non-dysphoric individuals. These findings suggest that sad facial expressions led to more elaborate encoding, and thereby better recollection, in dysphoric individuals.

  7. Changes in nuclear morphology and chromatin texture of basal keratinocytes in melasma.

    PubMed

    Brianezi, G; Handel, A C; Schmitt, J V; Miot, L D B; Miot, H A

    2015-04-01

    The pathogenesis of melasma and the role of keratinocytes in disease development and maintenance are not completely understood. Dermal abnormalities, the expression of inflammatory mediators, growth factors, epithelial expression of melanocortin and sexual hormones receptors suggest that not only melanocytes, but entire epidermal melanin unit is involved in melasma physiopathology. To compare nuclear morphological features and chromatin texture between basal keratinocytes in facial melasma and adjacent normal skin. We took facial skin biopsies (2 mm melasma and adjacent normal skin) from women processed for haematoxylin and eosin. Thirty non-overlapping basal keratinocyte nuclei were segmented and descriptors of area, highest diameter, perimeter, circularity, pixel intensity, profilometric index (Ra) and fractal dimension were extracted using ImageJ software. Basal keratinocyte nuclei from facial melasma epidermis displayed larger size, irregular shape, hyperpigmentation and chromatin heterogeneity by fractal dimension than perilesional skin. Basal keratinocytes from facial melasma display changes in nuclear form and chromatin texture, suggesting that the phenotype differences between melasma and adjacent facial skin can result from complete epidermal melanin unit alterations, not just hypertrophic melanocytes. © 2014 European Academy of Dermatology and Venereology.

  8. Realistic facial animation generation based on facial expression mapping

    NASA Astrophysics Data System (ADS)

    Yu, Hui; Garrod, Oliver; Jack, Rachael; Schyns, Philippe

    2014-01-01

    Facial expressions reflect internal emotional states of a character or in response to social communications. Though much effort has been taken to generate realistic facial expressions, it still remains a challenging topic due to human being's sensitivity to subtle facial movements. In this paper, we present a method for facial animation generation, which reflects true facial muscle movements with high fidelity. An intermediate model space is introduced to transfer captured static AU peak frames based on FACS to the conformed target face. And then dynamic parameters derived using a psychophysics method is integrated to generate facial animation, which is assumed to represent natural correlation of multiple AUs. Finally, the animation sequence in the intermediate model space is mapped to the target face to produce final animation.

  9. Behind the Robot's Smiles and Frowns: In Social Context, People Do Not Mirror Android's Expressions But React to Their Informational Value.

    PubMed

    Hofree, Galit; Ruvolo, Paul; Reinert, Audrey; Bartlett, Marian S; Winkielman, Piotr

    2018-01-01

    Facial actions are key elements of non-verbal behavior. Perceivers' reactions to others' facial expressions often represent a match or mirroring (e.g., they smile to a smile). However, the information conveyed by an expression depends on context. Thus, when shown by an opponent, a smile conveys bad news and evokes frowning. The availability of anthropomorphic agents capable of facial actions raises the question of how people respond to such agents in social context. We explored this issue in a study where participants played a strategic game with or against a facially expressive android. Electromyography (EMG) recorded participants' reactions over zygomaticus muscle (smiling) and corrugator muscle (frowning). We found that participants' facial responses to android's expressions reflect their informational value, rather than a direct match. Overall, participants smiled more, and frowned less, when winning than losing. Critically, participants' responses to the game outcome were similar regardless of whether it was conveyed via the android's smile or frown. Furthermore, the outcome had greater impact on people's facial reactions when it was conveyed through android's face than a computer screen. These findings demonstrate that facial actions of artificial agents impact human facial responding. They also suggest a sophistication in human-robot communication that highlights the signaling value of facial expressions.

  10. Facial Expression Enhances Emotion Perception Compared to Vocal Prosody: Behavioral and fMRI Studies.

    PubMed

    Zhang, Heming; Chen, Xuhai; Chen, Shengdong; Li, Yansong; Chen, Changming; Long, Quanshan; Yuan, Jiajin

    2018-05-09

    Facial and vocal expressions are essential modalities mediating the perception of emotion and social communication. Nonetheless, currently little is known about how emotion perception and its neural substrates differ across facial expression and vocal prosody. To clarify this issue, functional MRI scans were acquired in Study 1, in which participants were asked to discriminate the valence of emotional expression (angry, happy or neutral) from facial, vocal, or bimodal stimuli. In Study 2, we used an affective priming task (unimodal materials as primers and bimodal materials as target) and participants were asked to rate the intensity, valence, and arousal of the targets. Study 1 showed higher accuracy and shorter response latencies in the facial than in the vocal modality for a happy expression. Whole-brain analysis showed enhanced activation during facial compared to vocal emotions in the inferior temporal-occipital regions. Region of interest analysis showed a higher percentage signal change for facial than for vocal anger in the superior temporal sulcus. Study 2 showed that facial relative to vocal priming of anger had a greater influence on perceived emotion for bimodal targets, irrespective of the target valence. These findings suggest that facial expression is associated with enhanced emotion perception compared to equivalent vocal prosodies.

  11. Variants of Independence in the Perception of Facial Identity and Expression

    ERIC Educational Resources Information Center

    Fitousi, Daniel; Wenger, Michael J.

    2013-01-01

    A prominent theory in the face perception literature--the parallel-route hypothesis (Bruce & Young, 1986)--assumes a dedicated channel for the processing of identity that is separate and independent from the channel(s) in which nonidentity information is processed (e.g., expression, eye gaze). The current work subjected this assumption to…

  12. Lateralisation for processing facial emotion and anxiety: contrasting state, trait and social anxiety.

    PubMed

    Bourne, Victoria J; Vladeanu, Matei

    2011-04-01

    Recent neuropsychological studies have attempted to distinguish between different types of anxiety by contrasting patterns of brain organisation or activation; however, lateralisation for processing emotional stimuli has received relatively little attention. This study examines the relationship between strength of lateralisation for the processing of facial expressions of emotion and three measures of anxiety: state anxiety, trait anxiety and social anxiety. Across all six of the basic emotions (anger, disgust, fear, happiness, sadness, surprise) the same patterns of association were found. Participants with high levels of trait anxiety were more strongly lateralised to the right hemisphere for processing facial emotion. In contrast, participants with high levels of self-reported physiological arousal in response to social anxiety were more weakly lateralised to the right hemisphere, or even lateralised to the left hemisphere, for the processing of facial emotion. There were also sex differences in these associations: the relationships were evident for males only. The finding of distinct patterns of lateralisation for trait anxiety and self-reported physiological arousal suggests different neural circuitry for trait and social anxiety. Copyright © 2011. Published by Elsevier Ltd.

  13. Sex differences in facial emotion recognition across varying expression intensity levels from videos.

    PubMed

    Wingenbach, Tanja S H; Ashwin, Chris; Brosnan, Mark

    2018-01-01

    There has been much research on sex differences in the ability to recognise facial expressions of emotions, with results generally showing a female advantage in reading emotional expressions from the face. However, most of the research to date has used static images and/or 'extreme' examples of facial expressions. Therefore, little is known about how expression intensity and dynamic stimuli might affect the commonly reported female advantage in facial emotion recognition. The current study investigated sex differences in accuracy of response (Hu; unbiased hit rates) and response latencies for emotion recognition using short video stimuli (1sec) of 10 different facial emotion expressions (anger, disgust, fear, sadness, surprise, happiness, contempt, pride, embarrassment, neutral) across three variations in the intensity of the emotional expression (low, intermediate, high) in an adolescent and adult sample (N = 111; 51 male, 60 female) aged between 16 and 45 (M = 22.2, SD = 5.7). Overall, females showed more accurate facial emotion recognition compared to males and were faster in correctly recognising facial emotions. The female advantage in reading expressions from the faces of others was unaffected by expression intensity levels and emotion categories used in the study. The effects were specific to recognition of emotions, as males and females did not differ in the recognition of neutral faces. Together, the results showed a robust sex difference favouring females in facial emotion recognition using video stimuli of a wide range of emotions and expression intensity variations.

  14. Sex differences in facial emotion recognition across varying expression intensity levels from videos

    PubMed Central

    2018-01-01

    There has been much research on sex differences in the ability to recognise facial expressions of emotions, with results generally showing a female advantage in reading emotional expressions from the face. However, most of the research to date has used static images and/or ‘extreme’ examples of facial expressions. Therefore, little is known about how expression intensity and dynamic stimuli might affect the commonly reported female advantage in facial emotion recognition. The current study investigated sex differences in accuracy of response (Hu; unbiased hit rates) and response latencies for emotion recognition using short video stimuli (1sec) of 10 different facial emotion expressions (anger, disgust, fear, sadness, surprise, happiness, contempt, pride, embarrassment, neutral) across three variations in the intensity of the emotional expression (low, intermediate, high) in an adolescent and adult sample (N = 111; 51 male, 60 female) aged between 16 and 45 (M = 22.2, SD = 5.7). Overall, females showed more accurate facial emotion recognition compared to males and were faster in correctly recognising facial emotions. The female advantage in reading expressions from the faces of others was unaffected by expression intensity levels and emotion categories used in the study. The effects were specific to recognition of emotions, as males and females did not differ in the recognition of neutral faces. Together, the results showed a robust sex difference favouring females in facial emotion recognition using video stimuli of a wide range of emotions and expression intensity variations. PMID:29293674

  15. Emotional Intelligence and Mismatching Expressive and Verbal Messages: A Contribution to Detection of Deception

    PubMed Central

    Wojciechowski, Jerzy; Stolarski, Maciej; Matthews, Gerald

    2014-01-01

    Processing facial emotion, especially mismatches between facial and verbal messages, is believed to be important in the detection of deception. For example, emotional leakage may accompany lying. Individuals with superior emotion perception abilities may then be more adept in detecting deception by identifying mismatch between facial and verbal messages. Two personal factors that may predict such abilities are female gender and high emotional intelligence (EI). However, evidence on the role of gender and EI in detection of deception is mixed. A key issue is that the facial processing skills required to detect deception may not be the same as those required to identify facial emotion. To test this possibility, we developed a novel facial processing task, the FDT (Face Decoding Test) that requires detection of inconsistencies between facial and verbal cues to emotion. We hypothesized that gender and ability EI would be related to performance when cues were inconsistent. We also hypothesized that gender effects would be mediated by EI, because women tend to score as more emotionally intelligent on ability tests. Data were collected from 210 participants. Analyses of the FDT suggested that EI was correlated with superior face decoding in all conditions. We also confirmed the expected gender difference, the superiority of high EI individuals, and the mediation hypothesis. Also, EI was more strongly associated with facial decoding performance in women than in men, implying there may be gender differences in strategies for processing affective cues. It is concluded that integration of emotional and cognitive cues may be a core attribute of EI that contributes to the detection of deception. PMID:24658500

  16. Not on the Face Alone: Perception of Contextualized Face Expressions in Huntington's Disease

    ERIC Educational Resources Information Center

    Aviezer, Hillel; Bentin, Shlomo; Hassin, Ran R.; Meschino, Wendy S.; Kennedy, Jeanne; Grewal, Sonya; Esmail, Sherali; Cohen, Sharon; Moscovitch, Morris

    2009-01-01

    Numerous studies have demonstrated that Huntington's disease mutation-carriers have deficient explicit recognition of isolated facial expressions. There are no studies, however, which have investigated the recognition of facial expressions embedded within an emotional body and scene context. Real life facial expressions are typically embedded in…

  17. Botulinum toxin and the facial feedback hypothesis: can looking better make you feel happier?

    PubMed

    Alam, Murad; Barrett, Karen C; Hodapp, Robert M; Arndt, Kenneth A

    2008-06-01

    The facial feedback hypothesis suggests that muscular manipulations which result in more positive facial expressions may lead to more positive emotional states in affected individuals. In this essay, we hypothesize that the injection of botulinum toxin for upper face dynamic creases might induce positive emotional states by reducing the ability to frown and create other negative facial expressions. The use of botulinum toxin to pharmacologically alter upper face muscular expressiveness may curtail the appearance of negative emotions, most notably anger, but also fear and sadness. This occurs via the relaxation of the corrugator supercilii and the procerus, which are responsible for brow furrowing, and to a lesser extent, because of the relaxation of the frontalis. Concurrently, botulinum toxin may dampen some positive expressions like the true smile, which requires activity of the orbicularis oculi, a muscle also relaxed after toxin injections. On balance, the evidence suggests that botulinum toxin injections for upper face dynamic creases may reduce negative facial expressions more than they reduce positive facial expressions. Based on the facial feedback hypothesis, this net change in facial expression may potentially have the secondary effect of reducing the internal experience of negative emotions, thus making patients feel less angry, sad, and fearful.

  18. The face of pain--a pilot study to validate the measurement of facial pain expression with an improved electromyogram method.

    PubMed

    Wolf, Karsten; Raedler, Thomas; Henke, Kai; Kiefer, Falk; Mass, Reinhard; Quante, Markus; Wiedemann, Klaus

    2005-01-01

    The purpose of this pilot study was to establish the validity of an improved facial electromyogram (EMG) method for the measurement of facial pain expression. Darwin defined pain in connection with fear as a simultaneous occurrence of eye staring, brow contraction and teeth chattering. Prkachin was the first to use the video-based Facial Action Coding System to measure facial expressions while using four different types of pain triggers, identifying a group of facial muscles around the eyes. The activity of nine facial muscles in 10 healthy male subjects was analyzed. Pain was induced through a laser system with a randomized sequence of different intensities. Muscle activity was measured with a new, highly sensitive and selective facial EMG. The results indicate two groups of muscles as key for pain expression. These results are in concordance with Darwin's definition. As in Prkachin's findings, one muscle group is assembled around the orbicularis oculi muscle, initiating eye staring. The second group consists of the mentalis and depressor anguli oris muscles, which trigger mouth movements. The results demonstrate the validity of the facial EMG method for measuring facial pain expression. Further studies with psychometric measurements, a larger sample size and a female test group should be conducted.

  19. Recognition, Expression, and Understanding Facial Expressions of Emotion in Adolescents with Nonverbal and General Learning Disabilities

    ERIC Educational Resources Information Center

    Bloom, Elana; Heath, Nancy

    2010-01-01

    Children with nonverbal learning disabilities (NVLD) have been found to be worse at recognizing facial expressions than children with verbal learning disabilities (LD) and without LD. However, little research has been done with adolescents. In addition, expressing and understanding facial expressions is yet to be studied among adolescents with LD…

  20. The Perception of Dynamic and Static Facial Expressions of Happiness and Disgust Investigated by ERPs and fMRI Constrained Source Analysis

    PubMed Central

    Trautmann-Lengsfeld, Sina Alexa; Domínguez-Borràs, Judith; Escera, Carles; Herrmann, Manfred; Fehr, Thorsten

    2013-01-01

    A recent functional magnetic resonance imaging (fMRI) study by our group demonstrated that dynamic emotional faces are more accurately recognized and evoked more widespread patterns of hemodynamic brain responses than static emotional faces. Based on this experimental design, the present study aimed at investigating the spatio-temporal processing of static and dynamic emotional facial expressions in 19 healthy women by means of multi-channel electroencephalography (EEG), event-related potentials (ERP) and fMRI-constrained regional source analyses. ERP analysis showed an increased amplitude of the LPP (late posterior positivity) over centro-parietal regions for static facial expressions of disgust compared to neutral faces. In addition, the LPP was more widespread and temporally prolonged for dynamic compared to static faces of disgust and happiness. fMRI constrained source analysis on static emotional face stimuli indicated the spatio-temporal modulation of predominantly posterior regional brain activation related to the visual processing stream for both emotional valences when compared to the neutral condition in the fusiform gyrus. The spatio-temporal processing of dynamic stimuli yielded enhanced source activity for emotional compared to neutral conditions in temporal (e.g., fusiform gyrus), and frontal regions (e.g., ventromedial prefrontal cortex, medial and inferior frontal cortex) in early and again in later time windows. The present data support the view that dynamic facial displays trigger more information reflected in complex neural networks, in particular because of their changing features potentially triggering sustained activation related to a continuing evaluation of those faces. A combined fMRI and EEG approach thus provides an advanced insight to the spatio-temporal characteristics of emotional face processing, by also revealing additional neural generators, not identifiable by the only use of an fMRI approach. PMID:23818974

  1. Fixation to features and neural processing of facial expressions in a gender discrimination task.

    PubMed

    Neath, Karly N; Itier, Roxane J

    2015-10-01

    Early face encoding, as reflected by the N170 ERP component, is sensitive to fixation to the eyes. Whether this sensitivity varies with facial expressions of emotion and can also be seen on other ERP components such as P1 and EPN, was investigated. Using eye-tracking to manipulate fixation on facial features, we found the N170 to be the only eye-sensitive component and this was true for fearful, happy and neutral faces. A different effect of fixation to features was seen for the earlier P1 that likely reflected general sensitivity to face position. An early effect of emotion (∼120 ms) for happy faces was seen at occipital sites and was sustained until ∼350 ms post-stimulus. For fearful faces, an early effect was seen around 80 ms followed by a later effect appearing at ∼150 ms until ∼300 ms at lateral posterior sites. Results suggests that in this emotion-irrelevant gender discrimination task, processing of fearful and happy expressions occurred early and largely independently of the eye-sensitivity indexed by the N170. Processing of the two emotions involved different underlying brain networks active at different times. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Emotion understanding in postinstitutionalized Eastern European children

    PubMed Central

    WISMER FRIES, ALISON B.; POLLAK, SETH D.

    2005-01-01

    To examine the effects of early emotional neglect on children’s affective development, we assessed children who had experienced institutionalized care prior to adoption into family environments. One task required children to identify photographs of facial expressions of emotion. A second task required children to match facial expressions to an emotional situation. Internationally adopted, postinstitutionalized children had difficulty identifying facial expressions of emotion. In addition, postinstitutionalized children had significant difficulty matching appropriate facial expressions to happy, sad, and fearful scenarios. However, postinstitutionalized children performed as well as comparison children when asked to identify and match angry facial expressions. These results are discussed in terms of the importance of emotional input early in life on later developmental organization. PMID:15487600

  3. Empathy, but not mimicry restriction, influences the recognition of change in emotional facial expressions.

    PubMed

    Kosonogov, Vladimir; Titova, Alisa; Vorobyeva, Elena

    2015-01-01

    The current study addressed the hypothesis that empathy and the restriction of facial muscles of observers can influence recognition of emotional facial expressions. A sample of 74 participants recognized the subjective onset of emotional facial expressions (anger, disgust, fear, happiness, sadness, surprise, and neutral) in a series of morphed face photographs showing a gradual change (frame by frame) from one expression to another. The high-empathy (as measured by the Empathy Quotient) participants recognized emotional facial expressions at earlier photographs from the series than did low-empathy ones, but there was no difference in the exploration time. Restriction of facial muscles of observers (with plasters and a stick in mouth) did not influence the responses. We discuss these findings in the context of the embodied simulation theory and previous data on empathy.

  4. The Child Affective Facial Expression (CAFE) set: validity and reliability from untrained adults

    PubMed Central

    LoBue, Vanessa; Thrasher, Cat

    2014-01-01

    Emotional development is one of the largest and most productive areas of psychological research. For decades, researchers have been fascinated by how humans respond to, detect, and interpret emotional facial expressions. Much of the research in this area has relied on controlled stimulus sets of adults posing various facial expressions. Here we introduce a new stimulus set of emotional facial expressions into the domain of research on emotional development—The Child Affective Facial Expression set (CAFE). The CAFE set features photographs of a racially and ethnically diverse group of 2- to 8-year-old children posing for six emotional facial expressions—angry, fearful, sad, happy, surprised, and disgusted—and a neutral face. In the current work, we describe the set and report validity and reliability data on the set from 100 untrained adult participants. PMID:25610415

  5. Emotion recognition in borderline personality disorder: effects of emotional information on negative bias.

    PubMed

    Fenske, Sabrina; Lis, Stefanie; Liebke, Lisa; Niedtfeld, Inga; Kirsch, Peter; Mier, Daniela

    2015-01-01

    Borderline Personality Disorder (BPD) is characterized by severe deficits in social interactions, which might be linked to deficits in emotion recognition. Research on emotion recognition abilities in BPD revealed heterogeneous results, ranging from deficits to heightened sensitivity. The most stable findings point to an impairment in the evaluation of neutral facial expressions as neutral, as well as to a negative bias in emotion recognition; that is the tendency to attribute negative emotions to neutral expressions, or in a broader sense to report a more negative emotion category than depicted. However, it remains unclear which contextual factors influence the occurrence of this negative bias. Previous studies suggest that priming by preceding emotional information and also constrained processing time might augment the emotion recognition deficit in BPD. To test these assumptions, 32 female BPD patients and 31 healthy females, matched for age and education, participated in an emotion recognition study, in which every facial expression was preceded by either a positive, neutral or negative scene. Furthermore, time constraints for processing were varied by presenting the facial expressions with short (100 ms) or long duration (up to 3000 ms) in two separate blocks. BPD patients showed a significant deficit in emotion recognition for neutral and positive facial expression, associated with a significant negative bias. In BPD patients, this emotion recognition deficit was differentially affected by preceding emotional information and time constraints, with a greater influence of emotional information during long face presentations and a greater influence of neutral information during short face presentations. Our results are in line with previous findings supporting the existence of a negative bias in emotion recognition in BPD patients, and provide further insights into biased social perceptions in BPD patients.

  6. Anodal tDCS targeting the right orbitofrontal cortex enhances facial expression recognition

    PubMed Central

    Murphy, Jillian M.; Ridley, Nicole J.; Vercammen, Ans

    2015-01-01

    The orbitofrontal cortex (OFC) has been implicated in the capacity to accurately recognise facial expressions. The aim of the current study was to determine if anodal transcranial direct current stimulation (tDCS) targeting the right OFC in healthy adults would enhance facial expression recognition, compared with a sham condition. Across two counterbalanced sessions of tDCS (i.e. anodal and sham), 20 undergraduate participants (18 female) completed a facial expression labelling task comprising angry, disgusted, fearful, happy, sad and neutral expressions, and a control (social judgement) task comprising the same expressions. Responses on the labelling task were scored for accuracy, median reaction time and overall efficiency (i.e. combined accuracy and reaction time). Anodal tDCS targeting the right OFC enhanced facial expression recognition, reflected in greater efficiency and speed of recognition across emotions, relative to the sham condition. In contrast, there was no effect of tDCS to responses on the control task. This is the first study to demonstrate that anodal tDCS targeting the right OFC boosts facial expression recognition. This finding provides a solid foundation for future research to examine the efficacy of this technique as a means to treat facial expression recognition deficits, particularly in individuals with OFC damage or dysfunction. PMID:25971602

  7. Dissociation of Neural Substrates of Response Inhibition to Negative Information between Implicit and Explicit Facial Go/Nogo Tasks: Evidence from an Electrophysiological Study

    PubMed Central

    Sun, Shiyue; Carretié, Luis; Zhang, Lei; Dong, Yi; Zhu, Chunyan; Luo, Yuejia; Wang, Kai

    2014-01-01

    Background Although ample evidence suggests that emotion and response inhibition are interrelated at the behavioral and neural levels, neural substrates of response inhibition to negative facial information remain unclear. Thus we used event-related potential (ERP) methods to explore the effects of explicit and implicit facial expression processing in response inhibition. Methods We used implicit (gender categorization) and explicit emotional Go/Nogo tasks (emotion categorization) in which neutral and sad faces were presented. Electrophysiological markers at the scalp and the voxel level were analyzed during the two tasks. Results We detected a task, emotion and trial type interaction effect in the Nogo-P3 stage. Larger Nogo-P3 amplitudes during sad conditions versus neutral conditions were detected with explicit tasks. However, the amplitude differences between the two conditions were not significant for implicit tasks. Source analyses on P3 component revealed that right inferior frontal junction (rIFJ) was involved during this stage. The current source density (CSD) of rIFJ was higher with sad conditions compared to neutral conditions for explicit tasks, rather than for implicit tasks. Conclusions The findings indicated that response inhibition was modulated by sad facial information at the action inhibition stage when facial expressions were processed explicitly rather than implicitly. The rIFJ may be a key brain region in emotion regulation. PMID:25330212

  8. Incongruence Between Observers’ and Observed Facial Muscle Activation Reduces Recognition of Emotional Facial Expressions From Video Stimuli

    PubMed Central

    Wingenbach, Tanja S. H.; Brosnan, Mark; Pfaltz, Monique C.; Plichta, Michael M.; Ashwin, Chris

    2018-01-01

    According to embodied cognition accounts, viewing others’ facial emotion can elicit the respective emotion representation in observers which entails simulations of sensory, motor, and contextual experiences. In line with that, published research found viewing others’ facial emotion to elicit automatic matched facial muscle activation, which was further found to facilitate emotion recognition. Perhaps making congruent facial muscle activity explicit produces an even greater recognition advantage. If there is conflicting sensory information, i.e., incongruent facial muscle activity, this might impede recognition. The effects of actively manipulating facial muscle activity on facial emotion recognition from videos were investigated across three experimental conditions: (a) explicit imitation of viewed facial emotional expressions (stimulus-congruent condition), (b) pen-holding with the lips (stimulus-incongruent condition), and (c) passive viewing (control condition). It was hypothesised that (1) experimental condition (a) and (b) result in greater facial muscle activity than (c), (2) experimental condition (a) increases emotion recognition accuracy from others’ faces compared to (c), (3) experimental condition (b) lowers recognition accuracy for expressions with a salient facial feature in the lower, but not the upper face area, compared to (c). Participants (42 males, 42 females) underwent a facial emotion recognition experiment (ADFES-BIV) while electromyography (EMG) was recorded from five facial muscle sites. The experimental conditions’ order was counter-balanced. Pen-holding caused stimulus-incongruent facial muscle activity for expressions with facial feature saliency in the lower face region, which reduced recognition of lower face region emotions. Explicit imitation caused stimulus-congruent facial muscle activity without modulating recognition. Methodological implications are discussed. PMID:29928240

  9. Incongruence Between Observers' and Observed Facial Muscle Activation Reduces Recognition of Emotional Facial Expressions From Video Stimuli.

    PubMed

    Wingenbach, Tanja S H; Brosnan, Mark; Pfaltz, Monique C; Plichta, Michael M; Ashwin, Chris

    2018-01-01

    According to embodied cognition accounts, viewing others' facial emotion can elicit the respective emotion representation in observers which entails simulations of sensory, motor, and contextual experiences. In line with that, published research found viewing others' facial emotion to elicit automatic matched facial muscle activation, which was further found to facilitate emotion recognition. Perhaps making congruent facial muscle activity explicit produces an even greater recognition advantage. If there is conflicting sensory information, i.e., incongruent facial muscle activity, this might impede recognition. The effects of actively manipulating facial muscle activity on facial emotion recognition from videos were investigated across three experimental conditions: (a) explicit imitation of viewed facial emotional expressions (stimulus-congruent condition), (b) pen-holding with the lips (stimulus-incongruent condition), and (c) passive viewing (control condition). It was hypothesised that (1) experimental condition (a) and (b) result in greater facial muscle activity than (c), (2) experimental condition (a) increases emotion recognition accuracy from others' faces compared to (c), (3) experimental condition (b) lowers recognition accuracy for expressions with a salient facial feature in the lower, but not the upper face area, compared to (c). Participants (42 males, 42 females) underwent a facial emotion recognition experiment (ADFES-BIV) while electromyography (EMG) was recorded from five facial muscle sites. The experimental conditions' order was counter-balanced. Pen-holding caused stimulus-incongruent facial muscle activity for expressions with facial feature saliency in the lower face region, which reduced recognition of lower face region emotions. Explicit imitation caused stimulus-congruent facial muscle activity without modulating recognition. Methodological implications are discussed.

  10. Deliberately generated and imitated facial expressions of emotions in people with eating disorders.

    PubMed

    Dapelo, Marcela Marin; Bodas, Sergio; Morris, Robin; Tchanturia, Kate

    2016-02-01

    People with eating disorders have difficulties in socio emotional functioning that could contribute to maintaining the functional consequences of the disorder. This study aimed to explore the ability to deliberately generate (i.e., pose) and imitate facial expressions of emotions in women with anorexia (AN) and bulimia nervosa (BN), compared to healthy controls (HC). One hundred and three participants (36 AN, 25 BN, and 42 HC) were asked to pose and imitate facial expressions of anger, disgust, fear, happiness, and sadness. Their facial expressions were recorded and coded. Participants with eating disorders (both AN and BN) were less accurate than HC when posing facial expressions of emotions. Participants with AN were less accurate compared to HC imitating facial expressions, whilst BN participants had a middle range performance. All results remained significant after controlling for anxiety, depression and autistic features. The relatively small number of BN participants recruited for this study. The study findings suggest that people with eating disorders, particularly those with AN, have difficulties posing and imitating facial expressions of emotions. These difficulties could have an impact in social communication and social functioning. This is the first study to investigate the ability to pose and imitate facial expressions of emotions in people with eating disorders, and the findings suggest this area should be further explored in future studies. Copyright © 2015. Published by Elsevier B.V.

  11. Dissociable roles of internal feelings and face recognition ability in facial expression decoding.

    PubMed

    Zhang, Lin; Song, Yiying; Liu, Ling; Liu, Jia

    2016-05-15

    The problem of emotion recognition has been tackled by researchers in both affective computing and cognitive neuroscience. While affective computing relies on analyzing visual features from facial expressions, it has been proposed that humans recognize emotions by internally simulating the emotional states conveyed by others' expressions, in addition to perceptual analysis of facial features. Here we investigated whether and how our internal feelings contributed to the ability to decode facial expressions. In two independent large samples of participants, we observed that individuals who generally experienced richer internal feelings exhibited a higher ability to decode facial expressions, and the contribution of internal feelings was independent of face recognition ability. Further, using voxel-based morphometry, we found that the gray matter volume (GMV) of bilateral superior temporal sulcus (STS) and the right inferior parietal lobule was associated with facial expression decoding through the mediating effect of internal feelings, while the GMV of bilateral STS, precuneus, and the right central opercular cortex contributed to facial expression decoding through the mediating effect of face recognition ability. In addition, the clusters in bilateral STS involved in the two components were neighboring yet separate. Our results may provide clues about the mechanism by which internal feelings, in addition to face recognition ability, serve as an important instrument for humans in facial expression decoding. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Facioscapulohumeral muscular dystrophy

    MedlinePlus

    ... due to weakness of the cheek muscles Decreased facial expression due to weakness of facial muscles Depressed or angry facial expression Difficulty pronouncing words Difficulty reaching above the shoulder ...

  13. Facial Emotion Recognition and Expression in Parkinson's Disease: An Emotional Mirror Mechanism?

    PubMed

    Ricciardi, Lucia; Visco-Comandini, Federica; Erro, Roberto; Morgante, Francesca; Bologna, Matteo; Fasano, Alfonso; Ricciardi, Diego; Edwards, Mark J; Kilner, James

    2017-01-01

    Parkinson's disease (PD) patients have impairment of facial expressivity (hypomimia) and difficulties in interpreting the emotional facial expressions produced by others, especially for aversive emotions. We aimed to evaluate the ability to produce facial emotional expressions and to recognize facial emotional expressions produced by others in a group of PD patients and a group of healthy participants in order to explore the relationship between these two abilities and any differences between the two groups of participants. Twenty non-demented, non-depressed PD patients and twenty healthy participants (HC) matched for demographic characteristics were studied. The ability of recognizing emotional facial expressions was assessed with the Ekman 60-faces test (Emotion recognition task). Participants were video-recorded while posing facial expressions of 6 primary emotions (happiness, sadness, surprise, disgust, fear and anger). The most expressive pictures for each emotion were derived from the videos. Ten healthy raters were asked to look at the pictures displayed on a computer-screen in pseudo-random fashion and to identify the emotional label in a six-forced-choice response format (Emotion expressivity task). Reaction time (RT) and accuracy of responses were recorded. At the end of each trial the participant was asked to rate his/her confidence in his/her perceived accuracy of response. For emotion recognition, PD reported lower score than HC for Ekman total score (p<0.001), and for single emotions sub-scores happiness, fear, anger, sadness (p<0.01) and surprise (p = 0.02). In the facial emotion expressivity task, PD and HC significantly differed in the total score (p = 0.05) and in the sub-scores for happiness, sadness, anger (all p<0.001). RT and the level of confidence showed significant differences between PD and HC for the same emotions. There was a significant positive correlation between the emotion facial recognition and expressivity in both groups; the correlation was even stronger when ranking emotions from the best recognized to the worst (R = 0.75, p = 0.004). PD patients showed difficulties in recognizing emotional facial expressions produced by others and in posing facial emotional expressions compared to healthy subjects. The linear correlation between recognition and expression in both experimental groups suggests that the two mechanisms share a common system, which could be deteriorated in patients with PD. These results open new clinical and rehabilitation perspectives.

  14. Rapid Facial Reactions to Emotional Facial Expressions in Typically Developing Children and Children with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Beall, Paula M.; Moody, Eric J.; McIntosh, Daniel N.; Hepburn, Susan L.; Reed, Catherine L.

    2008-01-01

    Typical adults mimic facial expressions within 1000ms, but adults with autism spectrum disorder (ASD) do not. These rapid facial reactions (RFRs) are associated with the development of social-emotional abilities. Such interpersonal matching may be caused by motor mirroring or emotional responses. Using facial electromyography (EMG), this study…

  15. Social orienting of children with autism to facial expressions and speech: a study with a wearable eye-tracker in naturalistic settings

    PubMed Central

    Magrelli, Silvia; Jermann, Patrick; Noris, Basilio; Ansermet, François; Hentsch, François; Nadel, Jacqueline; Billard, Aude

    2013-01-01

    This study investigates attention orienting to social stimuli in children with Autism Spectrum Conditions (ASC) during dyadic social interactions taking place in real-life settings. We study the effect of social cues that differ in complexity and distinguish between social cues produced by facial expressions of emotion and those produced during speech. We record the children's gazes using a head-mounted eye-tracking device and report on a detailed and quantitative analysis of the motion of the gaze in response to the social cues. The study encompasses a group of children with ASC from 2 to 11-years old (n = 14) and a group of typically developing (TD) children (n = 17) between 3 and 6-years old. While the two groups orient overtly to facial expressions, children with ASC do so to a lesser extent. Children with ASC differ importantly from TD children in the way they respond to speech cues, displaying little overt shifting of attention to speaking faces. When children with ASC orient to facial expressions, they show reaction times and first fixation lengths similar to those presented by TD children. However, children with ASC orient to speaking faces slower than TD children. These results support the hypothesis that individuals affected by ASC have difficulties processing complex social sounds and detecting intermodal correspondence between facial and vocal information. It also corroborates evidence that people with ASC show reduced overt attention toward social stimuli. PMID:24312064

  16. Modulation of neural circuits underlying temporal production by facial expressions of pain.

    PubMed

    Ballotta, Daniela; Lui, Fausta; Porro, Carlo Adolfo; Nichelli, Paolo Frigio; Benuzzi, Francesca

    2018-01-01

    According to the Scalar Expectancy Theory, humans are equipped with a biological internal clock, possibly modulated by attention and arousal. Both emotions and pain are arousing and can absorb attentional resources, thus causing distortions of temporal perception. The aims of the present single-event fMRI study were to investigate: a) whether observation of facial expressions of pain interferes with time production; and b) the neural network subserving this kind of temporal distortions. Thirty healthy volunteers took part in the study. Subjects were asked to perform a temporal production task and a concurrent gender discrimination task, while viewing faces of unknown people with either pain-related or neutral expressions. Behavioural data showed temporal underestimation (i.e., longer produced intervals) during implicit pain expression processing; this was accompanied by increased activity of right middle temporal gyrus, a region known to be active during the perception of emotional and painful faces. Psycho-Physiological Interaction analyses showed that: 1) the activity of middle temporal gyrus was positively related to that of areas previously reported to play a role in timing: left primary motor cortex, middle cingulate cortex, supplementary motor area, right anterior insula, inferior frontal gyrus, bilateral cerebellum and basal ganglia; 2) the functional connectivity of supplementary motor area with several frontal regions, anterior cingulate cortex and right angular gyrus was correlated to the produced interval during painful expression processing. Our data support the hypothesis that observing emotional expressions distorts subjective time perception through the interaction of the neural network subserving processing of facial expressions with the brain network involved in timing. Within this frame, middle temporal gyrus appears to be the key region of the interplay between the two neural systems.

  17. Facial expression recognition based on improved deep belief networks

    NASA Astrophysics Data System (ADS)

    Wu, Yao; Qiu, Weigen

    2017-08-01

    In order to improve the robustness of facial expression recognition, a method of face expression recognition based on Local Binary Pattern (LBP) combined with improved deep belief networks (DBNs) is proposed. This method uses LBP to extract the feature, and then uses the improved deep belief networks as the detector and classifier to extract the LBP feature. The combination of LBP and improved deep belief networks is realized in facial expression recognition. In the JAFFE (Japanese Female Facial Expression) database on the recognition rate has improved significantly.

  18. Patterns of Emotion Experiences as Predictors of Facial Expressions of Emotion.

    ERIC Educational Resources Information Center

    Blumberg, Samuel H.; Izard, Carroll E.

    1991-01-01

    Examined the relations between emotion and facial expressions of emotion in 8- to 12-year-old male psychiatric patients. Results indicated that patterns or combinations of emotion experiences had an impact on facial expressions of emotion. (Author/BB)

  19. Neuropsychological Studies of Linguistic and Affective Facial Expressions in Deaf Signers.

    ERIC Educational Resources Information Center

    Corina, David P.; Bellugi, Ursula; Reilly, Judy

    1999-01-01

    Presents two studies that explore facial expression production in deaf signers. An experimental paradigm uses chimeric stimuli of American Sign Language linguistic and facial expressions to explore patterns of productive asymmetries in brain-intact signers. (Author/VWL)

  20. The influence of communicative relations on facial responses to pain: Does it matter who is watching?

    PubMed Central

    Karmann, Anna J; Lautenbacher, Stefan; Bauer, Florian; Kunz, Miriam

    2014-01-01

    BACKGROUND: Facial responses to pain are believed to be an act of communication and, as such, are likely to be affected by the relationship between sender and receiver. OBJECTIVES: To investigate this effect by examining the impact that variations in communicative relations (from being alone to being with an intimate other) have on the elements of the facial language used to communicate pain (types of facial responses), and on the degree of facial expressiveness. METHODS: Facial responses of 126 healthy participants to phasic heat pain were assessed in three different social situations: alone, but aware of video recording; in the presence of an experimenter; and in the presence of an intimate other. Furthermore, pain catastrophizing and sex (of participant and experimenter) were considered as additional influences. RESULTS: Whereas similar types of facial responses were elicited independent of the relationship between sender and observer, the degree of facial expressiveness varied significantly, with increased expressiveness occurring in the presence of the partner. Interestingly, being with an experimenter decreased facial expressiveness only in women. Pain catastrophizing and the sex of the experimenter exhibited no substantial influence on facial responses. CONCLUSION: Variations in communicative relations had no effect on the elements of the facial pain language. The degree of facial expressiveness, however, was adapted to the relationship between sender and observer. Individuals suppressed their facial communication of pain toward unfamiliar persons, whereas they overtly displayed it in the presence of an intimate other. Furthermore, when confronted with an unfamiliar person, different situational demands appeared to apply for both sexes. PMID:24432350

  1. Impact of visual learning on facial expressions of physical distress: a study on voluntary and evoked expressions of pain in congenitally blind and sighted individuals.

    PubMed

    Kunz, Miriam; Faltermeier, Nicole; Lautenbacher, Stefan

    2012-02-01

    The ability to facially communicate physical distress (e.g. pain) can be essential to ensure help, support and clinical treatment for the individual experiencing physical distress. So far, it is not known to which degree this ability represents innate and biologically prepared programs or whether it requires visual learning. Here, we address this question by studying evoked and voluntary facial expressions of pain in congenitally blind (N=21) and sighted (N=42) individuals. The repertoire of evoked facial expressions was comparable in congenitally blind and sighted individuals; however, blind individuals were less capable of facially encoding different intensities of experimental pain. Moreover, blind individuals were less capable of voluntarily modulating their pain expression. We conclude that the repertoire of facial muscles being activated during pain is biologically prepared. However, visual learning is a prerequisite in order to encode different intensities of physical distress as well as for up- and down-regulation of one's facial expression. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Passing faces: sequence-dependent variations in the perceptual processing of emotional faces.

    PubMed

    Karl, Christian; Hewig, Johannes; Osinsky, Roman

    2016-10-01

    There is broad evidence that contextual factors influence the processing of emotional facial expressions. Yet temporal-dynamic aspects, inter alia how face processing is influenced by the specific order of neutral and emotional facial expressions, have been largely neglected. To shed light on this topic, we recorded electroencephalogram from 168 healthy participants while they performed a gender-discrimination task with angry and neutral faces. Our event-related potential (ERP) analyses revealed a strong emotional modulation of the N170 component, indicating that the basic visual encoding and emotional analysis of a facial stimulus happen, at least partially, in parallel. While the N170 and the late positive potential (LPP; 400-600 ms) were only modestly affected by the sequence of preceding faces, we observed a strong influence of face sequences on the early posterior negativity (EPN; 200-300 ms). Finally, the differing response patterns of the EPN and LPP indicate that these two ERPs represent distinct processes during face analysis: while the former seems to represent the integration of contextual information in the perception of a current face, the latter appears to represent the net emotional interpretation of a current face.

  3. Compensating for age limits through emotional crossmodal integration

    PubMed Central

    Chaby, Laurence; Boullay, Viviane Luherne-du; Chetouani, Mohamed; Plaza, Monique

    2015-01-01

    Social interactions in daily life necessitate the integration of social signals from different sensory modalities. In the aging literature, it is well established that the recognition of emotion in facial expressions declines with advancing age, and this also occurs with vocal expressions. By contrast, crossmodal integration processing in healthy aging individuals is less documented. Here, we investigated the age-related effects on emotion recognition when faces and voices were presented alone or simultaneously, allowing for crossmodal integration. In this study, 31 young adults (M = 25.8 years) and 31 older adults (M = 67.2 years) were instructed to identify several basic emotions (happiness, sadness, anger, fear, disgust) and a neutral expression, which were displayed as visual (facial expressions), auditory (non-verbal affective vocalizations) or crossmodal (simultaneous, congruent facial and vocal affective expressions) stimuli. The results showed that older adults performed slower and worse than younger adults at recognizing negative emotions from isolated faces and voices. In the crossmodal condition, although slower, older adults were as accurate as younger except for anger. Importantly, additional analyses using the “race model” demonstrate that older adults benefited to the same extent as younger adults from the combination of facial and vocal emotional stimuli. These results help explain some conflicting results in the literature and may clarify emotional abilities related to daily life that are partially spared among older adults. PMID:26074845

  4. ECTODERMAL WNT/β-CATENIN SIGNALING SHAPES THE MOUSE FACE

    PubMed Central

    Reid, Bethany S.; Yang, Hui; Melvin, Vida Senkus; Taketo, Makoto M.; Williams, Trevor

    2010-01-01

    The canonical Wnt/β-catenin pathway is an essential component of multiple developmental processes. To investigate the role of this pathway in the ectoderm during facial morphogenesis, we generated conditional β-catenin mouse mutants using a novel ectoderm-specific Cre recombinase transgenic line. Our results demonstrate that ablating or stabilizing β-catenin in the embryonic ectoderm causes dramatic changes in facial morphology. There are accompanying alterations in the expression of Fgf8 and Shh, key molecules that establish a signaling center critical for facial patterning, the frontonasal ectodermal zone (FEZ). These data indicate that Wnt/β-catenin signaling within the ectoderm is critical for facial development and further suggest that this pathway is an important mechanism for generating the diverse facial shapes of vertebrates during evolution. PMID:21087601

  5. Facial expressions of emotion and the course of conjugal bereavement.

    PubMed

    Bonanno, G A; Keltner, D

    1997-02-01

    The common assumption that emotional expression mediates the course of bereavement is tested. Competing hypotheses about the direction of mediation were formulated from the grief work and social-functional accounts of emotional expression. Facial expressions of emotion in conjugally bereaved adults were coded at 6 months post-loss as they described their relationship with the deceased; grief and perceived health were measured at 6, 14, and 25 months. Facial expressions of negative emotion, in particular anger, predicted increased grief at 14 months and poorer perceived health through 25 months. Facial expressions of positive emotion predicted decreased grief through 25 months and a positive but nonsignificant relation to perceived health. Predictive relations between negative and positive emotional expression persisted when initial levels of self-reported emotion, grief, and health were statistically controlled, demonstrating the mediating role of facial expressions of emotion in adjustment to conjugal loss. Theoretical and clinical implications are discussed.

  6. Preferential amygdala reactivity to the negative assessment of neutral faces.

    PubMed

    Blasi, Giuseppe; Hariri, Ahmad R; Alce, Guilna; Taurisano, Paolo; Sambataro, Fabio; Das, Saumitra; Bertolino, Alessandro; Weinberger, Daniel R; Mattay, Venkata S

    2009-11-01

    Prior studies suggest that the amygdala shapes complex behavioral responses to socially ambiguous cues. We explored human amygdala function during explicit behavioral decision making about discrete emotional facial expressions that can represent socially unambiguous and ambiguous cues. During functional magnetic resonance imaging, 43 healthy adults were required to make complex social decisions (i.e., approach or avoid) about either relatively unambiguous (i.e., angry, fearful, happy) or ambiguous (i.e., neutral) facial expressions. Amygdala activation during this task was compared with that elicited by simple, perceptual decisions (sex discrimination) about the identical facial stimuli. Angry and fearful expressions were more frequently judged as avoidable and happy expressions most often as approachable. Neutral expressions were equally judged as avoidable and approachable. Reaction times to neutral expressions were longer than those to angry, fearful, and happy expressions during social judgment only. Imaging data on stimuli judged to be avoided revealed a significant task by emotion interaction in the amygdala. Here, only neutral facial expressions elicited greater activity during social judgment than during sex discrimination. Furthermore, during social judgment only, neutral faces judged to be avoided were associated with greater amygdala activity relative to neutral faces that were judged as approachable. Moreover, functional coupling between the amygdala and both dorsolateral prefrontal (social judgment > sex discrimination) and cingulate (sex discrimination > social judgment) cortices was differentially modulated by task during processing of neutral faces. Our results suggest that increased amygdala reactivity and differential functional coupling with prefrontal circuitries may shape complex decisions and behavioral responses to socially ambiguous cues.

  7. Preferential Amygdala Reactivity to the Negative Assessment of Neutral Faces

    PubMed Central

    Blasi, Giuseppe; Hariri, Ahmad R.; Alce, Guilna; Taurisano, Paolo; Sambataro, Fabio; Das, Saumitra; Bertolino, Alessandro; Weinberger, Daniel R.; Mattay, Venkata S.

    2010-01-01

    Background Prior studies suggest that the amygdala shapes complex behavioral responses to socially ambiguous cues. We explored human amygdala function during explicit behavioral decision making about discrete emotional facial expressions that can represent socially unambiguous and ambiguous cues. Methods During functional magnetic resonance imaging, 43 healthy adults were required to make complex social decisions (i.e., approach or avoid) about either relatively unambiguous (i.e., angry, fearful, happy) or ambiguous (i.e., neutral) facial expressions. Amygdala activation during this task was compared with that elicited by simple, perceptual decisions (sex discrimination) about the identical facial stimuli. Results Angry and fearful expressions were more frequently judged as avoidable and happy expressions most often as approachable. Neutral expressions were equally judged as avoidable and approachable. Reaction times to neutral expressions were longer than those to angry, fearful, and happy expressions during social judgment only. Imaging data on stimuli judged to be avoided revealed a significant task by emotion interaction in the amygdala. Here, only neutral facial expressions elicited greater activity during social judgment than during sex discrimination. Furthermore, during social judgment only, neutral faces judged to be avoided were associated with greater amygdala activity relative to neutral faces that were judged as approachable. Moreover, functional coupling between the amygdala and both dorsolateral prefrontal (social judgment > sex discrimination) and cingulate (sex discrimination > social judgment) cortices was differentially modulated by task during processing of neutral faces. Conclusions Our results suggest that increased amygdala reactivity and differential functional coupling with prefrontal circuitries may shape complex decisions and behavioral responses to socially ambiguous cues. PMID:19709644

  8. Facing mixed emotions: Analytic and holistic perception of facial emotion expressions engages separate brain networks.

    PubMed

    Meaux, Emilie; Vuilleumier, Patrik

    2016-11-01

    The ability to decode facial emotions is of primary importance for human social interactions; yet, it is still debated how we analyze faces to determine their expression. Here we compared the processing of emotional face expressions through holistic integration and/or local analysis of visual features, and determined which brain systems mediate these distinct processes. Behavioral, physiological, and brain responses to happy and angry faces were assessed by presenting congruent global configurations of expressions (e.g., happy top+happy bottom), incongruent composite configurations (e.g., angry top+happy bottom), and isolated features (e.g. happy top only). Top and bottom parts were always from the same individual. Twenty-six healthy volunteers were scanned using fMRI while they classified the expression in either the top or the bottom face part but ignored information in the other non-target part. Results indicate that the recognition of happy and anger expressions is neither strictly holistic nor analytic Both routes were involved, but with a different role for analytic and holistic information depending on the emotion type, and different weights of local features between happy and anger expressions. Dissociable neural pathways were engaged depending on emotional face configurations. In particular, regions within the face processing network differed in their sensitivity to holistic expression information, which predominantly activated fusiform, inferior occipital areas and amygdala when internal features were congruent (i.e. template matching), whereas more local analysis of independent features preferentially engaged STS and prefrontal areas (IFG/OFC) in the context of full face configurations, but early visual areas and pulvinar when seen in isolated parts. Collectively, these findings suggest that facial emotion recognition recruits separate, but interactive dorsal and ventral routes within the face processing networks, whose engagement may be shaped by reciprocal interactions and modulated by task demands. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. How Do Typically Developing Deaf Children and Deaf Children with Autism Spectrum Disorder Use the Face When Comprehending Emotional Facial Expressions in British Sign Language?

    ERIC Educational Resources Information Center

    Denmark, Tanya; Atkinson, Joanna; Campbell, Ruth; Swettenham, John

    2014-01-01

    Facial expressions in sign language carry a variety of communicative features. While emotion can modulate a spoken utterance through changes in intonation, duration and intensity, in sign language specific facial expressions presented concurrently with a manual sign perform this function. When deaf adult signers cannot see facial features, their…

  10. Realistic prediction of individual facial emotion expressions for craniofacial surgery simulations

    NASA Astrophysics Data System (ADS)

    Gladilin, Evgeny; Zachow, Stefan; Deuflhard, Peter; Hege, Hans-Christian

    2003-05-01

    In addition to the static soft tissue prediction, the estimation of individual facial emotion expressions is an important criterion for the evaluation of the carniofacial surgery planning. In this paper, we present an approach for the estimation of individual facial emotion expressions on the basis of geometrical models of human anatomy derived from tomographic data and the finite element modeling of facial tissue biomechanics.

  11. A Web-based Game for Teaching Facial Expressions to Schizophrenic Patients.

    PubMed

    Gülkesen, Kemal Hakan; Isleyen, Filiz; Cinemre, Buket; Samur, Mehmet Kemal; Sen Kaya, Semiha; Zayim, Nese

    2017-07-12

    Recognizing facial expressions is an important social skill. In some psychological disorders such as schizophrenia, loss of this skill may complicate the patient's daily life. Prior research has shown that information technology may help to develop facial expression recognition skills through educational software and games. To examine if a computer game designed for teaching facial expressions would improve facial expression recognition skills of patients with schizophrenia. We developed a website composed of eight serious games. Thirty-two patients were given a pre-test composed of 21 facial expression photographs. Eighteen patients were in the study group while 14 were in the control group. Patients in the study group were asked to play the games on the website. After a period of one month, we performed a post-test for all patients. The median score of the correct answers was 17.5 in the control group whereas it was 16.5 in the study group (of 21) in pretest. The median post-test score was 18 in the control group (p=0.052) whereas it was 20 in the study group (p<0.001). Computer games may be used for the purpose of educating people who have difficulty in recognizing facial expressions.

  12. Cyclin D1 expression and facial function outcome after vestibular schwannoma surgery.

    PubMed

    Lassaletta, Luis; Del Rio, Laura; Torres-Martin, Miguel; Rey, Juan A; Patrón, Mercedes; Madero, Rosario; Roda, Jose Maria; Gavilan, Javier

    2011-01-01

    The proto-oncogen cyclin D1 has been implicated in the development and behavior of vestibular schwannoma. This study evaluates the association between cyclin D1 expression and other known prognostic factors in facial function outcome 1 year after vestibular schwannoma surgery. Sixty-four patients undergoing surgery for vestibular schwannoma were studied. Immunohistochemistry analysis was performed with anticyclin D1 in all cases. Cyclin D1 expression, as well as other demographic, clinical, radiologic, and intraoperative data, was correlated with 1-year postoperative facial function. Good 1-year facial function (Grades 1-2) was achieved in 73% of cases. Cyclin D1 expression was found in 67% of the tumors. Positive cyclin D1 staining was more frequent in patients with Grades 1 to 2 (75%) than in those with Grades 3 to 6 (25%). Other significant variables were tumor volume and facial nerve stimulation after tumor resection. The area under the receiver operating characteristics curve increased when adding cyclin D1 expression to the multivariate model. Cyclin D1 expression is associated to facial outcome after vestibular schwannoma surgery. The prognostic value of cyclin D1 expression is independent of tumor size and facial nerve stimulation at the end of surgery.

  13. Developmental Changes in Infants' Categorization of Anger and Disgust Facial Expressions

    ERIC Educational Resources Information Center

    Ruba, Ashley L.; Johnson, Kristin M.; Harris, Lasana T.; Wilbourn, Makeba Parramore

    2017-01-01

    For decades, scholars have examined how children first recognize emotional facial expressions. This research has found that infants younger than 10 months can discriminate negative, within-valence facial expressions in looking time tasks, and children older than 24 months struggle to categorize these expressions in labeling and free-sort tasks.…

  14. Attention Orienting by Gaze and Facial Expressions Across Development

    PubMed Central

    Neath, Karly; Nilsen, Elizabeth S.; Gittsovich, Katarzyna; Itier, Roxane J.

    2014-01-01

    Processing of facial expressions has been shown to potentiate orienting of attention toward the direction signaled by gaze in adults, an important social–cognitive function. However, little is known about how this social attention skill develops. This study is the first to examine the developmental trajectory of the gaze orienting effect (GOE), its modulations by facial expressions, and its links with theory of mind (ToM) abilities. Dynamic emotional stimuli were presented to 222 participants (7–25 years old) with normal trait anxiety using a gaze-cuing paradigm. The GOE was found as early as 7 years of age and decreased linearly until 12–13 years, at which point adult levels were reached. Both fearful and surprised expressions enhanced the GOE compared with neutral expressions. The GOE for fearful faces was also larger than for joyful and angry expressions. These effects did not interact with age and were not driven by intertrial variance. Importantly, the GOE did not correlate with ToM abilities as assessed by the “Reading the Mind in the Eyes” test. The implication of these findings for clinical and typically developing populations is discussed. PMID:23356559

  15. A View of the Therapy for Bell's Palsy Based on Molecular Biological Analyses of Facial Muscles.

    PubMed

    Moriyama, Hiroshi; Mitsukawa, Nobuyuki; Itoh, Masahiro; Otsuka, Naruhito

    2017-12-01

    Details regarding the molecular biological features of Bell's palsy have not been widely reported in textbooks. We genetically analyzed facial muscles and clarified these points. We performed genetic analysis of facial muscle specimens from Japanese patients with severe (House-Brackmann facial nerve grading system V) and moderate (House-Brackmann facial nerve grading system III) dysfunction due to Bell's palsy. Microarray analysis of gene expression was performed using specimens from the healthy and affected sides, and gene expression was compared. Changes in gene expression were defined as an affected side/healthy side ratio of >1.5 or <0.5. We observed that the gene expression in Bell's palsy changes with the degree of facial nerve palsy. Especially, muscle, neuron, and energy category genes tended to fluctuate with the degree of facial nerve palsy. It is expected that this study will aid in the development of new treatments and diagnostic/prognostic markers based on the severity of facial nerve palsy.

  16. Gently does it: Humans outperform a software classifier in recognizing subtle, nonstereotypical facial expressions.

    PubMed

    Yitzhak, Neta; Giladi, Nir; Gurevich, Tanya; Messinger, Daniel S; Prince, Emily B; Martin, Katherine; Aviezer, Hillel

    2017-12-01

    According to dominant theories of affect, humans innately and universally express a set of emotions using specific configurations of prototypical facial activity. Accordingly, thousands of studies have tested emotion recognition using sets of highly intense and stereotypical facial expressions, yet their incidence in real life is virtually unknown. In fact, a commonplace experience is that emotions are expressed in subtle and nonprototypical forms. Such facial expressions are at the focus of the current study. In Experiment 1, we present the development and validation of a novel stimulus set consisting of dynamic and subtle emotional facial displays conveyed without constraining expressers to using prototypical configurations. Although these subtle expressions were more challenging to recognize than prototypical dynamic expressions, they were still well recognized by human raters, and perhaps most importantly, they were rated as more ecological and naturalistic than the prototypical expressions. In Experiment 2, we examined the characteristics of subtle versus prototypical expressions by subjecting them to a software classifier, which used prototypical basic emotion criteria. Although the software was highly successful at classifying prototypical expressions, it performed very poorly at classifying the subtle expressions. Further validation was obtained from human expert face coders: Subtle stimuli did not contain many of the key facial movements present in prototypical expressions. Together, these findings suggest that emotions may be successfully conveyed to human viewers using subtle nonprototypical expressions. Although classic prototypical facial expressions are well recognized, they appear less naturalistic and may not capture the richness of everyday emotional communication. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  17. Facial expression recognition based on improved local ternary pattern and stacked auto-encoder

    NASA Astrophysics Data System (ADS)

    Wu, Yao; Qiu, Weigen

    2017-08-01

    In order to enhance the robustness of facial expression recognition, we propose a method of facial expression recognition based on improved Local Ternary Pattern (LTP) combined with Stacked Auto-Encoder (SAE). This method uses the improved LTP extraction feature, and then uses the improved depth belief network as the detector and classifier to extract the LTP feature. The combination of LTP and improved deep belief network is realized in facial expression recognition. The recognition rate on CK+ databases has improved significantly.

  18. Facial mimicry in its social setting

    PubMed Central

    Seibt, Beate; Mühlberger, Andreas; Likowski, Katja U.; Weyers, Peter

    2015-01-01

    In interpersonal encounters, individuals often exhibit changes in their own facial expressions in response to emotional expressions of another person. Such changes are often called facial mimicry. While this tendency first appeared to be an automatic tendency of the perceiver to show the same emotional expression as the sender, evidence is now accumulating that situation, person, and relationship jointly determine whether and for which emotions such congruent facial behavior is shown. We review the evidence regarding the moderating influence of such factors on facial mimicry with a focus on understanding the meaning of facial responses to emotional expressions in a particular constellation. From this, we derive recommendations for a research agenda with a stronger focus on the most common forms of encounters, actual interactions with known others, and on assessing potential mediators of facial mimicry. We conclude that facial mimicry is modulated by many factors: attention deployment and sensitivity, detection of valence, emotional feelings, and social motivations. We posit that these are the more proximal causes of changes in facial mimicry due to changes in its social setting. PMID:26321970

  19. Residual fMRI sensitivity for identity changes in acquired prosopagnosia.

    PubMed

    Fox, Christopher J; Iaria, Giuseppe; Duchaine, Bradley C; Barton, Jason J S

    2013-01-01

    While a network of cortical regions contribute to face processing, the lesions in acquired prosopagnosia are highly variable, and likely result in different combinations of spared and affected regions of this network. To assess the residual functional sensitivities of spared regions in prosopagnosia, we designed a rapid event-related functional magnetic resonance imaging (fMRI) experiment that included pairs of faces with same or different identities and same or different expressions. By measuring the release from adaptation to these facial changes we determined the residual sensitivity of face-selective regions-of-interest. We tested three patients with acquired prosopagnosia, and all three of these patients demonstrated residual sensitivity for facial identity changes in surviving fusiform and occipital face areas of either the right or left hemisphere, but not in the right posterior superior temporal sulcus. The patients also showed some residual capabilities for facial discrimination with normal performance on the Benton Facial Recognition Test, but impaired performance on more complex tasks of facial discrimination. We conclude that fMRI can demonstrate residual processing of facial identity in acquired prosopagnosia, that this adaptation can occur in the same structures that show similar processing in healthy subjects, and further, that this adaptation may be related to behavioral indices of face perception.

  20. Residual fMRI sensitivity for identity changes in acquired prosopagnosia

    PubMed Central

    Fox, Christopher J.; Iaria, Giuseppe; Duchaine, Bradley C.; Barton, Jason J. S.

    2013-01-01

    While a network of cortical regions contribute to face processing, the lesions in acquired prosopagnosia are highly variable, and likely result in different combinations of spared and affected regions of this network. To assess the residual functional sensitivities of spared regions in prosopagnosia, we designed a rapid event-related functional magnetic resonance imaging (fMRI) experiment that included pairs of faces with same or different identities and same or different expressions. By measuring the release from adaptation to these facial changes we determined the residual sensitivity of face-selective regions-of-interest. We tested three patients with acquired prosopagnosia, and all three of these patients demonstrated residual sensitivity for facial identity changes in surviving fusiform and occipital face areas of either the right or left hemisphere, but not in the right posterior superior temporal sulcus. The patients also showed some residual capabilities for facial discrimination with normal performance on the Benton Facial Recognition Test, but impaired performance on more complex tasks of facial discrimination. We conclude that fMRI can demonstrate residual processing of facial identity in acquired prosopagnosia, that this adaptation can occur in the same structures that show similar processing in healthy subjects, and further, that this adaptation may be related to behavioral indices of face perception. PMID:24151479

Top