Sample records for facial affect perception

  1. Categorical Perception of Affective and Linguistic Facial Expressions

    ERIC Educational Resources Information Center

    McCullough, Stephen; Emmorey, Karen

    2009-01-01

    Two experiments investigated categorical perception (CP) effects for affective facial expressions and linguistic facial expressions from American Sign Language (ASL) for Deaf native signers and hearing non-signers. Facial expressions were presented in isolation (Experiment 1) or in an ASL verb context (Experiment 2). Participants performed ABX…

  2. The influence of context on distinct facial expressions of disgust.

    PubMed

    Reschke, Peter J; Walle, Eric A; Knothe, Jennifer M; Lopez, Lukas D

    2018-06-11

    Face perception is susceptible to contextual influence and perceived physical similarities between emotion cues. However, studies often use structurally homogeneous facial expressions, making it difficult to explore how within-emotion variability in facial configuration affects emotion perception. This study examined the influence of context on the emotional perception of categorically identical, yet physically distinct, facial expressions of disgust. Participants categorized two perceptually distinct disgust facial expressions, "closed" (i.e., scrunched nose, closed mouth) and "open" (i.e., scrunched nose, open mouth, protruding tongue), that were embedded in contexts comprising emotion postures and scenes. Results demonstrated that the effect of nonfacial elements was significantly stronger for "open" disgust facial expressions than "closed" disgust facial expressions. These findings provide support that physical similarity within discrete categories of facial expressions is mutable and plays an important role in affective face perception. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  3. Misinterpretation of Facial Expressions of Emotion in Verbal Adults with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Eack, Shaun M.; Mazefsky, Carla A.; Minshew, Nancy J.

    2015-01-01

    Facial emotion perception is significantly affected in autism spectrum disorder, yet little is known about how individuals with autism spectrum disorder misinterpret facial expressions that result in their difficulty in accurately recognizing emotion in faces. This study examined facial emotion perception in 45 verbal adults with autism spectrum…

  4. Dynamic facial expressions evoke distinct activation in the face perception network: a connectivity analysis study.

    PubMed

    Foley, Elaine; Rippon, Gina; Thai, Ngoc Jade; Longe, Olivia; Senior, Carl

    2012-02-01

    Very little is known about the neural structures involved in the perception of realistic dynamic facial expressions. In the present study, a unique set of naturalistic dynamic facial emotional expressions was created. Through fMRI and connectivity analysis, a dynamic face perception network was identified, which is demonstrated to extend Haxby et al.'s [Haxby, J. V., Hoffman, E. A., & Gobbini, M. I. The distributed human neural system for face perception. Trends in Cognitive Science, 4, 223-233, 2000] distributed neural system for face perception. This network includes early visual regions, such as the inferior occipital gyrus, which is identified as insensitive to motion or affect but sensitive to the visual stimulus, the STS, identified as specifically sensitive to motion, and the amygdala, recruited to process affect. Measures of effective connectivity between these regions revealed that dynamic facial stimuli were associated with specific increases in connectivity between early visual regions, such as the inferior occipital gyrus and the STS, along with coupling between the STS and the amygdala, as well as the inferior frontal gyrus. These findings support the presence of a distributed network of cortical regions that mediate the perception of different dynamic facial expressions.

  5. He Throws like a Girl (but Only when He's Sad): Emotion Affects Sex-Decoding of Biological Motion Displays

    ERIC Educational Resources Information Center

    Johnson, Kerri L.; McKay, Lawrie S.; Pollick, Frank E.

    2011-01-01

    Gender stereotypes have been implicated in sex-typed perceptions of facial emotion. Such interpretations were recently called into question because facial cues of emotion are confounded with sexually dimorphic facial cues. Here we examine the role of visual cues and gender stereotypes in perceptions of biological motion displays, thus overcoming…

  6. The Interactive Effects of Facial Expressions of Emotion and Verbal Messages on Perceptions of Affective Meaning.

    ERIC Educational Resources Information Center

    Friedman, Howard S.

    1979-01-01

    Students' perceptions of sincerity, dominance, and positivity were measured by pairing happy, angry, surprised and sad faces of teachers with teachers' comments characterized as positive or negative and dominant or submissive. Clear effects of facial-verbal combinations emerged; there were no sex differences other than in perceptions of sincerity.…

  7. Bodily action penetrates affective perception

    PubMed Central

    Rigutti, Sara; Gerbino, Walter

    2016-01-01

    Fantoni & Gerbino (2014) showed that subtle postural shifts associated with reaching can have a strong hedonic impact and affect how actors experience facial expressions of emotion. Using a novel Motor Action Mood Induction Procedure (MAMIP), they found consistent congruency effects in participants who performed a facial emotion identification task after a sequence of visually-guided reaches: a face perceived as neutral in a baseline condition appeared slightly happy after comfortable actions and slightly angry after uncomfortable actions. However, skeptics about the penetrability of perception (Zeimbekis & Raftopoulos, 2015) would consider such evidence insufficient to demonstrate that observer’s internal states induced by action comfort/discomfort affect perception in a top-down fashion. The action-modulated mood might have produced a back-end memory effect capable of affecting post-perceptual and decision processing, but not front-end perception. Here, we present evidence that performing a facial emotion detection (not identification) task after MAMIP exhibits systematic mood-congruent sensitivity changes, rather than response bias changes attributable to cognitive set shifts; i.e., we show that observer’s internal states induced by bodily action can modulate affective perception. The detection threshold for happiness was lower after fifty comfortable than uncomfortable reaches; while the detection threshold for anger was lower after fifty uncomfortable than comfortable reaches. Action valence induced an overall sensitivity improvement in detecting subtle variations of congruent facial expressions (happiness after positive comfortable actions, anger after negative uncomfortable actions), in the absence of significant response bias shifts. Notably, both comfortable and uncomfortable reaches impact sensitivity in an approximately symmetric way relative to a baseline inaction condition. All of these constitute compelling evidence of a genuine top-down effect on perception: specifically, facial expressions of emotion are penetrable by action-induced mood. Affective priming by action valence is a candidate mechanism for the influence of observer’s internal states on properties experienced as phenomenally objective and yet loaded with meaning. PMID:26893964

  8. An FMRI Study of Olfactory Cues to Perception of Conspecific Stress

    DTIC Science & Technology

    2010-04-01

    modulate recognition of fear in ambiguous facial expressions. Psychol Sci 20: 177-183. 23. Pause BM, Ohrt A, Prehn A, Ferstl R (2004) Positive...emotional priming of facial affect perception in females is diminished by chemosensory anxiety signals. Chem Senses 29: 797-805. 24. Prehn A, Ohrt A

  9. Cultural Differences in Affect Intensity Perception in the Context of Advertising

    PubMed Central

    Pogosyan, Marianna; Engelmann, Jan B.

    2011-01-01

    Cultural differences in the perception of positive affect intensity within an advertising context were investigated among American, Japanese, and Russian participants. Participants were asked to rate the intensity of facial expressions of positive emotions, which displayed either subtle, low intensity, or salient, high intensity expressions of positive affect. In agreement with previous findings from cross-cultural psychological research, current results demonstrate both cross-cultural agreement and differences in the perception of positive affect intensity across the three cultures. Specifically, American participants perceived high arousal (HA) images as significantly less calm than participants from the other two cultures, while the Japanese participants perceived low arousal (LA) images as significantly more excited than participants from the other cultures. The underlying mechanisms of these cultural differences were further investigated through difference scores that probed for cultural differences in perception and categorization of positive emotions. Findings indicate that rating differences are due to (1) perceptual differences in the extent to which HA images were discriminated from LA images, and (2) categorization differences in the extent to which facial expressions were grouped into affect intensity categories. Specifically, American participants revealed significantly higher perceptual differentiation between arousal levels of facial expressions in high and intermediate intensity categories. Japanese participants, on the other hand, did not discriminate between high and low arousal affect categories to the same extent as did the American and Russian participants. These findings indicate the presence of cultural differences in underlying decoding mechanisms of facial expressions of positive affect intensity. Implications of these results for global advertising are discussed. PMID:22084635

  10. Cultural differences in affect intensity perception in the context of advertising.

    PubMed

    Pogosyan, Marianna; Engelmann, Jan B

    2011-01-01

    Cultural differences in the perception of positive affect intensity within an advertising context were investigated among American, Japanese, and Russian participants. Participants were asked to rate the intensity of facial expressions of positive emotions, which displayed either subtle, low intensity, or salient, high intensity expressions of positive affect. In agreement with previous findings from cross-cultural psychological research, current results demonstrate both cross-cultural agreement and differences in the perception of positive affect intensity across the three cultures. Specifically, American participants perceived high arousal (HA) images as significantly less calm than participants from the other two cultures, while the Japanese participants perceived low arousal (LA) images as significantly more excited than participants from the other cultures. The underlying mechanisms of these cultural differences were further investigated through difference scores that probed for cultural differences in perception and categorization of positive emotions. Findings indicate that rating differences are due to (1) perceptual differences in the extent to which HA images were discriminated from LA images, and (2) categorization differences in the extent to which facial expressions were grouped into affect intensity categories. Specifically, American participants revealed significantly higher perceptual differentiation between arousal levels of facial expressions in high and intermediate intensity categories. Japanese participants, on the other hand, did not discriminate between high and low arousal affect categories to the same extent as did the American and Russian participants. These findings indicate the presence of cultural differences in underlying decoding mechanisms of facial expressions of positive affect intensity. Implications of these results for global advertising are discussed.

  11. Facial Expression Enhances Emotion Perception Compared to Vocal Prosody: Behavioral and fMRI Studies.

    PubMed

    Zhang, Heming; Chen, Xuhai; Chen, Shengdong; Li, Yansong; Chen, Changming; Long, Quanshan; Yuan, Jiajin

    2018-05-09

    Facial and vocal expressions are essential modalities mediating the perception of emotion and social communication. Nonetheless, currently little is known about how emotion perception and its neural substrates differ across facial expression and vocal prosody. To clarify this issue, functional MRI scans were acquired in Study 1, in which participants were asked to discriminate the valence of emotional expression (angry, happy or neutral) from facial, vocal, or bimodal stimuli. In Study 2, we used an affective priming task (unimodal materials as primers and bimodal materials as target) and participants were asked to rate the intensity, valence, and arousal of the targets. Study 1 showed higher accuracy and shorter response latencies in the facial than in the vocal modality for a happy expression. Whole-brain analysis showed enhanced activation during facial compared to vocal emotions in the inferior temporal-occipital regions. Region of interest analysis showed a higher percentage signal change for facial than for vocal anger in the superior temporal sulcus. Study 2 showed that facial relative to vocal priming of anger had a greater influence on perceived emotion for bimodal targets, irrespective of the target valence. These findings suggest that facial expression is associated with enhanced emotion perception compared to equivalent vocal prosodies.

  12. Misinterpretation of Facial Expressions of Emotion in Verbal Adults with Autism Spectrum Disorder

    PubMed Central

    Eack, Shaun M.; MAZEFSKY, CARLA A.; Minshew, Nancy J.

    2014-01-01

    Facial emotion perception is significantly affected in autism spectrum disorder (ASD), yet little is known about how individuals with ASD misinterpret facial expressions that result in their difficulty in accurately recognizing emotion in faces. This study examined facial emotion perception in 45 verbal adults with ASD and 30 age- and gender-matched volunteers without ASD to identify patterns of emotion misinterpretation during face processing that contribute to emotion recognition impairments in autism. Results revealed that difficulty distinguishing emotional from neutral facial expressions characterized much of the emotion perception impairments exhibited by participants with ASD. In particular, adults with ASD uniquely misinterpreted happy faces as neutral, and were significantly more likely than typical volunteers to attribute negative valence to non-emotional faces. The over-attribution of emotions to neutral faces was significantly related to greater communication and emotional intelligence impairments in individuals with ASD. These findings suggest a potential negative bias toward the interpretation of facial expressions and may have implications for interventions designed to remediate emotion perception in ASD. PMID:24535689

  13. Misinterpretation of facial expressions of emotion in verbal adults with autism spectrum disorder.

    PubMed

    Eack, Shaun M; Mazefsky, Carla A; Minshew, Nancy J

    2015-04-01

    Facial emotion perception is significantly affected in autism spectrum disorder, yet little is known about how individuals with autism spectrum disorder misinterpret facial expressions that result in their difficulty in accurately recognizing emotion in faces. This study examined facial emotion perception in 45 verbal adults with autism spectrum disorder and 30 age- and gender-matched volunteers without autism spectrum disorder to identify patterns of emotion misinterpretation during face processing that contribute to emotion recognition impairments in autism. Results revealed that difficulty distinguishing emotional from neutral facial expressions characterized much of the emotion perception impairments exhibited by participants with autism spectrum disorder. In particular, adults with autism spectrum disorder uniquely misinterpreted happy faces as neutral, and were significantly more likely than typical volunteers to attribute negative valence to nonemotional faces. The over-attribution of emotions to neutral faces was significantly related to greater communication and emotional intelligence impairments in individuals with autism spectrum disorder. These findings suggest a potential negative bias toward the interpretation of facial expressions and may have implications for interventions designed to remediate emotion perception in autism spectrum disorder. © The Author(s) 2014.

  14. Hostility and Facial Affect Recognition: Effects of a Cold Pressor Stressor on Accuracy and Cardiovascular Reactivity

    ERIC Educational Resources Information Center

    Herridge, Matt L.; Harrison, David W.; Mollet, Gina A.; Shenal, Brian V.

    2004-01-01

    The effects of hostility and a cold pressor stressor on the accuracy of facial affect perception were examined in the present experiment. A mechanism whereby physiological arousal level is mediated by systems which also mediate accuracy of an individual's interpretation of affective cues is described. Right-handed participants were classified as…

  15. Asymmetric bias in perception of facial affect among Roman and Arabic script readers.

    PubMed

    Heath, Robin L; Rouhana, Aida; Ghanem, Dana Abi

    2005-01-01

    The asymmetric chimeric faces test is used frequently as an indicator of right hemisphere involvement in the perception of facial affect, as the test is considered free of linguistic elements. Much of the original research with the asymmetric chimeric faces test was conducted with subjects reading left-to-right Roman script, i.e., English. As readers of right-to-left scripts, such as Arabic, demonstrated a mixed or weak rightward bias in judgements of facial affect, the influence of habitual scanning direction was thought to intersect with laterality. We administered the asymmetric chimeric faces test to 1239 adults who represented a range of script experience, i.e., Roman script readers (English and French), Arabic readers, bidirectional readers of Roman and Arabic scripts, and illiterates. Our findings supported the hypothesis that the bias in facial affect judgement is rooted in laterality, but can be influenced by script direction. Specifically, right-handed readers of Roman script demonstrated the greatest mean leftward score, and mixed-handed Arabic script readers demonstrated the greatest mean rightward score. Biliterates showed a gradual shift in asymmetric perception, as their scores fell between those of Roman and Arabic script readers, basically distributed in the order expected by their handedness and most often used script. Illiterates, whose only directional influence was laterality, showed a slight leftward bias.

  16. Recognition of Facially Expressed Emotions and Visual Search Strategies in Adults with Asperger Syndrome

    ERIC Educational Resources Information Center

    Falkmer, Marita; Bjallmark, Anna; Larsson, Matilda; Falkmer, Torbjorn

    2011-01-01

    Can the disadvantages persons with Asperger syndrome frequently experience with reading facially expressed emotions be attributed to a different visual perception, affecting their scanning patterns? Visual search strategies, particularly regarding the importance of information from the eye area, and the ability to recognise facially expressed…

  17. Faces in Context: A Review and Systematization of Contextual Influences on Affective Face Processing

    PubMed Central

    Wieser, Matthias J.; Brosch, Tobias

    2012-01-01

    Facial expressions are of eminent importance for social interaction as they convey information about other individuals’ emotions and social intentions. According to the predominant “basic emotion” approach, the perception of emotion in faces is based on the rapid, automatic categorization of prototypical, universal expressions. Consequently, the perception of facial expressions has typically been investigated using isolated, de-contextualized, static pictures of facial expressions that maximize the distinction between categories. However, in everyday life, an individual’s face is not perceived in isolation, but almost always appears within a situational context, which may arise from other people, the physical environment surrounding the face, as well as multichannel information from the sender. Furthermore, situational context may be provided by the perceiver, including already present social information gained from affective learning and implicit processing biases such as race bias. Thus, the perception of facial expressions is presumably always influenced by contextual variables. In this comprehensive review, we aim at (1) systematizing the contextual variables that may influence the perception of facial expressions and (2) summarizing experimental paradigms and findings that have been used to investigate these influences. The studies reviewed here demonstrate that perception and neural processing of facial expressions are substantially modified by contextual information, including verbal, visual, and auditory information presented together with the face as well as knowledge or processing biases already present in the observer. These findings further challenge the assumption of automatic, hardwired categorical emotion extraction mechanisms predicted by basic emotion theories. Taking into account a recent model on face processing, we discuss where and when these different contextual influences may take place, thus outlining potential avenues in future research. PMID:23130011

  18. Faces in context: a review and systematization of contextual influences on affective face processing.

    PubMed

    Wieser, Matthias J; Brosch, Tobias

    2012-01-01

    Facial expressions are of eminent importance for social interaction as they convey information about other individuals' emotions and social intentions. According to the predominant "basic emotion" approach, the perception of emotion in faces is based on the rapid, automatic categorization of prototypical, universal expressions. Consequently, the perception of facial expressions has typically been investigated using isolated, de-contextualized, static pictures of facial expressions that maximize the distinction between categories. However, in everyday life, an individual's face is not perceived in isolation, but almost always appears within a situational context, which may arise from other people, the physical environment surrounding the face, as well as multichannel information from the sender. Furthermore, situational context may be provided by the perceiver, including already present social information gained from affective learning and implicit processing biases such as race bias. Thus, the perception of facial expressions is presumably always influenced by contextual variables. In this comprehensive review, we aim at (1) systematizing the contextual variables that may influence the perception of facial expressions and (2) summarizing experimental paradigms and findings that have been used to investigate these influences. The studies reviewed here demonstrate that perception and neural processing of facial expressions are substantially modified by contextual information, including verbal, visual, and auditory information presented together with the face as well as knowledge or processing biases already present in the observer. These findings further challenge the assumption of automatic, hardwired categorical emotion extraction mechanisms predicted by basic emotion theories. Taking into account a recent model on face processing, we discuss where and when these different contextual influences may take place, thus outlining potential avenues in future research.

  19. Beauty in the eye of the beholder: Using facial electromyography to examine the association between eating disorder symptoms and perceptions of emaciation among undergraduate women.

    PubMed

    Dodd, Dorian R; Velkoff, Elizabeth A; Forrest, Lauren N; Fussner, Lauren M; Smith, April

    2017-06-01

    Thin-ideal internalization, drive for thinness, and over-evaluation of the importance of thinness are associated with eating disorders (EDs). However, little research has examined to what extent perceptions of emaciation are also associated with ED symptoms. In the present study, 80 undergraduate women self-reported on ED symptomatology and perceptions of emaciated, thin, and overweight female bodies. While participants viewed images of these different body types, facial electromyography was used to measure activation of facial muscles associated with disgust reactions. Emaciated and overweight bodies were rated negatively and elicited facial responses consistent with disgust. Further, ED symptomatology was associated with pronounced aversion to overweight bodies (assessed via self-report pleasantness ratings), and attenuated negative affect to emaciated bodies (assessed via facial electromyography). The latter association was significant even when controlling for self-reported perceptions of emaciation, suggesting that psychophysiological methods in ED research may provide valuable information unavailable via self-report. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Cognitive Processing about Classroom-Relevant Contexts: Teachers' Attention to and Utilization of Girls' Body Size, Ethnicity, Attractiveness, and Facial Affect

    ERIC Educational Resources Information Center

    Wang, Shirley S.; Treat, Teresa A.; Brownell, Kelly D.

    2008-01-01

    This study examines 2 aspects of cognitive processing in person perception--attention and decision making--in classroom-relevant contexts. Teachers completed 2 implicit, performance-based tasks that characterized attention to and utilization of 4 student characteristics of interest: ethnicity, facial affect, body size, and attractiveness. Stimuli…

  1. A new look at emotion perception: Concepts speed and shape facial emotion recognition.

    PubMed

    Nook, Erik C; Lindquist, Kristen A; Zaki, Jamil

    2015-10-01

    Decades ago, the "New Look" movement challenged how scientists thought about vision by suggesting that conceptual processes shape visual perceptions. Currently, affective scientists are likewise debating the role of concepts in emotion perception. Here, we utilized a repetition-priming paradigm in conjunction with signal detection and individual difference analyses to examine how providing emotion labels-which correspond to discrete emotion concepts-affects emotion recognition. In Study 1, pairing emotional faces with emotion labels (e.g., "sad") increased individuals' speed and sensitivity in recognizing emotions. Additionally, individuals with alexithymia-who have difficulty labeling their own emotions-struggled to recognize emotions based on visual cues alone, but not when emotion labels were provided. Study 2 replicated these findings and further demonstrated that emotion concepts can shape perceptions of facial expressions. Together, these results suggest that emotion perception involves conceptual processing. We discuss the implications of these findings for affective, social, and clinical psychology. (c) 2015 APA, all rights reserved).

  2. The influence of attention toward facial expressions on size perception.

    PubMed

    Choi, Jeong-Won; Kim, Kiho; Lee, Jang-Han

    2016-01-01

    According to the New Look theory, size perception is affected by emotional factors. Although previous studies have attempted to explain the effects of both emotion and motivation on size perception, they have failed to identify the underlying mechanisms. This study aimed to investigate the underlying mechanisms of size perception by applying attention toward facial expressions using the Ebbinghaus illusion as a measurement tool. The participants, female university students, were asked to judge the size of a target stimulus relative to the size of facial expressions (i.e., happy, angry, and neutral) surrounding the target. The results revealed that the participants perceived angry and neutral faces to be larger than happy faces. This finding indicates that individuals pay closer attention to neutral and angry faces than happy ones. These results suggest that the mechanisms underlying size perception involve cognitive processes that focus attention toward relevant stimuli and block out irrelevant stimuli.

  3. Gender differences in emotion experience perception under different facial muscle manipulations.

    PubMed

    Wang, Yufeng; Zhang, Dongjun; Zou, Feng; Li, Hao; Luo, Yanyan; Zhang, Meng; Liu, Yijun

    2016-04-01

    According to embodied emotion theory, facial manipulations should modulate and initiate particular emotions. However, whether there are gender differences in emotion experience perception under different facial muscle manipulations is not clear. Therefore, we conducted two behavioral experiments to examine gender differences in emotional perception in response to facial expressions (sad, neutral, and happy) under three conditions: (1) holding a pen using only the teeth (HPT), which facilitates the muscles typically associated with smiling; (2) holding a pen using only the lips (HPL), which inhibits the muscles typically associated with smiling; and (3) a control condition--hold no pen (HNP). We found that HPT made the emotional feelings more positive, and that the change degree of female's ratings of sad facial expressions between conditions (HPL to HPT) was larger than males'. These results suggested cognition can be affected by the interaction of the stimuli and the body, especially the female. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Cross-modal Association between Auditory and Visuospatial Information in Mandarin Tone Perception in Noise by Native and Non-native Perceivers.

    PubMed

    Hannah, Beverly; Wang, Yue; Jongman, Allard; Sereno, Joan A; Cao, Jiguo; Nie, Yunlong

    2017-01-01

    Speech perception involves multiple input modalities. Research has indicated that perceivers establish cross-modal associations between auditory and visuospatial events to aid perception. Such intermodal relations can be particularly beneficial for speech development and learning, where infants and non-native perceivers need additional resources to acquire and process new sounds. This study examines how facial articulatory cues and co-speech hand gestures mimicking pitch contours in space affect non-native Mandarin tone perception. Native English as well as Mandarin perceivers identified tones embedded in noise with either congruent or incongruent Auditory-Facial (AF) and Auditory-FacialGestural (AFG) inputs. Native Mandarin results showed the expected ceiling-level performance in the congruent AF and AFG conditions. In the incongruent conditions, while AF identification was primarily auditory-based, AFG identification was partially based on gestures, demonstrating the use of gestures as valid cues in tone identification. The English perceivers' performance was poor in the congruent AF condition, but improved significantly in AFG. While the incongruent AF identification showed some reliance on facial information, incongruent AFG identification relied more on gestural than auditory-facial information. These results indicate positive effects of facial and especially gestural input on non-native tone perception, suggesting that cross-modal (visuospatial) resources can be recruited to aid auditory perception when phonetic demands are high. The current findings may inform patterns of tone acquisition and development, suggesting how multi-modal speech enhancement principles may be applied to facilitate speech learning.

  5. Cross-modal Association between Auditory and Visuospatial Information in Mandarin Tone Perception in Noise by Native and Non-native Perceivers

    PubMed Central

    Hannah, Beverly; Wang, Yue; Jongman, Allard; Sereno, Joan A.; Cao, Jiguo; Nie, Yunlong

    2017-01-01

    Speech perception involves multiple input modalities. Research has indicated that perceivers establish cross-modal associations between auditory and visuospatial events to aid perception. Such intermodal relations can be particularly beneficial for speech development and learning, where infants and non-native perceivers need additional resources to acquire and process new sounds. This study examines how facial articulatory cues and co-speech hand gestures mimicking pitch contours in space affect non-native Mandarin tone perception. Native English as well as Mandarin perceivers identified tones embedded in noise with either congruent or incongruent Auditory-Facial (AF) and Auditory-FacialGestural (AFG) inputs. Native Mandarin results showed the expected ceiling-level performance in the congruent AF and AFG conditions. In the incongruent conditions, while AF identification was primarily auditory-based, AFG identification was partially based on gestures, demonstrating the use of gestures as valid cues in tone identification. The English perceivers’ performance was poor in the congruent AF condition, but improved significantly in AFG. While the incongruent AF identification showed some reliance on facial information, incongruent AFG identification relied more on gestural than auditory-facial information. These results indicate positive effects of facial and especially gestural input on non-native tone perception, suggesting that cross-modal (visuospatial) resources can be recruited to aid auditory perception when phonetic demands are high. The current findings may inform patterns of tone acquisition and development, suggesting how multi-modal speech enhancement principles may be applied to facilitate speech learning. PMID:29255435

  6. Can We Distinguish Emotions from Faces? Investigation of Implicit and Explicit Processes of Peak Facial Expressions.

    PubMed

    Xiao, Ruiqi; Li, Xianchun; Li, Lin; Wang, Yanmei

    2016-01-01

    Most previous studies on facial expression recognition have focused on the moderate emotions; to date, few studies have been conducted to investigate the explicit and implicit processes of peak emotions. In the current study, we used transiently peak intense expression images of athletes at the winning or losing point in competition as materials, and investigated the diagnosability of peak facial expressions at both implicit and explicit levels. In Experiment 1, participants were instructed to evaluate isolated faces, isolated bodies, and the face-body compounds, and eye-tracking movement was recorded. The results revealed that the isolated body and face-body congruent images were better recognized than isolated face and face-body incongruent images, indicating that the emotional information conveyed by facial cues was ambiguous, and the body cues influenced facial emotion recognition. Furthermore, eye movement records showed that the participants displayed distinct gaze patterns for the congruent and incongruent compounds. In Experiment 2A, the subliminal affective priming task was used, with faces as primes and bodies as targets, to investigate the unconscious emotion perception of peak facial expressions. The results showed that winning face prime facilitated reaction to winning body target, whereas losing face prime inhibited reaction to winning body target, suggesting that peak facial expressions could be perceived at the implicit level. In general, the results indicate that peak facial expressions cannot be consciously recognized but can be perceived at the unconscious level. In Experiment 2B, revised subliminal affective priming task and a strict awareness test were used to examine the validity of unconscious perception of peak facial expressions found in Experiment 2A. Results of Experiment 2B showed that reaction time to both winning body targets and losing body targets was influenced by the invisibly peak facial expression primes, which indicated the unconscious perception of peak facial expressions.

  7. Can We Distinguish Emotions from Faces? Investigation of Implicit and Explicit Processes of Peak Facial Expressions

    PubMed Central

    Xiao, Ruiqi; Li, Xianchun; Li, Lin; Wang, Yanmei

    2016-01-01

    Most previous studies on facial expression recognition have focused on the moderate emotions; to date, few studies have been conducted to investigate the explicit and implicit processes of peak emotions. In the current study, we used transiently peak intense expression images of athletes at the winning or losing point in competition as materials, and investigated the diagnosability of peak facial expressions at both implicit and explicit levels. In Experiment 1, participants were instructed to evaluate isolated faces, isolated bodies, and the face-body compounds, and eye-tracking movement was recorded. The results revealed that the isolated body and face-body congruent images were better recognized than isolated face and face-body incongruent images, indicating that the emotional information conveyed by facial cues was ambiguous, and the body cues influenced facial emotion recognition. Furthermore, eye movement records showed that the participants displayed distinct gaze patterns for the congruent and incongruent compounds. In Experiment 2A, the subliminal affective priming task was used, with faces as primes and bodies as targets, to investigate the unconscious emotion perception of peak facial expressions. The results showed that winning face prime facilitated reaction to winning body target, whereas losing face prime inhibited reaction to winning body target, suggesting that peak facial expressions could be perceived at the implicit level. In general, the results indicate that peak facial expressions cannot be consciously recognized but can be perceived at the unconscious level. In Experiment 2B, revised subliminal affective priming task and a strict awareness test were used to examine the validity of unconscious perception of peak facial expressions found in Experiment 2A. Results of Experiment 2B showed that reaction time to both winning body targets and losing body targets was influenced by the invisibly peak facial expression primes, which indicated the unconscious perception of peak facial expressions. PMID:27630604

  8. Evaluating Posed and Evoked Facial Expressions of Emotion from Adults with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Faso, Daniel J.; Sasson, Noah J.; Pinkham, Amy E.

    2015-01-01

    Though many studies have examined facial affect perception by individuals with autism spectrum disorder (ASD), little research has investigated how facial expressivity in ASD is perceived by others. Here, naïve female observers (n = 38) judged the intensity, naturalness and emotional category of expressions produced by adults with ASD (n = 6) and…

  9. Early Sign Language Experience Goes Along with an Increased Cross-modal Gain for Affective Prosodic Recognition in Congenitally Deaf CI Users.

    PubMed

    Fengler, Ineke; Delfau, Pia-Céline; Röder, Brigitte

    2018-04-01

    It is yet unclear whether congenitally deaf cochlear implant (CD CI) users' visual and multisensory emotion perception is influenced by their history in sign language acquisition. We hypothesized that early-signing CD CI users, relative to late-signing CD CI users and hearing, non-signing controls, show better facial expression recognition and rely more on the facial cues of audio-visual emotional stimuli. Two groups of young adult CD CI users-early signers (ES CI users; n = 11) and late signers (LS CI users; n = 10)-and a group of hearing, non-signing, age-matched controls (n = 12) performed an emotion recognition task with auditory, visual, and cross-modal emotionally congruent and incongruent speech stimuli. On different trials, participants categorized either the facial or the vocal expressions. The ES CI users more accurately recognized affective prosody than the LS CI users in the presence of congruent facial information. Furthermore, the ES CI users, but not the LS CI users, gained more than the controls from congruent visual stimuli when recognizing affective prosody. Both CI groups performed overall worse than the controls in recognizing affective prosody. These results suggest that early sign language experience affects multisensory emotion perception in CD CI users.

  10. Human Empathy, Personality and Experience Affect the Emotion Ratings of Dog and Human Facial Expressions.

    PubMed

    Kujala, Miiamaaria V; Somppi, Sanni; Jokela, Markus; Vainio, Outi; Parkkonen, Lauri

    2017-01-01

    Facial expressions are important for humans in communicating emotions to the conspecifics and enhancing interpersonal understanding. Many muscles producing facial expressions in humans are also found in domestic dogs, but little is known about how humans perceive dog facial expressions, and which psychological factors influence people's perceptions. Here, we asked 34 observers to rate the valence, arousal, and the six basic emotions (happiness, sadness, surprise, disgust, fear, and anger/aggressiveness) from images of human and dog faces with Pleasant, Neutral and Threatening expressions. We investigated how the subjects' personality (the Big Five Inventory), empathy (Interpersonal Reactivity Index) and experience of dog behavior affect the ratings of dog and human faces. Ratings of both species followed similar general patterns: human subjects classified dog facial expressions from pleasant to threatening very similarly to human facial expressions. Subjects with higher emotional empathy evaluated Threatening faces of both species as more negative in valence and higher in anger/aggressiveness. More empathetic subjects also rated the happiness of Pleasant humans but not dogs higher, and they were quicker in their valence judgments of Pleasant human, Threatening human and Threatening dog faces. Experience with dogs correlated positively with ratings of Pleasant and Neutral dog faces. Personality also had a minor effect on the ratings of Pleasant and Neutral faces in both species. The results imply that humans perceive human and dog facial expression in a similar manner, and the perception of both species is influenced by psychological factors of the evaluators. Especially empathy affects both the speed and intensity of rating dogs' emotional facial expressions.

  11. Human Empathy, Personality and Experience Affect the Emotion Ratings of Dog and Human Facial Expressions

    PubMed Central

    Kujala, Miiamaaria V.; Somppi, Sanni; Jokela, Markus; Vainio, Outi; Parkkonen, Lauri

    2017-01-01

    Facial expressions are important for humans in communicating emotions to the conspecifics and enhancing interpersonal understanding. Many muscles producing facial expressions in humans are also found in domestic dogs, but little is known about how humans perceive dog facial expressions, and which psychological factors influence people’s perceptions. Here, we asked 34 observers to rate the valence, arousal, and the six basic emotions (happiness, sadness, surprise, disgust, fear, and anger/aggressiveness) from images of human and dog faces with Pleasant, Neutral and Threatening expressions. We investigated how the subjects’ personality (the Big Five Inventory), empathy (Interpersonal Reactivity Index) and experience of dog behavior affect the ratings of dog and human faces. Ratings of both species followed similar general patterns: human subjects classified dog facial expressions from pleasant to threatening very similarly to human facial expressions. Subjects with higher emotional empathy evaluated Threatening faces of both species as more negative in valence and higher in anger/aggressiveness. More empathetic subjects also rated the happiness of Pleasant humans but not dogs higher, and they were quicker in their valence judgments of Pleasant human, Threatening human and Threatening dog faces. Experience with dogs correlated positively with ratings of Pleasant and Neutral dog faces. Personality also had a minor effect on the ratings of Pleasant and Neutral faces in both species. The results imply that humans perceive human and dog facial expression in a similar manner, and the perception of both species is influenced by psychological factors of the evaluators. Especially empathy affects both the speed and intensity of rating dogs’ emotional facial expressions. PMID:28114335

  12. Moving Faces

    ERIC Educational Resources Information Center

    Journal of College Science Teaching, 2005

    2005-01-01

    A recent study by Zara Ambadar and Jeffrey F. Cohn of the University of Pittsburgh and Jonathan W. Schooler of the University of British Columbia, examined how motion affects people's judgment of subtle facial expressions. Two experiments demonstrated robust effects of motion in facilitating the perception of subtle facial expressions depicting…

  13. He throws like a girl (but only when he's sad): emotion affects sex-decoding of biological motion displays.

    PubMed

    Johnson, Kerri L; McKay, Lawrie S; Pollick, Frank E

    2011-05-01

    Gender stereotypes have been implicated in sex-typed perceptions of facial emotion. Such interpretations were recently called into question because facial cues of emotion are confounded with sexually dimorphic facial cues. Here we examine the role of visual cues and gender stereotypes in perceptions of biological motion displays, thus overcoming the morphological confounding inherent in facial displays. In four studies, participants' judgments revealed gender stereotyping. Observers accurately perceived emotion from biological motion displays (Study 1), and this affected sex categorizations. Angry displays were overwhelmingly judged to be men; sad displays were judged to be women (Studies 2-4). Moreover, this pattern remained strong when stimuli were equated for velocity (Study 3). We argue that these results were obtained because perceivers applied gender stereotypes of emotion to infer sex category (Study 4). Implications for both vision sciences and social psychology are discussed. Copyright © 2011 Elsevier B.V. All rights reserved.

  14. Eyelid-openness and mouth curvature influence perceived intelligence beyond attractiveness.

    PubMed

    Talamas, Sean N; Mavor, Kenneth I; Axelsson, John; Sundelin, Tina; Perrett, David I

    2016-05-01

    Impression formation is profoundly influenced by facial attractiveness, but the existence of facial cues which affect judgments beyond such an "attractiveness halo" may be underestimated. Because depression and tiredness adversely affect cognitive capacity, we reasoned that facial cues to mood (mouth curvature) and alertness (eyelid-openness) affect impressions of intellectual capacity. Over 4 studies we investigated the influence of these malleable facial cues on first impressions of intelligence. In Studies 1 and 2 we scrutinize the perceived intelligence and attractiveness ratings of images of 100 adults (aged 18-33) and 90 school-age children (aged 5-17), respectively. Intelligence impression was partially mediated by attractiveness, but independent effects of eyelid-openness and subtle smiling were found that enhanced intelligence ratings independent of attractiveness. In Study 3 we digitally manipulated stimuli to have altered eyelid-openness or mouth curvature and found that each independent manipulation had an influence on perceptions of intelligence. In a final set of stimuli (Study 4) we explored changes in these cues before and after sleep restriction, to examine whether natural variations in these cues according to sleep condition can influence perceptions. In Studies 3 and 4 variations with increased eyelid-openness and mouth curvature were found to relate positively to intelligence ratings. These findings suggest potential overgeneralizations based on subtle facial cues that indicate mood and tiredness, both of which alter cognitive ability. These findings also have important implications for students who are directly influenced by expectations of ability and teachers who may form expectations based on initial perceptions of intelligence. (c) 2016 APA, all rights reserved).

  15. Face or body? Oxytocin improves perception of emotions from facial expressions in incongruent emotional body context.

    PubMed

    Perry, Anat; Aviezer, Hillel; Goldstein, Pavel; Palgi, Sharon; Klein, Ehud; Shamay-Tsoory, Simone G

    2013-11-01

    The neuropeptide oxytocin (OT) has been repeatedly reported to play an essential role in the regulation of social cognition in humans in general, and specifically in enhancing the recognition of emotions from facial expressions. The later was assessed in different paradigms that rely primarily on isolated and decontextualized emotional faces. However, recent evidence has indicated that the perception of basic facial expressions is not context invariant and can be categorically altered by context, especially body context, at early perceptual levels. Body context has a strong effect on our perception of emotional expressions, especially when the actual target face and the contextually expected face are perceptually similar. To examine whether and how OT affects emotion recognition, we investigated the role of OT in categorizing facial expressions in incongruent body contexts. Our results show that in the combined process of deciphering emotions from facial expressions and from context, OT gives an advantage to the face. This advantage is most evident when the target face and the contextually expected face are perceptually similar. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Perceiving emotions in neutral faces: expression processing is biased by affective person knowledge.

    PubMed

    Suess, Franziska; Rabovsky, Milena; Abdel Rahman, Rasha

    2015-04-01

    According to a widely held view, basic emotions such as happiness or anger are reflected in facial expressions that are invariant and uniquely defined by specific facial muscle movements. Accordingly, expression perception should not be vulnerable to influences outside the face. Here, we test this assumption by manipulating the emotional valence of biographical knowledge associated with individual persons. Faces of well-known and initially unfamiliar persons displaying neutral expressions were associated with socially relevant negative, positive or comparatively neutral biographical information. The expressions of faces associated with negative information were classified as more negative than faces associated with neutral information. Event-related brain potential modulations in the early posterior negativity, a component taken to reflect early sensory processing of affective stimuli such as emotional facial expressions, suggest that negative affective knowledge can bias the perception of faces with neutral expressions toward subjectively displaying negative emotions. © The Author (2014). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  17. Facial emotion perception impairments in schizophrenia patients with comorbid antisocial personality disorder.

    PubMed

    Tang, Dorothy Y Y; Liu, Amy C Y; Lui, Simon S Y; Lam, Bess Y H; Siu, Bonnie W M; Lee, Tatia M C; Cheung, Eric F C

    2016-02-28

    Impairment in facial emotion perception is believed to be associated with aggression. Schizophrenia patients with antisocial features are more impaired in facial emotion perception than their counterparts without these features. However, previous studies did not define the comorbidity of antisocial personality disorder (ASPD) using stringent criteria. We recruited 30 participants with dual diagnoses of ASPD and schizophrenia, 30 participants with schizophrenia and 30 controls. We employed the Facial Emotional Recognition paradigm to measure facial emotion perception, and administered a battery of neurocognitive tests. The Life History of Aggression scale was used. ANOVAs and ANCOVAs were conducted to examine group differences in facial emotion perception, and control for the effect of other neurocognitive dysfunctions on facial emotion perception. Correlational analyses were conducted to examine the association between facial emotion perception and aggression. Patients with dual diagnoses performed worst in facial emotion perception among the three groups. The group differences in facial emotion perception remained significant, even after other neurocognitive impairments were controlled for. Severity of aggression was correlated with impairment in perceiving negative-valenced facial emotions in patients with dual diagnoses. Our findings support the presence of facial emotion perception impairment and its association with aggression in schizophrenia patients with comorbid ASPD. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  18. Sad or fearful? The influence of body posture on adults' and children's perception of facial displays of emotion.

    PubMed

    Mondloch, Catherine J

    2012-02-01

    The current research investigated the influence of body posture on adults' and children's perception of facial displays of emotion. In each of two experiments, participants categorized facial expressions that were presented on a body posture that was congruent (e.g., a sad face on a body posing sadness) or incongruent (e.g., a sad face on a body posing fear). Adults and 8-year-olds made more errors and had longer reaction times on incongruent trials than on congruent trials when judging sad versus fearful facial expressions, an effect that was larger in 8-year-olds. The congruency effect was reduced when faces and bodies were misaligned, providing some evidence for holistic processing. Neither adults nor 8-year-olds were affected by congruency when judging sad versus happy expressions. Evidence that congruency effects vary with age and with similarity of emotional expressions is consistent with dimensional theories and "emotional seed" models of emotion perception. 2011 Elsevier Inc. All rights reserved.

  19. Emotional face processing and flat affect in schizophrenia: functional and structural neural correlates.

    PubMed

    Lepage, M; Sergerie, K; Benoit, A; Czechowska, Y; Dickie, E; Armony, J L

    2011-09-01

    There is a general consensus in the literature that schizophrenia causes difficulties with facial emotion perception and discrimination. Functional brain imaging studies have observed reduced limbic activity during facial emotion perception but few studies have examined the relation to flat affect severity. A total of 26 people with schizophrenia and 26 healthy controls took part in this event-related functional magnetic resonance imaging study. Sad, happy and neutral faces were presented in a pseudo-random order and participants indicated the gender of the face presented. Manual segmentation of the amygdala was performed on a structural T1 image. Both the schizophrenia group and the healthy control group rated the emotional valence of facial expressions similarly. Both groups exhibited increased brain activity during the perception of emotional faces relative to neutral ones in multiple brain regions, including multiple prefrontal regions bilaterally, the right amygdala, right cingulate cortex and cuneus. Group comparisons, however, revealed increased activity in the healthy group in the anterior cingulate, right parahippocampal gyrus and multiple visual areas. In schizophrenia, the severity of flat affect correlated significantly with neural activity in several brain areas including the amygdala and parahippocampal region bilaterally. These results suggest that many of the brain regions involved in emotional face perception, including the amygdala, are equally recruited in both schizophrenia and controls, but flat affect can also moderate activity in some other brain regions, notably in the left amygdala and parahippocampal gyrus bilaterally. There were no significant group differences in the volume of the amygdala.

  20. Influence of Perceived Height, Masculinity, and Age on Each Other and on Perceptions of Dominance in Male Faces.

    PubMed

    Batres, Carlota; Re, Daniel E; Perrett, David I

    2015-01-01

    Several studies have examined the individual effects of facial cues to height, masculinity, and age on interpersonal interactions and partner preferences. We know much less about the influence of these traits on each other. We, therefore, examined how facial cues to height, masculinity, and age influence perceptions of each other and found significant overlap. This suggests that studies investigating the effects of one of these traits in isolation may need to account for the influence of the other two traits. Additionally, there is inconsistent evidence on how each of these three facial traits affects dominance. We, therefore, investigated how varying such traits influences perceptions of dominance in male faces. We found that increases in perceived height, masculinity, and age (up to 35 years) all increased facial dominance. Our results may reflect perceptual generalizations from sex differences as men are on average taller, more dominant, and age faster than women. Furthermore, we found that the influences of height and age on perceptions of dominance are mediated by masculinity. These results give us a better understanding of the facial characteristics that convey the appearance of dominance, a trait that is linked to a wealth of real-world outcomes. © The Author(s) 2015.

  1. Symmetrical and Asymmetrical Interactions between Facial Expressions and Gender Information in Face Perception.

    PubMed

    Liu, Chengwei; Liu, Ying; Iqbal, Zahida; Li, Wenhui; Lv, Bo; Jiang, Zhongqing

    2017-01-01

    To investigate the interaction between facial expressions and facial gender information during face perception, the present study matched the intensities of the two types of information in face images and then adopted the orthogonal condition of the Garner Paradigm to present the images to participants who were required to judge the gender and expression of the faces; the gender and expression presentations were varied orthogonally. Gender and expression processing displayed a mutual interaction. On the one hand, the judgment of angry expressions occurred faster when presented with male facial images; on the other hand, the classification of the female gender occurred faster when presented with a happy facial expression than when presented with an angry facial expression. According to the evoked-related potential results, the expression classification was influenced by gender during the face structural processing stage (as indexed by N170), which indicates the promotion or interference of facial gender with the coding of facial expression features. However, gender processing was affected by facial expressions in more stages, including the early (P1) and late (LPC) stages of perceptual processing, reflecting that emotional expression influences gender processing mainly by directing attention.

  2. Distinct facial processing in schizophrenia and schizoaffective disorders

    PubMed Central

    Chen, Yue; Cataldo, Andrea; Norton, Daniel J; Ongur, Dost

    2011-01-01

    Although schizophrenia and schizoaffective disorders have both similar and differing clinical features, it is not well understood whether similar or differing pathophysiological processes mediate patients’ cognitive functions. Using psychophysical methods, this study compared the performances of schizophrenia (SZ) patients, patients with schizoaffective disorder (SA), and a healthy control group in two face-related cognitive tasks: emotion discrimination, which tested perception of facial affect, and identity discrimination, which tested perception of non-affective facial features. Compared to healthy controls, SZ patients, but not SA patients, exhibited deficient performance in both fear and happiness discrimination, as well as identity discrimination. SZ patients, but not SA patients, also showed impaired performance in a theory-of-mind task for which emotional expressions are identified based upon the eye regions of face images. This pattern of results suggests distinct processing of face information in schizophrenia and schizoaffective disorders. PMID:21868199

  3. Colour homogeneity and visual perception of age, health and attractiveness of male facial skin.

    PubMed

    Fink, B; Matts, P J; D'Emiliano, D; Bunse, L; Weege, B; Röder, S

    2012-12-01

    Visible facial skin condition in females is known to affect perception of age, health and attractiveness. Skin colour distribution in shape- and topography-standardized female faces, driven by localized melanin and haemoglobin, can account for up to twenty years of apparent age perception. Although this is corroborated by an ability to discern female age even in isolated, non-contextual skin images, a similar effect in the perception of male skin is yet to be demonstrated. To investigate the effect of skin colour homogeneity and chromophore distribution on the visual perception of age, health and attractiveness of male facial skin. Cropped images from the cheeks of facial images of 160 Caucasian British men aged 10-70 years were blind-rated for age, health and attractiveness by a total of 308 participants. In addition, the homogeneity of skin images and corresponding eumelanin/oxyhaemoglobin concentration maps were analysed objectively using Haralick's image segmentation algorithm. Isolated skin images taken from the cheeks of younger males were judged as healthier and more attractive. Perception of age, health and attractiveness was strongly related to melanin and haemoglobin distribution, whereby more even distributions led to perception of younger age and greater health and attractiveness. The evenness of melanized features was a stronger cue for age perception, whereas haemoglobin distribution was associated more strongly with health and attractiveness perception. Male skin colour homogeneity, driven by melanin and haemoglobin distribution, influences perception of age, health and attractiveness. © 2011 The Authors. Journal of the European Academy of Dermatology and Venereology © 2011 European Academy of Dermatology and Venereology.

  4. Identifying differences in biased affective information processing in major depression.

    PubMed

    Gollan, Jackie K; Pane, Heather T; McCloskey, Michael S; Coccaro, Emil F

    2008-05-30

    This study investigates the extent to which participants with major depression differ from healthy comparison participants in the irregularities in affective information processing, characterized by deficits in facial expression recognition, intensity categorization, and reaction time to identifying emotionally salient and neutral information. Data on diagnoses, symptom severity, and affective information processing using a facial recognition task were collected from 66 participants, male and female between ages 18 and 54 years, grouped by major depressive disorder (N=37) or healthy non-psychiatric (N=29) status. Findings from MANCOVAs revealed that major depression was associated with a significantly longer reaction time to sad facial expressions compared with healthy status. Also, depressed participants demonstrated a negative bias towards interpreting neutral facial expressions as sad significantly more often than healthy participants. In turn, healthy participants interpreted neutral faces as happy significantly more often than depressed participants. No group differences were observed for facial expression recognition and intensity categorization. The observed effects suggest that depression has significant effects on the perception of the intensity of negative affective stimuli, delayed speed of processing sad affective information, and biases towards interpreting neutral faces as sad.

  5. Social perception and aging: The relationship between aging and the perception of subtle changes in facial happiness and identity.

    PubMed

    Yang, Tao; Penton, Tegan; Köybaşı, Şerife Leman; Banissy, Michael J

    2017-09-01

    Previous findings suggest that older adults show impairments in the social perception of faces, including the perception of emotion and facial identity. The majority of this work has tended to examine performance on tasks involving young adult faces and prototypical emotions. While useful, this can influence performance differences between groups due to perceptual biases and limitations on task performance. Here we sought to examine how typical aging is associated with the perception of subtle changes in facial happiness and facial identity in older adult faces. We developed novel tasks that permitted the ability to assess facial happiness, facial identity, and non-social perception (object perception) across similar task parameters. We observe that aging is linked with declines in the ability to make fine-grained judgements in the perception of facial happiness and facial identity (from older adult faces), but not for non-social (object) perception. This pattern of results is discussed in relation to mechanisms that may contribute to declines in facial perceptual processing in older adulthood. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  6. What's in a "face file"? Feature binding with facial identity, emotion, and gaze direction.

    PubMed

    Fitousi, Daniel

    2017-07-01

    A series of four experiments investigated the binding of facial (i.e., facial identity, emotion, and gaze direction) and non-facial (i.e., spatial location and response location) attributes. Evidence for the creation and retrieval of temporary memory face structures across perception and action has been adduced. These episodic structures-dubbed herein "face files"-consisted of both visuo-visuo and visuo-motor bindings. Feature binding was indicated by partial-repetition costs. That is repeating a combination of facial features or altering them altogether, led to faster responses than repeating or alternating only one of the features. Taken together, the results indicate that: (a) "face files" affect both action and perception mechanisms, (b) binding can take place with facial dimensions and is not restricted to low-level features (Hommel, Visual Cognition 5:183-216, 1998), and (c) the binding of facial and non-facial attributes is facilitated if the dimensions share common spatial or motor codes. The theoretical contributions of these results to "person construal" theories (Freeman, & Ambady, Psychological Science, 20(10), 1183-1188, 2011), as well as to face recognition models (Haxby, Hoffman, & Gobbini, Biological Psychiatry, 51(1), 59-67, 2000) are discussed.

  7. Emotion perception across cultures: the role of cognitive mechanisms

    PubMed Central

    Engelmann, Jan B.; Pogosyan, Marianna

    2012-01-01

    Despite consistently documented cultural differences in the perception of facial expressions of emotion, the role of culture in shaping cognitive mechanisms that are central to emotion perception has received relatively little attention in past research. We review recent developments in cross-cultural psychology that provide particular insights into the modulatory role of culture on cognitive mechanisms involved in interpretations of facial expressions of emotion through two distinct routes: display rules and cognitive styles. Investigations of emotion intensity perception have demonstrated that facial expressions with varying levels of intensity of positive affect are perceived and categorized differently across cultures. Specifically, recent findings indicating significant levels of differentiation between intensity levels of facial expressions among American participants, as well as deviations from clear categorization of high and low intensity expressions among Japanese and Russian participants, suggest that display rules shape mental representations of emotions, such as intensity levels of emotion prototypes. Furthermore, a series of recent studies using eye tracking as a proxy for overt attention during face perception have identified culture-specific cognitive styles, such as the propensity to attend to very specific features of the face. Together, these results suggest a cascade of cultural influences on cognitive mechanisms involved in interpretations of facial expressions of emotion, whereby cultures impart specific behavioral practices that shape the way individuals process information from the environment. These cultural influences lead to differences in cognitive styles due to culture-specific attentional biases and emotion prototypes, which partially account for the gradient of cultural agreements and disagreements obtained in past investigations of emotion perception. PMID:23486743

  8. Emotion perception across cultures: the role of cognitive mechanisms.

    PubMed

    Engelmann, Jan B; Pogosyan, Marianna

    2013-01-01

    Despite consistently documented cultural differences in the perception of facial expressions of emotion, the role of culture in shaping cognitive mechanisms that are central to emotion perception has received relatively little attention in past research. We review recent developments in cross-cultural psychology that provide particular insights into the modulatory role of culture on cognitive mechanisms involved in interpretations of facial expressions of emotion through two distinct routes: display rules and cognitive styles. Investigations of emotion intensity perception have demonstrated that facial expressions with varying levels of intensity of positive affect are perceived and categorized differently across cultures. Specifically, recent findings indicating significant levels of differentiation between intensity levels of facial expressions among American participants, as well as deviations from clear categorization of high and low intensity expressions among Japanese and Russian participants, suggest that display rules shape mental representations of emotions, such as intensity levels of emotion prototypes. Furthermore, a series of recent studies using eye tracking as a proxy for overt attention during face perception have identified culture-specific cognitive styles, such as the propensity to attend to very specific features of the face. Together, these results suggest a cascade of cultural influences on cognitive mechanisms involved in interpretations of facial expressions of emotion, whereby cultures impart specific behavioral practices that shape the way individuals process information from the environment. These cultural influences lead to differences in cognitive styles due to culture-specific attentional biases and emotion prototypes, which partially account for the gradient of cultural agreements and disagreements obtained in past investigations of emotion perception.

  9. Exaggerated perception of facial expressions is increased in individuals with schizotypal traits

    PubMed Central

    Uono, Shota; Sato, Wataru; Toichi, Motomi

    2015-01-01

    Emotional facial expressions are indispensable communicative tools, and social interactions involving facial expressions are impaired in some psychiatric disorders. Recent studies revealed that the perception of dynamic facial expressions was exaggerated in normal participants, and this exaggerated perception is weakened in autism spectrum disorder (ASD). Based on the notion that ASD and schizophrenia spectrum disorder are at two extremes of the continuum with respect to social impairment, we hypothesized that schizophrenic characteristics would strengthen the exaggerated perception of dynamic facial expressions. To test this hypothesis, we investigated the relationship between the perception of facial expressions and schizotypal traits in a normal population. We presented dynamic and static facial expressions, and asked participants to change an emotional face display to match the perceived final image. The presence of schizotypal traits was positively correlated with the degree of exaggeration for dynamic, as well as static, facial expressions. Among its subscales, the paranoia trait was positively correlated with the exaggerated perception of facial expressions. These results suggest that schizotypal traits, specifically the tendency to over-attribute mental states to others, exaggerate the perception of emotional facial expressions. PMID:26135081

  10. Exaggerated perception of facial expressions is increased in individuals with schizotypal traits.

    PubMed

    Uono, Shota; Sato, Wataru; Toichi, Motomi

    2015-07-02

    Emotional facial expressions are indispensable communicative tools, and social interactions involving facial expressions are impaired in some psychiatric disorders. Recent studies revealed that the perception of dynamic facial expressions was exaggerated in normal participants, and this exaggerated perception is weakened in autism spectrum disorder (ASD). Based on the notion that ASD and schizophrenia spectrum disorder are at two extremes of the continuum with respect to social impairment, we hypothesized that schizophrenic characteristics would strengthen the exaggerated perception of dynamic facial expressions. To test this hypothesis, we investigated the relationship between the perception of facial expressions and schizotypal traits in a normal population. We presented dynamic and static facial expressions, and asked participants to change an emotional face display to match the perceived final image. The presence of schizotypal traits was positively correlated with the degree of exaggeration for dynamic, as well as static, facial expressions. Among its subscales, the paranoia trait was positively correlated with the exaggerated perception of facial expressions. These results suggest that schizotypal traits, specifically the tendency to over-attribute mental states to others, exaggerate the perception of emotional facial expressions.

  11. Perceived functional impact of abnormal facial appearance.

    PubMed

    Rankin, Marlene; Borah, Gregory L

    2003-06-01

    Functional facial deformities are usually described as those that impair respiration, eating, hearing, or speech. Yet facial scars and cutaneous deformities have a significant negative effect on social functionality that has been poorly documented in the scientific literature. Insurance companies are declining payments for reconstructive surgical procedures for facial deformities caused by congenital disabilities and after cancer or trauma operations that do not affect mechanical facial activity. The purpose of this study was to establish a large, sample-based evaluation of the perceived social functioning, interpersonal characteristics, and employability indices for a range of facial appearances (normal and abnormal). Adult volunteer evaluators (n = 210) provided their subjective perceptions based on facial physical appearance, and an analysis of the consequences of facial deformity on parameters of preferential treatment was performed. A two-group comparative research design rated the differences among 10 examples of digitally altered facial photographs of actual patients among various age and ethnic groups with "normal" and "abnormal" congenital deformities or posttrauma scars. Photographs of adult patients with observable congenital and posttraumatic deformities (abnormal) were digitally retouched to eliminate the stigmatic defects (normal). The normal and abnormal photographs of identical patients were evaluated by the large sample study group on nine parameters of social functioning, such as honesty, employability, attractiveness, and effectiveness, using a visual analogue rating scale. Patients with abnormal facial characteristics were rated as significantly less honest (p = 0.007), less employable (p = 0.001), less trustworthy (p = 0.01), less optimistic (p = 0.001), less effective (p = 0.02), less capable (p = 0.002), less intelligent (p = 0.03), less popular (p = 0.001), and less attractive (p = 0.001) than were the same patients with normal facial appearances. Facial deformity caused by trauma, congenital disabilities, and postsurgical sequelae present with significant adverse functional consequences. Facial deformities have a significant negative effect on perceptions of social functionality, including employability, honesty, and trustworthiness. Adverse perceptions of patients with facial deformities occur regardless of sex, educational level, and age of evaluator.

  12. Compensatory premotor activity during affective face processing in subclinical carriers of a single mutant Parkin allele.

    PubMed

    Anders, Silke; Sack, Benjamin; Pohl, Anna; Münte, Thomas; Pramstaller, Peter; Klein, Christine; Binkofski, Ferdinand

    2012-04-01

    Patients with Parkinson's disease suffer from significant motor impairments and accompanying cognitive and affective dysfunction due to progressive disturbances of basal ganglia-cortical gating loops. Parkinson's disease has a long presymptomatic stage, which indicates a substantial capacity of the human brain to compensate for dopaminergic nerve degeneration before clinical manifestation of the disease. Neuroimaging studies provide evidence that increased motor-related cortical activity can compensate for progressive dopaminergic nerve degeneration in carriers of a single mutant Parkin or PINK1 gene, who show a mild but significant reduction of dopamine metabolism in the basal ganglia in the complete absence of clinical motor signs. However, it is currently unknown whether similar compensatory mechanisms are effective in non-motor basal ganglia-cortical gating loops. Here, we ask whether asymptomatic Parkin mutation carriers show altered patterns of brain activity during processing of facial gestures, and whether this might compensate for latent facial emotion recognition deficits. Current theories in social neuroscience assume that execution and perception of facial gestures are linked by a special class of visuomotor neurons ('mirror neurons') in the ventrolateral premotor cortex/pars opercularis of the inferior frontal gyrus (Brodmann area 44/6). We hypothesized that asymptomatic Parkin mutation carriers would show increased activity in this area during processing of affective facial gestures, replicating the compensatory motor effects that have previously been observed in these individuals. Additionally, Parkin mutation carriers might show altered activity in other basal ganglia-cortical gating loops. Eight asymptomatic heterozygous Parkin mutation carriers and eight matched controls underwent functional magnetic resonance imaging and a subsequent facial emotion recognition task. As predicted, Parkin mutation carriers showed significantly stronger activity in the right ventrolateral premotor cortex during execution and perception of affective facial gestures than healthy controls. Furthermore, Parkin mutation carriers showed a slightly reduced ability to recognize facial emotions that was least severe in individuals who showed the strongest increase of ventrolateral premotor activity. In addition, Parkin mutation carriers showed a significantly weaker than normal increase of activity in the left lateral orbitofrontal cortex (inferior frontal gyrus pars orbitalis, Brodmann area 47), which was unrelated to facial emotion recognition ability. These findings are consistent with the hypothesis that compensatory activity in the ventrolateral premotor cortex during processing of affective facial gestures can reduce impairments in facial emotion recognition in subclinical Parkin mutation carriers. A breakdown of this compensatory mechanism might lead to the impairment of facial expressivity and facial emotion recognition observed in manifest Parkinson's disease.

  13. Compensatory premotor activity during affective face processing in subclinical carriers of a single mutant Parkin allele

    PubMed Central

    Sack, Benjamin; Pohl, Anna; Münte, Thomas; Pramstaller, Peter; Klein, Christine; Binkofski, Ferdinand

    2012-01-01

    Patients with Parkinson's disease suffer from significant motor impairments and accompanying cognitive and affective dysfunction due to progressive disturbances of basal ganglia–cortical gating loops. Parkinson's disease has a long presymptomatic stage, which indicates a substantial capacity of the human brain to compensate for dopaminergic nerve degeneration before clinical manifestation of the disease. Neuroimaging studies provide evidence that increased motor-related cortical activity can compensate for progressive dopaminergic nerve degeneration in carriers of a single mutant Parkin or PINK1 gene, who show a mild but significant reduction of dopamine metabolism in the basal ganglia in the complete absence of clinical motor signs. However, it is currently unknown whether similar compensatory mechanisms are effective in non-motor basal ganglia–cortical gating loops. Here, we ask whether asymptomatic Parkin mutation carriers show altered patterns of brain activity during processing of facial gestures, and whether this might compensate for latent facial emotion recognition deficits. Current theories in social neuroscience assume that execution and perception of facial gestures are linked by a special class of visuomotor neurons (‘mirror neurons’) in the ventrolateral premotor cortex/pars opercularis of the inferior frontal gyrus (Brodmann area 44/6). We hypothesized that asymptomatic Parkin mutation carriers would show increased activity in this area during processing of affective facial gestures, replicating the compensatory motor effects that have previously been observed in these individuals. Additionally, Parkin mutation carriers might show altered activity in other basal ganglia–cortical gating loops. Eight asymptomatic heterozygous Parkin mutation carriers and eight matched controls underwent functional magnetic resonance imaging and a subsequent facial emotion recognition task. As predicted, Parkin mutation carriers showed significantly stronger activity in the right ventrolateral premotor cortex during execution and perception of affective facial gestures than healthy controls. Furthermore, Parkin mutation carriers showed a slightly reduced ability to recognize facial emotions that was least severe in individuals who showed the strongest increase of ventrolateral premotor activity. In addition, Parkin mutation carriers showed a significantly weaker than normal increase of activity in the left lateral orbitofrontal cortex (inferior frontal gyrus pars orbitalis, Brodmann area 47), which was unrelated to facial emotion recognition ability. These findings are consistent with the hypothesis that compensatory activity in the ventrolateral premotor cortex during processing of affective facial gestures can reduce impairments in facial emotion recognition in subclinical Parkin mutation carriers. A breakdown of this compensatory mechanism might lead to the impairment of facial expressivity and facial emotion recognition observed in manifest Parkinson's disease. PMID:22434215

  14. Perceptual and affective mechanisms in facial expression recognition: An integrative review.

    PubMed

    Calvo, Manuel G; Nummenmaa, Lauri

    2016-09-01

    Facial expressions of emotion involve a physical component of morphological changes in a face and an affective component conveying information about the expresser's internal feelings. It remains unresolved how much recognition and discrimination of expressions rely on the perception of morphological patterns or the processing of affective content. This review of research on the role of visual and emotional factors in expression recognition reached three major conclusions. First, behavioral, neurophysiological, and computational measures indicate that basic expressions are reliably recognized and discriminated from one another, albeit the effect may be inflated by the use of prototypical expression stimuli and forced-choice responses. Second, affective content along the dimensions of valence and arousal is extracted early from facial expressions, although this coarse affective representation contributes minimally to categorical recognition of specific expressions. Third, the physical configuration and visual saliency of facial features contribute significantly to expression recognition, with "emotionless" computational models being able to reproduce some of the basic phenomena demonstrated in human observers. We conclude that facial expression recognition, as it has been investigated in conventional laboratory tasks, depends to a greater extent on perceptual than affective information and mechanisms.

  15. An Intact Social Cognitive Process in Schizophrenia: Situational Context Effects on Perception of Facial Affect

    PubMed Central

    Lee, Junghee; Kern, Robert S.; Harvey, Philippe-Olivier; Horan, William P.; Kee, Kimmy S.; Ochsner, Kevin; Penn, David L.; Green, Michael F.

    2013-01-01

    Background Impaired facial affect recognition is the most consistent social cognitive finding in schizophrenia. Although social situations provide powerful constraints on our perception, little is known about how situational context modulates facial affect recognition in schizophrenia. Methods Study 1 was a single-site study with 34 schizophrenia patients and 22 healthy controls. Study 2 was a 2-site study with 68 schizophrenia patients and 28 controls. Both studies administered a Situational Context Facial Affect Recognition Task with 2 conditions: a situational context condition and a no-context condition. For the situational context condition, a briefly shown face was preceded by a sentence describing either a fear- or surprise-inducing event. In the no-context condition, a face was presented without a sentence. For both conditions, subjects rated how fearful or surprised the face appeared on a 9-point Likert scale. Results For the situational context condition of study 1, both patients and controls rated faces as more afraid when they were paired with fear-inducing sentences and as more surprised when they were paired with surprise-inducing sentences. The degree of modulation was comparable across groups. For the no-context condition, patients rated faces comparably to controls. The findings of study 2 replicated those from study 1. Conclusions Despite previous abnormalities in other types of context paradigms, this study found intact situational context processing in schizophrenia, suggesting that patients benefit from situational context when interpreting ambiguous facial expression. This area of relative social cognitive strength in schizophrenia has implications for social cognitive training programs. PMID:22532704

  16. Perceptions of midline deviations among different facial types.

    PubMed

    Williams, Ryan P; Rinchuse, Daniel J; Zullo, Thomas G

    2014-02-01

    The correction of a deviated midline can involve complicated mechanics and a protracted treatment. The threshold below which midline deviations are considered acceptable might depend on multiple factors. The objective of this study was to evaluate the effect of facial type on laypersons' perceptions of various degrees of midline deviation. Smiling photographs of male and female subjects were altered to create 3 facial type variations (euryprosopic, mesoprosopic, and leptoprosopic) and deviations in the midline ranging from 0.0 to 4.0 mm. Evaluators rated the overall attractiveness and acceptability of each photograph. Data were collected from 160 raters. The overall threshold for the acceptability of a midline deviation was 2.92 ± 1.10 mm, with the threshold for the male subject significantly lower than that for the female subject. The euryprosopic facial type showed no decrease in mean attractiveness until the deviations were 2 mm or more. All other facial types were rated as decreasingly attractive from 1 mm onward. Among all facial types, the attractiveness of the male subject was only affected at deviations of 2 mm or greater; for the female subject, the attractiveness scores were significantly decreased at 1 mm. The mesoprosopic facial type was most attractive for the male subject but was the least attractive for the female subject. Facial type and sex may affect the thresholds at which a midline deviation is detected and above which a midline deviation is considered unacceptable. Both the euryprosopic facial type and male sex were associated with higher levels of attractiveness at relatively small levels of deviations. Copyright © 2014 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  17. Perception of Age, Attractiveness, and Tiredness After Isolated and Combined Facial Subunit Aging.

    PubMed

    Forte, Antonio Jorge; Andrew, Tom W; Colasante, Cesar; Persing, John A

    2015-12-01

    Patients often seek help to redress aging that affects various regions of the face (subunits). The purpose of this study was to determine how aging of different facial subunits impacts perception of age, attractiveness, and tiredness. Frontal and lateral view facial photographs of a middle-aged woman were modified using imaging software to independently age different facial features. Sixty-six subjects were administered with a questionnaire, and presented with a baseline unmodified picture and others containing different individual or grouped aging of facial subunits. Test subjects were asked to estimate the age of the subject in the image and quantify (0-10 scale) how "tired" and "attractive" they appeared. Facial subunits were organized following rank assignment regarding impact on perception of age, attractiveness, and tiredness. The correlation coefficient between age and attractiveness had a strong inverse relationship of approximately -0.95 in both lateral and frontal views. From most to least impact in age, the rank assignment for frontal view facial subunits was full facial aging, middle third, lower third, upper third, vertical lip rhytides, horizontal forehead rhytides, jowls, upper eyelid ptosis, loss of malar volume, lower lid fat herniation, deepening glabellar furrows, and deepening nasolabial folds. From most to least impact in age, the rank assignment for lateral view facial subunits was severe neck ptosis, jowls, moderate neck ptosis, vertical lip rhytides, crow's feet, lower lid fat herniation, loss of malar volume, and elongated earlobe. This study provides a preliminary template for further research to determine which anatomical subunit will have the most substantial effect on an aged appearance, as well as on the perception of tiredness and attractiveness. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266.

  18. Emotion Perception from Face, Voice, and Touch: Comparisons and Convergence

    PubMed Central

    Schirmer, Annett; Adolphs, Ralph

    2017-01-01

    Historically, research on emotion perception has focused on facial expressions, and findings from this modality have come to dominate our thinking about other modalities. Here, we examine emotion perception through a wider lens by comparing facial with vocal and tactile processing. We review stimulus characteristics and ensuing behavioral and brain responses, and show that audition and touch do not simply duplicate visual mechanisms. Each modality provides a distinct input channel and engages partly non-overlapping neuroanatomical systems with different processing specializations (e.g., specific emotions versus affect). Moreover, processing of signals across the different modalities converges, first into multi- and later into amodal representations that enable holistic emotion judgments. PMID:28173998

  19. Self-concept and the perception of facial appearance in children and adolescents seeking orthodontic treatment.

    PubMed

    Phillips, Ceib; Beal, Kimberly N Edwards

    2009-01-01

    To examine, in adolescents with mild to moderate malocclusion, the relationship between self-concept and demographic characteristics, a clinical assessment of malocclusion, self-perception of malocclusion, and self-perception of facial attractiveness. Fifty-nine consecutive patients ages 9 to 15 years scheduled for initial records in a graduate orthodontic clinic consented to participate. Each subject independently completed the Multidimensional Self-Concept Scale (MSCS), the Facial Image Scale, and the Index of Treatment Need-Aesthetic Component (IOTN-AC). Peer Assessment Rating (PAR) scores were obtained from the patients' diagnostic dental casts. Forward multiple-regression analysis with a backward overlook was used to analyze the effect of the demographic, clinical, and self-perception measures on each of the six self-concept (MSCS) domains. Self-perception of the dentofacial region was the only statistically significant predictor (P < .05) for the Global, Competence, Affect, Academic, and Physical domains of self-concept, while age, parental marital status, and the adolescent's self-perception of the dentofacial region were statistically significant predictors (P < .05) of Social Self-Concept. The self-perceived level of the attractiveness or "positive" feelings toward the dentofacial region is more strongly related to self-concept than the severity of the malocclusion as indicated by the PAR score or by the adolescent's perception of their malocclusion.

  20. Impressions of dominance are made relative to others in the visual environment.

    PubMed

    Re, Daniel E; Lefevre, Carmen E; DeBruine, Lisa M; Jones, Benedict C; Perrett, David I

    2014-03-27

    Face judgments of dominance play an important role in human social interaction. Perceived facial dominance is thought to indicate physical formidability, as well as resource acquisition and holding potential. Dominance cues in the face affect perceptions of attractiveness, emotional state, and physical strength. Most experimental paradigms test perceptions of facial dominance in individual faces, or they use manipulated versions of the same face in a forced-choice task but in the absence of other faces. Here, we extend this work by assessing whether dominance ratings are absolute or are judged relative to other faces. We presented participants with faces to be rated for dominance (target faces), while also presenting a second face (non-target faces) that was not to be rated. We found that both the masculinity and sex of the non-target face affected dominance ratings of the target face. Masculinized non-target faces decreased the perceived dominance of a target face relative to a feminized non-target face, and displaying a male non-target face decreased perceived dominance of a target face more so than a female non-target face. Perceived dominance of male target faces was affected more by masculinization of male non-target faces than female non-target faces. These results indicate that dominance perceptions can be altered by surrounding faces, demonstrating that facial dominance is judged at least partly relative to other faces.

  1. Fourier power spectrum characteristics of face photographs: attractiveness perception depends on low-level image properties.

    PubMed

    Menzel, Claudia; Hayn-Leichsenring, Gregor U; Langner, Oliver; Wiese, Holger; Redies, Christoph

    2015-01-01

    We investigated whether low-level processed image properties that are shared by natural scenes and artworks - but not veridical face photographs - affect the perception of facial attractiveness and age. Specifically, we considered the slope of the radially averaged Fourier power spectrum in a log-log plot. This slope is a measure of the distribution of special frequency power in an image. Images of natural scenes and artworks possess - compared to face images - a relatively shallow slope (i.e., increased high spatial frequency power). Since aesthetic perception might be based on the efficient processing of images with natural scene statistics, we assumed that the perception of facial attractiveness might also be affected by these properties. We calculated Fourier slope and other beauty-associated measurements in face images and correlated them with ratings of attractiveness and age of the depicted persons (Study 1). We found that Fourier slope - in contrast to the other tested image properties - did not predict attractiveness ratings when we controlled for age. In Study 2A, we overlaid face images with random-phase patterns with different statistics. Patterns with a slope similar to those in natural scenes and artworks resulted in lower attractiveness and higher age ratings. In Studies 2B and 2C, we directly manipulated the Fourier slope of face images and found that images with shallower slopes were rated as more attractive. Additionally, attractiveness of unaltered faces was affected by the Fourier slope of a random-phase background (Study 3). Faces in front of backgrounds with statistics similar to natural scenes and faces were rated as more attractive. We conclude that facial attractiveness ratings are affected by specific image properties. An explanation might be the efficient coding hypothesis.

  2. Fourier Power Spectrum Characteristics of Face Photographs: Attractiveness Perception Depends on Low-Level Image Properties

    PubMed Central

    Langner, Oliver; Wiese, Holger; Redies, Christoph

    2015-01-01

    We investigated whether low-level processed image properties that are shared by natural scenes and artworks – but not veridical face photographs – affect the perception of facial attractiveness and age. Specifically, we considered the slope of the radially averaged Fourier power spectrum in a log-log plot. This slope is a measure of the distribution of special frequency power in an image. Images of natural scenes and artworks possess – compared to face images – a relatively shallow slope (i.e., increased high spatial frequency power). Since aesthetic perception might be based on the efficient processing of images with natural scene statistics, we assumed that the perception of facial attractiveness might also be affected by these properties. We calculated Fourier slope and other beauty-associated measurements in face images and correlated them with ratings of attractiveness and age of the depicted persons (Study 1). We found that Fourier slope – in contrast to the other tested image properties – did not predict attractiveness ratings when we controlled for age. In Study 2A, we overlaid face images with random-phase patterns with different statistics. Patterns with a slope similar to those in natural scenes and artworks resulted in lower attractiveness and higher age ratings. In Studies 2B and 2C, we directly manipulated the Fourier slope of face images and found that images with shallower slopes were rated as more attractive. Additionally, attractiveness of unaltered faces was affected by the Fourier slope of a random-phase background (Study 3). Faces in front of backgrounds with statistics similar to natural scenes and faces were rated as more attractive. We conclude that facial attractiveness ratings are affected by specific image properties. An explanation might be the efficient coding hypothesis. PMID:25835539

  3. Brain systems for assessing the affective value of faces

    PubMed Central

    Said, Christopher P.; Haxby, James V.; Todorov, Alexander

    2011-01-01

    Cognitive neuroscience research on facial expression recognition and face evaluation has proliferated over the past 15 years. Nevertheless, large questions remain unanswered. In this overview, we discuss the current understanding in the field, and describe what is known and what remains unknown. In §2, we describe three types of behavioural evidence that the perception of traits in neutral faces is related to the perception of facial expressions, and may rely on the same mechanisms. In §3, we discuss cortical systems for the perception of facial expressions, and argue for a partial segregation of function in the superior temporal sulcus and the fusiform gyrus. In §4, we describe the current understanding of how the brain responds to emotionally neutral faces. To resolve some of the inconsistencies in the literature, we perform a large group analysis across three different studies, and argue that one parsimonious explanation of prior findings is that faces are coded in terms of their typicality. In §5, we discuss how these two lines of research—perception of emotional expressions and face evaluation—could be integrated into a common, cognitive neuroscience framework. PMID:21536552

  4. Dentofacial self-perception and social perception of adults with unilateral cleft lip and palate.

    PubMed

    Meyer-Marcotty, Philipp; Stellzig-Eisenhauer, Angelika

    2009-05-01

    The aim of this study was to investigate the influence of facial asymmetry on how an adult population with unilateral cleft lip and palate (CLP) perceived themselves and were perceived by others. 3D facial data of 30 adult patients with cleft lip and palate (CLP) was scanned and standardized extra- and intraoral photographs were taken. The measured degree of 3D asymmetry was computed for the entire face, midface and lower face. Subjective estimates regarding facial symmetry, attractiveness as well as satisfaction and a desire or indication for further treatment were surveyed by means of a questionnaire filled out by patients and an assessment group (10 orthodontists, 10 oral and maxillofacial (OM) surgeons, 15 laypersons). The study's results show that the largest degree of asymmetry was found in the midface of CLP patients. The vast majority of the patients were dissatisfied with their facial appearance, and patients, experts and laypersons expressed great interest in and a need of correction. We observed tangible incongruence between how the patients perceived their own faces and how others perceived them. Asymmetry, especially in the midface, appears to detract from how facial appearance is self-perceived and perceived by others, which explains the primary desire for or need of nose correction. The self-perception of patients affected by CLP does not correlate with objective results or how others perceive them. Clinicians should be open to adult patients' requests for correction, but the patient's self-perception should also be critically explored.

  5. Self-Concept and the Perception of Facial Appearance in Children and Adolescents Seeking Orthodontic Treatment

    PubMed Central

    Phillips, Ceib; Beal, Kimberly N. Edwards

    2009-01-01

    Objective To examine, in adolescents with mild to moderate malocclusion, the relationship between self-concept and demographic characteristics, a clinical assessment of malocclusion, self-perception of malocclusion, and self-perception of facial attractiveness. Methods and Materials Fifty-nine consecutive patients ages 9 to 15 years scheduled for initial records in a graduate orthodontic clinic consented to participate. Each subject independently completed the Multidimensional Self-Concept Scale (MSCS), the Facial Image Scale, and the Index of Treatment Need–Aesthetic Component (IOTN-AC). Peer Assessment Rating (PAR) scores were obtained from the patients’ diagnostic dental casts. Forward multiple-regression analysis with a backward overlook was used to analyze the effect of the demographic, clinical, and self-perception measures on each of the six self-concept (MSCS) domains. Results Self-perception of the dentofacial region was the only statistically significant predictor (P < .05) for the Global, Competence, Affect, Academic, and Physical domains of self-concept, while age, parental marital status, and the adolescent's self-perception of the dentofacial region were statistically significant predictors (P < .05) of Social Self-Concept. Conclusion The self-perceived level of the attractiveness or “positive” feelings toward the dentofacial region is more strongly related to self-concept than the severity of the malocclusion as indicated by the PAR score or by the adolescent's perception of their malocclusion. PMID:19123700

  6. Drug effects on responses to emotional facial expressions: recent findings.

    PubMed

    Miller, Melissa A; Bershad, Anya K; de Wit, Harriet

    2015-09-01

    Many psychoactive drugs increase social behavior and enhance social interactions, which may, in turn, increase their attractiveness to users. Although the psychological mechanisms by which drugs affect social behavior are not fully understood, there is some evidence that drugs alter the perception of emotions in others. Drugs can affect the ability to detect, attend to, and respond to emotional facial expressions, which in turn may influence their use in social settings. Either increased reactivity to positive expressions or decreased response to negative expressions may facilitate social interaction. This article reviews evidence that psychoactive drugs alter the processing of emotional facial expressions using subjective, behavioral, and physiological measures. The findings lay the groundwork for better understanding how drugs alter social processing and social behavior more generally.

  7. Effects of face feature and contour crowding in facial expression adaptation.

    PubMed

    Liu, Pan; Montaser-Kouhsari, Leila; Xu, Hong

    2014-12-01

    Prolonged exposure to a visual stimulus, such as a happy face, biases the perception of subsequently presented neutral face toward sad perception, the known face adaptation. Face adaptation is affected by visibility or awareness of the adapting face. However, whether it is affected by discriminability of the adapting face is largely unknown. In the current study, we used crowding to manipulate discriminability of the adapting face and test its effect on face adaptation. Instead of presenting flanking faces near the target face, we shortened the distance between facial features (internal feature crowding), and reduced the size of face contour (external contour crowding), to introduce crowding. We are interested in whether internal feature crowding or external contour crowding is more effective in inducing crowding effect in our first experiment. We found that combining internal feature and external contour crowding, but not either of them alone, induced significant crowding effect. In Experiment 2, we went on further to investigate its effect on adaptation. We found that both internal feature crowding and external contour crowding reduced its facial expression aftereffect (FEA) significantly. However, we did not find a significant correlation between discriminability of the adapting face and its FEA. Interestingly, we found a significant correlation between discriminabilities of the adapting and test faces. Experiment 3 found that the reduced adaptation aftereffect in combined crowding by the external face contour and the internal facial features cannot be decomposed into the effects from the face contour and facial features linearly. It thus suggested a nonlinear integration between facial features and face contour in face adaptation.

  8. The effect of skin surface topography and skin colouration cues on perception of male facial age, health and attractiveness.

    PubMed

    Fink, B; Matts, P J; Brauckmann, C; Gundlach, S

    2018-04-01

    Previous studies investigating the effects of skin surface topography and colouration cues on the perception of female faces reported a differential weighting for the perception of skin topography and colour evenness, where topography was a stronger visual cue for the perception of age, whereas skin colour evenness was a stronger visual cue for the perception of health. We extend these findings in a study of the effect of skin surface topography and colour evenness cues on the perceptions of facial age, health and attractiveness in males. Facial images of six men (aged 40 to 70 years), selected for co-expression of lines/wrinkles and discolouration, were manipulated digitally to create eight stimuli, namely, separate removal of these two features (a) on the forehead, (b) in the periorbital area, (c) on the cheeks and (d) across the entire face. Omnibus (within-face) pairwise combinations, including the original (unmodified) face, were presented to a total of 240 male and female judges, who selected the face they considered younger, healthier and more attractive. Significant effects were detected for facial image choice, in response to skin feature manipulation. The combined removal of skin surface topography resulted in younger age perception compared with that seen with the removal of skin colouration cues, whereas the opposite pattern was found for health preference. No difference was detected for the perception of attractiveness. These perceptual effects were seen particularly on the forehead and cheeks. Removing skin topography cues (but not discolouration) in the periorbital area resulted in higher preferences for all three attributes. Skin surface topography and colouration cues affect the perception of age, health and attractiveness in men's faces. The combined removal of these features on the forehead, cheeks and in the periorbital area results in the most positive assessments. © 2018 Society of Cosmetic Scientists and the Société Française de Cosmétologie.

  9. Modulation of α power and functional connectivity during facial affect recognition.

    PubMed

    Popov, Tzvetan; Miller, Gregory A; Rockstroh, Brigitte; Weisz, Nathan

    2013-04-03

    Research has linked oscillatory activity in the α frequency range, particularly in sensorimotor cortex, to processing of social actions. Results further suggest involvement of sensorimotor α in the processing of facial expressions, including affect. The sensorimotor face area may be critical for perception of emotional face expression, but the role it plays is unclear. The present study sought to clarify how oscillatory brain activity contributes to or reflects processing of facial affect during changes in facial expression. Neuromagnetic oscillatory brain activity was monitored while 30 volunteers viewed videos of human faces that changed their expression from neutral to fearful, neutral, or happy expressions. Induced changes in α power during the different morphs, source analysis, and graph-theoretic metrics served to identify the role of α power modulation and cross-regional coupling by means of phase synchrony during facial affect recognition. Changes from neutral to emotional faces were associated with a 10-15 Hz power increase localized in bilateral sensorimotor areas, together with occipital power decrease, preceding reported emotional expression recognition. Graph-theoretic analysis revealed that, in the course of a trial, the balance between sensorimotor power increase and decrease was associated with decreased and increased transregional connectedness as measured by node degree. Results suggest that modulations in α power facilitate early registration, with sensorimotor cortex including the sensorimotor face area largely functionally decoupled and thereby protected from additional, disruptive input and that subsequent α power decrease together with increased connectedness of sensorimotor areas facilitates successful facial affect recognition.

  10. Aberrant patterns of visual facial information usage in schizophrenia.

    PubMed

    Clark, Cameron M; Gosselin, Frédéric; Goghari, Vina M

    2013-05-01

    Deficits in facial emotion perception have been linked to poorer functional outcome in schizophrenia. However, the relationship between abnormal emotion perception and functional outcome remains poorly understood. To better understand the nature of facial emotion perception deficits in schizophrenia, we used the Bubbles Facial Emotion Perception Task to identify differences in usage of visual facial information in schizophrenia patients (n = 20) and controls (n = 20), when differentiating between angry and neutral facial expressions. As hypothesized, schizophrenia patients required more facial information than controls to accurately differentiate between angry and neutral facial expressions, and they relied on different facial features and spatial frequencies to differentiate these facial expressions. Specifically, schizophrenia patients underutilized the eye regions, overutilized the nose and mouth regions, and virtually ignored information presented at the lowest levels of spatial frequency. In addition, a post hoc one-tailed t test revealed a positive relationship of moderate strength between the degree of divergence from "normal" visual facial information usage in the eye region and lower overall social functioning. These findings provide direct support for aberrant patterns of visual facial information usage in schizophrenia in differentiating between socially salient emotional states. © 2013 American Psychological Association

  11. Facial emotion perception in Chinese patients with schizophrenia and non-psychotic first-degree relatives.

    PubMed

    Li, Huijie; Chan, Raymond C K; Zhao, Qing; Hong, Xiaohong; Gong, Qi-Yong

    2010-03-17

    Although there is a consensus that patients with schizophrenia have certain deficits in perceiving and expressing facial emotions, previous studies of facial emotion perception in schizophrenia do not present consistent results. The objective of this study was to explore facial emotion perception deficits in Chinese patients with schizophrenia and their non-psychotic first-degree relatives. Sixty-nine patients with schizophrenia, 56 of their first-degree relatives (33 parents and 23 siblings), and 92 healthy controls (67 younger healthy controls matched to the patients and siblings, and 25 older healthy controls matched to the parents) completed a set of facial emotion perception tasks, including facial emotion discrimination, identification, intensity, valence, and corresponding face identification tasks. The results demonstrated that patients with schizophrenia performed significantly worse than their siblings and younger healthy controls in accuracy in a variety of facial emotion perception tasks, whereas the siblings of the patients performed as well as the corresponding younger healthy controls in all of the facial emotion perception tasks. Patients with schizophrenia also showed significantly reduced speed than younger healthy controls, while siblings of patients did not demonstrate significant differences with both patients and younger healthy controls in speed. Meanwhile, we also found that parents of the schizophrenia patients performed significantly worse than the corresponding older healthy controls in accuracy in terms of facial emotion identification, valence, and the composite index of the facial discrimination, identification, intensity and valence tasks. Moreover, no significant differences were found between the parents of patients and older healthy controls in speed after controlling the years of education and IQ. Taken together, the results suggest that facial emotion perception deficits may serve as potential endophenotypes for schizophrenia. Copyright 2010 Elsevier Inc. All rights reserved.

  12. Effective connectivity during processing of facial affect: evidence for multiple parallel pathways.

    PubMed

    Dima, Danai; Stephan, Klaas E; Roiser, Jonathan P; Friston, Karl J; Frangou, Sophia

    2011-10-05

    The perception of facial affect engages a distributed cortical network. We used functional magnetic resonance imaging and dynamic causal modeling to characterize effective connectivity during explicit (conscious) categorization of affective stimuli in the human brain. Specifically, we examined the modulation of connectivity from posterior regions of the face-processing network to the lateral ventral prefrontal cortex (VPFC) during affective categorization and we tested for a potential role of the amygdala (AMG) in mediating this modulation. We found that explicit processing of facial affect led to prominent modulation (increase) in the effective connectivity from the inferior occipital gyrus (IOG) to the VPFC, while there was less evidence for modulation of the afferent connections from fusiform gyrus and AMG to VPFC. More specifically, the forward connection from IOG to the VPFC exhibited a selective increase under anger (as opposed to fear or sadness). Furthermore, Bayesian model comparison suggested that the modulation of afferent connections to the VPFC was mediated directly by facial affect, as opposed to an indirect modulation mediated by the AMG. Our results thus suggest that affective information is conveyed to the VPFC along multiple parallel pathways and that AMG activity is not sufficient to account for the gating of information transfer to the VPFC during explicit emotional processing.

  13. Discrimination and categorization of emotional facial expressions and faces in Parkinson's disease.

    PubMed

    Alonso-Recio, Laura; Martín, Pilar; Rubio, Sandra; Serrano, Juan M

    2014-09-01

    Our objective was to compare the ability to discriminate and categorize emotional facial expressions (EFEs) and facial identity characteristics (age and/or gender) in a group of 53 individuals with Parkinson's disease (PD) and another group of 53 healthy subjects. On the one hand, by means of discrimination and identification tasks, we compared two stages in the visual recognition process that could be selectively affected in individuals with PD. On the other hand, facial expression versus gender and age comparison permits us to contrast whether the emotional or non-emotional content influences the configural perception of faces. In Experiment I, we did not find differences between groups, either with facial expression or age, in discrimination tasks. Conversely, in Experiment II, we found differences between the groups, but only in the EFE identification task. Taken together, our results indicate that configural perception of faces does not seem to be globally impaired in PD. However, this ability is selectively altered when the categorization of emotional faces is required. A deeper assessment of the PD group indicated that decline in facial expression categorization is more evident in a subgroup of patients with higher global impairment (motor and cognitive). Taken together, these results suggest that the problems found in facial expression recognition may be associated with the progressive neuronal loss in frontostriatal and mesolimbic circuits, which characterizes PD. © 2013 The British Psychological Society.

  14. Facial Features: What Women Perceive as Attractive and What Men Consider Attractive.

    PubMed

    Muñoz-Reyes, José Antonio; Iglesias-Julios, Marta; Pita, Miguel; Turiegano, Enrique

    2015-01-01

    Attractiveness plays an important role in social exchange and in the ability to attract potential mates, especially for women. Several facial traits have been described as reliable indicators of attractiveness in women, but very few studies consider the influence of several measurements simultaneously. In addition, most studies consider just one of two assessments to directly measure attractiveness: either self-evaluation or men's ratings. We explored the relationship between these two estimators of attractiveness and a set of facial traits in a sample of 266 young Spanish women. These traits are: facial fluctuating asymmetry, facial averageness, facial sexual dimorphism, and facial maturity. We made use of the advantage of having recently developed methodologies that enabled us to measure these variables in real faces. We also controlled for three other widely used variables: age, body mass index and waist-to-hip ratio. The inclusion of many different variables allowed us to detect any possible interaction between the features described that could affect attractiveness perception. Our results show that facial fluctuating asymmetry is related both to self-perceived and male-rated attractiveness. Other facial traits are related only to one direct attractiveness measurement: facial averageness and facial maturity only affect men's ratings. Unmodified faces are closer to natural stimuli than are manipulated photographs, and therefore our results support the importance of employing unmodified faces to analyse the factors affecting attractiveness. We also discuss the relatively low equivalence between self-perceived and male-rated attractiveness and how various anthropometric traits are relevant to them in different ways. Finally, we highlight the need to perform integrated-variable studies to fully understand female attractiveness.

  15. Facial Features: What Women Perceive as Attractive and What Men Consider Attractive

    PubMed Central

    Muñoz-Reyes, José Antonio; Iglesias-Julios, Marta; Pita, Miguel; Turiegano, Enrique

    2015-01-01

    Attractiveness plays an important role in social exchange and in the ability to attract potential mates, especially for women. Several facial traits have been described as reliable indicators of attractiveness in women, but very few studies consider the influence of several measurements simultaneously. In addition, most studies consider just one of two assessments to directly measure attractiveness: either self-evaluation or men's ratings. We explored the relationship between these two estimators of attractiveness and a set of facial traits in a sample of 266 young Spanish women. These traits are: facial fluctuating asymmetry, facial averageness, facial sexual dimorphism, and facial maturity. We made use of the advantage of having recently developed methodologies that enabled us to measure these variables in real faces. We also controlled for three other widely used variables: age, body mass index and waist-to-hip ratio. The inclusion of many different variables allowed us to detect any possible interaction between the features described that could affect attractiveness perception. Our results show that facial fluctuating asymmetry is related both to self-perceived and male-rated attractiveness. Other facial traits are related only to one direct attractiveness measurement: facial averageness and facial maturity only affect men's ratings. Unmodified faces are closer to natural stimuli than are manipulated photographs, and therefore our results support the importance of employing unmodified faces to analyse the factors affecting attractiveness. We also discuss the relatively low equivalence between self-perceived and male-rated attractiveness and how various anthropometric traits are relevant to them in different ways. Finally, we highlight the need to perform integrated-variable studies to fully understand female attractiveness. PMID:26161954

  16. Theory of Mind, Emotion Recognition and Social Perception in Individuals at Clinical High Risk for Psychosis: findings from the NAPLS-2 cohort.

    PubMed

    Barbato, Mariapaola; Liu, Lu; Cadenhead, Kristin S; Cannon, Tyrone D; Cornblatt, Barbara A; McGlashan, Thomas H; Perkins, Diana O; Seidman, Larry J; Tsuang, Ming T; Walker, Elaine F; Woods, Scott W; Bearden, Carrie E; Mathalon, Daniel H; Heinssen, Robert; Addington, Jean

    2015-09-01

    Social cognition, the mental operations that underlie social interactions, is a major construct to investigate in schizophrenia. Impairments in social cognition are present before the onset of psychosis, and even in unaffected first-degree relatives, suggesting that social cognition may be a trait marker of the illness. In a large cohort of individuals at clinical high risk for psychosis (CHR) and healthy controls, three domains of social cognition (theory of mind, facial emotion recognition and social perception) were assessed to clarify which domains are impaired in this population. Six-hundred and seventy-five CHR individuals and 264 controls, who were part of the multi-site North American Prodromal Longitudinal Study, completed The Awareness of Social Inference Test , the Penn Emotion Recognition task , the Penn Emotion Differentiation task , and the Relationship Across Domains , measures of theory of mind, facial emotion recognition, and social perception, respectively. Social cognition was not related to positive and negative symptom severity, but was associated with age and IQ. CHR individuals demonstrated poorer performance on all measures of social cognition. However, after controlling for age and IQ, the group differences remained significant for measures of theory of mind and social perception, but not for facial emotion recognition. Theory of mind and social perception are impaired in individuals at CHR for psychosis. Age and IQ seem to play an important role in the arising of deficits in facial affect recognition. Future studies should examine the stability of social cognition deficits over time and their role, if any, in the development of psychosis.

  17. Facial expression perception correlates with verbal working memory function in schizophrenia.

    PubMed

    Hagiya, Kumiko; Sumiyoshi, Tomiki; Kanie, Ayako; Pu, Shenghong; Kaneko, Koichi; Mogami, Tamiko; Oshima, Sachie; Niwa, Shin-ichi; Inagaki, Akiko; Ikebuchi, Emi; Kikuchi, Akiko; Yamasaki, Syudo; Iwata, Kazuhiko; Nakagome, Kazuyuki

    2015-12-01

    Facial emotion perception is considered to provide a measure of social cognition. Numerous studies have examined the perception of emotion in patients with schizophrenia, and the majority has reported impaired ability to recognize facial emotion perception. We aimed to investigate the correlation between facial expression recognition and other domains of social cognition and neurocognition in Japanese patients with schizophrenia. Participants were 52 patients with schizophrenia and 53 normal controls with no history of psychiatric diseases. All participants completed the Hinting Task and the Social Cognition Screening Questionnaire. The Brief Assessment of Cognition in Schizophrenia was administered only to the patients. Facial emotion perception measured by the Facial Emotion Selection Test (FEST) was compared between the patients and normal controls. Patients performed significantly worse on the FEST compared to normal control subjects. The FEST total score was significantly positively correlated with scores of the Brief Assessment of Cognition in Schizophrenia attention subscale, Hinting Task, Social Cognition Screening Questionnaire Verbal Working Memory and Metacognition subscales. Stepwise multiple regression analysis revealed that verbal working memory function was positively related to the facial emotion perception ability in patients with schizophrenia. These results point to the concept that facial emotion perception and some types of working memory use common cognitive resources. Our findings may provide implications for cognitive rehabilitation and related interventions in schizophrenia. © 2015 The Authors. Psychiatry and Clinical Neurosciences © 2015 Japanese Society of Psychiatry and Neurology.

  18. Subliminal perception of others' physical pain and pleasure.

    PubMed

    Chiesa, Patrizia Andrea; Liuzza, Marco Tullio; Acciarino, Adriano; Aglioti, Salvatore Maria

    2015-08-01

    Studies indicate that explicit and implicit processing of affectively charged stimuli may be reflected in specific behavioral markers and physiological signatures. This study investigated whether the pleasantness ratings of a neutral target were affected by subliminal perception of pleasant and painful facial expressions. Participants were presented images depicting face of non-famous models being slapped (painful condition), caressed (pleasant condition) or touched (neutral condition) by the right hand of another individual. In particular, we combined the continuous flash suppression technique with the affective misattribution procedure (AMP) to explore subliminal empathic processing. Measures of pupil reactivity along with empathy traits were also collected. Results showed that participants rated the neutral target as less or more likeable congruently with the painful or pleasant facial expression presented, respectively. Pupil dilation was associated both with the implicit attitudes (AMP score) and with empathic concern. Thus, the results provide behavioral and physiological evidence that state-related empathic reactivity can occur at an entirely subliminal level and that it is linked to autonomic responses and empathic traits.

  19. Psychocentricity and participant profiles: implications for lexical processing among multilinguals

    PubMed Central

    Libben, Gary; Curtiss, Kaitlin; Weber, Silke

    2014-01-01

    Lexical processing among bilinguals is often affected by complex patterns of individual experience. In this paper we discuss the psychocentric perspective on language representation and processing, which highlights the centrality of individual experience in psycholinguistic experimentation. We discuss applications to the investigation of lexical processing among multilinguals and explore the advantages of using high-density experiments with multilinguals. High density experiments are designed to co-index measures of lexical perception and production, as well as participant profiles. We discuss the challenges associated with the characterization of participant profiles and present a new data visualization technique, that we term Facial Profiles. This technique is based on Chernoff faces developed over 40 years ago. The Facial Profile technique seeks to overcome some of the challenges associated with the use of Chernoff faces, while maintaining the core insight that recoding multivariate data as facial features can engage the human face recognition system and thus enhance our ability to detect and interpret patterns within multivariate datasets. We demonstrate that Facial Profiles can code participant characteristics in lexical processing studies by recoding variables such as reading ability, speaking ability, and listening ability into iconically-related relative sizes of eye, mouth, and ear, respectively. The balance of ability in bilinguals can be captured by creating composite facial profiles or Janus Facial Profiles. We demonstrate the use of Facial Profiles and Janus Facial Profiles in the characterization of participant effects in the study of lexical perception and production. PMID:25071614

  20. Drug effects on responses to emotional facial expressions: recent findings

    PubMed Central

    Miller, Melissa A.; Bershad, Anya K.; de Wit, Harriet

    2016-01-01

    Many psychoactive drugs increase social behavior and enhance social interactions, which may, in turn, increase their attractiveness to users. Although the psychological mechanisms by which drugs affect social behavior are not fully understood, there is some evidence that drugs alter the perception of emotions in others. Drugs can affect the ability to detect, attend to, and respond to emotional facial expressions, which in turn may influence their use in social settings. Either increased reactivity to positive expressions or decreased response to negative expressions may facilitate social interaction. This article reviews evidence that psychoactive drugs alter the processing of emotional facial expressions using subjective, behavioral, and physiological measures. The findings lay the groundwork for better understanding how drugs alter social processing and social behavior more generally. PMID:26226144

  1. Effects of inverting contour and features on processing for static and dynamic face perception: an MEG study.

    PubMed

    Miki, Kensaku; Takeshima, Yasuyuki; Watanabe, Shoko; Honda, Yukiko; Kakigi, Ryusuke

    2011-04-06

    We investigated the effects of inverting facial contour (hair and chin) and features (eyes, nose and mouth) on processing for static and dynamic face perception using magnetoencephalography (MEG). We used apparent motion, in which the first stimulus (S1) was replaced by a second stimulus (S2) with no interstimulus interval and subjects perceived visual motion, and presented three conditions as follows: (1) U&U: Upright contour and Upright features, (2) U&I: Upright contour and Inverted features, and (3) I&I: Inverted contour and Inverted features. In static face perception (S1 onset), the peak latency of the fusiform area's activity, which was related to static face perception, was significantly longer for U&I and I&I than for U&U in the right hemisphere and for U&I than for U&U and I&I in the left. In dynamic face perception (S2 onset), the strength (moment) of the occipitotemporal area's activity, which was related to dynamic face perception, was significantly larger for I&I than for U&U and U&I in the right hemisphere, but not the left. These results can be summarized as follows: (1) in static face perception, the activity of the right fusiform area was more affected by the inversion of features while that of the left fusiform area was more affected by the disruption of the spatial relation between the contour and features, and (2) in dynamic face perception, the activity of the right occipitotemporal area was affected by the inversion of the facial contour. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Adult Perceptions of Positive and Negative Infant Emotional Expressions

    ERIC Educational Resources Information Center

    Bolzani Dinehart, Laura H.; Messinger, Daniel S.; Acosta, Susan I.; Cassel, Tricia; Ambadar, Zara; Cohn, Jeffrey

    2005-01-01

    Adults' perceptions provide information about the emotional meaning of infant facial expressions. This study asks whether similar facial movements influence adult perceptions of emotional intensity in both infant positive (smile) and negative (cry face) facial expressions. Ninety-five college students rated a series of naturally occurring and…

  3. Is right hemisphere decline in the perception of emotion a function of aging?

    PubMed

    McDowell, C L; Harrison, D W; Demaree, H A

    1994-11-01

    The hypothesis that the right cerebral hemisphere declines more quickly than the left cerebral hemisphere in the normal aging process was tested using accuracy and intensity measures in a facial recognition test and using response time and response bias measures in a tachistoscopic paradigm. Elderly and younger men and women (N = 60) participated in both experiments. Experiment 1 required facial affect identification and intensity ratings of 50 standardized photographs of 5 affective categories: Happy, Neutral, Sad, Angry, and Fearful. The elderly were significantly less accurate in identifying facial affective valence. This effect was found using negative and neutral expressions. Results for happy expressions, however, were consistent with the younger group. In Experiment 2, age differences in hemispheric asymmetry were evaluated using presentation of affective faces in each visual field. Following prolonged experience with the affective stimuli during Experiment 1, the elderly showed heightened cerebral asymmetry for facial affect processing compared to the younger group. Both groups showed a positive affective bias to neutral stimuli presented to the left hemisphere. Elderly and younger subjects scored significantly higher on Vocabulary and Block Design subtests of the WAIS-R, respectively. Overall, the findings suggest that the elderly have more difficulty processing negative affect, while their ability to process positive affect remains intact. The results lend only partial support to the right hemi-aging hypothesis.

  4. Facial attractiveness of skeletal class I and class II malocclusion as perceived by laypeople, patients and clinicians.

    PubMed

    Pace, Michela; Cioffi, Iacopo; D'antò, Vincenzo; Valletta, Alessandra; Valletta, Rosa; Amato, Massimo

    2018-06-01

    Physical attractiveness is dependent on facial appearance. The facial profile plays a crucial role in facial attractiveness and can be improved with orthodontic treatment. The aesthetic assessment of facial appearance may be influenced by the cultural background and education of the assessor and dependent upon the experience level of dental professionals. This study aimed to evaluate how the sagittal jaw relationship in Class I and Class II individuals affects facial attractiveness, and whether the assessor's professional education and background affect the perception of facial attractiveness. Facial silhouettes simulating mandibular retrusion, maxillary protrusion, mandibular retrusion combined with maxillary protrusion, bimaxillary protrusion and severe bimaxillary protrusion in class I and class II patients were assessed by five groups of people with different backgrounds and education levels (i.e., 23 expert orthodontists, 21 orthodontists, 15 maxillofacial surgeons, 19 orthodontic patients and 28 laypeople). Straight facial profiles were judged to be more attractive than convex profiles due to severe mandibular retrusion and to mandibular retrusion combined with maxillary protrusion (all P<0.05). Convex profiles due to a slightly retruded position of the mandible were judged less attractive by clinicians than by patients and laypeople (all P<0.05). Convex facial profiles are less attractive than Class I profiles. The assessment of facial attractiveness is dependent on the assessor's education and background. Laypeople and patients are considerably less sensitive to abnormal sagittal jaw relationships than orthodontists.

  5. The perception of positive and negative facial expressions by unilateral stroke patients.

    PubMed

    Abbott, Jacenta D; Wijeratne, Tissa; Hughes, Andrew; Perre, Diana; Lindell, Annukka K

    2014-04-01

    There remains conflict in the literature about the lateralisation of affective face perception. Some studies have reported a right hemisphere advantage irrespective of valence, whereas others have found a left hemisphere advantage for positive, and a right hemisphere advantage for negative, emotion. Differences in injury aetiology and chronicity, proportion of male participants, participant age, and the number of emotions used within a perception task may contribute to these contradictory findings. The present study therefore controlled and/or directly examined the influence of these possible moderators. Right brain-damaged (RBD; n=17), left brain-damaged (LBD; n=17), and healthy control (HC; n=34) participants completed two face perception tasks (identification and discrimination). No group differences in facial expression perception according to valence were found. Across emotions, the RBD group was less accurate thanthe HC group, however RBD and LBD group performancedid not differ. The lack of difference between RBD and LBD groups indicates that both hemispheres are involved in positive and negative expression perception. The inclusion of older adults and the well-defined chronicity range of the brain-damaged participants may have moderated these findings. Participant sex and general face perception ability did not influence performance. Furthermore, while the RBD group was less accurate than the LBD group when the identification task tested two emotions, performance of the two groups was indistinguishable when the number of emotions increased (four or six). This suggests that task demand moderates a study's ability to find hemispheric differences in the perception of facial emotion. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. Aspects of Facial Contrast Decrease with Age and Are Cues for Age Perception

    PubMed Central

    Porcheron, Aurélie; Mauger, Emmanuelle; Russell, Richard

    2013-01-01

    Age is a primary social dimension. We behave differently toward people as a function of how old we perceive them to be. Age perception relies on cues that are correlated with age, such as wrinkles. Here we report that aspects of facial contrast–the contrast between facial features and the surrounding skin–decreased with age in a large sample of adult Caucasian females. These same aspects of facial contrast were also significantly correlated with the perceived age of the faces. Individual faces were perceived as younger when these aspects of facial contrast were artificially increased, but older when these aspects of facial contrast were artificially decreased. These findings show that facial contrast plays a role in age perception, and that faces with greater facial contrast look younger. Because facial contrast is increased by typical cosmetics use, we infer that cosmetics function in part by making the face appear younger. PMID:23483959

  7. The Influence of Music on Facial Emotion Recognition in Children with Autism Spectrum Disorder and Neurotypical Children.

    PubMed

    Brown, Laura S

    2017-03-01

    Children with autism spectrum disorder (ASD) often struggle with social skills, including the ability to perceive emotions based on facial expressions. Research evidence suggests that many individuals with ASD can perceive emotion in music. Examining whether music can be used to enhance recognition of facial emotion by children with ASD would inform development of music therapy interventions. The purpose of this study was to investigate the influence of music with a strong emotional valance (happy; sad) on children with ASD's ability to label emotions depicted in facial photographs, and their response time. Thirty neurotypical children and 20 children with high-functioning ASD rated expressions of happy, neutral, and sad in 30 photographs under two music listening conditions (sad music; happy music). During each music listening condition, participants rated the 30 images using a 7-point scale that ranged from very sad to very happy. Response time data were also collected across both conditions. A significant two-way interaction revealed that participants' ratings of happy and neutral faces were unaffected by music conditions, but sad faces were perceived to be sadder with sad music than with happy music. Across both conditions, neurotypical children rated the happy faces as happier and the sad faces as sadder than did participants with ASD. Response times of the neurotypical children were consistently shorter than response times of the children with ASD; both groups took longer to rate sad faces than happy faces. Response times of neurotypical children were generally unaffected by the valence of the music condition; however, children with ASD took longer to respond when listening to sad music. Music appears to affect perceptions of emotion in children with ASD, and perceptions of sad facial expressions seem to be more affected by emotionally congruent background music than are perceptions of happy or neutral faces. © the American Music Therapy Association 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  8. Face adaptation aftereffects reveal anterior medial temporal cortex role in high level category representation.

    PubMed

    Furl, N; van Rijsbergen, N J; Treves, A; Dolan, R J

    2007-08-01

    Previous studies have shown reductions of the functional magnetic resonance imaging (fMRI) signal in response to repetition of specific visual stimuli. We examined how adaptation affects the neural responses associated with categorization behavior, using face adaptation aftereffects. Adaptation to a given facial category biases categorization towards non-adapted facial categories in response to presentation of ambiguous morphs. We explored a hypothesis, posed by recent psychophysical studies, that these adaptation-induced categorizations are mediated by activity in relatively advanced stages within the occipitotemporal visual processing stream. Replicating these studies, we find that adaptation to a facial expression heightens perception of non-adapted expressions. Using comparable behavioral methods, we also show that adaptation to a specific identity heightens perception of a second identity in morph faces. We show both expression and identity effects to be associated with heightened anterior medial temporal lobe activity, specifically when perceiving the non-adapted category. These regions, incorporating bilateral anterior ventral rhinal cortices, perirhinal cortex and left anterior hippocampus are regions previously implicated in high-level visual perception. These categorization effects were not evident in fusiform or occipital gyri, although activity in these regions was reduced to repeated faces. The findings suggest that adaptation-induced perception is mediated by activity in regions downstream to those showing reductions due to stimulus repetition.

  9. The effect of face patch microstimulation on perception of faces and objects.

    PubMed

    Moeller, Sebastian; Crapse, Trinity; Chang, Le; Tsao, Doris Y

    2017-05-01

    What is the range of stimuli encoded by face-selective regions of the brain? We asked how electrical microstimulation of face patches in macaque inferotemporal cortex affects perception of faces and objects. We found that microstimulation strongly distorted face percepts and that this effect depended on precise targeting to the center of face patches. While microstimulation had no effect on the percept of many non-face objects, it did affect the percept of some, including non-face objects whose shape is consistent with a face (for example, apples) as well as somewhat facelike abstract images (for example, cartoon houses). Microstimulation even perturbed the percept of certain objects that did not activate the stimulated face patch at all. Overall, these results indicate that representation of facial identity is localized to face patches, but activity in these patches can also affect perception of face-compatible non-face objects, including objects normally represented in other parts of inferotemporal cortex.

  10. Early and late temporo-spatial effects of contextual interference during perception of facial affect.

    PubMed

    Frühholz, Sascha; Fehr, Thorsten; Herrmann, Manfred

    2009-10-01

    Contextual features during recognition of facial affect are assumed to modulate the temporal course of emotional face processing. Here, we simultaneously presented colored backgrounds during valence categorizations of facial expressions. Subjects incidentally learned to perceive negative, neutral and positive expressions within a specific colored context. Subsequently, subjects made fast valence judgments while presented with the same face-color-combinations as in the first run (congruent trials) or with different face-color-combinations (incongruent trials). Incongruent trials induced significantly increased response latencies and significantly decreased performance accuracy. Contextual incongruent information during processing of neutral expressions modulated the P1 and the early posterior negativity (EPN) both localized in occipito-temporal areas. Contextual congruent information during emotional face perception revealed an emotion-related modulation of the P1 for positive expressions and of the N170 and the EPN for negative expressions. Highest amplitude of the N170 was found for negative expressions in a negatively associated context and the N170 amplitude varied with the amount of overall negative information. Incongruent trials with negative expressions elicited a parietal negativity which was localized to superior parietal cortex and which most likely represents a posterior manifestation of the N450 as an indicator of conflict processing. A sustained activation of the late LPP over parietal cortex for all incongruent trials might reflect enhanced engagement with facial expression during task conditions of contextual interference. In conclusion, whereas early components seem to be sensitive to the emotional valence of facial expression in specific contexts, late components seem to subserve interference resolution during emotional face processing.

  11. Beyond face value: does involuntary emotional anticipation shape the perception of dynamic facial expressions?

    PubMed

    Palumbo, Letizia; Jellema, Tjeerd

    2013-01-01

    Emotional facial expressions are immediate indicators of the affective dispositions of others. Recently it has been shown that early stages of social perception can already be influenced by (implicit) attributions made by the observer about the agent's mental state and intentions. In the current study possible mechanisms underpinning distortions in the perception of dynamic, ecologically-valid, facial expressions were explored. In four experiments we examined to what extent basic perceptual processes such as contrast/context effects, adaptation and representational momentum underpinned the perceptual distortions, and to what extent 'emotional anticipation', i.e. the involuntary anticipation of the other's emotional state of mind on the basis of the immediate perceptual history, might have played a role. Neutral facial expressions displayed at the end of short video-clips, in which an initial facial expression of joy or anger gradually morphed into a neutral expression, were misjudged as being slightly angry or slightly happy, respectively (Experiment 1). This response bias disappeared when the actor's identity changed in the final neutral expression (Experiment 2). Videos depicting neutral-to-joy-to-neutral and neutral-to-anger-to-neutral sequences again produced biases but in opposite direction (Experiment 3). The bias survived insertion of a 400 ms blank (Experiment 4). These results suggested that the perceptual distortions were not caused by any of the low-level perceptual mechanisms (adaptation, representational momentum and contrast effects). We speculate that especially when presented with dynamic, facial expressions, perceptual distortions occur that reflect 'emotional anticipation' (a low-level mindreading mechanism), which overrules low-level visual mechanisms. Underpinning neural mechanisms are discussed in relation to the current debate on action and emotion understanding.

  12. Beyond Face Value: Does Involuntary Emotional Anticipation Shape the Perception of Dynamic Facial Expressions?

    PubMed Central

    Palumbo, Letizia; Jellema, Tjeerd

    2013-01-01

    Emotional facial expressions are immediate indicators of the affective dispositions of others. Recently it has been shown that early stages of social perception can already be influenced by (implicit) attributions made by the observer about the agent’s mental state and intentions. In the current study possible mechanisms underpinning distortions in the perception of dynamic, ecologically-valid, facial expressions were explored. In four experiments we examined to what extent basic perceptual processes such as contrast/context effects, adaptation and representational momentum underpinned the perceptual distortions, and to what extent ‘emotional anticipation’, i.e. the involuntary anticipation of the other’s emotional state of mind on the basis of the immediate perceptual history, might have played a role. Neutral facial expressions displayed at the end of short video-clips, in which an initial facial expression of joy or anger gradually morphed into a neutral expression, were misjudged as being slightly angry or slightly happy, respectively (Experiment 1). This response bias disappeared when the actor’s identity changed in the final neutral expression (Experiment 2). Videos depicting neutral-to-joy-to-neutral and neutral-to-anger-to-neutral sequences again produced biases but in opposite direction (Experiment 3). The bias survived insertion of a 400 ms blank (Experiment 4). These results suggested that the perceptual distortions were not caused by any of the low-level perceptual mechanisms (adaptation, representational momentum and contrast effects). We speculate that especially when presented with dynamic, facial expressions, perceptual distortions occur that reflect ‘emotional anticipation’ (a low-level mindreading mechanism), which overrules low-level visual mechanisms. Underpinning neural mechanisms are discussed in relation to the current debate on action and emotion understanding. PMID:23409112

  13. We look like our names: The manifestation of name stereotypes in facial appearance.

    PubMed

    Zwebner, Yonat; Sellier, Anne-Laure; Rosenfeld, Nir; Goldenberg, Jacob; Mayo, Ruth

    2017-04-01

    Research demonstrates that facial appearance affects social perceptions. The current research investigates the reverse possibility: Can social perceptions influence facial appearance? We examine a social tag that is associated with us early in life-our given name. The hypothesis is that name stereotypes can be manifested in facial appearance, producing a face-name matching effect , whereby both a social perceiver and a computer are able to accurately match a person's name to his or her face. In 8 studies we demonstrate the existence of this effect, as participants examining an unfamiliar face accurately select the person's true name from a list of several names, significantly above chance level. We replicate the effect in 2 countries and find that it extends beyond the limits of socioeconomic cues. We also find the effect using a computer-based paradigm and 94,000 faces. In our exploration of the underlying mechanism, we show that existing name stereotypes produce the effect, as its occurrence is culture-dependent. A self-fulfilling prophecy seems to be at work, as initial evidence shows that facial appearance regions that are controlled by the individual (e.g., hairstyle) are sufficient to produce the effect, and socially using one's given name is necessary to generate the effect. Together, these studies suggest that facial appearance represents social expectations of how a person with a specific name should look. In this way a social tag may influence one's facial appearance. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  14. Affect of the unconscious: visually suppressed angry faces modulate our decisions.

    PubMed

    Almeida, Jorge; Pajtas, Petra E; Mahon, Bradford Z; Nakayama, Ken; Caramazza, Alfonso

    2013-03-01

    Emotional and affective processing imposes itself over cognitive processes and modulates our perception of the surrounding environment. In two experiments, we addressed the issue of whether nonconscious processing of affect can take place even under deep states of unawareness, such as those induced by interocular suppression techniques, and can elicit an affective response that can influence our understanding of the surrounding environment. In Experiment 1, participants judged the likeability of an unfamiliar item--a Chinese character--that was preceded by a face expressing a particular emotion (either happy or angry). The face was rendered invisible through an interocular suppression technique (continuous flash suppression; CFS). In Experiment 2, backward masking (BM), a less robust masking technique, was used to render the facial expressions invisible. We found that despite equivalent phenomenological suppression of the visual primes under CFS and BM, different patterns of affective processing were obtained with the two masking techniques. Under BM, nonconscious affective priming was obtained for both happy and angry invisible facial expressions. However, under CFS, nonconscious affective priming was obtained only for angry facial expressions. We discuss an interpretation of this dissociation between affective processing and visual masking techniques in terms of distinct routes from the retina to the amygdala.

  15. I feel your voice. Cultural differences in the multisensory perception of emotion.

    PubMed

    Tanaka, Akihiro; Koizumi, Ai; Imai, Hisato; Hiramatsu, Saori; Hiramoto, Eriko; de Gelder, Beatrice

    2010-09-01

    Cultural differences in emotion perception have been reported mainly for facial expressions and to a lesser extent for vocal expressions. However, the way in which the perceiver combines auditory and visual cues may itself be subject to cultural variability. Our study investigated cultural differences between Japanese and Dutch participants in the multisensory perception of emotion. A face and a voice, expressing either congruent or incongruent emotions, were presented on each trial. Participants were instructed to judge the emotion expressed in one of the two sources. The effect of to-be-ignored voice information on facial judgments was larger in Japanese than in Dutch participants, whereas the effect of to-be-ignored face information on vocal judgments was smaller in Japanese than in Dutch participants. This result indicates that Japanese people are more attuned than Dutch people to vocal processing in the multisensory perception of emotion. Our findings provide the first evidence that multisensory integration of affective information is modulated by perceivers' cultural background.

  16. Reduced white matter integrity and facial emotion perception in never-medicated patients with first-episode schizophrenia: A diffusion tensor imaging study.

    PubMed

    Zhao, Xiaoxin; Sui, Yuxiu; Yao, Jingjing; Lv, Yiding; Zhang, Xinyue; Jin, Zhuma; Chen, Lijun; Zhang, Xiangrong

    2017-07-03

    Facial emotion perception is impaired in schizophrenia. Although the pathology of schizophrenia is thought to involve abnormality in white matter (WM), few studies have examined the correlation between facial emotion perception and WM abnormalities in never-medicated patients with first-episode schizophrenia. The present study tested associations between facial emotion perception and WM integrity in order to investigate the neural basis of impaired facial emotion perception in schizophrenia. Sixty-three schizophrenic patients and thirty control subjects underwent facial emotion categorization (FEC). The FEC data was inserted into a logistic function model with subsequent analysis by independent-samples T test and the shift point and slope as outcome measurements. Severity of symptoms was measured using a five-factor model of the Positive and Negative Syndrome Scale (PANSS). Voxelwise group comparison of WM fractional anisotropy (FA) was operated using tract-based spatial statistics (TBSS). The correlation between impaired facial emotion perception and FA reduction was examined in patients using simple regression analysis within brain areas that showed a significant FA reduction in patients compared with controls. The same correlation analysis was also performed for control subjects in the whole brain. The patients with schizophrenia reported a higher shift point and a steeper slope than control subjects in FEC. The patients showed a significant FA reduction in left deep WM in the parietal, temporal and occipital lobes, a small portion of the corpus callosum (CC), and the corona radiata. In voxelwise correlation analysis, we found that facial emotion perception significantly correlated with reduced FA in various WM regions, including left forceps major (FM), inferior longitudinal fasciculus (ILF), inferior fronto-occipital fasciculus (IFOF), Left splenium of CC, and left ILF. The correlation analyses in healthy controls revealed no significant correlation of FA with FEC task. These results showed disrupted WM integrity in these regions constitutes a potential neural basis for the facial emotion perception impairments in schizophrenia. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Facial EMG responses to emotional expressions are related to emotion perception ability.

    PubMed

    Künecke, Janina; Hildebrandt, Andrea; Recio, Guillermo; Sommer, Werner; Wilhelm, Oliver

    2014-01-01

    Although most people can identify facial expressions of emotions well, they still differ in this ability. According to embodied simulation theories understanding emotions of others is fostered by involuntarily mimicking the perceived expressions, causing a "reactivation" of the corresponding mental state. Some studies suggest automatic facial mimicry during expression viewing; however, findings on the relationship between mimicry and emotion perception abilities are equivocal. The present study investigated individual differences in emotion perception and its relationship to facial muscle responses - recorded with electromyogram (EMG)--in response to emotional facial expressions. N° = °269 participants completed multiple tasks measuring face and emotion perception. EMG recordings were taken from a subsample (N° = °110) in an independent emotion classification task of short videos displaying six emotions. Confirmatory factor analyses of the m. corrugator supercilii in response to angry, happy, sad, and neutral expressions showed that individual differences in corrugator activity can be separated into a general response to all faces and an emotion-related response. Structural equation modeling revealed a substantial relationship between the emotion-related response and emotion perception ability, providing evidence for the role of facial muscle activation in emotion perception from an individual differences perspective.

  18. Facial EMG Responses to Emotional Expressions Are Related to Emotion Perception Ability

    PubMed Central

    Künecke, Janina; Hildebrandt, Andrea; Recio, Guillermo; Sommer, Werner; Wilhelm, Oliver

    2014-01-01

    Although most people can identify facial expressions of emotions well, they still differ in this ability. According to embodied simulation theories understanding emotions of others is fostered by involuntarily mimicking the perceived expressions, causing a “reactivation” of the corresponding mental state. Some studies suggest automatic facial mimicry during expression viewing; however, findings on the relationship between mimicry and emotion perception abilities are equivocal. The present study investigated individual differences in emotion perception and its relationship to facial muscle responses - recorded with electromyogram (EMG) - in response to emotional facial expressions. N° = °269 participants completed multiple tasks measuring face and emotion perception. EMG recordings were taken from a subsample (N° = °110) in an independent emotion classification task of short videos displaying six emotions. Confirmatory factor analyses of the m. corrugator supercilii in response to angry, happy, sad, and neutral expressions showed that individual differences in corrugator activity can be separated into a general response to all faces and an emotion-related response. Structural equation modeling revealed a substantial relationship between the emotion-related response and emotion perception ability, providing evidence for the role of facial muscle activation in emotion perception from an individual differences perspective. PMID:24489647

  19. Assessment of perception of morphed facial expressions using the Emotion Recognition Task: normative data from healthy participants aged 8-75.

    PubMed

    Kessels, Roy P C; Montagne, Barbara; Hendriks, Angelique W; Perrett, David I; de Haan, Edward H F

    2014-03-01

    The ability to recognize and label emotional facial expressions is an important aspect of social cognition. However, existing paradigms to examine this ability present only static facial expressions, suffer from ceiling effects or have limited or no norms. A computerized test, the Emotion Recognition Task (ERT), was developed to overcome these difficulties. In this study, we examined the effects of age, sex, and intellectual ability on emotion perception using the ERT. In this test, emotional facial expressions are presented as morphs gradually expressing one of the six basic emotions from neutral to four levels of intensity (40%, 60%, 80%, and 100%). The task was administered in 373 healthy participants aged 8-75. In children aged 8-17, only small developmental effects were found for the emotions anger and happiness, in contrast to adults who showed age-related decline on anger, fear, happiness, and sadness. Sex differences were present predominantly in the adult participants. IQ only minimally affected the perception of disgust in the children, while years of education were correlated with all emotions but surprise and disgust in the adult participants. A regression-based approach was adopted to present age- and education- or IQ-adjusted normative data for use in clinical practice. Previous studies using the ERT have demonstrated selective impairments on specific emotions in a variety of psychiatric, neurologic, or neurodegenerative patient groups, making the ERT a valuable addition to existing paradigms for the assessment of emotion perception. © 2013 The British Psychological Society.

  20. Alexithymia, emotion perception, and social assertiveness in adult women with Noonan and Turner syndromes.

    PubMed

    Roelofs, Renée L; Wingbermühle, Ellen; Freriks, Kim; Verhaak, Chris M; Kessels, Roy P C; Egger, Jos I M

    2015-04-01

    Noonan syndrome (NS) and Turner syndrome (TS) are associated with cognitive problems and difficulties in affective information processing. While both phenotypes include short stature, facial dysmorphisms, and a webbed neck, genetic etiology and neuropsychological phenotype differ significantly. The present study examines putative differences in affective information processing and social assertiveness between adult women with NS and TS. Twenty-six women with NS, 40 women with TS, and 40 female controls were matched on age and intelligence, and subsequently compared on (1) alexithymia, measured by the Bermond-Vorst Alexithymia Questionnaire, (2) emotion perception, evaluated by the Emotion Recognition Task, and (3) social assertiveness and social discomfort, assessed by the Scale for Interpersonal Behavior. Women with TS showed higher levels of alexithymia than women with NS and controls (P-values < 0.001), whereas women with NS had more trouble recognizing angry facial expressions in comparison with controls (P = 0.01). No significant group differences were found for the frequency of social assertiveness and the level of social discomfort. Women with NS and TS demonstrated different patterns of impairment in affective information processing, in terms of alexithymia and emotion perception. The present findings suggest neuropsychological phenotyping to be helpful for the diagnosis of specific cognitive-affective deficits in genetic syndromes, for the enhancement of genetic counseling, and for the development of personalized treatment plans. © 2015 Wiley Periodicals, Inc.

  1. What is adapted in face adaptation? The neural representations of expression in the human visual system.

    PubMed

    Fox, Christopher J; Barton, Jason J S

    2007-01-05

    The neural representation of facial expression within the human visual system is not well defined. Using an adaptation paradigm, we examined aftereffects on expression perception produced by various stimuli. Adapting to a face, which was used to create morphs between two expressions, substantially biased expression perception within the morphed faces away from the adapting expression. This adaptation was not based on low-level image properties, as a different image of the same person displaying that expression produced equally robust aftereffects. Smaller but significant aftereffects were generated by images of different individuals, irrespective of gender. Non-face visual, auditory, or verbal representations of emotion did not generate significant aftereffects. These results suggest that adaptation affects at least two neural representations of expression: one specific to the individual (not the image), and one that represents expression across different facial identities. The identity-independent aftereffect suggests the existence of a 'visual semantic' for facial expression in the human visual system.

  2. Hair Color and Skin Color Together Influence Perceptions of Age, Health, and Attractiveness in Lightly-Pigmented, Young Women.

    PubMed

    Fink, Bernhard; Liebner, Katharina; Müller, Ann-Kathrin; Hirn, Thomas; McKelvey, Graham; Lankhof, John

    2018-05-17

    Research documents that even subtle changes in visible skin condition affect perceptions of age, health, and attractiveness. There is evidence that hair quality also affects the assessment of physical appearance, as variations in hair diameter, hair density, and hair style have systematic effects on perception. Here, we consider combined effects of hair color and skin color on the perception of female physical appearance. In two experiments, we digitally manipulated facial skin color of lightly-pigmented, young women, both between-subjects (Experiment 1) and within-subjects (Experiment 2), and investigated possible interactions with hair color in regard to age, health, and attractiveness perception. In both experiments, we detected hair color and skin color interaction effects on men's and women's assessments. For between-subjects comparisons, participants with lighter hair color were judged to be younger than those with darker shades; this effect was more pronounced in women with light skin color. No such effect was observed for within-subjects variation in skin color. Both experiments showed that smaller perceived contrast between hair color and skin color resulted in more positive responses. We conclude that hair color and facial skin color together have an effect on perceptions of female age, health, and attractiveness in young women, and we discuss these findings with reference to the literature on the role of hair and skin in the assessment of female physical appearance. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  3. Changing perception: facial reanimation surgery improves attractiveness and decreases negative facial perception.

    PubMed

    Dey, Jacob K; Ishii, Masaru; Boahene, Kofi D O; Byrne, Patrick J; Ishii, Lisa E

    2014-01-01

    Determine the effect of facial reanimation surgery on observer-graded attractiveness and negative facial perception of patients with facial paralysis. Randomized controlled experiment. Ninety observers viewed images of paralyzed faces, smiling and in repose, before and after reanimation surgery, as well as normal comparison faces. Observers rated the attractiveness of each face and characterized the paralyzed faces by rating severity, disfigured/bothersome, and importance to repair. Iterated factor analysis indicated these highly correlated variables measure a common domain, so they were combined to create the disfigured, important to repair, bothersome, severity (DIBS) factor score. Mixed effects linear regression determined the effect of facial reanimation surgery on attractiveness and DIBS score. Facial paralysis induces an attractiveness penalty of 2.51 on a 10-point scale for faces in repose and 3.38 for smiling faces. Mixed effects linear regression showed that reanimation surgery improved attractiveness for faces both in repose and smiling by 0.84 (95% confidence interval [CI]: 0.67, 1.01) and 1.24 (95% CI: 1.07, 1.42) respectively. Planned hypothesis tests confirmed statistically significant differences in attractiveness ratings between postoperative and normal faces, indicating attractiveness was not completely normalized. Regression analysis also showed that reanimation surgery decreased DIBS by 0.807 (95% CI: 0.704, 0.911) for faces in repose and 0.989 (95% CI: 0.886, 1.093), an entire standard deviation, for smiling faces. Facial reanimation surgery increases attractiveness and decreases negative facial perception of patients with facial paralysis. These data emphasize the need to optimize reanimation surgery to restore not only function, but also symmetry and cosmesis to improve facial perception and patient quality of life. © 2013 The American Laryngological, Rhinological and Otological Society, Inc.

  4. The prevalence of visual hallucinations in non-affective psychosis, and the role of perception and attention.

    PubMed

    van Ommen, M M; van Beilen, M; Cornelissen, F W; Smid, H G O M; Knegtering, H; Aleman, A; van Laar, T

    2016-06-01

    Little is known about visual hallucinations (VH) in psychosis. We investigated the prevalence and the role of bottom-up and top-down processing in VH. The prevailing view is that VH are probably related to altered top-down processing, rather than to distorted bottom-up processing. Conversely, VH in Parkinson's disease are associated with impaired visual perception and attention, as proposed by the Perception and Attention Deficit (PAD) model. Auditory hallucinations (AH) in psychosis, however, are thought to be related to increased attention. Our retrospective database study included 1119 patients with non-affective psychosis and 586 controls. The Community Assessment of Psychic Experiences established the VH rate. Scores on visual perception tests [Degraded Facial Affect Recognition (DFAR), Benton Facial Recognition Task] and attention tests [Response Set-shifting Task, Continuous Performance Test-HQ (CPT-HQ)] were compared between 75 VH patients, 706 non-VH patients and 485 non-VH controls. The lifetime VH rate was 37%. The patient groups performed similarly on cognitive tasks; both groups showed worse perception (DFAR) than controls. Non-VH patients showed worse attention (CPT-HQ) than controls, whereas VH patients did not perform differently. We did not find significant VH-related impairments in bottom-up processing or direct top-down alterations. However, the results suggest a relatively spared attentional performance in VH patients, whereas face perception and processing speed were equally impaired in both patient groups relative to controls. This would match better with the increased attention hypothesis than with the PAD model. Our finding that VH frequently co-occur with AH may support an increased attention-induced 'hallucination proneness'.

  5. Humor and laughter in patients with cerebellar degeneration.

    PubMed

    Frank, B; Propson, B; Göricke, S; Jacobi, H; Wild, B; Timmann, D

    2012-06-01

    Humor is a complex behavior which includes cognitive, affective and motor responses. Based on observations of affective changes in patients with cerebellar lesions, the cerebellum may support cerebral and brainstem areas involved in understanding and appreciation of humorous stimuli and expression of laughter. The aim of the present study was to examine if humor appreciation, perception of humorous stimuli, and the succeeding facial reaction differ between patients with cerebellar degeneration and healthy controls. Twenty-three adults with pure cerebellar degeneration were compared with 23 age-, gender-, and education-matched healthy control subjects. No significant difference in humor appreciation and perception of humorous stimuli could be found between groups using the 3 Witz-Dimensionen Test, a validated test asking for funniness and aversiveness of jokes and cartoons. Furthermore, while observing jokes, humorous cartoons, and video sketches, facial expressions of subjects were videotaped and afterwards analysed using the Facial Action Coding System. Using depression as a covariate, the number, and to a lesser degree, the duration of facial expressions during laughter were reduced in cerebellar patients compared to healthy controls. In sum, appreciation of humor appears to be largely preserved in patients with chronic cerebellar degeneration. Cerebellar circuits may contribute to the expression of laughter. Findings add to the literature that non-motor disorders in patients with chronic cerebellar disease are generally mild, but do not exclude that more marked disorders may show up in acute cerebellar disease and/or in more specific tests of humor appreciation.

  6. Holistic processing of static and moving faces.

    PubMed

    Zhao, Mintao; Bülthoff, Isabelle

    2017-07-01

    Humans' face ability develops and matures with extensive experience in perceiving, recognizing, and interacting with faces that move most of the time. However, how facial movements affect 1 core aspect of face ability-holistic face processing-remains unclear. Here we investigated the influence of rigid facial motion on holistic and part-based face processing by manipulating the presence of facial motion during study and at test in a composite face task. The results showed that rigidly moving faces were processed as holistically as static faces (Experiment 1). Holistic processing of moving faces persisted whether facial motion was presented during study, at test, or both (Experiment 2). Moreover, when faces were inverted to eliminate the contributions of both an upright face template and observers' expertise with upright faces, rigid facial motion facilitated holistic face processing (Experiment 3). Thus, holistic processing represents a general principle of face perception that applies to both static and dynamic faces, rather than being limited to static faces. These results support an emerging view that both perceiver-based and face-based factors contribute to holistic face processing, and they offer new insights on what underlies holistic face processing, how information supporting holistic face processing interacts with each other, and why facial motion may affect face recognition and holistic face processing differently. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  7. Emotion perception, but not affect perception, is impaired with semantic memory loss.

    PubMed

    Lindquist, Kristen A; Gendron, Maria; Barrett, Lisa Feldman; Dickerson, Bradford C

    2014-04-01

    For decades, psychologists and neuroscientists have hypothesized that the ability to perceive emotions on others' faces is inborn, prelinguistic, and universal. Concept knowledge about emotion has been assumed to be epiphenomenal to emotion perception. In this article, we report findings from 3 patients with semantic dementia that cannot be explained by this "basic emotion" view. These patients, who have substantial deficits in semantic processing abilities, spontaneously perceived pleasant and unpleasant expressions on faces, but not discrete emotions such as anger, disgust, fear, or sadness, even in a task that did not require the use of emotion words. Our findings support the hypothesis that discrete emotion concept knowledge helps transform perceptions of affect (positively or negatively valenced facial expressions) into perceptions of discrete emotions such as anger, disgust, fear, and sadness. These findings have important consequences for understanding the processes supporting emotion perception.

  8. Emotion perception, but not affect perception, is impaired with semantic memory loss

    PubMed Central

    Lindquist, Kristen A.; Gendron, Maria; Feldman Barrett, Lisa; Dickerson, Bradford C.

    2014-01-01

    For decades, psychologists and neuroscientists have hypothesized that the ability to perceive emotions on others’ faces is inborn, pre-linguistic, and universal. Concept knowledge about emotion has been assumed to be epiphenomenal to emotion perception. In this paper, we report findings from three patients with semantic dementia that cannot be explained by this “basic emotion” view. These patients, who have substantial deficits in semantic processing abilities, spontaneously perceived pleasant and unpleasant expressions on faces, but not discrete emotions such as anger, disgust, fear, or sadness, even in a task that did not require the use of emotion words. Our findings support the hypothesis that discrete emotion concept knowledge helps transform perceptions of affect (positively or negatively valenced facial expressions) into perceptions of discrete emotions such as anger, disgust, fear and sadness. These findings have important consequences for understanding the processes supporting emotion perception. PMID:24512242

  9. Brain Response to a Humanoid Robot in Areas Implicated in the Perception of Human Emotional Gestures

    PubMed Central

    Chaminade, Thierry; Zecca, Massimiliano; Blakemore, Sarah-Jayne; Takanishi, Atsuo; Frith, Chris D.; Micera, Silvestro; Dario, Paolo; Rizzolatti, Giacomo; Gallese, Vittorio; Umiltà, Maria Alessandra

    2010-01-01

    Background The humanoid robot WE4-RII was designed to express human emotions in order to improve human-robot interaction. We can read the emotions depicted in its gestures, yet might utilize different neural processes than those used for reading the emotions in human agents. Methodology Here, fMRI was used to assess how brain areas activated by the perception of human basic emotions (facial expression of Anger, Joy, Disgust) and silent speech respond to a humanoid robot impersonating the same emotions, while participants were instructed to attend either to the emotion or to the motion depicted. Principal Findings Increased responses to robot compared to human stimuli in the occipital and posterior temporal cortices suggest additional visual processing when perceiving a mechanical anthropomorphic agent. In contrast, activity in cortical areas endowed with mirror properties, like left Broca's area for the perception of speech, and in the processing of emotions like the left anterior insula for the perception of disgust and the orbitofrontal cortex for the perception of anger, is reduced for robot stimuli, suggesting lesser resonance with the mechanical agent. Finally, instructions to explicitly attend to the emotion significantly increased response to robot, but not human facial expressions in the anterior part of the left inferior frontal gyrus, a neural marker of motor resonance. Conclusions Motor resonance towards a humanoid robot, but not a human, display of facial emotion is increased when attention is directed towards judging emotions. Significance Artificial agents can be used to assess how factors like anthropomorphism affect neural response to the perception of human actions. PMID:20657777

  10. Effects of facial color on the subliminal processing of fearful faces.

    PubMed

    Nakajima, K; Minami, T; Nakauchi, S

    2015-12-03

    Recent studies have suggested that both configural information, such as face shape, and surface information is important for face perception. In particular, facial color is sufficiently suggestive of emotional states, as in the phrases: "flushed with anger" and "pale with fear." However, few studies have examined the relationship between facial color and emotional expression. On the other hand, event-related potential (ERP) studies have shown that emotional expressions, such as fear, are processed unconsciously. In this study, we examined how facial color modulated the supraliminal and subliminal processing of fearful faces. We recorded electroencephalograms while participants performed a facial emotion identification task involving masked target faces exhibiting facial expressions (fearful or neutral) and colors (natural or bluish). The results indicated that there was a significant interaction between facial expression and color for the latency of the N170 component. Subsequent analyses revealed that the bluish-colored faces increased the latency effect of facial expressions compared to the natural-colored faces, indicating that the bluish color modulated the processing of fearful expressions. We conclude that the unconscious processing of fearful faces is affected by facial color. Copyright © 2015 IBRO. Published by Elsevier Ltd. All rights reserved.

  11. Judgment of Nasolabial Esthetics in Cleft Lip and Palate Is Not Influenced by Overall Facial Attractiveness.

    PubMed

    Kocher, Katharina; Kowalski, Piotr; Kolokitha, Olga-Elpis; Katsaros, Christos; Fudalej, Piotr S

    2016-05-01

    To determine whether judgment of nasolabial esthetics in cleft lip and palate (CLP) is influenced by overall facial attractiveness. Experimental study. University of Bern, Switzerland. Seventy-two fused images (36 of boys, 36 of girls) were constructed. Each image comprised (1) the nasolabial region of a treated child with complete unilateral CLP (UCLP) and (2) the external facial features, i.e., the face with masked nasolabial region, of a noncleft child. Photographs of the nasolabial region of six boys and six girls with UCLP representing a wide range of esthetic outcomes, i.e., from very good to very poor appearance, were randomly chosen from a sample of 60 consecutively treated patients in whom nasolabial esthetics had been rated in a previous study. Photographs of external facial features of six boys and six girls without UCLP with various esthetics were randomly selected from patients' files. Eight lay raters evaluated the fused images using a 100-mm visual analogue scale. Method reliability was assessed by reevaluation of fused images after >1 month. A regression model was used to analyze which elements of facial esthetics influenced the perception of nasolabial appearance. Method reliability was good. A regression analysis demonstrated that only the appearance of the nasolabial area affected the esthetic scores of fused images (coefficient = -11.44; P < .001; R(2) = 0.464). The appearance of the external facial features did not influence perceptions of fused images. Cropping facial images for assessment of nasolabial appearance in CLP seems unnecessary. Instead, esthetic evaluation can be performed on images of full faces.

  12. Looking Like a Leader–Facial Shape Predicts Perceived Height and Leadership Ability

    PubMed Central

    Re, Daniel E.; Hunter, David W.; Coetzee, Vinet; Tiddeman, Bernard P.; Xiao, Dengke; DeBruine, Lisa M.; Jones, Benedict C.; Perrett, David I.

    2013-01-01

    Judgments of leadership ability from face images predict the outcomes of actual political elections and are correlated with leadership success in the corporate world. The specific facial cues that people use to judge leadership remain unclear, however. Physical height is also associated with political and organizational success, raising the possibility that facial cues of height contribute to leadership perceptions. Consequently, we assessed whether cues to height exist in the face and, if so, whether they are associated with perception of leadership ability. We found that facial cues to perceived height had a strong relationship with perceived leadership ability. Furthermore, when allowed to manually manipulate faces, participants increased facial cues associated with perceived height in order to maximize leadership perception. A morphometric analysis of face shape revealed that structural facial masculinity was not responsible for the relationship between perceived height and perceived leadership ability. Given the prominence of facial appearance in making social judgments, facial cues to perceived height may have a significant influence on leadership selection. PMID:24324651

  13. Parallel Processing in Face Perception

    ERIC Educational Resources Information Center

    Martens, Ulla; Leuthold, Hartmut; Schweinberger, Stefan R.

    2010-01-01

    The authors examined face perception models with regard to the functional and temporal organization of facial identity and expression analysis. Participants performed a manual 2-choice go/no-go task to classify faces, where response hand depended on facial familiarity (famous vs. unfamiliar) and response execution depended on facial expression…

  14. Do Valenced Odors and Trait Body Odor Disgust Affect Evaluation of Emotion in Dynamic Faces?

    PubMed

    Syrjänen, Elmeri; Liuzza, Marco Tullio; Fischer, Håkan; Olofsson, Jonas K

    2017-12-01

    Disgust is a core emotion evolved to detect and avoid the ingestion of poisonous food as well as the contact with pathogens and other harmful agents. Previous research has shown that multisensory presentation of olfactory and visual information may strengthen the processing of disgust-relevant information. However, it is not known whether these findings extend to dynamic facial stimuli that changes from neutral to emotionally expressive, or if individual differences in trait body odor disgust may influence the processing of disgust-related information. In this preregistered study, we tested whether a classification of dynamic facial expressions as happy or disgusted, and an emotional evaluation of these facial expressions, would be affected by individual differences in body odor disgust sensitivity, and by exposure to a sweat-like, negatively valenced odor (valeric acid), as compared with a soap-like, positively valenced odor (lilac essence) or a no-odor control. Using Bayesian hypothesis testing, we found evidence that odors do not affect recognition of emotion in dynamic faces even when body odor disgust sensitivity was used as moderator. However, an exploratory analysis suggested that an unpleasant odor context may cause faster RTs for faces, independent of their emotional expression. Our results further our understanding of the scope and limits of odor effects on facial perception affect and suggest further studies should focus on reproducibility, specifying experimental circumstances where odor effects on facial expressions may be present versus absent.

  15. Affect of the unconscious: Visually suppressed angry faces modulate our decisions

    PubMed Central

    Pajtas, Petra E.; Mahon, Bradford Z.; Nakayama, Ken; Caramazza, Alfonso

    2016-01-01

    Emotional and affective processing imposes itself over cognitive processes and modulates our perception of the surrounding environment. In two experiments, we addressed the issue of whether nonconscious processing of affect can take place even under deep states of unawareness, such as those induced by interocular suppression techniques, and can elicit an affective response that can influence our understanding of the surrounding environment. In Experiment 1, participants judged the likeability of an unfamiliar item—a Chinese character—that was preceded by a face expressing a particular emotion (either happy or angry). The face was rendered invisible through an interocular suppression technique (continuous flash suppression; CFS). In Experiment 2, backward masking (BM), a less robust masking technique, was used to render the facial expressions invisible. We found that despite equivalent phenomenological suppression of the visual primes under CFS and BM, different patterns of affective processing were obtained with the two masking techniques. Under BM, nonconscious affective priming was obtained for both happy and angry invisible facial expressions. However, under CFS, nonconscious affective priming was obtained only for angry facial expressions. We discuss an interpretation of this dissociation between affective processing and visual masking techniques in terms of distinct routes from the retina to the amygdala. PMID:23224765

  16. Facial profile preferences, self-awareness and perception among groups of people in the United Arab Emirates

    PubMed Central

    Al Taki, Amjad; Guidoum, Amina

    2014-01-01

    Objectives: The objective of this study is to assess the differences in facial profile preference among different layers of people in the United Arab Emirates. Facial profile self-awareness among the different groups was also evaluated. Materials and Methods: A total sample of 222 participants (mean [standard deviation] age = 25.71 [8.3] years, almost 80% of the participants were of Arab origin and 55% were males); consisting of 60 laypersons, 60 dental students, 60 general practitioners, 16 oral surgeons, and 26 orthodontists. Facial profile photographs of a male and female adult with straight profiles and a Class I skeletal relationship were used as a baseline template. Computerized photographic image modification was carried out on the templates to obtain seven different facial profile silhouettes for each gender. To assess differences in facial profile perception, participants were asked to rank the profiles of each gender on a scale from most to least attractive (1 [highest score] and 7 [least score]). Awareness and satisfaction with the facial appearance on a profile view was assessed using questionnaires completed by the non-expert groups. Results: The straight facial profile was perceived to be highly attractive by all five groups. The least attractive profiles were the bimaxillary protrusion and the mandibular retrusion for the male and the female profiles, respectively. Lip protrusion was more esthetically acceptable in females. Significant differences in perception existed among groups. The female profile esthetic perception was highly correlated between the expert groups (P > 0.05). Overall agreement between the non-expert group's perceptions of their own profiles and evaluation by the expert orthodontist was 51% (κ = 0.089). Candidates who perceived themselves as having a Class III facial profile were the least satisfied with their profile. Conclusions: Dental professionals, dental students, and laypersons had a similar perception trends in female and male aesthetic preference. Laypersons were more tolerant to profiles with bi-maxillary retrusion. The expert group's esthetic perception was highly correlated only for the female profiles. Most of the non-experts were unable to correctly identify their facial profile. PMID:24987664

  17. Children's Representations of Facial Expression and Identity: Identity-Contingent Expression Aftereffects

    ERIC Educational Resources Information Center

    Vida, Mark D.; Mondloch, Catherine J.

    2009-01-01

    This investigation used adaptation aftereffects to examine developmental changes in the perception of facial expressions. Previous studies have shown that adults' perceptions of ambiguous facial expressions are biased following adaptation to intense expressions. These expression aftereffects are strong when the adapting and probe expressions share…

  18. Emotional expressions beyond facial muscle actions. A call for studying autonomic signals and their impact on social perception

    PubMed Central

    Kret, Mariska E.

    2015-01-01

    Humans are well adapted to quickly recognize and adequately respond to another’s emotions. Different theories propose that mimicry of emotional expressions (facial or otherwise) mechanistically underlies, or at least facilitates, these swift adaptive reactions. When people unconsciously mimic their interaction partner’s expressions of emotion, they come to feel reflections of those companions’ emotions, which in turn influence the observer’s own emotional and empathic behavior. The majority of research has focused on facial actions as expressions of emotion. However, the fact that emotions are not just expressed by facial muscles alone is often still ignored in emotion perception research. In this article, I therefore argue for a broader exploration of emotion signals from sources beyond the face muscles that are more automatic and difficult to control. Specifically, I will focus on the perception of implicit sources such as gaze and tears and autonomic responses such as pupil-dilation, eyeblinks and blushing that are subtle yet visible to observers and because they can hardly be controlled or regulated by the sender, provide important “veridical” information. Recently, more research is emerging about the mimicry of these subtle affective signals including pupil-mimicry. I will here review this literature and suggest avenues for future research that will eventually lead to a better comprehension of how these signals help in making social judgments and understand each other’s emotions. PMID:26074855

  19. Second to fourth digit ratio and face shape

    PubMed Central

    Fink, Bernhard; Grammer, Karl; Mitteroecker, Philipp; Gunz, Philipp; Schaefer, Katrin; Bookstein, Fred L; Manning, John T

    2005-01-01

    The average human male face differs from the average female face in size and shape of the jaws, cheek-bones, lips, eyes and nose. It is possible that this dimorphism is determined by sex steroids such as testosterone (T) and oestrogen (E), and several studies on the perception of such characteristics have been based on this assumption, but those studies focussed mainly on the relationship of male faces with circulating hormone levels; the corresponding biology of the female face remains mainly speculative. This paper is concerned with the relative importance of prenatal T and E levels (assessed via the 2D : 4D finger length ratio, a proxy for the ratio of T/E) and sex in the determination of facial form as characterized by 64 landmark points on facial photographs of 106 Austrians of college age. We found that (i) prenatal sex steroid ratios (in terms of 2D : 4D) and actual chromosomal sex dimorphism operate differently on faces, (ii) 2D : 4D affects male and female face shape by similar patterns, but (iii) is three times more intense in men than in women. There was no evidence that these effects were confounded by allometry or facial asymmetry. Our results suggest that studies on the perception of facial characteristics need to consider differential effects of prenatal hormone exposure and actual chromosomal gender in order to understand how characteristics have come to be rated ‘masculine’ or ‘feminine’ and the consequences of these perceptions in terms of mate preferences. PMID:16191608

  20. Gender, age, and psychosocial context of the perception of facial esthetics.

    PubMed

    Tole, Nikoleta; Lajnert, Vlatka; Kovacevic Pavicic, Daniela; Spalj, Stjepan

    2014-01-01

    To explore the effects of gender, age, and psychosocial context on the perception of facial esthetics. The study included 1,444 Caucasian subjects aged 16 to 85 years. Two sets of color photographs illustrating 13 male and 13 female Caucasian facial type alterations, representing different skeletal and dentoalveolar components of sagittal maxillary-mandibular relationships, were used to estimate the facial profile attractiveness. The examinees graded the profiles based on a 0 to 10 numerical rating scale. The examinees graded the profiles of their own sex only from a social perspective, whereas opposite sex profiles were graded both from the social and emotional perspective separately. The perception of facial esthetics was found to be related to the gender, age, and psychosocial context of evaluation (p < 0.05). The most attractive profiles to men are the orthognathic female profile from the social perspective and the moderate bialveolar protrusion from the emotional perspective. The most attractive profile to women is the orthognathic male profile, when graded from the social aspect, and the mild bialveolar retrusion when graded from the emotional aspect. The age increase of the assessor results in a higher attractiveness grade. When planning treatment that modifies the facial profile, the clinician should bear in mind that the perception of facial profile esthetics is a complex phenomenon influenced by biopsychosocial factors. This study allows a better understanding of the concept of perception of facial esthetics that includes gender, age, and psychosocial context. © 2013 Wiley Periodicals, Inc.

  1. Impact of facial defect reconstruction on attractiveness and negative facial perception.

    PubMed

    Dey, Jacob K; Ishii, Masaru; Boahene, Kofi D O; Byrne, Patrick; Ishii, Lisa E

    2015-06-01

    Measure the impact of facial defect reconstruction on observer-graded attractiveness and negative facial perception. Prospective, randomized, controlled experiment. One hundred twenty casual observers viewed images of faces with defects of varying sizes and locations before and after reconstruction as well as normal comparison faces. Observers rated attractiveness, defect severity, and how disfiguring, bothersome, and important to repair they considered each face. Facial defects decreased attractiveness -2.26 (95% confidence interval [CI]: -2.45, -2.08) on a 10-point scale. Mixed effects linear regression showed this attractiveness penalty varied with defect size and location, with large and central defects generating the greatest penalty. Reconstructive surgery increased attractiveness 1.33 (95% CI: 1.18, 1.47), an improvement dependent upon size and location, restoring some defect categories to near normal ranges of attractiveness. Iterated principal factor analysis indicated the disfiguring, important to repair, bothersome, and severity variables were highly correlated and measured a common domain; thus, they were combined to create the disfigured, important to repair, bothersome, severity (DIBS) factor score, representing negative facial perception. The DIBS regression showed defect faces have a 1.5 standard deviation increase in negative perception (DIBS: 1.69, 95% CI: 1.61, 1.77) compared to normal faces, which decreased by a similar magnitude after surgery (DIBS: -1.44, 95% CI: -1.49, -1.38). These findings varied with defect size and location. Surgical reconstruction of facial defects increased attractiveness and decreased negative social facial perception, an impact that varied with defect size and location. These new social perception data add to the evidence base demonstrating the value of high-quality reconstructive surgery. NA. © 2015 The American Laryngological, Rhinological and Otological Society, Inc.

  2. Facial emotion recognition, face scan paths, and face perception in children with neurofibromatosis type 1.

    PubMed

    Lewis, Amelia K; Porter, Melanie A; Williams, Tracey A; Bzishvili, Samantha; North, Kathryn N; Payne, Jonathan M

    2017-05-01

    This study aimed to investigate face scan paths and face perception abilities in children with Neurofibromatosis Type 1 (NF1) and how these might relate to emotion recognition abilities in this population. The authors investigated facial emotion recognition, face scan paths, and face perception in 29 children with NF1 compared to 29 chronological age-matched typically developing controls. Correlations between facial emotion recognition, face scan paths, and face perception in children with NF1 were examined. Children with NF1 displayed significantly poorer recognition of fearful expressions compared to controls, as well as a nonsignificant trend toward poorer recognition of anger. Although there was no significant difference between groups in time spent viewing individual core facial features (eyes, nose, mouth, and nonfeature regions), children with NF1 spent significantly less time than controls viewing the face as a whole. Children with NF1 also displayed significantly poorer face perception abilities than typically developing controls. Facial emotion recognition deficits were not significantly associated with aberrant face scan paths or face perception abilities in the NF1 group. These results suggest that impairments in the perception, identification, and interpretation of information from faces are important aspects of the social-cognitive phenotype of NF1. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  3. Rules versus Prototype Matching: Strategies of Perception of Emotional Facial Expressions in the Autism Spectrum

    ERIC Educational Resources Information Center

    Rutherford, M. D.; McIntosh, Daniel N.

    2007-01-01

    When perceiving emotional facial expressions, people with autistic spectrum disorders (ASD) appear to focus on individual facial features rather than configurations. This paper tests whether individuals with ASD use these features in a rule-based strategy of emotional perception, rather than a typical, template-based strategy by considering…

  4. Project PAVE (Personality And Vision Experimentation): role of personal and interpersonal resilience in the perception of emotional facial expression

    PubMed Central

    Tanzer, Michal; Shahar, Golan; Avidan, Galia

    2014-01-01

    The aim of the proposed theoretical model is to illuminate personal and interpersonal resilience by drawing from the field of emotional face perception. We suggest that perception/recognition of emotional facial expressions serves as a central link between subjective, self-related processes and the social context. Emotional face perception constitutes a salient social cue underlying interpersonal communication and behavior. Because problems in communication and interpersonal behavior underlie most, if not all, forms of psychopathology, it follows that perception/recognition of emotional facial expressions impacts psychopathology. The ability to accurately interpret one’s facial expression is crucial in subsequently deciding on an appropriate course of action. However, perception in general, and of emotional facial expressions in particular, is highly influenced by individuals’ personality and the self-concept. Herein we briefly outline well-established theories of personal and interpersonal resilience and link them to the neuro-cognitive basis of face perception. We then describe the findings of our ongoing program of research linking two well-established resilience factors, general self-efficacy (GSE) and perceived social support (PSS), with face perception. We conclude by pointing out avenues for future research focusing on possible genetic markers and patterns of brain connectivity associated with the proposed model. Implications of our integrative model to psychotherapy are discussed. PMID:25165439

  5. Neural Correlates of Facial Mimicry: Simultaneous Measurements of EMG and BOLD Responses during Perception of Dynamic Compared to Static Facial Expressions

    PubMed Central

    Rymarczyk, Krystyna; Żurawski, Łukasz; Jankowiak-Siuda, Kamila; Szatkowska, Iwona

    2018-01-01

    Facial mimicry (FM) is an automatic response to imitate the facial expressions of others. However, neural correlates of the phenomenon are as yet not well established. We investigated this issue using simultaneously recorded EMG and BOLD signals during perception of dynamic and static emotional facial expressions of happiness and anger. During display presentations, BOLD signals and zygomaticus major (ZM), corrugator supercilii (CS) and orbicularis oculi (OO) EMG responses were recorded simultaneously from 46 healthy individuals. Subjects reacted spontaneously to happy facial expressions with increased EMG activity in ZM and OO muscles and decreased CS activity, which was interpreted as FM. Facial muscle responses correlated with BOLD activity in regions associated with motor simulation of facial expressions [i.e., inferior frontal gyrus, a classical Mirror Neuron System (MNS)]. Further, we also found correlations for regions associated with emotional processing (i.e., insula, part of the extended MNS). It is concluded that FM involves both motor and emotional brain structures, especially during perception of natural emotional expressions. PMID:29467691

  6. Effects of Objective 3-Dimensional Measures of Facial Shape and Symmetry on Perceptions of Facial Attractiveness.

    PubMed

    Hatch, Cory D; Wehby, George L; Nidey, Nichole L; Moreno Uribe, Lina M

    2017-09-01

    Meeting patient desires for enhanced facial esthetics requires that providers have standardized and objective methods to measure esthetics. The authors evaluated the effects of objective 3-dimensional (3D) facial shape and asymmetry measurements derived from 3D facial images on perceptions of facial attractiveness. The 3D facial images of 313 adults in Iowa were digitized with 32 landmarks, and objective 3D facial measurements capturing symmetric and asymmetric components of shape variation, centroid size, and fluctuating asymmetry were obtained from the 3D coordinate data using geo-morphometric analyses. Frontal and profile images of study participants were rated for facial attractiveness by 10 volunteers (5 women and 5 men) on a 5-point Likert scale and a visual analog scale. Multivariate regression was used to identify the effects of the objective 3D facial measurements on attractiveness ratings. Several objective 3D facial measurements had marked effects on attractiveness ratings. Shorter facial heights with protrusive chins, midface retrusion, faces with protrusive noses and thin lips, flat mandibular planes with deep labiomental folds, any cants of the lip commissures and floor of the nose, larger faces overall, and increased fluctuating asymmetry were rated as significantly (P < .001) less attractive. Perceptions of facial attractiveness can be explained by specific 3D measurements of facial shapes and fluctuating asymmetry, which have important implications for clinical practice and research. Copyright © 2017 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  7. Perceived differences between chimpanzee (Pan troglodytes) and human (Homo sapiens) facial expressions are related to emotional interpretation.

    PubMed

    Waller, Bridget M; Bard, Kim A; Vick, Sarah-Jane; Smith Pasqualini, Marcia C

    2007-11-01

    Human face perception is a finely tuned, specialized process. When comparing faces between species, therefore, it is essential to consider how people make these observational judgments. Comparing facial expressions may be particularly problematic, given that people tend to consider them categorically as emotional signals, which may affect how accurately specific details are processed. The bared-teeth display (BT), observed in most primates, has been proposed as a homologue of the human smile (J. A. R. A. M. van Hooff, 1972). In this study, judgments of similarity between BT displays of chimpanzees (Pan troglodytes) and human smiles varied in relation to perceived emotional valence. When a chimpanzee BT was interpreted as fearful, observers tended to underestimate the magnitude of the relationship between certain features (the extent of lip corner raise) and human smiles. These judgments may reflect the combined effects of categorical emotional perception, configural face processing, and perceptual organization in mental imagery and may demonstrate the advantages of using standardized observational methods in comparative facial expression research. Copyright 2007 APA.

  8. The role of working memory in decoding emotions.

    PubMed

    Phillips, Louise H; Channon, Shelley; Tunstall, Mary; Hedenstrom, Anna; Lyons, Kathryn

    2008-04-01

    Decoding facial expressions of emotion is an important aspect of social communication that is often impaired following psychiatric or neurological illness. However, little is known of the cognitive components involved in perceiving emotional expressions. Three dual task studies explored the role of verbal working memory in decoding emotions. Concurrent working memory load substantially interfered with choosing which emotional label described a facial expression (Experiment 1). A key factor in the magnitude of interference was the number of emotion labels from which to choose (Experiment 2). In contrast the ability to decide that two faces represented the same emotion in a discrimination task was relatively unaffected by concurrent working memory load (Experiment 3). Different methods of assessing emotion perception make substantially different demands on working memory. Implications for clinical disorders which affect both working memory and emotion perception are considered. (Copyright) 2008 APA.

  9. More than mere mimicry? The influence of emotion on rapid facial reactions to faces.

    PubMed

    Moody, Eric J; McIntosh, Daniel N; Mann, Laura J; Weisser, Kimberly R

    2007-05-01

    Within a second of seeing an emotional facial expression, people typically match that expression. These rapid facial reactions (RFRs), often termed mimicry, are implicated in emotional contagion, social perception, and embodied affect, yet ambiguity remains regarding the mechanism(s) involved. Two studies evaluated whether RFRs to faces are solely nonaffective motor responses or whether emotional processes are involved. Brow (corrugator, related to anger) and forehead (frontalis, related to fear) activity were recorded using facial electromyography (EMG) while undergraduates in two conditions (fear induction vs. neutral) viewed fear, anger, and neutral facial expressions. As predicted, fear induction increased fear expressions to angry faces within 1000 ms of exposure, demonstrating an emotional component of RFRs. This did not merely reflect increased fear from the induction, because responses to neutral faces were unaffected. Considering RFRs to be merely nonaffective automatic reactions is inaccurate. RFRs are not purely motor mimicry; emotion influences early facial responses to faces. The relevance of these data to emotional contagion, autism, and the mirror system-based perspectives on imitation is discussed.

  10. Oxygenated-Blood Colour Change Thresholds for Perceived Facial Redness, Health, and Attractiveness

    PubMed Central

    Re, Daniel E.; Whitehead, Ross D.; Xiao, Dengke; Perrett, David I.

    2011-01-01

    Blood oxygenation level is associated with cardiovascular fitness, and raising oxygenated blood colouration in human faces increases perceived health. The current study used a two-alternative forced choice (2AFC) psychophysics design to quantify the oxygenated blood colour (redness) change threshold required to affect perception of facial colour, health and attractiveness. Detection thresholds for colour judgments were lower than those for health and attractiveness, which did not differ. The results suggest redness preferences do not reflect a sensory bias, rather preferences may be based on accurate indications of health status. Furthermore, results suggest perceived health and attractiveness may be perceptually equivalent when they are assessed based on facial redness. Appearance-based motivation for lifestyle change can be effective; thus future studies could assess the degree to which cardiovascular fitness increases face redness and could quantify changes in aerobic exercise needed to increase facial attractiveness. PMID:21448270

  11. Oxygenated-blood colour change thresholds for perceived facial redness, health, and attractiveness.

    PubMed

    Re, Daniel E; Whitehead, Ross D; Xiao, Dengke; Perrett, David I

    2011-03-23

    Blood oxygenation level is associated with cardiovascular fitness, and raising oxygenated blood colouration in human faces increases perceived health. The current study used a two-alternative forced choice (2AFC) psychophysics design to quantify the oxygenated blood colour (redness) change threshold required to affect perception of facial colour, health and attractiveness. Detection thresholds for colour judgments were lower than those for health and attractiveness, which did not differ. The results suggest redness preferences do not reflect a sensory bias, rather preferences may be based on accurate indications of health status. Furthermore, results suggest perceived health and attractiveness may be perceptually equivalent when they are assessed based on facial redness. Appearance-based motivation for lifestyle change can be effective; thus future studies could assess the degree to which cardiovascular fitness increases face redness and could quantify changes in aerobic exercise needed to increase facial attractiveness.

  12. Perceptions of variability in facial emotion influence beliefs about the stability of psychological characteristics.

    PubMed

    Weisbuch, Max; Grunberg, Rebecca L; Slepian, Michael L; Ambady, Nalini

    2016-10-01

    Beliefs about the malleability versus stability of traits (incremental vs. entity lay theories) have a profound impact on social cognition and self-regulation, shaping phenomena that range from the fundamental attribution error and group-based stereotyping to academic motivation and achievement. Less is known about the causes than the effects of these lay theories, and in the current work the authors examine the perception of facial emotion as a causal influence on lay theories. Specifically, they hypothesized that (a) within-person variability in facial emotion signals within-person variability in traits and (b) social environments replete with within-person variability in facial emotion encourage perceivers to endorse incremental lay theories. Consistent with Hypothesis 1, Study 1 participants were more likely to attribute dynamic (vs. stable) traits to a person who exhibited several different facial emotions than to a person who exhibited a single facial emotion across multiple images. Hypothesis 2 suggests that social environments support incremental lay theories to the extent that they include many people who exhibit within-person variability in facial emotion. Consistent with Hypothesis 2, participants in Studies 2-4 were more likely to endorse incremental theories of personality, intelligence, and morality after exposure to multiple individuals exhibiting within-person variability in facial emotion than after exposure to multiple individuals exhibiting a single emotion several times. Perceptions of within-person variability in facial emotion-rather than perceptions of simple diversity in facial emotion-were responsible for these effects. Discussion focuses on how social ecologies shape lay theories. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  13. Facial Cosmetics Exert a Greater Influence on Processing of the Mouth Relative to the Eyes: Evidence from the N170 Event-Related Potential Component.

    PubMed

    Tanaka, Hideaki

    2016-01-01

    Cosmetic makeup significantly influences facial perception. Because faces consist of similar physical structures, cosmetic makeup is typically used to highlight individual features, particularly those of the eyes (i.e., eye shadow) and mouth (i.e., lipstick). Though event-related potentials have been utilized to study various aspects of facial processing, the influence of cosmetics on specific ERP components remains unclear. The present study aimed to investigate the relationship between the application of cosmetic makeup and the amplitudes of the P1 and N170 event-related potential components during facial perception tasks. Moreover, the influence of visual perception on N170 amplitude, was evaluated under three makeup conditions: Eye Shadow, Lipstick, and No Makeup. Electroencephalography was used to monitor 17 participants who were exposed to visual stimuli under each these three makeup conditions. The results of the present study subsequently demonstrated that the Lipstick condition elicited a significantly greater N170 amplitude than the No Makeup condition, while P1 amplitude was unaffected by any of the conditions. Such findings indicate that the application of cosmetic makeup alters general facial perception but exerts no influence on the perception of low-level visual features. Collectively, these results support the notion that the application of makeup induces subtle alterations in the processing of facial stimuli, with a particular effect on the processing of specific facial components (i.e., the mouth), as reflected by changes in N170 amplitude.

  14. Facial Cosmetics Exert a Greater Influence on Processing of the Mouth Relative to the Eyes: Evidence from the N170 Event-Related Potential Component

    PubMed Central

    Tanaka, Hideaki

    2016-01-01

    Cosmetic makeup significantly influences facial perception. Because faces consist of similar physical structures, cosmetic makeup is typically used to highlight individual features, particularly those of the eyes (i.e., eye shadow) and mouth (i.e., lipstick). Though event-related potentials have been utilized to study various aspects of facial processing, the influence of cosmetics on specific ERP components remains unclear. The present study aimed to investigate the relationship between the application of cosmetic makeup and the amplitudes of the P1 and N170 event-related potential components during facial perception tasks. Moreover, the influence of visual perception on N170 amplitude, was evaluated under three makeup conditions: Eye Shadow, Lipstick, and No Makeup. Electroencephalography was used to monitor 17 participants who were exposed to visual stimuli under each these three makeup conditions. The results of the present study subsequently demonstrated that the Lipstick condition elicited a significantly greater N170 amplitude than the No Makeup condition, while P1 amplitude was unaffected by any of the conditions. Such findings indicate that the application of cosmetic makeup alters general facial perception but exerts no influence on the perception of low-level visual features. Collectively, these results support the notion that the application of makeup induces subtle alterations in the processing of facial stimuli, with a particular effect on the processing of specific facial components (i.e., the mouth), as reflected by changes in N170 amplitude. PMID:27656161

  15. Emotion categories and dimensions in the facial communication of affect: An integrated approach.

    PubMed

    Mehu, Marc; Scherer, Klaus R

    2015-12-01

    We investigated the role of facial behavior in emotional communication, using both categorical and dimensional approaches. We used a corpus of enacted emotional expressions (GEMEP) in which professional actors are instructed, with the help of scenarios, to communicate a variety of emotional experiences. The results of Study 1 replicated earlier findings showing that only a minority of facial action units are associated with specific emotional categories. Likewise, facial behavior did not show a specific association with particular emotional dimensions. Study 2 showed that facial behavior plays a significant role both in the detection of emotions and in the judgment of their dimensional aspects, such as valence, arousal, dominance, and unpredictability. In addition, a mediation model revealed that the association between facial behavior and recognition of the signaler's emotional intentions is mediated by perceived emotional dimensions. We conclude that, from a production perspective, facial action units convey neither specific emotions nor specific emotional dimensions, but are associated with several emotions and several dimensions. From the perceiver's perspective, facial behavior facilitated both dimensional and categorical judgments, and the former mediated the effect of facial behavior on recognition accuracy. The classification of emotional expressions into discrete categories may, therefore, rely on the perception of more general dimensions such as valence and arousal and, presumably, the underlying appraisals that are inferred from facial movements. (c) 2015 APA, all rights reserved).

  16. Short-term visual deprivation reduces interference effects of task-irrelevant facial expressions on affective prosody judgments

    PubMed Central

    Fengler, Ineke; Nava, Elena; Röder, Brigitte

    2015-01-01

    Several studies have suggested that neuroplasticity can be triggered by short-term visual deprivation in healthy adults. Specifically, these studies have provided evidence that visual deprivation reversibly affects basic perceptual abilities. The present study investigated the long-lasting effects of short-term visual deprivation on emotion perception. To this aim, we visually deprived a group of young healthy adults, age-matched with a group of non-deprived controls, for 3 h and tested them before and after visual deprivation (i.e., after 8 h on average and at 4 week follow-up) on an audio–visual (i.e., faces and voices) emotion discrimination task. To observe changes at the level of basic perceptual skills, we additionally employed a simple audio–visual (i.e., tone bursts and light flashes) discrimination task and two unimodal (one auditory and one visual) perceptual threshold measures. During the 3 h period, both groups performed a series of auditory tasks. To exclude the possibility that changes in emotion discrimination may emerge as a consequence of the exposure to auditory stimulation during the 3 h stay in the dark, we visually deprived an additional group of age-matched participants who concurrently performed unrelated (i.e., tactile) tasks to the later tested abilities. The two visually deprived groups showed enhanced affective prosodic discrimination abilities in the context of incongruent facial expressions following the period of visual deprivation; this effect was partially maintained until follow-up. By contrast, no changes were observed in affective facial expression discrimination and in the basic perception tasks in any group. These findings suggest that short-term visual deprivation per se triggers a reweighting of visual and auditory emotional cues, which seems to possibly prevail for longer durations. PMID:25954166

  17. Increased positive versus negative affective perception and memory in healthy volunteers following selective serotonin and norepinephrine reuptake inhibition.

    PubMed

    Harmer, Catherine J; Shelley, Nicholas C; Cowen, Philip J; Goodwin, Guy M

    2004-07-01

    Antidepressants that inhibit the reuptake of serotonin (SSRIs) or norepinephrine (SNRIs) are effective in the treatment of disorders such as depression and anxiety. Cognitive psychological theories emphasize the importance of correcting negative biases of information processing in the nonpharmacological treatment of these disorders, but it is not known whether antidepressant drugs can directly modulate the neural processing of affective information. The present study therefore assessed the actions of repeated antidepressant administration on perception and memory for positive and negative emotional information in healthy volunteers. Forty-two male and female volunteers were randomly assigned to 7 days of double-blind intervention with the SSRI citalopram (20 mg/day), the SNRI reboxetine (8 mg/day), or placebo. On the final day, facial expression recognition, emotion-potentiated startle response, and memory for affect-laden words were assessed. Questionnaires monitoring mood, hostility, and anxiety were given before and after treatment. In the facial expression recognition task, citalopram and reboxetine reduced the identification of the negative facial expressions of anger and fear. Citalopram also abolished the increased startle response found in the context of negative affective images. Both antidepressants increased the relative recall of positive (versus negative) emotional material. These changes in emotional processing occurred in the absence of significant differences in ratings of mood and anxiety. However, reboxetine decreased subjective ratings of hostility and elevated energy. Short-term administration of two different antidepressant types had similar effects on emotion-related tasks in healthy volunteers, reducing the processing of negative relative to positive emotional material. Such effects of antidepressants may ameliorate the negative biases in information processing that characterize mood and anxiety disorders. They also suggest a mechanism of action potentially compatible with cognitive theories of anxiety and depression.

  18. Are event-related potentials to dynamic facial expressions of emotion related to individual differences in the accuracy of processing facial expressions and identity?

    PubMed

    Recio, Guillermo; Wilhelm, Oliver; Sommer, Werner; Hildebrandt, Andrea

    2017-04-01

    Despite a wealth of knowledge about the neural mechanisms behind emotional facial expression processing, little is known about how they relate to individual differences in social cognition abilities. We studied individual differences in the event-related potentials (ERPs) elicited by dynamic facial expressions. First, we assessed the latent structure of the ERPs, reflecting structural face processing in the N170, and the allocation of processing resources and reflexive attention to emotionally salient stimuli, in the early posterior negativity (EPN) and the late positive complex (LPC). Then we estimated brain-behavior relationships between the ERP factors and behavioral indicators of facial identity and emotion-processing abilities. Structural models revealed that the participants who formed faster structural representations of neutral faces (i.e., shorter N170 latencies) performed better at face perception (r = -.51) and memory (r = -.42). The N170 amplitude was not related to individual differences in face cognition or emotion processing. The latent EPN factor correlated with emotion perception (r = .47) and memory (r = .32), and also with face perception abilities (r = .41). Interestingly, the latent factor representing the difference in EPN amplitudes between the two neutral control conditions (chewing and blinking movements) also correlated with emotion perception (r = .51), highlighting the importance of tracking facial changes in the perception of emotional facial expressions. The LPC factor for negative expressions correlated with the memory for emotional facial expressions. The links revealed between the latency and strength of activations of brain systems and individual differences in processing socio-emotional information provide new insights into the brain mechanisms involved in social communication.

  19. Seeing a haptically explored face: visual facial-expression aftereffect from haptic adaptation to a face.

    PubMed

    Matsumiya, Kazumichi

    2013-10-01

    Current views on face perception assume that the visual system receives only visual facial signals. However, I show that the visual perception of faces is systematically biased by adaptation to a haptically explored face. Recently, face aftereffects (FAEs; the altered perception of faces after adaptation to a face) have been demonstrated not only in visual perception but also in haptic perception; therefore, I combined the two FAEs to examine whether the visual system receives face-related signals from the haptic modality. I found that adaptation to a haptically explored facial expression on a face mask produced a visual FAE for facial expression. This cross-modal FAE was not due to explicitly imaging a face, response bias, or adaptation to local features. Furthermore, FAEs transferred from vision to haptics. These results indicate that visual face processing depends on substrates adapted by haptic faces, which suggests that face processing relies on shared representation underlying cross-modal interactions.

  20. Automatic prediction of facial trait judgments: appearance vs. structural models.

    PubMed

    Rojas, Mario; Masip, David; Todorov, Alexander; Vitria, Jordi

    2011-01-01

    Evaluating other individuals with respect to personality characteristics plays a crucial role in human relations and it is the focus of attention for research in diverse fields such as psychology and interactive computer systems. In psychology, face perception has been recognized as a key component of this evaluation system. Multiple studies suggest that observers use face information to infer personality characteristics. Interactive computer systems are trying to take advantage of these findings and apply them to increase the natural aspect of interaction and to improve the performance of interactive computer systems. Here, we experimentally test whether the automatic prediction of facial trait judgments (e.g. dominance) can be made by using the full appearance information of the face and whether a reduced representation of its structure is sufficient. We evaluate two separate approaches: a holistic representation model using the facial appearance information and a structural model constructed from the relations among facial salient points. State of the art machine learning methods are applied to a) derive a facial trait judgment model from training data and b) predict a facial trait value for any face. Furthermore, we address the issue of whether there are specific structural relations among facial points that predict perception of facial traits. Experimental results over a set of labeled data (9 different trait evaluations) and classification rules (4 rules) suggest that a) prediction of perception of facial traits is learnable by both holistic and structural approaches; b) the most reliable prediction of facial trait judgments is obtained by certain type of holistic descriptions of the face appearance; and c) for some traits such as attractiveness and extroversion, there are relationships between specific structural features and social perceptions.

  1. Sex differences in amygdala activation during the perception of facial affect.

    PubMed

    Killgore, W D; Yurgelun-Todd, D A

    2001-08-08

    The cognitive and affective systems of the cerebral cortex are often more lateralized in males than females, but it is unclear whether these differences extend to subcortical systems. We used fMRI to examine sex differences in lateralized amygdala activity during happy and fearful face perception. Amygdala activation differed for men and women depending on the valence of the expression. Overall, males were more lateralized than females, but the direction differed between valence conditions. Happy faces produced greater right than left amygdala activation for males but not females. Both sexes showed greater left amygdala activation for fearful faces. These findings suggest that the lateralization of affective function may extend beyond the cortex to subcortical regions such as the amygdala.

  2. Multiple Mechanisms in the Perception of Face Gender: Effect of Sex-Irrelevant Features

    ERIC Educational Resources Information Center

    Komori, Masashi; Kawamura, Satoru; Ishihara, Shigekazu

    2011-01-01

    Effects of sex-relevant and sex-irrelevant facial features on the evaluation of facial gender were investigated. Participants rated masculinity of 48 male facial photographs and femininity of 48 female facial photographs. Eighty feature points were measured on each of the facial photographs. Using a generalized Procrustes analysis, facial shapes…

  3. Attention Alters Perceived Attractiveness.

    PubMed

    Störmer, Viola S; Alvarez, George A

    2016-04-01

    Can attention alter the impression of a face? Previous studies showed that attention modulates the appearance of lower-level visual features. For instance, attention can make a simple stimulus appear to have higher contrast than it actually does. We tested whether attention can also alter the perception of a higher-order property-namely, facial attractiveness. We asked participants to judge the relative attractiveness of two faces after summoning their attention to one of the faces using a briefly presented visual cue. Across trials, participants judged the attended face to be more attractive than the same face when it was unattended. This effect was not due to decision or response biases, but rather was due to changes in perceptual processing of the faces. These results show that attention alters perceived facial attractiveness, and broadly demonstrate that attention can influence higher-level perception and may affect people's initial impressions of one another. © The Author(s) 2016.

  4. Psychopathic traits affect the visual exploration of facial expressions.

    PubMed

    Boll, Sabrina; Gamer, Matthias

    2016-05-01

    Deficits in emotional reactivity and recognition have been reported in psychopathy. Impaired attention to the eyes along with amygdala malfunctions may underlie these problems. Here, we investigated how different facets of psychopathy modulate the visual exploration of facial expressions by assessing personality traits in a sample of healthy young adults using an eye-tracking based face perception task. Fearless Dominance (the interpersonal-emotional facet of psychopathy) and Coldheartedness scores predicted reduced face exploration consistent with findings on lowered emotional reactivity in psychopathy. Moreover, participants high on the social deviance facet of psychopathy ('Self-Centered Impulsivity') showed a reduced bias to shift attention towards the eyes. Our data suggest that facets of psychopathy modulate face processing in healthy individuals and reveal possible attentional mechanisms which might be responsible for the severe impairments of social perception and behavior observed in psychopathy. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. The role of visual experience in the production of emotional facial expressions by blind people: a review.

    PubMed

    Valente, Dannyelle; Theurel, Anne; Gentaz, Edouard

    2018-04-01

    Facial expressions of emotion are nonverbal behaviors that allow us to interact efficiently in social life and respond to events affecting our welfare. This article reviews 21 studies, published between 1932 and 2015, examining the production of facial expressions of emotion by blind people. It particularly discusses the impact of visual experience on the development of this behavior from birth to adulthood. After a discussion of three methodological considerations, the review of studies reveals that blind subjects demonstrate differing capacities for producing spontaneous expressions and voluntarily posed expressions. Seventeen studies provided evidence that blind and sighted spontaneously produce the same pattern of facial expressions, even if some variations can be found, reflecting facial and body movements specific to blindness or differences in intensity and control of emotions in some specific contexts. This suggests that lack of visual experience seems to not have a major impact when this behavior is generated spontaneously in real emotional contexts. In contrast, eight studies examining voluntary expressions indicate that blind individuals have difficulty posing emotional expressions. The opportunity for prior visual observation seems to affect performance in this case. Finally, we discuss three new directions for research to provide additional and strong evidence for the debate regarding the innate or the culture-constant learning character of the production of emotional facial expressions by blind individuals: the link between perception and production of facial expressions, the impact of display rules in the absence of vision, and the role of other channels in expression of emotions in the context of blindness.

  6. Models of hemispheric specialization in facial emotion perception--a reevaluation.

    PubMed

    Najt, Pablo; Bayer, Ulrike; Hausmann, Markus

    2013-02-01

    A considerable amount of research on functional cerebral asymmetries (FCAs) for facial emotion perception has shown conflicting support for three competing models: (i) the Right Hemisphere Hypothesis, (ii) the Valence-Specific Hypothesis, and (iii) the Approach/Withdrawal model. However, the majority of studies evaluating the Right Hemisphere or the Valence-Specific Hypotheses are rather limited by the small number of emotional expressions used. In addition, it is difficult to evaluate the Approach/Withdrawal Hypothesis due to insufficient data on anger and FCAs. The aim of the present study was (a) to review visual half field (VHF) studies of hemispheric specialization in facial emotion perception and (b) to reevaluate empirical evidence with respect to all three partly conflicting hypotheses. Results from the present study revealed a left visual field (LVF)/right hemisphere advantage for the perception of angry, fearful, and sad facial expressions and a right visual field (RVF)/left hemisphere advantage for the perception of happy expressions. Thus, FCAs for the perception of specific facial emotions do not fully support the Right Hemisphere Hypothesis, the Valence-Specific Hypothesis, or the Approach/Withdrawal model. A systematic literature review, together with the results of the present study, indicate a consistent LVF/right hemisphere advantage only for a subset of negative emotions including anger, fear and sadness, rather suggesting a "negative (only) valence model." PsycINFO Database Record (c) 2013 APA, all rights reserved.

  7. Sad or Fearful? The Influence of Body Posture on Adults' and Children's Perception of Facial Displays of Emotion

    ERIC Educational Resources Information Center

    Mondloch, Catherine J.

    2012-01-01

    The current research investigated the influence of body posture on adults' and children's perception of facial displays of emotion. In each of two experiments, participants categorized facial expressions that were presented on a body posture that was congruent (e.g., a sad face on a body posing sadness) or incongruent (e.g., a sad face on a body…

  8. Perception of health from facial cues

    PubMed Central

    Henderson, Audrey J.; Holzleitner, Iris J.; Talamas, Sean N.

    2016-01-01

    Impressions of health are integral to social interactions, yet poorly understood. A review of the literature reveals multiple facial characteristics that potentially act as cues to health judgements. The cues vary in their stability across time: structural shape cues including symmetry and sexual dimorphism alter slowly across the lifespan and have been found to have weak links to actual health, but show inconsistent effects on perceived health. Facial adiposity changes over a medium time course and is associated with both perceived and actual health. Skin colour alters over a short time and has strong effects on perceived health, yet links to health outcomes have barely been evaluated. Reviewing suggested an additional influence of demeanour as a perceptual cue to health. We, therefore, investigated the association of health judgements with multiple facial cues measured objectively from two-dimensional and three-dimensional facial images. We found evidence for independent contributions of face shape and skin colour cues to perceived health. Our empirical findings: (i) reinforce the role of skin yellowness; (ii) demonstrate the utility of global face shape measures of adiposity; and (iii) emphasize the role of affect in facial images with nominally neutral expression in impressions of health. PMID:27069057

  9. Intranasal oxytocin selectively attenuates rhesus monkeys' attention to negative facial expressions.

    PubMed

    Parr, Lisa A; Modi, Meera; Siebert, Erin; Young, Larry J

    2013-09-01

    Intranasal oxytocin (IN-OT) modulates social perception and cognition in humans and could be an effective pharmacotherapy for treating social impairments associated with neuropsychiatric disorders, like autism. However, it is unknown how IN-OT modulates social cognition, its effect after repeated use, or its impact on the developing brain. Animal models are urgently needed. This study examined the effect of IN-OT on social perception in monkeys using tasks that reveal some of the social impairments seen in autism. Six rhesus macaques (Macaca mulatta, 4 males) received a 48 IU dose of OT or saline placebo using a pediatric nebulizer. An hour later, they performed a computerized task (the dot-probe task) to measure their attentional bias to social, emotional, and nonsocial images. Results showed that IN-OT significantly reduced monkeys' attention to negative facial expressions, but not neutral faces or clip art images and, additionally, showed a trend to enhance monkeys' attention to direct vs. averted gaze faces. This study is the first to demonstrate an effect of IN-OT on social perception in monkeys, IN-OT selectively reduced monkey's attention to negative facial expressions, but not neutral social or nonsocial images. These findings complement several reports in humans showing that IN-OT reduces the aversive quality of social images suggesting that, like humans, monkey social perception is mediated by the oxytocinergic system. Importantly, these results in monkeys suggest that IN-OT does not dampen the emotional salience of social stimuli, but rather acts to affect the evaluation of emotional images during the early stages of information processing. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Impaired Perception of Emotional Expression in Amyotrophic Lateral Sclerosis.

    PubMed

    Oh, Seong Il; Oh, Ki Wook; Kim, Hee Jin; Park, Jin Seok; Kim, Seung Hyun

    2016-07-01

    The increasing recognition that deficits in social emotions occur in amyotrophic lateral sclerosis (ALS) is helping to explain the spectrum of neuropsychological dysfunctions, thus supporting the view of ALS as a multisystem disorder involving neuropsychological deficits as well as motor deficits. The aim of this study was to characterize the emotion perception abilities of Korean patients with ALS based on the recognition of facial expressions. Twenty-four patients with ALS and 24 age- and sex-matched healthy controls completed neuropsychological tests and facial emotion recognition tasks [ChaeLee Korean Facial Expressions of Emotions (ChaeLee-E)]. The ChaeLee-E test includes facial expressions for seven emotions: happiness, sadness, anger, disgust, fear, surprise, and neutral. The ability to perceive facial emotions was significantly worse among ALS patients performed than among healthy controls [65.2±18.0% vs. 77.1±6.6% (mean±SD), p=0.009]. Eight of the 24 patients (33%) scored below the 5th percentile score of controls for recognizing facial emotions. Emotion perception deficits occur in Korean ALS patients, particularly regarding facial expressions of emotion. These findings expand the spectrum of cognitive and behavioral dysfunction associated with ALS into emotion processing dysfunction.

  11. Visual and auditory socio-cognitive perception in unilateral temporal lobe epilepsy in children and adolescents: a prospective controlled study.

    PubMed

    Laurent, Agathe; Arzimanoglou, Alexis; Panagiotakaki, Eleni; Sfaello, Ignacio; Kahane, Philippe; Ryvlin, Philippe; Hirsch, Edouard; de Schonen, Scania

    2014-12-01

    A high rate of abnormal social behavioural traits or perceptual deficits is observed in children with unilateral temporal lobe epilepsy. In the present study, perception of auditory and visual social signals, carried by faces and voices, was evaluated in children or adolescents with temporal lobe epilepsy. We prospectively investigated a sample of 62 children with focal non-idiopathic epilepsy early in the course of the disorder. The present analysis included 39 children with a confirmed diagnosis of temporal lobe epilepsy. Control participants (72), distributed across 10 age groups, served as a control group. Our socio-perceptual evaluation protocol comprised three socio-visual tasks (face identity, facial emotion and gaze direction recognition), two socio-auditory tasks (voice identity and emotional prosody recognition), and three control tasks (lip reading, geometrical pattern and linguistic intonation recognition). All 39 patients also benefited from a neuropsychological examination. As a group, children with temporal lobe epilepsy performed at a significantly lower level compared to the control group with regards to recognition of facial identity, direction of eye gaze, and emotional facial expressions. We found no relationship between the type of visual deficit and age at first seizure, duration of epilepsy, or the epilepsy-affected cerebral hemisphere. Deficits in socio-perceptual tasks could be found independently of the presence of deficits in visual or auditory episodic memory, visual non-facial pattern processing (control tasks), or speech perception. A normal FSIQ did not exempt some of the patients from an underlying deficit in some of the socio-perceptual tasks. Temporal lobe epilepsy not only impairs development of emotion recognition, but can also impair development of perception of other socio-perceptual signals in children with or without intellectual deficiency. Prospective studies need to be designed to evaluate the results of appropriate re-education programs in children presenting with deficits in social cue processing.

  12. Positive and negative symptom scores are correlated with activation in different brain regions during facial emotion perception in schizophrenia patients: a voxel-based sLORETA source activity study.

    PubMed

    Kim, Do-Won; Kim, Han-Sung; Lee, Seung-Hwan; Im, Chang-Hwan

    2013-12-01

    Schizophrenia is one of the most devastating of all mental illnesses, and has dimensional characteristics that include both positive and negative symptoms. One problem reported in schizophrenia patients is that they tend to show deficits in face emotion processing, on which negative symptoms are thought to have stronger influence. In this study, four event-related potential (ERP) components (P100, N170, N250, and P300) and their source activities were analyzed using EEG data acquired from 23 schizophrenia patients while they were presented with facial emotion picture stimuli. Correlations between positive and negative syndrome scale (PANSS) scores and source activations during facial emotion processing were calculated to identify the brain areas affected by symptom scores. Our analysis demonstrates that PANSS positive scores are negatively correlated with major areas of the left temporal lobule for early ERP components (P100, N170) and with the right middle frontal lobule for a later component (N250), which indicates that positive symptoms affect both early face processing and facial emotion processing. On the other hand, PANSS negative scores are negatively correlated with several clustered regions, including the left fusiform gyrus (at P100), most of which are not overlapped with regions showing correlations with PANSS positive scores. Our results suggest that positive and negative symptoms affect independent brain regions during facial emotion processing, which may help to explain the heterogeneous characteristics of schizophrenia. © 2013 Elsevier B.V. All rights reserved.

  13. Putting the face in context: Body expressions impact facial emotion processing in human infants.

    PubMed

    Rajhans, Purva; Jessen, Sarah; Missana, Manuela; Grossmann, Tobias

    2016-06-01

    Body expressions exert strong contextual effects on facial emotion perception in adults. Specifically, conflicting body cues hamper the recognition of emotion from faces, as evident on both the behavioral and neural level. We examined the developmental origins of the neural processes involved in emotion perception across body and face in 8-month-old infants by measuring event-related brain potentials (ERPs). We primed infants with body postures (fearful, happy) that were followed by either congruent or incongruent facial expressions. Our results revealed that body expressions impact facial emotion processing and that incongruent body cues impair the neural discrimination of emotional facial expressions. Priming effects were associated with attentional and recognition memory processes, as reflected in a modulation of the Nc and Pc evoked at anterior electrodes. These findings demonstrate that 8-month-old infants possess neural mechanisms that allow for the integration of emotion across body and face, providing evidence for the early developmental emergence of context-sensitive facial emotion perception. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. Perceiving the evil eye: Investigating hostile interpretation of ambiguous facial emotional expression in violent and non-violent offenders.

    PubMed

    Kuin, Niki C; Masthoff, Erik D M; Munafò, Marcus R; Penton-Voak, Ian S

    2017-01-01

    Research into the causal and perpetuating factors influencing aggression has partly focused on the general tendency of aggression-prone individuals to infer hostile intent in others, even in ambiguous circumstances. This is referred to as the 'hostile interpretation bias'. Whether this hostile interpretation bias also exists in basal information processing, such as perception of facial emotion, is not yet known, especially with respect to the perception of ambiguous expressions. In addition, little is known about how this potential bias in facial emotion perception is related to specific characteristics of aggression. In the present study, conducted in a penitentiary setting with detained male adults, we investigated if violent offenders (n = 71) show a stronger tendency to interpret ambiguous facial expressions on a computer task as angry rather than happy, compared to non-violent offenders (n = 14) and to a control group of healthy volunteers (n = 32). We also investigated if hostile perception of facial expressions is related to specific characteristics of aggression, such as proactive and reactive aggression. No clear statistical evidence was found that violent offenders perceived facial emotional expressions as more angry than non-violent offenders or healthy volunteers. A regression analysis in the violent offender group showed that only age and a self-report measure of hostility predicted outcome on the emotion perception task. Other traits, such as psychopathic traits, intelligence, attention and a tendency to jump to conclusions were not associated with interpretation of anger in facial emotional expressions. We discuss the possible impact of the study design and population studied on our results, as well as implications for future studies.

  15. Perceiving the evil eye: Investigating hostile interpretation of ambiguous facial emotional expression in violent and non-violent offenders

    PubMed Central

    Masthoff, Erik D. M.; Munafò, Marcus R.; Penton-Voak, Ian S.

    2017-01-01

    Research into the causal and perpetuating factors influencing aggression has partly focused on the general tendency of aggression-prone individuals to infer hostile intent in others, even in ambiguous circumstances. This is referred to as the ‘hostile interpretation bias’. Whether this hostile interpretation bias also exists in basal information processing, such as perception of facial emotion, is not yet known, especially with respect to the perception of ambiguous expressions. In addition, little is known about how this potential bias in facial emotion perception is related to specific characteristics of aggression. In the present study, conducted in a penitentiary setting with detained male adults, we investigated if violent offenders (n = 71) show a stronger tendency to interpret ambiguous facial expressions on a computer task as angry rather than happy, compared to non-violent offenders (n = 14) and to a control group of healthy volunteers (n = 32). We also investigated if hostile perception of facial expressions is related to specific characteristics of aggression, such as proactive and reactive aggression. No clear statistical evidence was found that violent offenders perceived facial emotional expressions as more angry than non-violent offenders or healthy volunteers. A regression analysis in the violent offender group showed that only age and a self-report measure of hostility predicted outcome on the emotion perception task. Other traits, such as psychopathic traits, intelligence, attention and a tendency to jump to conclusions were not associated with interpretation of anger in facial emotional expressions. We discuss the possible impact of the study design and population studied on our results, as well as implications for future studies. PMID:29190802

  16. Face perception in women with Turner syndrome and its underlying factors.

    PubMed

    Anaki, David; Zadikov Mor, Tal; Gepstein, Vardit; Hochberg, Ze'ev

    2016-09-01

    Turner syndrome (TS) is a chromosomal condition that affects development in females. It is characterized by short stature, ovarian failure and other congenital malformations, due to a partial or complete absence of the sex chromosome. Women with TS frequently suffer from various physical and hormonal dysfunctions, along with impairments in visual-spatial processing and social cognition difficulties. Previous research has also shown difficulties in face and emotion perception. In the current study we examined two questions: First, whether women with TS, that are impaired in face perception, also suffer from deficits in face-specific processes. The second question was whether these face impairments in TS are related to visual-spatial perceptual dysfunctions exhibited by TS individuals, or to impaired social cognition skills. Twenty-six women with TS and 26 control participants were tested on various cognitive and psychological tests to assess visual-spatial perception, face and facial expression perception, and social cognition skills. Results show that women with TS were less accurate in face perception and facial expression processing, yet they exhibited normal face-specific processes (configural and holistic processing). They also showed difficulties in spatial perception and social cognition capacities. Additional analyses revealed that their face perception impairments were related to their deficits in visual-spatial processing. Thus, our results do not support the claim that the impairments in face processing observed in TS are related to difficulties in social cognition. Rather, our data point to the possibility that face perception difficulties in TS stem from visual-spatial impairments and may not be specific to faces. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Differential roles of low and high spatial frequency content in abnormal facial emotion perception in schizophrenia.

    PubMed

    McBain, Ryan; Norton, Daniel; Chen, Yue

    2010-09-01

    While schizophrenia patients are impaired at facial emotion perception, the role of basic visual processing in this deficit remains relatively unclear. We examined emotion perception when spatial frequency content of facial images was manipulated via high-pass and low-pass filtering. Unlike controls (n=29), patients (n=30) perceived images with low spatial frequencies as more fearful than those without this information, across emotional salience levels. Patients also perceived images with high spatial frequencies as happier. In controls, this effect was found only at low emotional salience. These results indicate that basic visual processing has an amplified modulatory effect on emotion perception in schizophrenia. (c) 2010 Elsevier B.V. All rights reserved.

  18. Selective attention modulates early human evoked potentials during emotional face-voice processing.

    PubMed

    Ho, Hao Tam; Schröger, Erich; Kotz, Sonja A

    2015-04-01

    Recent findings on multisensory integration suggest that selective attention influences cross-sensory interactions from an early processing stage. Yet, in the field of emotional face-voice integration, the hypothesis prevails that facial and vocal emotional information interacts preattentively. Using ERPs, we investigated the influence of selective attention on the perception of congruent versus incongruent combinations of neutral and angry facial and vocal expressions. Attention was manipulated via four tasks that directed participants to (i) the facial expression, (ii) the vocal expression, (iii) the emotional congruence between the face and the voice, and (iv) the synchrony between lip movement and speech onset. Our results revealed early interactions between facial and vocal emotional expressions, manifested as modulations of the auditory N1 and P2 amplitude by incongruent emotional face-voice combinations. Although audiovisual emotional interactions within the N1 time window were affected by the attentional manipulations, interactions within the P2 modulation showed no such attentional influence. Thus, we propose that the N1 and P2 are functionally dissociated in terms of emotional face-voice processing and discuss evidence in support of the notion that the N1 is associated with cross-sensory prediction, whereas the P2 relates to the derivation of an emotional percept. Essentially, our findings put the integration of facial and vocal emotional expressions into a new perspective-one that regards the integration process as a composite of multiple, possibly independent subprocesses, some of which are susceptible to attentional modulation, whereas others may be influenced by additional factors.

  19. Contribution of Interoceptive Information to Emotional Processing: Evidence from Individuals with Spinal Cord Injury.

    PubMed

    Pistoia, Francesca; Carolei, Antonio; Sacco, Simona; Conson, Massimiliano; Pistarini, Caterina; Cazzulani, Benedetta; Stewart, Janet; Franceschini, Marco; Sarà, Marco

    2015-12-15

    There is much evidence to suggest that recognizing and sharing emotions with others require a first-hand experience of those emotions in our own body which, in turn, depends on the adequate perception of our own internal state (interoception) through preserved sensory pathways. Here we explored the contribution of interoception to first-hand emotional experiences and to the recognition of others' emotions. For this aim, 10 individuals with sensory deafferentation as a consequence of high spinal cord injury (SCI; five males and five females; mean age, 48 ± 14.8 years) and 20 healthy subjects matched for age, sex, and education were included in the study. Recognition of facial expressions and judgment of emotionally evocative scenes were investigated in both groups using the Ekman and Friesen set of Pictures of Facial Affect and the International Affective Picture System. A two-way mixed analysis of variance and post hoc comparisons were used to test differences among emotions and groups. Compared with healthy subjects, individuals with SCI, when asked to judge emotionally evocative scenes, had difficulties in judging their own emotional response to complex scenes eliciting fear and anger, while they were able to recognize the same emotions when conveyed by facial expressions. Our findings endorse a simulative view of emotional processing according to which the proper perception of our own internal state (interoception), through preserved sensory pathways, is crucial for first-hand experiences of the more primordial emotions, such as fear and anger.

  20. Baby schema in human and animal faces induces cuteness perception and gaze allocation in children.

    PubMed

    Borgi, Marta; Cogliati-Dezza, Irene; Brelsford, Victoria; Meints, Kerstin; Cirulli, Francesca

    2014-01-01

    The baby schema concept was originally proposed as a set of infantile traits with high appeal for humans, subsequently shown to elicit caretaking behavior and to affect cuteness perception and attentional processes. However, it is unclear whether the response to the baby schema may be extended to the human-animal bond context. Moreover, questions remain as to whether the cute response is constant and persistent or whether it changes with development. In the present study we parametrically manipulated the baby schema in images of humans, dogs, and cats. We analyzed responses of 3-6 year-old children, using both explicit (i.e., cuteness ratings) and implicit (i.e., eye gaze patterns) measures. By means of eye-tracking, we assessed children's preferential attention to images varying only for the degree of baby schema and explored participants' fixation patterns during a cuteness task. For comparative purposes, cuteness ratings were also obtained in a sample of adults. Overall our results show that the response to an infantile facial configuration emerges early during development. In children, the baby schema affects both cuteness perception and gaze allocation to infantile stimuli and to specific facial features, an effect not simply limited to human faces. In line with previous research, results confirm human positive appraisal toward animals and inform both educational and therapeutic interventions involving pets, helping to minimize risk factors (e.g., dog bites).

  1. The Effect of Target Sex, Sexual Dimorphism, and Facial Attractiveness on Perceptions of Target Attractiveness and Trustworthiness

    PubMed Central

    Hu, Yuanyan; Abbasi, Najam ul Hasan; Zhang, Yang; Chen, Hong

    2018-01-01

    Facial sexual dimorphism has widely demonstrated as having an influence on the facial attractiveness and social interactions. However, earlier studies show inconsistent results on the effect of sexual dimorphism on facial attractiveness judgments. Previous studies suggest that the level of attractiveness might work as a moderating variable among the relationship between sexual dimorphism and facial preference and have often focused on the effect of sexual dimorphism on general attractiveness ratings, rather than concentrating on trustworthiness perception. Male and female participants viewed target male and female faces that varied on attractiveness (more attractive or less attractive) and sexual dimorphism (masculine or feminine). Participants rated the attractiveness of the faces and reported how much money they would give to the target person as a measure of trust. For the facial attractiveness ratings, (a) both men and women participants preferred masculine male faces to feminine male ones under the more attractive condition, whereas preferred feminine male faces to masculine male ones under the less attractive condition; (b) all participants preferred feminine female faces to masculine female ones under the less attractive condition, while there were no differences between feminine female faces and masculine female faces under the more attractive condition. For the target trustworthiness perception, (a) participants showed no preference between masculine male faces and feminine male faces, neither under the more attractive condition nor the less attractiveness condition; (b) however, all the participants preferred masculine female faces over feminine female faces under the more attractive condition, exhibiting no preference between feminine female faces and masculine female faces under the less attractive condition. These findings suggest that the attractiveness of facial stimulus may be a reason to interpret the inconsistent results from the previous studies, which focused on the effect of facial sexual dimorphism on the facial attractiveness. Furthermore, implications about the effect of target facial sexual dimorphism on participants’ trustworthiness perception were discussed.

  2. The influence of sagittal position of the mandible in facial attractiveness and social perception.

    PubMed

    Sena, Lorena Marques Ferreira de; Damasceno E Araújo, Lislley Anne Lacerda; Farias, Arthur Costa Rodrigues; Pereira, Hallissa Simplício Gomes

    2017-01-01

    This study aims at comparing the perception of orthodontists, maxillofacial surgeons, visual artists and laypersons when evaluating the influence of sagittal position of the mandible - in lateral view - in facial attractiveness; at a job hiring; and in the perception of socioeconomic profile. A black male, a white male, a black female and a white female with harmonic faces served as models to obtain a facial profile photograph. Each photograph was digitally manipulated to obtain seven facial profiles: an ideal, three simulating mandibular advancement and three simulating mandibular retrusion, producing 28 photographs. These photographs were evaluated through a questionnaire by orthodontists, maxillofacial surgeons, visual artists and laypersons. The anteroposterior positioning of the mandible exerted strong influence on the level of facial attractiveness, but few significant differences between the different groups of evaluators were observed (p < 0.05). The profiles pointed as the most attractive were also pointed as more favorable to be hired to a job position and pointed also as having the best socioeconomic condition.

  3. Reduced Accuracy and Sensitivity in the Perception of Emotional Facial Expressions in Individuals with High Autism Spectrum Traits

    ERIC Educational Resources Information Center

    Poljac, Ervin; Poljac, Edita; Wagemans, Johan

    2013-01-01

    Autism spectrum disorder (ASD) is among other things characterized by specific impairments in emotion processing. It is not clear, however, to what extent the typical decline in affective functioning is related to the specific autistic traits. We employed "The Autism Spectrum-Quotient" (AQ) to quantify autistic traits in a group of 500…

  4. Through the eyes of a child: preschoolers' identification of emotional expressions from the child affective facial expression (CAFE) set.

    PubMed

    LoBue, Vanessa; Baker, Lewis; Thrasher, Cat

    2017-08-10

    Researchers have been interested in the perception of human emotional expressions for decades. Importantly, most empirical work in this domain has relied on controlled stimulus sets of adults posing for various emotional expressions. Recently, the Child Affective Facial Expression (CAFE) set was introduced to the scientific community, featuring a large validated set of photographs of preschool aged children posing for seven different emotional expressions. Although the CAFE set was extensively validated using adult participants, the set was designed for use with children. It is therefore necessary to verify that adult validation applies to child performance. In the current study, we examined 3- to 4-year-olds' identification of a subset of children's faces in the CAFE set, and compared it to adult ratings cited in previous research. Our results demonstrate an exceptionally strong relationship between adult ratings of the CAFE photos and children's ratings, suggesting that the adult validation of the set can be applied to preschool-aged participants. The results are discussed in terms of methodological implications for the use of the CAFE set with children, and theoretical implications for using the set to study the development of emotion perception in early childhood.

  5. African perceptions of female attractiveness.

    PubMed

    Coetzee, Vinet; Faerber, Stella J; Greeff, Jaco M; Lefevre, Carmen E; Re, Daniel E; Perrett, David I

    2012-01-01

    Little is known about mate choice preferences outside Western, educated, industrialised, rich and democratic societies, even though these Western populations may be particularly unrepresentative of human populations. To our knowledge, this is the first study to test which facial cues contribute to African perceptions of African female attractiveness and also the first study to test the combined role of facial adiposity, skin colour (lightness, yellowness and redness), skin homogeneity and youthfulness in the facial attractiveness preferences of any population. Results show that youthfulness, skin colour, skin homogeneity and facial adiposity significantly and independently predict attractiveness in female African faces. Younger, thinner women with a lighter, yellower skin colour and a more homogenous skin tone are considered more attractive. These findings provide a more global perspective on human mate choice and point to a universal role for these four facial cues in female facial attractiveness.

  6. Impact of Childhood Maltreatment on the Recognition of Facial Expressions of Emotions.

    PubMed

    Ardizzi, Martina; Martini, Francesca; Umiltà, Maria Alessandra; Evangelista, Valentina; Ravera, Roberto; Gallese, Vittorio

    2015-01-01

    The development of the explicit recognition of facial expressions of emotions can be affected by childhood maltreatment experiences. A previous study demonstrated the existence of an explicit recognition bias for angry facial expressions among a population of adolescent Sierra Leonean street-boys exposed to high levels of maltreatment. In the present study, the recognition bias for angry facial expressions was investigated in a younger population of street-children and age-matched controls. Participants performed a forced-choice facial expressions recognition task. Recognition bias was measured as participants' tendency to over-attribute anger label to other negative facial expressions. Participants' heart rate was assessed and related to their behavioral performance, as index of their stress-related physiological responses. Results demonstrated the presence of a recognition bias for angry facial expressions among street-children, also pinpointing a similar, although significantly less pronounced, tendency among controls. Participants' performance was controlled for age, cognitive and educational levels and for naming skills. None of these variables influenced the recognition bias for angry facial expressions. Differently, a significant effect of heart rate on participants' tendency to use anger label was evidenced. Taken together, these results suggest that childhood exposure to maltreatment experiences amplifies children's "pre-existing bias" for anger labeling in forced-choice emotion recognition task. Moreover, they strengthen the thesis according to which the recognition bias for angry facial expressions is a manifestation of a functional adaptive mechanism that tunes victim's perceptive and attentive focus on salient environmental social stimuli.

  7. Impact of Childhood Maltreatment on the Recognition of Facial Expressions of Emotions

    PubMed Central

    Ardizzi, Martina; Martini, Francesca; Umiltà, Maria Alessandra; Evangelista, Valentina; Ravera, Roberto; Gallese, Vittorio

    2015-01-01

    The development of the explicit recognition of facial expressions of emotions can be affected by childhood maltreatment experiences. A previous study demonstrated the existence of an explicit recognition bias for angry facial expressions among a population of adolescent Sierra Leonean street-boys exposed to high levels of maltreatment. In the present study, the recognition bias for angry facial expressions was investigated in a younger population of street-children and age-matched controls. Participants performed a forced-choice facial expressions recognition task. Recognition bias was measured as participants’ tendency to over-attribute anger label to other negative facial expressions. Participants’ heart rate was assessed and related to their behavioral performance, as index of their stress-related physiological responses. Results demonstrated the presence of a recognition bias for angry facial expressions among street-children, also pinpointing a similar, although significantly less pronounced, tendency among controls. Participants’ performance was controlled for age, cognitive and educational levels and for naming skills. None of these variables influenced the recognition bias for angry facial expressions. Differently, a significant effect of heart rate on participants’ tendency to use anger label was evidenced. Taken together, these results suggest that childhood exposure to maltreatment experiences amplifies children’s “pre-existing bias” for anger labeling in forced-choice emotion recognition task. Moreover, they strengthen the thesis according to which the recognition bias for angry facial expressions is a manifestation of a functional adaptive mechanism that tunes victim’s perceptive and attentive focus on salient environmental social stimuli. PMID:26509890

  8. Effects of induced sad mood on facial emotion perception in young and older adults.

    PubMed

    Lawrie, Louisa; Jackson, Margaret C; Phillips, Louise H

    2018-02-15

    Older adults perceive less intense negative emotion in facial expressions compared to younger counterparts. Prior research has also demonstrated that mood alters facial emotion perception. Nevertheless, there is little evidence which evaluates the interactive effects of age and mood on emotion perception. This study investigated the effects of sad mood on younger and older adults' perception of emotional and neutral faces. Participants rated the intensity of stimuli while listening to sad music and in silence. Measures of mood were administered. Younger and older participants' rated sad faces as displaying stronger sadness when they experienced sad mood. While younger participants showed no influence of sad mood on happiness ratings of happy faces, older adults rated happy faces as conveying less happiness when they experienced sad mood. This study demonstrates how emotion perception can change when a controlled mood induction procedure is applied to alter mood in young and older participants.

  9. Differences in holistic processing do not explain cultural differences in the recognition of facial expression.

    PubMed

    Yan, Xiaoqian; Young, Andrew W; Andrews, Timothy J

    2017-12-01

    The aim of this study was to investigate the causes of the own-race advantage in facial expression perception. In Experiment 1, we investigated Western Caucasian and Chinese participants' perception and categorization of facial expressions of six basic emotions that included two pairs of confusable expressions (fear and surprise; anger and disgust). People were slightly better at identifying facial expressions posed by own-race members (mainly in anger and disgust). In Experiment 2, we asked whether the own-race advantage was due to differences in the holistic processing of facial expressions. Participants viewed composite faces in which the upper part of one expression was combined with the lower part of a different expression. The upper and lower parts of the composite faces were either aligned or misaligned. Both Chinese and Caucasian participants were better at identifying the facial expressions from the misaligned images, showing interference on recognizing the parts of the expressions created by holistic perception of the aligned composite images. However, this interference from holistic processing was equivalent across expressions of own-race and other-race faces in both groups of participants. Whilst the own-race advantage in recognizing facial expressions does seem to reflect the confusability of certain emotions, it cannot be explained by differences in holistic processing.

  10. High visual resolution matters in audiovisual speech perception, but only for some.

    PubMed

    Alsius, Agnès; Wayne, Rachel V; Paré, Martin; Munhall, Kevin G

    2016-07-01

    The basis for individual differences in the degree to which visual speech input enhances comprehension of acoustically degraded speech is largely unknown. Previous research indicates that fine facial detail is not critical for visual enhancement when auditory information is available; however, these studies did not examine individual differences in ability to make use of fine facial detail in relation to audiovisual speech perception ability. Here, we compare participants based on their ability to benefit from visual speech information in the presence of an auditory signal degraded with noise, modulating the resolution of the visual signal through low-pass spatial frequency filtering and monitoring gaze behavior. Participants who benefited most from the addition of visual information (high visual gain) were more adversely affected by the removal of high spatial frequency information, compared to participants with low visual gain, for materials with both poor and rich contextual cues (i.e., words and sentences, respectively). Differences as a function of gaze behavior between participants with the highest and lowest visual gains were observed only for words, with participants with the highest visual gain fixating longer on the mouth region. Our results indicate that the individual variance in audiovisual speech in noise performance can be accounted for, in part, by better use of fine facial detail information extracted from the visual signal and increased fixation on mouth regions for short stimuli. Thus, for some, audiovisual speech perception may suffer when the visual input (in addition to the auditory signal) is less than perfect.

  11. An investigation of a novel transdiagnostic model of delusions in a group with positive schizotypal symptoms.

    PubMed

    Cameron, Clare; Kaplan, Ryan A; Rossell, Susan L

    2014-01-01

    Although several theories of delusions have been put forward, most do not offer a comprehensive diagnosis-independent explanation of delusion aetiology. This study used a non-clinical sample to provide empirical support for a novel transdiagnostic model of delusions that implicates aberrant semantic memory and emotion perception processes as key factors in delusion formation and maintenance. It was hypothesised that among a non-clinical sample, people high in schizotypy would demonstrate differences in semantic memory and emotion perception, relative to people low in schizotypy. Using the Cognitive Disorganisation subscale of the Oxford-Liverpool Inventory of Feelings and Experiences, 41 healthy participants were separated into high and low schizotypy groups and completed facial emotion perception and semantic priming tasks. As expected, participants in the high schizotypy group demonstrated different performance on the semantic priming task and reduced facial affect accuracy for the emotion anger, and reaction time differences to fearful faces. These findings suggest that such processes may be involved in the development of the sorts of unusual beliefs which underlie delusions. Investigation of how emotion perception and semantic memory may interrelate in the aetiology of delusions would be of value in furthering our understanding of their role in delusion formation.

  12. Categorical Perception of Emotional Facial Expressions in Preschoolers

    ERIC Educational Resources Information Center

    Cheal, Jenna L.; Rutherford, M. D.

    2011-01-01

    Adults perceive emotional facial expressions categorically. In this study, we explored categorical perception in 3.5-year-olds by creating a morphed continuum of emotional faces and tested preschoolers' discrimination and identification of them. In the discrimination task, participants indicated whether two examples from the continuum "felt the…

  13. Categorical Representation of Facial Expressions in the Infant Brain

    ERIC Educational Resources Information Center

    Leppanen, Jukka M.; Richmond, Jenny; Vogel-Farley, Vanessa K.; Moulson, Margaret C.; Nelson, Charles A.

    2009-01-01

    Categorical perception, demonstrated as reduced discrimination of within-category relative to between-category differences in stimuli, has been found in a variety of perceptual domains in adults. To examine the development of categorical perception in the domain of facial expression processing, we used behavioral and event-related potential (ERP)…

  14. Contemporary Koreans’ Perceptions of Facial Beauty

    PubMed Central

    An, Soo-Jung; Hwang, Rahil

    2017-01-01

    Background This article aims to investigate current perceptions of beauty of the general public and physicians without a specialization in plastic surgery performing aesthetic procedures. Methods A cross-sectional and interviewing questionnaire was administered to 290 people in Seoul, South Korea in September 2015. The questionnaire addressed three issues: general attitudes about plastic surgery (Q1), perception of and preferences regarding Korean female celebrities’ facial attractiveness (Q2), and the relative influence of each facial aesthetic subunit on overall facial attractiveness. The survey’s results were gathered by a professional research agency and classified according to a respondent’s gender, age, and job type (95%±5.75% confidence interval). Statistical analysis was performed using SPSS ver. 10.1, calculating one-way analysis of variance with post hoc analysis and Tukey’s t-test. Results Among the respondents, 38.3% were in favor of aesthetic plastic surgery. The most common source of plastic surgery information was the internet (50.0%). The most powerful factor influencing hospital or clinic selection was the postoperative surgical results of acquaintances (74.9%). We created a composite face of an attractive Korean female, representing the current facial configuration considered appealing to the Koreans. Beauty perceptions differed to some degree based on gender and generational differences. We found that there were certain differences in beauty perceptions between general physicians who perform aesthetic procedures and the general public. Conclusions Our study results provide aesthetic plastic surgeons with detailed information about contemporary Korean people’s attitudes toward and perceptions of plastic surgery and the specific characteristics of female Korean faces currently considered attractive, plus trends in these perceptions, which should inform plastic surgeons within their specialized fields. PMID:28946720

  15. Neural correlates of the perception of dynamic versus static facial expressions of emotion.

    PubMed

    Kessler, Henrik; Doyen-Waldecker, Cornelia; Hofer, Christian; Hoffmann, Holger; Traue, Harald C; Abler, Birgit

    2011-04-20

    This study investigated brain areas involved in the perception of dynamic facial expressions of emotion. A group of 30 healthy subjects was measured with fMRI when passively viewing prototypical facial expressions of fear, disgust, sadness and happiness. Using morphing techniques, all faces were displayed as still images and also dynamically as a film clip with the expressions evolving from neutral to emotional. Irrespective of a specific emotion, dynamic stimuli selectively activated bilateral superior temporal sulcus, visual area V5, fusiform gyrus, thalamus and other frontal and parietal areas. Interaction effects of emotion and mode of presentation (static/dynamic) were only found for the expression of happiness, where static faces evoked greater activity in the medial prefrontal cortex. Our results confirm previous findings on neural correlates of the perception of dynamic facial expressions and are in line with studies showing the importance of the superior temporal sulcus and V5 in the perception of biological motion. Differential activation in the fusiform gyrus for dynamic stimuli stands in contrast to classical models of face perception but is coherent with new findings arguing for a more general role of the fusiform gyrus in the processing of socially relevant stimuli.

  16. Spontaneous Gender Categorization in Masking and Priming Studies: Key for Distinguishing Jane from John Doe but Not Madonna from Sinatra

    PubMed Central

    Habibi, Ruth; Khurana, Beena

    2012-01-01

    Facial recognition is key to social interaction, however with unfamiliar faces only generic information, in the form of facial stereotypes such as gender and age is available. Therefore is generic information more prominent in unfamiliar versus familiar face processing? In order to address the question we tapped into two relatively disparate stages of face processing. At the early stages of encoding, we employed perceptual masking to reveal that only perception of unfamiliar face targets is affected by the gender of the facial masks. At the semantic end; using a priming paradigm, we found that while to-be-ignored unfamiliar faces prime lexical decisions to gender congruent stereotypic words, familiar faces do not. Our findings indicate that gender is a more salient dimension in unfamiliar relative to familiar face processing, both in early perceptual stages as well as later semantic stages of person construal. PMID:22389697

  17. Enhanced embodied response following ambiguous emotional processing.

    PubMed

    Beffara, Brice; Ouellet, Marc; Vermeulen, Nicolas; Basu, Anamitra; Morisseau, Tiffany; Mermillod, Martial

    2012-08-01

    It has generally been assumed that high-level cognitive and emotional processes are based on amodal conceptual information. In contrast, however, "embodied simulation" theory states that the perception of an emotional signal can trigger a simulation of the related state in the motor, somatosensory, and affective systems. To study the effect of social context on the mimicry effect predicted by the "embodied simulation" theory, we recorded the electromyographic (EMG) activity of participants when looking at emotional facial expressions. We observed an increase in embodied responses when the participants were exposed to a context involving social valence before seeing the emotional facial expressions. An examination of the dynamic EMG activity induced by two socially relevant emotional expressions (namely joy and anger) revealed enhanced EMG responses of the facial muscles associated with the related social prime (either positive or negative). These results are discussed within the general framework of embodiment theory.

  18. Residual fMRI sensitivity for identity changes in acquired prosopagnosia.

    PubMed

    Fox, Christopher J; Iaria, Giuseppe; Duchaine, Bradley C; Barton, Jason J S

    2013-01-01

    While a network of cortical regions contribute to face processing, the lesions in acquired prosopagnosia are highly variable, and likely result in different combinations of spared and affected regions of this network. To assess the residual functional sensitivities of spared regions in prosopagnosia, we designed a rapid event-related functional magnetic resonance imaging (fMRI) experiment that included pairs of faces with same or different identities and same or different expressions. By measuring the release from adaptation to these facial changes we determined the residual sensitivity of face-selective regions-of-interest. We tested three patients with acquired prosopagnosia, and all three of these patients demonstrated residual sensitivity for facial identity changes in surviving fusiform and occipital face areas of either the right or left hemisphere, but not in the right posterior superior temporal sulcus. The patients also showed some residual capabilities for facial discrimination with normal performance on the Benton Facial Recognition Test, but impaired performance on more complex tasks of facial discrimination. We conclude that fMRI can demonstrate residual processing of facial identity in acquired prosopagnosia, that this adaptation can occur in the same structures that show similar processing in healthy subjects, and further, that this adaptation may be related to behavioral indices of face perception.

  19. Residual fMRI sensitivity for identity changes in acquired prosopagnosia

    PubMed Central

    Fox, Christopher J.; Iaria, Giuseppe; Duchaine, Bradley C.; Barton, Jason J. S.

    2013-01-01

    While a network of cortical regions contribute to face processing, the lesions in acquired prosopagnosia are highly variable, and likely result in different combinations of spared and affected regions of this network. To assess the residual functional sensitivities of spared regions in prosopagnosia, we designed a rapid event-related functional magnetic resonance imaging (fMRI) experiment that included pairs of faces with same or different identities and same or different expressions. By measuring the release from adaptation to these facial changes we determined the residual sensitivity of face-selective regions-of-interest. We tested three patients with acquired prosopagnosia, and all three of these patients demonstrated residual sensitivity for facial identity changes in surviving fusiform and occipital face areas of either the right or left hemisphere, but not in the right posterior superior temporal sulcus. The patients also showed some residual capabilities for facial discrimination with normal performance on the Benton Facial Recognition Test, but impaired performance on more complex tasks of facial discrimination. We conclude that fMRI can demonstrate residual processing of facial identity in acquired prosopagnosia, that this adaptation can occur in the same structures that show similar processing in healthy subjects, and further, that this adaptation may be related to behavioral indices of face perception. PMID:24151479

  20. Attractiveness as a Function of Skin Tone and Facial Features: Evidence from Categorization Studies.

    PubMed

    Stepanova, Elena V; Strube, Michael J

    2018-01-01

    Participants rated the attractiveness and racial typicality of male faces varying in their facial features from Afrocentric to Eurocentric and in skin tone from dark to light in two experiments. Experiment 1 provided evidence that facial features and skin tone have an interactive effect on perceptions of attractiveness and mixed-race faces are perceived as more attractive than single-race faces. Experiment 2 further confirmed that faces with medium levels of skin tone and facial features are perceived as more attractive than faces with extreme levels of these factors. Black phenotypes (combinations of dark skin tone and Afrocentric facial features) were rated as more attractive than White phenotypes (combinations of light skin tone and Eurocentric facial features); ambiguous faces (combinations of Afrocentric and Eurocentric physiognomy) with medium levels of skin tone were rated as the most attractive in Experiment 2. Perceptions of attractiveness were relatively independent of racial categorization in both experiments.

  1. Curvilinear relationship between phonological working memory load and social-emotional modulation

    PubMed Central

    Mano, Quintino R.; Brown, Gregory G.; Bolden, Khalima; Aupperle, Robin; Sullivan, Sarah; Paulus, Martin P.; Stein, Murray B.

    2015-01-01

    Accumulating evidence suggests that working memory load is an important factor for the interplay between cognitive and facial-affective processing. However, it is unclear how distraction caused by perception of faces interacts with load-related performance. We developed a modified version of the delayed match-to-sample task wherein task-irrelevant facial distracters were presented early in the rehearsal of pseudoword memoranda that varied incrementally in load size (1-syllable, 2-syllables, or 3-syllables). Facial distracters displayed happy, sad, or neutral expressions in Experiment 1 (N=60) and happy, fearful, or neutral expressions in Experiment 2 (N=29). Facial distracters significantly disrupted task performance in the intermediate load condition (2-syllable) but not in the low or high load conditions (1- and 3-syllables, respectively), an interaction replicated and generalised in Experiment 2. All facial distracters disrupted working memory in the intermediate load condition irrespective of valence, suggesting a primary and general effect of distraction caused by faces. However, sad and fearful faces tended to be less disruptive than happy faces, suggesting a secondary and specific valence effect. Working memory appears to be most vulnerable to social-emotional information at intermediate loads. At low loads, spare capacity is capable of accommodating the combinatorial load (1-syllable plus facial distracter), whereas high loads maximised capacity and deprived facial stimuli from occupying working memory slots to cause disruption. PMID:22928750

  2. Beyond pleasure and pain: Facial expression ambiguity in adults and children during intense situations.

    PubMed

    Wenzler, Sofia; Levine, Sarah; van Dick, Rolf; Oertel-Knöchel, Viola; Aviezer, Hillel

    2016-09-01

    According to psychological models as well as common intuition, intense positive and negative situations evoke highly distinct emotional expressions. Nevertheless, recent work has shown that when judging isolated faces, the affective valence of winning and losing professional tennis players is hard to differentiate. However, expressions produced by professional athletes during publicly broadcasted sports events may be strategically controlled. To shed light on this matter we examined if ordinary people's spontaneous facial expressions evoked during highly intense situations are diagnostic for the situational valence. In Experiment 1 we compared reactions with highly intense positive situations (surprise soldier reunions) versus highly intense negative situations (terror attacks). In Experiment 2, we turned to children and compared facial reactions with highly positive situations (e.g., a child receiving a surprise trip to Disneyland) versus highly negative situations (e.g., a child discovering her parents ate up all her Halloween candy). The results demonstrate that facial expressions of both adults and children are often not diagnostic for the valence of the situation. These findings demonstrate the ambiguity of extreme facial expressions and highlight the importance of context in everyday emotion perception. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  3. Non-rigid, but not rigid, motion interferes with the processing of structural face information in developmental prosopagnosia.

    PubMed

    Maguinness, Corrina; Newell, Fiona N

    2015-04-01

    There is growing evidence to suggest that facial motion is an important cue for face recognition. However, it is poorly understood whether motion is integrated with facial form information or whether it provides an independent cue to identity. To provide further insight into this issue, we compared the effect of motion on face perception in two developmental prosopagnosics and age-matched controls. Participants first learned faces presented dynamically (video), or in a sequence of static images, in which rigid (viewpoint) or non-rigid (expression) changes occurred. Immediately following learning, participants were required to match a static face image to the learned face. Test face images varied by viewpoint (Experiment 1) or expression (Experiment 2) and were learned or novel face images. We found similar performance across prosopagnosics and controls in matching facial identity across changes in viewpoint when the learned face was shown moving in a rigid manner. However, non-rigid motion interfered with face matching across changes in expression in both individuals with prosopagnosia compared to the performance of control participants. In contrast, non-rigid motion did not differentially affect the matching of facial expressions across changes in identity for either prosopagnosics (Experiment 3). Our results suggest that whilst the processing of rigid motion information of a face may be preserved in developmental prosopagnosia, non-rigid motion can specifically interfere with the representation of structural face information. Taken together, these results suggest that both form and motion cues are important in face perception and that these cues are likely integrated in the representation of facial identity. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Developmental Change in Infant Categorization: The Perception of Correlations among Facial Features.

    ERIC Educational Resources Information Center

    Younger, Barbara

    1992-01-01

    Tested 7 and 10 month olds for perception of correlations among facial features. After habituation to faces displaying a pattern of correlation, 10 month olds generalized to a novel face that preserved the pattern of correlation but showed increased attention to a novel face that violated the pattern. (BC)

  5. The Influence of Averageness on Adults' Perceptions of Attractiveness: The Effect of Early Visual Deprivation.

    PubMed

    Vingilis-Jaremko, Larissa; Maurer, Daphne; Rhodes, Gillian; Jeffery, Linda

    2016-08-03

    Adults who missed early visual input because of congenital cataracts later have deficits in many aspects of face processing. Here we investigated whether they make normal judgments of facial attractiveness. In particular, we studied whether their perceptions are affected normally by a face's proximity to the population mean, as is true of typically developing adults, who find average faces to be more attractive than most other faces. We compared the judgments of facial attractiveness of 12 cataract-reversal patients to norms established from 36 adults with normal vision. Participants viewed pairs of adult male and adult female faces that had been transformed 50% toward and 50% away from their respective group averages, and selected which face was more attractive. Averageness influenced patients' judgments of attractiveness, but to a lesser extent than controls. The results suggest that cataract-reversal patients are able to develop a system for representing faces with a privileged position for an average face, consistent with evidence from identity aftereffects. However, early visual experience is necessary to set up the neural architecture necessary for averageness to influence perceptions of attractiveness with its normal potency. © The Author(s) 2016.

  6. Early Sign Language Experience Goes along with an Increased Cross-Modal Gain for Affective Prosodic Recognition in Congenitally Deaf CI Users

    ERIC Educational Resources Information Center

    Fengler, Ineke; Delfau, Pia-Céline; Röder, Brigitte

    2018-01-01

    It is yet unclear whether congenitally deaf cochlear implant (CD CI) users' visual and multisensory emotion perception is influenced by their history in sign language acquisition. We hypothesized that early-signing CD CI users, relative to late-signing CD CI users and hearing, non-signing controls, show better facial expression recognition and…

  7. Face in profile view reduces perceived facial expression intensity: an eye-tracking study.

    PubMed

    Guo, Kun; Shaw, Heather

    2015-02-01

    Recent studies measuring the facial expressions of emotion have focused primarily on the perception of frontal face images. As we frequently encounter expressive faces from different viewing angles, having a mechanism which allows invariant expression perception would be advantageous to our social interactions. Although a couple of studies have indicated comparable expression categorization accuracy across viewpoints, it is unknown how perceived expression intensity and associated gaze behaviour change across viewing angles. Differences could arise because diagnostic cues from local facial features for decoding expressions could vary with viewpoints. Here we manipulated orientation of faces (frontal, mid-profile, and profile view) displaying six common facial expressions of emotion, and measured participants' expression categorization accuracy, perceived expression intensity and associated gaze patterns. In comparison with frontal faces, profile faces slightly reduced identification rates for disgust and sad expressions, but significantly decreased perceived intensity for all tested expressions. Although quantitatively viewpoint had expression-specific influence on the proportion of fixations directed at local facial features, the qualitative gaze distribution within facial features (e.g., the eyes tended to attract the highest proportion of fixations, followed by the nose and then the mouth region) was independent of viewpoint and expression type. Our results suggest that the viewpoint-invariant facial expression processing is categorical perception, which could be linked to a viewpoint-invariant holistic gaze strategy for extracting expressive facial cues. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Empathetic perspective-taking is impaired in schizophrenia: evidence from a study of emotion attribution and theory of mind.

    PubMed

    Langdon, Robyn; Coltheart, Max; Ward, Philip B

    2006-03-01

    Schizophrenia and autism are clinically distinct yet both disorders are characterised by theory of mind (ToM) deficits. Autistic individuals fail to appreciate false beliefs, yet understand the causal connections between behavioural events and simple emotions. Findings of this type have promoted the view that ToM deficits in autism reflect a domain-specific difficulty with appreciating the representational nature of epistemic mental states (i.e., beliefs and intentions and not emotions). This study examines whether the same holds true for schizophrenia. A picture-sequencing task assessed capacity to infer false beliefs in patients with schizophrenia and healthy controls. To assess emotion attribution, participants were shown cartoon strips of events likely to elicit strong emotional reactions in story characters. Characters' faces were blanked out. Participants were instructed to think about how the characters would be feeling in order to match up the cards depicting facial affect appropriately. Participants later named emotions depicted in facial affect cards. Patients were as capable as controls of identifying cartoon facial expressions, yet had greater difficulties with: (a) attributing emotions based on circumstances; and (b) inferring false beliefs. Schizophrenia patients, unlike autistic individuals, suffer a domain-general difficulty with empathetic perspective-taking that affects equally their appreciation of other people's beliefs, percepts, and emotions.

  9. Unconscious processing of facial affect in children and adolescents.

    PubMed

    Killgore, William D S; Yurgelun-Todd, Deborah A

    2007-01-01

    In a previous study, with adults, we demonstrated that the amygdala and anterior cingulate gyrus are differentially responsive to happy and sad faces presented subliminally. Because the ability to perceive subtle facial signals communicating sadness is an important aspect of prosocial development, and is critical for empathic behavior, we examined this phenomenon from a developmental perspective using a backward masking paradigm. While undergoing functional magnetic resonance imaging (fMRI), 10 healthy adolescent children were presented with a series of happy and sad facial expressions, each lasting 20 ms and masked immediately by a neutral face to prevent conscious awareness of the affective expression. Relative to fixation baseline, masked sad faces activated the right amygdala, whereas masked happy faces failed to activate any of the regions of interest. Direct comparison between masked happy and sad faces revealed valence specific differences in the anterior cingulate gyrus. When the data were compared statistically to our previous sample of adults, the adolescent group showed significantly greater activity in the right amygdala relative to the adults during the masked sad condition. Groups also differed in several non-hypothesized regions. Development of unconscious perception from adolescence into adulthood appears to be accompanied by reduced activity within limbic affect processing systems, and perhaps increased involvement of other cortical and cerebellar systems.

  10. The impact of facial abnormalities and their spatial position on perception of cuteness and attractiveness of infant faces

    PubMed Central

    Lewis, Jennifer; Roberson, Debi

    2017-01-01

    Research has demonstrated that how “cute” an infant is perceived to be has consequences for caregiving. Infants with facial abnormalities receive lower ratings of cuteness, but relatively little is known about how different abnormalities and their location affect these aesthetic judgements. The objective of the current study was to compare the impact of different abnormalities on the perception of infant faces, while controlling for infant identity. In two experiments, adult participants gave ratings of cuteness and attractiveness in response to face images that had been edited to introduce common facial abnormalities. Stimulus faces displayed either a haemangioma (a small, benign birth mark), strabismus (an abnormal alignment of the eyes) or a cleft lip (an abnormal opening in the upper lip). In Experiment 1, haemangioma had less of a detrimental effect on ratings than the more severe abnormalities. In Experiment 2, we manipulated the position of a haemangioma on the face. We found small but robust effects of this position, with abnormalities in the top and on the left of the face receiving lower cuteness ratings. This is consistent with previous research showing that people attend more to the top of the face (particularly the eyes) and to the left hemifield. PMID:28749958

  11. Visual attention to variation in female facial skin color distribution.

    PubMed

    Fink, Bernhard; Matts, Paul J; Klingenberg, Heiner; Kuntze, Sebastian; Weege, Bettina; Grammer, Karl

    2008-06-01

    Visible skin condition of women is argued to influence human physical attraction. Recent research has shown that people are sensitive to variation in skin color distribution, and such variation affects visual perception of female facial attractiveness, healthiness, and age. The eye gaze of 39 males and females, aged 13 to 45 years, was tracked while they viewed images of shape- and topography-standardized stimulus faces that varied only in terms of skin color distribution. The number of fixations and dwell time were significantly higher when viewing stimulus faces with the homogeneous skin color distribution of young people, compared with those of more elderly people. In accordance with recent research, facial stimuli with even skin tones were also judged to be younger and received higher attractiveness ratings. Finally, visual attention measures were negatively correlated with perceived age, but positively associated with attractiveness judgments. Variation in visible skin color distribution (independent of facial form and skin surface topography) is able to selectively attract people's attention toward female faces, and this higher attention results in more positive statements about a woman's face.

  12. Subliminal cues bias perception of facial affect in patients with social phobia: evidence for enhanced unconscious threat processing

    PubMed Central

    Jusyte, Aiste; Schönenberg, Michael

    2014-01-01

    Socially anxious individuals have been shown to exhibit altered processing of facial affect, especially expressions signaling threat. Enhanced unaware processing has been suggested an important mechanism which may give rise to anxious conscious cognition and behavior. This study investigated whether individuals with social anxiety disorder (SAD) are perceptually more vulnerable to the biasing effects of subliminal threat cues compared to healthy controls. In a perceptual judgment task, 23 SAD and 23 matched control participants were asked to rate the affective valence of parametrically manipulated affective expressions ranging from neutral to angry. Each trial was preceded by subliminal presentation of an angry/neutral cue. The SAD group tended to rate target faces as “angry” when the preceding subliminal stimulus was angry vs. neutral, while healthy participants were not biased by the subliminal stimulus presentation. The perceptual bias in SAD was also associated with higher reaction time latencies in the subliminal angry cue condition. The results provide further support for enhanced unconscious threat processing in SAD individuals. The implications for etiology, maintenance, and treatment of SAD are discussed. PMID:25136307

  13. Subliminal cues bias perception of facial affect in patients with social phobia: evidence for enhanced unconscious threat processing.

    PubMed

    Jusyte, Aiste; Schönenberg, Michael

    2014-01-01

    Socially anxious individuals have been shown to exhibit altered processing of facial affect, especially expressions signaling threat. Enhanced unaware processing has been suggested an important mechanism which may give rise to anxious conscious cognition and behavior. This study investigated whether individuals with social anxiety disorder (SAD) are perceptually more vulnerable to the biasing effects of subliminal threat cues compared to healthy controls. In a perceptual judgment task, 23 SAD and 23 matched control participants were asked to rate the affective valence of parametrically manipulated affective expressions ranging from neutral to angry. Each trial was preceded by subliminal presentation of an angry/neutral cue. The SAD group tended to rate target faces as "angry" when the preceding subliminal stimulus was angry vs. neutral, while healthy participants were not biased by the subliminal stimulus presentation. The perceptual bias in SAD was also associated with higher reaction time latencies in the subliminal angry cue condition. The results provide further support for enhanced unconscious threat processing in SAD individuals. The implications for etiology, maintenance, and treatment of SAD are discussed.

  14. Odor Valence Linearly Modulates Attractiveness, but Not Age Assessment, of Invariant Facial Features in a Memory-Based Rating Task

    PubMed Central

    Seubert, Janina; Gregory, Kristen M.; Chamberland, Jessica; Dessirier, Jean-Marc; Lundström, Johan N.

    2014-01-01

    Scented cosmetic products are used across cultures as a way to favorably influence one's appearance. While crossmodal effects of odor valence on perceived attractiveness of facial features have been demonstrated experimentally, it is unknown whether they represent a phenomenon specific to affective processing. In this experiment, we presented odors in the context of a face battery with systematic feature manipulations during a speeded response task. Modulatory effects of linear increases of odor valence were investigated by juxtaposing subsequent memory-based ratings tasks – one predominantly affective (attractiveness) and a second, cognitive (age). The linear modulation pattern observed for attractiveness was consistent with additive effects of face and odor appraisal. Effects of odor valence on age perception were not linearly modulated and may be the result of cognitive interference. Affective and cognitive processing of faces thus appear to differ in their susceptibility to modulation by odors, likely as a result of privileged access of olfactory stimuli to affective brain networks. These results are critically discussed with respect to potential biases introduced by the preceding speeded response task. PMID:24874703

  15. Children's Perceptions of and Beliefs about Facial Maturity

    ERIC Educational Resources Information Center

    Thomas, Gross F.

    2004-01-01

    The author studied children's and young adult's perceptions of facial age and beliefs about the sociability, cognitive ability, and physical fitness of adult faces. From pairs of photographs of adult faces, participants (4-6 years old, 8-10 years old, 13-16 years old, and 19-23 years old) selected the one face that appeared younger, older, better…

  16. Teachers' Perception Regarding Facial Expressions as an Effective Teaching Tool

    ERIC Educational Resources Information Center

    Butt, Muhammad Naeem; Iqbal, Mohammad

    2011-01-01

    The major objective of the study was to explore teachers' perceptions about the importance of facial expression in the teaching-learning process. All the teachers of government secondary schools constituted the population of the study. A sample of 40 teachers, both male and female, in rural and urban areas of district Peshawar, were selected…

  17. A differential neural response to threatening and non-threatening negative facial expressions in paranoid and non-paranoid schizophrenics.

    PubMed

    Phillips, M L; Williams, L; Senior, C; Bullmore, E T; Brammer, M J; Andrew, C; Williams, S C; David, A S

    1999-11-08

    Several studies have demonstrated impaired facial expression recognition in schizophrenia. Few have examined the neural basis for this; none have compared the neural correlates of facial expression perception in different schizophrenic patient subgroups. We compared neural responses to facial expressions in 10 right-handed schizophrenic patients (five paranoid and five non-paranoid) and five normal volunteers using functional Magnetic Resonance Imaging (fMRI). In three 5-min experiments, subjects viewed alternating 30-s blocks of black-and-white facial expressions of either fear, anger or disgust contrasted with expressions of mild happiness. After scanning, subjects categorised each expression. All patients were less accurate in identifying expressions, and showed less activation to these stimuli than normals. Non-paranoids performed poorly in the identification task and failed to activate neural regions that are normally linked with perception of these stimuli. They categorised disgust as either anger or fear more frequently than paranoids, and demonstrated in response to disgust expressions activation in the amygdala, a region associated with perception of fearful faces. Paranoids were more accurate in recognising expressions, and demonstrated greater activation than non-paranoids to most stimuli. We provide the first evidence for a distinction between two schizophrenic patient subgroups on the basis of recognition of and neural response to different negative facial expressions.

  18. Transient emotional events and individual affective traits affect emotion recognition in a perceptual decision-making task.

    PubMed

    Qiao-Tasserit, Emilie; Garcia Quesada, Maria; Antico, Lia; Bavelier, Daphne; Vuilleumier, Patrik; Pichon, Swann

    2017-01-01

    Both affective states and personality traits shape how we perceive the social world and interpret emotions. The literature on affective priming has mostly focused on brief influences of emotional stimuli and emotional states on perceptual and cognitive processes. Yet this approach does not fully capture more dynamic processes at the root of emotional states, with such states lingering beyond the duration of the inducing external stimuli. Our goal was to put in perspective three different types of affective states (induced affective states, more sustained mood states and affective traits such as depression and anxiety) and investigate how they may interact and influence emotion perception. Here, we hypothesized that absorption into positive and negative emotional episodes generate sustained affective states that outlast the episode period and bias the interpretation of facial expressions in a perceptual decision-making task. We also investigated how such effects are influenced by more sustained mood states and by individual affect traits (depression and anxiety) and whether they interact. Transient emotional states were induced using movie-clips, after which participants performed a forced-choice emotion classification task with morphed facial expressions ranging from fear to happiness. Using a psychometric approach, we show that negative (vs. neutral) clips increased participants' propensity to classify ambiguous faces as fearful during several minutes. In contrast, positive movies biased classification toward happiness only for those clips perceived as most absorbing. Negative mood, anxiety and depression had a stronger effect than transient states and increased the propensity to classify ambiguous faces as fearful. These results provide the first evidence that absorption and different temporal dimensions of emotions have a significant effect on how we perceive facial expressions.

  19. Transient emotional events and individual affective traits affect emotion recognition in a perceptual decision-making task

    PubMed Central

    Garcia Quesada, Maria; Antico, Lia; Bavelier, Daphne; Vuilleumier, Patrik; Pichon, Swann

    2017-01-01

    Both affective states and personality traits shape how we perceive the social world and interpret emotions. The literature on affective priming has mostly focused on brief influences of emotional stimuli and emotional states on perceptual and cognitive processes. Yet this approach does not fully capture more dynamic processes at the root of emotional states, with such states lingering beyond the duration of the inducing external stimuli. Our goal was to put in perspective three different types of affective states (induced affective states, more sustained mood states and affective traits such as depression and anxiety) and investigate how they may interact and influence emotion perception. Here, we hypothesized that absorption into positive and negative emotional episodes generate sustained affective states that outlast the episode period and bias the interpretation of facial expressions in a perceptual decision-making task. We also investigated how such effects are influenced by more sustained mood states and by individual affect traits (depression and anxiety) and whether they interact. Transient emotional states were induced using movie-clips, after which participants performed a forced-choice emotion classification task with morphed facial expressions ranging from fear to happiness. Using a psychometric approach, we show that negative (vs. neutral) clips increased participants’ propensity to classify ambiguous faces as fearful during several minutes. In contrast, positive movies biased classification toward happiness only for those clips perceived as most absorbing. Negative mood, anxiety and depression had a stronger effect than transient states and increased the propensity to classify ambiguous faces as fearful. These results provide the first evidence that absorption and different temporal dimensions of emotions have a significant effect on how we perceive facial expressions. PMID:28151976

  20. Sex-related differences in behavioral and amygdalar responses to compound facial threat cues.

    PubMed

    Im, Hee Yeon; Adams, Reginald B; Cushing, Cody A; Boshyan, Jasmine; Ward, Noreen; Kveraga, Kestutis

    2018-03-08

    During face perception, we integrate facial expression and eye gaze to take advantage of their shared signals. For example, fear with averted gaze provides a congruent avoidance cue, signaling both threat presence and its location, whereas fear with direct gaze sends an incongruent cue, leaving threat location ambiguous. It has been proposed that the processing of different combinations of threat cues is mediated by dual processing routes: reflexive processing via magnocellular (M) pathway and reflective processing via parvocellular (P) pathway. Because growing evidence has identified a variety of sex differences in emotional perception, here we also investigated how M and P processing of fear and eye gaze might be modulated by observer's sex, focusing on the amygdala, a structure important to threat perception and affective appraisal. We adjusted luminance and color of face stimuli to selectively engage M or P processing and asked observers to identify emotion of the face. Female observers showed more accurate behavioral responses to faces with averted gaze and greater left amygdala reactivity both to fearful and neutral faces. Conversely, males showed greater right amygdala activation only for M-biased averted-gaze fear faces. In addition to functional reactivity differences, females had proportionately greater bilateral amygdala volumes, which positively correlated with behavioral accuracy for M-biased fear. Conversely, in males only the right amygdala volume was positively correlated with accuracy for M-biased fear faces. Our findings suggest that M and P processing of facial threat cues is modulated by functional and structural differences in the amygdalae associated with observer's sex. © 2018 Wiley Periodicals, Inc.

  1. An Adult Developmental Approach to Perceived Facial Attractiveness and Distinctiveness

    PubMed Central

    Ebner, Natalie C.; Luedicke, Joerg; Voelkle, Manuel C.; Riediger, Michaela; Lin, Tian; Lindenberger, Ulman

    2018-01-01

    Attractiveness and distinctiveness constitute facial features with high biological and social relevance. Bringing a developmental perspective to research on social-cognitive face perception, we used a large set of faces taken from the FACES Lifespan Database to examine effects of face and perceiver characteristics on subjective evaluations of attractiveness and distinctiveness in young (20–31 years), middle-aged (44–55 years), and older (70–81 years) men and women. We report novel findings supporting variations by face and perceiver age, in interaction with gender and emotion: although older and middle-aged compared to young perceivers generally rated faces of all ages as more attractive, young perceivers gave relatively higher attractiveness ratings to young compared to middle-aged and older faces. Controlling for variations in attractiveness, older compared to young faces were viewed as more distinctive by young and middle-aged perceivers. Age affected attractiveness more negatively for female than male faces. Furthermore, happy faces were rated as most attractive, while disgusted faces were rated as least attractive, particularly so by middle-aged and older perceivers and for young and female faces. Perceivers largely agreed on distinctiveness ratings for neutral and happy emotions, but older and middle-aged compared to young perceivers rated faces displaying negative emotions as more distinctive. These findings underscore the importance of a lifespan perspective on perception of facial characteristics and suggest possible effects of age on goal-directed perception, social motivation, and in-group bias. This publication makes available picture-specific normative data for experimental stimulus selection. PMID:29867620

  2. Social vision: sustained perceptual enhancement of affective facial cues in social anxiety

    PubMed Central

    McTeague, Lisa M.; Shumen, Joshua R.; Wieser, Matthias J.; Lang, Peter J.; Keil, Andreas

    2010-01-01

    Heightened perception of facial cues is at the core of many theories of social behavior and its disorders. In the present study, we continuously measured electrocortical dynamics in human visual cortex, as evoked by happy, neutral, fearful, and angry faces. Thirty-seven participants endorsing high versus low generalized social anxiety (upper and lower tertiles of 2,104 screened undergraduates) viewed naturalistic faces flickering at 17.5 Hz to evoke steady-state visual evoked potentials (ssVEPs), recorded from 129 scalp electrodes. Electrophysiological data were evaluated in the time-frequency domain after linear source space projection using the minimum norm method. Source estimation indicated an early visual cortical origin of the face-evoked ssVEP, which showed sustained amplitude enhancement for emotional expressions specifically in individuals with pervasive social anxiety. Participants in the low symptom group showed no such sensitivity, and a correlational analysis across the entire sample revealed a strong relationship between self-reported interpersonal anxiety/avoidance and enhanced visual cortical response amplitude for emotional, versus neutral expressions. This pattern was maintained across the 3500 ms viewing epoch, suggesting that temporally sustained, heightened perceptual bias towards affective facial cues is associated with generalized social anxiety. PMID:20832490

  3. Brain potentials indicate the effect of other observers' emotions on perceptions of facial attractiveness.

    PubMed

    Huang, Yujing; Pan, Xuwei; Mo, Yan; Ma, Qingguo

    2016-03-23

    Perceptions of facial attractiveness are sensitive to emotional expression of the perceived face. However, little is known about whether the emotional expression on the face of another observer of the perceived face may have an effect on perceptions of facial attractiveness. The present study used event-related potential technique to examine social influence of the emotional expression on the face of another observer of the perceived face on perceptions of facial attractiveness. The experiment consisted of two phases. In the first phase, a neutral target face was paired with two images of individuals gazing at the target face with smiling, fearful or neutral expressions. In the second phase, participants were asked to judge the attractiveness of the target face. We found that a target face was more attractive when other observers positively gazing at the target face in contrast to the condition when other observers were negative. Additionally, the results of brain potentials showed that the visual positive component P3 with peak latency from 270 to 330 ms was larger after participants observed the target face paired with smiling individuals than the target face paired with neutral individuals. These findings suggested that facial attractiveness of an individual may be influenced by the emotional expression on the face of another observer of the perceived face. Copyright © 2016. Published by Elsevier Ireland Ltd.

  4. Serial dependence in the perception of attractiveness.

    PubMed

    Xia, Ye; Leib, Allison Yamanashi; Whitney, David

    2016-12-01

    The perception of attractiveness is essential for choices of food, object, and mate preference. Like perception of other visual features, perception of attractiveness is stable despite constant changes of image properties due to factors like occlusion, visual noise, and eye movements. Recent results demonstrate that perception of low-level stimulus features and even more complex attributes like human identity are biased towards recent percepts. This effect is often called serial dependence. Some recent studies have suggested that serial dependence also exists for perceived facial attractiveness, though there is also concern that the reported effects are due to response bias. Here we used an attractiveness-rating task to test the existence of serial dependence in perceived facial attractiveness. Our results demonstrate that perceived face attractiveness was pulled by the attractiveness level of facial images encountered up to 6 s prior. This effect was not due to response bias and did not rely on the previous motor response. This perceptual pull increased as the difference in attractiveness between previous and current stimuli increased. Our results reconcile previously conflicting findings and extend previous work, demonstrating that sequential dependence in perception operates across different levels of visual analysis, even at the highest levels of perceptual interpretation.

  5. Social cognition in schizophrenia and healthy aging: differences and similarities.

    PubMed

    Silver, Henry; Bilker, Warren B

    2014-12-01

    Social cognition is impaired in schizophrenia but it is not clear whether this is specific for the illness and whether emotion perception is selectively affected. To study this we examined the perception of emotional and non-emotional clues in facial expressions, a key social cognitive skill, in schizophrenia patients and old healthy individuals using young healthy individuals as reference. Tests of object recognition, visual orientation, psychomotor speed, and working memory were included to allow multivariate analysis taking into account other cognitive functions Schizophrenia patients showed impairments in recognition of identity and emotional facial clues compared to young and old healthy groups. Severity was similar to that for object recognition and visuospatial processing. Older and younger healthy groups did not differ from each other on these tests. Schizophrenia patients and old healthy individuals were similarly impaired in the ability to automatically learn new faces during the testing procedure (measured by the CSTFAC index) compared to young healthy individuals. Social cognition is distinctly impaired in schizophrenia compared to healthy aging. Further study is needed to identify the mechanisms of automatic social cognitive learning impairment in schizophrenia patients and healthy aging individuals and determine whether similar neural systems are affected. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. The perception of attractiveness and trustworthiness in male faces affects hypothetical voting decisions differently in wartime and peacetime scenarios.

    PubMed

    Little, Anthony C; Roberts, S Craig; Jones, Benedict C; Debruine, Lisa M

    2012-01-01

    Facial appearance of candidates has been linked to real election outcomes. Here we extend these findings by examining the contributions of attractiveness and trustworthiness in male faces to perceived votability. We first use real faces to show that attractiveness and trustworthiness are positively and independently related to perceptions of good leadership (rating study). We then show that computer graphic manipulations of attractiveness and trustworthiness influence choice of leader (experiments 1 and 2). Finally, we show that changing context from wartime to peacetime can affect which face receives the most votes. Attractive faces were relatively more valued for wartime and trustworthy faces relatively more valued for peacetime (experiments 1 and 2). This pattern suggests that attractiveness, which may indicate health and fitness, is perceived to be a useful attribute in wartime leaders, whereas trustworthiness, which may indicate prosocial traits, is perceived to be more important during peacetime. Our studies highlight the possible role of facial appearance in voting behaviour and the role of attributions of attractiveness and trust. We also show that there may be no general characteristics of faces that make them perceived as the best choice of leader; leaders may be chosen because of characteristics that are perceived as the best for leaders to possess in particular situations.

  7. Involvement of Right STS in Audio-Visual Integration for Affective Speech Demonstrated Using MEG

    PubMed Central

    Hagan, Cindy C.; Woods, Will; Johnson, Sam; Green, Gary G. R.; Young, Andrew W.

    2013-01-01

    Speech and emotion perception are dynamic processes in which it may be optimal to integrate synchronous signals emitted from different sources. Studies of audio-visual (AV) perception of neutrally expressed speech demonstrate supra-additive (i.e., where AV>[unimodal auditory+unimodal visual]) responses in left STS to crossmodal speech stimuli. However, emotions are often conveyed simultaneously with speech; through the voice in the form of speech prosody and through the face in the form of facial expression. Previous studies of AV nonverbal emotion integration showed a role for right (rather than left) STS. The current study therefore examined whether the integration of facial and prosodic signals of emotional speech is associated with supra-additive responses in left (cf. results for speech integration) or right (due to emotional content) STS. As emotional displays are sometimes difficult to interpret, we also examined whether supra-additive responses were affected by emotional incongruence (i.e., ambiguity). Using magnetoencephalography, we continuously recorded eighteen participants as they viewed and heard AV congruent emotional and AV incongruent emotional speech stimuli. Significant supra-additive responses were observed in right STS within the first 250 ms for emotionally incongruent and emotionally congruent AV speech stimuli, which further underscores the role of right STS in processing crossmodal emotive signals. PMID:23950977

  8. Involvement of right STS in audio-visual integration for affective speech demonstrated using MEG.

    PubMed

    Hagan, Cindy C; Woods, Will; Johnson, Sam; Green, Gary G R; Young, Andrew W

    2013-01-01

    Speech and emotion perception are dynamic processes in which it may be optimal to integrate synchronous signals emitted from different sources. Studies of audio-visual (AV) perception of neutrally expressed speech demonstrate supra-additive (i.e., where AV>[unimodal auditory+unimodal visual]) responses in left STS to crossmodal speech stimuli. However, emotions are often conveyed simultaneously with speech; through the voice in the form of speech prosody and through the face in the form of facial expression. Previous studies of AV nonverbal emotion integration showed a role for right (rather than left) STS. The current study therefore examined whether the integration of facial and prosodic signals of emotional speech is associated with supra-additive responses in left (cf. results for speech integration) or right (due to emotional content) STS. As emotional displays are sometimes difficult to interpret, we also examined whether supra-additive responses were affected by emotional incongruence (i.e., ambiguity). Using magnetoencephalography, we continuously recorded eighteen participants as they viewed and heard AV congruent emotional and AV incongruent emotional speech stimuli. Significant supra-additive responses were observed in right STS within the first 250 ms for emotionally incongruent and emotionally congruent AV speech stimuli, which further underscores the role of right STS in processing crossmodal emotive signals.

  9. Brief Report: Representational Momentum for Dynamic Facial Expressions in Pervasive Developmental Disorder

    ERIC Educational Resources Information Center

    Uono, Shota; Sato, Wataru; Toichi, Motomi

    2010-01-01

    Individuals with pervasive developmental disorder (PDD) have difficulty with social communication via emotional facial expressions, but behavioral studies involving static images have reported inconsistent findings about emotion recognition. We investigated whether dynamic presentation of facial expression would enhance subjective perception of…

  10. The Perception and Mimicry of Facial Movements Predict Judgments of Smile Authenticity

    PubMed Central

    Korb, Sebastian; With, Stéphane; Niedenthal, Paula; Kaiser, Susanne; Grandjean, Didier

    2014-01-01

    The mechanisms through which people perceive different types of smiles and judge their authenticity remain unclear. Here, 19 different types of smiles were created based on the Facial Action Coding System (FACS), using highly controlled, dynamic avatar faces. Participants observed short videos of smiles while their facial mimicry was measured with electromyography (EMG) over four facial muscles. Smile authenticity was judged after each trial. Avatar attractiveness was judged once in response to each avatar’s neutral face. Results suggest that, in contrast to most earlier work using static pictures as stimuli, participants relied less on the Duchenne marker (the presence of crow’s feet wrinkles around the eyes) in their judgments of authenticity. Furthermore, mimicry of smiles occurred in the Zygomaticus Major, Orbicularis Oculi, and Corrugator muscles. Consistent with theories of embodied cognition, activity in these muscles predicted authenticity judgments, suggesting that facial mimicry influences the perception of smiles. However, no significant mediation effect of facial mimicry was found. Avatar attractiveness did not predict authenticity judgments or mimicry patterns. PMID:24918939

  11. Revisiting the Relationship between the Processing of Gaze Direction and the Processing of Facial Expression

    ERIC Educational Resources Information Center

    Ganel, Tzvi

    2011-01-01

    There is mixed evidence on the nature of the relationship between the perception of gaze direction and the perception of facial expressions. Major support for shared processing of gaze and expression comes from behavioral studies that showed that observers cannot process expression or gaze and ignore irrelevant variations in the other dimension.…

  12. Developmental Changes in the Perception of Adult Facial Age

    ERIC Educational Resources Information Center

    Gross, Thomas F.

    2007-01-01

    The author studied children's (aged 5-16 years) and young adults' (aged 18-22 years) perception and use of facial features to discriminate the age of mature adult faces. In Experiment 1, participants rated the age of unaltered and transformed (eyes, nose, eyes and nose, and whole face blurred) adult faces (aged 20-80 years). In Experiment 2,…

  13. Functional Alterations of Postcentral Gyrus Modulated by Angry Facial Expressions during Intraoral Tactile Stimuli in Patients with Burning Mouth Syndrome: A Functional Magnetic Resonance Imaging Study

    PubMed Central

    Yoshino, Atsuo; Okamoto, Yasumasa; Doi, Mitsuru; Okada, Go; Takamura, Masahiro; Ichikawa, Naho; Yamawaki, Shigeto

    2017-01-01

    Previous findings suggest that negative emotions could influence abnormal sensory perception in burning mouth syndrome (BMS). However, few studies have investigated the underlying neural mechanisms associated with BMS. We examined activation of brain regions in response to intraoral tactile stimuli when modulated by angry facial expressions. We performed functional magnetic resonance imaging on a group of 27 BMS patients and 21 age-matched healthy controls. Tactile stimuli were presented during different emotional contexts, which were induced via the continuous presentation of angry or neutral pictures of human faces. BMS patients exhibited higher tactile ratings and greater activation in the postcentral gyrus during the presentation of tactile stimuli involving angry faces relative to controls. Significant positive correlations between changes in brain activation elicited by angry facial images in the postcentral gyrus and changes in tactile rating scores by angry facial images were found for both groups. For BMS patients, there was a significant positive correlation between changes in tactile-related activation of the postcentral gyrus elicited by angry facial expressions and pain intensity in daily life. Findings suggest that neural responses in the postcentral gyrus are more strongly affected by angry facial expressions in BMS patients, which may reflect one possible mechanism underlying impaired somatosensory system function in this disorder. PMID:29163243

  14. Men's facial masculinity: when (body) size matters.

    PubMed

    Holzleitner, Iris J; Hunter, David W; Tiddeman, Bernard P; Seck, Alassane; Re, Daniel E; Perrett, David I

    2014-01-01

    Recent studies suggest that judgments of facial masculinity reflect more than sexually dimorphic shape. Here, we investigated whether the perception of masculinity is influenced by facial cues to body height and weight. We used the average differences in three-dimensional face shape of forty men and forty women to compute a morphological masculinity score, and derived analogous measures for facial correlates of height and weight based on the average face shape of short and tall, and light and heavy men. We found that facial cues to body height and weight had substantial and independent effects on the perception of masculinity. Our findings suggest that men are perceived as more masculine if they appear taller and heavier, independent of how much their face shape differs from women's. We describe a simple method to quantify how body traits are reflected in the face and to define the physical basis of psychological attributions.

  15. Automatic emotion processing as a function of trait emotional awareness: an fMRI study

    PubMed Central

    Lichev, Vladimir; Sacher, Julia; Ihme, Klas; Rosenberg, Nicole; Quirin, Markus; Lepsien, Jöran; Pampel, André; Rufer, Michael; Grabe, Hans-Jörgen; Kugel, Harald; Kersting, Anette; Villringer, Arno; Lane, Richard D.

    2015-01-01

    It is unclear whether reflective awareness of emotions is related to extent and intensity of implicit affective reactions. This study is the first to investigate automatic brain reactivity to emotional stimuli as a function of trait emotional awareness. To assess emotional awareness the Levels of Emotional Awareness Scale (LEAS) was administered. During scanning, masked happy, angry, fearful and neutral facial expressions were presented to 46 healthy subjects, who had to rate the fit between artificial and emotional words. The rating procedure allowed assessment of shifts in implicit affectivity due to emotion faces. Trait emotional awareness was associated with increased activation in the primary somatosensory cortex, inferior parietal lobule, anterior cingulate gyrus, middle frontal and cerebellar areas, thalamus, putamen and amygdala in response to masked happy faces. LEAS correlated positively with shifts in implicit affect caused by masked happy faces. According to our findings, people with high emotional awareness show stronger affective reactivity and more activation in brain areas involved in emotion processing and simulation during the perception of masked happy facial expression than people with low emotional awareness. High emotional awareness appears to be characterized by an enhanced positive affective resonance to others at an automatic processing level. PMID:25140051

  16. Impaired perception of facial emotion in developmental prosopagnosia.

    PubMed

    Biotti, Federica; Cook, Richard

    2016-08-01

    Developmental prosopagnosia (DP) is a neurodevelopmental condition characterised by difficulties recognising faces. Despite severe difficulties recognising facial identity, expression recognition is typically thought to be intact in DP; case studies have described individuals who are able to correctly label photographic displays of facial emotion, and no group differences have been reported. This pattern of deficits suggests a locus of impairment relatively late in the face processing stream, after the divergence of expression and identity analysis pathways. To date, however, there has been little attempt to investigate emotion recognition systematically in a large sample of developmental prosopagnosics using sensitive tests. In the present study, we describe three complementary experiments that examine emotion recognition in a sample of 17 developmental prosopagnosics. In Experiment 1, we investigated observers' ability to make binary classifications of whole-face expression stimuli drawn from morph continua. In Experiment 2, observers judged facial emotion using only the eye-region (the rest of the face was occluded). Analyses of both experiments revealed diminished ability to classify facial expressions in our sample of developmental prosopagnosics, relative to typical observers. Imprecise expression categorisation was particularly evident in those individuals exhibiting apperceptive profiles, associated with problems encoding facial shape accurately. Having split the sample of prosopagnosics into apperceptive and non-apperceptive subgroups, only the apperceptive prosopagnosics were impaired relative to typical observers. In our third experiment, we examined the ability of observers' to classify the emotion present within segments of vocal affect. Despite difficulties judging facial emotion, the prosopagnosics exhibited excellent recognition of vocal affect. Contrary to the prevailing view, our results suggest that many prosopagnosics do experience difficulties classifying expressions, particularly those with apperceptive profiles. These individuals may have difficulties forming view-invariant structural descriptions at an early stage in the face processing stream, before identity and expression pathways diverge. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Emotional Intelligence and Mismatching Expressive and Verbal Messages: A Contribution to Detection of Deception

    PubMed Central

    Wojciechowski, Jerzy; Stolarski, Maciej; Matthews, Gerald

    2014-01-01

    Processing facial emotion, especially mismatches between facial and verbal messages, is believed to be important in the detection of deception. For example, emotional leakage may accompany lying. Individuals with superior emotion perception abilities may then be more adept in detecting deception by identifying mismatch between facial and verbal messages. Two personal factors that may predict such abilities are female gender and high emotional intelligence (EI). However, evidence on the role of gender and EI in detection of deception is mixed. A key issue is that the facial processing skills required to detect deception may not be the same as those required to identify facial emotion. To test this possibility, we developed a novel facial processing task, the FDT (Face Decoding Test) that requires detection of inconsistencies between facial and verbal cues to emotion. We hypothesized that gender and ability EI would be related to performance when cues were inconsistent. We also hypothesized that gender effects would be mediated by EI, because women tend to score as more emotionally intelligent on ability tests. Data were collected from 210 participants. Analyses of the FDT suggested that EI was correlated with superior face decoding in all conditions. We also confirmed the expected gender difference, the superiority of high EI individuals, and the mediation hypothesis. Also, EI was more strongly associated with facial decoding performance in women than in men, implying there may be gender differences in strategies for processing affective cues. It is concluded that integration of emotional and cognitive cues may be a core attribute of EI that contributes to the detection of deception. PMID:24658500

  18. Societal Value of Surgery for Facial Reanimation.

    PubMed

    Su, Peiyi; Ishii, Lisa E; Joseph, Andrew; Nellis, Jason; Dey, Jacob; Bater, Kristin; Byrne, Patrick J; Boahene, Kofi D O; Ishii, Masaru

    2017-03-01

    Patients with facial paralysis are perceived negatively by society in a number of domains. Society's perception of the health utility of varying degrees of facial paralysis and the value society places on reconstructive surgery for facial reanimation need to be quantified. To measure health state utility of varying degrees of facial paralysis, willingness to pay (WTP) for a repair, and the subsequent value of facial reanimation surgery as perceived by society. This prospective observational study conducted in an academic tertiary referral center evaluated a group of 348 casual observers who viewed images of faces with unilateral facial paralysis of 3 severity levels (low, medium, and high) categorized by House-Brackmann grade. Structural equation modeling was performed to understand associations among health utility metrics, WTP, and facial perception domains. Data were collected from July 16 to September 26, 2015. Observer-rated (1) quality of life (QOL) using established health utility metrics (standard gamble, time trade-off, and a visual analog scale) and (2) their WTP for surgical repair. Among the 348 observers (248 women [71.3%]; 100 men [28.7%]; mean [SD] age, 29.3 [11.6] years), mixed-effects linear regression showed that WTP increased nonlinearly with increasing severity of paralysis. Participants were willing to pay $3487 (95% CI, $2362-$4961) to repair low-grade paralysis, $8571 (95% CI, $6401-$11 234) for medium-grade paralysis, and $20 431 (95% CI, $16 273-$25 317) for high-grade paralysis. The dominant factor affecting the participants' WTP was perceived QOL. Modeling showed that perceived QOL decreased with paralysis severity (regression coefficient, -0.004; 95% CI, -0.005 to -0.004; P < .001) and increased with attractiveness (regression coefficient, 0.002; 95% CI, 0.002 to 0.003; P < .001). Mean (SD) health utility scores calculated by the standard gamble metric for low- and high-grade paralysis were 0.98 (0.09) and 0.77 (0.25), respectively. Time trade-off and visual analog scale measures were highly correlated. We calculated mean (SD) WTP per quality-adjusted life-year, which ranged from $10 167 ($14 565) to $17 008 ($38 288) for low- to high-grade paralysis, respectively. Society perceives the repair of facial paralysis to be a high-value intervention. Societal WTP increases and perceived health state utility decreases with increasing House-Brackmann grade. This study demonstrates the usefulness of WTP as an objective measure to inform dimensions of disease severity and signal the value society places on proper facial function. NA.

  19. Introducing the Geneva Multimodal expression corpus for experimental research on emotion perception.

    PubMed

    Bänziger, Tanja; Mortillaro, Marcello; Scherer, Klaus R

    2012-10-01

    Research on the perception of emotional expressions in faces and voices is exploding in psychology, the neurosciences, and affective computing. This article provides an overview of some of the major emotion expression (EE) corpora currently available for empirical research and introduces a new, dynamic, multimodal corpus of emotion expressions, the Geneva Multimodal Emotion Portrayals Core Set (GEMEP-CS). The design features of the corpus are outlined and justified, and detailed validation data for the core set selection are presented and discussed. Finally, an associated database with microcoded facial, vocal, and body action elements, as well as observer ratings, is introduced.

  20. Not on the Face Alone: Perception of Contextualized Face Expressions in Huntington's Disease

    ERIC Educational Resources Information Center

    Aviezer, Hillel; Bentin, Shlomo; Hassin, Ran R.; Meschino, Wendy S.; Kennedy, Jeanne; Grewal, Sonya; Esmail, Sherali; Cohen, Sharon; Moscovitch, Morris

    2009-01-01

    Numerous studies have demonstrated that Huntington's disease mutation-carriers have deficient explicit recognition of isolated facial expressions. There are no studies, however, which have investigated the recognition of facial expressions embedded within an emotional body and scene context. Real life facial expressions are typically embedded in…

  1. Global-Local Precedence in the Perception of Facial Age and Emotional Expression by Children with Autism and Other Developmental Disabilities

    ERIC Educational Resources Information Center

    Gross, Thomas F.

    2005-01-01

    Global information processing and perception of facial age and emotional expression was studied in children with autism, language disorders, mental retardation, and a clinical control group. Children were given a global-local task and asked to recognize age and emotion in human and canine faces. Children with autism made fewer global responses and…

  2. Effects of spatial frequency and location of fearful faces on human amygdala activity.

    PubMed

    Morawetz, Carmen; Baudewig, Juergen; Treue, Stefan; Dechent, Peter

    2011-01-31

    Facial emotion perception plays a fundamental role in interpersonal social interactions. Images of faces contain visual information at various spatial frequencies. The amygdala has previously been reported to be preferentially responsive to low-spatial frequency (LSF) rather than to high-spatial frequency (HSF) filtered images of faces presented at the center of the visual field. Furthermore, it has been proposed that the amygdala might be especially sensitive to affective stimuli in the periphery. In the present study we investigated the impact of spatial frequency and stimulus eccentricity on face processing in the human amygdala and fusiform gyrus using functional magnetic resonance imaging (fMRI). The spatial frequencies of pictures of fearful faces were filtered to produce images that retained only LSF or HSF information. Facial images were presented either in the left or right visual field at two different eccentricities. In contrast to previous findings, we found that the amygdala responds to LSF and HSF stimuli in a similar manner regardless of the location of the affective stimuli in the visual field. Furthermore, the fusiform gyrus did not show differential responses to spatial frequency filtered images of faces. Our findings argue against the view that LSF information plays a crucial role in the processing of facial expressions in the amygdala and of a higher sensitivity to affective stimuli in the periphery. Copyright © 2010 Elsevier B.V. All rights reserved.

  3. Auto white balance method using a pigmentation separation technique for human skin color

    NASA Astrophysics Data System (ADS)

    Tanaka, Satomi; Kakinuma, Akihiro; Kamijo, Naohiro; Takahashi, Hiroshi; Tsumura, Norimichi

    2017-02-01

    The human visual system maintains the perception of colors of an object across various light sources. Similarly, current digital cameras feature an auto white balance function, which estimates the illuminant color and corrects the color of a photograph as if the photograph was taken under a certain light source. The main subject in a photograph is often a person's face, which could be used to estimate the illuminant color. However, such estimation is adversely affected by differences in facial colors among individuals. The present paper proposes an auto white balance algorithm based on a pigmentation separation method that separates the human skin color image into the components of melanin, hemoglobin and shading. Pigment densities have a uniform property within the same race that can be calculated from the components of melanin and hemoglobin in the face. We, thus, propose a method that uses the subject's facial color in an image and is unaffected by individual differences in facial color among Japanese people.

  4. Judging emotional congruency: Explicit attention to situational context modulates processing of facial expressions of emotion.

    PubMed

    Diéguez-Risco, Teresa; Aguado, Luis; Albert, Jacobo; Hinojosa, José Antonio

    2015-12-01

    The influence of explicit evaluative processes on the contextual integration of facial expressions of emotion was studied in a procedure that required the participants to judge the congruency of happy and angry faces with preceding sentences describing emotion-inducing situations. Judgments were faster on congruent trials in the case of happy faces and on incongruent trials in the case of angry faces. At the electrophysiological level, a congruency effect was observed in the face-sensitive N170 component that showed larger amplitudes on incongruent trials. An interactive effect of congruency and emotion appeared on the LPP (late positive potential), with larger amplitudes in response to happy faces that followed anger-inducing situations. These results show that the deliberate intention to judge the contextual congruency of facial expressions influences not only processes involved in affective evaluation such as those indexed by the LPP but also earlier processing stages that are involved in face perception. Copyright © 2015. Published by Elsevier B.V.

  5. The Impact of Facial Aesthetic and Reconstructive Surgeries on Patients' Quality of Life.

    PubMed

    Yıldız, Tülin; Selimen, Deniz

    2015-12-01

    The aim of the present prospective and descriptive study was to assess the impact of facial aesthetic and reconstructive surgeries on quality of life. Ninety-one patients, of whom 43 had aesthetic surgery and 48 had reconstructive surgery, were analysed. The data were collected using the patient information form, body cathexis scale, and short form (SF)-36 quality of life scale. There were significant differences between before and after the surgery in both groups in terms of body cathexis scale and quality of life (p < 0.05 for both). It was observed that problems regarding the body image perception were encountered more, and the quality of life was poorer in both aesthetic and reconstructive surgery patients before the surgery. However, the problems were decreased, and the quality of life was enhanced after the surgery. Among the parameters of SF-36 quality of life scale, particularly the mean scores of social functioning, physical role functioning, emotional role functioning, mental health, and vitality/fatigue were found low before the surgery, whereas the mean scores were significantly improved after the surgery. The results revealed that facial aesthetic and reconstructive surgical interventions favourably affected the body image perception and self-esteem and that positive reflections in emotional, social, and mental aspects were effective in enhancing self-confidence and quality of life of the individual.

  6. The correlates of subjective perception of identity and expression in the face network: an fMRI adaptation study

    PubMed Central

    Fox, Christopher J.; Moon, So Young; Iaria, Giuseppe; Barton, Jason J.S.

    2009-01-01

    The recognition of facial identity and expression are distinct tasks, with current models hypothesizing anatomic segregation of processing within a face-processing network. Using fMRI adaptation and a region-of-interest approach, we assessed how the perception of identity and expression changes in morphed stimuli affected the signal within this network, by contrasting (a) changes that crossed categorical boundaries of identity or expression with those that did not, and (b) changes that subjects perceived as causing identity or expression to change, versus changes that they perceived as not affecting the category of identity or expression. The occipital face area (OFA) was sensitive to any structural change in a face, whether it was identity or expression, but its signal did not correlate with whether subjects perceived a change or not. Both the fusiform face area (FFA) and the posterior superior temporal sulcus (pSTS) showed release from adaptation when subjects perceived a change in either identity or expression, although in the pSTS this effect only occurred when subjects were explicitly attending to expression. The middle superior temporal sulcus (mSTS) showed release from adaptation for expression only, and the precuneus for identity only. The data support models where the OFA is involved in the early perception of facial structure. However, evidence for a functional overlap in the FFA and pSTS, with both identity and expression signals in both areas, argues against a complete independence of identity and expression processing in these regions of the core face-processing network. PMID:18852053

  7. The correlates of subjective perception of identity and expression in the face network: an fMRI adaptation study.

    PubMed

    Fox, Christopher J; Moon, So Young; Iaria, Giuseppe; Barton, Jason J S

    2009-01-15

    The recognition of facial identity and expression are distinct tasks, with current models hypothesizing anatomic segregation of processing within a face-processing network. Using fMRI adaptation and a region-of-interest approach, we assessed how the perception of identity and expression changes in morphed stimuli affected the signal within this network, by contrasting (a) changes that crossed categorical boundaries of identity or expression with those that did not, and (b) changes that subjects perceived as causing identity or expression to change, versus changes that they perceived as not affecting the category of identity or expression. The occipital face area (OFA) was sensitive to any structural change in a face, whether it was identity or expression, but its signal did not correlate with whether subjects perceived a change or not. Both the fusiform face area (FFA) and the posterior superior temporal sulcus (pSTS) showed release from adaptation when subjects perceived a change in either identity or expression, although in the pSTS this effect only occurred when subjects were explicitly attending to expression. The middle superior temporal sulcus (mSTS) showed release from adaptation for expression only, and the precuneus for identity only. The data support models where the OFA is involved in the early perception of facial structure. However, evidence for a functional overlap in the FFA and pSTS, with both identity and expression signals in both areas, argues against a complete independence of identity and expression processing in these regions of the core face-processing network.

  8. Visible skin colouration predicts perception of male facial age, health and attractiveness.

    PubMed

    Fink, B; Bunse, L; Matts, P J; D'Emiliano, D

    2012-08-01

    Although there is evidence that perception of facial age, health and attractiveness is informed by shape characteristics as well as by visible skin condition, studies on the latter have focused almost exclusively on female skin. Recent research, however, suggests that a decrease in skin colour homogeneity leads to older, less healthy and less attractive ratings of facial skin in both women and men. Here, we elaborate on the significance of the homogeneity of visible skin colouration in men by testing the hypothesis that perception of age, health and attractiveness of (non-contextual) digitally isolated fields of cheek skin only can predict that of whole facial images. Facial digital images of 160 British men (all Caucasian) aged between 10 and 70 were blind-rated for age, health and attractiveness by a total of 147 men and 154 women (mean age = 22.95, SD = 4.26), and these ratings were related to those of corresponding images of cheek skin reported by Fink et al. (J. Eur. Acad. Dermatol. Venereol. in press). Linear regression analysis showed that age, health and attractiveness perception of men's faces could be predicted by the ratings of cheek skin only, such that older men were viewed as older, less healthy and less attractive. This result underlines once again the potent signalling role of skin in its own right, independent of shape or other factors and suggests strongly that visible skin condition, and skin colour homogeneity in particular, plays a significant role in the perception of men's faces. © 2012 The Authors. ICS © 2012 Society of Cosmetic Scientists and the Société Française de Cosmétologie.

  9. Relation between facial affect recognition and configural face processing in antipsychotic-free schizophrenia.

    PubMed

    Fakra, Eric; Jouve, Elisabeth; Guillaume, Fabrice; Azorin, Jean-Michel; Blin, Olivier

    2015-03-01

    Deficit in facial affect recognition is a well-documented impairment in schizophrenia, closely connected to social outcome. This deficit could be related to psychopathology, but also to a broader dysfunction in processing facial information. In addition, patients with schizophrenia inadequately use configural information-a type of processing that relies on spatial relationships between facial features. To date, no study has specifically examined the link between symptoms and misuse of configural information in the deficit in facial affect recognition. Unmedicated schizophrenia patients (n = 30) and matched healthy controls (n = 30) performed a facial affect recognition task and a face inversion task, which tests aptitude to rely on configural information. In patients, regressions were carried out between facial affect recognition, symptom dimensions and inversion effect. Patients, compared with controls, showed a deficit in facial affect recognition and a lower inversion effect. Negative symptoms and lower inversion effect could account for 41.2% of the variance in facial affect recognition. This study confirms the presence of a deficit in facial affect recognition, and also of dysfunctional manipulation in configural information in antipsychotic-free patients. Negative symptoms and poor processing of configural information explained a substantial part of the deficient recognition of facial affect. We speculate that this deficit may be caused by several factors, among which independently stand psychopathology and failure in correctly manipulating configural information. PsycINFO Database Record (c) 2015 APA, all rights reserved.

  10. Facial affective reactions to bitter-tasting foods and body mass index in adults.

    PubMed

    Garcia-Burgos, D; Zamora, M C

    2013-12-01

    Differences in food consumption among body-weight statuses (e.g., higher fruit intake linked with lower body mass index (BMI) and energy-dense products with higher BMI) has raised the question of why people who are overweight or are at risk of becoming overweight eat differently from thinner people. One explanation, in terms of sensitivity to affective properties of food, suggests that palatability-driven consumption is likely to be an important contributor to food intake, and therefore body weight. Extending this approach to unpalatable tastes, we examined the relationship between aversive reactions to foods and BMI. We hypothesized that people who have a high BMI will show more negative affective reactions to bitter-tasting stimuli, even after controlling for sensory perception differences. Given that hedonic reactions may influence consumption even without conscious feelings of pleasure/displeasure, the facial expressions were included in order to provide more direct access to affective systems than subjective reports. Forty adults (28 females, 12 males) participated voluntarily. Their ages ranged from 18 to 46 years (M=24.2, SD=5.8). On the basis of BMI, participants were classified as low BMI (BMI<20; n=20) and high BMI (BMI>23; n=20). The mean BMI was 19.1 for low BMI (SD=0.7) and 25.2 for high BMI participants (SD=1.8). Each subject tasted 5 mL of a grapefruit juice drink and a bitter chocolate drink. Subjects rated the drinks' hedonic and incentive value, familiarity and bitter intensity immediately after each stimulus presentation. The results indicated that high BMI participants reacted to bitter stimuli showing more profound changes from baseline in neutral and disgust facial expressions compared with low BMI. No differences between groups were detected for the subjective pleasantness and familiarity. The research here is the first to examine how affective facial reactions to bitter food, apart from taste responsiveness, can predict differences in BMI. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Eigen-disfigurement model for simulating plausible facial disfigurement after reconstructive surgery.

    PubMed

    Lee, Juhun; Fingeret, Michelle C; Bovik, Alan C; Reece, Gregory P; Skoracki, Roman J; Hanasono, Matthew M; Markey, Mia K

    2015-03-27

    Patients with facial cancers can experience disfigurement as they may undergo considerable appearance changes from their illness and its treatment. Individuals with difficulties adjusting to facial cancer are concerned about how others perceive and evaluate their appearance. Therefore, it is important to understand how humans perceive disfigured faces. We describe a new strategy that allows simulation of surgically plausible facial disfigurement on a novel face for elucidating the human perception on facial disfigurement. Longitudinal 3D facial images of patients (N = 17) with facial disfigurement due to cancer treatment were replicated using a facial mannequin model, by applying Thin-Plate Spline (TPS) warping and linear interpolation on the facial mannequin model in polar coordinates. Principal Component Analysis (PCA) was used to capture longitudinal structural and textural variations found within each patient with facial disfigurement arising from the treatment. We treated such variations as disfigurement. Each disfigurement was smoothly stitched on a healthy face by seeking a Poisson solution to guided interpolation using the gradient of the learned disfigurement as the guidance field vector. The modeling technique was quantitatively evaluated. In addition, panel ratings of experienced medical professionals on the plausibility of simulation were used to evaluate the proposed disfigurement model. The algorithm reproduced the given face effectively using a facial mannequin model with less than 4.4 mm maximum error for the validation fiducial points that were not used for the processing. Panel ratings of experienced medical professionals on the plausibility of simulation showed that the disfigurement model (especially for peripheral disfigurement) yielded predictions comparable to the real disfigurements. The modeling technique of this study is able to capture facial disfigurements and its simulation represents plausible outcomes of reconstructive surgery for facial cancers. Thus, our technique can be used to study human perception on facial disfigurement.

  12. [Neural mechanisms of facial recognition].

    PubMed

    Nagai, Chiyoko

    2007-01-01

    We review recent researches in neural mechanisms of facial recognition in the light of three aspects: facial discrimination and identification, recognition of facial expressions, and face perception in itself. First, it has been demonstrated that the fusiform gyrus has a main role of facial discrimination and identification. However, whether the FFA (fusiform face area) is really a special area for facial processing or not is controversial; some researchers insist that the FFA is related to 'becoming an expert' for some kinds of visual objects, including faces. Neural mechanisms of prosopagnosia would be deeply concerned to this issue. Second, the amygdala seems to be very concerned to recognition of facial expressions, especially fear. The amygdala, connected with the superior temporal sulcus and the orbitofrontal cortex, appears to operate the cortical function. The amygdala and the superior temporal sulcus are related to gaze recognition, which explains why a patient with bilateral amygdala damage could not recognize only a fear expression; the information from eyes is necessary for fear recognition. Finally, even a newborn infant can recognize a face as a face, which is congruent with the innate hypothesis of facial recognition. Some researchers speculate that the neural basis of such face perception is the subcortical network, comprised of the amygdala, the superior colliculus, and the pulvinar. This network would relate to covert recognition that prosopagnosic patients have.

  13. [Emotion Recognition in Patients with Peripheral Facial Paralysis - A Pilot Study].

    PubMed

    Konnerth, V; Mohr, G; von Piekartz, H

    2016-02-01

    The perception of emotions is an important component in enabling human beings to social interaction in everyday life. Thus, the ability to recognize the emotions of the other one's mime is a key prerequisite for this. The following study aimed at evaluating the ability of subjects with 'peripheral facial paresis' to perceive emotions in healthy individuals. A pilot study was conducted in which 13 people with 'peripheral facial paresis' participated. This assessment included the 'Facially Expressed Emotion Labeling-Test' (FEEL-Test), the 'Facial-Laterality-Recognition Test' (FLR-Test) and the 'Toronto-Alexithymie-Scale 26' (TAS 26). The results were compared with data of healthy people from other studies. In contrast to healthy patients, the subjects with 'facial paresis' show more difficulties in recognizing basic emotions; however the results are not significant. The participants show a significant lower level of speed (right/left: p<0.001) concerning the perception of facial laterality compared to healthy people. With regard to the alexithymia, the tested group reveals significantly higher results (p<0.001) compared to the unimpaired people. The present pilot study does not prove any impact on this specific patient group's ability to recognize emotions and facial laterality. For future studies the research question should be verified in a larger sample size. © Georg Thieme Verlag KG Stuttgart · New York.

  14. The implicit processing of categorical and dimensional strategies: an fMRI study of facial emotion perception

    PubMed Central

    Matsuda, Yoshi-Taka; Fujimura, Tomomi; Katahira, Kentaro; Okada, Masato; Ueno, Kenichi; Cheng, Kang; Okanoya, Kazuo

    2013-01-01

    Our understanding of facial emotion perception has been dominated by two seemingly opposing theories: the categorical and dimensional theories. However, we have recently demonstrated that hybrid processing involving both categorical and dimensional perception can be induced in an implicit manner (Fujimura etal., 2012). The underlying neural mechanisms of this hybrid processing remain unknown. In this study, we tested the hypothesis that separate neural loci might intrinsically encode categorical and dimensional processing functions that serve as a basis for hybrid processing. We used functional magnetic resonance imaging to measure neural correlates while subjects passively viewed emotional faces and performed tasks that were unrelated to facial emotion processing. Activity in the right fusiform face area (FFA) increased in response to psychologically obvious emotions and decreased in response to ambiguous expressions, demonstrating the role of the FFA in categorical processing. The amygdala, insula and medial prefrontal cortex exhibited evidence of dimensional (linear) processing that correlated with physical changes in the emotional face stimuli. The occipital face area and superior temporal sulcus did not respond to these changes in the presented stimuli. Our results indicated that distinct neural loci process the physical and psychological aspects of facial emotion perception in a region-specific and implicit manner. PMID:24133426

  15. Expression-dependent susceptibility to face distortions in processing of facial expressions of emotion.

    PubMed

    Guo, Kun; Soornack, Yoshi; Settle, Rebecca

    2018-03-05

    Our capability of recognizing facial expressions of emotion under different viewing conditions implies the existence of an invariant expression representation. As natural visual signals are often distorted and our perceptual strategy changes with external noise level, it is essential to understand how expression perception is susceptible to face distortion and whether the same facial cues are used to process high- and low-quality face images. We systematically manipulated face image resolution (experiment 1) and blur (experiment 2), and measured participants' expression categorization accuracy, perceived expression intensity and associated gaze patterns. Our analysis revealed a reasonable tolerance to face distortion in expression perception. Reducing image resolution up to 48 × 64 pixels or increasing image blur up to 15 cycles/image had little impact on expression assessment and associated gaze behaviour. Further distortion led to decreased expression categorization accuracy and intensity rating, increased reaction time and fixation duration, and stronger central fixation bias which was not driven by distortion-induced changes in local image saliency. Interestingly, the observed distortion effects were expression-dependent with less deterioration impact on happy and surprise expressions, suggesting this distortion-invariant facial expression perception might be achieved through the categorical model involving a non-linear configural combination of local facial features. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. How does context affect assessments of facial emotion? The role of culture and age

    PubMed Central

    Ko, Seon-Gyu; Lee, Tae-Ho; Yoon, Hyea-Young; Kwon, Jung-Hye; Mather, Mara

    2010-01-01

    People from Asian cultures are more influenced by context in their visual processing than people from Western cultures. In this study, we examined how these cultural differences in context processing affect how people interpret facial emotions. We found that younger Koreans were more influenced than younger Americans by emotional background pictures when rating the emotion of a central face, especially those younger Koreans with low self-rated stress. In contrast, among older adults, neither Koreans nor Americans showed significant influences of context in their face emotion ratings. These findings suggest that cultural differences in reliance on context to interpret others' emotions depend on perceptual integration processes that decline with age, leading to fewer cultural differences in perception among older adults than among younger adults. Furthermore, when asked to recall the background pictures, younger participants recalled more negative pictures than positive pictures, whereas older participants recalled similar numbers of positive and negative pictures. These age differences in the valence of memory were consistent across culture. PMID:21038967

  17. Women's Facial Redness Increases Their Perceived Attractiveness: Mediation Through Perceived Healthiness.

    PubMed

    Pazda, Adam D; Thorstenson, Christopher A; Elliot, Andrew J; Perrett, David I

    2016-07-01

    In the present research, we investigated whether the red-attraction relation that has been observed for men viewing women may also be observed with regard to women's facial redness. We manipulated facial redness by slightly increasing or decreasing the redness on the faces of baseline pictures of target women, and then had men judge the attractiveness of the women. We also examined healthiness perceptions as a mediator of the redness-attraction relation, along with several other candidate mediator variables. A series of experiments showed that increased redness led to increased ratings of attractiveness, and decreased redness led to decreased ratings of attractiveness. Perceived healthiness was documented as a mediator of the influence of female facial redness on male perceptions of attractiveness, and this mediation was independent of other candidate mediator variables. The findings highlight the importance of attending to facial coloration as an attraction-relevant cue and point to interesting areas for subsequent research. © The Author(s) 2016.

  18. The relative contributions of facial shape and surface information to perceptions of attractiveness and dominance.

    PubMed

    Torrance, Jaimie S; Wincenciak, Joanna; Hahn, Amanda C; DeBruine, Lisa M; Jones, Benedict C

    2014-01-01

    Although many studies have investigated the facial characteristics that influence perceptions of others' attractiveness and dominance, the majority of these studies have focused on either the effects of shape information or surface information alone. Consequently, the relative contributions of facial shape and surface characteristics to attractiveness and dominance perceptions are unclear. To address this issue, we investigated the relationships between ratings of original versions of faces and ratings of versions in which either surface information had been standardized (i.e., shape-only versions) or shape information had been standardized (i.e., surface-only versions). For attractiveness and dominance judgments of both male and female faces, ratings of shape-only and surface-only versions independently predicted ratings of the original versions of faces. The correlations between ratings of original and shape-only versions and between ratings of original and surface-only versions differed only in two instances. For male attractiveness, ratings of original versions were more strongly related to ratings of surface-only than shape-only versions, suggesting that surface information is particularly important for men's facial attractiveness. The opposite was true for female physical dominance, suggesting that shape information is particularly important for women's facial physical dominance. In summary, our results indicate that both facial shape and surface information contribute to judgments of others' attractiveness and dominance, suggesting that it may be important to consider both sources of information in research on these topics.

  19. Perceived emotion genuineness: normative ratings for popular facial expression stimuli and the development of perceived-as-genuine and perceived-as-fake sets.

    PubMed

    Dawel, Amy; Wright, Luke; Irons, Jessica; Dumbleton, Rachael; Palermo, Romina; O'Kearney, Richard; McKone, Elinor

    2017-08-01

    In everyday social interactions, people's facial expressions sometimes reflect genuine emotion (e.g., anger in response to a misbehaving child) and sometimes do not (e.g., smiling for a school photo). There is increasing theoretical interest in this distinction, but little is known about perceived emotion genuineness for existing facial expression databases. We present a new method for rating perceived genuineness using a neutral-midpoint scale (-7 = completely fake; 0 = don't know; +7 = completely genuine) that, unlike previous methods, provides data on both relative and absolute perceptions. Normative ratings from typically developing adults for five emotions (anger, disgust, fear, sadness, and happiness) provide three key contributions. First, the widely used Pictures of Facial Affect (PoFA; i.e., "the Ekman faces") and the Radboud Faces Database (RaFD) are typically perceived as not showing genuine emotion. Also, in the only published set for which the actual emotional states of the displayers are known (via self-report; the McLellan faces), percepts of emotion genuineness often do not match actual emotion genuineness. Second, we provide genuine/fake norms for 558 faces from several sources (PoFA, RaFD, KDEF, Gur, FacePlace, McLellan, News media), including a list of 143 stimuli that are event-elicited (rather than posed) and, congruently, perceived as reflecting genuine emotion. Third, using the norms we develop sets of perceived-as-genuine (from event-elicited sources) and perceived-as-fake (from posed sources) stimuli, matched on sex, viewpoint, eye-gaze direction, and rated intensity. We also outline the many types of research questions that these norms and stimulus sets could be used to answer.

  20. Multisensory emotion perception in congenitally, early, and late deaf CI users

    PubMed Central

    Nava, Elena; Villwock, Agnes K.; Büchner, Andreas; Lenarz, Thomas; Röder, Brigitte

    2017-01-01

    Emotions are commonly recognized by combining auditory and visual signals (i.e., vocal and facial expressions). Yet it is unknown whether the ability to link emotional signals across modalities depends on early experience with audio-visual stimuli. In the present study, we investigated the role of auditory experience at different stages of development for auditory, visual, and multisensory emotion recognition abilities in three groups of adolescent and adult cochlear implant (CI) users. CI users had a different deafness onset and were compared to three groups of age- and gender-matched hearing control participants. We hypothesized that congenitally deaf (CD) but not early deaf (ED) and late deaf (LD) CI users would show reduced multisensory interactions and a higher visual dominance in emotion perception than their hearing controls. The CD (n = 7), ED (deafness onset: <3 years of age; n = 7), and LD (deafness onset: >3 years; n = 13) CI users and the control participants performed an emotion recognition task with auditory, visual, and audio-visual emotionally congruent and incongruent nonsense speech stimuli. In different blocks, participants judged either the vocal (Voice task) or the facial expressions (Face task). In the Voice task, all three CI groups performed overall less efficiently than their respective controls and experienced higher interference from incongruent facial information. Furthermore, the ED CI users benefitted more than their controls from congruent faces and the CD CI users showed an analogous trend. In the Face task, recognition efficiency of the CI users and controls did not differ. Our results suggest that CI users acquire multisensory interactions to some degree, even after congenital deafness. When judging affective prosody they appear impaired and more strongly biased by concurrent facial information than typically hearing individuals. We speculate that limitations inherent to the CI contribute to these group differences. PMID:29023525

  1. Multisensory emotion perception in congenitally, early, and late deaf CI users.

    PubMed

    Fengler, Ineke; Nava, Elena; Villwock, Agnes K; Büchner, Andreas; Lenarz, Thomas; Röder, Brigitte

    2017-01-01

    Emotions are commonly recognized by combining auditory and visual signals (i.e., vocal and facial expressions). Yet it is unknown whether the ability to link emotional signals across modalities depends on early experience with audio-visual stimuli. In the present study, we investigated the role of auditory experience at different stages of development for auditory, visual, and multisensory emotion recognition abilities in three groups of adolescent and adult cochlear implant (CI) users. CI users had a different deafness onset and were compared to three groups of age- and gender-matched hearing control participants. We hypothesized that congenitally deaf (CD) but not early deaf (ED) and late deaf (LD) CI users would show reduced multisensory interactions and a higher visual dominance in emotion perception than their hearing controls. The CD (n = 7), ED (deafness onset: <3 years of age; n = 7), and LD (deafness onset: >3 years; n = 13) CI users and the control participants performed an emotion recognition task with auditory, visual, and audio-visual emotionally congruent and incongruent nonsense speech stimuli. In different blocks, participants judged either the vocal (Voice task) or the facial expressions (Face task). In the Voice task, all three CI groups performed overall less efficiently than their respective controls and experienced higher interference from incongruent facial information. Furthermore, the ED CI users benefitted more than their controls from congruent faces and the CD CI users showed an analogous trend. In the Face task, recognition efficiency of the CI users and controls did not differ. Our results suggest that CI users acquire multisensory interactions to some degree, even after congenital deafness. When judging affective prosody they appear impaired and more strongly biased by concurrent facial information than typically hearing individuals. We speculate that limitations inherent to the CI contribute to these group differences.

  2. Neurobiological mechanisms associated with facial affect recognition deficits after traumatic brain injury.

    PubMed

    Neumann, Dawn; McDonald, Brenna C; West, John; Keiski, Michelle A; Wang, Yang

    2016-06-01

    The neurobiological mechanisms that underlie facial affect recognition deficits after traumatic brain injury (TBI) have not yet been identified. Using functional magnetic resonance imaging (fMRI), study aims were to 1) determine if there are differences in brain activation during facial affect processing in people with TBI who have facial affect recognition impairments (TBI-I) relative to people with TBI and healthy controls who do not have facial affect recognition impairments (TBI-N and HC, respectively); and 2) identify relationships between neural activity and facial affect recognition performance. A facial affect recognition screening task performed outside the scanner was used to determine group classification; TBI patients who performed greater than one standard deviation below normal performance scores were classified as TBI-I, while TBI patients with normal scores were classified as TBI-N. An fMRI facial recognition paradigm was then performed within the 3T environment. Results from 35 participants are reported (TBI-I = 11, TBI-N = 12, and HC = 12). For the fMRI task, TBI-I and TBI-N groups scored significantly lower than the HC group. Blood oxygenation level-dependent (BOLD) signals for facial affect recognition compared to a baseline condition of viewing a scrambled face, revealed lower neural activation in the right fusiform gyrus (FG) in the TBI-I group than the HC group. Right fusiform gyrus activity correlated with accuracy on the facial affect recognition tasks (both within and outside the scanner). Decreased FG activity suggests facial affect recognition deficits after TBI may be the result of impaired holistic face processing. Future directions and clinical implications are discussed.

  3. The Noh mask effect: vertical viewpoint dependence of facial expression perception.

    PubMed Central

    Lyons, M J; Campbell, R; Plante, A; Coleman, M; Kamachi, M; Akamatsu, S

    2000-01-01

    Full-face masks, worn by skilled actors in the Noh tradition, can induce a variety of perceived expressions with changes in head orientation. Out-of-plane rotation of the head changes the two-dimensional image characteristics of the face which viewers may misinterpret as non-rigid changes due to muscle action. Three experiments with Japanese and British viewers explored this effect. Experiment 1 confirmed a systematic relationship between vertical angle of view of a Noh mask and judged affect. A forward tilted mask was more often judged happy, and one backward tilted more often judged sad. This effect was moderated by culture. Japanese viewers ascribed happiness to the mask at greater degrees of backward tilt with a reversal towards sadness at extreme forward angles. Cropping the facial image of chin and upper head contour reduced the forward-tilt reversal. Finally, the relationship between head tilt and affect was replicated with a laser-scanned human face image, but with no cultural effect. Vertical orientation of the head changes the apparent disposition of facial features and viewers respond systematically to these changes. Culture moderates this effect, and we discuss how perceptual strategies for ascribing expression to familiar and unfamiliar images may account for the differences. PMID:11413638

  4. Facial Identity and Self-Perception: An Examination of Psychosocial Outcomes in Cosmetic Surgery Patients.

    PubMed

    Slavin, Benjamin; Beer, Jacob

    2017-06-01

    The psychosocial health of patients undergoing cosmetic procedures has often been linked to a host of pre-existing conditions, including the type of procedure being performed. Age, gender, and the psychological state of the patients also contribute to the perceived outcome. Specifically, the presence or absence of Body Dysmorphic Disorder (BDD) has been identified as an independent marker for unhappiness following cosmetic procedures.1 However, no study has, to our knowledge, identified a more precise indicator that is associated with higher rates of patient dissatisfaction from cosmetic procedure. This review identifies facial identity and self-perception as potential identifiers of future patient dissatisfaction with cosmetic procedures. Specifically, we believe that patients with a realistic facial identity and self-perception are more likely to be satisfied than those whose self-perceptions are distorted. Patients undergoing restorative procedures, including blepharoplasty, rhytidectomy, and liposuction, are more likely to have an increased outcome favorability rating than those undergoing type change procedures, such as rhinoplasty and breast augmentation. Age, which typically is an independent variable for satisfaction, tends to be associated with increased favorability ratings following cosmetic procedures. Female gender is a second variable associated with higher satisfaction. The authors believe that negative facial identity and self-perception are risk factors for patient dissatisfaction with cosmetic procedural outcomes. Based on this assumption, clinicians may want to focus on the face as a particular area of psychosocial concern.

    J Drugs Dermatol. 2017;16(6):617-620.

    .

  5. Is fear in your head? A comparison of instructed and real-life expressions of emotion in the face and body.

    PubMed

    Abramson, Lior; Marom, Inbal; Petranker, Rotem; Aviezer, Hillel

    2017-04-01

    The majority of emotion perception studies utilize instructed and stereotypical expressions of faces or bodies. While such stimuli are highly standardized and well-recognized, their resemblance to real-life expressions of emotion remains unknown. Here we examined facial and body expressions of fear and anger during real-life situations and compared their recognition to that of instructed expressions of the same emotions. In order to examine the source of the affective signal, expressions of emotion were presented as faces alone, bodies alone, and naturally, as faces with bodies. The results demonstrated striking deviations between recognition of instructed and real-life stimuli, which differed as a function of the emotion expressed. In real-life fearful expressions of emotion, bodies were far better recognized than faces, a pattern not found with instructed expressions of emotion. Anger reactions were better recognized from the body than from the face in both real-life and instructed stimuli. However, the real-life stimuli were overall better recognized than their instructed counterparts. These results indicate that differences between instructed and real-life expressions of emotion are prevalent and raise caution against an overreliance of researchers on instructed affective stimuli. The findings also demonstrate that in real life, facial expression perception may rely heavily on information from the contextualizing body. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  6. Test battery for measuring the perception and recognition of facial expressions of emotion

    PubMed Central

    Wilhelm, Oliver; Hildebrandt, Andrea; Manske, Karsten; Schacht, Annekathrin; Sommer, Werner

    2014-01-01

    Despite the importance of perceiving and recognizing facial expressions in everyday life, there is no comprehensive test battery for the multivariate assessment of these abilities. As a first step toward such a compilation, we present 16 tasks that measure the perception and recognition of facial emotion expressions, and data illustrating each task's difficulty and reliability. The scoring of these tasks focuses on either the speed or accuracy of performance. A sample of 269 healthy young adults completed all tasks. In general, accuracy and reaction time measures for emotion-general scores showed acceptable and high estimates of internal consistency and factor reliability. Emotion-specific scores yielded lower reliabilities, yet high enough to encourage further studies with such measures. Analyses of task difficulty revealed that all tasks are suitable for measuring emotion perception and emotion recognition related abilities in normal populations. PMID:24860528

  7. Somatosensory Representations Link the Perception of Emotional Expressions and Sensory Experience.

    PubMed

    Kragel, Philip A; LaBar, Kevin S

    2016-01-01

    Studies of human emotion perception have linked a distributed set of brain regions to the recognition of emotion in facial, vocal, and body expressions. In particular, lesions to somatosensory cortex in the right hemisphere have been shown to impair recognition of facial and vocal expressions of emotion. Although these findings suggest that somatosensory cortex represents body states associated with distinct emotions, such as a furrowed brow or gaping jaw, functional evidence directly linking somatosensory activity and subjective experience during emotion perception is critically lacking. Using functional magnetic resonance imaging and multivariate decoding techniques, we show that perceiving vocal and facial expressions of emotion yields hemodynamic activity in right somatosensory cortex that discriminates among emotion categories, exhibits somatotopic organization, and tracks self-reported sensory experience. The findings both support embodied accounts of emotion and provide mechanistic insight into how emotional expressions are capable of biasing subjective experience in those who perceive them.

  8. Perceptions of Emotion from Facial Expressions are Not Culturally Universal: Evidence from a Remote Culture

    PubMed Central

    Gendron, Maria; Roberson, Debi; van der Vyver, Jacoba Marietta; Barrett, Lisa Feldman

    2014-01-01

    It is widely believed that certain emotions are universally recognized in facial expressions. Recent evidence indicates that Western perceptions (e.g., scowls as anger) depend on cues to US emotion concepts embedded in experiments. Since such cues are standard feature in methods used in cross-cultural experiments, we hypothesized that evidence of universality depends on this conceptual context. In our study, participants from the US and the Himba ethnic group sorted images of posed facial expressions into piles by emotion type. Without cues to emotion concepts, Himba participants did not show the presumed “universal” pattern, whereas US participants produced a pattern with presumed universal features. With cues to emotion concepts, participants in both cultures produced sorts that were closer to the presumed “universal” pattern, although substantial cultural variation persisted. Our findings indicate that perceptions of emotion are not universal, but depend on cultural and conceptual contexts. PMID:24708506

  9. Somatosensory Representations Link the Perception of Emotional Expressions and Sensory Experience123

    PubMed Central

    2016-01-01

    Abstract Studies of human emotion perception have linked a distributed set of brain regions to the recognition of emotion in facial, vocal, and body expressions. In particular, lesions to somatosensory cortex in the right hemisphere have been shown to impair recognition of facial and vocal expressions of emotion. Although these findings suggest that somatosensory cortex represents body states associated with distinct emotions, such as a furrowed brow or gaping jaw, functional evidence directly linking somatosensory activity and subjective experience during emotion perception is critically lacking. Using functional magnetic resonance imaging and multivariate decoding techniques, we show that perceiving vocal and facial expressions of emotion yields hemodynamic activity in right somatosensory cortex that discriminates among emotion categories, exhibits somatotopic organization, and tracks self-reported sensory experience. The findings both support embodied accounts of emotion and provide mechanistic insight into how emotional expressions are capable of biasing subjective experience in those who perceive them. PMID:27280154

  10. The Emotional Modulation of Facial Mimicry: A Kinematic Study.

    PubMed

    Tramacere, Antonella; Ferrari, Pier F; Gentilucci, Maurizio; Giuffrida, Valeria; De Marco, Doriana

    2017-01-01

    It is well-established that the observation of emotional facial expression induces facial mimicry responses in the observers. However, how the interaction between emotional and motor components of facial expressions can modulate the motor behavior of the perceiver is still unknown. We have developed a kinematic experiment to evaluate the effect of different oro-facial expressions on perceiver's face movements. Participants were asked to perform two movements, i.e., lip stretching and lip protrusion, in response to the observation of four meaningful (i.e., smile, angry-mouth, kiss, and spit) and two meaningless mouth gestures. All the stimuli were characterized by different motor patterns (mouth aperture or mouth closure). Response Times and kinematics parameters of the movements (amplitude, duration, and mean velocity) were recorded and analyzed. Results evidenced a dissociated effect on reaction times and movement kinematics. We found shorter reaction time when a mouth movement was preceded by the observation of a meaningful and motorically congruent oro-facial gesture, in line with facial mimicry effect. On the contrary, during execution, the perception of smile was associated with the facilitation, in terms of shorter duration and higher velocity of the incongruent movement, i.e., lip protrusion. The same effect resulted in response to kiss and spit that significantly facilitated the execution of lip stretching. We called this phenomenon facial mimicry reversal effect , intended as the overturning of the effect normally observed during facial mimicry. In general, the findings show that both motor features and types of emotional oro-facial gestures (conveying positive or negative valence) affect the kinematics of subsequent mouth movements at different levels: while congruent motor features facilitate a general motor response, motor execution could be speeded by gestures that are motorically incongruent with the observed one. Moreover, valence effect depends on the specific movement required. Results are discussed in relation to the Basic Emotion Theory and embodied cognition framework.

  11. Multiple faces of pain: effects of chronic pain on the brain regulation of facial expression

    PubMed Central

    Vachon-Presseau, Etienne; Roy, Mathieu; Woo, Choong-Wan; Kunz, Miriam; Martel, Marc-Olivier; Sullivan, Michael J.; Jackson, Philip L.; Wager, Tor D.; Rainville, Pierre

    2018-01-01

    Pain behaviors are shaped by social demands and learning processes, and chronic pain has been previously suggested to affect their meaning. In this study, we combined functional magnetic resonance imaging with in-scanner video recording during thermal pain stimulations and use multilevel mediation analyses to study the brain mediators of pain facial expressions and the perception of pain intensity (self-reports) in healthy individuals and patients with chronic back pain (CBP). Behavioral data showed that the relation between pain expression and pain report was disrupted in CBP. In both patients with CBP and healthy controls, brain activity varying on a trial-by-trial basis with pain facial expressions was mainly located in the primary motor cortex and completely dissociated from the pattern of brain activity varying with pain intensity ratings. Stronger activity was observed in CBP specifically during pain facial expressions in several nonmotor brain regions such as the medial prefrontal cortex, the precuneus, and the medial temporal lobe. In sharp contrast, no moderating effect of chronic pain was observed on brain activity associated with pain intensity ratings. Our results demonstrate that pain facial expressions and pain intensity ratings reflect different aspects of pain processing and support psychosocial models of pain suggesting that distinctive mechanisms are involved in the regulation of pain behaviors in chronic pain. PMID:27411160

  12. Nasal Oxytocin Treatment Biases Dogs’ Visual Attention and Emotional Response toward Positive Human Facial Expressions

    PubMed Central

    Somppi, Sanni; Törnqvist, Heini; Topál, József; Koskela, Aija; Hänninen, Laura; Krause, Christina M.; Vainio, Outi

    2017-01-01

    The neuropeptide oxytocin plays a critical role in social behavior and emotion regulation in mammals. The aim of this study was to explore how nasal oxytocin administration affects gazing behavior during emotional perception in domestic dogs. Looking patterns of dogs, as a measure of voluntary attention, were recorded during the viewing of human facial expression photographs. The pupil diameters of dogs were also measured as a physiological index of emotional arousal. In a placebo-controlled within-subjects experimental design, 43 dogs, after having received either oxytocin or placebo (saline) nasal spray treatment, were presented with pictures of unfamiliar male human faces displaying either a happy or an angry expression. We found that, depending on the facial expression, the dogs’ gaze patterns were affected selectively by oxytocin treatment. After receiving oxytocin, dogs fixated less often on the eye regions of angry faces and revisited (glanced back at) more often the eye regions of smiling (happy) faces than after the placebo treatment. Furthermore, following the oxytocin treatment dogs fixated and revisited the eyes of happy faces significantly more often than the eyes of angry faces. The analysis of dogs’ pupil diameters during viewing of human facial expressions indicated that oxytocin may also have a modulatory effect on dogs’ emotional arousal. While subjects’ pupil sizes were significantly larger when viewing angry faces than happy faces in the control (placebo treatment) condition, oxytocin treatment not only eliminated this effect but caused an opposite pupil response. Overall, these findings suggest that nasal oxytocin administration selectively changes the allocation of attention and emotional arousal in domestic dogs. Oxytocin has the potential to decrease vigilance toward threatening social stimuli and increase the salience of positive social stimuli thus making eye gaze of friendly human faces more salient for dogs. Our study provides further support for the role of the oxytocinergic system in the social perception abilities of domestic dogs. We propose that oxytocin modulates fundamental emotional processing in dogs through a mechanism that may facilitate communication between humans and dogs. PMID:29089919

  13. Slowing down presentation of facial movements and vocal sounds enhances facial expression recognition and induces facial-vocal imitation in children with autism.

    PubMed

    Tardif, Carole; Lainé, France; Rodriguez, Mélissa; Gepner, Bruno

    2007-09-01

    This study examined the effects of slowing down presentation of facial expressions and their corresponding vocal sounds on facial expression recognition and facial and/or vocal imitation in children with autism. Twelve autistic children and twenty-four normal control children were presented with emotional and non-emotional facial expressions on CD-Rom, under audio or silent conditions, and under dynamic visual conditions (slowly, very slowly, at normal speed) plus a static control. Overall, children with autism showed lower performance in expression recognition and more induced facial-vocal imitation than controls. In the autistic group, facial expression recognition and induced facial-vocal imitation were significantly enhanced in slow conditions. Findings may give new perspectives for understanding and intervention for verbal and emotional perceptive and communicative impairments in autistic populations.

  14. Assessment of health-related quality of life in Turkish patients with facial prostheses

    PubMed Central

    2013-01-01

    Background Facial prostheses are intended to provide a non-operative rehabilitation for patients with acquired facial defects. By improving aesthetics and quality of life (QOL), this treatment involves reintegration of the patient into family and social life. The aim of this study was to evaluate the perception of QOL in adult patients with facial prostheses and to compare this perception with that of a control group. Methods The study participants consisted of 72 patients, who were divided into three equal-sized groups according to the type of prosthesis (OP- orbital prosthesis, AP- auricular prosthesis, NP - nasal prosthesis) and 24 healthy control participants without any congenital or acquired deformity of face or body. Clinical and socio-demographic data were gathered from each person’s medical chart. Participants completed the Turkish version of the World Health Organization Quality of Life Instrument, Short Form (WHOQOL-BREF). Descriptive statistics, independent sample t-tests, Pearson's chi-square test, ANOVA, ANCOVA, and Pearson correlation were used to analyse the data. Results Compared with the control participants, patients with NP scored lower on the all domains of QOL and all three patient groups had lower scores on overall QOL and its domains of physical and environmental health. Patients with OP reported significantly lower physical health scores than those with AP, while patients with NP reported significantly lower overall QOL and psychological health scores than those with AP. Female patients had lower environmental domain scores than did male patients. The patient’s age and income correlated with social relationships QOL, while the patient’s income and the age of facial prosthesis were correlated with environmental QOL. Conclusion Patients with facial prostheses had lower scores in overall QOL, physical and environmental health domains than the control participants. Socio-demographic and clinical characteristics such as age, gender, income, localization of the defect, and age of facial prosthesis were associated with patients’ QOL. These findings may provide valuable information about the specific health needs of these patients that may affect their well-being. Further studies are needed to confirm these results. Use of the WHOQOL-BREF may provide valuable information for determining patients’ needs and priorities as well as for planning and developing comprehensive prosthetic rehabilitation programs. PMID:23351906

  15. Societal Value of Surgery for Facial Reanimation

    PubMed Central

    Su, Peiyi; Ishii, Lisa E.; Joseph, Andrew; Nellis, Jason; Dey, Jacob; Bater, Kristin; Byrne, Patrick J.; Boahene, Kofi D. O.; Ishii, Masaru

    2017-01-01

    IMPORTANCE Patients with facial paralysis are perceived negatively by society in a number of domains. Society’s perception of the health utility of varying degrees of facial paralysis and the value society places on reconstructive surgery for facial reanimation need to be quantified. OBJECTIVE To measure health state utility of varying degrees of facial paralysis, willingness to pay (WTP) for a repair, and the subsequent value of facial reanimation surgery as perceived by society. DESIGN, SETTING, AND PARTICIPANTS This prospective observational study conducted in an academic tertiary referral center evaluated a group of 348 casual observers who viewed images of faces with unilateral facial paralysis of 3 severity levels (low, medium, and high) categorized by House-Brackmann grade. Structural equation modeling was performed to understand associations among health utility metrics, WTP, and facial perception domains. Data were collected from July 16 to September 26, 2015. MAIN OUTCOMES AND MEASURES Observer-rated (1) quality of life (QOL) using established health utility metrics (standard gamble, time trade-off, and a visual analog scale) and (2) their WTP for surgical repair. RESULTS Among the 348 observers (248 women [71.3%]; 100 men [28.7%]; mean [SD] age, 29.3 [11.6] years), mixed-effects linear regression showed that WTP increased nonlinearly with increasing severity of paralysis. Participants were willing to pay $3487 (95% CI, $2362–$4961) to repair low-grade paralysis, $8571 (95% CI, $6401–$11 234) for medium-grade paralysis, and $20 431 (95% CI, $16 273–$25 317) for high-grade paralysis. The dominant factor affecting the participants’ WTP was perceived QOL. Modeling showed that perceived QOL decreased with paralysis severity (regression coefficient, −0.004; 95% CI, −0.005 to −0.004; P < .001) and increased with attractiveness (regression coefficient, 0.002; 95% CI, 0.002 to 0.003; P < .001). Mean (SD) health utility scores calculated by the standard gamble metric for low- and high-grade paralysis were 0.98 (0.09) and 0.77 (0.25), respectively. Time trade-off and visual analog scale measures were highly correlated. We calculated mean (SD) WTP per quality-adjusted life-year, which ranged from $10 167 ($14 565) to $17 008 ($38 288) for low- to high-grade paralysis, respectively. CONCLUSIONS AND RELEVANCE Society perceives the repair of facial paralysis to be a high-value intervention. Societal WTP increases and perceived health state utility decreases with increasing House-Brackmann grade. This study demonstrates the usefulness of WTP as an objective measure to inform dimensions of disease severity and signal the value society places on proper facial function. LEVEL OF EVIDENCE NA. PMID:27892977

  16. The Measurement of the Sensory Recovery Period in Zygoma and Blow-Out Fractures with Neurometer Current Perception Threshold.

    PubMed

    Oh, Daemyung; Yun, Taebin; Kim, Junhyung; Choi, Jaehoon; Jeong, Woonhyeok; Chu, Hojun; Lee, Soyoung

    2016-09-01

    Facial hypoesthesia is one of the most troublesome complaints in the management of facial bone fractures. However, there is a lack of literature on facial sensory recovery after facial trauma. The purpose of this study was to evaluate the facial sensory recovery period for facial bone fractures using Neurometer. Sixty-three patients who underwent open reduction of zygomatic and blowout fractures between December 2013 and July 2015 were included in the study. The facial sensory status of the patients was repeatedly examined preoperatively and postoperatively by Neurometer current perception threshold (CPT) until the results were normalized. Among the 63 subjects, 30 patients had normal Neurometer results preoperatively and postoperatively. According to fracture types, 17 patients with blowout fracture had a median recovery period of 0.25 months. Twelve patients with zygomatic fracture had a median recovery period of 1.00 month. Four patients with both fracture types had a median recovery period of 0.625 months. The median recovery period of all 33 patients was 0.25 months. There was no statistically significant difference in the sensory recovery period between types and subgroups of zygomatic and blowout fractures. In addition, there was no statistically significant difference in the sensory recovery period according to Neurometer results and the patients' own subjective reports. Neurometer CPT is effective for evaluating and comparing preoperative and postoperative facial sensory status and evaluating the sensory recovery period in facial bone fracture patients.

  17. Quantifying deficits in the perception of fear and anger in morphed facial expressions after bilateral amygdala damage.

    PubMed

    Graham, Reiko; Devinsky, Orrin; Labar, Kevin S

    2007-01-07

    Amygdala damage has been associated with impairments in perceiving facial expressions of fear. However, deficits in perceiving other emotions, such as anger, and deficits in perceiving emotion blends have not been definitively established. One possibility is that methods used to index expression perception are susceptible to heuristic use, which may obscure impairments. To examine this, we adapted a task used to examine categorical perception of morphed facial expressions [Etcoff, N. L., & Magee, J. J. (1992). Categorical perception of facial expressions. Cognition, 44(3), 227-240]. In one version of the task, expressions were categorized with unlimited time constraints. In the other, expressions were presented with limited exposure durations to tap more automatic aspects of processing. Three morph progressions were employed: neutral to anger, neutral to fear, and fear to anger. Both tasks were administered to a participant with bilateral amygdala damage (S.P.), age- and education-matched controls, and young controls. The second task was also administered to unilateral temporal lobectomy patients. In the first version, S.P. showed impairments relative to normal controls on the neutral-to-anger and fear-to-anger morphs, but not on the neutral-to-fear morph. However, reaction times suggested that speed-accuracy tradeoffs could account for results. In the second version, S.P. showed impairments on all morph types relative to all other subject groups. A third experiment showed that this deficit did not extend to the perception of morphed identities. These results imply that when heuristics use is discouraged on tasks utilizing subtle emotion transitions, deficits in the perception of anger and anger/fear blends, as well as fear, are evident with bilateral amygdala damage.

  18. Facial profile parameters and their relative influence on bilabial prominence and the perceptions of facial profile attractiveness: A novel approach

    PubMed Central

    Denize, Erin Stewart; McDonald, Fraser; Sherriff, Martyn

    2014-01-01

    Objective To evaluate the relative importance of bilabial prominence in relation to other facial profile parameters in a normal population. Methods Profile stimulus images of 38 individuals (28 female and 10 male; ages 19-25 years) were shown to an unrelated group of first-year students (n = 42; ages 18-24 years). The images were individually viewed on a 17-inch monitor. The observers received standardized instructions before viewing. A six-question questionnaire was completed using a Likert-type scale. The responses were analyzed by ordered logistic regression to identify associations between profile characteristics and observer preferences. The Bayesian Information Criterion was used to select variables that explained observer preferences most accurately. Results Nasal, bilabial, and chin prominences; the nasofrontal angle; and lip curls had the greatest effect on overall profile attractiveness perceptions. The lip-chin-throat angle and upper lip curl had the greatest effect on forehead prominence perceptions. The bilabial prominence, nasolabial angle (particularly the lower component), and mentolabial angle had the greatest effect on nasal prominence perceptions. The bilabial prominence, nasolabial angle, chin prominence, and submental length had the greatest effect on lip prominence perceptions. The bilabial prominence, nasolabial angle, mentolabial angle, and submental length had the greatest effect on chin prominence perceptions. Conclusions More prominent lips, within normal limits, may be considered more attractive in the profile view. Profile parameters have a greater influence on their neighboring aesthetic units but indirectly influence related profile parameters, endorsing the importance of achieving an aesthetic balance between relative prominences of all aesthetic units of the facial profile. PMID:25133133

  19. Comparison of the effect of labiolingual inclination and anteroposterior position of maxillary incisors on esthetic profile in three different facial patterns

    PubMed Central

    Chirivella, Praveen; Singaraju, Gowri Sankar; Mandava, Prasad; Reddy, V Karunakar; Neravati, Jeevan Kumar; George, Suja Ani

    2017-01-01

    Objective: To test the null hypothesis that there is no effect of esthetic perception of smiling profile in three different facial types by a change in the maxillary incisor inclination and position. Materials and Methods: A smiling profile photograph with Class I skeletal and dental pattern, normal profile were taken in each of the three facial types dolichofacial, mesofacial, and brachyfacial. Based on the original digital image, 15 smiling profiles in each of the facial types were created using the FACAD software by altering the labiolingual inclination and anteroposterior position of the maxillary incisors. These photographs were rated on a visual analog scale by three panels of examiners consisting of orthodontists, dentists, and nonprofessionals with twenty members in each group. The responses were assessed by analysis of variance (ANOVA) test followed by post hoc Scheffe. Results: Significant differences (P < 0.001) were detected when ratings of each photograph in each of the individual facial type was compared. In dolichofacial and mesofacial pattern, the position of the maxillary incisor must be limited to 2 mm from the goal anterior limit line. In brachyfacial pattern, any movement of facial axis point of maxillary incisors away from GALL is worsens the facial esthetics. The result of the ANOVA showed differences among the three groups for certain facial profiles. Conclusion: The hypothesis was rejected. The esthetic perception of labiolingual inclination and anteroposterior of maxillary incisors differ in different facial types, and this may effect in formulating treatment plans for different facial types. PMID:28197396

  20. Dependence of the appearance-based perception of criminality, suggestibility, and trustworthiness on the level of pixelation of facial images.

    PubMed

    Nurmoja, Merle; Eamets, Triin; Härma, Hanne-Loore; Bachmann, Talis

    2012-10-01

    While the dependence of face identification on the level of pixelation-transform of the images of faces has been well studied, similar research on face-based trait perception is underdeveloped. Because depiction formats used for hiding individual identity in visual media and evidential material recorded by surveillance cameras often consist of pixelized images, knowing the effects of pixelation on person perception has practical relevance. Here, the results of two experiments are presented showing the effect of facial image pixelation on the perception of criminality, trustworthiness, and suggestibility. It appears that individuals (N = 46, M age = 21.5 yr., SD = 3.1 for criminality ratings; N = 94, M age = 27.4 yr., SD = 10.1 for other ratings) have the ability to discriminate between facial cues ndicative of these perceived traits from the coarse level of image pixelation (10-12 pixels per face horizontally) and that the discriminability increases with a decrease in the coarseness of pixelation. Perceived criminality and trustworthiness appear to be better carried by the pixelized images than perceived suggestibility.

  1. Effects of nonverbal behavior on perceptions of a female employee's power bases.

    PubMed

    Aguinis, H; Henle, C A

    2001-08-01

    The authors extended a previous examination of the effects of nonverbal behavior on perceptions of a male employee's power bases (H. Aguinis, M. M. Simonsen, & C. A. Pierce, 1998) by examining the effects of nonverbal behavior on perceptions of a female employee's power bases. U.S. undergraduates read vignettes describing a female employee engaging in 3 types of nonverbal behavior (i.e., eye contact, facial expression, body posture) and rated their perceptions of the woman's power bases (i.e., reward, coercive, legitimate, referent, expert, credibility). As predicted, (a) direct eye contact increased perceptions of coercive power, and (b) a relaxed facial expression decreased perceptions of all 6 power bases. Also as predicted, the present results differed markedly from those of Aguinis et al. (1998) regarding a male employee. The authors discuss implications for theory, future research, and the advancement of female employees.

  2. Associations between feelings of social anxiety and emotion perception.

    PubMed

    Lynn, Spencer K; Bui, Eric; Hoeppner, Susanne S; O'Day, Emily B; Palitz, Sophie A; Barrett, Lisa Feldman; Simon, Naomi M

    2018-06-01

    Abnormally biased perceptual judgment is a feature of many psychiatric disorders. Thus, individuals with social anxiety disorder are biased to recall or interpret social events negatively. Cognitive behavioral therapy addresses such bias by teaching patients, via verbal instruction, to become aware of and change pathological misjudgment. The present study examined whether targeting verbal instruction to specific decision parameters that influence perceptual judgment may affect changes in anger perception. We used a signal detection framework to decompose anger perception into three decision parameters (base rate of encountering anger vs. no-anger, payoff for correct vs. incorrect categorization of face stimuli, and perceptual similarity of angry vs. not-angry facial expressions). We created brief verbal instructions that emphasized each parameter separately. Participants with social anxiety disorder, generalized anxiety disorder, and healthy controls, were assigned to one of the three instruction conditions. We compared anger perception pre-vs. post-instruction. Base rate and payoff instructions affected response bias over and above practice effects, across the three groups. There was no interaction with diagnosis. The ability to target specific decision parameters that underlie perceptual judgment suggests that cognitive behavioral therapy might be improved by tailoring it to patients' individual parameter "estimation" deficits. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Perceived association between diagnostic and non-diagnostic cues of women's sexual interest: General Recognition Theory predictors of risk for sexual coercion.

    PubMed

    Farris, Coreen; Viken, Richard J; Treat, Teresa A

    2010-01-01

    Young men's errors in sexual perception have been linked to sexual coercion. The current investigation sought to explicate the perceptual and decisional sources of these social perception errors, as well as their link to risk for sexual violence. General Recognition Theory (GRT; [Ashby, F. G., & Townsend, J. T. (1986). Varieties of perceptual independence. Psychological Review, 93, 154-179]) was used to estimate participants' ability to discriminate between affective cues and clothing style cues and to measure illusory correlations between men's perception of women's clothing style and sexual interest. High-risk men were less sensitive to the distinction between women's friendly and sexual interest cues relative to other men. In addition, they were more likely to perceive an illusory correlation between women's diagnostic sexual interest cues (e.g., facial affect) and non-diagnostic cues (e.g., provocative clothing), which increases the probability that high-risk men will misperceive friendly women as intending to communicate sexual interest. The results provide information about the degree of risk conferred by individual differences in perceptual processing of women's interest cues, and also illustrate how translational scientists might adapt GRT to examine research questions about individual differences in social perception.

  4. Identity modulates short-term memory for facial emotion.

    PubMed

    Galster, Murray; Kahana, Michael J; Wilson, Hugh R; Sekuler, Robert

    2009-12-01

    For some time, the relationship between processing of facial expression and facial identity has been in dispute. Using realistic synthetic faces, we reexamined this relationship for both perception and short-term memory. In Experiment 1, subjects tried to identify whether the emotional expression on a probe stimulus face matched the emotional expression on either of two remembered faces that they had just seen. The results showed that identity strongly influenced recognition short-term memory for emotional expression. In Experiment 2, subjects' similarity/dissimilarity judgments were transformed by multidimensional scaling (MDS) into a 2-D description of the faces' perceptual representations. Distances among stimuli in the MDS representation, which showed a strong linkage of emotional expression and facial identity, were good predictors of correct and false recognitions obtained previously in Experiment 1. The convergence of the results from Experiments 1 and 2 suggests that the overall structure and configuration of faces' perceptual representations may parallel their representation in short-term memory and that facial identity modulates the representation of facial emotion, both in perception and in memory. The stimuli from this study may be downloaded from http://cabn.psychonomic-journals.org/content/supplemental.

  5. Facial expressions as a model to test the role of the sensorimotor system in the visual perception of the actions.

    PubMed

    Mele, Sonia; Ghirardi, Valentina; Craighero, Laila

    2017-12-01

    A long-term debate concerns whether the sensorimotor coding carried out during transitive actions observation reflects the low-level movement implementation details or the movement goals. On the contrary, phonemes and emotional facial expressions are intransitive actions that do not fall into this debate. The investigation of phonemes discrimination has proven to be a good model to demonstrate that the sensorimotor system plays a role in understanding actions acoustically presented. In the present study, we adapted the experimental paradigms already used in phonemes discrimination during face posture manipulation, to the discrimination of emotional facial expressions. We submitted participants to a lower or to an upper face posture manipulation during the execution of a four alternative labelling task of pictures randomly taken from four morphed continua between two emotional facial expressions. The results showed that the implementation of low-level movement details influence the discrimination of ambiguous facial expressions differing for a specific involvement of those movement details. These findings indicate that facial expressions discrimination is a good model to test the role of the sensorimotor system in the perception of actions visually presented.

  6. Identity modulates short-term memory for facial emotion

    PubMed Central

    Galster, Murray; Kahana, Michael J.; Wilson, Hugh R.; Sekuler, Robert

    2010-01-01

    For some time, the relationship between processing of facial expression and facial identity has been in dispute. Using realistic synthetic faces, we reexamined this relationship for both perception and short-term memory. In Experiment 1, subjects tried to identify whether the emotional expression on a probe stimulus face matched the emotional expression on either of two remembered faces that they had just seen. The results showed that identity strongly influenced recognition short-term memory for emotional expression. In Experiment 2, subjects’ similarity/dissimilarity judgments were transformed by multidimensional scaling (MDS) into a 2-D description of the faces’ perceptual representations. Distances among stimuli in the MDS representation, which showed a strong linkage of emotional expression and facial identity, were good predictors of correct and false recognitions obtained previously in Experiment 1. The convergence of the results from Experiments 1 and 2 suggests that the overall structure and configuration of faces’ perceptual representations may parallel their representation in short-term memory and that facial identity modulates the representation of facial emotion, both in perception and in memory. The stimuli from this study may be downloaded from http://cabn.psychonomic-journals.org/content/supplemental. PMID:19897794

  7. Facial profile esthetic preferences: perception in two Brazilian states.

    PubMed

    Oliveira, Marina Detoni Vieira de; Silveira, Bruno Lopes da; Mattos, Cláudia Trindade; Marquezan, Mariana

    2015-01-01

    The aim of this study was to assess the regional influence on the perception of facial profile esthetics in Rio de Janeiro state (RJ) and Rio Grande do Sul state (RS), Brazil. Two Caucasian models, a man and a woman, with balanced facial profiles, had their photographs digitally manipulated so as to produce seven different profiles. First year dental students (laypeople) assessed the images and classified them according to their esthetic preference. The result of the t test for independent samples showed differences among states for certain facial profiles. The female photograph identified with the letter 'G' (mandibular retrusion) received higher scores in RS state (p = 0.006). No differences were found for male photographs (p > 0.007). The evaluators' sex seemed not to influence their esthetic perception (p > 0.007). Considering all evaluators together, ANOVA/Tukey's test showed differences among the profiles (p ≤ 0.05) for both male and female photographs. The female photograph that received the highest score was the one identified with the letter 'F' (dentoalveolar bimaxillary retrusion/ straight profile). For the male profiles, photograph identified with the letter 'E' (dentoalveolar bimaxillary protrusion/ straight profile) received the best score. Regional differences were observed regarding preferences of facial profile esthetics. In Rio de Janeiro state, more prominent lips were preferred while in Rio Grande do Sul state, profiles with straight lips were favored. Class III profiles were considered less attractive.

  8. A research on motion design for APP's loading pages based on time perception

    NASA Astrophysics Data System (ADS)

    Cao, Huai; Hu, Xiaoyun

    2018-04-01

    Due to restrictions caused by objective reasons like network bandwidth, hardware performance and etc., waiting is still an inevitable phenomenon that appears in our using mobile-terminal products. Relevant researches show that users' feelings in a waiting scenario can affect their evaluations on the whole product and services the product provides. With the development of user experience and inter-facial design subjects, the role of motion effect in the interface design has attracted more and more scholars' attention. In the current studies, the research theory of motion design in a waiting scenario is imperfect. This article will use the basic theory and experimental research methods of cognitive psychology to explore the motion design's impact on user's time perception when users are waiting for loading APP pages. Firstly, the article analyzes the factors that affect waiting experience of loading APP pages based on the theory of time perception, and then discusses motion design's impact on the level of time-perception when loading pages and its design strategy. Moreover, by the operation analysis of existing loading motion designs, the article classifies the existing loading motions and designs an experiment to verify the impact of different types of motions on the user's time perception. The result shows that the waiting time perception of mobile's terminals' APPs is related to the loading motion types, the combination type of loading motions can effectively shorten the waiting time perception as it scores a higher mean value in the length of time perception.

  9. The effects of facial adiposity on attractiveness and perceived leadership ability.

    PubMed

    Re, Daniel E; Perrett, David I

    2014-01-01

    Facial attractiveness has a positive influence on electoral success both in experimental paradigms and in the real world. One parameter that influences facial attractiveness and social judgements is facial adiposity (a facial correlate to body mass index, BMI). Overweight people have high facial adiposity and are perceived to be less attractive and lower in leadership ability. Here, we used an interactive design in order to assess whether the most attractive level of facial adiposity is also perceived as most leader-like. We found that participants reduced facial adiposity more to maximize attractiveness than to maximize perceived leadership ability. These results indicate that facial appearance impacts leadership judgements beyond the effects of attractiveness. We suggest that the disparity between optimal facial adiposity in attractiveness and leadership judgements stems from social trends that have produced thin ideals for attractiveness, while leadership judgements are associated with perception of physical dominance.

  10. Effects of delta-9-tetrahydrocannabinol on evaluation of emotional images

    PubMed Central

    Ballard, Michael E; Bedi, Gillinder; de Wit, Harriet

    2013-01-01

    There is growing evidence that drugs of abuse alter processing of emotional information in ways that could be attractive to users. Our recent report that Δ9-tetrahydrocannabinol (THC) diminishes amygdalar activation in response to threat-related faces suggests that THC may modify evaluation of emotionally-salient, particularly negative or threatening, stimuli. In this study, we examined the effects of acute THC on evaluation of emotional images. Healthy volunteers received two doses of THC (7.5 and 15 mg; p.o.) and placebo across separate sessions before performing tasks assessing facial emotion recognition and emotional responses to pictures of emotional scenes. THC significantly impaired recognition of facial fear and anger, but it only marginally impaired recognition of sadness and happiness. The drug did not consistently affect ratings of emotional scenes. THC' effects on emotional evaluation were not clearly related to its mood-altering effects. These results support our previous work, and show that THC reduces perception of facial threat. Nevertheless, THC does not appear to positively bias evaluation of emotional stimuli in general PMID:22585232

  11. Modelling the perceptual similarity of facial expressions from image statistics and neural responses.

    PubMed

    Sormaz, Mladen; Watson, David M; Smith, William A P; Young, Andrew W; Andrews, Timothy J

    2016-04-01

    The ability to perceive facial expressions of emotion is essential for effective social communication. We investigated how the perception of facial expression emerges from the image properties that convey this important social signal, and how neural responses in face-selective brain regions might track these properties. To do this, we measured the perceptual similarity between expressions of basic emotions, and investigated how this is reflected in image measures and in the neural response of different face-selective regions. We show that the perceptual similarity of different facial expressions (fear, anger, disgust, sadness, happiness) can be predicted by both surface and feature shape information in the image. Using block design fMRI, we found that the perceptual similarity of expressions could also be predicted from the patterns of neural response in the face-selective posterior superior temporal sulcus (STS), but not in the fusiform face area (FFA). These results show that the perception of facial expression is dependent on the shape and surface properties of the image and on the activity of specific face-selective regions. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Experience-based human perception of facial expressions in Barbary macaques (Macaca sylvanus).

    PubMed

    Maréchal, Laëtitia; Levy, Xandria; Meints, Kerstin; Majolo, Bonaventura

    2017-01-01

    Facial expressions convey key cues of human emotions, and may also be important for interspecies interactions. The universality hypothesis suggests that six basic emotions (anger, disgust, fear, happiness, sadness, and surprise) should be expressed by similar facial expressions in close phylogenetic species such as humans and nonhuman primates. However, some facial expressions have been shown to differ in meaning between humans and nonhuman primates like macaques. This ambiguity in signalling emotion can lead to an increased risk of aggression and injuries for both humans and animals. This raises serious concerns for activities such as wildlife tourism where humans closely interact with wild animals. Understanding what factors (i.e., experience and type of emotion) affect ability to recognise emotional state of nonhuman primates, based on their facial expressions, can enable us to test the validity of the universality hypothesis, as well as reduce the risk of aggression and potential injuries in wildlife tourism. The present study investigated whether different levels of experience of Barbary macaques, Macaca sylvanus , affect the ability to correctly assess different facial expressions related to aggressive, distressed, friendly or neutral states, using an online questionnaire. Participants' level of experience was defined as either: (1) naïve: never worked with nonhuman primates and never or rarely encountered live Barbary macaques; (2) exposed: shown pictures of the different Barbary macaques' facial expressions along with the description and the corresponding emotion prior to undertaking the questionnaire; (3) expert: worked with Barbary macaques for at least two months. Experience with Barbary macaques was associated with better performance in judging their emotional state. Simple exposure to pictures of macaques' facial expressions improved the ability of inexperienced participants to better discriminate neutral and distressed faces, and a trend was found for aggressive faces. However, these participants, even when previously exposed to pictures, had difficulties in recognising aggressive, distressed and friendly faces above chance level. These results do not support the universality hypothesis as exposed and naïve participants had difficulties in correctly identifying aggressive, distressed and friendly faces. Exposure to facial expressions improved their correct recognition. In addition, the findings suggest that providing simple exposure to 2D pictures (for example, information signs explaining animals' facial signalling in zoos or animal parks) is not a sufficient educational tool to reduce tourists' misinterpretations of macaque emotion. Additional measures, such as keeping a safe distance between tourists and wild animals, as well as reinforcing learning via videos or supervised visits led by expert guides, could reduce such issues and improve both animal welfare and tourist experience.

  13. Caring more and knowing more reduces age-related differences in emotion perception.

    PubMed

    Stanley, Jennifer Tehan; Isaacowitz, Derek M

    2015-06-01

    Traditional emotion perception tasks show that older adults are less accurate than are young adults at recognizing facial expressions of emotion. Recently, we proposed that socioemotional factors might explain why older adults seem impaired in lab tasks but less so in everyday life (Isaacowitz & Stanley, 2011). Thus, in the present research we empirically tested whether socioemotional factors such as motivation and familiarity can alter this pattern of age effects. In 1 task, accountability instructions eliminated age differences in the traditional emotion perception task. Using a novel emotion perception paradigm featuring spontaneous dynamic facial expressions of a familiar romantic partner versus a same-age stranger, we found that age differences in emotion perception accuracy were attenuated in the familiar partner condition, relative to the stranger condition. Taken together, the results suggest that both overall accuracy as well as specific patterns of age effects differ appreciably between traditional emotion perception tasks and emotion perception within a socioemotional context. (c) 2015 APA, all rights reserved.

  14. Caring More and Knowing More Reduces Age-Related Differences in Emotion Perception

    PubMed Central

    Stanley, Jennifer Tehan; Isaacowitz, Derek M.

    2015-01-01

    Traditional emotion perception tasks show that older adults are less accurate than young adults at recognizing facial expressions of emotion. Recently, we proposed that socioemotional factors might explain why older adults seem impaired in lab tasks but less so in everyday life (Isaacowitz & Stanley, 2011). Thus, in the present research we empirically tested whether socioemotional factors such as motivation and familiarity can alter this pattern of age effects. In one task, accountability instructions eliminated age differences in the traditional emotion perception task. Using a novel emotion perception paradigm featuring spontaneous dynamic facial expressions of a familiar romantic partner versus a same-age stranger, we found that age differences in emotion perception accuracy were attenuated in the familiar partner condition, relative to the stranger condition. Taken together, the results suggest that both overall accuracy as well as specific patterns of age effects differ appreciably between traditional emotion perception tasks and emotion perception within a socioemotional context. PMID:26030775

  15. What’s in a Face? How Face Gender and Current Affect Influence Perceived Emotion

    PubMed Central

    Harris, Daniel A.; Hayes-Skelton, Sarah A.; Ciaramitaro, Vivian M.

    2016-01-01

    Faces drive our social interactions. A vast literature suggests an interaction between gender and emotional face perception, with studies using different methodologies demonstrating that the gender of a face can affect how emotions are processed. However, how different is our perception of affective male and female faces? Furthermore, how does our current affective state when viewing faces influence our perceptual biases? We presented participants with a series of faces morphed along an emotional continuum from happy to angry. Participants judged each face morph as either happy or angry. We determined each participant’s unique emotional ‘neutral’ point, defined as the face morph judged to be perceived equally happy and angry, separately for male and female faces. We also assessed how current state affect influenced these perceptual neutral points. Our results indicate that, for both male and female participants, the emotional neutral point for male faces is perceptually biased to be happier than for female faces. This bias suggests that more happiness is required to perceive a male face as emotionally neutral, i.e., we are biased to perceive a male face as more negative. Interestingly, we also find that perceptual biases in perceiving female faces are correlated with current mood, such that positive state affect correlates with perceiving female faces as happier, while we find no significant correlation between negative state affect and the perception of facial emotion. Furthermore, we find reaction time biases, with slower responses for angry male faces compared to angry female faces. PMID:27733839

  16. Emotion-Color Associations in the Context of the Face.

    PubMed

    Thorstenson, Christopher A; Elliot, Andrew J; Pazda, Adam D; Perrett, David I; Xiao, Dengke

    2017-11-27

    Facial expressions of emotion contain important information that is perceived and used by observers to understand others' emotional state. While there has been considerable research into perceptions of facial musculature and emotion, less work has been conducted to understand perceptions of facial coloration and emotion. The current research examined emotion-color associations in the context of the face. Across 4 experiments, participants were asked to manipulate the color of face, or shape, stimuli along 2 color axes (i.e., red-green, yellow-blue) for 6 target emotions (i.e., anger, disgust, fear, happiness, sadness, surprise). The results yielded a pattern that is consistent with physiological and psychological models of emotion. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  17. Rhinoplasty and facial asymmetry: Analysis of subjective and anthropometric factors in the Caucasian nose

    PubMed Central

    Carvalho, Bettina; Ballin, Annelyse Christine; Becker, Renata Vecentin; Berger, Cezar Augusto Sarraff; Hurtado, Johann G. G. Melcherts; Mocellin, Marcos

    2012-01-01

    Summary Introduction: Anthropometric proportions and symmetry are considered determinants of beauty. These parameters have significant importance in facial plastic surgery, particularly in rhinoplasty. As the central organ of the face, the nose is especially important in determining facial symmetry, both through the perception of a crooked nose and through the determination of facial growth. The evaluation of the presence of facial asymmetry has great relevance preoperatively, both for surgical planning and counseling. Aim/Objective: To evaluate and document the presence of facial asymmetry in patients during rhinoplasty planning and to correlate the anthropometric measures with the perception of facial symmetry or asymmetry, assessing whether there is a higher prevalence of facial asymmetry in these patients compared to volunteers without nasal complaints. Methods: This prospective study was performed by comparing photographs of patients with rhinoplasty planning and volunteers (controls), n = 201, and by evaluating of anthropometric measurements taken from a line passing through the center of the face, until tragus, medial canthus, corner side wing margin, and oral commissure of each side, by statistical analysis (Z test and odds ratio). Results: None of the patients or volunteers had completely symmetric values. Subjectively, 59% of patients were perceived as asymmetric, against 54% of volunteers. Objectively, more than 89% of respondents had asymmetrical measures. Patients had greater RLMTr (MidLine Tragus Ratio) asymmetry than volunteers, which was statistically significant. Discussion/Conclusion: Facial asymmetries are very common in patients seeking rhinoplasty, and special attention should be paid to these aspects both for surgical planning and for counseling of patients. PMID:25991972

  18. Rhinoplasty and facial asymmetry: Analysis of subjective and anthropometric factors in the Caucasian nose.

    PubMed

    Carvalho, Bettina; Ballin, Annelyse Christine; Becker, Renata Vecentin; Berger, Cezar Augusto Sarraff; Hurtado, Johann G G Melcherts; Mocellin, Marcos

    2012-10-01

     Anthropometric proportions and symmetry are considered determinants of beauty. These parameters have significant importance in facial plastic surgery, particularly in rhinoplasty. As the central organ of the face, the nose is especially important in determining facial symmetry, both through the perception of a crooked nose and through the determination of facial growth. The evaluation of the presence of facial asymmetry has great relevance preoperatively, both for surgical planning and counseling.  To evaluate and document the presence of facial asymmetry in patients during rhinoplasty planning and to correlate the anthropometric measures with the perception of facial symmetry or asymmetry, assessing whether there is a higher prevalence of facial asymmetry in these patients compared to volunteers without nasal complaints.  This prospective study was performed by comparing photographs of patients with rhinoplasty planning and volunteers (controls), n = 201, and by evaluating of anthropometric measurements taken from a line passing through the center of the face, until tragus, medial canthus, corner side wing margin, and oral commissure of each side, by statistical analysis (Z test and odds ratio).  None of the patients or volunteers had completely symmetric values. Subjectively, 59% of patients were perceived as asymmetric, against 54% of volunteers. Objectively, more than 89% of respondents had asymmetrical measures. Patients had greater RLMTr (MidLine Tragus Ratio) asymmetry than volunteers, which was statistically significant.  Facial asymmetries are very common in patients seeking rhinoplasty, and special attention should be paid to these aspects both for surgical planning and for counseling of patients.

  19. Early visual experience and the recognition of basic facial expressions: involvement of the middle temporal and inferior frontal gyri during haptic identification by the early blind

    PubMed Central

    Kitada, Ryo; Okamoto, Yuko; Sasaki, Akihiro T.; Kochiyama, Takanori; Miyahara, Motohide; Lederman, Susan J.; Sadato, Norihiro

    2012-01-01

    Face perception is critical for social communication. Given its fundamental importance in the course of evolution, the innate neural mechanisms can anticipate the computations necessary for representing faces. However, the effect of visual deprivation on the formation of neural mechanisms that underlie face perception is largely unknown. We previously showed that sighted individuals can recognize basic facial expressions by haptics surprisingly well. Moreover, the inferior frontal gyrus (IFG) and posterior superior temporal sulcus (pSTS) in the sighted subjects are involved in haptic and visual recognition of facial expressions. Here, we conducted both psychophysical and functional magnetic-resonance imaging (fMRI) experiments to determine the nature of the neural representation that subserves the recognition of basic facial expressions in early blind individuals. In a psychophysical experiment, both early blind and sighted subjects haptically identified basic facial expressions at levels well above chance. In the subsequent fMRI experiment, both groups haptically identified facial expressions and shoe types (control). The sighted subjects then completed the same task visually. Within brain regions activated by the visual and haptic identification of facial expressions (relative to that of shoes) in the sighted group, corresponding haptic identification in the early blind activated regions in the inferior frontal and middle temporal gyri. These results suggest that the neural system that underlies the recognition of basic facial expressions develops supramodally even in the absence of early visual experience. PMID:23372547

  20. Early visual experience and the recognition of basic facial expressions: involvement of the middle temporal and inferior frontal gyri during haptic identification by the early blind.

    PubMed

    Kitada, Ryo; Okamoto, Yuko; Sasaki, Akihiro T; Kochiyama, Takanori; Miyahara, Motohide; Lederman, Susan J; Sadato, Norihiro

    2013-01-01

    Face perception is critical for social communication. Given its fundamental importance in the course of evolution, the innate neural mechanisms can anticipate the computations necessary for representing faces. However, the effect of visual deprivation on the formation of neural mechanisms that underlie face perception is largely unknown. We previously showed that sighted individuals can recognize basic facial expressions by haptics surprisingly well. Moreover, the inferior frontal gyrus (IFG) and posterior superior temporal sulcus (pSTS) in the sighted subjects are involved in haptic and visual recognition of facial expressions. Here, we conducted both psychophysical and functional magnetic-resonance imaging (fMRI) experiments to determine the nature of the neural representation that subserves the recognition of basic facial expressions in early blind individuals. In a psychophysical experiment, both early blind and sighted subjects haptically identified basic facial expressions at levels well above chance. In the subsequent fMRI experiment, both groups haptically identified facial expressions and shoe types (control). The sighted subjects then completed the same task visually. Within brain regions activated by the visual and haptic identification of facial expressions (relative to that of shoes) in the sighted group, corresponding haptic identification in the early blind activated regions in the inferior frontal and middle temporal gyri. These results suggest that the neural system that underlies the recognition of basic facial expressions develops supramodally even in the absence of early visual experience.

  1. Colour influences perception of facial emotions but this effect is impaired in healthy ageing and schizophrenia.

    PubMed

    Silver, Henry; Bilker, Warren B

    2015-01-01

    Social cognition is commonly assessed by identification of emotions in facial expressions. Presence of colour, a salient feature of stimuli, might influence emotional face perception. We administered 2 tests of facial emotion recognition, the Emotion Recognition Test (ER40) using colour pictures and the Penn Emotional Acuity Test using monochromatic pictures, to 37 young healthy, 39 old healthy and 37 schizophrenic men. Among young healthy individuals recognition of emotions was more accurate and faster in colour than in monochromatic pictures. Compared to the younger group, older healthy individuals revealed impairment in identification of sad expressions in colour but not monochromatic pictures. Schizophrenia patients showed greater impairment in colour than monochromatic pictures of neutral and sad expressions and overall total score compared to both healthy groups. Patients showed significant correlations between cognitive impairment and perception of emotion in colour but not monochromatic pictures. Colour enhances perception of general emotional clues and this contextual effect is impaired in healthy ageing and schizophrenia. The effects of colour need to be considered in interpreting and comparing studies of emotion perception. Coloured face stimuli may be more sensitive to emotion processing impairments but less selective for emotion-specific information than monochromatic stimuli. This may impact on their utility in early detection of impairments and investigations of underlying mechanisms.

  2. Mutual information, perceptual independence, and holistic face perception.

    PubMed

    Fitousi, Daniel

    2013-07-01

    The concept of perceptual independence is ubiquitous in psychology. It addresses the question of whether two (or more) dimensions are perceived independently. Several authors have proposed perceptual independence (or its lack thereof) as a viable measure of holistic face perception (Loftus, Oberg, & Dillon, Psychological Review 111:835-863, 2004; Wenger & Ingvalson, Learning, Memory, and Cognition 28:872-892, 2002). According to this notion, the processing of facial features occurs in an interactive manner. Here, I examine this idea from the perspective of two theories of perceptual independence: the multivariate uncertainty analysis (MUA; Garner & Morton, Definitions, models, and experimental paradigms. Psychological Bulletin 72:233-259, 1969), and the general recognition theory (GRT; Ashby & Townsend, Psychological Review 93:154-179, 1986). The goals of the study were to (1) introduce the MUA, (2) examine various possible relations between MUA and GRT using numerical simulations, and (3) apply the MUA to two consensual markers of holistic face perception(-)recognition of facial features (Farah, Wilson, Drain, & Tanaka, Psychological Review 105:482-498, 1998) and the composite face effect (Young, Hellawell, & Hay, Perception 16:747-759, 1987). The results suggest that facial holism is generated by violations of several types of perceptual independence. They highlight the important theoretical role played by converging operations in the study of holistic face perception.

  3. Coding and quantification of a facial expression for pain in lambs.

    PubMed

    Guesgen, M J; Beausoleil, N J; Leach, M; Minot, E O; Stewart, M; Stafford, K J

    2016-11-01

    Facial expressions are routinely used to assess pain in humans, particularly those who are non-verbal. Recently, there has been an interest in developing coding systems for facial grimacing in non-human animals, such as rodents, rabbits, horses and sheep. The aims of this preliminary study were to: 1. Qualitatively identify facial feature changes in lambs experiencing pain as a result of tail-docking and compile these changes to create a Lamb Grimace Scale (LGS); 2. Determine whether human observers can use the LGS to differentiate tail-docked lambs from control lambs and differentiate lambs before and after docking; 3. Determine whether changes in facial action units of the LGS can be objectively quantified in lambs before and after docking; 4. Evaluate effects of restraint of lambs on observers' perceptions of pain using the LGS and on quantitative measures of facial action units. By comparing images of lambs before (no pain) and after (pain) tail-docking, the LGS was devised in consultation with scientists experienced in assessing facial expression in other species. The LGS consists of five facial action units: Orbital Tightening, Mouth Features, Nose Features, Cheek Flattening and Ear Posture. The aims of the study were addressed in two experiments. In Experiment I, still images of the faces of restrained lambs were taken from video footage before and after tail-docking (n=4) or sham tail-docking (n=3). These images were scored by a group of five naïve human observers using the LGS. Because lambs were restrained for the duration of the experiment, Ear Posture was not scored. The scores for the images were averaged to provide one value per feature per period and then scores for the four LGS action units were averaged to give one LGS score per lamb per period. In Experiment II, still images of the faces nine lambs were taken before and after tail-docking. Stills were taken when lambs were restrained and unrestrained in each period. A different group of five human observers scored the images from Experiment II. Changes in facial action units were also quantified objectively by a researcher using image measurement software. In both experiments LGS scores were analyzed using a linear MIXED model to evaluate the effects of tail docking on observers' perception of facial expression changes. Kendall's Index of Concordance was used to measure reliability among observers. In Experiment I, human observers were able to use the LGS to differentiate docked lambs from control lambs. LGS scores significantly increased from before to after treatment in docked lambs but not control lambs. In Experiment II there was a significant increase in LGS scores after docking. This was coupled with changes in other validated indicators of pain after docking in the form of pain-related behaviour. Only two components, Mouth Features and Orbital Tightening, showed significant quantitative changes after docking. The direction of these changes agree with the description of these facial action units in the LGS. Restraint affected people's perceptions of pain as well as quantitative measures of LGS components. Freely moving lambs were scored lower using the LGS over both periods and had a significantly smaller eye aperture and smaller nose and ear angles than when they were held. Agreement among observers for LGS scores were fair overall (Experiment I: W=0.60; Experiment II: W=0.66). This preliminary study demonstrates changes in lamb facial expression associated with pain. The results of these experiments should be interpreted with caution due to low lamb numbers. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Two Ways to Facial Expression Recognition? Motor and Visual Information Have Different Effects on Facial Expression Recognition.

    PubMed

    de la Rosa, Stephan; Fademrecht, Laura; Bülthoff, Heinrich H; Giese, Martin A; Curio, Cristóbal

    2018-06-01

    Motor-based theories of facial expression recognition propose that the visual perception of facial expression is aided by sensorimotor processes that are also used for the production of the same expression. Accordingly, sensorimotor and visual processes should provide congruent emotional information about a facial expression. Here, we report evidence that challenges this view. Specifically, the repeated execution of facial expressions has the opposite effect on the recognition of a subsequent facial expression than the repeated viewing of facial expressions. Moreover, the findings of the motor condition, but not of the visual condition, were correlated with a nonsensory condition in which participants imagined an emotional situation. These results can be well accounted for by the idea that facial expression recognition is not always mediated by motor processes but can also be recognized on visual information alone.

  5. I thought that I heard you laughing: Contextual facial expressions modulate the perception of authentic laughter and crying.

    PubMed

    Lavan, Nadine; Lima, César F; Harvey, Hannah; Scott, Sophie K; McGettigan, Carolyn

    2015-01-01

    It is well established that categorising the emotional content of facial expressions may differ depending on contextual information. Whether this malleability is observed in the auditory domain and in genuine emotion expressions is poorly explored. We examined the perception of authentic laughter and crying in the context of happy, neutral and sad facial expressions. Participants rated the vocalisations on separate unipolar scales of happiness and sadness and on arousal. Although they were instructed to focus exclusively on the vocalisations, consistent context effects were found: For both laughter and crying, emotion judgements were shifted towards the information expressed by the face. These modulations were independent of response latencies and were larger for more emotionally ambiguous vocalisations. No effects of context were found for arousal ratings. These findings suggest that the automatic encoding of contextual information during emotion perception generalises across modalities, to purely non-verbal vocalisations, and is not confined to acted expressions.

  6. Facial biases on vocal perception and memory.

    PubMed

    Boltz, Marilyn G

    2017-06-01

    Does a speaker's face influence the way their voice is heard and later remembered? This question was addressed through two experiments where in each, participants listened to middle-aged voices accompanied by faces that were either age-appropriate, younger or older than the voice or, as a control, no face at all. In Experiment 1, participants evaluated each voice on various acoustical dimensions and speaker characteristics. The results showed that facial displays influenced perception such that the same voice was heard differently depending on the age of the accompanying face. Experiment 2 further revealed that facial displays led to memory distortions that were age-congruent in nature. These findings illustrate that faces can activate certain social categories and preconceived stereotypes that then influence vocal and person perception in a corresponding fashion. Processes of face/voice integration are very similar to those of music/film, indicating that the two areas can mutually inform one another and perhaps, more generally, reflect a centralized mechanism of cross-sensory integration. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Trustworthy-Looking Face Meets Brown Eyes

    PubMed Central

    Kleisner, Karel; Priplatova, Lenka; Frost, Peter; Flegr, Jaroslav

    2013-01-01

    We tested whether eye color influences perception of trustworthiness. Facial photographs of 40 female and 40 male students were rated for perceived trustworthiness. Eye color had a significant effect, the brown-eyed faces being perceived as more trustworthy than the blue-eyed ones. Geometric morphometrics, however, revealed significant correlations between eye color and face shape. Thus, face shape likewise had a significant effect on perceived trustworthiness but only for male faces, the effect for female faces not being significant. To determine whether perception of trustworthiness was being influenced primarily by eye color or by face shape, we recolored the eyes on the same male facial photos and repeated the test procedure. Eye color now had no effect on perceived trustworthiness. We concluded that although the brown-eyed faces were perceived as more trustworthy than the blue-eyed ones, it was not brown eye color per se that caused the stronger perception of trustworthiness but rather the facial features associated with brown eyes. PMID:23326406

  8. Emotions in "Black and White" or Shades of Gray? How We Think About Emotion Shapes Our Perception and Neural Representation of Emotion.

    PubMed

    Satpute, Ajay B; Nook, Erik C; Narayanan, Sandhya; Shu, Jocelyn; Weber, Jochen; Ochsner, Kevin N

    2016-11-01

    The demands of social life often require categorically judging whether someone's continuously varying facial movements express "calm" or "fear," or whether one's fluctuating internal states mean one feels "good" or "bad." In two studies, we asked whether this kind of categorical, "black and white," thinking can shape the perception and neural representation of emotion. Using psychometric and neuroimaging methods, we found that (a) across participants, judging emotions using a categorical, "black and white" scale relative to judging emotions using a continuous, "shades of gray," scale shifted subjective emotion perception thresholds; (b) these shifts corresponded with activity in brain regions previously associated with affective responding (i.e., the amygdala and ventral anterior insula); and (c) connectivity of these regions with the medial prefrontal cortex correlated with the magnitude of categorization-related shifts. These findings suggest that categorical thinking about emotions may actively shape the perception and neural representation of the emotions in question. © The Author(s) 2016.

  9. Emotions in ‘black or white’ or shades of gray? How we think about emotion shapes our perception and neural representation of emotion

    PubMed Central

    Satpute, Ajay B.; Nook, Erik C.; Narayanan, Sandhya; Shu, Jocelyn; Weber, Jochen; Ochsner, Kevin N.

    2016-01-01

    The demands of social life often require categorically judging whether someone's continuously varying facial movements express “calm” or “fear”, or whether our fluctuating internal states mean we feel “good” or “bad”. In two neuroimaging studies, we ask whether this kind of categorical, ‘black and white’, thinking can shape the perception and neural representation of emotion. Using psychometric and neuroimaging methods, we found that (1) across participants, judging emotions using a ‘black and white’ scale vs. a ‘shades of gray’ scale shifted subjective emotion perception thresholds, (2) these shifts corresponded with activity in regions associated with affective responding including the amygdala and ventral anterior insula, and (3) connectivity of these regions with the medial prefrontal cortex correlated with the magnitude of categorization-related shifts. These findings suggest that categorical thinking about emotion may actively shape the perception and neural representation of the emotions in question. PMID:27670663

  10. Categorical perception of facial expressions in individuals with non-clinical social anxiety.

    PubMed

    Qiu, Fanghui; Han, Mingxiu; Zhai, Yu; Jia, Shiwei

    2018-03-01

    According to the well-established categorical perception (CP) of facial expressions, we decode complicated expression signals into simplified categories to facilitate expression processing. Expression processing deficits have been widely described in social anxiety (SA), but it remains to be investigated whether CP of expressions are affected by SA. The present study examined whether individuals with SA had an interpretation bias when processing ambiguous expressions and whether the sensitivity of their CP was affected by their SA. Sixty-four participants (high SA, 30; low SA, 34) were selected from 658 undergraduates using the Interaction Anxiousness Scale (IAS). With the CP paradigm, specifically with the analysis method of the logistic function model, we derived the categorical boundaries (reflecting interpretation bias) and slopes (reflecting sensitivity of CP) of both high- and low-SA groups while recognizing angry-fearful, happy-angry, and happy-fearful expression continua. Based on a comparison of the categorical boundaries and slopes between the high- and low-SA groups, the results showed that the categorical boundaries between the two groups were not different for any of the three continua, which means that the SA does not affect the interpretation bias for any of the three continua. The slopes for the high-SA group were flatter than those for the low-SA group for both the angry-fearful and happy-angry continua, indicating that the high-SA group is insensitive to the subtle changes that occur from angry to fearful faces and from happy to angry faces. Since participants were selected from a sample of undergraduates based on their IAS scores, the results cannot be directly generalized to individuals with clinical SA disorder. The study indicates that SA does not affect interpretation biases in the processing of anger, fear, and happiness, but does modulate the sensitivity of individuals' CP when anger appears. High-SA individuals perceive angry expressions in a less categorical manner than the low-SA group, but no such difference was found in the perception of happy or fearful expressions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. How does context affect assessments of facial emotion? The role of culture and age.

    PubMed

    Ko, Seon-Gyu; Lee, Tae-Ho; Yoon, Hyea-Young; Kwon, Jung-Hye; Mather, Mara

    2011-03-01

    People from Asian cultures are more influenced by context in their visual processing than people from Western cultures. In this study, we examined how these cultural differences in context processing affect how people interpret facial emotions. We found that younger Koreans were more influenced than younger Americans by emotional background pictures when rating the emotion of a central face, especially those younger Koreans with low self-rated stress. In contrast, among older adults, neither Koreans nor Americans showed significant influences of context in their face emotion ratings. These findings suggest that cultural differences in reliance on context to interpret others' emotions depend on perceptual integration processes that decline with age, leading to fewer cultural differences in perception among older adults than among younger adults. Furthermore, when asked to recall the background pictures, younger participants recalled more negative pictures than positive pictures, whereas older participants recalled similar numbers of positive and negative pictures. These age differences in the valence of memory were consistent across culture. (c) 2011 APA, all rights reserved.

  12. Facial profile esthetic preferences: perception in two Brazilian states

    PubMed Central

    de Oliveira, Marina Detoni Vieira; da Silveira, Bruno Lopes; Mattos, Cláudia Trindade; Marquezan, Mariana

    2015-01-01

    OBJECTIVE: The aim of this study was to assess the regional influence on the perception of facial profile esthetics in Rio de Janeiro state (RJ) and Rio Grande do Sul state (RS), Brazil. METHODS: Two Caucasian models, a man and a woman, with balanced facial profiles, had their photographs digitally manipulated so as to produce seven different profiles. First year dental students (laypeople) assessed the images and classified them according to their esthetic preference. RESULTS: The result of the t test for independent samples showed differences among states for certain facial profiles. The female photograph identified with the letter 'G' (mandibular retrusion) received higher scores in RS state (p = 0.006). No differences were found for male photographs (p > 0.007). The evaluators' sex seemed not to influence their esthetic perception (p > 0.007). Considering all evaluators together, ANOVA/Tukey's test showed differences among the profiles (p ≤ 0.05) for both male and female photographs. The female photograph that received the highest score was the one identified with the letter 'F' (dentoalveolar bimaxillary retrusion/ straight profile). For the male profiles, photograph identified with the letter 'E' (dentoalveolar bimaxillary protrusion/ straight profile) received the best score. CONCLUSION: Regional differences were observed regarding preferences of facial profile esthetics. In Rio de Janeiro state, more prominent lips were preferred while in Rio Grande do Sul state, profiles with straight lips were favored. Class III profiles were considered less attractive. PMID:26154461

  13. Social perception of morbidity in facial nerve paralysis.

    PubMed

    Li, Matthew Ka Ki; Niles, Navin; Gore, Sinclair; Ebrahimi, Ardalan; McGuinness, John; Clark, Jonathan Robert

    2016-08-01

    There are many patient-based and clinician-based scales measuring the severity of facial nerve paralysis and the impact on quality of life, however, the social perception of facial palsy has received little attention. The purpose of this pilot study was to measure the consequences of facial paralysis on selected domains of social perception and compare the social impact of paralysis of the different components. Four patients with typical facial palsies (global, marginal mandibular, zygomatic/buccal, and frontal) and 1 control were photographed. These images were each shown to 100 participants who subsequently rated variables of normality, perceived distress, trustworthiness, intelligence, interaction, symmetry, and disability. Statistical analysis was performed to compare the results among each palsy. Paralyzed faces were considered less normal compared to the control on a scale of 0 to 10 (mean, 8.6; 95% confidence interval [CI] = 8.30-8.86) with global paralysis (mean, 3.4; 95% CI = 3.08-3.80) rated as the most disfiguring, followed by the zygomatic/buccal (mean, 6.0; 95% CI = 5.68-6.37), marginal (mean, 6.5; 95% CI = 6.08-6.86), and then temporal palsies (mean, 6.9; 95% CI = 6.57-7.21). Similar trends were seen when analyzing these palsies for perceived distress, intelligence, and trustworthiness, using a random effects regression model. Our sample suggests that society views paralyzed faces as less normal, less trustworthy, and more distressed. Different components of facial paralysis are worse than others and surgical correction may need to be prioritized in an evidence-based manner with social morbidity in mind. © 2016 Wiley Periodicals, Inc. Head Neck 38:1158-1163, 2016. © 2016 Wiley Periodicals, Inc.

  14. Stability of Facial Affective Expressions in Schizophrenia

    PubMed Central

    Fatouros-Bergman, H.; Spang, J.; Merten, J.; Preisler, G.; Werbart, A.

    2012-01-01

    Thirty-two videorecorded interviews were conducted by two interviewers with eight patients diagnosed with schizophrenia. Each patient was interviewed four times: three weekly interviews by the first interviewer and one additional interview by the second interviewer. 64 selected sequences where the patients were speaking about psychotic experiences were scored for facial affective behaviour with Emotion Facial Action Coding System (EMFACS). In accordance with previous research, the results show that patients diagnosed with schizophrenia express negative facial affectivity. Facial affective behaviour seems not to be dependent on temporality, since within-subjects ANOVA revealed no substantial changes in the amount of affects displayed across the weekly interview occasions. Whereas previous findings found contempt to be the most frequent affect in patients, in the present material disgust was as common, but depended on the interviewer. The results suggest that facial affectivity in these patients is primarily dominated by the negative emotions of disgust and, to a lesser extent, contempt and implies that this seems to be a fairly stable feature. PMID:22966449

  15. Facial Redness Increases Men's Perceived Healthiness and Attractiveness.

    PubMed

    Thorstenson, Christopher A; Pazda, Adam D; Elliot, Andrew J; Perrett, David I

    2017-06-01

    Past research has shown that peripheral and facial redness influences perceptions of attractiveness for men viewing women. The current research investigated whether a parallel effect is present when women rate men with varying facial redness. In four experiments, women judged the attractiveness of men's faces, which were presented with varying degrees of redness. We also examined perceived healthiness and other candidate variables as mediators of the red-attractiveness effect. The results show that facial redness positively influences ratings of men's attractiveness. Additionally, perceived healthiness was documented as a mediator of this effect, independent of other potential mediator variables. The current research emphasizes facial coloration as an important feature of social judgments.

  16. Long-Term Exposure to American and European Movies and Television Series Facilitates Caucasian Face Perception in Young Chinese Watchers.

    PubMed

    Wang, Yamin; Zhou, Lu

    2016-10-01

    Most young Chinese people now learn about Caucasian individuals via media, especially American and European movies and television series (AEMT). The current study aimed to explore whether long-term exposure to AEMT facilitates Caucasian face perception in young Chinese watchers. Before the experiment, we created Chinese, Caucasian, and generic average faces (generic average face was created from both Chinese and Caucasian faces) and tested participants' ability to identify them. In the experiment, we asked AEMT watchers and Chinese movie and television series (CMT) watchers to complete a facial norm detection task. This task was developed recently to detect norms used in facial perception. The results indicated that AEMT watchers coded Caucasian faces relative to a Caucasian face norm better than they did to a generic face norm, whereas no such difference was found among CMT watchers. All watchers coded Chinese faces by referencing a Chinese norm better than they did relative to a generic norm. The results suggested that long-term exposure to AEMT has the same effect as daily other-race face contact in shaping facial perception. © The Author(s) 2016.

  17. When does Subliminal Affective Image Priming Influence the Ability of Schizophrenic Patients to Perceive Face Emotions?

    PubMed Central

    Vaina, Lucia M.; Rana, Kunjan D.; Cotos, Ionela; Li-Yang, Chen; Huang, Melissa A.; Podea, Delia

    2014-01-01

    Background Deficits in face emotion perception are among the most pervasive aspects of schizophrenia impairments which strongly affects interpersonal communication and social skills. Material/Methods Schizophrenic patients (PSZ) and healthy control subjects (HCS) performed 2 psychophysical tasks. One, the SAFFIMAP test, was designed to determine the impact of subliminally presented affective or neutral images on the accuracy of face-expression (angry or neutral) perception. In the second test, FEP, subjects saw pictures of face-expression and were asked to rate them as angry, happy, or neutral. The following clinical scales were used to determine the acute symptoms in PSZ: Positive and Negative Syndrome (PANSS), Young Mania Rating (YMRS), Hamilton Depression (HAM-D), and Hamilton Anxiety (HAM-A). Results On the SAFFIMAP test, different from the HCS group, the PSZ group tended to categorize the neutral expression of test faces as angry and their response to the test-face expression was not influenced by the affective content of the primes. In PSZ, the PANSS-positive score was significantly correlated with correct perception of angry faces for aggressive or pleasant primes. YMRS scores were strongly correlated with PSZ’s tendency to recognize angry face expressions when the prime was a pleasant or a neutral image. The HAM-D score was positively correlated with categorizing the test-faces as neutral, regardless of the affective content of the prime or of the test-face expression (angry or neutral). Conclusions Despite its exploratory nature, this study provides the first evidence that conscious perception and categorization of facial emotions (neutral or angry) in PSZ is directly affected by their positive or negative symptoms of the disease as defined by their individual scores on the clinical diagnostic scales. PMID:25537115

  18. When does subliminal affective image priming influence the ability of schizophrenic patients to perceive face emotions?

    PubMed

    Vaina, Lucia Maria; Rana, Kunjan D; Cotos, Ionela; Li-Yang, Chen; Huang, Melissa A; Podea, Delia

    2014-12-24

    Deficits in face emotion perception are among the most pervasive aspects of schizophrenia impairments which strongly affects interpersonal communication and social skills. Schizophrenic patients (PSZ) and healthy control subjects (HCS) performed 2 psychophysical tasks. One, the SAFFIMAP test, was designed to determine the impact of subliminally presented affective or neutral images on the accuracy of face-expression (angry or neutral) perception. In the second test, FEP, subjects saw pictures of face-expression and were asked to rate them as angry, happy, or neutral. The following clinical scales were used to determine the acute symptoms in PSZ: Positive and Negative Syndrome (PANSS), Young Mania Rating (YMRS), Hamilton Depression (HAM-D), and Hamilton Anxiety (HAM-A). On the SAFFIMAP test, different from the HCS group, the PSZ group tended to categorize the neutral expression of test faces as angry and their response to the test-face expression was not influenced by the affective content of the primes. In PSZ, the PANSS-positive score was significantly correlated with correct perception of angry faces for aggressive or pleasant primes. YMRS scores were strongly correlated with PSZ's tendency to recognize angry face expressions when the prime was a pleasant or a neutral image. The HAM-D score was positively correlated with categorizing the test-faces as neutral, regardless of the affective content of the prime or of the test-face expression (angry or neutral). Despite its exploratory nature, this study provides the first evidence that conscious perception and categorization of facial emotions (neutral or angry) in PSZ is directly affected by their positive or negative symptoms of the disease as defined by their individual scores on the clinical diagnostic scales.

  19. Student Inferences Based on Facial Appearance

    ERIC Educational Resources Information Center

    Mendez, Jeanette Morehouse; Mendez, Jesse Perez

    2016-01-01

    This study extends the scope of research that examines the connection between physical attractiveness and student perception through a survey analysis. While other studies concentrate on physical attractiveness alone, we examined not only perceptions of attractiveness but its impact on students' perception of knowledge, approachability and faculty…

  20. Identifying and detecting facial expressions of emotion in peripheral vision.

    PubMed

    Smith, Fraser W; Rossit, Stephanie

    2018-01-01

    Facial expressions of emotion are signals of high biological value. Whilst recognition of facial expressions has been much studied in central vision, the ability to perceive these signals in peripheral vision has only seen limited research to date, despite the potential adaptive advantages of such perception. In the present experiment, we investigate facial expression recognition and detection performance for each of the basic emotions (plus neutral) at up to 30 degrees of eccentricity. We demonstrate, as expected, a decrease in recognition and detection performance with increasing eccentricity, with happiness and surprised being the best recognized expressions in peripheral vision. In detection however, while happiness and surprised are still well detected, fear is also a well detected expression. We show that fear is a better detected than recognized expression. Our results demonstrate that task constraints shape the perception of expression in peripheral vision and provide novel evidence that detection and recognition rely on partially separate underlying mechanisms, with the latter more dependent on the higher spatial frequency content of the face stimulus.

  1. Identifying and detecting facial expressions of emotion in peripheral vision

    PubMed Central

    Rossit, Stephanie

    2018-01-01

    Facial expressions of emotion are signals of high biological value. Whilst recognition of facial expressions has been much studied in central vision, the ability to perceive these signals in peripheral vision has only seen limited research to date, despite the potential adaptive advantages of such perception. In the present experiment, we investigate facial expression recognition and detection performance for each of the basic emotions (plus neutral) at up to 30 degrees of eccentricity. We demonstrate, as expected, a decrease in recognition and detection performance with increasing eccentricity, with happiness and surprised being the best recognized expressions in peripheral vision. In detection however, while happiness and surprised are still well detected, fear is also a well detected expression. We show that fear is a better detected than recognized expression. Our results demonstrate that task constraints shape the perception of expression in peripheral vision and provide novel evidence that detection and recognition rely on partially separate underlying mechanisms, with the latter more dependent on the higher spatial frequency content of the face stimulus. PMID:29847562

  2. Dogs Evaluate Threatening Facial Expressions by Their Biological Validity – Evidence from Gazing Patterns

    PubMed Central

    Somppi, Sanni; Törnqvist, Heini; Kujala, Miiamaaria V.; Hänninen, Laura; Krause, Christina M.; Vainio, Outi

    2016-01-01

    Appropriate response to companions’ emotional signals is important for all social creatures. The emotional expressions of humans and non-human animals have analogies in their form and function, suggesting shared evolutionary roots, but very little is known about how animals other than primates view and process facial expressions. In primates, threat-related facial expressions evoke exceptional viewing patterns compared with neutral or positive stimuli. Here, we explore if domestic dogs (Canis familiaris) have such an attentional bias toward threatening social stimuli and whether observed emotional expressions affect dogs’ gaze fixation distribution among the facial features (eyes, midface and mouth). We recorded the voluntary eye gaze of 31 domestic dogs during viewing of facial photographs of humans and dogs with three emotional expressions (threatening, pleasant and neutral). We found that dogs’ gaze fixations spread systematically among facial features. The distribution of fixations was altered by the seen expression, but eyes were the most probable targets of the first fixations and gathered longer looking durations than mouth regardless of the viewed expression. The examination of the inner facial features as a whole revealed more pronounced scanning differences among expressions. This suggests that dogs do not base their perception of facial expressions on the viewing of single structures, but the interpretation of the composition formed by eyes, midface and mouth. Dogs evaluated social threat rapidly and this evaluation led to attentional bias, which was dependent on the depicted species: threatening conspecifics’ faces evoked heightened attention but threatening human faces instead an avoidance response. We propose that threatening signals carrying differential biological validity are processed via distinctive neurocognitive pathways. Both of these mechanisms may have an adaptive significance for domestic dogs. The findings provide a novel perspective on understanding the processing of emotional expressions and sensitivity to social threat in non-primates. PMID:26761433

  3. What's in a face? The role of skin tone, facial physiognomy, and color presentation mode of facial primes in affective priming effects.

    PubMed

    Stepanova, Elena V; Strube, Michael J

    2012-01-01

    Participants (N = 106) performed an affective priming task with facial primes that varied in their skin tone and facial physiognomy, and, which were presented either in color or in gray-scale. Participants' racial evaluations were more positive for Eurocentric than for Afrocentric physiognomy faces. Light skin tone faces were evaluated more positively than dark skin tone faces, but the magnitude of this effect depended on the mode of color presentation. The results suggest that in affective priming tasks, faces might not be processed holistically, and instead, visual features of facial priming stimuli independently affect implicit evaluations.

  4. Facial Affect Recognition in Violent and Nonviolent Antisocial Behavior Subtypes.

    PubMed

    Schönenberg, Michael; Mayer, Sarah Verena; Christian, Sandra; Louis, Katharina; Jusyte, Aiste

    2016-10-01

    Prior studies provide evidence for impaired recognition of distress cues in individuals exhibiting antisocial behavior. However, it remains unclear whether this deficit is generally associated with antisociality or may be specific to violent behavior only. To examine whether there are meaningful differences between the two behavioral dimensions rule-breaking and aggression, violent and nonviolent incarcerated offenders as well as control participants were presented with an animated face recognition task in which a video sequence of a neutral face changed into an expression of one of the six basic emotions. The participants were instructed to press a button as soon as they were able to identify the emotional expression, allowing for an assessment of the perceived emotion onset. Both aggressive and nonaggressive offenders demonstrated a delayed perception of primarily fearful facial cues as compared to controls. These results suggest the importance of targeting impaired emotional processing in both types of antisocial behavior.

  5. The Effect of Observers’ Mood on the Local Processing of Emotional Faces: Evidence from Short-Lived and Prolonged Mood States

    PubMed Central

    Mokhtari, Setareh; Buttle, Heather

    2015-01-01

    We examined the effect of induced mood, varying in valence and longevity, on local processing of emotional faces. It was found that negative facial expression conveyed by the global level of the face interferes with efficient processing of the local features. The results also showed that the duration of involvement with a mood influenced the local processing. We observed that attending to the local level of faces is not different in short-lived happy and sad mood states. However, as the mood state is experienced for a longer period, local processing was impaired in happy mood compared to sad mood. Taken together, we concluded that both facial expressions and affective states influence processing of the local parts of faces. Moreover, we suggest that mediating factors like the duration of involvement with the mood play a role in the interrelation between mood, attention, and perception. PMID:25883696

  6. Multiple mechanisms in the perception of face gender: Effect of sex-irrelevant features.

    PubMed

    Komori, Masashi; Kawamura, Satoru; Ishihara, Shigekazu

    2011-06-01

    Effects of sex-relevant and sex-irrelevant facial features on the evaluation of facial gender were investigated. Participants rated masculinity of 48 male facial photographs and femininity of 48 female facial photographs. Eighty feature points were measured on each of the facial photographs. Using a generalized Procrustes analysis, facial shapes were converted into multidimensional vectors, with the average face as a starting point. Each vector was decomposed into a sex-relevant subvector and a sex-irrelevant subvector which were, respectively, parallel and orthogonal to the main male-female axis. Principal components analysis (PCA) was performed on the sex-irrelevant subvectors. One principal component was negatively correlated with both perceived masculinity and femininity, and another was correlated only with femininity, though both components were orthogonal to the male-female dimension (and thus by definition sex-irrelevant). These results indicate that evaluation of facial gender depends on sex-irrelevant as well as sex-relevant facial features.

  7. A greater decline in female facial attractiveness during middle age reflects women’s loss of reproductive value

    PubMed Central

    Maestripieri, Dario; Klimczuk, Amanda C. E.; Traficonte, Daniel M.; Wilson, M. Claire

    2014-01-01

    Facial attractiveness represents an important component of an individual’s overall attractiveness as a potential mating partner. Perceptions of facial attractiveness are expected to vary with age-related changes in health, reproductive value, and power. In this study, we investigated perceptions of facial attractiveness, power, and personality in two groups of women of pre- and post-menopausal ages (35–50 years and 51–65 years, respectively) and two corresponding groups of men. We tested three hypotheses: (1) that perceived facial attractiveness would be lower for older than for younger men and women; (2) that the age-related reduction in facial attractiveness would be greater for women than for men; and (3) that for men, there would be a larger increase in perceived power at older ages. Eighty facial stimuli were rated by 60 (30 male, 30 female) middle-aged women and men using online surveys. Our three main hypotheses were supported by the data. Consistent with sex differences in mating strategies, the greater age-related decline in female facial attractiveness was driven by male respondents, while the greater age-related increase in male perceived power was driven by female respondents. In addition, we found evidence that some personality ratings were correlated with perceived attractiveness and power ratings. The results of this study are consistent with evolutionary theory and with previous research showing that faces can provide important information about characteristics that men and women value in a potential mating partner such as their health, reproductive value, and power or possession of resources. PMID:24592253

  8. A method of assessing facial profile attractiveness and its application in comparing the aesthetic preferences of two samples of South Africans.

    PubMed

    Morar, Ajay; Stein, Errol

    2011-06-01

    Numerous studies have evaluated the perception of facial attractiveness. However, many of the instruments previously used have limitations. This study introduces an improved tool and describes its application in the assessment of the preferred facial profile in two sample groups. Cross-sectional study. Two sites were involved: a rural healthcare facility (Winterveldt, Northwest Province) and the campus of the University of the Witwatersrand (Johannesburg, Gauteng Province). Adult females and males selected from amongst first, attendees at the healthcare facility, and second, staff of the University of the Witwatersrand. Eight androgynous lateral facial profile images were created using a morphing software programme representing six transitions between two anchoring extremes in terms of lip retrusion/protrusion vs protrusion/retrusion. These images were presented to, and rated by, two mixed male/female groups of rural and of urban habitat using a pre-piloted form. Statistical analysis of the responses obtained established the preferred facial profile by gender in each group. The perception of facial attractiveness varied marginally between rural and urban black South Africans. There was no statistically significant difference between females and males in the rural group (P=0·2353) and those in the urban sample (P=0·1318) with respect to their choice of ideal facial profile. Females and males in both the rural and urban groups found extreme profile convexity unappealing. By contrast, a larger proportion of rural females, rural males and urban females demonstrated a preference for extreme profile concavity. The research tool described is a useful instrument in the assessment of facial profile attractiveness.

  9. A greater decline in female facial attractiveness during middle age reflects women's loss of reproductive value.

    PubMed

    Maestripieri, Dario; Klimczuk, Amanda C E; Traficonte, Daniel M; Wilson, M Claire

    2014-01-01

    Facial attractiveness represents an important component of an individual's overall attractiveness as a potential mating partner. Perceptions of facial attractiveness are expected to vary with age-related changes in health, reproductive value, and power. In this study, we investigated perceptions of facial attractiveness, power, and personality in two groups of women of pre- and post-menopausal ages (35-50 years and 51-65 years, respectively) and two corresponding groups of men. We tested three hypotheses: (1) that perceived facial attractiveness would be lower for older than for younger men and women; (2) that the age-related reduction in facial attractiveness would be greater for women than for men; and (3) that for men, there would be a larger increase in perceived power at older ages. Eighty facial stimuli were rated by 60 (30 male, 30 female) middle-aged women and men using online surveys. Our three main hypotheses were supported by the data. Consistent with sex differences in mating strategies, the greater age-related decline in female facial attractiveness was driven by male respondents, while the greater age-related increase in male perceived power was driven by female respondents. In addition, we found evidence that some personality ratings were correlated with perceived attractiveness and power ratings. The results of this study are consistent with evolutionary theory and with previous research showing that faces can provide important information about characteristics that men and women value in a potential mating partner such as their health, reproductive value, and power or possession of resources.

  10. Face Context Influences Local Part Processing: An ERP Study.

    PubMed

    Zhang, Hong; Sun, Yaoru; Zhao, Lun

    2017-09-01

    Perception of face parts on the basis of features is thought to be different from perception of whole faces, which is more based on configural information. Face context is also suggested to play an important role in face processing. To investigate how face context influences the early-stage perception of facial local parts, we used an oddball paradigm that tested perceptual stages of face processing rather than recognition. We recorded the event-related potentials (ERPs) elicited by whole faces and face parts presented in four conditions (upright-normal, upright-thatcherised, inverted-normal and inverted-thatcherised), as well as the ERPs elicited by non-face objects (whole houses and house parts) with corresponding conditions. The results showed that face context significantly affected the N170 with increased amplitudes and earlier peak latency for upright normal faces. Removing face context delayed the P1 latency but did not affect the P1 amplitude prominently for both upright and inverted normal faces. Across all conditions, neither the N170 nor the P1 was modulated by house context. The significant changes on the N170 and P1 components revealed that face context influences local part processing at the early stage of face processing and this context effect might be specific for face perception. We further suggested that perceptions of whole faces and face parts are functionally distinguished.

  11. Relationship between individual differences in functional connectivity and facial-emotion recognition abilities in adults with traumatic brain injury.

    PubMed

    Rigon, A; Voss, M W; Turkstra, L S; Mutlu, B; Duff, M C

    2017-01-01

    Although several studies have demonstrated that facial-affect recognition impairment is common following moderate-severe traumatic brain injury (TBI), and that there are diffuse alterations in large-scale functional brain networks in TBI populations, little is known about the relationship between the two. Here, in a sample of 26 participants with TBI and 20 healthy comparison participants (HC) we measured facial-affect recognition abilities and resting-state functional connectivity (rs-FC) using fMRI. We then used network-based statistics to examine (A) the presence of rs-FC differences between individuals with TBI and HC within the facial-affect processing network, and (B) the association between inter-individual differences in emotion recognition skills and rs-FC within the facial-affect processing network. We found that participants with TBI showed significantly lower rs-FC in a component comprising homotopic and within-hemisphere, anterior-posterior connections within the facial-affect processing network. In addition, within the TBI group, participants with higher emotion-labeling skills showed stronger rs-FC within a network comprised of intra- and inter-hemispheric bilateral connections. Findings indicate that the ability to successfully recognize facial-affect after TBI is related to rs-FC within components of facial-affective networks, and provide new evidence that further our understanding of the mechanisms underlying emotion recognition impairment in TBI.

  12. Emerging perceptions of facial plastic surgery among medical students.

    PubMed

    Rosenthal, E; Clark, J M; Wax, M K; Cook, T A

    2001-11-01

    The purpose of this study was to examine the perceptions of medical students regarding facial aesthetic surgery and those specialists most likely to perform aesthetic or reconstructive facial surgery. A survey was designed based on a review of the literature to assess the desirable characteristics and the perceived role of the facial plastic and reconstructive surgeon (FPRS). The surveys were distributed to 2 populations: medical students from 4 medical schools and members of the general public. A total of 339 surveys were collected, 217 from medical students and 122 from the general public. Medical students and the public had similar responses. The results demonstrated that respondents preferred a male plastic surgeon from the ages of 41 to 50 years old and would look to their family doctor for a recommendation. Facial aesthetic and reconstructive surgery was considered the domain of maxillofacial and general plastic surgeons, not the FPRS. Integration of the FPRS into the medical school curriculum may help to improve the perceived role of the specialty within the medical community. It is important for the specialty to communicate to aspiring physicians the dedicated training of an otolaryngologist specializing in FPRS.

  13. Characterization of a posed smile and evaluation of facial attractiveness by panel perception and its correlation with hard and soft tissue.

    PubMed

    Malhotra, Smriti; Sidhu, Maninder Singh; Prabhakar, Mona; Kochhar, Anuraj Singh

    2012-01-01

    To examine whether specific hard and soft tissue had any effect on smile characteristics and to ascertain the opinions of laypersons and clinicians in evaluating facial attractiveness among different occlusions. Photographs of posed smiles, along with profiles and full faces, of 76 patients with different occlusions were captured, and a lateral cephalogram of each subject was traced. These photographs were judged by a panel of 10 clinicians and 10 laypersons on a 5-point visual analog scale. Quantitative measurements were carried out on the smile images for 14 smile characteristics. The effect of hard and soft tissue on these characteristics was also examined. The upper vermilion lip thickness was affected by Pt.A-UI and E-line to upper lip, while the lower vermilion lip thickness was affected by lower anterior facial height. FMA had a significant positive effect on gingival display (P ≤ .05). This meant that an increase in FMA also caused the gingival display to increase. The nasolabial angle showed a significant positive effect on incisal display, while FMA showed a negative effect on intercanine width. Lower facial height and FMA had a significant negative effect on the smile index. A correlation was found between the judgments of clinicians and laypersons. Both judged Class I relationships to be the most attractive. FMA was found to have a positive effect on the amount of gingival display. It was also observed that patients with Class II Division 1 relationships had the thickest lips compared with patients having other types of occlusions. Class III patients exhibited no gingival display on smile. Patients with Class I showed the maximum smile width, while patients with Class III showed the least amount of buccal corridor.

  14. The perceptual saliency of fearful eyes and smiles: A signal detection study

    PubMed Central

    Saban, Muhammet Ikbal; Rotshtein, Pia

    2017-01-01

    Facial features differ in the amount of expressive information they convey. Specifically, eyes are argued to be essential for fear recognition, while smiles are crucial for recognising happy expressions. In three experiments, we tested whether expression modulates the perceptual saliency of diagnostic facial features and whether the feature’s saliency depends on the face configuration. Participants were presented with masked facial features or noise at perceptual conscious threshold. The task was to indicate whether eyes (experiments 1-3A) or a mouth (experiment 3B) was present. The expression of the face and its configuration (i.e. spatial arrangement of the features) were manipulated. Experiment 1 compared fearful with neutral expressions, experiments 2 and 3 compared fearful versus happy expressions. The detection accuracy data was analysed using Signal Detection Theory (SDT), to examine the effects of expression and configuration on perceptual precision (d’) and response bias (c), separately. Across all three experiments, fearful eyes were detected better (higher d’) than neutral and happy eyes. Eyes were more precisely detected than mouths, whereas smiles were detected better than fearful mouths. The configuration of the features had no consistent effects across the experiments on the ability to detect expressive features. But facial configuration affected consistently the response bias. Participants used a more liberal criterion for detecting the eyes in canonical configuration and fearful expression. Finally, the power in low spatial frequency of a feature predicted its discriminability index. The results suggest that expressive features are perceptually more salient with a higher d’ due to changes at the low-level visual properties, with emotions and configuration affecting perception through top-down processes, as reflected by the response bias. PMID:28267761

  15. Plastic Surgery on Children with Down Syndrome: Parents' Perceptions of Physical, Personal, and Social Functioning.

    ERIC Educational Resources Information Center

    Kravetz, Shlomo; And Others

    1992-01-01

    This study compared perceptions of parents of 19 children with Down's syndrome (DS) who had undergone plastic facial surgery with perceptions of parents of DS children who had not received surgery. The comparison found little evidence of positive impact of the surgery on parents' perceptions of their children's physical, personal, and social…

  16. Feeling Touched: Emotional Modulation of Somatosensory Potentials to Interpersonal Touch.

    PubMed

    Ravaja, N; Harjunen, V; Ahmed, I; Jacucci, G; Spapé, M M

    2017-01-12

    Although the previous studies have shown that an emotional context may alter touch processing, it is not clear how visual contextual information modulates the sensory signals, and at what levels does this modulation take place. Therefore, we investigated how a toucher's emotional expressions (anger, happiness, fear, and sadness) modulate touchee's somatosensory-evoked potentials (SEPs) in different temporal ranges. Participants were presented with tactile stimulation appearing to originate from expressive characters in virtual reality. Touch processing was indexed using SEPs, and self-reports of touch experience were collected. Early potentials were found to be amplified after angry, happy and sad facial expressions, while late potentials were amplified after anger but attenuated after happiness. These effects were related to two stages of emotional modulation of tactile perception: anticipation and interpretation. The findings show that not only does touch affect emotion, but also emotional expressions affect touch perception. The affective modulation of touch was initially obtained as early as 25 ms after the touch onset suggesting that emotional context is integrated to the tactile sensation at a very early stage.

  17. Feeling Touched: Emotional Modulation of Somatosensory Potentials to Interpersonal Touch

    PubMed Central

    Ravaja, N.; Harjunen, V.; Ahmed, I.; Jacucci, G.; Spapé, M. M.

    2017-01-01

    Although the previous studies have shown that an emotional context may alter touch processing, it is not clear how visual contextual information modulates the sensory signals, and at what levels does this modulation take place. Therefore, we investigated how a toucher’s emotional expressions (anger, happiness, fear, and sadness) modulate touchee’s somatosensory-evoked potentials (SEPs) in different temporal ranges. Participants were presented with tactile stimulation appearing to originate from expressive characters in virtual reality. Touch processing was indexed using SEPs, and self-reports of touch experience were collected. Early potentials were found to be amplified after angry, happy and sad facial expressions, while late potentials were amplified after anger but attenuated after happiness. These effects were related to two stages of emotional modulation of tactile perception: anticipation and interpretation. The findings show that not only does touch affect emotion, but also emotional expressions affect touch perception. The affective modulation of touch was initially obtained as early as 25 ms after the touch onset suggesting that emotional context is integrated to the tactile sensation at a very early stage. PMID:28079157

  18. Facial affect processing and depression susceptibility: cognitive biases and cognitive neuroscience.

    PubMed

    Bistricky, Steven L; Ingram, Rick E; Atchley, Ruth Ann

    2011-11-01

    Facial affect processing is essential to social development and functioning and is particularly relevant to models of depression. Although cognitive and interpersonal theories have long described different pathways to depression, cognitive-interpersonal and evolutionary social risk models of depression focus on the interrelation of interpersonal experience, cognition, and social behavior. We therefore review the burgeoning depressive facial affect processing literature and examine its potential for integrating disciplines, theories, and research. In particular, we evaluate studies in which information processing or cognitive neuroscience paradigms were used to assess facial affect processing in depressed and depression-susceptible populations. Most studies have assessed and supported cognitive models. This research suggests that depressed and depression-vulnerable groups show abnormal facial affect interpretation, attention, and memory, although findings vary based on depression severity, comorbid anxiety, or length of time faces are viewed. Facial affect processing biases appear to correspond with distinct neural activity patterns and increased depressive emotion and thought. Biases typically emerge in depressed moods but are occasionally found in the absence of such moods. Indirect evidence suggests that childhood neglect might cultivate abnormal facial affect processing, which can impede social functioning in ways consistent with cognitive-interpersonal and interpersonal models. However, reviewed studies provide mixed support for the social risk model prediction that depressive states prompt cognitive hypervigilance to social threat information. We recommend prospective interdisciplinary research examining whether facial affect processing abnormalities promote-or are promoted by-depressogenic attachment experiences, negative thinking, and social dysfunction.

  19. Infants' Intermodal Perception of Canine ("Canis Familairis") Facial Expressions and Vocalizations

    ERIC Educational Resources Information Center

    Flom, Ross; Whipple, Heather; Hyde, Daniel

    2009-01-01

    From birth, human infants are able to perceive a wide range of intersensory relationships. The current experiment examined whether infants between 6 months and 24 months old perceive the intermodal relationship between aggressive and nonaggressive canine vocalizations (i.e., barks) and appropriate canine facial expressions. Infants simultaneously…

  20. Negative affect is related to reduced differential neural responses to social and non-social stimuli in 5-to-8-month-old infants: A functional near-infrared spectroscopy-study.

    PubMed

    van der Kant, Anne; Biro, Szilvia; Levelt, Claartje; Huijbregts, Stephan

    2018-04-01

    Both social perception and temperament in young infants have been related to social functioning later in life. Previous functional Near-Infrared Spectroscopy (fNIRS) data (Lloyd-Fox et al., 2009) showed larger blood-oxygenation changes for social compared to non-social stimuli in the posterior temporal cortex of five-month-old infants. We sought to replicate and extend these findings by using fNIRS to study the neural basis of social perception in relation to infant temperament (Negative Affect) in 37 five-to-eight-month-old infants. Infants watched short videos displaying either hand and facial movements of female actors (social dynamic condition) or moving toys and machinery (non-social dynamic condition), while fNIRS data were collected over temporal brain regions. Negative Affect was measured using the Infant Behavior Questionnaire. Results showed significantly larger blood-oxygenation changes in the right posterior-temporal region in the social compared to the non-social condition. Furthermore, this differential activation was smaller in infants showing higher Negative Affect. Our results replicate those of Lloyd-Fox et al. and confirmed that five-to-eight-month-old infants show cortical specialization for social perception. Furthermore, the decreased cortical sensitivity to social stimuli in infants showing high Negative Affect may be an early biomarker for later difficulties in social interaction. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. Facial Affect Recognition and Social Anxiety in Preschool Children

    ERIC Educational Resources Information Center

    Ale, Chelsea M.; Chorney, Daniel B.; Brice, Chad S.; Morris, Tracy L.

    2010-01-01

    Research relating anxiety and facial affect recognition has focused mostly on school-aged children and adults and has yielded mixed results. The current study sought to demonstrate an association among behavioural inhibition and parent-reported social anxiety, shyness, social withdrawal and facial affect recognition performance in 30 children,…

  2. She looks sad, but he looks mad: the effects of age, gender, and ambiguity on emotion perception.

    PubMed

    Parmley, Maria; Cunningham, Joseph G

    2014-01-01

    This study investigated how target sex, target age, and expressive ambiguity influence emotion perception. Undergraduate participants (N = 192) watched morphed video clips of eight child and eight adult facial expressions shifting from neutral to either sadness or anger. Participants were asked to stop the video clip when they first saw an emotion appear (perceptual sensitivity) and were asked to identify the emotion that they saw (accuracy). Results indicate that female participants identified sad expressions sooner in female targets than in male targets. Participants were also more accurate identifying angry facial expressions by male children than by female children. Findings are discussed in terms of the effects of ambiguity, gender, and age on the perception of emotional expressions.

  3. Mirror, mirror on the wall…: self-perception of facial beauty versus judgement by others.

    PubMed

    Springer, I N; Wiltfang, J; Kowalski, J T; Russo, P A J; Schulze, M; Becker, S; Wolfart, S

    2012-12-01

    In 1878, Margaret Wolfe Hungerford published a simple but insightful phrase in her novel 'Molly Bawn' that was to be quoted so often it has almost become cliché: "Beauty is in the eye of the beholder". While many questions regarding the perception and neural processing of facial attractiveness have been resolved, it became obvious to us that study designs have been principally based on either facial self-perception or perception by others. The relationship between these however, remains both crucial and unknown. Standardized images were taken of 141 subjects. These 141 subjects were asked to complete the adjective mood scale (AMS) and to rank specific issues related to their looks on a visual analogue scale. The images were then shown to independent judges to rank specific issues related to their looks on a visual analogue scale. Our results show proof for a strikingly simple observation: that individuals perceive their own beauty to be greater than that expressed in the opinions of others (p < 0.001). This observation provides insight into our basic behavioural patterns and suggests that there are strong psychological mechanisms in humans supporting self-identification and thereby encouraging the self-confidence and resilience necessary to maintain one's social standing. While the psychological basis of self-confidence is multifactorial, our finding provides critical objective insight. We prove here for the first time that nothing more than the beauty of the beholder is in the eyes of the latter. Copyright © 2012 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  4. Face processing in autism: Reduced integration of cross-feature dynamics.

    PubMed

    Shah, Punit; Bird, Geoffrey; Cook, Richard

    2016-02-01

    Characteristic problems with social interaction have prompted considerable interest in the face processing of individuals with Autism Spectrum Disorder (ASD). Studies suggest that reduced integration of information from disparate facial regions likely contributes to difficulties recognizing static faces in this population. Recent work also indicates that observers with ASD have problems using patterns of facial motion to judge identity and gender, and may be less able to derive global motion percepts. These findings raise the possibility that feature integration deficits also impact the perception of moving faces. To test this hypothesis, we examined whether observers with ASD exhibit susceptibility to a new dynamic face illusion, thought to index integration of moving facial features. When typical observers view eye-opening and -closing in the presence of asynchronous mouth-opening and -closing, the concurrent mouth movements induce a strong illusory slowing of the eye transitions. However, we find that observers with ASD are not susceptible to this illusion, suggestive of weaker integration of cross-feature dynamics. Nevertheless, observers with ASD and typical controls were equally able to detect the physical differences between comparison eye transitions. Importantly, this confirms that observers with ASD were able to fixate the eye-region, indicating that the striking group difference has a perceptual, not attentional origin. The clarity of the present results contrasts starkly with the modest effect sizes and equivocal findings seen throughout the literature on static face perception in ASD. We speculate that differences in the perception of facial motion may be a more reliable feature of this condition. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  5. Probabilistic Perception, Empathy, and Dynamic Homeostasis: Insights in Autism Spectrum Disorders and Conduct Disorders

    PubMed Central

    Guilé, Jean Marc

    2013-01-01

    Homeostasis is not a permanent and stable state but instead results from conflicting forces. Therefore, infants have to engage in dynamic exchanges with their environment, in biological, cognitive, and affective domains. Empathy is an adaptive response to these environmental challenges, which contributes to reaching proper dynamic homeostasis and development. Empathy relies on implicit interactive processes, namely probabilistic perception and synchrony, which will be reviewed in the article. If typically-developed neonates are fully equipped to automatically and synchronously interact with their human environment, conduct disorders (CD) and autism spectrum disorders (ASD) present with impairments in empathetic communication, e.g., emotional arousal and facial emotion processing. In addition sensorimotor resonance is lacking in ASD, and emotional concern and semantic empathy are impaired in CD with Callous-Unemotional traits. PMID:24479115

  6. Experience-based human perception of facial expressions in Barbary macaques (Macaca sylvanus)

    PubMed Central

    Levy, Xandria; Meints, Kerstin; Majolo, Bonaventura

    2017-01-01

    Background Facial expressions convey key cues of human emotions, and may also be important for interspecies interactions. The universality hypothesis suggests that six basic emotions (anger, disgust, fear, happiness, sadness, and surprise) should be expressed by similar facial expressions in close phylogenetic species such as humans and nonhuman primates. However, some facial expressions have been shown to differ in meaning between humans and nonhuman primates like macaques. This ambiguity in signalling emotion can lead to an increased risk of aggression and injuries for both humans and animals. This raises serious concerns for activities such as wildlife tourism where humans closely interact with wild animals. Understanding what factors (i.e., experience and type of emotion) affect ability to recognise emotional state of nonhuman primates, based on their facial expressions, can enable us to test the validity of the universality hypothesis, as well as reduce the risk of aggression and potential injuries in wildlife tourism. Methods The present study investigated whether different levels of experience of Barbary macaques, Macaca sylvanus, affect the ability to correctly assess different facial expressions related to aggressive, distressed, friendly or neutral states, using an online questionnaire. Participants’ level of experience was defined as either: (1) naïve: never worked with nonhuman primates and never or rarely encountered live Barbary macaques; (2) exposed: shown pictures of the different Barbary macaques’ facial expressions along with the description and the corresponding emotion prior to undertaking the questionnaire; (3) expert: worked with Barbary macaques for at least two months. Results Experience with Barbary macaques was associated with better performance in judging their emotional state. Simple exposure to pictures of macaques’ facial expressions improved the ability of inexperienced participants to better discriminate neutral and distressed faces, and a trend was found for aggressive faces. However, these participants, even when previously exposed to pictures, had difficulties in recognising aggressive, distressed and friendly faces above chance level. Discussion These results do not support the universality hypothesis as exposed and naïve participants had difficulties in correctly identifying aggressive, distressed and friendly faces. Exposure to facial expressions improved their correct recognition. In addition, the findings suggest that providing simple exposure to 2D pictures (for example, information signs explaining animals’ facial signalling in zoos or animal parks) is not a sufficient educational tool to reduce tourists’ misinterpretations of macaque emotion. Additional measures, such as keeping a safe distance between tourists and wild animals, as well as reinforcing learning via videos or supervised visits led by expert guides, could reduce such issues and improve both animal welfare and tourist experience. PMID:28584731

  7. Global facial beauty: approaching a unified aesthetic ideal.

    PubMed

    Sands, Noah B; Adamson, Peter A

    2014-04-01

    Recognition of facial beauty is both inborn and learned through social discourses and exposures. Demographic shifts across the globe, in addition to cross-cultural interactions that typify 21st century globalization in virtually all industries, comprise major active evolutionary forces that reshape our individual notions of facial beauty. This article highlights the changing perceptions of beauty, while defining and distinguishing natural beauty and artificial beauty. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  8. A dental-facial attractiveness scale. Part II. Consistency of perception.

    PubMed

    Tedesco, L A; Albino, J E; Cunat, J J; Slakter, M J; Waltz, K J

    1983-01-01

    A previous report describes the reliability and validity of a scale designed to assess perceptions of dental-facial attractiveness, independent of occlusal function. The purpose of the present study was to assess the consistency of judgments of dental-facial attractiveness (DFA) for sex and race differences in photographed children. Using a five-pont DFA scale, twelve college freshmen (three black females, three black males, three white females, three white males) rated ninety-six photographs of the mouths and jaws of 13- to 14-year-old children (twenty-four black females, twenty-four black males, twenty-four white females, twenty-four white males). No significant mean differences were found between the black and white photographed or between the female and male children photographed. However, means were significantly different for DFA judgments by race and sex of the raters. Black raters judged all photographs to be more attractive than did white raters, and female raters judged all photographs to be more attractive than did male raters. Correlational data are presented describing consistency of perception within rater groups and photographed groups of children.

  9. Passing faces: sequence-dependent variations in the perceptual processing of emotional faces.

    PubMed

    Karl, Christian; Hewig, Johannes; Osinsky, Roman

    2016-10-01

    There is broad evidence that contextual factors influence the processing of emotional facial expressions. Yet temporal-dynamic aspects, inter alia how face processing is influenced by the specific order of neutral and emotional facial expressions, have been largely neglected. To shed light on this topic, we recorded electroencephalogram from 168 healthy participants while they performed a gender-discrimination task with angry and neutral faces. Our event-related potential (ERP) analyses revealed a strong emotional modulation of the N170 component, indicating that the basic visual encoding and emotional analysis of a facial stimulus happen, at least partially, in parallel. While the N170 and the late positive potential (LPP; 400-600 ms) were only modestly affected by the sequence of preceding faces, we observed a strong influence of face sequences on the early posterior negativity (EPN; 200-300 ms). Finally, the differing response patterns of the EPN and LPP indicate that these two ERPs represent distinct processes during face analysis: while the former seems to represent the integration of contextual information in the perception of a current face, the latter appears to represent the net emotional interpretation of a current face.

  10. Exploring social cognition in patients with apathy following acquired brain damage.

    PubMed

    Njomboro, Progress; Humphreys, Glyn W; Deb, Shoumitro

    2014-01-23

    Research on cognition in apathy has largely focused on executive functions. To the best of our knowledge, no studies have investigated the relationship between apathy symptoms and processes involved in social cognition. Apathy symptoms include attenuated emotional behaviour, low social engagement and social withdrawal, all of which may be linked to underlying socio-cognitive deficits. We compared patients with brain damage who also had apathy symptoms against similar patients with brain damage but without apathy symptoms. Both patient groups were also compared against normal controls on key socio-cognitive measures involving moral reasoning, social awareness related to making judgements between normative and non-normative behaviour, Theory of Mind processing, and the perception of facial expressions of emotion. We also controlled for the likely effects of executive deficits and depressive symptoms on these comparisons. Our results indicated that patients with apathy were distinctively impaired in making moral reasoning decisions and in judging the social appropriateness of behaviour. Deficits in Theory of Mind and perception of facial expressions of emotion did not distinguish patients with apathy from those without apathy. Our findings point to a possible socio-cognitive profile for apathy symptoms and provide initial insights into how socio-cognitive deficits in patients with apathy may affect social functioning.

  11. Factors Influencing Perception of Facial Attractiveness: Gender and Dental Education.

    PubMed

    Jung, Ga-Hee; Jung, Seunggon; Park, Hong-Ju; Oh, Hee-Kyun; Kook, Min-Suk

    2018-03-01

    This study was conducted to investigate the gender- and dental education-specific differences in perception of facial attractiveness for varying ratio of lower face contour. Two hundred eleven students (110 male respondents and 110 female respondents; aged between 20-38 years old) were requested to rate facial figures with alterations to the bigonial width and the vertical length of the lower face. We produced a standard figure which is based on the "golden ratio" and 4 additional series of figures with either horizontal or vertical alterations to the contour of lower face. The preference for each figure was evaluated using a Visual Analog Scale. The Kruskal Wallis test was used for differences in the preferences for each figure and the Mann-Whitney U test was used to evaluate gender-specific differences and differences by dental education. In general, the highest preference score was indicated for the standard figure, whereas facial figure with large bigonial width and chin length had the lowest score.Male respondents showed significantly higher preference score for facial contour that had a 0.1 proportional increase in the facial height-bigonial width ratio over that of the standard figure.For horizontal alterations to the facial profiles, there were no significant differences in the preferences by the level of dental education. For vertically altered images, the average Visual Analog Scale was significantly lower among the dentally-educated for facial image that had a proportional 0.22 and 0.42 increase in the ratio between the vertical length of the chin and the lip. Generally, the standard image based on the golden ratio was the most. Slender face was appealed more to males than to females, and facial image with an increased lower facial height were perceived to be much less attractive to the dentally-educated respondents, which suggests that the dental education might have some influence in sensitivity to vertical changes in lower face.

  12. Children's Facial Trustworthiness Judgments: Agreement and Relationship with Facial Attractiveness.

    PubMed

    Ma, Fengling; Xu, Fen; Luo, Xianming

    2016-01-01

    This study examined developmental changes in children's abilities to make trustworthiness judgments based on faces and the relationship between a child's perception of trustworthiness and facial attractiveness. One hundred and one 8-, 10-, and 12-year-olds, along with 37 undergraduates, were asked to judge the trustworthiness of 200 faces. Next, they issued facial attractiveness judgments. The results indicated that children made consistent trustworthiness and attractiveness judgments based on facial appearance, but with-adult and within-age agreement levels of facial judgments increased with age. Additionally, the agreement levels of judgments made by girls were higher than those by boys. Furthermore, the relationship between trustworthiness and attractiveness judgments increased with age, and the relationship between two judgments made by girls was closer than those by boys. These findings suggest that face-based trait judgment ability develops throughout childhood and that, like adults, children may use facial attractiveness as a heuristic cue that signals a stranger's trustworthiness.

  13. Predicting the Accuracy of Facial Affect Recognition: The Interaction of Child Maltreatment and Intellectual Functioning

    ERIC Educational Resources Information Center

    Shenk, Chad E.; Putnam, Frank W.; Noll, Jennie G.

    2013-01-01

    Previous research demonstrates that both child maltreatment and intellectual performance contribute uniquely to the accurate identification of facial affect by children and adolescents. The purpose of this study was to extend this research by examining whether child maltreatment affects the accuracy of facial recognition differently at varying…

  14. Muecas: A Multi-Sensor Robotic Head for Affective Human Robot Interaction and Imitation

    PubMed Central

    Cid, Felipe; Moreno, Jose; Bustos, Pablo; Núñez, Pedro

    2014-01-01

    This paper presents a multi-sensor humanoid robotic head for human robot interaction. The design of the robotic head, Muecas, is based on ongoing research on the mechanisms of perception and imitation of human expressions and emotions. These mechanisms allow direct interaction between the robot and its human companion through the different natural language modalities: speech, body language and facial expressions. The robotic head has 12 degrees of freedom, in a human-like configuration, including eyes, eyebrows, mouth and neck, and has been designed and built entirely by IADeX (Engineering, Automation and Design of Extremadura) and RoboLab. A detailed description of its kinematics is provided along with the design of the most complex controllers. Muecas can be directly controlled by FACS (Facial Action Coding System), the de facto standard for facial expression recognition and synthesis. This feature facilitates its use by third party platforms and encourages the development of imitation and of goal-based systems. Imitation systems learn from the user, while goal-based ones use planning techniques to drive the user towards a final desired state. To show the flexibility and reliability of the robotic head, the paper presents a software architecture that is able to detect, recognize, classify and generate facial expressions in real time using FACS. This system has been implemented using the robotics framework, RoboComp, which provides hardware-independent access to the sensors in the head. Finally, the paper presents experimental results showing the real-time functioning of the whole system, including recognition and imitation of human facial expressions. PMID:24787636

  15. Facial contrast is a cue for perceiving health from the face.

    PubMed

    Russell, Richard; Porcheron, Aurélie; Sweda, Jennifer R; Jones, Alex L; Mauger, Emmanuelle; Morizot, Frederique

    2016-09-01

    How healthy someone appears has important social consequences. Yet the visual cues that determine perceived health remain poorly understood. Here we report evidence that facial contrast-the luminance and color contrast between internal facial features and the surrounding skin-is a cue for the perception of health from the face. Facial contrast was measured from a large sample of Caucasian female faces, and was found to predict ratings of perceived health. Most aspects of facial contrast were positively related to perceived health, meaning that faces with higher facial contrast appeared healthier. In 2 subsequent experiments, we manipulated facial contrast and found that participants perceived faces with increased facial contrast as appearing healthier than faces with decreased facial contrast. These results support the idea that facial contrast is a cue for perceived health. This finding adds to the growing knowledge about perceived health from the face, and helps to ground our understanding of perceived health in terms of lower-level perceptual features such as contrast. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  16. Facial Emotion Recognition Performance Differentiates Between Behavioral Variant Frontotemporal Dementia and Major Depressive Disorder.

    PubMed

    Chiu, Isabelle; Piguet, Olivier; Diehl-Schmid, Janine; Riedl, Lina; Beck, Johannes; Leyhe, Thomas; Holsboer-Trachsler, Edith; Kressig, Reto W; Berres, Manfred; Monsch, Andreas U; Sollberger, Marc

    Misdiagnosis of early behavioral variant frontotemporal dementia (bvFTD) with major depressive disorder (MDD) is not uncommon due to overlapping symptoms. The aim of this study was to improve the discrimination between these disorders using a novel facial emotion perception task. In this prospective cohort study (July 2013-March 2016), we compared 25 patients meeting Rascovsky diagnostic criteria for bvFTD, 20 patients meeting DSM-IV criteria for MDD, 21 patients meeting McKhann diagnostic criteria for Alzheimer's disease dementia, and 31 healthy participants on a novel emotion intensity rating task comprising morphed low-intensity facial stimuli. Participants were asked to rate the intensity of morphed faces on the congruent basic emotion (eg, rating on sadness when sad face is shown) and on the 5 incongruent basic emotions (eg, rating on each of the other basic emotions when sad face is shown). While bvFTD patients underrated congruent emotions (P < .01), they also overrated incongruent emotions (P < .001), resulting in confusion of facial emotions. In contrast, MDD patients overrated congruent negative facial emotions (P < .001), but not incongruent facial emotions. Accordingly, ratings of congruent and incongruent emotions highly discriminated between bvFTD and MDD patients, ranging from area under the curve (AUC) = 93% to AUC = 98%. Further, an almost complete discrimination (AUC = 99%) was achieved by contrasting the 2 rating types. In contrast, Alzheimer's disease dementia patients perceived emotions similarly to healthy participants, indicating no impact of cognitive impairment on rating scores. Our congruent and incongruent facial emotion intensity rating task allows a detailed assessment of facial emotion perception in patient populations. By using this simple task, we achieved an almost complete discrimination between bvFTD and MDD, potentially helping improve the diagnostic certainty in early bvFTD. © Copyright 2018 Physicians Postgraduate Press, Inc.

  17. Exploring the nature of facial affect processing deficits in schizophrenia.

    PubMed

    van 't Wout, Mascha; Aleman, André; Kessels, Roy P C; Cahn, Wiepke; de Haan, Edward H F; Kahn, René S

    2007-04-15

    Schizophrenia has been associated with deficits in facial affect processing, especially negative emotions. However, the exact nature of the deficit remains unclear. The aim of the present study was to investigate whether schizophrenia patients have problems in automatic allocation of attention as well as in controlled evaluation of facial affect. Thirty-seven patients with schizophrenia were compared with 41 control subjects on incidental facial affect processing (gender decision of faces with a fearful, angry, happy, disgusted, and neutral expression) and degraded facial affect labeling (labeling of fearful, angry, happy, and neutral faces). The groups were matched on estimates of verbal and performance intelligence (National Adult Reading Test; Raven's Matrices), general face recognition ability (Benton Face Recognition), and other demographic variables. The results showed that patients with schizophrenia as well as control subjects demonstrate the normal threat-related interference during incidental facial affect processing. Conversely, on controlled evaluation patients were specifically worse in the labeling of fearful faces. In particular, patients with high levels of negative symptoms may be characterized by deficits in labeling fear. We suggest that patients with schizophrenia show no evidence of deficits in the automatic allocation of attention resources to fearful (threat-indicating) faces, but have a deficit in the controlled processing of facial emotions that may be specific for fearful faces.

  18. Recognition of Facial Expressions of Emotion in Adults with Down Syndrome

    ERIC Educational Resources Information Center

    Virji-Babul, Naznin; Watt, Kimberley; Nathoo, Farouk; Johnson, Peter

    2012-01-01

    Research on facial expressions in individuals with Down syndrome (DS) has been conducted using photographs. Our goal was to examine the effect of motion on perception of emotional expressions. Adults with DS, adults with typical development matched for chronological age (CA), and children with typical development matched for developmental age (DA)…

  19. Seeing Mixed Emotions: The Specificity of Emotion Perception From Static and Dynamic Facial Expressions Across Cultures

    PubMed Central

    Fang, Xia; Sauter, Disa A.; Van Kleef, Gerben A.

    2017-01-01

    Although perceivers often agree about the primary emotion that is conveyed by a particular expression, observers may concurrently perceive several additional emotions from a given facial expression. In the present research, we compared the perception of two types of nonintended emotions in Chinese and Dutch observers viewing facial expressions: emotions which were morphologically similar to the intended emotion and emotions which were morphologically dissimilar to the intended emotion. Findings were consistent across two studies and showed that (a) morphologically similar emotions were endorsed to a greater extent than dissimilar emotions and (b) Chinese observers endorsed nonintended emotions more than did Dutch observers. Furthermore, the difference between Chinese and Dutch observers was more pronounced for the endorsement of morphologically similar emotions than of dissimilar emotions. We also obtained consistent evidence that Dutch observers endorsed nonintended emotions that were congruent with the preceding expressions to a greater degree. These findings suggest that culture and morphological similarity both influence the extent to which perceivers see several emotions in a facial expression. PMID:29386689

  20. Seeing Mixed Emotions: The Specificity of Emotion Perception From Static and Dynamic Facial Expressions Across Cultures.

    PubMed

    Fang, Xia; Sauter, Disa A; Van Kleef, Gerben A

    2018-01-01

    Although perceivers often agree about the primary emotion that is conveyed by a particular expression, observers may concurrently perceive several additional emotions from a given facial expression. In the present research, we compared the perception of two types of nonintended emotions in Chinese and Dutch observers viewing facial expressions: emotions which were morphologically similar to the intended emotion and emotions which were morphologically dissimilar to the intended emotion. Findings were consistent across two studies and showed that (a) morphologically similar emotions were endorsed to a greater extent than dissimilar emotions and (b) Chinese observers endorsed nonintended emotions more than did Dutch observers. Furthermore, the difference between Chinese and Dutch observers was more pronounced for the endorsement of morphologically similar emotions than of dissimilar emotions. We also obtained consistent evidence that Dutch observers endorsed nonintended emotions that were congruent with the preceding expressions to a greater degree. These findings suggest that culture and morphological similarity both influence the extent to which perceivers see several emotions in a facial expression.

  1. Women's greater ability to perceive happy facial emotion automatically: gender differences in affective priming.

    PubMed

    Donges, Uta-Susan; Kersting, Anette; Suslow, Thomas

    2012-01-01

    There is evidence that women are better in recognizing their own and others' emotions. The female advantage in emotion recognition becomes even more apparent under conditions of rapid stimulus presentation. Affective priming paradigms have been developed to examine empirically whether facial emotion stimuli presented outside of conscious awareness color our impressions. It was observed that masked emotional facial expression has an affect congruent influence on subsequent judgments of neutral stimuli. The aim of the present study was to examine the effect of gender on affective priming based on negative and positive facial expression. In our priming experiment sad, happy, neutral, or no facial expression was briefly presented (for 33 ms) and masked by neutral faces which had to be evaluated. 81 young healthy volunteers (53 women) participated in the study. Subjects had no subjective awareness of emotional primes. Women did not differ from men with regard to age, education, intelligence, trait anxiety, or depressivity. In the whole sample, happy but not sad facial expression elicited valence congruent affective priming. Between-group analyses revealed that women manifested greater affective priming due to happy faces than men. Women seem to have a greater ability to perceive and respond to positive facial emotion at an automatic processing level compared to men. High perceptual sensitivity to minimal social-affective signals may contribute to women's advantage in understanding other persons' emotional states.

  2. Women's Greater Ability to Perceive Happy Facial Emotion Automatically: Gender Differences in Affective Priming

    PubMed Central

    Donges, Uta-Susan; Kersting, Anette; Suslow, Thomas

    2012-01-01

    There is evidence that women are better in recognizing their own and others' emotions. The female advantage in emotion recognition becomes even more apparent under conditions of rapid stimulus presentation. Affective priming paradigms have been developed to examine empirically whether facial emotion stimuli presented outside of conscious awareness color our impressions. It was observed that masked emotional facial expression has an affect congruent influence on subsequent judgments of neutral stimuli. The aim of the present study was to examine the effect of gender on affective priming based on negative and positive facial expression. In our priming experiment sad, happy, neutral, or no facial expression was briefly presented (for 33 ms) and masked by neutral faces which had to be evaluated. 81 young healthy volunteers (53 women) participated in the study. Subjects had no subjective awareness of emotional primes. Women did not differ from men with regard to age, education, intelligence, trait anxiety, or depressivity. In the whole sample, happy but not sad facial expression elicited valence congruent affective priming. Between-group analyses revealed that women manifested greater affective priming due to happy faces than men. Women seem to have a greater ability to perceive and respond to positive facial emotion at an automatic processing level compared to men. High perceptual sensitivity to minimal social-affective signals may contribute to women's advantage in understanding other persons' emotional states. PMID:22844519

  3. Greater perceptual sensitivity to happy facial expression.

    PubMed

    Maher, Stephen; Ekstrom, Tor; Chen, Yue

    2014-01-01

    Perception of subtle facial expressions is essential for social functioning; yet it is unclear if human perceptual sensitivities differ in detecting varying types of facial emotions. Evidence diverges as to whether salient negative versus positive emotions (such as sadness versus happiness) are preferentially processed. Here, we measured perceptual thresholds for the detection of four types of emotion in faces--happiness, fear, anger, and sadness--using psychophysical methods. We also evaluated the association of the perceptual performances with facial morphological changes between neutral and respective emotion types. Human observers were highly sensitive to happiness compared with the other emotional expressions. Further, this heightened perceptual sensitivity to happy expressions can be attributed largely to the emotion-induced morphological change of a particular facial feature (end-lip raise).

  4. Brief report: Representational momentum for dynamic facial expressions in pervasive developmental disorder.

    PubMed

    Uono, Shota; Sato, Wataru; Toichi, Motomi

    2010-03-01

    Individuals with pervasive developmental disorder (PDD) have difficulty with social communication via emotional facial expressions, but behavioral studies involving static images have reported inconsistent findings about emotion recognition. We investigated whether dynamic presentation of facial expression would enhance subjective perception of expressed emotion in 13 individuals with PDD and 13 typically developing controls. We presented dynamic and static emotional (fearful and happy) expressions. Participants were asked to match a changeable emotional face display with the last presented image. The results showed that both groups perceived the last image of dynamic facial expression to be more emotionally exaggerated than the static facial expression. This finding suggests that individuals with PDD have an intact perceptual mechanism for processing dynamic information in another individual's face.

  5. Facial Animations: Future Research Directions & Challenges

    NASA Astrophysics Data System (ADS)

    Alkawaz, Mohammed Hazim; Mohamad, Dzulkifli; Rehman, Amjad; Basori, Ahmad Hoirul

    2014-06-01

    Nowadays, computer facial animation is used in a significant multitude fields that brought human and social to study the computer games, films and interactive multimedia reality growth. Authoring the computer facial animation, complex and subtle expressions are challenging and fraught with problems. As a result, the current most authored using universal computer animation techniques often limit the production quality and quantity of facial animation. With the supplement of computer power, facial appreciative, software sophistication and new face-centric methods emerging are immature in nature. Therefore, this paper concentrates to define and managerially categorize current and emerged surveyed facial animation experts to define the recent state of the field, observed bottlenecks and developing techniques. This paper further presents a real-time simulation model of human worry and howling with detail discussion about their astonish, sorrow, annoyance and panic perception.

  6. Rapid processing of emotional expressions without conscious awareness.

    PubMed

    Smith, Marie L

    2012-08-01

    Rapid accurate categorization of the emotional state of our peers is of critical importance and as such many have proposed that facial expressions of emotion can be processed without conscious awareness. Typically, studies focus selectively on fearful expressions due to their evolutionary significance, leaving the subliminal processing of other facial expressions largely unexplored. Here, I investigated the time course of processing of 3 facial expressions (fearful, disgusted, and happy) plus an emotionally neutral face, during objectively unaware and aware perception. Participants completed the challenging "which expression?" task in response to briefly presented backward-masked expressive faces. Although participant's behavioral responses did not differentiate between the emotional content of the stimuli in the unaware condition, activity over frontal and occipitotemporal (OT) brain regions indicated an emotional modulation of the neuronal response. Over frontal regions this was driven by negative facial expressions and was present on all emotional trials independent of later categorization. Whereas the N170 component, recorded on lateral OT electrodes, was enhanced for all facial expressions but only on trials that would later be categorized as emotional. The results indicate that emotional faces, not only fearful, are processed without conscious awareness at an early stage and highlight the critical importance of considering categorization response when studying subliminal perception.

  7. Information-Processing Alternatives to Holistic Perception: Identifying the Mechanisms of Secondary-Level Holism within a Categorization Paradigm

    ERIC Educational Resources Information Center

    Fific, Mario; Townsend, James T.

    2010-01-01

    Failure to selectively attend to a facial feature, in the part-to-whole paradigm, has been taken as evidence of holistic perception in a large body of face perception literature. In this article, we demonstrate that although failure of selective attention is a necessary property of holistic perception, its presence alone is not sufficient to…

  8. MEG demonstrates a supra-additive response to facial and vocal emotion in the right superior temporal sulcus.

    PubMed

    Hagan, Cindy C; Woods, Will; Johnson, Sam; Calder, Andrew J; Green, Gary G R; Young, Andrew W

    2009-11-24

    An influential neural model of face perception suggests that the posterior superior temporal sulcus (STS) is sensitive to those aspects of faces that produce transient visual changes, including facial expression. Other researchers note that recognition of expression involves multiple sensory modalities and suggest that the STS also may respond to crossmodal facial signals that change transiently. Indeed, many studies of audiovisual (AV) speech perception show STS involvement in AV speech integration. Here we examine whether these findings extend to AV emotion. We used magnetoencephalography to measure the neural responses of participants as they viewed and heard emotionally congruent fear and minimally congruent neutral face and voice stimuli. We demonstrate significant supra-additive responses (i.e., where AV > [unimodal auditory + unimodal visual]) in the posterior STS within the first 250 ms for emotionally congruent AV stimuli. These findings show a role for the STS in processing crossmodal emotive signals.

  9. Body Weight Can Change How Your Emotions Are Perceived

    PubMed Central

    2016-01-01

    Accurately interpreting other’s emotions through facial expressions has important adaptive values for social interactions. However, due to the stereotypical social perception of overweight individuals as carefree, humorous, and light-hearted, the body weight of those with whom we interact may have a systematic influence on our emotion judgment even though it has no relevance to the expressed emotion itself. In this experimental study, we examined the role of body weight in faces on the affective perception of facial expressions. We hypothesized that the weight perceived in a face would bias the assessment of an emotional expression, with overweight faces generally more likely to be perceived as having more positive and less negative expressions than healthy weight faces. Using two-alternative forced-choice perceptual decision tasks, participants were asked to sort the emotional expressions of overweight and healthy weight facial stimuli that had been gradually morphed across six emotional intensity levels into one of two categories—“neutral vs. happy” (Experiment 1) and “neutral vs. sad” (Experiment 2). As predicted, our results demonstrated that overweight faces were more likely to be categorized as happy (i.e., lower happy decision threshold) and less likely to be categorized as sad (i.e., higher sad decision threshold) compared to healthy weight faces that had the same levels of emotional intensity. The neutral-sad decision threshold shift was negatively correlated with participant’s own fear of becoming fat, that is, those without a fear of becoming fat more strongly perceived overweight faces as sad relative to those with a higher fear. These findings demonstrate that the weight of the face systematically influences how its emotional expression is interpreted, suggesting that being overweight may make emotional expressions appear more happy and less sad than they really are. PMID:27870892

  10. Body Weight Can Change How Your Emotions Are Perceived.

    PubMed

    Oh, Yujung; Hass, Norah C; Lim, Seung-Lark

    2016-01-01

    Accurately interpreting other's emotions through facial expressions has important adaptive values for social interactions. However, due to the stereotypical social perception of overweight individuals as carefree, humorous, and light-hearted, the body weight of those with whom we interact may have a systematic influence on our emotion judgment even though it has no relevance to the expressed emotion itself. In this experimental study, we examined the role of body weight in faces on the affective perception of facial expressions. We hypothesized that the weight perceived in a face would bias the assessment of an emotional expression, with overweight faces generally more likely to be perceived as having more positive and less negative expressions than healthy weight faces. Using two-alternative forced-choice perceptual decision tasks, participants were asked to sort the emotional expressions of overweight and healthy weight facial stimuli that had been gradually morphed across six emotional intensity levels into one of two categories-"neutral vs. happy" (Experiment 1) and "neutral vs. sad" (Experiment 2). As predicted, our results demonstrated that overweight faces were more likely to be categorized as happy (i.e., lower happy decision threshold) and less likely to be categorized as sad (i.e., higher sad decision threshold) compared to healthy weight faces that had the same levels of emotional intensity. The neutral-sad decision threshold shift was negatively correlated with participant's own fear of becoming fat, that is, those without a fear of becoming fat more strongly perceived overweight faces as sad relative to those with a higher fear. These findings demonstrate that the weight of the face systematically influences how its emotional expression is interpreted, suggesting that being overweight may make emotional expressions appear more happy and less sad than they really are.

  11. Electrophysiological correlates of facial decision: insights from upright and upside-down Mooney-face perception.

    PubMed

    George, Nathalie; Jemel, Boutheina; Fiori, Nicole; Chaby, Laurence; Renault, Bernard

    2005-08-01

    We investigated the ERP correlates of the subjective perception of upright and upside-down ambiguous pictures as faces using two-tone Mooney stimuli in an explicit facial decision task (deciding whether a face is perceived or not in the display). The difficulty in perceiving upside-down Mooneys as faces was reflected by both lower rates of "Face" responses and delayed "Face" reaction times for upside-down relative to upright stimuli. The N170 was larger for the stimuli reported as "faces". It was also larger for the upright than the upside-down stimuli only when they were reported as faces. Furthermore, facial decision as well as stimulus orientation effects spread from 140-190 ms to 390-440 ms. The behavioural delay in 'Face' responses to upside-down stimuli was reflected in ERPs by later effect of facial decision for upside-down relative to upright Mooneys over occipito-temporal electrodes. Moreover, an orientation effect was observed only for the stimuli reported as faces; it yielded a marked hemispheric asymmetry, lasting from 140-190 ms to 390-440 ms post-stimulus onset in the left hemisphere and from 340-390 to 390-440 ms only in the right hemisphere. Taken together, the results supported a preferential involvement of the right hemisphere in the detection of faces, whatever their orientation. By contrast, the early orientation effect in the left hemisphere suggested that upside-down Mooney stimuli were processed as non face objects until facial decision was reached in this hemisphere. The present data show that face perception involves not only spatially but also temporally distributed activities in occipito-temporal regions.

  12. [Beauty judgment: review of the literature].

    PubMed

    Faure, Jacques; Bolender, Yves

    2014-03-01

    Esthetic judgments are surely subjective, but as surely, that does not preclude them being studied objectively through rigorous scientific methods. The factual basis of a science of esthetics is not to settle whether some person or image is "objectively beautiful" but rather to determine whether some representative set or sets of individuals judge or experience him/her/it as beautiful or unattractive. The aim of this paper is to review the definitional, theoretical and methodological aspects pertaining to the perception of facial/dental attractiveness by a group of representative individuals. The first part lays down the basic principles of the perception of facial/dental attractiveness: the perception involves a jury, a field of investigation and a test providing quantitative data; the following general determinants of beauty perception are reviewed: the average morphology, the judge's cultural background, the numerology, the judge's ethnical origin. Indirect determinants are the dentition, the osseous architecture and the muscular envelope. Some disruptive factors might alter the judges' facial perception. They might be qualified as either peripheral to the face or psycho-social factors. Peripheral factors include hair style and color, skin hue, wrinkles, lips color... Psycho-social factors cover the personality of the subject being evaluated, his/her intelligence or behavior. The second part deals specifically with the methodology used to determine facial attractiveness and to correlate this latter with a specific morphology. Typically such a study aims to determine average esthetic preferences for some set of visual displays among a particular jury, given a specific task to judge esthetic quality or qualities. The sample being studied, the displays, the jury or jurys, the rating procedure must all be specified prior to collecting data. A specific emphasis will be given to the rating process and the associated morphometrics, the ultimate goal being to discriminate morphologies judged as attractive among our patients. © EDP Sciences, SFODF, 2014.

  13. The face is not an empty canvas: how facial expressions interact with facial appearance.

    PubMed

    Hess, Ursula; Adams, Reginald B; Kleck, Robert E

    2009-12-12

    Faces are not simply blank canvases upon which facial expressions write their emotional messages. In fact, facial appearance and facial movement are both important social signalling systems in their own right. We here provide multiple lines of evidence for the notion that the social signals derived from facial appearance on the one hand and facial movement on the other interact in a complex manner, sometimes reinforcing and sometimes contradicting one another. Faces provide information on who a person is. Sex, age, ethnicity, personality and other characteristics that can define a person and the social group the person belongs to can all be derived from the face alone. The present article argues that faces interact with the perception of emotion expressions because this information informs a decoder's expectations regarding an expresser's probable emotional reactions. Facial appearance also interacts more directly with the interpretation of facial movement because some of the features that are used to derive personality or sex information are also features that closely resemble certain emotional expressions, thereby enhancing or diluting the perceived strength of particular expressions.

  14. Deficits in Degraded Facial Affect Labeling in Schizophrenia and Borderline Personality Disorder.

    PubMed

    van Dijke, Annemiek; van 't Wout, Mascha; Ford, Julian D; Aleman, André

    2016-01-01

    Although deficits in facial affect processing have been reported in schizophrenia as well as in borderline personality disorder (BPD), these disorders have not yet been directly compared on facial affect labeling. Using degraded stimuli portraying neutral, angry, fearful and angry facial expressions, we hypothesized more errors in labeling negative facial expressions in patients with schizophrenia compared to healthy controls. Patients with BPD were expected to have difficulty in labeling neutral expressions and to display a bias towards a negative attribution when wrongly labeling neutral faces. Patients with schizophrenia (N = 57) and patients with BPD (N = 30) were compared to patients with somatoform disorder (SoD, a psychiatric control group; N = 25) and healthy control participants (N = 41) on facial affect labeling accuracy and type of misattributions. Patients with schizophrenia showed deficits in labeling angry and fearful expressions compared to the healthy control group and patients with BPD showed deficits in labeling neutral expressions compared to the healthy control group. Schizophrenia and BPD patients did not differ significantly from each other when labeling any of the facial expressions. Compared to SoD patients, schizophrenia patients showed deficits on fearful expressions, but BPD did not significantly differ from SoD patients on any of the facial expressions. With respect to the type of misattributions, BPD patients mistook neutral expressions more often for fearful expressions compared to schizophrenia patients and healthy controls, and less often for happy compared to schizophrenia patients. These findings suggest that although schizophrenia and BPD patients demonstrate different as well as similar facial affect labeling deficits, BPD may be associated with a tendency to detect negative affect in neutral expressions.

  15. The asymmetric facial skin perfusion distribution of Bell's palsy discovered by laser speckle imaging technology.

    PubMed

    Cui, Han; Chen, Yi; Zhong, Weizheng; Yu, Haibo; Li, Zhifeng; He, Yuhai; Yu, Wenlong; Jin, Lei

    2016-01-01

    Bell's palsy is a kind of peripheral neural disease that cause abrupt onset of unilateral facial weakness. In the pathologic study, it was evidenced that ischemia of facial nerve at the affected side of face existed in Bell's palsy patients. Since the direction of facial nerve blood flow is primarily proximal to distal, facial skin microcirculation would also be affected after the onset of Bell's palsy. Therefore, monitoring the full area of facial skin microcirculation would help to identify the condition of Bell's palsy patients. In this study, a non-invasive, real time and full field imaging technology - laser speckle imaging (LSI) technology was applied for measuring facial skin blood perfusion distribution of Bell's palsy patients. 85 participants with different stage of Bell's palsy were included. Results showed that Bell's palsy patients' facial skin perfusion of affected side was lower than that of the normal side at the region of eyelid, and that the asymmetric distribution of the facial skin perfusion between two sides of eyelid is positively related to the stage of the disease (P <  0.001). During the recovery, the perfusion of affected side of eyelid was increasing to nearly the same with the normal side. This study was a novel application of LSI in evaluating the facial skin perfusion of Bell's palsy patients, and we discovered that the facial skin blood perfusion could reflect the stage of Bell's palsy, which suggested that microcirculation should be investigated in patients with this neurological deficit. It was also suggested LSI as potential diagnostic tool for Bell's palsy.

  16. How Context Influences Our Perception of Emotional Faces: A Behavioral Study on the Kuleshov Effect.

    PubMed

    Calbi, Marta; Heimann, Katrin; Barratt, Daniel; Siri, Francesca; Umiltà, Maria A; Gallese, Vittorio

    2017-01-01

    Facial expressions are of major importance in understanding the mental and emotional states of others. So far, most studies on the perception and comprehension of emotions have used isolated facial expressions as stimuli; for example, photographs of actors displaying facial expressions corresponding to one of the so called 'basic emotions.' However, our real experience during social interactions is different: facial expressions of emotion are mostly perceived in a wider context, constituted by body language, the surrounding environment, and our beliefs and expectations. Already in the early twentieth century, the Russian filmmaker Lev Kuleshov argued that such context, established by intermediate shots of strong emotional content, could significantly change our interpretation of facial expressions in film. Prior experiments have shown behavioral effects pointing in this direction, but have only used static images as stimuli. Our study used a more ecological design with participants watching film sequences of neutral faces, crosscut with scenes of strong emotional content (evoking happiness or fear, plus neutral stimuli as a baseline condition). The task was to rate the emotion displayed by a target person's face in terms of valence, arousal, and category. Results clearly demonstrated the presence of a significant effect in terms of both valence and arousal in the fear condition only. Moreover, participants tended to categorize the target person's neutral facial expression choosing the emotion category congruent with the preceding context. Our results highlight the context-sensitivity of emotions and the importance of studying them under ecologically valid conditions.

  17. Emotion Perception in Asperger's Syndrome and High-Functioning Autism: The Importance of Diagnostic Criteria and Cue Intensity

    ERIC Educational Resources Information Center

    Mazefsky, Carla A.; Oswald, Donald P.

    2007-01-01

    This study compared emotion perception accuracy between children with Asperger's syndrome (AS) and high-functioning autism (HFA). Thirty children were diagnosed with AS or HFA based on empirically supported diagnostic criteria and administered an emotion perception test consisting of facial expressions and tone of voice cues that varied in…

  18. Biased recognition of facial affect in patients with major depressive disorder reflects clinical state.

    PubMed

    Münkler, Paula; Rothkirch, Marcus; Dalati, Yasmin; Schmack, Katharina; Sterzer, Philipp

    2015-01-01

    Cognitive theories of depression posit that perception is negatively biased in depressive disorder. Previous studies have provided empirical evidence for this notion, but left open the question whether the negative perceptual bias reflects a stable trait or the current depressive state. Here we investigated the stability of negatively biased perception over time. Emotion perception was examined in patients with major depressive disorder (MDD) and healthy control participants in two experiments. In the first experiment subjective biases in the recognition of facial emotional expressions were assessed. Participants were presented with faces that were morphed between sad and neutral and happy expressions and had to decide whether the face was sad or happy. The second experiment assessed automatic emotion processing by measuring the potency of emotional faces to gain access to awareness using interocular suppression. A follow-up investigation using the same tests was performed three months later. In the emotion recognition task, patients with major depression showed a shift in the criterion for the differentiation between sad and happy faces: In comparison to healthy controls, patients with MDD required a greater intensity of the happy expression to recognize a face as happy. After three months, this negative perceptual bias was reduced in comparison to the control group. The reduction in negative perceptual bias correlated with the reduction of depressive symptoms. In contrast to previous work, we found no evidence for preferential access to awareness of sad vs. happy faces. Taken together, our results indicate that MDD-related perceptual biases in emotion recognition reflect the current clinical state rather than a stable depressive trait.

  19. Identifying Facial Emotions: Valence Specific Effects and an Exploration of the Effects of Viewer Gender

    ERIC Educational Resources Information Center

    Jansari, Ashok; Rodway, Paul; Goncalves, Salvador

    2011-01-01

    The valence hypothesis suggests that the right hemisphere is specialised for negative emotions and the left hemisphere is specialised for positive emotions (Silberman & Weingartner, 1986). It is unclear to what extent valence-specific effects in facial emotion perception depend upon the gender of the perceiver. To explore this question 46…

  20. [Measuring impairment of facial affects recognition in schizophrenia. Preliminary study of the facial emotions recognition task (TREF)].

    PubMed

    Gaudelus, B; Virgile, J; Peyroux, E; Leleu, A; Baudouin, J-Y; Franck, N

    2015-06-01

    The impairment of social cognition, including facial affects recognition, is a well-established trait in schizophrenia, and specific cognitive remediation programs focusing on facial affects recognition have been developed by different teams worldwide. However, even though social cognitive impairments have been confirmed, previous studies have also shown heterogeneity of the results between different subjects. Therefore, assessment of personal abilities should be measured individually before proposing such programs. Most research teams apply tasks based on facial affects recognition by Ekman et al. or Gur et al. However, these tasks are not easily applicable in a clinical exercise. Here, we present the Facial Emotions Recognition Test (TREF), which is designed to identify facial affects recognition impairments in a clinical practice. The test is composed of 54 photos and evaluates abilities in the recognition of six universal emotions (joy, anger, sadness, fear, disgust and contempt). Each of these emotions is represented with colored photos of 4 different models (two men and two women) at nine intensity levels from 20 to 100%. Each photo is presented during 10 seconds; no time limit for responding is applied. The present study compared the scores of the TREF test in a sample of healthy controls (64 subjects) and people with stabilized schizophrenia (45 subjects) according to the DSM IV-TR criteria. We analysed global scores for all emotions, as well as sub scores for each emotion between these two groups, taking into account gender differences. Our results were coherent with previous findings. Applying TREF, we confirmed an impairment in facial affects recognition in schizophrenia by showing significant differences between the two groups in their global results (76.45% for healthy controls versus 61.28% for people with schizophrenia), as well as in sub scores for each emotion except for joy. Scores for women were significantly higher than for men in the population without psychiatric diagnosis. The study also allowed the identification of cut-off scores; results below 2 standard deviations of the healthy control average (61.57%) pointed to a facial affect recognition deficit. The TREF appears to be a useful tool to identify facial affects recognition impairment in schizophrenia. Neuropsychologists, who have tried this task, have positive feedback. The TREF is easy to use (duration of about 15 minutes), easy to apply in subjects with attentional difficulties, and tests facial affects recognition at ecological intensity levels. These results have to be confirmed in the future with larger sample sizes, and in comparison with other tasks, evaluating the facial affects recognition processes. Copyright © 2014 L’Encéphale, Paris. Published by Elsevier Masson SAS. All rights reserved.

  1. Facial averageness and genetic quality: Testing heritability, genetic correlation with attractiveness, and the paternal age effect.

    PubMed

    Lee, Anthony J; Mitchem, Dorian G; Wright, Margaret J; Martin, Nicholas G; Keller, Matthew C; Zietsch, Brendan P

    2016-01-01

    Popular theory suggests that facial averageness is preferred in a partner for genetic benefits to offspring. However, whether facial averageness is associated with genetic quality is yet to be established. Here, we computed an objective measure of facial averageness for a large sample ( N = 1,823) of identical and nonidentical twins and their siblings to test two predictions from the theory that facial averageness reflects genetic quality. First, we use biometrical modelling to estimate the heritability of facial averageness, which is necessary if it reflects genetic quality. We also test for a genetic association between facial averageness and facial attractiveness. Second, we assess whether paternal age at conception (a proxy of mutation load) is associated with facial averageness and facial attractiveness. Our findings are mixed with respect to our hypotheses. While we found that facial averageness does have a genetic component, and a significant phenotypic correlation exists between facial averageness and attractiveness, we did not find a genetic correlation between facial averageness and attractiveness (therefore, we cannot say that the genes that affect facial averageness also affect facial attractiveness) and paternal age at conception was not negatively associated with facial averageness. These findings support some of the previously untested assumptions of the 'genetic benefits' account of facial averageness, but cast doubt on others.

  2. Test-retest reliability of subliminal facial affective priming.

    PubMed

    Dannlowski, Udo; Suslow, Thomas

    2006-02-01

    Since the seminal 1993 demonstrations o f Murphy an d Zajonc, researchers have replicated and extended findings concerning subliminal affective priming. So far, however, no data on test-retest reliability of affective priming effects are available. A subliminal facial affective priming task was administered to 22 healthy individuals (15 women and 7 men) twice about 7 wk. apart. Happy and sad facial expressions were used as affective primes and neutral Chinese ideographs served as target masks, which had to be evaluated. Neutral facial primes and a no-face condition served as baselines. All participants reported not having seen any of the prime faces at either testing session. Priming scores for affective faces compared to the baselines were computed. Acceptable test-retest correlations (rs) of up to .74 were found for the affective priming scores. Although measured almost 2 mo. apart, subliminal affective priming seems to be a temporally stable effect.

  3. Age, Health and Attractiveness Perception of Virtual (Rendered) Human Hair

    PubMed Central

    Fink, Bernhard; Hufschmidt, Carla; Hirn, Thomas; Will, Susanne; McKelvey, Graham; Lankhof, John

    2016-01-01

    The social significance of physical appearance and beauty has been documented in many studies. It is known that even subtle manipulations of facial morphology and skin condition can alter people’s perception of a person’s age, health and attractiveness. While the variation in facial morphology and skin condition cues has been studied quite extensively, comparably little is known on the effect of hair on social perception. This has been partly caused by the technical difficulty of creating appropriate stimuli for investigations of people’s response to systematic variation of certain hair characteristics, such as color and style, while keeping other features constant. Here, we present a modeling approach to the investigation of human hair perception using computer-generated, virtual (rendered) human hair. In three experiments, we manipulated hair diameter (Experiment 1), hair density (Experiment 2), and hair style (Experiment 3) of human (female) head hair and studied perceptions of age, health and attractiveness. Our results show that even subtle changes in these features have an impact on hair perception. We discuss our findings with reference to previous studies on condition-dependent quality cues in women that influence human social perception, thereby suggesting that hair is a salient feature of human physical appearance, which contributes to the perception of beauty. PMID:28066276

  4. Enhanced Facial Symmetry Assessment in Orthodontists

    PubMed Central

    Jackson, Tate H.; Clark, Kait; Mitroff, Stephen R.

    2013-01-01

    Assessing facial symmetry is an evolutionarily important process, which suggests that individual differences in this ability should exist. As existing data are inconclusive, the current study explored whether a group trained in facial symmetry assessment, orthodontists, possessed enhanced abilities. Symmetry assessment was measured using face and non-face stimuli among orthodontic residents and two control groups: university participants with no symmetry training and airport security luggage screeners, a group previously shown to possess expert visual search skills unrelated to facial symmetry. Orthodontic residents were more accurate at assessing symmetry in both upright and inverted faces compared to both control groups, but not for non-face stimuli. These differences are not likely due to motivational biases or a speed-accuracy tradeoff—orthodontic residents were slower than the university participants but not the security screeners. Understanding such individual differences in facial symmetry assessment may inform the perception of facial attractiveness. PMID:24319342

  5. Learning representative features for facial images based on a modified principal component analysis

    NASA Astrophysics Data System (ADS)

    Averkin, Anton; Potapov, Alexey

    2013-05-01

    The paper is devoted to facial image analysis and particularly deals with the problem of automatic evaluation of the attractiveness of human faces. We propose a new approach for automatic construction of feature space based on a modified principal component analysis. Input data sets for the algorithm are the learning data sets of facial images, which are rated by one person. The proposed approach allows one to extract features of the individual subjective face beauty perception and to predict attractiveness values for new facial images, which were not included into a learning data set. The Pearson correlation coefficient between values predicted by our method for new facial images and personal attractiveness estimation values equals to 0.89. This means that the new approach proposed is promising and can be used for predicting subjective face attractiveness values in real systems of the facial images analysis.

  6. Subject independent facial expression recognition with robust face detection using a convolutional neural network.

    PubMed

    Matsugu, Masakazu; Mori, Katsuhiko; Mitari, Yusuke; Kaneda, Yuji

    2003-01-01

    Reliable detection of ordinary facial expressions (e.g. smile) despite the variability among individuals as well as face appearance is an important step toward the realization of perceptual user interface with autonomous perception of persons. We describe a rule-based algorithm for robust facial expression recognition combined with robust face detection using a convolutional neural network. In this study, we address the problem of subject independence as well as translation, rotation, and scale invariance in the recognition of facial expression. The result shows reliable detection of smiles with recognition rate of 97.6% for 5600 still images of more than 10 subjects. The proposed algorithm demonstrated the ability to discriminate smiling from talking based on the saliency score obtained from voting visual cues. To the best of our knowledge, it is the first facial expression recognition model with the property of subject independence combined with robustness to variability in facial appearance.

  7. Near-optimal integration of facial form and motion.

    PubMed

    Dobs, Katharina; Ma, Wei Ji; Reddy, Leila

    2017-09-08

    Human perception consists of the continuous integration of sensory cues pertaining to the same object. While it has been fairly well shown that humans use an optimal strategy when integrating low-level cues proportional to their relative reliability, the integration processes underlying high-level perception are much less understood. Here we investigate cue integration in a complex high-level perceptual system, the human face processing system. We tested cue integration of facial form and motion in an identity categorization task and found that an optimal model could successfully predict subjects' identity choices. Our results suggest that optimal cue integration may be implemented across different levels of the visual processing hierarchy.

  8. Bihippocampal damage with emotional dysfunction: impaired auditory recognition of fear.

    PubMed

    Ghika-Schmid, F; Ghika, J; Vuilleumier, P; Assal, G; Vuadens, P; Scherer, K; Maeder, P; Uske, A; Bogousslavsky, J

    1997-01-01

    A right-handed man developed a sudden transient, amnestic syndrome associated with bilateral hemorrhage of the hippocampi, probably due to Urbach-Wiethe disease. In the 3rd month, despite significant hippocampal structural damage on imaging, only a milder degree of retrograde and anterograde amnesia persisted on detailed neuropsychological examination. On systematic testing of recognition of facial and vocal expression of emotion, we found an impairment of the vocal perception of fear, but not that of other emotions, such as joy, sadness and anger. Such selective impairment of fear perception was not present in the recognition of facial expression of emotion. Thus emotional perception varies according to the different aspects of emotions and the different modality of presentation (faces versus voices). This is consistent with the idea that there may be multiple emotion systems. The study of emotional perception in this unique case of bilateral involvement of hippocampus suggests that this structure may play a critical role in the recognition of fear in vocal expression, possibly dissociated from that of other emotions and from that of fear in facial expression. In regard of recent data suggesting that the amygdala is playing a role in the recognition of fear in the auditory as well as in the visual modality this could suggest that the hippocampus may be part of the auditory pathway of fear recognition.

  9. Perceptions of brachyfacial, mesofacial and dolichofacial individuals with regard to the buccal corridor in different facial types

    PubMed Central

    PITHON, Matheus Melo; da MATA, Kayure Rocha; ROCHA, Karina Silva; COSTA, Brenda do Nascimento; NEVES, Fernando; BARBOSA, George Caique Gouveia; COQUEIRO, Raildo da Silva

    2014-01-01

    Objective Evaluate the esthetic perception and attractiveness of the smile with regard to the buccal corridor in different facial types by brachyfacial, mesofacial and dolichofacial individuals. Material and Methods The image of a smiling individual with a mesofacial type of face was changed to create three different facial types with five different buccal corridors (2%, 10%, 15%, 22% and 28%). To achieve this effect, a photo editing software was used (Adobe Photoshop, Adobe Systems Inc, San Francisco, CA, EUA). The images were submitted to evaluators with brachyfacial, mesofacial and dolichofacial types of faces, who evaluated the degree of esthetic perception and attractiveness by means of a visual analog scale measuring 70 mm. The differences between evaluators were verified by the Mann-Whitney test. All statistics were performed with a confidence level of 95%. Results Brachyfacial individuals perceived mesofacial and dolichofacial types of faces with buccal corridor of 2% as more attractive. Mesofacial individuals perceived mesofacial and dolichofacial types of faces with buccal corridor of 2%, 10% and 15% as more attractive. Dolichofacial individuals perceived the mesofacial type of face with buccal corridor of 2% as more attractive. Evaluators of the female sex generally attributed higher scores than the male evaluators. Conclusion To achieve an enhanced esthetic smile it is necessary to observe the patient's facial type. The preference for narrow buccal corridors is an esthetic characteristic preferred by men and women, and wide buccal corridors are less attractive. PMID:25466472

  10. Perceptions of brachyfacial, mesofacial and dolichofacial individuals with regard to the buccal corridor in different facial types.

    PubMed

    Pithon, Matheus Melo; Mata, Kayure Rocha da; Rocha, Karina Silva; Costa, Brenda do Nascimento; Neves, Fernando; Barbosa, George Caique Gouveia; Coqueiro, Raildo da Silva

    2014-01-01

    Evaluate the esthetic perception and attractiveness of the smile with regard to the buccal corridor in different facial types by brachyfacial, mesofacial and dolichofacial individuals. The image of a smiling individual with a mesofacial type of face was changed to create three different facial types with five different buccal corridors (2%, 10%, 15%, 22% and 28%). To achieve this effect, a photo editing software was used (Adobe Photoshop, Adobe Systems Inc, San Francisco, CA, EUA). The images were submitted to evaluators with brachyfacial, mesofacial and dolichofacial types of faces, who evaluated the degree of esthetic perception and attractiveness by means of a visual analog scale measuring 70 mm. The differences between evaluators were verified by the Mann-Whitney test. All statistics were performed with a confidence level of 95%. Brachyfacial individuals perceived mesofacial and dolichofacial types of faces with buccal corridor of 2% as more attractive. Mesofacial individuals perceived mesofacial and dolichofacial types of faces with buccal corridor of 2%, 10% and 15% as more attractive. Dolichofacial individuals perceived the mesofacial type of face with buccal corridor of 2% as more attractive. Evaluators of the female sex generally attributed higher scores than the male evaluators. To achieve an enhanced esthetic smile it is necessary to observe the patient's facial type. The preference for narrow buccal corridors is an esthetic characteristic preferred by men and women, and wide buccal corridors are less attractive.

  11. Face and body perception in schizophrenia: a configural processing deficit?

    PubMed

    Soria Bauser, Denise; Thoma, Patrizia; Aizenberg, Victoria; Brüne, Martin; Juckel, Georg; Daum, Irene

    2012-01-30

    Face and body perception rely on common processing mechanisms and activate similar but not identical brain networks. Patients with schizophrenia show impaired face perception, and the present study addressed for the first time body perception in this group. Seventeen patients diagnosed with schizophrenia or schizoaffective disorder were compared to 17 healthy controls on standardized tests assessing basic face perception skills (identity discrimination, memory for faces, recognition of facial affect). A matching-to-sample task including emotional and neutral faces, bodies and cars either in an upright or in an inverted position was administered to assess potential category-specific performance deficits and impairments of configural processing. Relative to healthy controls, schizophrenia patients showed poorer performance on the tasks assessing face perception skills. In the matching-to-sample task, they also responded more slowly and less accurately than controls, regardless of the stimulus category. Accuracy analysis showed significant inversion effects for faces and bodies across groups, reflecting configural processing mechanisms; however reaction time analysis indicated evidence of reduced inversion effects regardless of category in schizophrenia patients. The magnitude of the inversion effects was not related to clinical symptoms. Overall, the data point towards reduced configural processing, not only for faces but also for bodies and cars in individuals with schizophrenia. © 2011 Elsevier Ltd. All rights reserved.

  12. Amphetamine as a social drug: Effects of d-amphetamine on social processing and behavior

    PubMed Central

    Wardle, Margaret C.; Garner, Matthew J.; Munafò, Marcus R.; de Wit, Harriet

    2012-01-01

    Rationale Drug users often report using drugs to enhance social situations, and empirical studies support the idea that drugs increase both social behavior and the value of social interactions. One way drugs may affect social behavior is by altering social processing, for example by decreasing perceptions of negative emotion in others. Objectives We examined effects of d-amphetamine on processing of emotional facial expressions, and on the social behavior of talking. We predicted amphetamine would enhance attention, identification and responsivity to positive expressions, and that this in turn would predict increased talkativeness. Methods Over three sessions, 36 healthy normal adults received placebo, 10mg, and 20mg d-amphetamine under counterbalanced double-blind conditions. At each session we measured processing of happy, fearful, sad and angry expressions using an attentional visual probe task, a dynamic emotion identification task, and measures of facial muscle activity. We also measured talking. Results Amphetamine decreased the threshold for identifying all emotions, increased negative facial responses to sad expressions, and increased talkativeness. Contrary to our hypotheses, amphetamine did not alter attention to, identification of or facial responses to positive emotions specifically. Interestingly, the drug decreased the threshold to identify all emotions, and this effect was uniquely related to increased talkativeness, even after controlling for overall sensitivity to amphetamine. Conclusions The results suggest that amphetamine may encourage sociability by increasing sensitivity to subtle emotional expressions. These findings suggest novel social mechanisms that may contribute to the rewarding effects of amphetamine. PMID:22526538

  13. Children's Facial Trustworthiness Judgments: Agreement and Relationship with Facial Attractiveness

    PubMed Central

    Ma, Fengling; Xu, Fen; Luo, Xianming

    2016-01-01

    This study examined developmental changes in children's abilities to make trustworthiness judgments based on faces and the relationship between a child's perception of trustworthiness and facial attractiveness. One hundred and one 8-, 10-, and 12-year-olds, along with 37 undergraduates, were asked to judge the trustworthiness of 200 faces. Next, they issued facial attractiveness judgments. The results indicated that children made consistent trustworthiness and attractiveness judgments based on facial appearance, but with-adult and within-age agreement levels of facial judgments increased with age. Additionally, the agreement levels of judgments made by girls were higher than those by boys. Furthermore, the relationship between trustworthiness and attractiveness judgments increased with age, and the relationship between two judgments made by girls was closer than those by boys. These findings suggest that face-based trait judgment ability develops throughout childhood and that, like adults, children may use facial attractiveness as a heuristic cue that signals a stranger's trustworthiness. PMID:27148111

  14. Mesial temporal lobe epilepsy diminishes functional connectivity during emotion perception.

    PubMed

    Steiger, Bettina K; Muller, Angela M; Spirig, Esther; Toller, Gianina; Jokeit, Hennric

    2017-08-01

    Unilateral mesial temporal lobe epilepsy (MTLE) has been associated with impaired recognition of emotional facial expressions. Correspondingly, imaging studies showed decreased activity of the amygdala and cortical face processing regions in response to emotional faces. However, functional connectivity among regions involved in emotion perception has not been studied so far. To address this, we examined intrinsic functional connectivity (FC) modulated by the perception of dynamic fearful faces among the amygdala and limbic, frontal, temporal and brainstem regions. Regions of interest were identified in an activation analysis by presenting a block-design with dynamic fearful faces and dynamic landscapes to 15 healthy individuals. This led to 10 predominately right-hemispheric regions. Functional connectivity between these regions during the perception of fearful faces was examined in drug-refractory patients with left- (n=16) or right-sided (n=17) MTLE, epilepsy patients with extratemporal seizure onset (n=15) and a second group of 15 healthy controls. Healthy controls showed a widespread functional network modulated by the perception of fearful faces that encompassed bilateral amygdalae, limbic, cortical, subcortical and brainstem regions. In patients with left MTLE, a downsized network of frontal and temporal regions centered on the right amygdala was present. Patients with right MTLE showed almost no significant functional connectivity. A maintained network in the epilepsy control group indicates that findings in mesial temporal lobe epilepsy could not be explained by clinical factors such as seizures and antiepileptic medication. Functional networks underlying facial emotion perception are considerably changed in left and right MTLE. Alterations are present for both hemispheres in either MTLE group, but are more pronounced in right MTLE. Disruption of the functional network architecture possibly contributes to deficits in facial emotion recognition frequently reported in MTLE. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Facial cues to perceived height influence leadership choices in simulated war and peace contexts.

    PubMed

    Re, Daniel E; DeBruine, Lisa M; Jones, Benedict C; Perrett, David I

    2013-01-31

    Body size and other signs of physical prowess are associated with leadership hierarchies in many social species. Here we (1) assess whether facial cues associated with perceived height and masculinity have different effects on leadership judgments in simulated wartime and peacetime contexts and (2) test how facial cues associated with perceived height and masculinity influence dominance perceptions. Results indicate that cues associated with perceived height and masculinity in potential leaders‟ faces are valued more in a wartime (vs. peacetime) context. Furthermore, increasing cues of apparent height and masculinity in faces increased perceived dominance. Together, these findings suggest that facial cues of physical stature contribute to establishing leadership hierarchies in humans.

  16. The association between PTSD and facial affect recognition.

    PubMed

    Williams, Christian L; Milanak, Melissa E; Judah, Matt R; Berenbaum, Howard

    2018-05-05

    The major aims of this study were to examine how, if at all, having higher levels of PTSD would be associated with performance on a facial affect recognition task in which facial expressions of emotion are superimposed on emotionally valenced, non-face images. College students with trauma histories (N = 90) completed a facial affect recognition task as well as measures of exposure to traumatic events, and PTSD symptoms. When the face and context matched, participants with higher levels of PTSD were significantly more accurate. When the face and context were mismatched, participants with lower levels of PTSD were more accurate than were those with higher levels of PTSD. These findings suggest that PTSD is associated with how people process affective information. Furthermore, these results suggest that the enhanced attention of people with higher levels of PTSD to affective information can be either beneficial or detrimental to their ability to accurately identify facial expressions of emotion. Limitations, future directions and clinical implications are discussed. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. How affective information from faces and scenes interacts in the brain

    PubMed Central

    Vandenbulcke, Mathieu; Sinke, Charlotte B. A.; Goebel, Rainer; de Gelder, Beatrice

    2014-01-01

    Facial expression perception can be influenced by the natural visual context in which the face is perceived. We performed an fMRI experiment presenting participants with fearful or neutral faces against threatening or neutral background scenes. Triangles and scrambled scenes served as control stimuli. The results showed that the valence of the background influences face selective activity in the right anterior parahippocampal place area (PPA) and subgenual anterior cingulate cortex (sgACC) with higher activation for neutral backgrounds compared to threatening backgrounds (controlled for isolated background effects) and that this effect correlated with trait empathy in the sgACC. In addition, the left fusiform gyrus (FG) responds to the affective congruence between face and background scene. The results show that valence of the background modulates face processing and support the hypothesis that empathic processing in sgACC is inhibited when affective information is present in the background. In addition, the findings reveal a pattern of complex scene perception showing a gradient of functional specialization along the posterior–anterior axis: from sensitivity to the affective content of scenes (extrastriate body area: EBA and posterior PPA), over scene emotion–face emotion interaction (left FG) via category–scene interaction (anterior PPA) to scene–category–personality interaction (sgACC). PMID:23956081

  18. Perceiving emotional expressions in others: Activation likelihood estimation meta-analyses of explicit evaluation, passive perception and incidental perception of emotions.

    PubMed

    Dricu, Mihai; Frühholz, Sascha

    2016-12-01

    We conducted a series of activation likelihood estimation (ALE) meta-analyses to determine the commonalities and distinctions between separate levels of emotion perception, namely incidental perception, passive perception, and explicit evaluation of emotional expressions. Pooling together more than 180 neuroimaging experiments using facial, vocal or body expressions, our results are threefold. First, explicitly evaluating the emotions of others recruits brain regions associated with the sensory processing of expressions, such as the inferior occipital gyrus, middle fusiform gyrus and the superior temporal gyrus, and brain regions involved in low-level and high-level mindreading, namely the posterior superior temporal sulcus, the inferior frontal cortex and dorsomedial frontal cortex. Second, we show that only the sensory regions were also consistently active during the passive perception of emotional expressions. Third, we show that the brain regions involved in mindreading were active during the explicit evaluation of both facial and vocal expressions. We discuss these results in light of the existing literature and conclude by proposing a cognitive model for perceiving and evaluating the emotions of others. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Smile attractiveness related to buccal corridor space in 3 different facial types: A perception of 3 ethnic groups of Malaysians.

    PubMed

    Nimbalkar, Smita; Oh, Yih Y; Mok, Reei Y; Tioh, Jing Y; Yew, Kai J; Patil, Pravinkumar G

    2018-03-16

    Buccal corridor space and its variations greatly influence smile attractiveness. Facial types are different for different ethnic populations, and so is smile attractiveness. The subjective perception of smile attractiveness of different populations may vary in regard to different buccal corridor spaces and facial patterns. The purpose of this study was to determine esthetic perceptions of the Malaysian population regarding the width of buccal corridor spaces and their effect on smile esthetics in individuals with short, normal, and long faces. The image of a smiling individual with a mesofacial face was modified to create 2 different facial types (brachyfacial and dolicofacial). Each face form was further modified into 5 different buccal corridors (2%, 10%, 15%, 22%, and 28%). The images were submitted to 3 different ethnic groups of evaluators (Chinese, Malay, Indian; 100 each), ranging between 17 and 21 years of age. A visual analog scale (50 mm in length) was used for assessment. The scores given to each image were compared with the Kruskal-Wallis test, and pairwise comparison was performed using the Mann-Whitney U test (α=.05). All 3 groups of evaluators could distinguish gradations of dark spaces in the buccal corridor at 2%, 10%, and 28%. Statistically significant differences were observed among 3 groups of evaluators in esthetic perception when pairwise comparisons were performed. A 15% buccal corridor was found to score esthetically equally within 3 face types by all 3 groups of evaluators. The Indian population was more critical in evaluation than the Chinese or Malay populations. In a pairwise comparison, more significant differences were found between long and short faces and the normal face; the normal face was compared with long and short faces separately. The width of the buccal corridor space influences smile attractiveness in different facial types. A medium buccal corridor (15%) is the esthetic characteristic preferred by all groups of evaluators in short, normal, and long face types. Copyright © 2017 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  20. Decoding of Emotion through Facial Expression, Prosody and Verbal Content in Children and Adolescents with Asperger's Syndrome

    ERIC Educational Resources Information Center

    Lindner, Jennifer L.; Rosen, Lee A.

    2006-01-01

    This study examined differences in the ability to decode emotion through facial expression, prosody, and verbal content between 14 children with Asperger's Syndrome (AS) and 16 typically developing peers. The ability to decode emotion was measured by the Perception of Emotion Test (POET), which portrayed the emotions of happy, angry, sad, and…

  1. [The role of experience in the neurology of facial expression of emotions].

    PubMed

    Gordillo, Fernando; Pérez, Miguel A; Arana, José M; Mestas, Lilia; López, Rafael M

    2015-04-01

    Facial expression of emotion has an important social function that facilitates interaction between people. This process has a neurological basis, which is not isolated from the context, or the experience of the interaction between people in that context. Yet, to date, the impact that experience has on the perception of emotions is not completely understood. To discuss the role of experience in the recognition of facial expression of emotions and to analyze the biases towards emotional perception. The maturation of the structures that support the ability to recognize emotion goes through a sensitive period during adolescence, where experience may have greater impact on emotional recognition. Experiences of abuse, neglect, war, and stress generate a bias towards expressions of anger and sadness. Similarly, positive experiences generate a bias towards the expression of happiness. Only when people are able to use the facial expression of emotions as a channel for understanding an expression, will they be able to interact appropriately with their environment. This environment, in turn, will lead to experiences that modulate this capacity. Therefore, it is a self-regulatory process that can be directed through the implementation of intervention programs on emotional aspects.

  2. Intrapersonal and Interpersonal Concomitants of Facial Blushing during Everyday Social Encounters

    PubMed Central

    aan het Rot, Marije; Moskowitz, D. S.; de Jong, Peter J.

    2015-01-01

    Facial blushing may usually be undesirable but may have an ameliorative function for some individuals under some circumstances. Researchers have studied the blush in laboratory settings, but not in daily life. In the present research, conducted with young adults, we employed for the first time an event-contingent recording method for assessing facial blushing during every-day social encounters. Blushing was associated with feeling embarrassed, ashamed, and exposed. These findings, though based on correlational analyses, are consistent with the idea that blushing is often unpleasant and can be maladaptive, and may contribute to the common belief that blushing is an undesirable response. Frequent blushers generally reported lower levels of dominant behavior, higher levels of submissive behavior, and perceived their social interaction partners as more powerful and less affiliative. This was independent of whether they blushed or not, suggesting that altered social behaviors and perceptions are associated with blushing-associated traits rather than with the blushing state. The experience of the blush varied as a function of the frequency with which a person blushed. Blushing was associated with higher levels of shame in frequent blushers than in infrequent blushers. In infrequent blushers, blushing was associated with higher levels of pleasant affect, suggesting that for infrequent blushers the blush may occur in positive social encounters. PMID:25679216

  3. Neural mechanism for judging the appropriateness of facial affect.

    PubMed

    Kim, Ji-Woong; Kim, Jae-Jin; Jeong, Bum Seok; Ki, Seon Wan; Im, Dong-Mi; Lee, Soo Jung; Lee, Hong Shick

    2005-12-01

    Questions regarding the appropriateness of facial expressions in particular situations arise ubiquitously in everyday social interactions. To determine the appropriateness of facial affect, first of all, we should represent our own or the other's emotional state as induced by the social situation. Then, based on these representations, we should infer the possible affective response of the other person. In this study, we identified the brain mechanism mediating special types of social evaluative judgments of facial affect in which the internal reference is related to theory of mind (ToM) processing. Many previous ToM studies have used non-emotional stimuli, but, because so much valuable social information is conveyed through nonverbal emotional channels, this investigation used emotionally salient visual materials to tap ToM. Fourteen right-handed healthy subjects volunteered for our study. We used functional magnetic resonance imaging to examine brain activation during the judgmental task for the appropriateness of facial affects as opposed to gender matching tasks. We identified activation of a brain network, which includes both medial frontal cortex, left temporal pole, left inferior frontal gyrus, and left thalamus during the judgmental task for appropriateness of facial affect compared to the gender matching task. The results of this study suggest that the brain system involved in ToM plays a key role in judging the appropriateness of facial affect in an emotionally laden situation. In addition, our result supports that common neural substrates are involved in performing diverse kinds of ToM tasks irrespective of perceptual modalities and the emotional salience of test materials.

  4. Body Image and Quality of Life in Adolescents With Craniofacial Conditions

    PubMed Central

    Crerand, Canice E.; Sarwer, David B.; Kazak, Anne E.; Clarke, Alexandra; DPsych; Rumsey, Nichola

    2017-01-01

    Objective To evaluate body image in adolescents with and without craniofacial conditions; and to examine relationships between body image and quality of life. Design Case-control design. Setting A pediatric hospital’s craniofacial center and primary care practices. Participants 70 adolescents with visible craniofacial conditions and a demographically-matched sample of 42 adolescents without craniofacial conditions. Main Outcome Measure Adolescents completed measures of quality of life and body image including satisfaction with weight, facial and overall appearance; investment in appearance (importance of appearance to self-worth); and body image disturbance (appearance-related distress and impairment in functioning). Results Adolescents with craniofacial conditions reported lower appearance investment (p < 0.001) and were more likely to report concerns about facial features (p < 0.02) compared to non-affected youth. Females in both groups reported greater investment in appearance, greater body image disturbance, and lower weight satisfaction compared to males (p < 0.01). Within both groups, greater body image disturbance was associated with lower quality of life (p <0.01). The two groups did not differ significantly on measures of quality of life, body image disturbance, or satisfaction with appearance. Conclusions Body image and quality of life in adolescents with craniofacial conditions are similar to non-affected youth. Relationships between body image and quality of life emphasize that appearance perceptions are important to adolescents’ well-being regardless of whether they have a facial disfigurement. Investment in one’s appearance may explain variations in body image satisfaction and serve as an intervention target particularly for females. PMID:26751907

  5. Steady-state visual evoked potentials as a research tool in social affective neuroscience

    PubMed Central

    Wieser, Matthias J.; Miskovic, Vladimir; Keil, Andreas

    2017-01-01

    Like many other primates, humans place a high premium on social information transmission and processing. One important aspect of this information concerns the emotional state of other individuals, conveyed by distinct visual cues such as facial expressions, overt actions, or by cues extracted from the situational context. A rich body of theoretical and empirical work has demonstrated that these socio-emotional cues are processed by the human visual system in a prioritized fashion, in the service of optimizing social behavior. Furthermore, socio-emotional perception is highly dependent on situational contexts and previous experience. Here, we review current issues in this area of research and discuss the utility of the steady-state visual evoked potential (ssVEP) technique for addressing key empirical questions. Methodological advantages and caveats are discussed with particular regard to quantifying time-varying competition among multiple perceptual objects, trial-by-trial analysis of visual cortical activation, functional connectivity, and the control of low-level stimulus features. Studies on facial expression and emotional scene processing are summarized, with an emphasis on viewing faces and other social cues in emotional contexts, or when competing with each other. Further, because the ssVEP technique can be readily accommodated to studying the viewing of complex scenes with multiple elements, it enables researchers to advance theoretical models of socio-emotional perception, based on complex, quasi-naturalistic viewing situations. PMID:27699794

  6. Emotions facilitate the communication of ambiguous group memberships.

    PubMed

    Tskhay, Konstantin O; Rule, Nicholas O

    2015-12-01

    It is well known that emotions intersect with obvious social categories (e.g., race), influencing both how targets are categorized and the emotions that are read from their faces. Here, we examined the influence of emotional expression on the perception of less obvious group memberships for which, in the absence of obvious and stable physical markers, emotion may serve as a major avenue for group categorization and identification. Specifically, we examined whether emotions are embedded in the mental representations of sexual orientation and political affiliation, and whether people may use emotional expressions to communicate these group memberships to others. Using reverse correlation methods, we found that mental representations of gay and liberal faces were characterized by more positive facial expressions than mental representations of straight and conservative faces (Study 1). Furthermore, participants were evaluated as expressing more positive emotions when enacting self-defined "gay" and "liberal" versus "straight" and "conservative" facial expressions in the lab (Study 2). In addition, neutral faces morphed with happiness were perceived as more gay than when morphed with anger, and when compared to unmorphed controls (Study 3). Finally, we found that affect facilitated perceptions of sexual orientation and political affiliation in naturalistic settings (Study 4). Together, these studies suggest that emotion is a defining characteristic of person construal that people tend to use both when signaling their group memberships and when receiving those signals to categorize others. (c) 2015 APA, all rights reserved).

  7. Splash Safety During Dermatologic Procedures Among US Dermatology Residents.

    PubMed

    Korta, Dorota Z; Chapman, Lance W; Lee, Patrick K; Linden, Kenneth G

    2017-07-01

    Dermatologists are at potential risk of acquiring infections from contamination of the mucous membranes by blood and body fluids. However, there are little data on splash safety during procedural dermatology. To determine dermatology resident perceptions about splash risk during dermatologic procedures and to quantify the rate of protective equipment use. An anonymous on-line survey was sent to 108 United States ACGME-approved dermatology residency programs assessing frequency of facial protection during dermatologic procedures, personal history of splash injury, and, if applicable, reasons for not always wearing facial protection. A total of 153 dermatology residents responded. Rates of facial protection varied by procedure, with the highest rates during surgery and the lowest during local anesthetic injection. Over 54% of respondents reported suffering facial splash while not wearing facial protection during a procedure. In contrast, 88.9% of respondents correctly answered that there is a small risk of acquiring infection from mucosal splash. Residency program recommendations for facial protection seem to vary by procedure. The authors' results demonstrate that although facial splash is a common injury, facial protection rates and protective recommendations vary significantly by procedure. These data support the recommendation for enhanced facial protection guidelines during procedural dermatology.

  8. Emotional memory and perception in temporal lobectomy patients with amygdala damage.

    PubMed

    Brierley, B; Medford, N; Shaw, P; David, A S

    2004-04-01

    The human amygdala is implicated in the formation of emotional memories and the perception of emotional stimuli--particularly fear--across various modalities. To discern the extent to which these functions are related. 28 patients who had anterior temporal lobectomy (13 left and 15 right) for intractable epilepsy were recruited. Structural magnetic resonance imaging showed that three of them had atrophy of their remaining amygdala. All participants were given tests of affect perception from facial and vocal expressions and of emotional memory, using a standard narrative test and a novel test of word recognition. The results were standardised against matched healthy controls. Performance on all emotion tasks in patients with unilateral lobectomy ranged from unimpaired to moderately impaired. Perception of emotions in faces and voices was (with exceptions) significantly positively correlated, indicating multimodal emotional processing. However, there was no correlation between the subjects' performance on tests of emotional memory and perception. Several subjects showed strong emotional memory enhancement but poor fear perception. Patients with bilateral amygdala damage had greater impairment, particularly on the narrative test of emotional memory, one showing superior fear recognition but absent memory enhancement. Bilateral amygdala damage is particularly disruptive of emotional memory processes in comparison with unilateral temporal lobectomy. On a cognitive level, the pattern of results implies that perception of emotional expressions and emotional memory are supported by separate processing systems or streams.

  9. How Context Influences Our Perception of Emotional Faces: A Behavioral Study on the Kuleshov Effect

    PubMed Central

    Calbi, Marta; Heimann, Katrin; Barratt, Daniel; Siri, Francesca; Umiltà, Maria A.; Gallese, Vittorio

    2017-01-01

    Facial expressions are of major importance in understanding the mental and emotional states of others. So far, most studies on the perception and comprehension of emotions have used isolated facial expressions as stimuli; for example, photographs of actors displaying facial expressions corresponding to one of the so called ‘basic emotions.’ However, our real experience during social interactions is different: facial expressions of emotion are mostly perceived in a wider context, constituted by body language, the surrounding environment, and our beliefs and expectations. Already in the early twentieth century, the Russian filmmaker Lev Kuleshov argued that such context, established by intermediate shots of strong emotional content, could significantly change our interpretation of facial expressions in film. Prior experiments have shown behavioral effects pointing in this direction, but have only used static images as stimuli. Our study used a more ecological design with participants watching film sequences of neutral faces, crosscut with scenes of strong emotional content (evoking happiness or fear, plus neutral stimuli as a baseline condition). The task was to rate the emotion displayed by a target person’s face in terms of valence, arousal, and category. Results clearly demonstrated the presence of a significant effect in terms of both valence and arousal in the fear condition only. Moreover, participants tended to categorize the target person’s neutral facial expression choosing the emotion category congruent with the preceding context. Our results highlight the context-sensitivity of emotions and the importance of studying them under ecologically valid conditions. PMID:29046652

  10. Illuminant color estimation based on pigmentation separation from human skin color

    NASA Astrophysics Data System (ADS)

    Tanaka, Satomi; Kakinuma, Akihiro; Kamijo, Naohiro; Takahashi, Hiroshi; Tsumura, Norimichi

    2015-03-01

    Human has the visual system called "color constancy" that maintains the perceptive colors of same object across various light sources. The effective method of color constancy algorithm was proposed to use the human facial color in a digital color image, however, this method has wrong estimation results by the difference of individual facial colors. In this paper, we present the novel color constancy algorithm based on skin color analysis. The skin color analysis is the method to separate the skin color into the components of melanin, hemoglobin and shading. We use the stationary property of Japanese facial color, and this property is calculated from the components of melanin and hemoglobin. As a result, we achieve to propose the method to use subject's facial color in image and not depend on the individual difference among Japanese facial color.

  11. [Emotional facial expression recognition impairment in Parkinson disease].

    PubMed

    Lachenal-Chevallet, Karine; Bediou, Benoit; Bouvard, Martine; Thobois, Stéphane; Broussolle, Emmanuel; Vighetto, Alain; Krolak-Salmon, Pierre

    2006-03-01

    some behavioral disturbances observed in Parkinson's disease (PD) could be related to impaired recognition of various social messages particularly emotional facial expressions. facial expression recognition was assessed using morphed faces (five emotions: happiness, fear, anger, disgust, neutral), and compared to gender recognition and general cognitive assessment in 12 patients with Parkinson's disease and 14 controls subjects. facial expression recognition was impaired among patients, whereas gender recognitions, visuo-perceptive capacities and total efficiency were preserved. Post hoc analyses disclosed a deficit for fear and disgust recognition compared to control subjects. the impairment of emotional facial expression recognition in PD appears independent of other cognitive deficits. This impairment may be related to the dopaminergic depletion in basal ganglia and limbic brain regions. They could take a part in psycho-behavioral disorders and particularly in communication disorders observed in Parkinson's disease patients.

  12. Automatic processing of facial affects in patients with borderline personality disorder: associations with symptomatology and comorbid disorders.

    PubMed

    Donges, Uta-Susan; Dukalski, Bibiana; Kersting, Anette; Suslow, Thomas

    2015-01-01

    Instability of affects and interpersonal relations are important features of borderline personality disorder (BPD). Interpersonal problems of individuals suffering from BPD might develop based on abnormalities in the processing of facial affects and high sensitivity to negative affective expressions. The aims of the present study were to examine automatic evaluative shifts and latencies as a function of masked facial affects in patients with BPD compared to healthy individuals. As BPD comorbidity rates for mental and personality disorders are high, we investigated also the relationships of affective processing characteristics with specific borderline symptoms and comorbidity. Twenty-nine women with BPD and 38 healthy women participated in the study. The majority of patients suffered from additional Axis I disorders and/or additional personality disorders. In the priming experiment, angry, happy, neutral, or no facial expression was briefly presented (for 33 ms) and masked by neutral faces that had to be evaluated. Evaluative decisions and response latencies were registered. Borderline-typical symptomatology was assessed with the Borderline Symptom List. In the total sample, valence-congruent evaluative shifts and delays of evaluative decision due to facial affect were observed. No between-group differences were obtained for evaluative decisions and latencies. The presence of comorbid anxiety disorders was found to be positively correlated with evaluative shifting owing to masked happy primes, regardless of baseline-neutral or no facial expression condition. The presence of comorbid depressive disorder, paranoid personality disorder, and symptoms of social isolation and self-aggression were significantly correlated with response delay due to masked angry faces, regardless of baseline. In the present affective priming study, no abnormalities in the automatic recognition and processing of facial affects were observed in BPD patients compared to healthy individuals. The presence of comorbid anxiety disorders could make patients more susceptible to the influence of a happy expression on judgment processes at an automatic processing level. Comorbid depressive disorder, paranoid personality disorder, and symptoms of social isolation and self-aggression may enhance automatic attention allocation to threatening facial expressions in BPD. Increased automatic vigilance for social threat stimuli might contribute to affective instability and interpersonal problems in specific patients with BPD.

  13. Facial trauma among victims of terrestrial transport accidents.

    PubMed

    d'Avila, Sérgio; Barbosa, Kevan Guilherme Nóbrega; Bernardino, Ítalo de Macedo; da Nóbrega, Lorena Marques; Bento, Patrícia Meira; E Ferreira, Efigênia Ferreira

    2016-01-01

    In developing countries, terrestrial transport accidents - TTA, especially those involving automobiles and motorcycles - are a major cause of facial trauma, surpassing urban violence. This cross-sectional census study attempted to determine facial trauma occurrence with terrestrial transport accidents etiology, involving cars, motorcycles, or accidents with pedestrians in the northeastern region of Brazil, and examine victims' socio-demographic characteristics. Morbidity data from forensic service reports of victims who sought care from January to December 2012 were analyzed. Altogether, 2379 reports were evaluated, of which 673 were related to terrestrial transport accidents and 103 involved facial trauma. Three previously trained and calibrated researchers collected data using a specific form. Facial trauma occurrence rate was 15.3% (n=103). The most affected age group was 20-29 years (48.3%), and more men than women were affected (2.81:1). Motorcycles were involved in the majority of accidents resulting in facial trauma (66.3%). The occurrence of facial trauma in terrestrial transport accident victims tends to affect a greater proportion of young and male subjects, and the most prevalent accidents involve motorcycles. Copyright © 2015 Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico-Facial. Published by Elsevier Editora Ltda. All rights reserved.

  14. Facial color is an efficient mechanism to visually transmit emotion

    PubMed Central

    Benitez-Quiroz, Carlos F.; Srinivasan, Ramprakash

    2018-01-01

    Facial expressions of emotion in humans are believed to be produced by contracting one’s facial muscles, generally called action units. However, the surface of the face is also innervated with a large network of blood vessels. Blood flow variations in these vessels yield visible color changes on the face. Here, we study the hypothesis that these visible facial colors allow observers to successfully transmit and visually interpret emotion even in the absence of facial muscle activation. To study this hypothesis, we address the following two questions. Are observable facial colors consistent within and differential between emotion categories and positive vs. negative valence? And does the human visual system use these facial colors to decode emotion from faces? These questions suggest the existence of an important, unexplored mechanism of the production of facial expressions of emotion by a sender and their visual interpretation by an observer. The results of our studies provide evidence in favor of our hypothesis. We show that people successfully decode emotion using these color features, even in the absence of any facial muscle activation. We also demonstrate that this color signal is independent from that provided by facial muscle movements. These results support a revised model of the production and perception of facial expressions of emotion where facial color is an effective mechanism to visually transmit and decode emotion. PMID:29555780

  15. Facial color is an efficient mechanism to visually transmit emotion.

    PubMed

    Benitez-Quiroz, Carlos F; Srinivasan, Ramprakash; Martinez, Aleix M

    2018-04-03

    Facial expressions of emotion in humans are believed to be produced by contracting one's facial muscles, generally called action units. However, the surface of the face is also innervated with a large network of blood vessels. Blood flow variations in these vessels yield visible color changes on the face. Here, we study the hypothesis that these visible facial colors allow observers to successfully transmit and visually interpret emotion even in the absence of facial muscle activation. To study this hypothesis, we address the following two questions. Are observable facial colors consistent within and differential between emotion categories and positive vs. negative valence? And does the human visual system use these facial colors to decode emotion from faces? These questions suggest the existence of an important, unexplored mechanism of the production of facial expressions of emotion by a sender and their visual interpretation by an observer. The results of our studies provide evidence in favor of our hypothesis. We show that people successfully decode emotion using these color features, even in the absence of any facial muscle activation. We also demonstrate that this color signal is independent from that provided by facial muscle movements. These results support a revised model of the production and perception of facial expressions of emotion where facial color is an effective mechanism to visually transmit and decode emotion. Copyright © 2018 the Author(s). Published by PNAS.

  16. Intranasal oxytocin increases facial expressivity, but not ratings of trustworthiness, in patients with schizophrenia and healthy controls.

    PubMed

    Woolley, J D; Chuang, B; Fussell, C; Scherer, S; Biagianti, B; Fulford, D; Mathalon, D H; Vinogradov, S

    2017-05-01

    Blunted facial affect is a common negative symptom of schizophrenia. Additionally, assessing the trustworthiness of faces is a social cognitive ability that is impaired in schizophrenia. Currently available pharmacological agents are ineffective at improving either of these symptoms, despite their clinical significance. The hypothalamic neuropeptide oxytocin has multiple prosocial effects when administered intranasally to healthy individuals and shows promise in decreasing negative symptoms and enhancing social cognition in schizophrenia. Although two small studies have investigated oxytocin's effects on ratings of facial trustworthiness in schizophrenia, its effects on facial expressivity have not been investigated in any population. We investigated the effects of oxytocin on facial emotional expressivity while participants performed a facial trustworthiness rating task in 33 individuals with schizophrenia and 35 age-matched healthy controls using a double-blind, placebo-controlled, cross-over design. Participants rated the trustworthiness of presented faces interspersed with emotionally evocative photographs while being video-recorded. Participants' facial expressivity in these videos was quantified by blind raters using a well-validated manualized approach (i.e. the Facial Expression Coding System; FACES). While oxytocin administration did not affect ratings of facial trustworthiness, it significantly increased facial expressivity in individuals with schizophrenia (Z = -2.33, p = 0.02) and at trend level in healthy controls (Z = -1.87, p = 0.06). These results demonstrate that oxytocin administration can increase facial expressivity in response to emotional stimuli and suggest that oxytocin may have the potential to serve as a treatment for blunted facial affect in schizophrenia.

  17. The insula is not specifically involved in disgust processing: an fMRI study.

    PubMed

    Schienle, A; Stark, R; Walter, B; Blecker, C; Ott, U; Kirsch, P; Sammer, G; Vaitl, D

    2002-11-15

    fMRI studies have shown that the perception of facial disgust expressions specifically activates the insula. The present fMRI study investigated whether this structure is also involved in the processing of visual stimuli depicting non-mimic disgust elicitors compared to fear-inducing and neutral scenes. Twelve female subjects were scanned while viewing alternating blocks of 40 disgust-inducing, 40 fear-inducing and 40 affectively neutral pictures, shown for 1.5 s each. Afterwards, affective ratings were assessed. The disgust pictures, rated as highly repulsive, induced activation in the insula, the amygdala, the orbitofrontal and occipito-temporal cortex. Since during the fear condition the insula was also involved, our findings do not fit the idea of the insula as a specific disgust processor.

  18. Individual differences in ensemble perception reveal multiple, independent levels of ensemble representation.

    PubMed

    Haberman, Jason; Brady, Timothy F; Alvarez, George A

    2015-04-01

    Ensemble perception, including the ability to "see the average" from a group of items, operates in numerous feature domains (size, orientation, speed, facial expression, etc.). Although the ubiquity of ensemble representations is well established, the large-scale cognitive architecture of this process remains poorly defined. We address this using an individual differences approach. In a series of experiments, observers saw groups of objects and reported either a single item from the group or the average of the entire group. High-level ensemble representations (e.g., average facial expression) showed complete independence from low-level ensemble representations (e.g., average orientation). In contrast, low-level ensemble representations (e.g., orientation and color) were correlated with each other, but not with high-level ensemble representations (e.g., facial expression and person identity). These results suggest that there is not a single domain-general ensemble mechanism, and that the relationship among various ensemble representations depends on how proximal they are in representational space. (c) 2015 APA, all rights reserved).

  19. Knowing your face: A componential analysis of self-perceived facial attractiveness.

    PubMed

    Yoder, Marcel S; Ault, Lara K; Mathews, Maureen A

    2017-01-01

    Facial attractiveness (FA) is a highly agreed upon and socially important characteristic, but contemporary research has not fully investigated individuals' enhancement of their FA. We used the social relations model (SRM) to analyze data from 187 participants. In face-to-face groups of four, participants rated their own and others' FA. We assessed the degree of FA enhancement using unconfounded measures of both self-insight and social comparison. Results indicated African Americans and men enhanced more than Caucasians and women. Race effects were mediated by psychological adjustment, while gender effects were related to meta-perceptions. Men tended to view themselves through a "frog prince" lens, such that they accurately predicted others' lesser views of them but persisted in overly positive self-perceptions of FA. Findings suggest further consideration of SRM measures, meta-perceptions, and a focus on sample diversity in the study of self-enhancement.

  20. Differential hemispheric and visual stream contributions to ensemble coding of crowd emotion

    PubMed Central

    Im, Hee Yeon; Albohn, Daniel N.; Steiner, Troy G.; Cushing, Cody A.; Adams, Reginald B.; Kveraga, Kestutis

    2017-01-01

    In crowds, where scrutinizing individual facial expressions is inefficient, humans can make snap judgments about the prevailing mood by reading “crowd emotion”. We investigated how the brain accomplishes this feat in a set of behavioral and fMRI studies. Participants were asked to either avoid or approach one of two crowds of faces presented in the left and right visual hemifields. Perception of crowd emotion was improved when crowd stimuli contained goal-congruent cues and was highly lateralized to the right hemisphere. The dorsal visual stream was preferentially activated in crowd emotion processing, with activity in the intraparietal sulcus and superior frontal gyrus predicting perceptual accuracy for crowd emotion perception, whereas activity in the fusiform cortex in the ventral stream predicted better perception of individual facial expressions. Our findings thus reveal significant behavioral differences and differential involvement of the hemispheres and the major visual streams in reading crowd versus individual face expressions. PMID:29226255

  1. Gaze Patterns in Auditory-Visual Perception of Emotion by Children with Hearing Aids and Hearing Children

    PubMed Central

    Wang, Yifang; Zhou, Wei; Cheng, Yanhong; Bian, Xiaoying

    2017-01-01

    This study investigated eye-movement patterns during emotion perception for children with hearing aids and hearing children. Seventy-eight participants aged from 3 to 7 were asked to watch videos with a facial expression followed by an oral statement, and these two cues were either congruent or incongruent in emotional valence. Results showed that while hearing children paid more attention to the upper part of the face, children with hearing aids paid more attention to the lower part of the face after the oral statement was presented, especially for the neutral facial expression/neutral oral statement condition. These results suggest that children with hearing aids have an altered eye contact pattern with others and a difficulty in matching visual and voice cues in emotion perception. The negative cause and effect of these gaze patterns should be avoided in earlier rehabilitation for hearing-impaired children with assistive devices. PMID:29312104

  2. Resting RSA Is Associated with Natural and Self-Regulated Responses to Negative Emotional Stimuli

    ERIC Educational Resources Information Center

    Demaree, Heath A.; Robinson, Jennifer L.; Everhart, D. Erik; Schmeichel, Brandon J.

    2004-01-01

    Resting respiratory sinus arrhythmia (RSA) was assessed among 111 adult participants. These individuals were then asked to watch a positive or negative affective film in either a natural manner or while exaggerating their facial response. Facial reactions to the film were video-recorded and subsequently rated in terms of facial affect.…

  3. The Relation of Facial Affect Recognition and Empathy to Delinquency in Youth Offenders

    ERIC Educational Resources Information Center

    Carr, Mary B.; Lutjemeier, John A.

    2005-01-01

    Associations among facial affect recognition, empathy, and self-reported delinquency were studied in a sample of 29 male youth offenders at a probation placement facility. Youth offenders were asked to recognize facial expressions of emotions from adult faces, child faces, and cartoon faces. Youth offenders also responded to a series of statements…

  4. Assessing the Utility of a Virtual Environment for Enhancing Facial Affect Recognition in Adolescents with Autism

    ERIC Educational Resources Information Center

    Bekele, Esubalew; Crittendon, Julie; Zheng, Zhi; Swanson, Amy; Weitlauf, Amy; Warren, Zachary; Sarkar, Nilanjan

    2014-01-01

    Teenagers with autism spectrum disorder (ASD) and age-matched controls participated in a dynamic facial affect recognition task within a virtual reality (VR) environment. Participants identified the emotion of a facial expression displayed at varied levels of intensity by a computer generated avatar. The system assessed performance (i.e.,…

  5. To Capture a Face: A Novel Technique for the Analysis and Quantification of Facial Expressions in American Sign Language

    ERIC Educational Resources Information Center

    Grossman, Ruth B.; Kegl, Judy

    2006-01-01

    American Sign Language uses the face to express vital components of grammar in addition to the more universal expressions of emotion. The study of ASL facial expressions has focused mostly on the perception and categorization of various expression types by signing and nonsigning subjects. Only a few studies of the production of ASL facial…

  6. The Perception of Facial Expressions and Stimulus Motion by Two- and Five-Month-Old Infants Using Holographic Stimuli.

    ERIC Educational Resources Information Center

    Nelson, Charles A.; Horowitz, Frances Degen

    1983-01-01

    Holograms of faces were used to study two- and five-month-old infants' discriminations of changes in facial expression and pose when the stimulus was seen to move or to remain stationary. While no evidence was found suggesting that infants preferred the moving face, evidence indicated that motion contrasts facilitate face recognition. (Author/RH)

  7. Thin-Slice Perception Develops Slowly

    ERIC Educational Resources Information Center

    Balas, Benjamin; Kanwisher, Nancy; Saxe, Rebecca

    2012-01-01

    Body language and facial gesture provide sufficient visual information to support high-level social inferences from "thin slices" of behavior. Given short movies of nonverbal behavior, adults make reliable judgments in a large number of tasks. Here we find that the high precision of adults' nonverbal social perception depends on the slow…

  8. Laterality of facial expressions of emotion: Universal and culture-specific influences.

    PubMed

    Mandal, Manas K; Ambady, Nalini

    2004-01-01

    Recent research indicates that (a) the perception and expression of facial emotion are lateralized to a great extent in the right hemisphere, and, (b) whereas facial expressions of emotion embody universal signals, culture-specific learning moderates the expression and interpretation of these emotions. In the present article, we review the literature on laterality and universality, and propose that, although some components of facial expressions of emotion are governed biologically, others are culturally influenced. We suggest that the left side of the face is more expressive of emotions, is more uninhibited, and displays culture-specific emotional norms. The right side of face, on the other hand, is less susceptible to cultural display norms and exhibits more universal emotional signals. Copyright 2004 IOS Press

  9. Using Event Related Potentials to Explore Stages of Facial Affect Recognition Deficits in Schizophrenia

    PubMed Central

    Wynn, Jonathan K.; Lee, Junghee; Horan, William P.; Green, Michael F.

    2008-01-01

    Schizophrenia patients show impairments in identifying facial affect; however, it is not known at what stage facial affect processing is impaired. We evaluated 3 event-related potentials (ERPs) to explore stages of facial affect processing in schizophrenia patients. Twenty-six schizophrenia patients and 27 normal controls participated. In separate blocks, subjects identified the gender of a face, the emotion of a face, or if a building had 1 or 2 stories. Three ERPs were examined: (1) P100 to examine basic visual processing, (2) N170 to examine facial feature encoding, and (3) N250 to examine affect decoding. Behavioral performance on each task was also measured. Results showed that schizophrenia patients’ P100 was comparable to the controls during all 3 identification tasks. Both patients and controls exhibited a comparable N170 that was largest during processing of faces and smallest during processing of buildings. For both groups, the N250 was largest during the emotion identification task and smallest for the building identification task. However, the patients produced a smaller N250 compared with the controls across the 3 tasks. The groups did not differ in behavioral performance in any of the 3 identification tasks. The pattern of intact P100 and N170 suggest that patients maintain basic visual processing and facial feature encoding abilities. The abnormal N250 suggests that schizophrenia patients are less efficient at decoding facial affect features. Our results imply that abnormalities in the later stage of feature decoding could potentially underlie emotion identification deficits in schizophrenia. PMID:18499704

  10. Subtle perceptions of male sexual orientation influence occupational opportunities.

    PubMed

    Rule, Nicholas O; Bjornsdottir, R Thora; Tskhay, Konstantin O; Ambady, Nalini

    2016-12-01

    Theories linking the literatures on stereotyping and human resource management have proposed that individuals may enjoy greater success obtaining jobs congruent with stereotypes about their social categories or traits. Here, we explored such effects for a detectable, but not obvious, social group distinction: male sexual orientation. Bridging previous work on prejudice and occupational success with that on social perception, we found that perceivers rated gay and straight men as more suited to professions consistent with stereotypes about their groups (nurses, pediatricians, and English teachers vs. engineers, managers, surgeons, and math teachers) from mere photos of their faces. Notably, distinct evaluations of the gay and straight men emerged based on perceptions of their faces with no explicit indication of sexual orientation. Neither perceivers' expertise with hiring decisions nor diagnostic information about the targets eliminated these biases, but encouraging fair decisions did contribute to partly ameliorating the differences. Mediation analysis further showed that perceptions of the targets' sexual orientations and facial affect accounted for these effects. Individuals may therefore infer characteristics about individuals' group memberships from their faces and use this information in a way that meaningfully influences evaluations of their suitability for particular jobs. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  11. Accurate perception of negative emotions predicts functional capacity in schizophrenia.

    PubMed

    Abram, Samantha V; Karpouzian, Tatiana M; Reilly, James L; Derntl, Birgit; Habel, Ute; Smith, Matthew J

    2014-04-30

    Several studies suggest facial affect perception (FAP) deficits in schizophrenia are linked to poorer social functioning. However, whether reduced functioning is associated with inaccurate perception of specific emotional valence or a global FAP impairment remains unclear. The present study examined whether impairment in the perception of specific emotional valences (positive, negative) and neutrality were uniquely associated with social functioning, using a multimodal social functioning battery. A sample of 59 individuals with schizophrenia and 41 controls completed a computerized FAP task, and measures of functional capacity, social competence, and social attainment. Participants also underwent neuropsychological testing and symptom assessment. Regression analyses revealed that only accurately perceiving negative emotions explained significant variance (7.9%) in functional capacity after accounting for neurocognitive function and symptoms. Partial correlations indicated that accurately perceiving anger, in particular, was positively correlated with functional capacity. FAP for positive, negative, or neutral emotions were not related to social competence or social attainment. Our findings were consistent with prior literature suggesting negative emotions are related to functional capacity in schizophrenia. Furthermore, the observed relationship between perceiving anger and performance of everyday living skills is novel and warrants further exploration. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  12. Mothers' pupillary responses to infant facial expressions.

    PubMed

    Yrttiaho, Santeri; Niehaus, Dana; Thomas, Eileen; Leppänen, Jukka M

    2017-02-06

    Human parental care relies heavily on the ability to monitor and respond to a child's affective states. The current study examined pupil diameter as a potential physiological index of mothers' affective response to infant facial expressions. Pupillary time-series were measured from 86 mothers of young infants in response to an array of photographic infant faces falling into four emotive categories based on valence (positive vs. negative) and arousal (mild vs. strong). Pupil dilation was highly sensitive to the valence of facial expressions, being larger for negative vs. positive facial expressions. A separate control experiment with luminance-matched non-face stimuli indicated that the valence effect was specific to facial expressions and cannot be explained by luminance confounds. Pupil response was not sensitive to the arousal level of facial expressions. The results show the feasibility of using pupil diameter as a marker of mothers' affective responses to ecologically valid infant stimuli and point to a particularly prompt maternal response to infant distress cues.

  13. Vocal fundamental and formant frequencies affect perceptions of speaker cooperativeness.

    PubMed

    Knowles, Kristen K; Little, Anthony C

    2016-01-01

    In recent years, the perception of social traits in faces and voices has received much attention. Facial and vocal masculinity are linked to perceptions of trustworthiness; however, while feminine faces are generally considered to be trustworthy, vocal trustworthiness is associated with masculinized vocal features. Vocal traits such as pitch and formants have previously been associated with perceived social traits such as trustworthiness and dominance, but the link between these measurements and perceptions of cooperativeness have yet to be examined. In Experiment 1, cooperativeness ratings of male and female voices were examined against four vocal measurements: fundamental frequency (F0), pitch variation (F0-SD), formant dispersion (Df), and formant position (Pf). Feminine pitch traits (F0 and F0-SD) and masculine formant traits (Df and Pf) were associated with higher cooperativeness ratings. In Experiment 2, manipulated voices with feminized F0 were found to be more cooperative than voices with masculinized F0(,) among both male and female speakers, confirming our results from Experiment 1. Feminine pitch qualities may indicate an individual who is friendly and non-threatening, while masculine formant qualities may reflect an individual that is socially dominant or prestigious, and the perception of these associated traits may influence the perceived cooperativeness of the speakers.

  14. Multisensory perception of the six basic emotions is modulated by attentional instruction and unattended modality

    PubMed Central

    Takagi, Sachiko; Hiramatsu, Saori; Tabei, Ken-ichi; Tanaka, Akihiro

    2015-01-01

    Previous studies have shown that the perception of facial and vocal affective expressions interacts with each other. Facial expressions usually dominate vocal expressions when we perceive the emotions of face–voice stimuli. In most of these studies, participants were instructed to pay attention to the face or voice. Few studies compared the perceived emotions with and without specific instructions regarding the modality to which attention should be directed. Also, these studies used combinations of the face and voice which expresses two opposing emotions, which limits the generalizability of the findings. The purpose of this study is to examine whether the emotion perception is modulated by instructions to pay attention to the face or voice using the six basic emotions. Also we examine the modality dominance between the face and voice for each emotion category. Before the experiment, we recorded faces and voices which expresses the six basic emotions and orthogonally combined these faces and voices. Consequently, the emotional valence of visual and auditory information was either congruent or incongruent. In the experiment, there were unisensory and multisensory sessions. The multisensory session was divided into three blocks according to whether an instruction was given to pay attention to a given modality (face attention, voice attention, and no instruction). Participants judged whether the speaker expressed happiness, sadness, anger, fear, disgust, or surprise. Our results revealed that instructions to pay attention to one modality and congruency of the emotions between modalities modulated the modality dominance, and the modality dominance is differed for each emotion category. In particular, the modality dominance for anger changed according to each instruction. Analyses also revealed that the modality dominance suggested by the congruency effect can be explained in terms of the facilitation effect and the interference effect. PMID:25698945

  15. Tuning to the Positive: Age-Related Differences in Subjective Perception of Facial Emotion

    PubMed Central

    Picardo, Rochelle; Baron, Andrew S.; Anderson, Adam K.; Todd, Rebecca M.

    2016-01-01

    Facial expressions aid social transactions and serve as socialization tools, with smiles signaling approval and reward, and angry faces signaling disapproval and punishment. The present study examined whether the subjective experience of positive vs. negative facial expressions differs between children and adults. Specifically, we examined age-related differences in biases toward happy and angry facial expressions. Young children (5–7 years) and young adults (18–29 years) rated the intensity of happy and angry expressions as well as levels of experienced arousal. Results showed that young children—but not young adults—rated happy facial expressions as both more intense and arousing than angry faces. This finding, which we replicated in two independent samples, was not due to differences in the ability to identify facial expressions, and suggests that children are more tuned to information in positive expressions. Together these studies provide evidence that children see unambiguous adult emotional expressions through rose-colored glasses, and suggest that what is emotionally relevant can shift with development. PMID:26734940

  16. Dynamic Displays Enhance the Ability to Discriminate Genuine and Posed Facial Expressions of Emotion

    PubMed Central

    Namba, Shushi; Kabir, Russell S.; Miyatani, Makoto; Nakao, Takashi

    2018-01-01

    Accurately gauging the emotional experience of another person is important for navigating interpersonal interactions. This study investigated whether perceivers are capable of distinguishing between unintentionally expressed (genuine) and intentionally manipulated (posed) facial expressions attributed to four major emotions: amusement, disgust, sadness, and surprise. Sensitivity to this discrimination was explored by comparing unstaged dynamic and static facial stimuli and analyzing the results with signal detection theory. Participants indicated whether facial stimuli presented on a screen depicted a person showing a given emotion and whether that person was feeling a given emotion. The results showed that genuine displays were evaluated more as felt expressions than posed displays for all target emotions presented. In addition, sensitivity to the perception of emotional experience, or discriminability, was enhanced in dynamic facial displays, but was less pronounced in the case of static displays. This finding indicates that dynamic information in facial displays contributes to the ability to accurately infer the emotional experiences of another person. PMID:29896135

  17. The face-selective N170 component is modulated by facial color.

    PubMed

    Nakajima, Kae; Minami, Tetsuto; Nakauchi, Shigeki

    2012-08-01

    Faces play an important role in social interaction by conveying information and emotion. Of the various components of the face, color particularly provides important clues with regard to perception of age, sex, health status, and attractiveness. In event-related potential (ERP) studies, the N170 component has been identified as face-selective. To determine the effect of color on face processing, we investigated the modulation of N170 by facial color. We recorded ERPs while subjects viewed facial color stimuli at 8 hue angles, which were generated by rotating the original facial color distribution around the white point by 45° for each human face. Responses to facial color were localized to the left, but not to the right hemisphere. N170 amplitudes gradually increased in proportion to the increase in hue angle from the natural-colored face. This suggests that N170 amplitude in the left hemisphere reflects processing of facial color information. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Influence of skin ageing features on Chinese women's perception of facial age and attractiveness.

    PubMed

    Porcheron, A; Latreille, J; Jdid, R; Tschachler, E; Morizot, F

    2014-08-01

    Ageing leads to characteristic changes in the appearance of facial skin. Among these changes, we can distinguish the skin topographic cues (skin sagging and wrinkles), the dark spots and the dark circles around the eyes. Although skin changes are similar in Caucasian and Chinese faces, the age of occurrence and the severity of age-related features differ between the two populations. Little is known about how the ageing of skin influences the perception of female faces in Chinese women. The aim of this study is to evaluate the contribution of the different age-related skin features to the perception of age and attractiveness in Chinese women. Facial images of Caucasian women and Chinese women in their 60s were manipulated separately to reduce the following skin features: (i) skin sagging and wrinkles, (ii) dark spots and (iii) dark circles. Finally, all signs were reduced simultaneously (iv). Female Chinese participants were asked to estimate the age difference between the modified and original images and evaluate the attractiveness of modified and original faces. Chinese women perceived the Chinese faces as younger after the manipulation of dark spots than after the reduction in wrinkles/sagging, whereas they perceived the Caucasian faces as the youngest after the manipulation of wrinkles/sagging. Interestingly, Chinese women evaluated faces with reduced dark spots as being the most attractive whatever the origin of the face. The manipulation of dark circles contributed to making Caucasian and Chinese faces being perceived younger and more attractive than the original faces, although the effect was less pronounced than for the two other types of manipulation. This is the first study to have examined the influence of various age-related skin features on the facial age and attractiveness perception of Chinese women. The results highlight different contributions of dark spots, sagging/wrinkles and dark circles to their perception of Chinese and Caucasian faces. © 2014 The Authors. International Journal of Cosmetic Science published by John Wiley & Sons Ltd on behalf of Society of Cosmetic Scientists and Societe Francaise de Cosmetologie.

  19. Facial movements strategically camouflage involuntary social signals of face morphology.

    PubMed

    Gill, Daniel; Garrod, Oliver G B; Jack, Rachael E; Schyns, Philippe G

    2014-05-01

    Animals use social camouflage as a tool of deceit to increase the likelihood of survival and reproduction. We tested whether humans can also strategically deploy transient facial movements to camouflage the default social traits conveyed by the phenotypic morphology of their faces. We used the responses of 12 observers to create models of the dynamic facial signals of dominance, trustworthiness, and attractiveness. We applied these dynamic models to facial morphologies differing on perceived dominance, trustworthiness, and attractiveness to create a set of dynamic faces; new observers rated each dynamic face according to the three social traits. We found that specific facial movements camouflage the social appearance of a face by modulating the features of phenotypic morphology. A comparison of these facial expressions with those similarly derived for facial emotions showed that social-trait expressions, rather than being simple one-to-one overgeneralizations of emotional expressions, are a distinct set of signals composed of movements from different emotions. Our generative face models represent novel psychophysical laws for social sciences; these laws predict the perception of social traits on the basis of dynamic face identities.

  20. Intact Rapid Facial Mimicry as well as Generally Reduced Mimic Responses in Stable Schizophrenia Patients

    PubMed Central

    Chechko, Natalya; Pagel, Alena; Otte, Ellen; Koch, Iring; Habel, Ute

    2016-01-01

    Spontaneous emotional expressions (rapid facial mimicry) perform both emotional and social functions. In the current study, we sought to test whether there were deficits in automatic mimic responses to emotional facial expressions in patients (15 of them) with stable schizophrenia compared to 15 controls. In a perception-action interference paradigm (the Simon task; first experiment), and in the context of a dual-task paradigm (second experiment), the task-relevant stimulus feature was the gender of a face, which, however, displayed a smiling or frowning expression (task-irrelevant stimulus feature). We measured the electromyographical activity in the corrugator supercilii and zygomaticus major muscle regions in response to either compatible or incompatible stimuli (i.e., when the required response did or did not correspond to the depicted facial expression). The compatibility effect based on interactions between the implicit processing of a task-irrelevant emotional facial expression and the conscious production of an emotional facial expression did not differ between the groups. In stable patients (in spite of a reduced mimic reaction), we observed an intact capacity to respond spontaneously to facial emotional stimuli. PMID:27303335

  1. The effect of background music on episodic memory and autonomic responses: listening to emotionally touching music enhances facial memory capacity.

    PubMed

    Proverbio, Alice Mado; Mado Proverbio, C A Alice; Lozano Nasi, Valentina; Alessandra Arcari, Laura; De Benedetto, Francesco; Guardamagna, Matteo; Gazzola, Martina; Zani, Alberto

    2015-10-15

    The aim of this study was to investigate how background auditory processing can affect other perceptual and cognitive processes as a function of stimulus content, style and emotional nature. Previous studies have offered contrasting evidence, and it has been recently shown that listening to music negatively affected concurrent mental processing in the elderly but not in young adults. To further investigate this matter, the effect of listening to music vs. listening to the sound of rain or silence was examined by administering an old/new face memory task (involving 448 unknown faces) to a group of 54 non-musician university students. Heart rate and diastolic and systolic blood pressure were measured during an explicit face study session that was followed by a memory test. The results indicated that more efficient and faster recall of faces occurred under conditions of silence or when participants were listening to emotionally touching music. Whereas auditory background (e.g., rain or joyful music) interfered with memory encoding, listening to emotionally touching music improved memory and significantly increased heart rate. It is hypothesized that touching music is able to modify the visual perception of faces by binding facial properties with auditory and emotionally charged information (music), which may therefore result in deeper memory encoding.

  2. The effect of background music on episodic memory and autonomic responses: listening to emotionally touching music enhances facial memory capacity

    PubMed Central

    Mado Proverbio, C.A. Alice; Lozano Nasi, Valentina; Alessandra Arcari, Laura; De Benedetto, Francesco; Guardamagna, Matteo; Gazzola, Martina; Zani, Alberto

    2015-01-01

    The aim of this study was to investigate how background auditory processing can affect other perceptual and cognitive processes as a function of stimulus content, style and emotional nature. Previous studies have offered contrasting evidence, and it has been recently shown that listening to music negatively affected concurrent mental processing in the elderly but not in young adults. To further investigate this matter, the effect of listening to music vs. listening to the sound of rain or silence was examined by administering an old/new face memory task (involving 448 unknown faces) to a group of 54 non-musician university students. Heart rate and diastolic and systolic blood pressure were measured during an explicit face study session that was followed by a memory test. The results indicated that more efficient and faster recall of faces occurred under conditions of silence or when participants were listening to emotionally touching music. Whereas auditory background (e.g., rain or joyful music) interfered with memory encoding, listening to emotionally touching music improved memory and significantly increased heart rate. It is hypothesized that touching music is able to modify the visual perception of faces by binding facial properties with auditory and emotionally charged information (music), which may therefore result in deeper memory encoding. PMID:26469712

  3. Text-to-audiovisual speech synthesizer for children with learning disabilities.

    PubMed

    Mendi, Engin; Bayrak, Coskun

    2013-01-01

    Learning disabilities affect the ability of children to learn, despite their having normal intelligence. Assistive tools can highly increase functional capabilities of children with learning disorders such as writing, reading, or listening. In this article, we describe a text-to-audiovisual synthesizer that can serve as an assistive tool for such children. The system automatically converts an input text to audiovisual speech, providing synchronization of the head, eye, and lip movements of the three-dimensional face model with appropriate facial expressions and word flow of the text. The proposed system can enhance speech perception and help children having learning deficits to improve their chances of success.

  4. Brain Responses to Dynamic Facial Expressions: A Normative Meta-Analysis.

    PubMed

    Zinchenko, Oksana; Yaple, Zachary A; Arsalidou, Marie

    2018-01-01

    Identifying facial expressions is crucial for social interactions. Functional neuroimaging studies show that a set of brain areas, such as the fusiform gyrus and amygdala, become active when viewing emotional facial expressions. The majority of functional magnetic resonance imaging (fMRI) studies investigating face perception typically employ static images of faces. However, studies that use dynamic facial expressions (e.g., videos) are accumulating and suggest that a dynamic presentation may be more sensitive and ecologically valid for investigating faces. By using quantitative fMRI meta-analysis the present study examined concordance of brain regions associated with viewing dynamic facial expressions. We analyzed data from 216 participants that participated in 14 studies, which reported coordinates for 28 experiments. Our analysis revealed bilateral fusiform and middle temporal gyri, left amygdala, left declive of the cerebellum and the right inferior frontal gyrus. These regions are discussed in terms of their relation to models of face processing.

  5. Face Processing in Children with Autism Spectrum Disorder: Independent or Interactive Processing of Facial Identity and Facial Expression?

    ERIC Educational Resources Information Center

    Krebs, Julia F.; Biswas, Ajanta; Pascalis, Olivier; Kamp-Becker, Inge; Remschmidt, Helmuth; Schwarzer, Gudrun

    2011-01-01

    The current study investigated if deficits in processing emotional expression affect facial identity processing and vice versa in children with autism spectrum disorder. Children with autism and IQ and age matched typically developing children classified faces either by emotional expression, thereby ignoring facial identity or by facial identity…

  6. Viewing distance matter to perceived intensity of facial expressions

    PubMed Central

    Gerhardsson, Andreas; Högman, Lennart; Fischer, Håkan

    2015-01-01

    In our daily perception of facial expressions, we depend on an ability to generalize across the varied distances at which they may appear. This is important to how we interpret the quality and the intensity of the expression. Previous research has not investigated whether this so called perceptual constancy also applies to the experienced intensity of facial expressions. Using a psychophysical measure (Borg CR100 scale) the present study aimed to further investigate perceptual constancy of happy and angry facial expressions at varied sizes, which is a proxy for varying viewing distances. Seventy-one (42 females) participants rated the intensity and valence of facial expressions varying in distance and intensity. The results demonstrated that the perceived intensity (PI) of the emotional facial expression was dependent on the distance of the face and the person perceiving it. An interaction effect was noted, indicating that close-up faces are perceived as more intense than faces at a distance and that this effect is stronger the more intense the facial expression truly is. The present study raises considerations regarding constancy of the PI of happy and angry facial expressions at varied distances. PMID:26191035

  7. Forming Facial Expressions Influences Assessment of Others' Dominance but Not Trustworthiness.

    PubMed

    Ueda, Yoshiyuki; Nagoya, Kie; Yoshikawa, Sakiko; Nomura, Michio

    2017-01-01

    Forming specific facial expressions influences emotions and perception. Bearing this in mind, studies should be reconsidered in which observers expressing neutral emotions inferred personal traits from the facial expressions of others. In the present study, participants were asked to make happy, neutral, and disgusted facial expressions: for "happy," they held a wooden chopstick in their molars to form a smile; for "neutral," they clasped the chopstick between their lips, making no expression; for "disgusted," they put the chopstick between their upper lip and nose and knit their brows in a scowl. However, they were not asked to intentionally change their emotional state. Observers judged happy expression images as more trustworthy, competent, warm, friendly, and distinctive than disgusted expression images, regardless of the observers' own facial expression. Observers judged disgusted expression images as more dominant than happy expression images. However, observers expressing disgust overestimated dominance in observed disgusted expression images and underestimated dominance in happy expression images. In contrast, observers with happy facial forms attenuated dominance for disgusted expression images. These results suggest that dominance inferred from facial expressions is unstable and influenced by not only the observed facial expression, but also the observers' own physiological states.

  8. Is moral beauty different from facial beauty? Evidence from an fMRI study

    PubMed Central

    Wang, Tingting; Mo, Ce; Tan, Li Hai; Cant, Jonathan S.; Zhong, Luojin; Cupchik, Gerald

    2015-01-01

    Is moral beauty different from facial beauty? Two functional magnetic resonance imaging experiments were performed to answer this question. Experiment 1 investigated the network of moral aesthetic judgments and facial aesthetic judgments. Participants performed aesthetic judgments and gender judgments on both faces and scenes containing moral acts. The conjunction analysis of the contrasts ‘facial aesthetic judgment > facial gender judgment’ and ‘scene moral aesthetic judgment > scene gender judgment’ identified the common involvement of the orbitofrontal cortex (OFC), inferior temporal gyrus and medial superior frontal gyrus, suggesting that both types of aesthetic judgments are based on the orchestration of perceptual, emotional and cognitive components. Experiment 2 examined the network of facial beauty and moral beauty during implicit perception. Participants performed a non-aesthetic judgment task on both faces (beautiful vs common) and scenes (containing morally beautiful vs neutral information). We observed that facial beauty (beautiful faces > common faces) involved both the cortical reward region OFC and the subcortical reward region putamen, whereas moral beauty (moral beauty scenes > moral neutral scenes) only involved the OFC. Moreover, compared with facial beauty, moral beauty spanned a larger-scale cortical network, indicating more advanced and complex cerebral representations characterizing moral beauty. PMID:25298010

  9. Reaction Time of Facial Affect Recognition in Asperger's Disorder for Cartoon and Real, Static and Moving Faces

    ERIC Educational Resources Information Center

    Miyahara, Motohide; Bray, Anne; Tsujii, Masatsugu; Fujita, Chikako; Sugiyama, Toshiro

    2007-01-01

    This study used a choice reaction-time paradigm to test the perceived impairment of facial affect recognition in Asperger's disorder. Twenty teenagers with Asperger's disorder and 20 controls were compared with respect to the latency and accuracy of response to happy or disgusted facial expressions, presented in cartoon or real images and in…

  10. On Assisting a Visual-Facial Affect Recognition System with Keyboard-Stroke Pattern Information

    NASA Astrophysics Data System (ADS)

    Stathopoulou, I.-O.; Alepis, E.; Tsihrintzis, G. A.; Virvou, M.

    Towards realizing a multimodal affect recognition system, we are considering the advantages of assisting a visual-facial expression recognition system with keyboard-stroke pattern information. Our work is based on the assumption that the visual-facial and keyboard modalities are complementary to each other and that their combination can significantly improve the accuracy in affective user models. Specifically, we present and discuss the development and evaluation process of two corresponding affect recognition subsystems, with emphasis on the recognition of 6 basic emotional states, namely happiness, sadness, surprise, anger and disgust as well as the emotion-less state which we refer to as neutral. We find that emotion recognition by the visual-facial modality can be aided greatly by keyboard-stroke pattern information and the combination of the two modalities can lead to better results towards building a multimodal affect recognition system.

  11. Agency and facial emotion judgment in context.

    PubMed

    Ito, Kenichi; Masuda, Takahiko; Li, Liman Man Wai

    2013-06-01

    Past research showed that East Asians' belief in holism was expressed as their tendencies to include background facial emotions into the evaluation of target faces more than North Americans. However, this pattern can be interpreted as North Americans' tendency to downplay background facial emotions due to their conceptualization of facial emotion as volitional expression of internal states. Examining this alternative explanation, we investigated whether different types of contextual information produce varying degrees of effect on one's face evaluation across cultures. In three studies, European Canadians and East Asians rated the intensity of target facial emotions surrounded with either affectively salient landscape sceneries or background facial emotions. The results showed that, although affectively salient landscapes influenced the judgment of both cultural groups, only European Canadians downplayed the background facial emotions. The role of agency as differently conceptualized across cultures and multilayered systems of cultural meanings are discussed.

  12. Cutaneous electrical stimulation treatment in unresolved facial nerve paralysis: an exploratory study.

    PubMed

    Hyvärinen, Antti; Tarkka, Ina M; Mervaala, Esa; Pääkkönen, Ari; Valtonen, Hannu; Nuutinen, Juhani

    2008-12-01

    The purpose of this study was to assess clinical and neurophysiological changes after 6 mos of transcutaneous electrical stimulation in patients with unresolved facial nerve paralysis. A pilot case series of 10 consecutive patients with chronic facial nerve paralysis either of idiopathic origin or because of herpes zoster oticus participated in this open study. All patients received below sensory threshold transcutaneous electrical stimulation for 6 mos for their facial nerve paralysis. The intervention consisted of gradually increasing the duration of electrical stimulation of three sites on the affected area for up to 6 hrs/day. Assessments of the facial nerve function were performed using the House-Brackmann clinical scale and neurophysiological measurements of compound motor action potential distal latencies on the affected and nonaffected sides. Patients were tested before and after the intervention. A significant improvement was observed in the facial nerve upper branch compound motor action potential distal latency on the affected side in all patients. An improvement of one grade in House-Brackmann scale was observed and some patients also reported subjective improvement. Transcutaneous electrical stimulation treatment may have a positive effect on unresolved facial nerve paralysis. This study illustrates a possibly effective treatment option for patients with the chronic facial paresis with no other expectations of recovery.

  13. Use of context in emotion perception: The role of top-down control, cue type, and perceiver's age.

    PubMed

    Ngo, Nhi; Isaacowitz, Derek M

    2015-06-01

    Although context is crucial to emotion perception, there are various factors that can modulate contextual influence. The current research investigated how cue type, top-down control, and the perceiver's age influence attention to context in facial emotion perception. In 2 experiments, younger and older adults identified facial expressions contextualized by other faces, isolated objects, and scenes. In the first experiment, participants were instructed to ignore face, object, and scene contexts. Face context was found to influence perception the least, whereas scene context produced the most contextual effect. Older adults were more influenced by context than younger adults, but both age groups were similarly influenced by different types of contextual cues, even when they were instructed to ignore the context. In the second experiment, when explicitly instructed that the context had no meaningful relationship to the target, younger and older adults both were less influenced by context than when they were instructed that the context was relevant to the target. Results from both studies indicate that contextual influence on emotion perception is not constant, but can vary based on the type of contextual cue, cue relevance, and the perceiver's age. (c) 2015 APA, all rights reserved).

  14. Brain synchronization during perception of facial emotional expressions with natural and unnatural dynamics

    PubMed Central

    Volhard, Jakob; Müller, Viktor; Kaulard, Kathrin; Brick, Timothy R.; Wallraven, Christian; Lindenberger, Ulman

    2017-01-01

    Research on the perception of facial emotional expressions (FEEs) often uses static images that do not capture the dynamic character of social coordination in natural settings. Recent behavioral and neuroimaging studies suggest that dynamic FEEs (videos or morphs) enhance emotion perception. To identify mechanisms associated with the perception of FEEs with natural dynamics, the present EEG (Electroencephalography)study compared (i) ecologically valid stimuli of angry and happy FEEs with natural dynamics to (ii) FEEs with unnatural dynamics, and to (iii) static FEEs. FEEs with unnatural dynamics showed faces moving in a biologically possible but unpredictable and atypical manner, generally resulting in ambivalent emotional content. Participants were asked to explicitly recognize FEEs. Using whole power (WP) and phase synchrony (Phase Locking Index, PLI), we found that brain responses discriminated between natural and unnatural FEEs (both static and dynamic). Differences were primarily observed in the timing and brain topographies of delta and theta PLI and WP, and in alpha and beta WP. Our results support the view that biologically plausible, albeit atypical, FEEs are processed by the brain by different mechanisms than natural FEEs. We conclude that natural movement dynamics are essential for the perception of FEEs and the associated brain processes. PMID:28723957

  15. Brain synchronization during perception of facial emotional expressions with natural and unnatural dynamics.

    PubMed

    Perdikis, Dionysios; Volhard, Jakob; Müller, Viktor; Kaulard, Kathrin; Brick, Timothy R; Wallraven, Christian; Lindenberger, Ulman

    2017-01-01

    Research on the perception of facial emotional expressions (FEEs) often uses static images that do not capture the dynamic character of social coordination in natural settings. Recent behavioral and neuroimaging studies suggest that dynamic FEEs (videos or morphs) enhance emotion perception. To identify mechanisms associated with the perception of FEEs with natural dynamics, the present EEG (Electroencephalography)study compared (i) ecologically valid stimuli of angry and happy FEEs with natural dynamics to (ii) FEEs with unnatural dynamics, and to (iii) static FEEs. FEEs with unnatural dynamics showed faces moving in a biologically possible but unpredictable and atypical manner, generally resulting in ambivalent emotional content. Participants were asked to explicitly recognize FEEs. Using whole power (WP) and phase synchrony (Phase Locking Index, PLI), we found that brain responses discriminated between natural and unnatural FEEs (both static and dynamic). Differences were primarily observed in the timing and brain topographies of delta and theta PLI and WP, and in alpha and beta WP. Our results support the view that biologically plausible, albeit atypical, FEEs are processed by the brain by different mechanisms than natural FEEs. We conclude that natural movement dynamics are essential for the perception of FEEs and the associated brain processes.

  16. Understanding the Burden of Adult Female Acne

    PubMed Central

    Kawata, Ariane K.; Daniels, Selena R.; Yeomans, Karen; Burk, Caroline T.; Callender, Valerie D.

    2014-01-01

    Objective: Typically regarded as an adolescent condition, acne among adult females is also prevalent. Limited data are available on the clinical characteristics and burden of adult female acne. The study objective was to describe clinical characteristics and psychosocial impact of acne in adult women. Design: Cross-sectional, web-based survey. Setting: Data were collected from a diverse sample of United States females. Participants: Women ages 25 to 45 years with facial acne (≥25 visible lesions). Measurements: Outcomes included sociodemographic and clinical characteristics, perceptions, coping behaviors, psychosocial impact of acne (health-related quality of life using acne-specific Quality of Life questionnaire and psychological status using Patient Health Questionnaire), and work/productivity. Results: A total of 208 women completed the survey (mean age 35±6 years), comprising White/Caucasian (51.4%), Black/African American (24.5%), Hispanic/Latino (11.1%), Asian (7.7%), and Other (5.3%). Facial acne presented most prominently on cheeks, chin, and forehead and was characterized by erythema, postinflammatory hyperpigmentation, and scarring. Average age of adult onset was 25±6 years, and one-third (33.7%) were diagnosed with acne as an adult. The majority (80.3%) had 25 to 49 visible facial lesions. Acne was perceived as troublesome and impacted self-confidence. Makeup was frequently used to conceal acne. Facial acne negatively affected health-related quality of life, was associated with mild/moderate symptoms of depression and/or anxiety, and impacted ability to concentrate on work or school. Conclusion: Results highlight the multifaceted impact of acne and provide evidence that adult female acne is under-recognized and burdensome. PMID:24578779

  17. Subjective and objective evaluation of frontal smile esthetics in patients with facial asymmetry-a comparative cross-sectional study.

    PubMed

    Singh, H; Maurya, R K; Kapoor, P; Sharma, P; Srivastava, D

    2017-02-01

    To analyze the relationship between subjective and objective evaluations of pre-treatment posed smiles in patients with facial asymmetry and to assess the influence of dentofacial structures involved in asymmetry on the perception of smile attractiveness. Thirty-five patients (25 males and 10 females) between 18 and 25 years of age with facial asymmetry were selected. Pre-treatment clinical photographs of posed smiles were subjectively evaluated by a panel of 20 orthodontists, 20 oral surgeons, and 20 laypersons. A customized Smile Mesh program was used for objective evaluation of the same smiles. Direct comparison among three smile groups (unattractive, slightly attractive, and attractive) for different Smile Mesh measurements was carried out using two-way anova test. Additionally, linear regression was performed to evaluate whether these measurements could predict the attractiveness of captured smiles. Patients with 'slightly attractive' smiles had a significantly greater distance between the incisal margin of the maxillary central incisor and the lower lip during smiling. The Smile Index was significantly greater in attractive smiles. However, based on the coefficients of linear regression, no objectively gathered measurement could predict smile attractiveness. Attractiveness or unattractiveness of smiles in patients with facial asymmetry could not be predicted by any measurement of Smile Mesh. The presence of facial asymmetry did not significantly influence the perception of smile esthetics. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  18. A View of the Therapy for Bell's Palsy Based on Molecular Biological Analyses of Facial Muscles.

    PubMed

    Moriyama, Hiroshi; Mitsukawa, Nobuyuki; Itoh, Masahiro; Otsuka, Naruhito

    2017-12-01

    Details regarding the molecular biological features of Bell's palsy have not been widely reported in textbooks. We genetically analyzed facial muscles and clarified these points. We performed genetic analysis of facial muscle specimens from Japanese patients with severe (House-Brackmann facial nerve grading system V) and moderate (House-Brackmann facial nerve grading system III) dysfunction due to Bell's palsy. Microarray analysis of gene expression was performed using specimens from the healthy and affected sides, and gene expression was compared. Changes in gene expression were defined as an affected side/healthy side ratio of >1.5 or <0.5. We observed that the gene expression in Bell's palsy changes with the degree of facial nerve palsy. Especially, muscle, neuron, and energy category genes tended to fluctuate with the degree of facial nerve palsy. It is expected that this study will aid in the development of new treatments and diagnostic/prognostic markers based on the severity of facial nerve palsy.

  19. Animated pedagogical agents: How the presence and nonverbal communication of a virtual instructor affect perceptions and learning outcomes in a computer-based environment about basic physics concepts

    NASA Astrophysics Data System (ADS)

    Frechette, M. Casey

    One important but under-researched area of instructional technology concerns the effects of animated pedagogical agents (APAs), or lifelike characters designed to enhance learning in computer-based environments. This research sought to broaden what is currently known about APAs' instructional value by investigating the effects of agents' visual presence and nonverbal communication. A theoretical framework based on APA literature published in the past decade guided the design of the study. This framework sets forth that APAs impact learning through their presence and communication. The communication displayed by an APA involves two distinct kinds of nonverbal cues: cognitive (hand and arm gestures) and affective (facial expressions). It was predicted that the presence of an agent would enhance learning and that nonverbal communication would amplify these effects. The research utilized a between-subjects experimental design. Participants were randomly assigned to treatment conditions in a controlled lab setting, and group means were compared with a MANCOVA. Participants received (1) a non-animated agent, (2) an agent with hand and arm gestures, (3) an agent with facial expressions, or (4) a fully animated agent. The agent appeared in a virtual learning environment focused on Kepler's laws of planetary motion. A control group did not receive the visual presence of an agent. Two effects were studied: participants' perceptions and their learning outcomes. Perceptions were measured with an attitudinal survey with five subscales. Learning outcomes were measured with an open-ended recall test, a multiple choice comprehension test, and an open-ended transfer test. Learners presented with an agent with affective nonverbal communication comprehended less than learners exposed to a non-animated agent. No significant differences were observed when a group exposed to a fully animated agent was compared to a group with a non-animated agent. Adding both nonverbal communication channels mitigated the disadvantages of adding just one kind of nonverbal cue. No statistically significant differences were observed on measures of recall or transfer, or on the attitudinal survey. The research supports the notion that invoking a human-like presence in a virtual learning environment prompts strong expectations about the character's realism. When these expectations are not met, learning is hindered.

  20. The Influences of Face Inversion and Facial Expression on Sensitivity to Eye Contact in High-Functioning Adults with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Vida, Mark D.; Maurer, Daphne; Calder, Andrew J.; Rhodes, Gillian; Walsh, Jennifer A.; Pachai, Matthew V.; Rutherford, M. D.

    2013-01-01

    We examined the influences of face inversion and facial expression on sensitivity to eye contact in high-functioning adults with and without an autism spectrum disorder (ASD). Participants judged the direction of gaze of angry, fearful, and neutral faces. In the typical group only, the range of directions of gaze leading to the perception of eye…

  1. The effect of facial makeup on the frequency of drivers stopping for hitchhikers.

    PubMed

    Guéguen, Nicolas; Lamy, Lubomir

    2013-08-01

    Judgments of photographs have shown that makeup enhances ratings of women's facial attractiveness. The present study assessed whether makeup affects the stopping behavior of drivers in response to a hitchhiker's signal. Four 20- to 22-year-old female confederates wore facial makeup, or not, while pretending to be hitchhiking. Frequency of stopping was compared in 1,600 male and female drivers. Facial makeup was associated with an increase in the number of male drivers who stopped to offer a ride. Makeup did not affect frequency of stopping by female drivers.

  2. Valence-Specific Laterality Effects in Vocal Emotion: Interactions with Stimulus Type, Blocking and Sex

    ERIC Educational Resources Information Center

    Schepman, Astrid; Rodway, Paul; Geddes, Pauline

    2012-01-01

    Valence-specific laterality effects have been frequently obtained in facial emotion perception but not in vocal emotion perception. We report a dichotic listening study further examining whether valence-specific laterality effects generalise to vocal emotions. Based on previous literature, we tested whether valence-specific laterality effects were…

  3. Variants of Independence in the Perception of Facial Identity and Expression

    ERIC Educational Resources Information Center

    Fitousi, Daniel; Wenger, Michael J.

    2013-01-01

    A prominent theory in the face perception literature--the parallel-route hypothesis (Bruce & Young, 1986)--assumes a dedicated channel for the processing of identity that is separate and independent from the channel(s) in which nonidentity information is processed (e.g., expression, eye gaze). The current work subjected this assumption to…

  4. Effects of touch on emotional face processing: A study of event-related potentials, facial EMG and cardiac activity.

    PubMed

    Spapé, M M; Harjunen, Ville; Ravaja, N

    2017-03-01

    Being touched is known to affect emotion, and even a casual touch can elicit positive feelings and affinity. Psychophysiological studies have recently shown that tactile primes affect visual evoked potentials to emotional stimuli, suggesting altered affective stimulus processing. As, however, these studies approached emotion from a purely unidimensional perspective, it remains unclear whether touch biases emotional evaluation or a more general feature such as salience. Here, we investigated how simple tactile primes modulate event related potentials (ERPs), facial EMG and cardiac response to pictures of facial expressions of emotion. All measures replicated known effects of emotional face processing: Disgust and fear modulated early ERPs, anger increased the cardiac orienting response, and expressions elicited emotion-congruent facial EMG activity. Tactile primes also affected these measures, but priming never interacted with the type of emotional expression. Thus, touch may additively affect general stimulus processing, but it does not bias or modulate immediate affective evaluation. Copyright © 2017. Published by Elsevier B.V.

  5. The effects of postnatal maternal depression and anxiety on the processing of infant faces

    PubMed Central

    Arteche, Adriane; Joormann, Jutta; Harvey, Allison; Craske, Michelle; Gotlib, Ian H.; Lehtonen, Annukka; Counsell, Nicholas; Stein, Alan

    2011-01-01

    Background Postnatally depressed mothers have difficulties responding appropriately to their infants. The quality of the mother–child relationship depends on a mother's ability to respond to her infant's cues, which are largely non-verbal. Therefore, it is likely that difficulties in a mother's appraisal of her infants' facial expressions will affect the quality of mother–infant interaction. This study aimed to investigate the effects of postnatal depression and anxiety on the processing of infants' facial expressions. Method A total of 89 mothers, 34 with Generalised Anxiety Disorder, 21 with Major Depressive Disorder, and 34 controls, completed a ‘morphed infants’ faces task when their children were between 10 and 18 months. Results Overall, mothers were more likely to identify happy faces accurately and at lower intensity than sad faces. Depressed compared to control participants, however, were less likely to accurately identify happy infant faces. Interestingly, mothers with GAD tended to identify happy faces at a lower intensity than controls. There were no differences between the groups in relation to sad faces. Limitations Our sample was relatively small and further research is needed to investigate the links between mothers' perceptions of infant expressions and both maternal responsiveness and later measures of child development. Conclusion Our findings have potential clinical implications as the difficulties in the processing of positive facial expressions in depression may lead to less maternal responsiveness to positive affect in the offspring and may diminish the quality of the mother–child interactions. Results for participants with GAD are consistent with the literature demonstrating that persons with GAD are intolerant of uncertainty and seek reassurance due to their worries. PMID:21641652

  6. The effects of postnatal maternal depression and anxiety on the processing of infant faces.

    PubMed

    Arteche, Adriane; Joormann, Jutta; Harvey, Allison; Craske, Michelle; Gotlib, Ian H; Lehtonen, Annukka; Counsell, Nicholas; Stein, Alan

    2011-09-01

    Postnatally depressed mothers have difficulties responding appropriately to their infants. The quality of the mother-child relationship depends on a mother's ability to respond to her infant's cues, which are largely non-verbal. Therefore, it is likely that difficulties in a mother's appraisal of her infants' facial expressions will affect the quality of mother-infant interaction. This study aimed to investigate the effects of postnatal depression and anxiety on the processing of infants' facial expressions. A total of 89 mothers, 34 with Generalised Anxiety Disorder, 21 with Major Depressive Disorder, and 34 controls, completed a 'morphed infants' faces task when their children were between 10 and 18 months. Overall, mothers were more likely to identify happy faces accurately and at lower intensity than sad faces. Depressed compared to control participants, however, were less likely to accurately identify happy infant faces. Interestingly, mothers with GAD tended to identify happy faces at a lower intensity than controls. There were no differences between the groups in relation to sad faces. Our sample was relatively small and further research is needed to investigate the links between mothers' perceptions of infant expressions and both maternal responsiveness and later measures of child development. Our findings have potential clinical implications as the difficulties in the processing of positive facial expressions in depression may lead to less maternal responsiveness to positive affect in the offspring and may diminish the quality of the mother-child interactions. Results for participants with GAD are consistent with the literature demonstrating that persons with GAD are intolerant of uncertainty and seek reassurance due to their worries. Copyright © 2011 Elsevier B.V. All rights reserved.

  7. On the Perception of Religious Group Membership from Faces

    PubMed Central

    Rule, Nicholas O.; Garrett, James V.; Ambady, Nalini

    2010-01-01

    Background The study of social categorization has largely been confined to examining groups distinguished by perceptually obvious cues. Yet many ecologically important group distinctions are less clear, permitting insights into the general processes involved in person perception. Although religious group membership is thought to be perceptually ambiguous, folk beliefs suggest that Mormons and non-Mormons can be categorized from their appearance. We tested whether Mormons could be distinguished from non-Mormons and investigated the basis for this effect to gain insight to how subtle perceptual cues can support complex social categorizations. Methodology/Principal Findings Participants categorized Mormons' and non-Mormons' faces or facial features according to their group membership. Individuals could distinguish between the two groups significantly better than chance guessing from their full faces and faces without hair, with eyes and mouth covered, without outer face shape, and inverted 180°; but not from isolated features (i.e., eyes, nose, or mouth). Perceivers' estimations of their accuracy did not match their actual accuracy. Exploration of the remaining features showed that Mormons and non-Mormons significantly differed in perceived health and that these perceptions were related to perceptions of skin quality, as demonstrated in a structural equation model representing the contributions of skin color and skin texture. Other judgments related to health (facial attractiveness, facial symmetry, and structural aspects related to body weight) did not differ between the two groups. Perceptions of health were also responsible for differences in perceived spirituality, explaining folk hypotheses that Mormons are distinct because they appear more spiritual than non-Mormons. Conclusions/Significance Subtle markers of group membership can influence how others are perceived and categorized. Perceptions of health from non-obvious and minimal cues distinguished individuals according to their religious group membership. These data illustrate how the non-conscious detection of very subtle differences in others' appearances supports cognitively complex judgments such as social categorization. PMID:21151864

  8. Event-related theta synchronization predicts deficit in facial affect recognition in schizophrenia.

    PubMed

    Csukly, Gábor; Stefanics, Gábor; Komlósi, Sarolta; Czigler, István; Czobor, Pál

    2014-02-01

    Growing evidence suggests that abnormalities in the synchronized oscillatory activity of neurons in schizophrenia may lead to impaired neural activation and temporal coding and thus lead to neurocognitive dysfunctions, such as deficits in facial affect recognition. To gain an insight into the neurobiological processes linked to facial affect recognition, we investigated both induced and evoked oscillatory activity by calculating the Event Related Spectral Perturbation (ERSP) and the Inter Trial Coherence (ITC) during facial affect recognition. Fearful and neutral faces as well as nonface patches were presented to 24 patients with schizophrenia and 24 matched healthy controls while EEG was recorded. The participants' task was to recognize facial expressions. Because previous findings with healthy controls showed that facial feature decoding was associated primarily with oscillatory activity in the theta band, we analyzed ERSP and ITC in this frequency band in the time interval of 140-200 ms, which corresponds to the N170 component. Event-related theta activity and phase-locking to facial expressions, but not to nonface patches, predicted emotion recognition performance in both controls and patients. Event-related changes in theta amplitude and phase-locking were found to be significantly weaker in patients compared with healthy controls, which is in line with previous investigations showing decreased neural synchronization in the low frequency bands in patients with schizophrenia. Neural synchrony is thought to underlie distributed information processing. Our results indicate a less effective functioning in the recognition process of facial features, which may contribute to a less effective social cognition in schizophrenia. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  9. Cross-Category Adaptation: Objects Produce Gender Adaptation in the Perception of Faces

    PubMed Central

    Javadi, Amir Homayoun; Wee, Natalie

    2012-01-01

    Adaptation aftereffects have been found for low-level visual features such as colour, motion and shape perception, as well as higher-level features such as gender, race and identity in domains such as faces and biological motion. It is not yet clear if adaptation effects in humans extend beyond this set of higher order features. The aim of this study was to investigate whether objects highly associated with one gender, e.g. high heels for females or electric shavers for males can modulate gender perception of a face. In two separate experiments, we adapted subjects to a series of objects highly associated with one gender and subsequently asked participants to judge the gender of an ambiguous face. Results showed that participants are more likely to perceive an ambiguous face as male after being exposed to objects highly associated to females and vice versa. A gender adaptation aftereffect was obtained despite the adaptor and test stimuli being from different global categories (objects and faces respectively). These findings show that our perception of gender from faces is highly affected by our environment and recent experience. This suggests two possible mechanisms: (a) that perception of the gender associated with an object shares at least some brain areas with those responsible for gender perception of faces and (b) adaptation to gender, which is a high-level concept, can modulate brain areas that are involved in facial gender perception through top-down processes. PMID:23049942

  10. The Child Affective Facial Expression (CAFE) set: validity and reliability from untrained adults

    PubMed Central

    LoBue, Vanessa; Thrasher, Cat

    2014-01-01

    Emotional development is one of the largest and most productive areas of psychological research. For decades, researchers have been fascinated by how humans respond to, detect, and interpret emotional facial expressions. Much of the research in this area has relied on controlled stimulus sets of adults posing various facial expressions. Here we introduce a new stimulus set of emotional facial expressions into the domain of research on emotional development—The Child Affective Facial Expression set (CAFE). The CAFE set features photographs of a racially and ethnically diverse group of 2- to 8-year-old children posing for six emotional facial expressions—angry, fearful, sad, happy, surprised, and disgusted—and a neutral face. In the current work, we describe the set and report validity and reliability data on the set from 100 untrained adult participants. PMID:25610415

  11. The Child Affective Facial Expression (CAFE) set: validity and reliability from untrained adults.

    PubMed

    LoBue, Vanessa; Thrasher, Cat

    2014-01-01

    Emotional development is one of the largest and most productive areas of psychological research. For decades, researchers have been fascinated by how humans respond to, detect, and interpret emotional facial expressions. Much of the research in this area has relied on controlled stimulus sets of adults posing various facial expressions. Here we introduce a new stimulus set of emotional facial expressions into the domain of research on emotional development-The Child Affective Facial Expression set (CAFE). The CAFE set features photographs of a racially and ethnically diverse group of 2- to 8-year-old children posing for six emotional facial expressions-angry, fearful, sad, happy, surprised, and disgusted-and a neutral face. In the current work, we describe the set and report validity and reliability data on the set from 100 untrained adult participants.

  12. Effects of a small talking facial image on autonomic activity: the moderating influence of dispositional BIS and BAS sensitivities and emotions.

    PubMed

    Ravaja, Niklas

    2004-01-01

    We examined the moderating influence of dispositional behavioral inhibition system and behavioral activation system (BAS) sensitivities, Negative Affect, and Positive Affect on the relationship between a small moving vs. static facial image and autonomic responses when viewing/listening to news messages read by a newscaster among 36 young adults. Autonomic parameters measured were respiratory sinus arrhythmia (RSA), low-frequency (LF) component of heart rate variability (HRV), electrodermal activity, and pulse transit time (PTT). The results showed that dispositional BAS sensitivity, particularly BAS Fun Seeking, and Negative Affect interacted with facial image motion in predicting autonomic nervous system activity. A moving facial image was related to lower RSA and LF component of HRV and shorter PTTs as compared to a static facial image among high BAS individuals. Even a small talking facial image may contribute to sustained attentional engagement among high BAS individuals, given that the BAS directs attention toward the positive cue and a moving social stimulus may act as a positive incentive for high BAS individuals.

  13. Lip colour affects perceived sex typicality and attractiveness of human faces.

    PubMed

    Stephen, Ian D; McKeegan, Angela M

    2010-01-01

    The luminance contrast between facial features and facial skin is greater in women than in men, and women's use of make-up enhances this contrast. In black-and-white photographs, increased luminance contrast enhances femininity and attractiveness in women's faces, but reduces masculinity and attractiveness in men's faces. In Caucasians, much of the contrast between the lips and facial skin is in redness. Red lips have been considered attractive in women in geographically and temporally diverse cultures, possibly because they mimic vasodilation associated with sexual arousal. Here, we investigate the effects of lip luminance and colour contrast on the attractiveness and sex typicality (masculinity/femininity) of human faces. In a Caucasian sample, we allowed participants to manipulate the colour of the lips in colour-calibrated face photographs along CIELab L* (light--dark), a* (red--green), and b* (yellow--blue) axes to enhance apparent attractiveness and sex typicality. Participants increased redness contrast to enhance femininity and attractiveness of female faces, but reduced redness contrast to enhance masculinity of men's faces. Lip blueness was reduced more in female than male faces. Increased lightness contrast enhanced the attractiveness of both sexes, and had little effect on perceptions of sex typicality. The association between lip colour contrast and attractiveness in women's faces may be attributable to its association with oxygenated blood perfusion indicating oestrogen levels, sexual arousal, and cardiac and respiratory health.

  14. Considering sex differences clarifies the effects of depression on facial emotion processing during fMRI.

    PubMed

    Jenkins, L M; Kendall, A D; Kassel, M T; Patrón, V G; Gowins, J R; Dion, C; Shankman, S A; Weisenbach, S L; Maki, P; Langenecker, S A

    2018-01-01

    Sex differences in emotion processing may play a role in women's increased risk for Major Depressive Disorder (MDD). However, studies of sex differences in brain mechanisms involved in emotion processing in MDD (or interactions of sex and diagnosis) are sparse. We conducted an event-related fMRI study examining the interactive and distinct effects of sex and MDD on neural activity during a facial emotion perception task. To minimize effects of current affective state and cumulative disease burden, we studied participants with remitted MDD (rMDD) who were early in the course of the illness. In total, 88 individuals aged 18-23 participated, including 48 with rMDD (32 female) and 40 healthy controls (HC; 25 female). fMRI revealed an interaction between sex and diagnosis for sad and neutral facial expressions in the superior frontal gyrus and left middle temporal gyrus. Results also revealed an interaction of sex with diagnosis in the amygdala. Data was from two sites, which might increase variability, but it also increases power to examine sex by diagnosis interactions. This study demonstrates the importance of taking sex differences into account when examining potential trait (or scar) mechanisms that could be useful in identifying individuals at-risk for MDD as well as for evaluating potential therapeutic innovations. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. BMI and WHR Are Reflected in Female Facial Shape and Texture: A Geometric Morphometric Image Analysis.

    PubMed

    Mayer, Christine; Windhager, Sonja; Schaefer, Katrin; Mitteroecker, Philipp

    2017-01-01

    Facial markers of body composition are frequently studied in evolutionary psychology and are important in computational and forensic face recognition. We assessed the association of body mass index (BMI) and waist-to-hip ratio (WHR) with facial shape and texture (color pattern) in a sample of young Middle European women by a combination of geometric morphometrics and image analysis. Faces of women with high BMI had a wider and rounder facial outline relative to the size of the eyes and lips, and relatively lower eyebrows. Furthermore, women with high BMI had a brighter and more reddish skin color than women with lower BMI. The same facial features were associated with WHR, even though BMI and WHR were only moderately correlated. Yet BMI was better predictable than WHR from facial attributes. After leave-one-out cross-validation, we were able to predict 25% of variation in BMI and 10% of variation in WHR by facial shape. Facial texture predicted only about 3-10% of variation in BMI and WHR. This indicates that facial shape primarily reflects total fat proportion, rather than the distribution of fat within the body. The association of reddish facial texture in high-BMI women may be mediated by increased blood pressure and superficial blood flow as well as diet. Our study elucidates how geometric morphometric image analysis serves to quantify the effect of biological factors such as BMI and WHR to facial shape and color, which in turn contributes to social perception.

  16. Neural mechanisms underlying the effects of face-based affective signals on memory for faces: a tentative model

    PubMed Central

    Tsukiura, Takashi

    2012-01-01

    In our daily lives, we form some impressions of other people. Although those impressions are affected by many factors, face-based affective signals such as facial expression, facial attractiveness, or trustworthiness are important. Previous psychological studies have demonstrated the impact of facial impressions on remembering other people, but little is known about the neural mechanisms underlying this psychological process. The purpose of this article is to review recent functional MRI (fMRI) studies to investigate the effects of face-based affective signals including facial expression, facial attractiveness, and trustworthiness on memory for faces, and to propose a tentative concept for understanding this affective-cognitive interaction. On the basis of the aforementioned research, three brain regions are potentially involved in the processing of face-based affective signals. The first candidate is the amygdala, where activity is generally modulated by both affectively positive and negative signals from faces. Activity in the orbitofrontal cortex (OFC), as the second candidate, increases as a function of perceived positive signals from faces; whereas activity in the insular cortex, as the third candidate, reflects a function of face-based negative signals. In addition, neuroscientific studies have reported that the three regions are functionally connected to the memory-related hippocampal regions. These findings suggest that the effects of face-based affective signals on memory for faces could be modulated by interactions between the regions associated with the processing of face-based affective signals and the hippocampus as a memory-related region. PMID:22837740

  17. How to Avoid Facial Nerve Injury in Mastoidectomy?

    PubMed Central

    Ryu, Nam-Gyu

    2016-01-01

    Unexpected iatrogenic facial nerve paralysis not only affects facial disfiguration, but also imposes a devastating effect on the social, psychological, and economic aspects of an affected person's life at once. The aims of this study were to postulate where surgeons had mistakenly drilled or where obscured by granulations or by fibrous bands and to look for surgical approach with focused on the safety of facial nerve in mastoid surgery. We had found 14 cases of iatrogenic facial nerve injury (IFNI) during mastoid surgery for 5 years in Korea. The medical records of all the patients were obtained and analyzed injured site of facial nerve segment with surgical technique of mastoidectomy. Eleven patients underwent facial nerve exploration and three patients had conservative management. 43% (6 cases) of iatrogenic facial nerve injuries had occurred in tympanic segment, 28.5% (4 cases) of injuries in second genu combined with tympanic segment, and 28.5% (4 cases) of injuries in mastoid segment. Surgeons should try to identify the facial nerve using available landmarks and be kept in mind the anomalies of the facial nerve. With use of intraoperative facial nerve monitoring, the avoidance of in order to avoid IFNI would be possible in more cases. Many authors emphasized the importance of intraoperative facial nerve monitoring, even in primary otologic surgery. However, anatomical understanding of intratemporal landmarks with meticulous dissection could not be emphasized as possible to prevent IFNI. PMID:27626078

  18. Human and animal sounds influence recognition of body language.

    PubMed

    Van den Stock, Jan; Grèzes, Julie; de Gelder, Beatrice

    2008-11-25

    In naturalistic settings emotional events have multiple correlates and are simultaneously perceived by several sensory systems. Recent studies have shown that recognition of facial expressions is biased towards the emotion expressed by a simultaneously presented emotional expression in the voice even if attention is directed to the face only. So far, no study examined whether this phenomenon also applies to whole body expressions, although there is no obvious reason why this crossmodal influence would be specific for faces. Here we investigated whether perception of emotions expressed in whole body movements is influenced by affective information provided by human and by animal vocalizations. Participants were instructed to attend to the action displayed by the body and to categorize the expressed emotion. The results indicate that recognition of body language is biased towards the emotion expressed by the simultaneously presented auditory information, whether it consist of human or of animal sounds. Our results show that a crossmodal influence from auditory to visual emotional information obtains for whole body video images with the facial expression blanked and includes human as well as animal sounds.

  19. Cultural similarities and differences in the perception of emotional valence and intensity: a comparison of Americans and Hong Kong Chinese.

    PubMed

    Zhu, Zhuoying; Ho, Samuel M Y; Bonanno, George A

    2013-01-01

    Despite being challenged for their ecological validity, studies of emotion perception have often relied on static, posed expressions. One of the key reasons is that dynamic, spontaneous expressions are difficult to control because of the existence of display rules and frequent co-occurrence of non-emotion related facial movements. The present study investigated cross-cultural patterns in the perception of emotion using an expressive regulation paradigm for generating facial expressions. The paradigm largely balances out the competing concerns for ecological and internal validity. Americans and Hong Kong Chinese (expressors) were presented with positively and negatively valenced pictures and were asked to enhance, suppress, or naturally display their facial expressions according to their subjective emotions. Videos of naturalistic and dynamic expressions of emotions were rated by Americans and Hong Kong Chinese (judges) for valence and intensity. The 2 cultures agreed on the valence and relative intensity of emotion expressions, but cultural differences were observed in absolute intensity ratings. The differences varied between positive and negative expressions. With positive expressions, ratings were higher when there was a cultural match between the expressor and the judge and when the expression was enhanced by the expressor. With negative expressions, Chinese judges gave higher ratings than their American counterparts for Chinese expressions under all 3 expressive conditions, and the discrepancy increased with expression intensity; no cultural differences were observed when American expressions were judged. The results were discussed with respect to the "decoding rules" and "same-culture advantage" approaches of emotion perception and a negativity bias in the Chinese collective culture.

  20. That "poker face" just might lose you the game! The impact of expressive suppression and mimicry on sensitivity to facial expressions of emotion.

    PubMed

    Schneider, Kristin G; Hempel, Roelie J; Lynch, Thomas R

    2013-10-01

    Successful interpersonal functioning often requires both the ability to mask inner feelings and the ability to accurately recognize others' expressions--but what if effortful control of emotional expressions impacts the ability to accurately read others? In this study, we examined the influence of self-controlled expressive suppression and mimicry on facial affect sensitivity--the speed with which one can accurately identify gradually intensifying facial expressions of emotion. Muscle activity of the brow (corrugator, related to anger), upper lip (levator, related to disgust), and cheek (zygomaticus, related to happiness) were recorded using facial electromyography while participants randomized to one of three conditions (Suppress, Mimic, and No-Instruction) viewed a series of six distinct emotional expressions (happiness, sadness, fear, anger, surprise, and disgust) as they morphed from neutral to full expression. As hypothesized, individuals instructed to suppress their own facial expressions showed impairment in facial affect sensitivity. Conversely, mimicry of emotion expressions appeared to facilitate facial affect sensitivity. Results suggest that it is difficult for a person to be able to simultaneously mask inner feelings and accurately "read" the facial expressions of others, at least when these expressions are at low intensity. The combined behavioral and physiological data suggest that the strategies an individual selects to control his or her own expression of emotion have important implications for interpersonal functioning.

  1. Modality-specific alterations in the perception of emotional stimuli in Bipolar Disorder compared to Healthy Controls and Major Depressive Disorder

    PubMed Central

    Vederman, Aaron C.; Weisenbach, Sara L.; Rapport, Lisa J.; Leon, Hadia M.; Haase, Brennan D.; Franti, Lindsay M.; Schallmo, Michael-Paul; Saunders, Erika F.H.; Kamali, Masoud M.; Zubieta, Jon-Kar; Langenecker, Scott A.; McInnis, Melvin G.

    2013-01-01

    Objectives Affect identification accuracy paradigms have increasingly been utilized to understand psychiatric illness including Bipolar Disorder (BD) and Major Depressive Disorder (MDD). This investigation focused on perceptual accuracy in affect identification in both visual and auditory domains among patients with BD, relative to Healthy Controls (HC) and patients with MDD. Demographic and clinical variables, in addition to medications were also investigated. Methods The visual Facial Emotion Perception Test (FEPT) and auditory Emotional Perception Test (EPT) were administered to adults with BD (n = 119) and MDD (n = 78) as well as HC (n = 66). Results Performance on the FEPT was significantly stronger than on the EPT irrespective of group. Performance on the EPT did not significantly differentiate the groups. On the FEPT, BD samples had the greatest difficulty relative to HC in identification of sad and fearful faces. BD participants also had greater difficulty identifying sad faces relative to MDD participants though not after controlling for severity of illness factors. For the BD (but not MDD) sample several clinical variables were also correlated with FEPT performance. Conclusions The findings suggest that disruptions in identification of negative emotions such as sadness and fear may be a characteristic trait of BD. However, this effect may be moderated by greater illness severity found in our BD sample. PMID:21683948

  2. Amblyopia Associated with Congenital Facial Nerve Paralysis.

    PubMed

    Iwamura, Hitoshi; Kondo, Kenji; Sawamura, Hiromasa; Baba, Shintaro; Yasuhara, Kazuo; Yamasoba, Tatsuya

    2016-01-01

    The association between congenital facial paralysis and visual development has not been thoroughly studied. Of 27 pediatric cases of congenital facial paralysis, we identified 3 patients who developed amblyopia, a visual acuity decrease caused by abnormal visual development, as comorbidity. These 3 patients had facial paralysis in the periocular region and developed amblyopia on the paralyzed side. They started treatment by wearing an eye patch immediately after diagnosis and before the critical visual developmental period; all patients responded to the treatment. Our findings suggest that the incidence of amblyopia in the cases of congenital facial paralysis, particularly the paralysis in the periocular region, is higher than that in the general pediatric population. Interestingly, 2 of the 3 patients developed anisometropic amblyopia due to the hyperopia of the affected eye, implying that the periocular facial paralysis may have affected the refraction of the eye through yet unspecified mechanisms. Therefore, the physicians who manage facial paralysis should keep this pathology in mind, and when they see pediatric patients with congenital facial paralysis involving the periocular region, they should consult an ophthalmologist as soon as possible. © 2016 S. Karger AG, Basel.

  3. Differences between Caucasian and Asian attractive faces.

    PubMed

    Rhee, S C

    2018-02-01

    There are discrepancies between the public's current beauty desires and conventional theories and historical rules regarding facial beauty. This photogrammetric study aims to describe in detail mathematical differences in facial configuration between attractive Caucasian and attractive Asian faces. To analyse the structural differences between attractive Caucasian and attractive Asian faces, frontal face and lateral face views for each race were morphed; facial landmarks were defined, and the relative photographic pixel distances and angles were measured. Absolute values were acquired by arithmetic conversion for comparison. The data indicate that some conventional beliefs of facial attractiveness can be applied but others are no longer valid in explaining perspectives of beauty between Caucasians and Asians. Racial differences in the perceptions of attractive faces were evident. Common features as a phenomenon of global fusion in the perspectives on facial beauty were revealed. Beauty standards differ with race and ethnicity, and some conventional rules for ideal facial attractiveness were found to be inappropriate. We must reexamine old principles of facial beauty and continue to fundamentally question it according to its racial, cultural, and neuropsychological aspects. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  4. Bidirectional Gender Face Aftereffects: Evidence Against Normative Facial Coding.

    PubMed

    Cronin, Sophie L; Spence, Morgan L; Miller, Paul A; Arnold, Derek H

    2017-02-01

    Facial appearance can be altered, not just by restyling but also by sensory processes. Exposure to a female face can, for instance, make subsequent faces look more masculine than they would otherwise. Two explanations exist. According to one, exposure to a female face renormalizes face perception, making that female and all other faces look more masculine as a consequence-a unidirectional effect. According to that explanation, exposure to a male face would have the opposite unidirectional effect. Another suggestion is that face gender is subject to contrastive aftereffects. These should make some faces look more masculine than the adaptor and other faces more feminine-a bidirectional effect. Here, we show that face gender aftereffects are bidirectional, as predicted by the latter hypothesis. Images of real faces rated as more and less masculine than adaptors at baseline tended to look even more and less masculine than adaptors post adaptation. This suggests that, rather than mental representations of all faces being recalibrated to better reflect the prevailing statistics of the environment, mental operations exaggerate differences between successive faces, and this can impact facial gender perception.

  5. Top-Down and Bottom-Up Visual Information Processing of Non-Social Stimuli in High-Functioning Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Maekawa, Toshihiko; Tobimatsu, Shozo; Inada, Naoko; Oribe, Naoya; Onitsuka, Toshiaki; Kanba, Shigenobu; Kamio, Yoko

    2011-01-01

    Individuals with high-functioning autism spectrum disorder (HF-ASD) often show superior performance in simple visual tasks, despite difficulties in the perception of socially important information such as facial expression. The neural basis of visual perception abnormalities associated with HF-ASD is currently unclear. We sought to elucidate the…

  6. Infants' Perception of Emotion from Body Movements

    ERIC Educational Resources Information Center

    Zieber, Nicole; Kangas, Ashley; Hock, Alyson; Bhatt, Ramesh S.

    2014-01-01

    Adults recognize emotions conveyed by bodies with comparable accuracy to facial emotions. However, no prior study has explored infants' perception of body emotions. In Experiment 1, 6.5-month-olds (n = 32) preferred happy over neutral actions of actors with covered faces in upright but not inverted silent videos. In Experiment 2, infants…

  7. The facial nerve: anatomy and associated disorders for oral health professionals.

    PubMed

    Takezawa, Kojiro; Townsend, Grant; Ghabriel, Mounir

    2018-04-01

    The facial nerve, the seventh cranial nerve, is of great clinical significance to oral health professionals. Most published literature either addresses the central connections of the nerve or its peripheral distribution but few integrate both of these components and also highlight the main disorders affecting the nerve that have clinical implications in dentistry. The aim of the current study is to provide a comprehensive description of the facial nerve. Multiple aspects of the facial nerve are discussed and integrated, including its neuroanatomy, functional anatomy, gross anatomy, clinical problems that may involve the nerve, and the use of detailed anatomical knowledge in the diagnosis of the site of facial nerve lesion in clinical neurology. Examples are provided of disorders that can affect the facial nerve during its intra-cranial, intra-temporal and extra-cranial pathways, and key aspects of clinical management are discussed. The current study is complemented by original detailed dissections and sketches that highlight key anatomical features and emphasise the extent and nature of anatomical variations displayed by the facial nerve.

  8. Is moral beauty different from facial beauty? Evidence from an fMRI study.

    PubMed

    Wang, Tingting; Mo, Lei; Mo, Ce; Tan, Li Hai; Cant, Jonathan S; Zhong, Luojin; Cupchik, Gerald

    2015-06-01

    Is moral beauty different from facial beauty? Two functional magnetic resonance imaging experiments were performed to answer this question. Experiment 1 investigated the network of moral aesthetic judgments and facial aesthetic judgments. Participants performed aesthetic judgments and gender judgments on both faces and scenes containing moral acts. The conjunction analysis of the contrasts 'facial aesthetic judgment > facial gender judgment' and 'scene moral aesthetic judgment > scene gender judgment' identified the common involvement of the orbitofrontal cortex (OFC), inferior temporal gyrus and medial superior frontal gyrus, suggesting that both types of aesthetic judgments are based on the orchestration of perceptual, emotional and cognitive components. Experiment 2 examined the network of facial beauty and moral beauty during implicit perception. Participants performed a non-aesthetic judgment task on both faces (beautiful vs common) and scenes (containing morally beautiful vs neutral information). We observed that facial beauty (beautiful faces > common faces) involved both the cortical reward region OFC and the subcortical reward region putamen, whereas moral beauty (moral beauty scenes > moral neutral scenes) only involved the OFC. Moreover, compared with facial beauty, moral beauty spanned a larger-scale cortical network, indicating more advanced and complex cerebral representations characterizing moral beauty. © The Author (2014). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  9. Orthodontists' and laypeople's perception of smile height aesthetics in relation to varying degrees of transverse cant of anterior teeth.

    PubMed

    Shiyan, Huang; Xu, Qian; Shuhao, Xu; Nanquan, Rao; Xiaobing, Li

    2016-05-01

    To determine the effect of varying the transverse cant of the anterior teeth on orthodontists' and laypeople's perceptions of smile aesthetics, and the influence that smile height has on this perception. A 20-year-old Chinese female with an aesthetic smile and normal occlusion was chosen and agreed to participate. Digital pictures of her posed smile were taken and manipulated to create three smile height variations: low, medium, or high. Each variation was further manipulated to create varying degrees of transverse anterior tooth cant. Fifty-six laypeople and 40 orthodontists participated as raters of the dental and facial impact of the altered smile images. The orthodontists more commonly and precisely identified the transverse cants of the anterior teeth and the detracting influence on smile aesthetics compared with laypersons. The orthodontists accepted a lesser range of anterior transverse cant. Increased smile heights enhanced the capability of all raters to detect a transverse cant and reduced the acceptable cant range. In addition, an increased smile height worsened the detracting effects of the transverse anterior cant in all raters' perceptions of smile aesthetics. An increased display of teeth and angulation of an anterior cant increased the ability of raters in both groups to detect differences. Transverse cants of anterior teeth can affect orthodontists' and laypeople's perceptions of smile aesthetics. Smile height and incisor display were significant factors that affected the orthodontist's and layperson's perceptions of smile aesthetics, and suggested that a description of the detracting effect of an anterior transverse cant should also consider smile height. A transverse occlusal cant is an important aesthetic factor used by clinicians during orthodontic diagnosis and review. It is important to appreciate that there is a difference in perception between orthodontic professionals and patients (laypeople). The extent of this perceptual difference and influencing factors could help the clinician set more appropriate treatment goals.

  10. Blinded by Beauty: Attractiveness Bias and Accurate Perceptions of Academic Performance

    PubMed Central

    Talamas, Sean N.; Mavor, Kenneth I.; Perrett, David I.

    2016-01-01

    Despite the old adage not to ‘judge a book by its cover’, facial cues often guide first impressions and these first impressions guide our decisions. Literature suggests there are valid facial cues that assist us in assessing someone’s health or intelligence, but such cues are overshadowed by an ‘attractiveness halo’ whereby desirable attributions are preferentially ascribed to attractive people. The impact of the attractiveness halo effect on perceptions of academic performance in the classroom is concerning as this has shown to influence students’ future performance. We investigated the limiting effects of the attractiveness halo on perceptions of actual academic performance in faces of 100 university students. Given the ambiguity and various perspectives on the definition of intelligence and the growing consensus on the importance of conscientiousness over intelligence in predicting actual academic performance, we also investigated whether perceived conscientiousness was a more accurate predictor of academic performance than perceived intelligence. Perceived conscientiousness was found to be a better predictor of actual academic performance when compared to perceived intelligence and perceived academic performance, and accuracy was improved when controlling for the influence of attractiveness on judgments. These findings emphasize the misleading effect of attractiveness on the accuracy of first impressions of competence, which can have serious consequences in areas such as education and hiring. The findings also have implications for future research investigating impression accuracy based on facial stimuli. PMID:26885976

  11. Blinded by Beauty: Attractiveness Bias and Accurate Perceptions of Academic Performance.

    PubMed

    Talamas, Sean N; Mavor, Kenneth I; Perrett, David I

    2016-01-01

    Despite the old adage not to 'judge a book by its cover', facial cues often guide first impressions and these first impressions guide our decisions. Literature suggests there are valid facial cues that assist us in assessing someone's health or intelligence, but such cues are overshadowed by an 'attractiveness halo' whereby desirable attributions are preferentially ascribed to attractive people. The impact of the attractiveness halo effect on perceptions of academic performance in the classroom is concerning as this has shown to influence students' future performance. We investigated the limiting effects of the attractiveness halo on perceptions of actual academic performance in faces of 100 university students. Given the ambiguity and various perspectives on the definition of intelligence and the growing consensus on the importance of conscientiousness over intelligence in predicting actual academic performance, we also investigated whether perceived conscientiousness was a more accurate predictor of academic performance than perceived intelligence. Perceived conscientiousness was found to be a better predictor of actual academic performance when compared to perceived intelligence and perceived academic performance, and accuracy was improved when controlling for the influence of attractiveness on judgments. These findings emphasize the misleading effect of attractiveness on the accuracy of first impressions of competence, which can have serious consequences in areas such as education and hiring. The findings also have implications for future research investigating impression accuracy based on facial stimuli.

  12. Bell's palsy and partial hypoglossal to facial nerve transfer: Case presentation and literature review

    PubMed Central

    Socolovsky, Mariano; Páez, Miguel Domínguez; Masi, Gilda Di; Molina, Gonzalo; Fernández, Eduardo

    2012-01-01

    Background: Idiopathic facial nerve palsy (Bell's palsy) is a very common condition that affects active population. Despite its generally benign course, a minority of patients can remain with permanent and severe sequelae, including facial palsy or dyskinesia. Hypoglossal to facial nerve anastomosis is rarely used to reinnervate the mimic muscle in these patients. In this paper, we present a case where a direct partial hypoglossal to facial nerve transfer was used to reinnervate the upper and lower face. We also discuss the indications of this procedure. Case Description: A 53-year-old woman presenting a spontaneous complete (House and Brackmann grade 6) facial palsy on her left side showed no improvement after 13 months of conservative treatment. Electromyography (EMG) showed complete denervation of the mimic muscles. A direct partial hypoglossal to facial nerve anastomosis was performed, including dissection of the facial nerve at the fallopian canal. One year after the procedure, the patient showed House and Brackmann grade 3 function in her affected face. Conclusions: Partial hypoglossal–facial anastomosis with intratemporal drilling of the facial nerve is a viable technique in the rare cases in which severe Bell's palsy does not recover spontaneously. Only carefully selected patients can really benefit from this technique. PMID:22574255

  13. Wanting it Too Much: An Inverse Relation Between Social Motivation and Facial Emotion Recognition in Autism Spectrum Disorder

    PubMed Central

    Garman, Heather D.; Spaulding, Christine J.; Webb, Sara Jane; Mikami, Amori Yee; Morris, James P.

    2016-01-01

    This study examined social motivation and early-stage face perception as frameworks for understanding impairments in facial emotion recognition (FER) in a well-characterized sample of youth with autism spectrum disorders (ASD). Early-stage face perception (N170 event-related potential latency) was recorded while participants completed a standardized FER task, while social motivation was obtained via parent report. Participants with greater social motivation exhibited poorer FER, while those with shorter N170 latencies exhibited better FER for child angry faces stimuli. Social motivation partially mediated the relationship between a faster N170 and better FER. These effects were all robust to variations in IQ, age, and ASD severity. These findings augur against theories implicating social motivation as uniformly valuable for individuals with ASD, and augment models suggesting a close link between early-stage face perception, social motivation, and FER in this population. Broader implications for models and development of FER in ASD are discussed. PMID:26743637

  14. Wanting it Too Much: An Inverse Relation Between Social Motivation and Facial Emotion Recognition in Autism Spectrum Disorder.

    PubMed

    Garman, Heather D; Spaulding, Christine J; Webb, Sara Jane; Mikami, Amori Yee; Morris, James P; Lerner, Matthew D

    2016-12-01

    This study examined social motivation and early-stage face perception as frameworks for understanding impairments in facial emotion recognition (FER) in a well-characterized sample of youth with autism spectrum disorders (ASD). Early-stage face perception (N170 event-related potential latency) was recorded while participants completed a standardized FER task, while social motivation was obtained via parent report. Participants with greater social motivation exhibited poorer FER, while those with shorter N170 latencies exhibited better FER for child angry faces stimuli. Social motivation partially mediated the relationship between a faster N170 and better FER. These effects were all robust to variations in IQ, age, and ASD severity. These findings augur against theories implicating social motivation as uniformly valuable for individuals with ASD, and augment models suggesting a close link between early-stage face perception, social motivation, and FER in this population. Broader implications for models and development of FER in ASD are discussed.

  15. Perceived intelligence is associated with measured intelligence in men but not women.

    PubMed

    Kleisner, Karel; Chvátalová, Veronika; Flegr, Jaroslav

    2014-01-01

    The ability to accurately assess the intelligence of other persons finds its place in everyday social interaction and should have important evolutionary consequences. We used static facial photographs of 40 men and 40 women to test the relationship between measured IQ, perceived intelligence, and facial shape. Both men and women were able to accurately evaluate the intelligence of men by viewing facial photographs. In addition to general intelligence, figural and fluid intelligence showed a significant relationship with perceived intelligence, but again, only in men. No relationship between perceived intelligence and IQ was found for women. We used geometric morphometrics to determine which facial traits are associated with the perception of intelligence, as well as with intelligence as measured by IQ testing. Faces that are perceived as highly intelligent are rather prolonged with a broader distance between the eyes, a larger nose, a slight upturn to the corners of the mouth, and a sharper, pointing, less rounded chin. By contrast, the perception of lower intelligence is associated with broader, more rounded faces with eyes closer to each other, a shorter nose, declining corners of the mouth, and a rounded and massive chin. By contrast, we found no correlation between morphological traits and real intelligence measured with IQ test, either in men or women. These results suggest that a perceiver can accurately gauge the real intelligence of men, but not women, by viewing their faces in photographs; however, this estimation is possibly not based on facial shape. Our study revealed no relation between intelligence and either attractiveness or face shape.

  16. Perceived Intelligence Is Associated with Measured Intelligence in Men but Not Women

    PubMed Central

    Kleisner, Karel; Chvátalová, Veronika; Flegr, Jaroslav

    2014-01-01

    Background The ability to accurately assess the intelligence of other persons finds its place in everyday social interaction and should have important evolutionary consequences. Methodology/Principal Findings We used static facial photographs of 40 men and 40 women to test the relationship between measured IQ, perceived intelligence, and facial shape. Both men and women were able to accurately evaluate the intelligence of men by viewing facial photographs. In addition to general intelligence, figural and fluid intelligence showed a significant relationship with perceived intelligence, but again, only in men. No relationship between perceived intelligence and IQ was found for women. We used geometric morphometrics to determine which facial traits are associated with the perception of intelligence, as well as with intelligence as measured by IQ testing. Faces that are perceived as highly intelligent are rather prolonged with a broader distance between the eyes, a larger nose, a slight upturn to the corners of the mouth, and a sharper, pointing, less rounded chin. By contrast, the perception of lower intelligence is associated with broader, more rounded faces with eyes closer to each other, a shorter nose, declining corners of the mouth, and a rounded and massive chin. By contrast, we found no correlation between morphological traits and real intelligence measured with IQ test, either in men or women. Conclusions These results suggest that a perceiver can accurately gauge the real intelligence of men, but not women, by viewing their faces in photographs; however, this estimation is possibly not based on facial shape. Our study revealed no relation between intelligence and either attractiveness or face shape. PMID:24651120

  17. Botulinum toxin treatment for facial palsy: A systematic review.

    PubMed

    Cooper, Lilli; Lui, Michael; Nduka, Charles

    2017-06-01

    Facial palsy may be complicated by ipsilateral synkinesis or contralateral hyperkinesis. Botulinum toxin is increasingly used in the management of facial palsy; however, the optimum dose, treatment interval, adjunct therapy and performance as compared with alternative treatments have not been well established. This study aimed to systematically review the evidence for the use of botulinum toxin in facial palsy. The Cochrane central register of controlled trials (CENTRAL), MEDLINE(R) (1946 to September 2015) and Embase Classic + Embase (1947 to September 2015) were searched for randomised studies using botulinum toxin in facial palsy. Forty-seven studies were identified, and three included. Their physical and patient-reported outcomes are described, and observations and cautions are discussed. Facial asymmetry has a strong correlation to subjective domains such as impairment in social interaction and perception of self-image and appearance. Botulinum toxin injections represent a minimally invasive technique that is helpful in restoring facial symmetry at rest and during movement in chronic, and potentially acute, facial palsy. Botulinum toxin in combination with physical therapy may be particularly helpful. Currently, there is a paucity of data; areas for further research are suggested. A strong body of evidence may allow botulinum toxin treatment to be nationally standardised and recommended in the management of facial palsy. Copyright © 2017 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  18. Are Happy Faces Attractive? The Roles of Early vs. Late Processing

    PubMed Central

    Sun, Delin; Chan, Chetwyn C. H.; Fan, Jintu; Wu, Yi; Lee, Tatia M. C.

    2015-01-01

    Facial attractiveness is closely related to romantic love. To understand if the neural underpinnings of perceived facial attractiveness and facial expression are similar constructs, we recorded neural signals using an event-related potential (ERP) methodology for 20 participants who were viewing faces with varied attractiveness and expressions. We found that attractiveness and expression were reflected by two early components, P2-lateral (P2l) and P2-medial (P2m), respectively; their interaction effect was reflected by LPP, a late component. The findings suggested that facial attractiveness and expression are first processed in parallel for discrimination between stimuli. After the initial processing, more attentional resources are allocated to the faces with the most positive or most negative valence in both the attractiveness and expression dimensions. The findings contribute to the theoretical model of face perception. PMID:26648885

  19. Neural measures of the role of affective prosody in empathy for pain.

    PubMed

    Meconi, Federica; Doro, Mattia; Lomoriello, Arianna Schiano; Mastrella, Giulia; Sessa, Paola

    2018-01-10

    Emotional communication often needs the integration of affective prosodic and semantic components from speech and the speaker's facial expression. Affective prosody may have a special role by virtue of its dual-nature; pre-verbal on one side and accompanying semantic content on the other. This consideration led us to hypothesize that it could act transversely, encompassing a wide temporal window involving the processing of facial expressions and semantic content expressed by the speaker. This would allow powerful communication in contexts of potential urgency such as witnessing the speaker's physical pain. Seventeen participants were shown with faces preceded by verbal reports of pain. Facial expressions, intelligibility of the semantic content of the report (i.e., participants' mother tongue vs. fictional language) and the affective prosody of the report (neutral vs. painful) were manipulated. We monitored event-related potentials (ERPs) time-locked to the onset of the faces as a function of semantic content intelligibility and affective prosody of the verbal reports. We found that affective prosody may interact with facial expressions and semantic content in two successive temporal windows, supporting its role as a transverse communication cue.

  20. A new quantitative evaluation method for age-related changes of individual pigmented spots in facial skin.

    PubMed

    Kikuchi, K; Masuda, Y; Yamashita, T; Sato, K; Katagiri, C; Hirao, T; Mizokami, Y; Yaguchi, H

    2016-08-01

    Facial skin pigmentation is one of the most prominent visible features of skin aging and often affects perception of health and beauty. To date, facial pigmentation has been evaluated using various image analysis methods developed for the cosmetic and esthetic fields. However, existing methods cannot provide precise information on pigmented spots, such as variations in size, color shade, and distribution pattern. The purpose of this study is the development of image evaluation methods to analyze individual pigmented spots and acquire detailed information on their age-related changes. To characterize the individual pigmented spots within a cheek image, we established a simple object-counting algorithm. First, we captured cheek images using an original imaging system equipped with an illumination unit and a high-resolution digital camera. The acquired images were converted into melanin concentration images using compensation formulae. Next, the melanin images were converted into binary images. The binary images were then subjected to noise reduction. Finally, we calculated parameters such as the melanin concentration, quantity, and size of individual pigmented spots using a connected-components labeling algorithm, which assigns a unique label to each separate group of connected pixels. The cheek image analysis was evaluated on 643 female Japanese subjects. We confirmed that the proposed method was sufficiently sensitive to measure the melanin concentration, and the numbers and sizes of individual pigmented spots through manual evaluation of the cheek images. The image analysis results for the 643 Japanese women indicated clear relationships between age and the changes in the pigmented spots. We developed a new quantitative evaluation method for individual pigmented spots in facial skin. This method facilitates the analysis of the characteristics of various pigmented facial spots and is directly applicable to the fields of dermatology, pharmacology, and esthetic cosmetology. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  1. Extracted facial feature of racial closely related faces

    NASA Astrophysics Data System (ADS)

    Liewchavalit, Chalothorn; Akiba, Masakazu; Kanno, Tsuneo; Nagao, Tomoharu

    2010-02-01

    Human faces contain a lot of demographic information such as identity, gender, age, race and emotion. Human being can perceive these pieces of information and use it as an important clue in social interaction with other people. Race perception is considered the most delicacy and sensitive parts of face perception. There are many research concerning image-base race recognition, but most of them are focus on major race group such as Caucasoid, Negroid and Mongoloid. This paper focuses on how people classify race of the racial closely related group. As a sample of racial closely related group, we choose Japanese and Thai face to represents difference between Northern and Southern Mongoloid. Three psychological experiment was performed to study the strategies of face perception on race classification. As a result of psychological experiment, it can be suggested that race perception is an ability that can be learn. Eyes and eyebrows are the most attention point and eyes is a significant factor in race perception. The Principal Component Analysis (PCA) was performed to extract facial features of sample race group. Extracted race features of texture and shape were used to synthesize faces. As the result, it can be suggested that racial feature is rely on detailed texture rather than shape feature. This research is a indispensable important fundamental research on the race perception which are essential in the establishment of human-like race recognition system.

  2. Reprint of "Investigating ensemble perception of emotions in autistic and typical children and adolescents".

    PubMed

    Karaminis, Themelis; Neil, Louise; Manning, Catherine; Turi, Marco; Fiorentini, Chiara; Burr, David; Pellicano, Elizabeth

    2018-01-01

    Ensemble perception, the ability to assess automatically the summary of large amounts of information presented in visual scenes, is available early in typical development. This ability might be compromised in autistic children, who are thought to present limitations in maintaining summary statistics representations for the recent history of sensory input. Here we examined ensemble perception of facial emotional expressions in 35 autistic children, 30 age- and ability-matched typical children and 25 typical adults. Participants received three tasks: a) an 'ensemble' emotion discrimination task; b) a baseline (single-face) emotion discrimination task; and c) a facial expression identification task. Children performed worse than adults on all three tasks. Unexpectedly, autistic and typical children were, on average, indistinguishable in their precision and accuracy on all three tasks. Computational modelling suggested that, on average, autistic and typical children used ensemble-encoding strategies to a similar extent; but ensemble perception was related to non-verbal reasoning abilities in autistic but not in typical children. Eye-movement data also showed no group differences in the way children attended to the stimuli. Our combined findings suggest that the abilities of autistic and typical children for ensemble perception of emotions are comparable on average. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  3. Ensemble perception of emotions in autistic and typical children and adolescents.

    PubMed

    Karaminis, Themelis; Neil, Louise; Manning, Catherine; Turi, Marco; Fiorentini, Chiara; Burr, David; Pellicano, Elizabeth

    2017-04-01

    Ensemble perception, the ability to assess automatically the summary of large amounts of information presented in visual scenes, is available early in typical development. This ability might be compromised in autistic children, who are thought to present limitations in maintaining summary statistics representations for the recent history of sensory input. Here we examined ensemble perception of facial emotional expressions in 35 autistic children, 30 age- and ability-matched typical children and 25 typical adults. Participants received three tasks: a) an 'ensemble' emotion discrimination task; b) a baseline (single-face) emotion discrimination task; and c) a facial expression identification task. Children performed worse than adults on all three tasks. Unexpectedly, autistic and typical children were, on average, indistinguishable in their precision and accuracy on all three tasks. Computational modelling suggested that, on average, autistic and typical children used ensemble-encoding strategies to a similar extent; but ensemble perception was related to non-verbal reasoning abilities in autistic but not in typical children. Eye-movement data also showed no group differences in the way children attended to the stimuli. Our combined findings suggest that the abilities of autistic and typical children for ensemble perception of emotions are comparable on average. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  4. Subliminal and Supraliminal Processing of Facial Expression of Emotions: Brain Oscillation in the Left/Right Frontal Area

    PubMed Central

    Balconi, Michela; Ferrari, Chiara

    2012-01-01

    The unconscious effects of an emotional stimulus have been highlighted by a vast amount of research, whereover it remains questionable whether it is possible to assign a specific function to cortical brain oscillations in the unconscious perception of facial expressions of emotions. Alpha band variation was monitored within the right- and left-cortical side when subjects consciously (supraliminal stimulation) or unconsciously (subliminal stimulation) processed facial patterns. Twenty subjects looked at six facial expressions of emotions (anger, fear, surprise, disgust, happiness, sadness, and neutral) under two different conditions: supraliminal (200 ms) vs. subliminal (30 ms) stimulation (140 target-mask pairs for each condition). The results showed that conscious/unconscious processing and the significance of the stimulus can modulate the alpha power. Moreover, it was found that there was an increased right frontal activity for negative emotions vs. an increased left response for positive emotion. The significance of facial expressions was adduced to elucidate cortical different responses to emotional types. PMID:24962767

  5. Subliminal and supraliminal processing of facial expression of emotions: brain oscillation in the left/right frontal area.

    PubMed

    Balconi, Michela; Ferrari, Chiara

    2012-03-26

    The unconscious effects of an emotional stimulus have been highlighted by a vast amount of research, whereover it remains questionable whether it is possible to assign a specific function to cortical brain oscillations in the unconscious perception of facial expressions of emotions. Alpha band variation was monitored within the right- and left-cortical side when subjects consciously (supraliminal stimulation) or unconsciously (subliminal stimulation) processed facial patterns. Twenty subjects looked at six facial expressions of emotions (anger, fear, surprise, disgust, happiness, sadness, and neutral) under two different conditions: supraliminal (200 ms) vs. subliminal (30 ms) stimulation (140 target-mask pairs for each condition). The results showed that conscious/unconscious processing and the significance of the stimulus can modulate the alpha power. Moreover, it was found that there was an increased right frontal activity for negative emotions vs. an increased left response for positive emotion. The significance of facial expressions was adduced to elucidate cortical different responses to emotional types.

  6. The relationship between facial 3-D morphometry and the perception of attractiveness in children.

    PubMed

    Ferrario, V F; Sforza, C; Poggio, C E; Colombo, A; Tartaglia, G

    1997-01-01

    The aim of this investigation was to determine whether attractive children differ in their three-dimensional facial characteristics from nonattractive children of the same age, race, and sex. The facial characteristics of 36 boys and 44 girls aged 8 to 9 years were investigated. Frontal and profile photographs were analyzed independently by 21 judges, and, for each view, four groups were obtained: attractive boys, nonattractive boys, attractive girls, and nonattractive girls. For each child, the three-dimensional coordinates of 16 standardized soft tissue facial landmarks were automatically collected using an infrared system and used to calculate several three-dimensional angles, linear distances, and linear distance ratios. Mean values were computed in the eight groups, and attractive and nonattractive children were compared within sex and view. Most children received a different esthetic evaluation in the separate frontal and profile assessments; concordance in both attractive and nonattractive groups was only 50%. Moreover, three-dimensional facial morphometry was not able to separate attractive and nonattractive children.

  7. Age Differences in the Complexity of Emotion Perception.

    PubMed

    Kim, Seungyoun; Geren, Jennifer L; Knight, Bob G

    2015-01-01

    The current study examined age differences in the number of emotion components used in the judgment of emotion from facial expressions. Fifty-eight younger and 58 older adults were compared on the complexity of perception of emotion from standardized facial expressions that were either clear or ambiguous exemplars of emotion. Using an intra-individual factor analytic approach, results showed that older adults used more emotion components in perceiving emotion in faces than younger adults. Both age groups reported greater emotional complexity for the clear and prototypical emotional stimuli. Age differences in emotional complexity were more pronounced for the ambiguous expressions compared with the clear expressions. These findings demonstrate that older adults showed increased elaboration of emotion, particularly when emotion cues were subtle and provide support for greater emotion differentiation in older adulthood.

  8. Placing the face in context: cultural differences in the perception of facial emotion.

    PubMed

    Masuda, Takahiko; Ellsworth, Phoebe C; Mesquita, Batja; Leu, Janxin; Tanida, Shigehito; Van de Veerdonk, Ellen

    2008-03-01

    Two studies tested the hypothesis that in judging people's emotions from their facial expressions, Japanese, more than Westerners, incorporate information from the social context. In Study 1, participants viewed cartoons depicting a happy, sad, angry, or neutral person surrounded by other people expressing the same emotion as the central person or a different one. The surrounding people's emotions influenced Japanese but not Westerners' perceptions of the central person. These differences reflect differences in attention, as indicated by eye-tracking data (Study 2): Japanese looked at the surrounding people more than did Westerners. Previous findings on East-West differences in contextual sensitivity generalize to social contexts, suggesting that Westerners see emotions as individual feelings, whereas Japanese see them as inseparable from the feelings of the group.

  9. Additive Manufacturing Techniques for the Reconstruction of 3D Fetal Faces.

    PubMed

    Speranza, Domenico; Citro, Daniela; Padula, Francesco; Motyl, Barbara; Marcolin, Federica; Calì, Michele; Martorelli, Massimo

    2017-01-01

    This paper deals with additive manufacturing techniques for the creation of 3D fetal face models starting from routine 3D ultrasound data. In particular, two distinct themes are addressed. First, a method for processing and building 3D models based on the use of medical image processing techniques is proposed. Second, the preliminary results of a questionnaire distributed to future parents consider the use of these reconstructions both from an emotional and an affective point of view. In particular, the study focuses on the enhancement of the perception of maternity or paternity and the improvement in the relationship between parents and physicians in case of fetal malformations, in particular facial or cleft lip diseases.

  10. Operant conditioning of facial displays of pain.

    PubMed

    Kunz, Miriam; Rainville, Pierre; Lautenbacher, Stefan

    2011-06-01

    The operant model of chronic pain posits that nonverbal pain behavior, such as facial expressions, is sensitive to reinforcement, but experimental evidence supporting this assumption is sparse. The aim of the present study was to investigate in a healthy population a) whether facial pain behavior can indeed be operantly conditioned using a discriminative reinforcement schedule to increase and decrease facial pain behavior and b) to what extent these changes affect pain experience indexed by self-ratings. In the experimental group (n = 29), the participants were reinforced every time that they showed pain-indicative facial behavior (up-conditioning) or a neutral expression (down-conditioning) in response to painful heat stimulation. Once facial pain behavior was successfully up- or down-conditioned, respectively (which occurred in 72% of participants), facial pain displays and self-report ratings were assessed. In addition, a control group (n = 11) was used that was yoked to the reinforcement plans of the experimental group. During the conditioning phases, reinforcement led to significant changes in facial pain behavior in the majority of the experimental group (p < .001) but not in the yoked control group (p > .136). Fine-grained analyses of facial muscle movements revealed a similar picture. Furthermore, the decline in facial pain displays (as observed during down-conditioning) strongly predicted changes in pain ratings (R(2) = 0.329). These results suggest that a) facial pain displays are sensitive to reinforcement and b) that changes in facial pain displays can affect self-report ratings.

  11. Bell's Palsy

    MedlinePlus

    ... hours to days Facial droop and difficulty making facial expressions, such as closing your eye or smiling Drooling Pain around the jaw or in or behind your ear on the affected side Increased ... if you experience facial weakness or drooping to determine the underlying cause ...

  12. Motion Based Target Acquisition and Evaluation in an Adaptive Machine Vision System

    DTIC Science & Technology

    1995-05-01

    paths in facial recognition and learning. Annals of Neurology, 22, 41-45. Tolman, E.C. (1932) Purposive behavior in Animals and Men. New York: Appleton...Learned scan paths are the active processes of perception. Rizzo et al. (1987) studied the fixation patterns of two patients with impaired facial ... recognition and learning and found an increase in the randomness of the scan patterns compared to controls, indicating that the cortex was failing to direct

  13. Does facial attractiveness influence perception of epilepsy diagnosis? An insight into stigma in epilepsy.

    PubMed

    Ristić, Aleksandar J; Jovanović, Olja; Popadić, Dragan; Pađen, Višnja; Moosa, Ahsan N V; Krivokapić, Ana; Parojčić, Aleksandra; Berisavac, Ivana; Ilanković, Andrej; Baščarević, Vladimir; Vojvodić, Nikola; Sokić, Dragoslav

    2017-12-01

    Using a group of young healthy individuals and patients with multiple sclerosis (pMS), we aimed to investigate whether the physical attractiveness judgment affects perception of epilepsy. We tested hypothesis that subjects, in the absence of relevant clues, would catch upon the facial attractiveness when asked to speculate which person suffers epilepsy and select less attractive choices. Two photo-arrays (7 photos for each gender) selected from the Chicago Face Database (180 neutral faces of Caucasian volunteers with unknown medical status) were shown to study participants. Photos were evenly distributed along a continuum of attractiveness that was estimated by independent raters in prestudy stage. In each photo-array, three photos had rating 1-3 (unattractive), one photo had rating 4 (neutral), and three photos had rating 5-7 (attractive). High-quality printed photo-arrays were presented to test subjects, and they were asked to select one person from each photo-array "who has epilepsy". Finally, all subjects were asked to complete questionnaire of self-esteem and 19-item Scale of stereotypes toward people with epilepsy. In total, 71 students of psychology, anthropology, or andragogy (mean age: 21.6±1.7years; female: 85.9%) and 70 pMS (mean age: 37.9±8years; female: 71.4%) were tested. Majority of students or pMS had no previous personal experience with individuals with epilepsy (63.4%; 47.1%, p=0.052). Male photo was selected as epileptic in the following proportions: students - 84.5% unattractive, 8.5% neutral, and 7% attractive; pMS - 62.9% unattractive, 8.6% neutral, and 28.6% attractive (p=0.003). Female photo was selected as epileptic in the following proportions: students - 38% unattractive, 52.1% neutral, and 9.9% attractive; pMS - 32.9% unattractive, 34.3% neutral, and 32.9% attractive (0.003). Both groups showed very low potential for stigmatization: significantly lower in pMS in 10 items. Patients with multiple sclerosis showed significantly higher self-esteem than students (p=0.007). Facial attractiveness influences the perception of diagnosis of epilepsy. Both students and pMS were less willing to attribute epilepsy to attractive person of both genders. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. What a Smile Means: Contextual Beliefs and Facial Emotion Expressions in a Non-verbal Zero-Sum Game

    PubMed Central

    Pádua Júnior, Fábio P.; Prado, Paulo H. M.; Roeder, Scott S.; Andrade, Eduardo B.

    2016-01-01

    Research into the authenticity of facial emotion expressions often focuses on the physical properties of the face while paying little attention to the role of beliefs in emotion perception. Further, the literature most often investigates how people express a pre-determined emotion rather than what facial emotion expressions people strategically choose to express. To fill these gaps, this paper proposes a non-verbal zero-sum game – the Face X Game – to assess the role of contextual beliefs and strategic displays of facial emotion expression in interpersonal interactions. This new research paradigm was used in a series of three studies, where two participants are asked to play the role of the sender (individual expressing emotional information on his/her face) or the observer (individual interpreting the meaning of that expression). Study 1 examines the outcome of the game with reference to the sex of the pair, where senders won more frequently when the pair was comprised of at least one female. Study 2 examines the strategic display of facial emotion expressions. The outcome of the game was again contingent upon the sex of the pair. Among female pairs, senders won the game more frequently, replicating the pattern of results from study 1. We also demonstrate that senders who strategically express an emotion incongruent with the valence of the event (e.g., smile after seeing a negative event) are able to mislead observers, who tend to hold a congruent belief about the meaning of the emotion expression. If sending an incongruent signal helps to explain why female senders win more frequently, it logically follows that female observers were more prone to hold a congruent, and therefore inaccurate, belief. This prospect implies that while female senders are willing and/or capable of displaying fake smiles, paired-female observers are not taking this into account. Study 3 investigates the role of contextual factors by manipulating female observers’ beliefs. When prompted to think in an incongruent manner, these observers significantly improve their performance in the game. These findings emphasize the role that contextual factors play in emotion perception—observers’ beliefs do indeed affect their judgments of facial emotion expressions. PMID:27148142

  15. Emotion recognition in borderline personality disorder: effects of emotional information on negative bias.

    PubMed

    Fenske, Sabrina; Lis, Stefanie; Liebke, Lisa; Niedtfeld, Inga; Kirsch, Peter; Mier, Daniela

    2015-01-01

    Borderline Personality Disorder (BPD) is characterized by severe deficits in social interactions, which might be linked to deficits in emotion recognition. Research on emotion recognition abilities in BPD revealed heterogeneous results, ranging from deficits to heightened sensitivity. The most stable findings point to an impairment in the evaluation of neutral facial expressions as neutral, as well as to a negative bias in emotion recognition; that is the tendency to attribute negative emotions to neutral expressions, or in a broader sense to report a more negative emotion category than depicted. However, it remains unclear which contextual factors influence the occurrence of this negative bias. Previous studies suggest that priming by preceding emotional information and also constrained processing time might augment the emotion recognition deficit in BPD. To test these assumptions, 32 female BPD patients and 31 healthy females, matched for age and education, participated in an emotion recognition study, in which every facial expression was preceded by either a positive, neutral or negative scene. Furthermore, time constraints for processing were varied by presenting the facial expressions with short (100 ms) or long duration (up to 3000 ms) in two separate blocks. BPD patients showed a significant deficit in emotion recognition for neutral and positive facial expression, associated with a significant negative bias. In BPD patients, this emotion recognition deficit was differentially affected by preceding emotional information and time constraints, with a greater influence of emotional information during long face presentations and a greater influence of neutral information during short face presentations. Our results are in line with previous findings supporting the existence of a negative bias in emotion recognition in BPD patients, and provide further insights into biased social perceptions in BPD patients.

  16. BMI and WHR Are Reflected in Female Facial Shape and Texture: A Geometric Morphometric Image Analysis

    PubMed Central

    Mayer, Christine; Windhager, Sonja; Schaefer, Katrin; Mitteroecker, Philipp

    2017-01-01

    Facial markers of body composition are frequently studied in evolutionary psychology and are important in computational and forensic face recognition. We assessed the association of body mass index (BMI) and waist-to-hip ratio (WHR) with facial shape and texture (color pattern) in a sample of young Middle European women by a combination of geometric morphometrics and image analysis. Faces of women with high BMI had a wider and rounder facial outline relative to the size of the eyes and lips, and relatively lower eyebrows. Furthermore, women with high BMI had a brighter and more reddish skin color than women with lower BMI. The same facial features were associated with WHR, even though BMI and WHR were only moderately correlated. Yet BMI was better predictable than WHR from facial attributes. After leave-one-out cross-validation, we were able to predict 25% of variation in BMI and 10% of variation in WHR by facial shape. Facial texture predicted only about 3–10% of variation in BMI and WHR. This indicates that facial shape primarily reflects total fat proportion, rather than the distribution of fat within the body. The association of reddish facial texture in high-BMI women may be mediated by increased blood pressure and superficial blood flow as well as diet. Our study elucidates how geometric morphometric image analysis serves to quantify the effect of biological factors such as BMI and WHR to facial shape and color, which in turn contributes to social perception. PMID:28052103

  17. Ramsay Hunt Syndrome

    MedlinePlus

    ... spinning or moving (vertigo) A change in taste perception or loss of taste Dry mouth and eyes ... of one-sided facial paralysis and hearing loss. Risk factors Anyone who has had chickenpox can develop ...

  18. Association of impaired facial affect recognition with basic facial and visual processing deficits in schizophrenia.

    PubMed

    Norton, Daniel; McBain, Ryan; Holt, Daphne J; Ongur, Dost; Chen, Yue

    2009-06-15

    Impaired emotion recognition has been reported in schizophrenia, yet the nature of this impairment is not completely understood. Recognition of facial emotion depends on processing affective and nonaffective facial signals, as well as basic visual attributes. We examined whether and how poor facial emotion recognition in schizophrenia is related to basic visual processing and nonaffective face recognition. Schizophrenia patients (n = 32) and healthy control subjects (n = 29) performed emotion discrimination, identity discrimination, and visual contrast detection tasks, where the emotionality, distinctiveness of identity, or visual contrast was systematically manipulated. Subjects determined which of two presentations in a trial contained the target: the emotional face for emotion discrimination, a specific individual for identity discrimination, and a sinusoidal grating for contrast detection. Patients had significantly higher thresholds (worse performance) than control subjects for discriminating both fearful and happy faces. Furthermore, patients' poor performance in fear discrimination was predicted by performance in visual detection and face identity discrimination. Schizophrenia patients require greater emotional signal strength to discriminate fearful or happy face images from neutral ones. Deficient emotion recognition in schizophrenia does not appear to be determined solely by affective processing but is also linked to the processing of basic visual and facial information.

  19. Neuropsychological Studies of Linguistic and Affective Facial Expressions in Deaf Signers.

    ERIC Educational Resources Information Center

    Corina, David P.; Bellugi, Ursula; Reilly, Judy

    1999-01-01

    Presents two studies that explore facial expression production in deaf signers. An experimental paradigm uses chimeric stimuli of American Sign Language linguistic and facial expressions to explore patterns of productive asymmetries in brain-intact signers. (Author/VWL)

  20. Crossmodal and Incremental Perception of Audiovisual Cues to Emotional Speech

    ERIC Educational Resources Information Center

    Barkhuysen, Pashiera; Krahmer, Emiel; Swerts, Marc

    2010-01-01

    In this article we report on two experiments about the perception of audiovisual cues to emotional speech. The article addresses two questions: (1) how do visual cues from a speaker's face to emotion relate to auditory cues, and (2) what is the recognition speed for various facial cues to emotion? Both experiments reported below are based on tests…

Top