Science.gov

Sample records for expression facial expression

  1. Holistic facial expression classification

    NASA Astrophysics Data System (ADS)

    Ghent, John; McDonald, J.

    2005-06-01

    This paper details a procedure for classifying facial expressions. This is a growing and relatively new type of problem within computer vision. One of the fundamental problems when classifying facial expressions in previous approaches is the lack of a consistent method of measuring expression. This paper solves this problem by the computation of the Facial Expression Shape Model (FESM). This statistical model of facial expression is based on an anatomical analysis of facial expression called the Facial Action Coding System (FACS). We use the term Action Unit (AU) to describe a movement of one or more muscles of the face and all expressions can be described using the AU's described by FACS. The shape model is calculated by marking the face with 122 landmark points. We use Principal Component Analysis (PCA) to analyse how the landmark points move with respect to each other and to lower the dimensionality of the problem. Using the FESM in conjunction with Support Vector Machines (SVM) we classify facial expressions. SVMs are a powerful machine learning technique based on optimisation theory. This project is largely concerned with statistical models, machine learning techniques and psychological tools used in the classification of facial expression. This holistic approach to expression classification provides a means for a level of interaction with a computer that is a significant step forward in human-computer interaction.

  2. Facial expression and sarcasm.

    PubMed

    Rockwell, P

    2001-08-01

    This study examined facial expression in the presentation of sarcasm. 60 responses (sarcastic responses = 30, nonsarcastic responses = 30) from 40 different speakers were coded by two trained coders. Expressions in three facial areas--eyebrow, eyes, and mouth--were evaluated. Only movement in the mouth area significantly differentiated ratings of sarcasm from nonsarcasm.

  3. [Prosopagnosia and facial expression recognition].

    PubMed

    Koyama, Shinichi

    2014-04-01

    This paper reviews clinical neuropsychological studies that have indicated that the recognition of a person's identity and the recognition of facial expressions are processed by different cortical and subcortical areas of the brain. The fusiform gyrus, especially the right fusiform gyrus, plays an important role in the recognition of identity. The superior temporal sulcus, amygdala, and medial frontal cortex play important roles in facial-expression recognition. Both facial recognition and facial-expression recognition are highly intellectual processes that involve several regions of the brain.

  4. Retinotopy of facial expression adaptation.

    PubMed

    Matsumiya, Kazumichi

    2014-01-01

    The face aftereffect (FAE; the illusion of faces after adaptation to a face) has been reported to occur without retinal overlap between adaptor and test, but recent studies revealed that the FAE is not constant across all test locations, which suggests that the FAE is also retinotopic. However, it remains unclear whether the characteristic of the retinotopy of the FAE for one facial aspect is the same as that of the FAE for another facial aspect. In the research reported here, an examination of the retinotopy of the FAE for facial expression indicated that the facial expression aftereffect occurs without retinal overlap between adaptor and test, and depends on the retinal distance between them. Furthermore, the results indicate that, although dependence of the FAE on adaptation-test distance is similar between facial expression and facial identity, the FAE for facial identity is larger than that for facial expression when a test face is presented in the opposite hemifield. On the basis of these results, I discuss adaptation mechanisms underlying facial expression processing and facial identity processing for the retinotopy of the FAE.

  5. Processing faces and facial expressions.

    PubMed

    Posamentier, Mette T; Abdi, Hervé

    2003-09-01

    This paper reviews processing of facial identity and expressions. The issue of independence of these two systems for these tasks has been addressed from different approaches over the past 25 years. More recently, neuroimaging techniques have provided researchers with new tools to investigate how facial information is processed in the brain. First, findings from "traditional" approaches to identity and expression processing are summarized. The review then covers findings from neuroimaging studies on face perception, recognition, and encoding. Processing of the basic facial expressions is detailed in light of behavioral and neuroimaging data. Whereas data from experimental and neuropsychological studies support the existence of two systems, the neuroimaging literature yields a less clear picture because it shows considerable overlap in activation patterns in response to the different face-processing tasks. Further, activation patterns in response to facial expressions support the notion of involved neural substrates for processing different facial expressions.

  6. Measuring facial expression of emotion

    PubMed Central

    Wolf, Karsten

    2015-01-01

    Research into emotions has increased in recent decades, especially on the subject of recognition of emotions. However, studies of the facial expressions of emotion were compromised by technical problems with visible video analysis and electromyography in experimental settings. These have only recently been overcome. There have been new developments in the field of automated computerized facial recognition; allowing real-time identification of facial expression in social environments. This review addresses three approaches to measuring facial expression of emotion and describes their specific contributions to understanding emotion in the healthy population and in persons with mental illness. Despite recent progress, studies on human emotions have been hindered by the lack of consensus on an emotion theory suited to examining the dynamic aspects of emotion and its expression. Studying expression of emotion in patients with mental health conditions for diagnostic and therapeutic purposes will profit from theoretical and methodological progress. PMID:26869846

  7. Measuring facial expression of emotion.

    PubMed

    Wolf, Karsten

    2015-12-01

    Research into emotions has increased in recent decades, especially on the subject of recognition of emotions. However, studies of the facial expressions of emotion were compromised by technical problems with visible video analysis and electromyography in experimental settings. These have only recently been overcome. There have been new developments in the field of automated computerized facial recognition; allowing real-time identification of facial expression in social environments. This review addresses three approaches to measuring facial expression of emotion and describes their specific contributions to understanding emotion in the healthy population and in persons with mental illness. Despite recent progress, studies on human emotions have been hindered by the lack of consensus on an emotion theory suited to examining the dynamic aspects of emotion and its expression. Studying expression of emotion in patients with mental health conditions for diagnostic and therapeutic purposes will profit from theoretical and methodological progress.

  8. Cortical control of facial expression.

    PubMed

    Müri, René M

    2016-06-01

    The present Review deals with the motor control of facial expressions in humans. Facial expressions are a central part of human communication. Emotional face expressions have a crucial role in human nonverbal behavior, allowing a rapid transfer of information between individuals. Facial expressions can be either voluntarily or emotionally controlled. Recent studies in nonhuman primates and humans have revealed that the motor control of facial expressions has a distributed neural representation. At least five cortical regions on the medial and lateral aspects of each hemisphere are involved: the primary motor cortex, the ventral lateral premotor cortex, the supplementary motor area on the medial wall, and the rostral and caudal cingulate cortex. The results of studies in humans and nonhuman primates suggest that the innervation of the face is bilaterally controlled for the upper part and mainly contralaterally controlled for the lower part. Furthermore, the primary motor cortex, the ventral lateral premotor cortex, and the supplementary motor area are essential for the voluntary control of facial expressions. In contrast, the cingulate cortical areas are important for emotional expression, because they receive input from different structures of the limbic system.

  9. Compound facial expressions of emotion

    PubMed Central

    Du, Shichuan; Tao, Yong; Martinez, Aleix M.

    2014-01-01

    Understanding the different categories of facial expressions of emotion regularly used by us is essential to gain insights into human cognition and affect as well as for the design of computational models and perceptual interfaces. Past research on facial expressions of emotion has focused on the study of six basic categories—happiness, surprise, anger, sadness, fear, and disgust. However, many more facial expressions of emotion exist and are used regularly by humans. This paper describes an important group of expressions, which we call compound emotion categories. Compound emotions are those that can be constructed by combining basic component categories to create new ones. For instance, happily surprised and angrily surprised are two distinct compound emotion categories. The present work defines 21 distinct emotion categories. Sample images of their facial expressions were collected from 230 human subjects. A Facial Action Coding System analysis shows the production of these 21 categories is different but consistent with the subordinate categories they represent (e.g., a happily surprised expression combines muscle movements observed in happiness and surprised). We show that these differences are sufficient to distinguish between the 21 defined categories. We then use a computational model of face perception to demonstrate that most of these categories are also visually discriminable from one another. PMID:24706770

  10. Compound facial expressions of emotion.

    PubMed

    Du, Shichuan; Tao, Yong; Martinez, Aleix M

    2014-04-15

    Understanding the different categories of facial expressions of emotion regularly used by us is essential to gain insights into human cognition and affect as well as for the design of computational models and perceptual interfaces. Past research on facial expressions of emotion has focused on the study of six basic categories--happiness, surprise, anger, sadness, fear, and disgust. However, many more facial expressions of emotion exist and are used regularly by humans. This paper describes an important group of expressions, which we call compound emotion categories. Compound emotions are those that can be constructed by combining basic component categories to create new ones. For instance, happily surprised and angrily surprised are two distinct compound emotion categories. The present work defines 21 distinct emotion categories. Sample images of their facial expressions were collected from 230 human subjects. A Facial Action Coding System analysis shows the production of these 21 categories is different but consistent with the subordinate categories they represent (e.g., a happily surprised expression combines muscle movements observed in happiness and surprised). We show that these differences are sufficient to distinguish between the 21 defined categories. We then use a computational model of face perception to demonstrate that most of these categories are also visually discriminable from one another.

  11. Darwin, deception, and facial expression.

    PubMed

    Ekman, Paul

    2003-12-01

    Darwin did not focus on deception. Only a few sentences in his book mentioned the issue. One of them raised the very interesting question of whether it is difficult to voluntarily inhibit the emotional expressions that are most difficult to voluntarily fabricate. Another suggestion was that it would be possible to unmask a fabricated expression by the absence of the difficult-to-voluntarily-generate facial actions. Still another was that during emotion body movements could be more easily suppressed than facial expression. Research relevant to each of Darwin's suggestions is reviewed, as is other research on deception that Darwin did not foresee.

  12. [Part II: Recognising facial expressions].

    PubMed

    Krolak-Salmon, P; Hénaff, M A; Bertrand, O; Vighetto, A; Mauguière, F

    2006-11-01

    In this second part, we address particularly the question of the neural mechanisms and structures involved in the recognition of facial emotional expressions that are crucial in social cognition. Emotion recognition in others can be critically impaired in some neurodegenerative and neurovascular diseases. That dysfunction sometimes correlated to disabling behavioural disorders and interpersonal communication impairment must be further understood. The results of a series of scalp and intracranial event related potential recordings, as well as recent advances in the literature, are reported. ERPs to facial emotional expressions were thus recorded in multiple subcortical and cortical areas in drug refractory epileptical patients implanted with depth electrodes. The roles of amygdala, insula and prefrontal cortex located at crossroads between perceptive analysis and emotional conceptual knowledge are particularly underlined. Altogether, these studies demonstrate that facial expressions are widely processed in space and time, some structures reacting very early and automatically, others providing a sustained reaction depending on the attention.

  13. Spontaneous Facial Mimicry in Response to Dynamic Facial Expressions

    ERIC Educational Resources Information Center

    Sato, Wataru; Yoshikawa, Sakiko

    2007-01-01

    Based on previous neuroscientific evidence indicating activation of the mirror neuron system in response to dynamic facial actions, we hypothesized that facial mimicry would occur while subjects viewed dynamic facial expressions. To test this hypothesis, dynamic/static facial expressions of anger/happiness were presented using computer-morphing…

  14. Spontaneous Facial Mimicry in Response to Dynamic Facial Expressions

    ERIC Educational Resources Information Center

    Sato, Wataru; Yoshikawa, Sakiko

    2007-01-01

    Based on previous neuroscientific evidence indicating activation of the mirror neuron system in response to dynamic facial actions, we hypothesized that facial mimicry would occur while subjects viewed dynamic facial expressions. To test this hypothesis, dynamic/static facial expressions of anger/happiness were presented using computer-morphing…

  15. Mapping and Manipulating Facial Expression

    ERIC Educational Resources Information Center

    Theobald, Barry-John; Matthews, Iain; Mangini, Michael; Spies, Jeffrey R.; Brick, Timothy R.; Cohn, Jeffrey F.; Boker, Steven M.

    2009-01-01

    Nonverbal visual cues accompany speech to supplement the meaning of spoken words, signify emotional state, indicate position in discourse, and provide back-channel feedback. This visual information includes head movements, facial expressions and body gestures. In this article we describe techniques for manipulating both verbal and nonverbal facial…

  16. Mapping and Manipulating Facial Expression

    ERIC Educational Resources Information Center

    Theobald, Barry-John; Matthews, Iain; Mangini, Michael; Spies, Jeffrey R.; Brick, Timothy R.; Cohn, Jeffrey F.; Boker, Steven M.

    2009-01-01

    Nonverbal visual cues accompany speech to supplement the meaning of spoken words, signify emotional state, indicate position in discourse, and provide back-channel feedback. This visual information includes head movements, facial expressions and body gestures. In this article we describe techniques for manipulating both verbal and nonverbal facial…

  17. Interaction between facial expression and color

    PubMed Central

    Nakajima, Kae; Minami, Tetsuto; Nakauchi, Shigeki

    2017-01-01

    Facial color varies depending on emotional state, and emotions are often described in relation to facial color. In this study, we investigated whether the recognition of facial expressions was affected by facial color and vice versa. In the facial expression task, expression morph continua were employed: fear-anger and sadness-happiness. The morphed faces were presented in three different facial colors (bluish, neutral, and reddish color). Participants identified a facial expression between the two endpoints (e.g., fear or anger) regardless of its facial color. The results showed that the perception of facial expression was influenced by facial color. In the fear-anger morphs, intermediate morphs of reddish-colored and bluish colored faces had a greater tendency to be identified as angry faces and fearful faces, respectively. In the facial color task, two bluish-to-reddish colored face continua were presented in three different facial expressions (fear-neutral-anger and sadness-neutral-happiness). Participants judged whether the facial color was reddish or bluish regardless of its expression. The faces with sad expression tended to be identified as more bluish, while the faces with other expressions did not affect facial color judgment. These results suggest that an interactive but disproportionate relationship exists between facial color and expression in face perception. PMID:28117349

  18. Facial dynamics and emotional expressions in facial aging treatments.

    PubMed

    Michaud, Thierry; Gassia, Véronique; Belhaouari, Lakhdar

    2015-03-01

    Facial expressions convey emotions that form the foundation of interpersonal relationships, and many of these emotions promote and regulate our social linkages. Hence, the facial aging symptomatological analysis and the treatment plan must of necessity include knowledge of the facial dynamics and the emotional expressions of the face. This approach aims to more closely meet patients' expectations of natural-looking results, by correcting age-related negative expressions while observing the emotional language of the face. This article will successively describe patients' expectations, the role of facial expressions in relational dynamics, the relationship between facial structures and facial expressions, and the way facial aging mimics negative expressions. Eventually, therapeutic implications for facial aging treatment will be addressed. © 2015 Wiley Periodicals, Inc.

  19. Neuroticism Delays Detection of Facial Expressions.

    PubMed

    Sawada, Reiko; Sato, Wataru; Uono, Shota; Kochiyama, Takanori; Kubota, Yasutaka; Yoshimura, Sayaka; Toichi, Motomi

    2016-01-01

    The rapid detection of emotional signals from facial expressions is fundamental for human social interaction. The personality factor of neuroticism modulates the processing of various types of emotional facial expressions; however, its effect on the detection of emotional facial expressions remains unclear. In this study, participants with high- and low-neuroticism scores performed a visual search task to detect normal expressions of anger and happiness, and their anti-expressions within a crowd of neutral expressions. Anti-expressions contained an amount of visual changes equivalent to those found in normal expressions compared to neutral expressions, but they were usually recognized as neutral expressions. Subjective emotional ratings in response to each facial expression stimulus were also obtained. Participants with high-neuroticism showed an overall delay in the detection of target facial expressions compared to participants with low-neuroticism. Additionally, the high-neuroticism group showed higher levels of arousal to facial expressions compared to the low-neuroticism group. These data suggest that neuroticism modulates the detection of emotional facial expressions in healthy participants; high levels of neuroticism delay overall detection of facial expressions and enhance emotional arousal in response to facial expressions.

  20. Neuroticism Delays Detection of Facial Expressions

    PubMed Central

    Sawada, Reiko; Sato, Wataru; Uono, Shota; Kochiyama, Takanori; Kubota, Yasutaka; Yoshimura, Sayaka; Toichi, Motomi

    2016-01-01

    The rapid detection of emotional signals from facial expressions is fundamental for human social interaction. The personality factor of neuroticism modulates the processing of various types of emotional facial expressions; however, its effect on the detection of emotional facial expressions remains unclear. In this study, participants with high- and low-neuroticism scores performed a visual search task to detect normal expressions of anger and happiness, and their anti-expressions within a crowd of neutral expressions. Anti-expressions contained an amount of visual changes equivalent to those found in normal expressions compared to neutral expressions, but they were usually recognized as neutral expressions. Subjective emotional ratings in response to each facial expression stimulus were also obtained. Participants with high-neuroticism showed an overall delay in the detection of target facial expressions compared to participants with low-neuroticism. Additionally, the high-neuroticism group showed higher levels of arousal to facial expressions compared to the low-neuroticism group. These data suggest that neuroticism modulates the detection of emotional facial expressions in healthy participants; high levels of neuroticism delay overall detection of facial expressions and enhance emotional arousal in response to facial expressions. PMID:27073904

  1. Facial expressions recognition with an emotion expressive robotic head

    NASA Astrophysics Data System (ADS)

    Doroftei, I.; Adascalitei, F.; Lefeber, D.; Vanderborght, B.; Doroftei, I. A.

    2016-08-01

    The purpose of this study is to present the preliminary steps in facial expressions recognition with a new version of an expressive social robotic head. So, in a first phase, our main goal was to reach a minimum level of emotional expressiveness in order to obtain nonverbal communication between the robot and human by building six basic facial expressions. To evaluate the facial expressions, the robot was used in some preliminary user studies, among children and adults.

  2. Recognizing Facial Expressions Automatically from Video

    NASA Astrophysics Data System (ADS)

    Shan, Caifeng; Braspenning, Ralph

    Facial expressions, resulting from movements of the facial muscles, are the face changes in response to a person's internal emotional states, intentions, or social communications. There is a considerable history associated with the study on facial expressions. Darwin [22] was the first to describe in details the specific facial expressions associated with emotions in animals and humans, who argued that all mammals show emotions reliably in their faces. Since that, facial expression analysis has been a area of great research interest for behavioral scientists [27]. Psychological studies [48, 3] suggest that facial expressions, as the main mode for nonverbal communication, play a vital role in human face-to-face communication. For illustration, we show some examples of facial expressions in Fig. 1.

  3. Mapping and manipulating facial expression.

    PubMed

    Theobald, Barry-John; Matthews, Iain; Mangini, Michael; Spies, Jeffrey R; Brick, Timothy R; Cohn, Jeffrey F; Boker, Steven M

    2009-01-01

    Nonverbal visual cues accompany speech to supplement the meaning of spoken words, signify emotional state, indicate position in discourse, and provide back-channel feedback. This visual information includes head movements, facial expressions and body gestures. In this article we describe techniques for manipulating both verbal and nonverbal facial gestures in video sequences of people engaged in conversation. We are developing a system for use in psychological experiments, where the effects of manipulating individual components of nonverbal visual behavior during live face-to-face conversation can be studied. In particular, the techniques we describe operate in real-time at video frame-rate and the manipulation can be applied so both participants in a conversation are kept blind to the experimental conditions.

  4. Mapping and Manipulating Facial Expression

    PubMed Central

    Theobald, Barry-John; Matthews, Iain; Mangini, Michael; Spies, Jeffrey R.; Brick, Timothy R.; Cohn, Jeffrey F.; Boker, Steven M.

    2009-01-01

    Non-verbal visual cues accompany speech to supplement the meaning of spoken words, signify emotional state, indicate position in discourse, and provide back-channel feedback. This visual information includes head movements, facial expressions and body gestures. In this paper we describe techniques for manipulating both verbal and non-verbal facial gestures in video sequences of people engaged in conversation. We are developing a system for use in psychological experiments, where the effects of manipulating individual components of non-verbal visual behaviour during live face-to-face conversation can be studied. In particular, the techniques we describe operate in real-time at video frame-rate and the manipulation can be applied so both participants in a conversation are kept blind to the experimental conditions. PMID:19624037

  5. Facial expression recognition using thermal image.

    PubMed

    Jiang, Guotai; Song, Xuemin; Zheng, Fuhui; Wang, Peipei; Omer, Ashgan

    2005-01-01

    Facial expression recognition will be studied in this paper using mathematics morphology, through drawing and analyzing the whole geometry characteristics and some geometry characteristics of the interesting area of Infrared Thermal Imaging (IRTI). The results show that geometry characteristic in the interesting region of different expression are obviously different; Facial temperature changes almost with the expression at the same time. Studies have shown feasibility of facial expression recognition on the basis of IRTI. This method can be used to monitor the facial expression in real time, which can be used in auxiliary diagnosis and medical on disease.

  6. Man-machine collaboration using facial expressions

    NASA Astrophysics Data System (ADS)

    Dai, Ying; Katahera, S.; Cai, D.

    2002-09-01

    For realizing the flexible man-machine collaboration, understanding of facial expressions and gestures is not negligible. In our method, we proposed a hierarchical recognition approach, for the understanding of human emotions. According to this method, the facial AFs (action features) were firstly extracted and recognized by using histograms of optical flow. Then, based on the facial AFs, facial expressions were classified into two calsses, one of which presents the positive emotions, and the other of which does the negative ones. Accordingly, the facial expressions belonged to the positive class, or the ones belonged to the negative class, were classified into more complex emotions, which were revealed by the corresponding facial expressions. Finally, the system architecture how to coordinate in recognizing facil action features and facial expressions for man-machine collaboration was proposed.

  7. Social Use of Facial Expressions in Hylobatids

    PubMed Central

    Scheider, Linda; Waller, Bridget M.; Oña, Leonardo; Burrows, Anne M.; Liebal, Katja

    2016-01-01

    Non-human primates use various communicative means in interactions with others. While primate gestures are commonly considered to be intentionally and flexibly used signals, facial expressions are often referred to as inflexible, automatic expressions of affective internal states. To explore whether and how non-human primates use facial expressions in specific communicative interactions, we studied five species of small apes (gibbons) by employing a newly established Facial Action Coding System for hylobatid species (GibbonFACS). We found that, despite individuals often being in close proximity to each other, in social (as opposed to non-social contexts) the duration of facial expressions was significantly longer when gibbons were facing another individual compared to non-facing situations. Social contexts included grooming, agonistic interactions and play, whereas non-social contexts included resting and self-grooming. Additionally, gibbons used facial expressions while facing another individual more often in social contexts than non-social contexts where facial expressions were produced regardless of the attentional state of the partner. Also, facial expressions were more likely ‘responded to’ by the partner’s facial expressions when facing another individual than non-facing. Taken together, our results indicate that gibbons use their facial expressions differentially depending on the social context and are able to use them in a directed way in communicative interactions with other conspecifics. PMID:26978660

  8. Social Use of Facial Expressions in Hylobatids.

    PubMed

    Scheider, Linda; Waller, Bridget M; Oña, Leonardo; Burrows, Anne M; Liebal, Katja

    2016-01-01

    Non-human primates use various communicative means in interactions with others. While primate gestures are commonly considered to be intentionally and flexibly used signals, facial expressions are often referred to as inflexible, automatic expressions of affective internal states. To explore whether and how non-human primates use facial expressions in specific communicative interactions, we studied five species of small apes (gibbons) by employing a newly established Facial Action Coding System for hylobatid species (GibbonFACS). We found that, despite individuals often being in close proximity to each other, in social (as opposed to non-social contexts) the duration of facial expressions was significantly longer when gibbons were facing another individual compared to non-facing situations. Social contexts included grooming, agonistic interactions and play, whereas non-social contexts included resting and self-grooming. Additionally, gibbons used facial expressions while facing another individual more often in social contexts than non-social contexts where facial expressions were produced regardless of the attentional state of the partner. Also, facial expressions were more likely 'responded to' by the partner's facial expressions when facing another individual than non-facing. Taken together, our results indicate that gibbons use their facial expressions differentially depending on the social context and are able to use them in a directed way in communicative interactions with other conspecifics.

  9. Geometry-driven photorealistic facial expression synthesis.

    PubMed

    Zhang, Qingshan; Liu, Zicheng; Guo, Baining; Terzopoulos, Demetri; Shum, Heung-Yeung

    2006-01-01

    Expression mapping (also called performance driven animation) has been a popular method for generating facial animations. A shortcoming of this method is that it does not generate expression details such as the wrinkles due to skin deformations. In this paper, we provide a solution to this problem. We have developed a geometry-driven facial expression synthesis system. Given feature point positions (the geometry) of a facial expression, our system automatically synthesizes a corresponding expression image that includes photorealistic and natural looking expression details. Due to the difficulty of point tracking, the number of feature points required by the synthesis system is, in general, more than what is directly available from a performance sequence. We have developed a technique to infer the missing feature point motions from the tracked subset by using an example-based approach. Another application of our system is expression editing where the user drags feature points while the system interactively generates facial expressions with skin deformation details.

  10. Mutual information-based facial expression recognition

    NASA Astrophysics Data System (ADS)

    Hazar, Mliki; Hammami, Mohamed; Hanêne, Ben-Abdallah

    2013-12-01

    This paper introduces a novel low-computation discriminative regions representation for expression analysis task. The proposed approach relies on interesting studies in psychology which show that most of the descriptive and responsible regions for facial expression are located around some face parts. The contributions of this work lie in the proposition of new approach which supports automatic facial expression recognition based on automatic regions selection. The regions selection step aims to select the descriptive regions responsible or facial expression and was performed using Mutual Information (MI) technique. For facial feature extraction, we have applied Local Binary Patterns Pattern (LBP) on Gradient image to encode salient micro-patterns of facial expressions. Experimental studies have shown that using discriminative regions provide better results than using the whole face regions whilst reducing features vector dimension.

  11. Human Facial Expressions as Adaptations:Evolutionary Questions in Facial Expression Research

    PubMed Central

    SCHMIDT, KAREN L.; COHN, JEFFREY F.

    2007-01-01

    The importance of the face in social interaction and social intelligence is widely recognized in anthropology. Yet the adaptive functions of human facial expression remain largely unknown. An evolutionary model of human facial expression as behavioral adaptation can be constructed, given the current knowledge of the phenotypic variation, ecological contexts, and fitness consequences of facial behavior. Studies of facial expression are available, but results are not typically framed in an evolutionary perspective. This review identifies the relevant physical phenomena of facial expression and integrates the study of this behavior with the anthropological study of communication and sociality in general. Anthropological issues with relevance to the evolutionary study of facial expression include: facial expressions as coordinated, stereotyped behavioral phenotypes, the unique contexts and functions of different facial expressions, the relationship of facial expression to speech, the value of facial expressions as signals, and the relationship of facial expression to social intelligence in humans and in nonhuman primates. Human smiling is used as an example of adaptation, and testable hypotheses concerning the human smile, as well as other expressions, are proposed. PMID:11786989

  12. Emotional facial activation induced by unconsciously perceived dynamic facial expressions.

    PubMed

    Kaiser, Jakob; Davey, Graham C L; Parkhouse, Thomas; Meeres, Jennifer; Scott, Ryan B

    2016-12-01

    Do facial expressions of emotion influence us when not consciously perceived? Methods to investigate this question have typically relied on brief presentation of static images. In contrast, real facial expressions are dynamic and unfold over several seconds. Recent studies demonstrate that gaze contingent crowding (GCC) can block awareness of dynamic expressions while still inducing behavioural priming effects. The current experiment tested for the first time whether dynamic facial expressions presented using this method can induce unconscious facial activation. Videos of dynamic happy and angry expressions were presented outside participants' conscious awareness while EMG measurements captured activation of the zygomaticus major (active when smiling) and the corrugator supercilii (active when frowning). Forced-choice classification of expressions confirmed they were not consciously perceived, while EMG revealed significant differential activation of facial muscles consistent with the expressions presented. This successful demonstration opens new avenues for research examining the unconscious emotional influences of facial expressions. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Realistic facial animation generation based on facial expression mapping

    NASA Astrophysics Data System (ADS)

    Yu, Hui; Garrod, Oliver; Jack, Rachael; Schyns, Philippe

    2014-01-01

    Facial expressions reflect internal emotional states of a character or in response to social communications. Though much effort has been taken to generate realistic facial expressions, it still remains a challenging topic due to human being's sensitivity to subtle facial movements. In this paper, we present a method for facial animation generation, which reflects true facial muscle movements with high fidelity. An intermediate model space is introduced to transfer captured static AU peak frames based on FACS to the conformed target face. And then dynamic parameters derived using a psychophysics method is integrated to generate facial animation, which is assumed to represent natural correlation of multiple AUs. Finally, the animation sequence in the intermediate model space is mapped to the target face to produce final animation.

  14. Horses discriminate between facial expressions of conspecifics

    PubMed Central

    Wathan, J.; Proops, L.; Grounds, K.; McComb, K.

    2016-01-01

    In humans, facial expressions are rich sources of social information and have an important role in regulating social interactions. However, the extent to which this is true in non-human animals, and particularly in non-primates, remains largely unknown. Therefore we tested whether domestic horses (Equus caballus) could discriminate between facial expressions of their conspecifics captured in different contexts, and whether viewing these expressions elicited functionally relevant reactions. Horses were more likely to approach photographic stimuli displaying facial expressions associated with positive attention and relaxation, and to avoid stimuli displaying an expression associated with aggression. Moreover, differing patterns of heart rate changes were observed in response to viewing the positive anticipation and agonistic facial expressions. These results indicate that horses spontaneously discriminate between photographs of unknown conspecifics portraying different facial expressions, showing appropriate behavioural and physiological responses. Thus horses, an animal far-removed from the primate lineage, also have the ability to use facial expressions as a means of gaining social information and potentially regulating social interactions. PMID:27995958

  15. The MPI Facial Expression Database — A Validated Database of Emotional and Conversational Facial Expressions

    PubMed Central

    Kaulard, Kathrin; Cunningham, Douglas W.; Bülthoff, Heinrich H.; Wallraven, Christian

    2012-01-01

    The ability to communicate is one of the core aspects of human life. For this, we use not only verbal but also nonverbal signals of remarkable complexity. Among the latter, facial expressions belong to the most important information channels. Despite the large variety of facial expressions we use in daily life, research on facial expressions has so far mostly focused on the emotional aspect. Consequently, most databases of facial expressions available to the research community also include only emotional expressions, neglecting the largely unexplored aspect of conversational expressions. To fill this gap, we present the MPI facial expression database, which contains a large variety of natural emotional and conversational expressions. The database contains 55 different facial expressions performed by 19 German participants. Expressions were elicited with the help of a method-acting protocol, which guarantees both well-defined and natural facial expressions. The method-acting protocol was based on every-day scenarios, which are used to define the necessary context information for each expression. All facial expressions are available in three repetitions, in two intensities, as well as from three different camera angles. A detailed frame annotation is provided, from which a dynamic and a static version of the database have been created. In addition to describing the database in detail, we also present the results of an experiment with two conditions that serve to validate the context scenarios as well as the naturalness and recognizability of the video sequences. Our results provide clear evidence that conversational expressions can be recognized surprisingly well from visual information alone. The MPI facial expression database will enable researchers from different research fields (including the perceptual and cognitive sciences, but also affective computing, as well as computer vision) to investigate the processing of a wider range of natural facial expressions

  16. Computer-enhanced emotion in facial expressions.

    PubMed Central

    Calder, A J; Young, A W; Rowland, D; Perrett, D I

    1997-01-01

    Benson & Perrett's (1991 b) computer-based caricature procedure was used to alter the positions of anatomical landmarks in photographs of emotional facial expressions with respect to their locations in a reference norm face (e.g. a neutral expression). Exaggerating the differences between an expression and its norm produces caricatured images, whereas reducing the differences produces 'anti-caricatures'. Experiment 1 showed that caricatured (+50% different from neutral) expressions were recognized significantly faster than the veridical (0%, undistorted) expressions. This held for all six basic emotions from the Ekman & Friesen (1976) series, and the effect generalized across different posers. For experiment 2, caricatured (+50%) and anti-caricatured (-50%) images were prepared using two types of reference norm; a neutral-expression norm, which would be optimal if facial expression recognition involves monitoring changes in the positioning of underlying facial muscles, and a perceptually-based norm involving an average of the expressions of six basic emotions (excluding neutral) in the Ekman & Friesen (1976) series. The results showed that the caricatured images were identified significantly faster, and the anti-caricatured images significantly slower, than the veridical expressions. Furthermore, the neutral-expression and average-expression norm caricatures produced the same pattern of results. PMID:9265191

  17. Biased Facial Expression Interpretation in Shy Children

    ERIC Educational Resources Information Center

    Kokin, Jessica; Younger, Alastair; Gosselin, Pierre; Vaillancourt, Tracy

    2016-01-01

    The relationship between shyness and the interpretations of the facial expressions of others was examined in a sample of 123 children aged 12 to 14?years. Participants viewed faces displaying happiness, fear, anger, disgust, sadness, surprise, as well as a neutral expression, presented on a computer screen. The children identified each expression…

  18. Biased Facial Expression Interpretation in Shy Children

    ERIC Educational Resources Information Center

    Kokin, Jessica; Younger, Alastair; Gosselin, Pierre; Vaillancourt, Tracy

    2016-01-01

    The relationship between shyness and the interpretations of the facial expressions of others was examined in a sample of 123 children aged 12 to 14?years. Participants viewed faces displaying happiness, fear, anger, disgust, sadness, surprise, as well as a neutral expression, presented on a computer screen. The children identified each expression…

  19. Facial expression recognition and subthalamic nucleus stimulation

    PubMed Central

    Schroeder, U; Kuehler, A; Hennenlotter, A; Haslinger, B; Tronnier, V; Krause, M; Pfister, R; Sprengelmeyer, R; Lange, K; Ceballos-Baumann, A

    2004-01-01

    Objective: To study the impact of STN stimulation in Parkinson's disease on perception of facial expressions. Results: There was a selective reduction in recognition of angry faces, but not other expressions, during STN stimulation. Conclusions: The findings may have important implications for social adjustment in these patients. PMID:15026519

  20. Hereditary family signature of facial expression

    PubMed Central

    Peleg, Gili; Katzir, Gadi; Peleg, Ofer; Kamara, Michal; Brodsky, Leonid; Hel-Or, Hagit; Keren, Daniel; Nevo, Eviatar

    2006-01-01

    Although facial expressions of emotion are universal, individual differences create a facial expression “signature” for each person; but, is there a unique family facial expression signature? Only a few family studies on the heredity of facial expressions have been performed, none of which compared the gestalt of movements in various emotional states; they compared only a few movements in one or two emotional states. No studies, to our knowledge, have compared movements of congenitally blind subjects with their relatives to our knowledge. Using two types of analyses, we show a correlation between movements of congenitally blind subjects with those of their relatives in think-concentrate, sadness, anger, disgust, joy, and surprise and provide evidence for a unique family facial expression signature. In the analysis “in-out family test,” a particular movement was compared each time across subjects. Results show that the frequency of occurrence of a movement of a congenitally blind subject in his family is significantly higher than that outside of his family in think-concentrate, sadness, and anger. In the analysis “the classification test,” in which congenitally blind subjects were classified to their families according to the gestalt of movements, results show 80% correct classification over the entire interview and 75% in anger. Analysis of the movements' frequencies in anger revealed a correlation between the movements' frequencies of congenitally blind individuals and those of their relatives. This study anticipates discovering genes that influence facial expressions, understanding their evolutionary significance, and elucidating repair mechanisms for syndromes lacking facial expression, such as autism. PMID:17043232

  1. The identification of unfolding facial expressions.

    PubMed

    Fiorentini, Chiara; Schmidt, Susanna; Viviani, Paolo

    2012-01-01

    We asked whether the identification of emotional facial expressions (FEs) involves the simultaneous perception of the facial configuration or the detection of emotion-specific diagnostic cues. We recorded at high speed (500 frames s-1) the unfolding of the FE in five actors, each expressing six emotions (anger, surprise, happiness, disgust, fear, sadness). Recordings were coded every 10 frames (20 ms of real time) with the Facial Action Coding System (FACS, Ekman et al 2002, Salt Lake City, UT: Research Nexus eBook) to identify the facial actions contributing to each expression, and their intensity changes over time. Recordings were shown in slow motion (1/20 of recording speed) to one hundred observers in a forced-choice identification task. Participants were asked to identify the emotion during the presentation as soon as they felt confident to do so. Responses were recorded along with the associated response times (RTs). The RT probability density functions for both correct and incorrect responses were correlated with the facial activity during the presentation. There were systematic correlations between facial activities, response probabilities, and RT peaks, and significant differences in RT distributions for correct and incorrect answers. The results show that a reliable response is possible long before the full FE configuration is reached. This suggests that identification is reached by integrating in time individual diagnostic facial actions, and does not require perceiving the full apex configuration.

  2. Facial expressions triggered by imagining the future.

    PubMed

    El Haj, Mohamad; Antoine, Pascal; Nandrino, Jean Louis

    2017-01-01

    A body of research has investigated the cognitive and affective features of imagining the future. Our study aimed at extending this research by evaluating expressions that are triggered by future thinking. Participants were asked to remember and to imagine personal events. Both past and future thinking were video-recorded and the recording was later analyzed by a software for facial analysis that detects and classifies basic emotional expressions (i.e., happy, sad, angry, surprised, scared, disgusted and neutral). The analysis showed more emotional and fewer neutral facial expressions during imagining the future than during remembering the past. These findings mirror a wealth of psychological research highlighting the emotional valence of future thinking, which has been mainly assessed in this research by subjective methodologies. Our work provides an empirical description of facial expressions that can be triggered by imagining the future.

  3. Facial expression recognition in perceptual color space.

    PubMed

    Lajevardi, Seyed Mehdi; Wu, Hong Ren

    2012-08-01

    This paper introduces a tensor perceptual color framework (TPCF) for facial expression recognition (FER), which is based on information contained in color facial images. The TPCF enables multi-linear image analysis in different color spaces and demonstrates that color components provide additional information for robust FER. Using this framework, the components (in either RGB, YCbCr, CIELab or CIELuv space) of color images are unfolded to two-dimensional (2- D) tensors based on multi-linear algebra and tensor concepts, from which the features are extracted by Log-Gabor filters. The mutual information quotient (MIQ) method is employed for feature selection. These features are classified using a multi-class linear discriminant analysis (LDA) classifier. The effectiveness of color information on FER using low-resolution and facial expression images with illumination variations is assessed for performance evaluation. Experimental results demonstrate that color information has significant potential to improve emotion recognition performance due to the complementary characteristics of image textures. Furthermore, the perceptual color spaces (CIELab and CIELuv) are better overall for facial expression recognition than other color spaces by providing more efficient and robust performance for facial expression recognition using facial images with illumination variation.

  4. Mothers' pupillary responses to infant facial expressions.

    PubMed

    Yrttiaho, Santeri; Niehaus, Dana; Thomas, Eileen; Leppänen, Jukka M

    2017-02-06

    Human parental care relies heavily on the ability to monitor and respond to a child's affective states. The current study examined pupil diameter as a potential physiological index of mothers' affective response to infant facial expressions. Pupillary time-series were measured from 86 mothers of young infants in response to an array of photographic infant faces falling into four emotive categories based on valence (positive vs. negative) and arousal (mild vs. strong). Pupil dilation was highly sensitive to the valence of facial expressions, being larger for negative vs. positive facial expressions. A separate control experiment with luminance-matched non-face stimuli indicated that the valence effect was specific to facial expressions and cannot be explained by luminance confounds. Pupil response was not sensitive to the arousal level of facial expressions. The results show the feasibility of using pupil diameter as a marker of mothers' affective responses to ecologically valid infant stimuli and point to a particularly prompt maternal response to infant distress cues.

  5. Visual perception of facial expressions of emotion.

    PubMed

    Martinez, Aleix M

    2017-10-01

    Facial expressions of emotion are produced by contracting and relaxing the facial muscles in our face. I hypothesize that the human visual system solves the inverse problem of production, that is, to interpret emotion, the visual system attempts to identify the underlying muscle activations. I show converging computational, behavioral and imaging evidence in favor of this hypothesis. I detail the computations performed by the human visual system to achieve the decoding of these facial actions and identify a brain region where these computations likely take place. The resulting computational model explains how humans readily classify emotions into categories as well as continuous variables. This model also predicts the existence of a large number of previously unknown facial expressions, including compound emotions, affect attributes and mental states that are regularly used by people. I provide evidence in favor of this prediction. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. A study on facial expressions recognition

    NASA Astrophysics Data System (ADS)

    Xu, Jingjing

    2017-09-01

    In terms of communication, postures and facial expressions of such feelings like happiness, anger and sadness play important roles in conveying information. With the development of the technology, recently a number of algorithms dealing with face alignment, face landmark detection, classification, facial landmark localization and pose estimation have been put forward. However, there are a lot of challenges and problems need to be fixed. In this paper, a few technologies have been concluded and analyzed, and they all relate to handling facial expressions recognition and poses like pose-indexed based multi-view method for face alignment, robust facial landmark detection under significant head pose and occlusion, partitioning the input domain for classification, robust statistics face formalization.

  7. Facial age affects emotional expression decoding.

    PubMed

    Fölster, Mara; Hess, Ursula; Werheid, Katja

    2014-01-01

    Facial expressions convey important information on emotional states of our interaction partners. However, in interactions between younger and older adults, there is evidence for a reduced ability to accurately decode emotional facial expressions. Previous studies have often followed up this phenomenon by examining the effect of the observers' age. However, decoding emotional faces is also likely to be influenced by stimulus features, and age-related changes in the face such as wrinkles and folds may render facial expressions of older adults harder to decode. In this paper, we review theoretical frameworks and empirical findings on age effects on decoding emotional expressions, with an emphasis on age-of-face effects. We conclude that the age of the face plays an important role for facial expression decoding. Lower expressivity, age-related changes in the face, less elaborated emotion schemas for older faces, negative attitudes toward older adults, and different visual scan patterns and neural processing of older than younger faces may lower decoding accuracy for older faces. Furthermore, age-related stereotypes and age-related changes in the face may bias the attribution of specific emotions such as sadness to older faces.

  8. Facial age affects emotional expression decoding

    PubMed Central

    Fölster, Mara; Hess, Ursula; Werheid, Katja

    2014-01-01

    Facial expressions convey important information on emotional states of our interaction partners. However, in interactions between younger and older adults, there is evidence for a reduced ability to accurately decode emotional facial expressions. Previous studies have often followed up this phenomenon by examining the effect of the observers' age. However, decoding emotional faces is also likely to be influenced by stimulus features, and age-related changes in the face such as wrinkles and folds may render facial expressions of older adults harder to decode. In this paper, we review theoretical frameworks and empirical findings on age effects on decoding emotional expressions, with an emphasis on age-of-face effects. We conclude that the age of the face plays an important role for facial expression decoding. Lower expressivity, age-related changes in the face, less elaborated emotion schemas for older faces, negative attitudes toward older adults, and different visual scan patterns and neural processing of older than younger faces may lower decoding accuracy for older faces. Furthermore, age-related stereotypes and age-related changes in the face may bias the attribution of specific emotions such as sadness to older faces. PMID:24550859

  9. Stereoscopy Amplifies Emotions Elicited by Facial Expressions

    PubMed Central

    Kätsyri, Jari; Häkkinen, Jukka

    2015-01-01

    Mediated facial expressions do not elicit emotions as strongly as real-life facial expressions, possibly due to the low fidelity of pictorial presentations in typical mediation technologies. In the present study, we investigated the extent to which stereoscopy amplifies emotions elicited by images of neutral, angry, and happy facial expressions. The emotional self-reports of positive and negative valence (which were evaluated separately) and arousal of 40 participants were recorded. The magnitude of perceived depth in the stereoscopic images was manipulated by varying the camera base at 15, 40, 65, 90, and 115 mm. The analyses controlled for participants’ gender, gender match, emotional empathy, and trait alexithymia. The results indicated that stereoscopy significantly amplified the negative valence and arousal elicited by angry expressions at the most natural (65 mm) camera base, whereas stereoscopy amplified the positive valence elicited by happy expressions in both the narrowed and most natural (15–65 mm) base conditions. Overall, the results indicate that stereoscopy amplifies the emotions elicited by mediated emotional facial expressions when the depth geometry is close to natural. The findings highlight the sensitivity of the visual system to depth and its effect on emotions. PMID:27551358

  10. Stereoscopy Amplifies Emotions Elicited by Facial Expressions.

    PubMed

    Hakala, Jussi; Kätsyri, Jari; Häkkinen, Jukka

    2015-12-01

    Mediated facial expressions do not elicit emotions as strongly as real-life facial expressions, possibly due to the low fidelity of pictorial presentations in typical mediation technologies. In the present study, we investigated the extent to which stereoscopy amplifies emotions elicited by images of neutral, angry, and happy facial expressions. The emotional self-reports of positive and negative valence (which were evaluated separately) and arousal of 40 participants were recorded. The magnitude of perceived depth in the stereoscopic images was manipulated by varying the camera base at 15, 40, 65, 90, and 115 mm. The analyses controlled for participants' gender, gender match, emotional empathy, and trait alexithymia. The results indicated that stereoscopy significantly amplified the negative valence and arousal elicited by angry expressions at the most natural (65 mm) camera base, whereas stereoscopy amplified the positive valence elicited by happy expressions in both the narrowed and most natural (15-65 mm) base conditions. Overall, the results indicate that stereoscopy amplifies the emotions elicited by mediated emotional facial expressions when the depth geometry is close to natural. The findings highlight the sensitivity of the visual system to depth and its effect on emotions.

  11. Sex Differences in the Rapid Detection of Emotional Facial Expressions

    PubMed Central

    Sawada, Reiko; Sato, Wataru; Kochiyama, Takanori; Uono, Shota; Kubota, Yasutaka; Yoshimura, Sayaka; Toichi, Motomi

    2014-01-01

    Background Previous studies have shown that females and males differ in the processing of emotional facial expressions including the recognition of emotion, and that emotional facial expressions are detected more rapidly than are neutral expressions. However, whether the sexes differ in the rapid detection of emotional facial expressions remains unclear. Methodology/Principal Findings We measured reaction times (RTs) during a visual search task in which 44 females and 46 males detected normal facial expressions of anger and happiness or their anti-expressions within crowds of neutral expressions. Anti-expressions expressed neutral emotions with visual changes quantitatively comparable to normal expressions. We also obtained subjective emotional ratings in response to the facial expression stimuli. RT results showed that both females and males detected normal expressions more rapidly than anti-expressions and normal-angry expressions more rapidly than normal-happy expressions. However, females and males showed different patterns in their subjective ratings in response to the facial expressions. Furthermore, sex differences were found in the relationships between subjective ratings and RTs. High arousal was more strongly associated with rapid detection of facial expressions in females, whereas negatively valenced feelings were more clearly associated with the rapid detection of facial expressions in males. Conclusion Our data suggest that females and males differ in their subjective emotional reactions to facial expressions and in the emotional processes that modulate the detection of facial expressions. PMID:24728084

  12. Judgments of subtle facial expressions of emotion.

    PubMed

    Matsumoto, David; Hwang, Hyisung C

    2014-04-01

    Most studies on judgments of facial expressions of emotion have primarily utilized prototypical, high-intensity expressions. This paper examines judgments of subtle facial expressions of emotion, including not only low-intensity versions of full-face prototypes but also variants of those prototypes. A dynamic paradigm was used in which observers were shown a neutral expression followed by the target expression to judge, and then the neutral expression again, allowing for a simulation of the emergence of the expression from and then return to a baseline. We also examined how signal and intensity clarities of the expressions (explained more fully in the Introduction) were associated with judgment agreement levels. Low-intensity, full-face prototypical expressions of emotion were judged as the intended emotion at rates significantly greater than chance. A number of the proposed variants were also judged as the intended emotions. Both signal and intensity clarities were individually associated with agreement rates; when their interrelationships were taken into account, signal clarity independently predicted agreement rates but intensity clarity did not. The presence or absence of specific muscles appeared to be more important to agreement rates than their intensity levels, with the exception of the intensity of zygomatic major, which was positively correlated with agreement rates for judgments of joy.

  13. LBP and SIFT based facial expression recognition

    NASA Astrophysics Data System (ADS)

    Sumer, Omer; Gunes, Ece O.

    2015-02-01

    This study compares the performance of local binary patterns (LBP) and scale invariant feature transform (SIFT) with support vector machines (SVM) in automatic classification of discrete facial expressions. Facial expression recognition is a multiclass classification problem and seven classes; happiness, anger, sadness, disgust, surprise, fear and comtempt are classified. Using SIFT feature vectors and linear SVM, 93.1% mean accuracy is acquired on CK+ database. On the other hand, the performance of LBP-based classifier with linear SVM is reported on SFEW using strictly person independent (SPI) protocol. Seven-class mean accuracy on SFEW is 59.76%. Experiments on both databases showed that LBP features can be used in a fairly descriptive way if a good localization of facial points and partitioning strategy are followed.

  14. Facial Expressivity at 4 Months: A Context by Expression Analysis

    PubMed Central

    Bennett, David S.; Bendersky, Margaret; Lewis, Michael

    2006-01-01

    The specificity predicted by differential emotions theory (DET) for early facial expressions in response to 5 different eliciting situations was studied in a sample of 4-month-old infants (n = 150). Infants were videotaped during tickle, sour taste, jack-in-the-box, arm restraint, and masked-stranger situations and their expressions were coded second by second. Infants showed a variety of facial expressions in each situation; however, more infants exhibited positive (joy and surprise) than negative expressions (anger, disgust, fear, and sadness) across all situations except sour taste. Consistent with DET-predicted specificity, joy expressions were the most common in response to tickling, and were less common in response to other situations. Surprise expressions were the most common in response to the jack-in-the-box, as predicted, but also were the most common in response to the arm restraint and masked-stranger situations, indicating a lack of specificity. No evidence of predicted specificity was found for anger, disgust, fear, and sadness expressions. Evidence of individual differences in expressivity within situations, as well as stability in the pattern across situations, underscores the need to examine both child and contextual factors in studying emotional development. The results provide little support for the DET postulate of situational specificity and suggest that a synthesis of differential emotions and dynamic systems theories of emotional expression should be considered. PMID:16878184

  15. [Facial expressions and human emotions. Bibliographic survey].

    PubMed

    da Silva, J A; da Silva, M J

    1995-01-01

    The present work intends to divulge some Nursing and Psychology' publications about human facies and emotions. Intends do help the health professional thinking about the facial expressions of emotion in the interpersonal relationships, showing the researches more cited at specific bibliography, first regarding to children and to adults.

  16. Time perception and dynamics of facial expressions of emotions.

    PubMed

    Fayolle, Sophie L; Droit-Volet, Sylvie

    2014-01-01

    Two experiments were run to examine the effects of dynamic displays of facial expressions of emotions on time judgments. The participants were given a temporal bisection task with emotional facial expressions presented in a dynamic or a static display. Two emotional facial expressions and a neutral expression were tested and compared. Each of the emotional expressions had the same affective valence (unpleasant), but one was high-arousing (expressing anger) and the other low-arousing (expressing sadness). Our results showed that time judgments are highly sensitive to movements in facial expressions and the emotions expressed. Indeed, longer perceived durations were found in response to the dynamic faces and the high-arousing emotional expressions compared to the static faces and low-arousing expressions. In addition, the facial movements amplified the effect of emotions on time perception. Dynamic facial expressions are thus interesting tools for examining variations in temporal judgments in different social contexts.

  17. Time Perception and Dynamics of Facial Expressions of Emotions

    PubMed Central

    Fayolle, Sophie L.; Droit-Volet, Sylvie

    2014-01-01

    Two experiments were run to examine the effects of dynamic displays of facial expressions of emotions on time judgments. The participants were given a temporal bisection task with emotional facial expressions presented in a dynamic or a static display. Two emotional facial expressions and a neutral expression were tested and compared. Each of the emotional expressions had the same affective valence (unpleasant), but one was high-arousing (expressing anger) and the other low-arousing (expressing sadness). Our results showed that time judgments are highly sensitive to movements in facial expressions and the emotions expressed. Indeed, longer perceived durations were found in response to the dynamic faces and the high-arousing emotional expressions compared to the static faces and low-arousing expressions. In addition, the facial movements amplified the effect of emotions on time perception. Dynamic facial expressions are thus interesting tools for examining variations in temporal judgments in different social contexts. PMID:24835285

  18. Holistic Gaze Strategy to Categorize Facial Expression of Varying Intensities

    PubMed Central

    Guo, Kun

    2012-01-01

    Using faces representing exaggerated emotional expressions, recent behaviour and eye-tracking studies have suggested a dominant role of individual facial features in transmitting diagnostic cues for decoding facial expressions. Considering that in everyday life we frequently view low-intensity expressive faces in which local facial cues are more ambiguous, we probably need to combine expressive cues from more than one facial feature to reliably decode naturalistic facial affects. In this study we applied a morphing technique to systematically vary intensities of six basic facial expressions of emotion, and employed a self-paced expression categorization task to measure participants' categorization performance and associated gaze patterns. The analysis of pooled data from all expressions showed that increasing expression intensity would improve categorization accuracy, shorten reaction time and reduce number of fixations directed at faces. The proportion of fixations and viewing time directed at internal facial features (eyes, nose and mouth region), however, was not affected by varying levels of intensity. Further comparison between individual facial expressions revealed that although proportional gaze allocation at individual facial features was quantitatively modulated by the viewed expressions, the overall gaze distribution in face viewing was qualitatively similar across different facial expressions and different intensities. It seems that we adopt a holistic viewing strategy to extract expressive cues from all internal facial features in processing of naturalistic facial expressions. PMID:22880043

  19. Violent Media Consumption and the Recognition of Dynamic Facial Expressions

    ERIC Educational Resources Information Center

    Kirsh, Steven J.; Mounts, Jeffrey R. W.; Olczak, Paul V.

    2006-01-01

    This study assessed the speed of recognition of facial emotional expressions (happy and angry) as a function of violent media consumption. Color photos of calm facial expressions morphed to either an angry or a happy facial expression. Participants were asked to make a speeded identification of the emotion (happiness or anger) during the morph.…

  20. Judging the Accuracy of Facial Expressions: A Methodological Inquiry.

    ERIC Educational Resources Information Center

    Andersen, Peter A.; Buller, David B.

    The extent to which human facial expressions are universal and cross-culturally recognizable has been the subject of considerable debate. Two approaches have been used in studies of facial expressions: the judgment approach entails showing examples of facial expressions to various cultures or groups of people and determining whether they interpret…

  1. Categorical Perception of Affective and Linguistic Facial Expressions

    ERIC Educational Resources Information Center

    McCullough, Stephen; Emmorey, Karen

    2009-01-01

    Two experiments investigated categorical perception (CP) effects for affective facial expressions and linguistic facial expressions from American Sign Language (ASL) for Deaf native signers and hearing non-signers. Facial expressions were presented in isolation (Experiment 1) or in an ASL verb context (Experiment 2). Participants performed ABX…

  2. Categorical Perception of Affective and Linguistic Facial Expressions

    ERIC Educational Resources Information Center

    McCullough, Stephen; Emmorey, Karen

    2009-01-01

    Two experiments investigated categorical perception (CP) effects for affective facial expressions and linguistic facial expressions from American Sign Language (ASL) for Deaf native signers and hearing non-signers. Facial expressions were presented in isolation (Experiment 1) or in an ASL verb context (Experiment 2). Participants performed ABX…

  3. Violent Media Consumption and the Recognition of Dynamic Facial Expressions

    ERIC Educational Resources Information Center

    Kirsh, Steven J.; Mounts, Jeffrey R. W.; Olczak, Paul V.

    2006-01-01

    This study assessed the speed of recognition of facial emotional expressions (happy and angry) as a function of violent media consumption. Color photos of calm facial expressions morphed to either an angry or a happy facial expression. Participants were asked to make a speeded identification of the emotion (happiness or anger) during the morph.…

  4. Automatic recognition of emotions from facial expressions

    NASA Astrophysics Data System (ADS)

    Xue, Henry; Gertner, Izidor

    2014-06-01

    In the human-computer interaction (HCI) process it is desirable to have an artificial intelligent (AI) system that can identify and categorize human emotions from facial expressions. Such systems can be used in security, in entertainment industries, and also to study visual perception, social interactions and disorders (e.g. schizophrenia and autism). In this work we survey and compare the performance of different feature extraction algorithms and classification schemes. We introduce a faster feature extraction method that resizes and applies a set of filters to the data images without sacrificing the accuracy. In addition, we have enhanced SVM to multiple dimensions while retaining the high accuracy rate of SVM. The algorithms were tested using the Japanese Female Facial Expression (JAFFE) Database and the Database of Faces (AT&T Faces).

  5. Perception of facial expression and facial identity in subjects with social developmental disorders.

    PubMed

    Hefter, Rebecca L; Manoach, Dara S; Barton, Jason J S

    2005-11-22

    It has been hypothesized that the social dysfunction in social developmental disorders (SDDs), such as autism, Asperger disorder, and the socioemotional processing disorder, impairs the acquisition of normal face-processing skills. The authors investigated whether this purported perceptual deficit was generalized to both facial expression and facial identity or whether these different types of facial perception were dissociated in SDDs. They studied 26 adults with a variety of SDD diagnoses, assessing their ability to discriminate famous from anonymous faces, their perception of emotional expression from facial and nonfacial cues, and the relationship between these abilities. They also compared the performance of two defined subgroups of subjects with SDDs on expression analysis: one with normal and one with impaired recognition of facial identity. While perception of facial expression was related to the perception of nonfacial expression, the perception of facial identity was not related to either facial or nonfacial expression. Likewise, subjects with SDDs with impaired facial identity processing perceived facial expression as well as those with normal facial identity processing. The processing of facial identity and that of facial expression are dissociable in social developmental disorders. Deficits in perceiving facial expression may be related to emotional processing more than face processing. Dissociations between the perception of facial identity and facial emotion are consistent with current cognitive models of face processing. The results argue against hypotheses that the social dysfunction in social developmental disorder causes a generalized failure to acquire face-processing skills.

  6. Stability of facial affective expressions in schizophrenia.

    PubMed

    Fatouros-Bergman, H; Spang, J; Merten, J; Preisler, G; Werbart, A

    2012-01-01

    Thirty-two videorecorded interviews were conducted by two interviewers with eight patients diagnosed with schizophrenia. Each patient was interviewed four times: three weekly interviews by the first interviewer and one additional interview by the second interviewer. 64 selected sequences where the patients were speaking about psychotic experiences were scored for facial affective behaviour with Emotion Facial Action Coding System (EMFACS). In accordance with previous research, the results show that patients diagnosed with schizophrenia express negative facial affectivity. Facial affective behaviour seems not to be dependent on temporality, since within-subjects ANOVA revealed no substantial changes in the amount of affects displayed across the weekly interview occasions. Whereas previous findings found contempt to be the most frequent affect in patients, in the present material disgust was as common, but depended on the interviewer. The results suggest that facial affectivity in these patients is primarily dominated by the negative emotions of disgust and, to a lesser extent, contempt and implies that this seems to be a fairly stable feature.

  7. Categorical perception of affective and linguistic facial expressions.

    PubMed

    McCullough, Stephen; Emmorey, Karen

    2009-02-01

    Two experiments investigated categorical perception (CP) effects for affective facial expressions and linguistic facial expressions from American Sign Language (ASL) for Deaf native signers and hearing non-signers. Facial expressions were presented in isolation (Experiment 1) or in an ASL verb context (Experiment 2). Participants performed ABX discrimination and identification tasks on morphed affective and linguistic facial expression continua. The continua were created by morphing end-point photo exemplars into 11 images, changing linearly from one expression to another in equal steps. For both affective and linguistic expressions, hearing non-signers exhibited better discrimination across category boundaries than within categories for both experiments, thus replicating previous results with affective expressions and demonstrating CP effects for non-canonical facial expressions. Deaf signers, however, showed significant CP effects only for linguistic facial expressions. Subsequent analyses indicated that order of presentation influenced signers' response time performance for affective facial expressions: viewing linguistic facial expressions first slowed response time for affective facial expressions. We conclude that CP effects for affective facial expressions can be influenced by language experience.

  8. Adults' responsiveness to children's facial expressions.

    PubMed

    Aradhye, Chinmay; Vonk, Jennifer; Arida, Danielle

    2015-07-01

    We investigated the effect of young children's (hereafter children's) facial expressions on adult responsiveness. In Study 1, 131 undergraduate students from a midsized university in the midwestern United States rated children's images and videos with smiling, crying, or neutral expressions on cuteness, likelihood to adopt, and participants' experienced distress. Looking times at images and videos along with perception of cuteness, likelihood to adopt, and experienced distress using 10-point Likert scales were measured. Videos of smiling children were rated as cuter and more likely to be adopted and were viewed for longer times compared with videos of crying children, which evoked more distress. In Study 2, we recorded responses from 101 of the same participants in an online survey measuring gender role identity, empathy, and perspective taking. Higher levels of femininity (as measured by Bem's Sex Role Inventory) predicted higher "likely to adopt" ratings for crying images. These findings indicate that adult perception of children and motivation to nurture are affected by both children's facial expressions and adult characteristics and build on existing literature to demonstrate that children may use expressions to manipulate the motivations of even non-kin adults to direct attention toward and perhaps nurture young children.

  9. The Communicative Function of Sad Facial Expressions.

    PubMed

    Reed, Lawrence Ian; DeScioli, Peter

    2017-01-01

    What are the communicative functions of sad facial expressions? Research shows that people feel sadness in response to losses but it's unclear whether sad expressions function to communicate losses to others and if so, what makes these signals credible. Here we use economic games to test the hypothesis that sad expressions lend credibility to claims of loss. Participants play the role of either a proposer or recipient in a game with a fictional backstory and real monetary payoffs. The proposers view a (fictional) video of the recipient's character displaying either a neutral or sad expression paired with a claim of loss. The proposer then decided how much money to give to the recipient. In three experiments, we test alternative theories by using situations in which the recipient's losses were uncertain (Experiment 1), the recipient's losses were certain (Experiment 2), or the recipient claims failed gains rather than losses (Experiment 3). Overall, we find that participants gave more money to recipients who displayed sad expressions compared to neutral expressions, but only under conditions of uncertain loss. This finding supports the hypothesis that sad expressions function to increase the credibility of claims of loss.

  10. Levels of Structural Integration and Facial Expressions of Negative Emotions.

    PubMed

    Bock, Astrid; Huber, Eva; Benecke, Cord

    2016-09-01

    For a clinically relevant understanding of facial displays of patients with mental disorders it is crucial to go beyond merely counting frequencies of facial expressions, but include the contextual information of the expression. We assume that patients with different levels of structural integration differ in the contextual embedding of their negative facial expressions of emotions. Facial affective behaviour of 80 female participants during an OPD interview was analysed using FACS (Facial Action Coding System) and the RFE coding system (Referencesof- Facial-Expression coding system; Bock et al. 2015).Using the RFE coding system, 2192 negative facial expressions of emotions were attributed to different references (e.g., interactive, self-related, object-related) by relying on contextual variables. Pure frequency of negative facial affect was not related to level of structural integration. Negative facial expressions of emotions directed towards the interviewer (interactive reference), as well as negative facial expressions directed towards the displayer's whole self were associated with lower levels of structural integration. In contrast, negative facial affects directed to single aspects of the self, to single aspects of objects, or to external situations were associated with higher levels of structural integration. The differentiation of references of facial affective behavior allows a deeper understanding of the connection between facial displays and structural levels of psychic integration.

  11. Dynamic facial expressions are processed holistically, but not more holistically than static facial expressions.

    PubMed

    Tobin, Alanna; Favelle, Simone; Palermo, Romina

    2016-09-01

    There is evidence that facial expressions are perceived holistically and featurally. The composite task is a direct measure of holistic processing (although the absence of a composite effect implies the use of other types of processing). Most composite task studies have used static images, despite the fact that movement is an important aspect of facial expressions and there is some evidence that movement may facilitate recognition. We created static and dynamic composites, in which emotions were reliably identified from each half of the face. The magnitude of the composite effect was similar for static and dynamic expressions identified from the top half (anger, sadness and surprise) but was reduced in dynamic as compared to static expressions identified from the bottom half (fear, disgust and joy). Thus, any advantage in recognising dynamic over static expressions is not likely to stem from enhanced holistic processing, rather motion may emphasise or disambiguate diagnostic featural information.

  12. Shadows alter facial expressions of Noh masks.

    PubMed

    Kawai, Nobuyuki; Miyata, Hiromitsu; Nishimura, Ritsuko; Okanoya, Kazuo

    2013-01-01

    A Noh mask, worn by expert actors during performance on the Japanese traditional Noh drama, conveys various emotional expressions despite its fixed physical properties. How does the mask change its expressions? Shadows change subtly during the actual Noh drama, which plays a key role in creating elusive artistic enchantment. We here describe evidence from two experiments regarding how attached shadows of the Noh masks influence the observers' recognition of the emotional expressions. In Experiment 1, neutral-faced Noh masks having the attached shadows of the happy/sad masks were recognized as bearing happy/sad expressions, respectively. This was true for all four types of masks each of which represented a character differing in sex and age, even though the original characteristics of the masks also greatly influenced the evaluation of emotions. Experiment 2 further revealed that frontal Noh mask images having shadows of upward/downward tilted masks were evaluated as sad/happy, respectively. This was consistent with outcomes from preceding studies using actually tilted Noh mask images. Results from the two experiments concur that purely manipulating attached shadows of the different types of Noh masks significantly alters the emotion recognition. These findings go in line with the mysterious facial expressions observed in Western paintings, such as the elusive qualities of Mona Lisa's smile. They also agree with the aesthetic principle of Japanese traditional art "yugen (profound grace and subtlety)", which highly appreciates subtle emotional expressions in the darkness.

  13. A comparison of facial expression properties in five hylobatid species.

    PubMed

    Scheider, Linda; Liebal, Katja; Oña, Leonardo; Burrows, Anne; Waller, Bridget

    2014-07-01

    Little is known about facial communication of lesser apes (family Hylobatidae) and how their facial expressions (and use of) relate to social organization. We investigated facial expressions (defined as combinations of facial movements) in social interactions of mated pairs in five different hylobatid species belonging to three different genera using a recently developed objective coding system, the Facial Action Coding System for hylobatid species (GibbonFACS). We described three important properties of their facial expressions and compared them between genera. First, we compared the rate of facial expressions, which was defined as the number of facial expressions per units of time. Second, we compared their repertoire size, defined as the number of different types of facial expressions used, independent of their frequency. Third, we compared the diversity of expression, defined as the repertoire weighted by the rate of use for each type of facial expression. We observed a higher rate and diversity of facial expression, but no larger repertoire, in Symphalangus (siamangs) compared to Hylobates and Nomascus species. In line with previous research, these results suggest siamangs differ from other hylobatids in certain aspects of their social behavior. To investigate whether differences in facial expressions are linked to hylobatid socio-ecology, we used a Phylogenetic General Least Square (PGLS) regression analysis to correlate those properties with two social factors: group-size and level of monogamy. No relationship between the properties of facial expressions and these socio-ecological factors was found. One explanation could be that facial expressions in hylobatid species are subject to phylogenetic inertia and do not differ sufficiently between species to reveal correlations with factors such as group size and monogamy level. © 2014 Wiley Periodicals, Inc.

  14. Sex differences in perception of invisible facial expressions

    PubMed Central

    Hong, Sang Wook; Yoon, K. Lira; Peaco, Sophia

    2015-01-01

    Previous research indicates that women are better at recognizing facial expressions than men. In the current study, we examined whether this female advantage in the processing of facial expressions also occurs at the unconscious level. In two studies, participants performed a simple detection task and a 4-AFC task while faces were rendered invisible by continuous flash suppression. When faces with full intensity expressions were suppressed, there was no significant sex difference in the time of breakup of suppression (Study 1). However, when suppressed faces depicted low intensity expressions, suppression broke up earlier in men than women, indicating that men may be more sensitive to facial features related to mild facial expressions (Study 2). The current findings suggest that the female advantage in processing of facial expressions is absent in unconscious processing of emotional information. The female advantage in facial expression processing may require conscious perception of faces. PMID:25883583

  15. Sex differences in perception of invisible facial expressions.

    PubMed

    Hong, Sang Wook; Yoon, K Lira; Peaco, Sophia

    2015-01-01

    Previous research indicates that women are better at recognizing facial expressions than men. In the current study, we examined whether this female advantage in the processing of facial expressions also occurs at the unconscious level. In two studies, participants performed a simple detection task and a 4-AFC task while faces were rendered invisible by continuous flash suppression. When faces with full intensity expressions were suppressed, there was no significant sex difference in the time of breakup of suppression (Study 1). However, when suppressed faces depicted low intensity expressions, suppression broke up earlier in men than women, indicating that men may be more sensitive to facial features related to mild facial expressions (Study 2). The current findings suggest that the female advantage in processing of facial expressions is absent in unconscious processing of emotional information. The female advantage in facial expression processing may require conscious perception of faces.

  16. Objectifying Facial Expressivity Assessment of Parkinson's Patients: Preliminary Study

    PubMed Central

    Patsis, Georgios; Jiang, Dongmei; Sahli, Hichem; Kerckhofs, Eric; Vandekerckhove, Marie

    2014-01-01

    Patients with Parkinson's disease (PD) can exhibit a reduction of spontaneous facial expression, designated as “facial masking,” a symptom in which facial muscles become rigid. To improve clinical assessment of facial expressivity of PD, this work attempts to quantify the dynamic facial expressivity (facial activity) of PD by automatically recognizing facial action units (AUs) and estimating their intensity. Spontaneous facial expressivity was assessed by comparing 7 PD patients with 8 control participants. To voluntarily produce spontaneous facial expressions that resemble those typically triggered by emotions, six emotions (amusement, sadness, anger, disgust, surprise, and fear) were elicited using movie clips. During the movie clips, physiological signals (facial electromyography (EMG) and electrocardiogram (ECG)) and frontal face video of the participants were recorded. The participants were asked to report on their emotional states throughout the experiment. We first examined the effectiveness of the emotion manipulation by evaluating the participant's self-reports. Disgust-induced emotions were significantly higher than the other emotions. Thus we focused on the analysis of the recorded data during watching disgust movie clips. The proposed facial expressivity assessment approach captured differences in facial expressivity between PD patients and controls. Also differences between PD patients with different progression of Parkinson's disease have been observed. PMID:25478003

  17. Objectifying facial expressivity assessment of Parkinson's patients: preliminary study.

    PubMed

    Wu, Peng; Gonzalez, Isabel; Patsis, Georgios; Jiang, Dongmei; Sahli, Hichem; Kerckhofs, Eric; Vandekerckhove, Marie

    2014-01-01

    Patients with Parkinson's disease (PD) can exhibit a reduction of spontaneous facial expression, designated as "facial masking," a symptom in which facial muscles become rigid. To improve clinical assessment of facial expressivity of PD, this work attempts to quantify the dynamic facial expressivity (facial activity) of PD by automatically recognizing facial action units (AUs) and estimating their intensity. Spontaneous facial expressivity was assessed by comparing 7 PD patients with 8 control participants. To voluntarily produce spontaneous facial expressions that resemble those typically triggered by emotions, six emotions (amusement, sadness, anger, disgust, surprise, and fear) were elicited using movie clips. During the movie clips, physiological signals (facial electromyography (EMG) and electrocardiogram (ECG)) and frontal face video of the participants were recorded. The participants were asked to report on their emotional states throughout the experiment. We first examined the effectiveness of the emotion manipulation by evaluating the participant's self-reports. Disgust-induced emotions were significantly higher than the other emotions. Thus we focused on the analysis of the recorded data during watching disgust movie clips. The proposed facial expressivity assessment approach captured differences in facial expressivity between PD patients and controls. Also differences between PD patients with different progression of Parkinson's disease have been observed.

  18. The Impact of Orthognathic Surgery on Facial Expressions.

    PubMed

    Al-Hiyali, Ali; Ayoub, Ashraf; Ju, Xiangyang; Almuzian, Mohammad; Al-Anezi, Thamer

    2015-12-01

    The aim of this study was to evaluate the impact of orthognathic surgical correction of facial asymmetry and maxillary hypoplasia on the magnitude and pattern of facial expressions. This study was carried out on 2 cohorts of patients: in group 1, 10 patients had surgical correction of facial asymmetry; in group 2, 13 patients had Le Fort I osteotomy to correct maxillary hypoplasia. The patients were asked to perform 3 facial expressions (maximal smile, lip purse, and cheek puff) that were recorded using the Di4D image-capture system before and after surgery. The capture of each expression generated 180 3-dimensional (3D) facial images. Twenty-seven facial soft tissue landmarks were digitized on the first frame of the 3D image of each expression and a mathematical generic mesh was applied on the 3D model to clone each patient's face. The cloned mesh was superimposed automatically on each sequence of the 3D images to evaluate the pattern of facial expressions. The digitization of facial landmarks was satisfactorily accurate and reproducible. In group 1, the asymmetry of facial expressions was significantly decreased after surgical correction (P = .0458). In group 2, Le Fort I osteotomy decreased the magnitude of facial expressions (P = .0267). This study confirmed that orthognathic surgery affects the dynamics of facial expressions; this should be considered when planning the surgery and informing patients about the surgical correction of dentofacial deformities. Copyright © 2015 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  19. Children's Representations of Facial Expression and Identity: Identity-Contingent Expression Aftereffects

    ERIC Educational Resources Information Center

    Vida, Mark D.; Mondloch, Catherine J.

    2009-01-01

    This investigation used adaptation aftereffects to examine developmental changes in the perception of facial expressions. Previous studies have shown that adults' perceptions of ambiguous facial expressions are biased following adaptation to intense expressions. These expression aftereffects are strong when the adapting and probe expressions share…

  20. Children's Representations of Facial Expression and Identity: Identity-Contingent Expression Aftereffects

    ERIC Educational Resources Information Center

    Vida, Mark D.; Mondloch, Catherine J.

    2009-01-01

    This investigation used adaptation aftereffects to examine developmental changes in the perception of facial expressions. Previous studies have shown that adults' perceptions of ambiguous facial expressions are biased following adaptation to intense expressions. These expression aftereffects are strong when the adapting and probe expressions share…

  1. Intensity dependence in high-level facial expression adaptation aftereffect.

    PubMed

    Hong, Sang Wook; Yoon, K Lira

    2017-06-14

    Perception of a facial expression can be altered or biased by a prolonged viewing of other facial expressions, known as the facial expression adaptation aftereffect (FEAA). Recent studies using antiexpressions have demonstrated a monotonic relation between the magnitude of the FEAA and adaptor extremity, suggesting that facial expressions are opponent coded and represented continuously from one expression to its antiexpression. However, it is unclear whether the opponent-coding scheme can account for the FEAA between two facial expressions. In the current study, we demonstrated that the magnitude of the FEAA between two facial expressions increased monotonically as a function of the intensity of adapting facial expressions, consistent with the predictions based on the opponent-coding model. Further, the monotonic increase in the FEAA occurred even when the intensity of an adapting face was too weak for its expression to be recognized. These results together suggest that multiple facial expressions are encoded and represented by balanced activity of neural populations tuned to different facial expressions.

  2. From facial expressions to bodily gestures

    PubMed Central

    2016-01-01

    This article aims to determine to what extent photographic practices in psychology, psychiatry and physiology contributed to the definition of the external bodily signs of passions and emotions in the second half of the 19th century in France. Bridging the gap between recent research in the history of emotions and photographic history, the following analyses focus on the photographic production of scientists and photographers who made significant contributions to the study of expressions and gestures, namely Duchenne de Boulogne, Charles Darwin, Paul Richer and Albert Londe. This article argues that photography became a key technology in their works due to the adequateness of the exposure time of different cameras to the duration of the bodily manifestations to be recorded, and that these uses constituted facial expressions and bodily gestures as particular objects for the scientific study. PMID:26900264

  3. Altering sensorimotor feedback disrupts visual discrimination of facial expressions.

    PubMed

    Wood, Adrienne; Lupyan, Gary; Sherrin, Steven; Niedenthal, Paula

    2016-08-01

    Looking at another person's facial expression of emotion can trigger the same neural processes involved in producing the expression, and such responses play a functional role in emotion recognition. Disrupting individuals' facial action, for example, interferes with verbal emotion recognition tasks. We tested the hypothesis that facial responses also play a functional role in the perceptual processing of emotional expressions. We altered the facial action of participants with a gel facemask while they performed a task that involved distinguishing target expressions from highly similar distractors. Relative to control participants, participants in the facemask condition demonstrated inferior perceptual discrimination of facial expressions, but not of nonface stimuli. The findings suggest that somatosensory/motor processes involving the face contribute to the visual perceptual-and not just conceptual-processing of facial expressions. More broadly, our study contributes to growing evidence for the fundamentally interactive nature of the perceptual inputs from different sensory modalities.

  4. Emotional Empathy and Facial Mimicry for Static and Dynamic Facial Expressions of Fear and Disgust.

    PubMed

    Rymarczyk, Krystyna; Żurawski, Łukasz; Jankowiak-Siuda, Kamila; Szatkowska, Iwona

    2016-01-01

    Facial mimicry is the tendency to imitate the emotional facial expressions of others. Increasing evidence suggests that the perception of dynamic displays leads to enhanced facial mimicry, especially for happiness and anger. However, little is known about the impact of dynamic stimuli on facial mimicry for fear and disgust. To investigate this issue, facial EMG responses were recorded in the corrugator supercilii, levator labii, and lateral frontalis muscles, while participants viewed static (photos) and dynamic (videos) facial emotional expressions. Moreover, we tested whether emotional empathy modulated facial mimicry for emotional facial expressions. In accordance with our predictions, the highly empathic group responded with larger activity in the corrugator supercilii and levator labii muscles. Moreover, dynamic compared to static facial expressions of fear revealed enhanced mimicry in the high-empathic group in the frontalis and corrugator supercilii muscles. In the low-empathic group the facial reactions were not differentiated between fear and disgust for both dynamic and static facial expressions. We conclude that highly empathic subjects are more sensitive in their facial reactions to the facial expressions of fear and disgust compared to low empathetic counterparts. Our data confirms that personal characteristics, i.e., empathy traits as well as modality of the presented stimuli, modulate the strength of facial mimicry. In addition, measures of EMG activity of the levator labii and frontalis muscles may be a useful index of empathic responses of fear and disgust.

  5. Emotional Empathy and Facial Mimicry for Static and Dynamic Facial Expressions of Fear and Disgust

    PubMed Central

    Rymarczyk, Krystyna; Żurawski, Łukasz; Jankowiak-Siuda, Kamila; Szatkowska, Iwona

    2016-01-01

    Facial mimicry is the tendency to imitate the emotional facial expressions of others. Increasing evidence suggests that the perception of dynamic displays leads to enhanced facial mimicry, especially for happiness and anger. However, little is known about the impact of dynamic stimuli on facial mimicry for fear and disgust. To investigate this issue, facial EMG responses were recorded in the corrugator supercilii, levator labii, and lateral frontalis muscles, while participants viewed static (photos) and dynamic (videos) facial emotional expressions. Moreover, we tested whether emotional empathy modulated facial mimicry for emotional facial expressions. In accordance with our predictions, the highly empathic group responded with larger activity in the corrugator supercilii and levator labii muscles. Moreover, dynamic compared to static facial expressions of fear revealed enhanced mimicry in the high-empathic group in the frontalis and corrugator supercilii muscles. In the low-empathic group the facial reactions were not differentiated between fear and disgust for both dynamic and static facial expressions. We conclude that highly empathic subjects are more sensitive in their facial reactions to the facial expressions of fear and disgust compared to low empathetic counterparts. Our data confirms that personal characteristics, i.e., empathy traits as well as modality of the presented stimuli, modulate the strength of facial mimicry. In addition, measures of EMG activity of the levator labii and frontalis muscles may be a useful index of empathic responses of fear and disgust. PMID:27933022

  6. Shadows Alter Facial Expressions of Noh Masks

    PubMed Central

    Kawai, Nobuyuki; Miyata, Hiromitsu; Nishimura, Ritsuko; Okanoya, Kazuo

    2013-01-01

    Background A Noh mask, worn by expert actors during performance on the Japanese traditional Noh drama, conveys various emotional expressions despite its fixed physical properties. How does the mask change its expressions? Shadows change subtly during the actual Noh drama, which plays a key role in creating elusive artistic enchantment. We here describe evidence from two experiments regarding how attached shadows of the Noh masks influence the observers’ recognition of the emotional expressions. Methodology/Principal Findings In Experiment 1, neutral-faced Noh masks having the attached shadows of the happy/sad masks were recognized as bearing happy/sad expressions, respectively. This was true for all four types of masks each of which represented a character differing in sex and age, even though the original characteristics of the masks also greatly influenced the evaluation of emotions. Experiment 2 further revealed that frontal Noh mask images having shadows of upward/downward tilted masks were evaluated as sad/happy, respectively. This was consistent with outcomes from preceding studies using actually tilted Noh mask images. Conclusions/Significance Results from the two experiments concur that purely manipulating attached shadows of the different types of Noh masks significantly alters the emotion recognition. These findings go in line with the mysterious facial expressions observed in Western paintings, such as the elusive qualities of Mona Lisa’s smile. They also agree with the aesthetic principle of Japanese traditional art “yugen (profound grace and subtlety)”, which highly appreciates subtle emotional expressions in the darkness. PMID:23940748

  7. Facial Expression Generation from Speaker's Emotional States in Daily Conversation

    NASA Astrophysics Data System (ADS)

    Mori, Hiroki; Ohshima, Koh

    A framework for generating facial expressions from emotional states in daily conversation is described. It provides a mapping between emotional states and facial expressions, where the former is represented by vectors with psychologically-defined abstract dimensions, and the latter is coded by the Facial Action Coding System. In order to obtain the mapping, parallel data with rated emotional states and facial expressions were collected for utterances of a female speaker, and a neural network was trained with the data. The effectiveness of proposed method is verified by a subjective evaluation test. As the result, the Mean Opinion Score with respect to the suitability of generated facial expression was 3.86 for the speaker, which was close to that of hand-made facial expressions.

  8. The own-sex effect in facial expression recognition.

    PubMed

    Doi, Hirokazu; Amamoto, Takaaki; Okishige, Yuuka; Kato, Mikako; Shinohara, Kazuyuki

    2010-06-02

    Responses to smiling and nonsmiling expressions are influenced by sex of both viewer and expresser. This study investigated the stage of neural processing at which the sexes of viewer and expresser modulate the recognition of smiling and nonsmiling expressions by measuring event-related potentials. The results showed that late positive component was larger to neutral expression of own-sex faces than to that of opposite-sex faces. These results indicate that neural correlates of facial expression recognition are influenced by the sexes of both viewer and expresser of facial expression at the stage of cognitive evaluation.

  9. Facial Expression Recognition in Nonvisual Imagery

    NASA Astrophysics Data System (ADS)

    Olague, Gustavo; Hammoud, Riad; Trujillo, Leonardo; Hernández, Benjamín; Romero, Eva

    This chapter presents two novel approaches that allow computer vision applications to perform human facial expression recognition (FER). From a prob lem standpoint, we focus on FER beyond the human visual spectrum, in long-wave infrared imagery, thus allowing us to offer illumination-independent solutions to this important human-computer interaction problem. From a methodological stand point, we introduce two different feature extraction techniques: a principal com ponent analysis-based approach with automatic feature selection and one based on texture information selected by an evolutionary algorithm. In the former, facial fea tures are selected based on interest point clusters, and classification is carried out us ing eigenfeature information; in the latter, an evolutionary-based learning algorithm searches for optimal regions of interest and texture features based on classification accuracy. Both of these approaches use a support vector machine-committee for classification. Results show effective performance for both techniques, from which we can conclude that thermal imagery contains worthwhile information for the FER problem beyond the human visual spectrum.

  10. Dielectric elastomer actuators for facial expression

    NASA Astrophysics Data System (ADS)

    Wang, Yuzhe; Zhu, Jian

    2016-04-01

    Dielectric elastomer actuators have the advantage of mimicking the salient feature of life: movements in response to stimuli. In this paper we explore application of dielectric elastomer actuators to artificial muscles. These artificial muscles can mimic natural masseter to control jaw movements, which are key components in facial expressions especially during talking and singing activities. This paper investigates optimal design of the dielectric elastomer actuator. It is found that the actuator with embedded plastic fibers can avert electromechanical instability and can greatly improve its actuation. Two actuators are then installed in a robotic skull to drive jaw movements, mimicking the masseters in a human jaw. Experiments show that the maximum vertical displacement of the robotic jaw, driven by artificial muscles, is comparable to that of the natural human jaw during speech activities. Theoretical simulations are conducted to analyze the performance of the actuator, which is quantitatively consistent with the experimental observations.

  11. Reproducibility of the dynamics of facial expressions in unilateral facial palsy.

    PubMed

    Alagha, M A; Ju, X; Morley, S; Ayoub, A

    2017-09-04

    The aim of this study was to assess the reproducibility of non-verbal facial expressions in unilateral facial paralysis using dynamic four-dimensional (4D) imaging. The Di4D system was used to record five facial expressions of 20 adult patients. The system captured 60 three-dimensional (3D) images per second; each facial expression took 3-4seconds which was recorded in real time. Thus a set of 180 3D facial images was generated for each expression. The procedure was repeated after 30min to assess the reproducibility of the expressions. A mathematical facial mesh consisting of thousands of quasi-point 'vertices' was conformed to the face in order to determine the morphological characteristics in a comprehensive manner. The vertices were tracked throughout the sequence of the 180 images. Five key 3D facial frames from each sequence of images were analyzed. Comparisons were made between the first and second capture of each facial expression to assess the reproducibility of facial movements. Corresponding images were aligned using partial Procrustes analysis, and the root mean square distance between them was calculated and analyzed statistically (paired Student t-test, P<0.05). Facial expressions of lip purse, cheek puff, and raising of eyebrows were reproducible. Facial expressions of maximum smile and forceful eye closure were not reproducible. The limited coordination of various groups of facial muscles contributed to the lack of reproducibility of these facial expressions. 4D imaging is a useful clinical tool for the assessment of facial expressions. Copyright © 2017 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  12. Automatic decoding of facial movements reveals deceptive pain expressions.

    PubMed

    Bartlett, Marian Stewart; Littlewort, Gwen C; Frank, Mark G; Lee, Kang

    2014-03-31

    In highly social species such as humans, faces have evolved to convey rich information for social interaction, including expressions of emotions and pain [1-3]. Two motor pathways control facial movement [4-7]: a subcortical extrapyramidal motor system drives spontaneous facial expressions of felt emotions, and a cortical pyramidal motor system controls voluntary facial expressions. The pyramidal system enables humans to simulate facial expressions of emotions not actually experienced. Their simulation is so successful that they can deceive most observers [8-11]. However, machine vision may be able to distinguish deceptive facial signals from genuine facial signals by identifying the subtle differences between pyramidally and extrapyramidally driven movements. Here, we show that human observers could not discriminate real expressions of pain from faked expressions of pain better than chance, and after training human observers, we improved accuracy to a modest 55%. However, a computer vision system that automatically measures facial movements and performs pattern recognition on those movements attained 85% accuracy. The machine system's superiority is attributable to its ability to differentiate the dynamics of genuine expressions from faked expressions. Thus, by revealing the dynamics of facial action through machine vision systems, our approach has the potential to elucidate behavioral fingerprints of neural control systems involved in emotional signaling.

  13. The Facial Expression Coding System (FACES): Development, Validation, and Utility

    ERIC Educational Resources Information Center

    Kring, Ann M.; Sloan, Denise M.

    2007-01-01

    This article presents information on the development and validation of the Facial Expression Coding System (FACES; A. M. Kring & D. Sloan, 1991). Grounded in a dimensional model of emotion, FACES provides information on the valence (positive, negative) of facial expressive behavior. In 5 studies, reliability and validity data from 13 diverse…

  14. The Facial Expression Coding System (FACES): Development, Validation, and Utility

    ERIC Educational Resources Information Center

    Kring, Ann M.; Sloan, Denise M.

    2007-01-01

    This article presents information on the development and validation of the Facial Expression Coding System (FACES; A. M. Kring & D. Sloan, 1991). Grounded in a dimensional model of emotion, FACES provides information on the valence (positive, negative) of facial expressive behavior. In 5 studies, reliability and validity data from 13 diverse…

  15. Four not six: Revealing culturally common facial expressions of emotion.

    PubMed

    Jack, Rachael E; Sun, Wei; Delis, Ioannis; Garrod, Oliver G B; Schyns, Philippe G

    2016-06-01

    As a highly social species, humans generate complex facial expressions to communicate a diverse range of emotions. Since Darwin's work, identifying among these complex patterns which are common across cultures and which are culture-specific has remained a central question in psychology, anthropology, philosophy, and more recently machine vision and social robotics. Classic approaches to addressing this question typically tested the cross-cultural recognition of theoretically motivated facial expressions representing 6 emotions, and reported universality. Yet, variable recognition accuracy across cultures suggests a narrower cross-cultural communication supported by sets of simpler expressive patterns embedded in more complex facial expressions. We explore this hypothesis by modeling the facial expressions of over 60 emotions across 2 cultures, and segregating out the latent expressive patterns. Using a multidisciplinary approach, we first map the conceptual organization of a broad spectrum of emotion words by building semantic networks in 2 cultures. For each emotion word in each culture, we then model and validate its corresponding dynamic facial expression, producing over 60 culturally valid facial expression models. We then apply to the pooled models a multivariate data reduction technique, revealing 4 latent and culturally common facial expression patterns that each communicates specific combinations of valence, arousal, and dominance. We then reveal the face movements that accentuate each latent expressive pattern to create complex facial expressions. Our data questions the widely held view that 6 facial expression patterns are universal, instead suggesting 4 latent expressive patterns with direct implications for emotion communication, social psychology, cognitive neuroscience, and social robotics. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  16. Facial Expressions of Emotion and the Assessment of Performance

    DTIC Science & Technology

    2010-07-01

    AFRL-RH-WP-TR-2010-0114 Facial Expressions of Emotion and the Assessment of Performance Regina M. Schmidt Christienne R. Ruth...To) January 2009 – May 2010 4. TITLE AND SUBTITLE Facial Expressions of Emotion and the Assessment of Performance 5a. CONTRACT NUMBER In-House...the ability of facial contractions to provide information regarding one‟s decision making capabilities under two emotional stressors. The study was

  17. Recognition of 3D facial expression from posed data

    NASA Astrophysics Data System (ADS)

    Samad, Manar D.; Iftekharuddin, Khan M.

    2013-05-01

    Although recognition of facial expression in 3D facial images has been an active research area, most of the prior works are limited to using full frontal facial images. These techniques primarily project 3D facial image on 2D and manually select landmarks in 2D projection to extract relevant features. Face recognition in 2D images can be challenging due to unconstrained conditions such as head pose, occlusion, and resulting loss of data. Similarly, pose variation in 3D facial imaging can also result in loss of data. In most of the current 3D facial recognition works, when 3D posed face data are projected onto 2D, additional data loss may render 2D facial expression recognition even more challenging. In comparison, this work proposes novel feature extraction directly from the 3D facial posed images without the need of manual selection of landmarks or projection of images in 2D space. This feature is obtained as the angle between consecutive 3D normal vectors on the vertex points aligned either horizontally or vertically across the 3D facial image. Our facial expression recognition results show that the feature obtained from vertices aligned vertically across the face yields the best accuracy for classification with an average 87.8% area under the ROC. The results further suggest that the same feature outperforms its horizontal counterpart in recognizing facial expressions for pose variation between 35º - 50º with average accuracy of 80% - 60%, respectively.

  18. Enhanced subliminal emotional responses to dynamic facial expressions.

    PubMed

    Sato, Wataru; Kubota, Yasutaka; Toichi, Motomi

    2014-01-01

    Emotional processing without conscious awareness plays an important role in human social interaction. Several behavioral studies reported that subliminal presentation of photographs of emotional facial expressions induces unconscious emotional processing. However, it was difficult to elicit strong and robust effects using this method. We hypothesized that dynamic presentations of facial expressions would enhance subliminal emotional effects and tested this hypothesis with two experiments. Fearful or happy facial expressions were presented dynamically or statically in either the left or the right visual field for 20 (Experiment 1) and 30 (Experiment 2) ms. Nonsense target ideographs were then presented, and participants reported their preference for them. The results consistently showed that dynamic presentations of emotional facial expressions induced more evident emotional biases toward subsequent targets than did static ones. These results indicate that dynamic presentations of emotional facial expressions induce more evident unconscious emotional processing.

  19. The Not Face: A grammaticalization of facial expressions of emotion

    PubMed Central

    Benitez-Quiroz, C. Fabian; Wilbur, Ronnie B.; Martinez, Aleix M.

    2016-01-01

    Facial expressions of emotion are thought to have evolved from the development of facial muscles used in sensory regulation and later adapted to express moral judgment. Negative moral judgment includes the expressions of anger, disgust and contempt. Here, we study the hypothesis that these facial expressions of negative moral judgment have further evolved into a facial expression of negation regularly used as a grammatical marker in human language. Specifically, we show that people from different cultures expressing negation use the same facial muscles as those employed to express negative moral judgment. We then show that this nonverbal signal is used as a co-articulator in speech and that, in American Sign Language, it has been grammaticalized as a non-manual marker. Furthermore, this facial expression of negation exhibits the theta oscillation (3–8 Hz) universally seen in syllable and mouthing production in speech and signing. These results provide evidence for the hypothesis that some components of human language have evolved from facial expressions of emotion, and suggest an evolutionary route for the emergence of grammatical markers. PMID:26872248

  20. The not face: A grammaticalization of facial expressions of emotion.

    PubMed

    Benitez-Quiroz, C Fabian; Wilbur, Ronnie B; Martinez, Aleix M

    2016-05-01

    Facial expressions of emotion are thought to have evolved from the development of facial muscles used in sensory regulation and later adapted to express moral judgment. Negative moral judgment includes the expressions of anger, disgust and contempt. Here, we study the hypothesis that these facial expressions of negative moral judgment have further evolved into a facial expression of negation regularly used as a grammatical marker in human language. Specifically, we show that people from different cultures expressing negation use the same facial muscles as those employed to express negative moral judgment. We then show that this nonverbal signal is used as a co-articulator in speech and that, in American Sign Language, it has been grammaticalized as a non-manual marker. Furthermore, this facial expression of negation exhibits the theta oscillation (3-8 Hz) universally seen in syllable and mouthing production in speech and signing. These results provide evidence for the hypothesis that some components of human language have evolved from facial expressions of emotion, and suggest an evolutionary route for the emergence of grammatical markers.

  1. Contextual interference processing during fast categorisations of facial expressions.

    PubMed

    Frühholz, Sascha; Trautmann-Lengsfeld, Sina A; Herrmann, Manfred

    2011-09-01

    We examined interference effects of emotionally associated background colours during fast valence categorisations of negative, neutral and positive expressions. According to implicitly learned colour-emotion associations, facial expressions were presented with colours that either matched the valence of these expressions or not. Experiment 1 included infrequent non-matching trials and Experiment 2 a balanced ratio of matching and non-matching trials. Besides general modulatory effects of contextual features on the processing of facial expressions, we found differential effects depending on the valance of target facial expressions. Whereas performance accuracy was mainly affected for neutral expressions, performance speed was specifically modulated by emotional expressions indicating some susceptibility of emotional expressions to contextual features. Experiment 3 used two further colour-emotion combinations, but revealed only marginal interference effects most likely due to missing colour-emotion associations. The results are discussed with respect to inherent processing demands of emotional and neutral expressions and their susceptibility to contextual interference.

  2. Electrophysiological correlates of the efficient detection of emotional facial expressions.

    PubMed

    Sawada, Reiko; Sato, Wataru; Uono, Shota; Kochiyama, Takanori; Toichi, Motomi

    2014-04-29

    Behavioral studies have shown that emotional facial expressions are detected more rapidly and accurately than are neutral expressions. However, the neural mechanism underlying this efficient detection has remained unclear. To investigate this mechanism, we measured event-related potentials (ERPs) during a visual search task in which participants detected the normal emotional facial expressions of anger and happiness or their control stimuli, termed "anti-expressions," within crowds of neutral expressions. The anti-expressions, which were created using a morphing technique that produced changes equivalent to those in the normal emotional facial expressions compared with the neutral facial expressions, were most frequently recognized as emotionally neutral. Behaviorally, normal expressions were detected faster and more accurately and were rated as more emotionally arousing than were the anti-expressions. Regarding ERPs, the normal expressions elicited larger early posterior negativity (EPN) at 200-400ms compared with anti-expressions. Furthermore, larger EPN was related to faster and more accurate detection and higher emotional arousal. These data suggest that the efficient detection of emotional facial expressions is implemented via enhanced activation of the posterior visual cortices at 200-400ms based on their emotional significance. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Processing of facial emotion expression in major depression: a review.

    PubMed

    Bourke, Cecilia; Douglas, Katie; Porter, Richard

    2010-08-01

    Processing of facial expressions of emotion is central to human interaction, and has important effects on behaviour and affective state. A range of methods and paradigms have been used to investigate various aspects of abnormal processing of facial expressions in major depression, including emotion specific deficits in recognition accuracy, response biases and attentional biases. The aim of this review is to examine and interpret data from studies of facial emotion processing in major depression, in the context of current knowledge about the neural correlates of facial expression processing of primary emotions. The review also discusses the methodologies used to examine facial expression processing. Studies of facial emotion processing and facial emotion recognition were identified up to December 2009 utilizing MEDLINE and Web of Science. Although methodological variations complicate interpretation of findings, there is reasonably consistent evidence of a negative response bias towards sadness in individuals with major depression, so that positive (happy), neutral or ambiguous facial expressions tend to be evaluated as more sad or less happy compared with healthy control groups. There is also evidence of increased vigilance and selective attention towards sad expressions and away from happy expressions, but less evidence of reduced general or emotion-specific recognition accuracy. Data is complicated by the use of multiple paradigms and the heterogeneity of major depression. Future studies should address methodological problems, including variations in patient characteristics, testing paradigms and procedures, and statistical methods used to analyse findings.

  4. Facial expression recognition based on improved deep belief networks

    NASA Astrophysics Data System (ADS)

    Wu, Yao; Qiu, Weigen

    2017-08-01

    In order to improve the robustness of facial expression recognition, a method of face expression recognition based on Local Binary Pattern (LBP) combined with improved deep belief networks (DBNs) is proposed. This method uses LBP to extract the feature, and then uses the improved deep belief networks as the detector and classifier to extract the LBP feature. The combination of LBP and improved deep belief networks is realized in facial expression recognition. In the JAFFE (Japanese Female Facial Expression) database on the recognition rate has improved significantly.

  5. The Effect of Expression Sequence and Expressor Identity on Judgments of the Intensity of Facial Expression.

    ERIC Educational Resources Information Center

    Thayer, Stephen

    Three hundred female college students participated in a study to determine whether viewing a prior sequence of photographs of emotional-facial expressions would influence their impressions of later expressions. Each subject was tested individually and shown one series of five separate photographs of varying facial expressions, one photograph at a…

  6. Recognition, Expression, and Understanding Facial Expressions of Emotion in Adolescents with Nonverbal and General Learning Disabilities

    ERIC Educational Resources Information Center

    Bloom, Elana; Heath, Nancy

    2010-01-01

    Children with nonverbal learning disabilities (NVLD) have been found to be worse at recognizing facial expressions than children with verbal learning disabilities (LD) and without LD. However, little research has been done with adolescents. In addition, expressing and understanding facial expressions is yet to be studied among adolescents with LD…

  7. Viewing distance matter to perceived intensity of facial expressions.

    PubMed

    Gerhardsson, Andreas; Högman, Lennart; Fischer, Håkan

    2015-01-01

    In our daily perception of facial expressions, we depend on an ability to generalize across the varied distances at which they may appear. This is important to how we interpret the quality and the intensity of the expression. Previous research has not investigated whether this so called perceptual constancy also applies to the experienced intensity of facial expressions. Using a psychophysical measure (Borg CR100 scale) the present study aimed to further investigate perceptual constancy of happy and angry facial expressions at varied sizes, which is a proxy for varying viewing distances. Seventy-one (42 females) participants rated the intensity and valence of facial expressions varying in distance and intensity. The results demonstrated that the perceived intensity (PI) of the emotional facial expression was dependent on the distance of the face and the person perceiving it. An interaction effect was noted, indicating that close-up faces are perceived as more intense than faces at a distance and that this effect is stronger the more intense the facial expression truly is. The present study raises considerations regarding constancy of the PI of happy and angry facial expressions at varied distances.

  8. Parameterized Facial Expression Synthesis Based on MPEG-4

    NASA Astrophysics Data System (ADS)

    Raouzaiou, Amaryllis; Tsapatsoulis, Nicolas; Karpouzis, Kostas; Kollias, Stefanos

    2002-12-01

    In the framework of MPEG-4, one can include applications where virtual agents, utilizing both textual and multisensory data, including facial expressions and nonverbal speech help systems become accustomed to the actual feelings of the user. Applications of this technology are expected in educational environments, virtual collaborative workplaces, communities, and interactive entertainment. Facial animation has gained much interest within the MPEG-4 framework; with implementation details being an open research area (Tekalp, 1999). In this paper, we describe a method for enriching human computer interaction, focusing on analysis and synthesis of primary and intermediate facial expressions (Ekman and Friesen (1978)). To achieve this goal, we utilize facial animation parameters (FAPs) to model primary expressions and describe a rule-based technique for handling intermediate ones. A relation between FAPs and the activation parameter proposed in classical psychological studies is established, leading to parameterized facial expression analysis and synthesis notions, compatible with the MPEG-4 standard.

  9. The Relationships between Processing Facial Identity, Emotional Expression, Facial Speech, and Gaze Direction during Development

    ERIC Educational Resources Information Center

    Spangler, Sibylle M.; Schwarzer, Gudrun; Korell, Monika; Maier-Karius, Johanna

    2010-01-01

    Four experiments were conducted with 5- to 11-year-olds and adults to investigate whether facial identity, facial speech, emotional expression, and gaze direction are processed independently of or in interaction with one another. In a computer-based, speeded sorting task, participants sorted faces according to facial identity while disregarding…

  10. The Relationships between Processing Facial Identity, Emotional Expression, Facial Speech, and Gaze Direction during Development

    ERIC Educational Resources Information Center

    Spangler, Sibylle M.; Schwarzer, Gudrun; Korell, Monika; Maier-Karius, Johanna

    2010-01-01

    Four experiments were conducted with 5- to 11-year-olds and adults to investigate whether facial identity, facial speech, emotional expression, and gaze direction are processed independently of or in interaction with one another. In a computer-based, speeded sorting task, participants sorted faces according to facial identity while disregarding…

  11. Reduced facial expressiveness in Parkinson's disease: A pure motor disorder?

    PubMed

    Ricciardi, Lucia; Bologna, Matteo; Morgante, Francesca; Ricciardi, Diego; Morabito, Bruno; Volpe, Daniele; Martino, Davide; Tessitore, Alessandro; Pomponi, Massimiliano; Bentivoglio, Anna Rita; Bernabei, Roberto; Fasano, Alfonso

    2015-11-15

    Impaired emotional facial expressiveness is an important feature in Parkinson's disease (PD). Although there is evidence of a possible relationship between reduced facial expressiveness and altered emotion recognition or imagery in PD, it is unknown whether other aspects of the emotional processing, such as subjective emotional experience (alexithymia), might influence hypomimia in this condition. In this study wee aimed to investigate possible relationship between reduced facial expressiveness and altered emotion processing (including facial recognition and alexithymia) in patients with PD. Forty PD patients and seventeen healthy controls were evaluated. Facial expressiveness was rated on video recordings, according to the UPDRS-III item 19 and using an ad hoc scale assessing static and dynamic facial expression and posed emotions. Six blind raters evaluated the patients' videos. Emotion facial recognition was tested using the Ekman Test; alexithymia was assessed using Toronto Alexithymia Scale (TAS-20). PD patients had a significantly reduced static and dynamic facial expressiveness and a deficit in posing happiness and surprise. They performed significantly worse than healthy controls in recognizing surprise (p=0.03). The Ekman total score positively correlated with the global expressiveness (R^2=0.39, p=0.01) and with the expressiveness of disgust (R^2=0.32, p=0.01). The occurrence of alexithymia was not different between PD patients and HC; however, a significant negative correlation between the expressiveness of disgust was found for a subscore of TAS (R^2=-.447, p=0.007). Reduced facial expressiveness in PD may be in part related to difficulties with emotional recognition in a context of an unimpaired subjective emotional experience. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Deciphering the enigmatic face: the importance of facial dynamics in interpreting subtle facial expressions.

    PubMed

    Ambadar, Zara; Schooler, Jonathan W; Cohn, Jeffrey F

    2005-05-01

    Most studies investigating the recognition of facial expressions have focused on static displays of intense expressions. Consequently, researchers may have underestimated the importance of motion in deciphering the subtle expressions that permeate real-life situations. In two experiments, we examined the effect of motion on perception of subtle facial expressions and tested the hypotheses that motion improves affect judgment by (a) providing denser sampling of expressions, (b) providing dynamic information, (c) facilitating configural processing, and (d) enhancing the perception of change. Participants viewed faces depicting subtle facial expressions in four modes (single-static, multi-static, dynamic, and first-last). Experiment 1 demonstrated a robust effect of motion and suggested that this effect was due to the dynamic property of the expression. Experiment 2 showed that the beneficial effect of motion may be due more specifically to its role in perception of change. Together, these experiments demonstrated the importance of motion in identifying subtle facial expressions.

  13. The Neuropsychology of Facial Identity and Facial Expression in Children with Mental Retardation

    ERIC Educational Resources Information Center

    Singh, Nirbhay N.; Oswald, Donald P.; Lancioni, Giulio E.; Ellis, Cynthia R.; Sage, Monica; Ferris, Jennifer R.

    2005-01-01

    We indirectly determined how children with mental retardation analyze facial identity and facial expression, and if these analyses of identity and expression were controlled by independent cognitive processes. In a reaction time study, 20 children with mild mental retardation were required to determine if simultaneously presented photographs of…

  14. Measuring the speed of recognising facially expressed emotions.

    PubMed

    Hildebrandt, Andrea; Schacht, Annekathrin; Sommer, Werner; Wilhelm, Oliver

    2012-01-01

    Faces provide identity- and emotion-related information-basic cues for mastering social interactions. Traditional models of face recognition suggest that following a very first initial stage the processing streams for facial identity and expression depart. In the present study we extended our previous multivariate investigations of face identity processing abilities to the speed of recognising facially expressed emotions. Analyses are based on a sample of N=151 young adults. First, we established a measurement model with a higher order factor for the speed of recognising facially expressed emotions (SRE). This model has acceptable fit without specifying emotion-specific relations between indicators. Next, we assessed whether SRE can be reliably distinguished from the speed of recognising facial identity (SRI) and found latent factors for SRE and SRI to be perfectly correlated. In contrast, SRE and SRI were both only moderately related to a latent factor for the speed of recognising non-face stimuli (SRNF). We conclude that the processing of facial stimuli-and not the processing of facially expressed basic emotions-is the critical component of SRE. These findings are at variance with suggestions of separate routes for processing facial identity and emotional facial expressions and suggest much more communality between these streams as far as the aspect of processing speed is concerned.

  15. The relationships between processing facial identity, emotional expression, facial speech, and gaze direction during development.

    PubMed

    Spangler, Sibylle M; Schwarzer, Gudrun; Korell, Monika; Maier-Karius, Johanna

    2010-01-01

    Four experiments were conducted with 5- to 11-year-olds and adults to investigate whether facial identity, facial speech, emotional expression, and gaze direction are processed independently of or in interaction with one another. In a computer-based, speeded sorting task, participants sorted faces according to facial identity while disregarding facial speech, emotional expression, and gaze direction or, alternatively, according to facial speech, emotional expression, and gaze direction while disregarding facial identity. Reaction times showed that children and adults were able to direct their attention selectively to facial identity despite variations of other kinds of face information, but when sorting according to facial speech and emotional expression, they were unable to ignore facial identity. In contrast, gaze direction could be processed independently of facial identity in all age groups. Apart from shorter reaction times and fewer classification errors, no substantial change in processing facial information was found to be correlated with age. We conclude that adult-like face processing routes are employed from 5 years of age onward.

  16. Automated Facial Action Coding System for Dynamic Analysis of Facial Expressions in Neuropsychiatric Disorders

    PubMed Central

    Hamm, Jihun; Kohler, Christian G.; Gur, Ruben C.; Verma, Ragini

    2011-01-01

    Facial expression is widely used to evaluate emotional impairment in neuropsychiatric disorders. Ekman and Friesen’s Facial Action Coding System (FACS) encodes movements of individual facial muscles from distinct momentary changes in facial appearance. Unlike facial expression ratings based on categorization of expressions into prototypical emotions (happiness, sadness, anger, fear, disgust, etc.), FACS can encode ambiguous and subtle expressions, and therefore is potentially more suitable for analyzing the small differences in facial affect. However, FACS rating requires extensive training, and is time consuming and subjective thus prone to bias. To overcome these limitations, we developed an automated FACS based on advanced computer science technology. The system automatically tracks faces in a video, extracts geometric and texture features, and produces temporal profiles of each facial muscle movement. These profiles are quantified to compute frequencies of single and combined Action Units (AUs) in videos, which can facilitate statistical study of large populations in disorders affecting facial expression. We derived quantitative measures of flat and inappropriate facial affect automatically from temporal AU profiles. Applicability of the automated FACS was illustrated in a pilot study, by applying it to data of videos from eight schizophrenia patients and controls. We created temporal AU profiles that provided rich information on the dynamics of facial muscle movements for each subject. The quantitative measures of flatness and inappropriateness showed clear differences between patients and the controls, highlighting their potential in automatic and objective quantification of symptom severity. PMID:21741407

  17. Automated Facial Action Coding System for dynamic analysis of facial expressions in neuropsychiatric disorders.

    PubMed

    Hamm, Jihun; Kohler, Christian G; Gur, Ruben C; Verma, Ragini

    2011-09-15

    Facial expression is widely used to evaluate emotional impairment in neuropsychiatric disorders. Ekman and Friesen's Facial Action Coding System (FACS) encodes movements of individual facial muscles from distinct momentary changes in facial appearance. Unlike facial expression ratings based on categorization of expressions into prototypical emotions (happiness, sadness, anger, fear, disgust, etc.), FACS can encode ambiguous and subtle expressions, and therefore is potentially more suitable for analyzing the small differences in facial affect. However, FACS rating requires extensive training, and is time consuming and subjective thus prone to bias. To overcome these limitations, we developed an automated FACS based on advanced computer science technology. The system automatically tracks faces in a video, extracts geometric and texture features, and produces temporal profiles of each facial muscle movement. These profiles are quantified to compute frequencies of single and combined Action Units (AUs) in videos, and they can facilitate a statistical study of large populations in disorders known to impact facial expression. We derived quantitative measures of flat and inappropriate facial affect automatically from temporal AU profiles. Applicability of the automated FACS was illustrated in a pilot study, by applying it to data of videos from eight schizophrenia patients and controls. We created temporal AU profiles that provided rich information on the dynamics of facial muscle movements for each subject. The quantitative measures of flatness and inappropriateness showed clear differences between patients and the controls, highlighting their potential in automatic and objective quantification of symptom severity. Copyright © 2011 Elsevier B.V. All rights reserved.

  18. Automatic decoding of facial movements reveals deceptive pain expressions

    PubMed Central

    Bartlett, Marian Stewart; Littlewort, Gwen C.; Frank, Mark G.; Lee, Kang

    2014-01-01

    Summary In highly social species such as humans, faces have evolved to convey rich information for social interaction, including expressions of emotions and pain [1–3]. Two motor pathways control facial movement [4–7]. A subcortical extrapyramidal motor system drives spontaneous facial expressions of felt emotions. A cortical pyramidal motor system controls voluntary facial expressions. The pyramidal system enables humans to simulate facial expressions of emotions not actually experienced. Their simulation is so successful that they can deceive most observers [8–11]. Machine vision may, however, be able to distinguish deceptive from genuine facial signals by identifying the subtle differences between pyramidally and extrapyramidally driven movements. Here we show that human observers could not discriminate real from faked expressions of pain better than chance, and after training, improved accuracy to a modest 55%. However a computer vision system that automatically measures facial movements and performs pattern recognition on those movements attained 85% accuracy. The machine system’s superiority is attributable to its ability to differentiate the dynamics of genuine from faked expressions. Thus by revealing the dynamics of facial action through machine vision systems, our approach has the potential to elucidate behavioral fingerprints of neural control systems involved in emotional signaling. PMID:24656830

  19. Image ratio features for facial expression recognition application.

    PubMed

    Song, Mingli; Tao, Dacheng; Liu, Zicheng; Li, Xuelong; Zhou, Mengchu

    2010-06-01

    Video-based facial expression recognition is a challenging problem in computer vision and human-computer interaction. To target this problem, texture features have been extracted and widely used, because they can capture image intensity changes raised by skin deformation. However, existing texture features encounter problems with albedo and lighting variations. To solve both problems, we propose a new texture feature called image ratio features. Compared with previously proposed texture features, e.g., high gradient component features, image ratio features are more robust to albedo and lighting variations. In addition, to further improve facial expression recognition accuracy based on image ratio features, we combine image ratio features with facial animation parameters (FAPs), which describe the geometric motions of facial feature points. The performance evaluation is based on the Carnegie Mellon University Cohn-Kanade database, our own database, and the Japanese Female Facial Expression database. Experimental results show that the proposed image ratio feature is more robust to albedo and lighting variations, and the combination of image ratio features and FAPs outperforms each feature alone. In addition, we study asymmetric facial expressions based on our own facial expression database and demonstrate the superior performance of our combined expression recognition system.

  20. Facial expression of emotions in borderline personality disorder and depression.

    PubMed

    Renneberg, Babette; Heyn, Katrin; Gebhard, Rita; Bachmann, Silke

    2005-09-01

    Borderline personality disorder (BPD) is characterized by marked problems in interpersonal relationships and emotion regulation. The assumption of emotional hyper-reactivity in BPD is tested regarding the facial expression of emotions, an aspect highly relevant for communication processes and a central feature of emotion regulation. Facial expressions of emotions are examined in a group of 30 female inpatients with BPD, 27 women with major depression and 30 non-patient female controls. Participants were videotaped while watching two short movie sequences, inducing either positive or negative emotions. Frequency of emotional facial expressions and intensity of happiness expressions were examined, using the Emotional Facial Action Coding System (EMFACS-7, Friesen & Ekman, EMFACS-7: Emotional Facial Action Coding System, Version 7. Unpublished manual, 1984). Group differences were analyzed for the negative and the positive mood-induction procedure separately. Results indicate that BPD patients reacted similar to depressed patients with reduced facial expressiveness to both films. The highest emotional facial activity to both films and most intense happiness expressions were displayed by the non-clinical control group. Current findings contradict the assumption of a general hyper-reactivity to emotional stimuli in patients with BPD.

  1. Perception of temporal asymmetries in dynamic facial expressions

    PubMed Central

    Reinl, Maren; Bartels, Andreas

    2015-01-01

    In the current study we examined whether timeline-reversals and emotional direction of dynamic facial expressions affect subjective experience of human observers. We recorded natural movies of faces that increased or decreased their expressions of fear, and played them either in the natural frame order or reversed from last to first frame (reversed timeline). This led to four conditions of increasing or decreasing fear, either following the natural or reversed temporal trajectory of facial dynamics. This 2-by-2 factorial design controlled for visual low-level properties, static visual content, and motion energy across the different factors. It allowed us to examine perceptual consequences that would occur if the timeline trajectory of facial muscle movements during the increase of an emotion are not the exact mirror of the timeline during the decrease. It additionally allowed us to study perceptual differences between increasing and decreasing emotional expressions. Perception of these time-dependent asymmetries have not yet been quantified. We found that three emotional measures, emotional intensity, artificialness of facial movement, and convincingness or plausibility of emotion portrayal, were affected by timeline-reversals as well as by the emotional direction of the facial expressions. Our results imply that natural dynamic facial expressions contain temporal asymmetries, and show that deviations from the natural timeline lead to a reduction of perceived emotional intensity and convincingness, and to an increase of perceived artificialness of the dynamic facial expression. In addition, they show that decreasing facial expressions are judged as less plausible than increasing facial expressions. Our findings are of relevance for both, behavioral as well as neuroimaging studies, as processing and perception are influenced by temporal asymmetries. PMID:26300807

  2. Development of the Korean Facial Emotion Stimuli: Korea University Facial Expression Collection 2nd Edition

    PubMed Central

    Kim, Sun-Min; Kwon, Ye-Jin; Jung, Soo-Yun; Kim, Min-Ji; Cho, Yang Seok; Kim, Hyun Taek; Nam, Ki-Chun; Kim, Hackjin; Choi, Kee-Hong; Choi, June-Seek

    2017-01-01

    Background: Developing valid emotional facial stimuli for specific ethnicities creates ample opportunities to investigate both the nature of emotional facial information processing in general and clinical populations as well as the underlying mechanisms of facial emotion processing within and across cultures. Given that most entries in emotional facial stimuli databases were developed with western samples, and given that very few of the eastern emotional facial stimuli sets were based strictly on the Ekman’s Facial Action Coding System, developing valid emotional facial stimuli of eastern samples remains a high priority. Aims: To develop and examine the psychometric properties of six basic emotional facial stimuli recruiting professional Korean actors and actresses based on the Ekman’s Facial Action Coding System for the Korea University Facial Expression Collection-Second Edition (KUFEC-II). Materials And Methods: Stimulus selection was done in two phases. First, researchers evaluated the clarity and intensity of each stimulus developed based on the Facial Action Coding System. Second, researchers selected a total of 399 stimuli from a total of 57 actors and actresses, which were then rated on accuracy, intensity, valence, and arousal by 75 independent raters. Conclusion: The hit rates between the targeted and rated expressions of the KUFEC-II were all above 80%, except for fear (50%) and disgust (63%). The KUFEC-II appears to be a valid emotional facial stimuli database, providing the largest set of emotional facial stimuli. The mean intensity score was 5.63 (out of 7), suggesting that the stimuli delivered the targeted emotions with great intensity. All positive expressions were rated as having a high positive valence, whereas all negative expressions were rated as having a high negative valence. The KUFEC II is expected to be widely used in various psychological studies on emotional facial expression. KUFEC-II stimuli can be obtained through contacting the

  3. Development of the Korean Facial Emotion Stimuli: Korea University Facial Expression Collection 2nd Edition.

    PubMed

    Kim, Sun-Min; Kwon, Ye-Jin; Jung, Soo-Yun; Kim, Min-Ji; Cho, Yang Seok; Kim, Hyun Taek; Nam, Ki-Chun; Kim, Hackjin; Choi, Kee-Hong; Choi, June-Seek

    2017-01-01

    Background: Developing valid emotional facial stimuli for specific ethnicities creates ample opportunities to investigate both the nature of emotional facial information processing in general and clinical populations as well as the underlying mechanisms of facial emotion processing within and across cultures. Given that most entries in emotional facial stimuli databases were developed with western samples, and given that very few of the eastern emotional facial stimuli sets were based strictly on the Ekman's Facial Action Coding System, developing valid emotional facial stimuli of eastern samples remains a high priority. Aims: To develop and examine the psychometric properties of six basic emotional facial stimuli recruiting professional Korean actors and actresses based on the Ekman's Facial Action Coding System for the Korea University Facial Expression Collection-Second Edition (KUFEC-II). Materials And Methods: Stimulus selection was done in two phases. First, researchers evaluated the clarity and intensity of each stimulus developed based on the Facial Action Coding System. Second, researchers selected a total of 399 stimuli from a total of 57 actors and actresses, which were then rated on accuracy, intensity, valence, and arousal by 75 independent raters. Conclusion: The hit rates between the targeted and rated expressions of the KUFEC-II were all above 80%, except for fear (50%) and disgust (63%). The KUFEC-II appears to be a valid emotional facial stimuli database, providing the largest set of emotional facial stimuli. The mean intensity score was 5.63 (out of 7), suggesting that the stimuli delivered the targeted emotions with great intensity. All positive expressions were rated as having a high positive valence, whereas all negative expressions were rated as having a high negative valence. The KUFEC II is expected to be widely used in various psychological studies on emotional facial expression. KUFEC-II stimuli can be obtained through contacting the

  4. Discrimination of gender using facial image with expression change

    NASA Astrophysics Data System (ADS)

    Kuniyada, Jun; Fukuda, Takahiro; Terada, Kenji

    2005-12-01

    By carrying out marketing research, the managers of large-sized department stores or small convenience stores obtain the information such as ratio of men and women of visitors and an age group, and improve their management plan. However, these works are carried out in the manual operations, and it becomes a big burden to small stores. In this paper, the authors propose a method of men and women discrimination by extracting difference of the facial expression change from color facial images. Now, there are a lot of methods of the automatic recognition of the individual using a motion facial image or a still facial image in the field of image processing. However, it is very difficult to discriminate gender under the influence of the hairstyle and clothes, etc. Therefore, we propose the method which is not affected by personality such as size and position of facial parts by paying attention to a change of an expression. In this method, it is necessary to obtain two facial images with an expression and an expressionless. First, a region of facial surface and the regions of facial parts such as eyes, nose, and mouth are extracted in the facial image with color information of hue and saturation in HSV color system and emphasized edge information. Next, the features are extracted by calculating the rate of the change of each facial part generated by an expression change. In the last step, the values of those features are compared between the input data and the database, and the gender is discriminated. In this paper, it experimented for the laughing expression and smile expression, and good results were provided for discriminating gender.

  5. Understanding Facial Expressions of Pain in Patients With Depression.

    PubMed

    Lautenbacher, Stefan; Bär, Karl-Juergen; Eisold, Patricia; Kunz, Miriam

    2016-12-02

    Although depression is associated with more clinical pain complaints, psychophysical data sometimes point to hypoalgesic alterations. Studying the more reflex-like facial expression of pain in patients with depression may offer a new perspective. Facial and psychophysical responses to nonpainful and painful heat stimuli were studied in 23 patients with major depressive disorder (MDD) and 23 matched control participants. As psychophysical data, pain thresholds, tolerance thresholds, and self-report were assessed. Facial responses were videotaped and subjected offline to Facial Action Coding System analysis. One of the key facial responses of pain, which is a known facial signal of negative affect (contraction of the eyebrows), was significantly increased in MDD patients. Moreover, facial expressions and pain ratings were strongly correlated in MDD patients, whereas these 2 response systems were-in line with established findings-only weakly related in healthy participants. Pain psychophysics was unaltered in MDD patients compared with healthy control participants. In conclusion, the facial expression of pain in MDD patients indicates rather hyper- than hypoalgesia, with enhanced affective pain processing. Moreover, the linkage between subjective and facial responses was much stronger in MDD patients, which may be due to a reduced influence of social display rules, which normally complicate this relationship.

  6. Facial identity and facial expression are initially integrated at visual perceptual stages of face processing.

    PubMed

    Fisher, Katie; Towler, John; Eimer, Martin

    2016-01-08

    It is frequently assumed that facial identity and facial expression are analysed in functionally and anatomically distinct streams within the core visual face processing system. To investigate whether expression and identity interact during the visual processing of faces, we employed a sequential matching procedure where participants compared either the identity or the expression of two successively presented faces, and ignored the other irrelevant dimension. Repetitions versus changes of facial identity and expression were varied independently across trials, and event-related potentials (ERPs) were recorded during task performance. Irrelevant facial identity and irrelevant expression both interfered with performance in the expression and identity matching tasks. These symmetrical interference effects show that neither identity nor expression can be selectively ignored during face matching, and suggest that they are not processed independently. N250r components to identity repetitions that reflect identity matching mechanisms in face-selective visual cortex were delayed and attenuated when there was an expression change, demonstrating that facial expression interferes with visual identity matching. These findings provide new evidence for interactions between facial identity and expression within the core visual processing system, and question the hypothesis that these two attributes are processed independently. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Does the Organization of Emotional Expression Change over Time? Facial Expressivity from 4 to 12 Months

    ERIC Educational Resources Information Center

    Bennett, David S.; Bendersky, Margaret; Lewis, Michael

    2005-01-01

    Differentiation models contend that the organization of facial expressivity increases during infancy. Accordingly, infants are believed to exhibit increasingly specific facial expressions in response to stimuli as a function of development. This study tested this hypothesis in a sample of 151 infants (83 boys and 68 girls) observed in 4 situations…

  8. Does the Organization of Emotional Expression Change over Time? Facial Expressivity from 4 to 12 Months

    ERIC Educational Resources Information Center

    Bennett, David S.; Bendersky, Margaret; Lewis, Michael

    2005-01-01

    Differentiation models contend that the organization of facial expressivity increases during infancy. Accordingly, infants are believed to exhibit increasingly specific facial expressions in response to stimuli as a function of development. This study tested this hypothesis in a sample of 151 infants (83 boys and 68 girls) observed in 4 situations…

  9. Top-down guidance in visual search for facial expressions.

    PubMed

    Hahn, Sowon; Gronlund, Scott D

    2007-02-01

    Using a visual search paradigm, we investigated how a top-down goal modified attentional bias for threatening facial expressions. In two experiments, participants searched for a facial expression either based on stimulus characteristics or a top-down goal. In Experiment 1 participants searched for a discrepant facial expression in a homogenous crowd of faces. Consistent with previous research, we obtained a shallower response time (RT) slope when the target face was angry than when it was happy. In Experiment 2, participants searched for a specific type of facial expression (allowing a top-down goal). When the display included a target, we found a shallower RT slope for the angry than for the happy face search. However, when an angry or happy face was present in the display in opposition to the task goal, we obtained equivalent RT slopes, suggesting that the mere presence of an angry face in opposition to the task goal did not support the well-known angry face superiority effect. Furthermore, RT distribution analyses supported the special status of an angry face only when it was combined with the top-down goal. On the basis of these results, we suggest that a threatening facial expression may guide attention as a high-priority stimulus in the absence of a specific goal; however, in the presence of a specific goal, the efficiency of facial expression search is dependent on the combined influence of a top-down goal and the stimulus characteristics.

  10. Importance of the brow in facial expressiveness during human communication.

    PubMed

    Neely, John Gail; Lisker, Paul; Drapekin, Jesse

    2014-03-01

    The objective of this study was to evaluate laterality and upper/lower face dominance of expressiveness during prescribed speech using a unique validated image subtraction system capable of sensitive and reliable measurement of facial surface deformation. Observations and experiments of central control of facial expressions during speech and social utterances in humans and animals suggest that the right mouth moves more than the left during nonemotional speech. However, proficient lip readers seem to attend to the whole face to interpret meaning from expressed facial cues, also implicating a horizontal (upper face-lower face) axis. Prospective experimental design. Experimental maneuver: recited speech. image-subtraction strength-duration curve amplitude. Thirty normal human adults were evaluated during memorized nonemotional recitation of 2 short sentences. Facial movements were assessed using a video-image subtractions system capable of simultaneously measuring upper and lower specific areas of each hemiface. The results demonstrate both axes influence facial expressiveness in human communication; however, the horizontal axis (upper versus lower face) would appear dominant, especially during what would appear to be spontaneous breakthrough unplanned expressiveness. These data are congruent with the concept that the left cerebral hemisphere has control over nonemotionally stimulated speech; however, the multisynaptic brainstem extrapyramidal pathways may override hemiface laterality and preferentially take control of the upper face. Additionally, these data demonstrate the importance of the often-ignored brow in facial expressiveness. Experimental study. EBM levels not applicable.

  11. Facial expressions of mice in aggressive and fearful contexts.

    PubMed

    Defensor, Erwin B; Corley, Michael J; Blanchard, Robert J; Blanchard, D Caroline

    2012-12-05

    Some animals display a variety of context dependent facial expressions. Previous studies have shown that rodents display a facial grimace while in pain. To determine if the facial expressions of mice extend beyond pain, facial expressions were analyzed in the presence of non-social, social and predator stimuli. In a vibrissae contact test, the whiskers of mice were stroked by the bristles of a brush. In a social proximity test, two mice were placed together in a small chamber where contact was virtually unavoidable. In a resident-intruder test of aggression, an unknown mouse was placed into the homecage of another mouse. In a cat odor exposure test and in a live rat exposure test, mice were presented with the respective stimuli. Results from this study indicated that mice showed two patterns of expression, either a full display of changes in the measured facial components, characterized by tightened eyes, flattened ears, nose swells and cheek swells; or a more limited display of these facial changes. The full display of changes occurred in the vibrissae contact test, the social proximity test, and in resident mice in the resident-intruder test. The more limited display of facial changes occurred in the cat odor exposure test, the rat exposure test and in intruder mice in the resident-intruder test. The differential display of facial changes across conditions indicated that mice showed tightened eyes and flattened ears in situations that provided the immediate potential for contact, suggesting that such changes are involved in protection of sensitive and/or vulnerable body parts. Furthermore, the display of facial expressions by mice indicates that these expressions are widely distributed across evolution.

  12. Facial expressions of emotion: a cognitive neuroscience perspective.

    PubMed

    Erickson, Kristine; Schulkin, Jay

    2003-06-01

    Facial expressions are one example of emotional behavior that illustrate the importance of emotions to both basic survival and social interaction. Basic facial responses to stimuli such as sweet and bitter taste are important for species fitness and governed by simple rules. Even at this basic level, facial responses have communicative value to other species members. During evolution simple facial responses were extended for use in more complex nonverbal communications; the responses are labile. The perception and production of facial expressions are cognitive processes and numerous subcortical and cortical areas contribute to these operations. We suggest that no specific emotion center exists over and above cognitive systems in the brain, and that emotion should not be divorced from cognition.

  13. Automatic Facial Expression Recognition and Operator Functional State

    NASA Technical Reports Server (NTRS)

    Blanson, Nina

    2011-01-01

    The prevalence of human error in safety-critical occupations remains a major challenge to mission success despite increasing automation in control processes. Although various methods have been proposed to prevent incidences of human error, none of these have been developed to employ the detection and regulation of Operator Functional State (OFS), or the optimal condition of the operator while performing a task, in work environments due to drawbacks such as obtrusiveness and impracticality. A video-based system with the ability to infer an individual's emotional state from facial feature patterning mitigates some of the problems associated with other methods of detecting OFS, like obtrusiveness and impracticality in integration with the mission environment. This paper explores the utility of facial expression recognition as a technology for inferring OFS by first expounding on the intricacies of OFS and the scientific background behind emotion and its relationship with an individual's state. Then, descriptions of the feedback loop and the emotion protocols proposed for the facial recognition program are explained. A basic version of the facial expression recognition program uses Haar classifiers and OpenCV libraries to automatically locate key facial landmarks during a live video stream. Various methods of creating facial expression recognition software are reviewed to guide future extensions of the program. The paper concludes with an examination of the steps necessary in the research of emotion and recommendations for the creation of an automatic facial expression recognition program for use in real-time, safety-critical missions.

  14. Automatic Facial Expression Recognition and Operator Functional State

    NASA Technical Reports Server (NTRS)

    Blanson, Nina

    2012-01-01

    The prevalence of human error in safety-critical occupations remains a major challenge to mission success despite increasing automation in control processes. Although various methods have been proposed to prevent incidences of human error, none of these have been developed to employ the detection and regulation of Operator Functional State (OFS), or the optimal condition of the operator while performing a task, in work environments due to drawbacks such as obtrusiveness and impracticality. A video-based system with the ability to infer an individual's emotional state from facial feature patterning mitigates some of the problems associated with other methods of detecting OFS, like obtrusiveness and impracticality in integration with the mission environment. This paper explores the utility of facial expression recognition as a technology for inferring OFS by first expounding on the intricacies of OFS and the scientific background behind emotion and its relationship with an individual's state. Then, descriptions of the feedback loop and the emotion protocols proposed for the facial recognition program are explained. A basic version of the facial expression recognition program uses Haar classifiers and OpenCV libraries to automatically locate key facial landmarks during a live video stream. Various methods of creating facial expression recognition software are reviewed to guide future extensions of the program. The paper concludes with an examination of the steps necessary in the research of emotion and recommendations for the creation of an automatic facial expression recognition program for use in real-time, safety-critical missions

  15. Slowing down Presentation of Facial Movements and Vocal Sounds Enhances Facial Expression Recognition and Induces Facial-Vocal Imitation in Children with Autism

    ERIC Educational Resources Information Center

    Tardif, Carole; Laine, France; Rodriguez, Melissa; Gepner, Bruno

    2007-01-01

    This study examined the effects of slowing down presentation of facial expressions and their corresponding vocal sounds on facial expression recognition and facial and/or vocal imitation in children with autism. Twelve autistic children and twenty-four normal control children were presented with emotional and non-emotional facial expressions on…

  16. Slowing down Presentation of Facial Movements and Vocal Sounds Enhances Facial Expression Recognition and Induces Facial-Vocal Imitation in Children with Autism

    ERIC Educational Resources Information Center

    Tardif, Carole; Laine, France; Rodriguez, Melissa; Gepner, Bruno

    2007-01-01

    This study examined the effects of slowing down presentation of facial expressions and their corresponding vocal sounds on facial expression recognition and facial and/or vocal imitation in children with autism. Twelve autistic children and twenty-four normal control children were presented with emotional and non-emotional facial expressions on…

  17. A specific neural substrate for perceiving facial expressions of disgust.

    PubMed

    Phillips, M L; Young, A W; Senior, C; Brammer, M; Andrew, C; Calder, A J; Bullmore, E T; Perrett, D I; Rowland, D; Williams, S C; Gray, J A; David, A S

    1997-10-02

    Recognition of facial expressions is critical to our appreciation of the social and physical environment, with separate emotions having distinct facial expressions. Perception of fearful facial expressions has been extensively studied, appearing to depend upon the amygdala. Disgust-literally 'bad taste'-is another important emotion, with a distinct evolutionary history, and is conveyed by a characteristic facial expression. We have used functional magnetic resonance imaging (fMRI) to examine the neural substrate for perceiving disgust expressions. Normal volunteers were presented with faces showing mild or strong disgust or fear. Cerebral activation in response to these stimuli was contrasted with that for neutral faces. Results for fear generally confirmed previous positron emission tomography findings of amygdala involvement. Both strong and mild expressions of disgust activated anterior insular cortex but not the amygdala; strong disgust also activated structures linked to a limbic cortico-striatal-thalamic circuit. The anterior insula is known to be involved in responses to offensive tastes. The neural response to facial expressions of disgust in others is thus closely related to appraisal of distasteful stimuli.

  18. Expressive facial animation synthesis by learning speech coarticulation and expression spaces.

    PubMed

    Deng, Zhigang; Neumann, Ulrich; Lewis, J P; Kim, Tae-Yong; Bulut, Murtaza; Narayanan, Shrikanth

    2006-01-01

    Synthesizing expressive facial animation is a very challenging topic within the graphics community. In this paper, we present an expressive facial animation synthesis system enabled by automated learning from facial motion capture data. Accurate 3D motions of the markers on the face of a human subject are captured while he/she recites a predesigned corpus, with specific spoken and visual expressions. We present a novel motion capture mining technique that "learns" speech coarticulation models for diphones and triphones from the recorded data. A Phoneme-Independent Expression Eigenspace (PIEES) that encloses the dynamic expression signals is constructed by motion signal processing (phoneme-based time-warping and subtraction) and Principal Component Analysis (PCA) reduction. New expressive facial animations are synthesized as follows: First, the learned coarticulation models are concatenated to synthesize neutral visual speech according to novel speech input, then a texture-synthesis-based approach is used to generate a novel dynamic expression signal from the PIEES model, and finally the synthesized expression signal is blended with the synthesized neutral visual speech to create the final expressive facial animation. Our experiments demonstrate that the system can effectively synthesize realistic expressive facial animation.

  19. Face processing in children with autism spectrum disorder: independent or interactive processing of facial identity and facial expression?

    PubMed

    Krebs, Julia F; Biswas, Ajanta; Pascalis, Olivier; Kamp-Becker, Inge; Remschmidt, Helmuth; Schwarzer, Gudrun

    2011-06-01

    The current study investigated if deficits in processing emotional expression affect facial identity processing and vice versa in children with autism spectrum disorder. Children with autism and IQ and age matched typically developing children classified faces either by emotional expression, thereby ignoring facial identity or by facial identity disregarding emotional expression. Typically developing children processed facial identity independently from facial expressions but processed facial expressions in interaction with identity. Children with autism processed both facial expression and identity independently of each other. They selectively directed their attention to one facial parameter despite variations in the other. Results indicate that there is no interaction in processing facial identity and emotional expression in autism spectrum disorder.

  20. Training facial expression production in children on the autism spectrum.

    PubMed

    Gordon, Iris; Pierce, Matthew D; Bartlett, Marian S; Tanaka, James W

    2014-10-01

    Children with autism spectrum disorder (ASD) show deficits in their ability to produce facial expressions. In this study, a group of children with ASD and IQ-matched, typically developing (TD) children were trained to produce "happy" and "angry" expressions with the FaceMaze computer game. FaceMaze uses an automated computer recognition system that analyzes the child's facial expression in real time. Before and after playing the Angry and Happy versions of FaceMaze, children posed "happy" and "angry" expressions. Naïve raters judged the post-FaceMaze "happy" and "angry" expressions of the ASD group as higher in quality than their pre-FaceMaze productions. Moreover, the post-game expressions of the ASD group were rated as equal in quality as the expressions of the TD group.

  1. Reconstructing dynamic mental models of facial expressions in prosopagnosia reveals distinct representations for identity and expression.

    PubMed

    Richoz, Anne-Raphaëlle; Jack, Rachael E; Garrod, Oliver G B; Schyns, Philippe G; Caldara, Roberto

    2015-04-01

    The human face transmits a wealth of signals that readily provide crucial information for social interactions, such as facial identity and emotional expression. Yet, a fundamental question remains unresolved: does the face information for identity and emotional expression categorization tap into common or distinct representational systems? To address this question we tested PS, a pure case of acquired prosopagnosia with bilateral occipitotemporal lesions anatomically sparing the regions that are assumed to contribute to facial expression (de)coding (i.e., the amygdala, the insula and the posterior superior temporal sulcus--pSTS). We previously demonstrated that PS does not use information from the eye region to identify faces, but relies on the suboptimal mouth region. PS's abnormal information use for identity, coupled with her neural dissociation, provides a unique opportunity to probe the existence of a dichotomy in the face representational system. To reconstruct the mental models of the six basic facial expressions of emotion in PS and age-matched healthy observers, we used a novel reverse correlation technique tracking information use on dynamic faces. PS was comparable to controls, using all facial features to (de)code facial expressions with the exception of fear. PS's normal (de)coding of dynamic facial expressions suggests that the face system relies either on distinct representational systems for identity and expression, or dissociable cortical pathways to access them. Interestingly, PS showed a selective impairment for categorizing many static facial expressions, which could be accounted for by her lesion in the right inferior occipital gyrus. PS's advantage for dynamic facial expressions might instead relate to a functionally distinct and sufficient cortical pathway directly connecting the early visual cortex to the spared pSTS. Altogether, our data provide critical insights on the healthy and impaired face systems, question evidence of deficits

  2. Impaired Overt Facial Mimicry in Response to Dynamic Facial Expressions in High-Functioning Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Yoshimura, Sayaka; Sato, Wataru; Uono, Shota; Toichi, Motomi

    2015-01-01

    Previous electromyographic studies have reported that individuals with autism spectrum disorders (ASD) exhibited atypical patterns of facial muscle activity in response to facial expression stimuli. However, whether such activity is expressed in visible facial mimicry remains unknown. To investigate this issue, we videotaped facial responses in…

  3. Impaired Overt Facial Mimicry in Response to Dynamic Facial Expressions in High-Functioning Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Yoshimura, Sayaka; Sato, Wataru; Uono, Shota; Toichi, Motomi

    2015-01-01

    Previous electromyographic studies have reported that individuals with autism spectrum disorders (ASD) exhibited atypical patterns of facial muscle activity in response to facial expression stimuli. However, whether such activity is expressed in visible facial mimicry remains unknown. To investigate this issue, we videotaped facial responses in…

  4. Recognising facial expression from spatially and temporally modified movements.

    PubMed

    Pollick, Frank E; Hill, Harold; Calder, Andrew; Paterson, Helena

    2003-01-01

    We examined how the recognition of facial emotion was influenced by manipulation of both spatial and temporal properties of 3-D point-light displays of facial motion. We started with the measurement of 3-D position of multiple locations on the face during posed expressions of anger, happiness, sadness, and surprise, and then manipulated the spatial and temporal properties of the measurements to obtain new versions of the movements. In two experiments, we examined recognition of these original and modified facial expressions: in experiment 1, we manipulated the spatial properties of the facial movement, and in experiment 2 we manipulated the temporal properties. The results of experiment 1 showed that exaggeration of facial expressions relative to a fixed neutral expression resulted in enhanced ratings of the intensity of that emotion. The results of experiment 2 showed that changing the duration of an expression had a small effect on ratings of emotional intensity, with a trend for expressions with shorter durations to have lower ratings of intensity. The results are discussed within the context of theories of encoding as related to caricature and emotion.

  5. Training Facial Expression Production in Children on the Autism Spectrum

    ERIC Educational Resources Information Center

    Gordon, Iris; Pierce, Matthew D.; Bartlett, Marian S.; Tanaka, James W.

    2014-01-01

    Children with autism spectrum disorder (ASD) show deficits in their ability to produce facial expressions. In this study, a group of children with ASD and IQ-matched, typically developing (TD) children were trained to produce "happy" and "angry" expressions with the FaceMaze computer game. FaceMaze uses an automated computer…

  6. Training Facial Expression Production in Children on the Autism Spectrum

    ERIC Educational Resources Information Center

    Gordon, Iris; Pierce, Matthew D.; Bartlett, Marian S.; Tanaka, James W.

    2014-01-01

    Children with autism spectrum disorder (ASD) show deficits in their ability to produce facial expressions. In this study, a group of children with ASD and IQ-matched, typically developing (TD) children were trained to produce "happy" and "angry" expressions with the FaceMaze computer game. FaceMaze uses an automated computer…

  7. Emotion Unchained: Facial Expression Modulates Gaze Cueing under Cognitive Load

    PubMed Central

    Petrucci, Manuel

    2016-01-01

    Direction of eye gaze cues spatial attention, and typically this cueing effect is not modulated by the expression of a face unless top-down processes are explicitly or implicitly involved. To investigate the role of cognitive control on gaze cueing by emotional faces, participants performed a gaze cueing task with happy, angry, or neutral faces under high (i.e., counting backward by 7) or low cognitive load (i.e., counting forward by 2). Results show that high cognitive load enhances gaze cueing effects for angry facial expressions. In addition, cognitive load reduces gaze cueing for neutral faces, whereas happy facial expressions and gaze affected object preferences regardless of load. This evidence clearly indicates a differential role of cognitive control in processing gaze direction and facial expression, suggesting that under typical conditions, when we shift attention based on social cues from another person, cognitive control processes are used to reduce interference from emotional information. PMID:27959925

  8. Emotion Unchained: Facial Expression Modulates Gaze Cueing under Cognitive Load.

    PubMed

    Pecchinenda, Anna; Petrucci, Manuel

    2016-01-01

    Direction of eye gaze cues spatial attention, and typically this cueing effect is not modulated by the expression of a face unless top-down processes are explicitly or implicitly involved. To investigate the role of cognitive control on gaze cueing by emotional faces, participants performed a gaze cueing task with happy, angry, or neutral faces under high (i.e., counting backward by 7) or low cognitive load (i.e., counting forward by 2). Results show that high cognitive load enhances gaze cueing effects for angry facial expressions. In addition, cognitive load reduces gaze cueing for neutral faces, whereas happy facial expressions and gaze affected object preferences regardless of load. This evidence clearly indicates a differential role of cognitive control in processing gaze direction and facial expression, suggesting that under typical conditions, when we shift attention based on social cues from another person, cognitive control processes are used to reduce interference from emotional information.

  9. Fast and Accurate Digital Morphometry of Facial Expressions.

    PubMed

    Grewe, Carl Martin; Schreiber, Lisa; Zachow, Stefan

    2015-10-01

    Facial surgery deals with a part of the human body that is of particular importance in everyday social interactions. The perception of a person's natural, emotional, and social appearance is significantly influenced by one's expression. This is why facial dynamics has been increasingly studied by both artists and scholars since the mid-Renaissance. Currently, facial dynamics and their importance in the perception of a patient's identity play a fundamental role in planning facial surgery. Assistance is needed for patient information and communication, and documentation and evaluation of the treatment as well as during the surgical procedure. Here, the quantitative assessment of morphological features has been facilitated by the emergence of diverse digital imaging modalities in the last decades. Unfortunately, the manual data preparation usually needed for further quantitative analysis of the digitized head models (surface registration, landmark annotation) is time-consuming, and thus inhibits its use for treatment planning and communication. In this article, we refer to historical studies on facial dynamics, briefly present related work from the field of facial surgery, and draw implications for further developments in this context. A prototypical stereophotogrammetric system for high-quality assessment of patient-specific 3D dynamic morphology is described. An individual statistical model of several facial expressions is computed, and possibilities to address a broad range of clinical questions in facial surgery are demonstrated. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  10. The face is not an empty canvas: how facial expressions interact with facial appearance

    PubMed Central

    Hess, Ursula; Adams, Reginald B.; Kleck, Robert E.

    2009-01-01

    Faces are not simply blank canvases upon which facial expressions write their emotional messages. In fact, facial appearance and facial movement are both important social signalling systems in their own right. We here provide multiple lines of evidence for the notion that the social signals derived from facial appearance on the one hand and facial movement on the other interact in a complex manner, sometimes reinforcing and sometimes contradicting one another. Faces provide information on who a person is. Sex, age, ethnicity, personality and other characteristics that can define a person and the social group the person belongs to can all be derived from the face alone. The present article argues that faces interact with the perception of emotion expressions because this information informs a decoder's expectations regarding an expresser's probable emotional reactions. Facial appearance also interacts more directly with the interpretation of facial movement because some of the features that are used to derive personality or sex information are also features that closely resemble certain emotional expressions, thereby enhancing or diluting the perceived strength of particular expressions. PMID:19884144

  11. The face is not an empty canvas: how facial expressions interact with facial appearance.

    PubMed

    Hess, Ursula; Adams, Reginald B; Kleck, Robert E

    2009-12-12

    Faces are not simply blank canvases upon which facial expressions write their emotional messages. In fact, facial appearance and facial movement are both important social signalling systems in their own right. We here provide multiple lines of evidence for the notion that the social signals derived from facial appearance on the one hand and facial movement on the other interact in a complex manner, sometimes reinforcing and sometimes contradicting one another. Faces provide information on who a person is. Sex, age, ethnicity, personality and other characteristics that can define a person and the social group the person belongs to can all be derived from the face alone. The present article argues that faces interact with the perception of emotion expressions because this information informs a decoder's expectations regarding an expresser's probable emotional reactions. Facial appearance also interacts more directly with the interpretation of facial movement because some of the features that are used to derive personality or sex information are also features that closely resemble certain emotional expressions, thereby enhancing or diluting the perceived strength of particular expressions.

  12. Emotional facial expressions reduce neural adaptation to face identity.

    PubMed

    Gerlicher, Anna M V; van Loon, Anouk M; Scholte, H Steven; Lamme, Victor A F; van der Leij, Andries R

    2014-05-01

    In human social interactions, facial emotional expressions are a crucial source of information. Repeatedly presented information typically leads to an adaptation of neural responses. However, processing seems sustained with emotional facial expressions. Therefore, we tested whether sustained processing of emotional expressions, especially threat-related expressions, would attenuate neural adaptation. Neutral and emotional expressions (happy, mixed and fearful) of same and different identity were presented at 3 Hz. We used electroencephalography to record the evoked steady-state visual potentials (ssVEP) and tested to what extent the ssVEP amplitude adapts to the same when compared with different face identities. We found adaptation to the identity of a neutral face. However, for emotional faces, adaptation was reduced, decreasing linearly with negative valence, with the least adaptation to fearful expressions. This short and straightforward method may prove to be a valuable new tool in the study of emotional processing.

  13. Unseen facial and bodily expressions trigger fast emotional reactions

    PubMed Central

    Tamietto, Marco; Castelli, Lorys; Vighetti, Sergio; Perozzo, Paola; Geminiani, Giuliano; Weiskrantz, Lawrence; de Gelder, Beatrice

    2009-01-01

    The spontaneous tendency to synchronize our facial expressions with those of others is often termed emotional contagion. It is unclear, however, whether emotional contagion depends on visual awareness of the eliciting stimulus and which processes underlie the unfolding of expressive reactions in the observer. It has been suggested either that emotional contagion is driven by motor imitation (i.e., mimicry), or that it is one observable aspect of the emotional state arising when we see the corresponding emotion in others. Emotional contagion reactions to different classes of consciously seen and “unseen” stimuli were compared by presenting pictures of facial or bodily expressions either to the intact or blind visual field of two patients with unilateral destruction of the visual cortex and ensuing phenomenal blindness. Facial reactions were recorded using electromyography, and arousal responses were measured with pupil dilatation. Passive exposure to unseen expressions evoked faster facial reactions and higher arousal compared with seen stimuli, therefore indicating that emotional contagion occurs also when the triggering stimulus cannot be consciously perceived because of cortical blindness. Furthermore, stimuli that are very different in their visual characteristics, such as facial and bodily gestures, induced highly similar expressive responses. This shows that the patients did not simply imitate the motor pattern observed in the stimuli, but resonated to their affective meaning. Emotional contagion thus represents an instance of truly affective reactions that may be mediated by visual pathways of old evolutionary origin bypassing cortical vision while still providing a cornerstone for emotion communication and affect sharing. PMID:19805044

  14. Regional Brain Responses Are Biased Toward Infant Facial Expressions Compared to Adult Facial Expressions in Nulliparous Women.

    PubMed

    Li, Bingbing; Cheng, Gang; Zhang, Dajun; Wei, Dongtao; Qiao, Lei; Wang, Xiangpeng; Che, Xianwei

    2016-01-01

    Recent neuroimaging studies suggest that neutral infant faces compared to neutral adult faces elicit greater activity in brain areas associated with face processing, attention, empathic response, reward, and movement. However, whether infant facial expressions evoke larger brain responses than adult facial expressions remains unclear. Here, we performed event-related functional magnetic resonance imaging in nulliparous women while they were presented with images of matched unfamiliar infant and adult facial expressions (happy, neutral, and uncomfortable/sad) in a pseudo-randomized order. We found that the bilateral fusiform and right lingual gyrus were overall more activated during the presentation of infant facial expressions compared to adult facial expressions. Uncomfortable infant faces compared to sad adult faces evoked greater activation in the bilateral fusiform gyrus, precentral gyrus, postcentral gyrus, posterior cingulate cortex-thalamus, and precuneus. Neutral infant faces activated larger brain responses in the left fusiform gyrus compared to neutral adult faces. Happy infant faces compared to happy adult faces elicited larger responses in areas of the brain associated with emotion and reward processing using a more liberal threshold of p < 0.005 uncorrected. Furthermore, the level of the test subjects' Interest-In-Infants was positively associated with the intensity of right fusiform gyrus response to infant faces and uncomfortable infant faces compared to sad adult faces. In addition, the Perspective Taking subscale score on the Interpersonal Reactivity Index-Chinese was significantly correlated with precuneus activity during uncomfortable infant faces compared to sad adult faces. Our findings suggest that regional brain areas may bias cognitive and emotional responses to infant facial expressions compared to adult facial expressions among nulliparous women, and this bias may be modulated by individual differences in Interest-In-Infants and

  15. Regional Brain Responses Are Biased Toward Infant Facial Expressions Compared to Adult Facial Expressions in Nulliparous Women

    PubMed Central

    Zhang, Dajun; Wei, Dongtao; Qiao, Lei; Wang, Xiangpeng; Che, Xianwei

    2016-01-01

    Recent neuroimaging studies suggest that neutral infant faces compared to neutral adult faces elicit greater activity in brain areas associated with face processing, attention, empathic response, reward, and movement. However, whether infant facial expressions evoke larger brain responses than adult facial expressions remains unclear. Here, we performed event-related functional magnetic resonance imaging in nulliparous women while they were presented with images of matched unfamiliar infant and adult facial expressions (happy, neutral, and uncomfortable/sad) in a pseudo-randomized order. We found that the bilateral fusiform and right lingual gyrus were overall more activated during the presentation of infant facial expressions compared to adult facial expressions. Uncomfortable infant faces compared to sad adult faces evoked greater activation in the bilateral fusiform gyrus, precentral gyrus, postcentral gyrus, posterior cingulate cortex-thalamus, and precuneus. Neutral infant faces activated larger brain responses in the left fusiform gyrus compared to neutral adult faces. Happy infant faces compared to happy adult faces elicited larger responses in areas of the brain associated with emotion and reward processing using a more liberal threshold of p < 0.005 uncorrected. Furthermore, the level of the test subjects’ Interest-In-Infants was positively associated with the intensity of right fusiform gyrus response to infant faces and uncomfortable infant faces compared to sad adult faces. In addition, the Perspective Taking subscale score on the Interpersonal Reactivity Index-Chinese was significantly correlated with precuneus activity during uncomfortable infant faces compared to sad adult faces. Our findings suggest that regional brain areas may bias cognitive and emotional responses to infant facial expressions compared to adult facial expressions among nulliparous women, and this bias may be modulated by individual differences in Interest-In-Infants and

  16. Comparison of emotion recognition from facial expression and music.

    PubMed

    Gaspar, Tina; Labor, Marina; Jurić, Iva; Dumancić, Dijana; Ilakovac, Vesna; Heffer, Marija

    2011-01-01

    The recognition of basic emotions in everyday communication involves interpretation of different visual and auditory clues. The ability to recognize emotions is not clearly determined as their presentation is usually very short (micro expressions), whereas the recognition itself does not have to be a conscious process. We assumed that the recognition from facial expressions is selected over the recognition of emotions communicated through music. In order to compare the success rate in recognizing emotions presented as facial expressions or in classical music works we conducted a survey which included 90 elementary school and 87 high school students from Osijek (Croatia). The participants had to match 8 photographs of different emotions expressed on the face and 8 pieces of classical music works with 8 offered emotions. The recognition of emotions expressed through classical music pieces was significantly less successful than the recognition of emotional facial expressions. The high school students were significantly better at recognizing facial emotions than the elementary school students, whereas girls were better than boys. The success rate in recognizing emotions from music pieces was associated with higher grades in mathematics. Basic emotions are far better recognized if presented on human faces than in music, possibly because the understanding of facial emotions is one of the oldest communication skills in human society. Female advantage in emotion recognition was selected due to the necessity of their communication with the newborns during early development. The proficiency in recognizing emotional content of music and mathematical skills probably share some general cognitive skills like attention, memory and motivation. Music pieces were differently processed in brain than facial expressions and consequently, probably differently evaluated as relevant emotional clues.

  17. Extreme Facial Expressions Classification Based on Reality Parameters

    NASA Astrophysics Data System (ADS)

    Rahim, Mohd Shafry Mohd; Rad, Abdolvahab Ehsani; Rehman, Amjad; Altameem, Ayman

    2014-09-01

    Extreme expressions are really type of emotional expressions that are basically stimulated through the strong emotion. An example of those extreme expression is satisfied through tears. So to be able to provide these types of features; additional elements like fluid mechanism (particle system) plus some of physics techniques like (SPH) are introduced. The fusion of facile animation with SPH exhibits promising results. Accordingly, proposed fluid technique using facial animation is the real tenor for this research to get the complex expression, like laugh, smile, cry (tears emergence) or the sadness until cry strongly, as an extreme expression classification that's happens on the human face in some cases.

  18. Development of a System for Automatic Facial Expression Analysis

    NASA Astrophysics Data System (ADS)

    Diago, Luis A.; Kitaoka, Tetsuko; Hagiwara, Ichiro

    Automatic recognition of facial expressions can be an important component of natural human-machine interactions. While a lot of samples are desirable for estimating more accurately the feelings of a person (e.g. likeness) about a machine interface, in real world situation, only a small number of samples must be obtained because the high cost in collecting emotions from observed person. This paper proposes a system that solves this problem conforming to individual differences. A new method is developed for facial expression classification based on the combination of Holographic Neural Networks (HNN) and Type-2 Fuzzy Logic. For the recognition of emotions induced by facial expressions, compared with former HNN and Support Vector Machines (SVM) classifiers, proposed method achieved the best generalization performance using less learning time than SVM classifiers.

  19. Language and affective facial expression in children with perinatal stroke

    PubMed Central

    Lai, Philip T.; Reilly, Judy S.

    2015-01-01

    Children with perinatal stroke (PS) provide a unique opportunity to understand developing brain-behavior relations. Previous research has noted distinctive differences in behavioral sequelae between children with PS and adults with acquired stroke: children fare better, presumably due to the plasticity of the developing brain for adaptive reorganization. Whereas we are beginning to understand language development, we know little about another communicative domain, emotional expression. The current study investigates the use and integration of language and facial expression during an interview. As anticipated, the language performance of the five and six year old PS group is comparable to their typically developing (TD) peers, however, their affective profiles are distinctive: those with right hemisphere injury are less expressive with respect to affective language and affective facial expression than either those with left hemisphere injury or TD group. The two distinctive profiles for language and emotional expression in these children suggest gradients of neuroplasticity in the developing brain. PMID:26117314

  20. Processing of emotional facial expressions in Korsakoff's syndrome.

    PubMed

    Montagne, Barbara; Kessels, Roy P C; Wester, Arie J; de Haan, Edward H F

    2006-07-01

    Interpersonal contacts depend to a large extent on understanding emotional facial expressions of others. Several neurological conditions may affect proficiency in emotional expression recognition. It has been shown that chronic alcoholics are impaired in labelling emotional expressions. More specifically, they mislabel sad expressions, regarding them as more hostile. Surprisingly, there has been relatively little research on patients with Korsakoff's syndrome as a result of chronic alcohol abuse. The current study investigated 23 patients diagnosed with Korsakoff's syndrome compared to 23 matched control participants. This study is the first to make use of a newly developed sensitive paradigm to measure emotion recognition for several emotions (anger, disgust, fear, happiness, sadness and surprise). The results show that patients with Korsakoff's syndrome are impaired at recognizing angry, fearful and surprised facial emotional expressions. These deficits might be due to the reported sub-cortical brain dysfunction in Korsakoff's syndrome.

  1. Visual decodification of some facial expressions through microimitation.

    PubMed

    Ruggieri, V; Fiorenza, M; Sabatini, N

    1986-04-01

    We examined the level of muscular tension of mentalis muscle of 36 students in graphic design at rest and during the presentation of three slides reproducing facial expressions. Analysis showed an increase in the myographic level of mentalis muscle from the third second of measurement onwards after the presentation of the slide in which contraction of the chin was involved. We interpret this result by hypothesizing that the decodification of some facial expressions is realized through a microreproduction of the stimulus from the decodifying subject.

  2. Recognition of facial expressions of emotion in panic disorder.

    PubMed

    Cai, Liqiang; Chen, Wanzhen; Shen, Yuedi; Wang, Xinling; Wei, Lili; Zhang, Yingchun; Wang, Wei; Chen, Wei

    2012-01-01

    Whether patients with panic disorder behave differently or not when recognizing the facial expressions of emotion remains unsettled. We tested 21 outpatients with panic disorder and 34 healthy subjects, with a photo set from the Matsumoto and Ekman Japanese and Caucasian facial expressions of emotion, which includes anger, contempt, disgust, fear, happiness, sadness, and surprise. Compared to the healthy subjects, patients showed lower accuracies when recognizing disgust and fear, but a higher accuracy when recognizing surprise. These results suggest that the altered specificity to these emotions leads tso self-awareness mechanisms to prevent further emotional reactions in panic disorder patients. Copyright © 2012 S. Karger AG, Basel.

  3. Neonatal pain facial expression: evaluating the primal face of pain.

    PubMed

    Schiavenato, Martin; Byers, Jacquie F; Scovanner, Paul; McMahon, James M; Xia, Yinglin; Lu, Naiji; He, Hua

    2008-08-31

    The primal face of pain (PFP) is postulated to be a common and universal facial expression to pain, hardwired and present at birth. We evaluated its presence by applying a computer-based methodology consisting of "point-pair" comparisons captured from video to measure facial movement in the pain expression by way of change across two images: one image before and one image after a painful stimulus (heel-stick). Similarity of facial expression was analyzed in a sample of 57 neonates representing both sexes and 3 ethnic backgrounds (African American, Caucasian and Hispanic/Latino) while controlling for these extraneous and potentially modulating factors: feeding type (bottle, breast, or both), behavioral state (awake or asleep), and use of epidural and/or other perinatal anesthesia. The PFP is consistent with previous reports of expression of pain in neonates and is characterized by opening of the mouth, drawing in of the brows, and closing of the eyes. Although facial expression was not identical across or among groups, our analyses showed no particular clustering or unique display by sex, or ethnicity. The clinical significance of this commonality of pain display, and of the origin of its potential individual variation begs further evaluation.

  4. Adult age differences in the interpretation of surprised facial expressions.

    PubMed

    Shuster, Michael M; Mikels, Joseph A; Camras, Linda A

    2017-03-01

    Research on adult age differences in the interpretation of facial expressions has yet to examine evaluations of surprised faces, which signal that an unexpected and ambiguous event has occurred in the expresser's environment. The present study examined whether older and younger adults differed in their interpretations of the affective valence of surprised faces. Specifically, we examined older and younger participants' evaluations of happy, angry, and surprised facial expressions. We predicted that, on the basis of age-related changes in the processing of emotional information, older adults would evaluate surprised faces more positively than would younger adults. The results indicated that older adults interpreted surprised faces more positively than did their younger counterparts. These findings reveal a novel age-related positivity effect in the interpretation of surprised faces, suggesting that older adults imbue ambiguous facial expressions-that is, expressions that lack either positive or negative facial actions-with positive meaning. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  5. Exploring emotional and cognitive conflict using speeded voluntary facial expressions.

    PubMed

    Chiew, Kimberly S; Braver, Todd S

    2010-12-01

    Affective conflict and control may have important parallels to cognitive conflict and control, but these processes have been difficult to quantitatively study with emotionally naturalistic laboratory paradigms. The current study examines a modification of the AX-Continuous Performance Task (AX-CPT), a well-validated probe of cognitive conflict and control, for the study of emotional conflict. In the Emotional AX-CPT, speeded emotional facial expressions measured with electromyography (EMG) were used as the primary response modality, and index of emotional conflict. Bottom-up emotional conflict occurred on trials in which precued facial expressions were incongruent with the valence of an emotionally evocative picture probe (e.g., smiling to a negative picture). A second form of top-down conflict occurred in which the facial expression and picture probe were congruent, but the opposite expression was expected based on the precue. A matched version of the task was also performed (in a separate group of participants) with affectively neutral probe stimuli. Behavioral interference was observed, in terms of response latencies and errors, on all conflict trials. However, bottom-up conflict was stronger in the emotional version of the task compared to the neutral version; top-down conflict was similar across the two versions. The results suggest that voluntary facial expressions may be more sensitive to indexing emotional than nonemotional conflict, and importantly, may provide an ecologically valid method of examining how emotional conflict may manifest in behavior and brain activity.

  6. Processing emotional facial expressions: The role of anxiety and awareness

    PubMed Central

    FOX, ELAINE

    2012-01-01

    In this paper, the role of self-reported anxiety and degree of conscious awareness as determinants of the selective processing of affective facial expressions is investigated. In two experiments, an attentional bias toward fearful facial expressions was observed, although this bias was apparent only for those reporting high levels of trait anxiety and only when the emotional face was presented in the left visual field. This pattern was especially strong when the participants were unaware of the presence of the facial stimuli. In Experiment 3, a patient with right-hemisphere brain damage and visual extinction was presented with photographs of faces and fruits on unilateral and bilateral trials. On bilateral trials, it was found that faces produced less extinction than did fruits. Moreover, faces portraying a fearful or a happy expression tended to produce less extinction than did neutral expressions. This suggests that emotional facial expressions may be less dependent on attention to achieve awareness.The implications of these results for understanding the relations between attention, emotion, and anxiety are discussed. PMID:12452584

  7. The Enfacement Illusion Is Not Affected by Negative Facial Expressions

    PubMed Central

    Beck, Brianna; Cardini, Flavia; Làdavas, Elisabetta; Bertini, Caterina

    2015-01-01

    Enfacement is an illusion wherein synchronous visual and tactile inputs update the mental representation of one’s own face to assimilate another person’s face. Emotional facial expressions, serving as communicative signals, may influence enfacement by increasing the observer’s motivation to understand the mental state of the expresser. Fearful expressions, in particular, might increase enfacement because they are valuable for adaptive behavior and more strongly represented in somatosensory cortex than other emotions. In the present study, a face was seen being touched at the same time as the participant’s own face. This face was either neutral, fearful, or angry. Anger was chosen as an emotional control condition for fear because it is similarly negative but induces less somatosensory resonance, and requires additional knowledge (i.e., contextual information and social contingencies) to effectively guide behavior. We hypothesized that seeing a fearful face (but not an angry one) would increase enfacement because of greater somatosensory resonance. Surprisingly, neither fearful nor angry expressions modulated the degree of enfacement relative to neutral expressions. Synchronous interpersonal visuo-tactile stimulation led to assimilation of the other’s face, but this assimilation was not modulated by facial expression processing. This finding suggests that dynamic, multisensory processes of self-face identification operate independently of facial expression processing. PMID:26291532

  8. Interference between conscious and unconscious facial expression information.

    PubMed

    Ye, Xing; He, Sheng; Hu, Ying; Yu, Yong Qiang; Wang, Kai

    2014-01-01

    There is ample evidence to show that many types of visual information, including emotional information, could be processed in the absence of visual awareness. For example, it has been shown that masked subliminal facial expressions can induce priming and adaptation effects. However, stimulus made invisible in different ways could be processed to different extent and have differential effects. In this study, we adopted a flanker type behavioral method to investigate whether a flanker rendered invisible through Continuous Flash Suppression (CFS) could induce a congruency effect on the discrimination of a visible target. Specifically, during the experiment, participants judged the expression (either happy or fearful) of a visible face in the presence of a nearby invisible face (with happy or fearful expression). Results show that participants were slower and less accurate in discriminating the expression of the visible face when the expression of the invisible flanker face was incongruent. Thus, facial expression information rendered invisible with CFS and presented a different spatial location could enhance or interfere with consciously processed facial expression information.

  9. N170 sensitivity to facial expression: A meta-analysis.

    PubMed

    Hinojosa, J A; Mercado, F; Carretié, L

    2015-08-01

    The N170 component is the most important electrophysiological index of face processing. Early studies concluded that it was insensitive to facial expression, thus supporting dual theories postulating separate mechanisms for identity and expression encoding. However, recent evidence contradicts this assumption. We conducted a meta-analysis to resolve inconsistencies and to derive theoretical implications. A systematic revision of 128 studies analyzing N170 in response to neutral and emotional expressions yielded 57 meta-analyzable experiments (involving 1645 healthy adults). First, the N170 was found to be sensitive to facial expressions, supporting proposals arguing for integrated rather than segregated mechanisms in the processing of identity and expression. Second, this sensitivity is heterogeneous, with anger, fear and happy faces eliciting the largest N170 amplitudes. Third, we explored some modulatory factors, including the focus of attention - N170 amplitude was found to be also sensitive to unattended expressions - or the reference electrode -common reference reinforcing the effects- . In sum, N170 is a valuable tool to study the neural processing of facial expressions in order to develop current theories. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. The Enfacement Illusion Is Not Affected by Negative Facial Expressions.

    PubMed

    Beck, Brianna; Cardini, Flavia; Làdavas, Elisabetta; Bertini, Caterina

    2015-01-01

    Enfacement is an illusion wherein synchronous visual and tactile inputs update the mental representation of one's own face to assimilate another person's face. Emotional facial expressions, serving as communicative signals, may influence enfacement by increasing the observer's motivation to understand the mental state of the expresser. Fearful expressions, in particular, might increase enfacement because they are valuable for adaptive behavior and more strongly represented in somatosensory cortex than other emotions. In the present study, a face was seen being touched at the same time as the participant's own face. This face was either neutral, fearful, or angry. Anger was chosen as an emotional control condition for fear because it is similarly negative but induces less somatosensory resonance, and requires additional knowledge (i.e., contextual information and social contingencies) to effectively guide behavior. We hypothesized that seeing a fearful face (but not an angry one) would increase enfacement because of greater somatosensory resonance. Surprisingly, neither fearful nor angry expressions modulated the degree of enfacement relative to neutral expressions. Synchronous interpersonal visuo-tactile stimulation led to assimilation of the other's face, but this assimilation was not modulated by facial expression processing. This finding suggests that dynamic, multisensory processes of self-face identification operate independently of facial expression processing.

  11. Facial expression recognition based on improved DAGSVM

    NASA Astrophysics Data System (ADS)

    Luo, Yuan; Cui, Ye; Zhang, Yi

    2014-11-01

    For the cumulative error problem because of randomization sequence of traditional DAGSVM(Directed Acyclic Graph Support Vector Machine) classification, this paper presents an improved DAGSVM expression recognition method. The method uses the distance of class and the standard deviation as the measure of the classer, which minimize the error rate of the upper structure of the classification. At the same time, this paper uses the method which combines discrete cosine transform (Discrete Cosine Transform, DCT) with Local Binary Pattern(Local Binary Pattern - LBP) ,to extract expression feature and be the input to improve the DAGSVM classifier for recognition. Experimental results show that compared with other multi-class support vector machine method, improved DAGSVM classifier can achieve higher recognition rate. And when it's used at the platform of the intelligent wheelchair, experiments show that the method has a better robustness.

  12. Drug effects on responses to emotional facial expressions: recent findings

    PubMed Central

    Miller, Melissa A.; Bershad, Anya K.; de Wit, Harriet

    2016-01-01

    Many psychoactive drugs increase social behavior and enhance social interactions, which may, in turn, increase their attractiveness to users. Although the psychological mechanisms by which drugs affect social behavior are not fully understood, there is some evidence that drugs alter the perception of emotions in others. Drugs can affect the ability to detect, attend to, and respond to emotional facial expressions, which in turn may influence their use in social settings. Either increased reactivity to positive expressions or decreased response to negative expressions may facilitate social interaction. This article reviews evidence that psychoactive drugs alter the processing of emotional facial expressions using subjective, behavioral, and physiological measures. The findings lay the groundwork for better understanding how drugs alter social processing and social behavior more generally. PMID:26226144

  13. Drug effects on responses to emotional facial expressions: recent findings.

    PubMed

    Miller, Melissa A; Bershad, Anya K; de Wit, Harriet

    2015-09-01

    Many psychoactive drugs increase social behavior and enhance social interactions, which may, in turn, increase their attractiveness to users. Although the psychological mechanisms by which drugs affect social behavior are not fully understood, there is some evidence that drugs alter the perception of emotions in others. Drugs can affect the ability to detect, attend to, and respond to emotional facial expressions, which in turn may influence their use in social settings. Either increased reactivity to positive expressions or decreased response to negative expressions may facilitate social interaction. This article reviews evidence that psychoactive drugs alter the processing of emotional facial expressions using subjective, behavioral, and physiological measures. The findings lay the groundwork for better understanding how drugs alter social processing and social behavior more generally.

  14. Fashioning the Face: Sensorimotor Simulation Contributes to Facial Expression Recognition.

    PubMed

    Wood, Adrienne; Rychlowska, Magdalena; Korb, Sebastian; Niedenthal, Paula

    2016-03-01

    When we observe a facial expression of emotion, we often mimic it. This automatic mimicry reflects underlying sensorimotor simulation that supports accurate emotion recognition. Why this is so is becoming more obvious: emotions are patterns of expressive, behavioral, physiological, and subjective feeling responses. Activation of one component can therefore automatically activate other components. When people simulate a perceived facial expression, they partially activate the corresponding emotional state in themselves, which provides a basis for inferring the underlying emotion of the expresser. We integrate recent evidence in favor of a role for sensorimotor simulation in emotion recognition. We then connect this account to a domain-general understanding of how sensory information from multiple modalities is integrated to generate perceptual predictions in the brain. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Visual Exploration of Emotional Facial Expressions in Parkinson's Disease

    PubMed Central

    Clark, Uraina S.; Neargarder, Sandy; Cronin-Golomb, Alice

    2010-01-01

    Parkinson's disease (PD) is associated with impairments in facial emotion recognition as well as visual and executive dysfunction. We investigated whether facial emotion categorization impairments in PD are attributable to visual scanning abnormalities by recording the eye movements of 16 non-demented PD and 20 healthy control (HC) participants during an emotion recognition task. We examined the influence of several factors that can affect visual scanning, including oculomotor, basic visual, and cognitive abilities (executive function). Increases in the number and duration of fixations in the top regions of surprise facial expressions were related to increases in recognition accuracy for this emotion in PD participants with left-sided motor symptom onset. Compared to HC men, HC women spent less time fixating on fearful expressions. PD participants displayed oculomotor abnormalities (antisaccades), but these were unrelated to scanning patterns. Performance on visual measures (acuity, contrast sensitivity) correlated with scanning patterns in the PD group only. Poorer executive function was associated with longer fixation times in PD and with a greater number of fixations in HC. Our findings indicate a specific relation between facial emotion categorization impairments and scanning of facial expressions in PD. PD and HC participants' scanning behaviors during an emotion categorization task were driven by different perceptual processes and cognitive strategies. Our results underscore the need to consider differences in perceptual and cognitive abilities in studies of visual scanning, particularly when examining this ability in patient populations for which both vision and cognition are impaired. PMID:20230847

  16. Impaired holistic coding of facial expression and facial identity in congenital prosopagnosia

    PubMed Central

    Palermo, Romina; Willis, Megan L.; Rivolta, Davide; McKone, Elinor; Wilson, C. Ellie; Calder, Andrew J.

    2011-01-01

    We test 12 individuals with congenital prosopagnosia (CP), who replicate a common pattern of showing severe difficulty in recognising facial identity in conjunction with normal recognition of facial expressions (both basic and ‘social’). Strength of holistic processing was examined using standard expression composite and identity composite tasks. Compared to age- and sex-matched controls, group analyses demonstrated that CPs showed weaker holistic processing, for both expression and identity information. Implications are (a) normal expression recognition in CP can derive from compensatory strategies (e.g., over-reliance on non-holistic cues to expression); (b) the split between processing of expression and identity information may take place after a common stage of holistic processing; and (c) contrary to a recent claim, holistic processing of identity is functionally involved in face identification ability. PMID:21333662

  17. Face Processing in Children with Autism Spectrum Disorder: Independent or Interactive Processing of Facial Identity and Facial Expression?

    ERIC Educational Resources Information Center

    Krebs, Julia F.; Biswas, Ajanta; Pascalis, Olivier; Kamp-Becker, Inge; Remschmidt, Helmuth; Schwarzer, Gudrun

    2011-01-01

    The current study investigated if deficits in processing emotional expression affect facial identity processing and vice versa in children with autism spectrum disorder. Children with autism and IQ and age matched typically developing children classified faces either by emotional expression, thereby ignoring facial identity or by facial identity…

  18. Face Processing in Children with Autism Spectrum Disorder: Independent or Interactive Processing of Facial Identity and Facial Expression?

    ERIC Educational Resources Information Center

    Krebs, Julia F.; Biswas, Ajanta; Pascalis, Olivier; Kamp-Becker, Inge; Remschmidt, Helmuth; Schwarzer, Gudrun

    2011-01-01

    The current study investigated if deficits in processing emotional expression affect facial identity processing and vice versa in children with autism spectrum disorder. Children with autism and IQ and age matched typically developing children classified faces either by emotional expression, thereby ignoring facial identity or by facial identity…

  19. Facial Expression Presentation for Real-Time Internet Communication

    NASA Astrophysics Data System (ADS)

    Dugarry, Alexandre; Berrada, Aida; Fu, Shan

    2003-01-01

    Text, voice and video images are the most common forms of media content for instant communication on the Internet. Studies have shown that facial expressions convey much richer information than text and voice during a face-to-face conversation. The currently available real time means of communication (instant text messages, chat programs and videoconferencing), however, have major drawbacks in terms of exchanging facial expression. The first two means do not involve the image transmission, whilst video conferencing requires a large bandwidth that is not always available, and the transmitted image sequence is neither smooth nor without delay. The objective of the work presented here is to develop a technique that overcomes these limitations, by extracting the facial expression of speakers and to realise real-time communication. In order to get the facial expressions, the main characteristics of the image are emphasized. Interpolation is performed on edge points previously detected to create geometric shapes such as arcs, lines, etc. The regional dominant colours of the pictures are also extracted and the combined results are subsequently converted into Scalable Vector Graphics (SVG) format. The application based on the proposed technique aims at being used simultaneously with chat programs and being able to run on any platform.

  20. Facial Expressions in Context: Contributions to Infant Emotion Theory.

    ERIC Educational Resources Information Center

    Camras, Linda A.

    To make the point that infant emotions are more dynamic than suggested by Differential Emotions Theory, which maintains that infants show the same prototypical facial expressions for emotions as adults do, this paper explores two questions: (1) when infants experience an emotion, do they always show the corresponding prototypical facial…

  1. Detection of Deception in Adults and Children via Facial Expressions.

    ERIC Educational Resources Information Center

    Feldman, Robert S.; And Others

    1979-01-01

    Examines the effect of age of encoder (first graders, seventh graders, and college students) on the decoding of nonverbal facial expressions indicative of verbal deception. Results showed the ratings of untrained, naive adult judges to be more accurate in decoding the first-grade stimulus persons than the older ones. (JMB)

  2. Further Evidence on Preschoolers' Interpretation of Facial Expressions.

    ERIC Educational Resources Information Center

    Bullock, Merry; Russell, James A.

    1985-01-01

    Assessed through two studies the organization and basis for preschool children's (n=240) and adults' (n=60) categorization of emotions. In one, children and adults chose facial expressions that exemplify emotion categories such as fear, anger, and happiness. In another they grouped emotions differing in arousal level or pleasure-displeasure…

  3. Perceived Bias in the Facial Expressions of Television News Broadcasters.

    ERIC Educational Resources Information Center

    Friedman, Howard S.; And Others

    1980-01-01

    Studied the nuances of perceived media bias by examining the television reporting of the 1976 Presidential election campaign by comparing the adjudged positivity of the facial expressions of network anchorpersons as they named or referred to either of the two candidates. (JMF)

  4. Teachers' Perception Regarding Facial Expressions as an Effective Teaching Tool

    ERIC Educational Resources Information Center

    Butt, Muhammad Naeem; Iqbal, Mohammad

    2011-01-01

    The major objective of the study was to explore teachers' perceptions about the importance of facial expression in the teaching-learning process. All the teachers of government secondary schools constituted the population of the study. A sample of 40 teachers, both male and female, in rural and urban areas of district Peshawar, were selected…

  5. Categorical Perception of Emotional Facial Expressions in Preschoolers

    ERIC Educational Resources Information Center

    Cheal, Jenna L.; Rutherford, M. D.

    2011-01-01

    Adults perceive emotional facial expressions categorically. In this study, we explored categorical perception in 3.5-year-olds by creating a morphed continuum of emotional faces and tested preschoolers' discrimination and identification of them. In the discrimination task, participants indicated whether two examples from the continuum "felt the…

  6. Categorical Representation of Facial Expressions in the Infant Brain

    ERIC Educational Resources Information Center

    Leppanen, Jukka M.; Richmond, Jenny; Vogel-Farley, Vanessa K.; Moulson, Margaret C.; Nelson, Charles A.

    2009-01-01

    Categorical perception, demonstrated as reduced discrimination of within-category relative to between-category differences in stimuli, has been found in a variety of perceptual domains in adults. To examine the development of categorical perception in the domain of facial expression processing, we used behavioral and event-related potential (ERP)…

  7. Categorical Perception of Emotional Facial Expressions in Preschoolers

    ERIC Educational Resources Information Center

    Cheal, Jenna L.; Rutherford, M. D.

    2011-01-01

    Adults perceive emotional facial expressions categorically. In this study, we explored categorical perception in 3.5-year-olds by creating a morphed continuum of emotional faces and tested preschoolers' discrimination and identification of them. In the discrimination task, participants indicated whether two examples from the continuum "felt the…

  8. Specificity of Facial Expression Labeling Deficits in Childhood Psychopathology

    ERIC Educational Resources Information Center

    Guyer, Amanda E.; McClure, Erin B.; Adler, Abby D.; Brotman, Melissa A.; Rich, Brendan A.; Kimes, Alane S.; Pine, Daniel S.; Ernst, Monique; Leibenluft, Ellen

    2007-01-01

    Background: We examined whether face-emotion labeling deficits are illness-specific or an epiphenomenon of generalized impairment in pediatric psychiatric disorders involving mood and behavioral dysregulation. Method: Two hundred fifty-two youths (7-18 years old) completed child and adult facial expression recognition subtests from the Diagnostic…

  9. Specificity of Facial Expression Labeling Deficits in Childhood Psychopathology

    ERIC Educational Resources Information Center

    Guyer, Amanda E.; McClure, Erin B.; Adler, Abby D.; Brotman, Melissa A.; Rich, Brendan A.; Kimes, Alane S.; Pine, Daniel S.; Ernst, Monique; Leibenluft, Ellen

    2007-01-01

    Background: We examined whether face-emotion labeling deficits are illness-specific or an epiphenomenon of generalized impairment in pediatric psychiatric disorders involving mood and behavioral dysregulation. Method: Two hundred fifty-two youths (7-18 years old) completed child and adult facial expression recognition subtests from the Diagnostic…

  10. Generating Facial Expressions Using an Anatomically Accurate Biomechanical Model.

    PubMed

    Wu, Tim; Hung, Alice; Mithraratne, Kumar

    2014-11-01

    This paper presents a computational framework for modelling the biomechanics of human facial expressions. A detailed high-order (Cubic-Hermite) finite element model of the human head was constructed using anatomical data segmented from magnetic resonance images. The model includes a superficial soft-tissue continuum consisting of skin, the subcutaneous layer and the superficial Musculo-Aponeurotic system. Embedded within this continuum mesh, are 20 pairs of facial muscles which drive facial expressions. These muscles were treated as transversely-isotropic and their anatomical geometries and fibre orientations were accurately depicted. In order to capture the relative composition of muscles and fat, material heterogeneity was also introduced into the model. Complex contact interactions between the lips, eyelids, and between superficial soft tissue continuum and deep rigid skeletal bones were also computed. In addition, this paper investigates the impact of incorporating material heterogeneity and contact interactions, which are often neglected in similar studies. Four facial expressions were simulated using the developed model and the results were compared with surface data obtained from a 3D structured-light scanner. Predicted expressions showed good agreement with the experimental data.

  11. [The neural networks of facial expression].

    PubMed

    Gordillo, F; Mestas, L; Castillo, G; Perez, M A; Lopez, R M; Arana, J M

    2017-02-01

    Introduccion. La percepcion de caras involucra una amplia red de conexiones entre regiones corticales y subcorticales que intercambian y sincronizan informacion a traves de haces de sustancia blanca. Este preciso sistema de comunicacion puede verse afectado tanto a traves de las propias estructuras como por las vias que las conectan. Objetivos. Delimitar el sustrato neuronal que subyace a la percepcion de la expresion facial y analizar los diferentes factores que participan modulando la integridad de esta red neuronal, con el fin de proponer mejoras en los programas de rehabilitacion. Desarrollo. Cuando la compleja red de conexiones que participa en la percepcion de la expresion facial se altera por traumatismos, patologias neurodegenerativas, trastornos del desarrollo, incluso por aislamiento social o contextos negativos, se deteriora tambien la capacidad para interactuar de manera adaptativa con el entorno. Conclusiones. La posibilidad de restaurar la integridad de la red neuronal encargada del procesamiento de la expresion facial pasa por tener en cuenta diferentes variables que en mayor o menor grado se han mostrado capaces de modificar la estructura o funcionalidad de las redes neuronales, como el entrenamiento aerobico, la estimulacion magnetica transcraneal, la estimulacion electrica transcraneal y el aprendizaje, sin bien estas variables estarian condicionadas por la edad, el tipo y evolucion del trastorno o el contexto generador, lo que plantearia la necesidad de protocolos de rehabilitacion ajustados y orientados a delimitar el sustrato neuronal del deficit.

  12. Gaze Dynamics in the Recognition of Facial Expressions of Emotion.

    PubMed

    Barabanschikov, Vladimir A

    2015-01-01

    We studied preferably fixated parts and features of human face in the process of recognition of facial expressions of emotion. Photographs of facial expressions were used. Participants were to categorize these as basic emotions; during this process, eye movements were registered. It was found that variation in the intensity of an expression is mirrored in accuracy of emotion recognition; it was also reflected by several indices of oculomotor function: duration of inspection of certain areas of the face, its upper and bottom or right parts, right and left sides; location, number and duration of fixations, viewing trajectory. In particular, for low-intensity expressions, right side of the face was found to be attended predominantly (right-side dominance); the right-side dominance effect, was, however, absent for expressions of high intensity. For both low- and high-intensity expressions, upper face part was predominantly fixated, though with greater fixation of high-intensity expressions. The majority of trials (70%), in line with findings in previous studies, revealed a V-shaped pattern of inspection trajectory. No relationship, between accuracy of recognition of emotional expressions, was found, though, with either location and duration of fixations or pattern of gaze directedness in the face.

  13. Active AU Based Patch Weighting for Facial Expression Recognition

    PubMed Central

    Xie, Weicheng; Shen, Linlin; Yang, Meng; Lai, Zhihui

    2017-01-01

    Facial expression has many applications in human-computer interaction. Although feature extraction and selection have been well studied, the specificity of each expression variation is not fully explored in state-of-the-art works. In this work, the problem of multiclass expression recognition is converted into triplet-wise expression recognition. For each expression triplet, a new feature optimization model based on action unit (AU) weighting and patch weight optimization is proposed to represent the specificity of the expression triplet. The sparse representation-based approach is then proposed to detect the active AUs of the testing sample for better generalization. The algorithm achieved competitive accuracies of 89.67% and 94.09% for the Jaffe and Cohn–Kanade (CK+) databases, respectively. Better cross-database performance has also been observed. PMID:28146094

  14. Facial Expression Influences Face Identity Recognition During the Attentional Blink

    PubMed Central

    2014-01-01

    Emotional stimuli (e.g., negative facial expressions) enjoy prioritized memory access when task relevant, consistent with their ability to capture attention. Whether emotional expression also impacts on memory access when task-irrelevant is important for arbitrating between feature-based and object-based attentional capture. Here, the authors address this question in 3 experiments using an attentional blink task with face photographs as first and second target (T1, T2). They demonstrate reduced neutral T2 identity recognition after angry or happy T1 expression, compared to neutral T1, and this supports attentional capture by a task-irrelevant feature. Crucially, after neutral T1, T2 identity recognition was enhanced and not suppressed when T2 was angry—suggesting that attentional capture by this task-irrelevant feature may be object-based and not feature-based. As an unexpected finding, both angry and happy facial expressions suppress memory access for competing objects, but only angry facial expression enjoyed privileged memory access. This could imply that these 2 processes are relatively independent from one another. PMID:25286076

  15. Facial expression influences face identity recognition during the attentional blink.

    PubMed

    Bach, Dominik R; Schmidt-Daffy, Martin; Dolan, Raymond J

    2014-12-01

    Emotional stimuli (e.g., negative facial expressions) enjoy prioritized memory access when task relevant, consistent with their ability to capture attention. Whether emotional expression also impacts on memory access when task-irrelevant is important for arbitrating between feature-based and object-based attentional capture. Here, the authors address this question in 3 experiments using an attentional blink task with face photographs as first and second target (T1, T2). They demonstrate reduced neutral T2 identity recognition after angry or happy T1 expression, compared to neutral T1, and this supports attentional capture by a task-irrelevant feature. Crucially, after neutral T1, T2 identity recognition was enhanced and not suppressed when T2 was angry-suggesting that attentional capture by this task-irrelevant feature may be object-based and not feature-based. As an unexpected finding, both angry and happy facial expressions suppress memory access for competing objects, but only angry facial expression enjoyed privileged memory access. This could imply that these 2 processes are relatively independent from one another.

  16. Do Dynamic Compared to Static Facial Expressions of Happiness and Anger Reveal Enhanced Facial Mimicry?

    PubMed

    Rymarczyk, Krystyna; Żurawski, Łukasz; Jankowiak-Siuda, Kamila; Szatkowska, Iwona

    2016-01-01

    Facial mimicry is the spontaneous response to others' facial expressions by mirroring or matching the interaction partner. Recent evidence suggested that mimicry may not be only an automatic reaction but could be dependent on many factors, including social context, type of task in which the participant is engaged, or stimulus properties (dynamic vs static presentation). In the present study, we investigated the impact of dynamic facial expression and sex differences on facial mimicry and judgment of emotional intensity. Electromyography recordings were recorded from the corrugator supercilii, zygomaticus major, and orbicularis oculi muscles during passive observation of static and dynamic images of happiness and anger. The ratings of the emotional intensity of facial expressions were also analysed. As predicted, dynamic expressions were rated as more intense than static ones. Compared to static images, dynamic displays of happiness also evoked stronger activity in the zygomaticus major and orbicularis oculi, suggesting that subjects experienced positive emotion. No muscles showed mimicry activity in response to angry faces. Moreover, we found that women exhibited greater zygomaticus major muscle activity in response to dynamic happiness stimuli than static stimuli. Our data support the hypothesis that people mimic positive emotions and confirm the importance of dynamic stimuli in some emotional processing.

  17. Do Dynamic Compared to Static Facial Expressions of Happiness and Anger Reveal Enhanced Facial Mimicry?

    PubMed Central

    Rymarczyk, Krystyna; Żurawski, Łukasz; Jankowiak-Siuda, Kamila; Szatkowska, Iwona

    2016-01-01

    Facial mimicry is the spontaneous response to others’ facial expressions by mirroring or matching the interaction partner. Recent evidence suggested that mimicry may not be only an automatic reaction but could be dependent on many factors, including social context, type of task in which the participant is engaged, or stimulus properties (dynamic vs static presentation). In the present study, we investigated the impact of dynamic facial expression and sex differences on facial mimicry and judgment of emotional intensity. Electromyography recordings were recorded from the corrugator supercilii, zygomaticus major, and orbicularis oculi muscles during passive observation of static and dynamic images of happiness and anger. The ratings of the emotional intensity of facial expressions were also analysed. As predicted, dynamic expressions were rated as more intense than static ones. Compared to static images, dynamic displays of happiness also evoked stronger activity in the zygomaticus major and orbicularis oculi, suggesting that subjects experienced positive emotion. No muscles showed mimicry activity in response to angry faces. Moreover, we found that women exhibited greater zygomaticus major muscle activity in response to dynamic happiness stimuli than static stimuli. Our data support the hypothesis that people mimic positive emotions and confirm the importance of dynamic stimuli in some emotional processing. PMID:27390867

  18. Recognition of facial expressions in obsessive-compulsive disorder.

    PubMed

    Corcoran, Kathleen M; Woody, Sheila R; Tolin, David F

    2008-01-01

    Sprengelmeyer et al. [Sprengelmeyer, R., Young, A. W., Pundt, I., Sprengelmeyer, A., Calder, A. J., Berrios, G., et al. (1997). Disgust implicated in obsessive-compulsive disorder. Proceedings of the Royal Society of London, 264, 1767-1773] found that patients with OCD showed severely impaired recognition of facial expressions of disgust. This result has potential to provide a unique window into the psychopathology of OCD, but several published attempts to replicate this finding have failed. The current study compared OCD patients to normal controls and panic disorder patients on ability to recognize facial expressions of negative emotions. Overall, the OCD patients were impaired in their ability to recognize disgust expressions, but only 33% of patients showed this deficit. These deficits were related to OCD symptom severity and general functioning, factors that may account for the inconsistent findings observed in different laboratories.

  19. Forming impressions: effects of facial expression and gender stereotypes.

    PubMed

    Hack, Tay

    2014-04-01

    The present study of 138 participants explored how facial expressions and gender stereotypes influence impressions. It was predicted that images of smiling women would be evaluated more favorably on traits reflecting warmth, and that images of non-smiling men would be evaluated more favorably on traits reflecting competence. As predicted, smiling female faces were rated as more warm; however, contrary to prediction, perceived competence of male faces was not affected by facial expression. Participants' female stereotype endorsement was a significant predictor for evaluations of female faces; those who ascribed more strongly to traditional female stereotypes reported the most positive impressions of female faces displaying a smiling expression. However, a similar effect was not found for images of men; endorsement of traditional male stereotypes did not predict participants' impressions of male faces.

  20. Making faces: posed facial expression, self-competence, and personality.

    PubMed

    Browne, B A

    1994-03-01

    This study examined the relationships between posed facial expression, childrens' perceived self-competence, and teachers' perceptions of competence. Third- and fifth-grade children completed the Self-Perception Profile for Children, the Eysenck Personality Questionnaire, and the Junior Self-Monitoring Scale for Children. Individual differences in posing accuracy were determined with a videotaped acting task. Children who were more able to produce prototypical expressions obtained higher teacher ratings of academic competence; however, posing ability bore little relationship to children's self-competence. Gender differences in feelings of self-competence, but not in sending ability, were observed. Extraversion and self-monitoring were unrelated to ability to pose emotional expressions.

  1. Judgment of other people's facial expressions of emotions is influenced by their concurrent affective hand movements.

    PubMed

    Hietanen, Jari K; Leppänen, Jukka M

    2008-06-01

    We investigated whether emotional information from facial expression and hand movement quality was integrated when identifying the expression of a compound stimulus showing a static facial expression combined with emotionally expressive dynamic manual actions. The emotions (happiness, neutrality, and anger) expressed by the face and hands were either congruent or incongruent. In Experiment 1, the participants judged whether the stimulus person was happy, neutral, or angry. Judgments were mainly based on the facial expressions, but were affected by manual expressions to some extent. In Experiment 2, the participants were instructed to base their judgment on the facial expression only. An effect of hand movement expressive quality was observed for happy facial expressions. The results conform with the proposal that perception of facial expressions of emotions can be affected by the expressive qualities of hand movements.

  2. Neuropsychological Studies of Linguistic and Affective Facial Expressions in Deaf Signers.

    ERIC Educational Resources Information Center

    Corina, David P.; Bellugi, Ursula; Reilly, Judy

    1999-01-01

    Presents two studies that explore facial expression production in deaf signers. An experimental paradigm uses chimeric stimuli of American Sign Language linguistic and facial expressions to explore patterns of productive asymmetries in brain-intact signers. (Author/VWL)

  3. Facial expressions and the evolution of the speech rhythm.

    PubMed

    Ghazanfar, Asif A; Takahashi, Daniel Y

    2014-06-01

    In primates, different vocalizations are produced, at least in part, by making different facial expressions. Not surprisingly, humans, apes, and monkeys all recognize the correspondence between vocalizations and the facial postures associated with them. However, one major dissimilarity between monkey vocalizations and human speech is that, in the latter, the acoustic output and associated movements of the mouth are both rhythmic (in the 3- to 8-Hz range) and tightly correlated, whereas monkey vocalizations have a similar acoustic rhythmicity but lack the concommitant rhythmic facial motion. This raises the question of how we evolved from a presumptive ancestral acoustic-only vocal rhythm to the one that is audiovisual with improved perceptual sensitivity. According to one hypothesis, this bisensory speech rhythm evolved through the rhythmic facial expressions of ancestral primates. If this hypothesis has any validity, we expect that the extant nonhuman primates produce at least some facial expressions with a speech-like rhythm in the 3- to 8-Hz frequency range. Lip smacking, an affiliative signal observed in many genera of primates, satisfies this criterion. We review a series of studies using developmental, x-ray cineradiographic, EMG, and perceptual approaches with macaque monkeys producing lip smacks to further investigate this hypothesis. We then explore its putative neural basis and remark on important differences between lip smacking and speech production. Overall, the data support the hypothesis that lip smacking may have been an ancestral expression that was linked to vocal output to produce the original rhythmic audiovisual speech-like utterances in the human lineage.

  4. Facial Expressions and the Evolution of the Speech Rhythm

    PubMed Central

    Ghazanfar, Asif A.; Takahashi, Daniel Y.

    2015-01-01

    In primates, different vocalizations are produced, at least in part, by making different facial expressions. Not surprisingly, humans, apes, and monkeys all recognize the correspondence between vocalizations and the facial postures associated with them. However, one major dissimilarity between monkey vocalizations and human speech is that, in the latter, the acoustic output and associated movements of the mouth are both rhythmic (in the 3- to 8-Hz range) and tightly correlated, whereas monkey vocalizations have a similar acoustic rhythmicity but lack the concommitant rhythmic facial motion. This raises the question of how we evolved from a presumptive ancestral acoustic-only vocal rhythm to the one that is audiovisual with improved perceptual sensitivity. According to one hypothesis, this bisensory speech rhythm evolved through the rhythmic facial expressions of ancestral primates. If this hypothesis has any validity, we expect that the extant nonhuman primates produce at least some facial expressions with a speech-like rhythm in the 3- to 8-Hz frequency range. Lip smacking, an affiliative signal observed in many genera of primates, satisfies this criterion. We review a series of studies using developmental, x-ray cineradiographic, EMG, and perceptual approaches with macaque monkeys producing lip smacks to further investigate this hypothesis. We then explore its putative neural basis and remark on important differences between lip smacking and speech production. Overall, the data support the hypothesis that lip smacking may have been an ancestral expression that was linked to vocal output to produce the original rhythmic audiovisual speech-like utterances in the human lineage. PMID:24456390

  5. Warsaw set of emotional facial expression pictures: a validation study of facial display photographs

    PubMed Central

    Olszanowski, Michal; Pochwatko, Grzegorz; Kuklinski, Krzysztof; Scibor-Rylski, Michal; Lewinski, Peter; Ohme, Rafal K.

    2015-01-01

    Emotional facial expressions play a critical role in theories of emotion and figure prominently in research on almost every aspect of emotion. This article provides a background for a new database of basic emotional expressions. The goal in creating this set was to provide high quality photographs of genuine facial expressions. Thus, after proper training, participants were inclined to express “felt” emotions. The novel approach taken in this study was also used to establish whether a given expression was perceived as intended by untrained judges. The judgment task for perceivers was designed to be sensitive to subtle changes in meaning caused by the way an emotional display was evoked and expressed. Consequently, this allowed us to measure the purity and intensity of emotional displays, which are parameters that validation methods used by other researchers do not capture. The final set is comprised of those pictures that received the highest recognition marks (e.g., accuracy with intended display) from independent judges, totaling 210 high quality photographs of 30 individuals. Descriptions of the accuracy, intensity, and purity of displayed emotion as well as FACS AU's codes are provided for each picture. Given the unique methodology applied to gathering and validating this set of pictures, it may be a useful tool for research using face stimuli. The Warsaw Set of Emotional Facial Expression Pictures (WSEFEP) is freely accessible to the scientific community for non-commercial use by request at http://www.emotional-face.org. PMID:25601846

  6. Warsaw set of emotional facial expression pictures: a validation study of facial display photographs.

    PubMed

    Olszanowski, Michal; Pochwatko, Grzegorz; Kuklinski, Krzysztof; Scibor-Rylski, Michal; Lewinski, Peter; Ohme, Rafal K

    2014-01-01

    Emotional facial expressions play a critical role in theories of emotion and figure prominently in research on almost every aspect of emotion. This article provides a background for a new database of basic emotional expressions. The goal in creating this set was to provide high quality photographs of genuine facial expressions. Thus, after proper training, participants were inclined to express "felt" emotions. The novel approach taken in this study was also used to establish whether a given expression was perceived as intended by untrained judges. The judgment task for perceivers was designed to be sensitive to subtle changes in meaning caused by the way an emotional display was evoked and expressed. Consequently, this allowed us to measure the purity and intensity of emotional displays, which are parameters that validation methods used by other researchers do not capture. The final set is comprised of those pictures that received the highest recognition marks (e.g., accuracy with intended display) from independent judges, totaling 210 high quality photographs of 30 individuals. Descriptions of the accuracy, intensity, and purity of displayed emotion as well as FACS AU's codes are provided for each picture. Given the unique methodology applied to gathering and validating this set of pictures, it may be a useful tool for research using face stimuli. The Warsaw Set of Emotional Facial Expression Pictures (WSEFEP) is freely accessible to the scientific community for non-commercial use by request at http://www.emotional-face.org.

  7. Facial Expression Recognition Deficits and Faulty Learning: Implications for Theoretical Models and Clinical Applications

    ERIC Educational Resources Information Center

    Sheaffer, Beverly L.; Golden, Jeannie A.; Averett, Paige

    2009-01-01

    The ability to recognize facial expressions of emotion is integral in social interaction. Although the importance of facial expression recognition is reflected in increased research interest as well as in popular culture, clinicians may know little about this topic. The purpose of this article is to discuss facial expression recognition literature…

  8. Using Video Modeling to Teach Children with PDD-NOS to Respond to Facial Expressions

    ERIC Educational Resources Information Center

    Axe, Judah B.; Evans, Christine J.

    2012-01-01

    Children with autism spectrum disorders often exhibit delays in responding to facial expressions, and few studies have examined teaching responding to subtle facial expressions to this population. We used video modeling to train 3 participants with PDD-NOS (age 5) to respond to eight facial expressions: approval, bored, calming, disapproval,…

  9. Using Video Modeling to Teach Children with PDD-NOS to Respond to Facial Expressions

    ERIC Educational Resources Information Center

    Axe, Judah B.; Evans, Christine J.

    2012-01-01

    Children with autism spectrum disorders often exhibit delays in responding to facial expressions, and few studies have examined teaching responding to subtle facial expressions to this population. We used video modeling to train 3 participants with PDD-NOS (age 5) to respond to eight facial expressions: approval, bored, calming, disapproval,…

  10. Americans and Palestinians judge spontaneous facial expressions of emotion.

    PubMed

    Kayyal, Mary H; Russell, James A

    2013-10-01

    The claim that certain emotions are universally recognized from facial expressions is based primarily on the study of expressions that were posed. The current study was of spontaneous facial expressions shown by aborigines in Papua New Guinea (Ekman, 1980); 17 faces claimed to convey one (or, in the case of blends, two) basic emotions and five faces claimed to show other universal feelings. For each face, participants rated the degree to which each of the 12 predicted emotions or feelings was conveyed. The modal choice for English-speaking Americans (n = 60), English-speaking Palestinians (n = 60), and Arabic-speaking Palestinians (n = 44) was the predicted label for only 4, 5, and 4, respectively, of the 17 faces for basic emotions, and for only 2, 2, and 2, respectively, of the 5 faces for other feelings. Observers endorsed the predicted emotion or feeling moderately often (65%, 55%, and 44%), but also denied it moderately often (35%, 45%, and 56%). They also endorsed more than one (or, for blends, two) label(s) in each face-on average, 2.3, 2.3, and 1.5 of basic emotions and 2.6, 2.2, and 1.5 of other feelings. There were both similarities and differences across culture and language, but the emotional meaning of a facial expression is not well captured by the predicted label(s) or, indeed, by any single label.

  11. Neural processing of dynamic emotional facial expressions in psychopaths

    PubMed Central

    Decety, Jean; Skelly, Laurie; Yoder, Keith J.; Kiehl, Kent A.

    2014-01-01

    Facial expressions play a critical role in social interactions by eliciting rapid responses in the observer. Failure to perceive and experience a normal range and depth of emotion seriously impact interpersonal communication and relationships. As has been demonstrated across a number of domains, abnormal emotion processing in individuals with psychopathy plays a key role in their lack of empathy. However, the neuroimaging literature is unclear as to whether deficits are specific to particular emotions such as fear and perhaps sadness. Moreover, findings are inconsistent across studies. In the current experiment, eighty adult incarcerated males scoring high, medium, and low on the Hare Psychopathy Checklist-Revised (PCL-R) underwent fMRI scanning while viewing dynamic facial expressions of fear, sadness, happiness and pain. Participants who scored high on the PCL-R showed a reduction in neuro-hemodynamic response to all four categories of facial expressions in the face processing network (inferior occipital gyrus, fusiform gyrus, STS) as well as the extended network (inferior frontal gyrus and orbitofrontal cortex), which supports a pervasive deficit across emotion domains. Unexpectedly, the response in dorsal insula to fear, sadness and pain was greater in psychopaths than non-psychopaths. Importantly, the orbitofrontal cortex and ventromedial prefrontal cortex, regions critically implicated in affective and motivated behaviors, were significantly less active in individuals with psychopathy during the perception of all four emotional expressions. PMID:24359488

  12. Neural processing of dynamic emotional facial expressions in psychopaths.

    PubMed

    Decety, Jean; Skelly, Laurie; Yoder, Keith J; Kiehl, Kent A

    2014-02-01

    Facial expressions play a critical role in social interactions by eliciting rapid responses in the observer. Failure to perceive and experience a normal range and depth of emotion seriously impact interpersonal communication and relationships. As has been demonstrated across a number of domains, abnormal emotion processing in individuals with psychopathy plays a key role in their lack of empathy. However, the neuroimaging literature is unclear as to whether deficits are specific to particular emotions such as fear and perhaps sadness. Moreover, findings are inconsistent across studies. In the current experiment, 80 incarcerated adult males scoring high, medium, and low on the Hare Psychopathy Checklist-Revised (PCL-R) underwent functional magnetic resonance imaging (fMRI) scanning while viewing dynamic facial expressions of fear, sadness, happiness, and pain. Participants who scored high on the PCL-R showed a reduction in neuro-hemodynamic response to all four categories of facial expressions in the face processing network (inferior occipital gyrus, fusiform gyrus, and superior temporal sulcus (STS)) as well as the extended network (inferior frontal gyrus and orbitofrontal cortex (OFC)), which supports a pervasive deficit across emotion domains. Unexpectedly, the response in dorsal insula to fear, sadness, and pain was greater in psychopaths than non-psychopaths. Importantly, the orbitofrontal cortex and ventromedial prefrontal cortex (vmPFC), regions critically implicated in affective and motivated behaviors, were significantly less active in individuals with psychopathy during the perception of all four emotional expressions.

  13. Looking with different eyes: The psychological meaning of categorisation goals moderates facial reactivity to facial expressions.

    PubMed

    van Dillen, Lotte F; Harris, Lasana T; van Dijk, Wilco W; Rotteveel, Mark

    2015-01-01

    In the present research we examined whether the psychological meaning of people's categorisation goals affects facial muscle activity in response to facial expressions of emotion. We had participants associate eye colour (blue, brown) with either a personality trait (extraversion) or a physical trait (light frequency) and asked them to use these associations in a speeded categorisation task of angry, disgusted, happy and neutral faces while assessing participants' response times and facial muscle activity. We predicted that participants would respond differentially to the emotional faces when the categorisation criteria allowed for inferences about a target's thoughts, feelings or behaviour (i.e., when categorising extraversion), but not when these lacked any social meaning (i.e., when categorising light frequency). Indeed, emotional faces triggered facial reactions to facial expressions when participants categorised extraversion, but not when they categorised light frequency. In line with this, only when categorising extraversion did participants' response times indicate a negativity bias replicating previous results. Together, these findings provide further evidence for the contextual nature of people's selective responses to the emotions expressed by others.

  14. Does the Organization of Emotional Expression Change Over Time? Facial Expressivity From 4 to 12 Months

    PubMed Central

    Bendersky, Margaret

    2006-01-01

    Differentiation models contend that the organization of facial expressivity increases during infancy. Accordingly, infants are believed to exhibit increasingly specific facial expressions in response to stimuli as a function of development. This study tested this hypothesis in a sample of 151 infants (83 boys and 68 girls) observed in 4 situations (tickle, sour taste, arm restraint, and masked stranger) at 4 and 12 months of age. Three of the 4 situations showed evidence of increasing specificity over time. In response to tickle, the number of infants exhibiting joy expressions increased and the number exhibiting interest, surprise, and surprise blends decreased from 4 to 12 months. In tasting a sour substance, more infants exhibited disgust and fewer exhibited joy and interest expressions, and fear and surprise blends over time. For arm restraint, more infants exhibited anger expressions and anger blends and fewer exhibited interest and surprise expressions and surprise blends over time. In response to a masked stranger, however, no evidence of increased specificity was found. Overall, these findings suggest that infants increasingly exhibit particular expressions in response to specific stimuli during the 1st year of life. These data provide partial support for the hypothesis that facial expressivity becomes increasingly organized over time. PMID:16906232

  15. Younger and Older Users' Recognition of Virtual Agent Facial Expressions.

    PubMed

    Beer, Jenay M; Smarr, Cory-Ann; Fisk, Arthur D; Rogers, Wendy A

    2015-03-01

    As technology advances, robots and virtual agents will be introduced into the home and healthcare settings to assist individuals, both young and old, with everyday living tasks. Understanding how users recognize an agent's social cues is therefore imperative, especially in social interactions. Facial expression, in particular, is one of the most common non-verbal cues used to display and communicate emotion in on-screen agents (Cassell, Sullivan, Prevost, & Churchill, 2000). Age is important to consider because age-related differences in emotion recognition of human facial expression have been supported (Ruffman et al., 2008), with older adults showing a deficit for recognition of negative facial expressions. Previous work has shown that younger adults can effectively recognize facial emotions displayed by agents (Bartneck & Reichenbach, 2005; Courgeon et al. 2009; 2011; Breazeal, 2003); however, little research has compared in-depth younger and older adults' ability to label a virtual agent's facial emotions, an import consideration because social agents will be required to interact with users of varying ages. If such age-related differences exist for recognition of virtual agent facial expressions, we aim to understand if those age-related differences are influenced by the intensity of the emotion, dynamic formation of emotion (i.e., a neutral expression developing into an expression of emotion through motion), or the type of virtual character differing by human-likeness. Study 1 investigated the relationship between age-related differences, the implication of dynamic formation of emotion, and the role of emotion intensity in emotion recognition of the facial expressions of a virtual agent (iCat). Study 2 examined age-related differences in recognition expressed by three types of virtual characters differing by human-likeness (non-humanoid iCat, synthetic human, and human). Study 2 also investigated the role of configural and featural processing as a possible

  16. Younger and Older Users’ Recognition of Virtual Agent Facial Expressions

    PubMed Central

    Beer, Jenay M.; Smarr, Cory-Ann; Fisk, Arthur D.; Rogers, Wendy A.

    2015-01-01

    As technology advances, robots and virtual agents will be introduced into the home and healthcare settings to assist individuals, both young and old, with everyday living tasks. Understanding how users recognize an agent’s social cues is therefore imperative, especially in social interactions. Facial expression, in particular, is one of the most common non-verbal cues used to display and communicate emotion in on-screen agents (Cassell, Sullivan, Prevost, & Churchill, 2000). Age is important to consider because age-related differences in emotion recognition of human facial expression have been supported (Ruffman et al., 2008), with older adults showing a deficit for recognition of negative facial expressions. Previous work has shown that younger adults can effectively recognize facial emotions displayed by agents (Bartneck & Reichenbach, 2005; Courgeon et al. 2009; 2011; Breazeal, 2003); however, little research has compared in-depth younger and older adults’ ability to label a virtual agent’s facial emotions, an import consideration because social agents will be required to interact with users of varying ages. If such age-related differences exist for recognition of virtual agent facial expressions, we aim to understand if those age-related differences are influenced by the intensity of the emotion, dynamic formation of emotion (i.e., a neutral expression developing into an expression of emotion through motion), or the type of virtual character differing by human-likeness. Study 1 investigated the relationship between age-related differences, the implication of dynamic formation of emotion, and the role of emotion intensity in emotion recognition of the facial expressions of a virtual agent (iCat). Study 2 examined age-related differences in recognition expressed by three types of virtual characters differing by human-likeness (non-humanoid iCat, synthetic human, and human). Study 2 also investigated the role of configural and featural processing as a

  17. Moving to continuous facial expression space using the MPEG-4 facial definition parameter (FDP) set

    NASA Astrophysics Data System (ADS)

    Karpouzis, Kostas; Tsapatsoulis, Nicolas; Kollias, Stefanos D.

    2000-06-01

    Research in facial expression has concluded that at least six emotions, conveyed by human faces, are universally associated with distinct expressions. Sadness, anger, joy, fear, disgust and surprise are categories of expressions that are recognizable across cultures. In this work we form a relation between the description of the universal expressions and the MPEG-4 Facial Definition Parameter Set (FDP). We also investigate the relation between the movement of basic FDPs and the parameters that describe emotion-related words according to some classical psychological studies. In particular Whissel suggested that emotions are points in a space, which seem to occupy two dimensions: activation and evaluation. We show that some of the MPEG-4 Facial Animation Parameters (FAPs), approximated by the motion of the corresponding FDPs, can be combined by means of a fuzzy rule system to estimate the activation parameter. In this way variations of the six archetypal emotions can be achieved. Moreover, Plutchik concluded that emotion terms are unevenly distributed through the space defined by dimensions like Whissel's; instead they tend to form an approximately circular pattern, called 'emotion wheel,' modeled using an angular measure. The 'emotion wheel' can be defined as a reference for creating intermediate expressions from the universal ones, by interpolating the movement of dominant FDP points between neighboring basic expressions. By exploiting the relation between the movement of the basic FDP point and the activation and angular parameters we can model more emotions than the primary ones and achieve efficient recognition in video sequences.

  18. Facial expressions of emotion are not culturally universal

    PubMed Central

    Jack, Rachael E.; Garrod, Oliver G. B.; Yu, Hui; Caldara, Roberto; Schyns, Philippe G.

    2012-01-01

    Since Darwin’s seminal works, the universality of facial expressions of emotion has remained one of the longest standing debates in the biological and social sciences. Briefly stated, the universality hypothesis claims that all humans communicate six basic internal emotional states (happy, surprise, fear, disgust, anger, and sad) using the same facial movements by virtue of their biological and evolutionary origins [Susskind JM, et al. (2008) Nat Neurosci 11:843–850]. Here, we refute this assumed universality. Using a unique computer graphics platform that combines generative grammars [Chomsky N (1965) MIT Press, Cambridge, MA] with visual perception, we accessed the mind’s eye of 30 Western and Eastern culture individuals and reconstructed their mental representations of the six basic facial expressions of emotion. Cross-cultural comparisons of the mental representations challenge universality on two separate counts. First, whereas Westerners represent each of the six basic emotions with a distinct set of facial movements common to the group, Easterners do not. Second, Easterners represent emotional intensity with distinctive dynamic eye activity. By refuting the long-standing universality hypothesis, our data highlight the powerful influence of culture on shaping basic behaviors once considered biologically hardwired. Consequently, our data open a unique nature–nurture debate across broad fields from evolutionary psychology and social neuroscience to social networking via digital avatars. PMID:22509011

  19. Facial expressions of emotion are not culturally universal.

    PubMed

    Jack, Rachael E; Garrod, Oliver G B; Yu, Hui; Caldara, Roberto; Schyns, Philippe G

    2012-05-08

    Since Darwin's seminal works, the universality of facial expressions of emotion has remained one of the longest standing debates in the biological and social sciences. Briefly stated, the universality hypothesis claims that all humans communicate six basic internal emotional states (happy, surprise, fear, disgust, anger, and sad) using the same facial movements by virtue of their biological and evolutionary origins [Susskind JM, et al. (2008) Nat Neurosci 11:843-850]. Here, we refute this assumed universality. Using a unique computer graphics platform that combines generative grammars [Chomsky N (1965) MIT Press, Cambridge, MA] with visual perception, we accessed the mind's eye of 30 Western and Eastern culture individuals and reconstructed their mental representations of the six basic facial expressions of emotion. Cross-cultural comparisons of the mental representations challenge universality on two separate counts. First, whereas Westerners represent each of the six basic emotions with a distinct set of facial movements common to the group, Easterners do not. Second, Easterners represent emotional intensity with distinctive dynamic eye activity. By refuting the long-standing universality hypothesis, our data highlight the powerful influence of culture on shaping basic behaviors once considered biologically hardwired. Consequently, our data open a unique nature-nurture debate across broad fields from evolutionary psychology and social neuroscience to social networking via digital avatars.

  20. Three-year-olds' rapid facial electromyographic responses to emotional facial expressions and body postures.

    PubMed

    Geangu, Elena; Quadrelli, Ermanno; Conte, Stefania; Croci, Emanuela; Turati, Chiara

    2016-04-01

    Rapid facial reactions (RFRs) to observed emotional expressions are proposed to be involved in a wide array of socioemotional skills, from empathy to social communication. Two of the most persuasive theoretical accounts propose RFRs to rely either on motor resonance mechanisms or on more complex mechanisms involving affective processes. Previous studies demonstrated that presentation of facial and bodily expressions can generate rapid changes in adult and school-age children's muscle activity. However, to date there is little to no evidence to suggest the existence of emotional RFRs from infancy to preschool age. To investigate whether RFRs are driven by motor mimicry or could also be a result of emotional appraisal processes, we recorded facial electromyographic (EMG) activation from the zygomaticus major and frontalis medialis muscles to presentation of static facial and bodily expressions of emotions (i.e., happiness, anger, fear, and neutral) in 3-year-old children. Results showed no specific EMG activation in response to bodily emotion expressions. However, observing others' happy faces led to increased activation of the zygomaticus major and decreased activation of the frontalis medialis, whereas observing others' angry faces elicited the opposite pattern of activation. This study suggests that RFRs are the result of complex mechanisms in which both affective processes and motor resonance may play an important role. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Facial expression recognition in Alzheimer's disease: a longitudinal study.

    PubMed

    Torres, Bianca; Santos, Raquel Luiza; Sousa, Maria Fernanda Barroso de; Simões Neto, José Pedro; Nogueira, Marcela Moreira Lima; Belfort, Tatiana T; Dias, Rachel; Dourado, Marcia Cristina Nascimento

    2015-05-01

    Facial recognition is one of the most important aspects of social cognition. In this study, we investigate the patterns of change and the factors involved in the ability to recognize emotion in mild Alzheimer's disease (AD). Through a longitudinal design, we assessed 30 people with AD. We used an experimental task that includes matching expressions with picture stimuli, labelling emotions and emotionally recognizing a stimulus situation. We observed a significant difference in the situational recognition task (p ≤ 0.05) between baseline and the second evaluation. The linear regression showed that cognition is a predictor of emotion recognition impairment (p ≤ 0.05). The ability to perceive emotions from facial expressions was impaired, particularly when the emotions presented were relatively subtle. Cognition is recruited to comprehend emotional situations in cases of mild dementia.

  2. Affective priming using facial expressions modulates liking for abstract art.

    PubMed

    Flexas, Albert; Rosselló, Jaume; Christensen, Julia F; Nadal, Marcos; Olivera La Rosa, Antonio; Munar, Enric

    2013-01-01

    We examined the influence of affective priming on the appreciation of abstract artworks using an evaluative priming task. Facial primes (showing happiness, disgust or no emotion) were presented under brief (Stimulus Onset Asynchrony, SOA = 20 ms) and extended (SOA = 300 ms) conditions. Differences in aesthetic liking for abstract paintings depending on the emotion expressed in the preceding primes provided a measure of the priming effect. The results showed that, for the extended SOA, artworks were liked more when preceded by happiness primes and less when preceded by disgust primes. Facial expressions of happiness, though not of disgust, exerted similar effects in the brief SOA condition. Subjective measures and a forced-choice task revealed no evidence of prime awareness in the suboptimal condition. Our results are congruent with findings showing that the affective transfer elicited by priming biases evaluative judgments, extending previous research to the domain of aesthetic appreciation.

  3. Affective Priming Using Facial Expressions Modulates Liking for Abstract Art

    PubMed Central

    Flexas, Albert; Rosselló, Jaume; Christensen, Julia F.; Nadal, Marcos; Olivera La Rosa, Antonio; Munar, Enric

    2013-01-01

    We examined the influence of affective priming on the appreciation of abstract artworks using an evaluative priming task. Facial primes (showing happiness, disgust or no emotion) were presented under brief (Stimulus Onset Asynchrony, SOA = 20ms) and extended (SOA = 300ms) conditions. Differences in aesthetic liking for abstract paintings depending on the emotion expressed in the preceding primes provided a measure of the priming effect. The results showed that, for the extended SOA, artworks were liked more when preceded by happiness primes and less when preceded by disgust primes. Facial expressions of happiness, though not of disgust, exerted similar effects in the brief SOA condition. Subjective measures and a forced-choice task revealed no evidence of prime awareness in the suboptimal condition. Our results are congruent with findings showing that the affective transfer elicited by priming biases evaluative judgments, extending previous research to the domain of aesthetic appreciation. PMID:24260350

  4. A longitudinal analysis of the development of infant facial expressions in response to acute pain: immediate and regulatory expressions.

    PubMed

    Ahola Kohut, Sara; Pillai Riddell, Rebecca; Flora, David B; Oster, Harriet

    2012-12-01

    Facial expressions during infancy are important to examine, as infants do not have the language skills to describe their experiences. This is particularly vital in the context of pain, where infants depend solely on their caregivers for relief. The objective of the current study was to investigate the development of negative infant facial expressions in response to immunization pain over the first year of life. Infant facial expressions were examined longitudinally using a subsample of 100 infants that were each videotaped during their 2-, 4-, 6-, and 12-month routine immunization appointments. Infant facial expressions were coded using BabyFACS (facial action coding system) for the first minute after a painful needle prick. Facial expressions were examined with a catalogue of the most commonly occurring facial expressions. Results demonstrated that clear differences were seen over ages. Infants display a variety of facial expressions with some of the components of adult pain expressions immediately after the needle and they abate shortly after. However, infants did not display adult expressions of discrete negative emotions. Instead, infants displayed a variety of generalized pain and distress faces aimed at gaining caregiver aid. The development of nonverbal communication in infants, particularly facial expressions, remains an important area of inquiry. Further study into accurately measuring infant negative emotions, pain, and distress is warranted.

  5. Can Neurotypical Individuals Read Autistic Facial Expressions? Atypical Production of Emotional Facial Expressions in Autism Spectrum Disorders.

    PubMed

    Brewer, Rebecca; Biotti, Federica; Catmur, Caroline; Press, Clare; Happé, Francesca; Cook, Richard; Bird, Geoffrey

    2016-02-01

    The difficulties encountered by individuals with autism spectrum disorder (ASD) when interacting with neurotypical (NT, i.e. nonautistic) individuals are usually attributed to failure to recognize the emotions and mental states of their NT interaction partner. It is also possible, however, that at least some of the difficulty is due to a failure of NT individuals to read the mental and emotional states of ASD interaction partners. Previous research has frequently observed deficits of typical facial emotion recognition in individuals with ASD, suggesting atypical representations of emotional expressions. Relatively little research, however, has investigated the ability of individuals with ASD to produce recognizable emotional expressions, and thus, whether NT individuals can recognize autistic emotional expressions. The few studies which have investigated this have used only NT observers, making it impossible to determine whether atypical representations are shared among individuals with ASD, or idiosyncratic. This study investigated NT and ASD participants' ability to recognize emotional expressions produced by NT and ASD posers. Three posing conditions were included, to determine whether potential group differences are due to atypical cognitive representations of emotion, impaired understanding of the communicative value of expressions, or poor proprioceptive feedback. Results indicated that ASD expressions were recognized less well than NT expressions, and that this is likely due to a genuine deficit in the representation of typical emotional expressions in this population. Further, ASD expressions were equally poorly recognized by NT individuals and those with ASD, implicating idiosyncratic, rather than common, atypical representations of emotional expressions in ASD.

  6. Emotion identification from facial expressions in children adopted internationally.

    PubMed

    Hwa-Froelich, Deborah A; Matsuo, Hisako; Becker, Jenna C

    2014-11-01

    Children adopted internationally who are exposed to institutional care receive less social interaction than children reared in families. These children spend their preadoptive life with individuals from their birth country and are adopted into families who may look and interact differently. The presumed patterns of limited social stimulation and transition from ethnically similar to ethnically and culturally different social interactions may affect these children's ability to accurately identify emotions from facial expressions. Thirty-five 4-year-old children adopted from Asia and Eastern Europe by U.S. families were compared with 33 nonadopted peers on the Diagnostic Analysis of Nonverbal Accuracy, Version 2 (DANVA2) Faces subtests. Correlation and regression analyses were completed with preadoption (adoption age, foster care exposure), postadoption environment (postadoption care duration, number of siblings, socioeconomic status), and individual (chronological age, gender, language competence) variables to determine related and predictive variables. The nonadopted group demonstrated better emotion identification than children internationally adopted, but no region-of-origin differences were found. English language performance was correlated with and predicted 20% of the variance in emotion identification of facial expressions on the DANVA2. Children adopted internationally who have stronger language ability tend to be more accurate in identifying emotions from facial expressions.

  7. Modulation of incentivized dishonesty by disgust facial expressions

    PubMed Central

    Lim, Julian; Ho, Paul M.; Mullette-Gillman, O'Dhaniel A.

    2015-01-01

    Disgust modulates moral decisions involving harming others. We recently specified that this effect is bi-directionally modulated by individual sensitivity to disgust. Here, we show that this effect generalizes to the moral domain of honesty and extends to outcomes with real-world impact. We employed a dice-rolling task in which participants were incentivized to dishonestly report outcomes to increase their potential final monetary payoff. Disgust or control facial expressions were presented subliminally on each trial. Our results reveal that the disgust facial expressions altered honest reporting as a bi-directional function moderated by individual sensitivity. Combining these data with those from prior experiments revealed that the effect of disgust presentation on both harm judgments and honesty could be accounted for by the same bidirectional function, with no significant effect of domain. This clearly demonstrates that disgust facial expressions produce the same modulation of moral judgments across different moral foundations (harm and honesty). Our results suggest strong overlap in the cognitive/neural processes of moral judgments across moral foundations, and provide a framework for further studies to specify the integration of emotional information in moral decision making. PMID:26257599

  8. 4-D facial expression recognition by learning geometric deformations.

    PubMed

    Ben Amor, Boulbaba; Drira, Hassen; Berretti, Stefano; Daoudi, Mohamed; Srivastava, Anuj

    2014-12-01

    In this paper, we present an automatic approach for facial expression recognition from 3-D video sequences. In the proposed solution, the 3-D faces are represented by collections of radial curves and a Riemannian shape analysis is applied to effectively quantify the deformations induced by the facial expressions in a given subsequence of 3-D frames. This is obtained from the dense scalar field, which denotes the shooting directions of the geodesic paths constructed between pairs of corresponding radial curves of two faces. As the resulting dense scalar fields show a high dimensionality, Linear Discriminant Analysis (LDA) transformation is applied to the dense feature space. Two methods are then used for classification: 1) 3-D motion extraction with temporal Hidden Markov model (HMM) and 2) mean deformation capturing with random forest. While a dynamic HMM on the features is trained in the first approach, the second one computes mean deformations under a window and applies multiclass random forest. Both of the proposed classification schemes on the scalar fields showed comparable results and outperformed earlier studies on facial expression recognition from 3-D video sequences.

  9. Hierarchical Recognition Scheme for Human Facial Expression Recognition Systems

    PubMed Central

    Siddiqi, Muhammad Hameed; Lee, Sungyoung; Lee, Young-Koo; Khan, Adil Mehmood; Truc, Phan Tran Ho

    2013-01-01

    Over the last decade, human facial expressions recognition (FER) has emerged as an important research area. Several factors make FER a challenging research problem. These include varying light conditions in training and test images; need for automatic and accurate face detection before feature extraction; and high similarity among different expressions that makes it difficult to distinguish these expressions with a high accuracy. This work implements a hierarchical linear discriminant analysis-based facial expressions recognition (HL-FER) system to tackle these problems. Unlike the previous systems, the HL-FER uses a pre-processing step to eliminate light effects, incorporates a new automatic face detection scheme, employs methods to extract both global and local features, and utilizes a HL-FER to overcome the problem of high similarity among different expressions. Unlike most of the previous works that were evaluated using a single dataset, the performance of the HL-FER is assessed using three publicly available datasets under three different experimental settings: n-fold cross validation based on subjects for each dataset separately; n-fold cross validation rule based on datasets; and, finally, a last set of experiments to assess the effectiveness of each module of the HL-FER separately. Weighted average recognition accuracy of 98.7% across three different datasets, using three classifiers, indicates the success of employing the HL-FER for human FER. PMID:24316568

  10. The effect of sad facial expressions on weight judgment

    PubMed Central

    Weston, Trent D.; Hass, Norah C.; Lim, Seung-Lark

    2015-01-01

    Although the body weight evaluation (e.g., normal or overweight) of others relies on perceptual impressions, it also can be influenced by other psychosocial factors. In this study, we explored the effect of task-irrelevant emotional facial expressions on judgments of body weight and the relationship between emotion-induced weight judgment bias and other psychosocial variables including attitudes toward obese persons. Forty-four participants were asked to quickly make binary body weight decisions for 960 randomized sad and neutral faces of varying weight levels presented on a computer screen. The results showed that sad facial expressions systematically decreased the decision threshold of overweight judgments for male faces. This perceptual decision bias by emotional expressions was positively correlated with the belief that being overweight is not under the control of obese persons. Our results provide experimental evidence that task-irrelevant emotional expressions can systematically change the decision threshold for weight judgments, demonstrating that sad expressions can make faces appear more overweight than they would otherwise be judged. PMID:25914669

  11. Joint sparse learning for 3-D facial expression generation.

    PubMed

    Song, Mingli; Tao, Dacheng; Sun, Shengpeng; Chen, Chun; Bu, Jiajun

    2013-08-01

    3-D facial expression generation, including synthesis and retargeting, has received intensive attentions in recent years, because it is important to produce realistic 3-D faces with specific expressions in modern film production and computer games. In this paper, we present joint sparse learning (JSL) to learn mapping functions and their respective inverses to model the relationship between the high-dimensional 3-D faces (of different expressions and identities) and their corresponding low-dimensional representations. Based on JSL, we can effectively and efficiently generate various expressions of a 3-D face by either synthesizing or retargeting. Furthermore, JSL is able to restore 3-D faces with holes by learning a mapping function between incomplete and intact data. Experimental results on a wide range of 3-D faces demonstrate the effectiveness of the proposed approach by comparing with representative ones in terms of quality, time cost, and robustness.

  12. Rapid Facial Reactions to Emotional Facial Expressions in Typically Developing Children and Children with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Beall, Paula M.; Moody, Eric J.; McIntosh, Daniel N.; Hepburn, Susan L.; Reed, Catherine L.

    2008-01-01

    Typical adults mimic facial expressions within 1000ms, but adults with autism spectrum disorder (ASD) do not. These rapid facial reactions (RFRs) are associated with the development of social-emotional abilities. Such interpersonal matching may be caused by motor mirroring or emotional responses. Using facial electromyography (EMG), this study…

  13. Lateralization for dynamic facial expressions in human superior temporal sulcus.

    PubMed

    De Winter, François-Laurent; Zhu, Qi; Van den Stock, Jan; Nelissen, Koen; Peeters, Ronald; de Gelder, Beatrice; Vanduffel, Wim; Vandenbulcke, Mathieu

    2015-02-01

    Most face processing studies in humans show stronger activation in the right compared to the left hemisphere. Evidence is largely based on studies with static stimuli focusing on the fusiform face area (FFA). Hence, the pattern of lateralization for dynamic faces is less clear. Furthermore, it is unclear whether this property is common to human and non-human primates due to predisposing processing strategies in the right hemisphere or that alternatively left sided specialization for language in humans could be the driving force behind this phenomenon. We aimed to address both issues by studying lateralization for dynamic facial expressions in monkeys and humans. Therefore, we conducted an event-related fMRI experiment in three macaques and twenty right handed humans. We presented human and monkey dynamic facial expressions (chewing and fear) as well as scrambled versions to both species. We studied lateralization in independently defined face-responsive and face-selective regions by calculating a weighted lateralization index (LIwm) using a bootstrapping method. In order to examine if lateralization in humans is related to language, we performed a separate fMRI experiment in ten human volunteers including a 'speech' expression (one syllable non-word) and its scrambled version. Both within face-responsive and selective regions, we found consistent lateralization for dynamic faces (chewing and fear) versus scrambled versions in the right human posterior superior temporal sulcus (pSTS), but not in FFA nor in ventral temporal cortex. Conversely, in monkeys no consistent pattern of lateralization for dynamic facial expressions was observed. Finally, LIwms based on the contrast between different types of dynamic facial expressions (relative to scrambled versions) revealed left-sided lateralization in human pSTS for speech-related expressions compared to chewing and emotional expressions. To conclude, we found consistent laterality effects in human posterior STS but not

  14. Automated and objective action coding of facial expressions in patients with acute facial palsy.

    PubMed

    Haase, Daniel; Minnigerode, Laura; Volk, Gerd Fabian; Denzler, Joachim; Guntinas-Lichius, Orlando

    2015-05-01

    Aim of the present observational single center study was to objectively assess facial function in patients with idiopathic facial palsy with a new computer-based system that automatically recognizes action units (AUs) defined by the Facial Action Coding System (FACS). Still photographs using posed facial expressions of 28 healthy subjects and of 299 patients with acute facial palsy were automatically analyzed for bilateral AU expression profiles. All palsies were graded with the House-Brackmann (HB) grading system and with the Stennert Index (SI). Changes of the AU profiles during follow-up were analyzed for 77 patients. The initial HB grading of all patients was 3.3 ± 1.2. SI at rest was 1.86 ± 1.3 and during motion 3.79 ± 4.3. Healthy subjects showed a significant AU asymmetry score of 21 ± 11 % and there was no significant difference to patients (p = 0.128). At initial examination of patients, the number of activated AUs was significantly lower on the paralyzed side than on the healthy side (p < 0.0001). The final examination for patients took place 4 ± 6 months post baseline. The number of activated AUs and the ratio between affected and healthy side increased significantly between baseline and final examination (both p < 0.0001). The asymmetry score decreased between baseline and final examination (p < 0.0001). The number of activated AUs on the healthy side did not change significantly (p = 0.779). Radical rethinking in facial grading is worthwhile: automated FACS delivers fast and objective global and regional data on facial motor function for use in clinical routine and clinical trials.

  15. Perceptual, categorical, and affective processing of ambiguous smiling facial expressions.

    PubMed

    Calvo, Manuel G; Fernández-Martín, Andrés; Nummenmaa, Lauri

    2012-12-01

    Why is a face with a smile but non-happy eyes likely to be interpreted as happy? We used blended expressions in which a smiling mouth was incongruent with the eyes (e.g., angry eyes), as well as genuine expressions with congruent eyes and mouth (e.g., both happy or angry). Tasks involved detection of a smiling mouth (perceptual), categorization of the expression (semantic), and valence evaluation (affective). The face stimulus display duration and stimulus onset asynchrony (SOA) were varied to assess the time course of each process. Results indicated that (a) a smiling mouth was visually more salient than the eyes both in truly happy and blended expressions; (b) a smile led viewers to categorize blended expressions as happy similarly for upright and inverted faces; (c) truly happy, but not blended, expressions primed the affective evaluation of probe scenes 550 ms following face onset; (d) both truly happy and blended expressions primed the detection of a smile in a probe scene by 170 ms post-stimulus; and (e) smile detection and expression categorization had similar processing thresholds and preceded affective evaluation. We conclude that the saliency of single physical features such as the mouth shape makes the smile quickly accessible to the visual system, which initially speeds up expression categorization regardless of congruence with the eyes. Only when the eye expression is later configurally integrated with the mouth, will affective discrimination begin. The present research provides support for serial models of facial expression processing. Copyright © 2012 Elsevier B.V. All rights reserved.

  16. Fear Modulates Visual Awareness Similarly for Facial and Bodily Expressions

    PubMed Central

    Stienen, Bernard M. C.; de Gelder, Beatrice

    2011-01-01

    Background: Social interaction depends on a multitude of signals carrying information about the emotional state of others. But the relative importance of facial and bodily signals is still poorly understood. Past research has focused on the perception of facial expressions while perception of whole body signals has only been studied recently. In order to better understand the relative contribution of affective signals from the face only or from the whole body we performed two experiments using binocular rivalry. This method seems to be perfectly suitable to contrast two classes of stimuli to test our processing sensitivity to either stimulus and to address the question how emotion modulates this sensitivity. Method: In the first experiment we directly contrasted fearful, angry, and neutral bodies and faces. We always presented bodies in one eye and faces in the other simultaneously for 60 s and asked participants to report what they perceived. In the second experiment we focused specifically on the role of fearful expressions of faces and bodies. Results: Taken together the two experiments show that there is no clear bias toward either the face or body when the expression of the body and face are neutral or angry. However, the perceptual dominance in favor of either the face of the body is a function of the stimulus class expressing fear. PMID:22125517

  17. Body Actions Change the Appearance of Facial Expressions

    PubMed Central

    Fantoni, Carlo; Gerbino, Walter

    2014-01-01

    Perception, cognition, and emotion do not operate along segregated pathways; rather, their adaptive interaction is supported by various sources of evidence. For instance, the aesthetic appraisal of powerful mood inducers like music can bias the facial expression of emotions towards mood congruency. In four experiments we showed similar mood-congruency effects elicited by the comfort/discomfort of body actions. Using a novel Motor Action Mood Induction Procedure, we let participants perform comfortable/uncomfortable visually-guided reaches and tested them in a facial emotion identification task. Through the alleged mediation of motor action induced mood, action comfort enhanced the quality of the participant’s global experience (a neutral face appeared happy and a slightly angry face neutral), while action discomfort made a neutral face appear angry and a slightly happy face neutral. Furthermore, uncomfortable (but not comfortable) reaching improved the sensitivity for the identification of emotional faces and reduced the identification time of facial expressions, as a possible effect of hyper-arousal from an unpleasant bodily experience. PMID:25251882

  18. Accurate identification of fear facial expressions predicts prosocial behavior.

    PubMed

    Marsh, Abigail A; Kozak, Megan N; Ambady, Nalini

    2007-05-01

    The fear facial expression is a distress cue that is associated with the provision of help and prosocial behavior. Prior psychiatric studies have found deficits in the recognition of this expression by individuals with antisocial tendencies. However, no prior study has shown accuracy for recognition of fear to predict actual prosocial or antisocial behavior in an experimental setting. In 3 studies, the authors tested the prediction that individuals who recognize fear more accurately will behave more prosocially. In Study 1, participants who identified fear more accurately also donated more money and time to a victim in a classic altruism paradigm. In Studies 2 and 3, participants' ability to identify the fear expression predicted prosocial behavior in a novel task designed to control for confounding variables. In Study 3, accuracy for recognizing fear proved a better predictor of prosocial behavior than gender, mood, or scores on an empathy scale.

  19. Accurate Identification of Fear Facial Expressions Predicts Prosocial Behavior

    PubMed Central

    Marsh, Abigail A.; Kozak, Megan N.; Ambady, Nalini

    2009-01-01

    The fear facial expression is a distress cue that is associated with the provision of help and prosocial behavior. Prior psychiatric studies have found deficits in the recognition of this expression by individuals with antisocial tendencies. However, no prior study has shown accuracy for recognition of fear to predict actual prosocial or antisocial behavior in an experimental setting. In 3 studies, the authors tested the prediction that individuals who recognize fear more accurately will behave more prosocially. In Study 1, participants who identified fear more accurately also donated more money and time to a victim in a classic altruism paradigm. In Studies 2 and 3, participants’ ability to identify the fear expression predicted prosocial behavior in a novel task designed to control for confounding variables. In Study 3, accuracy for recognizing fear proved a better predictor of prosocial behavior than gender, mood, or scores on an empathy scale. PMID:17516803

  20. Emotional Representation in Facial Expression and Script: A Comparison between Normal and Autistic Children

    ERIC Educational Resources Information Center

    Balconi, Michela; Carrera, Alba

    2007-01-01

    The paper explored conceptual and lexical skills with regard to emotional correlates of facial stimuli and scripts. In two different experimental phases normal and autistic children observed six facial expressions of emotions (happiness, anger, fear, sadness, surprise, and disgust) and six emotional scripts (contextualized facial expressions). In…

  1. Common impairments of emotional facial expression recognition in schizophrenia across French and Japanese cultures.

    PubMed

    Okada, Takashi; Kubota, Yasutaka; Sato, Wataru; Murai, Toshiya; Pellion, Fréderic; Gorog, Françoise

    2015-01-01

    To address whether the recognition of emotional facial expressions is impaired in schizophrenia across different cultures, patients with schizophrenia and age-matched normal controls in France and Japan were tested with a labeling task of emotional facial expressions and a matching task of unfamiliar faces. Schizophrenia patients in both France and Japan were less accurate in labeling fearful facial expressions. There was no correlation between the scores of facial emotion labeling and face matching. These results suggest that the impaired recognition of emotional facial expressions in schizophrenia is common across different cultures.

  2. Common impairments of emotional facial expression recognition in schizophrenia across French and Japanese cultures

    PubMed Central

    Okada, Takashi; Kubota, Yasutaka; Sato, Wataru; Murai, Toshiya; Pellion, Fréderic; Gorog, Françoise

    2015-01-01

    To address whether the recognition of emotional facial expressions is impaired in schizophrenia across different cultures, patients with schizophrenia and age-matched normal controls in France and Japan were tested with a labeling task of emotional facial expressions and a matching task of unfamiliar faces. Schizophrenia patients in both France and Japan were less accurate in labeling fearful facial expressions. There was no correlation between the scores of facial emotion labeling and face matching. These results suggest that the impaired recognition of emotional facial expressions in schizophrenia is common across different cultures. PMID:26257678

  3. Behavioural responses to facial and postural expressions of emotion: An interpersonal circumplex approach.

    PubMed

    Aan Het Rot, Marije; Enea, Violeta; Dafinoiu, Ion; Iancu, Sorina; Taftă, Steluţa A; Bărbuşelu, Mariana

    2017-03-21

    While the recognition of emotional expressions has been extensively studied, the behavioural response to these expressions has not. In the interpersonal circumplex, behaviour is defined in terms of communion and agency. In this study, we examined behavioural responses to both facial and postural expressions of emotion. We presented 101 Romanian students with facial and postural stimuli involving individuals ('targets') expressing happiness, sadness, anger, or fear. Using an interpersonal grid, participants simultaneously indicated how communal (i.e., quarrelsome or agreeable) and agentic (i.e., dominant or submissive) they would be towards people displaying these expressions. Participants were agreeable-dominant towards targets showing happy facial expressions and primarily quarrelsome towards targets with angry or fearful facial expressions. Responses to targets showing sad facial expressions were neutral on both dimensions of interpersonal behaviour. Postural versus facial expressions of happiness and anger elicited similar behavioural responses. Participants responded in a quarrelsome-submissive way to fearful postural expressions and in an agreeable way to sad postural expressions. Behavioural responses to the various facial expressions were largely comparable to those previously observed in Dutch students. Observed differences may be explained from participants' cultural background. Responses to the postural expressions largely matched responses to the facial expressions. © 2017 The British Psychological Society.

  4. Discriminative shared Gaussian processes for multiview and view-invariant facial expression recognition.

    PubMed

    Eleftheriadis, Stefanos; Rudovic, Ognjen; Pantic, Maja

    2015-01-01

    Images of facial expressions are often captured from various views as a result of either head movements or variable camera position. Existing methods for multiview and/or view-invariant facial expression recognition typically perform classification of the observed expression using either classifiers learned separately for each view or a single classifier learned for all views. However, these approaches ignore the fact that different views of a facial expression are just different manifestations of the same facial expression. By accounting for this redundancy, we can design more effective classifiers for the target task. To this end, we propose a discriminative shared Gaussian process latent variable model (DS-GPLVM) for multiview and view-invariant classification of facial expressions from multiple views. In this model, we first learn a discriminative manifold shared by multiple views of a facial expression. Subsequently, we perform facial expression classification in the expression manifold. Finally, classification of an observed facial expression is carried out either in the view-invariant manner (using only a single view of the expression) or in the multiview manner (using multiple views of the expression). The proposed model can also be used to perform fusion of different facial features in a principled manner. We validate the proposed DS-GPLVM on both posed and spontaneously displayed facial expressions from three publicly available datasets (MultiPIE, labeled face parts in the wild, and static facial expressions in the wild). We show that this model outperforms the state-of-the-art methods for multiview and view-invariant facial expression classification, and several state-of-the-art methods for multiview learning and feature fusion.

  5. Face recognition using facial expression: a novel approach

    NASA Astrophysics Data System (ADS)

    Singh, Deepak Kumar; Gupta, Priya; Tiwary, U. S.

    2008-04-01

    Facial expressions are undoubtedly the most effective nonverbal communication. The face has always been the equation of a person's identity. The face draws the demarcation line between identity and extinction. Each line on the face adds an attribute to the identity. These lines become prominent when we experience an emotion and these lines do not change completely with age. In this paper we have proposed a new technique for face recognition which focuses on the facial expressions of the subject to identify his face. This is a grey area on which not much light has been thrown earlier. According to earlier researches it is difficult to alter the natural expression. So our technique will be beneficial for identifying occluded or intentionally disguised faces. The test results of the experiments conducted prove that this technique will give a new direction in the field of face recognition. This technique will provide a strong base to the area of face recognition and will be used as the core method for critical defense security related issues.

  6. Can Neurotypical Individuals Read Autistic Facial Expressions? Atypical Production of Emotional Facial Expressions in Autism Spectrum Disorders

    PubMed Central

    Biotti, Federica; Catmur, Caroline; Press, Clare; Happé, Francesca; Cook, Richard; Bird, Geoffrey

    2015-01-01

    The difficulties encountered by individuals with autism spectrum disorder (ASD) when interacting with neurotypical (NT, i.e. nonautistic) individuals are usually attributed to failure to recognize the emotions and mental states of their NT interaction partner. It is also possible, however, that at least some of the difficulty is due to a failure of NT individuals to read the mental and emotional states of ASD interaction partners. Previous research has frequently observed deficits of typical facial emotion recognition in individuals with ASD, suggesting atypical representations of emotional expressions. Relatively little research, however, has investigated the ability of individuals with ASD to produce recognizable emotional expressions, and thus, whether NT individuals can recognize autistic emotional expressions. The few studies which have investigated this have used only NT observers, making it impossible to determine whether atypical representations are shared among individuals with ASD, or idiosyncratic. This study investigated NT and ASD participants’ ability to recognize emotional expressions produced by NT and ASD posers. Three posing conditions were included, to determine whether potential group differences are due to atypical cognitive representations of emotion, impaired understanding of the communicative value of expressions, or poor proprioceptive feedback. Results indicated that ASD expressions were recognized less well than NT expressions, and that this is likely due to a genuine deficit in the representation of typical emotional expressions in this population. Further, ASD expressions were equally poorly recognized by NT individuals and those with ASD, implicating idiosyncratic, rather than common, atypical representations of emotional expressions in ASD. Autism Res 2016, 9: 262–271. © 2015 International Society for Autism Research, Wiley Periodicals, Inc. PMID:26053037

  7. Generalized Discriminant Orthogonal Nonnegative Tensor Factorization for Facial Expression Recognition

    PubMed Central

    XiuJun, Zhang; Chang, Liu

    2014-01-01

    In order to overcome the limitation of traditional nonnegative factorization algorithms, the paper presents a generalized discriminant orthogonal non-negative tensor factorization algorithm. At first, the algorithm takes the orthogonal constraint into account to ensure the nonnegativity of the low-dimensional features. Furthermore, the discriminant constraint is imposed on low-dimensional weights to strengthen the discriminant capability of the low-dimensional features. The experiments on facial expression recognition have demonstrated that the algorithm is superior to other non-negative factorization algorithms. PMID:24982970

  8. Face in profile view reduces perceived facial expression intensity: an eye-tracking study.

    PubMed

    Guo, Kun; Shaw, Heather

    2015-02-01

    Recent studies measuring the facial expressions of emotion have focused primarily on the perception of frontal face images. As we frequently encounter expressive faces from different viewing angles, having a mechanism which allows invariant expression perception would be advantageous to our social interactions. Although a couple of studies have indicated comparable expression categorization accuracy across viewpoints, it is unknown how perceived expression intensity and associated gaze behaviour change across viewing angles. Differences could arise because diagnostic cues from local facial features for decoding expressions could vary with viewpoints. Here we manipulated orientation of faces (frontal, mid-profile, and profile view) displaying six common facial expressions of emotion, and measured participants' expression categorization accuracy, perceived expression intensity and associated gaze patterns. In comparison with frontal faces, profile faces slightly reduced identification rates for disgust and sad expressions, but significantly decreased perceived intensity for all tested expressions. Although quantitatively viewpoint had expression-specific influence on the proportion of fixations directed at local facial features, the qualitative gaze distribution within facial features (e.g., the eyes tended to attract the highest proportion of fixations, followed by the nose and then the mouth region) was independent of viewpoint and expression type. Our results suggest that the viewpoint-invariant facial expression processing is categorical perception, which could be linked to a viewpoint-invariant holistic gaze strategy for extracting expressive facial cues.

  9. QUANTIFYING ATYPICALITY IN AFFECTIVE FACIAL EXPRESSIONS OF CHILDREN WITH AUTISM SPECTRUM DISORDERS.

    PubMed

    Metallinou, Angeliki; Grossman, Ruth B; Narayanan, Shrikanth

    2013-01-01

    We focus on the analysis, quantification and visualization of atypicality in affective facial expressions of children with High Functioning Autism (HFA). We examine facial Motion Capture data from typically developing (TD) children and children with HFA, using various statistical methods, including Functional Data Analysis, in order to quantify atypical expression characteristics and uncover patterns of expression evolution in the two populations. Our results show that children with HFA display higher asynchrony of motion between facial regions, more rough facial and head motion, and a larger range of facial region motion. Overall, subjects with HFA consistently display a wider variability in the expressive facial gestures that they employ. Our analysis demonstrates the utility of computational approaches for understanding behavioral data and brings new insights into the autism domain regarding the atypicality that is often associated with facial expressions of subjects with HFA.

  10. Analysis of facial expressions in parkinson's disease through video-based automatic methods.

    PubMed

    Bandini, Andrea; Orlandi, Silvia; Escalante, Hugo Jair; Giovannelli, Fabio; Cincotta, Massimo; Reyes-Garcia, Carlos A; Vanni, Paola; Zaccara, Gaetano; Manfredi, Claudia

    2017-04-01

    The automatic analysis of facial expressions is an evolving field that finds several clinical applications. One of these applications is the study of facial bradykinesia in Parkinson's disease (PD), which is a major motor sign of this neurodegenerative illness. Facial bradykinesia consists in the reduction/loss of facial movements and emotional facial expressions called hypomimia. In this work we propose an automatic method for studying facial expressions in PD patients relying on video-based METHODS: 17 Parkinsonian patients and 17 healthy control subjects were asked to show basic facial expressions, upon request of the clinician and after the imitation of a visual cue on a screen. Through an existing face tracker, the Euclidean distance of the facial model from a neutral baseline was computed in order to quantify the changes in facial expressivity during the tasks. Moreover, an automatic facial expressions recognition algorithm was trained in order to study how PD expressions differed from the standard expressions. Results show that control subjects reported on average higher distances than PD patients along the tasks. This confirms that control subjects show larger movements during both posed and imitated facial expressions. Moreover, our results demonstrate that anger and disgust are the two most impaired expressions in PD patients. Contactless video-based systems can be important techniques for analyzing facial expressions also in rehabilitation, in particular speech therapy, where patients could get a definite advantage from a real-time feedback about the proper facial expressions/movements to perform. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Pubertal development shapes perception of complex facial expressions.

    PubMed

    Motta-Mena, Natalie V; Scherf, K Suzanne

    2017-07-01

    We previously hypothesized that pubertal development shapes the emergence of new components of face processing (Scherf et al., 2012; Garcia & Scherf, 2015). Here, we evaluate this hypothesis by investigating emerging perceptual sensitivity to complex versus basic facial expressions across pubertal development. We tested pre-pubescent children (6-8 years), age- and sex-matched adolescents in early and later stages of pubertal development (11-14 years), and sexually mature adults (18-24 years). Using a perceptual staircase procedure, participants made visual discriminations of both socially complex expressions (sexual interest, contempt) that are arguably relevant to emerging peer-oriented relationships of adolescence, and basic (happy, anger) expressions that are important even in early infancy. Only sensitivity to detect complex expressions improved as a function of pubertal development. The ability to perceive these expressions is adult-like by late puberty when adolescents become sexually mature. This pattern of results provides the first evidence that pubertal development specifically influences emerging affective components of face perception in adolescence. © 2016 John Wiley & Sons Ltd.

  12. Accurate landmarking of three-dimensional facial data in the presence of facial expressions and occlusions using a three-dimensional statistical facial feature model.

    PubMed

    Zhao, Xi; Dellandréa, Emmanuel; Chen, Liming; Kakadiaris, Ioannis A

    2011-10-01

    Three-dimensional face landmarking aims at automatically localizing facial landmarks and has a wide range of applications (e.g., face recognition, face tracking, and facial expression analysis). Existing methods assume neutral facial expressions and unoccluded faces. In this paper, we propose a general learning-based framework for reliable landmark localization on 3-D facial data under challenging conditions (i.e., facial expressions and occlusions). Our approach relies on a statistical model, called 3-D statistical facial feature model, which learns both the global variations in configurational relationships between landmarks and the local variations of texture and geometry around each landmark. Based on this model, we further propose an occlusion classifier and a fitting algorithm. Results from experiments on three publicly available 3-D face databases (FRGC, BU-3-DFE, and Bosphorus) demonstrate the effectiveness of our approach, in terms of landmarking accuracy and robustness, in the presence of expressions and occlusions.

  13. Plain faces are more expressive: comparative study of facial colour, mobility and musculature in primates.

    PubMed

    Santana, Sharlene E; Dobson, Seth D; Diogo, Rui

    2014-05-01

    Facial colour patterns and facial expressions are among the most important phenotypic traits that primates use during social interactions. While colour patterns provide information about the sender's identity, expressions can communicate its behavioural intentions. Extrinsic factors, including social group size, have shaped the evolution of facial coloration and mobility, but intrinsic relationships and trade-offs likely operate in their evolution as well. We hypothesize that complex facial colour patterning could reduce how salient facial expressions appear to a receiver, and thus species with highly expressive faces would have evolved uniformly coloured faces. We test this hypothesis through a phylogenetic comparative study, and explore the underlying morphological factors of facial mobility. Supporting our hypothesis, we find that species with highly expressive faces have plain facial colour patterns. The number of facial muscles does not predict facial mobility; instead, species that are larger and have a larger facial nucleus have more expressive faces. This highlights a potential trade-off between facial mobility and colour patterning in primates and reveals complex relationships between facial features during primate evolution.

  14. More than meets the smile: facial muscle expression in children with Ochoa syndrome.

    PubMed

    Ganesan, I; Thomas, T

    2011-12-01

    The Ochoa syndrome is the association of a non-neurogenic neurogenic bladder with abnormal facial muscle expression. Patients are at risk for renal failure due to obstructive uropathy. We report a family of three siblings, with an emphasis on the abnormalities in facial expression. Careful examination shows an unusual co-contraction of the orbicularis oculi and orbicularis oris muscles only when full facial expressions are exhibited, across a range of emotional or voluntary situations. This suggests a peripheral disorder in facial muscle control. Two thirds of patients have anal sphincter abnormalities. Aberrant organisation of the facial motor and urinary-anal sphincter nuclei may explain these symptoms.

  15. Deficits in the Mimicry of Facial Expressions in Parkinson's Disease.

    PubMed

    Livingstone, Steven R; Vezer, Esztella; McGarry, Lucy M; Lang, Anthony E; Russo, Frank A

    2016-01-01

    Humans spontaneously mimic the facial expressions of others, facilitating social interaction. This mimicking behavior may be impaired in individuals with Parkinson's disease, for whom the loss of facial movements is a clinical feature. To assess the presence of facial mimicry in patients with Parkinson's disease. Twenty-seven non-depressed patients with idiopathic Parkinson's disease and 28 age-matched controls had their facial muscles recorded with electromyography while they observed presentations of calm, happy, sad, angry, and fearful emotions. Patients exhibited reduced amplitude and delayed onset in the zygomaticus major muscle region (smiling response) following happy presentations (patients M = 0.02, 95% confidence interval [CI] -0.15 to 0.18, controls M = 0.26, CI 0.14 to 0.37, ANOVA, effect size [ES] = 0.18, p < 0.001). Although patients exhibited activation of the corrugator supercilii and medial frontalis (frowning response) following sad and fearful presentations, the frontalis response to sad presentations was attenuated relative to controls (patients M = 0.05, CI -0.08 to 0.18, controls M = 0.21, CI 0.09 to 0.34, ANOVA, ES = 0.07, p = 0.017). The amplitude of patients' zygomaticus activity in response to positive emotions was found to be negatively correlated with response times for ratings of emotional identification, suggesting a motor-behavioral link (r = -0.45, p = 0.02, two-tailed). Patients showed decreased mimicry overall, mimicking other peoples' frowns to some extent, but presenting with profoundly weakened and delayed smiles. These findings open a new avenue of inquiry into the "masked face" syndrome of PD.

  16. Deficits in the Mimicry of Facial Expressions in Parkinson's Disease

    PubMed Central

    Livingstone, Steven R.; Vezer, Esztella; McGarry, Lucy M.; Lang, Anthony E.; Russo, Frank A.

    2016-01-01

    Background: Humans spontaneously mimic the facial expressions of others, facilitating social interaction. This mimicking behavior may be impaired in individuals with Parkinson's disease, for whom the loss of facial movements is a clinical feature. Objective: To assess the presence of facial mimicry in patients with Parkinson's disease. Method: Twenty-seven non-depressed patients with idiopathic Parkinson's disease and 28 age-matched controls had their facial muscles recorded with electromyography while they observed presentations of calm, happy, sad, angry, and fearful emotions. Results: Patients exhibited reduced amplitude and delayed onset in the zygomaticus major muscle region (smiling response) following happy presentations (patients M = 0.02, 95% confidence interval [CI] −0.15 to 0.18, controls M = 0.26, CI 0.14 to 0.37, ANOVA, effect size [ES] = 0.18, p < 0.001). Although patients exhibited activation of the corrugator supercilii and medial frontalis (frowning response) following sad and fearful presentations, the frontalis response to sad presentations was attenuated relative to controls (patients M = 0.05, CI −0.08 to 0.18, controls M = 0.21, CI 0.09 to 0.34, ANOVA, ES = 0.07, p = 0.017). The amplitude of patients' zygomaticus activity in response to positive emotions was found to be negatively correlated with response times for ratings of emotional identification, suggesting a motor-behavioral link (r = –0.45, p = 0.02, two-tailed). Conclusions: Patients showed decreased mimicry overall, mimicking other peoples' frowns to some extent, but presenting with profoundly weakened and delayed smiles. These findings open a new avenue of inquiry into the “masked face” syndrome of PD. PMID:27375505

  17. Functional consequences of perceiving facial expressions of emotion without awareness.

    PubMed

    Eastwood, John D; Smilek, Daniel

    2005-09-01

    A substantial body of research has established that even when we are not consciously aware of the faces of others we are nevertheless sensitive to, and impacted by their facial expression. In this paper, we consider this body of research from a new perspective by examining the functions of unconscious perception revealed by these studies. A consideration of the literature from this perspective highlights that existing research methods are limited when it comes to revealing possible functions of unconscious perception. The critical shortcoming is that in all of the methods, the perceived facial expression remains outside of awareness. This is a problem because there are good reasons to believe that one important function of unconsciously perceived negative faces is to attract attention so that they are consciously perceived; such conscious perception, however, is never allowed with existing methodologies. We discuss recent studies of emotional face perception under conditions of visual search that address this issue directly. Further, we suggest that methodologies that do not examine cognitive processes as they occur in more natural settings may result in fundamental misunderstandings of human cognition.

  18. Facial expression recognition and emotional regulation in narcolepsy with cataplexy.

    PubMed

    Bayard, Sophie; Croisier Langenier, Muriel; Dauvilliers, Yves

    2013-04-01

    Cataplexy is pathognomonic of narcolepsy with cataplexy, and defined by a transient loss of muscle tone triggered by strong emotions. Recent researches suggest abnormal amygdala function in narcolepsy with cataplexy. Emotion treatment and emotional regulation strategies are complex functions involving cortical and limbic structures, like the amygdala. As the amygdala has been shown to play a role in facial emotion recognition, we tested the hypothesis that patients with narcolepsy with cataplexy would have impaired recognition of facial emotional expressions compared with patients affected with central hypersomnia without cataplexy and healthy controls. We also aimed to determine whether cataplexy modulates emotional regulation strategies. Emotional intensity, arousal and valence ratings on Ekman faces displaying happiness, surprise, fear, anger, disgust, sadness and neutral expressions of 21 drug-free patients with narcolepsy with cataplexy were compared with 23 drug-free sex-, age- and intellectual level-matched adult patients with hypersomnia without cataplexy and 21 healthy controls. All participants underwent polysomnography recording and multiple sleep latency tests, and completed depression, anxiety and emotional regulation questionnaires. Performance of patients with narcolepsy with cataplexy did not differ from patients with hypersomnia without cataplexy or healthy controls on both intensity rating of each emotion on its prototypical label and mean ratings for valence and arousal. Moreover, patients with narcolepsy with cataplexy did not use different emotional regulation strategies. The level of depressive and anxious symptoms in narcolepsy with cataplexy did not differ from the other groups. Our results demonstrate that narcolepsy with cataplexy accurately perceives and discriminates facial emotions, and regulates emotions normally. The absence of alteration of perceived affective valence remains a major clinical interest in narcolepsy with cataplexy

  19. Facial expression recognition based on improved local ternary pattern and stacked auto-encoder

    NASA Astrophysics Data System (ADS)

    Wu, Yao; Qiu, Weigen

    2017-08-01

    In order to enhance the robustness of facial expression recognition, we propose a method of facial expression recognition based on improved Local Ternary Pattern (LTP) combined with Stacked Auto-Encoder (SAE). This method uses the improved LTP extraction feature, and then uses the improved depth belief network as the detector and classifier to extract the LTP feature. The combination of LTP and improved deep belief network is realized in facial expression recognition. The recognition rate on CK+ databases has improved significantly.

  20. Symmetrical and Asymmetrical Interactions between Facial Expressions and Gender Information in Face Perception.

    PubMed

    Liu, Chengwei; Liu, Ying; Iqbal, Zahida; Li, Wenhui; Lv, Bo; Jiang, Zhongqing

    2017-01-01

    To investigate the interaction between facial expressions and facial gender information during face perception, the present study matched the intensities of the two types of information in face images and then adopted the orthogonal condition of the Garner Paradigm to present the images to participants who were required to judge the gender and expression of the faces; the gender and expression presentations were varied orthogonally. Gender and expression processing displayed a mutual interaction. On the one hand, the judgment of angry expressions occurred faster when presented with male facial images; on the other hand, the classification of the female gender occurred faster when presented with a happy facial expression than when presented with an angry facial expression. According to the evoked-related potential results, the expression classification was influenced by gender during the face structural processing stage (as indexed by N170), which indicates the promotion or interference of facial gender with the coding of facial expression features. However, gender processing was affected by facial expressions in more stages, including the early (P1) and late (LPC) stages of perceptual processing, reflecting that emotional expression influences gender processing mainly by directing attention.

  1. Symmetrical and Asymmetrical Interactions between Facial Expressions and Gender Information in Face Perception

    PubMed Central

    Liu, Chengwei; Liu, Ying; Iqbal, Zahida; Li, Wenhui; Lv, Bo; Jiang, Zhongqing

    2017-01-01

    To investigate the interaction between facial expressions and facial gender information during face perception, the present study matched the intensities of the two types of information in face images and then adopted the orthogonal condition of the Garner Paradigm to present the images to participants who were required to judge the gender and expression of the faces; the gender and expression presentations were varied orthogonally. Gender and expression processing displayed a mutual interaction. On the one hand, the judgment of angry expressions occurred faster when presented with male facial images; on the other hand, the classification of the female gender occurred faster when presented with a happy facial expression than when presented with an angry facial expression. According to the evoked-related potential results, the expression classification was influenced by gender during the face structural processing stage (as indexed by N170), which indicates the promotion or interference of facial gender with the coding of facial expression features. However, gender processing was affected by facial expressions in more stages, including the early (P1) and late (LPC) stages of perceptual processing, reflecting that emotional expression influences gender processing mainly by directing attention. PMID:28855882

  2. Contributions of feature shapes and surface cues to the recognition of facial expressions.

    PubMed

    Sormaz, Mladen; Young, Andrew W; Andrews, Timothy J

    2016-10-01

    Theoretical accounts of face processing often emphasise feature shapes as the primary visual cue to the recognition of facial expressions. However, changes in facial expression also affect the surface properties of the face. In this study, we investigated whether this surface information can also be used in the recognition of facial expression. First, participants identified facial expressions (fear, anger, disgust, sadness, happiness) from images that were manipulated such that they varied mainly in shape or mainly in surface properties. We found that the categorization of facial expression is possible in either type of image, but that different expressions are relatively dependent on surface or shape properties. Next, we investigated the relative contributions of shape and surface information to the categorization of facial expressions. This employed a complementary method that involved combining the surface properties of one expression with the shape properties from a different expression. Our results showed that the categorization of facial expressions in these hybrid images was equally dependent on the surface and shape properties of the image. Together, these findings provide a direct demonstration that both feature shape and surface information make significant contributions to the recognition of facial expressions.

  3. Exaggerated perception of facial expressions is increased in individuals with schizotypal traits

    PubMed Central

    Uono, Shota; Sato, Wataru; Toichi, Motomi

    2015-01-01

    Emotional facial expressions are indispensable communicative tools, and social interactions involving facial expressions are impaired in some psychiatric disorders. Recent studies revealed that the perception of dynamic facial expressions was exaggerated in normal participants, and this exaggerated perception is weakened in autism spectrum disorder (ASD). Based on the notion that ASD and schizophrenia spectrum disorder are at two extremes of the continuum with respect to social impairment, we hypothesized that schizophrenic characteristics would strengthen the exaggerated perception of dynamic facial expressions. To test this hypothesis, we investigated the relationship between the perception of facial expressions and schizotypal traits in a normal population. We presented dynamic and static facial expressions, and asked participants to change an emotional face display to match the perceived final image. The presence of schizotypal traits was positively correlated with the degree of exaggeration for dynamic, as well as static, facial expressions. Among its subscales, the paranoia trait was positively correlated with the exaggerated perception of facial expressions. These results suggest that schizotypal traits, specifically the tendency to over-attribute mental states to others, exaggerate the perception of emotional facial expressions. PMID:26135081

  4. Affect-specific activation of shared networks for perception and execution of facial expressions.

    PubMed

    Kircher, Tilo; Pohl, Anna; Krach, Sören; Thimm, Markus; Schulte-Rüther, Martin; Anders, Silke; Mathiak, Klaus

    2013-04-01

    Previous studies have shown overlapping neural activations for observation and execution or imitation of emotional facial expressions. These shared representations have been assumed to provide indirect evidence for a human mirror neuron system, which is suggested to be a prerequisite of action comprehension. We aimed at clarifying whether shared representations in and beyond human mirror areas are specifically activated by affective facial expressions or whether they are activated by facial expressions independent of the emotional meaning. During neuroimaging, participants observed and executed happy and non-emotional facial expressions. Shared representations were revealed for happy facial expressions in the pars opercularis, the precentral gyrus, in the superior temporal gyrus/medial temporal gyrus (MTG), in the pre-supplementary motor area and in the right amygdala. All areas showed less pronounced activation in the non-emotional condition. When directly compared, significant stronger neural responses emerged for happy facial expressions in the pre-supplementary motor area and in the MTG than for non-emotional stimuli. We assume that activation of shared representations depends on the affect and (social) relevance of the facial expression. The pre-supplementary motor area is a core-shared representation-structure supporting observation and execution of affective contagious facial expressions and might have a modulatory role during the preparation of executing happy facial expressions.

  5. Affect-specific activation of shared networks for perception and execution of facial expressions

    PubMed Central

    Kircher, Tilo; Krach, Sören; Thimm, Markus; Schulte-Rüther, Martin; Anders, Silke; Mathiak, Klaus

    2013-01-01

    Previous studies have shown overlapping neural activations for observation and execution or imitation of emotional facial expressions. These shared representations have been assumed to provide indirect evidence for a human mirror neuron system, which is suggested to be a prerequisite of action comprehension. We aimed at clarifying whether shared representations in and beyond human mirror areas are specifically activated by affective facial expressions or whether they are activated by facial expressions independent of the emotional meaning. During neuroimaging, participants observed and executed happy and non-emotional facial expressions. Shared representations were revealed for happy facial expressions in the pars opercularis, the precentral gyrus, in the superior temporal gyrus/medial temporal gyrus (MTG), in the pre-supplementary motor area and in the right amygdala. All areas showed less pronounced activation in the non-emotional condition. When directly compared, significant stronger neural responses emerged for happy facial expressions in the pre-supplementary motor area and in the MTG than for non-emotional stimuli. We assume that activation of shared representations depends on the affect and (social) relevance of the facial expression. The pre-supplementary motor area is a core-shared representation-structure supporting observation and execution of affective contagious facial expressions and might have a modulatory role during the preparation of executing happy facial expressions. PMID:22275167

  6. Exaggerated perception of facial expressions is increased in individuals with schizotypal traits.

    PubMed

    Uono, Shota; Sato, Wataru; Toichi, Motomi

    2015-07-02

    Emotional facial expressions are indispensable communicative tools, and social interactions involving facial expressions are impaired in some psychiatric disorders. Recent studies revealed that the perception of dynamic facial expressions was exaggerated in normal participants, and this exaggerated perception is weakened in autism spectrum disorder (ASD). Based on the notion that ASD and schizophrenia spectrum disorder are at two extremes of the continuum with respect to social impairment, we hypothesized that schizophrenic characteristics would strengthen the exaggerated perception of dynamic facial expressions. To test this hypothesis, we investigated the relationship between the perception of facial expressions and schizotypal traits in a normal population. We presented dynamic and static facial expressions, and asked participants to change an emotional face display to match the perceived final image. The presence of schizotypal traits was positively correlated with the degree of exaggeration for dynamic, as well as static, facial expressions. Among its subscales, the paranoia trait was positively correlated with the exaggerated perception of facial expressions. These results suggest that schizotypal traits, specifically the tendency to over-attribute mental states to others, exaggerate the perception of emotional facial expressions.

  7. Can an anger face also be scared? Malleability of facial expressions.

    PubMed

    Widen, Sherri C; Naab, Pamela

    2012-10-01

    Do people always interpret a facial expression as communicating a single emotion (e.g., the anger face as only angry) or is that interpretation malleable? The current study investigated preschoolers' (N = 60; 3-4 years) and adults' (N = 20) categorization of facial expressions. On each of five trials, participants selected from an array of 10 facial expressions (an open-mouthed, high arousal expression and a closed-mouthed, low arousal expression each for happiness, sadness, anger, fear, and disgust) all those that displayed the target emotion. Children's interpretation of facial expressions was malleable: 48% of children who selected the fear, anger, sadness, and disgust faces for the "correct" category also selected these same faces for another emotion category; 47% of adults did so for the sadness and disgust faces. The emotion children and adults attribute to facial expressions is influenced by the emotion category for which they are looking.

  8. The relation of expression recognition and affective experience in facial expression processing: an event-related potential study

    PubMed Central

    Dong, Guangheng; Lu, Shenglan

    2010-01-01

    The present study investigates the relationship of expression recognition and affective experience during facial expression processing by event-related potentials (ERP). Facial expressions used in the present study can be divided into three categories: positive (happy), neutral (neutral), and negative (angry). Participants were asked to finish two kinds of facial recognition tasks: one was easy, and the other was difficult. In the easy task, significant main effects were found for different valence conditions, meaning that emotions were evoked effectively when participants recognized the expressions in facial expression processing. However, no difference was found in the difficult task, meaning that even if participants had identified the expressions correctly, no relevant emotion was evoked during the process. The findings suggest that emotional experience was not simultaneous with expression identification in facial expression processing, and the affective experience process could be suppressed in challenging cognitive tasks. The results indicate that we should pay attention to the level of cognitive load when using facial expressions as emotion-eliciting materials in emotion studies; otherwise, the emotion may not be evoked effectively. PMID:22110330

  9. Face to face: blocking facial mimicry can selectively impair recognition of emotional expressions.

    PubMed

    Oberman, Lindsay M; Winkielman, Piotr; Ramachandran, Vilayanur S

    2007-01-01

    People spontaneously mimic a variety of behaviors, including emotional facial expressions. Embodied cognition theories suggest that mimicry reflects internal simulation of perceived emotion in order to facilitate its understanding. If so, blocking facial mimicry should impair recognition of expressions, especially of emotions that are simulated using facial musculature. The current research tested this hypothesis using four expressions (happy, disgust, fear, and sad) and two mimicry-interfering manipulations (1) biting on a pen and (2) chewing gum, as well as two control conditions. Experiment 1 used electromyography over cheek, mouth, and nose regions. The bite manipulation consistently activated assessed muscles, whereas the chew manipulation activated muscles only intermittently. Further, expressing happiness generated most facial action. Experiment 2 found that the bite manipulation interfered most with recognition of happiness. These findings suggest that facial mimicry differentially contributes to recognition of specific facial expressions, thus allowing for more refined predictions from embodied cognition theories.

  10. Misinterpretation of Facial Expressions of Emotion in Verbal Adults with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Eack, Shaun M.; Mazefsky, Carla A.; Minshew, Nancy J.

    2015-01-01

    Facial emotion perception is significantly affected in autism spectrum disorder, yet little is known about how individuals with autism spectrum disorder misinterpret facial expressions that result in their difficulty in accurately recognizing emotion in faces. This study examined facial emotion perception in 45 verbal adults with autism spectrum…

  11. Misinterpretation of Facial Expressions of Emotion in Verbal Adults with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Eack, Shaun M.; Mazefsky, Carla A.; Minshew, Nancy J.

    2015-01-01

    Facial emotion perception is significantly affected in autism spectrum disorder, yet little is known about how individuals with autism spectrum disorder misinterpret facial expressions that result in their difficulty in accurately recognizing emotion in faces. This study examined facial emotion perception in 45 verbal adults with autism spectrum…

  12. High frequency of facial expressions corresponding to confusion, concentration, and worry in an analysis of naturally occurring facial expressions of Americans.

    PubMed

    Rozin, Paul; Cohen, Adam B

    2003-03-01

    College students were instructed to observe symmetric and asymmetric facial expressions and to report the target's judgment of the "emotion" she or he was expressing, the facial movements involved, and the more expressive side. For both asymmetric and symmetric expressions, some of the most common emotions or states reported are neither included in standard taxonomies of emotion nor studied as important signals. Confusion is the most common descriptor reported for asymmetric expressions and is commonly reported for symmetrical expressions as well. Other frequent descriptors were think-concentrate and worry. Confusion is characterized principally by facial movements around the eyes and has many properties usually attributed to emotions. There was no evidence for lateralization of positive versus negative valenced states.

  13. Facial expression of positive emotions in individuals with eating disorders.

    PubMed

    Dapelo, Marcela M; Hart, Sharon; Hale, Christiane; Morris, Robin; Lynch, Thomas R; Tchanturia, Kate

    2015-11-30

    A large body of research has associated Eating Disorders with difficulties in socio-emotional functioning and it has been argued that they may serve to maintain the illness. This study aimed to explore facial expressions of positive emotions in individuals with Anorexia Nervosa (AN) and Bulimia Nervosa (BN) compared to healthy controls (HC), through an examination of the Duchenne smile (DS), which has been associated with feelings of enjoyment, amusement and happiness (Ekman et al., 1990). Sixty participants (AN=20; BN=20; HC=20) were videotaped while watching a humorous film clip. The duration and intensity of DS were subsequently analyzed using the facial action coding system (FACS) (Ekman and Friesen, 2003). Participants with AN displayed DS for shorter durations than BN and HC participants, and their DS had lower intensity. In the clinical groups, lower duration and intensity of DS were associated with lower BMI, and use of psychotropic medication. The study is the first to explore DS in people with eating disorders, providing further evidence of difficulties in the socio-emotional domain in people with AN. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  14. Paedomorphic facial expressions give dogs a selective advantage.

    PubMed

    Waller, Bridget M; Peirce, Kate; Caeiro, Cátia C; Scheider, Linda; Burrows, Anne M; McCune, Sandra; Kaminski, Juliane

    2013-01-01

    How wolves were first domesticated is unknown. One hypothesis suggests that wolves underwent a process of self-domestication by tolerating human presence and taking advantage of scavenging possibilities. The puppy-like physical and behavioural traits seen in dogs are thought to have evolved later, as a byproduct of selection against aggression. Using speed of selection from rehoming shelters as a proxy for artificial selection, we tested whether paedomorphic features give dogs a selective advantage in their current environment. Dogs who exhibited facial expressions that enhance their neonatal appearance were preferentially selected by humans. Thus, early domestication of wolves may have occurred not only as wolf populations became tamer, but also as they exploited human preferences for paedomorphic characteristics. These findings, therefore, add to our understanding of early dog domestication as a complex co-evolutionary process.

  15. Paedomorphic Facial Expressions Give Dogs a Selective Advantage

    PubMed Central

    Waller, Bridget M.; Peirce, Kate; Caeiro, Cátia C.; Scheider, Linda; Burrows, Anne M.; McCune, Sandra; Kaminski, Juliane

    2013-01-01

    How wolves were first domesticated is unknown. One hypothesis suggests that wolves underwent a process of self-domestication by tolerating human presence and taking advantage of scavenging possibilities. The puppy-like physical and behavioural traits seen in dogs are thought to have evolved later, as a byproduct of selection against aggression. Using speed of selection from rehoming shelters as a proxy for artificial selection, we tested whether paedomorphic features give dogs a selective advantage in their current environment. Dogs who exhibited facial expressions that enhance their neonatal appearance were preferentially selected by humans. Thus, early domestication of wolves may have occurred not only as wolf populations became tamer, but also as they exploited human preferences for paedomorphic characteristics. These findings, therefore, add to our understanding of early dog domestication as a complex co-evolutionary process. PMID:24386109

  16. Decoding facial expressions based on face-selective and motion-sensitive areas.

    PubMed

    Liang, Yin; Liu, Baolin; Xu, Junhai; Zhang, Gaoyan; Li, Xianglin; Wang, Peiyuan; Wang, Bin

    2017-06-01

    Humans can easily recognize others' facial expressions. Among the brain substrates that enable this ability, considerable attention has been paid to face-selective areas; in contrast, whether motion-sensitive areas, which clearly exhibit sensitivity to facial movements, are involved in facial expression recognition remained unclear. The present functional magnetic resonance imaging (fMRI) study used multi-voxel pattern analysis (MVPA) to explore facial expression decoding in both face-selective and motion-sensitive areas. In a block design experiment, participants viewed facial expressions of six basic emotions (anger, disgust, fear, joy, sadness, and surprise) in images, videos, and eyes-obscured videos. Due to the use of multiple stimulus types, the impacts of facial motion and eye-related information on facial expression decoding were also examined. It was found that motion-sensitive areas showed significant responses to emotional expressions and that dynamic expressions could be successfully decoded in both face-selective and motion-sensitive areas. Compared with static stimuli, dynamic expressions elicited consistently higher neural responses and decoding performance in all regions. A significant decrease in both activation and decoding accuracy due to the absence of eye-related information was also observed. Overall, the findings showed that emotional expressions are represented in motion-sensitive areas in addition to conventional face-selective areas, suggesting that motion-sensitive regions may also effectively contribute to facial expression recognition. The results also suggested that facial motion and eye-related information played important roles by carrying considerable expression information that could facilitate facial expression recognition. Hum Brain Mapp 38:3113-3125, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  17. The mysterious noh mask: contribution of multiple facial parts to the recognition of emotional expressions.

    PubMed

    Miyata, Hiromitsu; Nishimura, Ritsuko; Okanoya, Kazuo; Kawai, Nobuyuki

    2012-01-01

    A Noh mask worn by expert actors when performing on a Japanese traditional Noh drama is suggested to convey countless different facial expressions according to different angles of head/body orientation. The present study addressed the question of how different facial parts of a Noh mask, including the eyebrows, the eyes, and the mouth, may contribute to different emotional expressions. Both experimental situations of active creation and passive recognition of emotional facial expressions were introduced. In Experiment 1, participants either created happy or sad facial expressions, or imitated a face that looked up or down, by actively changing each facial part of a Noh mask image presented on a computer screen. For an upward tilted mask, the eyebrows and the mouth shared common features with sad expressions, whereas the eyes with happy expressions. This contingency tended to be reversed for a downward tilted mask. Experiment 2 further examined which facial parts of a Noh mask are crucial in determining emotional expressions. Participants were exposed to the synthesized Noh mask images with different facial parts expressing different emotions. Results clearly revealed that participants primarily used the shape of the mouth in judging emotions. The facial images having the mouth of an upward/downward tilted Noh mask strongly tended to be evaluated as sad/happy, respectively. The results suggest that Noh masks express chimeric emotional patterns, with different facial parts conveying different emotions This appears consistent with the principles of Noh which highly appreciate subtle and composite emotional expressions, as well as with the mysterious facial expressions observed in Western art. It was further demonstrated that the mouth serves as a diagnostic feature in characterizing the emotional expressions. This indicates the superiority of biologically-driven factors over the traditionally formulated performing styles when evaluating the emotions of the Noh masks.

  18. The Mysterious Noh Mask: Contribution of Multiple Facial Parts to the Recognition of Emotional Expressions

    PubMed Central

    Miyata, Hiromitsu; Nishimura, Ritsuko; Okanoya, Kazuo; Kawai, Nobuyuki

    2012-01-01

    Background A Noh mask worn by expert actors when performing on a Japanese traditional Noh drama is suggested to convey countless different facial expressions according to different angles of head/body orientation. The present study addressed the question of how different facial parts of a Noh mask, including the eyebrows, the eyes, and the mouth, may contribute to different emotional expressions. Both experimental situations of active creation and passive recognition of emotional facial expressions were introduced. Methodology/Principal Findings In Experiment 1, participants either created happy or sad facial expressions, or imitated a face that looked up or down, by actively changing each facial part of a Noh mask image presented on a computer screen. For an upward tilted mask, the eyebrows and the mouth shared common features with sad expressions, whereas the eyes with happy expressions. This contingency tended to be reversed for a downward tilted mask. Experiment 2 further examined which facial parts of a Noh mask are crucial in determining emotional expressions. Participants were exposed to the synthesized Noh mask images with different facial parts expressing different emotions. Results clearly revealed that participants primarily used the shape of the mouth in judging emotions. The facial images having the mouth of an upward/downward tilted Noh mask strongly tended to be evaluated as sad/happy, respectively. Conclusions/Significance The results suggest that Noh masks express chimeric emotional patterns, with different facial parts conveying different emotions This appears consistent with the principles of Noh which highly appreciate subtle and composite emotional expressions, as well as with the mysterious facial expressions observed in Western art. It was further demonstrated that the mouth serves as a diagnostic feature in characterizing the emotional expressions. This indicates the superiority of biologically-driven factors over the traditionally

  19. Gamma-band activity reflects attentional guidance by facial expression.

    PubMed

    Müsch, Kathrin; Siegel, Markus; Engel, Andreas K; Schneider, Till R

    2017-02-01

    Facial expressions attract attention due to their motivational significance. Previous work focused on attentional biases towards threat-related, fearful faces, although healthy participants tend to avoid mild threat. Growing evidence suggests that neuronal gamma (>30Hz) and alpha-band activity (8-12Hz) play an important role in attentional selection, but it is unknown if such oscillatory activity is involved in the guidance of attention through facial expressions. Thus, in this magnetoencephalography (MEG) study we investigated whether attention is shifted towards or away from fearful faces and characterized the underlying neuronal activity in these frequency ranges in forty-four healthy volunteers. We employed a covert spatial attention task using neutral and fearful faces as task-irrelevant distractors and emotionally neutral Gabor patches as targets. Participants had to indicate the tilt direction of the target. Analysis of the neuronal data was restricted to the responses to target Gabor patches. We performed statistical analysis at the sensor level and used subsequent source reconstruction to localize the observed effects. Spatially selective attention effects in the alpha and gamma band were revealed in parieto-occipital regions. We observed an attentional cost of processing the face distractors, as reflected in lower task performance on targets with short stimulus onset asynchrony (SOA <150ms) between faces and targets. On the neuronal level, attentional orienting to face distractors led to enhanced gamma band activity in bilateral occipital and parietal regions, when fearful faces were presented in the same hemifield as targets, but only in short SOA trials. Our findings provide evidence that both top-down and bottom-up attentional biases are reflected in parieto-occipital gamma-band activity.

  20. Altered saccadic targets when processing facial expressions under different attentional and stimulus conditions.

    PubMed

    Boutsen, Frank A; Dvorak, Justin D; Pulusu, Vinay K; Ross, Elliott D

    2017-03-13

    Depending on a subject's attentional bias, robust changes in emotional perception occur when facial blends (different emotions expressed on upper/lower face) are presented tachistoscopically. If no instructions are given, subjects overwhelmingly identify the lower facial expression when blends are presented to either visual field. If asked to attend to the upper face, subjects overwhelmingly identify the upper facial expression in the left visual field but remain slightly biased to the lower facial expression in the right visual field. The current investigation sought to determine whether differences in initial saccadic targets could help explain the perceptual biases described above. Ten subjects were presented with full and blend facial expressions under different attentional conditions. No saccadic differences were found for left versus right visual field presentations or for full facial versus blend stimuli. When asked to identify the presented emotion, saccades were directed to the lower face. When asked to attend to the upper face, saccades were directed to the upper face. When asked to attend to the upper face and try to identify the emotion, saccades were directed to the upper face but to a lesser degree. Thus, saccadic behavior supports the concept that there are cognitive-attentional pre-attunements when subjects visually process facial expressions. However, these pre-attunements do not fully explain the perceptual superiority of the left visual field for identifying the upper facial expression when facial blends are presented tachistoscopically. Hence other perceptual factors must be in play, such as the phenomenon of virtual scanning.

  1. Altered Kinematics of Facial Emotion Expression and Emotion Recognition Deficits Are Unrelated in Parkinson's Disease.

    PubMed

    Bologna, Matteo; Berardelli, Isabella; Paparella, Giulia; Marsili, Luca; Ricciardi, Lucia; Fabbrini, Giovanni; Berardelli, Alfredo

    2016-01-01

    Altered emotional processing, including reduced emotion facial expression and defective emotion recognition, has been reported in patients with Parkinson's disease (PD). However, few studies have objectively investigated facial expression abnormalities in PD using neurophysiological techniques. It is not known whether altered facial expression and recognition in PD are related. To investigate possible deficits in facial emotion expression and emotion recognition and their relationship, if any, in patients with PD. Eighteen patients with PD and 16 healthy controls were enrolled in this study. Facial expressions of emotion were recorded using a 3D optoelectronic system and analyzed using the facial action coding system. Possible deficits in emotion recognition were assessed using the Ekman test. Participants were assessed in one experimental session. Possible relationship between the kinematic variables of facial emotion expression, the Ekman test scores, and clinical and demographic data in patients were evaluated using the Spearman's test and multiple regression analysis. The facial expression of all six basic emotions had slower velocity and lower amplitude in patients in comparison to healthy controls (all Ps < 0.05). Patients also yielded worse Ekman global score and disgust, sadness, and fear sub-scores than healthy controls (all Ps < 0.001). Altered facial expression kinematics and emotion recognition deficits were unrelated in patients (all Ps > 0.05). Finally, no relationship emerged between kinematic variables of facial emotion expression, the Ekman test scores, and clinical and demographic data in patients (all Ps > 0.05). The results in this study provide further evidence of altered emotional processing in PD. The lack of any correlation between altered facial emotion expression kinematics and emotion recognition deficits in patients suggests that these abnormalities are mediated by separate pathophysiological mechanisms.

  2. Effects of Intensity of Facial Expressions on Amygdalar Activation Independently of Valence.

    PubMed

    Lin, Huiyan; Mueller-Bardorff, Miriam; Mothes-Lasch, Martin; Buff, Christine; Brinkmann, Leonie; Miltner, Wolfgang H R; Straube, Thomas

    2016-01-01

    For several stimulus categories (e.g., pictures, odors, and words), the arousal of both negative and positive stimuli has been shown to modulate amygdalar activation. In contrast, previous studies did not observe similar amygdalar effects in response to negative and positive facial expressions with varying intensity of facial expressions. Reasons for this discrepancy may be related to analytical strategies, experimental design and stimuli. Therefore, the present study aimed at re-investigating whether the intensity of facial expressions modulates amygdalar activation by circumventing limitations of previous research. Event-related functional magnetic resonance imaging was used to assess brain activation while participants observed a static neutral expression and positive (happy) and negative (angry) expressions of either high or low intensity from an ecologically valid, novel stimulus set. The ratings of arousal and intensity were highly correlated. We found that amygdalar activation followed a u-shaped activation pattern with highest activation to high intense facial expressions as compared to low intensity facial expressions and to the neutral expression irrespective of valence, suggesting a critical role of the amygdala in valence-independent arousal processing of facial expressions. Additionally, consistent with previous studies, intensity effects were also found in visual areas and generally increased activation to angry versus happy faces were found in visual cortex and insula, indicating enhanced visual representations of high arousing facial expressions and increased visual and somatosensory representations of threat.

  3. Effects of Intensity of Facial Expressions on Amygdalar Activation Independently of Valence

    PubMed Central

    Lin, Huiyan; Mueller-Bardorff, Miriam; Mothes-Lasch, Martin; Buff, Christine; Brinkmann, Leonie; Miltner, Wolfgang H. R.; Straube, Thomas

    2016-01-01

    For several stimulus categories (e.g., pictures, odors, and words), the arousal of both negative and positive stimuli has been shown to modulate amygdalar activation. In contrast, previous studies did not observe similar amygdalar effects in response to negative and positive facial expressions with varying intensity of facial expressions. Reasons for this discrepancy may be related to analytical strategies, experimental design and stimuli. Therefore, the present study aimed at re-investigating whether the intensity of facial expressions modulates amygdalar activation by circumventing limitations of previous research. Event-related functional magnetic resonance imaging was used to assess brain activation while participants observed a static neutral expression and positive (happy) and negative (angry) expressions of either high or low intensity from an ecologically valid, novel stimulus set. The ratings of arousal and intensity were highly correlated. We found that amygdalar activation followed a u-shaped activation pattern with highest activation to high intense facial expressions as compared to low intensity facial expressions and to the neutral expression irrespective of valence, suggesting a critical role of the amygdala in valence-independent arousal processing of facial expressions. Additionally, consistent with previous studies, intensity effects were also found in visual areas and generally increased activation to angry versus happy faces were found in visual cortex and insula, indicating enhanced visual representations of high arousing facial expressions and increased visual and somatosensory representations of threat. PMID:28066216

  4. Do Dynamic Facial Expressions Convey Emotions to Children Better than Do Static Ones?

    ERIC Educational Resources Information Center

    Widen, Sherri C.; Russell, James A.

    2015-01-01

    Past research has shown that children recognize emotions from facial expressions poorly and improve only gradually with age, but the stimuli in such studies have been static faces. Because dynamic faces include more information, it may well be that children more readily recognize emotions from dynamic facial expressions. The current study of…

  5. Effectiveness of Teaching Naming Facial Expression to Children with Autism via Video Modeling

    ERIC Educational Resources Information Center

    Akmanoglu, Nurgul

    2015-01-01

    This study aims to examine the effectiveness of teaching naming emotional facial expression via video modeling to children with autism. Teaching the naming of emotions (happy, sad, scared, disgusted, surprised, feeling physical pain, and bored) was made by creating situations that lead to the emergence of facial expressions to children…

  6. The Relationship between Processing Facial Identity and Emotional Expression in 8-Month-Old Infants

    ERIC Educational Resources Information Center

    Schwarzer, Gudrun; Jovanovic, Bianca

    2010-01-01

    In Experiment 1, it was investigated whether infants process facial identity and emotional expression independently or in conjunction with one another. Eight-month-old infants were habituated to two upright or two inverted faces varying in facial identity and emotional expression. Infants were tested with a habituation face, a switch face, and a…

  7. Do Dynamic Facial Expressions Convey Emotions to Children Better than Do Static Ones?

    ERIC Educational Resources Information Center

    Widen, Sherri C.; Russell, James A.

    2015-01-01

    Past research has shown that children recognize emotions from facial expressions poorly and improve only gradually with age, but the stimuli in such studies have been static faces. Because dynamic faces include more information, it may well be that children more readily recognize emotions from dynamic facial expressions. The current study of…

  8. Brief Report: Representational Momentum for Dynamic Facial Expressions in Pervasive Developmental Disorder

    ERIC Educational Resources Information Center

    Uono, Shota; Sato, Wataru; Toichi, Motomi

    2010-01-01

    Individuals with pervasive developmental disorder (PDD) have difficulty with social communication via emotional facial expressions, but behavioral studies involving static images have reported inconsistent findings about emotion recognition. We investigated whether dynamic presentation of facial expression would enhance subjective perception of…

  9. Evaluating Posed and Evoked Facial Expressions of Emotion from Adults with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Faso, Daniel J.; Sasson, Noah J.; Pinkham, Amy E.

    2015-01-01

    Though many studies have examined facial affect perception by individuals with autism spectrum disorder (ASD), little research has investigated how facial expressivity in ASD is perceived by others. Here, naïve female observers (n = 38) judged the intensity, naturalness and emotional category of expressions produced by adults with ASD (n = 6) and…

  10. Does Gaze Direction Modulate Facial Expression Processing in Children with Autism Spectrum Disorder?

    ERIC Educational Resources Information Center

    Akechi, Hironori; Senju, Atsushi; Kikuchi, Yukiko; Tojo, Yoshikuni; Osanai, Hiroo; Hasegawa, Toshikazu

    2009-01-01

    Two experiments investigated whether children with autism spectrum disorder (ASD) integrate relevant communicative signals, such as gaze direction, when decoding a facial expression. In Experiment 1, typically developing children (9-14 years old; n = 14) were faster at detecting a facial expression accompanying a gaze direction with a congruent…

  11. Preschooler's Faces in Spontaneous Emotional Contexts--How Well Do They Match Adult Facial Expression Prototypes?

    ERIC Educational Resources Information Center

    Gaspar, Augusta; Esteves, Francisco G.

    2012-01-01

    Prototypical facial expressions of emotion, also known as universal facial expressions, are the underpinnings of most research concerning recognition of emotions in both adults and children. Data on natural occurrences of these prototypes in natural emotional contexts are rare and difficult to obtain in adults. By recording naturalistic…

  12. Spontaneous Facial Expressions in Congenitally Blind and Sighted Children Aged 8-11.

    ERIC Educational Resources Information Center

    Galati, Dario; Sini, Barbara; Schmidt, Susanne; Tinti, Carla

    2003-01-01

    This study found that the emotional facial expressions of 10 congenitally blind and 10 sighted children, ages 8-11, were similar. However, the frequency of certain facial movements was higher in the blind children than in the sighted children, and social influences were evident only in the expressions of the sighted children, who often masked…

  13. Brief Report: Representational Momentum for Dynamic Facial Expressions in Pervasive Developmental Disorder

    ERIC Educational Resources Information Center

    Uono, Shota; Sato, Wataru; Toichi, Motomi

    2010-01-01

    Individuals with pervasive developmental disorder (PDD) have difficulty with social communication via emotional facial expressions, but behavioral studies involving static images have reported inconsistent findings about emotion recognition. We investigated whether dynamic presentation of facial expression would enhance subjective perception of…

  14. The Relationship between Processing Facial Identity and Emotional Expression in 8-Month-Old Infants

    ERIC Educational Resources Information Center

    Schwarzer, Gudrun; Jovanovic, Bianca

    2010-01-01

    In Experiment 1, it was investigated whether infants process facial identity and emotional expression independently or in conjunction with one another. Eight-month-old infants were habituated to two upright or two inverted faces varying in facial identity and emotional expression. Infants were tested with a habituation face, a switch face, and a…

  15. Recognition of Facial Expressions and Prosodic Cues with Graded Emotional Intensities in Adults with Asperger Syndrome

    ERIC Educational Resources Information Center

    Doi, Hirokazu; Fujisawa, Takashi X.; Kanai, Chieko; Ohta, Haruhisa; Yokoi, Hideki; Iwanami, Akira; Kato, Nobumasa; Shinohara, Kazuyuki

    2013-01-01

    This study investigated the ability of adults with Asperger syndrome to recognize emotional categories of facial expressions and emotional prosodies with graded emotional intensities. The individuals with Asperger syndrome showed poorer recognition performance for angry and sad expressions from both facial and vocal information. The group…

  16. Mu desynchronization during observation and execution of facial expressions in 30-month-old children.

    PubMed

    Rayson, Holly; Bonaiuto, James John; Ferrari, Pier Francesco; Murray, Lynne

    2016-06-01

    Simulation theories propose that observing another's facial expression activates sensorimotor representations involved in the execution of that expression, facilitating recognition processes. The mirror neuron system (MNS) is a potential mechanism underlying simulation of facial expressions, with like neural processes activated both during observation and performance. Research with monkeys and adult humans supports this proposal, but so far there have been no investigations of facial MNS activity early in human development. The current study used electroencephalography (EEG) to explore mu rhythm desynchronization, an index of MNS activity, in 30-month-old children as they observed videos of dynamic emotional and non-emotional facial expressions, as well as scrambled versions of the same videos. We found significant mu desynchronization in central regions during observation and execution of both emotional and non-emotional facial expressions, which was right-lateralized for emotional and bilateral for non-emotional expressions during observation. These findings support previous research suggesting movement simulation during observation of facial expressions, and are the first to provide evidence for sensorimotor activation during observation of facial expressions, consistent with a functioning facial MNS at an early stage of human development. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. Recognition of Facial Expressions and Prosodic Cues with Graded Emotional Intensities in Adults with Asperger Syndrome

    ERIC Educational Resources Information Center

    Doi, Hirokazu; Fujisawa, Takashi X.; Kanai, Chieko; Ohta, Haruhisa; Yokoi, Hideki; Iwanami, Akira; Kato, Nobumasa; Shinohara, Kazuyuki

    2013-01-01

    This study investigated the ability of adults with Asperger syndrome to recognize emotional categories of facial expressions and emotional prosodies with graded emotional intensities. The individuals with Asperger syndrome showed poorer recognition performance for angry and sad expressions from both facial and vocal information. The group…

  18. Impact of Childhood Maltreatment on the Recognition of Facial Expressions of Emotions.

    PubMed

    Ardizzi, Martina; Martini, Francesca; Umiltà, Maria Alessandra; Evangelista, Valentina; Ravera, Roberto; Gallese, Vittorio

    2015-01-01

    The development of the explicit recognition of facial expressions of emotions can be affected by childhood maltreatment experiences. A previous study demonstrated the existence of an explicit recognition bias for angry facial expressions among a population of adolescent Sierra Leonean street-boys exposed to high levels of maltreatment. In the present study, the recognition bias for angry facial expressions was investigated in a younger population of street-children and age-matched controls. Participants performed a forced-choice facial expressions recognition task. Recognition bias was measured as participants' tendency to over-attribute anger label to other negative facial expressions. Participants' heart rate was assessed and related to their behavioral performance, as index of their stress-related physiological responses. Results demonstrated the presence of a recognition bias for angry facial expressions among street-children, also pinpointing a similar, although significantly less pronounced, tendency among controls. Participants' performance was controlled for age, cognitive and educational levels and for naming skills. None of these variables influenced the recognition bias for angry facial expressions. Differently, a significant effect of heart rate on participants' tendency to use anger label was evidenced. Taken together, these results suggest that childhood exposure to maltreatment experiences amplifies children's "pre-existing bias" for anger labeling in forced-choice emotion recognition task. Moreover, they strengthen the thesis according to which the recognition bias for angry facial expressions is a manifestation of a functional adaptive mechanism that tunes victim's perceptive and attentive focus on salient environmental social stimuli.

  19. Compound facial expressions of emotion: from basic research to clinical applications.

    PubMed

    Du, Shichuan; Martinez, Aleix M

    2015-12-01

    Emotions are sometimes revealed through facial expressions. When these natural facial articulations involve the contraction of the same muscle groups in people of distinct cultural upbringings, this is taken as evidence of a biological origin of these emotions. While past research had identified facial expressions associated with a single internally felt category (eg, the facial expression of happiness when we feel joyful), we have recently studied facial expressions observed when people experience compound emotions (eg, the facial expression of happy surprise when we feel joyful in a surprised way, as, for example, at a surprise birthday party). Our research has identified 17 compound expressions consistently produced across cultures, suggesting that the number of facial expressions of emotion of biological origin is much larger than previously believed. The present paper provides an overview of these findings and shows evidence supporting the view that spontaneous expressions are produced using the same facial articulations previously identified in laboratory experiments. We also discuss the implications of our results in the study of psychopathologies, and consider several open research questions.

  20. Preschooler's Faces in Spontaneous Emotional Contexts--How Well Do They Match Adult Facial Expression Prototypes?

    ERIC Educational Resources Information Center

    Gaspar, Augusta; Esteves, Francisco G.

    2012-01-01

    Prototypical facial expressions of emotion, also known as universal facial expressions, are the underpinnings of most research concerning recognition of emotions in both adults and children. Data on natural occurrences of these prototypes in natural emotional contexts are rare and difficult to obtain in adults. By recording naturalistic…

  1. Evaluating Posed and Evoked Facial Expressions of Emotion from Adults with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Faso, Daniel J.; Sasson, Noah J.; Pinkham, Amy E.

    2015-01-01

    Though many studies have examined facial affect perception by individuals with autism spectrum disorder (ASD), little research has investigated how facial expressivity in ASD is perceived by others. Here, naïve female observers (n = 38) judged the intensity, naturalness and emotional category of expressions produced by adults with ASD (n = 6) and…

  2. Selective attention and facial expression recognition in patients with Parkinson's disease.

    PubMed

    Alonso-Recio, Laura; Serrano, Juan M; Martín, Pilar

    2014-06-01

    Parkinson's disease (PD) has been associated with facial expression recognition difficulties. However, this impairment could be secondary to the one produced in other cognitive processes involved in recognition, such as selective attention. This study investigates the influence of two selective attention components (inhibition and visual search) on facial expression recognition in PD. We compared facial expression and non-emotional stimuli recognition abilities of 51 patients and 51 healthy controls, by means of an adapted Stroop task, and by "The Face in the Crowd" paradigm, which assess Inhibition and Visual Search abilities, respectively. Patients scored worse than controls in both tasks with facial expressions, but not with the other nonemotional stimuli, indicating specific emotional recognition impairment, not dependent on selective attention abilities. This should be taken into account in patients' neuropsychological assessment given the relevance of emotional facial expression for social communication in everyday settings.

  3. Modulation of perception and brain activity by predictable trajectories of facial expressions.

    PubMed

    Furl, N; van Rijsbergen, N J; Kiebel, S J; Friston, K J; Treves, A; Dolan, R J

    2010-03-01

    People track facial expression dynamics with ease to accurately perceive distinct emotions. Although the superior temporal sulcus (STS) appears to possess mechanisms for perceiving changeable facial attributes such as expressions, the nature of the underlying neural computations is not known. Motivated by novel theoretical accounts, we hypothesized that visual and motor areas represent expressions as anticipated motion trajectories. Using magnetoencephalography, we show predictable transitions between fearful and neutral expressions (compared with scrambled and static presentations) heighten activity in visual cortex as quickly as 165 ms poststimulus onset and later (237 ms) engage fusiform gyrus, STS and premotor areas. Consistent with proposed models of biological motion representation, we suggest that visual areas predictively represent coherent facial trajectories. We show that such representations bias emotion perception of subsequent static faces, suggesting that facial movements elicit predictions that bias perception. Our findings reveal critical processes evoked in the perception of dynamic stimuli such as facial expressions, which can endow perception with temporal continuity.

  4. The Child Affective Facial Expression (CAFE) set: validity and reliability from untrained adults.

    PubMed

    LoBue, Vanessa; Thrasher, Cat

    2014-01-01

    Emotional development is one of the largest and most productive areas of psychological research. For decades, researchers have been fascinated by how humans respond to, detect, and interpret emotional facial expressions. Much of the research in this area has relied on controlled stimulus sets of adults posing various facial expressions. Here we introduce a new stimulus set of emotional facial expressions into the domain of research on emotional development-The Child Affective Facial Expression set (CAFE). The CAFE set features photographs of a racially and ethnically diverse group of 2- to 8-year-old children posing for six emotional facial expressions-angry, fearful, sad, happy, surprised, and disgusted-and a neutral face. In the current work, we describe the set and report validity and reliability data on the set from 100 untrained adult participants.

  5. Self-monitoring of social facial expressions in the primate amygdala and cingulate cortex

    PubMed Central

    Livneh, Uri; Resnik, Jennifer; Shohat, Yosi; Paz, Rony

    2012-01-01

    Keeping track of self-executed facial expressions is essential for the ability to correctly interpret and reciprocate social expressions. However, little is known about neural mechanisms that participate in self-monitoring of facial expression. We designed a natural paradigm for social interactions where a monkey is seated in front of a peer monkey that is concealed by an opaque liquid crystal display shutter positioned between them. Opening the shutter for short durations allowed the monkeys to see each other and encouraged facial communication. To explore neural mechanisms that participate in self-monitoring of facial expression, we simultaneously recorded the elicited natural facial interactions and the neural activity of single neurons in the amygdala and the dorsal anterior cingulate cortex (dACC), two regions that are implicated with decoding of others’ gestures. Neural activity in both regions was temporally locked to distinctive facial gestures and close inspection of time lags revealed activity that either preceded (production) or lagged (monitor) initiation of facial expressions. This result indicates that single neurons in the dACC and the amygdala hold information about self-executed facial expressions and demonstrates an intimate overlap between the neural networks that participate in decoding and production of socially informative facial information. PMID:23112157

  6. [Emotional intelligence and oscillatory responses on the emotional facial expressions].

    PubMed

    Kniazev, G G; Mitrofanova, L G; Bocharov, A V

    2013-01-01

    Emotional intelligence-related differences in oscillatory responses to emotional facial expressions were investigated in 48 subjects (26 men and 22 women) in age 18-30 years. Participants were instructed to evaluate emotional expression (angry, happy and neutral) of each presented face on an analog scale ranging from -100 (very hostile) to + 100 (very friendly). High emotional intelligence (EI) participants were found to be more sensitive to the emotional content of the stimuli. It showed up both in their subjective evaluation of the stimuli and in a stronger EEG theta synchronization at an earlier (between 100 and 500 ms after face presentation) processing stage. Source localization using sLORETA showed that this effect was localized in the fusiform gyrus upon the presentation of angry faces and in the posterior cingulate gyrus upon the presentation of happy faces. At a later processing stage (500-870 ms) event-related theta synchronization in high emotional intelligence subject was higher in the left prefrontal cortex upon the presentation of happy faces, but it was lower in the anterior cingulate cortex upon presentation of angry faces. This suggests the existence of a mechanism that can be selectively increase the positive emotions and reduce negative emotions.

  7. [Context effects recognition of the emotional facial expression].

    PubMed

    Kostandov, E A; Cheremushkin, E A; Iakovenko, I A; Ashkinazi, M L

    2010-11-01

    Influence of additional working memory load on emotional face recognition was studied in healthy adults. Visual set to emotional face expression was experimentally formed, and two types of additional task--visual-spatial or semantic--were embedded in the experiment. Additional task caused less plastic set, i.e., a slower set-shifting. This effect displayed itself in an increase of erroneous facial expression perceptions. The character of these erroneous perceptions (assimilative or contrast or visual illusions) depended on the type of the additional task. Pre-stimulus EEG coherence across experimental trials in theta (4-7), low alpha (8-10 Hz) and beta (14--20) bands was analysed. Data of low-alpha and beta-coherence supported the hypothesis that increased memory load caused less involvement of frontal lobes in selective attention mechanisms that are associated with set-forming. This results in a slower set-shifting. Increased memory load also led to a growth of theta-band coherence in the left hemisphere and its decrease in the right hemisphere. The account of theta-coherence decrease in the right hemisphere between prefrontal and temporal areas for a slower set-shifting is discussed.

  8. The amygdalo-motor pathways and the control of facial expressions

    PubMed Central

    Gothard, Katalin M.

    2013-01-01

    Facial expressions reflect decisions about the perceived meaning of social stimuli and the expected socio-emotional outcome of responding (or not) with a reciprocating expression. The decision to produce a facial expression emerges from the joint activity of a network of structures that include the amygdala and multiple, interconnected cortical and subcortical motor areas. Reciprocal transformations between these sensory and motor signals give rise to distinct brain states that promote, or impede the production of facial expressions. The muscles of the upper and lower face are controlled by anatomically distinct motor areas. Facial expressions engage to a different extent the lower and upper face and thus require distinct patterns of neural activity distributed across multiple facial motor areas in ventrolateral frontal cortex, the supplementary motor area, and two areas in the midcingulate cortex. The distributed nature of the decision manifests in the joint activation of multiple motor areas that initiate the production of facial expression. Concomitantly multiple areas, including the amygdala, monitor ongoing overt behaviors (the expression itself) and the covert, autonomic responses that accompany emotional expressions. As the production of facial expressions is brought into the framework of formal decision making, an important challenge will be to incorporate autonomic and visceral states into decisions that govern the receiving-emitting cycle of social signals. PMID:24678289

  9. The effect of emotional facial expressions on children's working memory: associations with age and behavior.

    PubMed

    Augusti, Else-Marie; Torheim, Hanna Karoline; Melinder, Annika

    2014-01-01

    Studies on adults have revealed a disadvantageous effect of negative emotional stimuli on executive functions (EF), and it is suggested that this effect is amplified in children. The present study's aim was to assess how emotional facial expressions affected working memory in 9- to 12-year-olds, using a working memory task with emotional facial expressions as stimuli. Additionally, we explored how degree of internalizing and externalizing symptoms in typically developing children was related to performance on the same task. Before employing the working memory task with emotional facial expressions as stimuli, an independent sample of 9- to 12-year-olds was asked to recognize the facial expressions intended to serve as stimuli for the working memory task and to rate the facial expressions on the degree to which the emotion was expressed and for arousal to obtain a baseline for how children during this age recognize and react to facial expressions. The first study revealed that children rated the facial expressions with similar intensity and arousal across age. When employing the working memory task with facial expressions, results revealed that negatively valenced expressions impaired working memory more than neutral and positively valenced expressions. The ability to successfully complete the working memory task increased between 9 to 12 years of age. Children's total problems were associated with poorer performance on the working memory task with facial expressions. Results on the effect of emotion on working memory are discussed in light of recent models and empirical findings on how emotional information might interact and interfere with cognitive processes such as working memory.

  10. Representational momentum in dynamic facial expressions is modulated by the level of expressed pain: Amplitude and direction effects.

    PubMed

    Prigent, Elise; Amorim, Michel-Ange; de Oliveira, Armando Mónica

    2017-09-27

    Humans have developed a specific capacity to rapidly perceive and anticipate other people's facial expressions so as to get an immediate impression of their emotional state of mind. We carried out two experiments to examine the perceptual and memory dynamics of facial expressions of pain. In the first experiment, we investigated how people estimate other people's levels of pain based on the perception of various dynamic facial expressions; these differ both in terms of the amount and intensity of activated action units. A second experiment used a representational momentum (RM) paradigm to study the emotional anticipation (memory bias) elicited by the same facial expressions of pain studied in Experiment 1. Our results highlighted the relationship between the level of perceived pain (in Experiment 1) and the direction and magnitude of memory bias (in Experiment 2): When perceived pain increases, the memory bias tends to be reduced (if positive) and ultimately becomes negative. Dynamic facial expressions of pain may reenact an "immediate perceptual history" in the perceiver before leading to an emotional anticipation of the agent's upcoming state. Thus, a subtle facial expression of pain (i.e., a low contraction around the eyes) that leads to a significant positive anticipation can be considered an adaptive process-one through which we can swiftly and involuntarily detect other people's pain.

  11. [Facial expressions of negative emotions in clinical interviews: The development, reliability and validity of a categorical system for the attribution of functions to facial expressions of negative emotions].

    PubMed

    Bock, Astrid; Huber, Eva; Peham, Doris; Benecke, Cord

    2015-01-01

    The development (Study 1) and validation (Study 2) of a categorical system for the attribution of facial expressions of negative emotions to specific functions. The facial expressions observed inOPDinterviews (OPD-Task-Force 2009) are coded according to the Facial Action Coding System (FACS; Ekman et al. 2002) and attributed to categories of basic emotional displays using EmFACS (Friesen & Ekman 1984). In Study 1 we analyze a partial sample of 20 interviews and postulate 10 categories of functions that can be arranged into three main categories (interactive, self and object). In Study 2 we rate the facial expressions (n=2320) from the OPD interviews (10 minutes each interview) of 80 female subjects (16 healthy, 64 with DSM-IV diagnosis; age: 18-57 years) according to the categorical system and correlate them with problematic relationship experiences (measured with IIP,Horowitz et al. 2000). Functions of negative facial expressions can be attributed reliably and validly with the RFE-Coding System. The attribution of interactive, self-related and object-related functions allows for a deeper understanding of the emotional facial expressions of patients with mental disorders.

  12. FaceWarehouse: a 3D facial expression database for visual computing.

    PubMed

    Cao, Chen; Weng, Yanlin; Zhou, Shun; Tong, Yiying; Zhou, Kun

    2014-03-01

    We present FaceWarehouse, a database of 3D facial expressions for visual computing applications. We use Kinect, an off-the-shelf RGBD camera, to capture 150 individuals aged 7-80 from various ethnic backgrounds. For each person, we captured the RGBD data of her different expressions, including the neutral expression and 19 other expressions such as mouth-opening, smile, kiss, etc. For every RGBD raw data record, a set of facial feature points on the color image such as eye corners, mouth contour, and the nose tip are automatically localized, and manually adjusted if better accuracy is required. We then deform a template facial mesh to fit the depth data as closely as possible while matching the feature points on the color image to their corresponding points on the mesh. Starting from these fitted face meshes, we construct a set of individual-specific expression blendshapes for each person. These meshes with consistent topology are assembled as a rank-3 tensor to build a bilinear face model with two attributes: identity and expression. Compared with previous 3D facial databases, for every person in our database, there is a much richer matching collection of expressions, enabling depiction of most human facial actions. We demonstrate the potential of FaceWarehouse for visual computing with four applications: facial image manipulation, face component transfer, real-time performance-based facial image animation, and facial animation retargeting from video to image.

  13. A large-scale analysis of sex differences in facial expressions.

    PubMed

    McDuff, Daniel; Kodra, Evan; Kaliouby, Rana El; LaFrance, Marianne

    2017-01-01

    There exists a stereotype that women are more expressive than men; however, research has almost exclusively focused on a single facial behavior, smiling. A large-scale study examines whether women are consistently more expressive than men or whether the effects are dependent on the emotion expressed. Studies of gender differences in expressivity have been somewhat restricted to data collected in lab settings or which required labor-intensive manual coding. In the present study, we analyze gender differences in facial behaviors as over 2,000 viewers watch a set of video advertisements in their home environments. The facial responses were recorded using participants' own webcams. Using a new automated facial coding technology we coded facial activity. We find that women are not universally more expressive across all facial actions. Nor are they more expressive in all positive valence actions and less expressive in all negative valence actions. It appears that generally women express actions more frequently than men, and in particular express more positive valence actions. However, expressiveness is not greater in women for all negative valence actions and is dependent on the discrete emotional state.

  14. A large-scale analysis of sex differences in facial expressions

    PubMed Central

    Kodra, Evan; el Kaliouby, Rana; LaFrance, Marianne

    2017-01-01

    There exists a stereotype that women are more expressive than men; however, research has almost exclusively focused on a single facial behavior, smiling. A large-scale study examines whether women are consistently more expressive than men or whether the effects are dependent on the emotion expressed. Studies of gender differences in expressivity have been somewhat restricted to data collected in lab settings or which required labor-intensive manual coding. In the present study, we analyze gender differences in facial behaviors as over 2,000 viewers watch a set of video advertisements in their home environments. The facial responses were recorded using participants’ own webcams. Using a new automated facial coding technology we coded facial activity. We find that women are not universally more expressive across all facial actions. Nor are they more expressive in all positive valence actions and less expressive in all negative valence actions. It appears that generally women express actions more frequently than men, and in particular express more positive valence actions. However, expressiveness is not greater in women for all negative valence actions and is dependent on the discrete emotional state. PMID:28422963

  15. Gaze Behavior of Children with ASD toward Pictures of Facial Expressions.

    PubMed

    Matsuda, Soichiro; Minagawa, Yasuyo; Yamamoto, Junichi

    2015-01-01

    Atypical gaze behavior in response to a face has been well documented in individuals with autism spectrum disorders (ASDs). Children with ASD appear to differ from typically developing (TD) children in gaze behavior for spoken and dynamic face stimuli but not for nonspeaking, static face stimuli. Furthermore, children with ASD and TD children show a difference in their gaze behavior for certain expressions. However, few studies have examined the relationship between autism severity and gaze behavior toward certain facial expressions. The present study replicated and extended previous studies by examining gaze behavior towards pictures of facial expressions. We presented ASD and TD children with pictures of surprised, happy, neutral, angry, and sad facial expressions. Autism severity was assessed using the Childhood Autism Rating Scale (CARS). The results showed that there was no group difference in gaze behavior when looking at pictures of facial expressions. Conversely, the children with ASD who had more severe autistic symptomatology had a tendency to gaze at angry facial expressions for a shorter duration in comparison to other facial expressions. These findings suggest that autism severity should be considered when examining atypical responses to certain facial expressions.

  16. Developmental changes in facial expression recognition in Japanese school-age children.

    PubMed

    Naruse, Susumu; Hashimoto, Toshiaki; Mori, Kenji; Tsuda, Yoshimi; Takahara, Mitsue; Kagami, Shoji

    2013-01-01

    Facial expressions hold abundant information and play a central part in communication. In daily life, we must construct amicable interpersonal relationships by communicating through verbal and nonverbal behaviors. While school-age is a period of rapid social growth, few studies exist that study developmental changes in facial expression recognition during this age. This study investigated developmental changes in facial expression recognition by examining observers' gaze on others' expressions. 87 school-age children from first to sixth grade (41 boys, 46 girls). The Tobii T60 Eye-tracker(Tobii Technologies, Sweden) was used to gauge eye movement during a task of matching pre-instructed emotion words and facial expressions images (neutral, angry, happy, surprised, sad, disgusted) presented on a monitor fixed at a distance of 50 cm. In the task of matching the six facial expression images and emotion words, the mid- and higher-grade children answered more accurately than the lower-grade children in matching four expressions, excluding neutral and happy. For fixation time and fixation count, the lower-grade children scored lower than other grade children, gazing on all facial expressions significantly fewer times and for shorter periods. It is guessed that the stage from lower grades to middle grades is a turning point in facial recognition.

  17. Gaze Behavior of Children with ASD toward Pictures of Facial Expressions

    PubMed Central

    Matsuda, Soichiro; Minagawa, Yasuyo; Yamamoto, Junichi

    2015-01-01

    Atypical gaze behavior in response to a face has been well documented in individuals with autism spectrum disorders (ASDs). Children with ASD appear to differ from typically developing (TD) children in gaze behavior for spoken and dynamic face stimuli but not for nonspeaking, static face stimuli. Furthermore, children with ASD and TD children show a difference in their gaze behavior for certain expressions. However, few studies have examined the relationship between autism severity and gaze behavior toward certain facial expressions. The present study replicated and extended previous studies by examining gaze behavior towards pictures of facial expressions. We presented ASD and TD children with pictures of surprised, happy, neutral, angry, and sad facial expressions. Autism severity was assessed using the Childhood Autism Rating Scale (CARS). The results showed that there was no group difference in gaze behavior when looking at pictures of facial expressions. Conversely, the children with ASD who had more severe autistic symptomatology had a tendency to gaze at angry facial expressions for a shorter duration in comparison to other facial expressions. These findings suggest that autism severity should be considered when examining atypical responses to certain facial expressions. PMID:26090223

  18. Brief report: Representational momentum for dynamic facial expressions in pervasive developmental disorder.

    PubMed

    Uono, Shota; Sato, Wataru; Toichi, Motomi

    2010-03-01

    Individuals with pervasive developmental disorder (PDD) have difficulty with social communication via emotional facial expressions, but behavioral studies involving static images have reported inconsistent findings about emotion recognition. We investigated whether dynamic presentation of facial expression would enhance subjective perception of expressed emotion in 13 individuals with PDD and 13 typically developing controls. We presented dynamic and static emotional (fearful and happy) expressions. Participants were asked to match a changeable emotional face display with the last presented image. The results showed that both groups perceived the last image of dynamic facial expression to be more emotionally exaggerated than the static facial expression. This finding suggests that individuals with PDD have an intact perceptual mechanism for processing dynamic information in another individual's face.

  19. Can We Distinguish Emotions from Faces? Investigation of Implicit and Explicit Processes of Peak Facial Expressions

    PubMed Central

    Xiao, Ruiqi; Li, Xianchun; Li, Lin; Wang, Yanmei

    2016-01-01

    Most previous studies on facial expression recognition have focused on the moderate emotions; to date, few studies have been conducted to investigate the explicit and implicit processes of peak emotions. In the current study, we used transiently peak intense expression images of athletes at the winning or losing point in competition as materials, and investigated the diagnosability of peak facial expressions at both implicit and explicit levels. In Experiment 1, participants were instructed to evaluate isolated faces, isolated bodies, and the face-body compounds, and eye-tracking movement was recorded. The results revealed that the isolated body and face-body congruent images were better recognized than isolated face and face-body incongruent images, indicating that the emotional information conveyed by facial cues was ambiguous, and the body cues influenced facial emotion recognition. Furthermore, eye movement records showed that the participants displayed distinct gaze patterns for the congruent and incongruent compounds. In Experiment 2A, the subliminal affective priming task was used, with faces as primes and bodies as targets, to investigate the unconscious emotion perception of peak facial expressions. The results showed that winning face prime facilitated reaction to winning body target, whereas losing face prime inhibited reaction to winning body target, suggesting that peak facial expressions could be perceived at the implicit level. In general, the results indicate that peak facial expressions cannot be consciously recognized but can be perceived at the unconscious level. In Experiment 2B, revised subliminal affective priming task and a strict awareness test were used to examine the validity of unconscious perception of peak facial expressions found in Experiment 2A. Results of Experiment 2B showed that reaction time to both winning body targets and losing body targets was influenced by the invisibly peak facial expression primes, which indicated the

  20. Can We Distinguish Emotions from Faces? Investigation of Implicit and Explicit Processes of Peak Facial Expressions.

    PubMed

    Xiao, Ruiqi; Li, Xianchun; Li, Lin; Wang, Yanmei

    2016-01-01

    Most previous studies on facial expression recognition have focused on the moderate emotions; to date, few studies have been conducted to investigate the explicit and implicit processes of peak emotions. In the current study, we used transiently peak intense expression images of athletes at the winning or losing point in competition as materials, and investigated the diagnosability of peak facial expressions at both implicit and explicit levels. In Experiment 1, participants were instructed to evaluate isolated faces, isolated bodies, and the face-body compounds, and eye-tracking movement was recorded. The results revealed that the isolated body and face-body congruent images were better recognized than isolated face and face-body incongruent images, indicating that the emotional information conveyed by facial cues was ambiguous, and the body cues influenced facial emotion recognition. Furthermore, eye movement records showed that the participants displayed distinct gaze patterns for the congruent and incongruent compounds. In Experiment 2A, the subliminal affective priming task was used, with faces as primes and bodies as targets, to investigate the unconscious emotion perception of peak facial expressions. The results showed that winning face prime facilitated reaction to winning body target, whereas losing face prime inhibited reaction to winning body target, suggesting that peak facial expressions could be perceived at the implicit level. In general, the results indicate that peak facial expressions cannot be consciously recognized but can be perceived at the unconscious level. In Experiment 2B, revised subliminal affective priming task and a strict awareness test were used to examine the validity of unconscious perception of peak facial expressions found in Experiment 2A. Results of Experiment 2B showed that reaction time to both winning body targets and losing body targets was influenced by the invisibly peak facial expression primes, which indicated the

  1. Can Healthy Fetuses Show Facial Expressions of “Pain” or “Distress”?

    PubMed Central

    Reissland, Nadja; Francis, Brian; Mason, James

    2013-01-01

    Background With advances of research on fetal behavioural development, the question of whether we can identify fetal facial expressions and determine their developmental progression, takes on greater importance. In this study we investigate longitudinally the increasing complexity of combinations of facial movements from 24 to 36 weeks gestation in a sample of healthy fetuses using frame-by-frame coding of 4-D ultrasound scans. The primary aim was to examine whether these complex facial movements coalesce into a recognisable facial expression of pain/distress. Methodology/Findings Fifteen fetuses (8 girls, 7 boys) were observed four times in the second and third trimester of pregnancy. Fetuses showed significant progress towards more complex facial expressions as gestational age increased. Statistical analysis of the facial movements making up a specific facial configuration namely “pain/distress” also demonstrates that this facial expression becomes significantly more complete as the fetus matures. Conclusions/Significance The study shows that one can determine the normal progression of fetal facial movements. Furthermore, our results suggest that healthy fetuses progress towards an increasingly complete pain/distress expression as they mature. We argue that this is an adaptive process which is beneficial to the fetus postnatally and has the potential to identify normal versus abnormal developmental pathways. PMID:23755245

  2. Pose-variant facial expression recognition using an embedded image system

    NASA Astrophysics Data System (ADS)

    Song, Kai-Tai; Han, Meng-Ju; Chang, Shuo-Hung

    2008-12-01

    In recent years, one of the most attractive research areas in human-robot interaction is automated facial expression recognition. Through recognizing the facial expression, a pet robot can interact with human in a more natural manner. In this study, we focus on the facial pose-variant problem. A novel method is proposed in this paper to recognize pose-variant facial expressions. After locating the face position in an image frame, the active appearance model (AAM) is applied to track facial features. Fourteen feature points are extracted to represent the variation of facial expressions. The distance between feature points are defined as the feature values. These feature values are sent to a support vector machine (SVM) for facial expression determination. The pose-variant facial expression is classified into happiness, neutral, sadness, surprise or anger. Furthermore, in order to evaluate the performance for practical applications, this study also built a low resolution database (160x120 pixels) using a CMOS image sensor. Experimental results show that the recognition rate is 84% with the self-built database.

  3. Empathy, but not mimicry restriction, influences the recognition of change in emotional facial expressions.

    PubMed

    Kosonogov, Vladimir; Titova, Alisa; Vorobyeva, Elena

    2015-01-01

    The current study addressed the hypothesis that empathy and the restriction of facial muscles of observers can influence recognition of emotional facial expressions. A sample of 74 participants recognized the subjective onset of emotional facial expressions (anger, disgust, fear, happiness, sadness, surprise, and neutral) in a series of morphed face photographs showing a gradual change (frame by frame) from one expression to another. The high-empathy (as measured by the Empathy Quotient) participants recognized emotional facial expressions at earlier photographs from the series than did low-empathy ones, but there was no difference in the exploration time. Restriction of facial muscles of observers (with plasters and a stick in mouth) did not influence the responses. We discuss these findings in the context of the embodied simulation theory and previous data on empathy.

  4. The Child Affective Facial Expression (CAFE) set: validity and reliability from untrained adults

    PubMed Central

    LoBue, Vanessa; Thrasher, Cat

    2014-01-01

    Emotional development is one of the largest and most productive areas of psychological research. For decades, researchers have been fascinated by how humans respond to, detect, and interpret emotional facial expressions. Much of the research in this area has relied on controlled stimulus sets of adults posing various facial expressions. Here we introduce a new stimulus set of emotional facial expressions into the domain of research on emotional development—The Child Affective Facial Expression set (CAFE). The CAFE set features photographs of a racially and ethnically diverse group of 2- to 8-year-old children posing for six emotional facial expressions—angry, fearful, sad, happy, surprised, and disgusted—and a neutral face. In the current work, we describe the set and report validity and reliability data on the set from 100 untrained adult participants. PMID:25610415

  5. 4D ultrasound study of fetal facial expressions in the third trimester of pregnancy.

    PubMed

    AboEllail, Mohamed Ahmed Mostafa; Kanenishi, Kenji; Mori, Nobuhiro; Mohamed, Osman Abdel Kareem; Hata, Toshiyuki

    2017-05-29

    To evaluate the frequencies of fetal facial expressions in the third trimester of pregnancy, when fetal brain maturation and development are progressing in normal healthy fetuses. Four-dimensional (4 D) ultrasound was used to examine the facial expressions of 111 healthy fetuses between 30 and 40 weeks of gestation. The frequencies of seven facial expressions (mouthing, yawning, smiling, tongue expulsion, scowling, sucking, and blinking) during 15-minute recordings were assessed. The fetuses were further divided into three gestational age groups (25 fetuses at 30-31 weeks, 43 at 32-35 weeks, and 43 at ≥36 weeks). Comparison of facial expressions among the three gestational age groups was performed to determine their changes with advancing gestation. Mouthing was the most frequent facial expression at 30-40 weeks of gestation, followed by blinking. Both facial expressions were significantly more frequent than the other expressions (p < .05). The frequency of yawning decreased with the gestational age after 30 weeks of gestation (p = .031). Other facial expressions did not change between 30 and 40 weeks. The frequency of yawning at 30-31 weeks was significantly higher than that at 36-40 weeks (p < .05). There were no significant differences in the other facial expressions among the three gestational age groups. Our results suggest that 4D ultrasound assessment of fetal facial expressions may be a useful modality for evaluating fetal brain maturation and development. The decreasing frequency of fetal yawning after 30 weeks of gestation may explain the emergence of distinct states of arousal.

  6. Individual differences in the recognition of facial expressions: an event-related potentials study.

    PubMed

    Tamamiya, Yoshiyuki; Hiraki, Kazuo

    2013-01-01

    Previous studies have shown that early posterior components of event-related potentials (ERPs) are modulated by facial expressions. The goal of the current study was to investigate individual differences in the recognition of facial expressions by examining the relationship between ERP components and the discrimination of facial expressions. Pictures of 3 facial expressions (angry, happy, and neutral) were presented to 36 young adults during ERP recording. Participants were asked to respond with a button press as soon as they recognized the expression depicted. A multiple regression analysis, where ERP components were set as predictor variables, assessed hits and reaction times in response to the facial expressions as dependent variables. The N170 amplitudes significantly predicted for accuracy of angry and happy expressions, and the N170 latencies were predictive for accuracy of neutral expressions. The P2 amplitudes significantly predicted reaction time. The P2 latencies significantly predicted reaction times only for neutral faces. These results suggest that individual differences in the recognition of facial expressions emerge from early components in visual processing.

  7. Dissociable roles of internal feelings and face recognition ability in facial expression decoding.

    PubMed

    Zhang, Lin; Song, Yiying; Liu, Ling; Liu, Jia

    2016-05-15

    The problem of emotion recognition has been tackled by researchers in both affective computing and cognitive neuroscience. While affective computing relies on analyzing visual features from facial expressions, it has been proposed that humans recognize emotions by internally simulating the emotional states conveyed by others' expressions, in addition to perceptual analysis of facial features. Here we investigated whether and how our internal feelings contributed to the ability to decode facial expressions. In two independent large samples of participants, we observed that individuals who generally experienced richer internal feelings exhibited a higher ability to decode facial expressions, and the contribution of internal feelings was independent of face recognition ability. Further, using voxel-based morphometry, we found that the gray matter volume (GMV) of bilateral superior temporal sulcus (STS) and the right inferior parietal lobule was associated with facial expression decoding through the mediating effect of internal feelings, while the GMV of bilateral STS, precuneus, and the right central opercular cortex contributed to facial expression decoding through the mediating effect of face recognition ability. In addition, the clusters in bilateral STS involved in the two components were neighboring yet separate. Our results may provide clues about the mechanism by which internal feelings, in addition to face recognition ability, serve as an important instrument for humans in facial expression decoding. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Cognitive tasks during expectation affect the congruency ERP effects to facial expressions.

    PubMed

    Lin, Huiyan; Schulz, Claudia; Straube, Thomas

    2015-01-01

    Expectancy congruency has been shown to modulate event-related potentials (ERPs) to emotional stimuli, such as facial expressions. However, it is unknown whether the congruency ERP effects to facial expressions can be modulated by cognitive manipulations during stimulus expectation. To this end, electroencephalography (EEG) was recorded while participants viewed (neutral and fearful) facial expressions. Each trial started with a cue, predicting a facial expression, followed by an expectancy interval without any cues and subsequently the face. In half of the trials, participants had to solve a cognitive task in which different letters were presented for target letter detection during the expectancy interval. Furthermore, facial expressions were congruent with the cues in 75% of all trials. ERP results revealed that for fearful faces, the cognitive task during expectation altered the congruency effect in N170 amplitude; congruent compared to incongruent fearful faces evoked larger N170 in the non-task condition but the congruency effect was not evident in the task condition. Regardless of facial expression, the congruency effect was generally altered by the cognitive task during expectation in P3 amplitude; the amplitudes were larger for incongruent compared to congruent faces in the non-task condition but the congruency effect was not shown in the task condition. The findings indicate that cognitive tasks during expectation reduce the processing of expectation and subsequently, alter congruency ERP effects to facial expressions.

  9. Deliberately generated and imitated facial expressions of emotions in people with eating disorders.

    PubMed

    Dapelo, Marcela Marin; Bodas, Sergio; Morris, Robin; Tchanturia, Kate

    2016-02-01

    People with eating disorders have difficulties in socio emotional functioning that could contribute to maintaining the functional consequences of the disorder. This study aimed to explore the ability to deliberately generate (i.e., pose) and imitate facial expressions of emotions in women with anorexia (AN) and bulimia nervosa (BN), compared to healthy controls (HC). One hundred and three participants (36 AN, 25 BN, and 42 HC) were asked to pose and imitate facial expressions of anger, disgust, fear, happiness, and sadness. Their facial expressions were recorded and coded. Participants with eating disorders (both AN and BN) were less accurate than HC when posing facial expressions of emotions. Participants with AN were less accurate compared to HC imitating facial expressions, whilst BN participants had a middle range performance. All results remained significant after controlling for anxiety, depression and autistic features. The relatively small number of BN participants recruited for this study. The study findings suggest that people with eating disorders, particularly those with AN, have difficulties posing and imitating facial expressions of emotions. These difficulties could have an impact in social communication and social functioning. This is the first study to investigate the ability to pose and imitate facial expressions of emotions in people with eating disorders, and the findings suggest this area should be further explored in future studies. Copyright © 2015. Published by Elsevier B.V.

  10. Analysis of facial expressions in patients with schziophrenia, in comparison with a healthy control - case study.

    PubMed

    Dudek, Arkadiusz; Wilczyński, Krzysztof M; Krysta, Krzysztof; Pęszor, Damian; Martyniak, Ewa; Wojciechowska, Marzena; Krzystanek, Marek; Wojciechowski, Konrad; Janas-Kozik, Małgorzata

    2017-09-01

    Deficits in area of communication, crucial for maintaining proper social bonds, may have a prominent adverse impact on quality of life in patients with schizophrenia. Social exclusion, lack of employment and deterioration of family life, may be consequences of aggravated social competencies, caused by inability to properly exhibit and interpret facial expressions. Although this phenomenon is known since first clinical descriptions of schizophrenia, lack of proper methodology limited our knowledge in this area. Aim of our study was to compare facial expressivity of the patient with schizophrenia, and the healthy individual. 47-years old patient suffering from schizophrenia, and 36-years old healthy individual were invited to participate in our study. They underwent the examination in Human Facial Modelling Lab in Polish-Japanese Institute of Information Technology in Bytom (Silesia, Katowice). Both participants were presented with two video materials, first one contained different facial expressions, which they had to imitate. Second one a part of comedy show, during which spontaneous reactions were recorded. Acquisition of facial expressions was conducted with marker-based technology of modelling. Obtained data was analyzed using Microsoft Excel. An overall facial expression intensity, expressed as an average value of distances traveled by markers during shifts from neutral position was higher in case of a healthy participant during both part of the study. The difference was especially visible in case of an upper half of the face. Utilization of marker-based methods in analysis of human facial expressions seem to be reliable and remarkably accurate.

  11. Cognitive tasks during expectation affect the congruency ERP effects to facial expressions

    PubMed Central

    Lin, Huiyan; Schulz, Claudia; Straube, Thomas

    2015-01-01

    Expectancy congruency has been shown to modulate event-related potentials (ERPs) to emotional stimuli, such as facial expressions. However, it is unknown whether the congruency ERP effects to facial expressions can be modulated by cognitive manipulations during stimulus expectation. To this end, electroencephalography (EEG) was recorded while participants viewed (neutral and fearful) facial expressions. Each trial started with a cue, predicting a facial expression, followed by an expectancy interval without any cues and subsequently the face. In half of the trials, participants had to solve a cognitive task in which different letters were presented for target letter detection during the expectancy interval. Furthermore, facial expressions were congruent with the cues in 75% of all trials. ERP results revealed that for fearful faces, the cognitive task during expectation altered the congruency effect in N170 amplitude; congruent compared to incongruent fearful faces evoked larger N170 in the non-task condition but the congruency effect was not evident in the task condition. Regardless of facial expression, the congruency effect was generally altered by the cognitive task during expectation in P3 amplitude; the amplitudes were larger for incongruent compared to congruent faces in the non-task condition but the congruency effect was not shown in the task condition. The findings indicate that cognitive tasks during expectation reduce the processing of expectation and subsequently, alter congruency ERP effects to facial expressions. PMID:26578938

  12. Processing of Individual Items during Ensemble Coding of Facial Expressions

    PubMed Central

    Li, Huiyun; Ji, Luyan; Tong, Ke; Ren, Naixin; Chen, Wenfeng; Liu, Chang Hong; Fu, Xiaolan

    2016-01-01

    There is growing evidence that human observers are able to extract the mean emotion or other type of information from a set of faces. The most intriguing aspect of this phenomenon is that observers often fail to identify or form a representation for individual faces in a face set. However, most of these results were based on judgments under limited processing resource. We examined a wider range of exposure time and observed how the relationship between the extraction of a mean and representation of individual facial expressions would change. The results showed that with an exposure time of 50 ms for the faces, observers were more sensitive to mean representation over individual representation, replicating the typical findings in the literature. With longer exposure time, however, observers were able to extract both individual and mean representation more accurately. Furthermore, diffusion model analysis revealed that the mean representation is also more prone to suffer from the noise accumulated in redundant processing time and leads to a more conservative decision bias, whereas individual representations seem more resistant to this noise. Results suggest that the encoding of emotional information from multiple faces may take two forms: single face processing and crowd face processing. PMID:27656154

  13. Facial feedback affects valence judgments of dynamic and static emotional expressions

    PubMed Central

    Hyniewska, Sylwia; Sato, Wataru

    2015-01-01

    The ability to judge others' emotions is required for the establishment and maintenance of smooth interactions in a community. Several lines of evidence suggest that the attribution of meaning to a face is influenced by the facial actions produced by an observer during the observation of a face. However, empirical studies testing causal relationships between observers' facial actions and emotion judgments have reported mixed findings. This issue was investigated by measuring emotion judgments in terms of valence and arousal dimensions while comparing dynamic vs. static presentations of facial expressions. We presented pictures and videos of facial expressions of anger and happiness. Participants (N = 36) were asked to differentiate between the gender of faces by activating the corrugator supercilii muscle (brow lowering) and zygomaticus major muscle (cheek raising). They were also asked to evaluate the internal states of the stimuli using the affect grid while maintaining the facial action until they finished responding. The cheek raising condition increased the attributed valence scores compared with the brow-lowering condition. This effect of facial actions was observed for static as well as for dynamic facial expressions. These data suggest that facial feedback mechanisms contribute to the judgment of the valence of emotional facial expressions. PMID:25852608

  14. Effects of cultural characteristics on building an emotion classifier through facial expression analysis

    NASA Astrophysics Data System (ADS)

    da Silva, Flávio Altinier Maximiano; Pedrini, Helio

    2015-03-01

    Facial expressions are an important demonstration of humanity's humors and emotions. Algorithms capable of recognizing facial expressions and associating them with emotions were developed and employed to compare the expressions that different cultural groups use to show their emotions. Static pictures of predominantly occidental and oriental subjects from public datasets were used to train machine learning algorithms, whereas local binary patterns, histogram of oriented gradients (HOGs), and Gabor filters were employed to describe the facial expressions for six different basic emotions. The most consistent combination, formed by the association of HOG filter and support vector machines, was then used to classify the other cultural group: there was a strong drop in accuracy, meaning that the subtle differences of facial expressions of each culture affected the classifier performance. Finally, a classifier was trained with images from both occidental and oriental subjects and its accuracy was higher on multicultural data, evidencing the need of a multicultural training set to build an efficient classifier.

  15. Realistic facial expression of virtual human based on color, sweat, and tears effects.

    PubMed

    Alkawaz, Mohammed Hazim; Basori, Ahmad Hoirul; Mohamad, Dzulkifli; Mohamed, Farhan

    2014-01-01

    Generating extreme appearances such as scared awaiting sweating while happy fit for tears (cry) and blushing (anger and happiness) is the key issue in achieving the high quality facial animation. The effects of sweat, tears, and colors are integrated into a single animation model to create realistic facial expressions of 3D avatar. The physical properties of muscles, emotions, or the fluid properties with sweating and tears initiators are incorporated. The action units (AUs) of facial action coding system are merged with autonomous AUs to create expressions including sadness, anger with blushing, happiness with blushing, and fear. Fluid effects such as sweat and tears are simulated using the particle system and smoothed-particle hydrodynamics (SPH) methods which are combined with facial animation technique to produce complex facial expressions. The effects of oxygenation of the facial skin color appearance are measured using the pulse oximeter system and the 3D skin analyzer. The result shows that virtual human facial expression is enhanced by mimicking actual sweating and tears simulations for all extreme expressions. The proposed method has contribution towards the development of facial animation industry and game as well as computer graphics.

  16. Realistic Facial Expression of Virtual Human Based on Color, Sweat, and Tears Effects

    PubMed Central

    Alkawaz, Mohammed Hazim; Basori, Ahmad Hoirul; Mohamad, Dzulkifli; Mohamed, Farhan

    2014-01-01

    Generating extreme appearances such as scared awaiting sweating while happy fit for tears (cry) and blushing (anger and happiness) is the key issue in achieving the high quality facial animation. The effects of sweat, tears, and colors are integrated into a single animation model to create realistic facial expressions of 3D avatar. The physical properties of muscles, emotions, or the fluid properties with sweating and tears initiators are incorporated. The action units (AUs) of facial action coding system are merged with autonomous AUs to create expressions including sadness, anger with blushing, happiness with blushing, and fear. Fluid effects such as sweat and tears are simulated using the particle system and smoothed-particle hydrodynamics (SPH) methods which are combined with facial animation technique to produce complex facial expressions. The effects of oxygenation of the facial skin color appearance are measured using the pulse oximeter system and the 3D skin analyzer. The result shows that virtual human facial expression is enhanced by mimicking actual sweating and tears simulations for all extreme expressions. The proposed method has contribution towards the development of facial animation industry and game as well as computer graphics. PMID:25136663

  17. The Effects of Early Institutionalization on the Discrimination of Facial Expressions of Emotion in Young Children.

    PubMed

    Jeon, Hana; Moulson, Margaret C; Fox, Nathan; Zeanah, Charles; Nelson, Charles A

    2010-03-01

    The current study examined the effects of institutionalization on the discrimination of facial expressions of emotion in 3 groups of 42-month-old children. One group consisted of children abandoned at birth who were randomly assigned to Care as Usual (institutional care) following a baseline assessment. Another group consisted of children abandoned at birth who were randomly assigned to high-quality foster care following a baseline assessment. A third group consisted of never-institutionalized children who were reared by their biological parents. All children were familiarized to happy, sad, fearful, and neutral facial expressions and tested on their ability to discriminate familiar versus novel facial expressions. Contrary to our prediction, all three groups of children were equally able to discriminate among the different expressions. Furthermore, in contrast to findings at 13-30 months of age, these same children showed familiarity rather than novelty preferences toward different expressions. There were also asymmetries in children's discrimination of facial expressions depending on which facial expression served as the familiar versus the novel stimulus. Collectively, early institutionalization appears not to impact the development of the ability to discriminate facial expressions of emotion, at least when preferential looking serves as the dependent measure. These findings are discussed in the context of the myriad domains that are affected by early institutionalization.

  18. Discriminability effect on Garner interference: evidence from recognition of facial identity and expression

    PubMed Central

    Wang, Yamin; Fu, Xiaolan; Johnston, Robert A.; Yan, Zheng

    2013-01-01

    Using Garner’s speeded classification task existing studies demonstrated an asymmetric interference in the recognition of facial identity and facial expression. It seems that expression is hard to interfere with identity recognition. However, discriminability of identity and expression, a potential confounding variable, had not been carefully examined in existing studies. In current work, we manipulated discriminability of identity and expression by matching facial shape (long or round) in identity and matching mouth (opened or closed) in facial expression. Garner interference was found either from identity to expression (Experiment 1) or from expression to identity (Experiment 2). Interference was also found in both directions (Experiment 3) or in neither direction (Experiment 4). The results support that Garner interference tends to occur under condition of low discriminability of relevant dimension regardless of facial property. Our findings indicate that Garner interference is not necessarily related to interdependent processing in recognition of facial identity and expression. The findings also suggest that discriminability as a mediating factor should be carefully controlled in future research. PMID:24391609

  19. Neural responses to facial and vocal expressions of fear and disgust.

    PubMed Central

    Phillips, M L; Young, A W; Scott, S K; Calder, A J; Andrew, C; Giampietro, V; Williams, S C; Bullmore, E T; Brammer, M; Gray, J A

    1998-01-01

    Neuropsychological studies report more impaired responses to facial expressions of fear than disgust in people with amygdala lesions, and vice versa in people with Huntington's disease. Experiments using functional magnetic resonance imaging (fMRI) have confirmed the role of the amygdala in the response to fearful faces and have implicated the anterior insula in the response to facial expressions of disgust. We used fMRI to extend these studies to the perception of fear and disgust from both facial and vocal expressions. Consistent with neuropsychological findings, both types of fearful stimuli activated the amygdala. Facial expressions of disgust activated the anterior insula and the caudate-putamen; vocal expressions of disgust did not significantly activate either of these regions. All four types of stimuli activated the superior temporal gyrus. Our findings therefore (i) support the differential localization of the neural substrates of fear and disgust; (ii) confirm the involvement of the amygdala in the emotion of fear, whether evoked by facial or vocal expressions; (iii) confirm the involvement of the anterior insula and the striatum in reactions to facial expressions of disgust; and (iv) suggest a possible general role for the perception of emotional expressions for the superior temporal gyrus. PMID:9802236

  20. Cultural similarities and differences in perceiving and recognizing facial expressions of basic emotions.

    PubMed

    Yan, Xiaoqian; Andrews, Timothy J; Young, Andrew W

    2016-03-01

    The ability to recognize facial expressions of basic emotions is often considered a universal human ability. However, recent studies have suggested that this commonality has been overestimated and that people from different cultures use different facial signals to represent expressions (Jack, Blais, Scheepers, Schyns, & Caldara, 2009; Jack, Caldara, & Schyns, 2012). We investigated this possibility by examining similarities and differences in the perception and categorization of facial expressions between Chinese and white British participants using whole-face and partial-face images. Our results showed no cultural difference in the patterns of perceptual similarity of expressions from whole-face images. When categorizing the same expressions, however, both British and Chinese participants were slightly more accurate with whole-face images of their own ethnic group. To further investigate potential strategy differences, we repeated the perceptual similarity and categorization tasks with presentation of only the upper or lower half of each face. Again, the perceptual similarity of facial expressions was similar between Chinese and British participants for both the upper and lower face regions. However, participants were slightly better at categorizing facial expressions of their own ethnic group for the lower face regions, indicating that the way in which culture shapes the categorization of facial expressions is largely driven by differences in information decoding from this part of the face.

  1. Anodal tDCS targeting the right orbitofrontal cortex enhances facial expression recognition.

    PubMed

    Willis, Megan L; Murphy, Jillian M; Ridley, Nicole J; Vercammen, Ans

    2015-12-01

    The orbitofrontal cortex (OFC) has been implicated in the capacity to accurately recognise facial expressions. The aim of the current study was to determine if anodal transcranial direct current stimulation (tDCS) targeting the right OFC in healthy adults would enhance facial expression recognition, compared with a sham condition. Across two counterbalanced sessions of tDCS (i.e. anodal and sham), 20 undergraduate participants (18 female) completed a facial expression labelling task comprising angry, disgusted, fearful, happy, sad and neutral expressions, and a control (social judgement) task comprising the same expressions. Responses on the labelling task were scored for accuracy, median reaction time and overall efficiency (i.e. combined accuracy and reaction time). Anodal tDCS targeting the right OFC enhanced facial expression recognition, reflected in greater efficiency and speed of recognition across emotions, relative to the sham condition. In contrast, there was no effect of tDCS to responses on the control task. This is the first study to demonstrate that anodal tDCS targeting the right OFC boosts facial expression recognition. This finding provides a solid foundation for future research to examine the efficacy of this technique as a means to treat facial expression recognition deficits, particularly in individuals with OFC damage or dysfunction. © The Author (2015). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  2. Anodal tDCS targeting the right orbitofrontal cortex enhances facial expression recognition

    PubMed Central

    Murphy, Jillian M.; Ridley, Nicole J.; Vercammen, Ans

    2015-01-01

    The orbitofrontal cortex (OFC) has been implicated in the capacity to accurately recognise facial expressions. The aim of the current study was to determine if anodal transcranial direct current stimulation (tDCS) targeting the right OFC in healthy adults would enhance facial expression recognition, compared with a sham condition. Across two counterbalanced sessions of tDCS (i.e. anodal and sham), 20 undergraduate participants (18 female) completed a facial expression labelling task comprising angry, disgusted, fearful, happy, sad and neutral expressions, and a control (social judgement) task comprising the same expressions. Responses on the labelling task were scored for accuracy, median reaction time and overall efficiency (i.e. combined accuracy and reaction time). Anodal tDCS targeting the right OFC enhanced facial expression recognition, reflected in greater efficiency and speed of recognition across emotions, relative to the sham condition. In contrast, there was no effect of tDCS to responses on the control task. This is the first study to demonstrate that anodal tDCS targeting the right OFC boosts facial expression recognition. This finding provides a solid foundation for future research to examine the efficacy of this technique as a means to treat facial expression recognition deficits, particularly in individuals with OFC damage or dysfunction. PMID:25971602

  3. Experimental results of affective valence and arousal to avatar's facial expressions.

    PubMed

    Ku, Jeonghun; Jang, Hee Jeong; Kim, Kwang Uk; Kim, Jae Hun; Park, Sung Hyouk; Lee, Jang Han; Kim, Jae Jin; Kim, In Y; Kim, Sun I

    2005-10-01

    The objectives of this study were to propose a method of presenting dynamic facial expressions to experimental subjects, in order to investigate human perception of avatar's facial expressions of different levels of emotional intensity. The investigation concerned how perception varies according to the strength of facial expression, as well as according to an avatar's gender. To accomplish these goals, we generated a male and a female virtual avatar with five levels of intensity of happiness and anger using a morphing technique. We then recruited 16 normal healthy subjects and measured each subject's emotional reaction by scoring affective arousal and valence after showing them the avatar's face. Through this study, we were able to investigate human perceptual characteristics evoked by male and female avatars' graduated facial expressions of happiness and anger. In addition, we were able to identify that a virtual avatar's facial expression could affect human emotion in different ways according to the avatar's gender and the intensity of its facial expressions. However, we could also see that virtual faces have some limitations because they are not real, so subjects recognized the expressions well, but were not influenced to the same extent. Although a virtual avatar has some limitations in conveying its emotion using facial expressions, this study is significant in that it shows that a new potential exists to use or manipulate emotional intensity by controlling a virtual avatar's facial expression linearly using a morphing technique. Therefore, it is predicted that this technique may be used for assessing emotional characteristics of humans, and may be of particular benefit for work with people with emotional disorders through a presentation of dynamic expression of various emotional intensities.

  4. Gently Does It: Humans Outperform a Software Classifier in Recognizing Subtle, Nonstereotypical Facial Expressions.

    PubMed

    Yitzhak, Neta; Giladi, Nir; Gurevich, Tanya; Messinger, Daniel S; Prince, Emily B; Martin, Katherine; Aviezer, Hillel

    2017-04-13

    According to dominant theories of affect, humans innately and universally express a set of emotions using specific configurations of prototypical facial activity. Accordingly, thousands of studies have tested emotion recognition using sets of highly intense and stereotypical facial expressions, yet their incidence in real life is virtually unknown. In fact, a commonplace experience is that emotions are expressed in subtle and nonprototypical forms. Such facial expressions are at the focus of the current study. In Experiment 1, we present the development and validation of a novel stimulus set consisting of dynamic and subtle emotional facial displays conveyed without constraining expressers to using prototypical configurations. Although these subtle expressions were more challenging to recognize than prototypical dynamic expressions, they were still well recognized by human raters, and perhaps most importantly, they were rated as more ecological and naturalistic than the prototypical expressions. In Experiment 2, we examined the characteristics of subtle versus prototypical expressions by subjecting them to a software classifier, which used prototypical basic emotion criteria. Although the software was highly successful at classifying prototypical expressions, it performed very poorly at classifying the subtle expressions. Further validation was obtained from human expert face coders: Subtle stimuli did not contain many of the key facial movements present in prototypical expressions. Together, these findings suggest that emotions may be successfully conveyed to human viewers using subtle nonprototypical expressions. Although classic prototypical facial expressions are well recognized, they appear less naturalistic and may not capture the richness of everyday emotional communication. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  5. Dynamic facial expressions of emotion transmit an evolving hierarchy of signals over time.

    PubMed

    Jack, Rachael E; Garrod, Oliver G B; Schyns, Philippe G

    2014-01-20

    Designed by biological and social evolutionary pressures, facial expressions of emotion comprise specific facial movements to support a near-optimal system of signaling and decoding. Although highly dynamical, little is known about the form and function of facial expression temporal dynamics. Do facial expressions transmit diagnostic signals simultaneously to optimize categorization of the six classic emotions, or sequentially to support a more complex communication system of successive categorizations over time? Our data support the latter. Using a combination of perceptual expectation modeling, information theory, and Bayesian classifiers, we show that dynamic facial expressions of emotion transmit an evolving hierarchy of "biologically basic to socially specific" information over time. Early in the signaling dynamics, facial expressions systematically transmit few, biologically rooted face signals supporting the categorization of fewer elementary categories (e.g., approach/avoidance). Later transmissions comprise more complex signals that support categorization of a larger number of socially specific categories (i.e., the six classic emotions). Here, we show that dynamic facial expressions of emotion provide a sophisticated signaling system, questioning the widely accepted notion that emotion communication is comprised of six basic (i.e., psychologically irreducible) categories, and instead suggesting four.

  6. Influence of Intensity on Children's Sensitivity to Happy, Sad, and Fearful Facial Expressions

    ERIC Educational Resources Information Center

    Gao, Xiaoqing; Maurer, Daphne

    2009-01-01

    Most previous studies investigating children's ability to recognize facial expressions used only intense exemplars. Here we compared the sensitivity of 5-, 7-, and 10-year-olds with that of adults (n = 24 per age group) for less intense expressions of happiness, sadness, and fear. The developmental patterns differed across expressions. For…

  7. Revisiting the Relationship between the Processing of Gaze Direction and the Processing of Facial Expression

    ERIC Educational Resources Information Center

    Ganel, Tzvi

    2011-01-01

    There is mixed evidence on the nature of the relationship between the perception of gaze direction and the perception of facial expressions. Major support for shared processing of gaze and expression comes from behavioral studies that showed that observers cannot process expression or gaze and ignore irrelevant variations in the other dimension.…

  8. Does Facial Expressivity Count? How Typically Developing Children Respond Initially to Children with Autism

    ERIC Educational Resources Information Center

    Stagg, Steven D.; Slavny, Rachel; Hand, Charlotte; Cardoso, Alice; Smith, Pamela

    2014-01-01

    Research investigating expressivity in children with autism spectrum disorder has reported flat affect or bizarre facial expressivity within this population; however, the impact expressivity may have on first impression formation has received little research input. We examined how videos of children with autism spectrum disorder were rated for…

  9. Does Facial Expressivity Count? How Typically Developing Children Respond Initially to Children with Autism

    ERIC Educational Resources Information Center

    Stagg, Steven D.; Slavny, Rachel; Hand, Charlotte; Cardoso, Alice; Smith, Pamela

    2014-01-01

    Research investigating expressivity in children with autism spectrum disorder has reported flat affect or bizarre facial expressivity within this population; however, the impact expressivity may have on first impression formation has received little research input. We examined how videos of children with autism spectrum disorder were rated for…

  10. Facial age cues and emotional expression interact asymmetrically: age cues moderate emotion categorisation.

    PubMed

    Craig, Belinda M; Lipp, Ottmar V

    2017-04-03

    Facial attributes such as race, sex, and age can interact with emotional expressions; however, only a couple of studies have investigated the nature of the interaction between facial age cues and emotional expressions and these have produced inconsistent results. Additionally, these studies have not addressed the mechanism/s driving the influence of facial age cues on emotional expression or vice versa. In the current study, participants categorised young and older adult faces expressing happiness and anger (Experiment 1) or sadness (Experiment 2) by their age and their emotional expression. Age cues moderated categorisation of happiness vs. anger and sadness in the absence of an influence of emotional expression on age categorisation times. This asymmetrical interaction suggests that facial age cues are obligatorily processed prior to emotional expressions. Finding a categorisation advantage for happiness expressed on young faces relative to both anger and sadness which are negative in valence but different in their congruence with old age stereotypes or structural overlap with age cues suggests that the observed influence of facial age cues on emotion perception is due to the congruence between relatively positive evaluations of young faces and happy expressions.

  11. Face-selective regions differ in their ability to classify facial expressions

    PubMed Central

    Zhang, Hui; Japee, Shruti; Nolan, Rachel; Chu, Carlton; Liu, Ning; Ungerleider, Leslie G

    2016-01-01

    Recognition of facial expressions is crucial for effective social interactions. Yet, the extent to which the various face-selective regions in the human brain classify different facial expressions remains unclear. We used functional magnetic resonance imaging (fMRI) and support vector machine pattern classification analysis to determine how well face-selective brain regions are able to decode different categories of facial expression. Subjects participated in a slow event-related fMRI experiment in which they were shown 32 face pictures, portraying four different expressions: neutral, fearful, angry, and happy and belonging to eight different identities. Our results showed that only the amygdala and the posterior superior temporal sulcus (STS) were able to accurately discriminate between these expressions, albeit in different ways: The amygdala discriminated fearful faces from non-fearful faces, whereas STS discriminated neutral from emotional (fearful, angry and happy) faces. In contrast to these findings on the classification of emotional expression, only the fusiform face area (FFA) and anterior inferior temporal cortex (aIT) could discriminate among the various facial identities. Further, the amygdala and STS were better than FFA and aIT at classifying expression, while FFA and aIT were better than the amygdala and STS at classifying identity. Taken together, our findings indicate that the decoding of facial emotion and facial identity occurs in different neural substrates: the amygdala and STS for the former and FFA and aIT for the latter. PMID:26826513

  12. Facial expression recognition in the wild based on multimodal texture features

    NASA Astrophysics Data System (ADS)

    Sun, Bo; Li, Liandong; Zhou, Guoyan; He, Jun

    2016-11-01

    Facial expression recognition in the wild is a very challenging task. We describe our work in static and continuous facial expression recognition in the wild. We evaluate the recognition results of gray deep features and color deep features, and explore the fusion of multimodal texture features. For the continuous facial expression recognition, we design two temporal-spatial dense scale-invariant feature transform (SIFT) features and combine multimodal features to recognize expression from image sequences. For the static facial expression recognition based on video frames, we extract dense SIFT and some deep convolutional neural network (CNN) features, including our proposed CNN architecture. We train linear support vector machine and partial least squares classifiers for those kinds of features on the static facial expression in the wild (SFEW) and acted facial expression in the wild (AFEW) dataset, and we propose a fusion network to combine all the extracted features at decision level. The final achievement we gained is 56.32% on the SFEW testing set and 50.67% on the AFEW validation set, which are much better than the baseline recognition rates of 35.96% and 36.08%.

  13. Recognition of facial expressions and prosodic cues with graded emotional intensities in adults with Asperger syndrome.

    PubMed

    Doi, Hirokazu; Fujisawa, Takashi X; Kanai, Chieko; Ohta, Haruhisa; Yokoi, Hideki; Iwanami, Akira; Kato, Nobumasa; Shinohara, Kazuyuki

    2013-09-01

    This study investigated the ability of adults with Asperger syndrome to recognize emotional categories of facial expressions and emotional prosodies with graded emotional intensities. The individuals with Asperger syndrome showed poorer recognition performance for angry and sad expressions from both facial and vocal information. The group difference in facial expression recognition was prominent for stimuli with low or intermediate emotional intensities. In contrast to this, the individuals with Asperger syndrome exhibited lower recognition accuracy than typically-developed controls mainly for emotional prosody with high emotional intensity. In facial expression recognition, Asperger and control groups showed an inversion effect for all categories. The magnitude of this effect was less in the Asperger group for angry and sad expressions, presumably attributable to reduced recruitment of the configural mode of face processing. The individuals with Asperger syndrome outperformed the control participants in recognizing inverted sad expressions, indicating enhanced processing of local facial information representing sad emotion. These results suggest that the adults with Asperger syndrome rely on modality-specific strategies in emotion recognition from facial expression and prosodic information.

  14. Face-selective regions differ in their ability to classify facial expressions.

    PubMed

    Zhang, Hui; Japee, Shruti; Nolan, Rachel; Chu, Carlton; Liu, Ning; Ungerleider, Leslie G

    2016-04-15

    Recognition of facial expressions is crucial for effective social interactions. Yet, the extent to which the various face-selective regions in the human brain classify different facial expressions remains unclear. We used functional magnetic resonance imaging (fMRI) and support vector machine pattern classification analysis to determine how well face-selective brain regions are able to decode different categories of facial expression. Subjects participated in a slow event-related fMRI experiment in which they were shown 32 face pictures, portraying four different expressions: neutral, fearful, angry, and happy and belonging to eight different identities. Our results showed that only the amygdala and the posterior superior temporal sulcus (STS) were able to accurately discriminate between these expressions, albeit in different ways: the amygdala discriminated fearful faces from non-fearful faces, whereas STS discriminated neutral from emotional (fearful, angry and happy) faces. In contrast to these findings on the classification of emotional expression, only the fusiform face area (FFA) and anterior inferior temporal cortex (aIT) could discriminate among the various facial identities. Further, the amygdala and STS were better than FFA and aIT at classifying expression, while FFA and aIT were better than the amygdala and STS at classifying identity. Taken together, our findings indicate that the decoding of facial emotion and facial identity occurs in different neural substrates: the amygdala and STS for the former and FFA and aIT for the latter.

  15. 3D Face Model Dataset: Automatic Detection of Facial Expressions and Emotions for Educational Environments

    ERIC Educational Resources Information Center

    Chickerur, Satyadhyan; Joshi, Kartik

    2015-01-01

    Emotion detection using facial images is a technique that researchers have been using for the last two decades to try to analyze a person's emotional state given his/her image. Detection of various kinds of emotion using facial expressions of students in educational environment is useful in providing insight into the effectiveness of tutoring…

  16. Rules versus Prototype Matching: Strategies of Perception of Emotional Facial Expressions in the Autism Spectrum

    ERIC Educational Resources Information Center

    Rutherford, M. D.; McIntosh, Daniel N.

    2007-01-01

    When perceiving emotional facial expressions, people with autistic spectrum disorders (ASD) appear to focus on individual facial features rather than configurations. This paper tests whether individuals with ASD use these features in a rule-based strategy of emotional perception, rather than a typical, template-based strategy by considering…

  17. Recognition of Facially Expressed Emotions and Visual Search Strategies in Adults with Asperger Syndrome

    ERIC Educational Resources Information Center

    Falkmer, Marita; Bjallmark, Anna; Larsson, Matilda; Falkmer, Torbjorn

    2011-01-01

    Can the disadvantages persons with Asperger syndrome frequently experience with reading facially expressed emotions be attributed to a different visual perception, affecting their scanning patterns? Visual search strategies, particularly regarding the importance of information from the eye area, and the ability to recognise facially expressed…

  18. 3D Face Model Dataset: Automatic Detection of Facial Expressions and Emotions for Educational Environments

    ERIC Educational Resources Information Center

    Chickerur, Satyadhyan; Joshi, Kartik

    2015-01-01

    Emotion detection using facial images is a technique that researchers have been using for the last two decades to try to analyze a person's emotional state given his/her image. Detection of various kinds of emotion using facial expressions of students in educational environment is useful in providing insight into the effectiveness of tutoring…

  19. Recognition of Facially Expressed Emotions and Visual Search Strategies in Adults with Asperger Syndrome

    ERIC Educational Resources Information Center

    Falkmer, Marita; Bjallmark, Anna; Larsson, Matilda; Falkmer, Torbjorn

    2011-01-01

    Can the disadvantages persons with Asperger syndrome frequently experience with reading facially expressed emotions be attributed to a different visual perception, affecting their scanning patterns? Visual search strategies, particularly regarding the importance of information from the eye area, and the ability to recognise facially expressed…

  20. Social Context Modulates Facial Imitation of Children’s Emotional Expressions

    PubMed Central

    Jap-Tjong, Nadine; Spencer, Hannah; Hofman, Dennis

    2016-01-01

    Children use emotional facial expressions of others for guiding their behavior, a process which is important to a child’s social-emotional development. Earlier studies on facial interaction demonstrate that imitation of emotional expressions of others is automatic, yet can be dynamically modulated depending on contextual information. Considering the value of emotional expressions for children especially, we tested whether and to what extent information about children’s temperament and domestic situation alters mimicry of their emotional expressions. Results show that angry expressions of children displaying negative behavior resulted in stronger imitation, which may serve as a corrective signal. Sad facial expressions resulted in stronger imitation towards those behaving positively but only when exposed to a difficult domestic situation, indicating increased empathy towards these children. These findings shed new light on the dynamic implicit communicative processes that shape interaction with children of different social-emotional backgrounds. PMID:27930714

  1. Impaired recognition of prosody and subtle emotional facial expressions in Parkinson's disease.

    PubMed

    Buxton, Sharon L; MacDonald, Lorraine; Tippett, Lynette J

    2013-04-01

    Accurately recognizing the emotional states of others is crucial for successful social interactions and social relationships. Individuals with Parkinson's disease (PD) have shown deficits in emotional recognition abilities although findings have been inconsistent. This study examined recognition of emotions from prosody and from facial emotional expressions with three levels of subtlety, in 30 individuals with PD (without dementia) and 30 control participants. The PD group were impaired on the prosody task, with no differential impairments in specific emotions. PD participants were also impaired at recognizing facial expressions of emotion, with a significant association between how well they could recognize emotions in the two modalities, even after controlling for disease severity. When recognizing facial expressions, the PD group had no difficulty identifying prototypical Ekman and Friesen (1976) emotional faces, but were poorer than controls at recognizing the moderate and difficult levels of subtle expressions. They were differentially impaired at recognizing moderately subtle expressions of disgust and sad expressions at the difficult level. Notably, however, they were impaired at recognizing happy expressions at both levels of subtlety. Furthermore how well PD participants identified happy expressions conveyed by either face or voice was strongly related to accuracy in the other modality. This suggests dysfunction of overlapping components of the circuitry processing happy expressions in PD. This study demonstrates the usefulness of including subtle expressions of emotion, likely to be encountered in everyday life, when assessing recognition of facial expressions.

  2. Attention for emotional facial expressions in dysphoria: an eye-movement registration study.

    PubMed

    Leyman, Lemke; De Raedt, Rudi; Vaeyens, Roel; Philippaerts, Renaat M

    2011-01-01

    Former research demonstrated that depression is associated with dysfunctional attentional processing of emotional information. Most studies examined this bias by registration of response latencies. The present study employed an ecologically valid measurement of attentive processing, using eye-movement registration. Dysphoric and non-dysphoric participants viewed slides presenting sad, angry, happy and neutral facial expressions. For each type of expression, three components of visual attention were analysed: the relative fixation frequency, fixation time and glance duration. Attentional biases were also investigated for inverted facial expressions to ensure that they were not related to eye-catching facial features. Results indicated that non-dysphoric individuals were characterised by longer fixating and dwelling on happy faces. Dysphoric individuals demonstrated a longer dwelling on sad and neutral faces. These results were not found for inverted facial expressions. The present findings are in line with the assumption that depression is associated with a prolonged attentional elaboration on negative information.

  3. Facial Emotion Recognition and Expression in Parkinson’s Disease: An Emotional Mirror Mechanism?

    PubMed Central

    Ricciardi, Lucia; Visco-Comandini, Federica; Erro, Roberto; Morgante, Francesca; Bologna, Matteo; Fasano, Alfonso; Ricciardi, Diego; Edwards, Mark J.; Kilner, James

    2017-01-01

    Background and aim Parkinson’s disease (PD) patients have impairment of facial expressivity (hypomimia) and difficulties in interpreting the emotional facial expressions produced by others, especially for aversive emotions. We aimed to evaluate the ability to produce facial emotional expressions and to recognize facial emotional expressions produced by others in a group of PD patients and a group of healthy participants in order to explore the relationship between these two abilities and any differences between the two groups of participants. Methods Twenty non-demented, non-depressed PD patients and twenty healthy participants (HC) matched for demographic characteristics were studied. The ability of recognizing emotional facial expressions was assessed with the Ekman 60-faces test (Emotion recognition task). Participants were video-recorded while posing facial expressions of 6 primary emotions (happiness, sadness, surprise, disgust, fear and anger). The most expressive pictures for each emotion were derived from the videos. Ten healthy raters were asked to look at the pictures displayed on a computer-screen in pseudo-random fashion and to identify the emotional label in a six-forced-choice response format (Emotion expressivity task). Reaction time (RT) and accuracy of responses were recorded. At the end of each trial the participant was asked to rate his/her confidence in his/her perceived accuracy of response. Results For emotion recognition, PD reported lower score than HC for Ekman total score (p<0.001), and for single emotions sub-scores happiness, fear, anger, sadness (p<0.01) and surprise (p = 0.02). In the facial emotion expressivity task, PD and HC significantly differed in the total score (p = 0.05) and in the sub-scores for happiness, sadness, anger (all p<0.001). RT and the level of confidence showed significant differences between PD and HC for the same emotions. There was a significant positive correlation between the emotion facial recognition and

  4. Facial Emotion Recognition and Expression in Parkinson's Disease: An Emotional Mirror Mechanism?

    PubMed

    Ricciardi, Lucia; Visco-Comandini, Federica; Erro, Roberto; Morgante, Francesca; Bologna, Matteo; Fasano, Alfonso; Ricciardi, Diego; Edwards, Mark J; Kilner, James

    2017-01-01

    Parkinson's disease (PD) patients have impairment of facial expressivity (hypomimia) and difficulties in interpreting the emotional facial expressions produced by others, especially for aversive emotions. We aimed to evaluate the ability to produce facial emotional expressions and to recognize facial emotional expressions produced by others in a group of PD patients and a group of healthy participants in order to explore the relationship between these two abilities and any differences between the two groups of participants. Twenty non-demented, non-depressed PD patients and twenty healthy participants (HC) matched for demographic characteristics were studied. The ability of recognizing emotional facial expressions was assessed with the Ekman 60-faces test (Emotion recognition task). Participants were video-recorded while posing facial expressions of 6 primary emotions (happiness, sadness, surprise, disgust, fear and anger). The most expressive pictures for each emotion were derived from the videos. Ten healthy raters were asked to look at the pictures displayed on a computer-screen in pseudo-random fashion and to identify the emotional label in a six-forced-choice response format (Emotion expressivity task). Reaction time (RT) and accuracy of responses were recorded. At the end of each trial the participant was asked to rate his/her confidence in his/her perceived accuracy of response. For emotion recognition, PD reported lower score than HC for Ekman total score (p<0.001), and for single emotions sub-scores happiness, fear, anger, sadness (p<0.01) and surprise (p = 0.02). In the facial emotion expressivity task, PD and HC significantly differed in the total score (p = 0.05) and in the sub-scores for happiness, sadness, anger (all p<0.001). RT and the level of confidence showed significant differences between PD and HC for the same emotions. There was a significant positive correlation between the emotion facial recognition and expressivity in both groups; the

  5. Dogs Evaluate Threatening Facial Expressions by Their Biological Validity--Evidence from Gazing Patterns.

    PubMed

    Somppi, Sanni; Törnqvist, Heini; Kujala, Miiamaaria V; Hänninen, Laura; Krause, Christina M; Vainio, Outi

    2016-01-01

    Appropriate response to companions' emotional signals is important for all social creatures. The emotional expressions of humans and non-human animals have analogies in their form and function, suggesting shared evolutionary roots, but very little is known about how animals other than primates view and process facial expressions. In primates, threat-related facial expressions evoke exceptional viewing patterns compared with neutral or positive stimuli. Here, we explore if domestic dogs (Canis familiaris) have such an attentional bias toward threatening social stimuli and whether observed emotional expressions affect dogs' gaze fixation distribution among the facial features (eyes, midface and mouth). We recorded the voluntary eye gaze of 31 domestic dogs during viewing of facial photographs of humans and dogs with three emotional expressions (threatening, pleasant and neutral). We found that dogs' gaze fixations spread systematically among facial features. The distribution of fixations was altered by the seen expression, but eyes were the most probable targets of the first fixations and gathered longer looking durations than mouth regardless of the viewed expression. The examination of the inner facial features as a whole revealed more pronounced scanning differences among expressions. This suggests that dogs do not base their perception of facial expressions on the viewing of single structures, but the interpretation of the composition formed by eyes, midface and mouth. Dogs evaluated social threat rapidly and this evaluation led to attentional bias, which was dependent on the depicted species: threatening conspecifics' faces evoked heightened attention but threatening human faces instead an avoidance response. We propose that threatening signals carrying differential biological validity are processed via distinctive neurocognitive pathways. Both of these mechanisms may have an adaptive significance for domestic dogs. The findings provide a novel perspective on

  6. Dogs Evaluate Threatening Facial Expressions by Their Biological Validity – Evidence from Gazing Patterns

    PubMed Central

    Somppi, Sanni; Törnqvist, Heini; Kujala, Miiamaaria V.; Hänninen, Laura; Krause, Christina M.; Vainio, Outi

    2016-01-01

    Appropriate response to companions’ emotional signals is important for all social creatures. The emotional expressions of humans and non-human animals have analogies in their form and function, suggesting shared evolutionary roots, but very little is known about how animals other than primates view and process facial expressions. In primates, threat-related facial expressions evoke exceptional viewing patterns compared with neutral or positive stimuli. Here, we explore if domestic dogs (Canis familiaris) have such an attentional bias toward threatening social stimuli and whether observed emotional expressions affect dogs’ gaze fixation distribution among the facial features (eyes, midface and mouth). We recorded the voluntary eye gaze of 31 domestic dogs during viewing of facial photographs of humans and dogs with three emotional expressions (threatening, pleasant and neutral). We found that dogs’ gaze fixations spread systematically among facial features. The distribution of fixations was altered by the seen expression, but eyes were the most probable targets of the first fixations and gathered longer looking durations than mouth regardless of the viewed expression. The examination of the inner facial features as a whole revealed more pronounced scanning differences among expressions. This suggests that dogs do not base their perception of facial expressions on the viewing of single structures, but the interpretation of the composition formed by eyes, midface and mouth. Dogs evaluated social threat rapidly and this evaluation led to attentional bias, which was dependent on the depicted species: threatening conspecifics’ faces evoked heightened attention but threatening human faces instead an avoidance response. We propose that threatening signals carrying differential biological validity are processed via distinctive neurocognitive pathways. Both of these mechanisms may have an adaptive significance for domestic dogs. The findings provide a novel

  7. Organization of the central control of muscles of facial expression in man

    PubMed Central

    Root, A A; Stephens, J A

    2003-01-01

    Surface EMGs were recorded simultaneously from ipsilateral pairs of facial muscles while subjects made three different common facial expressions: the smile, a sad expression and an expression of horror, and three contrived facial expressions. Central peaks were found in the cross-correlograms of EMG activity recorded from the orbicularis oculi and zygomaticus major during smiling, the corrugator and depressor anguli oris during the sad look and the frontalis and mentalis during the horror look. The size of the central peak was significantly greater between the orbicularis oculi and zygomaticus major during smiling. It is concluded that co-contraction of facial muscles during some facial expressions are accompanied by the presence of common synaptic drive to the motoneurones supplying the muscles involved. Central peaks were found in the cross-correlograms of EMG activity recorded from the frontalis and depressor anguli oris during a contrived expression. However, no central peaks were found in the cross-correlograms of EMG activity recorded from the frontalis and orbicularis oculi or from the frontalis and zygomaticus major during the other two contrived expressions. It is concluded that a common synaptic drive is not present between all possible facial muscle pairs and suggests a functional role for the synergy. The origin of the common drive is discussed. It is concluded that activity in branches of common stem last-order presynaptic input fibres to motoneurones innervating the different facial muscles and presynaptic synchronization of input activity to the different motoneurone pools is involved. The former probably contributes more to the drive to the orbicularis oculi and zygomaticus major during smiling, while the latter is probably more prevalent in the corrugator and depressor anguli oris during the sad look, the frontalis and mentalis during the horror look and the frontalis and depressor anguli oris during one of the contrived expressions. The strength

  8. Organization of the central control of muscles of facial expression in man.

    PubMed

    Root, A A; Stephens, J A

    2003-05-15

    Surface EMGs were recorded simultaneously from ipsilateral pairs of facial muscles while subjects made three different common facial expressions: the smile, a sad expression and an expression of horror, and three contrived facial expressions. Central peaks were found in the cross-correlograms of EMG activity recorded from the orbicularis oculi and zygomaticus major during smiling, the corrugator and depressor anguli oris during the sad look and the frontalis and mentalis during the horror look. The size of the central peak was significantly greater between the orbicularis oculi and zygomaticus major during smiling. It is concluded that co-contraction of facial muscles during some facial expressions are accompanied by the presence of common synaptic drive to the motoneurones supplying the muscles involved. Central peaks were found in the cross-correlograms of EMG activity recorded from the frontalis and depressor anguli oris during a contrived expression. However, no central peaks were found in the cross-correlograms of EMG activity recorded from the frontalis and orbicularis oculi or from the frontalis and zygomaticus major during the other two contrived expressions. It is concluded that a common synaptic drive is not present between all possible facial muscle pairs and suggests a functional role for the synergy. The origin of the common drive is discussed. It is concluded that activity in branches of common stem last-order presynaptic input fibres to motoneurones innervating the different facial muscles and presynaptic synchronization of input activity to the different motoneurone pools is involved. The former probably contributes more to the drive to the orbicularis oculi and zygomaticus major during smiling, while the latter is probably more prevalent in the corrugator and depressor anguli oris during the sad look, the frontalis and mentalis during the horror look and the frontalis and depressor anguli oris during one of the contrived expressions. The strength

  9. The role of context in interpreting facial expression: comment on Russell and Fehr (1987).

    PubMed

    Ekman, P; O'Sullivan, M

    1988-03-01

    In their article, "Relativity in the Perception of Emotion in Facial Expressions," Russell and Fehr (1987) argued that context is the principal determinant in interpreting facial expressions of emotion. They questioned the biological bases for emotion suggested by Darwin and supported by many cross-cultural studies. We suggest that their results occurred because the target faces they used were emotionally neutral or ambiguous. We also argue that their findings can be interpreted as supporting the communicative importance of the face.

  10. Impact of Childhood Maltreatment on the Recognition of Facial Expressions of Emotions

    PubMed Central

    Ardizzi, Martina; Martini, Francesca; Umiltà, Maria Alessandra; Evangelista, Valentina; Ravera, Roberto; Gallese, Vittorio

    2015-01-01

    The development of the explicit recognition of facial expressions of emotions can be affected by childhood maltreatment experiences. A previous study demonstrated the existence of an explicit recognition bias for angry facial expressions among a population of adolescent Sierra Leonean street-boys exposed to high levels of maltreatment. In the present study, the recognition bias for angry facial expressions was investigated in a younger population of street-children and age-matched controls. Participants performed a forced-choice facial expressions recognition task. Recognition bias was measured as participants’ tendency to over-attribute anger label to other negative facial expressions. Participants’ heart rate was assessed and related to their behavioral performance, as index of their stress-related physiological responses. Results demonstrated the presence of a recognition bias for angry facial expressions among street-children, also pinpointing a similar, although significantly less pronounced, tendency among controls. Participants’ performance was controlled for age, cognitive and educational levels and for naming skills. None of these variables influenced the recognition bias for angry facial expressions. Differently, a significant effect of heart rate on participants’ tendency to use anger label was evidenced. Taken together, these results suggest that childhood exposure to maltreatment experiences amplifies children’s “pre-existing bias” for anger labeling in forced-choice emotion recognition task. Moreover, they strengthen the thesis according to which the recognition bias for angry facial expressions is a manifestation of a functional adaptive mechanism that tunes victim’s perceptive and attentive focus on salient environmental social stimuli. PMID:26509890

  11. Are event-related potentials to dynamic facial expressions of emotion related to individual differences in the accuracy of processing facial expressions and identity?

    PubMed

    Recio, Guillermo; Wilhelm, Oliver; Sommer, Werner; Hildebrandt, Andrea

    2017-04-01

    Despite a wealth of knowledge about the neural mechanisms behind emotional facial expression processing, little is known about how they relate to individual differences in social cognition abilities. We studied individual differences in the event-related potentials (ERPs) elicited by dynamic facial expressions. First, we assessed the latent structure of the ERPs, reflecting structural face processing in the N170, and the allocation of processing resources and reflexive attention to emotionally salient stimuli, in the early posterior negativity (EPN) and the late positive complex (LPC). Then we estimated brain-behavior relationships between the ERP factors and behavioral indicators of facial identity and emotion-processing abilities. Structural models revealed that the participants who formed faster structural representations of neutral faces (i.e., shorter N170 latencies) performed better at face perception (r = -.51) and memory (r = -.42). The N170 amplitude was not related to individual differences in face cognition or emotion processing. The latent EPN factor correlated with emotion perception (r = .47) and memory (r = .32), and also with face perception abilities (r = .41). Interestingly, the latent factor representing the difference in EPN amplitudes between the two neutral control conditions (chewing and blinking movements) also correlated with emotion perception (r = .51), highlighting the importance of tracking facial changes in the perception of emotional facial expressions. The LPC factor for negative expressions correlated with the memory for emotional facial expressions. The links revealed between the latency and strength of activations of brain systems and individual differences in processing socio-emotional information provide new insights into the brain mechanisms involved in social communication.

  12. Spontaneous facial expressions of emotion of congenitally and noncongenitally blind individuals.

    PubMed

    Matsumoto, David; Willingham, Bob

    2009-01-01

    The study of the spontaneous expressions of blind individuals offers a unique opportunity to understand basic processes concerning the emergence and source of facial expressions of emotion. In this study, the authors compared the expressions of congenitally and noncongenitally blind athletes in the 2004 Paralympic Games with each other and with those produced by sighted athletes in the 2004 Olympic Games. The authors also examined how expressions change from 1 context to another. There were no differences between congenitally blind, noncongenitally blind, and sighted athletes, either on the level of individual facial actions or in facial emotion configurations. Blind athletes did produce more overall facial activity, but these were isolated to head and eye movements. The blind athletes' expressions differentiated whether they had won or lost a medal match at 3 different points in time, and there were no cultural differences in expression. These findings provide compelling evidence that the production of spontaneous facial expressions of emotion is not dependent on observational learning but simultaneously demonstrates a learned component to the social management of expressions, even among blind individuals.

  13. Interpreting text messages with graphic facial expression by deaf and hearing people.

    PubMed

    Saegusa, Chihiro; Namatame, Miki; Watanabe, Katsumi

    2015-01-01

    In interpreting verbal messages, humans use not only verbal information but also non-verbal signals such as facial expression. For example, when a person says "yes" with a troubled face, what he or she really means appears ambiguous. In the present study, we examined how deaf and hearing people differ in perceiving real meanings in texts accompanied by representations of facial expression. Deaf and hearing participants were asked to imagine that the face presented on the computer monitor was asked a question from another person (e.g., do you like her?). They observed either a realistic or a schematic face with a different magnitude of positive or negative expression on a computer monitor. A balloon that contained either a positive or negative text response to the question appeared at the same time as the face. Then, participants rated how much the individual on the monitor really meant it (i.e., perceived earnestness), using a 7-point scale. Results showed that the facial expression significantly modulated the perceived earnestness. The influence of positive expression on negative text responses was relatively weaker than that of negative expression on positive responses (i.e., "no" tended to mean "no" irrespective of facial expression) for both participant groups. However, this asymmetrical effect was stronger in the hearing group. These results suggest that the contribution of facial expression in perceiving real meanings from text messages is qualitatively similar but quantitatively different between deaf and hearing people.

  14. Interpreting text messages with graphic facial expression by deaf and hearing people

    PubMed Central

    Saegusa, Chihiro; Namatame, Miki; Watanabe, Katsumi

    2015-01-01

    In interpreting verbal messages, humans use not only verbal information but also non-verbal signals such as facial expression. For example, when a person says “yes” with a troubled face, what he or she really means appears ambiguous. In the present study, we examined how deaf and hearing people differ in perceiving real meanings in texts accompanied by representations of facial expression. Deaf and hearing participants were asked to imagine that the face presented on the computer monitor was asked a question from another person (e.g., do you like her?). They observed either a realistic or a schematic face with a different magnitude of positive or negative expression on a computer monitor. A balloon that contained either a positive or negative text response to the question appeared at the same time as the face. Then, participants rated how much the individual on the monitor really meant it (i.e., perceived earnestness), using a 7-point scale. Results showed that the facial expression significantly modulated the perceived earnestness. The influence of positive expression on negative text responses was relatively weaker than that of negative expression on positive responses (i.e., “no” tended to mean “no” irrespective of facial expression) for both participant groups. However, this asymmetrical effect was stronger in the hearing group. These results suggest that the contribution of facial expression in perceiving real meanings from text messages is qualitatively similar but quantitatively different between deaf and hearing people. PMID:25883582

  15. Recognizing dynamic facial expressions of emotion: Specificity and intensity effects in event-related brain potentials.

    PubMed

    Recio, Guillermo; Schacht, Annekathrin; Sommer, Werner

    2014-02-01

    Emotional facial expressions usually arise dynamically from a neutral expression. Yet, most previous research focused on static images. The present study investigated basic aspects of processing dynamic facial expressions. In two experiments, we presented short videos of facial expressions of six basic emotions and non-emotional facial movements emerging at variable and fixed rise times, attaining different intensity levels. In event-related brain potentials (ERP), effects of emotion but also for non-emotional movements appeared as early posterior negativity (EPN) between 200 and 350ms, suggesting an overall facilitation of early visual encoding for all facial movements. These EPN effects were emotion-unspecific. In contrast, relative to happiness and neutral expressions, negative emotional expressions elicited larger late positive ERP components (LPCs), indicating a more elaborate processing. Both EPN and LPC amplitudes increased with expression intensity. Effects of emotion and intensity were additive, indicating that intensity (understood as the degree of motion) increases the impact of emotional expressions but not its quality. These processes can be driven by all basic emotions, and there is little emotion-specificity even when statistical power is considerable (N (Experiment 2)=102). Copyright © 2013 Elsevier B.V. All rights reserved.

  16. Rapid influence of emotional scenes on encoding of facial expressions: an ERP study.

    PubMed

    Righart, Ruthger; de Gelder, Beatrice

    2008-09-01

    In daily life, we perceive a person's facial reaction as part of the natural environment surrounding it. Because most studies have investigated how facial expressions are recognized by using isolated faces, it is unclear what role the context plays. Although it has been observed that the N170 for facial expressions is modulated by the emotional context, it was not clear whether individuals use context information on this stage of processing to discriminate between facial expressions. The aim of the present study was to investigate how the early stages of face processing are affected by emotional scenes when explicit categorizations of fearful and happy facial expressions are made. Emotion effects were found for the N170, with larger amplitudes for faces in fearful scenes as compared to faces in happy and neutral scenes. Critically, N170 amplitudes were significantly increased for fearful faces in fearful scenes as compared to fearful faces in happy scenes and expressed in left-occipito-temporal scalp topography differences. Our results show that the information provided by the facial expression is combined with the scene context during the early stages of face processing.

  17. Rapid influence of emotional scenes on encoding of facial expressions: an ERP study

    PubMed Central

    Righart, Ruthger

    2008-01-01

    In daily life, we perceive a person's facial reaction as part of the natural environment surrounding it. Because most studies have investigated how facial expressions are recognized by using isolated faces, it is unclear what role the context plays. Although it has been observed that the N170 for facial expressions is modulated by the emotional context, it was not clear whether individuals use context information on this stage of processing to discriminate between facial expressions. The aim of the present study was to investigate how the early stages of face processing are affected by emotional scenes when explicit categorizations of fearful and happy facial expressions are made. Emotion effects were found for the N170, with larger amplitudes for faces in fearful scenes as compared to faces in happy and neutral scenes. Critically, N170 amplitudes were significantly increased for fearful faces in fearful scenes as compared to fearful faces in happy scenes and expressed in left-occipito-temporal scalp topography differences. Our results show that the information provided by the facial expression is combined with the scene context during the early stages of face processing. PMID:19015119

  18. Social anxiety and trustworthiness judgments of dynamic facial expressions of emotion.

    PubMed

    Gutiérrez-García, Aida; Calvo, Manuel G

    2016-09-01

    Perception of trustworthiness in other people is essential for successful social interaction. Facial expressions-as conveyers of feelings and intentions-are an important source of this information. We investigated how social anxiety is related to biases in the judgment of faces towards un/trustworthiness depending on type of emotional expression and expressive intensity. Undergraduates with clinical levels of social anxiety and low-anxiety controls were presented with 1-s video-clips displaying facial happiness, anger, fear, sadness, disgust, surprise, or neutrality, at various levels of emotional intensity. Participants judged how trustworthy the expressers looked like. Social anxiety was associated with enhanced distrust towards angry and disgusted expressions, and this occurred at lower intensity thresholds, relative to non-anxious controls. There was no effect for other negative expressions (sadness and fear), basically ambiguous expressions (surprise and neutral), or happy faces. The social anxiety and the control groups consisted of more females than males, although this gender disproportion was the same in both groups. Also, the expressive speed rate was different for the various intensity conditions, although such differences were equated for all the expressions and for both groups. Individuals with high social anxiety overestimate perceived social danger even from subtle facial cues, thus exhibiting a threat-related interpretative bias in the form of untrustworthiness judgments. Such a bias is, nevertheless, limited to facial expressions conveying direct threat such as hostility and rejection. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Perceptual and affective mechanisms in facial expression recognition: An integrative review.

    PubMed

    Calvo, Manuel G; Nummenmaa, Lauri

    2016-09-01

    Facial expressions of emotion involve a physical component of morphological changes in a face and an affective component conveying information about the expresser's internal feelings. It remains unresolved how much recognition and discrimination of expressions rely on the perception of morphological patterns or the processing of affective content. This review of research on the role of visual and emotional factors in expression recognition reached three major conclusions. First, behavioral, neurophysiological, and computational measures indicate that basic expressions are reliably recognized and discriminated from one another, albeit the effect may be inflated by the use of prototypical expression stimuli and forced-choice responses. Second, affective content along the dimensions of valence and arousal is extracted early from facial expressions, although this coarse affective representation contributes minimally to categorical recognition of specific expressions. Third, the physical configuration and visual saliency of facial features contribute significantly to expression recognition, with "emotionless" computational models being able to reproduce some of the basic phenomena demonstrated in human observers. We conclude that facial expression recognition, as it has been investigated in conventional laboratory tasks, depends to a greater extent on perceptual than affective information and mechanisms.

  20. Functional integration of the posterior superior temporal sulcus correlates with facial expression recognition.

    PubMed

    Wang, Xu; Song, Yiying; Zhen, Zonglei; Liu, Jia

    2016-05-01

    Face perception is essential for daily and social activities. Neuroimaging studies have revealed a distributed face network (FN) consisting of multiple regions that exhibit preferential responses to invariant or changeable facial information. However, our understanding about how these regions work collaboratively to facilitate facial information processing is limited. Here, we focused on changeable facial information processing, and investigated how the functional integration of the FN is related to the performance of facial expression recognition. To do so, we first defined the FN as voxels that responded more strongly to faces than objects, and then used a voxel-based global brain connectivity method based on resting-state fMRI to characterize the within-network connectivity (WNC) of each voxel in the FN. By relating the WNC and performance in the "Reading the Mind in the Eyes" Test across participants, we found that individuals with stronger WNC in the right posterior superior temporal sulcus (rpSTS) were better at recognizing facial expressions. Further, the resting-state functional connectivity (FC) between the rpSTS and right occipital face area (rOFA), early visual cortex (EVC), and bilateral STS were positively correlated with the ability of facial expression recognition, and the FCs of EVC-pSTS and OFA-pSTS contributed independently to facial expression recognition. In short, our study highlights the behavioral significance of intrinsic functional integration of the FN in facial expression processing, and provides evidence for the hub-like role of the rpSTS for facial expression recognition. Hum Brain Mapp 37:1930-1940, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  1. Behavioral dissociation between emotional and non-emotional facial expressions in congenital prosopagnosia

    PubMed Central

    Daini, Roberta; Comparetti, Chiara M.; Ricciardelli, Paola

    2014-01-01

    Neuropsychological and neuroimaging studies have shown that facial recognition and emotional expressions are dissociable. However, it is unknown if a single system supports the processing of emotional and non-emotional facial expressions. We aimed to understand if individuals with impairment in face recognition from birth (congenital prosopagnosia, CP) can use non-emotional facial expressions to recognize a face as an already seen one, and thus, process this facial dimension independently from features (which are impaired in CP), and basic emotional expressions. To this end, we carried out a behavioral study in which we compared the performance of 6 CP individuals to that of typical development individuals, using upright and inverted faces. Four avatar faces with a neutral expression were presented in the initial phase. The target faces presented in the recognition phase, in which a recognition task was requested (2AFC paradigm), could be identical (neutral) to those of the initial phase or present biologically plausible changes to features, non-emotional expressions, or emotional expressions. After this task, a second task was performed, in which the participants had to detect whether or not the recognized face exactly matched the study face or showed any difference. The results confirmed the CPs' impairment in the configural processing of the invariant aspects of the face, but also showed a spared configural processing of non-emotional facial expression (task 1). Interestingly and unlike the non-emotional expressions, the configural processing of emotional expressions was compromised in CPs and did not improve their change detection ability (task 2). These new results have theoretical implications for face perception models since they suggest that, at least in CPs, non-emotional expressions are processed configurally, can be dissociated from other facial dimensions, and may serve as a compensatory strategy to achieve face recognition. PMID:25520643

  2. [Association between intelligence development and facial expression recognition ability in children with autism spectrum disorder].

    PubMed

    Pan, Ning; Wu, Gui-Hua; Zhang, Ling; Zhao, Ya-Fen; Guan, Han; Xu, Cai-Juan; Jing, Jin; Jin, Yu

    2017-03-01

    To investigate the features of intelligence development, facial expression recognition ability, and the association between them in children with autism spectrum disorder (ASD). A total of 27 ASD children aged 6-16 years (ASD group, full intelligence quotient >70) and age- and gender-matched normally developed children (control group) were enrolled. Wechsler Intelligence Scale for Children Fourth Edition and Chinese Static Facial Expression Photos were used for intelligence evaluation and facial expression recognition test. Compared with the control group, the ASD group had significantly lower scores of full intelligence quotient, verbal comprehension index, perceptual reasoning index (PRI), processing speed index(PSI), and working memory index (WMI) (P<0.05). The ASD group also had a significantly lower overall accuracy rate of facial expression recognition and significantly lower accuracy rates of the recognition of happy, angry, sad, and frightened expressions than the control group (P<0.05). In the ASD group, the overall accuracy rate of facial expression recognition and the accuracy rates of the recognition of happy and frightened expressions were positively correlated with PRI (r=0.415, 0.455, and 0.393 respectively; P<0.05). The accuracy rate of the recognition of angry expression was positively correlated with WMI (r=0.397; P<0.05). ASD children have delayed intelligence development compared with normally developed children and impaired expression recognition ability. Perceptual reasoning and working memory abilities are positively correlated with expression recognition ability, which suggests that insufficient perceptual reasoning and working memory abilities may be important factors affecting facial expression recognition ability in ASD children.

  3. Behavioral dissociation between emotional and non-emotional facial expressions in congenital prosopagnosia.

    PubMed

    Daini, Roberta; Comparetti, Chiara M; Ricciardelli, Paola

    2014-01-01

    Neuropsychological and neuroimaging studies have shown that facial recognition and emotional expressions are dissociable. However, it is unknown if a single system supports the processing of emotional and non-emotional facial expressions. We aimed to understand if individuals with impairment in face recognition from birth (congenital prosopagnosia, CP) can use non-emotional facial expressions to recognize a face as an already seen one, and thus, process this facial dimension independently from features (which are impaired in CP), and basic emotional expressions. To this end, we carried out a behavioral study in which we compared the performance of 6 CP individuals to that of typical development individuals, using upright and inverted faces. Four avatar faces with a neutral expression were presented in the initial phase. The target faces presented in the recognition phase, in which a recognition task was requested (2AFC paradigm), could be identical (neutral) to those of the initial phase or present biologically plausible changes to features, non-emotional expressions, or emotional expressions. After this task, a second task was performed, in which the participants had to detect whether or not the recognized face exactly matched the study face or showed any difference. The results confirmed the CPs' impairment in the configural processing of the invariant aspects of the face, but also showed a spared configural processing of non-emotional facial expression (task 1). Interestingly and unlike the non-emotional expressions, the configural processing of emotional expressions was compromised in CPs and did not improve their change detection ability (task 2). These new results have theoretical implications for face perception models since they suggest that, at least in CPs, non-emotional expressions are processed configurally, can be dissociated from other facial dimensions, and may serve as a compensatory strategy to achieve face recognition.

  4. Electromyographic Responses to Emotional Facial Expressions in 6-7 Year Olds with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Deschamps, P. K. H.; Coppes, L.; Kenemans, J. L.; Schutter, D. J. L. G.; Matthys, W.

    2015-01-01

    This study aimed to examine facial mimicry in 6-7 year old children with autism spectrum disorder (ASD) and to explore whether facial mimicry was related to the severity of impairment in social responsiveness. Facial electromyographic activity in response to angry, fearful, sad and happy facial expressions was recorded in twenty 6-7 year old…

  5. Electromyographic Responses to Emotional Facial Expressions in 6-7 Year Olds with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Deschamps, P. K. H.; Coppes, L.; Kenemans, J. L.; Schutter, D. J. L. G.; Matthys, W.

    2015-01-01

    This study aimed to examine facial mimicry in 6-7 year old children with autism spectrum disorder (ASD) and to explore whether facial mimicry was related to the severity of impairment in social responsiveness. Facial electromyographic activity in response to angry, fearful, sad and happy facial expressions was recorded in twenty 6-7 year old…

  6. Intact recognition of facial expression, gender, and age in patients with impaired recognition of face identity.

    PubMed

    Tranel, D; Damasio, A R; Damasio, H

    1988-05-01

    We conducted a series of experiments to assess the ability to recognize the meaning of facial expressions, gender, and age in four patients with severe impairments of the recognition of facial identity. In three patients the recognition of face identity could be dissociated from that of facial expression, age, and gender. In one, all forms of face recognition were impaired. Thus, a given lesion may preclude one type of recognition but not another. We conclude that (1) the cognitive demands posed by different forms of recognition are met at different processing levels, and (2) different levels depend on different neural substrates.

  7. Laterality of facial expressions of emotion: Universal and culture-specific influences.

    PubMed

    Mandal, Manas K; Ambady, Nalini

    2004-01-01

    Recent research indicates that (a) the perception and expression of facial emotion are lateralized to a great extent in the right hemisphere, and, (b) whereas facial expressions of emotion embody universal signals, culture-specific learning moderates the expression and interpretation of these emotions. In the present article, we review the literature on laterality and universality, and propose that, although some components of facial expressions of emotion are governed biologically, others are culturally influenced. We suggest that the left side of the face is more expressive of emotions, is more uninhibited, and displays culture-specific emotional norms. The right side of face, on the other hand, is less susceptible to cultural display norms and exhibits more universal emotional signals. Copyright 2004 IOS Press

  8. Towards a Unified Framework for Pose, Expression, and Occlusion Tolerant Automatic Facial Alignment.

    PubMed

    Seshadri, Keshav; Savvides, Marios

    2016-10-01

    We propose a facial alignment algorithm that is able to jointly deal with the presence of facial pose variation, partial occlusion of the face, and varying illumination and expressions. Our approach proceeds from sparse to dense landmarking steps using a set of specific models trained to best account for the shape and texture variation manifested by facial landmarks and facial shapes across pose and various expressions. We also propose the use of a novel l1-regularized least squares approach that we incorporate into our shape model, which is an improvement over the shape model used by several prior Active Shape Model (ASM) based facial landmark localization algorithms. Our approach is compared against several state-of-the-art methods on many challenging test datasets and exhibits a higher fitting accuracy on all of them.

  9. Effects of exposure to facial expression variation in face learning and recognition.

    PubMed

    Liu, Chang Hong; Chen, Wenfeng; Ward, James

    2015-11-01

    Facial expression is a major source of image variation in face images. Linking numerous expressions to the same face can be a huge challenge for face learning and recognition. It remains largely unknown what level of exposure to this image variation is critical for expression-invariant face recognition. We examined this issue in a recognition memory task, where the number of facial expressions of each face being exposed during a training session was manipulated. Faces were either trained with multiple expressions or a single expression, and they were later tested in either the same or different expressions. We found that recognition performance after learning three emotional expressions had no improvement over learning a single emotional expression (Experiments 1 and 2). However, learning three emotional expressions improved recognition compared to learning a single neutral expression (Experiment 3). These findings reveal both the limitation and the benefit of multiple exposures to variations of emotional expression in achieving expression-invariant face recognition. The transfer of expression training to a new type of expression is likely to depend on a relatively extensive level of training and a certain degree of variation across the types of expressions.

  10. Posed versus spontaneous facial expressions are modulated by opposite cerebral hemispheres.

    PubMed

    Ross, Elliott D; Pulusu, Vinay K

    2013-05-01

    Clinical research has indicated that the left face is more expressive than the right face, suggesting that modulation of facial expressions is lateralized to the right hemisphere. The findings, however, are controversial because the results explain, on average, approximately 4% of the data variance. Using high-speed videography, we sought to determine if movement-onset asymmetry was a more powerful research paradigm than terminal movement asymmetry. The results were very robust, explaining up to 70% of the data variance. Posed expressions began overwhelmingly on the right face whereas spontaneous expressions began overwhelmingly on the left face. This dichotomy was most robust for upper facial expressions. In addition, movement-onset asymmetries did not predict terminal movement asymmetries, which were not significantly lateralized. The results support recent neuroanatomic observations that upper versus lower facial movements have different forebrain motor representations and recent behavioral constructs that posed versus spontaneous facial expressions are modulated preferentially by opposite cerebral hemispheres and that spontaneous facial expressions are graded rather than non-graded movements.

  11. Exposure to the self-face facilitates identification of dynamic facial expressions: influences on individual differences.

    PubMed

    Li, Yuan Hang; Tottenham, Nim

    2013-04-01

    A growing literature suggests that the self-face is involved in processing the facial expressions of others. The authors experimentally activated self-face representations to assess its effects on the recognition of dynamically emerging facial expressions of others. They exposed participants to videos of either their own faces (self-face prime) or faces of others (nonself-face prime) prior to a facial expression judgment task. Their results show that experimentally activating self-face representations results in earlier recognition of dynamically emerging facial expression. As a group, participants in the self-face prime condition recognized expressions earlier (when less affective perceptual information was available) compared to participants in the nonself-face prime condition. There were individual differences in performance, such that poorer expression identification was associated with higher autism traits (in this neurocognitively healthy sample). However, when randomized into the self-face prime condition, participants with high autism traits performed as well as those with low autism traits. Taken together, these data suggest that the ability to recognize facial expressions in others is linked with the internal representations of our own faces.

  12. Use of 3D faces facilitates facial expression recognition in children

    PubMed Central

    Wang, Lamei; Chen, Wenfeng; Li, Hong

    2017-01-01

    This study assessed whether presenting 3D face stimuli could facilitate children’s facial expression recognition. Seventy-one children aged between 3 and 6 participated in the study. Their task was to judge whether a face presented in each trial showed a happy or fearful expression. Half of the face stimuli were shown with 3D representations, whereas the other half of the images were shown as 2D pictures. We compared expression recognition under these conditions. The results showed that the use of 3D faces improved the speed of facial expression recognition in both boys and girls. Moreover, 3D faces improved boys’ recognition accuracy for fearful expressions. Since fear is the most difficult facial expression for children to recognize, the facilitation effect of 3D faces has important practical implications for children with difficulties in facial expression recognition. The potential benefits of 3D representation for other expressions also have implications for developing more realistic assessments of children’s expression recognition. PMID:28368008

  13. Synthesis of Facial Image with Expression Based on Muscular Contraction Parameters Using Linear Muscle and Sphincter Muscle

    NASA Astrophysics Data System (ADS)

    Ahn, Seonju; Ozawa, Shinji

    We aim to synthesize individual facial image with expression based on muscular contraction parameters. We have proposed a method of calculating the muscular contraction parameters from arbitrary face image without using learning for each individual. As a result, we could generate not only individual facial expression, but also the facial expressions of various persons. In this paper, we propose the muscle-based facial model; the facial muscles define both the linear and the novel sphincter. Additionally, we propose a method of synthesizing individual facial image with expression based on muscular contraction parameters. First, the individual facial model with expression is generated by fitting using the arbitrary face image. Next, the muscular contraction parameters are calculated that correspond to the expression displacement of the input face image. Finally, the facial expression is synthesized by the vertex displacements of a neutral facial model based on calculated muscular contraction parameters. Experimental results reveal that the novel sphincter muscle can synthesize facial expressions of the facial image, which corresponds to the actual face image with arbitrary and mouth or eyes expression.

  14. How Do We Update Faces? Effects of Gaze Direction and Facial Expressions on Working Memory Updating

    PubMed Central

    Artuso, Caterina; Palladino, Paola; Ricciardelli, Paola

    2012-01-01

    The aim of the study was to investigate how the biological binding between different facial dimensions, and their social and communicative relevance, may impact updating processes in working memory (WM). We focused on WM updating because it plays a key role in ongoing processing. Gaze direction and facial expression are crucial and changeable components of face processing. Direct gaze enhances the processing of approach-oriented facial emotional expressions (e.g., joy), while averted gaze enhances the processing of avoidance-oriented facial emotional expressions (e.g., fear). Thus, the way in which these two facial dimensions are combined communicates to the observer important behavioral and social information. Updating of these two facial dimensions and their bindings has not been investigated before, despite the fact that they provide a piece of social information essential for building and maintaining an internal ongoing representation of our social environment. In Experiment 1 we created a task in which the binding between gaze direction and facial expression was manipulated: high binding conditions (e.g., joy-direct gaze) were compared to low binding conditions (e.g., joy-averted gaze). Participants had to study and update continuously a number of faces, displaying different bindings between the two dimensions. In Experiment 2 we tested whether updating was affected by the social and communicative value of the facial dimension binding; to this end, we manipulated bindings between eye and hair color, two less communicative facial dimensions. Two new results emerged. First, faster response times were found in updating combinations of facial dimensions highly bound together. Second, our data showed that the ease of the ongoing updating processing varied depending on the communicative meaning of the binding that had to be updated. The results are discussed with reference to the role of WM updating in social cognition and appraisal processes. PMID:23060832

  15. Human Empathy, Personality and Experience Affect the Emotion Ratings of Dog and Human Facial Expressions

    PubMed Central

    Kujala, Miiamaaria V.; Somppi, Sanni; Jokela, Markus; Vainio, Outi; Parkkonen, Lauri

    2017-01-01

    Facial expressions are important for humans in communicating emotions to the conspecifics and enhancing interpersonal understanding. Many muscles producing facial expressions in humans are also found in domestic dogs, but little is known about how humans perceive dog facial expressions, and which psychological factors influence people’s perceptions. Here, we asked 34 observers to rate the valence, arousal, and the six basic emotions (happiness, sadness, surprise, disgust, fear, and anger/aggressiveness) from images of human and dog faces with Pleasant, Neutral and Threatening expressions. We investigated how the subjects’ personality (the Big Five Inventory), empathy (Interpersonal Reactivity Index) and experience of dog behavior affect the ratings of dog and human faces. Ratings of both species followed similar general patterns: human subjects classified dog facial expressions from pleasant to threatening very similarly to human facial expressions. Subjects with higher emotional empathy evaluated Threatening faces of both species as more negative in valence and higher in anger/aggressiveness. More empathetic subjects also rated the happiness of Pleasant humans but not dogs higher, and they were quicker in their valence judgments of Pleasant human, Threatening human and Threatening dog faces. Experience with dogs correlated positively with ratings of Pleasant and Neutral dog faces. Personality also had a minor effect on the ratings of Pleasant and Neutral faces in both species. The results imply that humans perceive human and dog facial expression in a similar manner, and the perception of both species is influenced by psychological factors of the evaluators. Especially empathy affects both the speed and intensity of rating dogs’ emotional facial expressions. PMID:28114335

  16. Support vector machine-based facial-expression recognition method combining shape and appearance

    NASA Astrophysics Data System (ADS)

    Han, Eun Jung; Kang, Byung Jun; Park, Kang Ryoung; Lee, Sangyoun

    2010-11-01

    Facial expression recognition can be widely used for various applications, such as emotion-based human-machine interaction, intelligent robot interfaces, face recognition robust to expression variation, etc. Previous studies have been classified as either shape- or appearance-based recognition. The shape-based method has the disadvantage that the individual variance of facial feature points exists irrespective of similar expressions, which can cause a reduction of the recognition accuracy. The appearance-based method has a limitation in that the textural information of the face is very sensitive to variations in illumination. To overcome these problems, a new facial-expression recognition method is proposed, which combines both shape and appearance information, based on the support vector machine (SVM). This research is novel in the following three ways as compared to previous works. First, the facial feature points are automatically detected by using an active appearance model. From these, the shape-based recognition is performed by using the ratios between the facial feature points based on the facial-action coding system. Second, the SVM, which is trained to recognize the same and different expression classes, is proposed to combine two matching scores obtained from the shape- and appearance-based recognitions. Finally, a single SVM is trained to discriminate four different expressions, such as neutral, a smile, anger, and a scream. By determining the expression of the input facial image whose SVM output is at a minimum, the accuracy of the expression recognition is much enhanced. The experimental results showed that the recognition accuracy of the proposed method was better than previous researches and other fusion methods.

  17. Human Empathy, Personality and Experience Affect the Emotion Ratings of Dog and Human Facial Expressions.

    PubMed

    Kujala, Miiamaaria V; Somppi, Sanni; Jokela, Markus; Vainio, Outi; Parkkonen, Lauri

    2017-01-01

    Facial expressions are important for humans in communicating emotions to the conspecifics and enhancing interpersonal understanding. Many muscles producing facial expressions in humans are also found in domestic dogs, but little is known about how humans perceive dog facial expressions, and which psychological factors influence people's perceptions. Here, we asked 34 observers to rate the valence, arousal, and the six basic emotions (happiness, sadness, surprise, disgust, fear, and anger/aggressiveness) from images of human and dog faces with Pleasant, Neutral and Threatening expressions. We investigated how the subjects' personality (the Big Five Inventory), empathy (Interpersonal Reactivity Index) and experience of dog behavior affect the ratings of dog and human faces. Ratings of both species followed similar general patterns: human subjects classified dog facial expressions from pleasant to threatening very similarly to human facial expressions. Subjects with higher emotional empathy evaluated Threatening faces of both species as more negative in valence and higher in anger/aggressiveness. More empathetic subjects also rated the happiness of Pleasant humans but not dogs higher, and they were quicker in their valence judgments of Pleasant human, Threatening human and Threatening dog faces. Experience with dogs correlated positively with ratings of Pleasant and Neutral dog faces. Personality also had a minor effect on the ratings of Pleasant and Neutral faces in both species. The results imply that humans perceive human and dog facial expression in a similar manner, and the perception of both species is influenced by psychological factors of the evaluators. Especially empathy affects both the speed and intensity of rating dogs' emotional facial expressions.

  18. Does gaze direction modulate facial expression processing in children with autism spectrum disorder?

    PubMed

    Akechi, Hironori; Senju, Atsushi; Kikuchi, Yukiko; Tojo, Yoshikuni; Osanai, Hiroo; Hasegawa, Toshikazu

    2009-01-01

    Two experiments investigated whether children with autism spectrum disorder (ASD) integrate relevant communicative signals, such as gaze direction, when decoding a facial expression. In Experiment 1, typically developing children (9-14 years old; n = 14) were faster at detecting a facial expression accompanying a gaze direction with a congruent motivational tendency (i.e., an avoidant facial expression with averted eye gaze) than those with an incongruent motivational tendency. Children with ASD (9-14 years old; n = 14) were not affected by the gaze direction of facial stimuli. This finding was replicated in Experiment 2, which presented only the eye region of the face to typically developing children (n = 10) and children with ASD (n = 10). These results demonstrated that children with ASD do not encode and/or integrate multiple communicative signals based on their affective or motivational tendency.

  19. Development and validation of an Argentine set of facial expressions of emotion.

    PubMed

    Vaiman, Marcelo; Wagner, Mónica Anna; Caicedo, Estefanía; Pereno, Germán Leandro

    2017-02-01

    Pictures of facial expressions of emotion are used in a wide range of experiments. The last decade has seen an increase in the number of studies presenting local sets of emotion stimuli. However, only a few existing sets contain pictures of Latin Americans, despite the growing attention emotion research is receiving in this region. Here we present the development and validation of the Universidad Nacional de Cordoba, Expresiones de Emociones Faciales (UNCEEF), a Facial Action Coding System (FACS)-verified set of pictures of Argentineans expressing the six basic emotions, plus neutral expressions. FACS scores, recognition rates, Hu scores, and discrimination indices are reported. Evidence of convergent validity was obtained using the Pictures of Facial Affect in an Argentine sample. However, recognition accuracy was greater for UNCEEF. The importance of local sets of emotion pictures is discussed.

  20. An optimized ERP brain-computer interface based on facial expression changes

    NASA Astrophysics Data System (ADS)

    Jin, Jing; Daly, Ian; Zhang, Yu; Wang, Xingyu; Cichocki, Andrzej

    2014-06-01

    Objective. Interferences from spatially adjacent non-target stimuli are known to evoke event-related potentials (ERPs) during non-target flashes and, therefore, lead to false positives. This phenomenon was commonly seen in visual attention-based brain-computer interfaces (BCIs) using conspicuous stimuli and is known to adversely affect the performance of BCI systems. Although users try to focus on the target stimulus, they cannot help but be affected by conspicuous changes of the stimuli (such as flashes or presenting images) which were adjacent to the target stimulus. Furthermore, subjects have reported that conspicuous stimuli made them tired and annoyed. In view of this, the aim of this study was to reduce adjacent interference, annoyance and fatigue using a new stimulus presentation pattern based upon facial expression changes. Our goal was not to design a new pattern which could evoke larger ERPs than the face pattern, but to design a new pattern which could reduce adjacent interference, annoyance and fatigue, and evoke ERPs as good as those observed during the face pattern. Approach. Positive facial expressions could be changed to negative facial expressions by minor changes to the original facial image. Although the changes are minor, the contrast is big enough to evoke strong ERPs. In this paper, a facial expression change pattern between positive and negative facial expressions was used to attempt to minimize interference effects. This was compared against two different conditions, a shuffled pattern containing the same shapes and colours as the facial expression change pattern, but without the semantic content associated with a change in expression, and a face versus no face pattern. Comparisons were made in terms of classification accuracy and information transfer rate as well as user supplied subjective measures. Main results. The results showed that interferences from adjacent stimuli, annoyance and the fatigue experienced by the subjects could be

  1. Unconscious Processing of Facial Expressions in Individuals with Internet Gaming Disorder.

    PubMed

    Peng, Xiaozhe; Cui, Fang; Wang, Ting; Jiao, Can

    2017-01-01

    Internet Gaming Disorder (IGD) is characterized by impairments in social communication and the avoidance of social contact. Facial expression processing is the basis of social communication. However, few studies have investigated how individuals with IGD process facial expressions, and whether they have deficits in emotional facial processing remains unclear. The aim of the present study was to explore these two issues by investigating the time course of emotional facial processing in individuals with IGD. A backward masking task was used to investigate the differences between individuals with IGD and normal controls (NC) in the processing of subliminally presented facial expressions (sad, happy, and neutral) with event-related potentials (ERPs). The behavioral results showed that individuals with IGD are slower than NC in response to both sad and neutral expressions in the sad-neutral context. The ERP results showed that individuals with IGD exhibit decreased amplitudes in ERP component N170 (an index of early face processing) in response to neutral expressions compared to happy expressions in the happy-neutral expressions context, which might be due to their expectancies for positive emotional content. The NC, on the other hand, exhibited comparable N170 amplitudes in response to both happy and neutral expressions in the happy-neutral expressions context, as well as sad and neutral expressions in the sad-neutral expressions context. Both individuals with IGD and NC showed comparable ERP amplitudes during the processing of sad expressions and neutral expressions. The present study revealed that individuals with IGD have different unconscious neutral facial processing patterns compared with normal individuals and suggested that individuals with IGD may expect more positive emotion in the happy-neutral expressions context. • The present study investigated whether the unconscious processing of facial expressions is influenced by excessive online gaming. A validated

  2. Unconscious Processing of Facial Expressions in Individuals with Internet Gaming Disorder

    PubMed Central

    Peng, Xiaozhe; Cui, Fang; Wang, Ting; Jiao, Can

    2017-01-01

    Internet Gaming Disorder (IGD) is characterized by impairments in social communication and the avoidance of social contact. Facial expression processing is the basis of social communication. However, few studies have investigated how individuals with IGD process facial expressions, and whether they have deficits in emotional facial processing remains unclear. The aim of the present study was to explore these two issues by investigating the time course of emotional facial processing in individuals with IGD. A backward masking task was used to investigate the differences between individuals with IGD and normal controls (NC) in the processing of subliminally presented facial expressions (sad, happy, and neutral) with event-related potentials (ERPs). The behavioral results showed that individuals with IGD are slower than NC in response to both sad and neutral expressions in the sad–neutral context. The ERP results showed that individuals with IGD exhibit decreased amplitudes in ERP component N170 (an index of early face processing) in response to neutral expressions compared to happy expressions in the happy–neutral expressions context, which might be due to their expectancies for positive emotional content. The NC, on the other hand, exhibited comparable N170 amplitudes in response to both happy and neutral expressions in the happy–neutral expressions context, as well as sad and neutral expressions in the sad–neutral expressions context. Both individuals with IGD and NC showed comparable ERP amplitudes during the processing of sad expressions and neutral expressions. The present study revealed that individuals with IGD have different unconscious neutral facial processing patterns compared with normal individuals and suggested that individuals with IGD may expect more positive emotion in the happy–neutral expressions context. Highlights: • The present study investigated whether the unconscious processing of facial expressions is influenced by excessive

  3. Perceptual, Categorical, and Affective Processing of Ambiguous Smiling Facial Expressions

    ERIC Educational Resources Information Center

    Calvo, Manuel G.; Fernandez-Martin, Andres; Nummenmaa, Lauri

    2012-01-01

    Why is a face with a smile but non-happy eyes likely to be interpreted as happy? We used blended expressions in which a smiling mouth was incongruent with the eyes (e.g., angry eyes), as well as genuine expressions with congruent eyes and mouth (e.g., both happy or angry). Tasks involved detection of a smiling mouth (perceptual), categorization of…

  4. Discrimination and Imitation of Facial Expressions by Neonates.

    ERIC Educational Resources Information Center

    Field, Tiffany

    Findings of a series of studies on individual differences and maturational changes in expressivity at the neonatal stage and during early infancy are reported. Research results indicate that newborns are able to discriminate and imitate the basic emotional expressions: happy, sad, and surprised. Results show widened infant lips when the happy…

  5. Perceptual, Categorical, and Affective Processing of Ambiguous Smiling Facial Expressions

    ERIC Educational Resources Information Center

    Calvo, Manuel G.; Fernandez-Martin, Andres; Nummenmaa, Lauri

    2012-01-01

    Why is a face with a smile but non-happy eyes likely to be interpreted as happy? We used blended expressions in which a smiling mouth was incongruent with the eyes (e.g., angry eyes), as well as genuine expressions with congruent eyes and mouth (e.g., both happy or angry). Tasks involved detection of a smiling mouth (perceptual), categorization of…

  6. Development and Standardization of Extended ChaeLee Korean Facial Expressions of Emotions.

    PubMed

    Lee, Kyoung-Uk; Kim, Jieun; Yeon, Bora; Kim, Seung-Hwan; Chae, Jeong-Ho

    2013-06-01

    In recent years there has been an enormous increase of neuroscience research using the facial expressions of emotion. This has led to a need for ethnically specific facial expressions data, due to differences of facial emotion processing among different ethnicities. FIFTY PROFESSIONAL ACTORS WERE ASKED TO POSE WITH EACH OF THE FOLLOWING FACIAL EXPRESSIONS IN TURN: happiness, sadness, fear, anger, disgust, surprise, and neutral. A total of 283 facial pictures of 40 actors were selected to be included in the validation study. Facial expression emotion identification was performed in a validation study by 104 healthy raters who provided emotion labeling, valence ratings, and arousal ratings. A total of 259 images of 37 actors were selected for inclusion in the Extended ChaeLee Korean Facial Expressions of Emotions tool, based on the analysis of results. In these images, the actors' mean age was 38±11.1 years (range 26-60 years), with 16 (43.2%) males and 21 (56.8%) females. The consistency varied by emotion type, showing the highest for happiness (95.5%) and the lowest for fear (49.0%). The mean scores for the valence ratings ranged from 4.0 (happiness) to 1.9 (sadness, anger, and disgust). The mean scores for the arousal ratings ranged from 3.7 (anger and fear) to 2.5 (neutral). We obtained facial expressions from individuals of Korean ethnicity and performed a study to validate them. Our results provide a tool for the affective neurosciences which could be used for the investigation of mechanisms of emotion processing in healthy individuals as well as in patients with various psychiatric disorders.

  7. Express

    Integrated Risk Information System (IRIS)

    Express ; CASRN 101200 - 48 - 0 Human health assessment information on a chemical substance is included in the IRIS database only after a comprehensive review of toxicity data , as outlined in the IRIS assessment development process . Sections I ( Health Hazard Assessments for Noncarcinogenic Effect

  8. N170 response to facial expressions is modulated by the affective congruency between the emotional expression and preceding affective picture.

    PubMed

    Hietanen, Jari K; Astikainen, Piia

    2013-02-01

    Does contextual affective information influence the processing of facial expressions already at the relatively early stages of face processing? We measured event-related brain potentials to happy and sad facial expressions primed by preceding pictures with affectively positive and negative scenes. The face-sensitive N170 response amplitudes showed a clear affective priming effect: N170 amplitudes to happy faces were larger when presented after positive vs. negative primes, whereas the N170 amplitudes to sad faces were larger when presented after negative vs. positive primes. Priming effects were also observed on later brain responses. The results support an early integration in processing of contextual and facial affective information. The results also provide neurophysiological support for theories suggesting that behavioral affective priming effects are based, at least in part, on facilitation of encoding of incoming affective information. Copyright © 2012 Elsevier B.V. All rights reserved.

  9. Social Alienation in Schizophrenia Patients: Association with Insula Responsiveness to Facial Expressions of Disgust

    PubMed Central

    Lindner, Christian; Dannlowski, Udo; Walhöfer, Kirsten; Rödiger, Maike; Maisch, Birgit; Bauer, Jochen; Ohrmann, Patricia; Lencer, Rebekka; Zwitserlood, Pienie; Kersting, Anette; Heindel, Walter; Arolt, Volker

    2014-01-01

    Introduction Among the functional neuroimaging studies on emotional face processing in schizophrenia, few have used paradigms with facial expressions of disgust. In this study, we investigated whether schizophrenia patients show less insula activation to macro-expressions (overt, clearly visible expressions) and micro-expressions (covert, very brief expressions) of disgust than healthy controls. Furthermore, departing from the assumption that disgust faces signal social rejection, we examined whether perceptual sensitivity to disgust is related to social alienation in patients and controls. We hypothesized that high insula responsiveness to facial disgust predicts social alienation. Methods We used functional magnetic resonance imaging to measure insula activation in 36 schizophrenia patients and 40 healthy controls. During scanning, subjects passively viewed covert and overt presentations of disgust and neutral faces. To measure social alienation, a social loneliness scale and an agreeableness scale were administered. Results Schizophrenia patients exhibited reduced insula activation in response to covert facial expressions of disgust. With respect to macro-expressions of disgust, no between-group differences emerged. In patients, insula responsiveness to covert faces of disgust was positively correlated with social loneliness. Furthermore, patients' insula responsiveness to covert and overt faces of disgust was negatively correlated with agreeableness. In controls, insula responsiveness to covert expressions of disgust correlated negatively with agreeableness. Discussion Schizophrenia patients show reduced insula responsiveness to micro-expressions but not macro-expressions of disgust compared to healthy controls. In patients, low agreeableness was associated with stronger insula response to micro- and macro-expressions of disgust. Patients with a strong tendency to feel uncomfortable with social interactions appear to be characterized by a high sensitivity for

  10. Internal representations reveal cultural diversity in expectations of facial expressions of emotion.

    PubMed

    Jack, Rachael E; Caldara, Roberto; Schyns, Philippe G

    2012-02-01

    Facial expressions have long been considered the "universal language of emotion." Yet consistent cultural differences in the recognition of facial expressions contradict such notions (e.g., R. E. Jack, C. Blais, C. Scheepers, P. G. Schyns, & R. Caldara, 2009). Rather, culture--as an intricate system of social concepts and beliefs--could generate different expectations (i.e., internal representations) of facial expression signals. To investigate, they used a powerful psychophysical technique (reverse correlation) to estimate the observer-specific internal representations of the 6 basic facial expressions of emotion (i.e., happy, surprise, fear, disgust, anger, and sad) in two culturally distinct groups (i.e., Western Caucasian [WC] and East Asian [EA]). Using complementary statistical image analyses, cultural specificity was directly revealed in these representations. Specifically, whereas WC internal representations predominantly featured the eyebrows and mouth, EA internal representations showed a preference for expressive information in the eye region. Closer inspection of the EA observer preference revealed a surprising feature: changes of gaze direction, shown primarily among the EA group. For the first time, it is revealed directly that culture can finely shape the internal representations of common facial expressions of emotion, challenging notions of a biologically hardwired "universal language of emotion."

  11. Long-term academic stress enhances early processing of facial expressions.

    PubMed

    Zhang, Liang; Qin, Shaozheng; Yao, Zhuxi; Zhang, Kan; Wu, Jianhui

    2016-11-01

    Exposure to long-term stress can lead to a variety of emotional and behavioral problems. Although widely investigated, the neural basis of how long-term stress impacts emotional processing in humans remains largely elusive. Using event-related brain potentials (ERPs), we investigated the effects of long-term stress on the neural dynamics of emotionally facial expression processing. Thirty-nine male college students undergoing preparation for a major examination and twenty-one matched controls performed a gender discrimination task for faces displaying angry, happy, and neutral expressions. The results of the Perceived Stress Scale showed that participants in the stress group perceived higher levels of long-term stress relative to the control group. ERP analyses revealed differential effects of long-term stress on two early stages of facial expression processing: 1) long-term stress generally augmented posterior P1 amplitudes to facial stimuli irrespective of expression valence, suggesting that stress can increase sensitization to visual inputs in general, and 2) long-term stress selectively augmented fronto-central P2 amplitudes for angry but not for neutral or positive facial expressions, suggesting that stress may lead to increased attentional prioritization to processing negative emotional stimuli. Together, our findings suggest that long-term stress has profound impacts on the early stages of facial expression processing, with an increase at the very early stage of general information inputs and a subsequent attentional bias toward processing emotionally negative stimuli.

  12. Young Infants Match Facial and Vocal Emotional Expressions of Other Infants.

    PubMed

    Vaillant-Molina, Mariana; Bahrick, Lorraine E; Flom, Ross

    2013-08-01

    Research has demonstrated that infants recognize emotional expressions of adults in the first half-year of life. We extended this research to a new domain, infant perception of the expressions of other infants. In an intermodal matching procedure, 3.5- and 5-month-old infants heard a series of infant vocal expressions (positive and negative affect) along with side-by-side dynamic videos in which one infant conveyed positive facial affect and another infant conveyed negative facial affect. Results demonstrated that 5-month-olds matched the vocal expressions with the affectively congruent facial expressions, whereas 3.5-month-olds showed no evidence of matching. These findings indicate that by 5 months of age, infants detect, discriminate, and match the facial and vocal affective displays of other infants. Further, because the facial and vocal expressions were portrayed by different infants and shared no face-voice synchrony, temporal or intensity patterning, matching was likely based on detection of a more general affective valence common to the face and voice.

  13. Young Infants Match Facial and Vocal Emotional Expressions of Other Infants

    PubMed Central

    Vaillant-Molina, Mariana; Bahrick, Lorraine E.; Flom, Ross

    2013-01-01

    Research has demonstrated that infants recognize emotional expressions of adults in the first half-year of life. We extended this research to a new domain, infant perception of the expressions of other infants. In an intermodal matching procedure, 3.5- and 5-month-old infants heard a series of infant vocal expressions (positive and negative affect) along with side-by-side dynamic videos in which one infant conveyed positive facial affect and another infant conveyed negative facial affect. Results demonstrated that 5-month-olds matched the vocal expressions with the affectively congruent facial expressions, whereas 3.5-month-olds showed no evidence of matching. These findings indicate that by 5 months of age, infants detect, discriminate, and match the facial and vocal affective displays of other infants. Further, because the facial and vocal expressions were portrayed by different infants and shared no face-voice synchrony, temporal or intensity patterning, matching was likely based on detection of a more general affective valence common to the face and voice. PMID:24302853

  14. Quantitative analysis of Euclidean distance to complement qualitative analysis of facial expression during deception

    PubMed Central

    Mondal, Ananya; Mukhopadhyay, Pritha; Basu, Nabanita; Bandyopadhyay, Samir Kumar; Chatterjee, Tanima

    2016-01-01

    Background: Accurate evaluation of an individuals' veracity is a fundamental aspect of social functioning that allows individuals to act in adaptive ways. The domain of deception detection ability is still young, and many components in this field are yet to be touched which demands more research in this field. Aims: The present study aims at deciphering the structural composition of face during felt, posed, and deceived emotions in facial expression unique to Indian culture, using Facial Action Coding System (FACS). Quantitative analysis of Euclidean distance has been done to complement qualitative FACS analysis. Methods: In this study, thirty female, young adults with age range of 23–27 years were chosen randomly for portraying their (felt, posed, and deceived) facial expression. All facial expressions were captured through instruction, and videos were converted into static images. The static images were coded on the basis of FACS to decipher the felt, posed, and deceived expressions. Quantitative analysis of the data has been done using MATLAB to meet the objectives of the study and to complement the qualitative analysis. Results: Felt and posed emotions differ in terms of intensity of the expression and subjective experience. Posed emotional and deceived expressions differ in intent. Facial asymmetry is an important indicator for detecting deception. PMID:28163412

  15. Dissociation between recognition and detection advantage for facial expressions: a meta-analysis.

    PubMed

    Nummenmaa, Lauri; Calvo, Manuel G

    2015-04-01

    Happy facial expressions are recognized faster and more accurately than other expressions in categorization tasks, whereas detection in visual search tasks is widely believed to be faster for angry than happy faces. We used meta-analytic techniques for resolving this categorization versus detection advantage discrepancy for positive versus negative facial expressions. Effect sizes were computed on the basis of the r statistic for a total of 34 recognition studies with 3,561 participants and 37 visual search studies with 2,455 participants, yielding a total of 41 effect sizes for recognition accuracy, 25 for recognition speed, and 125 for visual search speed. Random effects meta-analysis was conducted to estimate effect sizes at population level. For recognition tasks, an advantage in recognition accuracy and speed for happy expressions was found for all stimulus types. In contrast, for visual search tasks, moderator analysis revealed that a happy face detection advantage was restricted to photographic faces, whereas a clear angry face advantage was found for schematic and "smiley" faces. Robust detection advantage for nonhappy faces was observed even when stimulus emotionality was distorted by inversion or rearrangement of the facial features, suggesting that visual features primarily drive the search. We conclude that the recognition advantage for happy faces is a genuine phenomenon related to processing of facial expression category and affective valence. In contrast, detection advantages toward either happy (photographic stimuli) or nonhappy (schematic) faces is contingent on visual stimulus features rather than facial expression, and may not involve categorical or affective processing.

  16. Recognition of facial expressions of emotions by 3-year-olds.

    PubMed

    Székely, Eszter; Tiemeier, Henning; Arends, Lidia R; Jaddoe, Vincent W V; Hofman, Albert; Verhulst, Frank C; Herba, Catherine M

    2011-04-01

    Very few large-scale studies have focused on emotional facial expression recognition (FER) in 3-year-olds, an age of rapid social and language development. We studied FER in 808 healthy 3-year-olds using verbal and nonverbal computerized tasks for four basic emotions (happiness, sadness, anger, and fear). Three-year-olds showed differential performance on the verbal and nonverbal FER tasks, especially with respect to fear. That is to say, fear was one of the most accurately recognized facial expressions as matched nonverbally and the least accurately recognized facial expression as labeled verbally. Sex did not influence emotion-matching nor emotion-labeling performance after adjusting for basic matching or labeling ability. Three-year-olds made systematic errors in emotion-labeling. Namely, happy expressions were often confused with fearful expressions, whereas negative expressions were often confused with other negative expressions. Together, these findings suggest that 3-year-olds' FER skills strongly depend on task specifications. Importantly, fear was the most sensitive facial expression in this regard. Finally, in line with previous studies, we found that recognized emotion categories are initially broad, including emotions of the same valence, as reflected in the nonrandom errors of 3-year-olds. PsycINFO Database Record (c) 2011 APA, all rights reserved

  17. Recognition of Facial Expressions of Emotion in Adults with Down Syndrome

    ERIC Educational Resources Information Center

    Virji-Babul, Naznin; Watt, Kimberley; Nathoo, Farouk; Johnson, Peter

    2012-01-01

    Research on facial expressions in individuals with Down syndrome (DS) has been conducted using photographs. Our goal was to examine the effect of motion on perception of emotional expressions. Adults with DS, adults with typical development matched for chronological age (CA), and children with typical development matched for developmental age (DA)…

  18. Strategies for Perceiving Facial Expressions in Adults with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Walsh, Jennifer A.; Vida, Mark D.; Rutherford, M. D.

    2014-01-01

    Rutherford and McIntosh (J Autism Dev Disord 37:187-196, 2007) demonstrated that individuals with autism spectrum disorder (ASD) are more tolerant than controls of exaggerated schematic facial expressions, suggesting that they may use an alternative strategy when processing emotional expressions. The current study was designed to test this finding…

  19. Production of Emotional Facial Expressions in European American, Japanese, and Chinese Infants.

    ERIC Educational Resources Information Center

    Camras, Linda A.; And Others

    1998-01-01

    European American, Japanese, and Chinese 11-month-olds participated in emotion-inducing laboratory procedures. Facial responses were scored with BabyFACS, an anatomically based coding system. Overall, Chinese infants were less expressive than European American and Japanese infants, suggesting that differences in expressivity between European…

  20. Discrimination of Facial Expression by 5-Month-Old Infants of Nondepressed and Clinically Depressed Mothers

    PubMed Central

    Bornstein, Marc H.; Arterberry, Martha; Mash, Clay; Manian, Nanmathi

    2011-01-01

    Five-month-old infants of nondepressed and clinically depressed mothers were habituated to either a face with a neutral expression or the same face with a smile. Infants of nondepressed mothers subsequently discriminated between neutral and smiling facial expressions, whereas infants of clinically depressed mothers failed to make the same discrimination. PMID:21112092

  1. The Role of Facial Expressions in Attention-Orienting in Adults and Infants

    ERIC Educational Resources Information Center

    Rigato, Silvia; Menon, Enrica; Di Gangi, Valentina; George, Nathalie; Farroni, Teresa

    2013-01-01

    Faces convey many signals (i.e., gaze or expressions) essential for interpersonal interaction. We have previously shown that facial expressions of emotion and gaze direction are processed and integrated in specific combinations early in life. These findings open a number of developmental questions and specifically in this paper we address whether…

  2. Production of Emotional Facial Expressions in European American, Japanese, and Chinese Infants.

    ERIC Educational Resources Information Center

    Camras, Linda A.; And Others

    1998-01-01

    European American, Japanese, and Chinese 11-month-olds participated in emotion-inducing laboratory procedures. Facial responses were scored with BabyFACS, an anatomically based coding system. Overall, Chinese infants were less expressive than European American and Japanese infants, suggesting that differences in expressivity between European…

  3. Nine-year-old children use norm-based coding to visually represent facial expression.

    PubMed

    Burton, Nichola; Jeffery, Linda; Skinner, Andrew L; Benton, Christopher P; Rhodes, Gillian

    2013-10-01

    Children are less skilled than adults at making judgments about facial expression. This could be because they have not yet developed adult-like mechanisms for visually representing faces. Adults are thought to represent faces in a multidimensional face-space, and have been shown to code the expression of a face relative to the norm or average face in face-space. Norm-based coding is economical and adaptive, and may be what makes adults more sensitive to facial expression than children. This study investigated the coding system that children use to represent facial expression. An adaptation aftereffect paradigm was used to test 24 adults and 18 children (9 years 2 months to 9 years 11 months old). Participants adapted to weak and strong antiexpressions. They then judged the expression of an average expression. Adaptation created aftereffects that made the test face look like the expression opposite that of the adaptor. Consistent with the predictions of norm-based but not exemplar-based coding, aftereffects were larger for strong than weak adaptors for both age groups. Results indicate that, like adults, children's coding of facial expressions is norm-based.

  4. Positive, but Not Negative, Facial Expressions Facilitate 3-Month-Olds' Recognition of an Individual Face

    ERIC Educational Resources Information Center

    Brenna, Viola; Proietti, Valentina; Montirosso, Rosario; Turati, Chiara

    2013-01-01

    The current study examined whether and how the presence of a positive or a negative emotional expression may affect the face recognition process at 3 months of age. Using a familiarization procedure, Experiment 1 demonstrated that positive (i.e., happiness), but not negative (i.e., fear and anger) facial expressions facilitate infants' ability to…

  5. Recognition of Facial Expressions of Emotion in Adults with Down Syndrome

    ERIC Educational Resources Information Center

    Virji-Babul, Naznin; Watt, Kimberley; Nathoo, Farouk; Johnson, Peter

    2012-01-01

    Research on facial expressions in individuals with Down syndrome (DS) has been conducted using photographs. Our goal was to examine the effect of motion on perception of emotional expressions. Adults with DS, adults with typical development matched for chronological age (CA), and children with typical development matched for developmental age (DA)…

  6. Strategies for Perceiving Facial Expressions in Adults with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Walsh, Jennifer A.; Vida, Mark D.; Rutherford, M. D.

    2014-01-01

    Rutherford and McIntosh (J Autism Dev Disord 37:187-196, 2007) demonstrated that individuals with autism spectrum disorder (ASD) are more tolerant than controls of exaggerated schematic facial expressions, suggesting that they may use an alternative strategy when processing emotional expressions. The current study was designed to test this finding…

  7. The Role of Facial Expressions in Attention-Orienting in Adults and Infants

    ERIC Educational Resources Information Center

    Rigato, Silvia; Menon, Enrica; Di Gangi, Valentina; George, Nathalie; Farroni, Teresa

    2013-01-01

    Faces convey many signals (i.e., gaze or expressions) essential for interpersonal interaction. We have previously shown that facial expressions of emotion and gaze direction are processed and integrated in specific combinations early in life. These findings open a number of developmental questions and specifically in this paper we address whether…

  8. Positive, but Not Negative, Facial Expressions Facilitate 3-Month-Olds' Recognition of an Individual Face

    ERIC Educational Resources Information Center

    Brenna, Viola; Proietti, Valentina; Montirosso, Rosario; Turati, Chiara

    2013-01-01

    The current study examined whether and how the presence of a positive or a negative emotional expression may affect the face recognition process at 3 months of age. Using a familiarization procedure, Experiment 1 demonstrated that positive (i.e., happiness), but not negative (i.e., fear and anger) facial expressions facilitate infants' ability to…

  9. The Facial Expressive Action Stimulus Test. A test battery for the assessment of face memory, face and object perception, configuration processing, and facial expression recognition.

    PubMed

    de Gelder, Beatrice; Huis In 't Veld, Elisabeth M J; Van den Stock, Jan

    2015-01-01

    There are many ways to assess face perception skills. In this study, we describe a novel task battery FEAST (Facial Expressive Action Stimulus Test) developed to test recognition of identity and expressions of human faces as well as stimulus control categories. The FEAST consists of a neutral and emotional face memory task, a face and shoe identity matching task, a face and house part-to-whole matching task, and a human and animal facial expression matching task. The identity and part-to-whole matching tasks contain both upright and inverted conditions. The results provide reference data of a healthy sample of controls in two age groups for future users of the FEAST.

  10. Facial expressions, their communicatory functions and neuro-cognitive substrates.

    PubMed Central

    Blair, R J R

    2003-01-01

    Human emotional expressions serve a crucial communicatory role allowing the rapid transmission of valence information from one individual to another. This paper will review the literature on the neural mechanisms necessary for this communication: both the mechanisms involved in the production of emotional expressions and those involved in the interpretation of the emotional expressions of others. Finally, reference to the neuro-psychiatric disorders of autism, psychopathy and acquired sociopathy will be made. In these conditions, the appropriate processing of emotional expressions is impaired. In autism, it is argued that the basic response to emotional expressions remains intact but that there is impaired ability to represent the referent of the individual displaying the emotion. In psychopathy, the response to fearful and sad expressions is attenuated and this interferes with socialization resulting in an individual who fails to learn to avoid actions that result in harm to others. In acquired sociopathy, the response to angry expressions in particular is attenuated resulting in reduced regulation of social behaviour. PMID:12689381

  11. Facial expressions, their communicatory functions and neuro-cognitive substrates.

    PubMed

    Blair, R J R

    2003-03-29

    Human emotional expressions serve a crucial communicatory role allowing the rapid transmission of valence information from one individual to another. This paper will review the literature on the neural mechanisms necessary for this communication: both the mechanisms involved in the production of emotional expressions and those involved in the interpretation of the emotional expressions of others. Finally, reference to the neuro-psychiatric disorders of autism, psychopathy and acquired sociopathy will be made. In these conditions, the appropriate processing of emotional expressions is impaired. In autism, it is argued that the basic response to emotional expressions remains intact but that there is impaired ability to represent the referent of the individual displaying the emotion. In psychopathy, the response to fearful and sad expressions is attenuated and this interferes with socialization resulting in an individual who fails to learn to avoid actions that result in harm to others. In acquired sociopathy, the response to angry expressions in particular is attenuated resulting in reduced regulation of social behaviour.

  12. Differential expression of wound fibrotic factors between facial and trunk dermal fibroblasts.

    PubMed

    Kurita, Masakazu; Okazaki, Mutsumi; Kaminishi-Tanikawa, Akiko; Niikura, Mamoru; Takushima, Akihiko; Harii, Kiyonori

    2012-01-01

    Clinically, wounds on the face tend to heal with less scarring than those on the trunk, but the causes of this difference have not been clarified. Fibroblasts obtained from different parts of the body are known to show different properties. To investigate whether the characteristic properties of facial and trunk wound healing are caused by differences in local fibroblasts, we comparatively analyzed the functional properties of superficial and deep dermal fibroblasts obtained from the facial and trunk skin of seven individuals, with an emphasis on tendency for fibrosis. Proliferation kinetics and mRNA and protein expression of 11 fibrosis-associated factors were investigated. The proliferation kinetics of facial and trunk fibroblasts were identical, but the expression and production levels of profibrotic factors, such as extracellular matrix, transforming growth factor-β1, and connective tissue growth factor mRNA, were lower in facial fibroblasts when compared with trunk fibroblasts, while the expression of antifibrotic factors, such as collagenase, basic fibroblast growth factor, and hepatocyte growth factor, showed no clear trends. The differences in functional properties of facial and trunk dermal fibroblasts were consistent with the clinical tendencies of healing of facial and trunk wounds. Thus, the differences between facial and trunk scarring are at least partly related to the intrinsic nature of the local dermal fibroblasts.

  13. Electromyographic responses to emotional facial expressions in 6-7year olds: a feasibility study.

    PubMed

    Deschamps, P K H; Schutte, I; Kenemans, J L; Matthys, W; Schutter, D J L G

    2012-08-01

    Preliminary studies have demonstrated that school-aged children (average age 9-10years) show mimicry responses to happy and angry facial expressions. The aim of the present study was to assess the feasibility of using facial electromyography (EMG) as a method to study facial mimicry responses in younger children aged 6-7years to emotional facial expressions of other children. Facial EMG activity to the presentation of dynamic emotional faces was recorded from the corrugator, zygomaticus, frontalis and depressor muscle in sixty-one healthy participants aged 6-7years. Results showed that the presentation of angry faces was associated with corrugator activation and zygomaticus relaxation, happy faces with an increase in zygomaticus and a decrease in corrugator activation, fearful faces with frontalis activation, and sad faces with a combination of corrugator and frontalis activation. This study demonstrates the feasibility of measuring facial EMG response to emotional facial expressions in 6-7year old children. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. Differential Expression of Wound Fibrotic Factors between Facial and Trunk Dermal Fibroblasts

    PubMed Central

    Kurita, Masakazu; Okazaki, Mutsumi; Kaminishi-Tanikawa, Akiko; Niikura, Mamoru; Takushima, Akihiko; Harii, Kiyonori

    2012-01-01

    Clinically, wounds on the face tend to heal with less scarring than those on the trunk, but the causes of this difference have not been clarified. Fibroblasts obtained from different parts of the body are known to show different properties. To investigate whether the characteristic properties of facial and trunk wound healing are caused by differences in local fibroblasts, we comparatively analyzed the functional properties of superficial and deep dermal fibroblasts obtained from the facial and trunk skin of seven individuals, with an emphasis on tendency for fibrosis. Proliferation kinetics and mRNA and protein expression of 11 fibrosis-associated factors were investigated. The proliferation kinetics of facial and trunk fibroblasts were identical, but the expression and production levels of profibrotic factors, such as extracellular matrix, transforming growth factor-β1, and connective tissue growth factor mRNA, were lower in facial fibroblasts when compared with trunk fibro-blasts, while the expression of antifibrotic factors, such as collagenase, basic fibroblast growth factor, and hepatocyte growth factor, showed no clear trends. The differences in functional properties of facial and trunk dermal fibroblasts were consistent with the clinical tendencies of healing of facial and trunk wounds. Thus, the differences between facial and trunk scarring are at least partly related to the intrinsic nature of the local dermal fibroblasts. PMID:22260504

  15. In Your Face: Startle to Emotional Facial Expressions Depends on Face Direction

    PubMed Central

    Michalsen, Henriette; Øvervoll, Morten

    2017-01-01

    Although faces are often included in the broad category of emotional visual stimuli, the affective impact of different facial expressions is not well documented. The present experiment investigated startle electromyographic responses to pictures of neutral, happy, angry, and fearful facial expressions, with a frontal face direction (directed) and at a 45° angle to the left (averted). Results showed that emotional facial expressions interact with face direction to produce startle potentiation: Greater responses were found for angry expressions, compared with fear and neutrality, with directed faces. When faces were averted, fear and neutrality produced larger responses compared with anger and happiness. These results are in line with the notion that startle is potentiated to stimuli signaling threat. That is, a forward directed angry face may signal a threat toward the observer, and a fearful face directed to the side may signal a possible threat in the environment. PMID:28321290

  16. Does social support protect against recognition of angry facial expressions following failure?

    PubMed

    Tanzer, Michal; Avidan, Galia; Shahar, Golan

    2013-01-01

    Herein we examined the ability of perceived social support to modulate the recognition of facial expressions. Specifically, we hypothesised that perceived social support would serve as a "protective lens", biasing individuals against recognition of angry faces and towards recognizing happy ones. Experimentally inducing failure and success in a performance task, we examined both main and interactive effects of perceived social support. Under experimentally induced failure, perceived social support was negatively associated with recognizing angry emotional face expression. Interestingly, we also found a trend toward a positive association between perceived social support and recognition of happy facial expression, irrespective of the failure/success experimental manipulation. These findings elucidate a potential link between social/personality and cognitive processes underlying the perception of emotional facial expression.

  17. Sex differences in the perception of affective facial expressions: do men really lack emotional sensitivity?

    PubMed

    Montagne, Barbara; Kessels, Roy P C; Frigerio, Elisa; de Haan, Edward H F; Perrett, David I

    2005-06-01

    There is evidence that men and women display differences in both cognitive and affective functions. Recent studies have examined the processing of emotions in males and females. However, the findings are inconclusive, possibly the result of methodological differences. The aim of this study was to investigate the perception of emotional facial expressions in men and women. Video clips of neutral faces, gradually morphing into full-blown expressions were used. By doing this, we were able to examine both the accuracy and the sensitivity in labelling emotional facial expressions. Furthermore, all participants completed an anxiety and a depression rating scale. Research participants were 40 female students and 28 male students. Results revealed that men were less accurate, as well as less sensitive in labelling facial expressions. Thus, men show an overall worse performance compared to women on a task measuring the processing of emotional faces. This result is discussed in relation to recent findings.

  18. In Your Face: Startle to Emotional Facial Expressions Depends on Face Direction.

    PubMed

    Åsli, Ole; Michalsen, Henriette; Øvervoll, Morten

    2017-01-01

    Although faces are often included in the broad category of emotional visual stimuli, the affective impact of different facial expressions is not well documented. The present experiment investigated startle electromyographic responses to pictures of neutral, happy, angry, and fearful facial expressions, with a frontal face direction (directed) and at a 45° angle to the left (averted). Results showed that emotional facial expressions interact with face direction to produce startle potentiation: Greater responses were found for angry expressions, compared with fear and neutrality, with directed faces. When faces were averted, fear and neutrality produced larger responses compared with anger and happiness. These results are in line with the notion that startle is potentiated to stimuli signaling threat. That is, a forward directed angry face may signal a threat toward the observer, and a fearful face directed to the side may signal a possible threat in the environment.

  19. Mapping the impairment in decoding static facial expressions of emotion in prosopagnosia.

    PubMed

    Fiset, Daniel; Blais, Caroline; Royer, Jessica; Richoz, Anne-Raphaëlle; Dugas, Gabrielle; Caldara, Roberto

    2017-08-01

    Acquired prosopagnosia is characterized by a deficit in face recognition due to diverse brain lesions, but interestingly most prosopagnosic patients suffering from posterior lesions use the mouth instead of the eyes for face identification. Whether this bias is present for the recognition of facial expressions of emotion has not yet been addressed. We tested PS, a pure case of acquired prosopagnosia with bilateral occipitotemporal lesions anatomically sparing the regions dedicated for facial expression recognition. PS used mostly the mouth to recognize facial expressions even when the eye area was the most diagnostic. Moreover, PS directed most of her fixations towards the mouth. Her impairment was still largely present when she was instructed to look at the eyes, or when she was forced to look at them. Control participants showed a performance comparable to PS when only the lower part of the face was available. These observations suggest that the deficits observed in PS with static images are not solely attentional, but are rooted at the level of facial information use. This study corroborates neuroimaging findings suggesting that the Occipital Face Area might play a critical role in extracting facial features that are integrated for both face identification and facial expression recognition in static images. © The Author (2017). Published by Oxford University Press.

  20. Automated decoding of facial expressions reveals marked differences in children when telling antisocial versus prosocial lies.

    PubMed

    Zanette, Sarah; Gao, Xiaoqing; Brunet, Megan; Bartlett, Marian Stewart; Lee, Kang

    2016-10-01

    The current study used computer vision technology to examine the nonverbal facial expressions of children (6-11years old) telling antisocial and prosocial lies. Children in the antisocial lying group completed a temptation resistance paradigm where they were asked not to peek at a gift being wrapped for them. All children peeked at the gift and subsequently lied about their behavior. Children in the prosocial lying group were given an undesirable gift and asked if they liked it. All children lied about liking the gift. Nonverbal behavior was analyzed using the Computer Expression Recognition Toolbox (CERT), which employs the Facial Action Coding System (FACS), to automatically code children's facial expressions while lying. Using CERT, children's facial expressions during antisocial and prosocial lying were accurately and reliably differentiated significantly above chance-level accuracy. The basic expressions of emotion that distinguished antisocial lies from prosocial lies were joy and contempt. Children expressed joy more in prosocial lying than in antisocial lying. Girls showed more joy and less contempt compared with boys when they told prosocial lies. Boys showed more contempt when they told prosocial lies than when they told antisocial lies. The key action units (AUs) that differentiate children's antisocial and prosocial lies are blink/eye closure, lip pucker, and lip raise on the right side. Together, these findings indicate that children's facial expressions differ while telling antisocial versus prosocial lies. The reliability of CERT in detecting such differences in facial expression suggests the viability of using computer vision technology in deception research. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. Neurophysiology of spontaneous facial expressions: I. Motor control of the upper and lower face is behaviorally independent in adults.

    PubMed

    Ross, Elliott D; Gupta, Smita S; Adnan, Asif M; Holden, Thomas L; Havlicek, Joseph; Radhakrishnan, Sridhar

    2016-03-01

    Facial expressions are described traditionally as monolithic entities. However, humans have the capacity to produce facial blends, in which the upper and lower face simultaneously display different emotional expressions. This, in turn, has led to the Component Theory of facial expressions. Recent neuroanatomical studies in monkeys have demonstrated that there are separate cortical motor areas for controlling the upper and lower face that, presumably, also occur in humans. The lower face is represented on the posterior ventrolateral surface of the frontal lobes in the primary motor and premotor cortices and the upper face is represented on the medial surface of the posterior frontal lobes in the supplementary motor and anterior cingulate cortices. Our laboratory has been engaged in a series of studies exploring the perception and production of facial blends. Using high-speed videography, we began measuring the temporal aspects of facial expressions to develop a more complete understanding of the neurophysiology underlying facial expressions and facial blends. The goal of the research presented here was to determine if spontaneous facial expressions in adults are predominantly monolithic or exhibit independent motor control of the upper and lower face. We found that spontaneous facial expressions are very complex and that the motor control of the upper and lower face is overwhelmingly independent, thus robustly supporting the Component Theory of facial expressions. Seemingly monolithic expressions, be they full facial or facial blends, are most likely the result of a timing coincident rather than a synchronous coordination between the ventrolateral and medial cortical motor areas responsible for controlling the lower and upper face, respectively. In addition, we found evidence that the right and left face may also exhibit independent motor control, thus supporting the concept that spontaneous facial expressions are organized predominantly across the horizontal facial

  2. Alexithymia, not autism, predicts poor recognition of emotional facial expressions.

    PubMed

    Cook, Richard; Brewer, Rebecca; Shah, Punit; Bird, Geoffrey

    2013-05-01

    Despite considerable research into whether face perception is impaired in autistic individuals, clear answers have proved elusive. In the present study, we sought to determine whether co-occurring alexithymia (characterized by difficulties interpreting emotional states) may be responsible for face-perception deficits previously attributed to autism. Two experiments were conducted using psychophysical procedures to determine the relative contributions of alexithymia and autism to identity and expression recognition. Experiment 1 showed that alexithymia correlates strongly with the precision of expression attributions, whereas autism severity was unrelated to expression-recognition ability. Experiment 2 confirmed that alexithymia is not associated with impaired ability to detect expression variation; instead, results suggested that alexithymia is associated with difficulties interpreting intact sensory descriptions. Neither alexithymia nor autism was associated with biased or imprecise identity attributions. These findings accord with the hypothesis that the emotional symptoms of autism are in fact due to co-occurring alexithymia and that existing diagnostic criteria may need to be revised.

  3. Red enhances the processing of facial expressions of anger.

    PubMed

    Young, Steven G; Elliot, Andrew J; Feltman, Roger; Ambady, Nalini

    2013-06-01

    Emotional expressions convey important social information. Given the social importance of decoding emotions, expressive faces wield great influence on cognition and perception. However, contextual factors also exert a top-down influence on emotion detection, privileging particular expressions over others. The current research investigates how the psychological meaning implied by the color red biases the processing of anger expressions. Red has been shown to carry the meaning of threat and danger, and in two experiments we find that exposure to red enhances the perception and identification of anger. In Experiment 1, the identification of anger, relative to happiness, was facilitated when faces were viewed on a red background. In Experiment 2, the red-anger facilitation effect was replicated and shown to not generalize to another high arousal negative emotion, fear. These results document a novel influence of color on emotion detection processes.

  4. The Development of Dynamic Facial Expression Recognition at Different Intensities in 4- to 18-Year-Olds

    ERIC Educational Resources Information Center

    Montirosso, Rosario; Peverelli, Milena; Frigerio, Elisa; Crespi, Monica; Borgatti, Renato

    2010-01-01

    The primary purpose of this study was to examine the effect of the intensity of emotion expression on children's developing ability to label emotion during a dynamic presentation of five facial expressions (anger, disgust, fear, happiness, and sadness). A computerized task (AFFECT--animated full facial expression comprehension test) was used to…

  5. The Development of Dynamic Facial Expression Recognition at Different Intensities in 4- to 18-Year-Olds

    ERIC Educational Resources Information Center

    Montirosso, Rosario; Peverelli, Milena; Frigerio, Elisa; Crespi, Monica; Borgatti, Renato

    2010-01-01

    The primary purpose of this study was to examine the effect of the intensity of emotion expression on children's developing ability to label emotion during a dynamic presentation of five facial expressions (anger, disgust, fear, happiness, and sadness). A computerized task (AFFECT--animated full facial expression comprehension test) was used to…

  6. Experience-based human perception of facial expressions in Barbary macaques (Macaca sylvanus).

    PubMed

    Maréchal, Laëtitia; Levy, Xandria; Meints, Kerstin; Majolo, Bonaventura

    2017-01-01

    Facial expressions convey key cues of human emotions, and may also be important for interspecies interactions. The universality hypothesis suggests that six basic emotions (anger, disgust, fear, happiness, sadness, and surprise) should be expressed by similar facial expressions in close phylogenetic species such as humans and nonhuman primates. However, some facial expressions have been shown to differ in meaning between humans and nonhuman primates like macaques. This ambiguity in signalling emotion can lead to an increased risk of aggression and injuries for both humans and animals. This raises serious concerns for activities such as wildlife tourism where humans closely interact with wild animals. Understanding what factors (i.e., experience and type of emotion) affect ability to recognise emotional state of nonhuman primates, based on their facial expressions, can enable us to test the validity of the universality hypothesis, as well as reduce the risk of aggression and potential injuries in wildlife tourism. The present study investigated whether different levels of experience of Barbary macaques, Macaca sylvanus, affect the ability to correctly assess different facial expressions related to aggressive, distressed, friendly or neutral states, using an online questionnaire. Participants' level of experience was defined as either: (1) naïve: never worked with nonhuman primates and never or rarely encountered live Barbary macaques; (2) exposed: shown pictures of the different Barbary macaques' facial expressions along with the description and the corresponding emotion prior to undertaking the questionnaire; (3) expert: worked with Barbary macaques for at least two months. Experience with Barbary macaques was associated with better performance in judging their emotional state. Simple exposure to pictures of macaques' facial expressions improved the ability of inexperienced participants to better discriminate neutral and distressed faces, and a trend was found for

  7. Experience-based human perception of facial expressions in Barbary macaques (Macaca sylvanus)

    PubMed Central

    Levy, Xandria; Meints, Kerstin; Majolo, Bonaventura

    2017-01-01

    Background Facial expressions convey key cues of human emotions, and may also be important for interspecies interactions. The universality hypothesis suggests that six basic emotions (anger, disgust, fear, happiness, sadness, and surprise) should be expressed by similar facial expressions in close phylogenetic species such as humans and nonhuman primates. However, some facial expressions have been shown to differ in meaning between humans and nonhuman primates like macaques. This ambiguity in signalling emotion can lead to an increased risk of aggression and injuries for both humans and animals. This raises serious concerns for activities such as wildlife tourism where humans closely interact with wild animals. Understanding what factors (i.e., experience and type of emotion) affect ability to recognise emotional state of nonhuman primates, based on their facial expressions, can enable us to test the validity of the universality hypothesis, as well as reduce the risk of aggression and potential injuries in wildlife tourism. Methods The present study investigated whether different levels of experience of Barbary macaques, Macaca sylvanus, affect the ability to correctly assess different facial expressions related to aggressive, distressed, friendly or neutral states, using an online questionnaire. Participants’ level of experience was defined as either: (1) naïve: never worked with nonhuman primates and never or rarely encountered live Barbary macaques; (2) exposed: shown pictures of the different Barbary macaques’ facial expressions along with the description and the corresponding emotion prior to undertaking the questionnaire; (3) expert: worked with Barbary macaques for at least two months. Results Experience with Barbary macaques was associated with better performance in judging their emotional state. Simple exposure to pictures of macaques’ facial expressions improved the ability of inexperienced participants to better discriminate neutral and distressed

  8. Identification of Emotional Facial Expressions: Effects of Expression, Intensity, and Sex on Eye Gaze.

    PubMed

    Wells, Laura Jean; Gillespie, Steven Mark; Rotshtein, Pia

    2016-01-01

    The identification of emotional expressions is vital for social interaction, and can be affected by various factors, including the expressed emotion, the intensity of the expression, the sex of the face, and the gender of the observer. This study investigates how these factors affect the speed and accuracy of expression recognition, as well as dwell time on the two most significant areas of the face: the eyes and the mouth. Participants were asked to identify expressions from female and male faces displaying six expressions (anger, disgust, fear, happiness, sadness, and surprise), each with three levels of intensity (low, moderate, and normal). Overall, responses were fastest and most accurate for happy expressions, but slowest and least accurate for fearful expressions. More intense expressions were also classified most accurately. Reaction time showed a different pattern, with slowest response times recorded for expressions of moderate intensity. Overall, responses were slowest, but also most accurate, for female faces. Relative to male observers, women showed greater accuracy and speed when recognizing female expressions. Dwell time analyses revealed that attention to the eyes was about three times greater than on the mouth, with fearful eyes in particular attracting longer dwell times. The mouth region was attended to the most for fearful, angry, and disgusted expressions and least for surprise. These results extend upon previous findings to show important effects of expression, emotion intensity, and sex on expression recognition and gaze behaviour, and may have implications for understanding the ways in which emotion recognition abilities break down.

  9. Identification of Emotional Facial Expressions: Effects of Expression, Intensity, and Sex on Eye Gaze

    PubMed Central

    Rotshtein, Pia

    2016-01-01

    The identification of emotional expressions is vital for social interaction, and can be affected by various factors, including the expressed emotion, the intensity of the expression, the sex of the face, and the gender of the observer. This study investigates how these factors affect the speed and accuracy of expression recognition, as well as dwell time on the two most significant areas of the face: the eyes and the mouth. Participants were asked to identify expressions from female and male faces displaying six expressions (anger, disgust, fear, happiness, sadness, and surprise), each with three levels of intensity (low, moderate, and normal). Overall, responses were fastest and most accurate for happy expressions, but slowest and least accurate for fearful expressions. More intense expressions were also classified most accurately. Reaction time showed a different pattern, with slowest response times recorded for expressions of moderate intensity. Overall, responses were slowest, but also most accurate, for female faces. Relative to male observers, women showed greater accuracy and speed when recognizing female expressions. Dwell time analyses revealed that attention to the eyes was about three times greater than on the mouth, with fearful eyes in particular attracting longer dwell times. The mouth region was attended to the most for fearful, angry, and disgusted expressions and least for surprise. These results extend upon previous findings to show important effects of expression, emotion intensity, and sex on expression recognition and gaze behaviour, and may have implications for understanding the ways in which emotion recognition abilities break down. PMID:27942030

  10. That "poker face" just might lose you the game! The impact of expressive suppression and mimicry on sensitivity to facial expressions of emotion.

    PubMed

    Schneider, Kristin G; Hempel, Roelie J; Lynch, Thomas R

    2013-10-01

    Successful interpersonal functioning often requires both the ability to mask inner feelings and the ability to accurately recognize others' expressions--but what if effortful control of emotional expressions impacts the ability to accurately read others? In this study, we examined the influence of self-controlled expressive suppression and mimicry on facial affect sensitivity--the speed with which one can accurately identify gradually intensifying facial expressions of emotion. Muscle activity of the brow (corrugator, related to anger), upper lip (levator, related to disgust), and cheek (zygomaticus, related to happiness) were recorded using facial electromyography while participants randomized to one of three conditions (Suppress, Mimic, and No-Instruction) viewed a series of six distinct emotional expressions (happiness, sadness, fear, anger, surprise, and disgust) as they morphed from neutral to full expression. As hypothesized, individuals instructed to suppress their own facial expressions showed impairment in facial affect sensitivity. Conversely, mimicry of emotion expressions appeared to facilitate facial affect sensitivity. Results suggest that it is difficult for a person to be able to simultaneously mask inner feelings and accurately "read" the facial expressions of others, at least when these expressions are at low intensity. The combined behavioral and physiological data suggest that the strategies an individual selects to control his or her own expression of emotion have important implications for interpersonal functioning.

  11. 4D ultrasound evaluation of fetal facial expressions during the latter stages of the second trimester.

    PubMed

    Kanenishi, Kenji; Hanaoka, Uiko; Noguchi, Junko; Marumo, Genzo; Hata, Toshiyuki

    2013-06-01

    To assess the frequency of fetal facial expressions at 25-27 weeks of gestation using 4D ultrasound. Twenty-four normal fetuses were examined using 4D ultrasound. The face of each fetus was recorded continuously for 15 minutes. The frequencies of tongue expulsion, yawning, sucking, mouthing, blinking, scowling, and smiling were assessed and compared with those observed at 28-34 weeks of gestation in a previous study. Mouthing was the most common facial expression at 25-27 weeks of gestation; the frequency of mouthing was significantly higher than that of the other 6 facial expressions (P<0.05). Yawning was significantly more frequent than the other facial expressions, apart from mouthing (P<0.05). The frequencies of yawning, smiling, tongue expulsion, sucking, and blinking differed significantly between 25-27 and 28-34 weeks (P<0.05). The results indicate that facial expressions can be used as an indicator of normal fetal neurologic development from the second to the third trimester. 4D ultrasound may be a valuable tool for assessing fetal neurobehavioral development during gestation. Copyright © 2013 International Federation of Gynecology and Obstetrics. Published by Elsevier Ireland Ltd. All rights reserved.

  12. "You Should Have Seen the Look on Your Face…": Self-awareness of Facial Expressions.

    PubMed

    Qu, Fangbing; Yan, Wen-Jing; Chen, Yu-Hsin; Li, Kaiyun; Zhang, Hui; Fu, Xiaolan

    2017-01-01

    The awareness of facial expressions allows one to better understand, predict, and regulate his/her states to adapt to different social situations. The present research investigated individuals' awareness of their own facial expressions and the influence of the duration and intensity of expressions in two self-reference modalities, a real-time condition and a video-review condition. The participants were instructed to respond as soon as they became aware of any facial movements. The results revealed that awareness rates were 57.79% in the real-time condition and 75.92% in the video-review condition. The awareness rate was influenced by the intensity and (or) the duration. The intensity thresholds for individuals to become aware of their own facial expressions were calculated using logistic regression models. The results of Generalized Estimating Equations (GEE) revealed that video-review awareness was a significant predictor of real-time awareness. These findings extend understandings of human facial expression self-awareness in two modalities.

  13. Common cues to emotion in the dynamic facial expressions of speech and song

    PubMed Central

    Livingstone, Steven R.; Thompson, William F.; Wanderley, Marcelo M.; Palmer, Caroline

    2015-01-01

    Speech and song are universal forms of vocalization that may share aspects of emotional expression. Research has focused on parallels in acoustic features, overlooking facial cues to emotion. In three experiments, we compared moving facial expressions in speech and song. In Experiment 1, vocalists spoke and sang statements each with five emotions. Vocalists exhibited emotion-dependent movements of the eyebrows and lip corners that transcended speech–song differences. Vocalists’ jaw movements were coupled to their acoustic intensity, exhibiting differences across emotion and speech–song. Vocalists’ emotional movements extended beyond vocal sound to include large sustained expressions, suggesting a communicative function. In Experiment 2, viewers judged silent videos of vocalists’ facial expressions prior to, during, and following vocalization. Emotional intentions were identified accurately for movements during and after vocalization, suggesting that these movements support the acoustic message. Experiment 3 compared emotional identification in voice-only, face-only, and face-and-voice recordings. Emotion judgements for voice-only singing were poorly identified, yet were accurate for all other conditions, confirming that facial expressions conveyed emotion more accurately than the voice in song, yet were equivalent in speech. Collectively, these findings highlight broad commonalities in the facial cues to emotion in speech and song, yet highlight differences in perception and acoustic-motor production. PMID:25424388

  14. Gaze behavior predicts memory bias for angry facial expressions in stable dysphoria.

    PubMed

    Wells, Tony T; Beevers, Christopher G; Robison, Adrienne E; Ellis, Alissa J

    2010-12-01

    Interpersonal theories suggest that depressed individuals are sensitive to signs of interpersonal rejection, such as angry facial expressions. The present study examined memory bias for happy, sad, angry, and neutral facial expressions in stably dysphoric and stably nondysphoric young adults. Participants' gaze behavior (i.e., fixation duration, number of fixations, and distance between fixations) while viewing these facial expressions was also assessed. Using signal detection analyses, the dysphoric group had better accuracy on a surprise recognition task for angry faces than the nondysphoric group. Further, mediation analyses indicated that greater breadth of attentional focus (i.e., distance between fixations) accounted for enhanced recall of angry faces among the dysphoric group. There were no differences between dysphoria groups in gaze behavior or memory for sad, happy, or neutral facial expressions. Findings from this study identify a specific cognitive mechanism (i.e., breadth of attentional focus) that accounts for biased recall of angry facial expressions in dysphoria. This work also highlights the potential for integrating cognitive and interpersonal theories of depression.

  15. Common cues to emotion in the dynamic facial expressions of speech and song.

    PubMed

    Livingstone, Steven R; Thompson, William F; Wanderley, Marcelo M; Palmer, Caroline

    2015-01-01

    Speech and song are universal forms of vocalization that may share aspects of emotional expression. Research has focused on parallels in acoustic features, overlooking facial cues to emotion. In three experiments, we compared moving facial expressions in speech and song. In Experiment 1, vocalists spoke and sang statements each with five emotions. Vocalists exhibited emotion-dependent movements of the eyebrows and lip corners that transcended speech-song differences. Vocalists' jaw movements were coupled to their acoustic intensity, exhibiting differences across emotion and speech-song. Vocalists' emotional movements extended beyond vocal sound to include large sustained expressions, suggesting a communicative function. In Experiment 2, viewers judged silent videos of vocalists' facial expressions prior to, during, and following vocalization. Emotional intentions were identified accurately for movements during and after vocalization, suggesting that these movements support the acoustic message. Experiment 3 compared emotional identification in voice-only, face-only, and face-and-voice recordings. Emotion judgements for voice-only singing were poorly identified, yet were accurate for all other conditions, confirming that facial expressions conveyed emotion more accurately than the voice in song, yet were equivalent in speech. Collectively, these findings highlight broad commonalities in the facial cues to emotion in speech and song, yet highlight differences in perception and acoustic-motor production.

  16. Perception of stereoscopic direct gaze: The effects of interaxial distance and emotional facial expressions.

    PubMed

    Hakala, Jussi; Kätsyri, Jari; Takala, Tapio; Häkkinen, Jukka

    2016-07-01

    Gaze perception has received considerable research attention due to its importance in social interaction. The majority of recent studies have utilized monoscopic pictorial gaze stimuli. However, a monoscopic direct gaze differs from a live or stereoscopic gaze. In the monoscopic condition, both eyes of the observer receive a direct gaze, whereas in live and stereoscopic conditions, only one eye receives a direct gaze. In the present study, we examined the implications of the difference between monoscopic and stereoscopic direct gaze. Moreover, because research has shown that stereoscopy affects the emotions elicited by facial expressions, and facial expressions affect the range of directions where an observer perceives mutual gaze-the cone of gaze-we studied the interaction effect of stereoscopy and facial expressions on gaze perception. Forty observers viewed stereoscopic images wherein one eye of the observer received a direct gaze while the other eye received a horizontally averted gaze at five different angles corresponding to five interaxial distances between the cameras in stimulus acquisition. In addition to monoscopic and stereoscopic conditions, the stimuli included neutral, angry, and happy facial expressions. The observers judged the gaze direction and mutual gaze of four lookers. Our results show that the mean of the directions received by the left and right eyes approximated the perceived gaze direction in the stereoscopic semidirect gaze condition. The probability of perceiving mutual gaze in the stereoscopic condition was substantially lower compared with monoscopic direct gaze. Furthermore, stereoscopic semidirect gaze significantly widened the cone of gaze for happy facial expressions.

  17. Facial EMG responses to emotional expressions are related to emotion perception ability.

    PubMed

    Künecke, Janina; Hildebrandt, Andrea; Recio, Guillermo; Sommer, Werner; Wilhelm, Oliver

    2014-01-01

    Although most people can identify facial expressions of emotions well, they still differ in this ability. According to embodied simulation theories understanding emotions of others is fostered by involuntarily mimicking the perceived expressions, causing a "reactivation" of the corresponding mental state. Some studies suggest automatic facial mimicry during expression viewing; however, findings on the relationship between mimicry and emotion perception abilities are equivocal. The present study investigated individual differences in emotion perception and its relationship to facial muscle responses - recorded with electromyogram (EMG)--in response to emotional facial expressions. N° = °269 participants completed multiple tasks measuring face and emotion perception. EMG recordings were taken from a subsample (N° = °110) in an independent emotion classification task of short videos displaying six emotions. Confirmatory factor analyses of the m. corrugator supercilii in response to angry, happy, sad, and neutral expressions showed that individual differences in corrugator activity can be separated into a general response to all faces and an emotion-related response. Structural equation modeling revealed a substantial relationship between the emotion-related response and emotion perception ability, providing evidence for the role of facial muscle activation in emotion perception from an individual differences perspective.

  18. Facial EMG Responses to Emotional Expressions Are Related to Emotion Perception Ability

    PubMed Central

    Künecke, Janina; Hildebrandt, Andrea; Recio, Guillermo; Sommer, Werner; Wilhelm, Oliver

    2014-01-01

    Although most people can identify facial expressions of emotions well, they still differ in this ability. According to embodied simulation theories understanding emotions of others is fostered by involuntarily mimicking the perceived expressions, causing a “reactivation” of the corresponding mental state. Some studies suggest automatic facial mimicry during expression viewing; however, findings on the relationship between mimicry and emotion perception abilities are equivocal. The present study investigated individual differences in emotion perception and its relationship to facial muscle responses - recorded with electromyogram (EMG) - in response to emotional facial expressions. N° = °269 participants completed multiple tasks measuring face and emotion perception. EMG recordings were taken from a subsample (N° = °110) in an independent emotion classification task of short videos displaying six emotions. Confirmatory factor analyses of the m. corrugator supercilii in response to angry, happy, sad, and neutral expressions showed that individual differences in corrugator activity can be separated into a general response to all faces and an emotion-related response. Structural equation modeling revealed a substantial relationship between the emotion-related response and emotion perception ability, providing evidence for the role of facial muscle activation in emotion perception from an individual differences perspective. PMID:24489647

  19. Facial expression: An under-utilised tool for the assessment of welfare in mammals.

    PubMed

    Descovich, Kris A; Wathan, Jennifer; Leach, Matthew C; Buchanan-Smith, Hannah M; Flecknell, Paul; Farningham, David; Vick, Sarah-Jane

    2017-02-08

    Animal welfare is a key issue for industries that use or impact upon animals. The accurate identification of welfare states is particularly relevant to the field of bioscience, where the 3Rs framework encourages refinement of experimental procedures involving animal models. The assessment and improvement of welfare states in animals is reliant on reliable and valid measurement tools. Behavioural measures (activity, attention, posture and vocalisation) are frequently used because they are immediate and non-invasive, however no single indicator can yield a complete picture of the internal state of an animal. Facial expressions are extensively studied in humans as a measure of psychological and emotional experiences but are infrequently used in animal studies, with the exception of emerging research on pain behaviour. In this review, we discuss current evidence for facial representations of underlying affective states, and how communicative or functional expressions can be useful within welfare assessments. Validated tools for measuring facial movement are outlined, and the potential of expressions as honest signals are discussed, alongside other challenges and limitations to facial expression measurement within the context of animal welfare. We conclude that facial expression determination in animals is a useful but underutilised measure that complements existing tools in the assessment of welfare.

  20. Analysis and evaluation of facial expression and perceived age for designing automotive frontal views

    NASA Astrophysics Data System (ADS)

    Fujiwara, Takayuki; Kawasumi, Mikiko; Koshimizu, Hiroyasu

    2007-01-01

    We propose a method for quantifying the design of automotive frontal view based on the research on the human visual impression to the facial expression. We have researched to evaluate the automotive frontal face by using the facial words and the perceived age. Then we verified experimentally how effectively the line drawing image could work and coche-PICASSO image could be used for the image stimulation. As a result of this paper, a part of the facial words could be strongly correlated to both the facial expressions and the perceived age in the line drawing image. Besides, it was also known that the perceived age in the coche-PICASSO image was always younger than those of the line drawing image.

  1. Electrocortical responses to NIMSTIM facial expressions of emotion.

    PubMed

    Smith, Ezra; Weinberg, Anna; Moran, Tim; Hajcak, Greg

    2013-04-01

    Emotional faces are motivationally salient stimuli that automatically capture attention and rapidly potentiate neural processing. Because of their superior temporal resolution, scalp-recorded event-related potentials (ERPs) are ideal for examining rapid changes in neural activity. Some reports have found larger ERPs for fearful and angry faces compared with both neutral and other emotional faces, and a key aim of the present study was to assess neural response to multiple emotional expressions using the NIMSTIM. Importantly, no study has yet systematically evaluated neural activity and self-report ratings for multiple NIMSTIM expressions. Study 1 examined the time-course of electrocortical activity in response to fearful, angry, sad, happy, and neutral NIMSTIM faces. In Study 2, valence and arousal ratings were collected for the same faces in a separate sample. In line with previous findings, the early P1 was larger for fearful compared with neutral faces. The vertex positivity (VPP) was enhanced for fearful, angry, and happy expressions compared to neutral. There was no effect of expression on the N170. Marginally significant enhancements were observed for all expressions during the early posterior negativity (EPN). The late positive potential (LPP) was enhanced only for fearful and angry faces. All emotional expressions were rated as more arousing and more pleasant/unpleasant than neutral expressions. Overall, findings suggest that angry and fearful faces might be especially potent in terms of eliciting ERP responses and ideal for emotion research when more evocative images cannot be used. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Evidence for Anger Saliency during the Recognition of Chimeric Facial Expressions of Emotions in Underage Ebola Survivors.

    PubMed

    Ardizzi, Martina; Evangelista, Valentina; Ferroni, Francesca; Umiltà, Maria A; Ravera, Roberto; Gallese, Vittorio

    2017-01-01

    One of the crucial features defining basic emotions and their prototypical facial expressions is their value for survival. Childhood traumatic experiences affect the effective recognition of facial expressions of negative emotions, normally allowing the recruitment of adequate behavioral responses to environmental threats. Specifically, anger becomes an extraordinarily salient stimulus unbalancing victims' recognition of negative emotions. Despite the plethora of studies on this topic, to date, it is not clear whether this phenomenon reflects an overall response tendency toward anger recognition or a selective proneness to the salience of specific facial expressive cues of anger after trauma exposure. To address this issue, a group of underage Sierra Leonean Ebola virus disease survivors (mean age 15.40 years, SE 0.35; years of schooling 8.8 years, SE 0.46; 14 males) and a control group (mean age 14.55, SE 0.30; years of schooling 8.07 years, SE 0.30, 15 males) performed a forced-choice chimeric facial expressions recognition task. The chimeric facial expressions were obtained pairing upper and lower half faces of two different negative emotions (selected from anger, fear and sadness for a total of six different combinations). Overall, results showed that upper facial expressive cues were more salient than lower facial expressive cues. This priority was lost among Ebola virus disease survivors for the chimeric facial expressions of anger. In this case, differently from controls, Ebola virus disease survivors recognized anger regardless of the upper or lower position of the facial expressive cues of this emotion. The present results demonstrate that victims' performance in the recognition of the facial expression of anger does not reflect an overall response tendency toward anger recognition, but rather the specific greater salience of facial expressive cues of anger. Furthermore, the present results show that traumatic experiences deeply modify the perceptual

  3. Evidence for Anger Saliency during the Recognition of Chimeric Facial Expressions of Emotions in Underage Ebola Survivors

    PubMed Central

    Ardizzi, Martina; Evangelista, Valentina; Ferroni, Francesca; Umiltà, Maria A.; Ravera, Roberto; Gallese, Vittorio

    2017-01-01

    One of the crucial features defining basic emotions and their prototypical facial expressions is their value for survival. Childhood traumatic experiences affect the effective recognition of facial expressions of negative emotions, normally allowing the recruitment of adequate behavioral responses to environmental threats. Specifically, anger becomes an extraordinarily salient stimulus unbalancing victims’ recognition of negative emotions. Despite the plethora of studies on this topic, to date, it is not clear whether this phenomenon reflects an overall response tendency toward anger recognition or a selective proneness to the salience of specific facial expressive cues of anger after trauma exposure. To address this issue, a group of underage Sierra Leonean Ebola virus disease survivors (mean age 15.40 years, SE 0.35; years of schooling 8.8 years, SE 0.46; 14 males) and a control group (mean age 14.55, SE 0.30; years of schooling 8.07 years, SE 0.30, 15 males) performed a forced-choice chimeric facial expressions recognition task. The chimeric facial expressions were obtained pairing upper and lower half faces of two different negative emotions (selected from anger, fear and sadness for a total of six different combinations). Overall, results showed that upper facial expressive cues were more salient than lower facial expressive cues. This priority was lost among Ebola virus disease survivors for the chimeric facial expressions of anger. In this case, differently from controls, Ebola virus disease survivors recognized anger regardless of the upper or lower position of the facial expressive cues of this emotion. The present results demonstrate that victims’ performance in the recognition of the facial expression of anger does not reflect an overall response tendency toward anger recognition, but rather the specific greater salience of facial expressive cues of anger. Furthermore, the present results show that traumatic experiences deeply modify the perceptual

  4. Reduced expression of regeneration associated genes in chronically axotomized facial motoneurons.

    PubMed

    Gordon, T; You, S; Cassar, S L; Tetzlaff, W

    2015-02-01

    Chronically axotomized motoneurons progressively fail to regenerate their axons. Since axonal regeneration is associated with the increased expression of tubulin, actin and GAP-43, we examined whether the regenerative failure is due to failure of chronically axotomized motoneurons to express and sustain the expression of these regeneration associated genes (RAGs). Chronically axotomized facial motoneurons were subjected to a second axotomy to mimic the clinical surgical procedure of refreshing the proximal nerve stump prior to nerve repair. Expression of α1-tubulin, actin and GAP-43 was analyzed in axotomized motoneurons using in situ hybridization followed by autoradiography and silver grain quantification. The expression of these RAGs by acutely axotomized motoneurons declined over several months. The chronically injured motoneurons responded to a refreshment axotomy with a re-increase in RAG expression. However, this response to a refreshment axotomy of chronically injured facial motoneurons was less than that seen in acutely axotomized facial motoneurons. These data demonstrate that the neuronal RAG expression can be induced by injury-related signals and does not require acute deprivation of target derived factors. The transient expression is consistent with a transient inflammatory response to the injury. We conclude that transient RAG expression in chronically axotomized motoneurons and the weak response of the chronically axotomized motoneurons to a refreshment axotomy provides a plausible explanation for the progressive decline in regenerative capacity of chronically axotomized motoneurons.

  5. Transcranial magnetic stimulation disrupts the perception and embodiment of facial expressions.

    PubMed

    Pitcher, David; Garrido, Lúcia; Walsh, Vincent; Duchaine, Bradley C

    2008-09-03

    Theories of embodied cognition propose that recognizing facial expressions requires visual processing followed by simulation of the somatovisceral responses associated with the perceived expression. To test this proposal, we targeted the right occipital face area (rOFA) and the face region of right somatosensory cortex (rSC) with repetitive transcranial magnetic stimulation (rTMS) while participants discriminated facial expressions. rTMS selectively impaired discrimination of facial expressions at both sites but had no effect on a matched face identity task. Site specificity within the rSC was demonstrated by targeting rTMS at the face and finger regions while participants performed the expression discrimination task. rTMS targeted at the face region impaired task performance relative to rTMS targeted at the finger region. To establish the temporal course of visual and somatosensory contributions to expression processing, double-pulse TMS was delivered at different times to rOFA and rSC during expression discrimination. Accuracy dropped when pulses were delivered at 60-100 ms at rOFA and at 100-140 and 130-170 ms at rSC. These sequential impairments at rOFA and rSC support embodied accounts of expression recognition as well as hierarchical models of face processing. The results also demonstrate that nonvisual cortical areas contribute during early stages of expression processing.

  6. Sex Differences in Neural Activation to Facial Expressions Denoting Contempt and Disgust

    PubMed Central

    Aleman, André; Swart, Marte

    2008-01-01

    The facial expression of contempt has been regarded to communicate feelings of moral superiority. Contempt is an emotion that is closely related to disgust, but in contrast to disgust, contempt is inherently interpersonal and hierarchical. The aim of this study was twofold. First, to investigate the hypothesis of preferential amygdala responses to contempt expressions versus disgust. Second, to investigate whether, at a neural level, men would respond stronger to biological signals of interpersonal superiority (e.g., contempt) than women. We performed an experiment using functional magnetic resonance imaging (fMRI), in which participants watched facial expressions of contempt and disgust in addition to neutral expressions. The faces were presented as distractors in an oddball task in which participants had to react to one target face. Facial expressions of contempt and disgust activated a network of brain regions, including prefrontal areas (superior, middle and medial prefrontal gyrus), anterior cingulate, insula, amygdala, parietal cortex, fusiform gyrus, occipital cortex, putamen and thalamus. Contemptuous faces did not elicit stronger amygdala activation than did disgusted expressions. To limit the number of statistical comparisons, we confined our analyses of sex differences to the frontal and temporal lobes. Men displayed stronger brain activation than women to facial expressions of contempt in the medial frontal gyrus, inferior frontal gyrus, and superior temporal gyrus. Conversely, women showed stronger neural responses than men to facial expressions of disgust. In addition, the effect of stimulus sex differed for men versus women. Specifically, women showed stronger responses to male contemptuous faces (as compared to female expressions), in the insula and middle frontal gyrus. Contempt has been conceptualized as signaling perceived moral violations of social hierarchy, whereas disgust would signal violations of physical purity. Thus, our results suggest a

  7. Dynamic and Static Facial Expressions Decoded from Motion-Sensitive Areas in the Macaque Monkey

    PubMed Central

    Furl, Nicholas; Hadj-Bouziane, Fadila; Liu, Ning; Averbeck, Bruno B.; Ungerleider, Leslie G.

    2012-01-01

    Humans adeptly use visual motion to recognize socially-relevant facial information. The macaque provides a model visual system for studying neural coding of expression movements, as its superior temporal sulcus (STS) possesses brain areas selective for faces and areas sensitive to visual motion. We employed functional magnetic resonance imaging and facial stimuli to localize motion-sensitive areas (Mf areas), which responded more to dynamic faces compared to static faces, and face-selective areas, which responded selectively to faces compared to objects and places. Using multivariate analysis, we found that information about both dynamic and static facial expressions could be robustly decoded from Mf areas. By contrast, face-selective areas exhibited relatively less facial expression information. Classifiers trained with expressions from one motion type (dynamic or static) showed poor generalization to the other motion type, suggesting that Mf areas employ separate and non-confusable neural codes for dynamic and static presentations of the same expressions. We also show that some of the motion sensitivity elicited by facial stimuli was not specific to faces but could also be elicited by moving dots, particularly in FST and STPm/LST, confirming their already well-established low-level motion sensitivity. A different pattern was found in anterior STS, which responded more to dynamic than static faces but was not sensitive to dot motion. Overall, we show that emotional expressions are mostly represented outside of face-selective cortex, in areas sensitive to motion. These regions may play a fundamental role in enhancing recognition of facial expression despite the complex stimulus changes associated with motion. PMID:23136433

  8. Facial redness, expression, and masculinity influence perceptions of anger and health.

    PubMed

    Young, Steven G; Thorstenson, Christopher A; Pazda, Adam D

    2016-12-29

    Past research has found that skin colouration, particularly facial redness, influences the perceived health and emotional state of target individuals. In the current work, we explore several extensions of this past research. In Experiment 1, we manipulated facial redness incrementally on neutral and angry faces and had participants rate each face for anger and health. Different red effects emerged, as perceived anger increased in a linear manner as facial redness increased. Health ratings instead showed a curvilinear trend, as both extreme paleness and redness were rated as less healthy than moderate levels of red. Experiment 2 replicated and extended these findings by manipulating the masculinity of both angry and neutral faces that varied in redness. The results found the effect of red on perceived anger and health was moderated by masculine face structure. Collectively, these results show that facial redness has context dependent effects that vary based on facial expression, appearance, and differentially impact ratings of emotional states and health.

  9. Children's Prototypic Facial Expressions During Emotion-Eliciting Conversations With Their Mothers.

    PubMed

    Castro, Vanessa L; Camras, Linda A; Halberstadt, Amy G; Shuster, Michael

    2017-07-17

    Despite theoretical claims that emotions are primarily communicated through prototypic facial expressions, empirical evidence is surprisingly scarce. This study aimed to (a) test whether children produced more components of a prototypic emotional facial expression during situations judged or self-reported to involve the corresponding emotion than situations involving other emotions (termed "intersituational specificity"), (b) test whether children produced more components of the prototypic expression corresponding to a situation's judged or self-reported emotion than components of other emotional expressions (termed "intrasituational specificity"), and (c) examine coherence between children's self-reported emotional experience and observers' judgments of children's emotions. One hundred and 20 children (ages 7-9) were video-recorded during a discussion with their mothers. Emotion ratings were obtained for children in 441 episodes. Children's nonverbal behaviors were judged by observers and coded by FACS-trained researchers. Children's self-reported emotion corresponded significantly to observers' judgments of joy, anger, fear, and sadness but not surprise. Multilevel modeling results revealed that children produced joy facial expressions more in joy episodes than nonjoy episodes (supporting intersituational specificity for joy) and more joy and surprise expressions than other emotional expressions in joy and surprise episodes (supporting intrasituational specificity for joy and surprise). However, children produced anger, fear, and sadness expressions more in noncorresponding episodes and produced these expressions less than other expressions in corresponding episodes. Findings suggest that communication of negative emotion during social interactions-as indexed by agreement between self-report and observer judgments-may rely less on prototypic facial expressions than is often theoretically assumed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  10. Adult attachment anxiety is associated with enhanced automatic neural response to positive facial expression.

    PubMed

    Donges, Uta-Susan; Kugel, Harald; Stuhrmann, Anja; Grotegerd, Dominik; Redlich, Ronny; Lichev, Vladimir; Rosenberg, Nicole; Ihme, Klas; Suslow, Thomas; Dannlowski, Udo

    2012-09-18

    According to social psychology models of adult attachment, a fundamental dimension of attachment is anxiety. Individuals who are high in attachment anxiety are motivated to achieve intimacy in relationships, but are mistrustful of others and their availability. Behavioral research has shown that anxiously attached persons are vigilant for emotional facial expression, but the neural substrates underlying this perceptual sensitivity remain largely unknown. In the present study functional magnetic resonance imaging was used to examine automatic brain reactivity to approach-related facial emotions as a function of attachment anxiety in a sample of 109 healthy adults. Pictures of sad and happy faces were presented masked by neutral faces. The Relationship Scales Questionnaire (RSQ) was used to assess attachment style. Attachment anxiety was correlated with depressivity, trait anxiety, and attachment avoidance. Controlling for these variables, attachment-related anxiety was positively related to responses in left inferior, middle, and medial prefrontal areas, globus pallidus, claustrum, and right cerebellum to masked happy facial expression. Attachment anxiety was not found to be associated with brain activation due to masked sad faces. Our findings suggest that anxiously attached adults are automatically more responsive to positive approach-related facial expression in brain areas that are involved in the perception of facial emotion, facial mimicry, or the assessment of affective value and social distance. Copyright © 2012 IBRO. Published by Elsevier Ltd. All rights reserved.

  11. Distributed representations of dynamic facial expressions in the superior temporal sulcus

    PubMed Central

    Said, Christopher P.; Moore, Christopher D.; Engell, Andrew D.; Todorov, Alexander; Haxby, James V.

    2011-01-01

    Previous research on the superior temporal sulcus (STS) has shown that it responds more to facial expressions than to neutral faces. Here, we extend our understanding of the STS in two ways. First, using targeted high-resolution fMRI measurements of the lateral cortex and multivoxel pattern analysis, we show that the response to seven categories of dynamic facial expressions can be decoded in both the posterior STS (pSTS) and anterior STS (aSTS). We were also able to decode patterns corresponding to these expressions in the frontal operculum (FO), a structure that has also been shown to res